Open Source · MIT License · Free Forever

Turn raw logs into security intelligence with AI

Self-hosted SIEM platform that scores every event across 6 security criteria using LLMs, maps threats to MITRE ATT&CK, and filters all PII before it reaches the AI. Deploy in 5 minutes. Run for $2/month.

Get Started on GitHub Quick Start Guide
LogPulse AI Dashboard — real-time 6-criteria AI score bars for all monitored systems
6
AI Scoring Criteria
$2-10
Monthly LLM Cost
5 min
Docker Deploy
11+
PII Masking Categories

What it does

🛡

Privacy-First AI

PII is filtered before any data reaches the LLM. 11 built-in masking categories (IPs, emails, credentials, etc.), custom regex patterns, and full field stripping. Verify with a live test filter.

🤖

6-Criteria AI Scoring

Every event is evaluated by an LLM across IT Security, Performance, Failure Prediction, Anomaly Detection, Compliance, and Operational Risk. Each criterion uses a dedicated, tunable prompt.

🎯

MITRE ATT&CK Mapping

AI findings carry optional technique IDs and confidence scores, mapped to the MITRE ATT&CK framework. Go from raw log to threat classification automatically.

🔍

Meta-Analysis & Findings

Sliding-window pipeline aggregates scores into structured findings with deduplication (TF-IDF + Jaccard + LLM), severity decay, and auto-resolution when issues stop recurring.

💬

RAG "Ask AI"

Natural language queries over your entire event history. Ask "Were there failed SSH logins last night?" or "Summarize Docker issues from the past week." Persistent chat history.

💰

16 Cost Optimizations

Template deduplication, score caching, severity pre-filtering, batch sizing, and 12 more techniques reduce LLM costs by 80-95%. Real-time usage tracking per model and system.

🔒

Enterprise Security

RBAC with 20 permissions, immutable audit log (PostgreSQL trigger), bcrypt auth, session hashing, API key scopes with IP allowlists, OWASP Top 10 compliant.

🔌

Any LLM Provider

Works with OpenAI, Azure, Ollama, LM Studio, vLLM — any OpenAI-compatible API. Swap models from the UI without redeployment. Self-host for air-gapped environments.

🚀

Universal Ingestion

Syslog (UDP/TCP), OpenTelemetry, Fluent Bit, Vector, Logstash, Beats. Pull connectors for Elasticsearch, Loki, VictoriaLogs, Kafka, RabbitMQ. Auto-discovery of new sources.

Why LogPulse AI over alternatives?

1

vs. Wazuh

No multi-hour setup. No YAML editing. Docker up in 5 minutes with full GUI configuration. Plus LLM-powered analysis that Wazuh doesn't have.

2

vs. Graylog

No feature-gated enterprise tier. Everything is free and open-source under MIT. AI analysis included, not an expensive add-on.

3

vs. ELK Stack

Built-in AI scoring and findings pipeline. No need to build detection rules, dashboards, or alerting from scratch. Connect Elasticsearch as a hybrid data source.

4

vs. Splunk

Self-hosted, no per-GB pricing, no vendor lock-in. AI analysis runs on any LLM provider you choose — including your own hardware.

Get running in 5 minutes

# Clone the repo
git clone https://github.com/PhilipLykov/LogPulseAI.git
cd LogPulseAI/docker

# Configure: set a DB password and point to the bundled PostgreSQL
cp .env.example .env
sed -i 's/DB_HOST=localhost/DB_HOST=postgres/' .env
sed -i 's/CHANGE_ME_STRONG_PASSWORD/YourStr0ngP@ss!/' .env

# Start everything (backend + dashboard + PostgreSQL)
docker compose --profile db up -d --build

# Get your login credentials
docker compose exec backend sh -lc "cat /app/bootstrap-secrets.txt"

# Open http://localhost:8070 and configure your LLM in Settings