Skip to content

AI SQL Generation

Logchef can translate natural language questions into ClickHouse SQL queries using OpenAI-compatible LLMs. The AI is schema-aware — it knows your table’s columns and types.

  1. Open any log source
  2. Click the AI assistant button (wand icon) in the query editor
  3. Describe what you want in plain English
  4. Review the generated SQL, then insert it into the editor

If you already have a query in the editor, the AI uses it as context for refinements.

"Show error logs from the last 6 hours"
"Count logs by severity for today"
"Find logs containing 'connection refused' from the payments service"
"Top 10 most frequent error messages this week"
"Logs where trace_id = 'abc123'"

Configure via Administration → System Settings → AI:

SettingDescriptionDefault
EnabledToggle AI featurestrue
API KeyYour OpenAI API key
Base URLAPI endpointhttps://api.openai.com/v1
ModelModel namegpt-4o
Max TokensMax tokens to generate1024
TemperatureRandomness (0.0–1.0)0.1

Any OpenAI-compatible API works:

ProviderBase URL
OpenAIhttps://api.openai.com/v1 (default)
OpenRouterhttps://openrouter.ai/api/v1
Azure OpenAIYour Azure endpoint
Local modelsYour local server URL

For containerized deployments, seed the initial config via env vars:

Terminal window
export LOGCHEF_AI__ENABLED=true
export LOGCHEF_AI__API_KEY="sk-your-key"
export LOGCHEF_AI__MODEL="gpt-4o"

After first boot, the Admin UI takes precedence over env vars.

For AI assistant integration outside the web UI (e.g., Claude Desktop), see MCP Server.

AI-generated SQL should always be reviewed before execution. The model can produce incorrect queries, especially for complex aggregations or unfamiliar schemas. Start simple and iterate.