Natural Language to SQL
Transform plain English questions into optimized ClickHouse SQL queries instantly.
LogChef leverages the power of Large Language Models (LLMs) to translate your natural language questions about your logs directly into ClickHouse SQL queries. This feature is designed to lower the barrier to entry for log exploration, allowing users less familiar with SQL to gain insights quickly, and speeding up query construction for experienced users.
Simply type your query in plain English, and LogChef will do its best to generate the appropriate SQL to fetch the data you need from your connected ClickHouse log sources.
Natural Language to SQL
Transform plain English questions into optimized ClickHouse SQL queries instantly.
Schema-Aware Generation
AI understands your log structure and generates queries using actual field names and types.
Multiple Provider Support
Works with OpenAI, OpenRouter, Azure OpenAI, and other OpenAI-compatible APIs.
Context-Aware Suggestions
AI considers your current query context to provide better suggestions and refinements.
When you use the AI SQL generation feature for a specific log source, LogChef provides the underlying table schema and your natural language query to an AI model. The model then constructs a ClickHouse SQL query tailored to your request and the schema of your data.
Below are examples of natural language queries you can use with LogChef, demonstrating the versatility of the AI SQL generation feature. These are tailored for common log exploration use cases.
auth-service
today.”api-gateway
service.”abc123
.”payments
namespace with severity ERROR.”checkout-service
in the default
namespace.”syslog
namespace for the last 12 hours.”user_id
and value 42
.”env=prod
.”span_id
is missing or empty.”retry
and failed
in the body.”To enable AI SQL generation, configure the following settings in your config.toml
:
[ai]# Enable AI featuresenabled = true
# OpenAI API key (required)api_key = "sk-your_api_key_here"
# Model to use (default: gpt-4o)model = "gpt-4o"
# Optional: Custom API endpoint for OpenRouter, Azure OpenAI, etc.# base_url = "https://openrouter.ai/api/v1"
# Model parametersmax_tokens = 1024temperature = 0.1
The AI integration supports OpenAI-compatible APIs including:
https://openrouter.ai/api/v1
For production deployments, use environment variables for sensitive configuration:
export LOGCHEF_AI__ENABLED=trueexport LOGCHEF_AI__API_KEY="sk-your_api_key_here"export LOGCHEF_AI__MODEL="gpt-4o"
Natural Language: "Show me all error logs from the last 6 hours"
Generated SQL:SELECT *FROM logs_tableWHERE severity_text = 'ERROR' AND timestamp >= now() - INTERVAL 6 HOURORDER BY timestamp DESCLIMIT 100
If you already have a query in the editor, the AI will consider it as context:
Current Query: SELECT * FROM logs WHERE service = 'api-gateway'Natural Language: "Add a filter for errors in the last hour"
Enhanced SQL:SELECT *FROM logsWHERE service = 'api-gateway' AND severity_text = 'ERROR' AND timestamp >= now() - INTERVAL 1 HOURORDER BY timestamp DESC
For AI assistant integration outside the web interface, use the LogChef MCP server. This enables natural language log analysis through AI assistants like Claude Desktop:
While powerful, AI-generated SQL should always be reviewed, especially for critical or complex queries. The feature is a tool to assist and accelerate your log exploration, not a complete replacement for understanding your data and query logic.