Configuration
LogChef uses a TOML configuration file to manage its settings. This guide explains all available configuration options.
Server Settings
Configure the HTTP server and frontend settings:
[server]# Port for the HTTP server (default: 8125)port = 8125
# Host address to bind to (default: "0.0.0.0")host = "0.0.0.0"
# URL of the frontend application# Leave empty in production, used only in developmentfrontend_url = ""
# HTTP server timeout for requests (default: 30s)http_server_timeout = "30s"
Database Configuration
SQLite database settings for storing metadata:
[sqlite]# Path to the SQLite database filepath = "logchef.db"
Authentication
OpenID Connect (OIDC)
Configure your SSO provider (example using Dex):
[oidc]# URL of your OIDC providerprovider_url = "http://dex:5556/dex"
# Authentication endpoint URL (Optional: often discovered via provider_url)auth_url = "http://dex:5556/dex/auth"
# Token endpoint URL (Optional: often discovered via provider_url)token_url = "http://dex:5556/dex/token"
# OIDC client credentialsclient_id = "logchef"client_secret = "logchef-secret"
# Callback URL for OIDC authentication# Must match the URL configured in your OIDC providerredirect_url = "http://localhost:8125/api/v1/auth/callback"
# Required OIDC scopesscopes = ["openid", "email", "profile"]
Auth Settings
Configure authentication behavior:
[auth]# List of email addresses that have admin privilegesadmin_emails = ["admin@corp.internal"]
# Duration of user sessions (e.g., "8h", "24h", "7d")session_duration = "8h"
# Maximum number of concurrent sessions per usermax_concurrent_sessions = 1
Logging
Configure application logging:
[logging]# Log level: "debug", "info", "warn", "error"level = "info"
AI SQL Generation
Configure settings for AI-powered SQL generation using OpenAI-compatible APIs.
[ai]# Enable or disable AI features (default: false)enabled = true
# --- API Endpoint Configuration ---# Optional: Base URL for OpenAI-compatible endpoints# Leave empty for standard OpenAI API# Examples:# - OpenRouter: "https://openrouter.ai/api/v1"# - Azure OpenAI: "https://your-resource.openai.azure.com/"# - Custom proxy: "https://your-proxy.com/v1"base_url = ""
# OpenAI API Key (required if AI features are enabled)# Can also be set via LOGCHEF_AI__API_KEY environment variableapi_key = "sk-your_api_key_here"
# --- Model Parameters ---# Model to use for SQL generation (default: "gpt-4o")# Popular options: "gpt-4o", "gpt-4o-mini", "gpt-3.5-turbo"# For OpenRouter: model names like "openai/gpt-4o", "anthropic/claude-3-sonnet"model = "gpt-4o"
# Maximum number of tokens to generate (default: 1024)max_tokens = 1024
# Temperature for generation (0.0-1.0, lower is more deterministic, default: 0.1)temperature = 0.1
Supported Providers
The AI integration works with any OpenAI-compatible API:
- OpenAI: Leave
base_url
empty (default) - OpenRouter: Set
base_url = "https://openrouter.ai/api/v1"
- Azure OpenAI: Configure your Azure endpoint
- Local Models: Point to your local OpenAI-compatible server
API Token Requirements
Your API token needs appropriate permissions for the model you’re using. For OpenRouter, make sure your token has access to the specific model.
Security Considerations
- Store API keys in environment variables for production
- Use the least privileged API tokens possible
- Monitor API usage and costs
- Consider rate limiting for high-traffic deployments
Environment Variables
All configuration options set in the TOML file can be overridden or supplied via environment variables. This is particularly useful for sensitive information like API keys or for containerized deployments.
Environment variables are prefixed with LOGCHEF_
. For nested keys in the TOML structure, use a double underscore __
to represent the nesting.
Format: LOGCHEF_SECTION__KEY=value
Examples:
- Set server port:
Terminal window export LOGCHEF_SERVER__PORT=8125 - Set OIDC provider URL:
Terminal window export LOGCHEF_OIDC__PROVIDER_URL="http://dex.example.com/dex" - Set admin emails (comma-separated for arrays):
Terminal window export LOGCHEF_AUTH__ADMIN_EMAILS="admin@example.com,ops@example.com" - Set AI API Key:
Terminal window export LOGCHEF_AI__API_KEY="sk-your_actual_api_key_here" - Enable AI features and set the model:
Terminal window export LOGCHEF_AI__ENABLED=trueexport LOGCHEF_AI__MODEL="gpt-4o"
Environment variables take precedence over values defined in the TOML configuration file.
Production Configuration
For production deployments, ensure you:
- Set appropriate
host
andport
values - Configure a secure
client_secret
for OIDC - Set the correct
redirect_url
matching your domain - Configure admin emails for initial access
- Adjust session duration based on your security requirements
- Set logging level to “info” or “warn”
- If using AI features, ensure
LOGCHEF_AI__API_KEY
is set securely.
Example Production Configuration
[server]port = 8125host = "0.0.0.0"http_server_timeout = "30s"
[sqlite]path = "/data/logchef.db"
[oidc]provider_url = "https://dex.example.com"client_id = "logchef"client_secret = "your-secure-secret"redirect_url = "https://logchef.example.com/api/v1/auth/callback"scopes = ["openid", "email", "profile"]
[auth]admin_emails = ["admin@example.com"]session_duration = "8h"max_concurrent_sessions = 1
[logging]level = "info"
# AI features configuration (API key should be set via env var)[ai]enabled = true# base_url = "" # Leave empty for OpenAI, or set for other providers# api_key = "" # Use LOGCHEF_AI__API_KEY environment variablemodel = "gpt-4o"max_tokens = 1024temperature = 0.1