Log Pipelines
Every log entry passes through a configurable pipeline before it is stored. Pipelines let you parse, enrich, and redact your data automatically on ingest.
When a log entry is received it goes through two phases:
- Field normalization — Lightning Logs maps raw payload fields to standard columns (
level,msg,service, etc.), resolves the log type, and normalises trace/span IDs to UUID format. - Pipeline steps — your configured steps run in order on the normalized entry. Each step can read and modify all fields. If a step throws an error the failure is recorded in
attrs._pipeline_errorsand the next step continues — a single bad entry never blocks the rest of the batch.
Pipeline configuration is loaded from Settings → Pipeline with a 5-minute cache. Changes take effect within minutes without any redeployment.
All tenants start with this pipeline. You can modify or replace it at any time from Settings → Pipeline.
{
"steps": [
{ "type": "parse" },
{ "type": "enrich" },
{
"type": "redact",
"config": {
"patterns": ["email", "credit_card", "ssn"]
}
}
]
}Step Reference
Attempts to further decode and extract fields from the normalized entry.
- If
attrsis a JSON string it is decoded into an object. - If
msg,level, orserviceare missing, common aliases insideattrsare checked (message,error,severity,app).
Configuration
The parse step has no configuration options.
{ "type": "parse" }Fills in missing fields using static defaults you define.
defaults.env— applied when the entry has noenvfield.defaults.service— applied whenserviceis missing.defaults.host— applied whenhostis missing.defaults.generate_req_id— whentrue, a UUID is generated for entries that have noreq_id.
Configuration
{
"type": "enrich",
"config": {
"defaults": {
"env": "production",
"service": "my-api",
"generate_req_id": true
}
}
}Scans msg and all string values inside attrs and replaces matched patterns with a safe placeholder. Nested objects and arrays are walked recursively.
| Pattern name | Matches | Replaced with |
|---|---|---|
email | Email addresses | [email] |
credit_card | 13–19 digit card numbers | [credit_card] |
ssn | US Social Security Numbers (XXX-XX-XXXX) | [ssn] |
phone | US phone numbers (various formats) | [phone] |
Configuration
{
"type": "redact",
"config": {
"patterns": ["email", "credit_card", "ssn", "phone"],
"custom": [
{
"pattern": "sk-[a-zA-Z0-9]{32,}",
"replacement": "[api_key]"
}
]
}
}Minimal — parse only
Useful when your SDK already sends well-structured payloads and you want no overhead.
{
"steps": [
{ "type": "parse" }
]
}Strict PII redaction
All built-in patterns plus a custom rule for internal API keys.
{
"steps": [
{ "type": "parse" },
{ "type": "enrich", "config": { "defaults": { "env": "production" } } },
{
"type": "redact",
"config": {
"patterns": ["email", "credit_card", "ssn", "phone"],
"custom": [
{ "pattern": "Bearer\\s+[\\w\\-\\.]+", "replacement": "Bearer [token]" },
{ "pattern": "sk-[a-zA-Z0-9]{32,}", "replacement": "[api_key]" }
]
}
}
]
}Disabled pipeline
Pass an empty steps array to skip all pipeline processing (field normalization still runs).
{ "steps": [] }If a pipeline step throws an error for a specific log entry, the error is recorded inside that entry's attrs._pipeline_errors field and processing continues with the next step. Other entries in the same batch are unaffected.
You can query for affected entries using the DSL:
attrs._pipeline_errors IS NOT NULLEach error object contains:
{
"step": "redact",
"error": "invalid regex pattern",
"timestamp": "2026-03-19T12:00:00.000Z"
}- Go to Settings → Pipeline in the Lightning Logs dashboard.
- Edit the JSON configuration in the editor. The editor validates JSON syntax inline.
- Click Save Configuration. The change is applied to new log entries within 5 minutes (cache TTL).
Pipeline configuration is per-tenant. Changes only affect your account's log entries and do not impact other tenants.