v0.1.0 · stable · 0 deps

One log shape.
Four runtimes. Zero dependencies.

OpenInfra Logger is structured telemetry for Node, Python, Go and Rust — built from each runtime's standard library. Native batching, auto-redaction, and an offline AI analyzer for root-cause diagnosis.

Read the docs →
ships native for ›
Node.js
npm i @jonathascordeiro20/openinfra-logger
Python
from openinfra_logger import log
Go
import openinfralogger "…/openinfra-logger/go"
Rust
use openinfra_logger::Logger;
enterprise capabilities

Three pillars. One library.

01
auto-redaction

Data Shield

Intercepts every event before it touches the wire. Passwords, API keys, JWTs, credit cards — replaced with [REDACTED] in the buffer. LGPD and GDPR-compliant out of the box, no rules to write.

5 patternsdefault · fully extensible
02
native buffer · 2s flush

Remote Batching

Instead of HTTP-per-log, OIL holds events in a runtime-native buffer and ships them in lots of up to 100 every two seconds. No third-party transport. No backpressure surprises.

~100×fewer egress requests vs. log-per-call
03
tools/ai-analyzer

AI Root-Cause

A CLI that reads your local error files and asks Claude to diagnose. Stack traces in, structured remediation out — your code never leaves your machine until you choose to send the prompt.

claude-haikuor any Anthropic model
universal api

The same call. Every runtime.

OIL formats and ships JSON identically regardless of language. Drop it into a polyglot stack — Datadog won't be able to tell which service wrote which event.

shield active
const { log } = require('@jonathascordeiro20/openinfra-logger');

log('order.placed', 'info', {
  userId: user.id,
  card:   user.card  // → [REDACTED]
});
from openinfra_logger import log

log("order.placed", "info", {
    "user_id": user.id,
    "card":    user.card  # → [REDACTED]
})
import openinfralogger "github.com/jonathascordeiro20/openinfra-logger/go"

openinfralogger.Log("order.placed", "info", map[string]interface{}{
  "userId": user.ID,
  "card":   user.Card  // → [REDACTED]
})
use openinfra_logger::{Logger, Config};

let logger = Logger::new(Config::default());
let mut md = HashMap::new();
md.insert("userId".to_string(), user.id.clone());
md.insert("card".to_string(), user.card.clone());  // → [REDACTED]
logger.log("order.placed", "info", md);
output · datadog json
{
  "timestamp": "2026-05-15T14:08:22.041Z",
  "status": "info",
  "message": "order.placed",
  "service": "checkout-api",
  "runtime": "node.js",
  "dd.trace_id": "a4f1c9…d3",
  "userId": "u_8821",
  "card": "[REDACTED]"
}
native integrations

No extra packages.
Just set the format.

Switch destinations with a one-line config. OIL formats the wire payload natively for every major observability backend — no transformers, no exporters, no second sidecar.

Datadog

status, dd.trace_id, dd.span_id

Elastic / ECS

@timestamp, log.level

OpenTelemetry

trace_id + span_id auto-link

Grafana Loki

JSON over HTTP push

Plain JSON

stdout / file / remote

Custom transport

configurable remoteUrl + headers
configure({ "formatter": "datadog" | "elastic" | "default" })
tools/ai-analyzer

A senior engineer in your CI logs.

Run npm run analyze against any rolled error file. The CLI asks Claude to read the trace, identify the root cause, and propose a fix. Local files; explicit prompts; no telemetry phone-home.

~/checkout-api · npm run analyze
$ npm run analyze ./logs/error-2026-05-15.jsonl
📄 Analyzing log file: ./logs/error-2026-05-15.jsonl...
🔍 Found 12 anomalies/errors. Preparing AI analysis...
🧠 Connecting to Anthropic Claude API...

◆ Root Cause Analysis
  db.timeout cascade originating in checkout.repo.findOrder(),
  triggered when order_items.user_id index is missing on
  the read replica (last migration: 2026-04-22).

◆ Suggested fix
  1. add CREATE INDEX CONCURRENTLY on replica
  2. raise findOrder timeout from 800ms → 2000ms
  3. add trace_id to the retry budget key

→ wrote remediation to oil-analysis-a4f1.md
$ 
01

reads local files

The analyzer parses your rolled error logs without uploading them. Only the redacted, summarised prompt is sent to the model.

02

claude under the hood

Uses your own ANTHROPIC_API_KEY. Pick haiku for speed, sonnet or opus for depth — or pipe the prompt to claude.ai manually.

03

parseable output

The report is plain markdown / text — ready to attach to a PR, an incident doc, or a Slack thread.

privacy
The analyzer ignores any field already replaced with [REDACTED] by the Shield. PII never reaches the model.
0
external dependencies
across every runtime, every release
4
runtimes supported
Node · Python · Go · Rust
69
tests passing
unit + integration across all stacks
MIT
free for any use
community-built, community-owned
get started

Stop transforming logs.
Start writing them.

One install, zero deps. Drop OIL into your Node, Python, Go, or Rust service and your telemetry is consistent on day one — across every team, every region, every backend.

Read the quickstart → Star on GitHub
MIT · v0.1.0 · semver-stable