RepoPilotOpen in app →

tstack/lnav

Log file navigator

Healthy

Healthy across all four use cases

Use as dependencyHealthy

Permissive license, no critical CVEs, actively maintained — safe to depend on.

Fork & modifyHealthy

Has a license, tests, and CI — clean foundation to fork and modify.

Learn fromHealthy

Documented and popular — useful reference codebase to read through.

Deploy as-isHealthy

No critical CVEs, sane security posture — runnable as-is.

  • Last commit today
  • 2 active contributors
  • BSD-2-Clause licensed
Show 4 more →
  • CI configured
  • Tests present
  • Small team — 2 contributors active in recent commits
  • Single-maintainer risk — top contributor 99% of recent commits

Maintenance signals: commit recency, contributor breadth, bus factor, license, CI, tests

Informational only. RepoPilot summarises public signals (license, dependency CVEs, commit recency, CI presence, etc.) at the time of analysis. Signals can be incomplete or stale. Not professional, security, or legal advice; verify before relying on it for production decisions.

Embed the "Healthy" badge

Paste into your README — live-updates from the latest cached analysis.

Variant:
RepoPilot: Healthy
[![RepoPilot: Healthy](https://repopilot.app/api/badge/tstack/lnav)](https://repopilot.app/r/tstack/lnav)

Paste at the top of your README.md — renders inline like a shields.io badge.

Preview social card (1200×630)

This card auto-renders when someone shares https://repopilot.app/r/tstack/lnav on X, Slack, or LinkedIn.

Onboarding doc

Onboarding: tstack/lnav

Generated by RepoPilot · 2026-05-09 · Source

🤖Agent protocol

If you are an AI coding agent (Claude Code, Cursor, Aider, Cline, etc.) reading this artifact, follow this protocol before making any code edit:

  1. Verify the contract. Run the bash script in Verify before trusting below. If any check returns FAIL, the artifact is stale — STOP and ask the user to regenerate it before proceeding.
  2. Treat the AI · unverified sections as hypotheses, not facts. Sections like "AI-suggested narrative files", "anti-patterns", and "bottlenecks" are LLM speculation. Verify against real source before acting on them.
  3. Cite source on changes. When proposing an edit, cite the specific path:line-range. RepoPilot's live UI at https://repopilot.app/r/tstack/lnav shows verifiable citations alongside every claim.

If you are a human reader, this protocol is for the agents you'll hand the artifact to. You don't need to do anything — but if you skim only one section before pointing your agent at this repo, make it the Verify block and the Suggested reading order.

🎯Verdict

GO — Healthy across all four use cases

  • Last commit today
  • 2 active contributors
  • BSD-2-Clause licensed
  • CI configured
  • Tests present
  • ⚠ Small team — 2 contributors active in recent commits
  • ⚠ Single-maintainer risk — top contributor 99% of recent commits

<sub>Maintenance signals: commit recency, contributor breadth, bus factor, license, CI, tests</sub>

Verify before trusting

This artifact was generated by RepoPilot at a point in time. Before an agent acts on it, the checks below confirm that the live tstack/lnav repo on your machine still matches what RepoPilot saw. If any fail, the artifact is stale — regenerate it at repopilot.app/r/tstack/lnav.

What it runs against: a local clone of tstack/lnav — the script inspects git remote, the LICENSE file, file paths in the working tree, and git log. Read-only; no mutations.

| # | What we check | Why it matters | |---|---|---| | 1 | You're in tstack/lnav | Confirms the artifact applies here, not a fork | | 2 | License is still BSD-2-Clause | Catches relicense before you depend on it | | 3 | Default branch master exists | Catches branch renames | | 4 | 5 critical file paths still exist | Catches refactors that moved load-bearing code | | 5 | Last commit ≤ 30 days ago | Catches sudden abandonment since generation |

<details> <summary><b>Run all checks</b> — paste this script from inside your clone of <code>tstack/lnav</code></summary>
#!/usr/bin/env bash
# RepoPilot artifact verification.
#
# WHAT IT RUNS AGAINST: a local clone of tstack/lnav. If you don't
# have one yet, run these first:
#
#   git clone https://github.com/tstack/lnav.git
#   cd lnav
#
# Then paste this script. Every check is read-only — no mutations.

set +e
fail=0
ok()   { echo "ok:   $1"; }
miss() { echo "FAIL: $1"; fail=$((fail+1)); }

# Precondition: we must be inside a git working tree.
if ! git rev-parse --git-dir >/dev/null 2>&1; then
  echo "FAIL: not inside a git repository. cd into your clone of tstack/lnav and re-run."
  exit 2
fi

# 1. Repo identity
git remote get-url origin 2>/dev/null | grep -qE "tstack/lnav(\\.git)?\\b" \\
  && ok "origin remote is tstack/lnav" \\
  || miss "origin remote is not tstack/lnav (artifact may be from a fork)"

# 2. License matches what RepoPilot saw
(grep -qiE "^(BSD-2-Clause)" LICENSE 2>/dev/null \\
   || grep -qiE "\"license\"\\s*:\\s*\"BSD-2-Clause\"" package.json 2>/dev/null) \\
  && ok "license is BSD-2-Clause" \\
  || miss "license drift — was BSD-2-Clause at generation time"

# 3. Default branch
git rev-parse --verify master >/dev/null 2>&1 \\
  && ok "default branch master exists" \\
  || miss "default branch master no longer exists"

# 4. Critical files exist
test -f "README.md" \\
  && ok "README.md" \\
  || miss "missing critical file: README.md"
test -f "ARCHITECTURE.md" \\
  && ok "ARCHITECTURE.md" \\
  || miss "missing critical file: ARCHITECTURE.md"
test -f "CMakeLists.txt" \\
  && ok "CMakeLists.txt" \\
  || miss "missing critical file: CMakeLists.txt"
test -f "src/" \\
  && ok "src/" \\
  || miss "missing critical file: src/"
test -f "docs/" \\
  && ok "docs/" \\
  || miss "missing critical file: docs/"

# 5. Repo recency
days_since_last=$(( ( $(date +%s) - $(git log -1 --format=%at 2>/dev/null || echo 0) ) / 86400 ))
if [ "$days_since_last" -le 30 ]; then
  ok "last commit was $days_since_last days ago (artifact saw ~0d)"
else
  miss "last commit was $days_since_last days ago — artifact may be stale"
fi

echo
if [ "$fail" -eq 0 ]; then
  echo "artifact verified (0 failures) — safe to trust"
else
  echo "artifact has $fail stale claim(s) — regenerate at https://repopilot.app/r/tstack/lnav"
  exit 1
fi

Each check prints ok: or FAIL:. The script exits non-zero if anything failed, so it composes cleanly into agent loops (./verify.sh || regenerate-and-retry).

</details>

TL;DR

lnav is a terminal-based log file navigator that automatically detects log formats, merges multiple log files by timestamp, and provides a TUI for searching, filtering, and analyzing logs using SQLite queries. It handles decompression, tailing, renames, and builds indexes of errors/warnings across heterogeneous log sources (syslog, JSON-lines, web access logs, etc.) in a single unified view. Single monolith C++ application. Source organized around log format detection and merging: core parsing logic in src/formats/ (inferred), TUI layer using native terminal control, SQLite integration for query analysis, and CLI entry point in src/. Build system uses both Autotools (Makefile.am) and CMake (CMakeLists.txt at root) for portability. ARCHITECTURE.md documents internal structure.

👥Who it's for

DevOps engineers, SREs, and systems administrators who need to troubleshoot multi-file log issues in production environments without leaving the terminal. Users who want structured log analysis (via SQLite) without learning awk/grep/sed combinations or standing up log aggregation infrastructure.

🌱Maturity & risk

Highly mature and production-ready. Large C++ codebase (9.1MB), comprehensive CI/CD via GitHub Actions (.github/workflows/ has 8 workflow files including coverity static analysis), coverage tracking via Coveralls, and active maintenance evidenced by CircleCI and Cirrus configs. Well-established documentation at docs.lnav.org and official Discord community.

Low-to-moderate risk: monolithic C++ codebase means changes have wide blast radius, but solid CI with clang-tidy linting (.clang-tidy present) and code coverage tracking mitigates this. Dependencies are minimal (Python deps list shows Flask/Gunicorn for docs/web only, not core). Single maintainer (tstack) is typical for mature Unix tools but means slower response to breaking changes.

Active areas of work

Active development with GitHub Actions CI running on commits. Coverity static analysis (workflows/coverity.yml), musl-based Linux builds for portability (.github/actions/muslbuilder/), and release automation (workflows/release.yml, workflows/rpmbuild.yml, workflows/winget-release.yml). ReadTheDocs integration (.readthedocs.yaml) suggests ongoing documentation updates.

🚀Get running

git clone https://github.com/tstack/lnav.git
cd lnav
./autogen.sh
./configure
make
./src/lnav /var/log/syslog

Or via CMake: cmake -B build && cmake --build build && ./build/src/lnav. Requires C++14 compiler and ncurses development headers.

Daily commands: Dev build: ./autogen.sh && ./configure && make && ./src/lnav. For interactive testing: ./src/lnav test_logs/ (assuming test logs exist). Release: make install after configure. CMake alternative: cmake -B build -DCMAKE_BUILD_TYPE=Debug && cmake --build build && ./build/src/lnav.

🗺️Map of the codebase

  • README.md — Defines lnav's core purpose and features—every contributor must understand the log navigation paradigm
  • ARCHITECTURE.md — Documents the system design, component responsibilities, and data flow; essential for understanding codebase organization
  • CMakeLists.txt — Primary build configuration for the C++ codebase; all contributors need to understand how lnav is compiled and linked
  • src/ — Main source directory containing the log parsing, viewing, and navigation logic; foundation of all features
  • docs/ — Documentation and design rationale; critical for understanding user-facing behavior and feature requirements
  • .github/workflows/c-cpp.yml — CI/CD pipeline defining test and build requirements; essential for PR validation and release processes
  • configure.ac — Autoconf configuration for platform detection and dependency resolution; affects portability across systems

🧩Components & responsibilities

  • Log Format Detection & Parsing (C++, regex, state machines) — Identifies log format from samples, extracts structured fields (timestamp, level, message), handles multi-line entries
    • Failure mode: Incorrect format detection falls back to raw text; malformed entries skipped with warnings
  • Log Indexing & Merging (C++, in-memory data structures, merge algorithms) — Builds timestamp-ordered index across multiple log files, maintains sorted view during live tail
    • Failure mode: OOM on very large logs; index corruption triggers rebuild from scratch
  • View Controller & Filtering (C++, SQL queries, render caching) — Manages visible row range, applies regex/SQL filters, highlights matches, handles user navigation
    • Failure mode: Filter syntax errors halt filtering; invalid regexes show error message; slow filters timeout after N seconds
  • Terminal Rendering (ncurses, ANSI color) — Renders lines to terminal with colors, themes, scrollbar, status bar; handles window resizing

🛠️How to make changes

Add Support for a New Log Format

  1. Create a new log format class inheriting from log_format interface (src/log_format.h)
  2. Implement format-specific regex patterns and line parsing logic (src/)
  3. Register your format in the format registry so lnav auto-detects it (src/lnav.cc)
  4. Add test cases for your format to validate regex patterns and timestamp extraction (tests/)

Add a New SQL Custom Function

  1. Define the function signature and behavior in the SQL functions registry (src/sql_functions.cc)
  2. Implement the function logic using SQLite's custom function API (src/sql_util.h)
  3. Document the function in the docs (e.g., xpath_extract, json_extract patterns) (docs/)
  4. Add test cases to verify function behavior across log formats (tests/)

Add a New Filtering or Search Feature

  1. Define the filter type and update the view controller state machine (src/logfile_view.h)
  2. Implement filtering logic (regex, SQL, or text matching) (src/)
  3. Wire the filter into the UI event loop and keybindings (src/lnav.cc)
  4. Update the status line and visual feedback to show active filters (src/view_curses.h)

🔧Why these technologies

  • C++ — High performance log parsing and indexing; efficient memory usage for large log files; portability across Unix/Linux platforms
  • ncurses/Curses — Terminal UI control and cross-platform terminal rendering; color themes; minimal dependencies
  • SQLite — In-process structured query engine; allows SQL queries over log data without external DB dependency; custom function extensibility
  • CMake — Modern build system with cross-platform support; dependency management; integration with CI/CD
  • Python (docs/demo) — Documentation generation, demo log generation, and Flask-based web playground for remote file access

⚖️Trade-offs already made

  • In-memory indexing vs. on-disk index

    • Why: Fast interactive navigation without persistent state overhead during typical usage sessions
    • Consequence: Cannot handle extremely large log files (>available RAM); index is rebuilt on restart; no cross-session caching
  • Terminal UI (ncurses) vs. web/GUI

    • Why: Lightweight, zero-dependency tool suitable for remote SSH sessions and CI/CD pipelines
    • Consequence: Limited to terminal capabilities; no rich graphical visualization; requires keybinding learning curve
  • Auto-format detection via regex vs. ML/heuristics

    • Why: Deterministic, fast, zero-dependency approach; leverages human-curated format patterns
    • Consequence: Format detection limited to known patterns; new formats require manual registration; no learning from data
  • Stateless CLI tool vs. daemon with persistent cache

    • Why: Simplicity and no background process overhead; no configuration complexity; immutable log data assumption
    • Consequence: Cold start overhead on large files; no index sharing across invocations; tail mode requires polling

🚫Non-goals (don't propose these)

  • Does not provide authentication, authorization, or multi-user access control
  • Not a real-time log aggregation or streaming platform (designed for static/existing files with tail support)
  • Does not handle extremely large files >available RAM without significant performance degradation
  • Not cross-platform GUI; terminal-only interface (though web playground exists via Flask demo)
  • Does not provide backup, log rotation, or data lifecycle management
  • Not a replacement for centralized log systems (ELK, Splunk, etc.) for enterprise use

🪤Traps & gotchas

  1. Dual build system (Autotools + CMake): CMake is newer but both are maintained. Use CMake for development. 2. Tests: Look for test_*.cpp files in src/ or tests/ directory; test framework inferred from CI config but not obvious from file list. 3. Terminal capability detection: lnav queries terminfo at runtime; run on a real terminal or set TERM env var correctly. 4. Large file handling: Memory-mapped I/O likely used but not visible in file list; be careful with buffer sizes in format detection. 5. Signal handling: Multi-file tailing requires careful SIGCHLD/SIGHUP handling; modify main event loop cautiously.

🏗️Architecture

💡Concepts to learn

  • BurntSushi/ripgrep — Similar Unix tool philosophy but regex-focused; lnav users often combine with ripgrep for pipeline workflows
  • junegunn/fzf — Complementary TUI for fuzzy filtering; lnav could pipe to fzf for additional filtering stages
  • elastic/logstash — Industry alternative for log parsing/merging at scale; lnav is lightweight single-machine version for DevOps workflows
  • tstack/sqlite-compressions — Companion project by same author (tstack); provides SQLite compression extensions used in lnav for efficient indexing
  • PromyLOVE/loveindustries — Not a direct repo match, but ecosystem of log-centric DevOps tools; lnav is often used with prometheus/grafana for drilling into raw logs after metrics alert

🪄PR ideas

To work on one of these in Claude Code or Cursor, paste: Implement the "<title>" PR idea from CLAUDE.md, working through the checklist as the task list.

Add integration tests for crashd Flask application

The crashd/ directory contains a Flask-based crash reporting service with dependencies like Flask, gunicorn, and Jinja2, but there's no visible test suite. Given the service's role in handling crash data, adding integration tests would ensure reliability and catch regressions when dependencies are updated.

  • [ ] Create tests/crashd/ directory with test_app.py for Flask route testing
  • [ ] Add fixtures for test crash data in tests/crashd/fixtures/
  • [ ] Write tests for crashd/app.py endpoints (crash submission, retrieval, listing)
  • [ ] Add pytest configuration to CMakeLists.txt or create tests/crashd/CMakeLists.txt
  • [ ] Document testing setup in crashd/README.md

Create GitHub Actions workflow for linting Python and Markdown files

The repo has Python code in crashd/ and demo/, plus extensive Markdown documentation, but the .github/workflows/ directory shows only C++ focused workflows (c-cpp.yml, coverity.yml). A dedicated Python/Markdown linting workflow would catch style issues early. The .codespellrc and .clang-tidy files show linting is valued; Python/Markdown equivalents are missing.

  • [ ] Create .github/workflows/python-lint.yml with steps for flake8/black on crashd/ and demo/
  • [ ] Add Markdown linting to .github/workflows/check-md-links.yml or create new workflow using markdownlint
  • [ ] Create setup.cfg or pyproject.toml with flake8/black configuration for crashd/
  • [ ] Add requirements-dev.txt or similar for dev dependencies (flake8, black, markdownlint-cli)
  • [ ] Document Python style guidelines in CONTRIBUTING.md or similar

Add unit tests for log format parsing in src/ directory

The lnav codebase is a log file navigator with complex parsing logic, but the file structure shows extensive cmake/CI configuration without visible unit test organization. Given the critical nature of log parsing and the existing CMakeLists.txt structure, adding a formal test suite for format detection and parsing would improve maintainability.

  • [ ] Create tests/unit/ directory with CMakeLists.txt for unit test configuration
  • [ ] Add tests/unit/test_log_formats.cpp for format parsing validation
  • [ ] Add tests/unit/test_log_parsing.cpp for record extraction from sample logs
  • [ ] Create tests/fixtures/sample_logs/ with representative log files from different systems
  • [ ] Integrate unit tests into main CMakeLists.txt and CI workflows (c-cpp.yml)

🌿Good first issues

  1. Add format detection test coverage: Look in .github/workflows/c-cpp.yml to see what test framework is used, then add unit tests for each log format in src/formats/ (currently underdocumented for new formats). 2. Improve Windows support: .github/workflows/winget-release.yml exists but README shows Unix-centric examples; add Windows-specific examples to docs/README.md and test on .github/actions/. 3. Document SQLite extension functions: src/sql_ext.cpp likely has custom functions (implied by ; command for analysis) that aren't in docs; audit functions there and add entries to official docs in docs/ directory.

Top contributors

Click to expand

📝Recent commits

Click to expand
  • 3e9e2df — [timeline] create ranges out of tags (tstack)
  • 2c59660 — [format] tags defined in formats can capture (tstack)
  • ebca259 — [log] only return anchor for actual partition line (tstack)
  • 46a885b — [indexing] be less aggressive when hiding duplicate files (tstack)
  • 35e2f95 — [tests] add timeline test with uwsgi log (tstack)
  • 9aaaba5 — [uwsgi_log] set duration-field (tstack)
  • 6cba7b5 — [timeline] merge partitions (tstack)
  • de3cbc8 — [views] move zoom stuff to text_time_translator (tstack)
  • 63128cf — [docs] mention semantic() color (tstack)
  • 762cace — [textinput] add hotkeys to increase/decrease the multi-line prompt size (tstack)

🔒Security observations

The lnav project demonstrates a moderate security posture with some notable concerns. Critical issues include potential exposure of secrets in the .secrets file and the need for comprehensive dependency vulnerability scanning. The crashd Flask application lacks visible security hardening measures like security headers and input validation frameworks. Most dependencies appear reasonably up-to-date, but regular monitoring is essential. The use of multiple external integrations introduces supply chain risks. Recommendations include: (1) immediate verification of .secrets file handling, (2) implementation of automated dependency scanning, (3) hard

  • High · Outdated Flask Dependency with Known Vulnerabilities — crashd/requirements.txt. Flask 3.1.3 is used in the crashd application. While relatively recent, Flask versions in the 3.x series may have known security issues. The dependency should be regularly audited against CVE databases. Fix: Regularly update Flask to the latest stable version and monitor security advisories. Consider using a tool like 'safety' or 'pip-audit' to check for known vulnerabilities.
  • High · Outdated Werkzeug Dependency — crashd/requirements.txt. Werkzeug 3.1.6 is specified as a dependency. While it is a recent version, ensure it's compatible with Flask and regularly checked for security updates. Fix: Verify compatibility between Flask and Werkzeug versions. Use pip-audit or safety to check for known CVEs. Pin to specific patch versions and update regularly.
  • Medium · Potential Secrets in Repository Root — .secrets. A file named '.secrets' exists in the repository root, which may contain sensitive credentials or API keys that could be accidentally committed to version control. Fix: Ensure .secrets is in .gitignore and never committed to the repository. Use environment variables or a secrets management system instead. Rotate any credentials that may have been exposed.
  • Medium · Docker Images with Unspecified Base Versions — crashd/Dockerfile, demo/Dockerfile, .github/actions/muslbuilder/Dockerfile. Multiple Dockerfiles are present (crashd/Dockerfile, demo/Dockerfile, .github/actions/muslbuilder/Dockerfile) but the content is not visible. Dockerfiles should use pinned base image versions to avoid unexpected security updates. Fix: Pin Dockerfile base images to specific versions (e.g., 'python:3.11-alpine' instead of 'python:latest'). Regularly scan images with tools like Trivy or Grype.
  • Medium · Multiple External Web Services Integrated — README.md, docs files. The project integrates with multiple external services (ReadTheDocs, Coveralls, Snapcraft, Gurubase, Discord) as shown in README badges. These integrations introduce supply chain risks. Fix: Regularly audit integrations and their permissions. Use service-specific security features (e.g., limited tokens, IP restrictions). Monitor for service compromises.
  • Low · Gunicorn in Production without Explicit Configuration — crashd/requirements.txt. gunicorn 23.0.0 is listed as a dependency but no gunicorn configuration file is visible in the provided file structure. This may indicate default settings are used. Fix: Create an explicit gunicorn configuration file specifying secure defaults (worker processes, timeouts, access logs, etc.). Consider using a reverse proxy (nginx) in front of gunicorn.
  • Low · No Evidence of Security Headers Configuration — crashd/app.py. For the Flask/web components in crashd, there's no visible configuration for security headers (CSP, X-Frame-Options, HSTS, etc.). Fix: Implement Flask extensions like flask-talisman or manually add security headers to all responses. Configure Content-Security-Policy, X-Frame-Options, X-Content-Type-Options, etc.
  • Low · No Evidence of Input Validation Framework — crashd/app.py. The Flask application in crashd shows no visible use of input validation libraries (e.g., marshmallow, pydantic) for form/API input validation. Fix: Implement input validation using established libraries like marshmallow or pydantic. Validate and sanitize all user inputs before processing.

LLM-derived; treat as a starting point, not a security audit.


Generated by RepoPilot. Verdict based on maintenance signals — see the live page for receipts. Re-run on a new commit to refresh.

Healthy signals · tstack/lnav — RepoPilot