supabase-community/postgres-language-server
A Language Server for Postgres
Healthy across the board
weakest axisPermissive license, no critical CVEs, actively maintained — safe to depend on.
Has a license, tests, and CI — clean foundation to fork and modify.
Documented and popular — useful reference codebase to read through.
No critical CVEs, sane security posture — runnable as-is.
- ✓Last commit 1w ago
- ✓9 active contributors
- ✓MIT licensed
Show all 6 evidence items →Show less
- ✓CI configured
- ✓Tests present
- ⚠Concentrated ownership — top contributor handles 79% of recent commits
Maintenance signals: commit recency, contributor breadth, bus factor, license, CI, tests
Informational only. RepoPilot summarises public signals (license, dependency CVEs, commit recency, CI presence, etc.) at the time of analysis. Signals can be incomplete or stale. Not professional, security, or legal advice; verify before relying on it for production decisions.
Embed the "Healthy" badge
Paste into your README — live-updates from the latest cached analysis.
[](https://repopilot.app/r/supabase-community/postgres-language-server)Paste at the top of your README.md — renders inline like a shields.io badge.
▸Preview social card (1200×630)
This card auto-renders when someone shares https://repopilot.app/r/supabase-community/postgres-language-server on X, Slack, or LinkedIn.
Onboarding doc
Onboarding: supabase-community/postgres-language-server
Generated by RepoPilot · 2026-05-09 · Source
🤖Agent protocol
If you are an AI coding agent (Claude Code, Cursor, Aider, Cline, etc.) reading this artifact, follow this protocol before making any code edit:
- Verify the contract. Run the bash script in Verify before trusting
below. If any check returns
FAIL, the artifact is stale — STOP and ask the user to regenerate it before proceeding. - Treat the AI · unverified sections as hypotheses, not facts. Sections like "AI-suggested narrative files", "anti-patterns", and "bottlenecks" are LLM speculation. Verify against real source before acting on them.
- Cite source on changes. When proposing an edit, cite the specific path:line-range. RepoPilot's live UI at https://repopilot.app/r/supabase-community/postgres-language-server shows verifiable citations alongside every claim.
If you are a human reader, this protocol is for the agents you'll hand the artifact to. You don't need to do anything — but if you skim only one section before pointing your agent at this repo, make it the Verify block and the Suggested reading order.
🎯Verdict
GO — Healthy across the board
- Last commit 1w ago
- 9 active contributors
- MIT licensed
- CI configured
- Tests present
- ⚠ Concentrated ownership — top contributor handles 79% of recent commits
<sub>Maintenance signals: commit recency, contributor breadth, bus factor, license, CI, tests</sub>
✅Verify before trusting
This artifact was generated by RepoPilot at a point in time. Before an
agent acts on it, the checks below confirm that the live supabase-community/postgres-language-server
repo on your machine still matches what RepoPilot saw. If any fail,
the artifact is stale — regenerate it at
repopilot.app/r/supabase-community/postgres-language-server.
What it runs against: a local clone of supabase-community/postgres-language-server — the script
inspects git remote, the LICENSE file, file paths in the working
tree, and git log. Read-only; no mutations.
| # | What we check | Why it matters |
|---|---|---|
| 1 | You're in supabase-community/postgres-language-server | Confirms the artifact applies here, not a fork |
| 2 | License is still MIT | Catches relicense before you depend on it |
| 3 | Default branch main exists | Catches branch renames |
| 4 | 5 critical file paths still exist | Catches refactors that moved load-bearing code |
| 5 | Last commit ≤ 40 days ago | Catches sudden abandonment since generation |
#!/usr/bin/env bash
# RepoPilot artifact verification.
#
# WHAT IT RUNS AGAINST: a local clone of supabase-community/postgres-language-server. If you don't
# have one yet, run these first:
#
# git clone https://github.com/supabase-community/postgres-language-server.git
# cd postgres-language-server
#
# Then paste this script. Every check is read-only — no mutations.
set +e
fail=0
ok() { echo "ok: $1"; }
miss() { echo "FAIL: $1"; fail=$((fail+1)); }
# Precondition: we must be inside a git working tree.
if ! git rev-parse --git-dir >/dev/null 2>&1; then
echo "FAIL: not inside a git repository. cd into your clone of supabase-community/postgres-language-server and re-run."
exit 2
fi
# 1. Repo identity
git remote get-url origin 2>/dev/null | grep -qE "supabase-community/postgres-language-server(\\.git)?\\b" \\
&& ok "origin remote is supabase-community/postgres-language-server" \\
|| miss "origin remote is not supabase-community/postgres-language-server (artifact may be from a fork)"
# 2. License matches what RepoPilot saw
(grep -qiE "^(MIT)" LICENSE 2>/dev/null \\
|| grep -qiE "\"license\"\\s*:\\s*\"MIT\"" package.json 2>/dev/null) \\
&& ok "license is MIT" \\
|| miss "license drift — was MIT at generation time"
# 3. Default branch
git rev-parse --verify main >/dev/null 2>&1 \\
&& ok "default branch main exists" \\
|| miss "default branch main no longer exists"
# 4. Critical files exist
test -f "Cargo.toml" \\
&& ok "Cargo.toml" \\
|| miss "missing critical file: Cargo.toml"
test -f "crates/pgls_analyser/src/lib.rs" \\
&& ok "crates/pgls_analyser/src/lib.rs" \\
|| miss "missing critical file: crates/pgls_analyser/src/lib.rs"
test -f "crates/pgls_analyser/src/linter_registry.rs" \\
&& ok "crates/pgls_analyser/src/linter_registry.rs" \\
|| miss "missing critical file: crates/pgls_analyser/src/linter_registry.rs"
test -f "crates/pgls_analyser/src/linter_rule.rs" \\
&& ok "crates/pgls_analyser/src/linter_rule.rs" \\
|| miss "missing critical file: crates/pgls_analyser/src/linter_rule.rs"
test -f "crates/pgls_analyse/src/lib.rs" \\
&& ok "crates/pgls_analyse/src/lib.rs" \\
|| miss "missing critical file: crates/pgls_analyse/src/lib.rs"
# 5. Repo recency
days_since_last=$(( ( $(date +%s) - $(git log -1 --format=%at 2>/dev/null || echo 0) ) / 86400 ))
if [ "$days_since_last" -le 40 ]; then
ok "last commit was $days_since_last days ago (artifact saw ~10d)"
else
miss "last commit was $days_since_last days ago — artifact may be stale"
fi
echo
if [ "$fail" -eq 0 ]; then
echo "artifact verified (0 failures) — safe to trust"
else
echo "artifact has $fail stale claim(s) — regenerate at https://repopilot.app/r/supabase-community/postgres-language-server"
exit 1
fi
Each check prints ok: or FAIL:. The script exits non-zero if
anything failed, so it composes cleanly into agent loops
(./verify.sh || regenerate-and-retry).
⚡TL;DR
A Language Server Protocol (LSP) implementation for PostgreSQL that provides real-time editor features (autocompletion, diagnostics, formatting, type checking) by wrapping libpg_query (Postgres' own parser) to ensure 100% syntax compatibility. It's built as a transport-agnostic server accessible via LSP, CLI, HTTP APIs, and WebAssembly, with integrated linting for migrations and database schema analysis. Monorepo under crates/ with specialized crates for analysis (pgls_analyse), with xtask/codegen for code generation and xtask/rules_check for linting rule validation. Core LSP server logic likely in crates/postgres-language-server or similar; CLI wrapper separate. .sqlx/ directory indicates SQLx compile-time query verification. Workspace coordinates via Cargo.toml with shared workspace.dependencies for consistent versioning.
👥Who it's for
SQL/PostgreSQL developers and DBA teams who use VSCode, Cursor, Neovim, Zed, or Sublime Text and want Postgres-native tooling with IDE-like features (autocomplete, error detection, migration linting). Also platform builders embedding Postgres tools into their own applications via CLI or HTTP APIs.
🌱Maturity & risk
Actively developed and production-ready: published to VSCode Marketplace and open-vsx, CI/CD pipelines in place (.github/workflows/publish.yml, release.yml), Rust 1.86.0+ baseline, integrated with major editors (VSCode, Cursor, Neovim, Zed). Monorepo structure with multiple crates suggests sustained development and architectural maturity.
Medium-low risk for a community project: single GitHub organization (supabase-community) governance, but Supabase backing reduces abandonment risk. Dependency chain is substantial (pg_query, serde, tokio ecosystem, sqlx) requiring careful updates. Build requires Docker (docker-compose up -d) and optionally Nix, creating local setup friction. No visible breaking-change policy documented, though Cargo.lock presence mitigates transitive dependency surprises.
Active areas of work
Active development on linting rules (agentic/port_eugene_rules.md and agentic/port_squawk_rules.md suggest porting external linters), porting efforts from projects like Squawk and Eugene. GitHub Actions workflows (pull_request.yml, release.yml) indicate continuous integration. Documentation at pg-language-server.com being actively maintained (deploy_docs.yml workflow).
🚀Get running
git clone https://github.com/supabase-community/postgres-language-server.git
cd postgres-language-server
nix develop # optional, skip if not using Nix
docker-compose up -d
cargo build --release
Daily commands:
# Dev mode (if using Nix)
nix develop
docker-compose up -d
# Build CLI
cargo build --release
# Run language server (typically launched by editor plugin, but can run standalone)
cargo run --bin postgres-language-server
# Run tests
cargo test
🗺️Map of the codebase
Cargo.toml— Workspace root configuration defining all crates, dependencies, and edition requirements—essential for understanding the project structure and build system.crates/pgls_analyser/src/lib.rs— Core linter library entry point that orchestrates all safety rules and analysis—every contributor touching analysis features must understand this.crates/pgls_analyser/src/linter_registry.rs— Registry that dynamically loads and manages all linter rules—critical for adding or modifying any safety rules.crates/pgls_analyser/src/linter_rule.rs— Trait definition and abstractions for all linting rules—foundational contract that every new rule must implement.crates/pgls_analyse/src/lib.rs— Analysis framework providing metadata and categorization for rules—supports the analysis subsystem that linter rules depend on.ARCHITECTURE.md— High-level design documentation explaining core components, data flows, and decision rationale—required reading for understanding codebase organization.README.md— Project overview, features, and installation instructions—establishes scope, goals, and supported platforms for all contributors.
🛠️How to make changes
Add a new safety linting rule
- Create a new Rust module file in the safety rules directory following the naming convention (e.g.,
my_new_rule.rs) (crates/pgls_analyser/src/lint/safety/my_new_rule.rs) - Implement the
LinterRuletrait withname(),description(),lint()methods, matching existing rule patterns (crates/pgls_analyser/src/lint/safety/my_new_rule.rs) - Register the new rule in the safety module by adding a public declaration and module export (
crates/pgls_analyser/src/lint/safety.rs) - Create test specification files (YAML) in the test specs directory with positive and negative test cases (
crates/pgls_analyser/tests/specs/safety/myNewRule) - Run
cargo testto validate the rule works correctly against test specifications (crates/pgls_analyser/tests/rules_tests.rs)
Update rule metadata and categorization
- Define rule category and metadata attributes in your rule implementation using the
pgls_analysecategories system (crates/pgls_analyser/src/lint/safety/my_new_rule.rs) - Ensure the rule exports its metadata via the analysis framework so it can be discovered (
crates/pgls_analyse/src/categories.rs) - Run code generation tools to update documentation and rule listings (
xtask/codegen)
Extend the analysis context passed to rules
- Add new fields to the
LinterContextstruct if you need additional parsed AST data or metadata (crates/pgls_analyser/src/linter_context.rs) - Update the context builder in the linter to populate the new fields from parsed SQL (
crates/pgls_analyser/src/lib.rs) - Use the new context fields in your rule's
lint()implementation (crates/pgls_analyser/src/lint/safety/my_new_rule.rs)
🔧Why these technologies
- Rust with tree-sitter/libpg_query — 100% Postgres-compatible SQL parsing without reinventing a parser; compiled to fast binary for LSP and CLI use
- Language Server Protocol (LSP) — Industry-standard for IDE integration; enables single backend to serve VSCode, Neovim, Zed, Sublime Text, and Cursor simultaneously
- Trait-based rule system (LinterRule) — Allows modular, extensible safety rules; new rules are isolated implementations without touching core engine
- Cargo workspace with multiple crates — Separates concerns (analysis metadata, linting, test utilities) and enables partial compilation/reuse in different tools
- YAML test specifications — Human-readable before/after SQL examples for rule validation without needing Rust test code for each rule case
⚖️Trade-offs already made
-
Parsing done via libpg_query (C FFI) rather than pure Rust parser
- Why: libpg_query is the official Postgres parser; guarantees 100% syntax compatibility across all Postgres versions
- Consequence: Adds FFI overhead and C dependency; but eliminates risk of custom parser divergence from upstream Postgres
-
Rules are registered statically via macro/registry rather than dynamic plugin loading
- Why: Simpler architecture, compile-time safety, easier to test and distribute a single binary
- Consequence: Adding a rule requires code change and recompile; rules cannot be added by end-users at runtime without rebuilding
-
Rules execute sequentially (not parallelized within a file)
- Why: Simplicity and predictable diagnostics ordering; avoids race conditions on shared context
- Consequence: Analysis time grows linearly with rule count; acceptable for ~40 rules and small-to-medium files, but could bottleneck large migrations
-
LSP and CLI share same rule engine via library crate (pgls_analyser)
- Why: Single source of truth for all linting logic; ensures consistency across tools
- Consequence: Tight coupling between tools; breaking changes in analysis API affect both LSP server and CLI simultaneously
🚫Non-goals (don't propose these)
- Does not execute or simulate SQL—only static analysis of schema and migration statements
- Does not provide authentication, access control, or multi-tenant isolation—responsibility of embedding application
- Does not auto-format or refactor SQL—focuses on linting and diagnostics, not transformation
- Does not optimize queries or provide cost estimation—focuses on schema safety patterns, not query performance
- Does not version-gate rules per Postgres version—rules are applied uniformly regardless of target Postgres version
🪤Traps & gotchas
- Docker-compose must be running (docker-compose up -d) before tests/dev; LSP needs live Postgres for full features. 2) Rust edition 2024 requires rustc 1.86.0+; older toolchains will fail. 3) SQLx compile-time checking (.sqlx/ directory) requires DATABASE_URL env var set at build time — missing it breaks queries. 4) pg_query is a C FFI binding; build needs C compiler (cc) and Postgres headers. 5) Nix flake is optional but recommended; without it, manual dependency installation required (see .github/actions/setup-postgres for dependencies). 6) Tree-sitter CLI version (.tree-sitter-cli-version) must match; mismatches break parser generation.
🏗️Architecture
💡Concepts to learn
- Language Server Protocol (LSP) — Core abstraction this project implements; understanding LSP message flows, document synchronization, and capability negotiation is essential to extending features across editors
- Parse tree analysis and semantic traversal — pgls_analyse walks libpg_query AST output to extract symbols, types, and violations; knowing how parse trees represent SQL statements (e.g., SelectStmt, CreateTableStmt nodes) is crucial for adding new rules
- Transport-agnostic server design — This LSP runs over stdio (editor), HTTP (web), and WASM (browser); understanding how message serialization and request routing decouples from transport lets you add new frontends without core changes
- Compile-time SQL query verification (sqlx) — Codebase uses SQLx macros (.sqlx/ cache) to verify SQL at build time against live schema; breaking this requires DATABASE_URL and schema setup, and understanding it prevents CI gotchas
- FFI (Foreign Function Interface) with C — pg_query wraps libpg_query C library via Rust FFI; understanding unsafe Rust blocks, pointer marshaling, and C memory models is needed to debug parser crashes or add parser features
- Diagnostic rule engines and linting frameworks — This project ports linting rules from Squawk/Eugene using a modular rule-check framework (xtask/rules_check); understanding rule registration, message templates, and violation severity levels unlocks rule contributions
- Workspace monorepo patterns (Cargo workspaces) — Multiple specialized crates (pgls_analyse, codegen, rules_check) share dependencies via workspace.dependencies; understanding member interdependencies and version coordination avoids breaking builds when modifying shared code
🔗Related repos
pganalyze/libpg_query— Direct dependency (pg_query crate) — the extracted Postgres parser this LSP wraps; understanding it is essential to troubleshooting parser errorssbdchd/squawk— Direct inspiration and linting rule source (agentic/port_squawk_rules.md); understanding Squawk's rule architecture aids migrating new rulesbiomejs/biome— Architectural inspiration (acknowledged in README) for LSP toolchain patterns; similar multi-output approach (CLI, LSP, lib interfaces)
🪄PR ideas
To work on one of these in Claude Code or Cursor, paste:
Implement the "<title>" PR idea from CLAUDE.md, working through the checklist as the task list.
Add comprehensive integration tests for safety linting rules in crates/pgls_analyser/src/lint/safety
The safety lint rules directory (crates/pgls_analyser/src/lint/safety/) contains critical migration safety checks like add_serial_column.rs, adding_field_with_default.rs, and adding_not_null_field.rs. These rules prevent production incidents but appear to lack corresponding integration tests. Adding test cases would validate rule detection accuracy and prevent regressions as the linter evolves.
- [ ] Review existing test patterns in crates/pgls_analyser/src/lint/ and insta snapshot test setup
- [ ] Create crates/pgls_analyser/src/lint/safety/tests/ directory with fixture SQL files for each safety rule
- [ ] Add test cases for add_serial_column.rs covering valid and invalid migration patterns
- [ ] Add test cases for adding_not_null_field.rs with and without defaults
- [ ] Add test cases for adding_foreign_key_constraint.rs validating constraint checks
- [ ] Integrate with existing insta snapshot testing framework per Cargo.toml (insta = 1.31.0)
Create CI workflow for validating .sqlx query metadata files against schema changes
The .sqlx/ directory contains 12+ compiled query metadata files that sqlx uses for compile-time checking. When schema changes occur, these can become stale without detection. A GitHub Action should validate that .sqlx/ metadata reflects the current database schema during pull requests, preventing silent query validation failures.
- [ ] Review .github/workflows/ structure (pull_request.yml, release.yml) to understand existing patterns
- [ ] Create .github/workflows/validate-sqlx.yml that runs on PR with schema changes
- [ ] Leverage .github/actions/setup-postgres/action.yml to spin up test Postgres instance
- [ ] Add sqlx CLI prepare check command to validate .sqlx/ files match actual schema
- [ ] Configure workflow to fail if .sqlx metadata is outdated relative to Cargo.toml migrations
- [ ] Document in CONTRIBUTING.md how to regenerate .sqlx/ files locally
Add missing unit tests for crates/pgls_analyse/src/filter.rs and registry.rs modules
The pgls_analyse crate (filter.rs and registry.rs) provides core analysis filtering and rule registration infrastructure, but these files lack corresponding unit tests. These modules are foundational to how rules are discovered and applied. Adding tests would improve maintainability and enable future refactoring with confidence.
- [ ] Examine crates/pgls_analyse/src/lib.rs and categories.rs to understand module public APIs
- [ ] Add #[cfg(test)] module to crates/pgls_analyse/src/filter.rs with test cases for filter logic
- [ ] Add unit tests for registry initialization, rule registration, and rule lookup in registry.rs
- [ ] Create test fixtures for mock Rule and Category types if needed
- [ ] Ensure tests cover edge cases: empty registry, duplicate rule registration, filter mismatches
- [ ] Run cargo test to validate and ensure >80% code coverage for both modules
🌿Good first issues
- Add comprehensive tests for crates/pgls_analyse rule validation logic (xtask/rules_check exists but test coverage gaps evident for new rule types being ported from Eugene/Squawk in agentic/); start by running cargo test and identifying uncovered analyze paths
- Expand hover documentation for PL/pgSQL built-in functions (README mentions 'PL/pgSQL Support' feature but no evidence of comprehensive function-level docstrings in codebase); contribute hover text by examining how PostgreSQL's pg_proc catalog is queried
- Add missing LSP ClientCapabilities branching in server initialization (typical LSP codebases handle dynamic registration; check if crates/postgres-language-server/src/ properly detects and adapts to client capabilities like dynamic formatting or code lens support)
⭐Top contributors
Click to expand
Top contributors
- @psteinroe — 79 commits
- @juleswritescode — 13 commits
- @shihanng — 2 commits
- @lneves12 — 1 commits
- @flvmnt — 1 commits
📝Recent commits
Click to expand
Recent commits
cc88ac9— Add is_incomplete to signal truncated completion (#733) (shihanng)ba029fe— Update cargo-pgrx to 0.18.0 to match pglinter (#734) (shihanng)8088fb9— chore: improve release changelog formatting (#732) (psteinroe)178aa3c— feat: expose new workspace function splitStatements with ranges (#730) (lneves12)cfeb686— chore: add multi-file pretty print SQL fixtures (#731) (psteinroe)aa8b88b— fix: build wasm from release assets (psteinroe)145bf68— fix: pin bun version and use bundler resolution for wasm package (psteinroe)2c7f5a3— fix: enable rustls-backed Postgres TLS connections (#729) (psteinroe)a805c22— feat: add configurable plpgsql_check function arguments (#726) (psteinroe)bbe3430— fix: handle WITH in ALTER statement splitting (#728) (psteinroe)
🔒Security observations
- High · Hardcoded SSL Certificate in Dockerfile —
Dockerfile - openssl req command. The Dockerfile generates a self-signed SSL certificate with hardcoded parameters directly in the image build. This certificate is valid for 3650 days and uses predictable generation parameters. While self-signed certs are expected in development, this approach creates a static certificate in all built images, which could be problematic if the same image is used across environments. Fix: For production use, implement certificate management through external volume mounts or secrets management systems. Use environment-specific certificate generation or rotate certificates regularly. Consider using a certificate management tool like cert-manager if running in Kubernetes. - High · Unencrypted .env File in Repository —
.env file. The presence of a .env file in the repository root (visible in file structure) suggests configuration and potentially sensitive data may be committed to version control. Even if currently empty, this pattern indicates a security misconfiguration. Fix: Ensure .env is in .gitignore and never commit environment-specific secrets. Use .env.example template files instead. Implement proper secrets management through CI/CD (GitHub Secrets) or tools like HashiCorp Vault. - Medium · Broad Package Installation Without Verification —
Dockerfile - apt-get install command. The Dockerfile installs multiple packages (postgresql-server-dev-15, gcc, make, git, curl, pkg-config, libssl-dev, libclang-dev, clang, libicu-dev, openssl) without explicit version pinning or checksum verification. This increases the attack surface and makes builds non-deterministic. Fix: Pin specific versions of all installed packages. Example: 'postgresql-server-dev-15=15.X-Y' instead of relying on apt caching. Implement security scanning in the CI/CD pipeline for container images using tools like Trivy or Snyk. - Medium · Git Clone Without Depth or Integrity Verification —
Dockerfile - git clone of plpgsql_check. While the plpgsql_check repository is cloned with --depth 1 (good), there's no verification of the cloned code's integrity (no GPG signature verification). External dependency cloning during Docker build could introduce supply chain attacks. Fix: Implement GPG signature verification for cloned repositories. Verify commits with: 'git verify-commit HEAD'. Consider using git submodules with locked commit hashes instead of cloning during builds. Scan cloned code for vulnerabilities. - Medium · Incomplete Dockerfile Build Process —
Dockerfile - end of file. The Dockerfile content is truncated, with an incomplete comment about Rust installation for 'pgrx-based' component. This incomplete build definition makes it impossible to fully assess security implications of the complete build process. Fix: Complete the Dockerfile and ensure all multi-stage builds are present. Review the Rust installation process for potential security issues. Use a .dockerignore file to exclude unnecessary files from the build context. - Medium · Permissive File Permissions on Private Key —
Dockerfile - chmod and chown commands. The Dockerfile sets server.key to chmod 600 (correct) but the chown command changes ownership to postgres:postgres. In a containerized environment, this could allow the postgres process to read its own private key, which should ideally be handled by an external certificate management system. Fix: For production, use external secret management. If using file-based secrets, restrict access further and consider using Docker secrets (Swarm) or Kubernetes secrets with proper RBAC. - Low · Missing Security Headers in Repository Configuration —
Repository root. No evidence of security.txt, SECURITY.md, or vulnerability disclosure policy in the file structure. This makes it difficult for security researchers to responsibly report vulnerabilities. Fix: Add a SECURITY.md file with vulnerability disclosure guidelines. Create a .well-known/security.txt file following RFC 9116 standards. Establish a responsible disclosure process. - Low · Incomplete Dependency Review Visibility —
Cargo.toml. While workspace dependencies are defined, the actual binary dependencies' security advisories and their transitive dependencies are not fully visible in the provided context. The use of dated crates (e.g., 'edition = 2024' appears to be a future edition) suggests potential versioning issues. Fix: undefined
LLM-derived; treat as a starting point, not a security audit.
👉Where to read next
- Open issues — current backlog
- Recent PRs — what's actively shipping
- Source on GitHub
Generated by RepoPilot. Verdict based on maintenance signals — see the live page for receipts. Re-run on a new commit to refresh.