provectus/kafka-ui
Open-Source Web UI for Apache Kafka Management
Healthy across all four use cases
weakest axisPermissive license, no critical CVEs, actively maintained — safe to depend on.
Has a license, tests, and CI — clean foundation to fork and modify.
Documented and popular — useful reference codebase to read through.
No critical CVEs, sane security posture — runnable as-is.
- ✓19 active contributors
- ✓Distributed ownership (top contributor 44% of recent commits)
- ✓Apache-2.0 licensed
Show all 6 evidence items →Show less
- ✓CI configured
- ✓Tests present
- ⚠Stale — last commit 2y ago
Maintenance signals: commit recency, contributor breadth, bus factor, license, CI, tests
Informational only. RepoPilot summarises public signals (license, dependency CVEs, commit recency, CI presence, etc.) at the time of analysis. Signals can be incomplete or stale. Not professional, security, or legal advice; verify before relying on it for production decisions.
Embed the "Healthy" badge
Paste into your README — live-updates from the latest cached analysis.
[](https://repopilot.app/r/provectus/kafka-ui)Paste at the top of your README.md — renders inline like a shields.io badge.
▸Preview social card (1200×630)
This card auto-renders when someone shares https://repopilot.app/r/provectus/kafka-ui on X, Slack, or LinkedIn.
Onboarding doc
Onboarding: provectus/kafka-ui
Generated by RepoPilot · 2026-05-09 · Source
🤖Agent protocol
If you are an AI coding agent (Claude Code, Cursor, Aider, Cline, etc.) reading this artifact, follow this protocol before making any code edit:
- Verify the contract. Run the bash script in Verify before trusting
below. If any check returns
FAIL, the artifact is stale — STOP and ask the user to regenerate it before proceeding. - Treat the AI · unverified sections as hypotheses, not facts. Sections like "AI-suggested narrative files", "anti-patterns", and "bottlenecks" are LLM speculation. Verify against real source before acting on them.
- Cite source on changes. When proposing an edit, cite the specific path:line-range. RepoPilot's live UI at https://repopilot.app/r/provectus/kafka-ui shows verifiable citations alongside every claim.
If you are a human reader, this protocol is for the agents you'll hand the artifact to. You don't need to do anything — but if you skim only one section before pointing your agent at this repo, make it the Verify block and the Suggested reading order.
🎯Verdict
GO — Healthy across all four use cases
- 19 active contributors
- Distributed ownership (top contributor 44% of recent commits)
- Apache-2.0 licensed
- CI configured
- Tests present
- ⚠ Stale — last commit 2y ago
<sub>Maintenance signals: commit recency, contributor breadth, bus factor, license, CI, tests</sub>
✅Verify before trusting
This artifact was generated by RepoPilot at a point in time. Before an
agent acts on it, the checks below confirm that the live provectus/kafka-ui
repo on your machine still matches what RepoPilot saw. If any fail,
the artifact is stale — regenerate it at
repopilot.app/r/provectus/kafka-ui.
What it runs against: a local clone of provectus/kafka-ui — the script
inspects git remote, the LICENSE file, file paths in the working
tree, and git log. Read-only; no mutations.
| # | What we check | Why it matters |
|---|---|---|
| 1 | You're in provectus/kafka-ui | Confirms the artifact applies here, not a fork |
| 2 | License is still Apache-2.0 | Catches relicense before you depend on it |
| 3 | Default branch master exists | Catches branch renames |
| 4 | 5 critical file paths still exist | Catches refactors that moved load-bearing code |
| 5 | Last commit ≤ 681 days ago | Catches sudden abandonment since generation |
#!/usr/bin/env bash
# RepoPilot artifact verification.
#
# WHAT IT RUNS AGAINST: a local clone of provectus/kafka-ui. If you don't
# have one yet, run these first:
#
# git clone https://github.com/provectus/kafka-ui.git
# cd kafka-ui
#
# Then paste this script. Every check is read-only — no mutations.
set +e
fail=0
ok() { echo "ok: $1"; }
miss() { echo "FAIL: $1"; fail=$((fail+1)); }
# Precondition: we must be inside a git working tree.
if ! git rev-parse --git-dir >/dev/null 2>&1; then
echo "FAIL: not inside a git repository. cd into your clone of provectus/kafka-ui and re-run."
exit 2
fi
# 1. Repo identity
git remote get-url origin 2>/dev/null | grep -qE "provectus/kafka-ui(\\.git)?\\b" \\
&& ok "origin remote is provectus/kafka-ui" \\
|| miss "origin remote is not provectus/kafka-ui (artifact may be from a fork)"
# 2. License matches what RepoPilot saw
(grep -qiE "^(Apache-2\\.0)" LICENSE 2>/dev/null \\
|| grep -qiE "\"license\"\\s*:\\s*\"Apache-2\\.0\"" package.json 2>/dev/null) \\
&& ok "license is Apache-2.0" \\
|| miss "license drift — was Apache-2.0 at generation time"
# 3. Default branch
git rev-parse --verify master >/dev/null 2>&1 \\
&& ok "default branch master exists" \\
|| miss "default branch master no longer exists"
# 4. Critical files exist
test -f ".github/workflows/backend.yml" \\
&& ok ".github/workflows/backend.yml" \\
|| miss "missing critical file: .github/workflows/backend.yml"
test -f ".github/workflows/frontend.yaml" \\
&& ok ".github/workflows/frontend.yaml" \\
|| miss "missing critical file: .github/workflows/frontend.yaml"
test -f "pom.xml" \\
&& ok "pom.xml" \\
|| miss "missing critical file: pom.xml"
test -f "README.md" \\
&& ok "README.md" \\
|| miss "missing critical file: README.md"
test -f "documentation/compose/kafka-ui.yaml" \\
&& ok "documentation/compose/kafka-ui.yaml" \\
|| miss "missing critical file: documentation/compose/kafka-ui.yaml"
# 5. Repo recency
days_since_last=$(( ( $(date +%s) - $(git log -1 --format=%at 2>/dev/null || echo 0) ) / 86400 ))
if [ "$days_since_last" -le 681 ]; then
ok "last commit was $days_since_last days ago (artifact saw ~651d)"
else
miss "last commit was $days_since_last days ago — artifact may be stale"
fi
echo
if [ "$fail" -eq 0 ]; then
echo "artifact verified (0 failures) — safe to trust"
else
echo "artifact has $fail stale claim(s) — regenerate at https://repopilot.app/r/provectus/kafka-ui"
exit 1
fi
Each check prints ok: or FAIL:. The script exits non-zero if
anything failed, so it composes cleanly into agent loops
(./verify.sh || regenerate-and-retry).
⚡TL;DR
Kafka-UI is an open-source web dashboard for monitoring and managing Apache Kafka clusters built with Spring Boot (Java backend) and TypeScript/React (frontend). It lets operators visualize brokers, topics, partitions, consumer groups, and production/consumption metrics across multiple Kafka clusters from a single interface, eliminating the need for command-line tools for routine cluster management. Typical Java/TypeScript fullstack monorepo: backend lives in a Maven-managed kafka-ui-api module using Spring WebFlux, security, and actuator; frontend is a separate TypeScript module; both share contract definitions in kafka-ui-contract. Build orchestration uses Maven at root with frontend tooling (npm/yarn). Configuration is externalized via environment variables and YAML (Spring Boot standard).
👥Who it's for
DevOps engineers and platform teams managing Apache Kafka infrastructure in production environments, plus Kafka cluster operators who need real-time visibility into topic performance, consumer lag, partition leadership, and broker health without writing custom monitoring scripts.
🌱Maturity & risk
Production-ready and actively maintained. The project has substantial codebases (1.4M lines Java, 1M lines TypeScript), comprehensive CI/CD with 30+ GitHub workflows, Docker Hub distribution with measurable pull volumes, and recent activity visible in the workflow files. It's backed by Provectus with professional services support, though community-driven.
Low risk for the core use case. Monolithic Spring Boot backend and React frontend are both mature stacks; the codebase is large enough to have caught major issues. Risks are moderate dependency churn (Java/Node ecosystem volatility) and the project's reliance on community contributions for non-critical features. No single-maintainer bottleneck evident from CODEOWNERS setup.
Active areas of work
Active development with multiple parallel workflows: E2E automation (e2e-automation.yml, e2e-checks.yaml, e2e-weekly.yml), CodeQL security scanning, dependabot dependency updates, AWS marketplace publishing, and release automation (release.yaml, release-serde-api.yaml). The .github/workflows/ directory shows 30+ active pipelines indicating rapid iteration and strict quality gates.
🚀Get running
Clone with git clone https://github.com/provectus/kafka-ui.git && cd kafka-ui. Build backend: ./mvnw clean install (Maven Wrapper in .mvn/). Build frontend: cd kafka-ui-app && npm install && npm start. Use .devcontainer/devcontainer.json for containerized dev setup with VS Code. Full Docker Compose examples are in documentation/compose/.
Daily commands:
Backend dev: ./mvnw spring-boot:run -pl kafka-ui-api. Frontend dev: cd kafka-ui-app && npm start (likely runs on localhost:3000, backend on localhost:8080). Production: Docker images published to hub.docker.com/r/provectuslabs/kafka-ui (see .github/workflows/build-public-image.yml). Kubernetes-ready with docker-compose examples in documentation/.
🗺️Map of the codebase
.github/workflows/backend.yml— Defines the backend build, test, and deployment pipeline for the Java API; essential for understanding CI/CD expectations and artifact naming..github/workflows/frontend.yaml— Controls frontend (React/TypeScript) build and publication; critical for understanding UI build artifacts and deployment process.pom.xml— Root Maven configuration defining kafka-ui-api as the main backend module with JaCoCo coverage; essential for understanding dependency management and build configuration.README.md— Entry point describing Kafka-UI as a web UI for Kafka cluster management; establishes project scope, licensing, and community channels.documentation/compose/kafka-ui.yaml— Docker Compose configuration for running Kafka-UI with a Kafka cluster; demonstrates the typical deployment architecture and required services..github/CODEOWNERS— Specifies code owners and review requirements for different parts of the codebase; critical for understanding governance and review expectations.CONTRIBUTING.md— Developer contribution guidelines covering setup, testing, and PR submission; mandatory reading for any contributor.
🛠️How to make changes
Add a new Kafka cluster configuration
- Create a new Docker Compose YAML file in documentation/compose/ referencing the kafka-ui.yaml template. (
documentation/compose/kafka-ui-<scenario>.yaml) - Define Kafka broker, Zookeeper, and Kafka-UI services with appropriate environment variables (KAFKA_BROKERS, authentication params, etc.). (
documentation/compose/kafka-ui-<scenario>.yaml) - Document the scenario in DOCKER_COMPOSE.md with instructions on when to use this configuration. (
documentation/compose/DOCKER_COMPOSE.md) - If adding security features (SASL, SSL, ACL), reference existing example files like kafka-ui-sasl.yaml or kafka-ssl.yml. (
documentation/compose/kafka-ui-sasl.yaml)
Add a new Kafka Connector example
- Create a new JSON connector configuration in documentation/compose/connectors/ following the pattern of existing sink/source connectors. (
documentation/compose/connectors/<connector-name>.json) - Update the connector startup script to include the new connector in its deployment sequence. (
documentation/compose/connectors/start.sh) - Document the connector setup in DOCKER_COMPOSE.md with example topics and expected data flow. (
documentation/compose/DOCKER_COMPOSE.md)
Extend the CI/CD pipeline for a new build artifact
- Add a new job or step in the appropriate GitHub Actions workflow (.github/workflows/backend.yml or frontend.yaml). (
.github/workflows/backend.yml) - Define build commands, test execution, coverage reporting (JaCoCo for Java), and artifact publishing (Docker Hub, Maven Central, etc.). (
.github/workflows/backend.yml) - Update build-public-image.yml to include the new artifact in Docker image builds if applicable. (
.github/workflows/build-public-image.yml) - Ensure the release.yaml workflow includes steps to publish the new artifact during version releases. (
.github/workflows/release.yaml)
Configure a new security scenario (SASL, SSL, ACL, LDAP)
- Create a new Docker Compose YAML file in documentation/compose/ that includes the security mechanism (reference kafka-ui-sasl.yaml, kafka-ssl.yml, or kafka-ui-auth-context.yaml). (
documentation/compose/kafka-ui-<security-type>.yaml) - Add JAAS configuration files if needed in documentation/compose/jaas/ (e.g., kafka_server.conf, schema_registry.jaas). (
documentation/compose/jaas/kafka_server.conf) - If using SSL, add certificate generation and keystore/truststore files referencing documentation/compose/ssl/generate_certs.sh. (
documentation/compose/ssl/generate_certs.sh) - Document the security configuration in DOCKER_COMPOSE.md with environment variable mappings and authentication flow. (
documentation/compose/DOCKER_COMPOSE.md)
🔧Why these technologies
- Java / Spring Boot — Backend API for Kafka cluster management, chosen for JVM ecosystem integration with Kafka client libraries and cross-platform deployment.
- React / TypeScript — Frontend web UI providing responsive dashboard; TypeScript ensures type safety for managing complex cluster state and topology.
- Docker & Docker Compose — Containerization for consistent deployments across environments; Compose enables local multi-service stacks (Kafka, Zookeeper, Kafka-UI, optional connectors).
- Maven & Maven Wrapper — Build automation and dependency management for Java backend; wrapper ensures consistent Maven version across developers and CI/CD.
- GitHub Actions — CI/CD orchestration for automated testing, building, publishing Docker images, and releasing; deep GitHub integration for PR checks and releases.
- JaCoCo — Code coverage analysis for Java backend; integrated with SonarQube for quality gate enforcement.
⚖️Trade-offs already made
-
Monolithic Docker image containing both backend API and frontend UI
- Why: Simplifies deployment and reduces operational complexity for users running single docker run command.
- Consequence: Tighter coupling between frontend and backend versions; harder to scale frontend independently or use UI with alternative backend.
-
In-memory caching for cluster metadata instead of persistent cache (Redis, etc.)
- Why: Reduces external dependencies and operational overhead for a read-heavy UI use case.
- Consequence: Metadata loss on pod restart; not suitable for multi-instance deployments without session affinity; limited to single-node scaling.
-
Support for multiple security mechanisms (SASL, SSL, OAuth, ACL, LDAP) via configuration
- Why: Enables integration with diverse enterprise Kafka deployments without code changes.
- Consequence: Increased configuration complexity and testing surface; potential security misconfiguration by users.
-
Kafka Connector management as optional feature (not core UI functionality)
- Why: Keeps core Kafka management focused; connectors are optional
- Consequence: undefined
🪤Traps & gotchas
Spring Boot expects KAFKA_CLUSTERS_* environment variables for multi-cluster setup (see Docker examples). Frontend build requires Node.js ≥14 (inferred from modern TypeScript usage). OAuth2 config is optional but security is enabled by default — local dev may need SPRING_SECURITY_ENABLED=false or mock credentials. The serde-api (custom serialization) is a separate releasable module; changes there trigger release-serde-api.yaml workflow. E2E tests require running Kafka broker (Docker Compose mandatory for CI/CD). Maven Wrapper in .mvn/ is provided — use ./mvnw not system mvn to ensure version consistency.
🏗️Architecture
💡Concepts to learn
- Spring WebFlux (Reactive Streams) — kafka-ui backend uses non-blocking async I/O to handle multiple concurrent Kafka cluster connections without thread pool exhaustion
- OAuth2 / OpenID Connect — The UI supports enterprise authentication via oauth2-client; understanding grant flows is essential for securing multi-tenant deployments
- Consumer Group Offset Management — kafka-ui displays per-partition offsets and lag; understanding Kafka's offset commit protocol is critical for interpreting consumer health correctly
- ANTLR (Another Language Recognition Tool) — The repo uses ANTLR (17k lines) for parsing Kafka config strings and KQL queries; understanding grammar-based parsing helps extend the UI's query capabilities
- JaCoCo Code Coverage — Build pipeline enforces code coverage thresholds; contributors must understand coverage reports to pass CI gates
- Kafka Partition Leadership & ISR (In-Sync Replicas) — kafka-ui displays broker assignments and replica status; understanding leader election and ISR dynamics is essential for interpreting cluster health metrics
- Multi-Cluster Abstraction Pattern — The UI manages multiple independent Kafka clusters from one dashboard; the cluster registry and routing pattern is a key architectural design
🔗Related repos
apache/kafka— The upstream Kafka project; kafka-ui depends on its client libraries and protocol definitionsconfluentinc/control-center— Commercial Kafka management UI by Confluent; kafka-ui is the open-source alternative with simpler setupedenhill/kcat— CLI tool for Kafka inspection (alternate to UI for power users); kafka-ui provides GUI alternative to kcat workflowsprovectus/kafka-ui-azure— Azure Marketplace integration package for kafka-ui; companion deployment option for cloud environmentsspring-projects/spring-kafka— Spring Framework's Kafka client abstraction; kafka-ui uses this for its broker/topic management backend
🪄PR ideas
To work on one of these in Claude Code or Cursor, paste:
Implement the "<title>" PR idea from CLAUDE.md, working through the checklist as the task list.
Add GitHub Action workflow for frontend unit test coverage reporting
The repo has comprehensive backend testing (JaCoCo/SonarQube setup in pom.xml) and a frontend.yaml workflow, but there's no dedicated frontend test coverage workflow. The frontend.yaml likely runs builds but doesn't integrate coverage reporting to SonarQube like the backend does. This would ensure frontend code quality parity with backend standards and catch regressions early.
- [ ] Create .github/workflows/frontend-coverage.yml that runs npm/yarn test with coverage flags
- [ ] Configure coverage reporter (e.g., nyc/jest) to generate LCOV reports
- [ ] Add SonarQube scanner step to upload frontend coverage to SonarCloud (similar to backend setup in backend.yml)
- [ ] Document coverage thresholds in the workflow and in CONTRIBUTING.md
Create missing integration test suite for Kafka Connect API endpoints
The file structure shows documentation/compose/connectors with sample connector configs (github-source.json, s3-sink.json, source-activities.json, sink-activities.json), but there's no evident e2e test suite validating connector CRUD operations. Given kafka-ui manages connectors, this is a critical gap. The e2e-automation.yml workflow exists but likely lacks connector-specific scenarios.
- [ ] Create backend/src/test/java/com/provectus/kafka/connector/ConnectorApiIntegrationTest.java with tests for create/list/update/delete connector operations
- [ ] Extend e2e-automation.yml or create separate e2e-connectors.yml workflow using documentation/compose/connectors test data
- [ ] Add Testcontainers-based integration tests for Kafka Connect cluster scenarios
- [ ] Document test setup in CONTRIBUTING.md with Kafka Connect environment requirements
Add missing security headers and CSRF protection configuration documentation
The repo includes spring-boot-starter-security and spring-boot-starter-oauth2-client dependencies in pom.xml, plus a SECURITY.md file exists, but there's no specific guide for configuring security headers (CSP, X-Frame-Options, etc.) or CSRF protection for the Kafka UI. Given this is a management interface, security configuration clarity is critical for production deployments.
- [ ] Create documentation/SECURITY_HARDENING.md detailing recommended security headers configuration for kafka-ui-api
- [ ] Add security configuration examples in documentation/compose/data or a new documentation/security-config directory with Spring Security bean examples
- [ ] Add GitHub Action in .github/workflows/security-checks.yaml to validate common misconfigurations (OWASP dependency-check integration)
- [ ] Link new security guide from README.md and CONTRIBUTING.md
🌿Good first issues
- Add TypeScript unit tests for consumer group lag calculation logic in the frontend (likely
kafka-ui-app/src/missing tests parallel to new lag features) - Improve error handling and logging in the Spring Boot Kafka client integration layer (
kafka-ui-api/src/) for broker connection failures — add debug messages and structured logging - Document the ANTLR grammar extensions (17k lines in repo) with inline examples showing how to parse Kafka configuration strings for the UI's config editor
⭐Top contributors
Click to expand
Top contributors
- @dependabot[bot] — 44 commits
- @Haarolean — 20 commits
- @iliax — 17 commits
- @Narekmat — 3 commits
- @Vladysl — 2 commits
📝Recent commits
Click to expand
Recent commits
83b5a60— added commons-collections4 library instead of commons-collections (#4427) (Vladysl)3dc4446— added env variable filtering.groovy.enabled which allows to enable/disable groovy script executions (#4426) (Vladysl)53a6553— Bump vite-tsconfig-paths from 4.2.0 to 4.2.1 in /kafka-ui-react-app (#4215) (dependabot[bot])fc97dfa— Bump react-router-dom from 6.3.0 to 6.15.0 in /kafka-ui-react-app (#4217) (dependabot[bot])68f08a0— Bump Spring Boot (#4227) (freeformz)cc12814— Bump @typescript-eslint/parser from 5.29.0 to 5.62.0 in /kafka-ui-react-app (#4214) (dependabot[bot])5d53580— Bump @testing-library/dom from 9.0.0 to 9.3.1 in /kafka-ui-react-app (#4216) (dependabot[bot])de2f06c— Bump react-hot-toast from 2.4.0 to 2.4.1 in /kafka-ui-react-app (#4218) (dependabot[bot])ff106a2— Bump @openapitools/openapi-generator-cli from 2.5.2 to 2.7.0 in /kafka-ui-react-app (#4219) (dependabot[bot])c00cb32— Bump @types/lossless-json from 1.0.1 to 1.0.2 in /kafka-ui-react-app (#4220) (dependabot[bot])
🔒Security observations
The kafka-ui project demonstrates a reasonable security baseline with Spring Security and OAuth2 support included, but several concerns require attention. The most critical issue is the incomplete/truncated pom.xml which prevents full dependency analysis. Key risks include: (1
- High · Incomplete Dependency Declaration in POM —
pom.xml - kafka-ui-serde-api dependency. The pom.xml file shows a truncated dependency declaration for 'kafka-ui-serde-api' with an incomplete <version> tag ending with '<v'. This could indicate corrupted or incomplete build configuration that may lead to unpredictable dependency resolution or build failures. Fix: Verify and complete the dependency declaration by ensuring the version tag is properly closed with the correct version number following the pattern '${project.version}'. - Medium · OAuth2 Client Without Explicit PKCE Configuration —
pom.xml - spring-boot-starter-oauth2-client dependency. The project includes 'spring-boot-starter-oauth2-client' dependency but there is no visible configuration enforcing PKCE (Proof Key for Public Clients) in the provided context. OAuth2 implementations without PKCE can be vulnerable to authorization code interception attacks. Fix: Verify that OAuth2 configuration in application.yml/properties explicitly enables PKCE (spring.security.oauth2.client.registration.*.client-authentication-method=none and use pkce=true) for public client flows. - Medium · Outdated JaCoCo Code Coverage Plugin —
pom.xml - jacoco.version property. JaCoCo version 0.8.10 is specified in properties, which may contain known vulnerabilities. Current stable versions should be checked against CVE databases. Fix: Update JaCoCo to the latest stable version and verify compatibility with the current Java/Maven setup. Run 'mvn versions:display-property-updates' to check for updates. - Medium · Exposed Actuator Endpoints —
pom.xml - spring-boot-starter-actuator dependency. The project includes 'spring-boot-actuator' dependency. If not properly configured, actuator endpoints could expose sensitive application metrics, environment variables, and configuration details to unauthorized users. Fix: Ensure actuator endpoints are properly secured by configuring 'management.endpoints.web.exposure.include' to only expose necessary endpoints and require authentication for sensitive endpoints via 'management.endpoint.health.show-details=when-authorized'. - Medium · No Visible CSRF Protection Configuration —
pom.xml - spring-boot-starter-webflux and spring-boot-starter-security. While Spring Security is included, there is no visible CSRF token configuration in the provided file structure. WebFlux applications need explicit CSRF configuration. Fix: Verify that CSRF protection is explicitly configured for WebFlux (reactive) endpoints. Use '.csrf().disable()' only if intentionally building a stateless API with proper token-based authentication. - Low · Missing Dependency Version Management —
pom.xml - overall dependency management. The pom.xml relies on parent POM for dependency version management but no visible version lock or bill-of-materials (BOM) enforcement for transitive dependencies. Fix: Consider using a dependencyManagement section with a Spring Cloud or Spring Boot BOM to explicitly manage versions of all transitive dependencies and reduce supply chain risks. - Low · Incomplete File Structure Analysis —
pom.xml - entire file. The dependency file content is truncated/incomplete (ends at 'kafka-ui-serde-api' dependency), preventing full analysis of all security-relevant dependencies and configurations. Fix: Provide complete pom.xml file for comprehensive security analysis. Also analyze plugins section for any insecure or outdated build plugins. - Low · No Visible Security Headers Configuration —
Configuration/Application properties (not provided). For a web UI managing Kafka clusters, security headers (X-Frame-Options, X-Content-Type-Options, Content-Security-Policy) are critical but not visible in the provided context. Fix: Configure Spring Security to set security headers via '.headers().frameOptions().deny()' and other security header configurations to prevent clickjacking and XSS attacks.
LLM-derived; treat as a starting point, not a security audit.
👉Where to read next
- Open issues — current backlog
- Recent PRs — what's actively shipping
- Source on GitHub
Generated by RepoPilot. Verdict based on maintenance signals — see the live page for receipts. Re-run on a new commit to refresh.