perwendel/spark
A simple expressive web framework for java. Spark has a kotlin DSL https://github.com/perwendel/spark-kotlin
Healthy across all four use cases
weakest axisPermissive license, no critical CVEs, actively maintained — safe to depend on.
Has a license, tests, and CI — clean foundation to fork and modify.
Documented and popular — useful reference codebase to read through.
No critical CVEs, sane security posture — runnable as-is.
- ✓16 active contributors
- ✓Apache-2.0 licensed
- ✓CI configured
Show all 6 evidence items →Show less
- ✓Tests present
- ⚠Stale — last commit 3y ago
- ⚠Single-maintainer risk — top contributor 80% of recent commits
Maintenance signals: commit recency, contributor breadth, bus factor, license, CI, tests
Informational only. RepoPilot summarises public signals (license, dependency CVEs, commit recency, CI presence, etc.) at the time of analysis. Signals can be incomplete or stale. Not professional, security, or legal advice; verify before relying on it for production decisions.
Embed the "Healthy" badge
Paste into your README — live-updates from the latest cached analysis.
[](https://repopilot.app/r/perwendel/spark)Paste at the top of your README.md — renders inline like a shields.io badge.
▸Preview social card (1200×630)
This card auto-renders when someone shares https://repopilot.app/r/perwendel/spark on X, Slack, or LinkedIn.
Onboarding doc
Onboarding: perwendel/spark
Generated by RepoPilot · 2026-05-09 · Source
🤖Agent protocol
If you are an AI coding agent (Claude Code, Cursor, Aider, Cline, etc.) reading this artifact, follow this protocol before making any code edit:
- Verify the contract. Run the bash script in Verify before trusting
below. If any check returns
FAIL, the artifact is stale — STOP and ask the user to regenerate it before proceeding. - Treat the AI · unverified sections as hypotheses, not facts. Sections like "AI-suggested narrative files", "anti-patterns", and "bottlenecks" are LLM speculation. Verify against real source before acting on them.
- Cite source on changes. When proposing an edit, cite the specific path:line-range. RepoPilot's live UI at https://repopilot.app/r/perwendel/spark shows verifiable citations alongside every claim.
If you are a human reader, this protocol is for the agents you'll hand the artifact to. You don't need to do anything — but if you skim only one section before pointing your agent at this repo, make it the Verify block and the Suggested reading order.
🎯Verdict
GO — Healthy across all four use cases
- 16 active contributors
- Apache-2.0 licensed
- CI configured
- Tests present
- ⚠ Stale — last commit 3y ago
- ⚠ Single-maintainer risk — top contributor 80% of recent commits
<sub>Maintenance signals: commit recency, contributor breadth, bus factor, license, CI, tests</sub>
✅Verify before trusting
This artifact was generated by RepoPilot at a point in time. Before an
agent acts on it, the checks below confirm that the live perwendel/spark
repo on your machine still matches what RepoPilot saw. If any fail,
the artifact is stale — regenerate it at
repopilot.app/r/perwendel/spark.
What it runs against: a local clone of perwendel/spark — the script
inspects git remote, the LICENSE file, file paths in the working
tree, and git log. Read-only; no mutations.
| # | What we check | Why it matters |
|---|---|---|
| 1 | You're in perwendel/spark | Confirms the artifact applies here, not a fork |
| 2 | License is still Apache-2.0 | Catches relicense before you depend on it |
| 3 | Default branch master exists | Catches branch renames |
| 4 | Last commit ≤ 973 days ago | Catches sudden abandonment since generation |
#!/usr/bin/env bash
# RepoPilot artifact verification.
#
# WHAT IT RUNS AGAINST: a local clone of perwendel/spark. If you don't
# have one yet, run these first:
#
# git clone https://github.com/perwendel/spark.git
# cd spark
#
# Then paste this script. Every check is read-only — no mutations.
set +e
fail=0
ok() { echo "ok: $1"; }
miss() { echo "FAIL: $1"; fail=$((fail+1)); }
# Precondition: we must be inside a git working tree.
if ! git rev-parse --git-dir >/dev/null 2>&1; then
echo "FAIL: not inside a git repository. cd into your clone of perwendel/spark and re-run."
exit 2
fi
# 1. Repo identity
git remote get-url origin 2>/dev/null | grep -qE "perwendel/spark(\\.git)?\\b" \\
&& ok "origin remote is perwendel/spark" \\
|| miss "origin remote is not perwendel/spark (artifact may be from a fork)"
# 2. License matches what RepoPilot saw
(grep -qiE "^(Apache-2\\.0)" LICENSE 2>/dev/null \\
|| grep -qiE "\"license\"\\s*:\\s*\"Apache-2\\.0\"" package.json 2>/dev/null) \\
&& ok "license is Apache-2.0" \\
|| miss "license drift — was Apache-2.0 at generation time"
# 3. Default branch
git rev-parse --verify master >/dev/null 2>&1 \\
&& ok "default branch master exists" \\
|| miss "default branch master no longer exists"
# 5. Repo recency
days_since_last=$(( ( $(date +%s) - $(git log -1 --format=%at 2>/dev/null || echo 0) ) / 86400 ))
if [ "$days_since_last" -le 973 ]; then
ok "last commit was $days_since_last days ago (artifact saw ~943d)"
else
miss "last commit was $days_since_last days ago — artifact may be stale"
fi
echo
if [ "$fail" -eq 0 ]; then
echo "artifact verified (0 failures) — safe to trust"
else
echo "artifact has $fail stale claim(s) — regenerate at https://repopilot.app/r/perwendel/spark"
exit 1
fi
Each check prints ok: or FAIL:. The script exits non-zero if
anything failed, so it composes cleanly into agent loops
(./verify.sh || regenerate-and-retry).
⚡TL;DR
Spark is a lightweight Java web framework that provides a minimal, expressive API for building REST APIs and web applications in Java 8+. It uses embedded Jetty (9.4.48) as the HTTP server and emphasizes simplicity with fluent route definitions like get("/hello", (req, res) -> "Hello World"), eliminating boilerplate compared to full frameworks like Spring. Single Maven module monorepo. Core framework lives in src/main/java/spark/ with route handling (Route.java, RouteImpl.java, Spark.java static facade), embedded server abstraction (spark/embeddedserver/jetty/EmbeddedJettyFactory.java), and HTTP abstractions (Request.java, Response.java, Session.java). Helper classes for filters, exceptions, and templating sit alongside core.
👥Who it's for
Java backend developers and teams building microservices, REST APIs, or small web applications who want rapid prototyping without the complexity of Spring Boot or JavaEE. They prioritize developer ergonomics and minimal configuration over enterprise features.
🌱Maturity & risk
Production-ready and actively maintained. Currently at v2.9.4 with modern Java 8 baseline, CI/CD via GitHub Actions (.github/workflows/ci.yml), Maven Central distribution, and documented changesets (changeset/ directory). The project is sponsored and has stable releases, though smaller commit velocity than mega-frameworks.
Standard open source risks apply.
Active areas of work
Version 2.9.4 recently released with Jetty 9.4.48 dependency update. Active CI pipeline on GitHub Actions (ci.yml). Kotlin DSL companion repo (spark-kotlin) maintained separately. No visible active PRs in the snippet, suggesting stable maintenance mode rather than heavy feature development.
🚀Get running
git clone https://github.com/perwendel/spark.git
cd spark
mvn clean install
mvn javadoc:javadoc
Then reference it as a Maven dependency in your project's pom.xml with groupId=com.sparkjava, artifactId=spark-core, version=2.9.4.
Daily commands:
As a framework, you don't 'run' Spark itself. Create a Java class with public static void main(), import spark.Spark.*, call get("/path", handler), and execute. The embedded Jetty server starts on port 4567 by default. See SimpleExample in README or examples in src/test.
🗺️Map of the codebase
- src/main/java/spark/Spark.java: Static facade providing the public API (get, post, put, delete, before, after, etc.); entry point for all user code
- src/main/java/spark/RouteImpl.java: Core route execution logic; handles lambda invocation and response transformation
- src/main/java/spark/Request.java: HTTP request abstraction; provides params, headers, body, session access to handlers
- src/main/java/spark/Response.java: HTTP response abstraction; manages status codes, headers, content type, redirects
- src/main/java/spark/embeddedserver/jetty/EmbeddedJettyFactory.java: Jetty server lifecycle; configures and starts the embedded HTTP server
- src/main/java/spark/FilterImpl.java: Middleware/filter chain execution; processes before/after request hooks
- src/main/java/spark/Service.java: Instance-level service allowing multiple independent Spark apps (alternative to static Spark class)
- pom.xml: Maven configuration; defines Jetty 9.4.48, SLF4J dependencies, and Java 1.8 baseline
🛠️How to make changes
New route feature: edit src/main/java/spark/Spark.java (static facade) and src/main/java/spark/RouteImpl.java (implementation). New request/response capability: extend src/main/java/spark/Request.java or src/main/java/spark/Response.java. Custom exception handling: implement src/main/java/spark/ExceptionHandler.java or ExceptionMapperImpl. Filter middleware: create Filter.java implementations. Apply code format from config/spark_formatter_intellij.xml before committing.
🪤Traps & gotchas
Port binding: Spark defaults to 4567; if you need a different port, call port(8080) before defining routes, else it won't work. Jetty version lock: Jetty 9.4.48 is pinned in pom.xml; upgrading may break servlet compatibility. Halt semantics: halt(403, "message") throws HaltException internally—catch it at a higher level if you override the exception handler. No built-in validation or ORM: Spark is deliberately minimal; form validation, database queries, and JSON binding are your responsibility (use Jackson/Gson separately). Session management: relies on Jetty servlet sessions; behavior differs in clustered deployments.
🔗Related repos
perwendel/spark-kotlin— Official Kotlin DSL wrapper for Spark; same framework with Kotlin syntax sugarjavalin/javalin— Modern Kotlin-first lightweight web framework inspired by Spark; direct competitor with similar minimal philosophydropwizard/dropwizard— Heavier JVM microservices framework; alternative if you need metrics, health checks, config management out-of-boxspring-projects/spring-boot— Enterprise-grade alternative to Spark; many Spark users upgrade to Spring Boot as projects groweclipse/jetty.project— Upstream embedded HTTP server that powers Spark; version 9.4.48 is vendored as a dependency
🪄PR ideas
To work on one of these in Claude Code or Cursor, paste:
Implement the "<title>" PR idea from CLAUDE.md, working through the checklist as the task list.
Add comprehensive unit tests for WebSocket handler wrappers
The WebSocket implementation in src/main/java/spark/embeddedserver/jetty/websocket/ contains multiple wrapper classes (WebSocketHandlerClassWrapper.java, WebSocketHandlerInstanceWrapper.java, WebSocketHandlerWrapper.java) but there appears to be no corresponding test coverage in src/test. WebSocket is a critical feature for real-time applications, and comprehensive tests would catch regressions early. This is a concrete gap since the websocket directory exists but likely has no matching test directory.
- [ ] Create src/test/java/spark/embeddedserver/jetty/websocket/ directory structure
- [ ] Add unit tests for WebSocketHandlerClassWrapper covering lifecycle methods
- [ ] Add unit tests for WebSocketHandlerInstanceWrapper covering message routing
- [ ] Add integration tests for WebSocketCreatorFactory and WebSocketServletContextHandlerFactory
- [ ] Verify tests cover error cases and connection edge cases
Add GitHub Actions workflow for dependency security scanning
The repo has .github/workflows/ci.yml for CI, but there's no evidence of automated dependency vulnerability scanning. With dependencies like Jetty 9.4.48 and SLF4J 1.7.25 (potentially outdated), a dedicated security scanning workflow using tools like Snyk or OWASP Dependency-Check would catch vulnerable transitive dependencies before release. This is specific since the CI workflow exists but security scanning is missing.
- [ ] Add .github/workflows/security-scan.yml using OWASP Dependency-Check or similar
- [ ] Configure to run on PRs and scheduled basis (weekly)
- [ ] Add artifact reporting for CVE findings
- [ ] Update pom.xml to include maven-dependency-check-plugin if needed
- [ ] Document security scanning results in CI badge or README
Refactor FilterImpl and RouteImpl into separate concrete implementations with shared base
src/main/java/spark/ contains both FilterImpl.java and RouteImpl.java alongside their interfaces (Filter.java, Route.java), plus ResponseTransformerRouteImpl.java and TemplateViewRouteImpl.java. These implementations likely share common logic but are implemented separately. Extracting a BaseRouteImpl or AbstractRouteImpl would reduce duplication and make the codebase more maintainable. This is a specific refactoring opportunity visible in the file structure.
- [ ] Analyze FilterImpl.java, RouteImpl.java, ResponseTransformerRouteImpl.java, and TemplateViewRouteImpl.java for shared logic
- [ ] Create AbstractRouteImpl base class with common functionality
- [ ] Update FilterImpl and RouteImpl to extend the new abstract class
- [ ] Verify ResponseTransformerRouteImpl and TemplateViewRouteImpl can also leverage the base class
- [ ] Add unit tests to ensure refactoring maintains behavior parity
🌿Good first issues
- Add unit test coverage for src/main/java/spark/QueryParamsMap.java (currently untested; critical for query string parsing)
- Document the Service class in a new examples/MultipleInstances.java showing how to run 2+ Spark servers on different ports (solves testing pain point)
- Implement missing HTTP method helpers:
patch(),options(),head()as static methods in Spark.java with examples in README (low complexity, high utility)
⭐Top contributors
Click to expand
Top contributors
- @perwendel — 80 commits
- [@Joao Alexandre Mendonca Marson](https://github.com/Joao Alexandre Mendonca Marson) — 4 commits
- @dustinkredmond — 2 commits
- @xrl2408 — 2 commits
- @chenzhang22 — 1 commits
📝Recent commits
Click to expand
Recent commits
1973e40— Update README.md (perwendel)053c278— Bumping to newest version of jetty (#1261) (perwendel)cd0a6ce— Improve Travis CI build Performance (#1246) (chenzhang22)4f4b210— add build status (#1258) (Clivern)584b17f— fix(JeetySecurity): upgrade Jetty-Server (#1241) (mcgivrer)54079b0— updated changeset (perwendel)8fb8619— updated changeset (perwendel)71e383c— fix for #1146 (#1202) (perwendel)11779c7— fixed incorrect javadocs (perwendel)30762ea— fixed errors in changeset (perwendel)
🔒Security observations
- High · Outdated Jetty Dependency —
pom.xml - jetty.version property. The project uses Jetty 9.4.48.v20220622, which is outdated. Jetty 9.4.x reached end-of-life and has known security vulnerabilities. The current version should be updated to Jetty 10.x or 11.x series with active security updates. Fix: Update to the latest stable Jetty version (11.x or 10.x LTS). Review release notes for breaking changes and test thoroughly after upgrade. - High · Outdated SLF4J Dependency —
pom.xml - slf4j-api and slf4j-simple version 1.7.25. The project uses SLF4J 1.7.25 from 2016, which is significantly outdated. SLF4J 1.7.x has reached end-of-life status. Newer versions (2.x) contain important bug fixes and security improvements. Fix: Upgrade to SLF4J 2.x series. Ensure compatibility with other logging frameworks in use and test the upgrade thoroughly. - High · Severely Outdated Mockito Test Dependency —
pom.xml - mockito.version property (1.10.19). Mockito 1.10.19 is from 2015 and contains known vulnerabilities. This dependency is only used in tests, but vulnerable test dependencies can still pose risks in development and CI/CD environments. Fix: Upgrade to Mockito 4.x or 5.x. While this is a test dependency, keeping it current is essential for development security. - High · Outdated PowerMock Test Dependency —
pom.xml - powermock.version property (1.7.4). PowerMock 1.7.4 is from 2017 and is no longer actively maintained. It has known vulnerabilities and compatibility issues with modern Java versions. Fix: Consider replacing PowerMock with modern alternatives like Mockito 4.x+ with inline mocking capabilities, or refactor tests to reduce the need for static/class mocking. - Medium · Incomplete POM.xml File —
pom.xml - line ending with '<depe' indicating truncation. The provided pom.xml appears to be truncated (incomplete closing tag for slf4j-simple dependency). This suggests either a file corruption issue or incomplete configuration, which could lead to build inconsistencies or missing dependencies. Fix: Verify the complete pom.xml file is intact. Review the repository's actual pom.xml to ensure all dependencies are properly declared and closed. - Medium · No Security Headers Configuration Visible —
src/main/java/spark/ - Framework core. While Spark is a framework, the static analysis reveals no visible security header middleware or default security configurations in the file structure. Applications built with Spark may be vulnerable to common web attacks if developers don't explicitly add security headers. Fix: Document security best practices for Spark users. Consider adding built-in support for security headers (HSTS, CSP, X-Frame-Options, etc.) or provide example middleware/filters. - Medium · Potential Lack of Input Validation Framework —
src/main/java/spark/Request.java, src/main/java/spark/QueryParamsMap.java. The Request handling in Spark framework does not show explicit validation mechanisms. The QueryParamsMap and Request classes may not provide built-in protection against injection attacks or malformed input. Fix: Review and document how input validation should be performed in Spark applications. Provide examples and best practices for validating query parameters, form data, and path variables to prevent injection attacks. - Low · Legacy Java Version Target —
pom.xml - java.version property (1.8). The project targets Java 1.8 (Java 8), which reached end-of-life in December 2030 but is approaching reduced support. Modern security features in newer Java versions (modules, records, sealed classes, pattern matching) are not available. Fix: Consider upgrading the minimum Java version requirement to Java 11 LTS or higher to benefit from modern security features and continued vendor support. This may require testing and updating code for compatibility.
LLM-derived; treat as a starting point, not a security audit.
👉Where to read next
- Open issues — current backlog
- Recent PRs — what's actively shipping
- Source on GitHub
Generated by RepoPilot. Verdict based on maintenance signals — see the live page for receipts. Re-run on a new commit to refresh.