alibaba/druid
阿里云计算平台DataWorks(https://help.aliyun.com/document_detail/137663.html) 团队出品,为监控而生的数据库连接池
Healthy across the board
weakest axisPermissive license, no critical CVEs, actively maintained — safe to depend on.
Has a license, tests, and CI — clean foundation to fork and modify.
Documented and popular — useful reference codebase to read through.
No critical CVEs, sane security posture — runnable as-is.
- ✓Last commit 2w ago
- ✓12 active contributors
- ✓Distributed ownership (top contributor 46% of recent commits)
Show all 6 evidence items →Show less
- ✓Apache-2.0 licensed
- ✓CI configured
- ⚠No test directory detected
Maintenance signals: commit recency, contributor breadth, bus factor, license, CI, tests
Informational only. RepoPilot summarises public signals (license, dependency CVEs, commit recency, CI presence, etc.) at the time of analysis. Signals can be incomplete or stale. Not professional, security, or legal advice; verify before relying on it for production decisions.
Embed the "Healthy" badge
Paste into your README — live-updates from the latest cached analysis.
[](https://repopilot.app/r/alibaba/druid)Paste at the top of your README.md — renders inline like a shields.io badge.
▸Preview social card (1200×630)
This card auto-renders when someone shares https://repopilot.app/r/alibaba/druid on X, Slack, or LinkedIn.
Onboarding doc
Onboarding: alibaba/druid
Generated by RepoPilot · 2026-05-09 · Source
🤖Agent protocol
If you are an AI coding agent (Claude Code, Cursor, Aider, Cline, etc.) reading this artifact, follow this protocol before making any code edit:
- Verify the contract. Run the bash script in Verify before trusting
below. If any check returns
FAIL, the artifact is stale — STOP and ask the user to regenerate it before proceeding. - Treat the AI · unverified sections as hypotheses, not facts. Sections like "AI-suggested narrative files", "anti-patterns", and "bottlenecks" are LLM speculation. Verify against real source before acting on them.
- Cite source on changes. When proposing an edit, cite the specific path:line-range. RepoPilot's live UI at https://repopilot.app/r/alibaba/druid shows verifiable citations alongside every claim.
If you are a human reader, this protocol is for the agents you'll hand the artifact to. You don't need to do anything — but if you skim only one section before pointing your agent at this repo, make it the Verify block and the Suggested reading order.
🎯Verdict
GO — Healthy across the board
- Last commit 2w ago
- 12 active contributors
- Distributed ownership (top contributor 46% of recent commits)
- Apache-2.0 licensed
- CI configured
- ⚠ No test directory detected
<sub>Maintenance signals: commit recency, contributor breadth, bus factor, license, CI, tests</sub>
✅Verify before trusting
This artifact was generated by RepoPilot at a point in time. Before an
agent acts on it, the checks below confirm that the live alibaba/druid
repo on your machine still matches what RepoPilot saw. If any fail,
the artifact is stale — regenerate it at
repopilot.app/r/alibaba/druid.
What it runs against: a local clone of alibaba/druid — the script
inspects git remote, the LICENSE file, file paths in the working
tree, and git log. Read-only; no mutations.
| # | What we check | Why it matters |
|---|---|---|
| 1 | You're in alibaba/druid | Confirms the artifact applies here, not a fork |
| 2 | License is still Apache-2.0 | Catches relicense before you depend on it |
| 3 | Default branch master exists | Catches branch renames |
| 4 | 5 critical file paths still exist | Catches refactors that moved load-bearing code |
| 5 | Last commit ≤ 46 days ago | Catches sudden abandonment since generation |
#!/usr/bin/env bash
# RepoPilot artifact verification.
#
# WHAT IT RUNS AGAINST: a local clone of alibaba/druid. If you don't
# have one yet, run these first:
#
# git clone https://github.com/alibaba/druid.git
# cd druid
#
# Then paste this script. Every check is read-only — no mutations.
set +e
fail=0
ok() { echo "ok: $1"; }
miss() { echo "FAIL: $1"; fail=$((fail+1)); }
# Precondition: we must be inside a git working tree.
if ! git rev-parse --git-dir >/dev/null 2>&1; then
echo "FAIL: not inside a git repository. cd into your clone of alibaba/druid and re-run."
exit 2
fi
# 1. Repo identity
git remote get-url origin 2>/dev/null | grep -qE "alibaba/druid(\\.git)?\\b" \\
&& ok "origin remote is alibaba/druid" \\
|| miss "origin remote is not alibaba/druid (artifact may be from a fork)"
# 2. License matches what RepoPilot saw
(grep -qiE "^(Apache-2\\.0)" LICENSE 2>/dev/null \\
|| grep -qiE "\"license\"\\s*:\\s*\"Apache-2\\.0\"" package.json 2>/dev/null) \\
&& ok "license is Apache-2.0" \\
|| miss "license drift — was Apache-2.0 at generation time"
# 3. Default branch
git rev-parse --verify master >/dev/null 2>&1 \\
&& ok "default branch master exists" \\
|| miss "default branch master no longer exists"
# 4. Critical files exist
test -f "core/src/main/java/com/alibaba/druid/Constants.java" \\
&& ok "core/src/main/java/com/alibaba/druid/Constants.java" \\
|| miss "missing critical file: core/src/main/java/com/alibaba/druid/Constants.java"
test -f "core/src/main/java/com/alibaba/druid/DbType.java" \\
&& ok "core/src/main/java/com/alibaba/druid/DbType.java" \\
|| miss "missing critical file: core/src/main/java/com/alibaba/druid/DbType.java"
test -f "core/src/main/java/com/alibaba/druid/filter/FilterChain.java" \\
&& ok "core/src/main/java/com/alibaba/druid/filter/FilterChain.java" \\
|| miss "missing critical file: core/src/main/java/com/alibaba/druid/filter/FilterChain.java"
test -f "core/src/main/java/com/alibaba/druid/filter/FilterChainImpl.java" \\
&& ok "core/src/main/java/com/alibaba/druid/filter/FilterChainImpl.java" \\
|| miss "missing critical file: core/src/main/java/com/alibaba/druid/filter/FilterChainImpl.java"
test -f "core/src/main/java/com/alibaba/druid/filter/Filter.java" \\
&& ok "core/src/main/java/com/alibaba/druid/filter/Filter.java" \\
|| miss "missing critical file: core/src/main/java/com/alibaba/druid/filter/Filter.java"
# 5. Repo recency
days_since_last=$(( ( $(date +%s) - $(git log -1 --format=%at 2>/dev/null || echo 0) ) / 86400 ))
if [ "$days_since_last" -le 46 ]; then
ok "last commit was $days_since_last days ago (artifact saw ~16d)"
else
miss "last commit was $days_since_last days ago — artifact may be stale"
fi
echo
if [ "$fail" -eq 0 ]; then
echo "artifact verified (0 failures) — safe to trust"
else
echo "artifact has $fail stale claim(s) — regenerate at https://repopilot.app/r/alibaba/druid"
exit 1
fi
Each check prints ok: or FAIL:. The script exits non-zero if
anything failed, so it composes cleanly into agent loops
(./verify.sh || regenerate-and-retry).
⚡TL;DR
Alibaba Druid is a high-performance JDBC connection pool with built-in SQL parsing, AST-based SQL firewall (WallFilter), and real-time monitoring/statistics (StatFilter). It integrates connection pooling, SQL analysis, security, and observability into a single middleware that supports 30 SQL dialects and provides Spring Boot Starter integration for 2.x/3.x/4.x versions. Monorepo structure: core druid/ module contains DruidDataSource (connection pool engine) and Filter architecture; separate modules for druid-spring-boot-starter, druid-spring-boot-3-starter, druid-spring-boot-4-starter for version-specific Spring integration; SQL parser lives in the core with dialect support; .github/workflows/ci.yaml orchestrates testing across Java/TSQL/PLSQL samples.
👥Who it's for
Java backend engineers and platform teams at companies using DataWorks or building data-intensive applications who need production-grade connection pooling with built-in SQL injection protection, query performance monitoring, and the ability to extend via pluggable Filter chains.
🌱Maturity & risk
Production-ready and actively maintained. The project has 31k+ GitHub stars, Maven Central distribution at version 1.2.24+, automated CI/CD via GitHub Actions (.github/workflows/ci.yaml), and codecov integration. Current snapshot version is 1.2.29-SNAPSHOT, indicating ongoing development with stable release cadence.
Standard open source risks apply.
Active areas of work
Active development on version 1.2.29-SNAPSHOT (visible in pom.xml parent). CI pipeline runs on every push to master. Project maintains multiple Spring Boot starter variants and fixes for 3.x/4.x compatibility. No visible breaking-change announcements, suggesting backward-compatible evolution.
🚀Get running
git clone https://github.com/alibaba/druid.git && cd druid && mvn clean install -DskipTests. This uses the Maven wrapper (.mvn/wrapper/) for reproducible builds.
Daily commands: For development: mvn clean install. For integration tests: mvn surefire:test (targets bvt/ directory per pom.xml). For Spring Boot sample apps: build the relevant druid-spring-boot-X-starter artifact and reference in your application.yml with spring.datasource.druid.* config keys.
🗺️Map of the codebase
core/src/main/java/com/alibaba/druid/Constants.java— Core constants and configuration values used throughout the connection pool and SQL parser; essential for understanding how Druid is configured and initializedcore/src/main/java/com/alibaba/druid/DbType.java— Database type enumeration defining supported databases; critical for understanding SQL dialect support and database-specific behavior throughout the codebasecore/src/main/java/com/alibaba/druid/filter/FilterChain.java— Core filter chain interface that defines the interceptor pattern used for monitoring, security, and SQL parsing; fundamental to Druid's extensibility architecturecore/src/main/java/com/alibaba/druid/filter/FilterChainImpl.java— Default implementation of the filter chain; every JDBC operation flows through this, making it essential for understanding request lifecycle and monitoringcore/src/main/java/com/alibaba/druid/filter/Filter.java— Base interface for all filters; understanding this is prerequisite for implementing custom filters for monitoring, config, or security enhancementscore/src/main/java/com/alibaba/druid/filter/FilterManager.java— Central manager for filter registration and auto-loading; critical for understanding how Druid discovers and initializes monitoring and security filterscore/pom.xml— Maven POM defining core module dependencies, Java version (1.8+), and build configuration; essential for build process and understanding minimum requirements
🛠️How to make changes
Add a Custom Filter for Monitoring or Security
- Create a new Java class that extends FilterAdapter in core/src/main/java/com/alibaba/druid/filter/ (
core/src/main/java/com/alibaba/druid/filter/FilterAdapter.java) - Override relevant callback methods (e.g., connection_connect, statement_execute) to intercept JDBC operations (
core/src/main/java/com/alibaba/druid/filter/Filter.java) - If auto-loading is needed, annotate the class with @AutoLoad to enable service-loader discovery (
core/src/main/java/com/alibaba/druid/filter/AutoLoad.java) - Add custom filter to the filter chain via FilterManager.getFilters() or datasource configuration (
core/src/main/java/com/alibaba/druid/filter/FilterManager.java)
Add Support for a New Database Type
- Add a new enum constant to the DbType enumeration for the new database (
core/src/main/java/com/alibaba/druid/DbType.java) - Update Constants.java with database-specific default configuration values (timeouts, driver class names) (
core/src/main/java/com/alibaba/druid/Constants.java) - If SQL dialect differs, create or enhance SQL parser rules in the sql module for the new database variant (
core/src/main/java/com/alibaba/druid/DbType.java)
Add Configuration Encryption/Decryption Support
- Review existing encryption implementation in ConfigTools for algorithm and key management patterns (
core/src/main/java/com/alibaba/druid/filter/config/ConfigTools.java) - Extend or customize ConfigFilter to define which properties require decryption in datasource configuration (
core/src/main/java/com/alibaba/druid/filter/config/ConfigFilter.java) - Ensure encrypted values are marked with a prefix (e.g., 'enc:') so ConfigFilter recognizes them (
core/src/main/java/com/alibaba/druid/filter/config/ConfigFilter.java)
🔧Why these technologies
- Java JDBC — Druid is a JDBC datasource implementation; JDBC is the standard Java database access API and enables database agnosticity
- Filter/Interceptor Pattern — Allows pluggable monitoring, security, and SQL parsing without modifying core pool logic; enables extensibility
- Service Loader (AutoLoad) — Enables automatic discovery and initialization of custom filters at runtime without explicit configuration
- Maven — Standard Java build and dependency management; allows multi-module structure (core, sql, etc.)
⚖️Trade-offs already made
-
Heavyweight feature set (pool + SQL parser + monitoring + security in one library)
- Why: Alibaba's DataWorks platform needed deep monitoring and SQL analysis; integrating them reduces dependencies and improves coordination
- Consequence: Larger JAR footprint and surface area; steeper learning curve; but provides integrated view of database behavior
-
Synchronous filter chain with blocking callbacks
- Why: Simplicity and predictability; easier to reason about execution order and state
- Consequence: Filter execution is sequential and can become a bottleneck under extreme throughput; async monitoring deferred to separate framework
-
Configuration stored in properties or annotations rather than separate config files
- Why: Matches Spring Boot conventions and simplifies deployment; reduces file management overhead
- Consequence: Configuration embedded in application; less flexible for runtime changes without restart
🚫Non-goals (don't propose these)
- Real-time streaming analytics (monitoring is aggregated, not event-driven)
- Cross-database transactions and distributed ACID (single-database JDBC pools only)
- ORM functionality (Druid is a low-level JDBC pool, not a data access framework)
- Runtime SQL optimization or query rewriting (SQL parsing is for analysis and security, not optimizer)
- Automatic connection migration or failover to replicas (pool manages single endpoint)
🪤Traps & gotchas
- Filter ordering matters: StatFilter should wrap WallFilter if you want to capture blocked SQL stats. 2) PSCache (PreparedStatement cache) requires matching PreparedStatement usage patterns; misconfigured pool size can cause cache thrashing. 3) Spring Boot starter auto-configuration only activates if spring-boot-starter-jdbc is on classpath. 4) WallFilter has dialect-specific AST behavior; MySQL grammar differs from PostgreSQL — check your database profile in WallConfig. 5) Default pool validation queries vary by database (mysql: SELECT 1, postgres: SELECT 1::void); incorrect queries cause connection churn.
🏗️Architecture
💡Concepts to learn
- Abstract Syntax Tree (AST) for SQL — Druid parses SQL into AST to enable security checks (WallFilter), formatting, rewriting, and dialect-aware optimization without regex hacks
- PreparedStatement Caching (PSCache) — Druid's PSCache reuses compiled prepared statements per connection, reducing parse overhead; requires understanding statement key matching and cache invalidation
- Filter Chain pattern — Druid's pluggable Filter architecture (StatFilter, WallFilter, custom filters) chains request/response interception similar to servlet filters but for JDBC operations
- Connection Pool State Machine — DruidDataSource manages ACTIVE/WAIT/IDLE connection states, KeepAlive background tasks, and idle timeout eviction; critical for understanding memory leaks and pool exhaustion
- SQL Injection via AST-based validation — WallFilter's black/white list rules operate on AST nodes (e.g., BlockStatement detection), not string patterns, making bypasses harder than regex-based WAF
- Slow SQL detection and collection — StatFilter tracks SQL execution duration, groups by normalized query, and surfaces top slow queries for performance tuning; foundational for DataWorks observability
- Multi-database dialect support — Druid's parser handles MySQL, PostgreSQL, Oracle, SQL Server, MariaDB, Hive, ClickHouse syntax differences in one AST model; requires dialect-aware lexer/grammar
🔗Related repos
alibaba/canal— Complementary Alibaba project for MySQL binlog parsing; often used with Druid for real-time data sync and change-data-capture pipelinesapache/commons-dbcp2— Competing Apache connection pool; lighter-weight but lacks Druid's monitoring, SQL parsing, and WallFilter security featuresbrettwooldridge/HikariCP— Industry-standard high-performance pool (used by Spring Boot default); Druid differentiates via SQL AST analysis and StatFilter observabilitysharding-sphere/sharding-jdbc— Alibaba/ByteDance distributed SQL sharding layer; often layers on top of Druid for transparent multi-shard routingalibaba/DataX— Alibaba's heterogeneous data sync tool; uses Druid as primary pooling layer for JDBC sources in ETL workflows
🪄PR ideas
To work on one of these in Claude Code or Cursor, paste:
Implement the "<title>" PR idea from CLAUDE.md, working through the checklist as the task list.
Add comprehensive integration tests for database-specific connection pool behaviors
Druid is a monitoring-focused database connection pool. The current test suite appears limited (only bvt/** tests configured in maven-surefire-plugin). Add database-specific integration tests for MySQL, PostgreSQL, Oracle, and SQL Server to validate connection pooling, timeout handling, and monitoring metrics. This ensures reliability across different database vendors and catches regressions early.
- [ ] Create src/test/java/com/alibaba/druid/integration/mysql/DruidMySQLConnectionPoolTest.java with connection lifecycle tests
- [ ] Create src/test/java/com/alibaba/druid/integration/postgresql/DruidPostgresConnectionPoolTest.java
- [ ] Add docker-compose.yml or testcontainers setup in src/test/resources/ for test database infrastructure
- [ ] Update .github/workflows/ci.yaml to run integration tests against containerized databases
- [ ] Document in README.md how to run integration tests locally
Add GitHub Action for automated dependency vulnerability scanning and updates
The repo uses Maven with multiple dependencies (Spring 4.3.20, JUnit Jupiter 5.11.4) but has no automated vulnerability scanning or dependency update workflow. Add Dependabot or similar to detect security issues and outdated dependencies, with automated PR creation. This is critical for a connection pool library used in production systems.
- [ ] Create .github/dependabot.yml with Maven ecosystem configuration and security-updates grouping
- [ ] Create .github/workflows/security-scan.yaml with OWASP Dependency-Check or Snyk integration
- [ ] Add branch protection rule requiring security scan checks to pass before merge
- [ ] Document security policy in SECURITY.md with vulnerability disclosure process
Add comprehensive monitoring metrics documentation and example implementations
Druid is explicitly described as built 'for monitoring' (为监控而生), but the README lacks specific documentation on available metrics, monitoring integrations, and example code. Create concrete examples showing Prometheus metrics export, JMX monitoring setup, and custom metric collection patterns. This reduces friction for users trying to integrate Druid monitoring into their observability stack.
- [ ] Create docs/MONITORING.md documenting all available metrics (connection pool size, wait time, error rates, etc.)
- [ ] Create src/test/java/com/alibaba/druid/examples/MonitoringPrometheusExample.java with Prometheus exporter setup
- [ ] Create src/test/java/com/alibaba/druid/examples/MonitoringJMXExample.java with JMX MBean registration
- [ ] Add configuration examples to README.md linking to docs/MONITORING.md
- [ ] Add metrics to .github/ISSUE_TEMPLATE/bug_report.yml template to capture monitoring-related issues
🌿Good first issues
- Add JMX metric export for MeterRegistry in StatFilter (currently Web console only) to integrate with Prometheus/OpenTelemetry in monitoring-conscious teams.
- Expand SQL dialect coverage: add test cases in src/test/java/ for recent Postgres 16+ window function syntax or MySQL 8.0 JSON path expressions, which parse but don't format cleanly.
- Document Filter chaining best practices with runnable examples in examples/ folder — currently only inline code snippets exist; build a complete SpringBoot + WallFilter + StatFilter + custom LoggingFilter demo.
⭐Top contributors
Click to expand
Top contributors
- @wenshao — 46 commits
- @lingo-xp — 37 commits
- @Gladys-111 — 5 commits
- @cheetahburg — 3 commits
- @WuTaoyu — 2 commits
📝Recent commits
Click to expand
Recent commits
2790bd7— fix: add LISTAGG to aggregate functions for WITHIN GROUP parsing in redshift. (Gladys-111)08ab719— feat: fix sr hint issues (#6632) (lingo-xp)dbbf4ae— fix: WallFilter 全面审计修复 — 13处 bug 修复与架构改进 (#6623) (wenshao)d9f9e2c— fix: SQL Parser 全面审计修复 — 30处 bug 修复涉及 Lexer、AST、Visitor 及多方言 Parser (#6622) (wenshao)08cc4ab— feat: enhance BigQuery dialect with EXPORT DATA, CREATE SCHEMA/PROCEDURE support (wenshao)12f4184— docs: fix issues found in code review (wenshao)8ac63fa— docs: fix errors found in third documentation audit (wenshao)c66fafd— docs: fix errors found in second documentation audit (wenshao)3de5fdb— docs: fix errors found during documentation audit (wenshao)65fc990— docs: rename CLAUDE.md to AGENTS.md (wenshao)
🔒Security observations
The Alibaba Druid database connection pool codebase shows moderate security maturity with a defined vulnerability reporting program. However, critical security concerns exist: the Spring Framework dependency is severely outdated (4.3.20.RELEASE from 2018) with known CVEs affecting RCE and authentication, and the project uses SNAPSHOT versions unsuitable for production. The Java 1.8 target is legacy. The codebase would benefit from: (1) immediate Spring Framework upgrade to 5.3.x or 6.x; (2) release of stable non-SNAPSHOT versions; (3) Java version upgrade to LTS 11+; (4) comprehensive Software Composition Analysis (SCA) integration in CI/CD; (5) dependency audit and regular update processes. The existing SECURITY.md vulnerability reporting channel is well-established and provides clear guidance for responsible disclosure.
- High · Outdated Spring Framework Dependency —
core/pom.xml - spring.version property. The pom.xml specifies Spring Framework version 4.3.20.RELEASE, which was released in 2018 and is no longer actively maintained. This version contains known security vulnerabilities (CVE-2018-1258, CVE-2019-2725, CVE-2020-5410, and others) that could lead to RCE, information disclosure, and authentication bypass. Fix: Upgrade to Spring Framework 5.3.x or 6.x (latest stable version) to receive security patches and updates. Review application compatibility before upgrading. - Medium · SNAPSHOT Version in Production Build —
core/pom.xml - version 1.2.29-SNAPSHOT. The project version is set to '1.2.29-SNAPSHOT', indicating unreleased development code. SNAPSHOT versions should never be used in production as they may contain incomplete features, untested code, and security issues. Fix: Use stable released versions (non-SNAPSHOT) in production environments. Reserve SNAPSHOT versions only for development and testing. - Medium · JUnit 5 Dependency Version Pinning —
core/pom.xml - junit-jupiter.version property. The junit-jupiter.version is pinned to 5.11.4. While not critically vulnerable, pinned test dependencies should be reviewed for security updates and compatibility issues with the JDK version. Fix: Regularly update JUnit and other test dependencies. Consider using dependency management tools to monitor for security updates. - Medium · Insufficient Maven Plugin Configuration —
core/pom.xml - build/plugins section. The maven-surefire-plugin configuration appears incomplete (truncated in the build section). Incomplete configurations may lead to tests being skipped or not properly validated, potentially hiding security issues. Fix: Complete and verify all Maven plugin configurations. Ensure all security-related tests are properly included and executed in the CI/CD pipeline. - Low · Java Version Target Compatibility —
core/pom.xml - jdk.version property. The project targets JDK 1.8 (set in jdk.version property), which is an older Java version. While still widely supported, newer Java versions offer improved security features and performance. Fix: Consider upgrading to JDK 11 or later (LTS versions) to benefit from improved security, performance, and language features. Maintain backward compatibility documentation if needed. - Low · Missing Security Headers and Configuration —
Project documentation and configuration files. No explicit security headers, content security policies, or security-related configurations are visible in the provided file structure. As a database connection pool library, this is less critical, but proper documentation of security best practices is recommended. Fix: Document security best practices for users of the Druid connection pool, including secure configuration guidelines, authentication mechanisms, and recommendations for secure database connectivity. - Low · Incomplete SECURITY.md Vulnerability Scope —
SECURITY.md. The SECURITY.md file defines vulnerability collection scope but does not explicitly address dependency vulnerabilities or supply chain security issues, which are increasingly important attack vectors. Fix: Expand the vulnerability collection scope to include supply chain security, dependency vulnerabilities, and transitive dependency risks. Implement Software Composition Analysis (SCA) tools in CI/CD.
LLM-derived; treat as a starting point, not a security audit.
👉Where to read next
- Open issues — current backlog
- Recent PRs — what's actively shipping
- Source on GitHub
Generated by RepoPilot. Verdict based on maintenance signals — see the live page for receipts. Re-run on a new commit to refresh.