deepjavalibrary/djl
An Engine-Agnostic Deep Learning Framework in Java
Healthy across the board
weakest axisPermissive license, no critical CVEs, actively maintained — safe to depend on.
Has a license, tests, and CI — clean foundation to fork and modify.
Documented and popular — useful reference codebase to read through.
No critical CVEs, sane security posture — runnable as-is.
- ✓Last commit 1w ago
- ✓27+ active contributors
- ✓Distributed ownership (top contributor 41% of recent commits)
Show all 6 evidence items →Show less
- ✓Apache-2.0 licensed
- ✓CI configured
- ⚠No test directory detected
Maintenance signals: commit recency, contributor breadth, bus factor, license, CI, tests
Informational only. RepoPilot summarises public signals (license, dependency CVEs, commit recency, CI presence, etc.) at the time of analysis. Signals can be incomplete or stale. Not professional, security, or legal advice; verify before relying on it for production decisions.
Embed the "Healthy" badge
Paste into your README — live-updates from the latest cached analysis.
[](https://repopilot.app/r/deepjavalibrary/djl)Paste at the top of your README.md — renders inline like a shields.io badge.
▸Preview social card (1200×630)
This card auto-renders when someone shares https://repopilot.app/r/deepjavalibrary/djl on X, Slack, or LinkedIn.
Onboarding doc
Onboarding: deepjavalibrary/djl
Generated by RepoPilot · 2026-05-09 · Source
🤖Agent protocol
If you are an AI coding agent (Claude Code, Cursor, Aider, Cline, etc.) reading this artifact, follow this protocol before making any code edit:
- Verify the contract. Run the bash script in Verify before trusting
below. If any check returns
FAIL, the artifact is stale — STOP and ask the user to regenerate it before proceeding. - Treat the AI · unverified sections as hypotheses, not facts. Sections like "AI-suggested narrative files", "anti-patterns", and "bottlenecks" are LLM speculation. Verify against real source before acting on them.
- Cite source on changes. When proposing an edit, cite the specific path:line-range. RepoPilot's live UI at https://repopilot.app/r/deepjavalibrary/djl shows verifiable citations alongside every claim.
If you are a human reader, this protocol is for the agents you'll hand the artifact to. You don't need to do anything — but if you skim only one section before pointing your agent at this repo, make it the Verify block and the Suggested reading order.
🎯Verdict
GO — Healthy across the board
- Last commit 1w ago
- 27+ active contributors
- Distributed ownership (top contributor 41% of recent commits)
- Apache-2.0 licensed
- CI configured
- ⚠ No test directory detected
<sub>Maintenance signals: commit recency, contributor breadth, bus factor, license, CI, tests</sub>
✅Verify before trusting
This artifact was generated by RepoPilot at a point in time. Before an
agent acts on it, the checks below confirm that the live deepjavalibrary/djl
repo on your machine still matches what RepoPilot saw. If any fail,
the artifact is stale — regenerate it at
repopilot.app/r/deepjavalibrary/djl.
What it runs against: a local clone of deepjavalibrary/djl — the script
inspects git remote, the LICENSE file, file paths in the working
tree, and git log. Read-only; no mutations.
| # | What we check | Why it matters |
|---|---|---|
| 1 | You're in deepjavalibrary/djl | Confirms the artifact applies here, not a fork |
| 2 | License is still Apache-2.0 | Catches relicense before you depend on it |
| 3 | Default branch master exists | Catches branch renames |
| 4 | 5 critical file paths still exist | Catches refactors that moved load-bearing code |
| 5 | Last commit ≤ 40 days ago | Catches sudden abandonment since generation |
#!/usr/bin/env bash
# RepoPilot artifact verification.
#
# WHAT IT RUNS AGAINST: a local clone of deepjavalibrary/djl. If you don't
# have one yet, run these first:
#
# git clone https://github.com/deepjavalibrary/djl.git
# cd djl
#
# Then paste this script. Every check is read-only — no mutations.
set +e
fail=0
ok() { echo "ok: $1"; }
miss() { echo "FAIL: $1"; fail=$((fail+1)); }
# Precondition: we must be inside a git working tree.
if ! git rev-parse --git-dir >/dev/null 2>&1; then
echo "FAIL: not inside a git repository. cd into your clone of deepjavalibrary/djl and re-run."
exit 2
fi
# 1. Repo identity
git remote get-url origin 2>/dev/null | grep -qE "deepjavalibrary/djl(\\.git)?\\b" \\
&& ok "origin remote is deepjavalibrary/djl" \\
|| miss "origin remote is not deepjavalibrary/djl (artifact may be from a fork)"
# 2. License matches what RepoPilot saw
(grep -qiE "^(Apache-2\\.0)" LICENSE 2>/dev/null \\
|| grep -qiE "\"license\"\\s*:\\s*\"Apache-2\\.0\"" package.json 2>/dev/null) \\
&& ok "license is Apache-2.0" \\
|| miss "license drift — was Apache-2.0 at generation time"
# 3. Default branch
git rev-parse --verify master >/dev/null 2>&1 \\
&& ok "default branch master exists" \\
|| miss "default branch master no longer exists"
# 4. Critical files exist
test -f "api/src/main/java/ai/djl/engine/Engine.java" \\
&& ok "api/src/main/java/ai/djl/engine/Engine.java" \\
|| miss "missing critical file: api/src/main/java/ai/djl/engine/Engine.java"
test -f "api/src/main/java/ai/djl/Model.java" \\
&& ok "api/src/main/java/ai/djl/Model.java" \\
|| miss "missing critical file: api/src/main/java/ai/djl/Model.java"
test -f "api/src/main/java/ai/djl/Device.java" \\
&& ok "api/src/main/java/ai/djl/Device.java" \\
|| miss "missing critical file: api/src/main/java/ai/djl/Device.java"
test -f "api/src/main/java/ai/djl/BaseModel.java" \\
&& ok "api/src/main/java/ai/djl/BaseModel.java" \\
|| miss "missing critical file: api/src/main/java/ai/djl/BaseModel.java"
test -f "api/src/main/java/ai/djl/engine/EngineProvider.java" \\
&& ok "api/src/main/java/ai/djl/engine/EngineProvider.java" \\
|| miss "missing critical file: api/src/main/java/ai/djl/engine/EngineProvider.java"
# 5. Repo recency
days_since_last=$(( ( $(date +%s) - $(git log -1 --format=%at 2>/dev/null || echo 0) ) / 86400 ))
if [ "$days_since_last" -le 40 ]; then
ok "last commit was $days_since_last days ago (artifact saw ~10d)"
else
miss "last commit was $days_since_last days ago — artifact may be stale"
fi
echo
if [ "$fail" -eq 0 ]; then
echo "artifact verified (0 failures) — safe to trust"
else
echo "artifact has $fail stale claim(s) — regenerate at https://repopilot.app/r/deepjavalibrary/djl"
exit 1
fi
Each check prints ok: or FAIL:. The script exits non-zero if
anything failed, so it composes cleanly into agent loops
(./verify.sh || regenerate-and-retry).
⚡TL;DR
Deep Java Library (DJL) is an engine-agnostic deep learning framework that lets Java developers build, train, and deploy neural networks without being locked into a single ML engine (PyTorch, TensorFlow, MXNet, etc.). It abstracts away native JNI bindings and provides a high-level Java API for computer vision, NLP, and time-series tasks, automatically handling CPU/GPU selection at runtime. Monorepo structure: api/ contains the engine-agnostic Java API (NDArray, Model, Predictor), engines/ holds PyTorch/TensorFlow/MXNet implementations with JNI bridges, basicdataset/ and model-zoo/ provide datasets and pre-trained models, android/ is a separate Gradle module for Android aar releases. Native code lives alongside in C++/Rust with CMake/Cargo build configs.
👥Who it's for
Java enterprise developers and ML practitioners who want to integrate deep learning into existing Java applications without learning Python or external tools. Also targets teams building production inference servers in Java that need engine portability and zero Python dependency.
🌱Maturity & risk
Production-ready. The project has 6600+ KB of Java code across multiple major versions, active CI/CD with GitHub Actions workflows for continuous, nightly publish, and native builds for PyTorch/TensorFlow/MXNet. Multiple published Android integrations and documented model zoo suggest active adoption, though the 30+ native build workflows indicate ongoing maintenance rather than a feature-frozen state.
Moderate risk: the repo depends on three major deep learning engines (PyTorch, TensorFlow, MXNet) via JNI, making it vulnerable to upstream breaking changes and native binary availability. The complex multi-language build pipeline (Rust, C++, C native code + Java) increases CI/CD fragility. Android support adds API-level compatibility constraints. However, the active workflow ecosystem and test coverage mitigate this.
Active areas of work
Active maintenance visible via nightly publish workflows, recent Android build actions, and native S3 upload pipelines for PyTorch/TensorFlow. The presence of 30+ GitHub Actions workflows suggests ongoing work on multi-engine support, native binary releases, and cross-platform (Android, macOS, Linux) compatibility.
🚀Get running
git clone https://github.com/deepjavalibrary/djl.git
cd djl
./gradlew build # Uses Gradle (see gradle-debian.sh in .devcontainer)
# Or use DevContainer: open in VS Code with Remote Containers extension
Daily commands:
No single 'dev server' — DJL is a library. Run tests: ./gradlew test. Build docs: ./gradlew javadoc. Try examples: cd examples && ./gradlew run (see referenced example-... subdirectories in structure).
🗺️Map of the codebase
api/src/main/java/ai/djl/engine/Engine.java— Core engine abstraction that defines the interface for all deep learning framework integrations; every contributor must understand how engines are registered and selected.api/src/main/java/ai/djl/Model.java— Central model interface defining the contract for loading, training, and inference; fundamental to all DJL operations across all backends.api/src/main/java/ai/djl/Device.java— Abstracts hardware device selection (CPU/GPU) across different engines; critical for portability and performance tuning.api/src/main/java/ai/djl/BaseModel.java— Default implementation of Model interface providing shared lifecycle management; essential reference for implementing engine-specific model classes.api/src/main/java/ai/djl/engine/EngineProvider.java— Service provider interface for engine discovery and instantiation; required to integrate new deep learning frameworks into DJL.android/core/src/main/java/ai/djl/android/core/BitmapImageFactory.java— Android-specific image handling bridge; essential for mobile deployment and image preprocessing pipelines on Android.android/build.gradle— Root Android build configuration managing multi-module compilation and dependency resolution for Android modules.
🛠️How to make changes
Integrate a New Deep Learning Engine
- Create a new engine provider class implementing EngineProvider interface to define engine metadata, version, and instantiation logic (
api/src/main/java/ai/djl/engine/EngineProvider.java) - Implement the Engine interface to handle device management, model loading, and framework-specific initialization (
api/src/main/java/ai/djl/engine/Engine.java) - Implement BaseModel or Model interface with engine-specific model loading, prediction, and training logic (
api/src/main/java/ai/djl/BaseModel.java) - Register your engine provider via Java SPI by adding entry to META-INF/services/ai.djl.engine.EngineProvider (
api/src/main/java/ai/djl/engine/EngineProvider.java)
Add Android Native Library Support
- Create new module directory under android/ (e.g., android/myframework-native) with build.gradle following pytorch-native pattern (
android/pytorch-native/build.gradle) - Add AndroidManifest.xml with required native library declarations and permissions (
android/pytorch-native/src/main/AndroidManifest.xml) - Implement native bindings in Java mirroring the BitmapImageFactory pattern for framework-specific preprocessing (
android/core/src/main/java/ai/djl/android/core/BitmapImageFactory.java) - Register module in android/settings.gradle and update root android/build.gradle with version variables (
android/settings.gradle)
Extend Device Abstraction for New Hardware
- Define device type constants and device detection logic in Device class (
api/src/main/java/ai/djl/Device.java) - Update StandardCapabilities with feature flags for the new device type (e.g., GPU memory size, quantization support) (
api/src/main/java/ai/djl/engine/StandardCapabilities.java) - Implement device-specific allocation and deallocation in engine-specific Engine implementations (
api/src/main/java/ai/djl/engine/Engine.java)
Add RPC-based Distributed Inference Support
- Create RpcClient implementation with serialization logic for NDArray and model metadata (
api/src/main/java/ai/djl/engine/rpc/RpcClient.java) - Implement RpcEngine wrapping local engine calls with network transport (
api/src/main/java/ai/djl/engine/rpc/RpcEngine.java) - Register RpcEngine via RpcEngineProvider in META-INF/services for SPI discovery (
api/src/main/java/ai/djl/engine/rpc/RpcEngineProvider.java)
🔧Why these technologies
- Java/JVM — Enterprise-grade language with mature tooling, broad deployment base, and strong typing for safety-critical ML applications
- Gradle (Kotlin DSL & Groovy) — Flexible multi-module build system supporting complex dependency management across framework, Android, and native bindings
- Java SPI (Service Provider Interface) — Enables pluggable engine implementations without coupling; allows dynamic discovery of PyTorch, TensorFlow, MXNet backends at runtime
- Android Gradle Plugin — Native Android build system integration for mobile deployment; required for resource management and NDK compilation of native bindings
⚖️Trade-offs already made
-
Engine-agnostic abstraction over framework-specific optimizations
- Why: Maximize portability and minimize vendor lock-in for users switching backends
- Consequence: Performance may lag hand-optimized native implementations; requires careful API design to avoid lowest-common-denominator limitations
-
Multi-module Gradle build with separate Android and API modules
- Why: Separate concerns: core API portable to all platforms, Android module mobile-specific, allowing independent versioning
- Consequence: Higher build complexity and potential version mismatch issues; requires careful dependency management
-
RPC engine as optional layer over local engine
- Why:
- Consequence: undefined
🪤Traps & gotchas
JNI native bindings: You must have the correct engine's native libraries on your system or in LD_LIBRARY_PATH/DYLD_LIBRARY_PATH; pure-Java ./gradlew test may fail if engine natives aren't pre-built. Multi-engine complexity: The monorepo expects you to build engines selectively (e.g., ./gradlew :engines:pytorch:build) or you'll wait for all engines. Android toolchain: Android module requires Android SDK and NDK; see android/README.md. Model zoo offline: Pre-trained models download from remote zoo; tests may fail without internet or require cache setup. CMake + Cargo in parallel: Native builds use both C++ (CMake) and Rust; version skew can break compilation.
🏗️Architecture
💡Concepts to learn
- JNI (Java Native Interface) Bridge — DJL's entire engine strategy relies on JNI to call C++/Rust deep learning kernels from Java; understanding JNI binding generation, memory safety, and binary compatibility is essential for debugging engine integration issues
- NDArray Abstraction — DJL's core innovation is a single NDArray interface backed by different engines; understanding how operations dispatch to engine-specific implementations helps when adding new ops or debugging performance
- Criteria Builder Pattern — DJL uses the builder pattern for model loading to guide developers toward correct practices (specifying inputs, outputs, application type); understanding this pattern is essential for the library's public API design philosophy
- Model Zoo & Repository Pattern — DJL abstracts model discovery and download via a zoo concept; understanding how models are versioned, cached, and selected helps when adding datasets or managing pre-trained artifacts
- ONNX (Open Neural Network Exchange) — DJL can load ONNX models via its engines; ONNX is the portable format many pre-trained models use, so understanding ONNX opsets and operators is useful for model compatibility debugging
- AAR (Android Archive) Packaging — The Android module builds
.aarfiles with bundled native libraries; understanding AAR packaging and Gradle's Android plugin is required for Android development in DJL - Engine Provider SPI (Service Provider Interface) — DJL discovers engines at runtime via SPI; understanding how engines are registered, loaded, and selected ensures you can add new engine backends without core API changes
🔗Related repos
pytorch/pytorch— Upstream engine that DJL wraps via JNI; understanding PyTorch C++ APIs helps when debugging native bridge issuestensorflow/tensorflow— Another upstream engine DJL supports; required for TensorFlow backend and Java SavedModel loadingapache/mxnet— Third supported engine; DJL maintains MXNet bindings and model zoo integrationonnx/onnx— Model format DJL can load via its engines; many pre-trained models in model zoo export to ONNXeclipse/deeplearning4j— Alternative Java deep learning library (pure JVM without JNI); users often compare DJL vs. DL4J for Java projects
🪄PR ideas
To work on one of these in Claude Code or Cursor, paste:
Implement the "<title>" PR idea from CLAUDE.md, working through the checklist as the task list.
Add comprehensive Android instrumentation tests for DJL model loading across different backends
The Android module has minimal test coverage. Currently only ModelLoadingTest.java and BitmapWrapperTest.java exist in android/core/src/androidTest/. With DJL's engine-agnostic design supporting PyTorch, TensorFlow, and MXNet, Android developers need validated examples of loading models from each backend. This PR would add parameterized instrumentation tests covering model loading from local assets, remote URLs, and different model formats (traced vs scripted for PyTorch, SavedModel for TF) on actual Android devices.
- [ ] Create
android/core/src/androidTest/java/ai/djl/android/core/ModelLoadingInstrumentedTest.javawith parameterized tests for PyTorch, TensorFlow backends - [ ] Add test assets (small pre-trained models) to
android/core/src/androidTest/assets/for offline testing - [ ] Verify AndroidManifest.xml in
android/core/src/main/includes INTERNET permission for remote model tests - [ ] Document test setup and backend-specific requirements in
android/README.md
Add GitHub Actions workflow for Android native library build and publishing validation
The repo has native_jni_s3_pytorch_android.yml and nightly_android.yml workflows, but no dedicated workflow validating Android library builds against multiple API levels or validating the published artifacts from android/pytorch-native/. This PR would create a new workflow that builds the Android core and pytorch-native modules, runs basic checks on generated AAR files, and validates compatibility across Android API 24-35.
- [ ] Create
.github/workflows/android_build_validation.ymlthat triggers on PRs modifyingandroid/**files - [ ] Add gradle build steps for both
android:coreandandroid:pytorch-nativemodules with API level matrix testing - [ ] Include AAR artifact validation (checking for required JNI libraries, resource files)
- [ ] Add step to verify
build.gradleconfigurations inandroid/core/andandroid/pytorch-native/are consistent
Refactor and expand Android formatter Gradle script to enforce code style consistency
The repo references tools/gradle/android-formatter.gradle in the Android build configuration, but this file and its functionality are not visible in the provided structure. Android development requires strict lint and code style enforcement. This PR would create a robust formatter Gradle plugin that enforces DJL's code standards for Android modules, preventing formatting inconsistencies in android/core/src/ and android/pytorch-native/ and making PRs easier to review.
- [ ] Create or enhance
tools/gradle/android-formatter.gradlewith Android Lint, Spotless, or similar plugin configuration - [ ] Define code style rules (naming conventions, import ordering, spacing) matching DJL Java standards
- [ ] Add format verification task to CI pipeline by modifying
.github/workflows/continuous.ymlto run formatter on Android code - [ ] Document formatter usage in
CONTRIBUTING.mdwith specific Android contribution guidelines
🌿Good first issues
- Add missing Java docs: Many public methods in
api/src/main/java/ai/djl/ndarray/NDArray.javalack @param/@return/@throws; generate and submit JavaDoc completions for core ops like reshape, matmul, transpose. - Implement missing unit tests for BasicDataset:
basicdataset/src/main/java/ai/djl/basicdataset/has loaders for MNIST, ImageNet, etc. but androidTest coverage is sparse — add Android unit tests that verify dataset download and caching behavior inandroid/core/src/androidTest/. - Document engine selection in examples: The
examples/directory exists but no README explains which engine to install for 'hello world' inference; add a quick-start guide showing PyTorch-only vs. multi-engine setup with version tables.
⭐Top contributors
Click to expand
Top contributors
- @xyang16 — 41 commits
- @bryanktliu — 12 commits
- @siddvenk — 6 commits
- @xinhuagu — 5 commits
- @access2rohit — 3 commits
📝Recent commits
Click to expand
Recent commits
126c751— [fix] Fix huggingface workflow (#3855) (xyang16)07a01b0— [tokenizers] Reuse JNI class and constructor lookups in getTokenCharSpans (#3854) (lewisbobrow)3edd20a— Fix the first link in the 'Quick start' documentation (#3834) (bhq12)a82f2a9— fix security issues (#3826) (Majid-Taheri)029150d— Fix JNI memory safety issues in native utils (#3825) (Majid-Taheri)83f7e05— add minimal permissions for CodeQL analysis workflow (#3820) (AdnaneKhan)fb2e413— Increase build version to 0.37.0 (#3822) (ethnzhng)dbee017— Increase DJL version to 0.36.0 (#3823) (ethnzhng)ed5e096— NDManager.create(Number) unsupported type error message improvement (#3818) (fracpete)e59e30c— Update the Documentation (via DocString) to indicate the risks of using Utils.openUrl() with untrusted inputs (#3819) (access2rohit)
🔒Security observations
The DJL Android build configuration has moderate security concerns primarily related to dependency management and artifact sourcing. Key issues include: (1) use of mutable snapshot repositories in production builds, (2) inclusion of local Maven cache without environment restrictions, (3) missing dependency verification mechanisms, and (4) incomplete obfuscation configuration. The project shows good practices with separate module organization and existing ProGuard rules in some components, but needs consistent security hardening across all modules. No hardcoded secrets were detected in the analyzed files. Implement stricter dependency management, enable signature verification, and enforce HTTPS-only repositories to improve the security posture to 75+.
- High · Outdated Gradle Android Plugin —
android/build.gradle (buildscript dependencies). The build.gradle file specifies 'com.android.tools.build:gradle:8.5.1' which may contain known security vulnerabilities. Android build tools should be kept up-to-date to patch security issues in the build pipeline. Fix: Update to the latest stable version of Android Gradle Plugin. Regularly check for updates and apply security patches within 30 days of release. - High · Insecure Maven Repository Configuration —
android/build.gradle (allprojects repositories). The build configuration includes 'https://central.sonatype.com/repository/maven-snapshots/' which hosts snapshot dependencies. Snapshot artifacts are mutable and can be modified after initial download, potentially allowing dependency substitution attacks. Fix: Remove or restrict snapshot repository access to development environments only. Use release versions for production builds. Implement dependency lock files to pin versions. - Medium · Overly Permissive Maven Local Repository —
android/build.gradle (allprojects repositories). The configuration includes 'mavenLocal()' repository without restrictions. This allows build artifacts from local Maven cache to be used, which could be compromised if the local system is breached. Fix: Restrict mavenLocal() usage to development environments only. Use environment-specific gradle configurations to exclude mavenLocal() from CI/CD and production builds. - Medium · Missing Dependency Verification —
android/build.gradle. No evidence of dependency checksum verification or signature validation in the build configuration. This increases risk of supply chain attacks through compromised or tampered dependencies. Fix: Enable dependency verification using Gradle's built-in dependency verification feature. Add checksums/signatures for critical dependencies and verify them during builds. - Medium · Missing ProGuard/R8 Configuration in Core Module —
android/core/ (missing consumer-rules.pro implementation). While tokenizer-native and pytorch-native have proguard-rules.pro files, the core module lacks obfuscation configuration. This could expose sensitive logic and class names in production APKs. Fix: Create comprehensive ProGuard/R8 rules for the core module. Include rules to keep API classes public while obfuscating internal implementation details. - Low · Unencrypted Maven Repository URLs —
android/build.gradle (repositories block). While google() and mavenCentral() use HTTPS by default, explicit HTTP URLs could theoretically be introduced. The configuration should explicitly enforce HTTPS. Fix: Add explicit HTTPS enforcement. Document repository security requirements in CONTRIBUTING.md. Use Gradle dependency verification to detect tampering. - Low · Snapshot Version in Release Configuration —
android/build.gradle (version assignment). The version assignment uses conditional logic for snapshot naming based on release property, but doesn't validate the release property existence in CI environments, potentially allowing snapshot releases. Fix: Implement explicit version validation in CI/CD pipelines. Ensure release builds explicitly verify the release property and fail if not set appropriately.
LLM-derived; treat as a starting point, not a security audit.
👉Where to read next
- Open issues — current backlog
- Recent PRs — what's actively shipping
- Source on GitHub
Generated by RepoPilot. Verdict based on maintenance signals — see the live page for receipts. Re-run on a new commit to refresh.