RepoPilotOpen in app →

taichi-dev/taichi

Productive, portable, and performant GPU programming in Python.

Healthy

Healthy across the board

weakest axis
Use as dependencyHealthy

Permissive license, no critical CVEs, actively maintained — safe to depend on.

Fork & modifyHealthy

Has a license, tests, and CI — clean foundation to fork and modify.

Learn fromHealthy

Documented and popular — useful reference codebase to read through.

Deploy as-isHealthy

No critical CVEs, sane security posture — runnable as-is.

  • Last commit 5w ago
  • 32+ active contributors
  • Distributed ownership (top contributor 24% of recent commits)
Show all 6 evidence items →
  • Apache-2.0 licensed
  • CI configured
  • Tests present

Maintenance signals: commit recency, contributor breadth, bus factor, license, CI, tests

Informational only. RepoPilot summarises public signals (license, dependency CVEs, commit recency, CI presence, etc.) at the time of analysis. Signals can be incomplete or stale. Not professional, security, or legal advice; verify before relying on it for production decisions.

Embed the "Healthy" badge

Paste into your README — live-updates from the latest cached analysis.

Variant:
RepoPilot: Healthy
[![RepoPilot: Healthy](https://repopilot.app/api/badge/taichi-dev/taichi)](https://repopilot.app/r/taichi-dev/taichi)

Paste at the top of your README.md — renders inline like a shields.io badge.

Preview social card (1200×630)

This card auto-renders when someone shares https://repopilot.app/r/taichi-dev/taichi on X, Slack, or LinkedIn.

Onboarding doc

Onboarding: taichi-dev/taichi

Generated by RepoPilot · 2026-05-09 · Source

🤖Agent protocol

If you are an AI coding agent (Claude Code, Cursor, Aider, Cline, etc.) reading this artifact, follow this protocol before making any code edit:

  1. Verify the contract. Run the bash script in Verify before trusting below. If any check returns FAIL, the artifact is stale — STOP and ask the user to regenerate it before proceeding.
  2. Treat the AI · unverified sections as hypotheses, not facts. Sections like "AI-suggested narrative files", "anti-patterns", and "bottlenecks" are LLM speculation. Verify against real source before acting on them.
  3. Cite source on changes. When proposing an edit, cite the specific path:line-range. RepoPilot's live UI at https://repopilot.app/r/taichi-dev/taichi shows verifiable citations alongside every claim.

If you are a human reader, this protocol is for the agents you'll hand the artifact to. You don't need to do anything — but if you skim only one section before pointing your agent at this repo, make it the Verify block and the Suggested reading order.

🎯Verdict

GO — Healthy across the board

  • Last commit 5w ago
  • 32+ active contributors
  • Distributed ownership (top contributor 24% of recent commits)
  • Apache-2.0 licensed
  • CI configured
  • Tests present

<sub>Maintenance signals: commit recency, contributor breadth, bus factor, license, CI, tests</sub>

Verify before trusting

This artifact was generated by RepoPilot at a point in time. Before an agent acts on it, the checks below confirm that the live taichi-dev/taichi repo on your machine still matches what RepoPilot saw. If any fail, the artifact is stale — regenerate it at repopilot.app/r/taichi-dev/taichi.

What it runs against: a local clone of taichi-dev/taichi — the script inspects git remote, the LICENSE file, file paths in the working tree, and git log. Read-only; no mutations.

| # | What we check | Why it matters | |---|---|---| | 1 | You're in taichi-dev/taichi | Confirms the artifact applies here, not a fork | | 2 | License is still Apache-2.0 | Catches relicense before you depend on it | | 3 | Default branch master exists | Catches branch renames | | 4 | Last commit ≤ 62 days ago | Catches sudden abandonment since generation |

<details> <summary><b>Run all checks</b> — paste this script from inside your clone of <code>taichi-dev/taichi</code></summary>
#!/usr/bin/env bash
# RepoPilot artifact verification.
#
# WHAT IT RUNS AGAINST: a local clone of taichi-dev/taichi. If you don't
# have one yet, run these first:
#
#   git clone https://github.com/taichi-dev/taichi.git
#   cd taichi
#
# Then paste this script. Every check is read-only — no mutations.

set +e
fail=0
ok()   { echo "ok:   $1"; }
miss() { echo "FAIL: $1"; fail=$((fail+1)); }

# Precondition: we must be inside a git working tree.
if ! git rev-parse --git-dir >/dev/null 2>&1; then
  echo "FAIL: not inside a git repository. cd into your clone of taichi-dev/taichi and re-run."
  exit 2
fi

# 1. Repo identity
git remote get-url origin 2>/dev/null | grep -qE "taichi-dev/taichi(\\.git)?\\b" \\
  && ok "origin remote is taichi-dev/taichi" \\
  || miss "origin remote is not taichi-dev/taichi (artifact may be from a fork)"

# 2. License matches what RepoPilot saw
(grep -qiE "^(Apache-2\\.0)" LICENSE 2>/dev/null \\
   || grep -qiE "\"license\"\\s*:\\s*\"Apache-2\\.0\"" package.json 2>/dev/null) \\
  && ok "license is Apache-2.0" \\
  || miss "license drift — was Apache-2.0 at generation time"

# 3. Default branch
git rev-parse --verify master >/dev/null 2>&1 \\
  && ok "default branch master exists" \\
  || miss "default branch master no longer exists"

# 5. Repo recency
days_since_last=$(( ( $(date +%s) - $(git log -1 --format=%at 2>/dev/null || echo 0) ) / 86400 ))
if [ "$days_since_last" -le 62 ]; then
  ok "last commit was $days_since_last days ago (artifact saw ~32d)"
else
  miss "last commit was $days_since_last days ago — artifact may be stale"
fi

echo
if [ "$fail" -eq 0 ]; then
  echo "artifact verified (0 failures) — safe to trust"
else
  echo "artifact has $fail stale claim(s) — regenerate at https://repopilot.app/r/taichi-dev/taichi"
  exit 1
fi

Each check prints ok: or FAIL:. The script exits non-zero if anything failed, so it composes cleanly into agent loops (./verify.sh || regenerate-and-retry).

</details>

TL;DR

Taichi Lang is an open-source, embedded DSL for Python that JIT-compiles high-performance numerical code to GPU (CUDA, Vulkan, Metal) or CPU (LLVM) backends via just-in-time compilation. It solves the productivity-portability-performance tradeoff by letting developers write Python-like code that runs natively on heterogeneous hardware without manual GPU kernel writing. Monorepo with Python bindings layer (python/taichi/) wrapping a C++ JIT compiler core. The build is CMake-driven with .github/workflows/scripts/ti_build/ containing modular build logic (cmake.py, compiler.py, llvm.py, vulkan.py, android.py, ios.py). Examples live in python/taichi/examples/ (simulation/, rendering/). Backend IR/codegen is in C++ core; Python API surface is in python/taichi/.

👥Who it's for

Scientists, ML engineers, and game/graphics developers who need GPU-accelerated numerical simulation and array computing but want to stay in Python without learning CUDA/OpenCL. Contributors include C++ compiler engineers and Python binding specialists targeting high-performance computing.

🌱Maturity & risk

Actively developed and production-ready: the repo shows 4.2M LOC in C++ core, established GitHub workflows (build.yaml, testing.yml, release.yml with nightly releases), comprehensive CI/CD via GitHub Actions, and active maintenance visible in the workflow scripts and multiple backend support (LLVM, CUDA, Vulkan, Metal). Release versioning is tracked and distributed on PyPI (pip install taichi).

Moderate risk factors: multi-backend architecture (LLVM, CUDA, Vulkan, Metal, OpenGL) creates high surface area for compiler bugs; rapid feature velocity suggests potential breaking API changes between minor versions; heavy C++ codebase (~4.2M LOC) means contributor barrier is steep. However, the structured build system (ti_build/ scripts, CMake, cross-platform tooling) and mandatory CI checks (clang-tidy in .clang-tidy, pre-commit hooks) mitigate stability concerns.

Active areas of work

Active CI/CD with multiple workflows: build.yaml (cross-platform builds), testing.yml (continuous testing), release.yml (nightly + stable releases), perf.yml (benchmarking), and ghstack_land.yml (PR integration). The ti_build/ system shows recent investment in improving cross-platform support (Android, iOS, compiler abstractions). Windows PowerShell scripts (win_test.ps1, common-utils.ps1) and Unix equivalents (unix_test.sh, unix-perf-mon.sh) indicate active maintenance of multi-OS build pipelines.

🚀Get running

git clone https://github.com/taichi-dev/taichi.git
cd taichi
pip install -e .  # Development install from source
ti gallery  # Launch interactive demo gallery

For build customization, see .github/workflows/scripts/ti_build/ and CMakeLists.txt for backend selection (LLVM, CUDA, Vulkan, Metal).

Daily commands:

python -c "import taichi as ti; ti.init(arch=ti.gpu); print('Ready')"
# or for interactive demos:
ti gallery
# For development:
python -m pytest tests/  # Run test suite (inferred from CI workflows)

Backend selection via environment: TI_BACKEND_DISPLAY_GFX=1 python script.py or programmatic ti.init(arch=ti.cuda).

🗺️Map of the codebase

🛠️How to make changes

IR/Compiler changes: C++ files in the core (not fully visible in file list, but referenced in CMakeLists.txt). Python API: python/taichi/*.py for high-level API. Examples/tests: python/taichi/examples/ for reference implementations, add new examples there. Build system: .github/workflows/scripts/ti_build/ for backend integration; CMakeLists.txt for compilation flags. Linting: Run clang-format and clang-tidy per .clang-format and .clang-tidy configuration before committing.

🪤Traps & gotchas

Environment & backend selection: The repo auto-detects GPUs but can silently fall back to CPU if CUDA/Vulkan libraries missing; set TI_BACKEND=cuda explicitly if unsure. Build artifacts: ti_build/ scripts use sccache (sccache.py) for caching; first builds are very slow (~30+ min for full pipeline). Python version pin: Check .github/workflows/testing.yml for supported Python versions; 3.8+ assumed. Dependency resolution: CMakeLists.txt includes git submodules (.gitmodules); run git submodule update --init after clone. Windows builds: Require Visual Studio (see ti_build/compiler.py); pre-commit hooks (.pre-commit-config.yaml) must run before push to avoid clang-tidy failures.

💡Concepts to learn

  • Just-In-Time (JIT) Compilation — Taichi's core mechanism: Python code is compiled to native GPU/CPU at runtime, enabling high performance without explicit kernel writing
  • Intermediate Representation (IR) — Taichi lowers Python AST to a custom IR, then to backend-specific code (LLVM, CUDA); understanding IR is critical for compiler contributions
  • Single Program, Multiple Data (SPMD) — Taichi kernel execution model: one kernel function runs on many parallel threads with different data; fundamental to GPU utilization
  • Multi-backend Code Generation — Taichi abstracts away CUDA, Vulkan, Metal, LLVM differences; the ti_build/ system and backend plugins enable write-once-run-anywhere compute
  • Memory-mapped GPU Buffers — Taichi abstracts GPU memory allocation/sync via Python arrays; understanding this abstraction is key to debugging performance issues
  • Type Inference & Dynamic-to-Static Binding — Taichi infers types from Python code and binds them to static compiled kernels; critical for performance but source of subtle bugs
  • Sparse Data Structures — Taichi's sparse matrix and SoA (Structure of Arrays) support enables efficient numerical compute on irregular grids; visible in examples/simulation/
  • numba/numba — Python JIT compiler for NumPy arrays targeting CPU/GPU; solves similar productivity-performance tradeoff but more array-centric, less control over IR
  • pytorch/pytorch — Deep learning framework with Python API and multi-backend JIT (CUDA, Metal, Vulkan); ecosystem competitor for numerical GPU compute but focused on neural networks
  • google/jax — Functional Python AD framework with XLA JIT backend; solves GPU compute but functional-first paradigm vs. Taichi's imperative approach
  • taichi-dev/taichi-releases — Official binary release repo; use for pre-built wheels instead of from-source builds
  • taichi-dev/taichi.graphics — Official documentation and tutorial website for Taichi Lang; essential for understanding API design decisions

🪄PR ideas

To work on one of these in Claude Code or Cursor, paste: Implement the "<title>" PR idea from CLAUDE.md, working through the checklist as the task list.

Add comprehensive CI workflow for clang-tidy static analysis automation

The repo has .clang-tidy config and a check_clang_tidy.sh script in .github/workflows/scripts/, but there's no dedicated GitHub Action workflow file for running clang-tidy checks on PRs. This would catch C++ code quality issues early and prevent regressions. The infrastructure partially exists (check_clang_tidy.sh) but needs a proper workflow definition in .github/workflows/.

  • [ ] Create .github/workflows/clang-tidy.yml that runs on pull_request events
  • [ ] Integrate with existing .github/workflows/scripts/check_clang_tidy.sh
  • [ ] Configure to run on C++ source files in taichi/
  • [ ] Set up PR comments to report violations (similar to existing notify_on_failure.yml pattern)
  • [ ] Test against a few sample C++ files to validate the workflow

Add iOS build integration tests to CI pipeline

The repo has .github/workflows/scripts/ti_build/ios.py indicating iOS support, but there's no corresponding GitHub Actions workflow file for iOS builds. Currently only Android, Linux, macOS, and Windows have explicit workflows. Adding iOS CI would ensure cross-platform mobile support doesn't regress.

  • [ ] Create .github/workflows/ios-build.yml workflow file
  • [ ] Reference existing ios.py build script from ti_build/
  • [ ] Define build matrix for iOS SDK versions and architectures
  • [ ] Configure to run on macOS runners (required for iOS builds)
  • [ ] Add basic sanity checks like binary generation and link validation

Create unit tests for ti_build build system modules

The .github/workflows/scripts/ti_build/ directory contains multiple Python modules (compiler.py, dep.py, cmake.py, llvm.py, vulkan.py, etc.) but there's no corresponding test directory. These modules are critical for CI/CD but lack automated test coverage. Adding pytest-based tests would improve reliability and make contributions safer.

  • [ ] Create .github/workflows/scripts/ti_build/tests/ directory structure
  • [ ] Write unit tests for ti_build/cmake.py (CMake invocation logic)
  • [ ] Write unit tests for ti_build/compiler.py (compiler detection/validation)
  • [ ] Write unit tests for ti_build/dep.py (dependency resolution)
  • [ ] Add pytest configuration and integrate into testing.yml workflow

🌿Good first issues

  • Add missing integration tests for the Vulkan backend in the test matrix (.github/workflows/testing.yml currently may have gaps); check vulkan.py in ti_build/ for untested compilation paths.
  • Document the ti_build/ modular build system: each file (cmake.py, llvm.py, vulkan.py, etc.) lacks inline docstrings explaining the build flow and configuration options for new contributors.
  • Expand examples/ with a simple 'hello_taichi' template in python/taichi/examples/ that demonstrates all four backends (LLVM, CUDA, Vulkan, Metal) with the same kernel, fixing the current backend-specific examples.

Top contributors

Click to expand

📝Recent commits

Click to expand
  • ba0e81d — [ir] Sanitize SPIRV debugprintf string (#8752) (feisuzhu)
  • 7c7a3ef — [ci] Windows needs Vulkan SDK (#8754) (feisuzhu)
  • 898df1a — [Build] Revert "Drop OpenGL build (#8751)" (#8753) (feisuzhu)
  • 7d45eb7 — [Build] Drop OpenGL build (#8751) (feisuzhu)
  • 9ae2e38 — [build] Update minimum requirement for CMAKE (#8703) (linull24)
  • 6f36693 — [build] Add Python 3.13 build (#8748) (feisuzhu)
  • 41915a0 — [build] Identify aarch64 wheels (#8716) (johnnynunez)
  • faae67d — [build] Update setup function to use Miniforge and correct download URLs (#8719) (johnnynunez)
  • ef34a3f — [build] Update Vulkan SDK version (#8749) (cielavenir)
  • 628a8f9 — [build] Update sccache download URLs to version 0.10.0 for various plat… (#8717) (johnnynunez)

🔒Security observations

The Ta

  • Medium · Insecure Dependency: jsbeautifier — Dependencies/Package file - jsbeautifier. jsbeautifier is a JavaScript beautification library that has had security vulnerabilities in the past. It may be used to process untrusted input, which could lead to code injection or DoS attacks if not properly validated. Fix: 1. Verify the current version of jsbeautifier is up-to-date. 2. Review how jsbeautifier is used in the codebase - ensure it only processes trusted input. 3. Consider replacing with actively maintained alternatives if security patches are slow. 4. Implement input validation and sandboxing if processing user-supplied code.
  • Low · Insecure Dependency: bokeh — Dependencies/Package file - bokeh. bokeh is a visualization library that historically had security concerns related to XSS vulnerabilities in generated HTML output. While generally safe for internal use, it should be kept updated. Fix: 1. Keep bokeh updated to the latest version. 2. If bokeh output is served to end-users, ensure proper Content Security Policy (CSP) headers are implemented. 3. Validate and sanitize any user input passed to bokeh visualization functions. 4. Review bokeh security advisories regularly.
  • Medium · Automated Workflow Security: GitHub Actions Configuration — .github/workflows/ directory (build.yaml, pull_request.yml, issue_comment.yml, etc.). Multiple GitHub Actions workflow files (.github/workflows/*.yml) are present. These workflows may execute untrusted code from pull requests or issue comments, potentially leading to credential exposure or supply chain attacks if not properly secured. Fix: 1. Review all workflow files for use of 'pull_request' trigger with 'write' permissions - use 'pull_request_target' with restricted permissions instead. 2. Never pass secrets to workflows triggered by external events without explicit approval. 3. Use environment-level protection rules and required reviewers. 4. Audit scripts in .github/workflows/scripts/ for shell injection vulnerabilities. 5. Implement branch protection rules requiring workflow checks to pass.
  • Low · Shell Script Security in Workflow Scripts — .github/workflows/scripts/*.sh files (unix-perf-mon.sh, unix-test-docs.sh, unix_aot_compat_test.sh, unix_test.sh, etc.). Multiple shell scripts in .github/workflows/scripts/ (*.sh files) are executed in CI/CD pipelines. These scripts may be vulnerable to command injection if they process environment variables or user input unsafely. Fix: 1. Use shellcheck to scan all shell scripts for common vulnerabilities. 2. Quote all variable expansions: use "$var" instead of $var. 3. Use 'set -euo pipefail' at the beginning of scripts. 4. Avoid eval and command substitution with user input. 5. Validate all input before use.
  • Low · PowerShell Script Security — .github/workflows/scripts/common-utils.ps1, win_test.ps1. PowerShell scripts in CI/CD workflows (.github/workflows/scripts/*.ps1) may have execution policy issues or be vulnerable to script injection. Fix: 1. Set explicit execution policy in workflows before running scripts. 2. Sign PowerShell scripts with a code signing certificate. 3. Use strict mode: 'Set-StrictMode -Version Latest' at the beginning of scripts. 4. Quote all variable expansions and use parameterized queries. 5. Avoid using ConvertFrom-SecureString without proper key management.
  • Low · Python Script Security in Build Tools — .github/workflows/scripts/ti_build/ (entry.py, cmake.py, compiler.py, etc.) and .github/workflows/scripts/*.py. Multiple Python scripts in .github/workflows/scripts/ti_build/ directory may process system commands or untrusted input. The use of subprocess, os.system, or similar functions could lead to command injection if not properly sanitized. Fix: 1. Use subprocess.run() with shell=False and pass arguments as a list instead of shell=True. 2. Validate and sanitize all user input and environment variables. 3. Use shlex.quote() for any shell metacharacters that must be passed. 4. Avoid pickle module for deserializing untrusted data. 5. Use type hints and input validation throughout.

LLM-derived; treat as a starting point, not a security audit.


Generated by RepoPilot. Verdict based on maintenance signals — see the live page for receipts. Re-run on a new commit to refresh.

Healthy signals · taichi-dev/taichi — RepoPilot