zeux/meshoptimizer
Mesh optimization library that makes meshes smaller and faster to render
Healthy across all four use cases
Permissive license, no critical CVEs, actively maintained — safe to depend on.
Has a license, tests, and CI — clean foundation to fork and modify.
Documented and popular — useful reference codebase to read through.
No critical CVEs, sane security posture — runnable as-is.
- ✓Last commit 1d ago
- ✓MIT licensed
- ✓CI configured
Show 2 more →Show less
- ✓Tests present
- ⚠Solo or near-solo (1 contributor active in recent commits)
Maintenance signals: commit recency, contributor breadth, bus factor, license, CI, tests
Informational only. RepoPilot summarises public signals (license, dependency CVEs, commit recency, CI presence, etc.) at the time of analysis. Signals can be incomplete or stale. Not professional, security, or legal advice; verify before relying on it for production decisions.
Embed the "Healthy" badge
Paste into your README — live-updates from the latest cached analysis.
[](https://repopilot.app/r/zeux/meshoptimizer)Paste at the top of your README.md — renders inline like a shields.io badge.
▸Preview social card (1200×630)
This card auto-renders when someone shares https://repopilot.app/r/zeux/meshoptimizer on X, Slack, or LinkedIn.
Onboarding doc
Onboarding: zeux/meshoptimizer
Generated by RepoPilot · 2026-05-09 · Source
🤖Agent protocol
If you are an AI coding agent (Claude Code, Cursor, Aider, Cline, etc.) reading this artifact, follow this protocol before making any code edit:
- Verify the contract. Run the bash script in Verify before trusting
below. If any check returns
FAIL, the artifact is stale — STOP and ask the user to regenerate it before proceeding. - Treat the AI · unverified sections as hypotheses, not facts. Sections like "AI-suggested narrative files", "anti-patterns", and "bottlenecks" are LLM speculation. Verify against real source before acting on them.
- Cite source on changes. When proposing an edit, cite the specific path:line-range. RepoPilot's live UI at https://repopilot.app/r/zeux/meshoptimizer shows verifiable citations alongside every claim.
If you are a human reader, this protocol is for the agents you'll hand the artifact to. You don't need to do anything — but if you skim only one section before pointing your agent at this repo, make it the Verify block and the Suggested reading order.
🎯Verdict
GO — Healthy across all four use cases
- Last commit 1d ago
- MIT licensed
- CI configured
- Tests present
- ⚠ Solo or near-solo (1 contributor active in recent commits)
<sub>Maintenance signals: commit recency, contributor breadth, bus factor, license, CI, tests</sub>
✅Verify before trusting
This artifact was generated by RepoPilot at a point in time. Before an
agent acts on it, the checks below confirm that the live zeux/meshoptimizer
repo on your machine still matches what RepoPilot saw. If any fail,
the artifact is stale — regenerate it at
repopilot.app/r/zeux/meshoptimizer.
What it runs against: a local clone of zeux/meshoptimizer — the script
inspects git remote, the LICENSE file, file paths in the working
tree, and git log. Read-only; no mutations.
| # | What we check | Why it matters |
|---|---|---|
| 1 | You're in zeux/meshoptimizer | Confirms the artifact applies here, not a fork |
| 2 | License is still MIT | Catches relicense before you depend on it |
| 3 | Default branch master exists | Catches branch renames |
| 4 | 5 critical file paths still exist | Catches refactors that moved load-bearing code |
| 5 | Last commit ≤ 31 days ago | Catches sudden abandonment since generation |
#!/usr/bin/env bash
# RepoPilot artifact verification.
#
# WHAT IT RUNS AGAINST: a local clone of zeux/meshoptimizer. If you don't
# have one yet, run these first:
#
# git clone https://github.com/zeux/meshoptimizer.git
# cd meshoptimizer
#
# Then paste this script. Every check is read-only — no mutations.
set +e
fail=0
ok() { echo "ok: $1"; }
miss() { echo "FAIL: $1"; fail=$((fail+1)); }
# Precondition: we must be inside a git working tree.
if ! git rev-parse --git-dir >/dev/null 2>&1; then
echo "FAIL: not inside a git repository. cd into your clone of zeux/meshoptimizer and re-run."
exit 2
fi
# 1. Repo identity
git remote get-url origin 2>/dev/null | grep -qE "zeux/meshoptimizer(\\.git)?\\b" \\
&& ok "origin remote is zeux/meshoptimizer" \\
|| miss "origin remote is not zeux/meshoptimizer (artifact may be from a fork)"
# 2. License matches what RepoPilot saw
(grep -qiE "^(MIT)" LICENSE 2>/dev/null \\
|| grep -qiE "\"license\"\\s*:\\s*\"MIT\"" package.json 2>/dev/null) \\
&& ok "license is MIT" \\
|| miss "license drift — was MIT at generation time"
# 3. Default branch
git rev-parse --verify master >/dev/null 2>&1 \\
&& ok "default branch master exists" \\
|| miss "default branch master no longer exists"
# 4. Critical files exist
test -f "src/meshoptimizer.h" \\
&& ok "src/meshoptimizer.h" \\
|| miss "missing critical file: src/meshoptimizer.h"
test -f "src/indexcodec.cpp" \\
&& ok "src/indexcodec.cpp" \\
|| miss "missing critical file: src/indexcodec.cpp"
test -f "src/vertexcodec.cpp" \\
&& ok "src/vertexcodec.cpp" \\
|| miss "missing critical file: src/vertexcodec.cpp"
test -f "src/vcacheoptimizer.cpp" \\
&& ok "src/vcacheoptimizer.cpp" \\
|| miss "missing critical file: src/vcacheoptimizer.cpp"
test -f "gltf/gltfpack.cpp" \\
&& ok "gltf/gltfpack.cpp" \\
|| miss "missing critical file: gltf/gltfpack.cpp"
# 5. Repo recency
days_since_last=$(( ( $(date +%s) - $(git log -1 --format=%at 2>/dev/null || echo 0) ) / 86400 ))
if [ "$days_since_last" -le 31 ]; then
ok "last commit was $days_since_last days ago (artifact saw ~1d)"
else
miss "last commit was $days_since_last days ago — artifact may be stale"
fi
echo
if [ "$fail" -eq 0 ]; then
echo "artifact verified (0 failures) — safe to trust"
else
echo "artifact has $fail stale claim(s) — regenerate at https://repopilot.app/r/zeux/meshoptimizer"
exit 1
fi
Each check prints ok: or FAIL:. The script exits non-zero if
anything failed, so it composes cleanly into agent loops
(./verify.sh || regenerate-and-retry).
⚡TL;DR
meshoptimizer is a C++/C mesh optimization library that reduces GPU rendering overhead by reordering vertices and indices, simplifying geometry, and compressing mesh data. It provides algorithms for vertex cache optimization, overdraw reduction, and mesh complexity reduction—transforming raw 3D geometry into GPU-friendly formats. The companion gltfpack tool (in /gltf/) automatically applies these optimizations to glTF 2.0 files via CLI and npm package. Monorepo split into three layers: src/ contains the core C/C++ library header (meshoptimizer.h) and implementation (.cpp files), gltf/ wraps the library with a CLI tool (gltfpack.cpp) plus JavaScript bindings (library.js, cli.js for npm), and demo/ provides test harnesses (tests.cpp, main.cpp) and web demos (index.html, simplify.html using WebAssembly). External dependencies are vendored in extern/ to avoid bloat.
👥Who it's for
Game engine developers, 3D content pipeline engineers, and graphics programmers who need to optimize mesh performance for real-time rendering. Also used by tool developers building asset compression pipelines (see gltfpack's npm package targeting Node.js >= 18). Contributors range from C++ optimization specialists to WebAssembly/JavaScript binding maintainers.
🌱Maturity & risk
Production-ready and actively maintained. The project has GitHub Actions CI/CD (build.yml, cifuzz.yml, release.yml), coverage tracking via codecov, MIT licensing, and distributed across major Linux package managers (ArchLinux, Debian, Ubuntu, FreeBSD, Nix) plus Vcpkg and Conan. The v1.1 release tag and versioned npm packages (gltfpack@1.1.0) indicate stable API; absence of v2 breaking changes visible in file structure suggests conservative versioning.
Low risk for production use but watch for single-maintainer dynamics (repo by zeux, author 'Arseny Kapoulkine'). The codebase is lean (~811KB C++, ~246KB JS, ~99KB C) with minimal external dependencies (extern/ only contains cgltf.h, fast_obj.h, sdefl.h—all vendored headers). Primary risk is build/platform coverage; CMakeLists.txt and Makefile suggest manual maintenance of multiple build systems rather than relying on one. No visible NPM dependencies beyond dev tooling in gltf/package.json.
Active areas of work
Active maintenance with CI pipelines for release automation (release.yml), fuzzing coverage (cifuzz.yml), and cross-platform builds (build.yml). The gltfpack npm package is published regularly; .git-blame-ignore-revs suggests ongoing code quality work. Repository includes formatter config (.clang-format, .editorconfig, .prettierrc) indicating recent linting standardization efforts.
🚀Get running
git clone https://github.com/zeux/meshoptimizer.git
cd meshoptimizer
make
For gltfpack CLI via npm: npm install -g gltfpack (v1.1.0+). For development: use CMakeLists.txt for cross-platform builds or Makefile for Unix/Linux. See gltf/package.json for WebAssembly build setup (Node.js 18+).
Daily commands:
C++ library: make builds static/dynamic libs; outputs to build/ or system paths. gltfpack CLI: gltfpack input.glb -o output.glb (pre-built binaries on releases page). JavaScript/Node: npm install gltfpack && gltfpack input.glb (wraps WASM binary). Web demos: Open demo/index.html or demo/simplify.html in browser (requires meshoptimizer.js via npm package export or local build).
🗺️Map of the codebase
src/meshoptimizer.h— The main public API header exposing all mesh optimization functions; every contributor must understand the interface contractsrc/indexcodec.cpp— Core index buffer compression codec—heavily used by all index optimization pipelines and foundational to the library's compression strategysrc/vertexcodec.cpp— Core vertex buffer compression codec—critical path for vertex optimization and storage reductionsrc/vcacheoptimizer.cpp— Vertex cache optimization engine that reorders indices for GPU pipeline efficiency; one of the library's primary use casesgltf/gltfpack.cpp— Command-line tool entry point that orchestrates all optimization passes; shows real-world integration patternssrc/simplifier.cpp— Mesh simplification algorithm that reduces triangle count; essential for LOD and complexity reduction workflowssrc/clusterizer.cpp— Meshlet generation and clustering engine for GPU-friendly mesh organization and rendering optimization
🧩Components & responsibilities
- Index Codec (src/indexcodec.cpp) (Entropy coding, delta compression, bit packing) — Compresses and decompresses triangle index buffers using delta + entropy encoding
- Failure mode: Corruption of index stream → rendering artifacts, broken topology
- Vertex Codec (src/vertexcodec.cpp) (Quantization, prediction filters, entropy coding) — Quantizes and compresses vertex attributes (positions, normals, UVs) with optional filtering
- Failure mode: Precision loss → visual distortion, lighting artifacts, texture misalignment
- VCache Optimizer (src/vcacheoptimizer.cpp) (Cache simulation, greedy graph traversal) — Reorders indices to maximize GPU vertex cache hits via FIFO simulation
- Failure mode: Poor reordering → increased memory bandwidth, lower GPU utilization
- Simplifier (src/simplifier.cpp) (Quadric error metrics, progressive mesh decimation) — Decimates meshes while preserving silhouettes; generates LOD variants
- Failure mode: Over-simplification → loss of detail, broken geometry, visual popping
- undefined — undefined
🛠️How to make changes
Add a new mesh optimization algorithm
- Create a new .cpp file in src/ with your algorithm (e.g., src/newalgo.cpp) (
src/newalgo.cpp) - Add function declarations to the public API header (
src/meshoptimizer.h) - Implement the core optimization function following naming convention meshopt_newalgo() (
src/newalgo.cpp) - Register the algorithm in the glTF pipeline if applicable (
gltf/gltfpack.cpp) - Add test cases in tools/ (e.g., tools/algotest.cpp) (
tools/algotest.cpp)
Add a new glTF processing feature
- Implement the core feature logic in a new gltf/ module (
gltf/feature.cpp) - Add command-line argument parsing in gltfpack main (
gltf/gltfpack.cpp) - Integrate processing into the mesh pipeline (gltf/mesh.cpp or gltf/parsegltf.cpp) (
gltf/mesh.cpp) - Ensure proper serialization in write.cpp if output changes (
gltf/write.cpp)
Expose a new function to JavaScript/WASM
- Ensure the C function is declared in src/meshoptimizer.h (
src/meshoptimizer.h) - Create or update the appropriate .js wrapper (decoder, encoder, simplifier, etc.) (
js/meshopt_decoder.mjs) - Add TypeScript definitions for type safety (
js/meshopt_decoder.d.ts) - Add test coverage for the new binding (
js/meshopt_decoder.test.js)
🔧Why these technologies
- C++ with C API wrapper — Performance-critical mesh processing requires low-level control; C API enables cross-language FFI bindings (JavaScript, Rust, C#)
- WASM bindings (Emscripten) — Allows browser and Node.js execution of compression/decompression without native dependencies; essential for web tooling
- CMake + Makefile dual build — CMake for modern dependency management and IDE integration; Makefile for simple builds and embedded use
- glTF 2.0 as primary format — Industry standard 3D asset format; gltfpack CLI shows production-ready integration for real-world pipelines
⚖️Trade-offs already made
-
Compression over decompression quality
- Why: Codecs prioritize compression ratio and speed over bitrate efficiency for edge cases
- Consequence: Some meshes may not compress as aggressively as specialized codecs, but maintains predictable performance
-
Single-pass algorithms where possible
- Why: Reduces O(n²) or higher complexity for interactive/CLI use
- Consequence: May miss global optimization opportunities that multi-pass algorithms would find
-
No built-in async/streaming for large meshes
- Why: Simplifies API surface and memory management; meshes typically fit in memory
- Consequence: Not suitable for extremely large (>1GB) point clouds or streaming applications
🚫Non-goals (don't propose these)
- Does not provide real-time ray tracing acceleration structures
- Does not include physics simulation or collision detection
- Does not perform automatic LOD generation beyond simplification output
- Does not handle non-triangular mesh topologies (quads, n-gons)
- Does not provide GPU-resident data structure compilation
🪤Traps & gotchas
No major hidden traps, but note: (1) CMake and Makefile build systems must be kept in sync manually—edits to source lists require updates in both, (2) WebAssembly builds require Emscripten toolchain (not in repo but referenced implicitly by npm publish flow), (3) gltfpack binary distribution via npm uses .wasm files vendored in package.json 'files' array—new algorithm changes need rebuilds before npm publish or binary will be stale, (4) Vertex deduplication in optimization assumes unique position/attributes; mesh generation tools must pre-merge duplicate vertices or results will be suboptimal.
🏗️Architecture
💡Concepts to learn
- Vertex Cache Optimization (Post-Transform Cache) — GPUs buffer recently-transformed vertices; reordering index data to group spatially-close triangles maximizes cache hits and reduces overdraw—core to meshoptimizer's index optimization module
- Quadric Error Metrics (QEM) — simplify.cpp uses QEM to greedily collapse edges while minimizing visual error; essential algorithm for LOD generation in mesh simplification
- Triangles Strips & Fans (Index Reordering) — stripify.cpp converts indexed triangles into degenerate triangle strips to reduce index buffer size by ~50%; critical for older GPU architectures and bandwidth-constrained platforms
- Vertex Attribute Quantization & Encoding — encode.cpp reduces 32-bit float positions to 16-bit or packed formats; necessary for mobile/WebGL targets where bandwidth is bottleneck and precision loss is acceptable
- Meshlet Clustering (for Nanite-style LOD) — demo/clusterlod.h and demo/nanite.cpp show how meshoptimizer integrates with continuous LOD via geometric clustering; relevant for modern rendering pipelines using hardware-accelerated meshlet dispatching
- glTF 2.0 Asset Compression Pipeline — gltfpack orchestrates multiple optimization passes (simplify → reorder → quantize → deflate) on glTF JSON + buffers; understanding the composition is key to using library effectively for asset distribution
- WebAssembly Module Binding & WASM/JS Interop — gltf/library.js wraps compiled WASM binary; memory management between JS heap and WASM linear memory is non-obvious; essential for npm package consumers porting C++ algorithms to browser
🔗Related repos
KhronosGroup/glTF— Official glTF 2.0 specification and sample models that gltfpack is designed to optimize; canonical reference for mesh encodinggoogle/draco— Alternative mesh compression library with stronger geometry compression (used as fallback in gltfpack for some cases); competes on compression ratio vs. decode speed tradeoffszeux/pugixml— Sibling project by same author (zeux) using similar minimal-dependency philosophy; demonstrates author's patterns for vendored, self-contained librariesmrdoob/three.js— Primary consumer ecosystem: three.js recommends meshoptimizer preprocessing for WebGL optimization; npm package gltfpack integrates into three.js asset pipelinesBabylonJS/Babylon.js— Second major 3D engine integration point; Babylon.js examples use gltfpack-optimized assets; common deployment target alongside three.js
🪄PR ideas
To work on one of these in Claude Code or Cursor, paste:
Implement the "<title>" PR idea from CLAUDE.md, working through the checklist as the task list.
Add comprehensive test suite for gltf/mesh.cpp optimization algorithms
The gltf module lacks dedicated unit tests for mesh optimization functions. mesh.cpp contains critical optimization logic but has no corresponding test file. This would catch regressions in mesh processing, vertex/index reordering, and compression algorithms before they reach users.
- [ ] Create gltf/mesh.test.cpp with test cases for mesh optimization functions
- [ ] Test vertex reordering algorithms with various mesh topologies
- [ ] Test index optimization and compression paths
- [ ] Integrate test execution into .github/workflows/build.yml
- [ ] Compare test results against reference outputs from demo/tests.cpp
Add TypeScript type definitions test suite for js/ bindings
The JavaScript module exports .d.ts files (meshopt_decoder.d.ts, meshopt_encoder.d.ts, meshopt_clusterizer.d.ts) but there are no TypeScript compilation tests to verify the types match actual implementations. This causes type safety issues for TypeScript consumers.
- [ ] Create js/types.test.ts to verify all exported TypeScript definitions compile correctly
- [ ] Test type compatibility between decoder/encoder/clusterizer interfaces and actual implementations
- [ ] Add 'tsc --noEmit' check to package.json scripts
- [ ] Integrate TypeScript type checking into .github/workflows/build.yml
- [ ] Verify index.d.ts exports match index.js and library.js functionality
Add fuzzing harness for gltf/parsegltf.cpp with integration to CIFuzz workflow
The repo has .github/workflows/cifuzz.yml and gltf/fuzz.dict, but gltf/parsegltf.cpp (the main glTF parser) lacks a dedicated fuzzing harness. Given that this is a parser for untrusted file formats, a robust fuzzing target would catch memory safety issues and parser edge cases.
- [ ] Create gltf/fuzz_parsegltf.cpp with LLVMFuzzerTestOneInput harness targeting parseGLTF()
- [ ] Extend gltf/fuzz.dict with additional glTF structure tokens (node, mesh, primitive, material)
- [ ] Configure CMakeLists.txt to build fuzzing target when FUZZING=ON
- [ ] Update .github/workflows/cifuzz.yml to include gltf/fuzz_parsegltf target
- [ ] Verify coverage includes error paths in parsegltf.cpp and parseobj.cpp
🌿Good first issues
- Add missing C API examples to src/meshoptimizer.h documentation: write 2-3 annotated code samples showing (a) how to optimize an index buffer with overdraw reduction, (b) how to simplify a mesh to 50% polygons, (c) how to encode the result for GPU streaming. Currently only header comments, no runnable examples.
- Extend demo/tests.cpp to cover edge cases in simplify.cpp: add tests for degenerate meshes (single triangle, coplanar vertices, isolated disconnected components) and validate LOD quality metrics. Current test suite focuses on happy-path.
- Write Node.js/TypeScript type definitions for gltf/library.js exports: create gltf/library.d.ts declaring function signatures, callbacks, and buffer layouts so TypeScript users of the npm package get IDE autocomplete. Currently no .d.ts file exists.
📝Recent commits
Click to expand
Recent commits
14b045c— Update README.md (zeux)5f0a4a2— js: Add RegularizeLight option support to simplifier (zeux)f116303— Merge pull request #1051 from zeux/js-tangents (zeux)4fec35d— js: Add a deprecated type alias Flags for simplification flags (zeux)dd56fe0— Add a GitHub Actions check for TypeScript declarations (zeux)f75abf4— js: Update README (zeux)148a7f2— js: Add a test for MeshoptTangents (zeux)f4b56d7— js: Add TypeScript types for MeshoptTangents (zeux)4b05dde— js: Add MeshoptTangents module for tangent generation (zeux)c835967— indexcodec: Fix array formatting to avoid clang-format wrapping (zeux)
🔒Security observations
The meshoptimizer codebase demonstrates a solid overall security posture as a mesh optimization library with MIT license and active maintenance. However, several moderate concerns exist: (1) Node.js engine version is not pinned in package.json, creating potential exposure to unpatched vulnerabilities; (2) No explicit dependency management visible, increasing supply chain risk; (3) The CLI tool (gltfpack) may be vulnerable to command injection if user input is not properly sanitized when invoking external encoders; (4) Fuzzing infrastructure exists but lacks documentation; (5) External libraries in the extern/ directory are not version-controlled. The codebase lacks a security policy file. Recommendations focus on dependency pinning, input validation in CLI tools, documentation of security practices, and integration of security scanning in CI/CD pipelines.
- Medium · Permissive Node Engine Requirement —
gltf/package.json. The package.json specifies 'node': '>=18' which allows any version 18 or higher. This is overly permissive and does not pin to a stable, well-tested version. Node.js versions may contain security vulnerabilities that are patched in later minor/patch versions. Fix: Pin to a specific Node.js LTS version or at least a specific major.minor version. For example, use '^18.17.0' or '18.17.x' to ensure security patches are applied while maintaining some flexibility. - Medium · Missing Dependency Version Pinning —
gltf/package.json. The package.json file does not specify any dependencies or devDependencies. While the CLI script references external tools (basis, webp encoders), there is no lock file or version constraints visible, which could lead to supply chain attacks or unexpected behavior from uncontrolled transitive dependencies. Fix: Explicitly list all runtime and development dependencies with specific versions. Generate and commit a package-lock.json or yarn.lock file. Consider using npm audit and dependency scanning tools in CI/CD. - Medium · Potential Command Injection in CLI Tool —
gltf/cli.js. The gltfpack CLI tool (gltf/cli.js) processes user input and may invoke external programs (basis encoder, webp encoder). Without proper input validation and sanitization, this could lead to command injection vulnerabilities if user-supplied file paths or arguments are passed unsanitized to shell commands. Fix: Use safe APIs for spawning child processes (e.g., Node.js child_process.execFile() instead of exec()). Avoid shell interpretation. Validate and sanitize all user inputs. Use an allowlist for file paths and formats. - Low · No Security Policy Defined —
Repository root. The repository does not contain a SECURITY.md file or security policy documented in the README, making it unclear how security vulnerabilities should be reported responsibly. Fix: Create a SECURITY.md file following GitHub's security policy guidelines. Include instructions for responsible disclosure and how to report vulnerabilities privately. - Low · Fuzzing Dictionary Without Context —
gltf/fuzz.dict, gltf/fuzz.glb. The file gltf/fuzz.dict exists alongside gltf/fuzz.glb, indicating fuzzing infrastructure, but there is no documentation on fuzzing results or crash handling procedures. This could mask potential security issues. Fix: Document the fuzzing infrastructure and results. Ensure all discovered crashes are analyzed and fixed. Include fuzzing in CI/CD pipeline with sanitizers enabled. Maintain a public fuzzing status or use OSS-Fuzz integration. - Low · External Dependencies Without Version Control —
extern/. The extern/ directory contains third-party headers (cgltf.h, fast_obj.h, sdefl.h) with no visible version information or lock mechanism. These are critical parsing libraries that could have vulnerabilities. Fix: Document the versions of external dependencies. Use git submodules or a dependency manager to pin versions. Regularly audit these dependencies for security updates and apply patches promptly.
LLM-derived; treat as a starting point, not a security audit.
👉Where to read next
- Open issues — current backlog
- Recent PRs — what's actively shipping
- Source on GitHub
Generated by RepoPilot. Verdict based on maintenance signals — see the live page for receipts. Re-run on a new commit to refresh.