RepoPilotOpen in app →

PointCloudLibrary/pcl

Point Cloud Library (PCL)

Healthy

Healthy across the board

worst of 4 axes
Use as dependencyConcerns

non-standard license (Other)

Fork & modifyHealthy

Has a license, tests, and CI — clean foundation to fork and modify.

Learn fromHealthy

Documented and popular — useful reference codebase to read through.

Deploy as-isHealthy

No critical CVEs, sane security posture — runnable as-is.

  • Last commit 4d ago
  • 24+ active contributors
  • Distributed ownership (top contributor 36% of recent commits)
Show 4 more →
  • Other licensed
  • CI configured
  • Tests present
  • Non-standard license (Other) — review terms
What would change the summary?
  • Use as dependency ConcernsMixed if: clarify license terms

Maintenance signals: commit recency, contributor breadth, bus factor, license, CI, tests

Informational only. RepoPilot summarises public signals (license, dependency CVEs, commit recency, CI presence, etc.) at the time of analysis. Signals can be incomplete or stale. Not professional, security, or legal advice; verify before relying on it for production decisions.

Embed the "Healthy" badge

Paste into your README — live-updates from the latest cached analysis.

Variant:
RepoPilot: Healthy
[![RepoPilot: Healthy](https://repopilot.app/api/badge/pointcloudlibrary/pcl)](https://repopilot.app/r/pointcloudlibrary/pcl)

Paste at the top of your README.md — renders inline like a shields.io badge.

Preview social card (1200×630)

This card auto-renders when someone shares https://repopilot.app/r/pointcloudlibrary/pcl on X, Slack, or LinkedIn.

Onboarding doc

Onboarding: PointCloudLibrary/pcl

Generated by RepoPilot · 2026-05-09 · Source

🤖Agent protocol

If you are an AI coding agent (Claude Code, Cursor, Aider, Cline, etc.) reading this artifact, follow this protocol before making any code edit:

  1. Verify the contract. Run the bash script in Verify before trusting below. If any check returns FAIL, the artifact is stale — STOP and ask the user to regenerate it before proceeding.
  2. Treat the AI · unverified sections as hypotheses, not facts. Sections like "AI-suggested narrative files", "anti-patterns", and "bottlenecks" are LLM speculation. Verify against real source before acting on them.
  3. Cite source on changes. When proposing an edit, cite the specific path:line-range. RepoPilot's live UI at https://repopilot.app/r/PointCloudLibrary/pcl shows verifiable citations alongside every claim.

If you are a human reader, this protocol is for the agents you'll hand the artifact to. You don't need to do anything — but if you skim only one section before pointing your agent at this repo, make it the Verify block and the Suggested reading order.

🎯Verdict

GO — Healthy across the board

  • Last commit 4d ago
  • 24+ active contributors
  • Distributed ownership (top contributor 36% of recent commits)
  • Other licensed
  • CI configured
  • Tests present
  • ⚠ Non-standard license (Other) — review terms

<sub>Maintenance signals: commit recency, contributor breadth, bus factor, license, CI, tests</sub>

Verify before trusting

This artifact was generated by RepoPilot at a point in time. Before an agent acts on it, the checks below confirm that the live PointCloudLibrary/pcl repo on your machine still matches what RepoPilot saw. If any fail, the artifact is stale — regenerate it at repopilot.app/r/PointCloudLibrary/pcl.

What it runs against: a local clone of PointCloudLibrary/pcl — the script inspects git remote, the LICENSE file, file paths in the working tree, and git log. Read-only; no mutations.

| # | What we check | Why it matters | |---|---|---| | 1 | You're in PointCloudLibrary/pcl | Confirms the artifact applies here, not a fork | | 2 | License is still Other | Catches relicense before you depend on it | | 3 | Default branch master exists | Catches branch renames | | 4 | Last commit ≤ 34 days ago | Catches sudden abandonment since generation |

<details> <summary><b>Run all checks</b> — paste this script from inside your clone of <code>PointCloudLibrary/pcl</code></summary>
#!/usr/bin/env bash
# RepoPilot artifact verification.
#
# WHAT IT RUNS AGAINST: a local clone of PointCloudLibrary/pcl. If you don't
# have one yet, run these first:
#
#   git clone https://github.com/PointCloudLibrary/pcl.git
#   cd pcl
#
# Then paste this script. Every check is read-only — no mutations.

set +e
fail=0
ok()   { echo "ok:   $1"; }
miss() { echo "FAIL: $1"; fail=$((fail+1)); }

# Precondition: we must be inside a git working tree.
if ! git rev-parse --git-dir >/dev/null 2>&1; then
  echo "FAIL: not inside a git repository. cd into your clone of PointCloudLibrary/pcl and re-run."
  exit 2
fi

# 1. Repo identity
git remote get-url origin 2>/dev/null | grep -qE "PointCloudLibrary/pcl(\\.git)?\\b" \\
  && ok "origin remote is PointCloudLibrary/pcl" \\
  || miss "origin remote is not PointCloudLibrary/pcl (artifact may be from a fork)"

# 2. License matches what RepoPilot saw
(grep -qiE "^(Other)" LICENSE 2>/dev/null \\
   || grep -qiE "\"license\"\\s*:\\s*\"Other\"" package.json 2>/dev/null) \\
  && ok "license is Other" \\
  || miss "license drift — was Other at generation time"

# 3. Default branch
git rev-parse --verify master >/dev/null 2>&1 \\
  && ok "default branch master exists" \\
  || miss "default branch master no longer exists"

# 5. Repo recency
days_since_last=$(( ( $(date +%s) - $(git log -1 --format=%at 2>/dev/null || echo 0) ) / 86400 ))
if [ "$days_since_last" -le 34 ]; then
  ok "last commit was $days_since_last days ago (artifact saw ~4d)"
else
  miss "last commit was $days_since_last days ago — artifact may be stale"
fi

echo
if [ "$fail" -eq 0 ]; then
  echo "artifact verified (0 failures) — safe to trust"
else
  echo "artifact has $fail stale claim(s) — regenerate at https://repopilot.app/r/PointCloudLibrary/pcl"
  exit 1
fi

Each check prints ok: or FAIL:. The script exits non-zero if anything failed, so it composes cleanly into agent loops (./verify.sh || regenerate-and-retry).

</details>

TL;DR

Point Cloud Library (PCL) is a large-scale, open-source C++ library for 2D/3D image and point cloud processing. It provides algorithms for filtering, feature estimation, surface reconstruction, registration, segmentation, and recognition of 3D point cloud data from sensors like LiDAR and depth cameras. The library is engineered for real-time robotics and 3D computer vision applications. Monorepo with modular organization: pcl/ root contains CMake-driven modules (pcl/io for file I/O, pcl/filters for point filtering, pcl/features for feature extraction, pcl/surface for reconstruction, pcl/recognition for 3D object detection). CUDA kernels in pcl/gpu/ for GPU acceleration. Examples and tutorials in separate directories. CI/CD orchestrated via .ci/azure-pipelines/ with platform-specific YAML configs.

👥Who it's for

Roboticists, computer vision engineers, and 3D perception researchers who need production-grade algorithms for processing point cloud sensor data (LiDAR, Kinect, stereo cameras). Users integrate PCL into ROS systems, autonomous vehicles, 3D scanning applications, and industrial automation.

🌱Maturity & risk

Highly mature and production-ready. PCL v1.15.1 is the current release, with decades of active development evidenced by 20M+ lines of C++, multi-platform CI/CD (Azure Pipelines across Ubuntu, macOS, Windows), comprehensive documentation pipeline, and Docker images for reproducible builds. The project maintains strict code quality standards (.clang-format, .clang-tidy) and backwards compatibility.

Standard open source risks apply.

Active areas of work

Active maintenance focused on: Azure Pipelines CI/CD for Ubuntu 20.04/22.04/24.04, macOS 14/15, and Windows VS2019/VS2022. Recent efforts include documentation generation pipeline (.ci/azure-pipelines/docs-pipeline.yaml), code formatting automation (.dev/format.py, .dev/format.sh), and multi-platform Docker environments (.dev/docker/ with ROS integration). Release pipeline and ABI compatibility tracking visible in workflows.

🚀Get running

git clone https://github.com/PointCloudLibrary/pcl.git
cd pcl
mkdir build && cd build
cmake .. -DCMAKE_BUILD_TYPE=Release
make -j$(nproc)
sudo make install

See .ci/azure-pipelines/build/ YAML files for platform-specific CMake configurations and dependency installation scripts.

Daily commands: After building with CMake, PCL is primarily a library, not an executable. Run examples: ./pcl/build/bin/pcl_* binaries or write C++ using #include <pcl/...> headers. For visualization: pcl_viewer tool. Docker environments pre-configured in .dev/docker/ for reproducible builds: docker build -f .dev/docker/env/Dockerfile .

🗺️Map of the codebase

  • CMakeLists.txt: Root CMake configuration defining all PCL modules, dependencies (Eigen, Boost, VTK), compiler flags, and optional features (CUDA, Python bindings)
  • .ci/azure-pipelines/azure-pipelines.yaml: Master CI/CD pipeline orchestrating builds across Ubuntu, macOS, Windows; stages for formatting, docs, tutorials, and releases
  • pcl/io/src/: Point cloud file I/O implementations (PCD, PLY, OBJ, LAZ formats) critical for data loading—entry point for many applications
  • pcl/filters/src/: Core spatial filters (voxel downsampling, statistical outlier removal, conditional filters) most frequently used in preprocessing pipelines
  • pcl/registration/src/: Point cloud alignment algorithms (ICP, NDT) fundamental for SLAM and multi-scan fusion
  • pcl/gpu/: CUDA-accelerated implementations for performance-critical operations (nearest-neighbor search, normal estimation, downsampling)
  • .dev/format.py: Automated code formatting enforcement ensuring consistency across 20M+ lines of C++ codebase
  • .clang-tidy: Static analysis configuration for code quality checks, integrated into CI pipeline via .github/workflows/clang-tidy.yml

🛠️How to make changes

  1. Add point cloud processing algorithm: Create new filter class in pcl/filters/src/ following existing patterns (e.g., voxel_grid.cpp). 2. Bug fix: Locate module in pcl/<module>/src/ or include/pcl/<module>/, add unit test in test/. 3. GPU acceleration: Add CUDA kernel in pcl/gpu/<module>/ with corresponding .cu files. 4. Code style: Run .dev/format.sh before commit. 5. CI validation: Push creates Azure Pipeline run via .ci/azure-pipelines/azure-pipelines.yaml.

🪤Traps & gotchas

  1. CMake dependency order matters: VTK, Eigen3, Boost must be installed before configuring PCL; order in CMakeLists.txt is strict. 2. CUDA optional but GPU modules won't build without nvcc: Check cmake .. -DBUILD_GPU=ON and verify CUDA Toolkit version matches your GPU arch. 3. Python bindings are opt-in: -DPYTHON_LIBRARY_DIRS and -DPYTHON_INCLUDE_DIRS must be explicitly set; defaults often miss Anaconda/venv installs. 4. ABI compatibility: PCL v1.x has stricter ABI guarantees than v2.x (experimental). Linking against different Boost versions causes mysterious crashes. 5. Large memory footprint: Loading uncompressed point cloud files (>1B points) requires explicit memory management; no lazy-loading built-in. 6. Visualization requires X11/Wayland on Linux: PCL Viewer (VTK-based) fails silently in headless containers without DISPLAY variable and OpenGL.

💡Concepts to learn

  • Voxel Grid Downsampling — Essential preprocessing technique in PCL for reducing point cloud density while preserving spatial structure; most pipelines start here to improve algorithm speed
  • Iterative Closest Point (ICP) — Core PCL algorithm for point cloud registration/alignment; fundamental for SLAM, multi-scan fusion, and 3D reconstruction—extensively used across robotics
  • FLANN (Fast Library for Approximate Nearest Neighbors) — PCL depends heavily on FLANN for accelerating k-NN queries in feature matching and registration; understanding its tree structures (KD-Tree, Random Forests) is critical for performance tuning
  • Surface Reconstruction (Poisson, Moving Least Squares) — PCL offers multiple algorithms to convert sparse point clouds into continuous mesh surfaces; essential for 3D scanning and object reconstruction workflows
  • Point Feature Descriptors (FPFH, VFH, SHOT) — Fast Point Feature Histograms and variants are local shape descriptors used in PCL for object recognition and correspondence matching across clouds
  • Normal Estimation via Covariance Analysis — Computing surface normals from point neighborhoods is a prerequisite for most PCL algorithms (feature extraction, segmentation, visualization); PCA-based approach used throughout
  • Octree Spatial Indexing — PCL uses octrees for efficient spatial queries, compression, and change detection in large point clouds; critical for handling dynamic sensor streams
  • opencv/opencv — 2D image processing equivalent; PCL and OpenCV are complementary—users typically combine them for hybrid 2D/3D vision pipelines
  • ros-perception/perception_pcl — Official ROS integration layer for PCL; wraps PCL algorithms as ROS nodes (tf frames, sensor_msgs/PointCloud2), essential for robotics workflows
  • ceres-solver/ceres-solver — Non-linear optimization library that PCL depends on for advanced registration algorithms and surface fitting
  • PointCloudLibrary/pcl_ros — ROS 2 bindings and utilities for PCL, successor to perception_pcl for modern ROS 2 systems
  • davisking/dlib — Alternative 3D computer vision library with some overlapping functionality (registration, ML-based detection) but lighter weight

🪄PR ideas

To work on one of these in Claude Code or Cursor, paste: Implement the "<title>" PR idea from CLAUDE.md, working through the checklist as the task list.

Add comprehensive unit tests for 2D module (convolution, morphology, keypoint detection)

The 2d/ module has implementation files (convolution.cpp, examples.cpp) but no corresponding test directory is visible in the file structure. PCL has mature testing infrastructure in other modules. Adding unit tests for 2D image processing operations (convolution.h, morphology.h, keypoint.h, edge.h) would improve code reliability and catch regressions, especially since 2D operations are frequently used in computer vision pipelines.

  • [ ] Create test/2d/ directory following PCL's existing test structure
  • [ ] Add unit tests for convolution operations (kernel.hpp, convolution.hpp edge cases)
  • [ ] Add unit tests for morphology operations (erosion, dilation) with sample 2D arrays
  • [ ] Add unit tests for keypoint detection and edge detection algorithms
  • [ ] Integrate tests into CMake build via 2d/CMakeLists.txt
  • [ ] Ensure tests run in existing CI pipelines (.ci/azure-pipelines/build/*.yaml)

Create GitHub Actions workflow to validate code formatting consistency

PCL has .clang-format and .dev/format.sh/.dev/format.py scripts but currently only runs clang-tidy in .github/workflows/clang-tidy.yml. A dedicated GitHub Actions workflow that automatically runs the formatting check on PRs would catch style violations before review, reducing back-and-forth on formatting issues and maintaining consistency across the large codebase.

  • [ ] Create .github/workflows/code-format-check.yml
  • [ ] Add step to run .dev/format.py against modified files in PR
  • [ ] Configure to fail the check if formatting violations are found
  • [ ] Add option to auto-fix and commit formatting on demand using workflow dispatch
  • [ ] Reference .clang-format configuration in the workflow
  • [ ] Document in CONTRIBUTING.md (if it exists) or via workflow description

Add macOS-specific compiler warning suppressions and build documentation

The .ci/azure-pipelines/build/macos.yaml exists but there's no visible documentation explaining macOS-specific build quirks or compiler warnings. PCL likely encounters clang-specific warnings on macOS that differ from Linux/Windows. Adding a macOS build guide with documented suppressions in .clang-format or CMakeLists.txt would help contributors debug macOS build failures and reduce duplicate issues.

  • [ ] Analyze macOS CI logs to identify common clang warnings in PCL modules
  • [ ] Document macOS-specific compiler flags in CMakeLists.txt or a BUILDING_macOS.md file
  • [ ] Add macOS compiler warning suppressions to .clang-format if needed
  • [ ] Create or update build troubleshooting documentation for macOS developers
  • [ ] Add macOS-specific test cases to .ci/azure-pipelines/build/macos.yaml for edge cases
  • [ ] Link to documentation from .github/ISSUE_TEMPLATE/compilation-failure.md

🌿Good first issues

  • Add comprehensive unit tests for pcl/filters/src/radius_outlier_removal.cpp—currently lacks edge-case coverage for empty clouds and very small radius values. See existing test pattern in test/filters/test_statistical_outlier.cpp.
  • Document CUDA compute capability requirements in README and .dev/docker/ Dockerfiles. Currently unclear whether PCL supports Pascal (CC 6.1) vs. newer architectures; create a compatibility matrix.
  • Implement missing Python bindings for pcl::gpu:: modules. The pcl/bindings/py/ directory exists but GPU acceleration is not exposed to Python users—wrap key functions like gpu::downsampling using pybind11 pattern from pcl/bindings/py/pcl_io.cpp.

Top contributors

Click to expand

📝Recent commits

Click to expand
  • 4a6fbd8 — Merge pull request #6435 from daeho-ro/master (mvieth)
  • 767d24b — Merge pull request #6436 from AlrIsmail/optimize-ellipse3d-eigen-solver (mvieth)
  • 185b909 — sample_consensus: optimize Ellipse3D solver with computeDirect (float) (AlrIsmail)
  • ae58fbb — Merge pull request #6430 from AlrIsmail/refactor-ellipse3d-levmarq-to-internal (mvieth)
  • 490996e — Add x11 on linux when vtk 9.6+ (daeho-ro)
  • 4ddd51a — sample_consensus: reduce SampleConsensusModelEllipse3D template instantiations (AlrIsmail)
  • 92e58bb — Fix typo in warning message about field casting (#6433) (mvieth)
  • aa050fd — Fix typo in warning message about field casting (themightyoarfish)
  • f0e75e5 — Replace some deprecated Eigen and VTK functions (#6420) (mvieth)
  • 29feb18 — Added unit test for points almost on a line, that previously resulted in NAN values. (#6426) (larshg)

🔒Security observations

The Point Cloud Library codebase shows a reasonable security posture with a BSD license and active CI/CD infrastructure. The primary concern is historical security incidents with the previous website. The main areas requiring attention are: (1) Review of Docker configurations for best practices, (2) Audit of CI/CD pipeline files for credential management, (3) Review of shell and Python scripts for injection vulnerabilities, and (4) Ongoing monitoring for supply chain security. No critical vulnerabilities were identified in the provided file structure, but full content analysis of scripts and configuration files is recommended.

  • Medium · Potential Old Website Security History — README.md. The README mentions that the old website (pointclouds.org) was previously hacked and could still be hosting malicious code. This indicates a historical security incident that users should be aware of. Fix: Ensure all users are directed to the new official website (https://pointclouds.org). Consider adding security.txt and HSTS headers to the new website. Monitor for any reports of compromised versions or artifacts from the old domain.
  • Low · Docker Configuration Files Present — .dev/docker/*/Dockerfile. Multiple Dockerfile entries are present in the repository (.dev/docker). While not inherently vulnerable, Docker files should be reviewed to ensure they don't contain hardcoded credentials, use minimal base images, and follow security best practices. Fix: Review all Dockerfile configurations for: (1) Hardcoded credentials, (2) Use of latest/unspecified base image tags, (3) Running processes as root, (4) Unnecessary layers and dependencies. Implement image scanning in CI/CD pipeline.
  • Low · CI/CD Pipeline Configuration Review Needed — .ci/azure-pipelines/*.yaml, .github/workflows/*.yml. Azure Pipelines and GitHub Actions workflows are present but the actual content of these files was not provided for analysis. These files could potentially contain secrets, insecure build practices, or privilege escalation risks. Fix: Review all CI/CD pipeline files for: (1) Hardcoded secrets or credentials, (2) Use of secure secret management, (3) Principle of least privilege for service accounts, (4) Build artifact signing and verification.
  • Low · Script Files Requiring Security Review — .ci/scripts/*.sh, .dev/scripts/*.bash, .dev/scripts/*.py. Shell and Python scripts are present in the repository (.ci/scripts/, .dev/scripts/) which should be reviewed for injection vulnerabilities, unsafe command execution, and proper input validation. Fix: Review all scripts for: (1) Input validation and sanitization, (2) Safe command execution without shell injection risks, (3) Proper use of quoting and escaping, (4) Avoidance of eval() and similar unsafe functions.

LLM-derived; treat as a starting point, not a security audit.


Generated by RepoPilot. Verdict based on maintenance signals — see the live page for receipts. Re-run on a new commit to refresh.

Healthy signals · PointCloudLibrary/pcl — RepoPilot