Cysharp/MagicOnion
Unified Realtime/API framework for .NET platform and Unity.
Healthy across all four use cases
Permissive license, no critical CVEs, actively maintained — safe to depend on.
Has a license, tests, and CI — clean foundation to fork and modify.
Documented and popular — useful reference codebase to read through.
No critical CVEs, sane security posture — runnable as-is.
- ✓Last commit 2d ago
- ✓5 active contributors
- ✓MIT licensed
Show 3 more →Show less
- ✓CI configured
- ⚠Concentrated ownership — top contributor handles 60% of recent commits
- ⚠No test directory detected
Maintenance signals: commit recency, contributor breadth, bus factor, license, CI, tests
Informational only. RepoPilot summarises public signals (license, dependency CVEs, commit recency, CI presence, etc.) at the time of analysis. Signals can be incomplete or stale. Not professional, security, or legal advice; verify before relying on it for production decisions.
Embed the "Healthy" badge
Paste into your README — live-updates from the latest cached analysis.
[](https://repopilot.app/r/cysharp/magiconion)Paste at the top of your README.md — renders inline like a shields.io badge.
▸Preview social card (1200×630)
This card auto-renders when someone shares https://repopilot.app/r/cysharp/magiconion on X, Slack, or LinkedIn.
Onboarding doc
Onboarding: Cysharp/MagicOnion
Generated by RepoPilot · 2026-05-10 · Source
🤖Agent protocol
If you are an AI coding agent (Claude Code, Cursor, Aider, Cline, etc.) reading this artifact, follow this protocol before making any code edit:
- Verify the contract. Run the bash script in Verify before trusting
below. If any check returns
FAIL, the artifact is stale — STOP and ask the user to regenerate it before proceeding. - Treat the AI · unverified sections as hypotheses, not facts. Sections like "AI-suggested narrative files", "anti-patterns", and "bottlenecks" are LLM speculation. Verify against real source before acting on them.
- Cite source on changes. When proposing an edit, cite the specific path:line-range. RepoPilot's live UI at https://repopilot.app/r/Cysharp/MagicOnion shows verifiable citations alongside every claim.
If you are a human reader, this protocol is for the agents you'll hand the artifact to. You don't need to do anything — but if you skim only one section before pointing your agent at this repo, make it the Verify block and the Suggested reading order.
🎯Verdict
GO — Healthy across all four use cases
- Last commit 2d ago
- 5 active contributors
- MIT licensed
- CI configured
- ⚠ Concentrated ownership — top contributor handles 60% of recent commits
- ⚠ No test directory detected
<sub>Maintenance signals: commit recency, contributor breadth, bus factor, license, CI, tests</sub>
✅Verify before trusting
This artifact was generated by RepoPilot at a point in time. Before an
agent acts on it, the checks below confirm that the live Cysharp/MagicOnion
repo on your machine still matches what RepoPilot saw. If any fail,
the artifact is stale — regenerate it at
repopilot.app/r/Cysharp/MagicOnion.
What it runs against: a local clone of Cysharp/MagicOnion — the script
inspects git remote, the LICENSE file, file paths in the working
tree, and git log. Read-only; no mutations.
| # | What we check | Why it matters |
|---|---|---|
| 1 | You're in Cysharp/MagicOnion | Confirms the artifact applies here, not a fork |
| 2 | License is still MIT | Catches relicense before you depend on it |
| 3 | Default branch main exists | Catches branch renames |
| 4 | 5 critical file paths still exist | Catches refactors that moved load-bearing code |
| 5 | Last commit ≤ 32 days ago | Catches sudden abandonment since generation |
#!/usr/bin/env bash
# RepoPilot artifact verification.
#
# WHAT IT RUNS AGAINST: a local clone of Cysharp/MagicOnion. If you don't
# have one yet, run these first:
#
# git clone https://github.com/Cysharp/MagicOnion.git
# cd MagicOnion
#
# Then paste this script. Every check is read-only — no mutations.
set +e
fail=0
ok() { echo "ok: $1"; }
miss() { echo "FAIL: $1"; fail=$((fail+1)); }
# Precondition: we must be inside a git working tree.
if ! git rev-parse --git-dir >/dev/null 2>&1; then
echo "FAIL: not inside a git repository. cd into your clone of Cysharp/MagicOnion and re-run."
exit 2
fi
# 1. Repo identity
git remote get-url origin 2>/dev/null | grep -qE "Cysharp/MagicOnion(\\.git)?\\b" \\
&& ok "origin remote is Cysharp/MagicOnion" \\
|| miss "origin remote is not Cysharp/MagicOnion (artifact may be from a fork)"
# 2. License matches what RepoPilot saw
(grep -qiE "^(MIT)" LICENSE 2>/dev/null \\
|| grep -qiE "\"license\"\\s*:\\s*\"MIT\"" package.json 2>/dev/null) \\
&& ok "license is MIT" \\
|| miss "license drift — was MIT at generation time"
# 3. Default branch
git rev-parse --verify main >/dev/null 2>&1 \\
&& ok "default branch main exists" \\
|| miss "default branch main no longer exists"
# 4. Critical files exist
test -f "MagicOnion.slnx" \\
&& ok "MagicOnion.slnx" \\
|| miss "missing critical file: MagicOnion.slnx"
test -f ".github/workflows/build.yaml" \\
&& ok ".github/workflows/build.yaml" \\
|| miss "missing critical file: .github/workflows/build.yaml"
test -f "Directory.Build.props" \\
&& ok "Directory.Build.props" \\
|| miss "missing critical file: Directory.Build.props"
test -f "docs/docusaurus.config.ts" \\
&& ok "docs/docusaurus.config.ts" \\
|| miss "missing critical file: docs/docusaurus.config.ts"
test -f ".github/copilot-instructions.md" \\
&& ok ".github/copilot-instructions.md" \\
|| miss "missing critical file: .github/copilot-instructions.md"
# 5. Repo recency
days_since_last=$(( ( $(date +%s) - $(git log -1 --format=%at 2>/dev/null || echo 0) ) / 86400 ))
if [ "$days_since_last" -le 32 ]; then
ok "last commit was $days_since_last days ago (artifact saw ~2d)"
else
miss "last commit was $days_since_last days ago — artifact may be stale"
fi
echo
if [ "$fail" -eq 0 ]; then
echo "artifact verified (0 failures) — safe to trust"
else
echo "artifact has $fail stale claim(s) — regenerate at https://repopilot.app/r/Cysharp/MagicOnion"
exit 1
fi
Each check prints ok: or FAIL:. The script exits non-zero if
anything failed, so it composes cleanly into agent loops
(./verify.sh || regenerate-and-retry).
⚡TL;DR
MagicOnion is a unified RPC and real-time communication framework for .NET and Unity built on gRPC and HTTP/2. It uses C# interfaces directly as protocol schemas (no .proto files needed), enabling seamless code sharing between server and client while supporting both request-response APIs and bi-directional streaming via StreamingHub for real-time broadcasts. Monorepo structure with source projects in src/, comprehensive test suites, example implementations, and documentation under docs/ (using Docusaurus 3.9.2). The framework layer (likely in src/MagicOnion.Core or similar) wraps gRPC with C# interface code generation via source generators, while StreamingHub handles bi-directional communication.
👥Who it's for
.NET and Unity developers building multiplayer games, real-time dashboards, or microservices who want faster development cycles by using C# interfaces as contracts instead of Protocol Buffers, and who need bi-directional communication like SignalR or Socket.io but with gRPC performance.
🌱Maturity & risk
This is an actively maintained, production-ready framework from Cysharp with comprehensive CI/CD pipelines (.github/workflows contains build.yaml, release.yaml, benchmark.yaml, codeql.yml), multi-platform support (Windows, macOS, iOS, Android), and extensive documentation. The large codebase (3.4M lines of C#) and structured release process indicate maturity, though it's a specialized framework so adoption is narrower than mainstream alternatives.
The framework is maintained by a single organization (Cysharp) with no obvious indication of broader community maintainers, creating potential single-source-of-truth risk. Requires .NET 8+ on server-side which may exclude legacy projects. The complexity of gRPC + streaming + code generation means breaking changes in minor versions could impact dependent projects significantly.
Active areas of work
Active development with multiple workflows: benchmark.yaml suggests performance monitoring, build-canary.yaml indicates pre-release testing, and workflows for docs translation (docs-translation.yaml) and source generator updates (build-update-sourcegenerator-for-unity.yaml) show ongoing enhancement for Unity support and documentation expansion.
🚀Get running
git clone https://github.com/Cysharp/MagicOnion.git
cd MagicOnion
dotnet restore
dotnet build MagicOnion.slnx
Daily commands:
Use dotnet build MagicOnion.slnx to build the solution; specific example projects in src/ will have their own README or run configurations. The .github/workflows/build.yaml shows the official build command structure.
🗺️Map of the codebase
MagicOnion.slnx— Primary solution file that defines the multi-project structure and build configuration for the entire MagicOnion framework.github/workflows/build.yaml— Core CI/CD pipeline that validates all commits and ensures code quality across .NET, Unity, and documentation buildsDirectory.Build.props— Centralized MSBuild properties file that enforces consistent versioning, compiler settings, and package dependencies across all projectsdocs/docusaurus.config.ts— Documentation site configuration that serves as the primary reference for all users and contributors learning the framework.github/copilot-instructions.md— AI assistant guidelines that document coding conventions and architectural decisions for new contributionsNuGet.Config— Package source configuration critical for resolving internal and external dependencies during builds
🛠️How to make changes
Add a new StreamingHub (real-time service)
- Create a new interface inheriting from IStreamingHub<THub, TReceiver> in your service project (
docs/docs/streaminghub/define-interface.md) - Implement the interface in a concrete class with hub methods and server-to-client methods (
docs/docs/streaminghub/getting-started.md) - Register the hub in dependency injection and configure the gRPC service endpoint (
docs/docs/integration/dependency-injection.md) - Apply filters (if needed) for authentication, logging, or custom middleware via IStreamingHubFilter (
docs/docs/filter/streaminghub-filter.md)
Add a new Unary RPC service
- Define a service interface with methods returning Task<T> or ValueTask<T> (
docs/docs/unary/getting-started.md) - Implement the service class, optionally handling raw bytes responses for binary protocols (
docs/docs/unary/raw-bytes-response.md) - Register in the gRPC service builder and configure authentication if needed (
docs/docs/fundamentals/authentication.md) - Apply client-side or server-side filters for request/response transformation (
docs/docs/filter/fundamentals.md)
Configure authentication and security
- Choose authentication method (JWT, OAuth, mTLS) and update gRPC channel configuration (
docs/docs/fundamentals/authentication.md) - Enable HTTPS and configure SSL certificates for production deployments (
docs/docs/fundamentals/https.md) - Implement custom filters to validate credentials and attach user context (
docs/docs/filter/dependency-injection.md) - Test authentication flow with the client generated via source generator (
docs/docs/source-generator/client.md)
Customize serialization and add encryption
- Select serialization format (MessagePack, MemoryPack, or custom) in MagicOnionOptions (
docs/docs/advanced/magiconionoptions.md) - Implement IMessageSerializer if using custom serialization logic (
docs/docs/advanced/customize-serialization-encryption.md) - Add encryption middleware via custom filter to encrypt payloads at transport layer (
docs/docs/filter/extensibility.md)
🔧Why these technologies
- gRPC + Protocol Buffers — Provides HTTP/2 multiplexing, bi-directional streaming, and efficient binary serialization for low-latency RPC and real-time games
- C# Source Generators — Eliminates manual code generation and enables compile-time client proxy generation without runtime reflection overhead
- MessagePack / MemoryPack — Faster and more compact serialization than JSON, critical for game networking and real-time performance
- StreamingHub pattern — Abstracts away raw gRPC complexity while enabling push-based real-time server-to-client events for multiplayer games
- Docusaurus — Modern documentation platform with multi-language support (English/Japanese) and Markdown-based versioning for framework guides
⚖️Trade-offs already made
-
Require source code generation instead of reflection-based proxies
- Why: Eliminates runtime overhead and enables AOT compilation (no .NET JIT required in production)
- Consequence: Build-time dependency on code generation tools; requires recompilation on interface changes
-
Use gRPC over REST/HTTP for all RPC calls
- Why: HTTP/2 multiplexing and streaming efficiency; symmetric messaging for games
- Consequence: Browser clients require WebAssembly or browser gRPC libraries; less discoverable than REST APIs
-
Support both unary and streaming hubs as first-class patterns
- Why: Covers request-response (simple APIs) and push-based real-time (games) with single framework
- Consequence: Larger API surface; developers must choose appropriate pattern for each use case
-
Integrate filters (middleware) at framework level rather than per-service
- Why: Cross-cutting concerns (auth, logging, metrics) apply uniformly across all services
- Consequence: Less granular control than per-route middleware; requires understanding filter order and context propagation
🚫Non-goals (don't propose these)
- Does not provide authentication/authorization implementation; framework assumes external identity provider (JWT, OAuth, mTLS)
- Does not include database ORM or persistence layer; services are responsible for data access
- Does not support pure REST/HTTP (only gRPC); transcoding to HTTP JSON available but not primary interface
- Does not handle horizontal scaling or distributed deployment directly; relies on external load balancers and Kubernetes
- Does not provide UI framework; integrates with Unity and Blazor but does not include visual components
🪤Traps & gotchas
Requires .NET 8+ for server deployment (see README: 'MagicOnion server requires .NET 8+'). Unity support needs Unity 2022.3 LTS minimum. Source generators are involved, so cached build artifacts can cause stale code generation—clean builds may be needed. The streaming/real-time features create channel lifecycle complexity; incorrect hub disposal patterns will leak connections.
🏗️Architecture
💡Concepts to learn
- gRPC and HTTP/2 — MagicOnion's entire transport layer is built on gRPC over HTTP/2, so understanding multiplexing, server push, and binary framing is critical for diagnosing network issues and optimization
- StreamingHub (Bi-directional Streaming) — MagicOnion's core feature for real-time communication uses gRPC streaming patterns; you need to understand server-initiated pushes and channel lifecycle to build correct real-time features
- C# Source Generators — MagicOnion uses source generators to eliminate .proto boilerplate by generating code from C# interfaces; understanding this mechanism is essential for extending the framework or debugging code generation
- Protocol Buffers Serialization — Although MagicOnion abstracts it away, its internal wire protocol uses Protocol Buffers; understanding protobuf encoding (varint, zigzag, field numbering) helps optimize message sizes and performance
- Interface-Based Contracts (Anti-.proto) — MagicOnion's differentiator is using C# interfaces directly as RPC contracts instead of .proto IDL; this requires understanding how interface contracts map to service definitions and method signatures
- Monorepo Build Patterns (NuGet Central Package Management) — The repo uses Directory.Build.props and Directory.Packages.props for centralized dependency management across multiple projects; this pattern ensures consistency but requires understanding transitive dependency resolution
- Unity Integration with C# Generics and IL2CPP — The framework supports Unity IL2CPP compilation (mentioned in README: 'IL2C') which has different constraints than JIT; generic type handling and reflection patterns must account for IL2CPP limitations when writing framework extensions
🔗Related repos
grpc/grpc-dotnet— The underlying gRPC transport layer that MagicOnion wraps; understanding native gRPC is essential for troubleshooting binary protocol issuesaspnet/AspNetCore— MagicOnion server targets .NET 8+ and integrates with ASP.NET Core runtime; knowledge of ASP.NET Core middleware and service registration patterns applies directlyCysharp/MemoryPack— Companion serialization library for MagicOnion users who want faster binary serialization alternatives to Protocol Buffers; recommended in framework docsSignalR/SignalR— The primary alternative for .NET real-time communication; MagicOnion's StreamingHub feature is positioned as a gRPC-based alternative with different performance/latency tradeoffsCysharp/MessagePack-CSharp— Serialization framework also from Cysharp used in MagicOnion's serialization pipeline; understanding MessagePack schema is valuable for performance tuning
🪄PR ideas
To work on one of these in Claude Code or Cursor, paste:
Implement the "<title>" PR idea from CLAUDE.md, working through the checklist as the task list.
Add integration tests for gRPC authentication flows
The repo has docs/docs/fundamentals/authentication.md documenting authentication features, but there's no visible test coverage for authentication scenarios in a .NET/gRPC context. A new contributor could add integration tests covering JWT, mTLS, and custom authentication filters to ensure the authentication mechanisms described in the docs actually work end-to-end.
- [ ] Review docs/docs/fundamentals/authentication.md to understand supported auth mechanisms
- [ ] Create new test project (e.g., tests/MagicOnion.Tests.Integration.Auth/) with gRPC auth scenarios
- [ ] Add test cases for: JWT token validation, mTLS certificate verification, custom auth filter chains
- [ ] Integrate tests into .github/workflows/build.yaml CI pipeline
- [ ] Ensure tests cover both StreamingHub and Unary service patterns
Add CodeQL security scan for C# code with custom rules for gRPC/serialization safety
The repo has .github/codeql-config.yaml but it appears minimal. Given MagicOnion's focus on serialization (MemoryPack mentioned in docs) and streaming, custom CodeQL queries for unsafe deserialization, buffer overflows, and gRPC-specific vulnerabilities would strengthen security posture for contributors.
- [ ] Extend .github/codeql-config.yaml with C#-specific queries
- [ ] Create .github/codeql-custom-queries/ directory with rules for: unsafe serialization in MemoryPack, StreamingHub message validation, gRPC metadata injection
- [ ] Test queries against existing codebase (src/ directory) for false positives
- [ ] Document new queries in a SECURITY_SCANNING.md file
- [ ] Update .github/workflows/codeql.yml to use custom queries
Add Docusaurus component tests and accessibility compliance checks
The docs use Docusaurus 3.9.2 with mermaid diagrams, but there's no visible test coverage for the documentation site itself. A new contributor could add Playwright tests for critical doc flows (navigation, code examples, search) and WCAG 2.1 accessibility checks to prevent regression as docs grow.
- [ ] Create docs/tests/ directory with Playwright E2E tests
- [ ] Add test suite covering: doc homepage navigation, sidebar expansion, code block copy buttons, search functionality
- [ ] Add axe-core accessibility testing to docs/tests/ to verify WCAG 2.1 compliance
- [ ] Create GitHub Action in .github/workflows/docs-testing.yaml to run Playwright + a11y tests on doc changes
- [ ] Document test setup in docs/README.md with instructions for contributors
🌿Good first issues
- idea: Add comprehensive documentation examples for the MemoryPack serialization integration mentioned in docs/docs/advanced/memorypack.md—currently sparse, could benefit from side-by-side Protocol Buffer vs MemoryPack examples
- idea: Write integration tests for the 'Map Service Endpoints' feature (docs/docs/advanced/map-service-endpoints.md) to ensure endpoint mapping behavior is consistent across .NET Framework and Unity targets
- idea: Create a benchmark comparison script in .github/scripts/ (alongside existing benchmark-client-run.sh) that measures gRPC vs MagicOnion performance for same workload to quantify the interface-based abstraction overhead
⭐Top contributors
Click to expand
Top contributors
- @guitarrapc — 60 commits
- @mayuki — 33 commits
- @github-actions[bot] — 4 commits
- @dependabot[bot] — 2 commits
- @jeff-at-trimble — 1 commits
📝Recent commits
Click to expand
Recent commits
b7af52f— Merge pull request #1052 from Cysharp/dependabot/github_actions/dependencies-ba79bdfdfb (mayuki)05fcf63— chore(deps): bump the dependencies group across 1 directory with 5 updates (dependabot[bot])f49f2c3— Merge pull request #1046 from Cysharp/chore/bench2 (guitarrapc)a640d14— chore: benchmark NuGet server publish failed, due to API change (guitarrapc)a0726e2— Merge pull request #1045 from Cysharp/fix/timeout (guitarrapc)05988a1— chore: add logs (guitarrapc)d6562d6— chore: simpler handling (guitarrapc)96156a2— chore: add datadog sent canceled log (guitarrapc)29d5361— chore: update (guitarrapc)114e3a5— chore: log (guitarrapc)
🔒Security observations
The MagicOnion repository's documentation site (built with Docusaurus) has a moderate security posture. The main concerns are loose version constraints on dependencies, missing explicit security header configuration, and lack of visible security scanning in the CI/CD pipeline. The Node.js version constraint is overly permissive. The absence of a visible lock file is concerning for reproducibility. The core dependencies (React, Docusaurus) are relatively current but should be monitored regularly. No hardcoded secrets or obvious injection risks were detected in the visible file structure. The .NET framework components (not analyzed in detail due to lack of visible source code) should be separately reviewed for security considerations.
- Medium · Outdated React Version with Known Vulnerabilities —
package.json - dependencies: react, react-dom. React version 19.2.3 is relatively recent but should be monitored for security patches. The dependency on react-dom should always match the React version to prevent compatibility issues that could lead to security gaps. Fix: Regularly update React and react-dom to the latest stable versions. Enable Dependabot alerts and automated security updates in the repository settings. - Medium · Loose Version Constraints on Docusaurus Dependencies —
package.json - dependencies section. Several Docusaurus dependencies use caret (^) version constraints (@docusaurus/plugin-content-docs, @docusaurus/theme-common, @docusaurus/theme-mermaid, @mdx-js/react) which allow minor and patch updates automatically. This could introduce unexpected changes or vulnerabilities from transitive dependencies. Fix: Use pinned versions (exact versions) for critical dependencies, or at least use tilde (~) constraints to limit updates to patch versions only. Review the lock file (package-lock.json or yarn.lock) to ensure reproducible builds. - Low · Missing Security Headers Configuration —
docs configuration (likely docusaurus.config.js not visible in provided structure). The Docusaurus configuration does not explicitly show security headers (CSP, X-Frame-Options, X-Content-Type-Options) being set for the documentation site. Fix: Configure security headers in the Docusaurus config or via a hosting platform's security settings. Implement Content Security Policy (CSP) to prevent XSS attacks. - Low · No Lock File Visible in Repository Structure —
Repository root. The file structure does not show package-lock.json or yarn.lock, which are critical for ensuring reproducible and secure builds. Without lock files, dependency resolution can be non-deterministic. Fix: Ensure package-lock.json (for npm) or yarn.lock (for yarn) is committed to the repository and kept up-to-date. This ensures all developers and CI/CD pipelines use the same exact versions. - Low · Node.js Version Constraint Could Be More Specific —
package.json - engines.node. The engines field specifies 'node >= 18.0' which is very permissive. This allows any Node.js version 18 or higher, including versions with known security vulnerabilities. Fix: Specify a more restrictive Node.js version range, such as '18.x || 20.x || 22.x', and regularly update as new LTS versions are released. Use .nvmrc or .node-version files for consistency. - Low · No Dependency Audit Configuration Visible —
package.json - scripts section. No evidence of npm audit scripts or security scanning tools in the package.json scripts section. This makes it harder to detect vulnerable dependencies in the CI/CD pipeline. Fix: Add 'npm audit' or 'yarn audit' to the CI/CD pipeline. Consider using security scanning tools like Snyk or WhiteSource for continuous vulnerability monitoring.
LLM-derived; treat as a starting point, not a security audit.
👉Where to read next
- Open issues — current backlog
- Recent PRs — what's actively shipping
- Source on GitHub
Generated by RepoPilot. Verdict based on maintenance signals — see the live page for receipts. Re-run on a new commit to refresh.