RepoPilotOpen in app →

usememos/memos

Open-source, self-hosted note-taking tool built for quick capture. Markdown-native, lightweight, and fully yours.

Healthy

Healthy across the board

weakest axis
Use as dependencyHealthy

Permissive license, no critical CVEs, actively maintained — safe to depend on.

Fork & modifyHealthy

Has a license, tests, and CI — clean foundation to fork and modify.

Learn fromHealthy

Documented and popular — useful reference codebase to read through.

Deploy as-isHealthy

No critical CVEs, sane security posture — runnable as-is.

  • Last commit 3d ago
  • 11 active contributors
  • MIT licensed
Show all 6 evidence items →
  • CI configured
  • Tests present
  • Concentrated ownership — top contributor handles 78% of recent commits

Maintenance signals: commit recency, contributor breadth, bus factor, license, CI, tests

Informational only. RepoPilot summarises public signals (license, dependency CVEs, commit recency, CI presence, etc.) at the time of analysis. Signals can be incomplete or stale. Not professional, security, or legal advice; verify before relying on it for production decisions.

Embed the "Healthy" badge

Paste into your README — live-updates from the latest cached analysis.

Variant:
RepoPilot: Healthy
[![RepoPilot: Healthy](https://repopilot.app/api/badge/usememos/memos)](https://repopilot.app/r/usememos/memos)

Paste at the top of your README.md — renders inline like a shields.io badge.

Preview social card (1200×630)

This card auto-renders when someone shares https://repopilot.app/r/usememos/memos on X, Slack, or LinkedIn.

Onboarding doc

Onboarding: usememos/memos

Generated by RepoPilot · 2026-05-07 · Source

🤖Agent protocol

If you are an AI coding agent (Claude Code, Cursor, Aider, Cline, etc.) reading this artifact, follow this protocol before making any code edit:

  1. Verify the contract. Run the bash script in Verify before trusting below. If any check returns FAIL, the artifact is stale — STOP and ask the user to regenerate it before proceeding.
  2. Treat the AI · unverified sections as hypotheses, not facts. Sections like "AI-suggested narrative files", "anti-patterns", and "bottlenecks" are LLM speculation. Verify against real source before acting on them.
  3. Cite source on changes. When proposing an edit, cite the specific path:line-range. RepoPilot's live UI at https://repopilot.app/r/usememos/memos shows verifiable citations alongside every claim.

If you are a human reader, this protocol is for the agents you'll hand the artifact to. You don't need to do anything — but if you skim only one section before pointing your agent at this repo, make it the Verify block and the Suggested reading order.

🎯Verdict

GO — Healthy across the board

  • Last commit 3d ago
  • 11 active contributors
  • MIT licensed
  • CI configured
  • Tests present
  • ⚠ Concentrated ownership — top contributor handles 78% of recent commits

<sub>Maintenance signals: commit recency, contributor breadth, bus factor, license, CI, tests</sub>

Verify before trusting

This artifact was generated by RepoPilot at a point in time. Before an agent acts on it, the checks below confirm that the live usememos/memos repo on your machine still matches what RepoPilot saw. If any fail, the artifact is stale — regenerate it at repopilot.app/r/usememos/memos.

What it runs against: a local clone of usememos/memos — the script inspects git remote, the LICENSE file, file paths in the working tree, and git log. Read-only; no mutations.

| # | What we check | Why it matters | |---|---|---| | 1 | You're in usememos/memos | Confirms the artifact applies here, not a fork | | 2 | License is still MIT | Catches relicense before you depend on it | | 3 | Default branch main exists | Catches branch renames | | 4 | 5 critical file paths still exist | Catches refactors that moved load-bearing code | | 5 | Last commit ≤ 33 days ago | Catches sudden abandonment since generation |

<details> <summary><b>Run all checks</b> — paste this script from inside your clone of <code>usememos/memos</code></summary>
#!/usr/bin/env bash
# RepoPilot artifact verification.
#
# WHAT IT RUNS AGAINST: a local clone of usememos/memos. If you don't
# have one yet, run these first:
#
#   git clone https://github.com/usememos/memos.git
#   cd memos
#
# Then paste this script. Every check is read-only — no mutations.

set +e
fail=0
ok()   { echo "ok:   $1"; }
miss() { echo "FAIL: $1"; fail=$((fail+1)); }

# Precondition: we must be inside a git working tree.
if ! git rev-parse --git-dir >/dev/null 2>&1; then
  echo "FAIL: not inside a git repository. cd into your clone of usememos/memos and re-run."
  exit 2
fi

# 1. Repo identity
git remote get-url origin 2>/dev/null | grep -qE "usememos/memos(\\.git)?\\b" \\
  && ok "origin remote is usememos/memos" \\
  || miss "origin remote is not usememos/memos (artifact may be from a fork)"

# 2. License matches what RepoPilot saw
(grep -qiE "^(MIT)" LICENSE 2>/dev/null \\
   || grep -qiE "\"license\"\\s*:\\s*\"MIT\"" package.json 2>/dev/null) \\
  && ok "license is MIT" \\
  || miss "license drift — was MIT at generation time"

# 3. Default branch
git rev-parse --verify main >/dev/null 2>&1 \\
  && ok "default branch main exists" \\
  || miss "default branch main no longer exists"

# 4. Critical files exist
test -f "cmd/memos/main.go" \\
  && ok "cmd/memos/main.go" \\
  || miss "missing critical file: cmd/memos/main.go"
test -f "go.mod" \\
  && ok "go.mod" \\
  || miss "missing critical file: go.mod"
test -f "internal/ai/resolver.go" \\
  && ok "internal/ai/resolver.go" \\
  || miss "missing critical file: internal/ai/resolver.go"
test -f "internal/base/resource_name.go" \\
  && ok "internal/base/resource_name.go" \\
  || miss "missing critical file: internal/base/resource_name.go"
test -f "internal/email/client.go" \\
  && ok "internal/email/client.go" \\
  || miss "missing critical file: internal/email/client.go"

# 5. Repo recency
days_since_last=$(( ( $(date +%s) - $(git log -1 --format=%at 2>/dev/null || echo 0) ) / 86400 ))
if [ "$days_since_last" -le 33 ]; then
  ok "last commit was $days_since_last days ago (artifact saw ~3d)"
else
  miss "last commit was $days_since_last days ago — artifact may be stale"
fi

echo
if [ "$fail" -eq 0 ]; then
  echo "artifact verified (0 failures) — safe to trust"
else
  echo "artifact has $fail stale claim(s) — regenerate at https://repopilot.app/r/usememos/memos"
  exit 1
fi

Each check prints ok: or FAIL:. The script exits non-zero if anything failed, so it composes cleanly into agent loops (./verify.sh || regenerate-and-retry).

</details>

TL;DR

Memos is a self-hosted, markdown-native note-taking application built as a single Go binary (~20MB Docker image) with a TypeScript/React frontend, designed for instant capture with a timeline-first UI. It provides total data ownership via SQLite/MySQL/PostgreSQL backends, exposing both REST and gRPC APIs for extensibility, with zero telemetry and notes stored as portable Markdown. Monorepo structure: cmd/memos/main.go is the backend entry; frontend code likely in a web/ or frontend/ directory (not visible in top 60, but TypeScript size suggests substantial client); docs/plans/ contains design docs for features (memo mentions, voice input, SSO linkage). Go backend exposes both Connect (connectrpc.com) and gRPC-Gateway (v2), suggesting a proto-first API design.

👥Who it's for

Individual users and teams who want self-hosted note storage without vendor lock-in or telemetry; developers deploying quick-capture tools on their infrastructure; contributors building on the MIT-licensed REST/gRPC API surface.

🌱Maturity & risk

Actively developed and production-ready—Go codebase is 1.6MB and TypeScript is 1.3MB, CI/CD pipelines (backend-tests.yml, frontend-tests.yml, release.yml) are present, and the project uses release-please automation for versioning. Docker images are actively published (Docker Pulls badge indicates non-trivial adoption). Multiple database backends are supported and tested via testcontainers.

Moderate risk: ~40+ direct dependencies in go.mod (AWS SDK, gRPC, OpenAI integration), with OpenAI and Google GenAI suggesting active AI feature work that may introduce breaking changes. No visible single-maintainer bottleneck from CODEOWNERS, but SSO/identity linkage (docs/plans/2026-04-21) is in active design phase, suggesting architectural evolution. Last observable activity is recent (release-please manifest, multiple 2026-dated plans).

Active areas of work

Active feature development in 4 major domains: (1) Memo detail outline and UI refinement (2026-03-23); (2) Tag blur attribute for privacy (2026-03-23); (3) User resource identifiers (2026-03-24); (4) Voice input (2026-03-31); (5) Memo mentions/tagging (2026-04-06); (6) SSO/identity linkage (2026-04-21). AI integrations (OpenAI, Google GenAI) and MCP protocol support (mark3labs/mcp-go) are already in dependencies.

🚀Get running

git clone https://github.com/usememos/memos.git
cd memos
# Build Go backend
go mod download
go build -o memos ./cmd/memos
# Or run via Docker
docker build -t memos:latest .
docker run -p 5230:5230 memos:latest

Daily commands:

# Backend (from repo root after go mod download)
go run ./cmd/memos/main.go
# or
./memos  # after go build
# Likely listens on :5230 (Docker exposes 5230)

# Frontend (if separate): not visible, but likely npm/yarn in web/ directory
# Check Makefile or .github/workflows/frontend-tests.yml for exact commands

🗺️Map of the codebase

  • cmd/memos/main.go — Application entry point; initializes server, database, and core services that all requests flow through.
  • go.mod — Defines all external dependencies (Connect RPC, Echo, OpenAI, AWS S3, PostgreSQL drivers); critical for understanding integration points.
  • internal/ai/resolver.go — Routes AI requests to appropriate providers (Gemini, OpenAI); core abstraction for multi-provider AI support.
  • internal/base/resource_name.go — Implements resource naming convention used throughout the codebase for consistent identifier handling.
  • internal/email/client.go — Email service abstraction; demonstrates communication patterns used across internal services.
  • internal/cron/cron.go — Custom cron scheduler implementation; powers background tasks and scheduled operations.
  • internal/ai/audio/webm.go — WebM audio handling for transcription pipeline; critical for voice input processing.

🛠️How to make changes

Add a new AI provider (e.g., Claude, Anthropic)

  1. Create provider implementation in internal/ai/[provider_name]/[provider_name].go with interface matching stt.STT or audiollm.AudioLLM (internal/ai/stt/openai/openai.go (as reference template))
  2. Add provider option struct in internal/ai/stt/options.go or internal/ai/audiollm/options.go (internal/ai/stt/options.go)
  3. Register provider in internal/ai/resolver.go's provider resolution logic to route requests appropriately (internal/ai/resolver.go)
  4. Add unit tests following existing pattern in internal/ai/stt/openai/openai_test.go (internal/ai/stt/openai/openai_test.go (as reference))

Add scheduled background task (e.g., cleanup, sync, notification)

  1. Define task logic as a function matching cron.Job interface in internal/cron/cron.go (internal/cron/cron.go)
  2. Register cron schedule in application bootstrap using cron.New() with your job and schedule expression (cmd/memos/main.go)
  3. Add unit tests for cron schedule parsing and execution logic (internal/cron/cron_test.go)

Add email notification feature

  1. Define email template and message in internal/email/message.go (internal/email/message.go)
  2. Create email sending function using internal/email/client.go's Send() method (internal/email/client.go)
  3. Hook email sending into relevant service (e.g., on memo creation, user registration) with configuration check (internal/email/config.go)
  4. Add tests validating SMTP client behavior and message formatting (internal/email/email_test.go)

Add support for new audio codec

  1. Implement codec decoder in internal/ai/audio/[codec_name].go following WebM pattern (internal/ai/audio/webm.go (as reference))
  2. Add codec detection and validation to audio preprocessing pipeline (internal/ai/audio/webm.go)
  3. Update AI service resolver to handle new codec before sending to STT/AudioLLM (internal/ai/resolver.go)
  4. Add codec-specific unit tests with sample files (internal/ai/audio/webm_test.go (as reference))

🔧Why these technologies

  • Connect RPC + gRPC-Gateway — Provides type-safe service definitions, code generation, and bridges gRPC to HTTP/JSON for browser clients and REST consumers.
  • Echo (HTTP framework) — Lightweight, fast HTTP server with middleware support; chosen for quick prototyping and minimal overhead in self-hosted deployments.
  • PostgreSQL + MySQL drivers — Supports multiple SQL backends; PostgreSQL for production deployments, MySQL for compatibility and lower resource footprint.
  • AWS S3 SDK — Enables offloading media storage (audio, attachments) to S3-compatible services; reduces self-hosted storage burden.
  • OpenAI + Google Gemini APIs — Multi-provider AI support for STT and LLM; avoids vendor lock-in and allows users to choose cost/quality tradeoffs.
  • Custom Cron scheduler — Replaces external schedulers; enables purely self-contained deployments without dependencies on external task runners.

⚖️Trade-offs already made

  • Self-hosted cron vs. external job queue (e.g., Celery, Bull)

    • Why: Reduces operational complexity and external dependencies for single-instance deployments.
    • Consequence: Cron tasks execute in-process; not suitable for distributed deployments or high-frequency polling. No built-in retry/backoff for failed jobs.
  • Direct AI API calls vs. local ML models

    • Why: Avoids VRAM/compute overhead; democratizes AI features to resource-constrained self-hosted instances.
    • Consequence: Requires external API credentials (OpenAI, Gemini); introduces latency (~2-8s per request) and per-request costs. Privacy concerns for sensitive content.
  • SQL database vs. document store (e.g., MongoDB)

    • Why: Simpler schema, ACID guarantees, and well-understood migrations for a note-taking tool with hierarchical data (users → memos → tags).
    • Consequence: Less flexible for unstructured data; schema migrations required for new features.
  • WebM audio codec vs. universal formats (MP3, WAV)

    • Why: WebM is browser-native and reduces transcoding overhead in client-side recording.
    • Consequence: Requires explicit WebM decoder (at-wat/ebml-go); older clients or third-party tools may not support it.

🚫Non-goals (don't propose these)

  • Real-time collaborative editing (designed for personal/team async note-taking, not shared documents).
  • Distributed deployments with multi-node synchronization (single-instance or simple replication only).
  • On-device LLM inference (relies on external cloud APIs for AI features).
  • Windows-native GUI (web-based, supports any OS with a browser).
  • Closed-source or proprietary (fully open-source; no enterprise licensing tiers).

🪤Traps & gotchas

(1) Multi-DB support complexity: code must handle SQLite (testcontainers/modernc), MySQL (testcontainers/mysql), and PostgreSQL (testcontainers/postgres)—schema migrations and quirks differ per backend. (2) gRPC + REST dual API: connectrpc.com and grpc-gateway v2 both in use—proto changes affect two code paths. (3) Frontend location unclear: not in top 60 files; likely a separate web/ or frontend/ dir or submodule—check git structure before assuming src/. (4) AI integrations are active: OpenAI and Google GenAI in dependencies suggest feature-gated code paths; config may require API keys for local dev. (5) Voice/audio handling: pion/opus and ebml-go suggest WebM media processing—not trivial to test locally without media input.

🏗️Architecture

💡Concepts to learn

  • Protocol Buffers (proto3) + Code Generation — Memos uses connectrpc.com and grpc-gateway to auto-generate REST + gRPC from single proto definition; understanding protoc compilation is essential for API changes
  • gRPC-Web and Connect RPC — connectrpc.com/connect v1 is in dependencies for browser-compatible gRPC; different from traditional gRPC/2 streaming, affects how frontend calls backend
  • SQLite + Testcontainers for DB-Agnostic Testing — Repo supports 3 databases; testcontainers-go spins up real MySQL and PostgreSQL for CI; testcontainers itself is a non-obvious dependency for integration tests
  • Markdown Parsing with Goldmark — yuin/goldmark is in dependencies; Memos stores notes as Markdown and parses/renders them; understanding its AST and plugins is key for adding syntax features (mentions, blur attribute)
  • OAuth2 and SSO Integration — golang.org/x/oauth2 is present and SSO/identity linkage (2026-04-21 plan) is active; single sign-on patterns are non-trivial (token refresh, OIDC vs. OAuth2 differences)
  • S3-Compatible Storage (AWS SDK v2) — AWS SDK v2 + S3 service in dependencies; Memos likely supports object storage for media/attachments; understanding v2 async patterns differs from v1
  • Cobra CLI Framework — spf13/cobra + viper (config) are in dependencies; Memos binary has CLI flags for database selection, server port, etc.; understanding Cobra's command tree is essential for config changes
  • logseq/logseq — Markdown-native note-taking with local-first philosophy; similar 'total data ownership' positioning but for knowledge graphs vs. timeline capture
  • obsidian-md/obsidian-releases — Desktop markdown note-taking with sync; competes on ease-of-use but closed-source; Memos differentiator is self-hosted + web-native
  • nextcloud/notes — Self-hosted note app built on Nextcloud; similar infrastructure-agnostic deployment model but less lightweight (PHP vs. Go binary)
  • prometheus-community/helm-charts — Popular Helm patterns for deploying single Go binaries in Kubernetes; relevant for Memos users scaling beyond Docker
  • openai/openai-go — Official OpenAI Go SDK (in Memos dependencies); reference for AI feature integration patterns used in voice/mentions

🪄PR ideas

To work on one of these in Claude Code or Cursor, paste: Implement the "<title>" PR idea from CLAUDE.md, working through the checklist as the task list.

Add unit tests for internal/ai/audio/webm.go WebM parsing

The file internal/ai/audio/webm.go exists but webm_test.go is present yet likely incomplete. Given the audio processing pipeline for voice input and transcription features (docs/plans/2026-03-31-quick-voice-input, docs/superpowers/plans/2026-05-02-stt-audiollm-split.md), comprehensive tests for WebM container parsing, frame extraction, and error handling are critical. This directly supports the audio/LLM transcription pipeline.

  • [ ] Review internal/ai/audio/webm.go to understand WebM parsing logic
  • [ ] Add test cases for valid WebM file parsing with different codec configurations
  • [ ] Add test cases for malformed/invalid WebM files and edge cases
  • [ ] Add test cases for frame extraction and metadata parsing
  • [ ] Ensure coverage of error paths and boundary conditions
  • [ ] Run tests with go test ./internal/ai/audio/... and verify >80% coverage

Add GitHub Actions workflow for Go security vulnerability scanning

The repo has backend-tests.yml and proto-linter.yml workflows, but no dedicated security scanning. With dependencies on AWS SDK, OpenAI, Google GenAI, and OAuth2 packages, a gosec or go-sec workflow is essential for catching vulnerabilities early. This is critical for a self-hosted note-taking tool handling user data.

  • [ ] Create .github/workflows/go-security-scan.yml
  • [ ] Configure golangci-lint with gosec linter (already configured in .golangci.yaml)
  • [ ] Run on PR and push to main branches
  • [ ] Set to fail on medium/high severity issues
  • [ ] Add step to check for outdated Go dependencies using go list -json -m all | nancy sleuth
  • [ ] Reference this in SECURITY.md to document the scanning process

Add integration tests for AI providers (Gemini, OpenAI) with test containers

The internal/ai/ directory has audiollm/gemini and openai integrations, but no visible integration tests. The repo already uses testcontainers-go for MySQL and PostgreSQL (go.mod). Adding containerized tests for AI provider interactions (with mocked endpoints) would ensure the AI pipeline works correctly across the supported providers and catch integration issues before release.

  • [ ] Create internal/ai/integration_test.go for provider integration tests
  • [ ] Set up mock OpenAI and Gemini API endpoints using httptest.Server
  • [ ] Add tests for AudioLLM transcription flow with internal/ai/audiollm/audiollm.go
  • [ ] Add tests for error handling (API failures, rate limits, timeouts)
  • [ ] Use build tags (+build integration) to make them optional in CI
  • [ ] Document how to run integration tests in CONTRIBUTING.md or equivalent
  • [ ] Verify tests run with go test -tags=integration ./internal/ai/...

🌿Good first issues

  • Add missing TypeScript tests for frontend components: frontend-tests.yml exists in workflows but frontend source not visible in top 60 files—likely lacking unit test coverage on new React/Vue components; pick a component from the UI and add Jest or Vitest tests
  • Document gRPC API surface with code examples: repo claims 'full REST and gRPC APIs' but no proto definitions visible in top 60 and no examples in docs/; locate proto/ directory and add a docs/api-examples.md with curl/grpcurl samples for common operations (create memo, list tags, etc.)
  • Add integration test for SQLite + PostgreSQL schema parity: Multiple DB backends are tested separately via testcontainers but top 60 shows no integration test comparing query results across all three; add a test in internal/ that runs the same queries on SQLite, MySQL, and PostgreSQL and asserts identical output

Top contributors

Click to expand

📝Recent commits

Click to expand
  • bcbcb03 — chore: update zh-Hant translation (#5930) (hchengting)
  • 5ccba98 — refactor: split STT and Audio-LLM into separate interfaces (#5928) (boojack)
  • 238f27d — feat(transcription): explicit STT settings with provider, model, prompt (#5926) (boojack)
  • ef55013 — feat(memo): create memos on the selected calendar date (#5925) (boojack)
  • d349fe4 — chore(theme): rebalance dark palette for readability (boojack)
  • 8daef1d — feat(activity-calendar): aggregate by ViewContext.timeBasis (boojack)
  • ea0625d — feat(stats): admin instance resource statistics (boojack)
  • cd4f28a — feat(notification): add smtp email settings (boojack)
  • 35bf761 — fix(security): enforce attachment ownership on memo updates (boojack)
  • 603781f — fix(frontend): use correct url path for memos in sitemap.xml (#5921) (tokenicrat)

🔒Security observations

  • High · Dependency on Outdated Go Version — go.mod. The project specifies 'go 1.26.2' which appears to be a future/non-existent version. This may indicate version management issues or incompatibility with current Go toolchain. Standard Go versions should be verified to ensure security patches are applied. Fix: Verify and update to a stable, currently supported Go version (e.g., 1.23.x or later stable release). Ensure all dependencies are compatible with the specified version.
  • High · Missing Input Validation Framework — internal/ai/, internal/cron/, and data processing endpoints. The codebase includes multiple data processing modules (AI, STT, AudioLLM, Memo operations) but the file structure does not indicate a comprehensive input validation layer. This creates risks for injection attacks (SQLi, XSS) in note-taking functionality. Fix: Implement comprehensive input validation and sanitization for all user inputs, particularly for memo content, markdown processing, and API parameters. Use parameterized queries for database operations.
  • High · Incomplete Security Policy — SECURITY.md. SECURITY.md indicates that Memos is in 0.x stage with no formal disclosure program or CVE tracking. The policy states security fixes are only for the latest release with no backports, creating risks for users on older versions. Fix: Establish a more comprehensive security policy including: extended support window, security advisory process, CVE coordination, and a security.txt file. Consider adopting semantic versioning to improve trust.
  • Medium · OAuth2 Implementation Without Visible Security Controls — golang.org/x/oauth2 dependency, docs/plans/2026-04-21-sso-user-identity-linkage/. The project depends on 'golang.org/x/oauth2' and includes plans for SSO user identity linkage (docs/plans/2026-04-21-sso-user-identity-linkage), but no visible validation of PKCE, state parameter handling, or token refresh logic in the file structure. Fix: Ensure OAuth2 implementation includes: PKCE support, state parameter validation, secure token storage, short-lived access tokens, and refresh token rotation. Conduct security audit of authentication flow.
  • Medium · External AI/LLM Integration Risks — internal/ai/audiollm/gemini/, internal/ai/stt/openai/, internal/ai/. The project integrates multiple external AI services (OpenAI, Google Gemini, AudioLLM) and may process user data through these third-party services. No visible data sanitization or PII handling policies are apparent in the file structure. Fix: Implement: data privacy policies for external API calls, user consent mechanisms for data sharing, optional local-only processing mode, API key rotation, rate limiting, and audit logging for external integrations.
  • Medium · Docker Image Build Without Security Scanning — .github/workflows/, .dockerignore, Dockerfile (not provided). CI/CD workflows include build-canary-image.yml and release.yml but .dockerignore and Dockerfile best practices are not visible. No evidence of container image scanning in workflows. Fix: Add container image scanning (Trivy, Grype) to CI/CD pipeline. Implement Dockerfile security best practices: non-root user, minimal base images, secrets not baked into images, multi-stage builds.
  • Medium · Database Connection Security Not Visually Verified — github.com/go-sql-driver/mysql, github.com/lib/pq dependencies. The project supports multiple databases (MySQL, PostgreSQL, SQLite) via go-sql-driver/mysql and lib/pq. No visible connection string validation or SSL/TLS enforcement configuration in file structure. Fix: Enforce SSL/TLS for all database connections. Use environment variables for credentials (never hardcode). Implement connection pooling with secure defaults. Validate connection strings before use.
  • Medium · AWS S3 Integration Without Visible Access Controls — github.com/aws/aws-sdk-go-v2/service/s3 dependency. Project uses AWS SDK v2 for S3 operations but no visible configuration for bucket policies, encryption, or access key rotation in the file structure. Fix: Implement:

LLM-derived; treat as a starting point, not a security audit.


Generated by RepoPilot. Verdict based on maintenance signals — see the live page for receipts. Re-run on a new commit to refresh.

Healthy signals · usememos/memos — RepoPilot