Compare commits

...

70 Commits

Author SHA1 Message Date
Arunavo Ray
7d6bbe908f fix: respect BASE_URL in SAML callback fallback 2026-04-02 08:15:14 +05:30
Arunavo Ray
96e4653cda feat: support reverse proxy path prefixes 2026-04-02 08:03:54 +05:30
Arunavo Ray
c87513b648 chore: bump version to 3.14.2 2026-03-27 13:55:56 +05:30
ARUNAVO RAY
4f3cbc866e fix private github mirror auth (#255) 2026-03-27 13:49:36 +05:30
ARUNAVO RAY
60548f2062 fix sync target resolution for mirrored repos (#249) 2026-03-27 12:33:59 +05:30
dependabot[bot]
74dab43e89 build(deps): bump picomatch (#251)
Bumps the npm_and_yarn group with 1 update in the /www directory: [picomatch](https://github.com/micromatch/picomatch).


Updates `picomatch` from 2.3.1 to 2.3.2
- [Release notes](https://github.com/micromatch/picomatch/releases)
- [Changelog](https://github.com/micromatch/picomatch/blob/master/CHANGELOG.md)
- [Commits](https://github.com/micromatch/picomatch/compare/2.3.1...2.3.2)

---
updated-dependencies:
- dependency-name: picomatch
  dependency-version: 2.3.2
  dependency-type: indirect
  dependency-group: npm_and_yarn
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-03-27 09:39:17 +05:30
dependabot[bot]
01a8025140 build(deps): bump smol-toml (#250)
Bumps the npm_and_yarn group with 1 update in the /www directory: [smol-toml](https://github.com/squirrelchat/smol-toml).


Updates `smol-toml` from 1.6.0 to 1.6.1
- [Release notes](https://github.com/squirrelchat/smol-toml/releases)
- [Commits](https://github.com/squirrelchat/smol-toml/compare/v1.6.0...v1.6.1)

---
updated-dependencies:
- dependency-name: smol-toml
  dependency-version: 1.6.1
  dependency-type: indirect
  dependency-group: npm_and_yarn
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-03-26 22:35:57 +05:30
Arunavo Ray
8346748f5a fix: move --accept-flake-config before -- in bun2nix step
The flag was being passed to bun2nix instead of nix, causing
"unexpected argument" error.
2026-03-24 08:22:04 +05:30
Arunavo Ray
38002019ea fix: regenerate bun.nix in CI to prevent stale dependency errors
The Nix build has been failing since v3.9.6 because bun.nix fell out
of sync with bun.lock. During the sandboxed build bun install cannot
fetch missing packages, causing ConnectionRefused errors.

- Add bun2nix regeneration step before nix build in CI
- Trigger workflow on bun.lock and package.json changes
- Update flake.nix version from 3.9.6 to 3.14.1
2026-03-24 08:20:26 +05:30
Arunavo Ray
32eb27c8a6 chore: bump version to 3.14.1 2026-03-24 07:35:36 +05:30
dependabot[bot]
d33b4ff64f build(deps): bump h3 (#244)
Bumps the npm_and_yarn group with 1 update in the /www directory: [h3](https://github.com/h3js/h3).


Updates `h3` from 1.15.8 to 1.15.9
- [Release notes](https://github.com/h3js/h3/releases)
- [Changelog](https://github.com/h3js/h3/blob/v1.15.9/CHANGELOG.md)
- [Commits](https://github.com/h3js/h3/compare/v1.15.8...v1.15.9)

---
updated-dependencies:
- dependency-name: h3
  dependency-version: 1.15.9
  dependency-type: indirect
  dependency-group: npm_and_yarn
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-03-24 07:34:40 +05:30
ARUNAVO RAY
6f2e0cbca0 Add GitHub starred-list filtering with searchable selector (#247)
* feat: add starred list filtering and selector UI

* docs: add starred lists UI screenshot

* lib: improve starred list name matching
2026-03-24 07:33:46 +05:30
dependabot[bot]
95e6eb7602 build(deps): bump h3 (#242)
Bumps the npm_and_yarn group with 1 update in the /www directory: [h3](https://github.com/h3js/h3).


Updates `h3` from 1.15.5 to 1.15.8
- [Release notes](https://github.com/h3js/h3/releases)
- [Changelog](https://github.com/h3js/h3/blob/main/CHANGELOG.md)
- [Commits](https://github.com/h3js/h3/compare/v1.15.5...v1.15.8)

---
updated-dependencies:
- dependency-name: h3
  dependency-version: 1.15.8
  dependency-type: indirect
  dependency-group: npm_and_yarn
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-03-19 08:24:09 +05:30
Arunavo Ray
f50f49fc41 chore: bump version to 3.14.0 2026-03-19 00:59:59 +05:30
ARUNAVO RAY
5ea2abff85 feat: custom sync start time and frequency scheduling (#241)
* feat: add custom sync start time scheduling

* Updated UI

* docs: add updated issue 240 UI screenshot

* fix: improve schedule UI with client-side next run calc and timezone handling

- Compute next scheduled run client-side via useMemo to avoid permanent
  "Calculating..." state when server hasn't set nextRun yet
- Default to browser timezone when enabling syncing (not UTC)
- Show actual saved timezone in badge, use it consistently in all handlers
- Match time input background to select trigger in dark mode
- Add clock icon to time picker with hidden native indicator
2026-03-19 00:58:10 +05:30
Arunavo Ray
9d131b9a09 fix security alerts 2026-03-18 20:10:45 +05:30
github-actions[bot]
5f77fceaca chore: sync version to 3.13.4 2026-03-18 13:15:13 +00:00
ARUNAVO RAY
5d2462e5a0 feat: add notification system with Ntfy.sh and Apprise support (#238)
* feat: add notification system with Ntfy.sh and Apprise providers (#231)

Add push notification support for mirror job events with two providers:

- Ntfy.sh: direct HTTP POST to ntfy topics with priority/tag support
- Apprise API: aggregator gateway supporting 100+ notification services

Includes database migration (0010), settings UI tab, test endpoint,
auto-save integration, token encryption, and comprehensive tests.
Notifications are fire-and-forget and never block the mirror flow.

* fix: address review findings for notification system

- Fix silent catch in GET handler that returned ciphertext to UI,
  causing double-encryption on next save. Now clears token to ""
  on decryption failure instead.
- Add Zod schema validation to test notification endpoint, following
  project API route pattern guidelines.
- Mark notifyOnNewRepo toggle as "coming soon" with disabled state,
  since the backend doesn't yet emit new_repo events. The schema
  and type support is in place for when it's implemented.

* fix notification gating and config validation

* trim sync notification details
2026-03-18 18:36:51 +05:30
ARUNAVO RAY
0000a03ad6 fix: improve reverse proxy support for subdomain deployments (#237)
* fix: improve reverse proxy support for subdomain deployments (#63)

- Add X-Accel-Buffering: no header to SSE endpoint to prevent Nginx
  from buffering the event stream
- Auto-detect trusted origin from Host/X-Forwarded-* request headers
  so the app works behind a proxy without manual env var configuration
- Add prominent reverse proxy documentation to advanced docs page
  explaining BETTER_AUTH_URL, PUBLIC_BETTER_AUTH_URL, and
  BETTER_AUTH_TRUSTED_ORIGINS are mandatory for proxy deployments
- Add reverse proxy env var comments and entries to both
  docker-compose.yml and docker-compose.alt.yml
- Add dedicated reverse proxy configuration section to .env.example

* fix: address review findings for reverse proxy origin detection

- Fix x-forwarded-proto multi-value handling: take first value only
  and validate it is "http" or "https" before using
- Update comment to accurately describe auto-detection scope: helps
  with per-request CSRF checks but not callback URL validation
- Restore startup logging of static trusted origins for debugging

* fix: handle multi-value x-forwarded-host in chained proxy setups

x-forwarded-host can be comma-separated (e.g. "proxy1.example.com,
proxy2.example.com") in chained proxy setups. Take only the first
value, matching the same handling already applied to x-forwarded-proto.

* test: add unit tests for reverse proxy origin detection

Extract resolveTrustedOrigins into a testable exported function and
add 11 tests covering:
- Default localhost origins
- BETTER_AUTH_URL and BETTER_AUTH_TRUSTED_ORIGINS env vars
- Invalid URL handling
- Auto-detection from x-forwarded-host + x-forwarded-proto
- Multi-value header handling (chained proxy setups)
- Invalid proto rejection (only http/https allowed)
- Deduplication
- Fallback to host header when x-forwarded-host absent
2026-03-18 15:47:15 +05:30
ARUNAVO RAY
d697cb2bc9 fix: prevent starred repo name collisions during concurrent mirroring (#236)
* fix: prevent starred repo name collisions during concurrent mirroring (#95)

When multiple starred repos share the same short name (e.g. alice/dotfiles
and bob/dotfiles), concurrent batch mirroring could cause 409 Conflict
errors because generateUniqueRepoName only checked Gitea via HTTP, missing
repos that were claimed in the local DB but not yet created remotely.

Three fixes:
- Add DB-level check in generateUniqueRepoName so it queries the local
  repositories table for existing mirroredLocation claims, preventing two
  concurrent jobs from picking the same target name.
- Clear mirroredLocation on failed mirror so a failed repo doesn't falsely
  hold a location that was never successfully created, which would block
  retries and confuse the uniqueness check.
- Extract isMirroredLocationClaimedInDb helper for the DB lookup, using
  ne() to exclude the current repo's own record from the collision check.

* fix: address review findings for starred repo name collision fix

- Make generateUniqueRepoName immediately claim name by writing
  mirroredLocation to DB, closing the TOCTOU race window between
  name selection and the later status="mirroring" DB update
- Add fullName validation guard (must contain "/")
- Make isMirroredLocationClaimedInDb fail-closed (return true on
  DB error) to be conservative about preventing collisions
- Scope mirroredLocation clear on failure to starred repos only,
  preserving it for non-starred repos that may have partially
  created in Gitea and need the location for recovery

* fix: address P1/P2 review findings for starred repo name collision

P1a: Remove early name claiming from generateUniqueRepoName to prevent
stale claims on early return paths. The function now only checks
availability — the actual claim happens at the status="mirroring" DB
write (after both idempotency checks), which is protected by a new
unique partial index.

P1b: Add unique partial index on (userId, mirroredLocation) WHERE
mirroredLocation != '' via migration 0010. This enforces atomicity at
the DB level: if two concurrent workers try to claim the same name,
the second gets a constraint violation rather than silently colliding.

P2: Only clear mirroredLocation on failure if the Gitea migrate call
itself failed (migrateSucceeded flag). If migrate succeeded but
metadata mirroring failed, preserve the location since the repo
physically exists in Gitea and we need it for recovery/retry.
2026-03-18 15:27:20 +05:30
ARUNAVO RAY
ddd071f7e5 fix: prevent excessive disk usage from repo backups (#235)
* fix: prevent excessive disk usage from repo backups (#234)

Legacy configs with backupBeforeSync: true but no explicit backupStrategy
silently resolved to "always", creating full git bundles on every sync
cycle. This caused repo-backups to grow to 17GB+ for users with many
repositories.

Changes:
- Fix resolveBackupStrategy to map backupBeforeSync: true → "on-force-push"
  instead of "always", so legacy configs only backup when force-push is detected
- Fix config mapper to always set backupStrategy explicitly ("on-force-push")
  preventing the backward-compat fallback from triggering
- Lower default backupRetentionCount from 20 to 5 bundles per repo
- Add time-based retention (backupRetentionDays, default 30 days) alongside
  count-based retention, with safety net to always keep at least 1 bundle
- Add "high disk usage" warning on "Always Backup" UI option
- Update docs and tests to reflect new defaults and behavior

* fix: preserve legacy backupBeforeSync:false on UI round-trip and expose retention days

P1: mapDbToUiConfig now checks backupBeforeSync === false before
defaulting backupStrategy, preventing legacy "disabled" configs from
silently becoming "on-force-push" after any auto-save round-trip.

P3: Added "Snapshot retention days" input field to the backup settings
UI, matching the documented setting in FORCE_PUSH_PROTECTION.md.
2026-03-18 15:05:00 +05:30
Arunavo Ray
4629ab4335 chore: bump version to 3.13.3 2026-03-18 05:20:21 +05:30
Arunavo Ray
0f303c4b79 nix: regenerate bun.nix 2026-03-18 04:47:16 +05:30
ARUNAVO RAY
7c7c259d0a fix repo links to use external gitea url (#233) 2026-03-18 04:36:14 +05:30
Arunavo Ray
fe6bcc5288 chore: bump version to 3.13.2 2026-03-15 14:11:22 +05:30
ARUNAVO RAY
e26ed3aa9c fix: rewrite migration 0009 for SQLite compatibility and add migration validation (#230)
SQLite rejects ALTER TABLE ADD COLUMN with expression defaults like
DEFAULT (unixepoch()), which Drizzle-kit generated for the imported_at
column. This broke upgrades from v3.12.x to v3.13.0 (#228, #229).

Changes:
- Rewrite migration 0009 using table-recreation pattern (CREATE, INSERT
  SELECT, DROP, RENAME) instead of ALTER TABLE
- Add migration validation script with SQLite-specific lint rules that
  catch known invalid patterns before they ship
- Add upgrade-path testing with seeded data and verification fixtures
- Add runtime repair for users whose migration record may be stale
- Add explicit migration validation step to CI workflow

Fixes #228
Fixes #229
2026-03-15 14:10:06 +05:30
Arunavo Ray
efb96b6e60 chore: bump version to 3.13.1 2026-03-15 09:54:44 +05:30
Arunavo Ray
342cafed0e fix: force Go 1.25.8 toolchain and update x/crypto for git-lfs build
The git-lfs go.mod contains a `toolchain go1.25.3` directive which
causes Go to auto-download and use Go 1.25.3 instead of our installed
1.25.8. Set GOTOOLCHAIN=local to force using the installed version.

Also update golang.org/x/crypto to latest before building to resolve
CVE-2025-47913 (needs >= 0.43.0, was pinned at 0.36.0).
2026-03-15 09:35:50 +05:30
Arunavo Ray
fc7c6b59d7 docs: update README to reference Gitea/Forgejo as supported targets 2026-03-15 09:33:41 +05:30
Arunavo Ray
a77ec0447a chore: bump version to 3.13.0 2026-03-15 09:28:01 +05:30
Arunavo Ray
82b5ac8160 fix: build git-lfs from source with Go 1.25.8 to resolve remaining CVEs
Git-lfs v3.7.1 pre-built binaries use Go 1.25.3, which is affected by
CVE-2025-68121 (critical), CVE-2026-27142, CVE-2026-25679, CVE-2025-61729,
CVE-2025-61726, and CVE-2025-47913 (golang.org/x/crypto).

Since no newer git-lfs release exists, compile from source in a dedicated
build stage using Go 1.25.8 (latest patched release). Only the final
binary is copied into the runner image.
2026-03-15 09:22:50 +05:30
ARUNAVO RAY
299659eca2 fix: resolve CVEs, upgrade to Astro v6, and harden API security (#227)
* fix: resolve CVEs, upgrade to Astro v6, and harden API security

Docker image CVE fixes:
- Install git-lfs v3.7.1 from GitHub releases (Go 1.25) instead of
  Debian apt (Go 1.23.12), fixing CVE-2025-68121 and 8 other Go stdlib CVEs
- Strip build-only packages (esbuild, vite, rollup, svgo, tailwindcss)
  from production image, eliminating 9 esbuild Go stdlib CVEs

Dependency upgrades:
- Astro v5 → v6 (includes Vite 7, Zod 4)
- Remove legacy content config (src/content/config.ts)
- Update HealthResponse type for simplified health endpoint
- npm overrides for fast-xml-parser ≥5.3.6, devalue ≥5.6.2,
  node-forge ≥1.3.2, svgo ≥4.0.1, rollup ≥4.59.0

API security hardening:
- /api/auth/debug: dev-only, require auth, remove user-creation POST,
  strip trustedOrigins/databaseConfig from response
- /api/auth/check-users: return boolean hasUsers instead of exact count
- /api/cleanup/auto: require authentication, remove per-user details
- /api/health: remove OS version, memory, uptime from response
- /api/config: validate Gitea URL protocol (http/https only)
- BETTER_AUTH_SECRET: log security warning when using insecure defaults
- generateRandomString: replace Math.random() with crypto.getRandomValues()
- hashValue: add random salt and timing-safe verification

* repositories: migrate table to tanstack

* Revert "repositories: migrate table to tanstack"

This reverts commit a544b29e6d.

* fixed lock file
2026-03-15 09:19:24 +05:30
ARUNAVO RAY
6f53a3ed41 feat: add importedAt-based repository sorting (#226)
* repositories: add importedAt sorting

* repositories: use tanstack table for repo list
2026-03-15 08:52:45 +05:30
ARUNAVO RAY
1bca7df5ab feat: import repo topics and description into Gitea (#224)
* lib: sync repo topics and descriptions

* lib: harden metadata sync for existing repos
2026-03-15 08:22:44 +05:30
Arunavo Ray
b5210c3916 updated packages 2026-03-15 07:52:34 +05:30
ARUNAVO RAY
755647e29c scripts: add startup repair progress logs (#223) 2026-03-14 17:44:52 +05:30
dependabot[bot]
018c9d1a23 build(deps): bump devalue (#220) 2026-03-13 00:17:30 +05:30
Arunavo Ray
c89011819f chore: sync version to 3.12.5 2026-03-07 07:00:30 +05:30
ARUNAVO RAY
c00d48199b fix: gracefully handle SAML-protected orgs during GitHub import (#217) (#218) 2026-03-07 06:57:28 +05:30
ARUNAVO RAY
de28469210 nix: refresh bun deps and ci flake trust (#216) 2026-03-06 12:31:51 +05:30
github-actions[bot]
0e2f83fee0 chore: sync version to 3.12.4 2026-03-06 05:10:04 +00:00
ARUNAVO RAY
1dd3dea231 fix preserve strategy fork owner routing (#215) 2026-03-06 10:15:47 +05:30
Arunavo Ray
db783c4225 nix: reduce bun install CI stalls 2026-03-06 09:41:22 +05:30
github-actions[bot]
8a4716bdbd chore: sync version to 3.12.3 2026-03-06 03:35:40 +00:00
Arunavo Ray
9d37966c10 ci: only run nix flake check when nix files change 2026-03-06 09:03:32 +05:30
Arunavo Ray
ac16ae56ea ci: increase workflow timeouts to 25m and upgrade CodeQL Action to v4 2026-03-06 08:55:11 +05:30
Arunavo Ray
df3e665978 fix: bump Bun to 1.3.10 and harden startup for non-AVX CPUs (#213)
Bun 1.3.9 crashes with a segfault on CPUs without AVX support due to a
WASM IPInt bug (oven-sh/bun#27340), fixed in 1.3.10 via oven-sh/bun#26922.

- Bump Bun from 1.3.9 to 1.3.10 in Dockerfile, CI workflows, and packageManager
- Skip env config script when no GitHub/Gitea env vars are set
- Make startup scripts (env-config, recovery, repair) fault-tolerant so
  a crash in a non-critical script doesn't abort the entrypoint via set -e
2026-03-06 08:19:44 +05:30
github-actions[bot]
8a26764d2c chore: sync version to 3.12.2 2026-03-05 04:34:51 +00:00
ARUNAVO RAY
ce365a706e ci: persist release version to main (#212) 2026-03-05 09:55:59 +05:30
ARUNAVO RAY
be7daac5fb ci: automate release version from tag (#211) 2026-03-05 09:34:49 +05:30
dependabot[bot]
e32b7af5eb build(deps): bump svgo (#210)
Bumps the npm_and_yarn group with 1 update in the /www directory: [svgo](https://github.com/svg/svgo).


Updates `svgo` from 4.0.0 to 4.0.1
- [Release notes](https://github.com/svg/svgo/releases)
- [Commits](https://github.com/svg/svgo/compare/v4.0.0...v4.0.1)

---
updated-dependencies:
- dependency-name: svgo
  dependency-version: 4.0.1
  dependency-type: indirect
  dependency-group: npm_and_yarn
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-03-05 08:53:34 +05:30
ARUNAVO RAY
d0693206c3 feat: selective starred repo mirroring with autoMirrorStarred toggle (#208)
* feat: add autoMirrorStarred toggle for selective starred repo mirroring (#205)

Add `githubConfig.autoMirrorStarred` (default: false) to control whether
starred repos are included in automatic mirroring operations. Manual
per-repo actions always work regardless of this toggle.

Bug fixes:
- Cleanup service no longer orphans starred repos when includeStarred is
  disabled (prevents data loss)
- First-boot auto-start now gates initial mirror behind autoMirror config
  (previously mirrored everything unconditionally)
- "Mirror All" button now respects autoMirrorStarred setting
- Bulk mirror and getAvailableActions now include pending-approval status

Changes span schema, config mapping, env loader, scheduler, cleanup
service, UI settings toggle, and repository components.

* fix: log activity when repos are auto-imported during scheduled sync

Auto-discovered repositories (including newly starred ones) were inserted
into the database without creating activity log entries, so they appeared
in the dashboard but not in the activity log.

* ci: set 10-minute timeout on all CI jobs
2026-03-04 08:22:44 +05:30
Arunavo Ray
b079070c30 ci: also exclude helm/** from app CI workflows 2026-03-02 16:28:04 +05:30
Arunavo Ray
e68e9c38a8 ci: skip app CI workflows for www-only changes
Add www/** to paths-ignore in astro-build-test, e2e-tests, and
nix-build workflows. docker-build and helm-test already use positive
path filters and were unaffected.
2026-03-02 16:25:54 +05:30
Arunavo Ray
534150ecf9 chore(www): update website content, fix build, add Helm/Nix install methods
- Update softwareVersion from 3.9.2 to 3.11.0
- Add Helm and Nix installation tabs to Getting Started section
- Fix Helm instructions to use local chart path (no published repo)
- Update Features section: add Metadata Preservation, Force-Push Protection, Git LFS Support
- Remove unused @radix-ui/react-icons import from Hero.tsx and dependency from package.json
- Update structured data featureList with newer capabilities
2026-03-02 16:23:32 +05:30
ARUNAVO RAY
98da7065e0 feat: smart force-push protection with backup strategies (#206)
* feat: smart force-push protection with backup strategies (#187)

Replace blunt `backupBeforeSync` boolean with `backupStrategy` enum
offering four modes: disabled, always, on-force-push (default), and
block-on-force-push. This dramatically reduces backup storage for large
mirror collections by only creating snapshots when force-pushes are
actually detected.

Detection works by comparing branch SHAs between Gitea and GitHub APIs
before each sync — no git cloning required. Fail-open design ensures
detection errors never block sync.

Key changes:
- Add force-push detection module (branch SHA comparison via APIs)
- Add backup strategy resolver with backward-compat migration
- Add pending-approval repo status with approve/dismiss UI + API
- Add block-on-force-push mode requiring manual approval
- Fix checkAncestry to only treat 404 as confirmed force-push
  (transient errors skip branch instead of false-positive blocking)
- Fix approve-sync to bypass detection gate (skipForcePushDetection)
- Fix backup execution to not be hard-gated by deprecated flag
- Persist backupStrategy through config-mapper round-trip

* fix: resolve four bugs in smart force-push protection

P0: Approve flow re-blocks itself — approve-sync now calls
syncGiteaRepoEnhanced with skipForcePushDetection: true so the
detection+block gate is bypassed on approved syncs.

P1: backupStrategy not persisted — added to both directions of the
config-mapper. Don't inject a default in the mapper; let
resolveBackupStrategy handle fallback so legacy backupBeforeSync
still works for E2E tests and existing configs.

P1: Backup hard-gated by deprecated backupBeforeSync — added force
flag to createPreSyncBundleBackup; strategy-driven callers and
approve-sync pass force: true to bypass the legacy guard.

P1: checkAncestry false positives — now only returns false for
404/422 (confirmed force-push). Transient errors (rate limits, 500s)
are rethrown so detectForcePush skips that branch (fail-open).

* test(e2e): migrate backup tests from backupBeforeSync to backupStrategy

Update E2E tests to use the new backupStrategy enum ("always",
"disabled") instead of the deprecated backupBeforeSync boolean.

* docs: add backup strategy UI screenshot

* refactor(ui): move Destructive Update Protection to GitHub config tab

Relocates the backup strategy section from GiteaConfigForm to
GitHubConfigForm since it protects against GitHub-side force-pushes.
Adds ShieldAlert icon to match other section header patterns.

* docs: add force-push protection documentation and Beta badge

Add docs/FORCE_PUSH_PROTECTION.md covering detection mechanism,
backup strategies, API usage, and troubleshooting. Link it from
README features list and support section. Mark the feature as Beta
in the UI with an outline badge.

* fix(ui): match Beta badge style to Git LFS badge
2026-03-02 15:48:59 +05:30
ARUNAVO RAY
58e0194aa6 fix(nix): ensure absolute bundle path in pre-sync backup (#204)
* fix(nix): ensure absolute bundle path in pre-sync backup (#203)

Use path.resolve() instead of conditional path.isAbsolute() check to
guarantee bundlePath is always absolute before passing to git -C. On
NixOS, relative paths were interpreted relative to the temp mirror
clone directory, causing "No such file or directory" errors.

Closes #203

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(nix): ensure absolute bundle path in pre-sync backup (#203)

Use path.resolve() instead of conditional path.isAbsolute() check to
guarantee bundlePath is always absolute before passing to git -C. On
NixOS, relative paths were interpreted relative to the temp mirror
clone directory, causing "No such file or directory" errors.

Extract resolveBackupPaths() for testability. Bump version to 3.10.1.

Closes #203

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* ci: drop macos matrix and only run nix build on main/tags

- Remove macos-latest from Nix CI matrix (ubuntu-only)
- Only run `nix build` on main branch and version tags, skip on PRs
- `nix flake check` still runs on all PRs for validation

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 08:37:18 +05:30
Arunavo Ray
7864c46279 unused file 2026-03-01 08:06:11 +05:30
Arunavo Ray
e3970e53e1 chore: release v3.10.0
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 08:01:02 +05:30
ARUNAVO RAY
be46cfdffa feat: add target organization to Add Repository dialog (#202)
* feat: add target organization field to Add Repository dialog

Allow users to specify a destination Gitea organization when adding a
single repository, instead of relying solely on the default mirror
strategy. The field is optional — when left empty, the existing strategy
logic applies as before.

Closes #200

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs: add screenshot of target organization field in Add Repository dialog

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 07:55:27 +05:30
Xyndra
2e00a610cb Add E2E testing (#201)
* feat: add E2E testing infrastructure with fake GitHub, Playwright, and CI workflow

- Add fake GitHub API server (tests/e2e/fake-github-server.ts) with
  management API for seeding test data
- Add Playwright E2E test suite covering full mirror workflow:
  service health checks, user registration, config, sync, verify
- Add Docker Compose for E2E Gitea instance
- Add orchestrator script (run-e2e.sh) with cleanup
- Add GitHub Actions workflow (e2e-tests.yml) with Gitea service container
- Make GITHUB_API_URL configurable via env var for testing
- Add npm scripts: test:e2e, test:e2e:ci, test:e2e:keep, test:e2e:cleanup

* feat: add real git repos + backup config testing to E2E suite

- Create programmatic test git repos (create-test-repos.ts) with real
  commits, branches (main, develop, feature/*), and tags (v1.0.0, v1.1.0)
- Add git-server container to docker-compose serving bare repos via
  dumb HTTP protocol so Gitea can actually clone them
- Update fake GitHub server to emit reachable clone_url fields pointing
  to the git-server container (configurable via GIT_SERVER_URL env var)
- Add management endpoint POST /___mgmt/set-clone-url for runtime config
- Update E2E spec with real mirroring verification:
  * Verify repos appear in Gitea with actual content
  * Check branches, tags, commits, file content
  * Verify 4/4 repos mirrored successfully
- Add backup configuration test suite:
  * Enable/disable backupBeforeSync config
  * Toggle blockSyncOnBackupFailure
  * Trigger re-sync with backup enabled and verify activities
  * Verify config persistence across changes
- Update CI workflow to use docker compose (not service containers)
  matching the local run-e2e.sh approach
- Update cleanup.sh for git-repos directory and git-server port
- All 22 tests passing with real git content verification

* refactor: split E2E tests into focused files + add force-push tests

Split the monolithic e2e.spec.ts (1335 lines) into 5 focused spec files
and a shared helpers module:

  helpers.ts                 — constants, GiteaAPI, auth, saveConfig, utilities
  01-health.spec.ts          — service health checks (4 tests)
  02-mirror-workflow.spec.ts — full first-mirror journey (8 tests)
  03-backup.spec.ts          — backup config toggling (6 tests)
  04-force-push.spec.ts      — force-push simulation & backup verification (9 tests)
  05-sync-verification.spec.ts — dynamic repos, content integrity, reset (5 tests)

The force-push tests are the critical addition:
  F0: Record original state (commit SHAs, file content)
  F1: Rewrite source repo history (simulate force-push)
  F2: Sync to Gitea WITHOUT backup
  F3: Verify data loss — LICENSE file gone, README overwritten
  F4: Restore source, re-mirror to clean state
  F5: Enable backup, force-push again, sync through app
  F6: Verify Gitea reflects the force-push
  F7: Verify backup system was invoked (snapshot activities logged)
  F8: Restore source repo for subsequent tests

Also added to helpers.ts:
  - GiteaAPI.getBranch(), .getCommit(), .triggerMirrorSync()
  - getRepositoryIds(), triggerMirrorJobs(), triggerSyncRepo()

All 32 tests passing.

* Try to fix actions

* Try to fix the other action

* Add debug info to check why e2e action is failing

* More debug info

* Even more debug info

* E2E fix attempt #1

* E2E fix attempt #2

* more debug again

* E2E fix attempt #3

* E2E fix attempt #4

* Remove a bunch of debug info

* Hopefully fix backup bug

* Force backups to succeed
2026-03-01 07:35:13 +05:30
Arunavo Ray
61841dd7a5 chore: release v3.9.6
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-27 12:45:25 +05:30
ARUNAVO RAY
5aa0f3260d fix(nix): enable sandboxed builds with bun2nix (#199)
* fix(nix): enable sandboxed builds with bun2nix

The Nix package was broken on Linux because `bun install` requires
network access, which is blocked by Nix sandboxing (enabled by default
on Linux).

This switches to bun2nix for dependency management:
- Add bun2nix flake input to pre-fetch all npm dependencies
- Generate bun.nix lockfile for reproducible dependency resolution
- Copy bun cache to writable location during build to avoid EACCES
  errors from bunx writing to the read-only Nix store
- Add nanoid as an explicit dependency (was imported directly but only
  available as a transitive dep, which breaks with isolated linker)
- Update CI workflow to perform a full sandboxed build
- Add bun2nix to devShell for easy lockfile regeneration

Closes #197

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(nix): create writable workdir for database access

The app uses process.cwd()/data for the database path, but when running
from the Nix store the cwd is read-only. Create a writable working
directory with symlinks to app files and a real data directory.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-27 12:43:37 +05:30
ARUNAVO RAY
d0efa200d9 fix(docker): add git and git-lfs to runner image (#198)
The runner stage was missing git, causing pre-sync backups to fail with
"Executable not found in $PATH: git". The backup feature (enabled by
default) shells out to git for clone --mirror and bundle create.

Closes #196

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-27 11:12:35 +05:30
Arunavo Ray
c26b5574e0 chore: release v3.9.5 2026-02-26 11:02:00 +05:30
ARUNAVO RAY
89a6372565 nix: fix runtime wrapper paths and startup script packaging (#194)
* nix: fix flake module and runtime scripts

* docs: refresh readme and docs links/examples
2026-02-26 10:59:56 +05:30
ARUNAVO RAY
f40cad4713 nix: fix flake module and runtime scripts (#192) 2026-02-26 10:39:50 +05:30
ARUNAVO RAY
855906d990 auth: clarify invalid origin error toast guidance (#193)
* nix: fix flake module and runtime scripts

* auth: clarify invalid origin toast
2026-02-26 10:39:08 +05:30
ARUNAVO RAY
08da526ddd fix(github): keep disabled repos from cleanup while skipping new imports (#191)
* fix: preserve disabled repos while skipping new imports

* ci: upgrade bun to 1.3.6 for test workflow
2026-02-26 10:19:28 +05:30
ARUNAVO RAY
2395e14382 Add pre-sync snapshot protection for mirror rewrites (#190)
* add pre-sync snapshot protection

* stabilize test module mocks

* fix cross-test gitea mock exports

* fix gitea mock strategy behavior
2026-02-26 10:13:13 +05:30
177 changed files with 23851 additions and 2699 deletions

View File

@@ -9,6 +9,8 @@
NODE_ENV=production
HOST=0.0.0.0
PORT=4321
# Optional application base path (use "/" for root, or "/mirror" for subpath deployments)
BASE_URL=/
# Database Configuration
# For self-hosted, SQLite is used by default
@@ -18,9 +20,32 @@ DATABASE_URL=sqlite://data/gitea-mirror.db
# Generate with: openssl rand -base64 32
BETTER_AUTH_SECRET=change-this-to-a-secure-random-string-in-production
BETTER_AUTH_URL=http://localhost:4321
# PUBLIC_BETTER_AUTH_URL=https://your-domain.com # Optional: Set this if accessing from different origins (e.g., IP and domain)
# ENCRYPTION_SECRET=optional-encryption-key-for-token-encryption # Generate with: openssl rand -base64 48
# ===========================================
# REVERSE PROXY CONFIGURATION
# ===========================================
# REQUIRED when accessing Gitea Mirror through a reverse proxy (Nginx, Caddy, Traefik, etc.).
# Without these, sign-in will fail with "invalid origin" errors and pages may appear blank.
#
# Set all three to your external URL, e.g.:
# BETTER_AUTH_URL=https://gitea-mirror.example.com
# PUBLIC_BETTER_AUTH_URL=https://gitea-mirror.example.com
# BETTER_AUTH_TRUSTED_ORIGINS=https://gitea-mirror.example.com
#
# If your app is served from a path prefix (e.g. https://git.example.com/mirror), set:
# BASE_URL=/mirror
# BETTER_AUTH_URL=https://git.example.com
# PUBLIC_BETTER_AUTH_URL=https://git.example.com
# BETTER_AUTH_TRUSTED_ORIGINS=https://git.example.com
#
# BETTER_AUTH_URL - Used server-side for auth callbacks and redirects
# PUBLIC_BETTER_AUTH_URL - Used client-side (browser) for auth API calls
# BETTER_AUTH_TRUSTED_ORIGINS - Comma-separated list of origins allowed to make auth requests
# (e.g. https://gitea-mirror.example.com,https://alt.example.com)
PUBLIC_BETTER_AUTH_URL=http://localhost:4321
# BETTER_AUTH_TRUSTED_ORIGINS=
# ===========================================
# DOCKER CONFIGURATION (Optional)
# ===========================================
@@ -46,6 +71,7 @@ DOCKER_TAG=latest
# INCLUDE_ARCHIVED=false
# SKIP_FORKS=false
# MIRROR_STARRED=false
# MIRROR_STARRED_LISTS=homelab,dottools # Optional: comma-separated star list names; empty = all starred repos
# STARRED_REPOS_ORG=starred # Organization name for starred repos
# STARRED_REPOS_MODE=dedicated-org # dedicated-org | preserve-owner

Binary file not shown.

After

Width:  |  Height:  |  Size: 34 KiB

View File

@@ -43,6 +43,9 @@ This workflow builds Docker images on pushes and pull requests, and pushes to Gi
- Skips registry push for fork PRs (avoids package write permission failures)
- Uses build caching to speed up builds
- Creates multiple tags for each image (latest, semver, sha)
- Auto-syncs `package.json` version from `v*` tags during release builds
- Validates release tags use semver format before building
- After tag builds succeed, writes the same version back to `main/package.json`
### Docker Security Scan (`docker-scan.yml`)

View File

@@ -6,11 +6,15 @@ on:
paths-ignore:
- 'README.md'
- 'docs/**'
- 'www/**'
- 'helm/**'
pull_request:
branches: [ '*' ]
paths-ignore:
- 'README.md'
- 'docs/**'
- 'www/**'
- 'helm/**'
permissions:
contents: read
@@ -20,6 +24,7 @@ jobs:
build-and-test:
name: Build and Test Astro Project
runs-on: ubuntu-latest
timeout-minutes: 25
steps:
- name: Checkout repository
@@ -28,7 +33,7 @@ jobs:
- name: Setup Bun
uses: oven-sh/setup-bun@v1
with:
bun-version: '1.2.16'
bun-version: '1.3.10'
- name: Check lockfile and install dependencies
run: |
@@ -43,6 +48,12 @@ jobs:
- name: Run tests
run: bun test --coverage
- name: Check Drizzle migrations
run: bun run db:check
- name: Validate migrations (SQLite lint + upgrade path)
run: bun test:migrations
- name: Build Astro project
run: bunx --bun astro build

View File

@@ -36,6 +36,7 @@ env:
jobs:
docker:
runs-on: ubuntu-latest
timeout-minutes: 25
permissions:
contents: write
@@ -76,13 +77,34 @@ jobs:
id: tag_version
run: |
if [[ $GITHUB_REF == refs/tags/v* ]]; then
echo "VERSION=${GITHUB_REF#refs/tags/}" >> $GITHUB_OUTPUT
echo "Using version tag: ${GITHUB_REF#refs/tags/}"
TAG_VERSION="${GITHUB_REF#refs/tags/}"
if [[ ! "$TAG_VERSION" =~ ^v[0-9]+\.[0-9]+\.[0-9]+([.-][0-9A-Za-z.-]+)?(\+[0-9A-Za-z.-]+)?$ ]]; then
echo "::error::Release tag '${TAG_VERSION}' is invalid. Expected semver tag format like v1.2.3 or v1.2.3-rc.1"
exit 1
fi
APP_VERSION="${TAG_VERSION#v}"
echo "VERSION=${TAG_VERSION}" >> $GITHUB_OUTPUT
echo "APP_VERSION=${APP_VERSION}" >> $GITHUB_OUTPUT
echo "Using version tag: ${TAG_VERSION}"
else
echo "VERSION=latest" >> $GITHUB_OUTPUT
echo "APP_VERSION=dev" >> $GITHUB_OUTPUT
echo "No version tag, using 'latest'"
fi
# Keep version files aligned automatically for tag-based releases
- name: Sync app version from release tag
if: startsWith(github.ref, 'refs/tags/v')
run: |
VERSION="${{ steps.tag_version.outputs.APP_VERSION }}"
echo "Syncing package.json version to ${VERSION}"
jq --arg version "${VERSION}" '.version = $version' package.json > package.json.tmp
mv package.json.tmp package.json
echo "Version sync diff (package.json):"
git --no-pager diff -- package.json
# Extract metadata for Docker
- name: Extract Docker metadata
id: meta
@@ -231,8 +253,49 @@ jobs:
# Upload security scan results to GitHub Security tab
- name: Upload Docker Scout scan results to GitHub Security tab
uses: github/codeql-action/upload-sarif@v3
uses: github/codeql-action/upload-sarif@v4
if: always()
continue-on-error: true
with:
sarif_file: scout-results.sarif
sync-version-main:
name: Sync package.json version back to main
if: startsWith(github.ref, 'refs/tags/v')
runs-on: ubuntu-latest
needs: docker
permissions:
contents: write
steps:
- name: Checkout default branch
uses: actions/checkout@v4
with:
ref: ${{ github.event.repository.default_branch }}
- name: Update package.json version on main
env:
TAG_VERSION: ${{ github.ref_name }}
TARGET_BRANCH: ${{ github.event.repository.default_branch }}
run: |
if [[ ! "$TAG_VERSION" =~ ^v[0-9]+\.[0-9]+\.[0-9]+([.-][0-9A-Za-z.-]+)?(\+[0-9A-Za-z.-]+)?$ ]]; then
echo "::error::Release tag '${TAG_VERSION}' is invalid. Expected semver tag format like v1.2.3 or v1.2.3-rc.1"
exit 1
fi
APP_VERSION="${TAG_VERSION#v}"
echo "Syncing ${TARGET_BRANCH}/package.json to ${APP_VERSION}"
jq --arg version "${APP_VERSION}" '.version = $version' package.json > package.json.tmp
mv package.json.tmp package.json
if git diff --quiet -- package.json; then
echo "package.json on ${TARGET_BRANCH} already at ${APP_VERSION}; nothing to commit."
exit 0
fi
git config user.name "github-actions[bot]"
git config user.email "41898282+github-actions[bot]@users.noreply.github.com"
git add package.json
git commit -m "chore: sync version to ${APP_VERSION}"
git push origin "HEAD:${TARGET_BRANCH}"

285
.github/workflows/e2e-tests.yml vendored Normal file
View File

@@ -0,0 +1,285 @@
name: E2E Integration Tests
on:
push:
branches: ["*"]
paths-ignore:
- "README.md"
- "docs/**"
- "CHANGELOG.md"
- "LICENSE"
- "www/**"
- "helm/**"
pull_request:
branches: ["*"]
paths-ignore:
- "README.md"
- "docs/**"
- "CHANGELOG.md"
- "LICENSE"
- "www/**"
- "helm/**"
workflow_dispatch:
inputs:
debug_enabled:
description: "Enable debug logging"
required: false
default: "false"
type: boolean
permissions:
contents: read
actions: read
concurrency:
group: e2e-${{ github.ref }}
cancel-in-progress: true
env:
GITEA_PORT: 3333
FAKE_GITHUB_PORT: 4580
GIT_SERVER_PORT: 4590
APP_PORT: 4321
BUN_VERSION: "1.3.10"
jobs:
e2e-tests:
name: E2E Integration Tests
runs-on: ubuntu-latest
timeout-minutes: 25
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup Bun
uses: oven-sh/setup-bun@v1
with:
bun-version: ${{ env.BUN_VERSION }}
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: "22"
- name: Install dependencies
run: |
bun install
echo "✓ Dependencies installed"
- name: Install Playwright
run: |
npx playwright install chromium
npx playwright install-deps chromium
echo "✓ Playwright ready"
- name: Create test git repositories
run: |
echo "Creating bare git repos for E2E testing..."
bun run tests/e2e/create-test-repos.ts --output-dir tests/e2e/git-repos
if [ ! -f tests/e2e/git-repos/manifest.json ]; then
echo "ERROR: Test git repos were not created (manifest.json missing)"
exit 1
fi
echo "✓ Test repos created:"
cat tests/e2e/git-repos/manifest.json | jq -r '.repos[] | " • \(.owner)/\(.name) — \(.description)"'
- name: Start Gitea and git-server containers
run: |
echo "Starting containers via docker compose..."
docker compose -f tests/e2e/docker-compose.e2e.yml up -d
# Wait for git-server
echo "Waiting for git HTTP server..."
for i in $(seq 1 30); do
if curl -sf http://localhost:${{ env.GIT_SERVER_PORT }}/manifest.json > /dev/null 2>&1; then
echo "✓ Git HTTP server is ready"
break
fi
if [ $i -eq 30 ]; then
echo "ERROR: Git HTTP server did not start"
docker compose -f tests/e2e/docker-compose.e2e.yml logs git-server
exit 1
fi
sleep 1
done
# Wait for Gitea
echo "Waiting for Gitea to be ready..."
for i in $(seq 1 60); do
if curl -sf http://localhost:${{ env.GITEA_PORT }}/api/v1/version > /dev/null 2>&1; then
version=$(curl -sf http://localhost:${{ env.GITEA_PORT }}/api/v1/version | jq -r '.version // "unknown"')
echo "✓ Gitea is ready (version: $version)"
break
fi
if [ $i -eq 60 ]; then
echo "ERROR: Gitea did not become healthy within 120s"
docker compose -f tests/e2e/docker-compose.e2e.yml logs gitea-e2e --tail=30
exit 1
fi
sleep 2
done
- name: Initialize database
run: |
bun run manage-db init
echo "✓ Database initialized"
- name: Build application
env:
GH_API_URL: http://localhost:4580
BETTER_AUTH_SECRET: e2e-test-secret
run: |
bun run build
echo "✓ Build complete"
- name: Start fake GitHub API server
run: |
# Start with GIT_SERVER_URL pointing to the git-server container name
# (Gitea will resolve it via Docker networking)
PORT=${{ env.FAKE_GITHUB_PORT }} GIT_SERVER_URL="http://git-server" \
npx tsx tests/e2e/fake-github-server.ts &
echo $! > /tmp/fake-github.pid
echo "Waiting for fake GitHub API..."
for i in $(seq 1 30); do
if curl -sf http://localhost:${{ env.FAKE_GITHUB_PORT }}/___mgmt/health > /dev/null 2>&1; then
echo "✓ Fake GitHub API is ready"
break
fi
if [ $i -eq 30 ]; then
echo "ERROR: Fake GitHub API did not start"
exit 1
fi
sleep 1
done
# Ensure clone URLs are set for the git-server container
curl -sf -X POST http://localhost:${{ env.FAKE_GITHUB_PORT }}/___mgmt/set-clone-url \
-H "Content-Type: application/json" \
-d '{"url": "http://git-server"}' || true
echo "✓ Clone URLs configured for git-server container"
- name: Start gitea-mirror application
env:
GH_API_URL: http://localhost:4580
BETTER_AUTH_SECRET: e2e-test-secret
BETTER_AUTH_URL: http://localhost:4321
DATABASE_URL: file:data/gitea-mirror.db
HOST: 0.0.0.0
PORT: ${{ env.APP_PORT }}
NODE_ENV: production
PRE_SYNC_BACKUP_ENABLED: "false"
ENCRYPTION_SECRET: "e2e-encryption-secret-32char!!"
run: |
# Re-init DB in case build step cleared it
bun run manage-db init 2>/dev/null || true
bun run start &
echo $! > /tmp/app.pid
echo "Waiting for gitea-mirror app..."
for i in $(seq 1 90); do
if curl -sf http://localhost:${{ env.APP_PORT }}/api/health > /dev/null 2>&1 || \
curl -sf -o /dev/null -w "%{http_code}" http://localhost:${{ env.APP_PORT }}/ 2>/dev/null | grep -q "^[23]"; then
echo "✓ gitea-mirror app is ready"
break
fi
if ! kill -0 $(cat /tmp/app.pid) 2>/dev/null; then
echo "ERROR: App process died"
exit 1
fi
if [ $i -eq 90 ]; then
echo "ERROR: gitea-mirror app did not start within 180s"
exit 1
fi
sleep 2
done
- name: Run E2E tests
env:
APP_URL: http://localhost:${{ env.APP_PORT }}
GITEA_URL: http://localhost:${{ env.GITEA_PORT }}
FAKE_GITHUB_URL: http://localhost:${{ env.FAKE_GITHUB_PORT }}
GIT_SERVER_URL: http://localhost:${{ env.GIT_SERVER_PORT }}
CI: true
run: |
mkdir -p tests/e2e/test-results
npx playwright test \
--config tests/e2e/playwright.config.ts \
--reporter=github,html
- name: Diagnostic info on failure
if: failure()
run: |
echo "═══════════════════════════════════════════════════════════"
echo " Diagnostic Information"
echo "═══════════════════════════════════════════════════════════"
echo ""
echo "── Git server status ──"
curl -sf http://localhost:${{ env.GIT_SERVER_PORT }}/manifest.json 2>/dev/null | jq . || echo "(unreachable)"
echo ""
echo "── Gitea status ──"
curl -sf http://localhost:${{ env.GITEA_PORT }}/api/v1/version 2>/dev/null || echo "(unreachable)"
echo ""
echo "── Fake GitHub status ──"
curl -sf http://localhost:${{ env.FAKE_GITHUB_PORT }}/___mgmt/health 2>/dev/null | jq . || echo "(unreachable)"
echo ""
echo "── App status ──"
curl -sf http://localhost:${{ env.APP_PORT }}/api/health 2>/dev/null || echo "(unreachable)"
echo ""
echo "── Docker containers ──"
docker compose -f tests/e2e/docker-compose.e2e.yml ps 2>/dev/null || true
echo ""
echo "── Gitea container logs (last 50 lines) ──"
docker compose -f tests/e2e/docker-compose.e2e.yml logs gitea-e2e --tail=50 2>/dev/null || echo "(no container)"
echo ""
echo "── Git server logs (last 20 lines) ──"
docker compose -f tests/e2e/docker-compose.e2e.yml logs git-server --tail=20 2>/dev/null || echo "(no container)"
echo ""
echo "── Running processes ──"
ps aux | grep -E "(fake-github|astro|bun|node)" | grep -v grep || true
- name: Upload Playwright report
uses: actions/upload-artifact@v4
if: always()
with:
name: e2e-playwright-report
path: tests/e2e/playwright-report/
retention-days: 14
- name: Upload test results
uses: actions/upload-artifact@v4
if: always()
with:
name: e2e-test-results
path: tests/e2e/test-results/
retention-days: 14
- name: Cleanup
if: always()
run: |
# Stop background processes
if [ -f /tmp/fake-github.pid ]; then
kill $(cat /tmp/fake-github.pid) 2>/dev/null || true
rm -f /tmp/fake-github.pid
fi
if [ -f /tmp/app.pid ]; then
kill $(cat /tmp/app.pid) 2>/dev/null || true
rm -f /tmp/app.pid
fi
# Stop containers
docker compose -f tests/e2e/docker-compose.e2e.yml down --volumes --remove-orphans 2>/dev/null || true
echo "✓ Cleanup complete"

View File

@@ -21,6 +21,7 @@ jobs:
yamllint:
name: Lint YAML
runs-on: ubuntu-latest
timeout-minutes: 25
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
@@ -35,6 +36,7 @@ jobs:
helm-template:
name: Helm lint & template
runs-on: ubuntu-latest
timeout-minutes: 25
steps:
- uses: actions/checkout@v4
- name: Setup Helm

View File

@@ -5,18 +5,34 @@ on:
branches: [main, nix]
tags:
- 'v*'
paths:
- 'flake.nix'
- 'flake.lock'
- 'bun.nix'
- 'bun.lock'
- 'package.json'
- '.github/workflows/nix-build.yml'
pull_request:
branches: [main]
paths:
- 'flake.nix'
- 'flake.lock'
- 'bun.nix'
- 'bun.lock'
- 'package.json'
- '.github/workflows/nix-build.yml'
permissions:
contents: read
jobs:
check:
strategy:
matrix:
os: [ubuntu-latest, macos-latest]
runs-on: ${{ matrix.os }}
runs-on: ubuntu-latest
timeout-minutes: 45
env:
NIX_CONFIG: |
accept-flake-config = true
access-tokens = github.com=${{ secrets.GITHUB_TOKEN }}
steps:
- uses: actions/checkout@v4
@@ -27,19 +43,15 @@ jobs:
- name: Setup Nix Cache
uses: DeterminateSystems/magic-nix-cache-action@main
- name: Regenerate bun.nix from bun.lock
run: nix run --accept-flake-config github:nix-community/bun2nix -- -o bun.nix
- name: Check flake
run: nix flake check
run: nix flake check --accept-flake-config
- name: Show flake info
run: nix flake show
run: nix flake show --accept-flake-config
- name: Evaluate package
run: |
# Evaluate the derivation without building (validates the Nix expression)
nix eval .#packages.$(nix eval --impure --expr 'builtins.currentSystem').default.name
echo "Flake evaluation successful"
# Note: Full build requires network access for bun install.
# Nix sandboxed builds block network access.
# To build locally: nix build --option sandbox false
# Or use: nix develop && bun install && bun run build
- name: Build package
if: github.ref == 'refs/heads/main' || startsWith(github.ref, 'refs/tags/v')
run: nix build --print-build-logs --accept-flake-config

15
.gitignore vendored
View File

@@ -37,3 +37,18 @@ result
result-*
.direnv/
# E2E test artifacts
tests/e2e/test-results/
tests/e2e/playwright-report/
tests/e2e/.auth/
tests/e2e/e2e-storage-state.json
tests/e2e/.fake-github.pid
tests/e2e/.app.pid
tests/e2e/git-repos/
# Playwright
/test-results/
/playwright-report/
/blob-report/
/playwright/.cache/
/playwright/.auth/

View File

@@ -1,169 +0,0 @@
# Nix Distribution - Ready to Use!
## Current Status: WORKS NOW
Your Nix package is **already distributable**! Users can run it directly from GitHub without any additional setup on your end.
## How Users Will Use It
### Simple: Just Run From GitHub
```bash
nix run --extra-experimental-features 'nix-command flakes' github:RayLabsHQ/gitea-mirror
```
That's it! No releases, no CI, no infrastructure needed. It works right now.
---
## What Happens When They Run This?
1. **Nix fetches** your repo from GitHub
2. **Nix reads** `flake.nix` and `flake.lock`
3. **Nix builds** the package on their machine
4. **Nix runs** the application
5. **Result cached** in `/nix/store` for reuse
---
## Do You Need CI or Releases?
### For Basic Usage: **NO**
Users can already use it from GitHub. No CI or releases required.
### For CI Validation: **Already Set Up**
GitHub Actions validates builds on every push with Magic Nix Cache (free, no setup).
---
## Next Steps (Optional)
### Option 1: Release Versioning (2 minutes)
**Why:** Users can pin to specific versions
**How:**
```bash
# When ready to release
git tag v3.8.11
git push origin v3.8.11
# Users can then pin to this version
nix run github:RayLabsHQ/gitea-mirror/v3.8.11
```
No additional CI needed - tags work automatically with flakes!
### Option 2: Submit to nixpkgs (Long Term)
**Why:** Maximum discoverability and trust
**When:** After package is stable and well-tested
**How:** Submit PR to https://github.com/NixOS/nixpkgs
---
## Files Created
### Essential (Already Working)
- `flake.nix` - Package definition
- `flake.lock` - Dependency lock file
- `.envrc` - direnv integration
### Documentation
- `NIX.md` - Quick reference for users
- `docs/NIX_DEPLOYMENT.md` - Complete deployment guide
- `docs/NIX_DISTRIBUTION.md` - Distribution guide for you (maintainer)
- `README.md` - Updated with Nix instructions
### CI (Already Set Up)
- `.github/workflows/nix-build.yml` - Builds and validates on Linux + macOS
### Updated
- `.gitignore` - Added Nix artifacts
---
## Comparison: Your Distribution Options
| Setup | Time | User Experience | What You Need |
|-------|------|----------------|---------------|
| **Direct GitHub** | 0 min | Slow (build from source) | Nothing! Works now |
| **+ Git Tags** | 2 min | Versionable | Just push tags |
| **+ nixpkgs** | Hours | Official/Trusted | PR review process |
**Recommendation:** Direct GitHub works now. Add git tags for versioning. Consider nixpkgs submission once stable.
---
## Testing Your Distribution
You can test it right now:
```bash
# Test direct GitHub usage
nix run --extra-experimental-features 'nix-command flakes' github:RayLabsHQ/gitea-mirror
# Test with specific commit
nix run github:RayLabsHQ/gitea-mirror/$(git rev-parse HEAD)
# Validate flake
nix flake check
```
---
## User Documentation Locations
Users will find instructions in:
1. **README.md** - Installation section (already updated)
2. **NIX.md** - Quick reference
3. **docs/NIX_DEPLOYMENT.md** - Detailed guide
All docs include the correct commands with experimental features flags.
---
## When to Release New Versions
### For Git Tag Releases:
```bash
# 1. Update version in package.json
vim package.json
# 2. Update version in flake.nix (line 17)
vim flake.nix # version = "3.8.12";
# 3. Commit and tag
git add package.json flake.nix
git commit -m "chore: bump version to v3.8.12"
git tag v3.8.12
git push origin main
git push origin v3.8.12
```
Users can then use: `nix run github:RayLabsHQ/gitea-mirror/v3.8.12`
### No Release Needed For:
- Bug fixes
- Small changes
- Continuous updates
Users can always use latest from main: `nix run github:RayLabsHQ/gitea-mirror`
---
## Summary
**Ready to distribute RIGHT NOW**
- Just commit and push your `flake.nix`
- Users can run directly from GitHub
- CI validates builds automatically
**Optional: Submit to nixpkgs**
- Maximum discoverability
- Official Nix repository
- Do this once package is stable
See `docs/NIX_DISTRIBUTION.md` for complete details!

View File

@@ -1,13 +1,15 @@
# syntax=docker/dockerfile:1.4
FROM oven/bun:1.3.9-debian AS base
FROM oven/bun:1.3.10-debian AS base
WORKDIR /app
RUN apt-get update && apt-get install -y --no-install-recommends \
RUN apt-get update && apt-get -y upgrade && apt-get install -y --no-install-recommends \
python3 make g++ gcc wget sqlite3 openssl ca-certificates \
&& rm -rf /var/lib/apt/lists/*
# ----------------------------
FROM base AS builder
ARG BASE_URL=/
ENV BASE_URL=${BASE_URL}
COPY package.json ./
COPY bun.lock* ./
RUN bun install --frozen-lockfile
@@ -26,21 +28,54 @@ COPY bun.lock* ./
RUN bun install --production --omit=peer --frozen-lockfile
# ----------------------------
FROM oven/bun:1.3.9-debian AS runner
WORKDIR /app
RUN apt-get update && apt-get install -y --no-install-recommends \
wget sqlite3 openssl ca-certificates \
# Build git-lfs from source with patched Go to resolve Go stdlib CVEs
FROM debian:trixie-slim AS git-lfs-builder
RUN apt-get update && apt-get -y upgrade && apt-get install -y --no-install-recommends \
wget ca-certificates git make \
&& rm -rf /var/lib/apt/lists/*
ARG GO_VERSION=1.25.8
ARG GIT_LFS_VERSION=3.7.1
RUN ARCH="$(dpkg --print-architecture)" \
&& wget -qO /tmp/go.tar.gz "https://go.dev/dl/go${GO_VERSION}.linux-${ARCH}.tar.gz" \
&& tar -C /usr/local -xzf /tmp/go.tar.gz \
&& rm /tmp/go.tar.gz
ENV PATH="/usr/local/go/bin:/root/go/bin:${PATH}"
# Force using our installed Go (not the version in go.mod toolchain directive)
ENV GOTOOLCHAIN=local
RUN git clone --branch "v${GIT_LFS_VERSION}" --depth 1 https://github.com/git-lfs/git-lfs.git /tmp/git-lfs \
&& cd /tmp/git-lfs \
&& go get golang.org/x/crypto@latest \
&& go mod tidy \
&& make \
&& install -m 755 /tmp/git-lfs/bin/git-lfs /usr/local/bin/git-lfs
# ----------------------------
FROM oven/bun:1.3.10-debian AS runner
WORKDIR /app
RUN apt-get update && apt-get -y upgrade && apt-get install -y --no-install-recommends \
git wget sqlite3 openssl ca-certificates \
&& rm -rf /var/lib/apt/lists/*
COPY --from=git-lfs-builder /usr/local/bin/git-lfs /usr/local/bin/git-lfs
RUN git lfs install
COPY --from=pruner /app/node_modules ./node_modules
COPY --from=builder /app/dist ./dist
COPY --from=builder /app/package.json ./package.json
COPY --from=builder /app/docker-entrypoint.sh ./docker-entrypoint.sh
COPY --from=builder /app/drizzle ./drizzle
# Remove build-only packages that are not needed at runtime
# (esbuild, vite, rollup, tailwind, svgo — all only used during `astro build`)
RUN rm -rf node_modules/esbuild node_modules/@esbuild \
node_modules/rollup node_modules/@rollup \
node_modules/vite node_modules/svgo \
node_modules/@tailwindcss/vite \
node_modules/tailwindcss
ENV NODE_ENV=production
ENV HOST=0.0.0.0
ENV PORT=4321
ENV DATABASE_URL=file:data/gitea-mirror.db
ENV BASE_URL=/
# Create directories and setup permissions
RUN mkdir -p /app/certs && \
@@ -58,6 +93,6 @@ VOLUME /app/data
EXPOSE 4321
HEALTHCHECK --interval=30s --timeout=5s --start-period=5s --retries=3 \
CMD wget --no-verbose --tries=1 --spider http://localhost:4321/api/health || exit 1
CMD sh -c 'BASE="${BASE_URL:-/}"; if [ "$BASE" = "/" ]; then BASE=""; else BASE="${BASE%/}"; fi; wget --no-verbose --tries=1 --spider "http://localhost:4321${BASE}/api/health" || exit 1'
ENTRYPOINT ["./docker-entrypoint.sh"]

2
NIX.md
View File

@@ -24,7 +24,7 @@ Secrets auto-generate, database auto-initializes, and the web UI starts at http:
nix run --extra-experimental-features 'nix-command flakes' github:RayLabsHQ/gitea-mirror
# Pin to specific version
nix run github:RayLabsHQ/gitea-mirror/v3.8.11
nix run github:RayLabsHQ/gitea-mirror/vX.Y.Z
```
### 2. Install to Profile

View File

@@ -1,7 +1,7 @@
<p align="center">
<img src=".github/assets/logo.png" alt="Gitea Mirror Logo" width="120" />
<h1>Gitea Mirror</h1>
<p><i>Automatically mirror repositories from GitHub to your self-hosted Gitea instance.</i></p>
<p><i>Automatically mirror repositories from GitHub to your self-hosted Gitea/Forgejo instance.</i></p>
<p align="center">
<a href="https://github.com/RayLabsHQ/gitea-mirror/releases/latest"><img src="https://img.shields.io/github/v/tag/RayLabsHQ/gitea-mirror?label=release" alt="release"/></a>
<a href="https://github.com/RayLabsHQ/gitea-mirror/actions/workflows/astro-build-test.yml"><img src="https://img.shields.io/github/actions/workflow/status/RayLabsHQ/gitea-mirror/astro-build-test.yml?branch=main" alt="build"/></a>
@@ -19,7 +19,7 @@ docker compose -f docker-compose.alt.yml up -d
# Access at http://localhost:4321
```
First user signup becomes admin. Configure GitHub and Gitea through the web interface!
First user signup becomes admin. Configure GitHub and Gitea/Forgejo through the web interface!
<p align="center">
<img src=".github/assets/dashboard.png" alt="Dashboard" width="600" />
@@ -28,7 +28,7 @@ First user signup becomes admin. Configure GitHub and Gitea through the web inte
## ✨ Features
- 🔁 Mirror public, private, and starred GitHub repos to Gitea
- 🔁 Mirror public, private, and starred GitHub repos to Gitea/Forgejo
- 🏢 Mirror entire organizations with flexible strategies
- 🎯 Custom destination control for repos and organizations
- 📦 **Git LFS support** - Mirror large files with Git LFS
@@ -40,6 +40,7 @@ First user signup becomes admin. Configure GitHub and Gitea through the web inte
- 🔄 **Auto-discovery** - Automatically import new GitHub repositories (v3.4.0+)
- 🧹 **Repository cleanup** - Auto-remove repos deleted from GitHub (v3.4.0+)
- 🎯 **Proper mirror intervals** - Respects configured sync intervals (v3.4.0+)
- 🛡️ **[Force-push protection](docs/FORCE_PUSH_PROTECTION.md)** - Smart detection with backup-on-demand or block-and-approve modes (Beta)
- 🗑️ Automatic database cleanup with configurable retention
- 🐳 Dockerized with multi-arch support (AMD64/ARM64)
@@ -112,7 +113,7 @@ docker compose up -d
#### Using Pre-built Image Directly
```bash
docker pull ghcr.io/raylabshq/gitea-mirror:v3.1.1
docker pull ghcr.io/raylabshq/gitea-mirror:latest
```
### Configuration Options
@@ -198,12 +199,12 @@ bun run dev
1. **First Time Setup**
- Navigate to http://localhost:4321
- Create admin account (first user signup)
- Configure GitHub and Gitea connections
- Configure GitHub and Gitea/Forgejo connections
2. **Mirror Strategies**
- **Preserve Structure**: Maintains GitHub organization structure
- **Single Organization**: All repos go to one Gitea organization
- **Flat User**: All repos under your Gitea user account
- **Single Organization**: All repos go to one Gitea/Forgejo organization
- **Flat User**: All repos under your Gitea/Forgejo user account
- **Mixed Mode**: Personal repos in one org, organization repos preserve structure
3. **Customization**
@@ -216,13 +217,13 @@ bun run dev
### Git LFS (Large File Storage)
Mirror Git LFS objects along with your repositories:
- Enable "Mirror LFS" option in Settings → Mirror Options
- Requires Gitea server with LFS enabled (`LFS_START_SERVER = true`)
- Requires Gitea/Forgejo server with LFS enabled (`LFS_START_SERVER = true`)
- Requires Git v2.1.2+ on the server
### Metadata Mirroring
Transfer complete repository metadata from GitHub to Gitea:
Transfer complete repository metadata from GitHub to Gitea/Forgejo:
- **Issues** - Mirror all issues with comments and labels
- **Pull Requests** - Transfer PR discussions to Gitea
- **Pull Requests** - Transfer PR discussions to Gitea/Forgejo
- **Labels** - Preserve repository labels
- **Milestones** - Keep project milestones
- **Wiki** - Mirror wiki content
@@ -242,7 +243,7 @@ Gitea Mirror provides powerful automatic synchronization features:
#### Features (v3.4.0+)
- **Auto-discovery**: Automatically discovers and imports new GitHub repositories
- **Repository cleanup**: Removes repositories that no longer exist in GitHub
- **Proper intervals**: Mirrors respect your configured sync intervals (not Gitea's default 24h)
- **Proper intervals**: Mirrors respect your configured sync intervals (not Gitea/Forgejo's default 24h)
- **Smart scheduling**: Only syncs repositories that need updating
- **Auto-start on boot** (v3.5.3+): Automatically imports and mirrors all repositories when `SCHEDULE_ENABLED=true` or `GITEA_MIRROR_INTERVAL` is set - no manual clicks required!
@@ -253,7 +254,7 @@ Navigate to the Configuration page and enable "Automatic Syncing" with your pref
**🚀 Set it and forget it!** With these environment variables, Gitea Mirror will automatically:
1. **Import** all your GitHub repositories on startup (no manual import needed!)
2. **Mirror** them to Gitea immediately
2. **Mirror** them to Gitea/Forgejo immediately
3. **Keep them synchronized** based on your interval
4. **Auto-discover** new repos you create/star on GitHub
5. **Clean up** repos you delete from GitHub
@@ -283,23 +284,35 @@ CLEANUP_DRY_RUN=false # Set to true to test without changes
- **Auto-Start**: When `SCHEDULE_ENABLED=true` or `GITEA_MIRROR_INTERVAL` is set, the service automatically imports all GitHub repositories and mirrors them on startup. No manual "Import" or "Mirror" button clicks required!
- The scheduler checks every minute for tasks to run. The `GITEA_MIRROR_INTERVAL` determines how often each repository is actually synced. For example, with `8h`, each repo syncs every 8 hours from its last successful sync.
- **Large repo bootstrap**: For first-time mirroring of large repositories (especially with metadata/LFS), avoid very short intervals (for example `5m`). Start with a longer interval (`1h` to `8h`) or temporarily disable scheduling during the initial import/mirror run, then enable your regular interval after the first pass completes.
- **Why this matters**: If your Gitea instance takes a long time to complete migrations/imports, aggressive schedules can cause repeated retries and duplicate-looking mirror attempts.
- **Why this matters**: If your Gitea/Forgejo instance takes a long time to complete migrations/imports, aggressive schedules can cause repeated retries and duplicate-looking mirror attempts.
**🛡️ Backup Protection Features**:
- **No Accidental Deletions**: Repository cleanup is automatically skipped if GitHub is inaccessible (account deleted, banned, or API errors)
- **Archive Never Deletes Data**: The `archive` action preserves all repository data:
- Regular repositories: Made read-only using Gitea's archive feature
- Mirror repositories: Renamed with `archived-` prefix (Gitea API limitation prevents archiving mirrors)
- Regular repositories: Made read-only using Gitea/Forgejo's archive feature
- Mirror repositories: Renamed with `archived-` prefix (Gitea/Forgejo API limitation prevents archiving mirrors)
- Failed operations: Repository remains fully accessible even if marking as archived fails
- **Manual Sync on Demand**: Archived mirrors stay in Gitea with automatic syncs disabled; trigger `Manual Sync` from the Repositories page whenever you need fresh data.
- **The Whole Point of Backups**: Your Gitea mirrors are preserved even when GitHub sources disappear - that's why you have backups!
- **Manual Sync on Demand**: Archived mirrors stay in Gitea/Forgejo with automatic syncs disabled; trigger `Manual Sync` from the Repositories page whenever you need fresh data.
- **The Whole Point of Backups**: Your Gitea/Forgejo mirrors are preserved even when GitHub sources disappear - that's why you have backups!
- **Strongly Recommended**: Always use `CLEANUP_ORPHANED_REPO_ACTION=archive` (default) instead of `delete`
## Troubleshooting
### Reverse Proxy Configuration
If using a reverse proxy (e.g., nginx proxy manager) and experiencing issues with JavaScript files not loading properly, try enabling HTTP/2 support in your proxy configuration. While not required by the application, some proxy configurations may have better compatibility with HTTP/2 enabled. See [issue #43](https://github.com/RayLabsHQ/gitea-mirror/issues/43) for reference.
If you run behind a reverse proxy on a subpath (for example `https://git.example.com/mirror`), configure:
```bash
BASE_URL=/mirror
BETTER_AUTH_URL=https://git.example.com
PUBLIC_BETTER_AUTH_URL=https://git.example.com
BETTER_AUTH_TRUSTED_ORIGINS=https://git.example.com
```
Notes:
- `BASE_URL` sets the application path prefix.
- `BETTER_AUTH_TRUSTED_ORIGINS` should contain origins only (no path).
- When building Docker images, pass `BASE_URL` at build time as well.
### Mirror Token Rotation (GitHub Token Changed)
@@ -308,7 +321,7 @@ For existing pull-mirror repositories, changing the GitHub token in Gitea Mirror
If sync logs show authentication failures (for example `terminal prompts disabled`), do one of the following:
1. In Gitea/Forgejo, open repository **Settings → Mirror Settings** and update the mirror authorization password/token.
2. Or delete and re-mirror the repository from Gitea Mirror so it is recreated with current credentials.
2. Or delete and re-mirror the repository so it is recreated with current credentials.
### Re-sync Metadata After Changing Mirror Options
@@ -333,7 +346,7 @@ If your Gitea/Forgejo server has `mirror.MIN_INTERVAL` set to a higher value (fo
To avoid this:
1. Set Gitea Mirror interval to a value greater than or equal to your server `MIN_INTERVAL`.
2. Do not rely on manual per-repository mirror interval edits in Gitea/Forgejo, because Gitea Mirror will overwrite them on sync.
2. Do not rely on manual per-repository mirror interval edits in Gitea/Forgejo, as they will be overwritten on sync.
## Development
@@ -355,13 +368,13 @@ bun run build
- **Frontend**: Astro, React, Shadcn UI, Tailwind CSS v4
- **Backend**: Bun runtime, SQLite, Drizzle ORM
- **APIs**: GitHub (Octokit), Gitea REST API
- **APIs**: GitHub (Octokit), Gitea/Forgejo REST API
- **Auth**: Better Auth with session-based authentication
## Security
### Token Encryption
- All GitHub and Gitea API tokens are encrypted at rest using AES-256-GCM
- All GitHub and Gitea/Forgejo API tokens are encrypted at rest using AES-256-GCM
- Encryption is automatic and transparent to users
- Set `ENCRYPTION_SECRET` environment variable for production deployments
- Falls back to `BETTER_AUTH_SECRET` if not set
@@ -455,13 +468,13 @@ Gitea Mirror can also act as an OIDC provider for other applications. Register O
## Known Limitations
### Pull Request Mirroring Implementation
Pull requests **cannot be created as actual PRs** in Gitea due to API limitations. Instead, they are mirrored as **enriched issues** with comprehensive metadata.
Pull requests **cannot be created as actual PRs** in Gitea/Forgejo due to API limitations. Instead, they are mirrored as **enriched issues** with comprehensive metadata.
**Why real PR mirroring isn't possible:**
- Gitea's API doesn't support creating pull requests from external sources
- Gitea/Forgejo's API doesn't support creating pull requests from external sources
- Real PRs require actual Git branches with commits to exist in the repository
- Would require complex branch synchronization and commit replication
- The mirror relationship is one-way (GitHub → Gitea) for repository content
- The mirror relationship is one-way (GitHub → Gitea/Forgejo) for repository content
**How we handle Pull Requests:**
PRs are mirrored as issues with rich metadata including:
@@ -475,7 +488,7 @@ PRs are mirrored as issues with rich metadata including:
- 🔀 Base and head branch information
- ✅ Merge status tracking
This approach preserves all important PR information while working within Gitea's API constraints. The PRs appear in Gitea's issue tracker with clear visual distinction and comprehensive details.
This approach preserves all important PR information while working within Gitea/Forgejo's API constraints. The PRs appear in the issue tracker with clear visual distinction and comprehensive details.
## Contributing
@@ -483,7 +496,7 @@ Contributions are welcome! Please read our [Contributing Guidelines](CONTRIBUTIN
## License
GNU General Public License v3.0 - see [LICENSE](LICENSE) file for details.
GNU Affero General Public License v3.0 (AGPL-3.0) - see [LICENSE](LICENSE) file for details.
## Star History
@@ -498,7 +511,8 @@ GNU General Public License v3.0 - see [LICENSE](LICENSE) file for details.
## Support
- 📖 [Documentation](https://github.com/RayLabsHQ/gitea-mirror/tree/main/docs)
- 🔐 [Custom CA Certificates](docs/CA_CERTIFICATES.md)
- 🔐 [Environment Variables](docs/ENVIRONMENT_VARIABLES.md)
- 🛡️ [Force-Push Protection](docs/FORCE_PUSH_PROTECTION.md)
- 🐛 [Report Issues](https://github.com/RayLabsHQ/gitea-mirror/issues)
- 💬 [Discussions](https://github.com/RayLabsHQ/gitea-mirror/discussions)
- 🔧 [Proxmox VE Script](https://community-scripts.github.io/ProxmoxVE/scripts?id=gitea-mirror)

View File

@@ -4,8 +4,25 @@ import tailwindcss from '@tailwindcss/vite';
import react from '@astrojs/react';
import node from '@astrojs/node';
const normalizeBaseUrl = (value) => {
if (!value || value.trim() === '') {
return '/';
}
let normalized = value.trim();
if (!normalized.startsWith('/')) {
normalized = `/${normalized}`;
}
normalized = normalized.replace(/\/+$/, '');
return normalized || '/';
};
const base = normalizeBaseUrl(process.env.BASE_URL);
// https://astro.build/config
export default defineConfig({
base,
output: 'server',
adapter: node({
mode: 'standalone',

1234
bun.lock

File diff suppressed because it is too large Load Diff

3492
bun.nix Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -3,4 +3,7 @@
timeout = 5000
# Preload the setup file
preload = ["./src/tests/setup.bun.ts"]
preload = ["./src/tests/setup.bun.ts"]
# Only run tests in src/ directory (excludes tests/e2e/ which are Playwright tests)
root = "./src/"

804
design/giteamirror.pen Normal file
View File

@@ -0,0 +1,804 @@
{
"version": "2.9",
"children": [
{
"type": "frame",
"id": "eIiDx",
"x": 0,
"y": 0,
"name": "Scheduling Settings - Redesign",
"width": 1080,
"fill": "#09090B",
"cornerRadius": 16,
"gap": 24,
"padding": 32,
"children": [
{
"type": "frame",
"id": "7r0Wv",
"name": "Automatic Syncing Card",
"clip": true,
"width": "fill_container",
"fill": "#18181B",
"cornerRadius": 12,
"stroke": {
"align": "inside",
"thickness": 1,
"fill": "#27272A"
},
"layout": "vertical",
"children": [
{
"type": "frame",
"id": "gyCPG",
"name": "Header",
"width": "fill_container",
"gap": 12,
"padding": [
20,
24
],
"alignItems": "center",
"children": [
{
"type": "icon_font",
"id": "OunzZ",
"name": "headerIcon",
"width": 20,
"height": 20,
"iconFontName": "refresh-cw",
"iconFontFamily": "lucide",
"fill": "#A1A1AA"
},
{
"type": "text",
"id": "fMdlX",
"name": "headerTitle",
"fill": "#FAFAFA",
"content": "Automatic Syncing",
"fontFamily": "Inter",
"fontSize": 16,
"fontWeight": "600"
}
]
},
{
"type": "rectangle",
"id": "4cX02",
"name": "divider1",
"fill": "#27272A",
"width": "fill_container",
"height": 1
},
{
"type": "frame",
"id": "Kiezh",
"name": "Toggle Section",
"width": "fill_container",
"gap": 14,
"padding": [
20,
24
],
"children": [
{
"type": "frame",
"id": "QCPzN",
"name": "Checkbox",
"width": 20,
"height": 20,
"fill": "#6366F1",
"cornerRadius": 4,
"layout": "none",
"children": [
{
"type": "icon_font",
"id": "4FTax",
"x": 3,
"y": 3,
"name": "checkIcon",
"width": 14,
"height": 14,
"iconFontName": "check",
"iconFontFamily": "lucide",
"fill": "#FFFFFF"
}
]
},
{
"type": "frame",
"id": "FTzs6",
"name": "toggleText",
"width": "fill_container",
"layout": "vertical",
"gap": 4,
"children": [
{
"type": "text",
"id": "1nJKC",
"name": "toggleLabel",
"fill": "#FAFAFA",
"content": "Enable automatic repository syncing",
"fontFamily": "Inter",
"fontSize": 14,
"fontWeight": "500"
},
{
"type": "text",
"id": "r1O5t",
"name": "toggleDesc",
"fill": "#71717A",
"textGrowth": "fixed-width",
"width": "fill_container",
"content": "Periodically sync GitHub changes to Gitea",
"fontFamily": "Inter",
"fontSize": 13
}
]
}
]
},
{
"type": "rectangle",
"id": "nvQ6R",
"name": "divider2",
"fill": "#27272A",
"width": "fill_container",
"height": 1
},
{
"type": "frame",
"id": "FOoBn",
"name": "Schedule Builder",
"width": "fill_container",
"layout": "vertical",
"gap": 20,
"padding": 24,
"children": [
{
"type": "frame",
"id": "IqHEu",
"name": "schedHeader",
"width": "fill_container",
"justifyContent": "space_between",
"alignItems": "center",
"children": [
{
"type": "text",
"id": "RnVoM",
"name": "schedTitle",
"fill": "#A1A1AA",
"content": "SCHEDULE",
"fontFamily": "Inter",
"fontSize": 12,
"fontWeight": "600",
"letterSpacing": 1
},
{
"type": "frame",
"id": "aVtIZ",
"name": "tzBadge",
"fill": "#27272A",
"cornerRadius": 20,
"gap": 6,
"padding": [
4,
10
],
"alignItems": "center",
"children": [
{
"type": "icon_font",
"id": "iXpYV",
"name": "tzIcon",
"width": 12,
"height": 12,
"iconFontName": "globe",
"iconFontFamily": "lucide",
"fill": "#71717A"
},
{
"type": "text",
"id": "WjPMl",
"name": "tzText",
"fill": "#A1A1AA",
"content": "UTC",
"fontFamily": "Inter",
"fontSize": 11,
"fontWeight": "500"
}
]
}
]
},
{
"type": "frame",
"id": "P02fk",
"name": "formRow",
"width": "fill_container",
"gap": 12,
"children": [
{
"type": "frame",
"id": "kcYK5",
"name": "Frequency",
"width": "fill_container",
"layout": "vertical",
"gap": 6,
"children": [
{
"type": "text",
"id": "vMvsN",
"name": "label2",
"fill": "#A1A1AA",
"content": "Frequency",
"fontFamily": "Inter",
"fontSize": 12,
"fontWeight": "500"
},
{
"type": "frame",
"id": "3prth",
"name": "select2",
"width": "fill_container",
"height": 40,
"fill": "#27272A",
"cornerRadius": 8,
"stroke": {
"align": "inside",
"thickness": 1,
"fill": "#3F3F46"
},
"padding": [
0,
12
],
"justifyContent": "space_between",
"alignItems": "center",
"children": [
{
"type": "text",
"id": "ANY36",
"name": "sel2Text",
"fill": "#FAFAFA",
"content": "Daily",
"fontFamily": "Inter",
"fontSize": 13
},
{
"type": "icon_font",
"id": "GUWfd",
"name": "sel2Icon",
"width": 16,
"height": 16,
"iconFontName": "chevron-down",
"iconFontFamily": "lucide",
"fill": "#71717A"
}
]
}
]
},
{
"type": "frame",
"id": "xphp0",
"name": "Start Time",
"width": "fill_container",
"layout": "vertical",
"gap": 6,
"children": [
{
"type": "text",
"id": "l6VkR",
"name": "label3",
"fill": "#A1A1AA",
"content": "Start Time",
"fontFamily": "Inter",
"fontSize": 12,
"fontWeight": "500"
},
{
"type": "frame",
"id": "lWBDi",
"name": "timeInput",
"width": "fill_container",
"height": 40,
"fill": "#27272A",
"cornerRadius": 8,
"stroke": {
"align": "inside",
"thickness": 1,
"fill": "#3F3F46"
},
"padding": [
0,
12
],
"justifyContent": "space_between",
"alignItems": "center",
"children": [
{
"type": "text",
"id": "fbuMi",
"name": "timeText",
"fill": "#FAFAFA",
"content": "10:00 PM",
"fontFamily": "Inter",
"fontSize": 13
},
{
"type": "icon_font",
"id": "5xKW7",
"name": "timeIcon",
"width": 16,
"height": 16,
"iconFontName": "clock-4",
"iconFontFamily": "lucide",
"fill": "#71717A"
}
]
}
]
}
]
}
]
},
{
"type": "rectangle",
"id": "BtYt7",
"name": "divider3",
"fill": "#27272A",
"width": "fill_container",
"height": 1
},
{
"type": "frame",
"id": "520Kb",
"name": "Status Bar",
"width": "fill_container",
"padding": [
16,
24
],
"justifyContent": "space_between",
"alignItems": "center",
"children": [
{
"type": "frame",
"id": "J8JzX",
"name": "lastSync",
"gap": 8,
"alignItems": "center",
"children": [
{
"type": "icon_font",
"id": "MS5VM",
"name": "lastIcon",
"width": 14,
"height": 14,
"iconFontName": "history",
"iconFontFamily": "lucide",
"fill": "#52525B"
},
{
"type": "text",
"id": "8KJHY",
"name": "lastLabel",
"fill": "#52525B",
"content": "Last sync",
"fontFamily": "Inter",
"fontSize": 12
},
{
"type": "text",
"id": "Fz116",
"name": "lastValue",
"fill": "#A1A1AA",
"content": "Never",
"fontFamily": "Inter",
"fontSize": 12,
"fontWeight": "500"
}
]
},
{
"type": "frame",
"id": "ZbRFN",
"name": "nextSync",
"gap": 8,
"alignItems": "center",
"children": [
{
"type": "icon_font",
"id": "wIKSk",
"name": "nextIcon",
"width": 14,
"height": 14,
"iconFontName": "calendar",
"iconFontFamily": "lucide",
"fill": "#52525B"
},
{
"type": "text",
"id": "ejqSP",
"name": "nextLabel",
"fill": "#52525B",
"content": "Next sync",
"fontFamily": "Inter",
"fontSize": 12
},
{
"type": "text",
"id": "M4oJ7",
"name": "nextValue",
"fill": "#6366F1",
"content": "Calculating...",
"fontFamily": "Inter",
"fontSize": 12,
"fontWeight": "500"
}
]
}
]
}
]
},
{
"type": "frame",
"id": "7PK7H",
"name": "Database Maintenance Card",
"clip": true,
"width": "fill_container",
"height": "fill_container",
"fill": "#18181B",
"cornerRadius": 12,
"stroke": {
"align": "inside",
"thickness": 1,
"fill": "#27272A"
},
"layout": "vertical",
"children": [
{
"type": "frame",
"id": "FAaon",
"name": "Header",
"width": "fill_container",
"gap": 12,
"padding": [
20,
24
],
"alignItems": "center",
"children": [
{
"type": "icon_font",
"id": "64CaE",
"name": "rHeaderIcon",
"width": 20,
"height": 20,
"iconFontName": "database",
"iconFontFamily": "lucide",
"fill": "#A1A1AA"
},
{
"type": "text",
"id": "rvZlC",
"name": "rHeaderTitle",
"fill": "#FAFAFA",
"content": "Database Maintenance",
"fontFamily": "Inter",
"fontSize": 16,
"fontWeight": "600"
}
]
},
{
"type": "rectangle",
"id": "nsM0M",
"name": "rDivider1",
"fill": "#27272A",
"width": "fill_container",
"height": 1
},
{
"type": "frame",
"id": "8zhPi",
"name": "Toggle Section",
"width": "fill_container",
"gap": 14,
"padding": [
20,
24
],
"children": [
{
"type": "frame",
"id": "eQbZk",
"name": "Checkbox",
"width": 20,
"height": 20,
"fill": "#6366F1",
"cornerRadius": 4,
"layout": "none",
"children": [
{
"type": "icon_font",
"id": "t6PbY",
"x": 3,
"y": 3,
"name": "rCheckIcon",
"width": 14,
"height": 14,
"iconFontName": "check",
"iconFontFamily": "lucide",
"fill": "#FFFFFF"
}
]
},
{
"type": "frame",
"id": "lpBPI",
"name": "rToggleText",
"width": "fill_container",
"layout": "vertical",
"gap": 4,
"children": [
{
"type": "text",
"id": "Kuy1S",
"name": "rToggleLabel",
"fill": "#FAFAFA",
"content": "Enable automatic database cleanup",
"fontFamily": "Inter",
"fontSize": 14,
"fontWeight": "500"
},
{
"type": "text",
"id": "OviVY",
"name": "rToggleDesc",
"fill": "#71717A",
"textGrowth": "fixed-width",
"width": "fill_container",
"content": "Remove old activity logs to optimize storage",
"fontFamily": "Inter",
"fontSize": 13
}
]
}
]
},
{
"type": "rectangle",
"id": "1og3D",
"name": "rDivider2",
"fill": "#27272A",
"width": "fill_container",
"height": 1
},
{
"type": "frame",
"id": "J7576",
"name": "Retention Section",
"width": "fill_container",
"layout": "vertical",
"gap": 16,
"padding": 24,
"children": [
{
"type": "frame",
"id": "JZA6R",
"name": "retLabelRow",
"gap": 6,
"alignItems": "center",
"children": [
{
"type": "text",
"id": "Diiak",
"name": "retLabel",
"fill": "#FAFAFA",
"content": "Data retention period",
"fontFamily": "Inter",
"fontSize": 14,
"fontWeight": "500"
},
{
"type": "icon_font",
"id": "1qqCe",
"name": "retInfoIcon",
"width": 14,
"height": 14,
"iconFontName": "info",
"iconFontFamily": "lucide",
"fill": "#52525B"
}
]
},
{
"type": "frame",
"id": "kfUjs",
"name": "retRow",
"width": "fill_container",
"gap": 16,
"alignItems": "center",
"children": [
{
"type": "frame",
"id": "9bhls",
"name": "retSelect",
"width": 180,
"height": 40,
"fill": "#27272A",
"cornerRadius": 8,
"stroke": {
"align": "inside",
"thickness": 1,
"fill": "#3F3F46"
},
"padding": [
0,
12
],
"justifyContent": "space_between",
"alignItems": "center",
"children": [
{
"type": "text",
"id": "3NOod",
"name": "retSelText",
"fill": "#FAFAFA",
"content": "1 month",
"fontFamily": "Inter",
"fontSize": 13
},
{
"type": "icon_font",
"id": "8QBA8",
"name": "retSelIcon",
"width": 16,
"height": 16,
"iconFontName": "chevron-down",
"iconFontFamily": "lucide",
"fill": "#71717A"
}
]
},
{
"type": "text",
"id": "GA6ye",
"name": "retHelper",
"fill": "#52525B",
"content": "Cleanup runs every 2 days",
"fontFamily": "Inter",
"fontSize": 12
}
]
}
]
},
{
"type": "rectangle",
"id": "WfXVB",
"name": "rDivider3",
"fill": "#27272A",
"width": "fill_container",
"height": 1
},
{
"type": "frame",
"id": "WpXnI",
"name": "Cleanup Status",
"width": "fill_container",
"layout": "vertical",
"gap": 12,
"padding": [
16,
24
],
"children": [
{
"type": "frame",
"id": "fbpm5",
"name": "lastCleanup",
"width": "fill_container",
"justifyContent": "space_between",
"alignItems": "center",
"children": [
{
"type": "frame",
"id": "DdLix",
"name": "lastCleanupLeft",
"gap": 8,
"alignItems": "center",
"children": [
{
"type": "icon_font",
"id": "FN2cj",
"name": "lastCleanIcon",
"width": 14,
"height": 14,
"iconFontName": "history",
"iconFontFamily": "lucide",
"fill": "#52525B"
},
{
"type": "text",
"id": "JjmMa",
"name": "lastCleanLabel",
"fill": "#52525B",
"content": "Last cleanup",
"fontFamily": "Inter",
"fontSize": 12
}
]
},
{
"type": "text",
"id": "l1Kph",
"name": "lastCleanValue",
"fill": "#A1A1AA",
"content": "Never",
"fontFamily": "Inter",
"fontSize": 12,
"fontWeight": "500"
}
]
},
{
"type": "frame",
"id": "AWHY8",
"name": "nextCleanup",
"width": "fill_container",
"justifyContent": "space_between",
"alignItems": "center",
"children": [
{
"type": "frame",
"id": "sj0qN",
"name": "nextCleanupLeft",
"gap": 8,
"alignItems": "center",
"children": [
{
"type": "icon_font",
"id": "V6RTK",
"name": "nextCleanIcon",
"width": 14,
"height": 14,
"iconFontName": "calendar",
"iconFontFamily": "lucide",
"fill": "#52525B"
},
{
"type": "text",
"id": "wf0b4",
"name": "nextCleanLabel",
"fill": "#52525B",
"content": "Next cleanup",
"fontFamily": "Inter",
"fontSize": 12
}
]
},
{
"type": "text",
"id": "YWZGH",
"name": "nextCleanValue",
"fill": "#6366F1",
"content": "March 20, 2026 at 12:19 AM",
"fontFamily": "Inter",
"fontSize": 12,
"fontWeight": "500"
}
]
}
]
}
]
}
]
}
]
}

View File

@@ -18,6 +18,12 @@ services:
- BETTER_AUTH_SECRET=${BETTER_AUTH_SECRET} # Min 32 chars, required for sessions
- BETTER_AUTH_URL=${BETTER_AUTH_URL:-http://localhost:4321}
- BETTER_AUTH_TRUSTED_ORIGINS=${BETTER_AUTH_TRUSTED_ORIGINS:-http://localhost:4321}
# REVERSE PROXY: If accessing via a reverse proxy, set all three to your external URL:
# BETTER_AUTH_URL=https://gitea-mirror.example.com
# PUBLIC_BETTER_AUTH_URL=https://gitea-mirror.example.com
# BETTER_AUTH_TRUSTED_ORIGINS=https://gitea-mirror.example.com
# NOTE: Path-prefix deployments (e.g. /mirror) require BASE_URL at build time.
# Use docker-compose.yml (which builds from source) and set BASE_URL there.
# === CORE SETTINGS ===
# These are technically required but have working defaults
@@ -25,6 +31,7 @@ services:
- DATABASE_URL=file:data/gitea-mirror.db
- HOST=0.0.0.0
- PORT=4321
- BASE_URL=${BASE_URL:-/}
- PUBLIC_BETTER_AUTH_URL=${PUBLIC_BETTER_AUTH_URL:-http://localhost:4321}
# Optional concurrency controls (defaults match in-app defaults)
# If you want perfect ordering of issues and PRs, set these at 1
@@ -32,7 +39,11 @@ services:
- MIRROR_PULL_REQUEST_CONCURRENCY=${MIRROR_PULL_REQUEST_CONCURRENCY:-5}
healthcheck:
test: ["CMD", "wget", "--no-verbose", "--tries=3", "--spider", "http://localhost:4321/api/health"]
test:
[
"CMD-SHELL",
"BASE=\"${BASE_URL:-/}\"; if [ \"$${BASE}\" = \"/\" ]; then BASE=\"\"; else BASE=\"$${BASE%/}\"; fi; wget --no-verbose --tries=3 --spider \"http://localhost:4321$${BASE}/api/health\"",
]
interval: 30s
timeout: 10s
retries: 5

View File

@@ -45,6 +45,8 @@ services:
build:
context: .
dockerfile: Dockerfile
args:
BASE_URL: ${BASE_URL:-/}
platforms:
- linux/amd64
- linux/arm64
@@ -66,6 +68,7 @@ services:
- DATABASE_URL=file:data/gitea-mirror.db
- HOST=0.0.0.0
- PORT=4321
- BASE_URL=${BASE_URL:-/}
- BETTER_AUTH_SECRET=dev-secret-key
# GitHub/Gitea Mirror Config
- GITHUB_USERNAME=${GITHUB_USERNAME:-your-github-username}
@@ -89,7 +92,11 @@ services:
# Optional: Skip TLS verification (insecure, use only for testing)
# - GITEA_SKIP_TLS_VERIFY=${GITEA_SKIP_TLS_VERIFY:-false}
healthcheck:
test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost:4321/api/health"]
test:
[
"CMD-SHELL",
"BASE=\"${BASE_URL:-/}\"; if [ \"$${BASE}\" = \"/\" ]; then BASE=\"\"; else BASE=\"$${BASE%/}\"; fi; wget --no-verbose --tries=1 --spider \"http://localhost:4321$${BASE}/api/health\"",
]
interval: 30s
timeout: 5s
retries: 3

View File

@@ -7,6 +7,8 @@ services:
build:
context: .
dockerfile: Dockerfile
args:
BASE_URL: ${BASE_URL:-/}
platforms:
- linux/amd64
- linux/arm64
@@ -30,8 +32,21 @@ services:
- DATABASE_URL=file:data/gitea-mirror.db
- HOST=0.0.0.0
- PORT=4321
- BASE_URL=${BASE_URL:-/}
- BETTER_AUTH_SECRET=${BETTER_AUTH_SECRET:-your-secret-key-change-this-in-production}
- BETTER_AUTH_URL=${BETTER_AUTH_URL:-http://localhost:4321}
# REVERSE PROXY: If you access Gitea Mirror through a reverse proxy (e.g. Nginx, Caddy, Traefik),
# you MUST set these three variables to your external URL. Example:
# BETTER_AUTH_URL=https://gitea-mirror.example.com
# PUBLIC_BETTER_AUTH_URL=https://gitea-mirror.example.com
# BETTER_AUTH_TRUSTED_ORIGINS=https://gitea-mirror.example.com
# If deployed under a path prefix (e.g. https://git.example.com/mirror), also set:
# BASE_URL=/mirror
# BETTER_AUTH_URL=https://git.example.com
# PUBLIC_BETTER_AUTH_URL=https://git.example.com
# BETTER_AUTH_TRUSTED_ORIGINS=https://git.example.com
- PUBLIC_BETTER_AUTH_URL=${PUBLIC_BETTER_AUTH_URL:-http://localhost:4321}
- BETTER_AUTH_TRUSTED_ORIGINS=${BETTER_AUTH_TRUSTED_ORIGINS:-}
# Optional: ENCRYPTION_SECRET will be auto-generated if not provided
# - ENCRYPTION_SECRET=${ENCRYPTION_SECRET:-}
# GitHub/Gitea Mirror Config
@@ -74,7 +89,11 @@ services:
- HEADER_AUTH_AUTO_PROVISION=${HEADER_AUTH_AUTO_PROVISION:-false}
- HEADER_AUTH_ALLOWED_DOMAINS=${HEADER_AUTH_ALLOWED_DOMAINS:-}
healthcheck:
test: ["CMD", "wget", "--no-verbose", "--tries=3", "--spider", "http://localhost:4321/api/health"]
test:
[
"CMD-SHELL",
"BASE=\"${BASE_URL:-/}\"; if [ \"$${BASE}\" = \"/\" ]; then BASE=\"\"; else BASE=\"$${BASE%/}\"; fi; wget --no-verbose --tries=3 --spider \"http://localhost:4321$${BASE}/api/health\"",
]
interval: 30s
timeout: 10s
retries: 5

View File

@@ -139,16 +139,29 @@ fi
# Initialize configuration from environment variables if provided
echo "Checking for environment configuration..."
if [ -f "dist/scripts/startup-env-config.js" ]; then
echo "Loading configuration from environment variables..."
bun dist/scripts/startup-env-config.js
ENV_CONFIG_EXIT_CODE=$?
elif [ -f "scripts/startup-env-config.ts" ]; then
echo "Loading configuration from environment variables..."
bun scripts/startup-env-config.ts
ENV_CONFIG_EXIT_CODE=$?
# Only run the env config script if relevant env vars are set
# This avoids spawning a heavy Bun process on memory-constrained systems
HAS_ENV_CONFIG=false
if [ -n "$GITHUB_USERNAME" ] || [ -n "$GITHUB_TOKEN" ] || [ -n "$GITEA_URL" ] || [ -n "$GITEA_USERNAME" ] || [ -n "$GITEA_TOKEN" ]; then
HAS_ENV_CONFIG=true
fi
if [ "$HAS_ENV_CONFIG" = "true" ]; then
if [ -f "dist/scripts/startup-env-config.js" ]; then
echo "Loading configuration from environment variables..."
bun dist/scripts/startup-env-config.js || ENV_CONFIG_EXIT_CODE=$?
ENV_CONFIG_EXIT_CODE=${ENV_CONFIG_EXIT_CODE:-0}
elif [ -f "scripts/startup-env-config.ts" ]; then
echo "Loading configuration from environment variables..."
bun scripts/startup-env-config.ts || ENV_CONFIG_EXIT_CODE=$?
ENV_CONFIG_EXIT_CODE=${ENV_CONFIG_EXIT_CODE:-0}
else
echo "Environment configuration script not found. Skipping."
ENV_CONFIG_EXIT_CODE=0
fi
else
echo "Environment configuration script not found. Skipping."
echo "No GitHub/Gitea environment variables found, skipping env config initialization."
ENV_CONFIG_EXIT_CODE=0
fi
@@ -161,17 +174,15 @@ fi
# Run startup recovery to handle any interrupted jobs
echo "Running startup recovery..."
RECOVERY_EXIT_CODE=0
if [ -f "dist/scripts/startup-recovery.js" ]; then
echo "Running startup recovery using compiled script..."
bun dist/scripts/startup-recovery.js --timeout=30000
RECOVERY_EXIT_CODE=$?
bun dist/scripts/startup-recovery.js --timeout=30000 || RECOVERY_EXIT_CODE=$?
elif [ -f "scripts/startup-recovery.ts" ]; then
echo "Running startup recovery using TypeScript script..."
bun scripts/startup-recovery.ts --timeout=30000
RECOVERY_EXIT_CODE=$?
bun scripts/startup-recovery.ts --timeout=30000 || RECOVERY_EXIT_CODE=$?
else
echo "Warning: Startup recovery script not found. Skipping recovery."
RECOVERY_EXIT_CODE=0
fi
# Log recovery result
@@ -185,17 +196,15 @@ fi
# Run repository status repair to fix any inconsistent mirroring states
echo "Running repository status repair..."
REPAIR_EXIT_CODE=0
if [ -f "dist/scripts/repair-mirrored-repos.js" ]; then
echo "Running repository repair using compiled script..."
bun dist/scripts/repair-mirrored-repos.js --startup
REPAIR_EXIT_CODE=$?
bun dist/scripts/repair-mirrored-repos.js --startup || REPAIR_EXIT_CODE=$?
elif [ -f "scripts/repair-mirrored-repos.ts" ]; then
echo "Running repository repair using TypeScript script..."
bun scripts/repair-mirrored-repos.ts --startup
REPAIR_EXIT_CODE=$?
bun scripts/repair-mirrored-repos.ts --startup || REPAIR_EXIT_CODE=$?
else
echo "Warning: Repository repair script not found. Skipping repair."
REPAIR_EXIT_CODE=0
fi
# Log repair result

View File

@@ -310,26 +310,25 @@ bunx tsc --noEmit
## Release Process
1. **Update version**:
```bash
npm version patch # or minor/major
```
1. **Choose release version** (`X.Y.Z`) and update `CHANGELOG.md`
2. **Update CHANGELOG.md**
3. **Build and test**:
2. **Build and test**:
```bash
bun run build
bun test
```
4. **Create release**:
3. **Create release tag** (semver format required):
```bash
git tag v2.23.0
git push origin v2.23.0
git tag vX.Y.Z
git push origin vX.Y.Z
```
5. **Create GitHub release**
4. **Create GitHub release**
5. **CI version sync (automatic)**:
- On `v*` tags, release CI updates `package.json` version in the build context from the tag (`vX.Y.Z` -> `X.Y.Z`), so Docker release images always report the correct app version.
- After the release build succeeds, CI commits the same `package.json` version back to `main` automatically.
## Contributing
@@ -349,6 +348,6 @@ git push origin v2.23.0
## Getting Help
- Check existing [issues](https://github.com/yourusername/gitea-mirror/issues)
- Join [discussions](https://github.com/yourusername/gitea-mirror/discussions)
- Read the [FAQ](./FAQ.md)
- Check existing [issues](https://github.com/RayLabsHQ/gitea-mirror/issues)
- Join [discussions](https://github.com/RayLabsHQ/gitea-mirror/discussions)
- Review project docs in [docs/README.md](./README.md)

View File

@@ -33,6 +33,7 @@ Essential application settings required for running Gitea Mirror.
| `NODE_ENV` | Application environment | `production` | No |
| `HOST` | Server host binding | `0.0.0.0` | No |
| `PORT` | Server port | `4321` | No |
| `BASE_URL` | Application base path. Use `/` for root deployments, or a prefix such as `/mirror` when serving behind a reverse-proxy path prefix. | `/` | No |
| `DATABASE_URL` | Database connection URL | `sqlite://data/gitea-mirror.db` | No |
| `BETTER_AUTH_SECRET` | Secret key for session signing (generate with: `openssl rand -base64 32`) | - | Yes |
| `BETTER_AUTH_URL` | Primary base URL for authentication. This should be the main URL where your application is accessed. | `http://localhost:4321` | No |
@@ -61,6 +62,7 @@ Settings for connecting to and configuring GitHub repository sources.
| `INCLUDE_ARCHIVED` | Include archived repositories | `false` | `true`, `false` |
| `SKIP_FORKS` | Skip forked repositories | `false` | `true`, `false` |
| `MIRROR_STARRED` | Mirror starred repositories | `false` | `true`, `false` |
| `MIRROR_STARRED_LISTS` | Optional comma-separated GitHub Star List names to mirror (only used when `MIRROR_STARRED=true`) | - | Comma-separated list names (empty = all starred repos) |
| `STARRED_REPOS_ORG` | Organization name for starred repos | `starred` | Any string |
| `STARRED_REPOS_MODE` | How starred repos are mirrored | `dedicated-org` | `dedicated-org`, `preserve-owner` |
@@ -78,6 +80,7 @@ Settings for connecting to and configuring GitHub repository sources.
| Variable | Description | Default | Options |
|----------|-------------|---------|---------|
| `SKIP_STARRED_ISSUES` | Enable lightweight mode for starred repos (skip issues) | `false` | `true`, `false` |
| `AUTO_MIRROR_STARRED` | Automatically mirror starred repos during scheduled syncs and "Mirror All". When `false`, starred repos are imported for browsing but must be mirrored individually. | `false` | `true`, `false` |
## Gitea Configuration
@@ -300,6 +303,7 @@ services:
environment:
# Core Configuration
- NODE_ENV=production
- BASE_URL=/
- DATABASE_URL=file:data/gitea-mirror.db
- BETTER_AUTH_SECRET=your-secure-secret-here
# Primary access URL:
@@ -368,6 +372,21 @@ This setup allows you to:
**Important:** When accessing from different origins (IP vs domain), you'll need to log in separately on each origin as cookies cannot be shared across different origins for security reasons.
### Path Prefix Deployments
If you serve Gitea Mirror under a subpath such as `https://git.example.com/mirror`, set:
```bash
BASE_URL=/mirror
BETTER_AUTH_URL=https://git.example.com
PUBLIC_BETTER_AUTH_URL=https://git.example.com
BETTER_AUTH_TRUSTED_ORIGINS=https://git.example.com
```
Notes:
- `BETTER_AUTH_TRUSTED_ORIGINS` must contain origins only (no path).
- `BASE_URL` is applied at build time, so set it for image builds too.
### Trusted Origins
The `BETTER_AUTH_TRUSTED_ORIGINS` variable serves multiple purposes:

View File

@@ -0,0 +1,183 @@
# Force-Push Protection
This document describes the smart force-push protection system introduced in gitea-mirror v3.11.0+.
## The Problem
GitHub repositories can be force-pushed at any time — rewriting history, deleting branches, or replacing commits entirely. When gitea-mirror syncs a force-pushed repo, the old history in Gitea is silently overwritten. Files, commits, and branches disappear with no way to recover them.
The original workaround (`backupBeforeSync: true`) created a full git bundle backup before **every** sync. This doesn't scale — a user with 100+ GiB of mirrors would need up to 2 TB of backup storage with default retention settings, even though force-pushes are rare.
## Solution: Smart Detection
Instead of backing up everything every time, the system detects force-pushes **before** they happen and only acts when needed.
### How Detection Works
Before each sync, the app compares branch SHAs between Gitea (the mirror) and GitHub (the source):
1. **Fetch branches from both sides** — lightweight API calls to get branch names and their latest commit SHAs
2. **Compare each branch**:
- SHAs match → nothing changed, no action needed
- SHAs differ → check if the change is a normal push or a force-push
3. **Ancestry check** — for branches with different SHAs, call GitHub's compare API to determine if the new SHA is a descendant of the old one:
- **Fast-forward** (new SHA descends from old) → normal push, safe to sync
- **Diverged** (histories split) → force-push detected
- **404** (old SHA doesn't exist on GitHub anymore) → history was rewritten, force-push detected
- **Branch deleted on GitHub** → flagged as destructive change
### What Happens on Detection
Depends on the configured strategy (see below):
- **Backup strategies** (`always`, `on-force-push`): create a git bundle snapshot, then sync
- **Block strategy** (`block-on-force-push`): halt the sync, mark the repo as `pending-approval`, wait for user action
### Fail-Open Design
If detection itself fails (GitHub rate limits, network errors, API outages), sync proceeds normally. Detection never blocks a sync due to its own failure. Individual branch check failures are skipped — one flaky branch doesn't affect the others.
## Backup Strategies
Configure via **Settings → GitHub Configuration → Destructive Update Protection**.
| Strategy | What It Does | Storage Cost | Best For |
|---|---|---|---|
| **Disabled** | No detection, no backups | Zero | Repos you don't care about losing |
| **Always Backup** | Snapshot before every sync (original behavior) | High | Small mirror sets, maximum safety |
| **Smart** (default) | Detect force-pushes, backup only when found | Near-zero normally | Most users — efficient protection |
| **Block & Approve** | Detect force-pushes, block sync until approved | Zero | Critical repos needing manual review |
### Strategy Details
#### Disabled
Syncs proceed without any detection or backup. If a force-push happens on GitHub, the mirror silently overwrites.
#### Always Backup
Creates a git bundle snapshot before every sync regardless of whether a force-push occurred. This is the legacy behavior (equivalent to the old `backupBeforeSync: true`). Safe but expensive for large mirror sets.
#### Smart (`on-force-push`) — Recommended
Runs the force-push detection before each sync. On normal days (no force-pushes), syncs proceed without any backup overhead. When a force-push is detected, a snapshot is created before the sync runs.
This gives you protection when it matters with near-zero cost when it doesn't.
#### Block & Approve (`block-on-force-push`)
Runs detection and, when a force-push is found, **blocks the sync entirely**. The repository is marked as `pending-approval` and excluded from future scheduled syncs until you take action:
- **Approve**: creates a backup first, then syncs (safe)
- **Dismiss**: clears the flag and resumes normal syncing (no backup)
Use this for repos where you want manual control over destructive changes.
## Additional Settings
These appear when any non-disabled strategy is selected:
### Snapshot Retention Count
How many backup snapshots to keep per repository. Oldest snapshots are deleted when this limit is exceeded. Default: **5**.
### Snapshot Retention Days
Maximum age (in days) for backup snapshots. Bundles older than this are deleted during retention enforcement, though at least one bundle is always kept. Set to `0` to disable time-based retention. Default: **30**.
### Snapshot Directory
Where git bundle backups are stored. Default: **`data/repo-backups`**. Bundles are organized as `<directory>/<owner>/<repo>/<timestamp>.bundle`.
### Block Sync on Snapshot Failure
Available for **Always Backup** and **Smart** strategies. When enabled, if the snapshot creation fails (disk full, permissions error, etc.), the sync is also blocked. When disabled, sync continues even if the snapshot couldn't be created.
Recommended: **enabled** if you rely on backups for recovery.
## Backward Compatibility
The old `backupBeforeSync` boolean is still recognized:
| Old Setting | New Equivalent |
|---|---|
| `backupBeforeSync: true` | `backupStrategy: "on-force-push"` |
| `backupBeforeSync: false` | `backupStrategy: "disabled"` |
| Neither set | `backupStrategy: "on-force-push"` (new default) |
Existing configurations are automatically mapped. The old field is deprecated but will continue to work.
## Environment Variables
No new environment variables are required. The backup strategy is configured through the web UI and stored in the database alongside other config.
## API
### Approve/Dismiss Blocked Repos
When using the `block-on-force-push` strategy, repos that are blocked can be managed via the API:
```bash
# Approve sync (creates backup first, then syncs)
curl -X POST http://localhost:4321/api/job/approve-sync \
-H "Content-Type: application/json" \
-H "Cookie: <session>" \
-d '{"repositoryIds": ["<id>"], "action": "approve"}'
# Dismiss (clear the block, resume normal syncing)
curl -X POST http://localhost:4321/api/job/approve-sync \
-H "Content-Type: application/json" \
-H "Cookie: <session>" \
-d '{"repositoryIds": ["<id>"], "action": "dismiss"}'
```
Blocked repos also show an **Approve** / **Dismiss** button in the repository table UI.
## Architecture
### Key Files
| File | Purpose |
|---|---|
| `src/lib/utils/force-push-detection.ts` | Core detection: fetch branches, compare SHAs, check ancestry |
| `src/lib/repo-backup.ts` | Strategy resolver, backup decision logic, bundle creation |
| `src/lib/gitea-enhanced.ts` | Sync flow integration (calls detection + backup before mirror-sync) |
| `src/pages/api/job/approve-sync.ts` | Approve/dismiss API endpoint |
| `src/components/config/GitHubConfigForm.tsx` | Strategy selector UI |
| `src/components/repositories/RepositoryTable.tsx` | Pending-approval badge + action buttons |
### Detection Flow
```
syncGiteaRepoEnhanced()
├─ Resolve backup strategy (config → backupStrategy → backupBeforeSync → default)
├─ If strategy needs detection ("on-force-push" or "block-on-force-push"):
│ │
│ ├─ fetchGiteaBranches() — GET /api/v1/repos/{owner}/{repo}/branches
│ ├─ fetchGitHubBranches() — octokit.paginate(repos.listBranches)
│ │
│ └─ For each Gitea branch where SHA differs:
│ └─ checkAncestry() — octokit.repos.compareCommits()
│ ├─ "ahead" or "identical" → fast-forward (safe)
│ ├─ "diverged" or "behind" → force-push detected
│ └─ 404/422 → old SHA gone → force-push detected
├─ If "block-on-force-push" + detected:
│ └─ Set repo status to "pending-approval", return early
├─ If backup needed (always, or on-force-push + detected):
│ └─ Create git bundle snapshot
└─ Proceed to mirror-sync
```
## Troubleshooting
**Repos stuck in "pending-approval"**: Use the Approve or Dismiss buttons in the repository table, or call the approve-sync API endpoint.
**Detection always skipped**: Check the activity log for skip reasons. Common causes: Gitea repo not yet mirrored (first sync), GitHub API rate limits, network errors. All are fail-open by design.
**Backups consuming too much space**: Lower the retention count, or switch from "Always Backup" to "Smart" which only creates backups on actual force-pushes.
**False positives**: The detection compares branch-by-branch. A rebase (which is a force-push) will correctly trigger detection. If you routinely rebase branches, consider using "Smart" instead of "Block & Approve" to avoid constant approval prompts.

View File

@@ -16,7 +16,7 @@ nix run --extra-experimental-features 'nix-command flakes' github:RayLabsHQ/gite
nix run github:RayLabsHQ/gitea-mirror/abc123def
# Pin to git tag
nix run github:RayLabsHQ/gitea-mirror/v3.8.11
nix run github:RayLabsHQ/gitea-mirror/vX.Y.Z
```
**How it works:**
@@ -110,11 +110,11 @@ GitHub Actions workflow validates builds on every push/PR:
Tag releases for version pinning:
```bash
git tag v3.8.11
git push origin v3.8.11
git tag vX.Y.Z
git push origin vX.Y.Z
# Users can then pin:
nix run github:RayLabsHQ/gitea-mirror/v3.8.11
nix run github:RayLabsHQ/gitea-mirror/vX.Y.Z
```
### Phase 4: nixpkgs Submission (Long Term)
@@ -143,13 +143,13 @@ nix profile install --extra-experimental-features 'nix-command flakes' github:Ra
```bash
# Pin to git tag
nix run github:RayLabsHQ/gitea-mirror/v3.8.11
nix run github:RayLabsHQ/gitea-mirror/vX.Y.Z
# Pin to commit
nix run github:RayLabsHQ/gitea-mirror/abc123def
# Lock in flake.nix
inputs.gitea-mirror.url = "github:RayLabsHQ/gitea-mirror/v3.8.11";
inputs.gitea-mirror.url = "github:RayLabsHQ/gitea-mirror/vX.Y.Z";
```
#### Option 3: NixOS Configuration
@@ -160,7 +160,7 @@ inputs.gitea-mirror.url = "github:RayLabsHQ/gitea-mirror/v3.8.11";
nixpkgs.url = "github:NixOS/nixpkgs/nixos-unstable";
gitea-mirror.url = "github:RayLabsHQ/gitea-mirror";
# Or pin to version:
# gitea-mirror.url = "github:RayLabsHQ/gitea-mirror/v3.8.11";
# gitea-mirror.url = "github:RayLabsHQ/gitea-mirror/vX.Y.Z";
};
outputs = { nixpkgs, gitea-mirror, ... }: {
@@ -257,7 +257,7 @@ git tag -l
git ls-remote --tags origin
# Test specific tag
nix run github:RayLabsHQ/gitea-mirror/v3.8.11
nix run github:RayLabsHQ/gitea-mirror/vX.Y.Z
```
---

88
docs/NOTIFICATIONS.md Normal file
View File

@@ -0,0 +1,88 @@
# Notifications
Gitea Mirror supports push notifications for mirror events. You can be alerted when jobs succeed, fail, or when new repositories are discovered.
## Supported Providers
### 1. Ntfy.sh (Direct)
[Ntfy.sh](https://ntfy.sh) is a simple HTTP-based pub-sub notification service. You can use the public server at `https://ntfy.sh` or self-host your own instance.
**Setup (public server):**
1. Go to **Configuration > Notifications**
2. Enable notifications and select **Ntfy.sh** as the provider
3. Set the **Topic** to a unique name (e.g., `my-gitea-mirror-abc123`)
4. Leave the Server URL as `https://ntfy.sh`
5. Subscribe to the same topic on your phone or desktop using the [ntfy app](https://ntfy.sh/docs/subscribe/phone/)
**Setup (self-hosted):**
1. Deploy ntfy using Docker: `docker run -p 8080:80 binwiederhier/ntfy serve`
2. Set the **Server URL** to your instance (e.g., `http://ntfy:8080`)
3. If authentication is enabled, provide an **Access token**
4. Set your **Topic** name
**Priority levels:**
- `min` / `low` / `default` / `high` / `urgent`
- Error notifications automatically use `high` priority regardless of the default setting
### 2. Apprise API (Aggregator)
[Apprise](https://github.com/caronc/apprise-api) is a notification aggregator that supports 100+ services (Slack, Discord, Telegram, Email, Pushover, and many more) through a single API.
**Setup:**
1. Deploy the Apprise API server:
```yaml
# docker-compose.yml
services:
apprise:
image: caronc/apprise:latest
ports:
- "8000:8000"
volumes:
- apprise-config:/config
volumes:
apprise-config:
```
2. Configure your notification services in Apprise (via its web UI at `http://localhost:8000` or API)
3. Create a configuration token/key in Apprise
4. In Gitea Mirror, go to **Configuration > Notifications**
5. Enable notifications and select **Apprise API**
6. Set the **Server URL** to your Apprise instance (e.g., `http://apprise:8000`)
7. Enter the **Token/path** you created in step 3
**Tag filtering:**
- Optionally set a **Tag** to only notify specific Apprise services
- Leave empty to notify all configured services
## Event Types
| Event | Default | Description |
|-------|---------|-------------|
| Sync errors | On | A mirror job failed |
| Sync success | Off | A mirror job completed successfully |
| New repo discovered | Off | A new GitHub repo was auto-imported during scheduled sync |
## Testing
Use the **Send Test Notification** button on the Notifications settings page to verify your configuration. The test sends a sample success notification to your configured provider.
## Troubleshooting
**Notifications not arriving:**
- Check that notifications are enabled in the settings
- Verify the provider configuration (URL, topic/token)
- Use the Test button to check connectivity
- Check the server logs for `[NotificationService]` messages
**Ntfy authentication errors:**
- Ensure your access token is correct
- If self-hosting, verify the ntfy server allows the topic
**Apprise connection refused:**
- Verify the Apprise API server is running and accessible from the Gitea Mirror container
- If using Docker, ensure both containers are on the same network
- Check the Apprise server logs for errors
**Tokens and security:**
- Notification tokens (ntfy access tokens, Apprise tokens) are encrypted at rest using the same AES-256-GCM encryption as GitHub/Gitea tokens
- Tokens are decrypted only when sending notifications or displaying in the settings UI

View File

@@ -7,6 +7,8 @@ This folder contains engineering and operations references for the open-source G
### Core workflow
- **[DEVELOPMENT_WORKFLOW.md](./DEVELOPMENT_WORKFLOW.md)** Set up a local environment, run scripts, and understand the repo layout (app + marketing site).
- **[ENVIRONMENT_VARIABLES.md](./ENVIRONMENT_VARIABLES.md)** Complete reference for every configuration flag supported by the app and Docker images.
- **[NIX_DEPLOYMENT.md](./NIX_DEPLOYMENT.md)** User-facing deployment guide for Nix and NixOS.
- **[NIX_DISTRIBUTION.md](./NIX_DISTRIBUTION.md)** Maintainer notes for packaging, releases, and distribution strategy.
### Reliability & recovery
- **[GRACEFUL_SHUTDOWN.md](./GRACEFUL_SHUTDOWN.md)** How signal handling, shutdown coordination, and job persistence work in v3.
@@ -32,8 +34,6 @@ The first user you create locally becomes the administrator. All other configura
## Contributing & support
- 🎯 Contribution guide: [../CONTRIBUTING.md](../CONTRIBUTING.md)
- 📘 Code of conduct: [../CODE_OF_CONDUCT.md](../CODE_OF_CONDUCT.md)
- 🐞 Issues & feature requests: <https://github.com/RayLabsHQ/gitea-mirror/issues>
- 💬 Discussions: <https://github.com/RayLabsHQ/gitea-mirror/discussions>
Security disclosures should follow the process in [../SECURITY.md](../SECURITY.md).
- 🔐 Security policy & advisories: <https://github.com/RayLabsHQ/gitea-mirror/security>

Binary file not shown.

After

Width:  |  Height:  |  Size: 22 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 22 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 30 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 144 KiB

View File

@@ -0,0 +1,149 @@
CREATE TABLE `__new_repositories` (
`id` text PRIMARY KEY NOT NULL,
`user_id` text NOT NULL,
`config_id` text NOT NULL,
`name` text NOT NULL,
`full_name` text NOT NULL,
`normalized_full_name` text NOT NULL,
`url` text NOT NULL,
`clone_url` text NOT NULL,
`owner` text NOT NULL,
`organization` text,
`mirrored_location` text DEFAULT '',
`is_private` integer DEFAULT false NOT NULL,
`is_fork` integer DEFAULT false NOT NULL,
`forked_from` text,
`has_issues` integer DEFAULT false NOT NULL,
`is_starred` integer DEFAULT false NOT NULL,
`is_archived` integer DEFAULT false NOT NULL,
`size` integer DEFAULT 0 NOT NULL,
`has_lfs` integer DEFAULT false NOT NULL,
`has_submodules` integer DEFAULT false NOT NULL,
`language` text,
`description` text,
`default_branch` text NOT NULL,
`visibility` text DEFAULT 'public' NOT NULL,
`status` text DEFAULT 'imported' NOT NULL,
`last_mirrored` integer,
`error_message` text,
`destination_org` text,
`metadata` text,
`imported_at` integer DEFAULT (unixepoch()) NOT NULL,
`created_at` integer DEFAULT (unixepoch()) NOT NULL,
`updated_at` integer DEFAULT (unixepoch()) NOT NULL,
FOREIGN KEY (`user_id`) REFERENCES `users`(`id`) ON UPDATE no action ON DELETE no action,
FOREIGN KEY (`config_id`) REFERENCES `configs`(`id`) ON UPDATE no action ON DELETE no action
);
--> statement-breakpoint
INSERT INTO `__new_repositories` (
`id`,
`user_id`,
`config_id`,
`name`,
`full_name`,
`normalized_full_name`,
`url`,
`clone_url`,
`owner`,
`organization`,
`mirrored_location`,
`is_private`,
`is_fork`,
`forked_from`,
`has_issues`,
`is_starred`,
`is_archived`,
`size`,
`has_lfs`,
`has_submodules`,
`language`,
`description`,
`default_branch`,
`visibility`,
`status`,
`last_mirrored`,
`error_message`,
`destination_org`,
`metadata`,
`imported_at`,
`created_at`,
`updated_at`
)
SELECT
`repositories`.`id`,
`repositories`.`user_id`,
`repositories`.`config_id`,
`repositories`.`name`,
`repositories`.`full_name`,
`repositories`.`normalized_full_name`,
`repositories`.`url`,
`repositories`.`clone_url`,
`repositories`.`owner`,
`repositories`.`organization`,
`repositories`.`mirrored_location`,
`repositories`.`is_private`,
`repositories`.`is_fork`,
`repositories`.`forked_from`,
`repositories`.`has_issues`,
`repositories`.`is_starred`,
`repositories`.`is_archived`,
`repositories`.`size`,
`repositories`.`has_lfs`,
`repositories`.`has_submodules`,
`repositories`.`language`,
`repositories`.`description`,
`repositories`.`default_branch`,
`repositories`.`visibility`,
`repositories`.`status`,
`repositories`.`last_mirrored`,
`repositories`.`error_message`,
`repositories`.`destination_org`,
`repositories`.`metadata`,
COALESCE(
(
SELECT MIN(`mj`.`timestamp`)
FROM `mirror_jobs` `mj`
WHERE `mj`.`user_id` = `repositories`.`user_id`
AND `mj`.`status` = 'imported'
AND (
(`mj`.`repository_id` IS NOT NULL AND `mj`.`repository_id` = `repositories`.`id`)
OR (
`mj`.`repository_id` IS NULL
AND `mj`.`repository_name` IS NOT NULL
AND (
lower(trim(`mj`.`repository_name`)) = `repositories`.`normalized_full_name`
OR lower(trim(`mj`.`repository_name`)) = lower(trim(`repositories`.`name`))
)
)
)
),
`repositories`.`created_at`,
unixepoch()
) AS `imported_at`,
`repositories`.`created_at`,
`repositories`.`updated_at`
FROM `repositories`;
--> statement-breakpoint
DROP TABLE `repositories`;
--> statement-breakpoint
ALTER TABLE `__new_repositories` RENAME TO `repositories`;
--> statement-breakpoint
CREATE INDEX `idx_repositories_user_id` ON `repositories` (`user_id`);
--> statement-breakpoint
CREATE INDEX `idx_repositories_config_id` ON `repositories` (`config_id`);
--> statement-breakpoint
CREATE INDEX `idx_repositories_status` ON `repositories` (`status`);
--> statement-breakpoint
CREATE INDEX `idx_repositories_owner` ON `repositories` (`owner`);
--> statement-breakpoint
CREATE INDEX `idx_repositories_organization` ON `repositories` (`organization`);
--> statement-breakpoint
CREATE INDEX `idx_repositories_is_fork` ON `repositories` (`is_fork`);
--> statement-breakpoint
CREATE INDEX `idx_repositories_is_starred` ON `repositories` (`is_starred`);
--> statement-breakpoint
CREATE INDEX `idx_repositories_user_imported_at` ON `repositories` (`user_id`,`imported_at`);
--> statement-breakpoint
CREATE UNIQUE INDEX `uniq_repositories_user_full_name` ON `repositories` (`user_id`,`full_name`);
--> statement-breakpoint
CREATE UNIQUE INDEX `uniq_repositories_user_normalized_full_name` ON `repositories` (`user_id`,`normalized_full_name`);

View File

@@ -0,0 +1,9 @@
-- Add index for mirroredLocation lookups (used by name collision detection)
CREATE INDEX IF NOT EXISTS `idx_repositories_mirrored_location` ON `repositories` (`user_id`, `mirrored_location`);
-- Add unique partial index to enforce that no two repos for the same user
-- can claim the same non-empty mirroredLocation. This prevents race conditions
-- during concurrent batch mirroring of starred repos with duplicate names.
CREATE UNIQUE INDEX IF NOT EXISTS `uniq_repositories_user_mirrored_location`
ON `repositories` (`user_id`, `mirrored_location`)
WHERE `mirrored_location` != '';

View File

@@ -0,0 +1 @@
ALTER TABLE `configs` ADD `notification_config` text DEFAULT '{"enabled":false,"provider":"ntfy","notifyOnSyncError":true,"notifyOnSyncSuccess":false,"notifyOnNewRepo":false}' NOT NULL;

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -64,6 +64,27 @@
"when": 1761802056073,
"tag": "0008_serious_thena",
"breakpoints": true
},
{
"idx": 9,
"version": "6",
"when": 1773542995732,
"tag": "0009_nervous_tyger_tiger",
"breakpoints": true
},
{
"idx": 10,
"version": "6",
"when": 1774054800000,
"tag": "0010_mirrored_location_index",
"breakpoints": true
},
{
"idx": 11,
"version": "6",
"when": 1774058400000,
"tag": "0011_notification_config",
"breakpoints": true
}
]
}
}

111
flake.lock generated
View File

@@ -1,8 +1,50 @@
{
"nodes": {
"bun2nix": {
"inputs": {
"flake-parts": "flake-parts",
"import-tree": "import-tree",
"nixpkgs": [
"nixpkgs"
],
"systems": "systems",
"treefmt-nix": "treefmt-nix"
},
"locked": {
"lastModified": 1770895533,
"narHash": "sha256-v3QaK9ugy9bN9RXDnjw0i2OifKmz2NnKM82agtqm/UY=",
"owner": "nix-community",
"repo": "bun2nix",
"rev": "c843f477b15f51151f8c6bcc886954699440a6e1",
"type": "github"
},
"original": {
"owner": "nix-community",
"repo": "bun2nix",
"type": "github"
}
},
"flake-parts": {
"inputs": {
"nixpkgs-lib": "nixpkgs-lib"
},
"locked": {
"lastModified": 1769996383,
"narHash": "sha256-AnYjnFWgS49RlqX7LrC4uA+sCCDBj0Ry/WOJ5XWAsa0=",
"owner": "hercules-ci",
"repo": "flake-parts",
"rev": "57928607ea566b5db3ad13af0e57e921e6b12381",
"type": "github"
},
"original": {
"owner": "hercules-ci",
"repo": "flake-parts",
"type": "github"
}
},
"flake-utils": {
"inputs": {
"systems": "systems"
"systems": "systems_2"
},
"locked": {
"lastModified": 1731533236,
@@ -18,6 +60,21 @@
"type": "github"
}
},
"import-tree": {
"locked": {
"lastModified": 1763762820,
"narHash": "sha256-ZvYKbFib3AEwiNMLsejb/CWs/OL/srFQ8AogkebEPF0=",
"owner": "vic",
"repo": "import-tree",
"rev": "3c23749d8013ec6daa1d7255057590e9ca726646",
"type": "github"
},
"original": {
"owner": "vic",
"repo": "import-tree",
"type": "github"
}
},
"nixpkgs": {
"locked": {
"lastModified": 1761672384,
@@ -34,8 +91,24 @@
"type": "github"
}
},
"nixpkgs-lib": {
"locked": {
"lastModified": 1769909678,
"narHash": "sha256-cBEymOf4/o3FD5AZnzC3J9hLbiZ+QDT/KDuyHXVJOpM=",
"owner": "nix-community",
"repo": "nixpkgs.lib",
"rev": "72716169fe93074c333e8d0173151350670b824c",
"type": "github"
},
"original": {
"owner": "nix-community",
"repo": "nixpkgs.lib",
"type": "github"
}
},
"root": {
"inputs": {
"bun2nix": "bun2nix",
"flake-utils": "flake-utils",
"nixpkgs": "nixpkgs"
}
@@ -54,6 +127,42 @@
"repo": "default",
"type": "github"
}
},
"systems_2": {
"locked": {
"lastModified": 1681028828,
"narHash": "sha256-Vy1rq5AaRuLzOxct8nz4T6wlgyUR7zLU309k9mBC768=",
"owner": "nix-systems",
"repo": "default",
"rev": "da67096a3b9bf56a91d16901293e51ba5b49a27e",
"type": "github"
},
"original": {
"owner": "nix-systems",
"repo": "default",
"type": "github"
}
},
"treefmt-nix": {
"inputs": {
"nixpkgs": [
"bun2nix",
"nixpkgs"
]
},
"locked": {
"lastModified": 1770228511,
"narHash": "sha256-wQ6NJSuFqAEmIg2VMnLdCnUc0b7vslUohqqGGD+Fyxk=",
"owner": "numtide",
"repo": "treefmt-nix",
"rev": "337a4fe074be1042a35086f15481d763b8ddc0e7",
"type": "github"
},
"original": {
"owner": "numtide",
"repo": "treefmt-nix",
"type": "github"
}
}
},
"root": "root",

431
flake.nix
View File

@@ -1,25 +1,43 @@
{
description = "Gitea Mirror - Self-hosted GitHub to Gitea mirroring service";
nixConfig = {
extra-substituters = [
"https://nix-community.cachix.org"
];
extra-trusted-public-keys = [
"nix-community.cachix.org-1:mB9FSh9qf2dCimDSUo8Zy7bkq5CX+/rkCWyvRCYg3Fs="
];
};
inputs = {
nixpkgs.url = "github:NixOS/nixpkgs/nixos-unstable";
flake-utils.url = "github:numtide/flake-utils";
bun2nix = {
url = "github:nix-community/bun2nix";
inputs.nixpkgs.follows = "nixpkgs";
};
};
outputs = { self, nixpkgs, flake-utils }:
flake-utils.lib.eachDefaultSystem (system:
outputs = { self, nixpkgs, flake-utils, bun2nix }:
let
forEachSystem = flake-utils.lib.eachDefaultSystem;
in
(forEachSystem (system:
let
pkgs = nixpkgs.legacyPackages.${system};
b2n = bun2nix.packages.${system}.default;
# Build the application
gitea-mirror = pkgs.stdenv.mkDerivation {
pname = "gitea-mirror";
version = "3.8.11";
version = "3.14.1";
src = ./.;
nativeBuildInputs = with pkgs; [
bun
nativeBuildInputs = [
pkgs.bun
b2n.hook
];
buildInputs = with pkgs; [
@@ -27,21 +45,54 @@
openssl
];
configurePhase = ''
export HOME=$TMPDIR
export BUN_INSTALL=$TMPDIR/.bun
export PATH=$BUN_INSTALL/bin:$PATH
'';
bunDeps = b2n.fetchBunDeps {
bunNix = ./bun.nix;
};
# bun2nix defaults to isolated installs on Linux, which can be
# very slow in CI for larger dependency trees and may appear stuck.
# Use hoisted linker and fail fast on lockfile drift.
bunInstallFlags = if pkgs.stdenv.hostPlatform.isDarwin then [
"--linker=hoisted"
"--backend=copyfile"
"--frozen-lockfile"
"--no-progress"
] else [
"--linker=hoisted"
"--frozen-lockfile"
"--no-progress"
];
# Let the bun2nix hook handle dependency installation via the
# pre-fetched cache, but skip its default build/check/install
# phases since we have custom ones.
dontUseBunBuild = true;
dontUseBunCheck = true;
dontUseBunInstall = true;
buildPhase = ''
# Install dependencies
bun install --frozen-lockfile --no-progress
runHook preBuild
export HOME=$TMPDIR
# Build the application
# The bun2nix cache is in the read-only Nix store, but bunx/astro
# may try to write to it at build time. Copy the cache to a
# writable location.
if [ -n "$BUN_INSTALL_CACHE_DIR" ] && [ -d "$BUN_INSTALL_CACHE_DIR" ]; then
WRITABLE_CACHE="$TMPDIR/bun-cache"
cp -rL "$BUN_INSTALL_CACHE_DIR" "$WRITABLE_CACHE" 2>/dev/null || true
chmod -R u+w "$WRITABLE_CACHE" 2>/dev/null || true
export BUN_INSTALL_CACHE_DIR="$WRITABLE_CACHE"
fi
# Build the Astro application
bun run build
runHook postBuild
'';
installPhase = ''
runHook preInstall
mkdir -p $out/lib/gitea-mirror
mkdir -p $out/bin
@@ -49,11 +100,14 @@
cp -r dist $out/lib/gitea-mirror/
cp -r node_modules $out/lib/gitea-mirror/
cp -r scripts $out/lib/gitea-mirror/
cp -r src $out/lib/gitea-mirror/
cp -r drizzle $out/lib/gitea-mirror/
cp package.json $out/lib/gitea-mirror/
cp tsconfig.json $out/lib/gitea-mirror/
# Create entrypoint script that matches Docker behavior
cat > $out/bin/gitea-mirror <<'EOF'
#!/usr/bin/env bash
#!${pkgs.bash}/bin/bash
set -e
# === DEFAULT CONFIGURATION ===
@@ -75,7 +129,19 @@ export MIRROR_PULL_REQUEST_CONCURRENCY=''${MIRROR_PULL_REQUEST_CONCURRENCY:-5}
# Create data directory
mkdir -p "$DATA_DIR"
cd $out/lib/gitea-mirror
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
APP_DIR="$SCRIPT_DIR/../lib/gitea-mirror"
# The app uses process.cwd()/data for the database, but the Nix store
# is read-only. Create a writable working directory with symlinks to
# the app files and a real data directory.
WORK_DIR="$DATA_DIR/.workdir"
mkdir -p "$WORK_DIR"
for item in dist node_modules scripts src drizzle package.json tsconfig.json; do
ln -sfn "$APP_DIR/$item" "$WORK_DIR/$item"
done
ln -sfn "$DATA_DIR" "$WORK_DIR/data"
cd "$WORK_DIR"
# === AUTO-GENERATE SECRETS ===
BETTER_AUTH_SECRET_FILE="$DATA_DIR/.better_auth_secret"
@@ -112,7 +178,7 @@ if [ -z "$ENCRYPTION_SECRET" ]; then
fi
# === DATABASE INITIALIZATION ===
DB_PATH=$(echo "$DATABASE_URL" | sed 's|^file:||')
DB_PATH=$(echo "$DATABASE_URL" | ${pkgs.gnused}/bin/sed 's|^file:||')
if [ ! -f "$DB_PATH" ]; then
echo "Database not found. It will be created and initialized via Drizzle migrations on first app startup..."
touch "$DB_PATH"
@@ -123,25 +189,25 @@ fi
# === STARTUP SCRIPTS ===
# Initialize configuration from environment variables
echo "Checking for environment configuration..."
if [ -f "dist/scripts/startup-env-config.js" ]; then
if [ -f "scripts/startup-env-config.ts" ]; then
echo "Loading configuration from environment variables..."
${pkgs.bun}/bin/bun dist/scripts/startup-env-config.js && \
${pkgs.bun}/bin/bun scripts/startup-env-config.ts && \
echo " Environment configuration loaded successfully" || \
echo " Environment configuration loading completed with warnings"
fi
# Run startup recovery
echo "Running startup recovery..."
if [ -f "dist/scripts/startup-recovery.js" ]; then
${pkgs.bun}/bin/bun dist/scripts/startup-recovery.js --timeout=30000 && \
if [ -f "scripts/startup-recovery.ts" ]; then
${pkgs.bun}/bin/bun scripts/startup-recovery.ts --timeout=30000 && \
echo " Startup recovery completed successfully" || \
echo " Startup recovery completed with warnings"
fi
# Run repository status repair
echo "Running repository status repair..."
if [ -f "dist/scripts/repair-mirrored-repos.js" ]; then
${pkgs.bun}/bin/bun dist/scripts/repair-mirrored-repos.js --startup && \
if [ -f "scripts/repair-mirrored-repos.ts" ]; then
${pkgs.bun}/bin/bun scripts/repair-mirrored-repos.ts --startup && \
echo " Repository status repair completed successfully" || \
echo " Repository status repair completed with warnings"
fi
@@ -170,13 +236,16 @@ EOF
# Create database management helper
cat > $out/bin/gitea-mirror-db <<'EOF'
#!/usr/bin/env bash
#!${pkgs.bash}/bin/bash
export DATA_DIR=''${DATA_DIR:-"$HOME/.local/share/gitea-mirror"}
mkdir -p "$DATA_DIR"
cd $out/lib/gitea-mirror
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
cd "$SCRIPT_DIR/../lib/gitea-mirror"
exec ${pkgs.bun}/bin/bun scripts/manage-db.ts "$@"
EOF
chmod +x $out/bin/gitea-mirror-db
runHook postInstall
'';
meta = with pkgs.lib; {
@@ -201,6 +270,7 @@ EOF
bun
sqlite
openssl
b2n
];
shellHook = ''
@@ -211,182 +281,185 @@ EOF
echo " bun run dev # Start development server"
echo " bun run build # Build for production"
echo ""
echo "Nix packaging:"
echo " bun2nix -o bun.nix # Regenerate bun.nix after dependency changes"
echo " nix build # Build the package"
echo ""
echo "Database:"
echo " bun run manage-db init # Initialize database"
echo " bun run db:studio # Open Drizzle Studio"
'';
};
# NixOS module
nixosModules.default = { config, lib, pkgs, ... }:
with lib;
let
cfg = config.services.gitea-mirror;
in {
options.services.gitea-mirror = {
enable = mkEnableOption "Gitea Mirror service";
}
)) // {
nixosModules.default = { config, lib, pkgs, ... }:
with lib;
let
cfg = config.services.gitea-mirror;
in {
options.services.gitea-mirror = {
enable = mkEnableOption "Gitea Mirror service";
package = mkOption {
type = types.package;
default = self.packages.${system}.default;
description = "The Gitea Mirror package to use";
};
dataDir = mkOption {
type = types.path;
default = "/var/lib/gitea-mirror";
description = "Directory to store data and database";
};
user = mkOption {
type = types.str;
default = "gitea-mirror";
description = "User account under which Gitea Mirror runs";
};
group = mkOption {
type = types.str;
default = "gitea-mirror";
description = "Group under which Gitea Mirror runs";
};
host = mkOption {
type = types.str;
default = "0.0.0.0";
description = "Host to bind to";
};
port = mkOption {
type = types.port;
default = 4321;
description = "Port to listen on";
};
betterAuthUrl = mkOption {
type = types.str;
default = "http://localhost:4321";
description = "Better Auth URL (external URL of the service)";
};
betterAuthTrustedOrigins = mkOption {
type = types.str;
default = "http://localhost:4321";
description = "Comma-separated list of trusted origins for Better Auth";
};
mirrorIssueConcurrency = mkOption {
type = types.int;
default = 3;
description = "Number of concurrent issue mirror operations (set to 1 for perfect ordering)";
};
mirrorPullRequestConcurrency = mkOption {
type = types.int;
default = 5;
description = "Number of concurrent PR mirror operations (set to 1 for perfect ordering)";
};
environmentFile = mkOption {
type = types.nullOr types.path;
default = null;
description = ''
Path to file containing environment variables.
Only needed if you want to set BETTER_AUTH_SECRET or ENCRYPTION_SECRET manually.
Otherwise, secrets will be auto-generated and stored in the data directory.
Example:
BETTER_AUTH_SECRET=your-32-character-secret-here
ENCRYPTION_SECRET=your-encryption-secret-here
'';
};
openFirewall = mkOption {
type = types.bool;
default = false;
description = "Open the firewall for the specified port";
};
package = mkOption {
type = types.package;
default = self.packages.${pkgs.system}.default;
description = "The Gitea Mirror package to use";
};
config = mkIf cfg.enable {
users.users.${cfg.user} = {
isSystemUser = true;
group = cfg.group;
home = cfg.dataDir;
createHome = true;
};
dataDir = mkOption {
type = types.path;
default = "/var/lib/gitea-mirror";
description = "Directory to store data and database";
};
users.groups.${cfg.group} = {};
user = mkOption {
type = types.str;
default = "gitea-mirror";
description = "User account under which Gitea Mirror runs";
};
systemd.services.gitea-mirror = {
description = "Gitea Mirror - GitHub to Gitea mirroring service";
after = [ "network.target" ];
wantedBy = [ "multi-user.target" ];
group = mkOption {
type = types.str;
default = "gitea-mirror";
description = "Group under which Gitea Mirror runs";
};
environment = {
DATA_DIR = cfg.dataDir;
DATABASE_URL = "file:${cfg.dataDir}/gitea-mirror.db";
HOST = cfg.host;
PORT = toString cfg.port;
NODE_ENV = "production";
BETTER_AUTH_URL = cfg.betterAuthUrl;
BETTER_AUTH_TRUSTED_ORIGINS = cfg.betterAuthTrustedOrigins;
PUBLIC_BETTER_AUTH_URL = cfg.betterAuthUrl;
MIRROR_ISSUE_CONCURRENCY = toString cfg.mirrorIssueConcurrency;
MIRROR_PULL_REQUEST_CONCURRENCY = toString cfg.mirrorPullRequestConcurrency;
};
host = mkOption {
type = types.str;
default = "0.0.0.0";
description = "Host to bind to";
};
serviceConfig = {
Type = "simple";
User = cfg.user;
Group = cfg.group;
ExecStart = "${cfg.package}/bin/gitea-mirror";
Restart = "always";
RestartSec = "10s";
port = mkOption {
type = types.port;
default = 4321;
description = "Port to listen on";
};
# Security hardening
NoNewPrivileges = true;
PrivateTmp = true;
ProtectSystem = "strict";
ProtectHome = true;
ReadWritePaths = [ cfg.dataDir ];
betterAuthUrl = mkOption {
type = types.str;
default = "http://localhost:4321";
description = "Better Auth URL (external URL of the service)";
};
# Load environment file if specified (optional)
EnvironmentFile = mkIf (cfg.environmentFile != null) cfg.environmentFile;
betterAuthTrustedOrigins = mkOption {
type = types.str;
default = "http://localhost:4321";
description = "Comma-separated list of trusted origins for Better Auth";
};
# Graceful shutdown
TimeoutStopSec = "30s";
KillMode = "mixed";
KillSignal = "SIGTERM";
};
};
mirrorIssueConcurrency = mkOption {
type = types.int;
default = 3;
description = "Number of concurrent issue mirror operations (set to 1 for perfect ordering)";
};
# Health check timer (optional monitoring)
systemd.timers.gitea-mirror-healthcheck = mkIf cfg.enable {
description = "Gitea Mirror health check timer";
wantedBy = [ "timers.target" ];
timerConfig = {
OnBootSec = "5min";
OnUnitActiveSec = "5min";
};
};
mirrorPullRequestConcurrency = mkOption {
type = types.int;
default = 5;
description = "Number of concurrent PR mirror operations (set to 1 for perfect ordering)";
};
systemd.services.gitea-mirror-healthcheck = mkIf cfg.enable {
description = "Gitea Mirror health check";
after = [ "gitea-mirror.service" ];
serviceConfig = {
Type = "oneshot";
ExecStart = "${pkgs.curl}/bin/curl -f http://${cfg.host}:${toString cfg.port}/api/health || true";
User = "nobody";
};
};
environmentFile = mkOption {
type = types.nullOr types.path;
default = null;
description = ''
Path to file containing environment variables.
Only needed if you want to set BETTER_AUTH_SECRET or ENCRYPTION_SECRET manually.
Otherwise, secrets will be auto-generated and stored in the data directory.
networking.firewall = mkIf cfg.openFirewall {
allowedTCPPorts = [ cfg.port ];
};
Example:
BETTER_AUTH_SECRET=your-32-character-secret-here
ENCRYPTION_SECRET=your-encryption-secret-here
'';
};
openFirewall = mkOption {
type = types.bool;
default = false;
description = "Open the firewall for the specified port";
};
};
}
) // {
config = mkIf cfg.enable {
users.users.${cfg.user} = {
isSystemUser = true;
group = cfg.group;
home = cfg.dataDir;
createHome = true;
};
users.groups.${cfg.group} = {};
systemd.services.gitea-mirror = {
description = "Gitea Mirror - GitHub to Gitea mirroring service";
after = [ "network.target" ];
wantedBy = [ "multi-user.target" ];
environment = {
DATA_DIR = cfg.dataDir;
DATABASE_URL = "file:${cfg.dataDir}/gitea-mirror.db";
HOST = cfg.host;
PORT = toString cfg.port;
NODE_ENV = "production";
BETTER_AUTH_URL = cfg.betterAuthUrl;
BETTER_AUTH_TRUSTED_ORIGINS = cfg.betterAuthTrustedOrigins;
PUBLIC_BETTER_AUTH_URL = cfg.betterAuthUrl;
MIRROR_ISSUE_CONCURRENCY = toString cfg.mirrorIssueConcurrency;
MIRROR_PULL_REQUEST_CONCURRENCY = toString cfg.mirrorPullRequestConcurrency;
};
serviceConfig = {
Type = "simple";
User = cfg.user;
Group = cfg.group;
ExecStart = "${cfg.package}/bin/gitea-mirror";
Restart = "always";
RestartSec = "10s";
# Security hardening
NoNewPrivileges = true;
PrivateTmp = true;
ProtectSystem = "strict";
ProtectHome = true;
ReadWritePaths = [ cfg.dataDir ];
# Graceful shutdown
TimeoutStopSec = "30s";
KillMode = "mixed";
KillSignal = "SIGTERM";
} // optionalAttrs (cfg.environmentFile != null) {
EnvironmentFile = cfg.environmentFile;
};
};
# Health check timer (optional monitoring)
systemd.timers.gitea-mirror-healthcheck = {
description = "Gitea Mirror health check timer";
wantedBy = [ "timers.target" ];
timerConfig = {
OnBootSec = "5min";
OnUnitActiveSec = "5min";
};
};
systemd.services.gitea-mirror-healthcheck = {
description = "Gitea Mirror health check";
after = [ "gitea-mirror.service" ];
serviceConfig = {
Type = "oneshot";
ExecStart = "${pkgs.bash}/bin/bash -c '${pkgs.curl}/bin/curl -f http://127.0.0.1:${toString cfg.port}/api/health || true'";
User = "nobody";
};
};
networking.firewall = mkIf cfg.openFirewall {
allowedTCPPorts = [ cfg.port ];
};
};
};
# Overlay for adding to nixpkgs
overlays.default = final: prev: {
gitea-mirror = self.packages.${final.system}.default;

View File

@@ -1,7 +1,7 @@
{
"name": "gitea-mirror",
"type": "module",
"version": "3.9.4",
"version": "3.14.2",
"engines": {
"bun": ">=1.2.9"
},
@@ -34,20 +34,29 @@
"start": "bun dist/server/entry.mjs",
"start:fresh": "bun run cleanup-db && bun run manage-db init && bun dist/server/entry.mjs",
"test": "bun test",
"test:migrations": "bun scripts/validate-migrations.ts",
"test:watch": "bun test --watch",
"test:coverage": "bun test --coverage",
"test:e2e": "bash tests/e2e/run-e2e.sh",
"test:e2e:ci": "bash tests/e2e/run-e2e.sh --ci",
"test:e2e:keep": "bash tests/e2e/run-e2e.sh --keep",
"test:e2e:cleanup": "bash tests/e2e/cleanup.sh",
"astro": "bunx --bun astro"
},
"overrides": {
"@esbuild-kit/esm-loader": "npm:tsx@^4.21.0",
"devalue": "^5.5.0"
"devalue": "^5.6.4",
"fast-xml-parser": "^5.5.6",
"node-forge": "^1.3.3",
"svgo": "^4.0.1",
"rollup": ">=4.59.0"
},
"dependencies": {
"@astrojs/check": "^0.9.6",
"@astrojs/mdx": "4.3.13",
"@astrojs/node": "9.5.4",
"@astrojs/react": "^4.4.2",
"@better-auth/sso": "1.4.19",
"@astrojs/check": "^0.9.7",
"@astrojs/mdx": "5.0.0",
"@astrojs/node": "10.0.1",
"@astrojs/react": "^5.0.0",
"@better-auth/sso": "1.5.5",
"@octokit/plugin-throttling": "^11.0.3",
"@octokit/rest": "^22.0.1",
"@radix-ui/react-accordion": "^1.2.12",
@@ -69,14 +78,15 @@
"@radix-ui/react-tabs": "^1.1.13",
"@radix-ui/react-tooltip": "^1.2.8",
"@tailwindcss/vite": "^4.2.1",
"@tanstack/react-table": "^8.21.3",
"@tanstack/react-virtual": "^3.13.19",
"@types/canvas-confetti": "^1.9.0",
"@types/react": "^19.2.14",
"@types/react-dom": "^19.2.3",
"astro": "^5.17.3",
"astro": "^6.0.4",
"bcryptjs": "^3.0.3",
"better-auth": "1.5.5",
"buffer": "^6.0.3",
"better-auth": "1.4.19",
"canvas-confetti": "^1.9.4",
"class-variance-authority": "^0.7.1",
"clsx": "^2.1.1",
@@ -85,7 +95,8 @@
"drizzle-orm": "^0.45.1",
"fuse.js": "^7.1.0",
"jsonwebtoken": "^9.0.3",
"lucide-react": "^0.575.0",
"lucide-react": "^0.577.0",
"nanoid": "^5.1.6",
"next-themes": "^0.4.6",
"react": "^19.2.4",
"react-dom": "^19.2.4",
@@ -100,17 +111,19 @@
"zod": "^4.3.6"
},
"devDependencies": {
"@playwright/test": "^1.58.2",
"@testing-library/jest-dom": "^6.9.1",
"@testing-library/react": "^16.3.2",
"@types/bcryptjs": "^3.0.0",
"@types/bun": "^1.3.9",
"@types/bun": "^1.3.10",
"@types/jsonwebtoken": "^9.0.10",
"@types/node": "^25.5.0",
"@types/uuid": "^11.0.0",
"@vitejs/plugin-react": "^5.1.4",
"@vitejs/plugin-react": "^6.0.1",
"drizzle-kit": "^0.31.9",
"jsdom": "^28.1.0",
"tsx": "^4.21.0",
"vitest": "^4.0.18"
"vitest": "^4.1.0"
},
"packageManager": "bun@1.3.3"
"packageManager": "bun@1.3.10"
}

View File

@@ -15,33 +15,40 @@ import { repoStatusEnum } from "@/types/Repository";
const isDryRun = process.argv.includes("--dry-run");
const specificRepo = process.argv.find(arg => arg.startsWith("--repo-name="))?.split("=")[1];
const isStartupMode = process.argv.includes("--startup");
const requestTimeoutMs = parsePositiveInteger(process.env.GITEA_REPAIR_REQUEST_TIMEOUT_MS, 15000);
const progressInterval = parsePositiveInteger(process.env.GITEA_REPAIR_PROGRESS_INTERVAL, 100);
async function checkRepoInGitea(config: any, owner: string, repoName: string): Promise<boolean> {
try {
if (!config.giteaConfig?.url || !config.giteaConfig?.token) {
return false;
}
type GiteaLookupResult = {
exists: boolean;
details: any | null;
timedOut: boolean;
error: string | null;
};
const response = await fetch(
`${config.giteaConfig.url}/api/v1/repos/${owner}/${repoName}`,
{
headers: {
Authorization: `token ${config.giteaConfig.token}`,
},
}
);
return response.ok;
} catch (error) {
console.error(`Error checking repo ${owner}/${repoName} in Gitea:`, error);
return false;
function parsePositiveInteger(value: string | undefined, fallback: number): number {
const parsed = Number.parseInt(value ?? "", 10);
if (!Number.isFinite(parsed) || parsed <= 0) {
return fallback;
}
return parsed;
}
async function getRepoDetailsFromGitea(config: any, owner: string, repoName: string): Promise<any> {
function isTimeoutError(error: unknown): boolean {
if (!(error instanceof Error)) {
return false;
}
return error.name === "TimeoutError" || error.name === "AbortError";
}
async function getRepoDetailsFromGitea(config: any, owner: string, repoName: string): Promise<GiteaLookupResult> {
try {
if (!config.giteaConfig?.url || !config.giteaConfig?.token) {
return null;
return {
exists: false,
details: null,
timedOut: false,
error: "Missing Gitea URL or token in config",
};
}
const response = await fetch(
@@ -50,16 +57,41 @@ async function getRepoDetailsFromGitea(config: any, owner: string, repoName: str
headers: {
Authorization: `token ${config.giteaConfig.token}`,
},
signal: AbortSignal.timeout(requestTimeoutMs),
}
);
if (response.ok) {
return await response.json();
return {
exists: true,
details: await response.json(),
timedOut: false,
error: null,
};
}
return null;
if (response.status === 404) {
return {
exists: false,
details: null,
timedOut: false,
error: null,
};
}
return {
exists: false,
details: null,
timedOut: false,
error: `Gitea API returned HTTP ${response.status}`,
};
} catch (error) {
console.error(`Error getting repo details for ${owner}/${repoName}:`, error);
return null;
return {
exists: false,
details: null,
timedOut: isTimeoutError(error),
error: error instanceof Error ? error.message : String(error),
};
}
}
@@ -99,6 +131,8 @@ async function repairMirroredRepositories() {
.from(repositories)
.where(whereConditions);
const totalRepos = repos.length;
if (repos.length === 0) {
if (!isStartupMode) {
console.log("✅ No repositories found that need repair");
@@ -109,13 +143,25 @@ async function repairMirroredRepositories() {
if (!isStartupMode) {
console.log(`📋 Found ${repos.length} repositories to check:`);
console.log("");
} else {
console.log(`Checking ${totalRepos} repositories for status inconsistencies...`);
console.log(`Request timeout: ${requestTimeoutMs}ms | Progress interval: every ${progressInterval} repositories`);
}
const startedAt = Date.now();
const configCache = new Map<string, any>();
let checkedCount = 0;
let repairedCount = 0;
let skippedCount = 0;
let errorCount = 0;
let timeoutCount = 0;
let giteaErrorCount = 0;
let giteaErrorSamples = 0;
let startupSkipWarningCount = 0;
for (const repo of repos) {
checkedCount++;
if (!isStartupMode) {
console.log(`🔍 Checking repository: ${repo.name}`);
console.log(` Current status: ${repo.status}`);
@@ -124,13 +170,29 @@ async function repairMirroredRepositories() {
try {
// Get user configuration
const config = await db
.select()
.from(configs)
.where(eq(configs.id, repo.configId))
.limit(1);
const configKey = String(repo.configId);
let userConfig = configCache.get(configKey);
if (config.length === 0) {
if (!userConfig) {
const config = await db
.select()
.from(configs)
.where(eq(configs.id, repo.configId))
.limit(1);
if (config.length === 0) {
if (!isStartupMode) {
console.log(` ❌ No configuration found for repository`);
}
errorCount++;
continue;
}
userConfig = config[0];
configCache.set(configKey, userConfig);
}
if (!userConfig) {
if (!isStartupMode) {
console.log(` ❌ No configuration found for repository`);
}
@@ -138,7 +200,6 @@ async function repairMirroredRepositories() {
continue;
}
const userConfig = config[0];
const giteaUsername = userConfig.giteaConfig?.defaultOwner;
if (!giteaUsername) {
@@ -153,25 +214,59 @@ async function repairMirroredRepositories() {
let existsInGitea = false;
let actualOwner = giteaUsername;
let giteaRepoDetails = null;
let repoRequestTimedOut = false;
let repoRequestErrored = false;
// First check user location
existsInGitea = await checkRepoInGitea(userConfig, giteaUsername, repo.name);
if (existsInGitea) {
giteaRepoDetails = await getRepoDetailsFromGitea(userConfig, giteaUsername, repo.name);
const userLookup = await getRepoDetailsFromGitea(userConfig, giteaUsername, repo.name);
existsInGitea = userLookup.exists;
giteaRepoDetails = userLookup.details;
if (userLookup.timedOut) {
timeoutCount++;
repoRequestTimedOut = true;
} else if (userLookup.error) {
giteaErrorCount++;
repoRequestErrored = true;
if (!isStartupMode || giteaErrorSamples < 3) {
console.log(` ⚠️ Gitea lookup issue for ${giteaUsername}/${repo.name}: ${userLookup.error}`);
giteaErrorSamples++;
}
}
// If not found in user location and repo has organization, check organization
if (!existsInGitea && repo.organization) {
existsInGitea = await checkRepoInGitea(userConfig, repo.organization, repo.name);
const orgLookup = await getRepoDetailsFromGitea(userConfig, repo.organization, repo.name);
existsInGitea = orgLookup.exists;
if (existsInGitea) {
actualOwner = repo.organization;
giteaRepoDetails = await getRepoDetailsFromGitea(userConfig, repo.organization, repo.name);
giteaRepoDetails = orgLookup.details;
}
if (orgLookup.timedOut) {
timeoutCount++;
repoRequestTimedOut = true;
} else if (orgLookup.error) {
giteaErrorCount++;
repoRequestErrored = true;
if (!isStartupMode || giteaErrorSamples < 3) {
console.log(` ⚠️ Gitea lookup issue for ${repo.organization}/${repo.name}: ${orgLookup.error}`);
giteaErrorSamples++;
}
}
}
if (!existsInGitea) {
if (!isStartupMode) {
console.log(` ⏭️ Repository not found in Gitea - skipping`);
} else if (repoRequestTimedOut || repoRequestErrored) {
if (startupSkipWarningCount < 3) {
console.log(` ⚠️ Skipping ${repo.name}; Gitea was slow/unreachable during lookup`);
startupSkipWarningCount++;
if (startupSkipWarningCount === 3) {
console.log(` Additional slow/unreachable lookup warnings suppressed; progress logs will continue`);
}
}
}
skippedCount++;
continue;
@@ -241,22 +336,43 @@ async function repairMirroredRepositories() {
if (!isStartupMode) {
console.log("");
} else if (checkedCount % progressInterval === 0 || checkedCount === totalRepos) {
const elapsedSeconds = Math.floor((Date.now() - startedAt) / 1000);
console.log(
`Repair progress: ${checkedCount}/${totalRepos} checked | repaired=${repairedCount}, skipped=${skippedCount}, errors=${errorCount}, timeouts=${timeoutCount} | elapsed=${elapsedSeconds}s`
);
}
}
if (isStartupMode) {
// In startup mode, only log if there were repairs or errors
const elapsedSeconds = Math.floor((Date.now() - startedAt) / 1000);
console.log(
`Repository repair summary: checked=${checkedCount}, repaired=${repairedCount}, skipped=${skippedCount}, errors=${errorCount}, timeouts=${timeoutCount}, elapsed=${elapsedSeconds}s`
);
if (repairedCount > 0) {
console.log(`Repaired ${repairedCount} repository status inconsistencies`);
}
if (errorCount > 0) {
console.log(`Warning: ${errorCount} repositories had errors during repair`);
}
if (timeoutCount > 0) {
console.log(
`Warning: ${timeoutCount} Gitea API requests timed out. Increase GITEA_REPAIR_REQUEST_TIMEOUT_MS if your Gitea instance is under heavy load.`
);
}
if (giteaErrorCount > 0) {
console.log(`Warning: ${giteaErrorCount} Gitea API requests failed with non-timeout errors.`);
}
} else {
console.log("📊 Repair Summary:");
console.log(` Checked: ${checkedCount}`);
console.log(` Repaired: ${repairedCount}`);
console.log(` Skipped: ${skippedCount}`);
console.log(` Errors: ${errorCount}`);
console.log(` Timeouts: ${timeoutCount}`);
if (giteaErrorCount > 0) {
console.log(` Gitea API Errors: ${giteaErrorCount}`);
}
if (isDryRun && repairedCount > 0) {
console.log("");

View File

@@ -0,0 +1,265 @@
#!/usr/bin/env bun
import { Database } from "bun:sqlite";
import { readFileSync } from "fs";
import path from "path";
type JournalEntry = {
idx: number;
tag: string;
when: number;
breakpoints: boolean;
};
type Migration = {
entry: JournalEntry;
statements: string[];
};
type UpgradeFixture = {
seed: (db: Database) => void;
verify: (db: Database) => void;
};
type TableInfoRow = {
cid: number;
name: string;
type: string;
notnull: number;
dflt_value: string | null;
pk: number;
};
const migrationsFolder = path.join(process.cwd(), "drizzle");
const migrations = loadMigrations();
const latestMigration = migrations.at(-1);
/**
* Known SQLite limitations that Drizzle-kit's auto-generated migrations
* can violate. Each rule is checked against every SQL statement.
*/
const SQLITE_LINT_RULES: { pattern: RegExp; message: string }[] = [
{
pattern: /ALTER\s+TABLE\s+\S+\s+ADD\s+(?:COLUMN\s+)?\S+[^;]*DEFAULT\s*\(/i,
message:
"ALTER TABLE ADD COLUMN with an expression default (e.g. DEFAULT (unixepoch())) " +
"is not allowed in SQLite. Use the table-recreation pattern instead " +
"(CREATE new table, INSERT SELECT, DROP old, RENAME).",
},
{
pattern: /ALTER\s+TABLE\s+\S+\s+ADD\s+(?:COLUMN\s+)?\S+[^;]*DEFAULT\s+CURRENT_(TIME|DATE|TIMESTAMP)\b/i,
message:
"ALTER TABLE ADD COLUMN with DEFAULT CURRENT_TIME/CURRENT_DATE/CURRENT_TIMESTAMP " +
"is not allowed in SQLite. Use the table-recreation pattern instead.",
},
];
function loadMigrations(): Migration[] {
const journalPath = path.join(migrationsFolder, "meta", "_journal.json");
const journal = JSON.parse(readFileSync(journalPath, "utf8")) as {
entries: JournalEntry[];
};
return journal.entries.map((entry) => {
const migrationPath = path.join(migrationsFolder, `${entry.tag}.sql`);
const statements = readFileSync(migrationPath, "utf8")
.split("--> statement-breakpoint")
.map((statement) => statement.trim())
.filter(Boolean);
return { entry, statements };
});
}
function assert(condition: unknown, message: string): asserts condition {
if (!condition) {
throw new Error(message);
}
}
function runMigration(db: Database, migration: Migration) {
db.run("BEGIN");
try {
for (const statement of migration.statements) {
db.run(statement);
}
db.run("COMMIT");
} catch (error) {
try {
db.run("ROLLBACK");
} catch {
// Ignore rollback errors so the original failure is preserved.
}
throw error;
}
}
function runMigrations(db: Database, selectedMigrations: Migration[]) {
for (const migration of selectedMigrations) {
runMigration(db, migration);
}
}
function seedPre0009Database(db: Database) {
// Seed every existing table so ALTER TABLE paths run against non-empty data.
db.run("INSERT INTO users (id, email, username, name) VALUES ('u1', 'u1@example.com', 'user1', 'User One')");
db.run("INSERT INTO configs (id, user_id, name, github_config, gitea_config, schedule_config, cleanup_config) VALUES ('c1', 'u1', 'Default', '{}', '{}', '{}', '{}')");
db.run("INSERT INTO accounts (id, account_id, user_id, provider_id, access_token, refresh_token, id_token, access_token_expires_at, refresh_token_expires_at, scope) VALUES ('acct1', 'acct-1', 'u1', 'github', 'access-token', 'refresh-token', 'id-token', 2000, 3000, 'repo')");
db.run("INSERT INTO events (id, user_id, channel, payload) VALUES ('evt1', 'u1', 'sync', '{\"status\":\"queued\"}')");
db.run("INSERT INTO mirror_jobs (id, user_id, repository_id, repository_name, status, message, timestamp) VALUES ('job1', 'u1', 'r1', 'owner/repo', 'imported', 'Imported repository', 900)");
db.run("INSERT INTO organizations (id, user_id, config_id, name, avatar_url, public_repository_count, private_repository_count, fork_repository_count) VALUES ('org1', 'u1', 'c1', 'Example Org', 'https://example.com/org.png', 1, 0, 0)");
db.run("INSERT INTO repositories (id, user_id, config_id, name, full_name, normalized_full_name, url, clone_url, owner, organization, default_branch, created_at, updated_at, metadata) VALUES ('r1', 'u1', 'c1', 'repo', 'owner/repo', 'owner/repo', 'https://example.com/repo', 'https://example.com/repo.git', 'owner', 'Example Org', 'main', 1000, 1100, '{\"issues\":true}')");
db.run("INSERT INTO sessions (id, token, user_id, expires_at) VALUES ('sess1', 'session-token', 'u1', 4000)");
db.run("INSERT INTO verification_tokens (id, token, identifier, type, expires_at) VALUES ('vt1', 'verify-token', 'u1@example.com', 'email', 5000)");
db.run("INSERT INTO verifications (id, identifier, value, expires_at) VALUES ('ver1', 'u1@example.com', '123456', 6000)");
db.run("INSERT INTO oauth_applications (id, client_id, client_secret, name, redirect_urls, type, user_id) VALUES ('app1', 'client-1', 'secret-1', 'Example App', '[\"https://example.com/callback\"]', 'confidential', 'u1')");
db.run("INSERT INTO oauth_access_tokens (id, access_token, refresh_token, access_token_expires_at, refresh_token_expires_at, client_id, user_id, scopes) VALUES ('oat1', 'oauth-access-token', 'oauth-refresh-token', 7000, 8000, 'client-1', 'u1', '[\"repo\"]')");
db.run("INSERT INTO oauth_consent (id, user_id, client_id, scopes, consent_given) VALUES ('consent1', 'u1', 'client-1', '[\"repo\"]', true)");
db.run("INSERT INTO sso_providers (id, issuer, domain, oidc_config, user_id, provider_id) VALUES ('sso1', 'https://issuer.example.com', 'example.com', '{}', 'u1', 'provider-1')");
db.run("INSERT INTO rate_limits (id, user_id, provider, `limit`, remaining, used, reset, retry_after, status, last_checked) VALUES ('rl1', 'u1', 'github', 5000, 4999, 1, 9000, NULL, 'ok', 8500)");
}
function verify0009Migration(db: Database) {
const repositoryColumns = db.query("PRAGMA table_info(repositories)").all() as TableInfoRow[];
const importedAtColumn = repositoryColumns.find((column) => column.name === "imported_at");
assert(importedAtColumn, "Expected repositories.imported_at column to exist after migration");
assert(importedAtColumn.notnull === 1, "Expected repositories.imported_at to be NOT NULL");
assert(importedAtColumn.dflt_value === "unixepoch()", `Expected repositories.imported_at default to be unixepoch(), got ${importedAtColumn.dflt_value ?? "null"}`);
const existingRepo = db.query("SELECT imported_at FROM repositories WHERE id = 'r1'").get() as { imported_at: number } | null;
assert(existingRepo?.imported_at === 900, `Expected existing repository imported_at to backfill from mirror_jobs timestamp 900, got ${existingRepo?.imported_at ?? "null"}`);
db.run("INSERT INTO repositories (id, user_id, config_id, name, full_name, normalized_full_name, url, clone_url, owner, default_branch) VALUES ('r2', 'u1', 'c1', 'repo-two', 'owner/repo-two', 'owner/repo-two', 'https://example.com/repo-two', 'https://example.com/repo-two.git', 'owner', 'main')");
const newRepo = db.query("SELECT imported_at FROM repositories WHERE id = 'r2'").get() as { imported_at: number } | null;
assert(typeof newRepo?.imported_at === "number" && newRepo.imported_at > 0, "Expected new repository insert to receive imported_at from the column default");
const importedAtIndex = db
.query("SELECT name FROM sqlite_master WHERE type = 'index' AND tbl_name = 'repositories' AND name = 'idx_repositories_user_imported_at'")
.get() as { name: string } | null;
assert(importedAtIndex?.name === "idx_repositories_user_imported_at", "Expected repositories imported_at index to exist after migration");
}
function seedPre0010Database(db: any) {
// Seed a repo row to verify index creation doesn't break existing data
seedPre0009Database(db);
}
function verify0010Migration(db: any) {
const indexes = db.prepare(
"SELECT name FROM sqlite_master WHERE type='index' AND name='uniq_repositories_user_mirrored_location'"
).all();
if (indexes.length === 0) {
throw new Error("Missing unique partial index uniq_repositories_user_mirrored_location");
}
const lookupIdx = db.prepare(
"SELECT name FROM sqlite_master WHERE type='index' AND name='idx_repositories_mirrored_location'"
).all();
if (lookupIdx.length === 0) {
throw new Error("Missing lookup index idx_repositories_mirrored_location");
}
}
function seedPre0011Database(db: any) {
seedPre0009Database(db);
runMigration(db, migrations.find((m) => m.entry.tag === "0009_nervous_tyger_tiger")!);
runMigration(db, migrations.find((m) => m.entry.tag === "0010_mirrored_location_index")!);
}
function verify0011Migration(db: any) {
const configColumns = db.query("PRAGMA table_info(configs)").all() as TableInfoRow[];
const notificationConfigColumn = configColumns.find((column: any) => column.name === "notification_config");
assert(notificationConfigColumn, "Expected configs.notification_config column to exist after migration");
assert(notificationConfigColumn.notnull === 1, "Expected configs.notification_config to be NOT NULL");
assert(
notificationConfigColumn.dflt_value !== null,
"Expected configs.notification_config to have a default value",
);
const existingConfig = db.query("SELECT notification_config FROM configs WHERE id = 'c1'").get() as { notification_config: string } | null;
assert(existingConfig, "Expected existing config row to still exist");
const parsed = JSON.parse(existingConfig.notification_config);
assert(parsed.enabled === false, "Expected default notification_config.enabled to be false");
assert(parsed.provider === "ntfy", "Expected default notification_config.provider to be 'ntfy'");
}
const latestUpgradeFixtures: Record<string, UpgradeFixture> = {
"0009_nervous_tyger_tiger": {
seed: seedPre0009Database,
verify: verify0009Migration,
},
"0010_mirrored_location_index": {
seed: seedPre0010Database,
verify: verify0010Migration,
},
"0011_notification_config": {
seed: seedPre0011Database,
verify: verify0011Migration,
},
};
function lintMigrations(selectedMigrations: Migration[]) {
const violations: string[] = [];
for (const migration of selectedMigrations) {
for (const statement of migration.statements) {
for (const rule of SQLITE_LINT_RULES) {
if (rule.pattern.test(statement)) {
violations.push(`[${migration.entry.tag}] ${rule.message}\n Statement: ${statement.slice(0, 120)}...`);
}
}
}
}
assert(
violations.length === 0,
`SQLite lint found ${violations.length} violation(s):\n\n${violations.join("\n\n")}`,
);
}
function validateMigrations() {
assert(latestMigration, "No migrations found in drizzle/meta/_journal.json");
// Lint all migrations for known SQLite pitfalls before running anything.
lintMigrations(migrations);
const emptyDb = new Database(":memory:");
try {
runMigrations(emptyDb, migrations);
} finally {
emptyDb.close();
}
const upgradeFixture = latestUpgradeFixtures[latestMigration.entry.tag];
assert(
upgradeFixture,
`Missing upgrade fixture for latest migration ${latestMigration.entry.tag}. Add one in scripts/validate-migrations.ts.`,
);
const upgradeDb = new Database(":memory:");
try {
runMigrations(upgradeDb, migrations.slice(0, -1));
upgradeFixture.seed(upgradeDb);
runMigration(upgradeDb, latestMigration);
upgradeFixture.verify(upgradeDb);
} finally {
upgradeDb.close();
}
console.log(
`Validated ${migrations.length} migrations from scratch and upgrade path for ${latestMigration.entry.tag}.`,
);
}
try {
validateMigrations();
} catch (error) {
console.error("Migration validation failed:");
console.error(error instanceof Error ? error.stack ?? error.message : String(error));
process.exit(1);
}

View File

@@ -1,6 +1,7 @@
import { Button } from "@/components/ui/button";
import { Card, CardContent, CardHeader } from "@/components/ui/card";
import { Home, ArrowLeft, GitBranch, BookOpen, Settings, FileQuestion } from "lucide-react";
import { withBase } from "@/lib/base-path";
export function NotFound() {
return (
@@ -21,7 +22,7 @@ export function NotFound() {
{/* Action Buttons */}
<div className="flex flex-col gap-3">
<Button asChild className="w-full">
<a href="/">
<a href={withBase("/")}>
<Home className="mr-2 h-4 w-4" />
Go to Dashboard
</a>
@@ -45,21 +46,21 @@ export function NotFound() {
{/* Quick Links */}
<div className="grid grid-cols-3 gap-3">
<a
href="/repositories"
href={withBase("/repositories")}
className="flex flex-col items-center gap-2 p-3 rounded-md hover:bg-muted transition-colors"
>
<GitBranch className="h-5 w-5 text-muted-foreground" />
<span className="text-xs">Repositories</span>
</a>
<a
href="/config"
href={withBase("/config")}
className="flex flex-col items-center gap-2 p-3 rounded-md hover:bg-muted transition-colors"
>
<Settings className="h-5 w-5 text-muted-foreground" />
<span className="text-xs">Config</span>
</a>
<a
href="/docs"
href={withBase("/docs")}
className="flex flex-col items-center gap-2 p-3 rounded-md hover:bg-muted transition-colors"
>
<BookOpen className="h-5 w-5 text-muted-foreground" />
@@ -77,4 +78,4 @@ export function NotFound() {
</Card>
</div>
);
}
}

View File

@@ -36,6 +36,7 @@ import { toast } from 'sonner';
import { useLiveRefresh } from '@/hooks/useLiveRefresh';
import { useConfigStatus } from '@/hooks/useConfigStatus';
import { useNavigation } from '@/components/layout/MainLayout';
import { withBase } from '@/lib/base-path';
import {
Drawer,
DrawerClose,
@@ -321,7 +322,7 @@ export function ActivityLog() {
setIsInitialLoading(true);
setShowCleanupDialog(false);
const response = await fetch('/api/activities/cleanup', {
const response = await fetch(withBase('/api/activities/cleanup'), {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ userId: user.id }),

View File

@@ -12,6 +12,7 @@ import { Separator } from '@/components/ui/separator';
import { toast, Toaster } from 'sonner';
import { showErrorToast } from '@/lib/utils';
import { Loader2, Mail, Globe, Eye, EyeOff } from 'lucide-react';
import { withBase } from '@/lib/base-path';
export function LoginForm() {
@@ -47,7 +48,7 @@ export function LoginForm() {
toast.success('Login successful!');
// Small delay before redirecting to see the success message
setTimeout(() => {
window.location.href = '/';
window.location.href = withBase('/');
}, 1000);
} catch (error) {
showErrorToast(error, toast);
@@ -64,12 +65,15 @@ export function LoginForm() {
return;
}
const baseURL = typeof window !== 'undefined' ? window.location.origin : 'http://localhost:4321';
const callbackURL =
typeof window !== 'undefined'
? new URL(withBase('/'), window.location.origin).toString()
: `http://localhost:4321${withBase('/')}`;
await authClient.signIn.sso({
email: ssoEmail || undefined,
domain: domain,
providerId: providerId,
callbackURL: `${baseURL}/`,
callbackURL,
scopes: ['openid', 'email', 'profile'], // TODO: This is not being respected by the SSO plugin.
});
} catch (error) {
@@ -85,7 +89,7 @@ export function LoginForm() {
<CardHeader className="text-center">
<div className="flex justify-center mb-4">
<img
src="/logo.png"
src={withBase('/logo.png')}
alt="Gitea Mirror Logo"
className="h-8 w-10"
/>

View File

@@ -7,6 +7,7 @@ import { toast, Toaster } from 'sonner';
import { showErrorToast } from '@/lib/utils';
import { useAuth } from '@/hooks/useAuth';
import { Eye, EyeOff } from 'lucide-react';
import { withBase } from '@/lib/base-path';
export function SignupForm() {
const [isLoading, setIsLoading] = useState(false);
@@ -42,7 +43,7 @@ export function SignupForm() {
toast.success('Account created successfully! Redirecting to dashboard...');
// Small delay before redirecting to see the success message
setTimeout(() => {
window.location.href = '/';
window.location.href = withBase('/');
}, 1500);
} catch (error) {
showErrorToast(error, toast);
@@ -57,7 +58,7 @@ export function SignupForm() {
<CardHeader className="text-center">
<div className="flex justify-center mb-4">
<img
src="/logo.png"
src={withBase('/logo.png')}
alt="Gitea Mirror Logo"
className="h-8 w-10"
/>

View File

@@ -1,7 +1,8 @@
import { useEffect } from "react";
import { useEffect, useMemo } from "react";
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card";
import { Checkbox } from "@/components/ui/checkbox";
import { Label } from "@/components/ui/label";
import { Input } from "@/components/ui/input";
import {
Select,
SelectContent,
@@ -19,6 +20,7 @@ import {
Zap,
Info,
Archive,
Globe,
} from "lucide-react";
import {
Tooltip,
@@ -28,6 +30,10 @@ import {
} from "@/components/ui/tooltip";
import type { ScheduleConfig, DatabaseCleanupConfig } from "@/types/config";
import { formatDate } from "@/lib/utils";
import {
buildClockCronExpression,
getNextCronOccurrence,
} from "@/lib/utils/schedule-utils";
interface AutomationSettingsProps {
scheduleConfig: ScheduleConfig;
@@ -38,15 +44,13 @@ interface AutomationSettingsProps {
isAutoSavingCleanup?: boolean;
}
const scheduleIntervals = [
{ label: "Every hour", value: 3600 },
{ label: "Every 2 hours", value: 7200 },
{ label: "Every 4 hours", value: 14400 },
{ label: "Every 8 hours", value: 28800 },
{ label: "Every 12 hours", value: 43200 },
{ label: "Daily", value: 86400 },
{ label: "Every 2 days", value: 172800 },
{ label: "Weekly", value: 604800 },
const clockFrequencies = [
{ label: "Every hour", value: 1 },
{ label: "Every 2 hours", value: 2 },
{ label: "Every 4 hours", value: 4 },
{ label: "Every 8 hours", value: 8 },
{ label: "Every 12 hours", value: 12 },
{ label: "Daily", value: 24 },
];
const retentionPeriods = [
@@ -85,6 +89,27 @@ export function AutomationSettings({
isAutoSavingSchedule,
isAutoSavingCleanup,
}: AutomationSettingsProps) {
const browserTimezone =
typeof Intl !== "undefined"
? Intl.DateTimeFormat().resolvedOptions().timeZone || "UTC"
: "UTC";
// Use saved timezone, but treat "UTC" as unset for users who never chose it
const effectiveTimezone = scheduleConfig.timezone || browserTimezone;
const nextScheduledRun = useMemo(() => {
if (!scheduleConfig.enabled) return null;
const startTime = scheduleConfig.startTime || "22:00";
const frequencyHours = scheduleConfig.clockFrequencyHours || 24;
const cronExpression = buildClockCronExpression(startTime, frequencyHours);
if (!cronExpression) return null;
try {
return getNextCronOccurrence(cronExpression, new Date(), effectiveTimezone);
} catch {
return null;
}
}, [scheduleConfig.enabled, scheduleConfig.startTime, scheduleConfig.clockFrequencyHours, effectiveTimezone]);
// Update nextRun for cleanup when settings change
useEffect(() => {
if (cleanupConfig.enabled && !cleanupConfig.nextRun) {
@@ -125,7 +150,7 @@ export function AutomationSettings({
<CardContent className="space-y-6">
<div className="grid grid-cols-1 md:grid-cols-2 gap-4">
{/* Automatic Syncing Section */}
<div className="space-y-4 p-4 border border-border rounded-lg bg-card/50">
<div className="flex flex-col gap-4 p-4 border border-border rounded-lg bg-card/50">
<div className="flex items-center justify-between">
<h3 className="text-sm font-medium flex items-center gap-2">
<RefreshCw className="h-4 w-4 text-primary" />
@@ -136,14 +161,21 @@ export function AutomationSettings({
)}
</div>
<div className="space-y-4">
<div className="flex-1 flex flex-col gap-4">
<div className="flex items-start space-x-3">
<Checkbox
id="enable-auto-mirror"
checked={scheduleConfig.enabled}
className="mt-1.25"
onCheckedChange={(checked) =>
onScheduleChange({ ...scheduleConfig, enabled: !!checked })
onScheduleChange({
...scheduleConfig,
enabled: !!checked,
timezone: checked ? browserTimezone : scheduleConfig.timezone,
startTime: scheduleConfig.startTime || "22:00",
clockFrequencyHours: scheduleConfig.clockFrequencyHours || 24,
scheduleMode: "clock",
})
}
/>
<div className="space-y-0.5 flex-1">
@@ -154,79 +186,123 @@ export function AutomationSettings({
Enable automatic repository syncing
</Label>
<p className="text-xs text-muted-foreground">
Periodically check GitHub for changes and mirror them to Gitea
Periodically sync GitHub changes to Gitea
</p>
</div>
</div>
{scheduleConfig.enabled && (
<div className="ml-6 space-y-3">
<div>
<Label htmlFor="mirror-interval" className="text-sm">
Sync frequency
</Label>
<Select
value={scheduleConfig.interval.toString()}
onValueChange={(value) =>
onScheduleChange({
...scheduleConfig,
interval: parseInt(value, 10),
})
}
>
<SelectTrigger id="mirror-interval" className="mt-1.5">
<SelectValue />
</SelectTrigger>
<SelectContent>
{scheduleIntervals.map((option) => (
<SelectItem
key={option.value}
value={option.value.toString()}
>
{option.label}
</SelectItem>
))}
</SelectContent>
</Select>
<div className="space-y-3">
<div className="flex flex-wrap items-center justify-between gap-2">
<p className="text-[11px] font-medium uppercase tracking-wide text-muted-foreground">
Schedule
</p>
<span className="inline-flex items-center gap-1.5 rounded-full border border-border/70 px-2.5 py-0.5 text-[11px] text-muted-foreground">
<Globe className="h-3 w-3" />
{effectiveTimezone}
</span>
</div>
<div className="grid gap-3 sm:grid-cols-2">
<div className="space-y-1.5">
<Label
htmlFor="clock-frequency"
className="text-xs font-medium uppercase tracking-wide text-muted-foreground"
>
Frequency
</Label>
<Select
value={String(scheduleConfig.clockFrequencyHours || 24)}
onValueChange={(value) =>
onScheduleChange({
...scheduleConfig,
scheduleMode: "clock",
clockFrequencyHours: parseInt(value, 10),
startTime: scheduleConfig.startTime || "22:00",
timezone: effectiveTimezone,
})
}
>
<SelectTrigger id="clock-frequency" className="w-full">
<SelectValue />
</SelectTrigger>
<SelectContent>
{clockFrequencies.map((option) => (
<SelectItem
key={option.value}
value={option.value.toString()}
>
{option.label}
</SelectItem>
))}
</SelectContent>
</Select>
</div>
<div className="space-y-1.5">
<Label
htmlFor="clock-start-time"
className="text-xs font-medium uppercase tracking-wide text-muted-foreground"
>
Start time
</Label>
<div className="relative">
<div className="text-muted-foreground pointer-events-none absolute inset-y-0 left-0 flex items-center justify-center pl-3">
<Clock className="size-4" />
</div>
<Input
id="clock-start-time"
type="time"
value={scheduleConfig.startTime || "22:00"}
onChange={(event) =>
onScheduleChange({
...scheduleConfig,
scheduleMode: "clock",
startTime: event.target.value,
clockFrequencyHours:
scheduleConfig.clockFrequencyHours || 24,
timezone: effectiveTimezone,
})
}
className="appearance-none pl-9 dark:bg-input/30 [&::-webkit-calendar-picker-indicator]:hidden [&::-webkit-calendar-picker-indicator]:appearance-none"
/>
</div>
</div>
</div>
</div>
)}
<div className="space-y-2 p-3 bg-muted/30 dark:bg-muted/10 rounded-md border border-border/50">
<div className="flex items-center justify-between text-xs">
<span className="flex items-center gap-1.5">
<Clock className="h-3.5 w-3.5" />
Last sync
</span>
<span className="font-medium text-muted-foreground">
<div className="mt-auto flex items-center justify-between border-t border-border/50 pt-3 text-xs text-muted-foreground">
<span className="flex items-center gap-1.5">
<Clock className="h-3.5 w-3.5" />
Last sync{" "}
<span className="font-medium">
{scheduleConfig.lastRun
? formatDate(scheduleConfig.lastRun)
: "Never"}
</span>
</div>
</span>
{scheduleConfig.enabled ? (
scheduleConfig.nextRun && (
<div className="flex items-center justify-between text-xs">
<span className="flex items-center gap-1.5">
<Calendar className="h-3.5 w-3.5" />
Next sync
</span>
<span className="font-medium">
{formatDate(scheduleConfig.nextRun)}
</span>
</div>
)
<span className="flex items-center gap-1.5">
<Calendar className="h-3.5 w-3.5" />
Next sync{" "}
<span className="font-medium text-primary">
{scheduleConfig.nextRun
? formatDate(scheduleConfig.nextRun)
: nextScheduledRun
? formatDate(nextScheduledRun)
: "Calculating..."}
</span>
</span>
) : (
<div className="text-xs text-muted-foreground">
Enable automatic syncing to schedule periodic repository updates
</div>
<span>Enable syncing to schedule updates</span>
)}
</div>
</div>
</div>
</div>
{/* Database Cleanup Section */}
<div className="space-y-4 p-4 border border-border rounded-lg bg-card/50">
<div className="flex flex-col gap-4 p-4 border border-border rounded-lg bg-card/50">
<div className="flex items-center justify-between">
<h3 className="text-sm font-medium flex items-center gap-2">
<Database className="h-4 w-4 text-primary" />
@@ -237,7 +313,7 @@ export function AutomationSettings({
)}
</div>
<div className="space-y-4">
<div className="flex-1 flex flex-col gap-4">
<div className="flex items-start space-x-3">
<Checkbox
id="enable-auto-cleanup"
@@ -255,13 +331,13 @@ export function AutomationSettings({
Enable automatic database cleanup
</Label>
<p className="text-xs text-muted-foreground">
Remove old activity logs and events to optimize storage
Remove old activity logs to optimize storage
</p>
</div>
</div>
{cleanupConfig.enabled && (
<div className="ml-6 space-y-5">
<div className="space-y-5">
<div className="space-y-2">
<Label htmlFor="retention-period" className="text-sm flex items-center gap-2">
Data retention period
@@ -312,7 +388,7 @@ export function AutomationSettings({
</div>
)}
<div className="space-y-2 p-3 bg-muted/30 dark:bg-muted/10 rounded-md border border-border/50">
<div className="mt-auto space-y-2 pt-3 border-t border-border/50">
<div className="flex items-center justify-between text-xs">
<span className="flex items-center gap-1.5">
<Clock className="h-3.5 w-3.5" />

View File

@@ -3,6 +3,7 @@ import { GitHubConfigForm } from './GitHubConfigForm';
import { GiteaConfigForm } from './GiteaConfigForm';
import { AutomationSettings } from './AutomationSettings';
import { SSOSettings } from './SSOSettings';
import { NotificationSettings } from './NotificationSettings';
import type {
ConfigApiResponse,
GiteaConfig,
@@ -13,6 +14,7 @@ import type {
DatabaseCleanupConfig,
MirrorOptions,
AdvancedOptions,
NotificationConfig,
} from '@/types/config';
import { Button } from '../ui/button';
import { useAuth } from '@/hooks/useAuth';
@@ -22,6 +24,7 @@ import { toast } from 'sonner';
import { Skeleton } from '@/components/ui/skeleton';
import { invalidateConfigCache } from '@/hooks/useConfigStatus';
import { Tabs, TabsContent, TabsList, TabsTrigger } from '@/components/ui/tabs';
import { withBase } from '@/lib/base-path';
type ConfigState = {
githubConfig: GitHubConfig;
@@ -30,8 +33,11 @@ type ConfigState = {
cleanupConfig: DatabaseCleanupConfig;
mirrorOptions: MirrorOptions;
advancedOptions: AdvancedOptions;
notificationConfig: NotificationConfig;
};
const CONFIG_API_PATH = withBase('/api/config');
export function ConfigTabs() {
const [config, setConfig] = useState<ConfigState>({
githubConfig: {
@@ -39,6 +45,7 @@ export function ConfigTabs() {
token: '',
privateRepositories: false,
mirrorStarred: false,
starredLists: [],
},
giteaConfig: {
url: '',
@@ -50,6 +57,11 @@ export function ConfigTabs() {
starredReposOrg: 'starred',
starredReposMode: 'dedicated-org',
preserveOrgStructure: false,
backupStrategy: "on-force-push",
backupRetentionCount: 5,
backupRetentionDays: 30,
backupDirectory: 'data/repo-backups',
blockSyncOnBackupFailure: true,
},
scheduleConfig: {
enabled: false, // Don't set defaults here - will be loaded from API
@@ -79,6 +91,14 @@ export function ConfigTabs() {
advancedOptions: {
skipForks: false,
starredCodeOnly: false,
autoMirrorStarred: false,
},
notificationConfig: {
enabled: false,
provider: "ntfy",
notifyOnSyncError: true,
notifyOnSyncSuccess: false,
notifyOnNewRepo: false,
},
});
const { user } = useAuth();
@@ -89,10 +109,12 @@ export function ConfigTabs() {
const [isAutoSavingCleanup, setIsAutoSavingCleanup] = useState<boolean>(false);
const [isAutoSavingGitHub, setIsAutoSavingGitHub] = useState<boolean>(false);
const [isAutoSavingGitea, setIsAutoSavingGitea] = useState<boolean>(false);
const [isAutoSavingNotification, setIsAutoSavingNotification] = useState<boolean>(false);
const autoSaveScheduleTimeoutRef = useRef<NodeJS.Timeout | null>(null);
const autoSaveCleanupTimeoutRef = useRef<NodeJS.Timeout | null>(null);
const autoSaveGitHubTimeoutRef = useRef<NodeJS.Timeout | null>(null);
const autoSaveGiteaTimeoutRef = useRef<NodeJS.Timeout | null>(null);
const autoSaveNotificationTimeoutRef = useRef<NodeJS.Timeout | null>(null);
const isConfigFormValid = (): boolean => {
const { githubConfig, giteaConfig } = config;
@@ -119,19 +141,31 @@ export function ConfigTabs() {
if (!user?.id) return;
setIsSyncing(true);
try {
const result = await apiRequest<{ success: boolean; message?: string }>(
const result = await apiRequest<{ success: boolean; message?: string; failedOrgs?: string[]; recoveredOrgs?: number }>(
`/sync?userId=${user.id}`,
{ method: 'POST' },
);
result.success
? toast.success(
'GitHub data imported successfully! Head to the Repositories page to start mirroring.',
)
: toast.error(
`Failed to import GitHub data: ${
result.message || 'Unknown error'
}`,
if (result.success) {
toast.success(
'GitHub data imported successfully! Head to the Repositories page to start mirroring.',
);
if (result.failedOrgs && result.failedOrgs.length > 0) {
toast.warning(
`${result.failedOrgs.length} org${result.failedOrgs.length > 1 ? 's' : ''} failed to import (${result.failedOrgs.join(', ')}). Check the Organizations tab for details.`,
);
}
if (result.recoveredOrgs && result.recoveredOrgs > 0) {
toast.success(
`${result.recoveredOrgs} previously failed org${result.recoveredOrgs > 1 ? 's' : ''} recovered successfully.`,
);
}
} else {
toast.error(
`Failed to import GitHub data: ${
result.message || 'Unknown error'
}`,
);
}
} catch (error) {
toast.error(
`Error importing GitHub data: ${
@@ -167,7 +201,7 @@ export function ConfigTabs() {
};
try {
const response = await fetch('/api/config', {
const response = await fetch(CONFIG_API_PATH, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(reqPayload),
@@ -233,7 +267,7 @@ export function ConfigTabs() {
};
try {
const response = await fetch('/api/config', {
const response = await fetch(CONFIG_API_PATH, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(reqPayload),
@@ -298,7 +332,7 @@ export function ConfigTabs() {
};
try {
const response = await fetch('/api/config', {
const response = await fetch(CONFIG_API_PATH, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(reqPayload),
@@ -347,7 +381,7 @@ export function ConfigTabs() {
};
try {
const response = await fetch('/api/config', {
const response = await fetch(CONFIG_API_PATH, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(reqPayload),
@@ -387,7 +421,7 @@ export function ConfigTabs() {
};
try {
const response = await fetch('/api/config', {
const response = await fetch(CONFIG_API_PATH, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(reqPayload),
@@ -422,7 +456,7 @@ export function ConfigTabs() {
};
try {
const response = await fetch('/api/config', {
const response = await fetch(CONFIG_API_PATH, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(reqPayload),
@@ -442,6 +476,55 @@ export function ConfigTabs() {
}
}, [user?.id, config.githubConfig, config.giteaConfig, config.scheduleConfig, config.cleanupConfig, config.mirrorOptions]);
// Auto-save function for notification config changes
const autoSaveNotificationConfig = useCallback(async (notifConfig: NotificationConfig) => {
if (!user?.id) return;
// Clear any existing timeout
if (autoSaveNotificationTimeoutRef.current) {
clearTimeout(autoSaveNotificationTimeoutRef.current);
}
// Debounce the auto-save to prevent excessive API calls
autoSaveNotificationTimeoutRef.current = setTimeout(async () => {
setIsAutoSavingNotification(true);
const reqPayload = {
userId: user.id!,
githubConfig: config.githubConfig,
giteaConfig: config.giteaConfig,
scheduleConfig: config.scheduleConfig,
cleanupConfig: config.cleanupConfig,
mirrorOptions: config.mirrorOptions,
advancedOptions: config.advancedOptions,
notificationConfig: notifConfig,
};
try {
const response = await fetch(CONFIG_API_PATH, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(reqPayload),
});
const result: SaveConfigApiResponse = await response.json();
if (result.success) {
// Silent success - no toast for auto-save
invalidateConfigCache();
} else {
showErrorToast(
`Auto-save failed: ${result.message || 'Unknown error'}`,
toast
);
}
} catch (error) {
showErrorToast(error, toast);
} finally {
setIsAutoSavingNotification(false);
}
}, 500); // 500ms debounce
}, [user?.id, config.githubConfig, config.giteaConfig, config.scheduleConfig, config.cleanupConfig, config.mirrorOptions, config.advancedOptions]);
// Cleanup timeouts on unmount
useEffect(() => {
return () => {
@@ -457,6 +540,9 @@ export function ConfigTabs() {
if (autoSaveGiteaTimeoutRef.current) {
clearTimeout(autoSaveGiteaTimeoutRef.current);
}
if (autoSaveNotificationTimeoutRef.current) {
clearTimeout(autoSaveNotificationTimeoutRef.current);
}
};
}, []);
@@ -488,6 +574,8 @@ export function ConfigTabs() {
},
advancedOptions:
response.advancedOptions || config.advancedOptions,
notificationConfig:
(response as any).notificationConfig || config.notificationConfig,
});
}
@@ -617,9 +705,10 @@ export function ConfigTabs() {
{/* Content section - Tabs layout */}
<Tabs defaultValue="connections" className="space-y-4">
<TabsList className="grid w-full grid-cols-3">
<TabsList className="grid w-full grid-cols-4">
<TabsTrigger value="connections">Connections</TabsTrigger>
<TabsTrigger value="automation">Automation</TabsTrigger>
<TabsTrigger value="notifications">Notifications</TabsTrigger>
<TabsTrigger value="sso">Authentication</TabsTrigger>
</TabsList>
@@ -656,9 +745,20 @@ export function ConfigTabs() {
: update,
}))
}
giteaConfig={config.giteaConfig}
setGiteaConfig={update =>
setConfig(prev => ({
...prev,
giteaConfig:
typeof update === 'function'
? update(prev.giteaConfig)
: update,
}))
}
onAutoSave={autoSaveGitHubConfig}
onMirrorOptionsAutoSave={autoSaveMirrorOptions}
onAdvancedOptionsAutoSave={autoSaveAdvancedOptions}
onGiteaAutoSave={autoSaveGiteaConfig}
isAutoSaving={isAutoSavingGitHub}
/>
<GiteaConfigForm
@@ -696,6 +796,17 @@ export function ConfigTabs() {
/>
</TabsContent>
<TabsContent value="notifications" className="space-y-4">
<NotificationSettings
notificationConfig={config.notificationConfig}
onNotificationChange={(newConfig) => {
setConfig(prev => ({ ...prev, notificationConfig: newConfig }));
autoSaveNotificationConfig(newConfig);
}}
isAutoSaving={isAutoSavingNotification}
/>
</TabsContent>
<TabsContent value="sso" className="space-y-4">
<SSOSettings />
</TabsContent>

View File

@@ -7,10 +7,11 @@ import {
CardTitle,
} from "@/components/ui/card";
import { githubApi } from "@/lib/api";
import type { GitHubConfig, MirrorOptions, AdvancedOptions } from "@/types/config";
import type { GitHubConfig, MirrorOptions, AdvancedOptions, GiteaConfig, BackupStrategy } from "@/types/config";
import { Input } from "../ui/input";
import { toast } from "sonner";
import { Info } from "lucide-react";
import { Info, ShieldAlert } from "lucide-react";
import { Badge } from "@/components/ui/badge";
import { GitHubMirrorSettings } from "./GitHubMirrorSettings";
import { Separator } from "../ui/separator";
import {
@@ -26,23 +27,29 @@ interface GitHubConfigFormProps {
setMirrorOptions: React.Dispatch<React.SetStateAction<MirrorOptions>>;
advancedOptions: AdvancedOptions;
setAdvancedOptions: React.Dispatch<React.SetStateAction<AdvancedOptions>>;
giteaConfig?: GiteaConfig;
setGiteaConfig?: React.Dispatch<React.SetStateAction<GiteaConfig>>;
onAutoSave?: (githubConfig: GitHubConfig) => Promise<void>;
onMirrorOptionsAutoSave?: (mirrorOptions: MirrorOptions) => Promise<void>;
onAdvancedOptionsAutoSave?: (advancedOptions: AdvancedOptions) => Promise<void>;
onGiteaAutoSave?: (giteaConfig: GiteaConfig) => Promise<void>;
isAutoSaving?: boolean;
}
export function GitHubConfigForm({
config,
setConfig,
config,
setConfig,
mirrorOptions,
setMirrorOptions,
advancedOptions,
setAdvancedOptions,
onAutoSave,
giteaConfig,
setGiteaConfig,
onAutoSave,
onMirrorOptionsAutoSave,
onAdvancedOptionsAutoSave,
isAutoSaving
onGiteaAutoSave,
isAutoSaving
}: GitHubConfigFormProps) {
const [isLoading, setIsLoading] = useState(false);
@@ -202,7 +209,161 @@ export function GitHubConfigForm({
if (onAdvancedOptionsAutoSave) onAdvancedOptionsAutoSave(newOptions);
}}
/>
{giteaConfig && setGiteaConfig && (
<>
<Separator />
<div className="space-y-4">
<h3 className="text-sm font-medium flex items-center gap-2">
<ShieldAlert className="h-4 w-4 text-primary" />
Destructive Update Protection
<Badge variant="secondary" className="ml-2 text-[10px] px-1.5 py-0">BETA</Badge>
</h3>
<p className="text-xs text-muted-foreground">
Choose how to handle force-pushes or rewritten upstream history on GitHub.
</p>
<div className="grid grid-cols-2 md:grid-cols-4 gap-2">
{([
{
value: "disabled",
label: "Disabled",
desc: "No detection or backups",
},
{
value: "always",
label: "Always Backup",
desc: "Snapshot before every sync (high disk usage)",
},
{
value: "on-force-push",
label: "Smart",
desc: "Backup only on force-push",
},
{
value: "block-on-force-push",
label: "Block & Approve",
desc: "Require approval on force-push",
},
] as const).map((opt) => {
const isSelected = (giteaConfig.backupStrategy ?? "on-force-push") === opt.value;
return (
<button
key={opt.value}
type="button"
onClick={() => {
const newConfig = { ...giteaConfig, backupStrategy: opt.value as BackupStrategy };
setGiteaConfig(newConfig);
if (onGiteaAutoSave) onGiteaAutoSave(newConfig);
}}
className={`flex flex-col items-start gap-1 rounded-lg border p-3 text-left text-sm transition-colors ${
isSelected
? "border-primary bg-primary/5 ring-1 ring-primary"
: "border-input hover:bg-accent hover:text-accent-foreground"
}`}
>
<span className="font-medium">{opt.label}</span>
<span className="text-xs text-muted-foreground">{opt.desc}</span>
</button>
);
})}
</div>
{(giteaConfig.backupStrategy ?? "on-force-push") !== "disabled" && (
<>
<div className="grid grid-cols-1 md:grid-cols-3 gap-4">
<div>
<label htmlFor="backup-retention" className="block text-sm font-medium mb-1.5">
Snapshot retention count
</label>
<input
id="backup-retention"
name="backupRetentionCount"
type="number"
min={1}
value={giteaConfig.backupRetentionCount ?? 5}
onChange={(e) => {
const newConfig = {
...giteaConfig,
backupRetentionCount: Math.max(1, Number.parseInt(e.target.value, 10) || 5),
};
setGiteaConfig(newConfig);
if (onGiteaAutoSave) onGiteaAutoSave(newConfig);
}}
className="w-full rounded-md border border-input bg-background px-3 py-2 text-sm shadow-sm transition-colors placeholder:text-muted-foreground focus-visible:outline-none focus-visible:ring-1 focus-visible:ring-ring"
/>
</div>
<div>
<label htmlFor="backup-retention-days" className="block text-sm font-medium mb-1.5">
Snapshot retention days
</label>
<input
id="backup-retention-days"
name="backupRetentionDays"
type="number"
min={0}
value={giteaConfig.backupRetentionDays ?? 30}
onChange={(e) => {
const newConfig = {
...giteaConfig,
backupRetentionDays: Math.max(0, Number.parseInt(e.target.value, 10) || 0),
};
setGiteaConfig(newConfig);
if (onGiteaAutoSave) onGiteaAutoSave(newConfig);
}}
className="w-full rounded-md border border-input bg-background px-3 py-2 text-sm shadow-sm transition-colors placeholder:text-muted-foreground focus-visible:outline-none focus-visible:ring-1 focus-visible:ring-ring"
/>
<p className="text-xs text-muted-foreground mt-1">0 = no time-based limit</p>
</div>
<div>
<label htmlFor="backup-directory" className="block text-sm font-medium mb-1.5">
Snapshot directory
</label>
<input
id="backup-directory"
name="backupDirectory"
type="text"
value={giteaConfig.backupDirectory || "data/repo-backups"}
onChange={(e) => {
const newConfig = { ...giteaConfig, backupDirectory: e.target.value };
setGiteaConfig(newConfig);
if (onGiteaAutoSave) onGiteaAutoSave(newConfig);
}}
className="w-full rounded-md border border-input bg-background px-3 py-2 text-sm shadow-sm transition-colors placeholder:text-muted-foreground focus-visible:outline-none focus-visible:ring-1 focus-visible:ring-ring"
placeholder="data/repo-backups"
/>
</div>
</div>
{((giteaConfig.backupStrategy ?? "on-force-push") === "always" ||
(giteaConfig.backupStrategy ?? "on-force-push") === "on-force-push") && (
<label className="flex items-start gap-3 text-sm">
<input
name="blockSyncOnBackupFailure"
type="checkbox"
checked={Boolean(giteaConfig.blockSyncOnBackupFailure)}
onChange={(e) => {
const newConfig = { ...giteaConfig, blockSyncOnBackupFailure: e.target.checked };
setGiteaConfig(newConfig);
if (onGiteaAutoSave) onGiteaAutoSave(newConfig);
}}
className="mt-0.5 rounded border-input"
/>
<span>
Block sync when snapshot fails
<p className="text-xs text-muted-foreground">
Recommended for backup-first behavior. If disabled, sync continues even when snapshot creation fails.
</p>
</span>
</label>
)}
</>
)}
</div>
</>
)}
{/* Mobile: Show button at bottom */}
<Button
type="button"

View File

@@ -4,6 +4,7 @@ import { Label } from "@/components/ui/label";
import { Separator } from "@/components/ui/separator";
import { Badge } from "@/components/ui/badge";
import { Button } from "@/components/ui/button";
import { Input } from "@/components/ui/input";
import {
Tooltip,
TooltipContent,
@@ -17,6 +18,7 @@ import {
} from "@/components/ui/popover";
import {
Info,
Check,
GitBranch,
Star,
Lock,
@@ -31,7 +33,9 @@ import {
ChevronDown,
Funnel,
HardDrive,
FileCode2
FileCode2,
Plus,
X
} from "lucide-react";
import type { GitHubConfig, MirrorOptions, AdvancedOptions, DuplicateNameStrategy } from "@/types/config";
import {
@@ -41,7 +45,16 @@ import {
SelectTrigger,
SelectValue,
} from "@/components/ui/select";
import {
Command,
CommandEmpty,
CommandGroup,
CommandInput,
CommandItem,
CommandList,
} from "@/components/ui/command";
import { cn } from "@/lib/utils";
import { githubApi } from "@/lib/api";
interface GitHubMirrorSettingsProps {
githubConfig: GitHubConfig;
@@ -60,8 +73,42 @@ export function GitHubMirrorSettings({
onMirrorOptionsChange,
onAdvancedOptionsChange,
}: GitHubMirrorSettingsProps) {
const handleGitHubChange = (field: keyof GitHubConfig, value: boolean | string) => {
const [starListsOpen, setStarListsOpen] = React.useState(false);
const [starListSearch, setStarListSearch] = React.useState("");
const [customStarListName, setCustomStarListName] = React.useState("");
const [availableStarLists, setAvailableStarLists] = React.useState<string[]>([]);
const [loadingStarLists, setLoadingStarLists] = React.useState(false);
const [loadedStarLists, setLoadedStarLists] = React.useState(false);
const [attemptedStarListLoad, setAttemptedStarListLoad] = React.useState(false);
const normalizeStarListNames = React.useCallback((lists: string[] | undefined): string[] => {
if (!Array.isArray(lists)) return [];
const seen = new Set<string>();
const normalized: string[] = [];
for (const list of lists) {
const trimmed = list.trim();
if (!trimmed) continue;
const key = trimmed.toLowerCase();
if (seen.has(key)) continue;
seen.add(key);
normalized.push(trimmed);
}
return normalized;
}, []);
const selectedStarLists = React.useMemo(
() => normalizeStarListNames(githubConfig.starredLists),
[githubConfig.starredLists, normalizeStarListNames],
);
const allKnownStarLists = React.useMemo(
() => normalizeStarListNames([...availableStarLists, ...selectedStarLists]),
[availableStarLists, selectedStarLists, normalizeStarListNames],
);
const handleGitHubChange = (field: keyof GitHubConfig, value: boolean | string | string[]) => {
onGitHubConfigChange({ ...githubConfig, [field]: value });
};
@@ -83,6 +130,59 @@ export function GitHubMirrorSettings({
onAdvancedOptionsChange({ ...advancedOptions, [field]: value });
};
const setSelectedStarLists = React.useCallback((lists: string[]) => {
onGitHubConfigChange({
...githubConfig,
starredLists: normalizeStarListNames(lists),
});
}, [githubConfig, normalizeStarListNames, onGitHubConfigChange]);
const loadStarLists = React.useCallback(async () => {
if (
loadingStarLists ||
loadedStarLists ||
attemptedStarListLoad ||
!githubConfig.mirrorStarred
) return;
setAttemptedStarListLoad(true);
setLoadingStarLists(true);
try {
const response = await githubApi.getStarredLists();
setAvailableStarLists(normalizeStarListNames(response.lists));
setLoadedStarLists(true);
} catch {
// Keep UX usable with manual custom input even if list fetch fails.
// Allow retry on next popover open.
setLoadedStarLists(false);
} finally {
setLoadingStarLists(false);
}
}, [
attemptedStarListLoad,
githubConfig.mirrorStarred,
loadedStarLists,
loadingStarLists,
normalizeStarListNames,
]);
React.useEffect(() => {
if (!starListsOpen || !githubConfig.mirrorStarred) return;
void loadStarLists();
}, [starListsOpen, githubConfig.mirrorStarred, loadStarLists]);
React.useEffect(() => {
if (!githubConfig.mirrorStarred) {
setStarListsOpen(false);
}
}, [githubConfig.mirrorStarred]);
React.useEffect(() => {
if (!starListsOpen) {
setAttemptedStarListLoad(false);
}
}, [starListsOpen]);
// When metadata is disabled, all components should be disabled
const isMetadataEnabled = mirrorOptions.mirrorMetadata;
@@ -98,6 +198,17 @@ export function GitHubMirrorSettings({
const starredContentCount = Object.entries(starredRepoContent).filter(([key, value]) => key !== 'code' && value).length;
const totalStarredOptions = 4; // releases, issues, PRs, wiki
const normalizedStarListSearch = starListSearch.trim();
const canAddSearchAsStarList = normalizedStarListSearch.length > 0
&& !allKnownStarLists.some((list) => list.toLowerCase() === normalizedStarListSearch.toLowerCase());
const addCustomStarList = () => {
const trimmed = customStarListName.trim();
if (!trimmed) return;
setSelectedStarLists([...selectedStarLists, trimmed]);
setCustomStarListName("");
};
return (
<div className="space-y-6">
{/* Repository Selection Section */}
@@ -287,6 +398,168 @@ export function GitHubMirrorSettings({
</div>
</div>
{/* Auto-mirror starred repos toggle */}
{githubConfig.mirrorStarred && (
<div className="mt-4">
<div className="flex items-start space-x-3">
<Checkbox
id="auto-mirror-starred"
checked={advancedOptions.autoMirrorStarred ?? false}
onCheckedChange={(checked) => handleAdvancedChange('autoMirrorStarred', !!checked)}
/>
<div className="space-y-0.5 flex-1">
<Label
htmlFor="auto-mirror-starred"
className="text-sm font-normal cursor-pointer flex items-center gap-2"
>
<Star className="h-3.5 w-3.5" />
Auto-mirror new starred repositories
</Label>
<p className="text-xs text-muted-foreground">
When disabled, starred repos are imported for browsing but not automatically mirrored. You can still mirror individual repos manually.
</p>
</div>
</div>
</div>
)}
{/* Star list selection */}
{githubConfig.mirrorStarred && (
<div className="mt-4 space-y-2">
<Label className="text-xs font-medium text-muted-foreground">
Star Lists (optional)
</Label>
<Popover open={starListsOpen} onOpenChange={setStarListsOpen}>
<PopoverTrigger asChild>
<Button
variant="outline"
role="combobox"
aria-expanded={starListsOpen}
className="w-full justify-between h-9 text-xs font-normal"
>
<span className="truncate text-left">
{selectedStarLists.length === 0
? "All starred repositories"
: `${selectedStarLists.length} list${selectedStarLists.length === 1 ? "" : "s"} selected`}
</span>
<ChevronDown className="ml-2 h-3 w-3 opacity-50 shrink-0" />
</Button>
</PopoverTrigger>
<PopoverContent className="w-[360px] p-0" align="start">
<Command>
<CommandInput
value={starListSearch}
onValueChange={setStarListSearch}
placeholder="Search GitHub star lists..."
/>
<CommandList>
<CommandEmpty>
{loadingStarLists ? "Loading star lists..." : "No matching lists"}
</CommandEmpty>
<CommandGroup>
{allKnownStarLists.map((list) => {
const isSelected = selectedStarLists.some(
(selected) => selected.toLowerCase() === list.toLowerCase(),
);
return (
<CommandItem
key={list}
value={list}
onSelect={() => {
if (isSelected) {
setSelectedStarLists(
selectedStarLists.filter(
(selected) => selected.toLowerCase() !== list.toLowerCase(),
),
);
} else {
setSelectedStarLists([...selectedStarLists, list]);
}
}}
>
<Check
className={cn(
"mr-2 h-4 w-4",
isSelected ? "opacity-100" : "opacity-0",
)}
/>
<span className="truncate">{list}</span>
</CommandItem>
);
})}
</CommandGroup>
</CommandList>
</Command>
{canAddSearchAsStarList && (
<div className="border-t p-2">
<Button
variant="ghost"
size="sm"
className="w-full justify-start text-xs"
onClick={() => {
setSelectedStarLists([...selectedStarLists, normalizedStarListSearch]);
setStarListSearch("");
}}
>
<Plus className="mr-2 h-3.5 w-3.5" />
Add "{normalizedStarListSearch}"
</Button>
</div>
)}
</PopoverContent>
</Popover>
<p className="text-xs text-muted-foreground">
Leave empty to mirror all starred repositories. Select one or more lists to limit syncing.
</p>
{selectedStarLists.length > 0 && (
<div className="flex flex-wrap gap-1.5">
{selectedStarLists.map((list) => (
<Badge key={list} variant="secondary" className="gap-1">
<span>{list}</span>
<button
type="button"
onClick={() =>
setSelectedStarLists(
selectedStarLists.filter(
(selected) => selected.toLowerCase() !== list.toLowerCase(),
),
)
}
className="rounded-sm hover:text-foreground/80"
aria-label={`Remove ${list} list`}
>
<X className="h-3 w-3" />
</button>
</Badge>
))}
</div>
)}
<div className="flex items-center gap-2">
<Input
value={customStarListName}
onChange={(event) => setCustomStarListName(event.target.value)}
placeholder="Add custom list name"
className="h-8 text-xs"
/>
<Button
type="button"
variant="outline"
size="sm"
className="h-8"
onClick={addCustomStarList}
disabled={!customStarListName.trim()}
>
Add
</Button>
</div>
</div>
)}
{/* Duplicate name handling for starred repos */}
{githubConfig.mirrorStarred && (
<div className="mt-4 space-y-2">

View File

@@ -100,9 +100,14 @@ export function GiteaConfigForm({ config, setConfig, onAutoSave, isAutoSaving, g
);
}
const normalizedValue =
type === "checkbox"
? checked
: value;
const newConfig = {
...config,
[name]: type === "checkbox" ? checked : value,
[name]: normalizedValue,
};
setConfig(newConfig);
@@ -286,7 +291,7 @@ export function GiteaConfigForm({ config, setConfig, onAutoSave, isAutoSaving, g
if (onAutoSave) onAutoSave(newConfig);
}}
/>
{/* Mobile: Show button at bottom */}
<Button
type="button"

View File

@@ -0,0 +1,395 @@
import { useState } from "react";
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card";
import { Label } from "@/components/ui/label";
import { Input } from "@/components/ui/input";
import {
Select,
SelectContent,
SelectItem,
SelectTrigger,
SelectValue,
} from "@/components/ui/select";
import { Switch } from "@/components/ui/switch";
import { Button } from "@/components/ui/button";
import { Bell, Activity, Send } from "lucide-react";
import { toast } from "sonner";
import type { NotificationConfig } from "@/types/config";
import { withBase } from "@/lib/base-path";
interface NotificationSettingsProps {
notificationConfig: NotificationConfig;
onNotificationChange: (config: NotificationConfig) => void;
isAutoSaving?: boolean;
}
export function NotificationSettings({
notificationConfig,
onNotificationChange,
isAutoSaving,
}: NotificationSettingsProps) {
const [isTesting, setIsTesting] = useState(false);
const handleTestNotification = async () => {
setIsTesting(true);
try {
const resp = await fetch(withBase("/api/notifications/test"), {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ notificationConfig }),
});
const result = await resp.json();
if (result.success) {
toast.success("Test notification sent successfully!");
} else {
toast.error(`Test failed: ${result.error || "Unknown error"}`);
}
} catch (error) {
toast.error(
`Test failed: ${error instanceof Error ? error.message : String(error)}`
);
} finally {
setIsTesting(false);
}
};
return (
<Card className="w-full">
<CardHeader>
<CardTitle className="text-lg font-semibold flex items-center gap-2">
<Bell className="h-5 w-5" />
Notifications
{isAutoSaving && (
<Activity className="h-4 w-4 animate-spin text-muted-foreground ml-2" />
)}
</CardTitle>
</CardHeader>
<CardContent className="space-y-6">
{/* Enable/disable toggle */}
<div className="flex items-center justify-between">
<div className="space-y-0.5">
<Label htmlFor="notifications-enabled" className="text-sm font-medium cursor-pointer">
Enable notifications
</Label>
<p className="text-xs text-muted-foreground">
Receive alerts when mirror jobs complete or fail
</p>
</div>
<Switch
id="notifications-enabled"
checked={notificationConfig.enabled}
onCheckedChange={(checked) =>
onNotificationChange({ ...notificationConfig, enabled: checked })
}
/>
</div>
{notificationConfig.enabled && (
<>
{/* Provider selector */}
<div className="space-y-2">
<Label htmlFor="notification-provider" className="text-sm font-medium">
Notification provider
</Label>
<Select
value={notificationConfig.provider}
onValueChange={(value: "ntfy" | "apprise") =>
onNotificationChange({ ...notificationConfig, provider: value })
}
>
<SelectTrigger id="notification-provider">
<SelectValue />
</SelectTrigger>
<SelectContent>
<SelectItem value="ntfy">Ntfy.sh</SelectItem>
<SelectItem value="apprise">Apprise API</SelectItem>
</SelectContent>
</Select>
</div>
{/* Ntfy configuration */}
{notificationConfig.provider === "ntfy" && (
<div className="space-y-4 p-4 border border-border rounded-lg bg-card/50">
<h3 className="text-sm font-medium">Ntfy.sh Settings</h3>
<div className="space-y-2">
<Label htmlFor="ntfy-url" className="text-sm">
Server URL
</Label>
<Input
id="ntfy-url"
type="url"
placeholder="https://ntfy.sh"
value={notificationConfig.ntfy?.url || "https://ntfy.sh"}
onChange={(e) =>
onNotificationChange({
...notificationConfig,
ntfy: {
...notificationConfig.ntfy!,
url: e.target.value,
topic: notificationConfig.ntfy?.topic || "",
priority: notificationConfig.ntfy?.priority || "default",
},
})
}
/>
<p className="text-xs text-muted-foreground">
Use https://ntfy.sh for the public server or your self-hosted instance URL
</p>
</div>
<div className="space-y-2">
<Label htmlFor="ntfy-topic" className="text-sm">
Topic <span className="text-destructive">*</span>
</Label>
<Input
id="ntfy-topic"
placeholder="gitea-mirror"
value={notificationConfig.ntfy?.topic || ""}
onChange={(e) =>
onNotificationChange({
...notificationConfig,
ntfy: {
...notificationConfig.ntfy!,
url: notificationConfig.ntfy?.url || "https://ntfy.sh",
topic: e.target.value,
priority: notificationConfig.ntfy?.priority || "default",
},
})
}
/>
<p className="text-xs text-muted-foreground">
Choose a unique topic name. Anyone with the topic name can subscribe.
</p>
</div>
<div className="space-y-2">
<Label htmlFor="ntfy-token" className="text-sm">
Access token (optional)
</Label>
<Input
id="ntfy-token"
type="password"
placeholder="tk_..."
value={notificationConfig.ntfy?.token || ""}
onChange={(e) =>
onNotificationChange({
...notificationConfig,
ntfy: {
...notificationConfig.ntfy!,
url: notificationConfig.ntfy?.url || "https://ntfy.sh",
topic: notificationConfig.ntfy?.topic || "",
token: e.target.value,
priority: notificationConfig.ntfy?.priority || "default",
},
})
}
/>
<p className="text-xs text-muted-foreground">
Required if your ntfy server uses authentication
</p>
</div>
<div className="space-y-2">
<Label htmlFor="ntfy-priority" className="text-sm">
Default priority
</Label>
<Select
value={notificationConfig.ntfy?.priority || "default"}
onValueChange={(value: "min" | "low" | "default" | "high" | "urgent") =>
onNotificationChange({
...notificationConfig,
ntfy: {
...notificationConfig.ntfy!,
url: notificationConfig.ntfy?.url || "https://ntfy.sh",
topic: notificationConfig.ntfy?.topic || "",
priority: value,
},
})
}
>
<SelectTrigger id="ntfy-priority">
<SelectValue />
</SelectTrigger>
<SelectContent>
<SelectItem value="min">Min</SelectItem>
<SelectItem value="low">Low</SelectItem>
<SelectItem value="default">Default</SelectItem>
<SelectItem value="high">High</SelectItem>
<SelectItem value="urgent">Urgent</SelectItem>
</SelectContent>
</Select>
<p className="text-xs text-muted-foreground">
Error notifications always use "high" priority regardless of this setting
</p>
</div>
</div>
)}
{/* Apprise configuration */}
{notificationConfig.provider === "apprise" && (
<div className="space-y-4 p-4 border border-border rounded-lg bg-card/50">
<h3 className="text-sm font-medium">Apprise API Settings</h3>
<div className="space-y-2">
<Label htmlFor="apprise-url" className="text-sm">
Server URL <span className="text-destructive">*</span>
</Label>
<Input
id="apprise-url"
type="url"
placeholder="http://apprise:8000"
value={notificationConfig.apprise?.url || ""}
onChange={(e) =>
onNotificationChange({
...notificationConfig,
apprise: {
...notificationConfig.apprise!,
url: e.target.value,
token: notificationConfig.apprise?.token || "",
},
})
}
/>
<p className="text-xs text-muted-foreground">
URL of your Apprise API server (e.g., http://apprise:8000)
</p>
</div>
<div className="space-y-2">
<Label htmlFor="apprise-token" className="text-sm">
Token / path <span className="text-destructive">*</span>
</Label>
<Input
id="apprise-token"
placeholder="gitea-mirror"
value={notificationConfig.apprise?.token || ""}
onChange={(e) =>
onNotificationChange({
...notificationConfig,
apprise: {
...notificationConfig.apprise!,
url: notificationConfig.apprise?.url || "",
token: e.target.value,
},
})
}
/>
<p className="text-xs text-muted-foreground">
The Apprise API configuration token or key
</p>
</div>
<div className="space-y-2">
<Label htmlFor="apprise-tag" className="text-sm">
Tag filter (optional)
</Label>
<Input
id="apprise-tag"
placeholder="all"
value={notificationConfig.apprise?.tag || ""}
onChange={(e) =>
onNotificationChange({
...notificationConfig,
apprise: {
...notificationConfig.apprise!,
url: notificationConfig.apprise?.url || "",
token: notificationConfig.apprise?.token || "",
tag: e.target.value,
},
})
}
/>
<p className="text-xs text-muted-foreground">
Optional tag to filter which Apprise services receive notifications
</p>
</div>
</div>
)}
{/* Event toggles */}
<div className="space-y-4 p-4 border border-border rounded-lg bg-card/50">
<h3 className="text-sm font-medium">Notification Events</h3>
<div className="flex items-center justify-between">
<div className="space-y-0.5">
<Label htmlFor="notify-sync-error" className="text-sm font-normal cursor-pointer">
Sync errors
</Label>
<p className="text-xs text-muted-foreground">
Notify when a mirror job fails
</p>
</div>
<Switch
id="notify-sync-error"
checked={notificationConfig.notifyOnSyncError}
onCheckedChange={(checked) =>
onNotificationChange({ ...notificationConfig, notifyOnSyncError: checked })
}
/>
</div>
<div className="flex items-center justify-between">
<div className="space-y-0.5">
<Label htmlFor="notify-sync-success" className="text-sm font-normal cursor-pointer">
Sync success
</Label>
<p className="text-xs text-muted-foreground">
Notify when a mirror job completes successfully
</p>
</div>
<Switch
id="notify-sync-success"
checked={notificationConfig.notifyOnSyncSuccess}
onCheckedChange={(checked) =>
onNotificationChange({ ...notificationConfig, notifyOnSyncSuccess: checked })
}
/>
</div>
<div className="flex items-center justify-between">
<div className="space-y-0.5">
<Label htmlFor="notify-new-repo" className="text-sm font-normal cursor-pointer text-muted-foreground">
New repository discovered (coming soon)
</Label>
<p className="text-xs text-muted-foreground">
Notify when a new GitHub repository is auto-imported
</p>
</div>
<Switch
id="notify-new-repo"
checked={notificationConfig.notifyOnNewRepo}
disabled
onCheckedChange={(checked) =>
onNotificationChange({ ...notificationConfig, notifyOnNewRepo: checked })
}
/>
</div>
</div>
{/* Test button */}
<div className="flex justify-end">
<Button
variant="outline"
onClick={handleTestNotification}
disabled={isTesting}
>
{isTesting ? (
<>
<Activity className="h-4 w-4 animate-spin mr-2" />
Sending...
</>
) : (
<>
<Send className="h-4 w-4 mr-2" />
Send Test Notification
</>
)}
</Button>
</div>
</>
)}
</CardContent>
</Card>
);
}

View File

@@ -14,6 +14,7 @@ import { Badge } from '../ui/badge';
import { Tabs, TabsContent, TabsList, TabsTrigger } from '@/components/ui/tabs';
import { Textarea } from '@/components/ui/textarea';
import { MultiSelect } from '@/components/ui/multi-select';
import { withBase } from '@/lib/base-path';
function isTrustedIssuer(issuer: string, allowedHosts: string[]): boolean {
try {
@@ -100,6 +101,9 @@ export function SSOSettings() {
digestAlgorithm: 'sha256',
identifierFormat: 'urn:oasis:names:tc:SAML:1.1:nameid-format:emailAddress',
});
const appOrigin = typeof window !== 'undefined' ? window.location.origin : '';
const buildAbsoluteAppUrl = (path: string) =>
appOrigin ? new URL(withBase(path), appOrigin).toString() : withBase(path);
@@ -179,8 +183,8 @@ export function SSOSettings() {
} else {
requestData.entryPoint = providerForm.entryPoint;
requestData.cert = providerForm.cert;
requestData.callbackUrl = providerForm.callbackUrl || `${window.location.origin}/api/auth/sso/saml2/callback/${providerForm.providerId}`;
requestData.audience = providerForm.audience || window.location.origin;
requestData.callbackUrl = providerForm.callbackUrl || buildAbsoluteAppUrl(`/api/auth/sso/saml2/callback/${providerForm.providerId}`);
requestData.audience = providerForm.audience || appOrigin;
requestData.wantAssertionsSigned = providerForm.wantAssertionsSigned;
requestData.signatureAlgorithm = providerForm.signatureAlgorithm;
requestData.digestAlgorithm = providerForm.digestAlgorithm;
@@ -517,7 +521,7 @@ export function SSOSettings() {
<AlertCircle className="h-4 w-4" />
<AlertDescription>
<div className="space-y-2">
<p>Redirect URL: {window.location.origin}/api/auth/sso/callback/{providerForm.providerId || '{provider-id}'}</p>
<p>Redirect URL: {buildAbsoluteAppUrl(`/api/auth/sso/callback/${providerForm.providerId || '{provider-id}'}`)}</p>
{isTrustedIssuer(providerForm.issuer, ['google.com']) && (
<p className="text-xs text-muted-foreground">
Note: Google doesn't support the "offline_access" scope. Make sure to exclude it from the selected scopes.
@@ -563,8 +567,8 @@ export function SSOSettings() {
<AlertCircle className="h-4 w-4" />
<AlertDescription>
<div className="space-y-1">
<p>Callback URL: {window.location.origin}/api/auth/sso/saml2/callback/{providerForm.providerId || '{provider-id}'}</p>
<p>SP Metadata: {window.location.origin}/api/auth/sso/saml2/sp/metadata?providerId={providerForm.providerId || '{provider-id}'}</p>
<p>Callback URL: {buildAbsoluteAppUrl(`/api/auth/sso/saml2/callback/${providerForm.providerId || '{provider-id}'}`)}</p>
<p>SP Metadata: {buildAbsoluteAppUrl(`/api/auth/sso/saml2/sp/metadata?providerId=${providerForm.providerId || '{provider-id}'}`)}</p>
</div>
</AlertDescription>
</Alert>
@@ -724,4 +728,4 @@ export function SSOSettings() {
</Card>
</div>
);
}
}

View File

@@ -16,6 +16,7 @@ import { useLiveRefresh } from "@/hooks/useLiveRefresh";
import { usePageVisibility } from "@/hooks/usePageVisibility";
import { useConfigStatus } from "@/hooks/useConfigStatus";
import { useNavigation } from "@/components/layout/MainLayout";
import { withBase } from "@/lib/base-path";
// Helper function to format last sync time
function formatLastSyncTime(date: Date | null): string {
@@ -110,7 +111,7 @@ export function Dashboard() {
useEffectForToasts(() => {
if (!user?.id) return;
const eventSource = new EventSource(`/api/events?userId=${user.id}`);
const eventSource = new EventSource(`${withBase("/api/events")}?userId=${user.id}`);
eventSource.addEventListener("rate-limit", (event) => {
try {

View File

@@ -3,6 +3,7 @@ import type { MirrorJob } from "@/lib/db/schema";
import { formatDate, getStatusColor } from "@/lib/utils";
import { Button } from "../ui/button";
import { Activity, Clock } from "lucide-react";
import { withBase } from "@/lib/base-path";
interface RecentActivityProps {
activities: MirrorJob[];
@@ -14,7 +15,7 @@ export function RecentActivity({ activities }: RecentActivityProps) {
<CardHeader className="flex flex-row items-center justify-between">
<CardTitle>Recent Activity</CardTitle>
<Button variant="outline" asChild>
<a href="/activity">View All</a>
<a href={withBase("/activity")}>View All</a>
</Button>
</CardHeader>
<CardContent>
@@ -27,7 +28,7 @@ export function RecentActivity({ activities }: RecentActivityProps) {
</p>
<div className="flex gap-2">
<Button variant="outline" size="sm" asChild>
<a href="/activity">
<a href={withBase("/activity")}>
<Activity className="h-3.5 w-3.5 mr-1.5" />
View History
</a>

View File

@@ -4,7 +4,9 @@ import { GitFork } from "lucide-react";
import { SiGithub, SiGitea } from "react-icons/si";
import type { Repository } from "@/lib/db/schema";
import { getStatusColor } from "@/lib/utils";
import { buildGiteaWebUrl } from "@/lib/gitea-url";
import { useGiteaConfig } from "@/hooks/useGiteaConfig";
import { withBase } from "@/lib/base-path";
interface RepositoryListProps {
repositories: Repository[];
@@ -15,11 +17,6 @@ export function RepositoryList({ repositories }: RepositoryListProps) {
// Helper function to construct Gitea repository URL
const getGiteaRepoUrl = (repository: Repository): string | null => {
const rawBaseUrl = giteaConfig?.externalUrl || giteaConfig?.url;
if (!rawBaseUrl) {
return null;
}
// Only provide Gitea links for repositories that have been or are being mirrored
const validStatuses = ['mirroring', 'mirrored', 'syncing', 'synced'];
if (!validStatuses.includes(repository.status)) {
@@ -38,12 +35,7 @@ export function RepositoryList({ repositories }: RepositoryListProps) {
repoPath = `${owner}/${repository.name}`;
}
// Ensure the base URL doesn't have a trailing slash
const baseUrl = rawBaseUrl.endsWith("/")
? rawBaseUrl.slice(0, -1)
: rawBaseUrl;
return `${baseUrl}/${repoPath}`;
return buildGiteaWebUrl(giteaConfig, repoPath);
};
return (
@@ -51,7 +43,7 @@ export function RepositoryList({ repositories }: RepositoryListProps) {
<CardHeader className="flex flex-row items-center justify-between">
<CardTitle>Repositories</CardTitle>
<Button variant="outline" asChild>
<a href="/repositories">View All</a>
<a href={withBase("/repositories")}>View All</a>
</Button>
</CardHeader>
<CardContent>
@@ -63,7 +55,7 @@ export function RepositoryList({ repositories }: RepositoryListProps) {
Configure your GitHub connection to start mirroring repositories.
</p>
<Button asChild>
<a href="/config">Configure GitHub</a>
<a href={withBase("/config")}>Configure GitHub</a>
</Button>
</div>
) : (

View File

@@ -14,6 +14,7 @@ import {
DropdownMenuItem,
DropdownMenuTrigger,
} from "@/components/ui/dropdown-menu";
import { withBase } from "@/lib/base-path";
interface HeaderProps {
currentPage?: "dashboard" | "repositories" | "organizations" | "configuration" | "activity-log";
@@ -101,14 +102,14 @@ export function Header({ currentPage, onNavigate, onMenuClick, onToggleCollapse,
<button
onClick={() => {
if (currentPage !== 'dashboard') {
window.history.pushState({}, '', '/');
window.history.pushState({}, '', withBase('/'));
onNavigate?.('dashboard');
}
}}
className="flex items-center gap-2 py-1 hover:opacity-80 transition-opacity"
>
<img
src="/logo.png"
src={withBase('/logo.png')}
alt="Gitea Mirror Logo"
className="h-5 w-6"
/>
@@ -163,7 +164,7 @@ export function Header({ currentPage, onNavigate, onMenuClick, onToggleCollapse,
</DropdownMenu>
) : (
<Button variant="outline" size="sm" asChild>
<a href="/login">Login</a>
<a href={withBase('/login')}>Login</a>
</Button>
)}
</div>

View File

@@ -11,6 +11,7 @@ import { Toaster } from "@/components/ui/sonner";
import { useAuth } from "@/hooks/useAuth";
import { useRepoSync } from "@/hooks/useSyncRepo";
import { useConfigStatus } from "@/hooks/useConfigStatus";
import { stripBasePath, withBase } from "@/lib/base-path";
// Navigation context to signal when navigation happens
const NavigationContext = createContext<{ navigationKey: number }>({ navigationKey: 0 });
@@ -71,7 +72,7 @@ function AppWithProviders({ page: initialPage }: AppProps) {
// Handle browser back/forward navigation
useEffect(() => {
const handlePopState = () => {
const path = window.location.pathname;
const path = stripBasePath(window.location.pathname);
const pageMap: Record<string, AppProps['page']> = {
'/': 'dashboard',
'/repositories': 'repositories',
@@ -125,7 +126,7 @@ function AppWithProviders({ page: initialPage }: AppProps) {
if (!authLoading && !user) {
// Use window.location for client-side redirect
if (typeof window !== 'undefined') {
window.location.href = '/login';
window.location.href = withBase('/login');
}
return null;
}

View File

@@ -9,6 +9,7 @@ import {
TooltipProvider,
TooltipTrigger,
} from "@/components/ui/tooltip";
import { stripBasePath, withBase } from "@/lib/base-path";
interface SidebarProps {
className?: string;
@@ -24,14 +25,14 @@ export function Sidebar({ className, onNavigate, isOpen, isCollapsed = false, on
useEffect(() => {
// Hydration happens here
const path = window.location.pathname;
const path = stripBasePath(window.location.pathname);
setCurrentPath(path);
}, []);
// Listen for URL changes (browser back/forward)
useEffect(() => {
const handlePopState = () => {
setCurrentPath(window.location.pathname);
setCurrentPath(stripBasePath(window.location.pathname));
};
window.addEventListener('popstate', handlePopState);
@@ -45,7 +46,7 @@ export function Sidebar({ className, onNavigate, isOpen, isCollapsed = false, on
if (currentPath === href) return;
// Update URL without page reload
window.history.pushState({}, '', href);
window.history.pushState({}, '', withBase(href));
setCurrentPath(href);
// Map href to page name for the parent component
@@ -163,7 +164,7 @@ export function Sidebar({ className, onNavigate, isOpen, isCollapsed = false, on
Check out the documentation for help with setup and configuration.
</p>
<a
href="/docs"
href={withBase("/docs")}
target="_blank"
rel="noopener noreferrer"
className="inline-flex items-center gap-1.5 text-xs md:text-xs text-primary hover:underline py-2 md:py-0"
@@ -177,7 +178,7 @@ export function Sidebar({ className, onNavigate, isOpen, isCollapsed = false, on
<Tooltip delayDuration={0}>
<TooltipTrigger asChild>
<a
href="/docs"
href={withBase("/docs")}
target="_blank"
rel="noopener noreferrer"
className={cn(

View File

@@ -9,8 +9,10 @@ import type { FilterParams } from "@/types/filter";
import Fuse from "fuse.js";
import { Skeleton } from "@/components/ui/skeleton";
import { cn } from "@/lib/utils";
import { buildGiteaWebUrl } from "@/lib/gitea-url";
import { MirrorDestinationEditor } from "./MirrorDestinationEditor";
import { useGiteaConfig } from "@/hooks/useGiteaConfig";
import { withBase } from "@/lib/base-path";
import {
DropdownMenu,
DropdownMenuContent,
@@ -67,11 +69,6 @@ export function OrganizationList({
// Helper function to construct Gitea organization URL
const getGiteaOrgUrl = (organization: Organization): string | null => {
const rawBaseUrl = giteaConfig?.externalUrl || giteaConfig?.url;
if (!rawBaseUrl) {
return null;
}
// Only provide Gitea links for organizations that have been mirrored
const validStatuses = ['mirroring', 'mirrored'];
if (!validStatuses.includes(organization.status || '')) {
@@ -84,17 +81,12 @@ export function OrganizationList({
return null;
}
// Ensure the base URL doesn't have a trailing slash
const baseUrl = rawBaseUrl.endsWith("/")
? rawBaseUrl.slice(0, -1)
: rawBaseUrl;
return `${baseUrl}/${orgName}`;
return buildGiteaWebUrl(giteaConfig, orgName);
};
const handleUpdateDestination = async (orgId: string, newDestination: string | null) => {
// Call API to update organization destination
const response = await fetch(`/api/organizations/${orgId}`, {
const response = await fetch(`${withBase("/api/organizations")}/${orgId}`, {
method: "PATCH",
headers: {
"Content-Type": "application/json",
@@ -198,7 +190,7 @@ export function OrganizationList({
<div className="flex items-center gap-2 min-w-0">
<Building2 className="h-5 w-5 text-muted-foreground flex-shrink-0" />
<a
href={`/repositories?organization=${encodeURIComponent(org.name || '')}`}
href={`${withBase('/repositories')}?organization=${encodeURIComponent(org.name || '')}`}
className="font-medium hover:underline cursor-pointer truncate"
>
{org.name}
@@ -248,6 +240,11 @@ export function OrganizationList({
</div>
</div>
{/* Error message for failed orgs */}
{org.status === "failed" && org.errorMessage && (
<p className="text-xs text-destructive line-clamp-2">{org.errorMessage}</p>
)}
{/* Destination override section */}
<div>
<MirrorDestinationEditor
@@ -268,7 +265,7 @@ export function OrganizationList({
<div className="flex-1">
<div className="flex items-center gap-3 mb-1">
<a
href={`/repositories?organization=${encodeURIComponent(org.name || '')}`}
href={`${withBase('/repositories')}?organization=${encodeURIComponent(org.name || '')}`}
className="text-xl font-semibold hover:underline cursor-pointer"
>
{org.name}
@@ -304,6 +301,13 @@ export function OrganizationList({
/>
</div>
{/* Error message for failed orgs */}
{org.status === "failed" && org.errorMessage && (
<div className="mb-4 p-3 rounded-md bg-destructive/10 border border-destructive/20">
<p className="text-sm text-destructive">{org.errorMessage}</p>
</div>
)}
{/* Repository statistics */}
<div className="mb-4">
<div className="flex items-center gap-4 text-sm">
@@ -313,7 +317,7 @@ export function OrganizationList({
{org.repositoryCount === 1 ? "repository" : "repositories"}
</span>
</div>
{/* Repository breakdown - only show non-zero counts */}
{(() => {
const counts = [];
@@ -326,7 +330,7 @@ export function OrganizationList({
if (org.forkRepositoryCount && org.forkRepositoryCount > 0) {
counts.push(`${org.forkRepositoryCount} ${org.forkRepositoryCount === 1 ? 'fork' : 'forks'}`);
}
return counts.length > 0 ? (
<div className="flex items-center gap-3 text-xs text-muted-foreground">
{counts.map((count, index) => (
@@ -415,7 +419,7 @@ export function OrganizationList({
)}
</>
)}
{/* Dropdown menu for additional actions */}
{org.status !== "mirroring" && (
<DropdownMenu>
@@ -426,7 +430,7 @@ export function OrganizationList({
</DropdownMenuTrigger>
<DropdownMenuContent align="end">
{org.status !== "ignored" && (
<DropdownMenuItem
<DropdownMenuItem
onClick={() => org.id && onIgnore && onIgnore({ orgId: org.id, ignore: true })}
>
<Ban className="h-4 w-4 mr-2" />
@@ -449,7 +453,7 @@ export function OrganizationList({
</DropdownMenu>
)}
</div>
<div className="flex items-center gap-2 justify-center">
{(() => {
const giteaUrl = getGiteaOrgUrl(org);

View File

@@ -18,10 +18,12 @@ interface AddRepositoryDialogProps {
repo,
owner,
force,
destinationOrg,
}: {
repo: string;
owner: string;
force?: boolean;
destinationOrg?: string;
}) => Promise<void>;
}
@@ -32,6 +34,7 @@ export default function AddRepositoryDialog({
}: AddRepositoryDialogProps) {
const [repo, setRepo] = useState<string>("");
const [owner, setOwner] = useState<string>("");
const [destinationOrg, setDestinationOrg] = useState<string>("");
const [isLoading, setIsLoading] = useState<boolean>(false);
const [error, setError] = useState<string>("");
@@ -40,6 +43,7 @@ export default function AddRepositoryDialog({
setError("");
setRepo("");
setOwner("");
setDestinationOrg("");
}
}, [isDialogOpen]);
@@ -54,11 +58,16 @@ export default function AddRepositoryDialog({
try {
setIsLoading(true);
await onAddRepository({ repo, owner });
await onAddRepository({
repo,
owner,
destinationOrg: destinationOrg.trim() || undefined,
});
setError("");
setRepo("");
setOwner("");
setDestinationOrg("");
setIsDialogOpen(false);
} catch (err: any) {
setError(err?.message || "Failed to add repository.");
@@ -124,6 +133,27 @@ export default function AddRepositoryDialog({
/>
</div>
<div>
<label
htmlFor="destinationOrg"
className="block text-sm font-medium mb-1.5"
>
Target Organization{" "}
<span className="text-muted-foreground font-normal">
(optional)
</span>
</label>
<input
id="destinationOrg"
type="text"
value={destinationOrg}
onChange={(e) => setDestinationOrg(e.target.value)}
className="w-full rounded-md border border-input bg-background px-3 py-2 text-sm shadow-sm transition-colors placeholder:text-muted-foreground focus-visible:outline-none focus-visible:ring-1 focus-visible:ring-ring"
placeholder="Gitea org or user (uses default strategy if empty)"
autoComplete="off"
/>
</div>
{error && <p className="text-sm text-red-500 mt-1">{error}</p>}
</div>

View File

@@ -50,19 +50,30 @@ import AddRepositoryDialog from "./AddRepositoryDialog";
import { useLiveRefresh } from "@/hooks/useLiveRefresh";
import { useConfigStatus } from "@/hooks/useConfigStatus";
import { useNavigation } from "@/components/layout/MainLayout";
import { withBase } from "@/lib/base-path";
const REPOSITORY_SORT_OPTIONS = [
{ value: "imported-desc", label: "Recently Imported" },
{ value: "imported-asc", label: "Oldest Imported" },
{ value: "updated-desc", label: "Recently Updated" },
{ value: "updated-asc", label: "Oldest Updated" },
{ value: "name-asc", label: "Name (A-Z)" },
{ value: "name-desc", label: "Name (Z-A)" },
] as const;
export default function Repository() {
const [repositories, setRepositories] = useState<Repository[]>([]);
const [isInitialLoading, setIsInitialLoading] = useState(true);
const { user } = useAuth();
const { registerRefreshCallback, isLiveEnabled } = useLiveRefresh();
const { isGitHubConfigured, isFullyConfigured } = useConfigStatus();
const { isGitHubConfigured, isFullyConfigured, autoMirrorStarred, githubOwner } = useConfigStatus();
const { navigationKey } = useNavigation();
const { filter, setFilter } = useFilterParams({
searchTerm: "",
status: "",
organization: "",
owner: "",
sort: "imported-desc",
});
const [isDialogOpen, setIsDialogOpen] = useState<boolean>(false);
const [selectedRepoIds, setSelectedRepoIds] = useState<Set<string>>(new Set());
@@ -233,10 +244,12 @@ export default function Repository() {
// Filter out repositories that are already mirroring, mirrored, or ignored
const eligibleRepos = repositories.filter(
(repo) =>
repo.status !== "mirroring" &&
repo.status !== "mirrored" &&
repo.status !== "mirroring" &&
repo.status !== "mirrored" &&
repo.status !== "ignored" && // Skip ignored repositories
repo.id
repo.id &&
// Skip starred repos from other owners when autoMirrorStarred is disabled
!(repo.isStarred && !autoMirrorStarred && repo.owner !== githubOwner)
);
if (eligibleRepos.length === 0) {
@@ -292,7 +305,7 @@ export default function Repository() {
const selectedRepos = repositories.filter(repo => repo.id && selectedRepoIds.has(repo.id));
const eligibleRepos = selectedRepos.filter(
repo => repo.status === "imported" || repo.status === "failed"
repo => repo.status === "imported" || repo.status === "failed" || repo.status === "pending-approval"
);
if (eligibleRepos.length === 0) {
@@ -301,7 +314,7 @@ export default function Repository() {
}
const repoIds = eligibleRepos.map(repo => repo.id as string);
setLoadingRepoIds(prev => {
const newSet = new Set(prev);
repoIds.forEach(id => newSet.add(id));
@@ -694,14 +707,90 @@ export default function Repository() {
}
};
const handleApproveSyncAction = async ({ repoId }: { repoId: string }) => {
try {
if (!user || !user.id) return;
setLoadingRepoIds((prev) => new Set(prev).add(repoId));
const response = await apiRequest<{
success: boolean;
message?: string;
error?: string;
repositories: Repository[];
}>("/job/approve-sync", {
method: "POST",
data: { repositoryIds: [repoId], action: "approve" },
});
if (response.success) {
toast.success("Sync approved — backup + sync started");
setRepositories((prevRepos) =>
prevRepos.map((repo) => {
const updated = response.repositories.find((r) => r.id === repo.id);
return updated ? updated : repo;
}),
);
} else {
showErrorToast(response.error || "Error approving sync", toast);
}
} catch (error) {
showErrorToast(error, toast);
} finally {
setLoadingRepoIds((prev) => {
const newSet = new Set(prev);
newSet.delete(repoId);
return newSet;
});
}
};
const handleDismissSyncAction = async ({ repoId }: { repoId: string }) => {
try {
if (!user || !user.id) return;
setLoadingRepoIds((prev) => new Set(prev).add(repoId));
const response = await apiRequest<{
success: boolean;
message?: string;
error?: string;
repositories: Repository[];
}>("/job/approve-sync", {
method: "POST",
data: { repositoryIds: [repoId], action: "dismiss" },
});
if (response.success) {
toast.success("Force-push alert dismissed");
setRepositories((prevRepos) =>
prevRepos.map((repo) => {
const updated = response.repositories.find((r) => r.id === repo.id);
return updated ? updated : repo;
}),
);
} else {
showErrorToast(response.error || "Error dismissing alert", toast);
}
} catch (error) {
showErrorToast(error, toast);
} finally {
setLoadingRepoIds((prev) => {
const newSet = new Set(prev);
newSet.delete(repoId);
return newSet;
});
}
};
const handleAddRepository = async ({
repo,
owner,
force = false,
destinationOrg,
}: {
repo: string;
owner: string;
force?: boolean;
destinationOrg?: string;
}) => {
if (!user || !user.id) {
return;
@@ -736,6 +825,7 @@ export default function Repository() {
repo: trimmedRepo,
owner: trimmedOwner,
force,
...(destinationOrg ? { destinationOrg } : {}),
};
const response = await apiRequest<AddRepositoriesApiResponse>(
@@ -860,7 +950,7 @@ export default function Repository() {
const actions = [];
// Check if any selected repos can be mirrored
if (selectedRepos.some(repo => repo.status === "imported" || repo.status === "failed")) {
if (selectedRepos.some(repo => repo.status === "imported" || repo.status === "failed" || repo.status === "pending-approval")) {
actions.push('mirror');
}
@@ -898,7 +988,7 @@ export default function Repository() {
const selectedRepos = repositories.filter(repo => repo.id && selectedRepoIds.has(repo.id));
return {
mirror: selectedRepos.filter(repo => repo.status === "imported" || repo.status === "failed").length,
mirror: selectedRepos.filter(repo => repo.status === "imported" || repo.status === "failed" || repo.status === "pending-approval").length,
sync: selectedRepos.filter(repo => repo.status === "mirrored" || repo.status === "synced").length,
rerunMetadata: selectedRepos.filter(repo => ["mirrored", "synced", "archived"].includes(repo.status)).length,
retry: selectedRepos.filter(repo => repo.status === "failed").length,
@@ -920,6 +1010,7 @@ export default function Repository() {
status: "",
organization: "",
owner: "",
sort: filter.sort || "imported-desc",
});
};
@@ -1060,6 +1151,33 @@ export default function Repository() {
</SelectContent>
</Select>
</div>
{/* Sort Filter */}
<div className="space-y-2">
<label className="text-sm font-medium flex items-center gap-2">
<span className="text-muted-foreground">Sort</span>
</label>
<Select
value={filter.sort || "imported-desc"}
onValueChange={(value) =>
setFilter((prev) => ({
...prev,
sort: value,
}))
}
>
<SelectTrigger className="w-full h-10">
<SelectValue placeholder="Sort repositories" />
</SelectTrigger>
<SelectContent>
{REPOSITORY_SORT_OPTIONS.map((option) => (
<SelectItem key={option.value} value={option.value}>
{option.label}
</SelectItem>
))}
</SelectContent>
</Select>
</div>
</div>
<DrawerFooter className="gap-2 px-4 pt-2 pb-4 border-t">
@@ -1162,6 +1280,27 @@ export default function Repository() {
</SelectContent>
</Select>
<Select
value={filter.sort || "imported-desc"}
onValueChange={(value) =>
setFilter((prev) => ({
...prev,
sort: value,
}))
}
>
<SelectTrigger className="w-[190px] h-10">
<SelectValue placeholder="Sort repositories" />
</SelectTrigger>
<SelectContent>
{REPOSITORY_SORT_OPTIONS.map((option) => (
<SelectItem key={option.value} value={option.value}>
{option.label}
</SelectItem>
))}
</SelectContent>
</Select>
<Button
variant="outline"
size="icon"
@@ -1380,7 +1519,7 @@ export default function Repository() {
<Button
variant="default"
onClick={() => {
window.history.pushState({}, '', '/config');
window.history.pushState({}, '', withBase('/config'));
// We need to trigger a page change event for the navigation system
window.dispatchEvent(new PopStateEvent('popstate'));
}}
@@ -1406,6 +1545,8 @@ export default function Repository() {
await fetchRepositories(false);
}}
onDelete={handleRequestDeleteRepository}
onApproveSync={handleApproveSyncAction}
onDismissSync={handleDismissSyncAction}
/>
)}

View File

@@ -1,11 +1,20 @@
import { useMemo, useRef } from "react";
import Fuse from "fuse.js";
import {
getCoreRowModel,
getFilteredRowModel,
getSortedRowModel,
useReactTable,
type ColumnDef,
type ColumnFiltersState,
type SortingState,
} from "@tanstack/react-table";
import { useVirtualizer } from "@tanstack/react-virtual";
import { FlipHorizontal, GitFork, RefreshCw, RotateCcw, Star, Lock, Ban, Check, ChevronDown, Trash2 } from "lucide-react";
import { FlipHorizontal, GitFork, RefreshCw, RotateCcw, Star, Lock, Ban, Check, ChevronDown, Trash2, X } from "lucide-react";
import { SiGithub, SiGitea } from "react-icons/si";
import type { Repository } from "@/lib/db/schema";
import { Button } from "@/components/ui/button";
import { formatDate, formatLastSyncTime, getStatusColor } from "@/lib/utils";
import { formatLastSyncTime } from "@/lib/utils";
import { buildGiteaWebUrl } from "@/lib/gitea-url";
import type { FilterParams } from "@/types/filter";
import { Skeleton } from "@/components/ui/skeleton";
import { useGiteaConfig } from "@/hooks/useGiteaConfig";
@@ -19,6 +28,7 @@ import {
import { InlineDestinationEditor } from "./InlineDestinationEditor";
import { Card, CardContent } from "@/components/ui/card";
import { Badge } from "@/components/ui/badge";
import { withBase } from "@/lib/base-path";
import {
DropdownMenu,
DropdownMenuContent,
@@ -42,6 +52,32 @@ interface RepositoryTableProps {
onSelectionChange: (selectedIds: Set<string>) => void;
onRefresh?: () => Promise<void>;
onDelete?: (repoId: string) => void;
onApproveSync?: ({ repoId }: { repoId: string }) => Promise<void>;
onDismissSync?: ({ repoId }: { repoId: string }) => Promise<void>;
}
function getTimestamp(value: Date | string | null | undefined): number {
if (!value) return 0;
const timestamp = new Date(value).getTime();
return Number.isNaN(timestamp) ? 0 : timestamp;
}
function getTableSorting(sortOrder: string | undefined): SortingState {
switch (sortOrder ?? "imported-desc") {
case "imported-asc":
return [{ id: "importedAt", desc: false }];
case "updated-desc":
return [{ id: "updatedAt", desc: true }];
case "updated-asc":
return [{ id: "updatedAt", desc: false }];
case "name-asc":
return [{ id: "fullName", desc: false }];
case "name-desc":
return [{ id: "fullName", desc: true }];
case "imported-desc":
default:
return [{ id: "importedAt", desc: true }];
}
}
export default function RepositoryTable({
@@ -59,13 +95,15 @@ export default function RepositoryTable({
onSelectionChange,
onRefresh,
onDelete,
onApproveSync,
onDismissSync,
}: RepositoryTableProps) {
const tableParentRef = useRef<HTMLDivElement>(null);
const { giteaConfig } = useGiteaConfig();
const handleUpdateDestination = async (repoId: string, newDestination: string | null) => {
// Call API to update repository destination
const response = await fetch(`/api/repositories/${repoId}`, {
const response = await fetch(`${withBase("/api/repositories")}/${repoId}`, {
method: "PATCH",
headers: {
"Content-Type": "application/json",
@@ -88,10 +126,6 @@ export default function RepositoryTable({
// Helper function to construct Gitea repository URL
const getGiteaRepoUrl = (repository: Repository): string | null => {
if (!giteaConfig?.url) {
return null;
}
// Only provide Gitea links for repositories that have been or are being mirrored
const validStatuses = ['mirroring', 'mirrored', 'syncing', 'synced', 'archived'];
if (!validStatuses.includes(repository.status)) {
@@ -108,48 +142,92 @@ export default function RepositoryTable({
repoPath = `${owner}/${repository.name}`;
}
// Ensure the base URL doesn't have a trailing slash
const baseUrl = giteaConfig.url.endsWith('/')
? giteaConfig.url.slice(0, -1)
: giteaConfig.url;
return `${baseUrl}/${repoPath}`;
return buildGiteaWebUrl(giteaConfig, repoPath);
};
const hasAnyFilter = Object.values(filter).some(
(val) => val?.toString().trim() !== ""
);
const hasAnyFilter = [
filter.searchTerm,
filter.status,
filter.owner,
filter.organization,
].some((val) => val?.toString().trim() !== "");
const filteredRepositories = useMemo(() => {
let result = repositories;
const columnFilters = useMemo<ColumnFiltersState>(() => {
const next: ColumnFiltersState = [];
if (filter.status) {
result = result.filter((repo) => repo.status === filter.status);
next.push({ id: "status", value: filter.status });
}
if (filter.owner) {
result = result.filter((repo) => repo.owner === filter.owner);
next.push({ id: "owner", value: filter.owner });
}
if (filter.organization) {
result = result.filter(
(repo) => repo.organization === filter.organization
);
next.push({ id: "organization", value: filter.organization });
}
if (filter.searchTerm) {
const fuse = new Fuse(result, {
keys: ["name", "fullName", "owner", "organization"],
threshold: 0.3,
});
result = fuse.search(filter.searchTerm).map((res) => res.item);
}
return next;
}, [filter.status, filter.owner, filter.organization]);
return result;
}, [repositories, filter]);
const sorting = useMemo(() => getTableSorting(filter.sort), [filter.sort]);
const columns = useMemo<ColumnDef<Repository>[]>(
() => [
{
id: "fullName",
accessorFn: (row) => row.fullName,
},
{
id: "owner",
accessorFn: (row) => row.owner,
filterFn: "equalsString",
},
{
id: "organization",
accessorFn: (row) => row.organization ?? "",
filterFn: "equalsString",
},
{
id: "status",
accessorFn: (row) => row.status,
filterFn: "equalsString",
},
{
id: "importedAt",
accessorFn: (row) => getTimestamp(row.importedAt),
enableGlobalFilter: false,
enableColumnFilter: false,
},
{
id: "updatedAt",
accessorFn: (row) => getTimestamp(row.updatedAt),
enableGlobalFilter: false,
enableColumnFilter: false,
},
],
[]
);
const table = useReactTable({
data: repositories,
columns,
state: {
globalFilter: filter.searchTerm ?? "",
columnFilters,
sorting,
},
getCoreRowModel: getCoreRowModel(),
getFilteredRowModel: getFilteredRowModel(),
getSortedRowModel: getSortedRowModel(),
});
const visibleRepositories = table
.getRowModel()
.rows.map((row) => row.original);
const rowVirtualizer = useVirtualizer({
count: filteredRepositories.length,
count: visibleRepositories.length,
getScrollElement: () => tableParentRef.current,
estimateSize: () => 65,
overscan: 5,
@@ -158,7 +236,11 @@ export default function RepositoryTable({
// Selection handlers
const handleSelectAll = (checked: boolean) => {
if (checked) {
const allIds = new Set(filteredRepositories.map(repo => repo.id).filter((id): id is string => !!id));
const allIds = new Set(
visibleRepositories
.map((repo) => repo.id)
.filter((id): id is string => !!id)
);
onSelectionChange(allIds);
} else {
onSelectionChange(new Set());
@@ -175,8 +257,9 @@ export default function RepositoryTable({
onSelectionChange(newSelection);
};
const isAllSelected = filteredRepositories.length > 0 &&
filteredRepositories.every(repo => repo.id && selectedRepoIds.has(repo.id));
const isAllSelected =
visibleRepositories.length > 0 &&
visibleRepositories.every((repo) => repo.id && selectedRepoIds.has(repo.id));
const isPartiallySelected = selectedRepoIds.size > 0 && !isAllSelected;
// Mobile card layout for repository
@@ -231,7 +314,7 @@ export default function RepositoryTable({
{/* Status & Last Mirrored */}
<div className="flex items-center justify-between">
<Badge
<Badge
className={`capitalize
${repo.status === 'imported' ? 'bg-yellow-500/10 text-yellow-600 hover:bg-yellow-500/20 dark:text-yellow-400' :
repo.status === 'mirrored' || repo.status === 'synced' ? 'bg-green-500/10 text-green-600 hover:bg-green-500/20 dark:text-green-400' :
@@ -239,13 +322,14 @@ export default function RepositoryTable({
repo.status === 'failed' ? 'bg-red-500/10 text-red-600 hover:bg-red-500/20 dark:text-red-400' :
repo.status === 'ignored' ? 'bg-gray-500/10 text-gray-600 hover:bg-gray-500/20 dark:text-gray-400' :
repo.status === 'skipped' ? 'bg-orange-500/10 text-orange-600 hover:bg-orange-500/20 dark:text-orange-400' :
repo.status === 'pending-approval' ? 'bg-amber-500/10 text-amber-600 hover:bg-amber-500/20 dark:text-amber-400' :
'bg-muted hover:bg-muted/80'}`}
variant="secondary"
>
{repo.status}
</Badge>
<span className="text-xs text-muted-foreground">
{formatLastSyncTime(repo.lastMirrored)}
{formatLastSyncTime(repo.lastMirrored ?? null)}
</span>
</div>
</div>
@@ -316,7 +400,40 @@ export default function RepositoryTable({
)}
</Button>
)}
{repo.status === "pending-approval" && (
<div className="flex gap-2 w-full">
<Button
size="default"
variant="default"
onClick={() => repo.id && onApproveSync?.({ repoId: repo.id })}
disabled={isLoading}
className="flex-1 h-10"
>
{isLoading ? (
<>
<Check className="h-4 w-4 mr-2 animate-spin" />
Approving...
</>
) : (
<>
<Check className="h-4 w-4 mr-2" />
Approve Sync
</>
)}
</Button>
<Button
size="default"
variant="outline"
onClick={() => repo.id && onDismissSync?.({ repoId: repo.id })}
disabled={isLoading}
className="flex-1 h-10"
>
<X className="h-4 w-4 mr-2" />
Dismiss
</Button>
</div>
)}
{/* Ignore/Include button */}
{repo.status === "ignored" ? (
<Button
@@ -341,7 +458,7 @@ export default function RepositoryTable({
Ignore Repository
</Button>
)}
{/* External links */}
<div className="flex gap-2">
<Button variant="outline" size="default" className="flex-1 h-10 min-w-0" asChild>
@@ -472,7 +589,7 @@ export default function RepositoryTable({
{hasAnyFilter && (
<div className="mb-4 flex items-center gap-2">
<span className="text-sm text-muted-foreground">
Showing {filteredRepositories.length} of {repositories.length} repositories
Showing {visibleRepositories.length} of {repositories.length} repositories
</span>
<Button
variant="ghost"
@@ -483,6 +600,7 @@ export default function RepositoryTable({
status: "",
organization: "",
owner: "",
sort: filter.sort || "imported-desc",
})
}
>
@@ -491,7 +609,7 @@ export default function RepositoryTable({
</div>
)}
{filteredRepositories.length === 0 ? (
{visibleRepositories.length === 0 ? (
<div className="text-center py-8">
<p className="text-muted-foreground">
{hasAnyFilter
@@ -512,12 +630,12 @@ export default function RepositoryTable({
className="h-5 w-5"
/>
<span className="text-sm font-medium">
Select All ({filteredRepositories.length})
Select All ({visibleRepositories.length})
</span>
</div>
{/* Repository cards */}
{filteredRepositories.map((repo) => (
{visibleRepositories.map((repo) => (
<RepositoryCard key={repo.id} repo={repo} />
))}
</div>
@@ -563,13 +681,14 @@ export default function RepositoryTable({
position: "relative",
}}
>
{rowVirtualizer.getVirtualItems().map((virtualRow, index) => {
const repo = filteredRepositories[virtualRow.index];
{rowVirtualizer.getVirtualItems().map((virtualRow) => {
const repo = visibleRepositories[virtualRow.index];
if (!repo) return null;
const isLoading = loadingRepoIds.has(repo.id ?? "");
return (
<div
key={index}
key={virtualRow.key}
ref={rowVirtualizer.measureElement}
style={{
position: "absolute",
@@ -632,7 +751,7 @@ export default function RepositoryTable({
{/* Last Mirrored */}
<div className="h-full p-3 flex items-center flex-[1]">
<p className="text-sm">
{formatLastSyncTime(repo.lastMirrored)}
{formatLastSyncTime(repo.lastMirrored ?? null)}
</p>
</div>
@@ -642,7 +761,7 @@ export default function RepositoryTable({
<TooltipProvider>
<Tooltip>
<TooltipTrigger asChild>
<Badge
<Badge
variant="destructive"
className="cursor-help capitalize"
>
@@ -655,7 +774,7 @@ export default function RepositoryTable({
</Tooltip>
</TooltipProvider>
) : (
<Badge
<Badge
className={`capitalize
${repo.status === 'imported' ? 'bg-yellow-500/10 text-yellow-600 hover:bg-yellow-500/20 dark:text-yellow-400' :
repo.status === 'mirrored' || repo.status === 'synced' ? 'bg-green-500/10 text-green-600 hover:bg-green-500/20 dark:text-green-400' :
@@ -663,6 +782,7 @@ export default function RepositoryTable({
repo.status === 'failed' ? 'bg-red-500/10 text-red-600 hover:bg-red-500/20 dark:text-red-400' :
repo.status === 'ignored' ? 'bg-gray-500/10 text-gray-600 hover:bg-gray-500/20 dark:text-gray-400' :
repo.status === 'skipped' ? 'bg-orange-500/10 text-orange-600 hover:bg-orange-500/20 dark:text-orange-400' :
repo.status === 'pending-approval' ? 'bg-amber-500/10 text-amber-600 hover:bg-amber-500/20 dark:text-amber-400' :
'bg-muted hover:bg-muted/80'}`}
variant="secondary"
>
@@ -680,6 +800,8 @@ export default function RepositoryTable({
onRetry={() => onRetry({ repoId: repo.id ?? "" })}
onSkip={(skip) => onSkip({ repoId: repo.id ?? "", skip })}
onDelete={onDelete && repo.id ? () => onDelete(repo.id as string) : undefined}
onApproveSync={onApproveSync ? () => onApproveSync({ repoId: repo.id ?? "" }) : undefined}
onDismissSync={onDismissSync ? () => onDismissSync({ repoId: repo.id ?? "" }) : undefined}
/>
</div>
{/* Links */}
@@ -743,7 +865,7 @@ export default function RepositoryTable({
<div className={`h-1.5 w-1.5 rounded-full ${isLiveActive ? 'bg-emerald-500' : 'bg-primary'}`} />
<span className="text-sm font-medium text-foreground">
{hasAnyFilter
? `Showing ${filteredRepositories.length} of ${repositories.length} repositories`
? `Showing ${visibleRepositories.length} of ${repositories.length} repositories`
: `${repositories.length} ${repositories.length === 1 ? 'repository' : 'repositories'} total`}
</span>
</div>
@@ -791,6 +913,8 @@ function RepoActionButton({
onRetry,
onSkip,
onDelete,
onApproveSync,
onDismissSync,
}: {
repo: { id: string; status: string };
isLoading: boolean;
@@ -799,7 +923,36 @@ function RepoActionButton({
onRetry: () => void;
onSkip: (skip: boolean) => void;
onDelete?: () => void;
onApproveSync?: () => void;
onDismissSync?: () => void;
}) {
// For pending-approval repos, show approve/dismiss actions
if (repo.status === "pending-approval") {
return (
<div className="flex gap-1">
<Button
variant="default"
size="sm"
disabled={isLoading}
onClick={onApproveSync}
className="min-w-[70px]"
>
<Check className="h-4 w-4 mr-1" />
Approve
</Button>
<Button
variant="outline"
size="sm"
disabled={isLoading}
onClick={onDismissSync}
>
<X className="h-4 w-4 mr-1" />
Dismiss
</Button>
</div>
);
}
// For ignored repos, show an "Include" action
if (repo.status === "ignored") {
return (

View File

@@ -1,4 +0,0 @@
import { defineCollection, z } from 'astro:content';
// Export empty collections since docs have been moved
export const collections = {};

View File

@@ -8,6 +8,7 @@ import {
} from "react";
import { authApi } from "@/lib/api";
import type { ExtendedUser } from "@/types/user";
import { withBase } from "@/lib/base-path";
interface AuthContextType {
user: ExtendedUser | null;
@@ -61,9 +62,9 @@ export function AuthProvider({ children }: { children: React.ReactNode }) {
// Redirect user based on error
if (err?.message === "No users found") {
window.location.href = "/signup";
window.location.href = withBase("/signup");
} else {
window.location.href = "/login";
window.location.href = withBase("/login");
}
console.error("Auth check failed", err);
} finally {
@@ -111,7 +112,7 @@ export function AuthProvider({ children }: { children: React.ReactNode }) {
try {
await authApi.logout();
setUser(null);
window.location.href = "/login";
window.location.href = withBase("/login");
} catch (err) {
console.error("Logout error:", err);
} finally {

View File

@@ -8,6 +8,7 @@ import {
} from "react";
import { authClient, useSession as useBetterAuthSession } from "@/lib/auth-client";
import type { Session, AuthUser } from "@/lib/auth-client";
import { withBase } from "@/lib/base-path";
interface AuthContextType {
user: AuthUser | null;
@@ -46,7 +47,7 @@ export function AuthProvider({ children }: { children: React.ReactNode }) {
const result = await authClient.signIn.email({
email,
password,
callbackURL: "/",
callbackURL: withBase("/"),
});
if (result.error) {
@@ -73,7 +74,7 @@ export function AuthProvider({ children }: { children: React.ReactNode }) {
email,
password,
name: username, // Better Auth uses 'name' field for display name
callbackURL: "/",
callbackURL: withBase("/"),
});
if (result.error) {
@@ -94,7 +95,7 @@ export function AuthProvider({ children }: { children: React.ReactNode }) {
await authClient.signOut({
fetchOptions: {
onSuccess: () => {
window.location.href = "/login";
window.location.href = withBase("/login");
},
},
});
@@ -140,4 +141,4 @@ export function useAuth() {
}
// Export the Better Auth session hook for direct use when needed
export { useBetterAuthSession };
export { useBetterAuthSession };

View File

@@ -9,6 +9,8 @@ interface ConfigStatus {
isFullyConfigured: boolean;
isLoading: boolean;
error: string | null;
autoMirrorStarred: boolean;
githubOwner: string;
}
// Cache to prevent duplicate API calls across components
@@ -33,6 +35,8 @@ export function useConfigStatus(): ConfigStatus {
isFullyConfigured: false,
isLoading: true,
error: null,
autoMirrorStarred: false,
githubOwner: '',
});
// Track if this hook has already checked config to prevent multiple calls
@@ -46,6 +50,8 @@ export function useConfigStatus(): ConfigStatus {
isFullyConfigured: false,
isLoading: false,
error: 'No user found',
autoMirrorStarred: false,
githubOwner: '',
});
return;
}
@@ -78,6 +84,8 @@ export function useConfigStatus(): ConfigStatus {
isFullyConfigured,
isLoading: false,
error: null,
autoMirrorStarred: configResponse?.advancedOptions?.autoMirrorStarred ?? false,
githubOwner: configResponse?.githubConfig?.username ?? '',
});
return;
}
@@ -119,6 +127,8 @@ export function useConfigStatus(): ConfigStatus {
isFullyConfigured,
isLoading: false,
error: null,
autoMirrorStarred: configResponse?.advancedOptions?.autoMirrorStarred ?? false,
githubOwner: configResponse?.githubConfig?.username ?? '',
});
hasCheckedRef.current = true;
@@ -129,6 +139,8 @@ export function useConfigStatus(): ConfigStatus {
isFullyConfigured: false,
isLoading: false,
error: error instanceof Error ? error.message : 'Failed to check configuration',
autoMirrorStarred: false,
githubOwner: '',
});
hasCheckedRef.current = true;
}

View File

@@ -7,6 +7,7 @@ const FILTER_KEYS: (keyof FilterParams)[] = [
"membershipRole",
"owner",
"organization",
"sort",
"type",
"name",
];

View File

@@ -1,5 +1,6 @@
import { useEffect, useState, useRef, useCallback } from "react";
import type { MirrorJob } from "@/lib/db/schema";
import { withBase } from "@/lib/base-path";
interface UseSSEOptions {
userId?: string;
@@ -41,7 +42,7 @@ export const useSSE = ({
}
// Create new EventSource connection
const eventSource = new EventSource(`/api/sse?userId=${userId}`);
const eventSource = new EventSource(`${withBase("/api/sse")}?userId=${userId}`);
eventSourceRef.current = eventSource;
const handleMessage = (event: MessageEvent) => {

View File

@@ -1,5 +1,6 @@
import { useEffect, useRef } from "react";
import { useAuth } from "./useAuth";
import { withBase } from "@/lib/base-path";
interface UseRepoSyncOptions {
userId?: string;
@@ -51,7 +52,7 @@ export function useRepoSync({
const sync = async () => {
try {
const response = await fetch("/api/job/schedule-sync-repo", {
const response = await fetch(withBase("/api/job/schedule-sync-repo"), {
method: "POST",
headers: {
"Content-Type": "application/json",

View File

@@ -2,6 +2,7 @@
import '../styles/global.css';
import '../styles/docs.css';
import ThemeScript from '@/components/theme/ThemeScript.astro';
import { withBase } from '@/lib/base-path';
// Accept title as a prop with a default value
const { title = 'Gitea Mirror' } = Astro.props;
@@ -11,7 +12,7 @@ const { title = 'Gitea Mirror' } = Astro.props;
<head>
<meta charset="utf-8" />
<meta name="viewport" content="width=device-width" />
<link rel="icon" type="image/svg+xml" href="/favicon.svg" />
<link rel="icon" type="image/svg+xml" href={withBase('/favicon.svg')} />
<title>{title}</title>
<ThemeScript />
</head>

View File

@@ -1,5 +1,7 @@
import { withBase } from "@/lib/base-path";
// Base API URL
const API_BASE = "/api";
const API_BASE = withBase("/api");
// Helper function for API requests
async function apiRequest<T>(
@@ -78,6 +80,10 @@ export const githubApi = {
method: "POST",
body: JSON.stringify({ token }),
}),
getStarredLists: () =>
apiRequest<{ success: boolean; lists: string[] }>("/github/starred-lists", {
method: "GET",
}),
};
// Gitea API
@@ -91,35 +97,17 @@ export const giteaApi = {
// Health API
export interface HealthResponse {
status: "ok" | "error";
status: "ok" | "error" | "degraded";
timestamp: string;
version: string;
latestVersion: string;
updateAvailable: boolean;
database: {
connected: boolean;
message: string;
};
system: {
uptime: {
startTime: string;
uptimeMs: number;
formatted: string;
};
memory: {
rss: string;
heapTotal: string;
heapUsed: string;
external: string;
systemTotal: string;
systemFree: string;
};
os: {
platform: string;
version: string;
arch: string;
};
env: string;
recovery?: {
status: string;
jobsNeedingRecovery: number;
};
error?: string;
}

View File

@@ -3,6 +3,12 @@ import { createAuthClient } from "better-auth/react";
import { oidcClient } from "better-auth/client/plugins";
import { ssoClient } from "@better-auth/sso/client";
import type { Session as BetterAuthSession, User as BetterAuthUser } from "better-auth";
import { withBase } from "@/lib/base-path";
function normalizeAuthBaseUrl(url: string): string {
const validatedUrl = new URL(url.trim());
return validatedUrl.origin;
}
export const authClient = createAuthClient({
// Use PUBLIC_BETTER_AUTH_URL if set (for multi-origin access), otherwise use current origin
@@ -18,9 +24,8 @@ export const authClient = createAuthClient({
// Validate and clean the URL if provided
if (url && typeof url === 'string' && url.trim() !== '') {
try {
// Validate URL format and remove trailing slash
const validatedUrl = new URL(url.trim());
return validatedUrl.origin; // Use origin to ensure clean URL without path
// Validate URL format and preserve optional base path
return normalizeAuthBaseUrl(url);
} catch (e) {
console.warn(`Invalid PUBLIC_BETTER_AUTH_URL: ${url}, falling back to default`);
}
@@ -34,7 +39,7 @@ export const authClient = createAuthClient({
// Default for SSR - always return a valid URL
return 'http://localhost:4321';
})(),
basePath: '/api/auth', // Explicitly set the base path
basePath: withBase('/api/auth'), // Explicitly set the base path
plugins: [
oidcClient(),
ssoClient(),

View File

@@ -0,0 +1,119 @@
import { describe, test, expect, beforeEach, afterEach } from "bun:test";
import { resolveTrustedOrigins } from "./auth";
// Helper to create a mock Request with specific headers
function mockRequest(headers: Record<string, string>): Request {
return new Request("http://localhost:4321/api/auth/sign-in", {
headers: new Headers(headers),
});
}
describe("resolveTrustedOrigins", () => {
const savedEnv: Record<string, string | undefined> = {};
beforeEach(() => {
// Save and clear relevant env vars
for (const key of ["BETTER_AUTH_URL", "BETTER_AUTH_TRUSTED_ORIGINS"]) {
savedEnv[key] = process.env[key];
delete process.env[key];
}
});
afterEach(() => {
// Restore env vars
for (const [key, val] of Object.entries(savedEnv)) {
if (val === undefined) delete process.env[key];
else process.env[key] = val;
}
});
test("includes localhost defaults when called without request", async () => {
const origins = await resolveTrustedOrigins();
expect(origins).toContain("http://localhost:4321");
expect(origins).toContain("http://localhost:8080");
});
test("includes BETTER_AUTH_URL from env", async () => {
process.env.BETTER_AUTH_URL = "https://gitea-mirror.example.com";
const origins = await resolveTrustedOrigins();
expect(origins).toContain("https://gitea-mirror.example.com");
});
test("includes BETTER_AUTH_TRUSTED_ORIGINS (comma-separated)", async () => {
process.env.BETTER_AUTH_TRUSTED_ORIGINS = "https://a.example.com, https://b.example.com";
const origins = await resolveTrustedOrigins();
expect(origins).toContain("https://a.example.com");
expect(origins).toContain("https://b.example.com");
});
test("skips invalid URLs in env vars", async () => {
process.env.BETTER_AUTH_URL = "not-a-url";
process.env.BETTER_AUTH_TRUSTED_ORIGINS = "also-invalid, https://valid.example.com";
const origins = await resolveTrustedOrigins();
expect(origins).not.toContain("not-a-url");
expect(origins).not.toContain("also-invalid");
expect(origins).toContain("https://valid.example.com");
});
test("auto-detects origin from x-forwarded-host + x-forwarded-proto", async () => {
const req = mockRequest({
"x-forwarded-host": "gitea-mirror.mydomain.tld",
"x-forwarded-proto": "https",
});
const origins = await resolveTrustedOrigins(req);
expect(origins).toContain("https://gitea-mirror.mydomain.tld");
});
test("falls back to host header when x-forwarded-host is absent", async () => {
const req = mockRequest({
host: "myserver.local:4321",
});
const origins = await resolveTrustedOrigins(req);
expect(origins).toContain("http://myserver.local:4321");
});
test("handles multi-value x-forwarded-host (chained proxies)", async () => {
const req = mockRequest({
"x-forwarded-host": "external.example.com, internal.proxy.local",
"x-forwarded-proto": "https",
});
const origins = await resolveTrustedOrigins(req);
expect(origins).toContain("https://external.example.com");
expect(origins).not.toContain("https://internal.proxy.local");
});
test("handles multi-value x-forwarded-proto (chained proxies)", async () => {
const req = mockRequest({
"x-forwarded-host": "gitea.example.com",
"x-forwarded-proto": "https, http",
});
const origins = await resolveTrustedOrigins(req);
expect(origins).toContain("https://gitea.example.com");
// Should NOT create an origin with "https, http" as proto
expect(origins).not.toContain("https, http://gitea.example.com");
});
test("rejects invalid x-forwarded-proto values", async () => {
const req = mockRequest({
"x-forwarded-host": "gitea.example.com",
"x-forwarded-proto": "ftp",
});
const origins = await resolveTrustedOrigins(req);
expect(origins).not.toContain("ftp://gitea.example.com");
});
test("deduplicates origins", async () => {
process.env.BETTER_AUTH_URL = "http://localhost:4321";
const origins = await resolveTrustedOrigins();
const count = origins.filter(o => o === "http://localhost:4321").length;
expect(count).toBe(1);
});
test("defaults proto to http when x-forwarded-proto is absent", async () => {
const req = mockRequest({
"x-forwarded-host": "gitea.internal",
});
const origins = await resolveTrustedOrigins(req);
expect(origins).toContain("http://gitea.internal");
});
});

View File

@@ -5,6 +5,73 @@ import { sso } from "@better-auth/sso";
import { db, users } from "./db";
import * as schema from "./db/schema";
import { eq } from "drizzle-orm";
import { withBase } from "./base-path";
/**
* Resolves the list of trusted origins for Better Auth CSRF validation.
* Exported for testing. Called per-request with the incoming Request,
* or at startup with no request (static origins only).
*/
export async function resolveTrustedOrigins(request?: Request): Promise<string[]> {
const origins: string[] = [
"http://localhost:4321",
"http://localhost:8080", // Keycloak
];
// Add the primary URL from BETTER_AUTH_URL
const primaryUrl = process.env.BETTER_AUTH_URL;
if (primaryUrl && typeof primaryUrl === 'string' && primaryUrl.trim() !== '') {
try {
const validatedUrl = new URL(primaryUrl.trim());
origins.push(validatedUrl.origin);
} catch {
// Skip if invalid
}
}
// Add additional trusted origins from environment
if (process.env.BETTER_AUTH_TRUSTED_ORIGINS) {
const additionalOrigins = process.env.BETTER_AUTH_TRUSTED_ORIGINS
.split(',')
.map(o => o.trim())
.filter(o => o !== '');
for (const origin of additionalOrigins) {
try {
const validatedUrl = new URL(origin);
origins.push(validatedUrl.origin);
} catch {
console.warn(`Invalid trusted origin: ${origin}, skipping`);
}
}
}
// Auto-detect origin from the incoming request's Host header when running
// behind a reverse proxy. Helps with Better Auth's per-request CSRF check.
if (request?.headers) {
// Take first value only — headers can be comma-separated in chained proxy setups
const rawHost = request.headers.get("x-forwarded-host") || request.headers.get("host");
const host = rawHost?.split(",")[0].trim();
if (host) {
const rawProto = request.headers.get("x-forwarded-proto") || "http";
const proto = rawProto.split(",")[0].trim().toLowerCase();
if (proto === "http" || proto === "https") {
try {
const detected = new URL(`${proto}://${host}`);
origins.push(detected.origin);
} catch {
// Malformed header, ignore
}
}
}
}
const uniqueOrigins = [...new Set(origins.filter(Boolean))];
if (!request) {
console.info("Trusted origins (static):", uniqueOrigins);
}
return uniqueOrigins;
}
export const auth = betterAuth({
// Database configuration
@@ -31,7 +98,7 @@ export const auth = betterAuth({
try {
// Validate URL format and ensure it's a proper origin
const validatedUrl = new URL(url.trim());
const cleanUrl = validatedUrl.origin; // Use origin to ensure no trailing paths
const cleanUrl = validatedUrl.origin;
console.info('Using BETTER_AUTH_URL:', cleanUrl);
return cleanUrl;
} catch (e) {
@@ -41,50 +108,13 @@ export const auth = betterAuth({
return defaultUrl;
}
})(),
basePath: "/api/auth", // Specify the base path for auth endpoints
basePath: withBase("/api/auth"), // Specify the base path for auth endpoints
// Trusted origins - this is how we support multiple access URLs
trustedOrigins: (() => {
const origins: string[] = [
"http://localhost:4321",
"http://localhost:8080", // Keycloak
];
// Add the primary URL from BETTER_AUTH_URL
const primaryUrl = process.env.BETTER_AUTH_URL;
if (primaryUrl && typeof primaryUrl === 'string' && primaryUrl.trim() !== '') {
try {
const validatedUrl = new URL(primaryUrl.trim());
origins.push(validatedUrl.origin);
} catch {
// Skip if invalid
}
}
// Add additional trusted origins from environment
// This is where users can specify multiple access URLs
if (process.env.BETTER_AUTH_TRUSTED_ORIGINS) {
const additionalOrigins = process.env.BETTER_AUTH_TRUSTED_ORIGINS
.split(',')
.map(o => o.trim())
.filter(o => o !== '');
// Validate each additional origin
for (const origin of additionalOrigins) {
try {
const validatedUrl = new URL(origin);
origins.push(validatedUrl.origin);
} catch {
console.warn(`Invalid trusted origin: ${origin}, skipping`);
}
}
}
// Remove duplicates and empty strings, then return
const uniqueOrigins = [...new Set(origins.filter(Boolean))];
console.info('Trusted origins:', uniqueOrigins);
return uniqueOrigins;
})(),
// Trusted origins - this is how we support multiple access URLs.
// Uses the function form so that the origin can be auto-detected from
// the incoming request's Host / X-Forwarded-* headers, which makes the
// app work behind a reverse proxy without manual env var configuration.
trustedOrigins: (request?: Request) => resolveTrustedOrigins(request),
// Authentication methods
emailAndPassword: {
@@ -121,8 +151,8 @@ export const auth = betterAuth({
plugins: [
// OIDC Provider plugin - allows this app to act as an OIDC provider
oidcProvider({
loginPage: "/login",
consentPage: "/oauth/consent",
loginPage: withBase("/login"),
consentPage: withBase("/oauth/consent"),
// Allow dynamic client registration for flexibility
allowDynamicClientRegistration: true,
// Note: trustedClients would be configured here if Better Auth supports it

48
src/lib/base-path.test.ts Normal file
View File

@@ -0,0 +1,48 @@
import { afterEach, describe, expect, test } from "bun:test";
const originalBaseUrl = process.env.BASE_URL;
async function loadModule(baseUrl?: string) {
if (baseUrl === undefined) {
delete process.env.BASE_URL;
} else {
process.env.BASE_URL = baseUrl;
}
return import(`./base-path.ts?case=${encodeURIComponent(baseUrl ?? "default")}-${Date.now()}-${Math.random()}`);
}
afterEach(() => {
if (originalBaseUrl === undefined) {
delete process.env.BASE_URL;
} else {
process.env.BASE_URL = originalBaseUrl;
}
});
describe("base-path helpers", () => {
test("defaults to root paths", async () => {
const mod = await loadModule(undefined);
expect(mod.BASE_PATH).toBe("/");
expect(mod.withBase("/api/health")).toBe("/api/health");
expect(mod.withBase("repositories")).toBe("/repositories");
expect(mod.stripBasePath("/config")).toBe("/config");
});
test("normalizes prefixed base paths", async () => {
const mod = await loadModule("mirror/");
expect(mod.BASE_PATH).toBe("/mirror");
expect(mod.withBase("/api/health")).toBe("/mirror/api/health");
expect(mod.withBase("repositories")).toBe("/mirror/repositories");
expect(mod.stripBasePath("/mirror/config")).toBe("/config");
expect(mod.stripBasePath("/mirror")).toBe("/");
});
test("keeps absolute URLs unchanged", async () => {
const mod = await loadModule("/mirror");
expect(mod.withBase("https://example.com/path")).toBe("https://example.com/path");
});
});

63
src/lib/base-path.ts Normal file
View File

@@ -0,0 +1,63 @@
const URL_SCHEME_REGEX = /^[a-zA-Z][a-zA-Z\d+\-.]*:/;
function normalizeBasePath(basePath: string | null | undefined): string {
if (!basePath) {
return "/";
}
let normalized = basePath.trim();
if (!normalized) {
return "/";
}
if (!normalized.startsWith("/")) {
normalized = `/${normalized}`;
}
normalized = normalized.replace(/\/+$/, "");
return normalized || "/";
}
const rawBasePath =
(typeof import.meta !== "undefined" && import.meta.env?.BASE_URL) ||
process.env.BASE_URL ||
"/";
export const BASE_PATH = normalizeBasePath(rawBasePath);
export function withBase(path: string): string {
if (!path) {
return BASE_PATH === "/" ? "/" : `${BASE_PATH}/`;
}
if (URL_SCHEME_REGEX.test(path) || path.startsWith("//")) {
return path;
}
const normalizedPath = path.startsWith("/") ? path : `/${path}`;
if (BASE_PATH === "/") {
return normalizedPath;
}
return `${BASE_PATH}${normalizedPath}`;
}
export function stripBasePath(pathname: string): string {
if (!pathname) {
return "/";
}
if (BASE_PATH === "/") {
return pathname;
}
if (pathname === BASE_PATH || pathname === `${BASE_PATH}/`) {
return "/";
}
if (pathname.startsWith(`${BASE_PATH}/`)) {
return pathname.slice(BASE_PATH.length) || "/";
}
return pathname;
}

View File

@@ -19,8 +19,23 @@ export const ENV = {
},
// Better Auth secret for authentication
BETTER_AUTH_SECRET:
process.env.BETTER_AUTH_SECRET || "your-secret-key-change-this-in-production",
get BETTER_AUTH_SECRET(): string {
const secret = process.env.BETTER_AUTH_SECRET;
const knownInsecureDefaults = [
"your-secret-key-change-this-in-production",
"dev-only-insecure-secret-do-not-use-in-production",
];
if (!secret || knownInsecureDefaults.includes(secret)) {
if (process.env.NODE_ENV === "production") {
console.error(
"\x1b[31m[SECURITY WARNING]\x1b[0m BETTER_AUTH_SECRET is missing or using an insecure default. " +
"Set a strong secret: openssl rand -base64 32"
);
}
return secret || "dev-only-insecure-secret-do-not-use-in-production";
}
return secret;
},
// Server host and port
HOST: process.env.HOST || "localhost",

View File

@@ -35,13 +35,54 @@ if (process.env.NODE_ENV !== "test") {
// Create drizzle instance with the SQLite client
db = drizzle({ client: sqlite });
/**
* Fix migration records that were marked as applied but whose DDL actually
* failed (e.g. the v3.13.0 release where ALTER TABLE with expression default
* was rejected by SQLite). Without this, Drizzle skips the migration on
* retry because it thinks it already ran.
*
* Drizzle tracks migrations by `created_at` (= journal timestamp) and only
* looks at the most recent record. If the last recorded timestamp is >= the
* failed migration's timestamp but the expected column is missing, we delete
* stale records so the migration re-runs.
*/
function repairFailedMigrations() {
try {
const migrationsTableExists = sqlite
.query("SELECT name FROM sqlite_master WHERE type='table' AND name='__drizzle_migrations'")
.get();
if (!migrationsTableExists) return;
// Migration 0009 journal timestamp (from drizzle/meta/_journal.json)
const MIGRATION_0009_TIMESTAMP = 1773542995732;
const lastMigration = sqlite
.query("SELECT id, created_at FROM __drizzle_migrations ORDER BY created_at DESC LIMIT 1")
.get() as { id: number; created_at: number } | null;
if (!lastMigration || Number(lastMigration.created_at) < MIGRATION_0009_TIMESTAMP) return;
// Migration 0009 is recorded as applied — verify the column actually exists
const columns = sqlite.query("PRAGMA table_info(repositories)").all() as { name: string }[];
const hasImportedAt = columns.some((c) => c.name === "imported_at");
if (!hasImportedAt) {
console.log("🔧 Detected failed migration 0009 (imported_at column missing). Removing stale record so it can re-run...");
sqlite.prepare("DELETE FROM __drizzle_migrations WHERE created_at >= ?").run(MIGRATION_0009_TIMESTAMP);
}
} catch (error) {
console.warn("⚠️ Migration repair check failed (non-fatal):", error);
}
}
/**
* Run Drizzle migrations
*/
function runDrizzleMigrations() {
try {
console.log("🔄 Checking for pending migrations...");
// Check if migrations table exists
const migrationsTableExists = sqlite
.query("SELECT name FROM sqlite_master WHERE type='table' AND name='__drizzle_migrations'")
@@ -51,9 +92,12 @@ if (process.env.NODE_ENV !== "test") {
console.log("📦 First time setup - running initial migrations...");
}
// Fix any migrations that were recorded but actually failed (e.g. v3.13.0 bug)
repairFailedMigrations();
// Run migrations using Drizzle migrate function
migrate(db, { migrationsFolder: "./drizzle" });
console.log("✅ Database migrations completed successfully");
} catch (error) {
console.error("❌ Error running migrations:", error);

View File

@@ -0,0 +1,26 @@
import { expect, test } from "bun:test";
function decodeOutput(output: ArrayBufferLike | Uint8Array | null | undefined) {
if (!output) {
return "";
}
return Buffer.from(output as ArrayBufferLike).toString("utf8");
}
test("migration validation script passes", () => {
const result = Bun.spawnSync({
cmd: ["bun", "scripts/validate-migrations.ts"],
cwd: process.cwd(),
stdout: "pipe",
stderr: "pipe",
});
const stdout = decodeOutput(result.stdout);
const stderr = decodeOutput(result.stderr);
expect(
result.exitCode,
`Migration validation script failed.\nstdout:\n${stdout}\nstderr:\n${stderr}`,
).toBe(0);
});

View File

@@ -26,13 +26,22 @@ export const githubConfigSchema = z.object({
includeOrganizations: z.array(z.string()).default([]),
starredReposOrg: z.string().optional(),
starredReposMode: z.enum(["dedicated-org", "preserve-owner"]).default("dedicated-org"),
starredLists: z.array(z.string()).default([]),
mirrorStrategy: z.enum(["preserve", "single-org", "flat-user", "mixed"]).default("preserve"),
defaultOrg: z.string().optional(),
starredCodeOnly: z.boolean().default(false),
autoMirrorStarred: z.boolean().default(false),
skipStarredIssues: z.boolean().optional(), // Deprecated: kept for backward compatibility, use starredCodeOnly instead
starredDuplicateStrategy: z.enum(["suffix", "prefix", "owner-org"]).default("suffix").optional(),
});
export const backupStrategyEnum = z.enum([
"disabled",
"always",
"on-force-push",
"block-on-force-push",
]);
export const giteaConfigSchema = z.object({
url: z.url(),
externalUrl: z.url().optional(),
@@ -65,6 +74,12 @@ export const giteaConfigSchema = z.object({
mirrorPullRequests: z.boolean().default(false),
mirrorLabels: z.boolean().default(false),
mirrorMilestones: z.boolean().default(false),
backupStrategy: backupStrategyEnum.default("on-force-push"),
backupBeforeSync: z.boolean().default(true), // Deprecated: kept for backward compat, use backupStrategy
backupRetentionCount: z.number().int().min(1).default(5),
backupRetentionDays: z.number().int().min(0).default(30),
backupDirectory: z.string().optional(),
blockSyncOnBackupFailure: z.boolean().default(true),
});
export const scheduleConfigSchema = z.object({
@@ -108,6 +123,31 @@ export const cleanupConfigSchema = z.object({
nextRun: z.coerce.date().optional(),
});
export const ntfyConfigSchema = z.object({
url: z.string().default("https://ntfy.sh"),
topic: z.string().default(""),
token: z.string().optional(),
priority: z.enum(["min", "low", "default", "high", "urgent"]).default("default"),
});
export const appriseConfigSchema = z.object({
url: z.string().default(""),
token: z.string().default(""),
tag: z.string().optional(),
});
export const notificationConfigSchema = z.object({
enabled: z.boolean().default(false),
provider: z.enum(["ntfy", "apprise"]).default("ntfy"),
notifyOnSyncError: z.boolean().default(true),
notifyOnSyncSuccess: z.boolean().default(false),
notifyOnNewRepo: z.boolean().default(false),
ntfy: ntfyConfigSchema.optional(),
apprise: appriseConfigSchema.optional(),
});
export type NotificationConfig = z.infer<typeof notificationConfigSchema>;
export const configSchema = z.object({
id: z.string(),
userId: z.string(),
@@ -161,12 +201,14 @@ export const repositorySchema = z.object({
"syncing",
"synced",
"archived",
"pending-approval", // Blocked by force-push detection, needs manual approval
])
.default("imported"),
lastMirrored: z.coerce.date().optional().nullable(),
errorMessage: z.string().optional().nullable(),
destinationOrg: z.string().optional().nullable(),
metadata: z.string().optional().nullable(), // JSON string for metadata sync state
importedAt: z.coerce.date(),
createdAt: z.coerce.date(),
updatedAt: z.coerce.date(),
});
@@ -192,6 +234,7 @@ export const mirrorJobSchema = z.object({
"syncing",
"synced",
"archived",
"pending-approval",
])
.default("imported"),
message: z.string(),
@@ -320,6 +363,11 @@ export const configs = sqliteTable("configs", {
.$type<z.infer<typeof cleanupConfigSchema>>()
.notNull(),
notificationConfig: text("notification_config", { mode: "json" })
.$type<z.infer<typeof notificationConfigSchema>>()
.notNull()
.default(sql`'{"enabled":false,"provider":"ntfy","notifyOnSyncError":true,"notifyOnSyncSuccess":false,"notifyOnNewRepo":false}'`),
createdAt: integer("created_at", { mode: "timestamp" })
.notNull()
.default(sql`(unixepoch())`),
@@ -380,6 +428,9 @@ export const repositories = sqliteTable("repositories", {
destinationOrg: text("destination_org"),
metadata: text("metadata"), // JSON string storing metadata sync state (issues, PRs, releases, etc.)
importedAt: integer("imported_at", { mode: "timestamp" })
.notNull()
.default(sql`(unixepoch())`),
createdAt: integer("created_at", { mode: "timestamp" })
.notNull()
@@ -395,8 +446,10 @@ export const repositories = sqliteTable("repositories", {
index("idx_repositories_organization").on(table.organization),
index("idx_repositories_is_fork").on(table.isForked),
index("idx_repositories_is_starred").on(table.isStarred),
index("idx_repositories_user_imported_at").on(table.userId, table.importedAt),
uniqueIndex("uniq_repositories_user_full_name").on(table.userId, table.fullName),
uniqueIndex("uniq_repositories_user_normalized_full_name").on(table.userId, table.normalizedFullName),
index("idx_repositories_mirrored_location").on(table.userId, table.mirroredLocation),
]);
export const mirrorJobs = sqliteTable("mirror_jobs", {

View File

@@ -22,8 +22,10 @@ interface EnvConfig {
preserveOrgStructure?: boolean;
onlyMirrorOrgs?: boolean;
starredCodeOnly?: boolean;
autoMirrorStarred?: boolean;
starredReposOrg?: string;
starredReposMode?: 'dedicated-org' | 'preserve-owner';
starredLists?: string[];
mirrorStrategy?: 'preserve' | 'single-org' | 'flat-user' | 'mixed';
};
gitea: {
@@ -98,6 +100,9 @@ function parseEnvConfig(): EnvConfig {
const protectedRepos = process.env.CLEANUP_PROTECTED_REPOS
? process.env.CLEANUP_PROTECTED_REPOS.split(',').map(r => r.trim()).filter(Boolean)
: undefined;
const starredLists = process.env.MIRROR_STARRED_LISTS
? process.env.MIRROR_STARRED_LISTS.split(',').map((list) => list.trim()).filter(Boolean)
: undefined;
return {
github: {
@@ -113,8 +118,10 @@ function parseEnvConfig(): EnvConfig {
preserveOrgStructure: process.env.PRESERVE_ORG_STRUCTURE === 'true',
onlyMirrorOrgs: process.env.ONLY_MIRROR_ORGS === 'true',
starredCodeOnly: process.env.SKIP_STARRED_ISSUES === 'true',
autoMirrorStarred: process.env.AUTO_MIRROR_STARRED === 'true',
starredReposOrg: process.env.STARRED_REPOS_ORG,
starredReposMode: process.env.STARRED_REPOS_MODE as 'dedicated-org' | 'preserve-owner',
starredLists,
mirrorStrategy: process.env.MIRROR_STRATEGY as 'preserve' | 'single-org' | 'flat-user' | 'mixed',
},
gitea: {
@@ -264,6 +271,8 @@ export async function initializeConfigFromEnv(): Promise<void> {
mirrorStrategy,
defaultOrg: envConfig.gitea.organization || existingConfig?.[0]?.githubConfig?.defaultOrg || 'github-mirrors',
starredCodeOnly: envConfig.github.starredCodeOnly ?? existingConfig?.[0]?.githubConfig?.starredCodeOnly ?? false,
autoMirrorStarred: envConfig.github.autoMirrorStarred ?? existingConfig?.[0]?.githubConfig?.autoMirrorStarred ?? false,
starredLists: envConfig.github.starredLists ?? existingConfig?.[0]?.githubConfig?.starredLists ?? [],
};
// Build Gitea config

View File

@@ -13,6 +13,11 @@ const mockMirrorGitRepoPullRequestsToGitea = mock(() => Promise.resolve());
const mockMirrorGitRepoLabelsToGitea = mock(() => Promise.resolve());
const mockMirrorGitRepoMilestonesToGitea = mock(() => Promise.resolve());
const mockGetGiteaRepoOwnerAsync = mock(() => Promise.resolve("starred"));
const mockCreatePreSyncBundleBackup = mock(() =>
Promise.resolve({ bundlePath: "/tmp/mock.bundle" })
);
let mockShouldCreatePreSyncBackup = false;
let mockShouldBlockSyncOnBackupFailure = true;
// Mock the database module
const mockDb = {
@@ -28,8 +33,14 @@ const mockDb = {
mock.module("@/lib/db", () => ({
db: mockDb,
users: {},
configs: {},
organizations: {},
mirrorJobs: {},
repositories: {}
repositories: {},
events: {},
accounts: {},
sessions: {},
}));
// Mock config encryption
@@ -235,6 +246,12 @@ mock.module("@/lib/http-client", () => ({
HttpError: MockHttpError
}));
mock.module("@/lib/repo-backup", () => ({
createPreSyncBundleBackup: mockCreatePreSyncBundleBackup,
shouldCreatePreSyncBackup: () => mockShouldCreatePreSyncBackup,
shouldBlockSyncOnBackupFailure: () => mockShouldBlockSyncOnBackupFailure,
}));
// Now import the modules we're testing
import {
getGiteaRepoInfo,
@@ -264,6 +281,15 @@ describe("Enhanced Gitea Operations", () => {
mockMirrorGitRepoMilestonesToGitea.mockClear();
mockGetGiteaRepoOwnerAsync.mockClear();
mockGetGiteaRepoOwnerAsync.mockImplementation(() => Promise.resolve("starred"));
mockHttpGet.mockClear();
mockHttpPost.mockClear();
mockHttpDelete.mockClear();
mockCreatePreSyncBundleBackup.mockClear();
mockCreatePreSyncBundleBackup.mockImplementation(() =>
Promise.resolve({ bundlePath: "/tmp/mock.bundle" })
);
mockShouldCreatePreSyncBackup = false;
mockShouldBlockSyncOnBackupFailure = true;
// Reset tracking variables
orgCheckCount = 0;
orgTestContext = "";
@@ -529,6 +555,182 @@ describe("Enhanced Gitea Operations", () => {
expect(releaseCall.octokit).toBeDefined();
});
test("prefers recorded mirroredLocation when owner resolution changes", async () => {
mockGetGiteaRepoOwnerAsync.mockImplementation(() => Promise.resolve("ceph"));
const config: Partial<Config> = {
userId: "user123",
githubConfig: {
username: "testuser",
token: "github-token",
privateRepositories: false,
mirrorStarred: true,
},
giteaConfig: {
url: "https://gitea.example.com",
token: "encrypted-token",
defaultOwner: "testuser",
mirrorReleases: true,
},
};
const repository: Repository = {
id: "repo789",
name: "test-repo",
fullName: "ceph/test-repo",
owner: "ceph",
cloneUrl: "https://github.com/ceph/test-repo.git",
isPrivate: false,
isStarred: true,
status: repoStatusEnum.parse("mirrored"),
visibility: "public",
userId: "user123",
mirroredLocation: "starred/test-repo",
createdAt: new Date(),
updatedAt: new Date(),
};
const result = await syncGiteaRepoEnhanced(
{ config, repository },
{
getGiteaRepoOwnerAsync: mockGetGiteaRepoOwnerAsync,
mirrorGitHubReleasesToGitea: mockMirrorGitHubReleasesToGitea,
mirrorGitRepoIssuesToGitea: mockMirrorGitRepoIssuesToGitea,
mirrorGitRepoPullRequestsToGitea: mockMirrorGitRepoPullRequestsToGitea,
mirrorGitRepoLabelsToGitea: mockMirrorGitRepoLabelsToGitea,
mirrorGitRepoMilestonesToGitea: mockMirrorGitRepoMilestonesToGitea,
}
);
expect(result).toEqual({ success: true });
const mirrorSyncCalls = mockHttpPost.mock.calls.filter((call) =>
String(call[0]).includes("/mirror-sync")
);
expect(mirrorSyncCalls).toHaveLength(1);
expect(String(mirrorSyncCalls[0][0])).toContain("/api/v1/repos/starred/test-repo/mirror-sync");
expect(String(mirrorSyncCalls[0][0])).not.toContain("/api/v1/repos/ceph/test-repo/mirror-sync");
});
test("blocks sync when pre-sync snapshot fails and blocking is enabled", async () => {
mockShouldCreatePreSyncBackup = true;
mockShouldBlockSyncOnBackupFailure = true;
mockCreatePreSyncBundleBackup.mockImplementation(() =>
Promise.reject(new Error("simulated backup failure"))
);
const config: Partial<Config> = {
userId: "user123",
githubConfig: {
username: "testuser",
token: "github-token",
privateRepositories: false,
mirrorStarred: true,
},
giteaConfig: {
url: "https://gitea.example.com",
token: "encrypted-token",
defaultOwner: "testuser",
mirrorReleases: false,
backupStrategy: "always",
blockSyncOnBackupFailure: true,
},
};
const repository: Repository = {
id: "repo456",
name: "mirror-repo",
fullName: "user/mirror-repo",
owner: "user",
cloneUrl: "https://github.com/user/mirror-repo.git",
isPrivate: false,
isStarred: true,
status: repoStatusEnum.parse("mirrored"),
visibility: "public",
userId: "user123",
createdAt: new Date(),
updatedAt: new Date(),
};
await expect(
syncGiteaRepoEnhanced(
{ config, repository },
{
getGiteaRepoOwnerAsync: mockGetGiteaRepoOwnerAsync,
mirrorGitHubReleasesToGitea: mockMirrorGitHubReleasesToGitea,
mirrorGitRepoIssuesToGitea: mockMirrorGitRepoIssuesToGitea,
mirrorGitRepoPullRequestsToGitea: mockMirrorGitRepoPullRequestsToGitea,
mirrorGitRepoLabelsToGitea: mockMirrorGitRepoLabelsToGitea,
mirrorGitRepoMilestonesToGitea: mockMirrorGitRepoMilestonesToGitea,
}
)
).rejects.toThrow("Snapshot failed; sync blocked to protect history.");
const mirrorSyncCalls = mockHttpPost.mock.calls.filter((call) =>
String(call[0]).includes("/mirror-sync")
);
expect(mirrorSyncCalls.length).toBe(0);
});
test("continues sync when pre-sync snapshot fails and blocking is disabled", async () => {
mockShouldCreatePreSyncBackup = true;
mockShouldBlockSyncOnBackupFailure = false;
mockCreatePreSyncBundleBackup.mockImplementation(() =>
Promise.reject(new Error("simulated backup failure"))
);
const config: Partial<Config> = {
userId: "user123",
githubConfig: {
username: "testuser",
token: "github-token",
privateRepositories: false,
mirrorStarred: true,
},
giteaConfig: {
url: "https://gitea.example.com",
token: "encrypted-token",
defaultOwner: "testuser",
mirrorReleases: false,
backupBeforeSync: true,
blockSyncOnBackupFailure: false,
},
};
const repository: Repository = {
id: "repo457",
name: "mirror-repo",
fullName: "user/mirror-repo",
owner: "user",
cloneUrl: "https://github.com/user/mirror-repo.git",
isPrivate: false,
isStarred: true,
status: repoStatusEnum.parse("mirrored"),
visibility: "public",
userId: "user123",
createdAt: new Date(),
updatedAt: new Date(),
};
const result = await syncGiteaRepoEnhanced(
{ config, repository },
{
getGiteaRepoOwnerAsync: mockGetGiteaRepoOwnerAsync,
mirrorGitHubReleasesToGitea: mockMirrorGitHubReleasesToGitea,
mirrorGitRepoIssuesToGitea: mockMirrorGitRepoIssuesToGitea,
mirrorGitRepoPullRequestsToGitea: mockMirrorGitRepoPullRequestsToGitea,
mirrorGitRepoLabelsToGitea: mockMirrorGitRepoLabelsToGitea,
mirrorGitRepoMilestonesToGitea: mockMirrorGitRepoMilestonesToGitea,
}
);
expect(result).toEqual({ success: true });
const mirrorSyncCalls = mockHttpPost.mock.calls.filter((call) =>
String(call[0]).includes("/mirror-sync")
);
expect(mirrorSyncCalls.length).toBe(1);
});
test("mirrors metadata components when enabled and not previously synced", async () => {
const config: Partial<Config> = {
userId: "user123",

View File

@@ -15,6 +15,16 @@ import { httpPost, httpGet, httpPatch, HttpError } from "./http-client";
import { db, repositories } from "./db";
import { eq } from "drizzle-orm";
import { repoStatusEnum } from "@/types/Repository";
import {
createPreSyncBundleBackup,
shouldCreatePreSyncBackup,
shouldBlockSyncOnBackupFailure,
resolveBackupStrategy,
shouldBackupForStrategy,
shouldBlockSyncForStrategy,
strategyNeedsDetection,
} from "./repo-backup";
import { detectForcePush } from "./utils/force-push-detection";
import {
parseRepositoryMetadataState,
serializeRepositoryMetadataState,
@@ -42,6 +52,41 @@ interface GiteaRepoInfo {
private: boolean;
}
interface SyncTargetCandidate {
owner: string;
repoName: string;
}
function parseMirroredLocation(location?: string | null): SyncTargetCandidate | null {
if (!location) return null;
const trimmed = location.trim();
if (!trimmed) return null;
const slashIndex = trimmed.indexOf("/");
if (slashIndex <= 0 || slashIndex === trimmed.length - 1) return null;
const owner = trimmed.slice(0, slashIndex).trim();
const repoName = trimmed.slice(slashIndex + 1).trim();
if (!owner || !repoName) return null;
return { owner, repoName };
}
function dedupeSyncTargets(targets: SyncTargetCandidate[]): SyncTargetCandidate[] {
const seen = new Set<string>();
const deduped: SyncTargetCandidate[] = [];
for (const target of targets) {
const key = `${target.owner}/${target.repoName}`.toLowerCase();
if (seen.has(key)) continue;
seen.add(key);
deduped.push(target);
}
return deduped;
}
/**
* Check if a repository exists in Gitea and return its details
*/
@@ -250,9 +295,12 @@ export async function getOrCreateGiteaOrgEnhanced({
export async function syncGiteaRepoEnhanced({
config,
repository,
skipForcePushDetection,
}: {
config: Partial<Config>;
repository: Repository;
/** When true, skip force-push detection and blocking (used by approve-sync). */
skipForcePushDetection?: boolean;
}, deps?: SyncDependencies): Promise<any> {
try {
if (!config.userId || !config.giteaConfig?.url || !config.giteaConfig?.token) {
@@ -272,19 +320,78 @@ export async function syncGiteaRepoEnhanced({
})
.where(eq(repositories.id, repository.id!));
// Get the expected owner
// Resolve sync target in a backward-compatible order:
// 1) recorded mirroredLocation (actual historical mirror location)
// 2) owner derived from current strategy/config
const dependencies = deps ?? (await import("./gitea"));
const repoOwner = await dependencies.getGiteaRepoOwnerAsync({ config, repository });
const expectedOwner = await dependencies.getGiteaRepoOwnerAsync({ config, repository });
const recordedTarget = parseMirroredLocation(repository.mirroredLocation);
const candidateTargets = dedupeSyncTargets([
...(recordedTarget ? [recordedTarget] : []),
{ owner: expectedOwner, repoName: repository.name },
]);
// Check if repo exists and get its info
const repoInfo = await getGiteaRepoInfo({
config,
owner: repoOwner,
repoName: repository.name,
});
let repoOwner = expectedOwner;
let repoName = repository.name;
let repoInfo: GiteaRepoInfo | null = null;
let firstNonMirrorTarget: SyncTargetCandidate | null = null;
for (const target of candidateTargets) {
const candidateInfo = await getGiteaRepoInfo({
config,
owner: target.owner,
repoName: target.repoName,
});
if (!candidateInfo) {
continue;
}
if (!candidateInfo.mirror) {
if (!firstNonMirrorTarget) {
firstNonMirrorTarget = target;
}
continue;
}
repoOwner = target.owner;
repoName = target.repoName;
repoInfo = candidateInfo;
break;
}
if (!repoInfo) {
throw new Error(`Repository ${repository.name} not found in Gitea at ${repoOwner}/${repository.name}`);
if (firstNonMirrorTarget) {
console.warn(
`[Sync] Repository ${repository.name} exists at ${firstNonMirrorTarget.owner}/${firstNonMirrorTarget.repoName} but is not configured as a mirror`
);
await db
.update(repositories)
.set({
status: repoStatusEnum.parse("failed"),
updatedAt: new Date(),
errorMessage: "Repository exists in Gitea but is not configured as a mirror. Manual intervention required.",
})
.where(eq(repositories.id, repository.id!));
await createMirrorJob({
userId: config.userId,
repositoryId: repository.id,
repositoryName: repository.name,
message: `Cannot sync ${repository.name}: Not a mirror repository`,
details: `Repository ${repository.name} exists in Gitea but is not configured as a mirror. You may need to delete and recreate it as a mirror, or manually configure it as a mirror in Gitea.`,
status: "failed",
});
throw new Error(`Repository ${repository.name} is not a mirror. Cannot sync.`);
}
throw new Error(
`Repository ${repository.name} not found in Gitea. Tried locations: ${candidateTargets
.map((t) => `${t.owner}/${t.repoName}`)
.join(", ")}`
);
}
// Check if it's a mirror repository
@@ -313,25 +420,160 @@ export async function syncGiteaRepoEnhanced({
throw new Error(`Repository ${repository.name} is not a mirror. Cannot sync.`);
}
// ---- Smart backup strategy with force-push detection ----
const backupStrategy = resolveBackupStrategy(config);
let forcePushDetected = false;
if (backupStrategy !== "disabled") {
// Run force-push detection if the strategy requires it
// (skip when called from approve-sync to avoid re-blocking)
if (strategyNeedsDetection(backupStrategy) && !skipForcePushDetection) {
try {
const decryptedGithubToken = decryptedConfig.githubConfig?.token;
if (decryptedGithubToken) {
const fpOctokit = new Octokit({ auth: decryptedGithubToken });
const detectionResult = await detectForcePush({
giteaUrl: config.giteaConfig.url,
giteaToken: decryptedConfig.giteaConfig.token,
giteaOwner: repoOwner,
giteaRepo: repoName,
octokit: fpOctokit,
githubOwner: repository.owner,
githubRepo: repository.name,
});
forcePushDetected = detectionResult.detected;
if (detectionResult.skipped) {
console.log(
`[Sync] Force-push detection skipped for ${repository.name}: ${detectionResult.skipReason}`,
);
} else if (forcePushDetected) {
const branchNames = detectionResult.affectedBranches
.map((b) => `${b.name} (${b.reason})`)
.join(", ");
console.warn(
`[Sync] Force-push detected on ${repository.name}: ${branchNames}`,
);
}
} else {
console.log(
`[Sync] Skipping force-push detection for ${repository.name}: no GitHub token`,
);
}
} catch (detectionError) {
// Fail-open: detection errors should never block sync
console.warn(
`[Sync] Force-push detection failed for ${repository.name}, proceeding with sync: ${
detectionError instanceof Error ? detectionError.message : String(detectionError)
}`,
);
}
}
// Check if sync should be blocked (block-on-force-push mode)
if (shouldBlockSyncForStrategy(backupStrategy, forcePushDetected)) {
const branchInfo = `Force-push detected; sync blocked for manual approval.`;
await db
.update(repositories)
.set({
status: "pending-approval",
updatedAt: new Date(),
errorMessage: branchInfo,
})
.where(eq(repositories.id, repository.id!));
await createMirrorJob({
userId: config.userId,
repositoryId: repository.id,
repositoryName: repository.name,
message: `Sync blocked for ${repository.name}: force-push detected`,
details: branchInfo,
status: "pending-approval",
});
console.warn(`[Sync] Sync blocked for ${repository.name}: pending manual approval`);
return { blocked: true, reason: branchInfo };
}
// Create backup if strategy says so
if (shouldBackupForStrategy(backupStrategy, forcePushDetected)) {
const cloneUrl =
repoInfo.clone_url ||
`${config.giteaConfig.url.replace(/\/$/, "")}/${repoOwner}/${repoName}.git`;
try {
const backupResult = await createPreSyncBundleBackup({
config,
owner: repoOwner,
repoName,
cloneUrl,
force: true, // Strategy already decided to backup; skip legacy gate
});
await createMirrorJob({
userId: config.userId,
repositoryId: repository.id,
repositoryName: repository.name,
message: `Snapshot created for ${repository.name}`,
details: `Pre-sync snapshot created at ${backupResult.bundlePath}.`,
status: "syncing",
});
} catch (backupError) {
const errorMessage =
backupError instanceof Error ? backupError.message : String(backupError);
await createMirrorJob({
userId: config.userId,
repositoryId: repository.id,
repositoryName: repository.name,
message: `Snapshot failed for ${repository.name}`,
details: `Pre-sync snapshot failed: ${errorMessage}`,
status: "failed",
});
if (shouldBlockSyncOnBackupFailure(config)) {
await db
.update(repositories)
.set({
status: repoStatusEnum.parse("failed"),
updatedAt: new Date(),
errorMessage: `Snapshot failed; sync blocked to protect history. ${errorMessage}`,
})
.where(eq(repositories.id, repository.id!));
throw new Error(
`Snapshot failed; sync blocked to protect history. ${errorMessage}`,
);
}
console.warn(
`[Sync] Snapshot failed for ${repository.name}, continuing because blockSyncOnBackupFailure=false: ${errorMessage}`,
);
}
}
}
// Update mirror interval if needed
if (config.giteaConfig?.mirrorInterval) {
try {
console.log(`[Sync] Updating mirror interval for ${repository.name} to ${config.giteaConfig.mirrorInterval}`);
const updateUrl = `${config.giteaConfig.url}/api/v1/repos/${repoOwner}/${repository.name}`;
console.log(`[Sync] Updating mirror interval for ${repoOwner}/${repoName} to ${config.giteaConfig.mirrorInterval}`);
const updateUrl = `${config.giteaConfig.url}/api/v1/repos/${repoOwner}/${repoName}`;
await httpPatch(updateUrl, {
mirror_interval: config.giteaConfig.mirrorInterval,
}, {
Authorization: `token ${decryptedConfig.giteaConfig.token}`,
});
console.log(`[Sync] Successfully updated mirror interval for ${repository.name}`);
console.log(`[Sync] Successfully updated mirror interval for ${repoOwner}/${repoName}`);
} catch (updateError) {
console.warn(`[Sync] Failed to update mirror interval for ${repository.name}:`, updateError);
console.warn(`[Sync] Failed to update mirror interval for ${repoOwner}/${repoName}:`, updateError);
// Continue with sync even if interval update fails
}
}
// Perform the sync
const apiUrl = `${config.giteaConfig.url}/api/v1/repos/${repoOwner}/${repository.name}/mirror-sync`;
const apiUrl = `${config.giteaConfig.url}/api/v1/repos/${repoOwner}/${repoName}/mirror-sync`;
try {
const response = await httpPost(apiUrl, undefined, {
@@ -388,7 +630,7 @@ export async function syncGiteaRepoEnhanced({
octokit,
repository,
giteaOwner: repoOwner,
giteaRepoName: repository.name,
giteaRepoName: repoName,
});
metadataState.components.releases = true;
metadataUpdated = true;
@@ -420,7 +662,7 @@ export async function syncGiteaRepoEnhanced({
octokit,
repository,
giteaOwner: repoOwner,
giteaRepoName: repository.name,
giteaRepoName: repoName,
});
metadataState.components.issues = true;
metadataState.components.labels = true;
@@ -453,7 +695,7 @@ export async function syncGiteaRepoEnhanced({
octokit,
repository,
giteaOwner: repoOwner,
giteaRepoName: repository.name,
giteaRepoName: repoName,
});
metadataState.components.pullRequests = true;
metadataUpdated = true;
@@ -483,7 +725,7 @@ export async function syncGiteaRepoEnhanced({
octokit,
repository,
giteaOwner: repoOwner,
giteaRepoName: repository.name,
giteaRepoName: repoName,
});
metadataState.components.labels = true;
metadataUpdated = true;
@@ -522,7 +764,7 @@ export async function syncGiteaRepoEnhanced({
octokit,
repository,
giteaOwner: repoOwner,
giteaRepoName: repository.name,
giteaRepoName: repoName,
});
metadataState.components.milestones = true;
metadataUpdated = true;
@@ -560,7 +802,7 @@ export async function syncGiteaRepoEnhanced({
updatedAt: new Date(),
lastMirrored: new Date(),
errorMessage: null,
mirroredLocation: `${repoOwner}/${repository.name}`,
mirroredLocation: `${repoOwner}/${repoName}`,
metadata: metadataUpdated
? serializeRepositoryMetadataState(metadataState)
: repository.metadata ?? null,
@@ -572,7 +814,7 @@ export async function syncGiteaRepoEnhanced({
repositoryId: repository.id,
repositoryName: repository.name,
message: `Sync requested for repository: ${repository.name}`,
details: `Mirror sync was requested for ${repository.name}. Gitea/Forgejo performs the actual pull asynchronously; check remote logs for pull errors.`,
details: `Mirror sync was requested for ${repoOwner}/${repoName}.`,
status: "synced",
});

View File

@@ -24,9 +24,14 @@ mock.module("@/lib/db", () => {
values: mock(() => Promise.resolve())
}))
},
users: {},
configs: {},
repositories: {},
organizations: {},
events: {}
events: {},
mirrorJobs: {},
accounts: {},
sessions: {},
};
});
@@ -59,10 +64,16 @@ const mockGetOrCreateGiteaOrg = mock(async ({ orgName, config }: any) => {
const mockMirrorGitHubOrgRepoToGiteaOrg = mock(async () => {});
const mockIsRepoPresentInGitea = mock(async () => false);
const mockMirrorGithubRepoToGitea = mock(async () => {});
const mockGetGiteaRepoOwnerAsync = mock(async () => "starred");
const mockGetGiteaRepoOwner = mock(() => "starred");
mock.module("./gitea", () => ({
getOrCreateGiteaOrg: mockGetOrCreateGiteaOrg,
mirrorGitHubOrgRepoToGiteaOrg: mockMirrorGitHubOrgRepoToGiteaOrg,
mirrorGithubRepoToGitea: mockMirrorGithubRepoToGitea,
getGiteaRepoOwner: mockGetGiteaRepoOwner,
getGiteaRepoOwnerAsync: mockGetGiteaRepoOwnerAsync,
isRepoPresentInGitea: mockIsRepoPresentInGitea
}));
@@ -226,4 +237,4 @@ describe("Starred Repository Error Handling", () => {
});
});
});
});

45
src/lib/gitea-url.test.ts Normal file
View File

@@ -0,0 +1,45 @@
import { describe, expect, it } from "bun:test";
import { buildGiteaWebUrl, getGiteaWebBaseUrl } from "@/lib/gitea-url";
describe("getGiteaWebBaseUrl", () => {
it("prefers externalUrl when both urls are present", () => {
const baseUrl = getGiteaWebBaseUrl({
url: "http://gitea:3000",
externalUrl: "https://git.example.com",
});
expect(baseUrl).toBe("https://git.example.com");
});
it("falls back to url when externalUrl is missing", () => {
const baseUrl = getGiteaWebBaseUrl({
url: "http://gitea:3000",
});
expect(baseUrl).toBe("http://gitea:3000");
});
it("trims a trailing slash", () => {
const baseUrl = getGiteaWebBaseUrl({
externalUrl: "https://git.example.com/",
});
expect(baseUrl).toBe("https://git.example.com");
});
});
describe("buildGiteaWebUrl", () => {
it("builds a full repository url and removes leading path slashes", () => {
const url = buildGiteaWebUrl(
{ externalUrl: "https://git.example.com/" },
"/org/repo"
);
expect(url).toBe("https://git.example.com/org/repo");
});
it("returns null when no gitea url is configured", () => {
const url = buildGiteaWebUrl({}, "org/repo");
expect(url).toBeNull();
});
});

28
src/lib/gitea-url.ts Normal file
View File

@@ -0,0 +1,28 @@
interface GiteaUrlConfig {
url?: string | null;
externalUrl?: string | null;
}
export function getGiteaWebBaseUrl(
config?: GiteaUrlConfig | null
): string | null {
const rawBaseUrl = config?.externalUrl || config?.url;
if (!rawBaseUrl) {
return null;
}
return rawBaseUrl.endsWith("/") ? rawBaseUrl.slice(0, -1) : rawBaseUrl;
}
export function buildGiteaWebUrl(
config: GiteaUrlConfig | null | undefined,
path: string
): string | null {
const baseUrl = getGiteaWebBaseUrl(config);
if (!baseUrl) {
return null;
}
const normalizedPath = path.replace(/^\/+/, "");
return normalizedPath ? `${baseUrl}/${normalizedPath}` : baseUrl;
}

View File

@@ -27,8 +27,14 @@ mock.module("@/lib/db", () => {
})
})
},
users: {},
configs: {},
repositories: {},
organizations: {}
organizations: {},
mirrorJobs: {},
events: {},
accounts: {},
sessions: {},
};
});
@@ -55,8 +61,61 @@ mock.module("@/lib/http-client", () => {
// Mock the gitea module itself
mock.module("./gitea", () => {
const mockGetGiteaRepoOwner = mock(({ config, repository }: any) => {
if (repository?.isStarred && config?.githubConfig?.starredReposMode === "preserve-owner") {
return repository.organization || repository.owner;
}
if (repository?.isStarred) {
return config?.githubConfig?.starredReposOrg || "starred";
}
const mirrorStrategy =
config?.githubConfig?.mirrorStrategy ||
(config?.giteaConfig?.preserveOrgStructure ? "preserve" : "flat-user");
const configuredGitHubOwner =
(config?.githubConfig?.owner || config?.githubConfig?.username || "")
.trim()
.toLowerCase();
const repoOwner = repository?.owner?.trim().toLowerCase();
switch (mirrorStrategy) {
case "preserve":
if (repository?.organization) {
return repository.organization;
}
if (configuredGitHubOwner && repoOwner && repoOwner !== configuredGitHubOwner) {
return repository.owner;
}
return config?.giteaConfig?.defaultOwner || "giteauser";
case "single-org":
return config?.giteaConfig?.organization || config?.giteaConfig?.defaultOwner || "giteauser";
case "mixed":
if (repository?.organization) return repository.organization;
return config?.giteaConfig?.organization || config?.giteaConfig?.defaultOwner || "giteauser";
case "flat-user":
default:
return config?.giteaConfig?.defaultOwner || "giteauser";
}
});
const mockGetGiteaRepoOwnerAsync = mock(async ({ config, repository }: any) => {
if (repository?.isStarred && config?.githubConfig?.starredReposMode === "preserve-owner") {
return repository.organization || repository.owner;
}
if (repository?.destinationOrg) {
return repository.destinationOrg;
}
if (repository?.organization && mockDbSelectResult[0]?.destinationOrg) {
return mockDbSelectResult[0].destinationOrg;
}
return mockGetGiteaRepoOwner({ config, repository });
});
return {
isRepoPresentInGitea: mockIsRepoPresentInGitea,
getGiteaRepoOwner: mockGetGiteaRepoOwner,
getGiteaRepoOwnerAsync: mockGetGiteaRepoOwnerAsync,
mirrorGithubRepoToGitea: mock(async () => {}),
mirrorGitHubOrgRepoToGiteaOrg: mock(async () => {})
};
@@ -328,6 +387,7 @@ describe("Gitea Repository Mirroring", () => {
describe("getGiteaRepoOwner - Organization Override Tests", () => {
const baseConfig: Partial<Config> = {
githubConfig: {
owner: "testuser",
username: "testuser",
token: "token",
preserveOrgStructure: false,
@@ -436,6 +496,18 @@ describe("getGiteaRepoOwner - Organization Override Tests", () => {
expect(result).toBe("giteauser");
});
test("preserve strategy: personal repos owned by another user keep source owner namespace", () => {
const repo = {
...baseRepo,
owner: "nice-user",
fullName: "nice-user/test-repo",
organization: undefined,
isForked: true,
};
const result = getGiteaRepoOwner({ config: baseConfig, repository: repo });
expect(result).toBe("nice-user");
});
test("preserve strategy: org repos go to same org name", () => {
const repo = { ...baseRepo, organization: "myorg" };
const result = getGiteaRepoOwner({ config: baseConfig, repository: repo });
@@ -541,4 +613,26 @@ describe("getGiteaRepoOwner - Organization Override Tests", () => {
expect(result).toBe("FOO");
});
test("getGiteaRepoOwnerAsync preserves external personal owner for preserve strategy", async () => {
const configWithUser: Partial<Config> = {
...baseConfig,
userId: "user-id",
};
const repo = {
...baseRepo,
owner: "nice-user",
fullName: "nice-user/test-repo",
organization: undefined,
isForked: true,
};
const result = await getGiteaRepoOwnerAsync({
config: configWithUser,
repository: repo,
});
expect(result).toBe("nice-user");
});
});

View File

@@ -10,9 +10,10 @@ import type { Organization, Repository } from "./db/schema";
import { httpPost, httpGet, httpDelete, httpPut, httpPatch } from "./http-client";
import { createMirrorJob } from "./helpers";
import { db, organizations, repositories } from "./db";
import { eq, and } from "drizzle-orm";
import { eq, and, ne } from "drizzle-orm";
import { decryptConfigTokens } from "./utils/config-encryption";
import { formatDateShort } from "./utils";
import { buildGithubSourceAuthPayload } from "./utils/mirror-source-auth";
import {
parseRepositoryMetadataState,
serializeRepositoryMetadataState,
@@ -138,14 +139,35 @@ export const getGiteaRepoOwner = ({
// Get the mirror strategy - use preserveOrgStructure for backward compatibility
const mirrorStrategy = config.githubConfig.mirrorStrategy ||
(config.giteaConfig.preserveOrgStructure ? "preserve" : "flat-user");
const configuredGitHubOwner =
(
config.githubConfig.owner ||
(config.githubConfig as typeof config.githubConfig & { username?: string }).username ||
""
)
.trim()
.toLowerCase();
switch (mirrorStrategy) {
case "preserve":
// Keep GitHub structure - org repos go to same org, personal repos to user (or override)
// Keep GitHub structure:
// - org repos stay in the same org
// - personal repos owned by other users keep their source owner namespace
// - personal repos owned by the configured account go to defaultOwner
if (repository.organization) {
return repository.organization;
}
// Use personal repos override if configured, otherwise use username
const normalizedRepoOwner = repository.owner.trim().toLowerCase();
if (
normalizedRepoOwner &&
configuredGitHubOwner &&
normalizedRepoOwner !== configuredGitHubOwner
) {
return repository.owner;
}
// Personal repos from the configured GitHub account go to the configured default owner
return config.giteaConfig.defaultOwner;
case "single-org":
@@ -353,6 +375,161 @@ export const checkRepoLocation = async ({
return { present: false, actualOwner: expectedOwner };
};
const sanitizeTopicForGitea = (topic: string): string =>
topic
.trim()
.toLowerCase()
.replace(/[^a-z0-9-]+/g, "-")
.replace(/-+/g, "-")
.replace(/^-+/, "")
.replace(/-+$/, "");
const normalizeTopicsForGitea = (
topics: string[],
topicPrefix?: string
): string[] => {
const normalizedPrefix = topicPrefix ? sanitizeTopicForGitea(topicPrefix) : "";
const transformedTopics = topics
.map((topic) => sanitizeTopicForGitea(topic))
.filter((topic) => topic.length > 0)
.map((topic) => (normalizedPrefix ? `${normalizedPrefix}-${topic}` : topic));
return [...new Set(transformedTopics)];
};
const getSourceRepositoryCoordinates = (repository: Repository) => {
const delimiterIndex = repository.fullName.indexOf("/");
if (
delimiterIndex > 0 &&
delimiterIndex < repository.fullName.length - 1
) {
return {
owner: repository.fullName.slice(0, delimiterIndex),
repo: repository.fullName.slice(delimiterIndex + 1),
};
}
return {
owner: repository.owner,
repo: repository.name,
};
};
const fetchGitHubTopics = async ({
octokit,
repository,
}: {
octokit: Octokit;
repository: Repository;
}): Promise<string[] | null> => {
const { owner, repo } = getSourceRepositoryCoordinates(repository);
try {
const response = await octokit.request("GET /repos/{owner}/{repo}/topics", {
owner,
repo,
headers: {
Accept: "application/vnd.github+json",
},
});
const names = (response.data as { names?: unknown }).names;
if (!Array.isArray(names)) {
console.warn(
`[Metadata] Unexpected topics payload for ${repository.fullName}; skipping topic sync.`
);
return null;
}
return names.filter((topic): topic is string => typeof topic === "string");
} catch (error) {
console.warn(
`[Metadata] Failed to fetch topics from GitHub for ${repository.fullName}: ${
error instanceof Error ? error.message : String(error)
}`
);
return null;
}
};
const syncRepositoryMetadataToGitea = async ({
config,
octokit,
repository,
giteaOwner,
giteaRepoName,
giteaToken,
}: {
config: Partial<Config>;
octokit: Octokit;
repository: Repository;
giteaOwner: string;
giteaRepoName: string;
giteaToken: string;
}): Promise<void> => {
const giteaBaseUrl = config.giteaConfig?.url;
if (!giteaBaseUrl) {
return;
}
const repoApiUrl = `${giteaBaseUrl}/api/v1/repos/${giteaOwner}/${giteaRepoName}`;
const authHeaders = {
Authorization: `token ${giteaToken}`,
};
const description = repository.description?.trim() || "";
try {
await httpPatch(
repoApiUrl,
{ description },
authHeaders
);
console.log(
`[Metadata] Synced description for ${repository.fullName} to ${giteaOwner}/${giteaRepoName}`
);
} catch (error) {
console.warn(
`[Metadata] Failed to sync description for ${repository.fullName} to ${giteaOwner}/${giteaRepoName}: ${
error instanceof Error ? error.message : String(error)
}`
);
}
if (config.giteaConfig?.addTopics === false) {
return;
}
const sourceTopics = await fetchGitHubTopics({ octokit, repository });
if (sourceTopics === null) {
console.warn(
`[Metadata] Skipping topic sync for ${repository.fullName} because GitHub topics could not be fetched.`
);
return;
}
const topics = normalizeTopicsForGitea(
sourceTopics,
config.giteaConfig?.topicPrefix
);
try {
await httpPut(
`${repoApiUrl}/topics`,
{ topics },
authHeaders
);
console.log(
`[Metadata] Synced ${topics.length} topic(s) for ${repository.fullName} to ${giteaOwner}/${giteaRepoName}`
);
} catch (error) {
console.warn(
`[Metadata] Failed to sync topics for ${repository.fullName} to ${giteaOwner}/${giteaRepoName}: ${
error instanceof Error ? error.message : String(error)
}`
);
}
};
export const mirrorGithubRepoToGitea = async ({
octokit,
repository,
@@ -376,6 +553,23 @@ export const mirrorGithubRepoToGitea = async ({
// Get the correct owner based on the strategy (with organization overrides)
let repoOwner = await getGiteaRepoOwnerAsync({ config, repository });
const mirrorStrategy = config.githubConfig.mirrorStrategy ||
(config.giteaConfig.preserveOrgStructure ? "preserve" : "flat-user");
const configuredGitHubOwner = (
config.githubConfig.owner ||
(config.githubConfig as typeof config.githubConfig & { username?: string }).username ||
""
)
.trim()
.toLowerCase();
const normalizedRepoOwner = repository.owner.trim().toLowerCase();
const isExternalPersonalRepoInPreserveMode =
mirrorStrategy === "preserve" &&
!repository.organization &&
!repository.isStarred &&
normalizedRepoOwner !== "" &&
configuredGitHubOwner !== "" &&
normalizedRepoOwner !== configuredGitHubOwner;
// Determine the actual repository name to use (handle duplicates for starred repos)
let targetRepoName = repository.name;
@@ -393,6 +587,7 @@ export const mirrorGithubRepoToGitea = async ({
orgName: repoOwner,
baseName: repository.name,
githubOwner,
fullName: repository.fullName,
strategy: config.githubConfig.starredDuplicateStrategy,
});
@@ -427,36 +622,66 @@ export const mirrorGithubRepoToGitea = async ({
});
if (isExisting) {
console.log(
`Repository ${targetRepoName} already exists in Gitea under ${repoOwner}. Updating database status.`
);
// Update database to reflect that the repository is already mirrored
await db
.update(repositories)
.set({
status: repoStatusEnum.parse("mirrored"),
updatedAt: new Date(),
lastMirrored: new Date(),
errorMessage: null,
mirroredLocation: `${repoOwner}/${targetRepoName}`,
})
.where(eq(repositories.id, repository.id!));
// Append log for "mirrored" status
await createMirrorJob({
userId: config.userId,
repositoryId: repository.id,
repositoryName: repository.name,
message: `Repository ${repository.name} already exists in Gitea`,
details: `Repository ${repository.name} was found to already exist in Gitea under ${repoOwner} and database status was updated.`,
status: "mirrored",
const { getGiteaRepoInfo, handleExistingNonMirrorRepo } = await import("./gitea-enhanced");
const existingRepoInfo = await getGiteaRepoInfo({
config,
owner: repoOwner,
repoName: targetRepoName,
});
console.log(
`Repository ${repository.name} database status updated to mirrored`
);
return;
if (existingRepoInfo && !existingRepoInfo.mirror) {
console.log(`Repository ${targetRepoName} exists but is not a mirror. Handling...`);
await handleExistingNonMirrorRepo({
config,
repository,
repoInfo: existingRepoInfo,
strategy: "delete", // Can be configured: "skip", "delete", or "rename"
});
} else if (existingRepoInfo?.mirror) {
console.log(
`Repository ${targetRepoName} already exists in Gitea under ${repoOwner}. Updating database status.`
);
await syncRepositoryMetadataToGitea({
config,
octokit,
repository,
giteaOwner: repoOwner,
giteaRepoName: targetRepoName,
giteaToken: decryptedConfig.giteaConfig.token,
});
// Update database to reflect that the repository is already mirrored
await db
.update(repositories)
.set({
status: repoStatusEnum.parse("mirrored"),
updatedAt: new Date(),
lastMirrored: new Date(),
errorMessage: null,
mirroredLocation: `${repoOwner}/${targetRepoName}`,
})
.where(eq(repositories.id, repository.id!));
// Append log for "mirrored" status
await createMirrorJob({
userId: config.userId,
repositoryId: repository.id,
repositoryName: repository.name,
message: `Repository ${repository.name} already exists in Gitea`,
details: `Repository ${repository.name} was found to already exist in Gitea under ${repoOwner} and database status was updated.`,
status: "mirrored",
});
console.log(
`Repository ${repository.name} database status updated to mirrored`
);
return;
} else {
console.warn(
`[Mirror] Repository ${repoOwner}/${targetRepoName} exists but mirror status could not be verified. Continuing with mirror creation flow.`
);
}
}
console.log(`Mirroring repository ${repository.name}`);
@@ -520,6 +745,13 @@ export const mirrorGithubRepoToGitea = async ({
(orgError.message.includes('Permission denied') ||
orgError.message.includes('Authentication failed') ||
orgError.message.includes('does not have permission'))) {
if (isExternalPersonalRepoInPreserveMode) {
throw new Error(
`Cannot create/access namespace "${repoOwner}" for ${repository.fullName}. ` +
`Refusing fallback to "${config.giteaConfig.defaultOwner}" in preserve mode to avoid cross-owner overwrite.`
);
}
console.warn(`[Fallback] Organization creation/access failed. Attempting to mirror to user account instead.`);
// Update the repository owner to use the user account
@@ -585,16 +817,28 @@ export const mirrorGithubRepoToGitea = async ({
// Add authentication for private repositories
if (repository.isPrivate) {
if (!config.githubConfig.token) {
throw new Error(
"GitHub token is required to mirror private repositories."
);
}
// Use separate auth fields (required for Forgejo 12+ compatibility)
migratePayload.auth_username = "oauth2"; // GitHub tokens work with any username
migratePayload.auth_token = decryptedConfig.githubConfig.token;
const githubOwner =
(
config.githubConfig as typeof config.githubConfig & {
owner?: string;
}
).owner || "";
Object.assign(
migratePayload,
buildGithubSourceAuthPayload({
token: decryptedConfig.githubConfig.token,
githubOwner,
githubUsername: config.githubConfig.username,
repositoryOwner: repository.owner,
})
);
}
// Track whether the Gitea migrate call succeeded so the catch block
// knows whether to clear mirroredLocation (only safe before migrate succeeds)
let migrateSucceeded = false;
const response = await httpPost(
apiUrl,
migratePayload,
@@ -603,6 +847,17 @@ export const mirrorGithubRepoToGitea = async ({
}
);
migrateSucceeded = true;
await syncRepositoryMetadataToGitea({
config,
octokit,
repository,
giteaOwner: repoOwner,
giteaRepoName: targetRepoName,
giteaToken: decryptedConfig.giteaConfig.token,
});
const metadataState = parseRepositoryMetadataState(repository.metadata);
let metadataUpdated = false;
const skipMetadataForStarred =
@@ -836,14 +1091,21 @@ export const mirrorGithubRepoToGitea = async ({
}`
);
// Mark repos as "failed" in DB
// Mark repos as "failed" in DB. Only clear mirroredLocation if the Gitea
// migrate call itself failed (repo doesn't exist in Gitea). If migrate
// succeeded but metadata mirroring failed, preserve the location since
// the repo physically exists and we need the location for recovery/retry.
const failureUpdate: Record<string, any> = {
status: repoStatusEnum.parse("failed"),
updatedAt: new Date(),
errorMessage: error instanceof Error ? error.message : "Unknown error",
};
if (!migrateSucceeded) {
failureUpdate.mirroredLocation = "";
}
await db
.update(repositories)
.set({
status: repoStatusEnum.parse("failed"),
updatedAt: new Date(),
errorMessage: error instanceof Error ? error.message : "Unknown error",
})
.set(failureUpdate)
.where(eq(repositories.id, repository.id!));
// Append log for failure
@@ -894,29 +1156,103 @@ export async function getOrCreateGiteaOrg({
}
/**
* Generate a unique repository name for starred repos with duplicate names
* Check if a candidate mirroredLocation is already claimed by another repository
* in the local database. This prevents race conditions during concurrent batch
* mirroring where two repos could both claim the same name before either
* finishes creating in Gitea.
*/
async function isMirroredLocationClaimedInDb({
userId,
candidateLocation,
excludeFullName,
}: {
userId: string;
candidateLocation: string;
excludeFullName: string;
}): Promise<boolean> {
try {
const existing = await db
.select({ id: repositories.id })
.from(repositories)
.where(
and(
eq(repositories.userId, userId),
eq(repositories.mirroredLocation, candidateLocation),
ne(repositories.fullName, excludeFullName)
)
)
.limit(1);
return existing.length > 0;
} catch (error) {
console.error(
`Error checking DB for mirroredLocation "${candidateLocation}":`,
error
);
// Fail-closed: assume claimed to be conservative and prevent collisions
return true;
}
}
/**
* Generate a unique repository name for starred repos with duplicate names.
* Checks both the Gitea instance (HTTP) and the local DB (mirroredLocation)
* to reduce collisions during concurrent batch mirroring.
*
* NOTE: This function only checks availability — it does NOT claim the name.
* The actual claim happens later when mirroredLocation is written at the
* status="mirroring" DB update, which is protected by a unique partial index
* on (userId, mirroredLocation) WHERE mirroredLocation != ''.
*/
async function generateUniqueRepoName({
config,
orgName,
baseName,
githubOwner,
fullName,
strategy,
}: {
config: Partial<Config>;
orgName: string;
baseName: string;
githubOwner: string;
fullName: string;
strategy?: string;
}): Promise<string> {
if (!fullName?.includes("/")) {
throw new Error(
`Invalid fullName "${fullName}" for starred repo dedup — expected "owner/repo" format`
);
}
const duplicateStrategy = strategy || "suffix";
const userId = config.userId || "";
// Helper: check both Gitea and local DB for a candidate name
const isNameTaken = async (candidateName: string): Promise<boolean> => {
const existsInGitea = await isRepoPresentInGitea({
config,
owner: orgName,
repoName: candidateName,
});
if (existsInGitea) return true;
// Also check local DB to catch concurrent batch operations
// where another repo claimed this location but hasn't created it in Gitea yet
if (userId) {
const claimedInDb = await isMirroredLocationClaimedInDb({
userId,
candidateLocation: `${orgName}/${candidateName}`,
excludeFullName: fullName,
});
if (claimedInDb) return true;
}
return false;
};
// First check if base name is available
const baseExists = await isRepoPresentInGitea({
config,
owner: orgName,
repoName: baseName,
});
const baseExists = await isNameTaken(baseName);
if (!baseExists) {
return baseName;
@@ -948,11 +1284,7 @@ async function generateUniqueRepoName({
break;
}
const exists = await isRepoPresentInGitea({
config,
owner: orgName,
repoName: candidateName,
});
const exists = await isNameTaken(candidateName);
if (!exists) {
console.log(`Found unique name for duplicate starred repo: ${candidateName}`);
@@ -1015,6 +1347,7 @@ export async function mirrorGitHubRepoToGiteaOrg({
orgName,
baseName: repository.name,
githubOwner,
fullName: repository.fullName,
strategy: config.githubConfig.starredDuplicateStrategy,
});
@@ -1049,36 +1382,66 @@ export async function mirrorGitHubRepoToGiteaOrg({
});
if (isExisting) {
console.log(
`Repository ${targetRepoName} already exists in Gitea organization ${orgName}. Updating database status.`
);
// Update database to reflect that the repository is already mirrored
await db
.update(repositories)
.set({
status: repoStatusEnum.parse("mirrored"),
updatedAt: new Date(),
lastMirrored: new Date(),
errorMessage: null,
mirroredLocation: `${orgName}/${targetRepoName}`,
})
.where(eq(repositories.id, repository.id!));
// Create a mirror job log entry
await createMirrorJob({
userId: config.userId,
repositoryId: repository.id,
repositoryName: repository.name,
message: `Repository ${targetRepoName} already exists in Gitea organization ${orgName}`,
details: `Repository ${targetRepoName} was found to already exist in Gitea organization ${orgName} and database status was updated.`,
status: "mirrored",
const { getGiteaRepoInfo, handleExistingNonMirrorRepo } = await import("./gitea-enhanced");
const existingRepoInfo = await getGiteaRepoInfo({
config,
owner: orgName,
repoName: targetRepoName,
});
console.log(
`Repository ${targetRepoName} database status updated to mirrored in organization ${orgName}`
);
return;
if (existingRepoInfo && !existingRepoInfo.mirror) {
console.log(`Repository ${targetRepoName} exists but is not a mirror. Handling...`);
await handleExistingNonMirrorRepo({
config,
repository,
repoInfo: existingRepoInfo,
strategy: "delete", // Can be configured: "skip", "delete", or "rename"
});
} else if (existingRepoInfo?.mirror) {
console.log(
`Repository ${targetRepoName} already exists in Gitea organization ${orgName}. Updating database status.`
);
await syncRepositoryMetadataToGitea({
config,
octokit,
repository,
giteaOwner: orgName,
giteaRepoName: targetRepoName,
giteaToken: decryptedConfig.giteaConfig.token,
});
// Update database to reflect that the repository is already mirrored
await db
.update(repositories)
.set({
status: repoStatusEnum.parse("mirrored"),
updatedAt: new Date(),
lastMirrored: new Date(),
errorMessage: null,
mirroredLocation: `${orgName}/${targetRepoName}`,
})
.where(eq(repositories.id, repository.id!));
// Create a mirror job log entry
await createMirrorJob({
userId: config.userId,
repositoryId: repository.id,
repositoryName: repository.name,
message: `Repository ${targetRepoName} already exists in Gitea organization ${orgName}`,
details: `Repository ${targetRepoName} was found to already exist in Gitea organization ${orgName} and database status was updated.`,
status: "mirrored",
});
console.log(
`Repository ${targetRepoName} database status updated to mirrored in organization ${orgName}`
);
return;
} else {
console.warn(
`[Mirror] Repository ${orgName}/${targetRepoName} exists but mirror status could not be verified. Continuing with mirror creation flow.`
);
}
}
console.log(
@@ -1137,20 +1500,31 @@ export async function mirrorGitHubRepoToGiteaOrg({
wiki: shouldMirrorWiki || false,
lfs: config.giteaConfig?.lfs || false,
private: repository.isPrivate,
description: repository.description?.trim() || "",
};
// Add authentication for private repositories
if (repository.isPrivate) {
if (!config.githubConfig?.token) {
throw new Error(
"GitHub token is required to mirror private repositories."
);
}
// Use separate auth fields (required for Forgejo 12+ compatibility)
migratePayload.auth_username = "oauth2"; // GitHub tokens work with any username
migratePayload.auth_token = decryptedConfig.githubConfig.token;
const githubOwner =
(
config.githubConfig as typeof config.githubConfig & {
owner?: string;
}
)?.owner || "";
Object.assign(
migratePayload,
buildGithubSourceAuthPayload({
token: decryptedConfig.githubConfig?.token,
githubOwner,
githubUsername: config.githubConfig?.username,
repositoryOwner: repository.owner,
})
);
}
let migrateSucceeded = false;
const migrateRes = await httpPost(
apiUrl,
migratePayload,
@@ -1159,6 +1533,17 @@ export async function mirrorGitHubRepoToGiteaOrg({
}
);
migrateSucceeded = true;
await syncRepositoryMetadataToGitea({
config,
octokit,
repository,
giteaOwner: orgName,
giteaRepoName: targetRepoName,
giteaToken: decryptedConfig.giteaConfig.token,
});
const metadataState = parseRepositoryMetadataState(repository.metadata);
let metadataUpdated = false;
const skipMetadataForStarred =
@@ -1397,14 +1782,23 @@ export async function mirrorGitHubRepoToGiteaOrg({
error instanceof Error ? error.message : String(error)
}`
);
// Mark repos as "failed" in DB
// Mark repos as "failed" in DB. For starred repos, clear mirroredLocation
// to release the name claim for retry. For non-starred repos, preserve it
// since the Gitea repo may partially exist and we need the location for recovery.
const failureUpdate2: Record<string, any> = {
status: repoStatusEnum.parse("failed"),
updatedAt: new Date(),
errorMessage: error instanceof Error ? error.message : "Unknown error",
};
// Only clear mirroredLocation if the Gitea migrate call itself failed.
// If migrate succeeded but metadata mirroring failed, preserve the
// location since the repo physically exists in Gitea.
if (!migrateSucceeded) {
failureUpdate2.mirroredLocation = "";
}
await db
.update(repositories)
.set({
status: repoStatusEnum.parse("failed"),
updatedAt: new Date(),
errorMessage: error instanceof Error ? error.message : "Unknown error",
})
.set(failureUpdate2)
.where(eq(repositories.id, repository.id!));
// Append log for failure

View File

@@ -0,0 +1,319 @@
import { describe, expect, test, mock } from "bun:test";
import {
getGithubStarredListNames,
getGithubStarredRepositories,
} from "@/lib/github";
function makeRestStarredRepo(overrides: Record<string, unknown> = {}) {
return {
name: "demo",
full_name: "acme/demo",
html_url: "https://github.com/acme/demo",
clone_url: "https://github.com/acme/demo.git",
owner: {
login: "acme",
type: "Organization",
},
private: false,
fork: false,
has_issues: true,
archived: false,
size: 123,
language: "TypeScript",
description: "Demo",
default_branch: "main",
visibility: "public",
disabled: false,
created_at: "2024-01-01T00:00:00Z",
updated_at: "2024-01-02T00:00:00Z",
...overrides,
};
}
function makeGraphqlListRepo(
nameWithOwner: string,
overrides: Record<string, unknown> = {},
) {
const [owner, name] = nameWithOwner.split("/");
return {
__typename: "Repository" as const,
name,
nameWithOwner,
url: `https://github.com/${nameWithOwner}`,
sshUrl: `git@github.com:${nameWithOwner}.git`,
isPrivate: false,
isFork: false,
isArchived: false,
isDisabled: false,
hasIssuesEnabled: true,
diskUsage: 456,
description: `${name} repo`,
defaultBranchRef: { name: "main" },
visibility: "PUBLIC" as const,
updatedAt: "2024-01-02T00:00:00Z",
createdAt: "2024-01-01T00:00:00Z",
owner: {
__typename: "Organization" as const,
login: owner,
},
primaryLanguage: { name: "TypeScript" },
...overrides,
};
}
describe("GitHub starred lists support", () => {
test("falls back to REST starred endpoint when no lists are configured", async () => {
const paginate = mock(async () => [makeRestStarredRepo()]);
const graphql = mock(async () => {
throw new Error("GraphQL should not be used in REST fallback path");
});
const octokit = {
paginate,
graphql,
activity: {
listReposStarredByAuthenticatedUser: () => {},
},
} as any;
const repos = await getGithubStarredRepositories({
octokit,
config: { githubConfig: { starredLists: [] } } as any,
});
expect(repos).toHaveLength(1);
expect(repos[0].fullName).toBe("acme/demo");
expect(repos[0].isStarred).toBe(true);
expect(paginate).toHaveBeenCalledTimes(1);
expect(graphql).toHaveBeenCalledTimes(0);
});
test("filters starred repositories by configured list names and de-duplicates", async () => {
const paginate = mock(async () => []);
const graphql = mock(async (_query: string, variables?: Record<string, unknown>) => {
if (!variables || !("listId" in variables)) {
return {
viewer: {
lists: {
nodes: [
null,
{ id: "list-1", name: "HomeLab" },
{ id: "list-2", name: "DotTools" },
{ id: "list-3", name: "Ideas" },
],
pageInfo: { hasNextPage: false, endCursor: null },
},
},
};
}
if (variables.listId === "list-1") {
return {
node: {
items: {
nodes: [
null,
makeGraphqlListRepo("acme/repo-a"),
makeGraphqlListRepo("acme/repo-b"),
],
pageInfo: { hasNextPage: false, endCursor: null },
},
},
};
}
return {
node: {
items: {
nodes: [
makeGraphqlListRepo("acme/repo-b"),
makeGraphqlListRepo("acme/repo-c"),
],
pageInfo: { hasNextPage: false, endCursor: null },
},
},
};
});
const octokit = {
paginate,
graphql,
activity: {
listReposStarredByAuthenticatedUser: () => {},
},
} as any;
const repos = await getGithubStarredRepositories({
octokit,
config: {
githubConfig: {
starredLists: ["homelab", "dottools"],
},
} as any,
});
expect(repos).toHaveLength(3);
expect(repos.map((repo) => repo.fullName).sort()).toEqual([
"acme/repo-a",
"acme/repo-b",
"acme/repo-c",
]);
expect(paginate).toHaveBeenCalledTimes(0);
});
test("matches configured list names even when separators differ", async () => {
const paginate = mock(async () => []);
const graphql = mock(async (_query: string, variables?: Record<string, unknown>) => {
if (!variables || !("listId" in variables)) {
return {
viewer: {
lists: {
nodes: [
{ id: "list-1", name: "UI Frontend" },
{ id: "list-2", name: "Email | Self - Hosted" },
{ id: "list-3", name: "PaaS | Hosting | Deploy" },
],
pageInfo: { hasNextPage: false, endCursor: null },
},
},
};
}
if (variables.listId === "list-1") {
return {
node: {
items: {
nodes: [makeGraphqlListRepo("acme/ui-app")],
pageInfo: { hasNextPage: false, endCursor: null },
},
},
};
}
if (variables.listId === "list-2") {
return {
node: {
items: {
nodes: [makeGraphqlListRepo("acme/email-app")],
pageInfo: { hasNextPage: false, endCursor: null },
},
},
};
}
return {
node: {
items: {
nodes: [makeGraphqlListRepo("acme/paas-app")],
pageInfo: { hasNextPage: false, endCursor: null },
},
},
};
});
const octokit = {
paginate,
graphql,
activity: {
listReposStarredByAuthenticatedUser: () => {},
},
} as any;
const repos = await getGithubStarredRepositories({
octokit,
config: {
githubConfig: {
starredLists: ["ui-frontend", "email-self-hosted", "paas-hosting-deploy"],
},
} as any,
});
expect(repos).toHaveLength(3);
expect(repos.map((repo) => repo.fullName).sort()).toEqual([
"acme/email-app",
"acme/paas-app",
"acme/ui-app",
]);
expect(paginate).toHaveBeenCalledTimes(0);
});
test("throws when configured star list names do not match any GitHub list", async () => {
const paginate = mock(async () => []);
const graphql = mock(async (_query: string, variables?: Record<string, unknown>) => {
if (!variables || !("listId" in variables)) {
return {
viewer: {
lists: {
nodes: [{ id: "list-1", name: "HomeLab" }],
pageInfo: { hasNextPage: false, endCursor: null },
},
},
};
}
return {
node: {
items: {
nodes: [],
pageInfo: { hasNextPage: false, endCursor: null },
},
},
};
});
const octokit = {
paginate,
graphql,
activity: {
listReposStarredByAuthenticatedUser: () => {},
},
} as any;
await expect(
getGithubStarredRepositories({
octokit,
config: {
githubConfig: {
starredLists: ["MissingList"],
},
} as any,
}),
).rejects.toThrow("Configured GitHub star lists not found");
expect(paginate).toHaveBeenCalledTimes(0);
});
test("returns all available starred list names with pagination", async () => {
const graphql = mock(async (_query: string, variables?: Record<string, unknown>) => {
if (!variables?.after) {
return {
viewer: {
lists: {
nodes: [
null,
{ id: "a", name: "HomeLab" },
{ id: "b", name: "DotTools" },
],
pageInfo: { hasNextPage: true, endCursor: "cursor-1" },
},
},
};
}
return {
viewer: {
lists: {
nodes: [
{ id: "c", name: "Ideas" },
],
pageInfo: { hasNextPage: false, endCursor: null },
},
},
};
});
const octokit = { graphql } as any;
const lists = await getGithubStarredListNames({ octokit });
expect(lists).toEqual(["HomeLab", "DotTools", "Ideas"]);
expect(graphql).toHaveBeenCalledTimes(2);
});
});

View File

@@ -22,22 +22,30 @@ if (process.env.NODE_ENV !== "test") {
// Fallback to base Octokit if .plugin is not present
const MyOctokit: any = (Octokit as any)?.plugin?.call
? (Octokit as any).plugin(throttling)
: Octokit as any;
: (Octokit as any);
/**
* Creates an authenticated Octokit instance with rate limit tracking and throttling
*/
export function createGitHubClient(token: string, userId?: string, username?: string): Octokit {
export function createGitHubClient(
token: string,
userId?: string,
username?: string,
): Octokit {
// Create a proper User-Agent to identify our application
// This helps GitHub understand our traffic patterns and can provide better rate limits
const userAgent = username
? `gitea-mirror/3.5.4 (user:${username})`
const userAgent = username
? `gitea-mirror/3.5.4 (user:${username})`
: "gitea-mirror/3.5.4";
// Support GH_API_URL (preferred) or GITHUB_API_URL (may conflict with GitHub Actions)
// GitHub Actions sets GITHUB_API_URL to https://api.github.com by default
const baseUrl = process.env.GH_API_URL || process.env.GITHUB_API_URL || "https://api.github.com";
const octokit = new MyOctokit({
auth: token, // Always use token for authentication (5000 req/hr vs 60 for unauthenticated)
userAgent, // Identify our application and user
baseUrl: "https://api.github.com", // Explicitly set the API endpoint
baseUrl, // Configurable for E2E testing
log: {
debug: () => {},
info: console.log,
@@ -52,14 +60,19 @@ export function createGitHubClient(token: string, userId?: string, username?: st
},
},
throttle: {
onRateLimit: async (retryAfter: number, options: any, octokit: any, retryCount: number) => {
onRateLimit: async (
retryAfter: number,
options: any,
octokit: any,
retryCount: number,
) => {
const isSearch = options.url.includes("/search/");
const maxRetries = isSearch ? 5 : 3; // Search endpoints get more retries
console.warn(
`[GitHub] Rate limit hit for ${options.method} ${options.url}. Retry ${retryCount + 1}/${maxRetries}`
`[GitHub] Rate limit hit for ${options.method} ${options.url}. Retry ${retryCount + 1}/${maxRetries}`,
);
// Update rate limit status and notify UI (if available)
if (userId && RateLimitManager) {
await RateLimitManager.updateFromResponse(userId, {
@@ -68,7 +81,7 @@ export function createGitHubClient(token: string, userId?: string, username?: st
"x-ratelimit-reset": (Date.now() / 1000 + retryAfter).toString(),
});
}
if (userId && publishEvent) {
await publishEvent({
userId,
@@ -83,22 +96,29 @@ export function createGitHubClient(token: string, userId?: string, username?: st
},
});
}
// Retry with exponential backoff
if (retryCount < maxRetries) {
console.log(`[GitHub] Waiting ${retryAfter}s before retry...`);
return true;
}
// Max retries reached
console.error(`[GitHub] Max retries (${maxRetries}) reached for ${options.url}`);
console.error(
`[GitHub] Max retries (${maxRetries}) reached for ${options.url}`,
);
return false;
},
onSecondaryRateLimit: async (retryAfter: number, options: any, octokit: any, retryCount: number) => {
onSecondaryRateLimit: async (
retryAfter: number,
options: any,
octokit: any,
retryCount: number,
) => {
console.warn(
`[GitHub] Secondary rate limit hit for ${options.method} ${options.url}`
`[GitHub] Secondary rate limit hit for ${options.method} ${options.url}`,
);
// Update status and notify UI (if available)
if (userId && publishEvent) {
await publishEvent({
@@ -114,13 +134,15 @@ export function createGitHubClient(token: string, userId?: string, username?: st
},
});
}
// Retry up to 2 times for secondary rate limits
if (retryCount < 2) {
console.log(`[GitHub] Waiting ${retryAfter}s for secondary rate limit...`);
console.log(
`[GitHub] Waiting ${retryAfter}s for secondary rate limit...`,
);
return true;
}
return false;
},
// Throttle options to prevent hitting limits
@@ -129,50 +151,57 @@ export function createGitHubClient(token: string, userId?: string, username?: st
retryAfterBaseValue: 1000, // Base retry in ms
},
});
// Add additional rate limit tracking if userId is provided and RateLimitManager is available
// Add rate limit tracking hooks if userId is provided and RateLimitManager is available
if (userId && RateLimitManager) {
octokit.hook.after("request", async (response: any, options: any) => {
// Update rate limit from response headers
octokit.hook.after("request", async (response: any, _options: any) => {
if (response.headers) {
await RateLimitManager.updateFromResponse(userId, response.headers);
}
});
octokit.hook.error("request", async (error: any, options: any) => {
// Handle rate limit errors
if (error.status === 403 || error.status === 429) {
const message = error.message || "";
if (message.includes("rate limit") || message.includes("API rate limit")) {
console.error(`[GitHub] Rate limit error for user ${userId}: ${message}`);
if (
message.includes("rate limit") ||
message.includes("API rate limit")
) {
console.error(
`[GitHub] Rate limit error for user ${userId}: ${message}`,
);
// Update rate limit status from error response (if available)
if (error.response?.headers && RateLimitManager) {
await RateLimitManager.updateFromResponse(userId, error.response.headers);
await RateLimitManager.updateFromResponse(
userId,
error.response.headers,
);
}
// Create error event for UI (if available)
if (publishEvent) {
await publishEvent({
userId,
channel: "rate-limit",
payload: {
type: "error",
provider: "github",
error: message,
endpoint: options.url,
message: `Rate limit exceeded: ${message}`,
},
});
channel: "rate-limit",
payload: {
type: "error",
provider: "github",
error: message,
endpoint: options.url,
message: `Rate limit exceeded: ${message}`,
},
});
}
}
}
throw error;
});
}
return octokit;
}
@@ -213,7 +242,7 @@ export async function getGithubRepositories({
try {
const repos = await octokit.paginate(
octokit.repos.listForAuthenticatedUser,
{ per_page: 100 }
{ per_page: 100 },
);
const skipForks = config.githubConfig?.skipForks ?? false;
@@ -254,9 +283,11 @@ export async function getGithubRepositories({
visibility: (repo.visibility ?? "public") as GitRepo["visibility"],
status: "imported",
isDisabled: repo.disabled ?? false,
lastMirrored: undefined,
errorMessage: undefined,
importedAt: new Date(),
createdAt: repo.created_at ? new Date(repo.created_at) : new Date(),
updatedAt: repo.updated_at ? new Date(repo.updated_at) : new Date(),
}));
@@ -264,24 +295,297 @@ export async function getGithubRepositories({
throw new Error(
`Error fetching repositories: ${
error instanceof Error ? error.message : String(error)
}`
}`,
);
}
}
function getStarredListMatchKey(rawValue: string): string {
const normalized = rawValue.normalize("NFKC").trim().toLowerCase();
const tokens = normalized.match(/[\p{L}\p{N}]+/gu);
return tokens ? tokens.join("") : "";
}
function normalizeStarredListNames(rawLists: unknown): string[] {
if (!Array.isArray(rawLists)) return [];
const deduped = new Map<string, string>();
for (const value of rawLists) {
if (typeof value !== "string") continue;
const trimmed = value.trim();
if (!trimmed) continue;
const matchKey = getStarredListMatchKey(trimmed);
if (!matchKey || deduped.has(matchKey)) continue;
deduped.set(matchKey, trimmed);
}
return [...deduped.values()];
}
function toHttpsCloneUrl(repoUrl: string): string {
return repoUrl.endsWith(".git") ? repoUrl : `${repoUrl}.git`;
}
interface GitHubStarListNode {
id: string;
name: string;
}
interface GitHubRepositoryListItem {
__typename: "Repository";
name: string;
nameWithOwner: string;
url: string;
sshUrl: string;
isPrivate: boolean;
isFork: boolean;
isArchived: boolean;
isDisabled: boolean;
hasIssuesEnabled: boolean;
diskUsage: number;
description: string | null;
defaultBranchRef: { name: string } | null;
visibility: "PUBLIC" | "PRIVATE" | "INTERNAL";
updatedAt: string;
createdAt: string;
owner: {
__typename: "Organization" | "User" | string;
login: string;
};
primaryLanguage: {
name: string;
} | null;
}
async function getGithubStarLists(octokit: Octokit): Promise<GitHubStarListNode[]> {
const allLists: GitHubStarListNode[] = [];
let cursor: string | null = null;
do {
const result = await octokit.graphql<{
viewer: {
lists: {
nodes: Array<GitHubStarListNode | null> | null;
pageInfo: {
hasNextPage: boolean;
endCursor: string | null;
};
};
};
}>(
`
query($after: String) {
viewer {
lists(first: 50, after: $after) {
nodes {
id
name
}
pageInfo {
hasNextPage
endCursor
}
}
}
}
`,
{ after: cursor },
);
const lists = (result.viewer.lists.nodes ?? []).filter(
(list): list is GitHubStarListNode =>
!!list &&
typeof list.id === "string" &&
typeof list.name === "string",
);
allLists.push(...lists);
if (!result.viewer.lists.pageInfo.hasNextPage) break;
cursor = result.viewer.lists.pageInfo.endCursor;
} while (cursor);
return allLists;
}
async function getGithubRepositoriesForStarList(
octokit: Octokit,
listId: string,
): Promise<GitHubRepositoryListItem[]> {
const repositories: GitHubRepositoryListItem[] = [];
let cursor: string | null = null;
do {
const result = await octokit.graphql<{
node: {
items: {
nodes: Array<GitHubRepositoryListItem | null> | null;
pageInfo: {
hasNextPage: boolean;
endCursor: string | null;
};
};
} | null;
}>(
`
query($listId: ID!, $after: String) {
node(id: $listId) {
... on UserList {
items(first: 100, after: $after) {
nodes {
__typename
... on Repository {
name
nameWithOwner
url
sshUrl
isPrivate
isFork
isArchived
isDisabled
hasIssuesEnabled
diskUsage
description
defaultBranchRef {
name
}
visibility
updatedAt
createdAt
owner {
__typename
login
}
primaryLanguage {
name
}
}
}
pageInfo {
hasNextPage
endCursor
}
}
}
}
}
`,
{ listId, after: cursor },
);
const listNode = result.node;
if (!listNode) break;
const nodes = listNode.items.nodes ?? [];
for (const node of nodes) {
if (node?.__typename === "Repository") {
repositories.push(node);
}
}
if (!listNode.items.pageInfo.hasNextPage) break;
cursor = listNode.items.pageInfo.endCursor;
} while (cursor);
return repositories;
}
function mapGraphqlRepoToGitRepo(repo: GitHubRepositoryListItem): GitRepo {
const visibility = (repo.visibility ?? "PUBLIC").toLowerCase() as GitRepo["visibility"];
const createdAt = repo.createdAt ? new Date(repo.createdAt) : new Date();
const updatedAt = repo.updatedAt ? new Date(repo.updatedAt) : new Date();
return {
name: repo.name,
fullName: repo.nameWithOwner,
url: repo.url,
cloneUrl: toHttpsCloneUrl(repo.url),
owner: repo.owner.login,
organization: repo.owner.__typename === "Organization" ? repo.owner.login : undefined,
mirroredLocation: "",
destinationOrg: null,
isPrivate: repo.isPrivate,
isForked: repo.isFork,
forkedFrom: undefined,
hasIssues: repo.hasIssuesEnabled,
isStarred: true,
isArchived: repo.isArchived,
size: repo.diskUsage ?? 0,
hasLFS: false,
hasSubmodules: false,
language: repo.primaryLanguage?.name ?? null,
description: repo.description,
defaultBranch: repo.defaultBranchRef?.name || "main",
visibility,
status: "imported",
isDisabled: repo.isDisabled,
lastMirrored: undefined,
errorMessage: undefined,
importedAt: new Date(),
createdAt,
updatedAt,
};
}
export async function getGithubStarredRepositories({
octokit,
config,
}: {
octokit: Octokit;
config: Partial<Config>;
}) {
}): Promise<GitRepo[]> {
try {
const configuredLists = normalizeStarredListNames(
config.githubConfig?.starredLists,
);
if (configuredLists.length > 0) {
const allLists = await getGithubStarLists(octokit);
const configuredMatchKeySet = new Set(
configuredLists.map((list) => getStarredListMatchKey(list)),
);
const matchedLists = allLists.filter((list) =>
configuredMatchKeySet.has(getStarredListMatchKey(list.name)),
);
if (matchedLists.length === 0) {
const availableListNames = normalizeStarredListNames(
allLists.map((list) => list.name),
);
const preview = availableListNames.slice(0, 20).join(", ");
const availableSuffix = preview
? `. Available lists: ${preview}${availableListNames.length > 20 ? ", ..." : ""}`
: "";
throw new Error(
`Configured GitHub star lists not found: ${configuredLists.join(", ")}${availableSuffix}`,
);
}
const deduped = new Map<string, GitRepo>();
for (const list of matchedLists) {
const repos = await getGithubRepositoriesForStarList(octokit, list.id);
for (const repo of repos) {
const key = repo.nameWithOwner.toLowerCase();
if (deduped.has(key)) continue;
deduped.set(key, mapGraphqlRepoToGitRepo(repo));
}
}
return [...deduped.values()];
}
const starredRepos = await octokit.paginate(
octokit.activity.listReposStarredByAuthenticatedUser,
{
per_page: 100,
}
},
);
return starredRepos.map((repo) => ({
@@ -314,9 +618,11 @@ export async function getGithubStarredRepositories({
visibility: (repo.visibility ?? "public") as GitRepo["visibility"],
status: "imported",
isDisabled: repo.disabled ?? false,
lastMirrored: undefined,
errorMessage: undefined,
importedAt: new Date(),
createdAt: repo.created_at ? new Date(repo.created_at) : new Date(),
updatedAt: repo.updated_at ? new Date(repo.updated_at) : new Date(),
}));
@@ -324,11 +630,20 @@ export async function getGithubStarredRepositories({
throw new Error(
`Error fetching starred repositories: ${
error instanceof Error ? error.message : String(error)
}`
}`,
);
}
}
export async function getGithubStarredListNames({
octokit,
}: {
octokit: Octokit;
}): Promise<string[]> {
const lists = await getGithubStarLists(octokit);
return normalizeStarredListNames(lists.map((list) => list.name));
}
/**
* Get user github organizations
*/
@@ -338,7 +653,7 @@ export async function getGithubOrganizations({
}: {
octokit: Octokit;
config: Partial<Config>;
}): Promise<GitOrg[]> {
}): Promise<{ organizations: GitOrg[]; failedOrgs: { name: string; avatarUrl: string; reason: string }[] }> {
try {
const { data: orgs } = await octokit.orgs.listForAuthenticatedUser({
per_page: 100,
@@ -347,47 +662,66 @@ export async function getGithubOrganizations({
// Get excluded organizations from environment variable
const excludedOrgsEnv = process.env.GITHUB_EXCLUDED_ORGS;
const excludedOrgs = excludedOrgsEnv
? excludedOrgsEnv.split(',').map(org => org.trim().toLowerCase())
? excludedOrgsEnv.split(",").map((org) => org.trim().toLowerCase())
: [];
// Filter out excluded organizations
const filteredOrgs = orgs.filter(org => {
const filteredOrgs = orgs.filter((org) => {
if (excludedOrgs.includes(org.login.toLowerCase())) {
console.log(`Skipping organization ${org.login} - excluded via GITHUB_EXCLUDED_ORGS environment variable`);
console.log(
`Skipping organization ${org.login} - excluded via GITHUB_EXCLUDED_ORGS environment variable`,
);
return false;
}
return true;
});
const organizations = await Promise.all(
const failedOrgs: { name: string; avatarUrl: string; reason: string }[] = [];
const results = await Promise.all(
filteredOrgs.map(async (org) => {
const [{ data: orgDetails }, { data: membership }] = await Promise.all([
octokit.orgs.get({ org: org.login }),
octokit.orgs.getMembershipForAuthenticatedUser({ org: org.login }),
]);
try {
const [{ data: orgDetails }, { data: membership }] = await Promise.all([
octokit.orgs.get({ org: org.login }),
octokit.orgs.getMembershipForAuthenticatedUser({ org: org.login }),
]);
const totalRepos =
orgDetails.public_repos + (orgDetails.total_private_repos ?? 0);
const totalRepos =
orgDetails.public_repos + (orgDetails.total_private_repos ?? 0);
return {
name: org.login,
avatarUrl: org.avatar_url,
membershipRole: membership.role as MembershipRole,
isIncluded: false,
status: "imported" as RepoStatus,
repositoryCount: totalRepos,
createdAt: new Date(),
updatedAt: new Date(),
};
})
return {
name: org.login,
avatarUrl: org.avatar_url,
membershipRole: membership.role as MembershipRole,
isIncluded: false,
status: "imported" as RepoStatus,
repositoryCount: totalRepos,
createdAt: new Date(),
updatedAt: new Date(),
};
} catch (error: any) {
// Capture organizations that return 403 (SAML enforcement, insufficient token scope, etc.)
if (error?.status === 403) {
const reason = error?.message || "access denied";
console.warn(
`Failed to import organization ${org.login} - ${reason}`,
);
failedOrgs.push({ name: org.login, avatarUrl: org.avatar_url, reason });
return null;
}
throw error;
}
}),
);
return organizations;
return {
organizations: results.filter((org): org is NonNullable<typeof org> => org !== null),
failedOrgs,
};
} catch (error) {
throw new Error(
`Error fetching organizations: ${
error instanceof Error ? error.message : String(error)
}`
}`,
);
}
}
@@ -438,9 +772,11 @@ export async function getGithubOrganizationRepositories({
visibility: (repo.visibility ?? "public") as GitRepo["visibility"],
status: "imported",
isDisabled: repo.disabled ?? false,
lastMirrored: undefined,
errorMessage: undefined,
importedAt: new Date(),
createdAt: repo.created_at ? new Date(repo.created_at) : new Date(),
updatedAt: repo.updated_at ? new Date(repo.updated_at) : new Date(),
}));
@@ -448,7 +784,7 @@ export async function getGithubOrganizationRepositories({
throw new Error(
`Error fetching organization repositories: ${
error instanceof Error ? error.message : String(error)
}`
}`,
);
}
}

View File

@@ -3,6 +3,7 @@ import { db, mirrorJobs } from "./db";
import { eq, and, or, lt, isNull } from "drizzle-orm";
import { v4 as uuidv4 } from "uuid";
import { publishEvent } from "./events";
import { triggerJobNotification } from "./notification-service";
export async function createMirrorJob({
userId,
@@ -19,6 +20,7 @@ export async function createMirrorJob({
itemIds,
inProgress,
skipDuplicateEvent,
skipNotification,
}: {
userId: string;
organizationId?: string;
@@ -34,6 +36,7 @@ export async function createMirrorJob({
itemIds?: string[];
inProgress?: boolean;
skipDuplicateEvent?: boolean; // Option to skip event publishing for internal operations
skipNotification?: boolean; // Option to skip push notifications for specific internal operations
}) {
const jobId = uuidv4();
const currentTimestamp = new Date();
@@ -67,7 +70,7 @@ export async function createMirrorJob({
// Insert the job into the database
await db.insert(mirrorJobs).values(job);
// Publish the event using SQLite instead of Redis (unless skipped)
// Publish realtime status events unless explicitly skipped
if (!skipDuplicateEvent) {
const channel = `mirror-status:${userId}`;
@@ -89,6 +92,15 @@ export async function createMirrorJob({
});
}
// Trigger push notifications for terminal statuses (never blocks the mirror flow).
// Keep this independent from skipDuplicateEvent so event-stream suppression does not
// silently disable user-facing notifications.
if (!skipNotification && (status === "failed" || status === "mirrored" || status === "synced")) {
triggerJobNotification({ userId, status, repositoryName, organizationName, message, details }).catch(err => {
console.error("[NotificationService] Background notification failed:", err);
});
}
return jobId;
} catch (error) {
console.error("Error creating mirror job:", error);

View File

@@ -0,0 +1,221 @@
import { describe, test, expect, beforeEach, mock } from "bun:test";
// Mock fetch globally before importing the module
let mockFetch: ReturnType<typeof mock>;
beforeEach(() => {
mockFetch = mock(() =>
Promise.resolve(new Response("ok", { status: 200 }))
);
globalThis.fetch = mockFetch as any;
});
// Mock encryption module
mock.module("@/lib/utils/encryption", () => ({
encrypt: (val: string) => val,
decrypt: (val: string) => val,
isEncrypted: () => false,
}));
// Import after mocks are set up — db is already mocked via setup.bun.ts
import { sendNotification, testNotification } from "./notification-service";
import type { NotificationConfig } from "@/types/config";
describe("sendNotification", () => {
test("sends ntfy notification when provider is ntfy", async () => {
const config: NotificationConfig = {
enabled: true,
provider: "ntfy",
notifyOnSyncError: true,
notifyOnSyncSuccess: true,
notifyOnNewRepo: false,
ntfy: {
url: "https://ntfy.sh",
topic: "test-topic",
priority: "default",
},
};
await sendNotification(config, {
title: "Test",
message: "Test message",
type: "sync_success",
});
expect(mockFetch).toHaveBeenCalledTimes(1);
const [url] = mockFetch.mock.calls[0];
expect(url).toBe("https://ntfy.sh/test-topic");
});
test("sends apprise notification when provider is apprise", async () => {
const config: NotificationConfig = {
enabled: true,
provider: "apprise",
notifyOnSyncError: true,
notifyOnSyncSuccess: true,
notifyOnNewRepo: false,
apprise: {
url: "http://apprise:8000",
token: "my-token",
},
};
await sendNotification(config, {
title: "Test",
message: "Test message",
type: "sync_success",
});
expect(mockFetch).toHaveBeenCalledTimes(1);
const [url] = mockFetch.mock.calls[0];
expect(url).toBe("http://apprise:8000/notify/my-token");
});
test("does not throw when fetch fails", async () => {
mockFetch = mock(() => Promise.reject(new Error("Network error")));
globalThis.fetch = mockFetch as any;
const config: NotificationConfig = {
enabled: true,
provider: "ntfy",
notifyOnSyncError: true,
notifyOnSyncSuccess: true,
notifyOnNewRepo: false,
ntfy: {
url: "https://ntfy.sh",
topic: "test-topic",
priority: "default",
},
};
// Should not throw
await sendNotification(config, {
title: "Test",
message: "Test message",
type: "sync_success",
});
});
test("skips notification when ntfy topic is missing", async () => {
const config: NotificationConfig = {
enabled: true,
provider: "ntfy",
notifyOnSyncError: true,
notifyOnSyncSuccess: true,
notifyOnNewRepo: false,
ntfy: {
url: "https://ntfy.sh",
topic: "",
priority: "default",
},
};
await sendNotification(config, {
title: "Test",
message: "Test message",
type: "sync_success",
});
expect(mockFetch).not.toHaveBeenCalled();
});
test("skips notification when apprise URL is missing", async () => {
const config: NotificationConfig = {
enabled: true,
provider: "apprise",
notifyOnSyncError: true,
notifyOnSyncSuccess: true,
notifyOnNewRepo: false,
apprise: {
url: "",
token: "my-token",
},
};
await sendNotification(config, {
title: "Test",
message: "Test message",
type: "sync_success",
});
expect(mockFetch).not.toHaveBeenCalled();
});
});
describe("testNotification", () => {
test("returns success when notification is sent", async () => {
const config: NotificationConfig = {
enabled: true,
provider: "ntfy",
notifyOnSyncError: true,
notifyOnSyncSuccess: true,
notifyOnNewRepo: false,
ntfy: {
url: "https://ntfy.sh",
topic: "test-topic",
priority: "default",
},
};
const result = await testNotification(config);
expect(result.success).toBe(true);
expect(result.error).toBeUndefined();
});
test("returns error when topic is missing", async () => {
const config: NotificationConfig = {
enabled: true,
provider: "ntfy",
notifyOnSyncError: true,
notifyOnSyncSuccess: true,
notifyOnNewRepo: false,
ntfy: {
url: "https://ntfy.sh",
topic: "",
priority: "default",
},
};
const result = await testNotification(config);
expect(result.success).toBe(false);
expect(result.error).toContain("topic");
});
test("returns error when fetch fails", async () => {
mockFetch = mock(() =>
Promise.resolve(new Response("bad request", { status: 400 }))
);
globalThis.fetch = mockFetch as any;
const config: NotificationConfig = {
enabled: true,
provider: "ntfy",
notifyOnSyncError: true,
notifyOnSyncSuccess: true,
notifyOnNewRepo: false,
ntfy: {
url: "https://ntfy.sh",
topic: "test-topic",
priority: "default",
},
};
const result = await testNotification(config);
expect(result.success).toBe(false);
expect(result.error).toBeDefined();
});
test("returns error for unknown provider", async () => {
const config = {
enabled: true,
provider: "unknown" as any,
notifyOnSyncError: true,
notifyOnSyncSuccess: true,
notifyOnNewRepo: false,
};
const result = await testNotification(config);
expect(result.success).toBe(false);
expect(result.error).toContain("Unknown provider");
});
});

View File

@@ -0,0 +1,189 @@
import type { NotificationConfig } from "@/types/config";
import type { NotificationEvent } from "./providers/ntfy";
import { sendNtfyNotification } from "./providers/ntfy";
import { sendAppriseNotification } from "./providers/apprise";
import { db, configs } from "@/lib/db";
import { eq } from "drizzle-orm";
import { decrypt } from "@/lib/utils/encryption";
function sanitizeTestNotificationError(error: unknown): string {
if (!(error instanceof Error)) {
return "Failed to send test notification";
}
const safeErrorPatterns = [
/topic is required/i,
/url and token are required/i,
/unknown provider/i,
/bad request/i,
/unauthorized/i,
/forbidden/i,
/not found/i,
/timeout/i,
/network error/i,
/invalid/i,
];
if (safeErrorPatterns.some((pattern) => pattern.test(error.message))) {
return error.message;
}
return "Failed to send test notification";
}
/**
* Sends a notification using the configured provider.
* NEVER throws -- all errors are caught and logged.
*/
export async function sendNotification(
config: NotificationConfig,
event: NotificationEvent,
): Promise<void> {
try {
if (config.provider === "ntfy") {
if (!config.ntfy?.topic) {
console.warn("[NotificationService] Ntfy topic is not configured, skipping notification");
return;
}
await sendNtfyNotification(config.ntfy, event);
} else if (config.provider === "apprise") {
if (!config.apprise?.url || !config.apprise?.token) {
console.warn("[NotificationService] Apprise URL or token is not configured, skipping notification");
return;
}
await sendAppriseNotification(config.apprise, event);
}
} catch (error) {
console.error("[NotificationService] Failed to send notification:", error);
}
}
/**
* Sends a test notification and returns the result.
* Unlike sendNotification, this propagates the success/error status
* so the UI can display the outcome.
*/
export async function testNotification(
notificationConfig: NotificationConfig,
): Promise<{ success: boolean; error?: string }> {
const event: NotificationEvent = {
title: "Gitea Mirror - Test Notification",
message: "This is a test notification from Gitea Mirror. If you see this, notifications are working correctly!",
type: "sync_success",
};
try {
if (notificationConfig.provider === "ntfy") {
if (!notificationConfig.ntfy?.topic) {
return { success: false, error: "Ntfy topic is required" };
}
await sendNtfyNotification(notificationConfig.ntfy, event);
} else if (notificationConfig.provider === "apprise") {
if (!notificationConfig.apprise?.url || !notificationConfig.apprise?.token) {
return { success: false, error: "Apprise URL and token are required" };
}
await sendAppriseNotification(notificationConfig.apprise, event);
} else {
return { success: false, error: `Unknown provider: ${notificationConfig.provider}` };
}
return { success: true };
} catch (error) {
return { success: false, error: sanitizeTestNotificationError(error) };
}
}
/**
* Loads the user's notification config from the database and triggers
* a notification if the event type matches the user's preferences.
*
* NEVER throws -- all errors are caught and logged. This function is
* designed to be called fire-and-forget from the mirror job system.
*/
export async function triggerJobNotification({
userId,
status,
repositoryName,
organizationName,
message,
details,
}: {
userId: string;
status: string;
repositoryName?: string | null;
organizationName?: string | null;
message?: string;
details?: string;
}): Promise<void> {
try {
// Only trigger for terminal statuses
if (status !== "failed" && status !== "mirrored" && status !== "synced") {
return;
}
// Fetch user's config from database
const configResults = await db
.select()
.from(configs)
.where(eq(configs.userId, userId))
.limit(1);
if (configResults.length === 0) {
return;
}
const userConfig = configResults[0];
const notificationConfig = userConfig.notificationConfig as NotificationConfig | undefined;
if (!notificationConfig?.enabled) {
return;
}
// Check event type against user preferences
const isError = status === "failed";
const isSuccess = status === "mirrored" || status === "synced";
if (isError && !notificationConfig.notifyOnSyncError) {
return;
}
if (isSuccess && !notificationConfig.notifyOnSyncSuccess) {
return;
}
// Only decrypt the active provider's token to avoid failures from stale
// credentials on the inactive provider dropping the entire notification
const decryptedConfig = { ...notificationConfig };
if (decryptedConfig.provider === "ntfy" && decryptedConfig.ntfy?.token) {
decryptedConfig.ntfy = {
...decryptedConfig.ntfy,
token: decrypt(decryptedConfig.ntfy.token),
};
}
if (decryptedConfig.provider === "apprise" && decryptedConfig.apprise?.token) {
decryptedConfig.apprise = {
...decryptedConfig.apprise,
token: decrypt(decryptedConfig.apprise.token),
};
}
// Build event
const repoLabel = repositoryName || organizationName || "Unknown";
const eventType: NotificationEvent["type"] = isError ? "sync_error" : "sync_success";
const event: NotificationEvent = {
title: isError
? `Mirror Failed: ${repoLabel}`
: `Mirror Success: ${repoLabel}`,
message: [
message || `Repository ${repoLabel} ${isError ? "failed to mirror" : "mirrored successfully"}`,
details ? `\nDetails: ${details}` : "",
]
.filter(Boolean)
.join(""),
type: eventType,
};
await sendNotification(decryptedConfig, event);
} catch (error) {
console.error("[NotificationService] Background notification failed:", error);
}
}

View File

@@ -0,0 +1,98 @@
import { describe, test, expect, beforeEach, mock } from "bun:test";
import { sendAppriseNotification } from "./apprise";
import type { NotificationEvent } from "./ntfy";
import type { AppriseConfig } from "@/types/config";
describe("sendAppriseNotification", () => {
let mockFetch: ReturnType<typeof mock>;
beforeEach(() => {
mockFetch = mock(() =>
Promise.resolve(new Response("ok", { status: 200 }))
);
globalThis.fetch = mockFetch as any;
});
const baseConfig: AppriseConfig = {
url: "http://apprise:8000",
token: "gitea-mirror",
};
const baseEvent: NotificationEvent = {
title: "Test Notification",
message: "This is a test",
type: "sync_success",
};
test("constructs correct URL from config", async () => {
await sendAppriseNotification(baseConfig, baseEvent);
expect(mockFetch).toHaveBeenCalledTimes(1);
const [url] = mockFetch.mock.calls[0];
expect(url).toBe("http://apprise:8000/notify/gitea-mirror");
});
test("strips trailing slash from URL", async () => {
await sendAppriseNotification(
{ ...baseConfig, url: "http://apprise:8000/" },
baseEvent
);
const [url] = mockFetch.mock.calls[0];
expect(url).toBe("http://apprise:8000/notify/gitea-mirror");
});
test("sends correct JSON body format", async () => {
await sendAppriseNotification(baseConfig, baseEvent);
const [, opts] = mockFetch.mock.calls[0];
expect(opts.headers["Content-Type"]).toBe("application/json");
const body = JSON.parse(opts.body);
expect(body.title).toBe("Test Notification");
expect(body.body).toBe("This is a test");
expect(body.type).toBe("success");
});
test("maps sync_error to failure type", async () => {
const errorEvent: NotificationEvent = {
...baseEvent,
type: "sync_error",
};
await sendAppriseNotification(baseConfig, errorEvent);
const [, opts] = mockFetch.mock.calls[0];
const body = JSON.parse(opts.body);
expect(body.type).toBe("failure");
});
test("includes tag when configured", async () => {
await sendAppriseNotification(
{ ...baseConfig, tag: "urgent" },
baseEvent
);
const [, opts] = mockFetch.mock.calls[0];
const body = JSON.parse(opts.body);
expect(body.tag).toBe("urgent");
});
test("omits tag when not configured", async () => {
await sendAppriseNotification(baseConfig, baseEvent);
const [, opts] = mockFetch.mock.calls[0];
const body = JSON.parse(opts.body);
expect(body.tag).toBeUndefined();
});
test("throws on non-200 response", async () => {
mockFetch = mock(() =>
Promise.resolve(new Response("server error", { status: 500 }))
);
globalThis.fetch = mockFetch as any;
expect(
sendAppriseNotification(baseConfig, baseEvent)
).rejects.toThrow("Apprise error: 500");
});
});

View File

@@ -0,0 +1,15 @@
import type { AppriseConfig } from "@/types/config";
import type { NotificationEvent } from "./ntfy";
export async function sendAppriseNotification(config: AppriseConfig, event: NotificationEvent): Promise<void> {
const url = `${config.url.replace(/\/$/, "")}/notify/${config.token}`;
const headers: Record<string, string> = { "Content-Type": "application/json" };
const body = JSON.stringify({
title: event.title,
body: event.message,
type: event.type === "sync_error" ? "failure" : "success",
tag: config.tag || undefined,
});
const resp = await fetch(url, { method: "POST", body, headers });
if (!resp.ok) throw new Error(`Apprise error: ${resp.status} ${await resp.text()}`);
}

View File

@@ -0,0 +1,95 @@
import { describe, test, expect, beforeEach, mock } from "bun:test";
import { sendNtfyNotification, type NotificationEvent } from "./ntfy";
import type { NtfyConfig } from "@/types/config";
describe("sendNtfyNotification", () => {
let mockFetch: ReturnType<typeof mock>;
beforeEach(() => {
mockFetch = mock(() =>
Promise.resolve(new Response("ok", { status: 200 }))
);
globalThis.fetch = mockFetch as any;
});
const baseConfig: NtfyConfig = {
url: "https://ntfy.sh",
topic: "gitea-mirror",
priority: "default",
};
const baseEvent: NotificationEvent = {
title: "Test Notification",
message: "This is a test",
type: "sync_success",
};
test("constructs correct URL from config", async () => {
await sendNtfyNotification(baseConfig, baseEvent);
expect(mockFetch).toHaveBeenCalledTimes(1);
const [url] = mockFetch.mock.calls[0];
expect(url).toBe("https://ntfy.sh/gitea-mirror");
});
test("strips trailing slash from URL", async () => {
await sendNtfyNotification(
{ ...baseConfig, url: "https://ntfy.sh/" },
baseEvent
);
const [url] = mockFetch.mock.calls[0];
expect(url).toBe("https://ntfy.sh/gitea-mirror");
});
test("includes Authorization header when token is present", async () => {
await sendNtfyNotification(
{ ...baseConfig, token: "tk_secret" },
baseEvent
);
const [, opts] = mockFetch.mock.calls[0];
expect(opts.headers["Authorization"]).toBe("Bearer tk_secret");
});
test("does not include Authorization header when no token", async () => {
await sendNtfyNotification(baseConfig, baseEvent);
const [, opts] = mockFetch.mock.calls[0];
expect(opts.headers["Authorization"]).toBeUndefined();
});
test("uses high priority for sync_error events", async () => {
const errorEvent: NotificationEvent = {
...baseEvent,
type: "sync_error",
};
await sendNtfyNotification(baseConfig, errorEvent);
const [, opts] = mockFetch.mock.calls[0];
expect(opts.headers["Priority"]).toBe("high");
expect(opts.headers["Tags"]).toBe("warning");
});
test("uses config priority for non-error events", async () => {
await sendNtfyNotification(
{ ...baseConfig, priority: "low" },
baseEvent
);
const [, opts] = mockFetch.mock.calls[0];
expect(opts.headers["Priority"]).toBe("low");
expect(opts.headers["Tags"]).toBe("white_check_mark");
});
test("throws on non-200 response", async () => {
mockFetch = mock(() =>
Promise.resolve(new Response("rate limited", { status: 429 }))
);
globalThis.fetch = mockFetch as any;
expect(
sendNtfyNotification(baseConfig, baseEvent)
).rejects.toThrow("Ntfy error: 429");
});
});

21
src/lib/providers/ntfy.ts Normal file
View File

@@ -0,0 +1,21 @@
import type { NtfyConfig } from "@/types/config";
export interface NotificationEvent {
title: string;
message: string;
type: "sync_error" | "sync_success" | "new_repo";
}
export async function sendNtfyNotification(config: NtfyConfig, event: NotificationEvent): Promise<void> {
const url = `${config.url.replace(/\/$/, "")}/${config.topic}`;
const headers: Record<string, string> = {
"Title": event.title,
"Priority": event.type === "sync_error" ? "high" : (config.priority || "default"),
"Tags": event.type === "sync_error" ? "warning" : "white_check_mark",
};
if (config.token) {
headers["Authorization"] = `Bearer ${config.token}`;
}
const resp = await fetch(url, { method: "POST", body: event.message, headers });
if (!resp.ok) throw new Error(`Ntfy error: ${resp.status} ${await resp.text()}`);
}

248
src/lib/repo-backup.test.ts Normal file
View File

@@ -0,0 +1,248 @@
import path from "node:path";
import { afterEach, beforeEach, describe, expect, test } from "bun:test";
import type { Config } from "@/types/config";
import {
resolveBackupPaths,
resolveBackupStrategy,
shouldBackupForStrategy,
shouldBlockSyncForStrategy,
strategyNeedsDetection,
} from "@/lib/repo-backup";
describe("resolveBackupPaths", () => {
let originalBackupDirEnv: string | undefined;
beforeEach(() => {
originalBackupDirEnv = process.env.PRE_SYNC_BACKUP_DIR;
delete process.env.PRE_SYNC_BACKUP_DIR;
});
afterEach(() => {
if (originalBackupDirEnv === undefined) {
delete process.env.PRE_SYNC_BACKUP_DIR;
} else {
process.env.PRE_SYNC_BACKUP_DIR = originalBackupDirEnv;
}
});
test("returns absolute paths when backupDirectory is relative", () => {
const config: Partial<Config> = {
userId: "user-123",
giteaConfig: {
backupDirectory: "data/repo-backups",
} as Config["giteaConfig"],
};
const { backupRoot, repoBackupDir } = resolveBackupPaths({
config,
owner: "RayLabsHQ",
repoName: "gitea-mirror",
});
expect(path.isAbsolute(backupRoot)).toBe(true);
expect(path.isAbsolute(repoBackupDir)).toBe(true);
expect(repoBackupDir).toBe(
path.join(backupRoot, "user-123", "RayLabsHQ", "gitea-mirror")
);
});
test("returns absolute paths when backupDirectory is already absolute", () => {
const config: Partial<Config> = {
userId: "user-123",
giteaConfig: {
backupDirectory: "/data/repo-backups",
} as Config["giteaConfig"],
};
const { backupRoot, repoBackupDir } = resolveBackupPaths({
config,
owner: "owner",
repoName: "repo",
});
expect(backupRoot).toBe("/data/repo-backups");
expect(path.isAbsolute(repoBackupDir)).toBe(true);
});
test("falls back to cwd-based path when no backupDirectory is set", () => {
const config: Partial<Config> = {
userId: "user-123",
giteaConfig: {} as Config["giteaConfig"],
};
const { backupRoot } = resolveBackupPaths({
config,
owner: "owner",
repoName: "repo",
});
expect(path.isAbsolute(backupRoot)).toBe(true);
expect(backupRoot).toBe(
path.resolve(process.cwd(), "data", "repo-backups")
);
});
test("uses PRE_SYNC_BACKUP_DIR env var when config has no backupDirectory", () => {
process.env.PRE_SYNC_BACKUP_DIR = "custom/backup/path";
const config: Partial<Config> = {
userId: "user-123",
giteaConfig: {} as Config["giteaConfig"],
};
const { backupRoot } = resolveBackupPaths({
config,
owner: "owner",
repoName: "repo",
});
expect(path.isAbsolute(backupRoot)).toBe(true);
expect(backupRoot).toBe(path.resolve("custom/backup/path"));
});
test("sanitizes owner and repoName in path segments", () => {
const config: Partial<Config> = {
userId: "user-123",
giteaConfig: {
backupDirectory: "/backups",
} as Config["giteaConfig"],
};
const { repoBackupDir } = resolveBackupPaths({
config,
owner: "org/with-slash",
repoName: "repo name!",
});
expect(repoBackupDir).toBe(
path.join("/backups", "user-123", "org_with-slash", "repo_name_")
);
});
});
// ---- Backup strategy resolver tests ----
function makeConfig(overrides: Record<string, any> = {}): Partial<Config> {
return {
giteaConfig: {
url: "https://gitea.example.com",
token: "tok",
...overrides,
},
} as Partial<Config>;
}
const envKeysToClean = ["PRE_SYNC_BACKUP_STRATEGY", "PRE_SYNC_BACKUP_ENABLED"];
describe("resolveBackupStrategy", () => {
let savedEnv: Record<string, string | undefined> = {};
beforeEach(() => {
savedEnv = {};
for (const key of envKeysToClean) {
savedEnv[key] = process.env[key];
delete process.env[key];
}
});
afterEach(() => {
for (const [key, value] of Object.entries(savedEnv)) {
if (value === undefined) {
delete process.env[key];
} else {
process.env[key] = value;
}
}
});
test("returns explicit backupStrategy when set", () => {
expect(resolveBackupStrategy(makeConfig({ backupStrategy: "always" }))).toBe("always");
expect(resolveBackupStrategy(makeConfig({ backupStrategy: "disabled" }))).toBe("disabled");
expect(resolveBackupStrategy(makeConfig({ backupStrategy: "on-force-push" }))).toBe("on-force-push");
expect(resolveBackupStrategy(makeConfig({ backupStrategy: "block-on-force-push" }))).toBe("block-on-force-push");
});
test("maps backupBeforeSync: true → 'on-force-push' (backward compat, prevents silent always-backup)", () => {
expect(resolveBackupStrategy(makeConfig({ backupBeforeSync: true }))).toBe("on-force-push");
});
test("maps backupBeforeSync: false → 'disabled' (backward compat)", () => {
expect(resolveBackupStrategy(makeConfig({ backupBeforeSync: false }))).toBe("disabled");
});
test("prefers explicit backupStrategy over backupBeforeSync", () => {
expect(
resolveBackupStrategy(
makeConfig({ backupStrategy: "on-force-push", backupBeforeSync: true }),
),
).toBe("on-force-push");
});
test("falls back to PRE_SYNC_BACKUP_STRATEGY env var", () => {
process.env.PRE_SYNC_BACKUP_STRATEGY = "block-on-force-push";
expect(resolveBackupStrategy(makeConfig({}))).toBe("block-on-force-push");
});
test("falls back to PRE_SYNC_BACKUP_ENABLED env var (legacy)", () => {
process.env.PRE_SYNC_BACKUP_ENABLED = "false";
expect(resolveBackupStrategy(makeConfig({}))).toBe("disabled");
});
test("defaults to 'on-force-push' when nothing is configured", () => {
expect(resolveBackupStrategy(makeConfig({}))).toBe("on-force-push");
});
test("handles empty giteaConfig gracefully", () => {
expect(resolveBackupStrategy({})).toBe("on-force-push");
});
});
describe("shouldBackupForStrategy", () => {
test("disabled → never backup", () => {
expect(shouldBackupForStrategy("disabled", false)).toBe(false);
expect(shouldBackupForStrategy("disabled", true)).toBe(false);
});
test("always → always backup", () => {
expect(shouldBackupForStrategy("always", false)).toBe(true);
expect(shouldBackupForStrategy("always", true)).toBe(true);
});
test("on-force-push → backup only when detected", () => {
expect(shouldBackupForStrategy("on-force-push", false)).toBe(false);
expect(shouldBackupForStrategy("on-force-push", true)).toBe(true);
});
test("block-on-force-push → backup only when detected", () => {
expect(shouldBackupForStrategy("block-on-force-push", false)).toBe(false);
expect(shouldBackupForStrategy("block-on-force-push", true)).toBe(true);
});
});
describe("shouldBlockSyncForStrategy", () => {
test("only block-on-force-push + detected returns true", () => {
expect(shouldBlockSyncForStrategy("block-on-force-push", true)).toBe(true);
});
test("block-on-force-push without detection does not block", () => {
expect(shouldBlockSyncForStrategy("block-on-force-push", false)).toBe(false);
});
test("other strategies never block", () => {
expect(shouldBlockSyncForStrategy("disabled", true)).toBe(false);
expect(shouldBlockSyncForStrategy("always", true)).toBe(false);
expect(shouldBlockSyncForStrategy("on-force-push", true)).toBe(false);
});
});
describe("strategyNeedsDetection", () => {
test("returns true for detection-based strategies", () => {
expect(strategyNeedsDetection("on-force-push")).toBe(true);
expect(strategyNeedsDetection("block-on-force-push")).toBe(true);
});
test("returns false for non-detection strategies", () => {
expect(strategyNeedsDetection("disabled")).toBe(false);
expect(strategyNeedsDetection("always")).toBe(false);
});
});

Some files were not shown because too many files have changed in this diff Show More