Compare commits

...

24 Commits

Author SHA1 Message Date
ARUNAVO RAY
ce365a706e ci: persist release version to main (#212) 2026-03-05 09:55:59 +05:30
ARUNAVO RAY
be7daac5fb ci: automate release version from tag (#211) 2026-03-05 09:34:49 +05:30
dependabot[bot]
e32b7af5eb build(deps): bump svgo (#210)
Bumps the npm_and_yarn group with 1 update in the /www directory: [svgo](https://github.com/svg/svgo).


Updates `svgo` from 4.0.0 to 4.0.1
- [Release notes](https://github.com/svg/svgo/releases)
- [Commits](https://github.com/svg/svgo/compare/v4.0.0...v4.0.1)

---
updated-dependencies:
- dependency-name: svgo
  dependency-version: 4.0.1
  dependency-type: indirect
  dependency-group: npm_and_yarn
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-03-05 08:53:34 +05:30
ARUNAVO RAY
d0693206c3 feat: selective starred repo mirroring with autoMirrorStarred toggle (#208)
* feat: add autoMirrorStarred toggle for selective starred repo mirroring (#205)

Add `githubConfig.autoMirrorStarred` (default: false) to control whether
starred repos are included in automatic mirroring operations. Manual
per-repo actions always work regardless of this toggle.

Bug fixes:
- Cleanup service no longer orphans starred repos when includeStarred is
  disabled (prevents data loss)
- First-boot auto-start now gates initial mirror behind autoMirror config
  (previously mirrored everything unconditionally)
- "Mirror All" button now respects autoMirrorStarred setting
- Bulk mirror and getAvailableActions now include pending-approval status

Changes span schema, config mapping, env loader, scheduler, cleanup
service, UI settings toggle, and repository components.

* fix: log activity when repos are auto-imported during scheduled sync

Auto-discovered repositories (including newly starred ones) were inserted
into the database without creating activity log entries, so they appeared
in the dashboard but not in the activity log.

* ci: set 10-minute timeout on all CI jobs
2026-03-04 08:22:44 +05:30
Arunavo Ray
b079070c30 ci: also exclude helm/** from app CI workflows 2026-03-02 16:28:04 +05:30
Arunavo Ray
e68e9c38a8 ci: skip app CI workflows for www-only changes
Add www/** to paths-ignore in astro-build-test, e2e-tests, and
nix-build workflows. docker-build and helm-test already use positive
path filters and were unaffected.
2026-03-02 16:25:54 +05:30
Arunavo Ray
534150ecf9 chore(www): update website content, fix build, add Helm/Nix install methods
- Update softwareVersion from 3.9.2 to 3.11.0
- Add Helm and Nix installation tabs to Getting Started section
- Fix Helm instructions to use local chart path (no published repo)
- Update Features section: add Metadata Preservation, Force-Push Protection, Git LFS Support
- Remove unused @radix-ui/react-icons import from Hero.tsx and dependency from package.json
- Update structured data featureList with newer capabilities
2026-03-02 16:23:32 +05:30
ARUNAVO RAY
98da7065e0 feat: smart force-push protection with backup strategies (#206)
* feat: smart force-push protection with backup strategies (#187)

Replace blunt `backupBeforeSync` boolean with `backupStrategy` enum
offering four modes: disabled, always, on-force-push (default), and
block-on-force-push. This dramatically reduces backup storage for large
mirror collections by only creating snapshots when force-pushes are
actually detected.

Detection works by comparing branch SHAs between Gitea and GitHub APIs
before each sync — no git cloning required. Fail-open design ensures
detection errors never block sync.

Key changes:
- Add force-push detection module (branch SHA comparison via APIs)
- Add backup strategy resolver with backward-compat migration
- Add pending-approval repo status with approve/dismiss UI + API
- Add block-on-force-push mode requiring manual approval
- Fix checkAncestry to only treat 404 as confirmed force-push
  (transient errors skip branch instead of false-positive blocking)
- Fix approve-sync to bypass detection gate (skipForcePushDetection)
- Fix backup execution to not be hard-gated by deprecated flag
- Persist backupStrategy through config-mapper round-trip

* fix: resolve four bugs in smart force-push protection

P0: Approve flow re-blocks itself — approve-sync now calls
syncGiteaRepoEnhanced with skipForcePushDetection: true so the
detection+block gate is bypassed on approved syncs.

P1: backupStrategy not persisted — added to both directions of the
config-mapper. Don't inject a default in the mapper; let
resolveBackupStrategy handle fallback so legacy backupBeforeSync
still works for E2E tests and existing configs.

P1: Backup hard-gated by deprecated backupBeforeSync — added force
flag to createPreSyncBundleBackup; strategy-driven callers and
approve-sync pass force: true to bypass the legacy guard.

P1: checkAncestry false positives — now only returns false for
404/422 (confirmed force-push). Transient errors (rate limits, 500s)
are rethrown so detectForcePush skips that branch (fail-open).

* test(e2e): migrate backup tests from backupBeforeSync to backupStrategy

Update E2E tests to use the new backupStrategy enum ("always",
"disabled") instead of the deprecated backupBeforeSync boolean.

* docs: add backup strategy UI screenshot

* refactor(ui): move Destructive Update Protection to GitHub config tab

Relocates the backup strategy section from GiteaConfigForm to
GitHubConfigForm since it protects against GitHub-side force-pushes.
Adds ShieldAlert icon to match other section header patterns.

* docs: add force-push protection documentation and Beta badge

Add docs/FORCE_PUSH_PROTECTION.md covering detection mechanism,
backup strategies, API usage, and troubleshooting. Link it from
README features list and support section. Mark the feature as Beta
in the UI with an outline badge.

* fix(ui): match Beta badge style to Git LFS badge
2026-03-02 15:48:59 +05:30
ARUNAVO RAY
58e0194aa6 fix(nix): ensure absolute bundle path in pre-sync backup (#204)
* fix(nix): ensure absolute bundle path in pre-sync backup (#203)

Use path.resolve() instead of conditional path.isAbsolute() check to
guarantee bundlePath is always absolute before passing to git -C. On
NixOS, relative paths were interpreted relative to the temp mirror
clone directory, causing "No such file or directory" errors.

Closes #203

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(nix): ensure absolute bundle path in pre-sync backup (#203)

Use path.resolve() instead of conditional path.isAbsolute() check to
guarantee bundlePath is always absolute before passing to git -C. On
NixOS, relative paths were interpreted relative to the temp mirror
clone directory, causing "No such file or directory" errors.

Extract resolveBackupPaths() for testability. Bump version to 3.10.1.

Closes #203

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* ci: drop macos matrix and only run nix build on main/tags

- Remove macos-latest from Nix CI matrix (ubuntu-only)
- Only run `nix build` on main branch and version tags, skip on PRs
- `nix flake check` still runs on all PRs for validation

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 08:37:18 +05:30
Arunavo Ray
7864c46279 unused file 2026-03-01 08:06:11 +05:30
Arunavo Ray
e3970e53e1 chore: release v3.10.0
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 08:01:02 +05:30
ARUNAVO RAY
be46cfdffa feat: add target organization to Add Repository dialog (#202)
* feat: add target organization field to Add Repository dialog

Allow users to specify a destination Gitea organization when adding a
single repository, instead of relying solely on the default mirror
strategy. The field is optional — when left empty, the existing strategy
logic applies as before.

Closes #200

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs: add screenshot of target organization field in Add Repository dialog

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 07:55:27 +05:30
Xyndra
2e00a610cb Add E2E testing (#201)
* feat: add E2E testing infrastructure with fake GitHub, Playwright, and CI workflow

- Add fake GitHub API server (tests/e2e/fake-github-server.ts) with
  management API for seeding test data
- Add Playwright E2E test suite covering full mirror workflow:
  service health checks, user registration, config, sync, verify
- Add Docker Compose for E2E Gitea instance
- Add orchestrator script (run-e2e.sh) with cleanup
- Add GitHub Actions workflow (e2e-tests.yml) with Gitea service container
- Make GITHUB_API_URL configurable via env var for testing
- Add npm scripts: test:e2e, test:e2e:ci, test:e2e:keep, test:e2e:cleanup

* feat: add real git repos + backup config testing to E2E suite

- Create programmatic test git repos (create-test-repos.ts) with real
  commits, branches (main, develop, feature/*), and tags (v1.0.0, v1.1.0)
- Add git-server container to docker-compose serving bare repos via
  dumb HTTP protocol so Gitea can actually clone them
- Update fake GitHub server to emit reachable clone_url fields pointing
  to the git-server container (configurable via GIT_SERVER_URL env var)
- Add management endpoint POST /___mgmt/set-clone-url for runtime config
- Update E2E spec with real mirroring verification:
  * Verify repos appear in Gitea with actual content
  * Check branches, tags, commits, file content
  * Verify 4/4 repos mirrored successfully
- Add backup configuration test suite:
  * Enable/disable backupBeforeSync config
  * Toggle blockSyncOnBackupFailure
  * Trigger re-sync with backup enabled and verify activities
  * Verify config persistence across changes
- Update CI workflow to use docker compose (not service containers)
  matching the local run-e2e.sh approach
- Update cleanup.sh for git-repos directory and git-server port
- All 22 tests passing with real git content verification

* refactor: split E2E tests into focused files + add force-push tests

Split the monolithic e2e.spec.ts (1335 lines) into 5 focused spec files
and a shared helpers module:

  helpers.ts                 — constants, GiteaAPI, auth, saveConfig, utilities
  01-health.spec.ts          — service health checks (4 tests)
  02-mirror-workflow.spec.ts — full first-mirror journey (8 tests)
  03-backup.spec.ts          — backup config toggling (6 tests)
  04-force-push.spec.ts      — force-push simulation & backup verification (9 tests)
  05-sync-verification.spec.ts — dynamic repos, content integrity, reset (5 tests)

The force-push tests are the critical addition:
  F0: Record original state (commit SHAs, file content)
  F1: Rewrite source repo history (simulate force-push)
  F2: Sync to Gitea WITHOUT backup
  F3: Verify data loss — LICENSE file gone, README overwritten
  F4: Restore source, re-mirror to clean state
  F5: Enable backup, force-push again, sync through app
  F6: Verify Gitea reflects the force-push
  F7: Verify backup system was invoked (snapshot activities logged)
  F8: Restore source repo for subsequent tests

Also added to helpers.ts:
  - GiteaAPI.getBranch(), .getCommit(), .triggerMirrorSync()
  - getRepositoryIds(), triggerMirrorJobs(), triggerSyncRepo()

All 32 tests passing.

* Try to fix actions

* Try to fix the other action

* Add debug info to check why e2e action is failing

* More debug info

* Even more debug info

* E2E fix attempt #1

* E2E fix attempt #2

* more debug again

* E2E fix attempt #3

* E2E fix attempt #4

* Remove a bunch of debug info

* Hopefully fix backup bug

* Force backups to succeed
2026-03-01 07:35:13 +05:30
Arunavo Ray
61841dd7a5 chore: release v3.9.6
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-27 12:45:25 +05:30
ARUNAVO RAY
5aa0f3260d fix(nix): enable sandboxed builds with bun2nix (#199)
* fix(nix): enable sandboxed builds with bun2nix

The Nix package was broken on Linux because `bun install` requires
network access, which is blocked by Nix sandboxing (enabled by default
on Linux).

This switches to bun2nix for dependency management:
- Add bun2nix flake input to pre-fetch all npm dependencies
- Generate bun.nix lockfile for reproducible dependency resolution
- Copy bun cache to writable location during build to avoid EACCES
  errors from bunx writing to the read-only Nix store
- Add nanoid as an explicit dependency (was imported directly but only
  available as a transitive dep, which breaks with isolated linker)
- Update CI workflow to perform a full sandboxed build
- Add bun2nix to devShell for easy lockfile regeneration

Closes #197

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(nix): create writable workdir for database access

The app uses process.cwd()/data for the database path, but when running
from the Nix store the cwd is read-only. Create a writable working
directory with symlinks to app files and a real data directory.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-27 12:43:37 +05:30
ARUNAVO RAY
d0efa200d9 fix(docker): add git and git-lfs to runner image (#198)
The runner stage was missing git, causing pre-sync backups to fail with
"Executable not found in $PATH: git". The backup feature (enabled by
default) shells out to git for clone --mirror and bundle create.

Closes #196

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-27 11:12:35 +05:30
Arunavo Ray
c26b5574e0 chore: release v3.9.5 2026-02-26 11:02:00 +05:30
ARUNAVO RAY
89a6372565 nix: fix runtime wrapper paths and startup script packaging (#194)
* nix: fix flake module and runtime scripts

* docs: refresh readme and docs links/examples
2026-02-26 10:59:56 +05:30
ARUNAVO RAY
f40cad4713 nix: fix flake module and runtime scripts (#192) 2026-02-26 10:39:50 +05:30
ARUNAVO RAY
855906d990 auth: clarify invalid origin error toast guidance (#193)
* nix: fix flake module and runtime scripts

* auth: clarify invalid origin toast
2026-02-26 10:39:08 +05:30
ARUNAVO RAY
08da526ddd fix(github): keep disabled repos from cleanup while skipping new imports (#191)
* fix: preserve disabled repos while skipping new imports

* ci: upgrade bun to 1.3.6 for test workflow
2026-02-26 10:19:28 +05:30
ARUNAVO RAY
2395e14382 Add pre-sync snapshot protection for mirror rewrites (#190)
* add pre-sync snapshot protection

* stabilize test module mocks

* fix cross-test gitea mock exports

* fix gitea mock strategy behavior
2026-02-26 10:13:13 +05:30
Arunavo Ray
91c1703bb5 chore: release v3.9.4 2026-02-24 11:47:47 +05:30
ARUNAVO RAY
6a548e3dac security: enforce session-derived user identity on API routes (#186)
* security: enforce session user on api routes

* test: harden auth guard failure path
2026-02-24 11:47:29 +05:30
96 changed files with 12601 additions and 821 deletions

Binary file not shown.

After

Width:  |  Height:  |  Size: 34 KiB

View File

@@ -43,6 +43,9 @@ This workflow builds Docker images on pushes and pull requests, and pushes to Gi
- Skips registry push for fork PRs (avoids package write permission failures)
- Uses build caching to speed up builds
- Creates multiple tags for each image (latest, semver, sha)
- Auto-syncs `package.json` version from `v*` tags during release builds
- Validates release tags use semver format before building
- After tag builds succeed, writes the same version back to `main/package.json`
### Docker Security Scan (`docker-scan.yml`)

View File

@@ -6,11 +6,15 @@ on:
paths-ignore:
- 'README.md'
- 'docs/**'
- 'www/**'
- 'helm/**'
pull_request:
branches: [ '*' ]
paths-ignore:
- 'README.md'
- 'docs/**'
- 'www/**'
- 'helm/**'
permissions:
contents: read
@@ -20,6 +24,7 @@ jobs:
build-and-test:
name: Build and Test Astro Project
runs-on: ubuntu-latest
timeout-minutes: 10
steps:
- name: Checkout repository
@@ -28,7 +33,7 @@ jobs:
- name: Setup Bun
uses: oven-sh/setup-bun@v1
with:
bun-version: '1.2.16'
bun-version: '1.3.6'
- name: Check lockfile and install dependencies
run: |

View File

@@ -36,6 +36,7 @@ env:
jobs:
docker:
runs-on: ubuntu-latest
timeout-minutes: 10
permissions:
contents: write
@@ -76,13 +77,34 @@ jobs:
id: tag_version
run: |
if [[ $GITHUB_REF == refs/tags/v* ]]; then
echo "VERSION=${GITHUB_REF#refs/tags/}" >> $GITHUB_OUTPUT
echo "Using version tag: ${GITHUB_REF#refs/tags/}"
TAG_VERSION="${GITHUB_REF#refs/tags/}"
if [[ ! "$TAG_VERSION" =~ ^v[0-9]+\.[0-9]+\.[0-9]+([.-][0-9A-Za-z.-]+)?(\+[0-9A-Za-z.-]+)?$ ]]; then
echo "::error::Release tag '${TAG_VERSION}' is invalid. Expected semver tag format like v1.2.3 or v1.2.3-rc.1"
exit 1
fi
APP_VERSION="${TAG_VERSION#v}"
echo "VERSION=${TAG_VERSION}" >> $GITHUB_OUTPUT
echo "APP_VERSION=${APP_VERSION}" >> $GITHUB_OUTPUT
echo "Using version tag: ${TAG_VERSION}"
else
echo "VERSION=latest" >> $GITHUB_OUTPUT
echo "APP_VERSION=dev" >> $GITHUB_OUTPUT
echo "No version tag, using 'latest'"
fi
# Keep version files aligned automatically for tag-based releases
- name: Sync app version from release tag
if: startsWith(github.ref, 'refs/tags/v')
run: |
VERSION="${{ steps.tag_version.outputs.APP_VERSION }}"
echo "Syncing package.json version to ${VERSION}"
jq --arg version "${VERSION}" '.version = $version' package.json > package.json.tmp
mv package.json.tmp package.json
echo "Version sync diff (package.json):"
git --no-pager diff -- package.json
# Extract metadata for Docker
- name: Extract Docker metadata
id: meta
@@ -236,3 +258,44 @@ jobs:
continue-on-error: true
with:
sarif_file: scout-results.sarif
sync-version-main:
name: Sync package.json version back to main
if: startsWith(github.ref, 'refs/tags/v')
runs-on: ubuntu-latest
needs: docker
permissions:
contents: write
steps:
- name: Checkout default branch
uses: actions/checkout@v4
with:
ref: ${{ github.event.repository.default_branch }}
- name: Update package.json version on main
env:
TAG_VERSION: ${{ github.ref_name }}
TARGET_BRANCH: ${{ github.event.repository.default_branch }}
run: |
if [[ ! "$TAG_VERSION" =~ ^v[0-9]+\.[0-9]+\.[0-9]+([.-][0-9A-Za-z.-]+)?(\+[0-9A-Za-z.-]+)?$ ]]; then
echo "::error::Release tag '${TAG_VERSION}' is invalid. Expected semver tag format like v1.2.3 or v1.2.3-rc.1"
exit 1
fi
APP_VERSION="${TAG_VERSION#v}"
echo "Syncing ${TARGET_BRANCH}/package.json to ${APP_VERSION}"
jq --arg version "${APP_VERSION}" '.version = $version' package.json > package.json.tmp
mv package.json.tmp package.json
if git diff --quiet -- package.json; then
echo "package.json on ${TARGET_BRANCH} already at ${APP_VERSION}; nothing to commit."
exit 0
fi
git config user.name "github-actions[bot]"
git config user.email "41898282+github-actions[bot]@users.noreply.github.com"
git add package.json
git commit -m "chore: sync version to ${APP_VERSION}"
git push origin "HEAD:${TARGET_BRANCH}"

285
.github/workflows/e2e-tests.yml vendored Normal file
View File

@@ -0,0 +1,285 @@
name: E2E Integration Tests
on:
push:
branches: ["*"]
paths-ignore:
- "README.md"
- "docs/**"
- "CHANGELOG.md"
- "LICENSE"
- "www/**"
- "helm/**"
pull_request:
branches: ["*"]
paths-ignore:
- "README.md"
- "docs/**"
- "CHANGELOG.md"
- "LICENSE"
- "www/**"
- "helm/**"
workflow_dispatch:
inputs:
debug_enabled:
description: "Enable debug logging"
required: false
default: "false"
type: boolean
permissions:
contents: read
actions: read
concurrency:
group: e2e-${{ github.ref }}
cancel-in-progress: true
env:
GITEA_PORT: 3333
FAKE_GITHUB_PORT: 4580
GIT_SERVER_PORT: 4590
APP_PORT: 4321
BUN_VERSION: "1.3.6"
jobs:
e2e-tests:
name: E2E Integration Tests
runs-on: ubuntu-latest
timeout-minutes: 10
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup Bun
uses: oven-sh/setup-bun@v1
with:
bun-version: ${{ env.BUN_VERSION }}
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: "22"
- name: Install dependencies
run: |
bun install
echo "✓ Dependencies installed"
- name: Install Playwright
run: |
npx playwright install chromium
npx playwright install-deps chromium
echo "✓ Playwright ready"
- name: Create test git repositories
run: |
echo "Creating bare git repos for E2E testing..."
bun run tests/e2e/create-test-repos.ts --output-dir tests/e2e/git-repos
if [ ! -f tests/e2e/git-repos/manifest.json ]; then
echo "ERROR: Test git repos were not created (manifest.json missing)"
exit 1
fi
echo "✓ Test repos created:"
cat tests/e2e/git-repos/manifest.json | jq -r '.repos[] | " • \(.owner)/\(.name) — \(.description)"'
- name: Start Gitea and git-server containers
run: |
echo "Starting containers via docker compose..."
docker compose -f tests/e2e/docker-compose.e2e.yml up -d
# Wait for git-server
echo "Waiting for git HTTP server..."
for i in $(seq 1 30); do
if curl -sf http://localhost:${{ env.GIT_SERVER_PORT }}/manifest.json > /dev/null 2>&1; then
echo "✓ Git HTTP server is ready"
break
fi
if [ $i -eq 30 ]; then
echo "ERROR: Git HTTP server did not start"
docker compose -f tests/e2e/docker-compose.e2e.yml logs git-server
exit 1
fi
sleep 1
done
# Wait for Gitea
echo "Waiting for Gitea to be ready..."
for i in $(seq 1 60); do
if curl -sf http://localhost:${{ env.GITEA_PORT }}/api/v1/version > /dev/null 2>&1; then
version=$(curl -sf http://localhost:${{ env.GITEA_PORT }}/api/v1/version | jq -r '.version // "unknown"')
echo "✓ Gitea is ready (version: $version)"
break
fi
if [ $i -eq 60 ]; then
echo "ERROR: Gitea did not become healthy within 120s"
docker compose -f tests/e2e/docker-compose.e2e.yml logs gitea-e2e --tail=30
exit 1
fi
sleep 2
done
- name: Initialize database
run: |
bun run manage-db init
echo "✓ Database initialized"
- name: Build application
env:
GH_API_URL: http://localhost:4580
BETTER_AUTH_SECRET: e2e-test-secret
run: |
bun run build
echo "✓ Build complete"
- name: Start fake GitHub API server
run: |
# Start with GIT_SERVER_URL pointing to the git-server container name
# (Gitea will resolve it via Docker networking)
PORT=${{ env.FAKE_GITHUB_PORT }} GIT_SERVER_URL="http://git-server" \
npx tsx tests/e2e/fake-github-server.ts &
echo $! > /tmp/fake-github.pid
echo "Waiting for fake GitHub API..."
for i in $(seq 1 30); do
if curl -sf http://localhost:${{ env.FAKE_GITHUB_PORT }}/___mgmt/health > /dev/null 2>&1; then
echo "✓ Fake GitHub API is ready"
break
fi
if [ $i -eq 30 ]; then
echo "ERROR: Fake GitHub API did not start"
exit 1
fi
sleep 1
done
# Ensure clone URLs are set for the git-server container
curl -sf -X POST http://localhost:${{ env.FAKE_GITHUB_PORT }}/___mgmt/set-clone-url \
-H "Content-Type: application/json" \
-d '{"url": "http://git-server"}' || true
echo "✓ Clone URLs configured for git-server container"
- name: Start gitea-mirror application
env:
GH_API_URL: http://localhost:4580
BETTER_AUTH_SECRET: e2e-test-secret
BETTER_AUTH_URL: http://localhost:4321
DATABASE_URL: file:data/gitea-mirror.db
HOST: 0.0.0.0
PORT: ${{ env.APP_PORT }}
NODE_ENV: production
PRE_SYNC_BACKUP_ENABLED: "false"
ENCRYPTION_SECRET: "e2e-encryption-secret-32char!!"
run: |
# Re-init DB in case build step cleared it
bun run manage-db init 2>/dev/null || true
bun run start &
echo $! > /tmp/app.pid
echo "Waiting for gitea-mirror app..."
for i in $(seq 1 90); do
if curl -sf http://localhost:${{ env.APP_PORT }}/api/health > /dev/null 2>&1 || \
curl -sf -o /dev/null -w "%{http_code}" http://localhost:${{ env.APP_PORT }}/ 2>/dev/null | grep -q "^[23]"; then
echo "✓ gitea-mirror app is ready"
break
fi
if ! kill -0 $(cat /tmp/app.pid) 2>/dev/null; then
echo "ERROR: App process died"
exit 1
fi
if [ $i -eq 90 ]; then
echo "ERROR: gitea-mirror app did not start within 180s"
exit 1
fi
sleep 2
done
- name: Run E2E tests
env:
APP_URL: http://localhost:${{ env.APP_PORT }}
GITEA_URL: http://localhost:${{ env.GITEA_PORT }}
FAKE_GITHUB_URL: http://localhost:${{ env.FAKE_GITHUB_PORT }}
GIT_SERVER_URL: http://localhost:${{ env.GIT_SERVER_PORT }}
CI: true
run: |
mkdir -p tests/e2e/test-results
npx playwright test \
--config tests/e2e/playwright.config.ts \
--reporter=github,html
- name: Diagnostic info on failure
if: failure()
run: |
echo "═══════════════════════════════════════════════════════════"
echo " Diagnostic Information"
echo "═══════════════════════════════════════════════════════════"
echo ""
echo "── Git server status ──"
curl -sf http://localhost:${{ env.GIT_SERVER_PORT }}/manifest.json 2>/dev/null | jq . || echo "(unreachable)"
echo ""
echo "── Gitea status ──"
curl -sf http://localhost:${{ env.GITEA_PORT }}/api/v1/version 2>/dev/null || echo "(unreachable)"
echo ""
echo "── Fake GitHub status ──"
curl -sf http://localhost:${{ env.FAKE_GITHUB_PORT }}/___mgmt/health 2>/dev/null | jq . || echo "(unreachable)"
echo ""
echo "── App status ──"
curl -sf http://localhost:${{ env.APP_PORT }}/api/health 2>/dev/null || echo "(unreachable)"
echo ""
echo "── Docker containers ──"
docker compose -f tests/e2e/docker-compose.e2e.yml ps 2>/dev/null || true
echo ""
echo "── Gitea container logs (last 50 lines) ──"
docker compose -f tests/e2e/docker-compose.e2e.yml logs gitea-e2e --tail=50 2>/dev/null || echo "(no container)"
echo ""
echo "── Git server logs (last 20 lines) ──"
docker compose -f tests/e2e/docker-compose.e2e.yml logs git-server --tail=20 2>/dev/null || echo "(no container)"
echo ""
echo "── Running processes ──"
ps aux | grep -E "(fake-github|astro|bun|node)" | grep -v grep || true
- name: Upload Playwright report
uses: actions/upload-artifact@v4
if: always()
with:
name: e2e-playwright-report
path: tests/e2e/playwright-report/
retention-days: 14
- name: Upload test results
uses: actions/upload-artifact@v4
if: always()
with:
name: e2e-test-results
path: tests/e2e/test-results/
retention-days: 14
- name: Cleanup
if: always()
run: |
# Stop background processes
if [ -f /tmp/fake-github.pid ]; then
kill $(cat /tmp/fake-github.pid) 2>/dev/null || true
rm -f /tmp/fake-github.pid
fi
if [ -f /tmp/app.pid ]; then
kill $(cat /tmp/app.pid) 2>/dev/null || true
rm -f /tmp/app.pid
fi
# Stop containers
docker compose -f tests/e2e/docker-compose.e2e.yml down --volumes --remove-orphans 2>/dev/null || true
echo "✓ Cleanup complete"

View File

@@ -21,6 +21,7 @@ jobs:
yamllint:
name: Lint YAML
runs-on: ubuntu-latest
timeout-minutes: 10
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
@@ -35,6 +36,7 @@ jobs:
helm-template:
name: Helm lint & template
runs-on: ubuntu-latest
timeout-minutes: 10
steps:
- uses: actions/checkout@v4
- name: Setup Helm

View File

@@ -5,18 +5,26 @@ on:
branches: [main, nix]
tags:
- 'v*'
paths-ignore:
- 'README.md'
- 'docs/**'
- 'www/**'
- 'helm/**'
pull_request:
branches: [main]
paths-ignore:
- 'README.md'
- 'docs/**'
- 'www/**'
- 'helm/**'
permissions:
contents: read
jobs:
check:
strategy:
matrix:
os: [ubuntu-latest, macos-latest]
runs-on: ${{ matrix.os }}
runs-on: ubuntu-latest
timeout-minutes: 10
steps:
- uses: actions/checkout@v4
@@ -33,13 +41,6 @@ jobs:
- name: Show flake info
run: nix flake show
- name: Evaluate package
run: |
# Evaluate the derivation without building (validates the Nix expression)
nix eval .#packages.$(nix eval --impure --expr 'builtins.currentSystem').default.name
echo "Flake evaluation successful"
# Note: Full build requires network access for bun install.
# Nix sandboxed builds block network access.
# To build locally: nix build --option sandbox false
# Or use: nix develop && bun install && bun run build
- name: Build package
if: github.ref == 'refs/heads/main' || startsWith(github.ref, 'refs/tags/v')
run: nix build --print-build-logs

15
.gitignore vendored
View File

@@ -37,3 +37,18 @@ result
result-*
.direnv/
# E2E test artifacts
tests/e2e/test-results/
tests/e2e/playwright-report/
tests/e2e/.auth/
tests/e2e/e2e-storage-state.json
tests/e2e/.fake-github.pid
tests/e2e/.app.pid
tests/e2e/git-repos/
# Playwright
/test-results/
/playwright-report/
/blob-report/
/playwright/.cache/
/playwright/.auth/

View File

@@ -1,169 +0,0 @@
# Nix Distribution - Ready to Use!
## Current Status: WORKS NOW
Your Nix package is **already distributable**! Users can run it directly from GitHub without any additional setup on your end.
## How Users Will Use It
### Simple: Just Run From GitHub
```bash
nix run --extra-experimental-features 'nix-command flakes' github:RayLabsHQ/gitea-mirror
```
That's it! No releases, no CI, no infrastructure needed. It works right now.
---
## What Happens When They Run This?
1. **Nix fetches** your repo from GitHub
2. **Nix reads** `flake.nix` and `flake.lock`
3. **Nix builds** the package on their machine
4. **Nix runs** the application
5. **Result cached** in `/nix/store` for reuse
---
## Do You Need CI or Releases?
### For Basic Usage: **NO**
Users can already use it from GitHub. No CI or releases required.
### For CI Validation: **Already Set Up**
GitHub Actions validates builds on every push with Magic Nix Cache (free, no setup).
---
## Next Steps (Optional)
### Option 1: Release Versioning (2 minutes)
**Why:** Users can pin to specific versions
**How:**
```bash
# When ready to release
git tag v3.8.11
git push origin v3.8.11
# Users can then pin to this version
nix run github:RayLabsHQ/gitea-mirror/v3.8.11
```
No additional CI needed - tags work automatically with flakes!
### Option 2: Submit to nixpkgs (Long Term)
**Why:** Maximum discoverability and trust
**When:** After package is stable and well-tested
**How:** Submit PR to https://github.com/NixOS/nixpkgs
---
## Files Created
### Essential (Already Working)
- `flake.nix` - Package definition
- `flake.lock` - Dependency lock file
- `.envrc` - direnv integration
### Documentation
- `NIX.md` - Quick reference for users
- `docs/NIX_DEPLOYMENT.md` - Complete deployment guide
- `docs/NIX_DISTRIBUTION.md` - Distribution guide for you (maintainer)
- `README.md` - Updated with Nix instructions
### CI (Already Set Up)
- `.github/workflows/nix-build.yml` - Builds and validates on Linux + macOS
### Updated
- `.gitignore` - Added Nix artifacts
---
## Comparison: Your Distribution Options
| Setup | Time | User Experience | What You Need |
|-------|------|----------------|---------------|
| **Direct GitHub** | 0 min | Slow (build from source) | Nothing! Works now |
| **+ Git Tags** | 2 min | Versionable | Just push tags |
| **+ nixpkgs** | Hours | Official/Trusted | PR review process |
**Recommendation:** Direct GitHub works now. Add git tags for versioning. Consider nixpkgs submission once stable.
---
## Testing Your Distribution
You can test it right now:
```bash
# Test direct GitHub usage
nix run --extra-experimental-features 'nix-command flakes' github:RayLabsHQ/gitea-mirror
# Test with specific commit
nix run github:RayLabsHQ/gitea-mirror/$(git rev-parse HEAD)
# Validate flake
nix flake check
```
---
## User Documentation Locations
Users will find instructions in:
1. **README.md** - Installation section (already updated)
2. **NIX.md** - Quick reference
3. **docs/NIX_DEPLOYMENT.md** - Detailed guide
All docs include the correct commands with experimental features flags.
---
## When to Release New Versions
### For Git Tag Releases:
```bash
# 1. Update version in package.json
vim package.json
# 2. Update version in flake.nix (line 17)
vim flake.nix # version = "3.8.12";
# 3. Commit and tag
git add package.json flake.nix
git commit -m "chore: bump version to v3.8.12"
git tag v3.8.12
git push origin main
git push origin v3.8.12
```
Users can then use: `nix run github:RayLabsHQ/gitea-mirror/v3.8.12`
### No Release Needed For:
- Bug fixes
- Small changes
- Continuous updates
Users can always use latest from main: `nix run github:RayLabsHQ/gitea-mirror`
---
## Summary
**Ready to distribute RIGHT NOW**
- Just commit and push your `flake.nix`
- Users can run directly from GitHub
- CI validates builds automatically
**Optional: Submit to nixpkgs**
- Maximum discoverability
- Official Nix repository
- Do this once package is stable
See `docs/NIX_DISTRIBUTION.md` for complete details!

View File

@@ -29,7 +29,8 @@ RUN bun install --production --omit=peer --frozen-lockfile
FROM oven/bun:1.3.9-debian AS runner
WORKDIR /app
RUN apt-get update && apt-get install -y --no-install-recommends \
wget sqlite3 openssl ca-certificates \
git git-lfs wget sqlite3 openssl ca-certificates \
&& git lfs install \
&& rm -rf /var/lib/apt/lists/*
COPY --from=pruner /app/node_modules ./node_modules
COPY --from=builder /app/dist ./dist

2
NIX.md
View File

@@ -24,7 +24,7 @@ Secrets auto-generate, database auto-initializes, and the web UI starts at http:
nix run --extra-experimental-features 'nix-command flakes' github:RayLabsHQ/gitea-mirror
# Pin to specific version
nix run github:RayLabsHQ/gitea-mirror/v3.8.11
nix run github:RayLabsHQ/gitea-mirror/vX.Y.Z
```
### 2. Install to Profile

View File

@@ -40,6 +40,7 @@ First user signup becomes admin. Configure GitHub and Gitea through the web inte
- 🔄 **Auto-discovery** - Automatically import new GitHub repositories (v3.4.0+)
- 🧹 **Repository cleanup** - Auto-remove repos deleted from GitHub (v3.4.0+)
- 🎯 **Proper mirror intervals** - Respects configured sync intervals (v3.4.0+)
- 🛡️ **[Force-push protection](docs/FORCE_PUSH_PROTECTION.md)** - Smart detection with backup-on-demand or block-and-approve modes (Beta)
- 🗑️ Automatic database cleanup with configurable retention
- 🐳 Dockerized with multi-arch support (AMD64/ARM64)
@@ -112,7 +113,7 @@ docker compose up -d
#### Using Pre-built Image Directly
```bash
docker pull ghcr.io/raylabshq/gitea-mirror:v3.1.1
docker pull ghcr.io/raylabshq/gitea-mirror:latest
```
### Configuration Options
@@ -483,7 +484,7 @@ Contributions are welcome! Please read our [Contributing Guidelines](CONTRIBUTIN
## License
GNU General Public License v3.0 - see [LICENSE](LICENSE) file for details.
GNU Affero General Public License v3.0 (AGPL-3.0) - see [LICENSE](LICENSE) file for details.
## Star History
@@ -498,7 +499,8 @@ GNU General Public License v3.0 - see [LICENSE](LICENSE) file for details.
## Support
- 📖 [Documentation](https://github.com/RayLabsHQ/gitea-mirror/tree/main/docs)
- 🔐 [Custom CA Certificates](docs/CA_CERTIFICATES.md)
- 🔐 [Environment Variables](docs/ENVIRONMENT_VARIABLES.md)
- 🛡️ [Force-Push Protection](docs/FORCE_PUSH_PROTECTION.md)
- 🐛 [Report Issues](https://github.com/RayLabsHQ/gitea-mirror/issues)
- 💬 [Discussions](https://github.com/RayLabsHQ/gitea-mirror/discussions)
- 🔧 [Proxmox VE Script](https://community-scripts.github.io/ProxmoxVE/scripts?id=gitea-mirror)

157
bun.lock
View File

@@ -5,74 +5,77 @@
"": {
"name": "gitea-mirror",
"dependencies": {
"@astrojs/check": "latest",
"@astrojs/mdx": "latest",
"@astrojs/node": "latest",
"@astrojs/react": "latest",
"@better-auth/sso": "latest",
"@octokit/plugin-throttling": "latest",
"@octokit/rest": "latest",
"@radix-ui/react-accordion": "latest",
"@radix-ui/react-avatar": "latest",
"@radix-ui/react-checkbox": "latest",
"@radix-ui/react-collapsible": "latest",
"@radix-ui/react-dialog": "latest",
"@radix-ui/react-dropdown-menu": "latest",
"@radix-ui/react-hover-card": "latest",
"@radix-ui/react-label": "latest",
"@radix-ui/react-popover": "latest",
"@radix-ui/react-progress": "latest",
"@radix-ui/react-radio-group": "latest",
"@radix-ui/react-scroll-area": "latest",
"@radix-ui/react-select": "latest",
"@radix-ui/react-separator": "latest",
"@radix-ui/react-slot": "latest",
"@radix-ui/react-switch": "latest",
"@radix-ui/react-tabs": "latest",
"@radix-ui/react-tooltip": "latest",
"@tailwindcss/vite": "latest",
"@tanstack/react-virtual": "latest",
"@types/canvas-confetti": "latest",
"@types/react": "latest",
"@types/react-dom": "latest",
"astro": "latest",
"bcryptjs": "latest",
"better-auth": "latest",
"buffer": "latest",
"canvas-confetti": "latest",
"class-variance-authority": "latest",
"clsx": "latest",
"cmdk": "latest",
"dotenv": "latest",
"drizzle-orm": "latest",
"fuse.js": "latest",
"jsonwebtoken": "latest",
"lucide-react": "latest",
"next-themes": "latest",
"react": "latest",
"react-dom": "latest",
"react-icons": "latest",
"sonner": "latest",
"tailwind-merge": "latest",
"tailwindcss": "latest",
"tw-animate-css": "latest",
"typescript": "latest",
"uuid": "latest",
"vaul": "latest",
"zod": "latest",
"@astrojs/check": "^0.9.6",
"@astrojs/mdx": "4.3.13",
"@astrojs/node": "9.5.4",
"@astrojs/react": "^4.4.2",
"@better-auth/sso": "1.4.19",
"@octokit/plugin-throttling": "^11.0.3",
"@octokit/rest": "^22.0.1",
"@radix-ui/react-accordion": "^1.2.12",
"@radix-ui/react-avatar": "^1.1.11",
"@radix-ui/react-checkbox": "^1.3.3",
"@radix-ui/react-collapsible": "^1.1.12",
"@radix-ui/react-dialog": "^1.1.15",
"@radix-ui/react-dropdown-menu": "^2.1.16",
"@radix-ui/react-hover-card": "^1.1.15",
"@radix-ui/react-label": "^2.1.8",
"@radix-ui/react-popover": "^1.1.15",
"@radix-ui/react-progress": "^1.1.8",
"@radix-ui/react-radio-group": "^1.3.8",
"@radix-ui/react-scroll-area": "^1.2.10",
"@radix-ui/react-select": "^2.2.6",
"@radix-ui/react-separator": "^1.1.8",
"@radix-ui/react-slot": "^1.2.4",
"@radix-ui/react-switch": "^1.2.6",
"@radix-ui/react-tabs": "^1.1.13",
"@radix-ui/react-tooltip": "^1.2.8",
"@tailwindcss/vite": "^4.2.1",
"@tanstack/react-virtual": "^3.13.19",
"@types/canvas-confetti": "^1.9.0",
"@types/react": "^19.2.14",
"@types/react-dom": "^19.2.3",
"astro": "^5.18.0",
"bcryptjs": "^3.0.3",
"better-auth": "1.4.19",
"buffer": "^6.0.3",
"canvas-confetti": "^1.9.4",
"class-variance-authority": "^0.7.1",
"clsx": "^2.1.1",
"cmdk": "^1.1.1",
"dotenv": "^17.3.1",
"drizzle-orm": "^0.45.1",
"fuse.js": "^7.1.0",
"jsonwebtoken": "^9.0.3",
"lucide-react": "^0.575.0",
"nanoid": "^3.3.11",
"next-themes": "^0.4.6",
"react": "^19.2.4",
"react-dom": "^19.2.4",
"react-icons": "^5.5.0",
"sonner": "^2.0.7",
"tailwind-merge": "^3.5.0",
"tailwindcss": "^4.2.1",
"tw-animate-css": "^1.4.0",
"typescript": "^5.9.3",
"uuid": "^13.0.0",
"vaul": "^1.1.2",
"zod": "^4.3.6",
},
"devDependencies": {
"@testing-library/jest-dom": "latest",
"@testing-library/react": "latest",
"@types/bcryptjs": "latest",
"@types/bun": "latest",
"@types/jsonwebtoken": "latest",
"@types/uuid": "latest",
"@vitejs/plugin-react": "latest",
"drizzle-kit": "latest",
"jsdom": "latest",
"tsx": "latest",
"vitest": "latest",
"@playwright/test": "^1.58.2",
"@testing-library/jest-dom": "^6.9.1",
"@testing-library/react": "^16.3.2",
"@types/bcryptjs": "^3.0.0",
"@types/bun": "^1.3.9",
"@types/jsonwebtoken": "^9.0.10",
"@types/node": "^25.3.2",
"@types/uuid": "^11.0.0",
"@vitejs/plugin-react": "^5.1.4",
"drizzle-kit": "^0.31.9",
"jsdom": "^28.1.0",
"tsx": "^4.21.0",
"vitest": "^4.0.18",
},
},
},
@@ -361,6 +364,8 @@
"@oslojs/encoding": ["@oslojs/encoding@1.1.0", "", {}, "sha512-70wQhgYmndg4GCPxPPxPGevRKqTIJ2Nh4OkiMWmDAVYsTQ+Ta7Sq+rPevXyXGdzr30/qZBnyOalCszoMxlyldQ=="],
"@playwright/test": ["@playwright/test@1.58.2", "", { "dependencies": { "playwright": "1.58.2" }, "bin": { "playwright": "cli.js" } }, "sha512-akea+6bHYBBfA9uQqSYmlJXn61cTa+jbO87xVLCWbTqbWadRVmhxlXATaOjOgcBaWU4ePo0wB41KMFv3o35IXA=="],
"@radix-ui/number": ["@radix-ui/number@1.1.1", "", {}, "sha512-MkKCwxlXTgz6CFoJx3pCwn07GKp36+aZyu/u2Ln2VrA5DcdyCZkASEDBTd8x5whTQQL5CiYf4prXKLcgQdv29g=="],
"@radix-ui/primitive": ["@radix-ui/primitive@1.1.3", "", {}, "sha512-JTF99U/6XIjCBo0wqkU5sK10glYe27MRRsfwoiq5zzOEZLHU3A3KCMa5X/azekYRCJ0HlwI0crAXS/5dEHTzDg=="],
@@ -591,7 +596,7 @@
"@types/nlcst": ["@types/nlcst@2.0.3", "", { "dependencies": { "@types/unist": "*" } }, "sha512-vSYNSDe6Ix3q+6Z7ri9lyWqgGhJTmzRjZRqyq15N0Z/1/UnVsno9G/N40NBijoYx2seFDIl0+B2mgAb9mezUCA=="],
"@types/node": ["@types/node@22.15.23", "", { "dependencies": { "undici-types": "~6.21.0" } }, "sha512-7Ec1zaFPF4RJ0eXu1YT/xgiebqwqoJz8rYPDi/O2BcZ++Wpt0Kq9cl0eg6NN6bYbPnR67ZLo7St5Q3UK0SnARw=="],
"@types/node": ["@types/node@25.3.2", "", { "dependencies": { "undici-types": "~7.18.0" } }, "sha512-RpV6r/ij22zRRdyBPcxDeKAzH43phWVKEjL2iksqo1Vz3CuBUrgmPpPhALKiRfU7OMCmeeO9vECBMsV0hMTG8Q=="],
"@types/react": ["@types/react@19.2.14", "", { "dependencies": { "csstype": "^3.2.2" } }, "sha512-ilcTH/UniCkMdtexkoCN0bI7pMcJDvmQFPvuPvmEaYA/NSfFTAgdUSLAoVjaRJm7+6PvcM+q1zYOwS4wTYMF9w=="],
@@ -669,7 +674,7 @@
"astring": ["astring@1.9.0", "", { "bin": { "astring": "bin/astring" } }, "sha512-LElXdjswlqjWrPpJFg1Fx4wpkOCxj1TDHlSV4PlaRxHGWko024xICaa97ZkMfs6DRKlCguiAI+rbXv5GWwXIkg=="],
"astro": ["astro@5.17.3", "", { "dependencies": { "@astrojs/compiler": "^2.13.0", "@astrojs/internal-helpers": "0.7.5", "@astrojs/markdown-remark": "6.3.10", "@astrojs/telemetry": "3.3.0", "@capsizecss/unpack": "^4.0.0", "@oslojs/encoding": "^1.1.0", "@rollup/pluginutils": "^5.3.0", "acorn": "^8.15.0", "aria-query": "^5.3.2", "axobject-query": "^4.1.0", "boxen": "8.0.1", "ci-info": "^4.3.1", "clsx": "^2.1.1", "common-ancestor-path": "^1.0.1", "cookie": "^1.1.1", "cssesc": "^3.0.0", "debug": "^4.4.3", "deterministic-object-hash": "^2.0.2", "devalue": "^5.6.2", "diff": "^8.0.3", "dlv": "^1.1.3", "dset": "^3.1.4", "es-module-lexer": "^1.7.0", "esbuild": "^0.27.3", "estree-walker": "^3.0.3", "flattie": "^1.1.1", "fontace": "~0.4.0", "github-slugger": "^2.0.0", "html-escaper": "3.0.3", "http-cache-semantics": "^4.2.0", "import-meta-resolve": "^4.2.0", "js-yaml": "^4.1.1", "magic-string": "^0.30.21", "magicast": "^0.5.1", "mrmime": "^2.0.1", "neotraverse": "^0.6.18", "p-limit": "^6.2.0", "p-queue": "^8.1.1", "package-manager-detector": "^1.6.0", "piccolore": "^0.1.3", "picomatch": "^4.0.3", "prompts": "^2.4.2", "rehype": "^13.0.2", "semver": "^7.7.3", "shiki": "^3.21.0", "smol-toml": "^1.6.0", "svgo": "^4.0.0", "tinyexec": "^1.0.2", "tinyglobby": "^0.2.15", "tsconfck": "^3.1.6", "ultrahtml": "^1.6.0", "unifont": "~0.7.3", "unist-util-visit": "^5.0.0", "unstorage": "^1.17.4", "vfile": "^6.0.3", "vite": "^6.4.1", "vitefu": "^1.1.1", "xxhash-wasm": "^1.1.0", "yargs-parser": "^21.1.1", "yocto-spinner": "^0.2.3", "zod": "^3.25.76", "zod-to-json-schema": "^3.25.1", "zod-to-ts": "^1.2.0" }, "optionalDependencies": { "sharp": "^0.34.0" }, "bin": { "astro": "astro.js" } }, "sha512-69dcfPe8LsHzklwj+hl+vunWUbpMB6pmg35mACjetxbJeUNNys90JaBM8ZiwsPK689SAj/4Zqb1ayaANls9/MA=="],
"astro": ["astro@5.18.0", "", { "dependencies": { "@astrojs/compiler": "^2.13.0", "@astrojs/internal-helpers": "0.7.5", "@astrojs/markdown-remark": "6.3.10", "@astrojs/telemetry": "3.3.0", "@capsizecss/unpack": "^4.0.0", "@oslojs/encoding": "^1.1.0", "@rollup/pluginutils": "^5.3.0", "acorn": "^8.15.0", "aria-query": "^5.3.2", "axobject-query": "^4.1.0", "boxen": "8.0.1", "ci-info": "^4.3.1", "clsx": "^2.1.1", "common-ancestor-path": "^1.0.1", "cookie": "^1.1.1", "cssesc": "^3.0.0", "debug": "^4.4.3", "deterministic-object-hash": "^2.0.2", "devalue": "^5.6.2", "diff": "^8.0.3", "dlv": "^1.1.3", "dset": "^3.1.4", "es-module-lexer": "^1.7.0", "esbuild": "^0.27.3", "estree-walker": "^3.0.3", "flattie": "^1.1.1", "fontace": "~0.4.0", "github-slugger": "^2.0.0", "html-escaper": "3.0.3", "http-cache-semantics": "^4.2.0", "import-meta-resolve": "^4.2.0", "js-yaml": "^4.1.1", "magic-string": "^0.30.21", "magicast": "^0.5.1", "mrmime": "^2.0.1", "neotraverse": "^0.6.18", "p-limit": "^6.2.0", "p-queue": "^8.1.1", "package-manager-detector": "^1.6.0", "piccolore": "^0.1.3", "picomatch": "^4.0.3", "prompts": "^2.4.2", "rehype": "^13.0.2", "semver": "^7.7.3", "shiki": "^3.21.0", "smol-toml": "^1.6.0", "svgo": "^4.0.0", "tinyexec": "^1.0.2", "tinyglobby": "^0.2.15", "tsconfck": "^3.1.6", "ultrahtml": "^1.6.0", "unifont": "~0.7.3", "unist-util-visit": "^5.0.0", "unstorage": "^1.17.4", "vfile": "^6.0.3", "vite": "^6.4.1", "vitefu": "^1.1.1", "xxhash-wasm": "^1.1.0", "yargs-parser": "^21.1.1", "yocto-spinner": "^0.2.3", "zod": "^3.25.76", "zod-to-json-schema": "^3.25.1", "zod-to-ts": "^1.2.0" }, "optionalDependencies": { "sharp": "^0.34.0" }, "bin": { "astro": "astro.js" } }, "sha512-CHiohwJIS4L0G6/IzE1Fx3dgWqXBCXus/od0eGUfxrZJD2um2pE7ehclMmgL/fXqbU7NfE1Ze2pq34h2QaA6iQ=="],
"axobject-query": ["axobject-query@4.1.0", "", {}, "sha512-qIj0G9wZbMGNLjLmg1PT6v2mE9AH2zlnADJD/2tC6E00hgmhUOfEB6greHPAfLRSufHqROIUTkw6E+M3lH0PTQ=="],
@@ -1277,6 +1282,10 @@
"picomatch": ["picomatch@4.0.3", "", {}, "sha512-5gTmgEY/sqK6gFXLIsQNH19lWb4ebPDLA4SdLP7dsWkIXHWlG66oPuVvXSGFPppYZz8ZDZq0dYYrbHfBCVUb1Q=="],
"playwright": ["playwright@1.58.2", "", { "dependencies": { "playwright-core": "1.58.2" }, "optionalDependencies": { "fsevents": "2.3.2" }, "bin": { "playwright": "cli.js" } }, "sha512-vA30H8Nvkq/cPBnNw4Q8TWz1EJyqgpuinBcHET0YVJVFldr8JDNiU9LaWAE1KqSkRYazuaBhTpB5ZzShOezQ6A=="],
"playwright-core": ["playwright-core@1.58.2", "", { "bin": { "playwright-core": "cli.js" } }, "sha512-yZkEtftgwS8CsfYo7nm0KE8jsvm6i/PTgVtB8DL726wNf6H2IMsDuxCpJj59KDaxCtSnrWan2AeDqM7JBaultg=="],
"postcss": ["postcss@8.5.3", "", { "dependencies": { "nanoid": "^3.3.8", "picocolors": "^1.1.1", "source-map-js": "^1.2.1" } }, "sha512-dle9A3yYxlBSrt8Fu+IpjGT8SY8hN0mlaA6GY8t0P5PjIOZemULz/E2Bnm/2dcUOena75OTNkHI76uZBNUUq3A=="],
"prettier": ["prettier@3.7.4", "", { "bin": { "prettier": "bin/prettier.cjs" } }, "sha512-v6UNi1+3hSlVvv8fSaoUbggEM5VErKmmpGA7Pl3HF8V6uKY7rvClBOJlH6yNwQtfTueNkGVpOv/mtWL9L4bgRA=="],
@@ -1501,7 +1510,7 @@
"undici": ["undici@7.22.0", "", {}, "sha512-RqslV2Us5BrllB+JeiZnK4peryVTndy9Dnqq62S3yYRRTj0tFQCwEniUy2167skdGOy3vqRzEvl1Dm4sV2ReDg=="],
"undici-types": ["undici-types@6.21.0", "", {}, "sha512-iwDZqg0QAGrg9Rav5H4n0M64c3mkR59cJ6wQp+7C4nI0gsmExaedaYLNO44eT4AtBBwjbTiGPMlt2Md0T9H9JQ=="],
"undici-types": ["undici-types@7.18.2", "", {}, "sha512-AsuCzffGHJybSaRrmr5eHr81mwJU3kjw6M+uprWvCXiNeN9SOGwQ3Jn8jb8m3Z6izVgknn1R0FTCEAP2QrLY/w=="],
"unified": ["unified@11.0.5", "", { "dependencies": { "@types/unist": "^3.0.0", "bail": "^2.0.0", "devlop": "^1.0.0", "extend": "^3.0.0", "is-plain-obj": "^4.0.0", "trough": "^2.0.0", "vfile": "^6.0.0" } }, "sha512-xKvGhPWw3k84Qjh8bI3ZeJjqnyadK+GEFtazSfZv/rKeTkTjOJho6mFqh2SM96iIcZokxiOpg78GazTSg8+KHA=="],
@@ -1741,6 +1750,8 @@
"@types/bcryptjs/bcryptjs": ["bcryptjs@3.0.2", "", { "bin": { "bcrypt": "bin/bcrypt" } }, "sha512-k38b3XOZKv60C4E2hVsXTolJWfkGRMbILBIe2IBITXciy5bOsTKot5kDrf3ZfufQtQOUN5mXceUEpU1rTl9Uog=="],
"@types/jsonwebtoken/@types/node": ["@types/node@22.15.23", "", { "dependencies": { "undici-types": "~6.21.0" } }, "sha512-7Ec1zaFPF4RJ0eXu1YT/xgiebqwqoJz8rYPDi/O2BcZ++Wpt0Kq9cl0eg6NN6bYbPnR67ZLo7St5Q3UK0SnARw=="],
"anymatch/picomatch": ["picomatch@2.3.1", "", {}, "sha512-JU3teHTNjmE2VCGFzuY8EXzCDVwEqB2a8fsIvwaStHhAWJEeVd1o1QD80CU6+ZdEXXSLbSsuLwJjkCBWqRQUVA=="],
"astro/esbuild": ["esbuild@0.27.3", "", { "optionalDependencies": { "@esbuild/aix-ppc64": "0.27.3", "@esbuild/android-arm": "0.27.3", "@esbuild/android-arm64": "0.27.3", "@esbuild/android-x64": "0.27.3", "@esbuild/darwin-arm64": "0.27.3", "@esbuild/darwin-x64": "0.27.3", "@esbuild/freebsd-arm64": "0.27.3", "@esbuild/freebsd-x64": "0.27.3", "@esbuild/linux-arm": "0.27.3", "@esbuild/linux-arm64": "0.27.3", "@esbuild/linux-ia32": "0.27.3", "@esbuild/linux-loong64": "0.27.3", "@esbuild/linux-mips64el": "0.27.3", "@esbuild/linux-ppc64": "0.27.3", "@esbuild/linux-riscv64": "0.27.3", "@esbuild/linux-s390x": "0.27.3", "@esbuild/linux-x64": "0.27.3", "@esbuild/netbsd-arm64": "0.27.3", "@esbuild/netbsd-x64": "0.27.3", "@esbuild/openbsd-arm64": "0.27.3", "@esbuild/openbsd-x64": "0.27.3", "@esbuild/openharmony-arm64": "0.27.3", "@esbuild/sunos-x64": "0.27.3", "@esbuild/win32-arm64": "0.27.3", "@esbuild/win32-ia32": "0.27.3", "@esbuild/win32-x64": "0.27.3" }, "bin": { "esbuild": "bin/esbuild" } }, "sha512-8VwMnyGCONIs6cWue2IdpHxHnAjzxnw2Zr7MkVxB2vjmQ2ivqGFb4LEG3SMnv0Gb2F/G/2yA8zUaiL1gywDCCg=="],
@@ -1753,6 +1764,8 @@
"boxen/string-width": ["string-width@7.2.0", "", { "dependencies": { "emoji-regex": "^10.3.0", "get-east-asian-width": "^1.0.0", "strip-ansi": "^7.1.0" } }, "sha512-tsaTIkKW9b4N+AEj+SVA+WhJzV7/zMhcSu78mLKWSk7cXMOSHsBKFWUs0fWwq8QyK3MgJBQRX6Gbi4kYbdvGkQ=="],
"bun-types/@types/node": ["@types/node@22.15.23", "", { "dependencies": { "undici-types": "~6.21.0" } }, "sha512-7Ec1zaFPF4RJ0eXu1YT/xgiebqwqoJz8rYPDi/O2BcZ++Wpt0Kq9cl0eg6NN6bYbPnR67ZLo7St5Q3UK0SnARw=="],
"cliui/wrap-ansi": ["wrap-ansi@7.0.0", "", { "dependencies": { "ansi-styles": "^4.0.0", "string-width": "^4.1.0", "strip-ansi": "^6.0.0" } }, "sha512-YVGIj2kamLSTxw6NsZjoBxfSwsn0ycdesmc4p+Q21c5zPuZ1pl+NfxVdxPtdHvmNVOQ6XSYG4AUtyt/Fi7D16Q=="],
"cmdk/@radix-ui/react-dialog": ["@radix-ui/react-dialog@1.1.14", "", { "dependencies": { "@radix-ui/primitive": "1.1.2", "@radix-ui/react-compose-refs": "1.1.2", "@radix-ui/react-context": "1.1.2", "@radix-ui/react-dismissable-layer": "1.1.10", "@radix-ui/react-focus-guards": "1.1.2", "@radix-ui/react-focus-scope": "1.1.7", "@radix-ui/react-id": "1.1.1", "@radix-ui/react-portal": "1.1.9", "@radix-ui/react-presence": "1.1.4", "@radix-ui/react-primitive": "2.1.3", "@radix-ui/react-slot": "1.2.3", "@radix-ui/react-use-controllable-state": "1.2.2", "aria-hidden": "^1.2.4", "react-remove-scroll": "^2.6.3" }, "peerDependencies": { "@types/react": "*", "@types/react-dom": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc", "react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react", "@types/react-dom"] }, "sha512-+CpweKjqpzTmwRwcYECQcNYbI8V9VSQt0SNFKeEBLgfucbsLssU6Ppq7wUdNXEGb573bMjFhVjKVll8rmV6zMw=="],
@@ -1793,6 +1806,8 @@
"parse-entities/@types/unist": ["@types/unist@2.0.11", "", {}, "sha512-CmBKiL6NNo/OqgmMn95Fk9Whlp2mtvIv+KNpQKN2F4SjvrEesubTRWGYSg+BnWZOnlCaSTU1sMpsBOzgbYhnsA=="],
"playwright/fsevents": ["fsevents@2.3.2", "", { "os": "darwin" }, "sha512-xiqMQR4xAeHTuB9uWm+fFRcIOgKBMiOBP+eXiyT7jsgVCq1bkVygt00oASowB7EdtpOHaaPgKt812P9ab+DDKA=="],
"pretty-format/ansi-styles": ["ansi-styles@5.2.0", "", {}, "sha512-Cxwpt2SfTzTtXcfOlzGEee8O+c+MmUgGrNiBcXnuWxuFJHe6a5Hz7qwhwe5OgaSYI0IJvkLqWX1ASG+cJOkEiA=="],
"prompts/kleur": ["kleur@3.0.3", "", {}, "sha512-eTIzlVOSUR+JxdDFepEYcBMtZ9Qqdef+rnzWdRZuMbOywu5tO2w2N7rqjoANZ5k9vywhL6Br1VRjUIgTQx4E8w=="],
@@ -1919,6 +1934,8 @@
"@tailwindcss/node/lightningcss/lightningcss-win32-x64-msvc": ["lightningcss-win32-x64-msvc@1.31.1", "", { "os": "win32", "cpu": "x64" }, "sha512-I9aiFrbd7oYHwlnQDqr1Roz+fTz61oDDJX7n9tYF9FJymH1cIN1DtKw3iYt6b8WZgEjoNwVSncwF4wx/ZedMhw=="],
"@types/jsonwebtoken/@types/node/undici-types": ["undici-types@6.21.0", "", {}, "sha512-iwDZqg0QAGrg9Rav5H4n0M64c3mkR59cJ6wQp+7C4nI0gsmExaedaYLNO44eT4AtBBwjbTiGPMlt2Md0T9H9JQ=="],
"astro/esbuild/@esbuild/aix-ppc64": ["@esbuild/aix-ppc64@0.27.3", "", { "os": "aix", "cpu": "ppc64" }, "sha512-9fJMTNFTWZMh5qwrBItuziu834eOCUcEqymSH7pY+zoMVEZg3gcPuBNxH1EvfVYe9h0x/Ptw8KBzv7qxb7l8dg=="],
"astro/esbuild/@esbuild/android-arm": ["@esbuild/android-arm@0.27.3", "", { "os": "android", "cpu": "arm" }, "sha512-i5D1hPY7GIQmXlXhs2w8AWHhenb00+GxjxRncS2ZM7YNVGNfaMxgzSGuO8o8SJzRc/oZwU2bcScvVERk03QhzA=="],
@@ -1975,6 +1992,8 @@
"boxen/string-width/strip-ansi": ["strip-ansi@7.1.0", "", { "dependencies": { "ansi-regex": "^6.0.1" } }, "sha512-iq6eVVI64nQQTRYq2KtEg2d2uU7LElhTJwsH4YzIHZshxlgZms/wIc4VoDQTlG/IvVIrBKG06CrZnp0qv7hkcQ=="],
"bun-types/@types/node/undici-types": ["undici-types@6.21.0", "", {}, "sha512-iwDZqg0QAGrg9Rav5H4n0M64c3mkR59cJ6wQp+7C4nI0gsmExaedaYLNO44eT4AtBBwjbTiGPMlt2Md0T9H9JQ=="],
"cmdk/@radix-ui/react-dialog/@radix-ui/primitive": ["@radix-ui/primitive@1.1.2", "", {}, "sha512-XnbHrrprsNqZKQhStrSwgRUQzoCI1glLzdw79xiZPoofhGICeZRSQ3dIxAKH1gb3OHfNf4d6f+vAv3kil2eggA=="],
"cmdk/@radix-ui/react-dialog/@radix-ui/react-dismissable-layer": ["@radix-ui/react-dismissable-layer@1.1.10", "", { "dependencies": { "@radix-ui/primitive": "1.1.2", "@radix-ui/react-compose-refs": "1.1.2", "@radix-ui/react-primitive": "2.1.3", "@radix-ui/react-use-callback-ref": "1.1.1", "@radix-ui/react-use-escape-keydown": "1.1.1" }, "peerDependencies": { "@types/react": "*", "@types/react-dom": "*", "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc", "react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react", "@types/react-dom"] }, "sha512-IM1zzRV4W3HtVgftdQiiOmA0AdJlCtMLe00FXaHwgt3rAnNsIyDqshvkIW3hj/iu5hu8ERP7KIYki6NkqDxAwQ=="],

3724
bun.nix Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -3,4 +3,7 @@
timeout = 5000
# Preload the setup file
preload = ["./src/tests/setup.bun.ts"]
preload = ["./src/tests/setup.bun.ts"]
# Only run tests in src/ directory (excludes tests/e2e/ which are Playwright tests)
root = "./src/"

View File

@@ -310,26 +310,25 @@ bunx tsc --noEmit
## Release Process
1. **Update version**:
```bash
npm version patch # or minor/major
```
1. **Choose release version** (`X.Y.Z`) and update `CHANGELOG.md`
2. **Update CHANGELOG.md**
3. **Build and test**:
2. **Build and test**:
```bash
bun run build
bun test
```
4. **Create release**:
3. **Create release tag** (semver format required):
```bash
git tag v2.23.0
git push origin v2.23.0
git tag vX.Y.Z
git push origin vX.Y.Z
```
5. **Create GitHub release**
4. **Create GitHub release**
5. **CI version sync (automatic)**:
- On `v*` tags, release CI updates `package.json` version in the build context from the tag (`vX.Y.Z` -> `X.Y.Z`), so Docker release images always report the correct app version.
- After the release build succeeds, CI commits the same `package.json` version back to `main` automatically.
## Contributing
@@ -349,6 +348,6 @@ git push origin v2.23.0
## Getting Help
- Check existing [issues](https://github.com/yourusername/gitea-mirror/issues)
- Join [discussions](https://github.com/yourusername/gitea-mirror/discussions)
- Read the [FAQ](./FAQ.md)
- Check existing [issues](https://github.com/RayLabsHQ/gitea-mirror/issues)
- Join [discussions](https://github.com/RayLabsHQ/gitea-mirror/discussions)
- Review project docs in [docs/README.md](./README.md)

View File

@@ -78,6 +78,7 @@ Settings for connecting to and configuring GitHub repository sources.
| Variable | Description | Default | Options |
|----------|-------------|---------|---------|
| `SKIP_STARRED_ISSUES` | Enable lightweight mode for starred repos (skip issues) | `false` | `true`, `false` |
| `AUTO_MIRROR_STARRED` | Automatically mirror starred repos during scheduled syncs and "Mirror All". When `false`, starred repos are imported for browsing but must be mirrored individually. | `false` | `true`, `false` |
## Gitea Configuration

View File

@@ -0,0 +1,179 @@
# Force-Push Protection
This document describes the smart force-push protection system introduced in gitea-mirror v3.11.0+.
## The Problem
GitHub repositories can be force-pushed at any time — rewriting history, deleting branches, or replacing commits entirely. When gitea-mirror syncs a force-pushed repo, the old history in Gitea is silently overwritten. Files, commits, and branches disappear with no way to recover them.
The original workaround (`backupBeforeSync: true`) created a full git bundle backup before **every** sync. This doesn't scale — a user with 100+ GiB of mirrors would need up to 2 TB of backup storage with default retention settings, even though force-pushes are rare.
## Solution: Smart Detection
Instead of backing up everything every time, the system detects force-pushes **before** they happen and only acts when needed.
### How Detection Works
Before each sync, the app compares branch SHAs between Gitea (the mirror) and GitHub (the source):
1. **Fetch branches from both sides** — lightweight API calls to get branch names and their latest commit SHAs
2. **Compare each branch**:
- SHAs match → nothing changed, no action needed
- SHAs differ → check if the change is a normal push or a force-push
3. **Ancestry check** — for branches with different SHAs, call GitHub's compare API to determine if the new SHA is a descendant of the old one:
- **Fast-forward** (new SHA descends from old) → normal push, safe to sync
- **Diverged** (histories split) → force-push detected
- **404** (old SHA doesn't exist on GitHub anymore) → history was rewritten, force-push detected
- **Branch deleted on GitHub** → flagged as destructive change
### What Happens on Detection
Depends on the configured strategy (see below):
- **Backup strategies** (`always`, `on-force-push`): create a git bundle snapshot, then sync
- **Block strategy** (`block-on-force-push`): halt the sync, mark the repo as `pending-approval`, wait for user action
### Fail-Open Design
If detection itself fails (GitHub rate limits, network errors, API outages), sync proceeds normally. Detection never blocks a sync due to its own failure. Individual branch check failures are skipped — one flaky branch doesn't affect the others.
## Backup Strategies
Configure via **Settings → GitHub Configuration → Destructive Update Protection**.
| Strategy | What It Does | Storage Cost | Best For |
|---|---|---|---|
| **Disabled** | No detection, no backups | Zero | Repos you don't care about losing |
| **Always Backup** | Snapshot before every sync (original behavior) | High | Small mirror sets, maximum safety |
| **Smart** (default) | Detect force-pushes, backup only when found | Near-zero normally | Most users — efficient protection |
| **Block & Approve** | Detect force-pushes, block sync until approved | Zero | Critical repos needing manual review |
### Strategy Details
#### Disabled
Syncs proceed without any detection or backup. If a force-push happens on GitHub, the mirror silently overwrites.
#### Always Backup
Creates a git bundle snapshot before every sync regardless of whether a force-push occurred. This is the legacy behavior (equivalent to the old `backupBeforeSync: true`). Safe but expensive for large mirror sets.
#### Smart (`on-force-push`) — Recommended
Runs the force-push detection before each sync. On normal days (no force-pushes), syncs proceed without any backup overhead. When a force-push is detected, a snapshot is created before the sync runs.
This gives you protection when it matters with near-zero cost when it doesn't.
#### Block & Approve (`block-on-force-push`)
Runs detection and, when a force-push is found, **blocks the sync entirely**. The repository is marked as `pending-approval` and excluded from future scheduled syncs until you take action:
- **Approve**: creates a backup first, then syncs (safe)
- **Dismiss**: clears the flag and resumes normal syncing (no backup)
Use this for repos where you want manual control over destructive changes.
## Additional Settings
These appear when any non-disabled strategy is selected:
### Snapshot Retention Count
How many backup snapshots to keep per repository. Oldest snapshots are deleted when this limit is exceeded. Default: **20**.
### Snapshot Directory
Where git bundle backups are stored. Default: **`data/repo-backups`**. Bundles are organized as `<directory>/<owner>/<repo>/<timestamp>.bundle`.
### Block Sync on Snapshot Failure
Available for **Always Backup** and **Smart** strategies. When enabled, if the snapshot creation fails (disk full, permissions error, etc.), the sync is also blocked. When disabled, sync continues even if the snapshot couldn't be created.
Recommended: **enabled** if you rely on backups for recovery.
## Backward Compatibility
The old `backupBeforeSync` boolean is still recognized:
| Old Setting | New Equivalent |
|---|---|
| `backupBeforeSync: true` | `backupStrategy: "always"` |
| `backupBeforeSync: false` | `backupStrategy: "disabled"` |
| Neither set | `backupStrategy: "on-force-push"` (new default) |
Existing configurations are automatically mapped. The old field is deprecated but will continue to work.
## Environment Variables
No new environment variables are required. The backup strategy is configured through the web UI and stored in the database alongside other config.
## API
### Approve/Dismiss Blocked Repos
When using the `block-on-force-push` strategy, repos that are blocked can be managed via the API:
```bash
# Approve sync (creates backup first, then syncs)
curl -X POST http://localhost:4321/api/job/approve-sync \
-H "Content-Type: application/json" \
-H "Cookie: <session>" \
-d '{"repositoryIds": ["<id>"], "action": "approve"}'
# Dismiss (clear the block, resume normal syncing)
curl -X POST http://localhost:4321/api/job/approve-sync \
-H "Content-Type: application/json" \
-H "Cookie: <session>" \
-d '{"repositoryIds": ["<id>"], "action": "dismiss"}'
```
Blocked repos also show an **Approve** / **Dismiss** button in the repository table UI.
## Architecture
### Key Files
| File | Purpose |
|---|---|
| `src/lib/utils/force-push-detection.ts` | Core detection: fetch branches, compare SHAs, check ancestry |
| `src/lib/repo-backup.ts` | Strategy resolver, backup decision logic, bundle creation |
| `src/lib/gitea-enhanced.ts` | Sync flow integration (calls detection + backup before mirror-sync) |
| `src/pages/api/job/approve-sync.ts` | Approve/dismiss API endpoint |
| `src/components/config/GitHubConfigForm.tsx` | Strategy selector UI |
| `src/components/repositories/RepositoryTable.tsx` | Pending-approval badge + action buttons |
### Detection Flow
```
syncGiteaRepoEnhanced()
├─ Resolve backup strategy (config → backupStrategy → backupBeforeSync → default)
├─ If strategy needs detection ("on-force-push" or "block-on-force-push"):
│ │
│ ├─ fetchGiteaBranches() — GET /api/v1/repos/{owner}/{repo}/branches
│ ├─ fetchGitHubBranches() — octokit.paginate(repos.listBranches)
│ │
│ └─ For each Gitea branch where SHA differs:
│ └─ checkAncestry() — octokit.repos.compareCommits()
│ ├─ "ahead" or "identical" → fast-forward (safe)
│ ├─ "diverged" or "behind" → force-push detected
│ └─ 404/422 → old SHA gone → force-push detected
├─ If "block-on-force-push" + detected:
│ └─ Set repo status to "pending-approval", return early
├─ If backup needed (always, or on-force-push + detected):
│ └─ Create git bundle snapshot
└─ Proceed to mirror-sync
```
## Troubleshooting
**Repos stuck in "pending-approval"**: Use the Approve or Dismiss buttons in the repository table, or call the approve-sync API endpoint.
**Detection always skipped**: Check the activity log for skip reasons. Common causes: Gitea repo not yet mirrored (first sync), GitHub API rate limits, network errors. All are fail-open by design.
**Backups consuming too much space**: Lower the retention count, or switch from "Always Backup" to "Smart" which only creates backups on actual force-pushes.
**False positives**: The detection compares branch-by-branch. A rebase (which is a force-push) will correctly trigger detection. If you routinely rebase branches, consider using "Smart" instead of "Block & Approve" to avoid constant approval prompts.

View File

@@ -16,7 +16,7 @@ nix run --extra-experimental-features 'nix-command flakes' github:RayLabsHQ/gite
nix run github:RayLabsHQ/gitea-mirror/abc123def
# Pin to git tag
nix run github:RayLabsHQ/gitea-mirror/v3.8.11
nix run github:RayLabsHQ/gitea-mirror/vX.Y.Z
```
**How it works:**
@@ -110,11 +110,11 @@ GitHub Actions workflow validates builds on every push/PR:
Tag releases for version pinning:
```bash
git tag v3.8.11
git push origin v3.8.11
git tag vX.Y.Z
git push origin vX.Y.Z
# Users can then pin:
nix run github:RayLabsHQ/gitea-mirror/v3.8.11
nix run github:RayLabsHQ/gitea-mirror/vX.Y.Z
```
### Phase 4: nixpkgs Submission (Long Term)
@@ -143,13 +143,13 @@ nix profile install --extra-experimental-features 'nix-command flakes' github:Ra
```bash
# Pin to git tag
nix run github:RayLabsHQ/gitea-mirror/v3.8.11
nix run github:RayLabsHQ/gitea-mirror/vX.Y.Z
# Pin to commit
nix run github:RayLabsHQ/gitea-mirror/abc123def
# Lock in flake.nix
inputs.gitea-mirror.url = "github:RayLabsHQ/gitea-mirror/v3.8.11";
inputs.gitea-mirror.url = "github:RayLabsHQ/gitea-mirror/vX.Y.Z";
```
#### Option 3: NixOS Configuration
@@ -160,7 +160,7 @@ inputs.gitea-mirror.url = "github:RayLabsHQ/gitea-mirror/v3.8.11";
nixpkgs.url = "github:NixOS/nixpkgs/nixos-unstable";
gitea-mirror.url = "github:RayLabsHQ/gitea-mirror";
# Or pin to version:
# gitea-mirror.url = "github:RayLabsHQ/gitea-mirror/v3.8.11";
# gitea-mirror.url = "github:RayLabsHQ/gitea-mirror/vX.Y.Z";
};
outputs = { nixpkgs, gitea-mirror, ... }: {
@@ -257,7 +257,7 @@ git tag -l
git ls-remote --tags origin
# Test specific tag
nix run github:RayLabsHQ/gitea-mirror/v3.8.11
nix run github:RayLabsHQ/gitea-mirror/vX.Y.Z
```
---

View File

@@ -7,6 +7,8 @@ This folder contains engineering and operations references for the open-source G
### Core workflow
- **[DEVELOPMENT_WORKFLOW.md](./DEVELOPMENT_WORKFLOW.md)** Set up a local environment, run scripts, and understand the repo layout (app + marketing site).
- **[ENVIRONMENT_VARIABLES.md](./ENVIRONMENT_VARIABLES.md)** Complete reference for every configuration flag supported by the app and Docker images.
- **[NIX_DEPLOYMENT.md](./NIX_DEPLOYMENT.md)** User-facing deployment guide for Nix and NixOS.
- **[NIX_DISTRIBUTION.md](./NIX_DISTRIBUTION.md)** Maintainer notes for packaging, releases, and distribution strategy.
### Reliability & recovery
- **[GRACEFUL_SHUTDOWN.md](./GRACEFUL_SHUTDOWN.md)** How signal handling, shutdown coordination, and job persistence work in v3.
@@ -32,8 +34,6 @@ The first user you create locally becomes the administrator. All other configura
## Contributing & support
- 🎯 Contribution guide: [../CONTRIBUTING.md](../CONTRIBUTING.md)
- 📘 Code of conduct: [../CODE_OF_CONDUCT.md](../CODE_OF_CONDUCT.md)
- 🐞 Issues & feature requests: <https://github.com/RayLabsHQ/gitea-mirror/issues>
- 💬 Discussions: <https://github.com/RayLabsHQ/gitea-mirror/discussions>
Security disclosures should follow the process in [../SECURITY.md](../SECURITY.md).
- 🔐 Security policy & advisories: <https://github.com/RayLabsHQ/gitea-mirror/security>

Binary file not shown.

After

Width:  |  Height:  |  Size: 22 KiB

111
flake.lock generated
View File

@@ -1,8 +1,50 @@
{
"nodes": {
"bun2nix": {
"inputs": {
"flake-parts": "flake-parts",
"import-tree": "import-tree",
"nixpkgs": [
"nixpkgs"
],
"systems": "systems",
"treefmt-nix": "treefmt-nix"
},
"locked": {
"lastModified": 1770895533,
"narHash": "sha256-v3QaK9ugy9bN9RXDnjw0i2OifKmz2NnKM82agtqm/UY=",
"owner": "nix-community",
"repo": "bun2nix",
"rev": "c843f477b15f51151f8c6bcc886954699440a6e1",
"type": "github"
},
"original": {
"owner": "nix-community",
"repo": "bun2nix",
"type": "github"
}
},
"flake-parts": {
"inputs": {
"nixpkgs-lib": "nixpkgs-lib"
},
"locked": {
"lastModified": 1769996383,
"narHash": "sha256-AnYjnFWgS49RlqX7LrC4uA+sCCDBj0Ry/WOJ5XWAsa0=",
"owner": "hercules-ci",
"repo": "flake-parts",
"rev": "57928607ea566b5db3ad13af0e57e921e6b12381",
"type": "github"
},
"original": {
"owner": "hercules-ci",
"repo": "flake-parts",
"type": "github"
}
},
"flake-utils": {
"inputs": {
"systems": "systems"
"systems": "systems_2"
},
"locked": {
"lastModified": 1731533236,
@@ -18,6 +60,21 @@
"type": "github"
}
},
"import-tree": {
"locked": {
"lastModified": 1763762820,
"narHash": "sha256-ZvYKbFib3AEwiNMLsejb/CWs/OL/srFQ8AogkebEPF0=",
"owner": "vic",
"repo": "import-tree",
"rev": "3c23749d8013ec6daa1d7255057590e9ca726646",
"type": "github"
},
"original": {
"owner": "vic",
"repo": "import-tree",
"type": "github"
}
},
"nixpkgs": {
"locked": {
"lastModified": 1761672384,
@@ -34,8 +91,24 @@
"type": "github"
}
},
"nixpkgs-lib": {
"locked": {
"lastModified": 1769909678,
"narHash": "sha256-cBEymOf4/o3FD5AZnzC3J9hLbiZ+QDT/KDuyHXVJOpM=",
"owner": "nix-community",
"repo": "nixpkgs.lib",
"rev": "72716169fe93074c333e8d0173151350670b824c",
"type": "github"
},
"original": {
"owner": "nix-community",
"repo": "nixpkgs.lib",
"type": "github"
}
},
"root": {
"inputs": {
"bun2nix": "bun2nix",
"flake-utils": "flake-utils",
"nixpkgs": "nixpkgs"
}
@@ -54,6 +127,42 @@
"repo": "default",
"type": "github"
}
},
"systems_2": {
"locked": {
"lastModified": 1681028828,
"narHash": "sha256-Vy1rq5AaRuLzOxct8nz4T6wlgyUR7zLU309k9mBC768=",
"owner": "nix-systems",
"repo": "default",
"rev": "da67096a3b9bf56a91d16901293e51ba5b49a27e",
"type": "github"
},
"original": {
"owner": "nix-systems",
"repo": "default",
"type": "github"
}
},
"treefmt-nix": {
"inputs": {
"nixpkgs": [
"bun2nix",
"nixpkgs"
]
},
"locked": {
"lastModified": 1770228511,
"narHash": "sha256-wQ6NJSuFqAEmIg2VMnLdCnUc0b7vslUohqqGGD+Fyxk=",
"owner": "numtide",
"repo": "treefmt-nix",
"rev": "337a4fe074be1042a35086f15481d763b8ddc0e7",
"type": "github"
},
"original": {
"owner": "numtide",
"repo": "treefmt-nix",
"type": "github"
}
}
},
"root": "root",

417
flake.nix
View File

@@ -1,25 +1,43 @@
{
description = "Gitea Mirror - Self-hosted GitHub to Gitea mirroring service";
nixConfig = {
extra-substituters = [
"https://nix-community.cachix.org"
];
extra-trusted-public-keys = [
"nix-community.cachix.org-1:mB9FSh9qf2dCimDSUo8Zy7bkq5CX+/rkCWyvRCYg3Fs="
];
};
inputs = {
nixpkgs.url = "github:NixOS/nixpkgs/nixos-unstable";
flake-utils.url = "github:numtide/flake-utils";
bun2nix = {
url = "github:nix-community/bun2nix";
inputs.nixpkgs.follows = "nixpkgs";
};
};
outputs = { self, nixpkgs, flake-utils }:
flake-utils.lib.eachDefaultSystem (system:
outputs = { self, nixpkgs, flake-utils, bun2nix }:
let
forEachSystem = flake-utils.lib.eachDefaultSystem;
in
(forEachSystem (system:
let
pkgs = nixpkgs.legacyPackages.${system};
b2n = bun2nix.packages.${system}.default;
# Build the application
gitea-mirror = pkgs.stdenv.mkDerivation {
pname = "gitea-mirror";
version = "3.8.11";
version = "3.9.6";
src = ./.;
nativeBuildInputs = with pkgs; [
bun
nativeBuildInputs = [
pkgs.bun
b2n.hook
];
buildInputs = with pkgs; [
@@ -27,21 +45,40 @@
openssl
];
configurePhase = ''
export HOME=$TMPDIR
export BUN_INSTALL=$TMPDIR/.bun
export PATH=$BUN_INSTALL/bin:$PATH
'';
bunDeps = b2n.fetchBunDeps {
bunNix = ./bun.nix;
};
# Let the bun2nix hook handle dependency installation via the
# pre-fetched cache, but skip its default build/check/install
# phases since we have custom ones.
dontUseBunBuild = true;
dontUseBunCheck = true;
dontUseBunInstall = true;
buildPhase = ''
# Install dependencies
bun install --frozen-lockfile --no-progress
runHook preBuild
export HOME=$TMPDIR
# Build the application
# The bun2nix cache is in the read-only Nix store, but bunx/astro
# may try to write to it at build time. Copy the cache to a
# writable location.
if [ -n "$BUN_INSTALL_CACHE_DIR" ] && [ -d "$BUN_INSTALL_CACHE_DIR" ]; then
WRITABLE_CACHE="$TMPDIR/bun-cache"
cp -rL "$BUN_INSTALL_CACHE_DIR" "$WRITABLE_CACHE" 2>/dev/null || true
chmod -R u+w "$WRITABLE_CACHE" 2>/dev/null || true
export BUN_INSTALL_CACHE_DIR="$WRITABLE_CACHE"
fi
# Build the Astro application
bun run build
runHook postBuild
'';
installPhase = ''
runHook preInstall
mkdir -p $out/lib/gitea-mirror
mkdir -p $out/bin
@@ -49,11 +86,14 @@
cp -r dist $out/lib/gitea-mirror/
cp -r node_modules $out/lib/gitea-mirror/
cp -r scripts $out/lib/gitea-mirror/
cp -r src $out/lib/gitea-mirror/
cp -r drizzle $out/lib/gitea-mirror/
cp package.json $out/lib/gitea-mirror/
cp tsconfig.json $out/lib/gitea-mirror/
# Create entrypoint script that matches Docker behavior
cat > $out/bin/gitea-mirror <<'EOF'
#!/usr/bin/env bash
#!${pkgs.bash}/bin/bash
set -e
# === DEFAULT CONFIGURATION ===
@@ -75,7 +115,19 @@ export MIRROR_PULL_REQUEST_CONCURRENCY=''${MIRROR_PULL_REQUEST_CONCURRENCY:-5}
# Create data directory
mkdir -p "$DATA_DIR"
cd $out/lib/gitea-mirror
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
APP_DIR="$SCRIPT_DIR/../lib/gitea-mirror"
# The app uses process.cwd()/data for the database, but the Nix store
# is read-only. Create a writable working directory with symlinks to
# the app files and a real data directory.
WORK_DIR="$DATA_DIR/.workdir"
mkdir -p "$WORK_DIR"
for item in dist node_modules scripts src drizzle package.json tsconfig.json; do
ln -sfn "$APP_DIR/$item" "$WORK_DIR/$item"
done
ln -sfn "$DATA_DIR" "$WORK_DIR/data"
cd "$WORK_DIR"
# === AUTO-GENERATE SECRETS ===
BETTER_AUTH_SECRET_FILE="$DATA_DIR/.better_auth_secret"
@@ -112,7 +164,7 @@ if [ -z "$ENCRYPTION_SECRET" ]; then
fi
# === DATABASE INITIALIZATION ===
DB_PATH=$(echo "$DATABASE_URL" | sed 's|^file:||')
DB_PATH=$(echo "$DATABASE_URL" | ${pkgs.gnused}/bin/sed 's|^file:||')
if [ ! -f "$DB_PATH" ]; then
echo "Database not found. It will be created and initialized via Drizzle migrations on first app startup..."
touch "$DB_PATH"
@@ -123,25 +175,25 @@ fi
# === STARTUP SCRIPTS ===
# Initialize configuration from environment variables
echo "Checking for environment configuration..."
if [ -f "dist/scripts/startup-env-config.js" ]; then
if [ -f "scripts/startup-env-config.ts" ]; then
echo "Loading configuration from environment variables..."
${pkgs.bun}/bin/bun dist/scripts/startup-env-config.js && \
${pkgs.bun}/bin/bun scripts/startup-env-config.ts && \
echo " Environment configuration loaded successfully" || \
echo " Environment configuration loading completed with warnings"
fi
# Run startup recovery
echo "Running startup recovery..."
if [ -f "dist/scripts/startup-recovery.js" ]; then
${pkgs.bun}/bin/bun dist/scripts/startup-recovery.js --timeout=30000 && \
if [ -f "scripts/startup-recovery.ts" ]; then
${pkgs.bun}/bin/bun scripts/startup-recovery.ts --timeout=30000 && \
echo " Startup recovery completed successfully" || \
echo " Startup recovery completed with warnings"
fi
# Run repository status repair
echo "Running repository status repair..."
if [ -f "dist/scripts/repair-mirrored-repos.js" ]; then
${pkgs.bun}/bin/bun dist/scripts/repair-mirrored-repos.js --startup && \
if [ -f "scripts/repair-mirrored-repos.ts" ]; then
${pkgs.bun}/bin/bun scripts/repair-mirrored-repos.ts --startup && \
echo " Repository status repair completed successfully" || \
echo " Repository status repair completed with warnings"
fi
@@ -170,13 +222,16 @@ EOF
# Create database management helper
cat > $out/bin/gitea-mirror-db <<'EOF'
#!/usr/bin/env bash
#!${pkgs.bash}/bin/bash
export DATA_DIR=''${DATA_DIR:-"$HOME/.local/share/gitea-mirror"}
mkdir -p "$DATA_DIR"
cd $out/lib/gitea-mirror
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
cd "$SCRIPT_DIR/../lib/gitea-mirror"
exec ${pkgs.bun}/bin/bun scripts/manage-db.ts "$@"
EOF
chmod +x $out/bin/gitea-mirror-db
runHook postInstall
'';
meta = with pkgs.lib; {
@@ -201,6 +256,7 @@ EOF
bun
sqlite
openssl
b2n
];
shellHook = ''
@@ -211,182 +267,185 @@ EOF
echo " bun run dev # Start development server"
echo " bun run build # Build for production"
echo ""
echo "Nix packaging:"
echo " bun2nix -o bun.nix # Regenerate bun.nix after dependency changes"
echo " nix build # Build the package"
echo ""
echo "Database:"
echo " bun run manage-db init # Initialize database"
echo " bun run db:studio # Open Drizzle Studio"
'';
};
# NixOS module
nixosModules.default = { config, lib, pkgs, ... }:
with lib;
let
cfg = config.services.gitea-mirror;
in {
options.services.gitea-mirror = {
enable = mkEnableOption "Gitea Mirror service";
}
)) // {
nixosModules.default = { config, lib, pkgs, ... }:
with lib;
let
cfg = config.services.gitea-mirror;
in {
options.services.gitea-mirror = {
enable = mkEnableOption "Gitea Mirror service";
package = mkOption {
type = types.package;
default = self.packages.${system}.default;
description = "The Gitea Mirror package to use";
};
dataDir = mkOption {
type = types.path;
default = "/var/lib/gitea-mirror";
description = "Directory to store data and database";
};
user = mkOption {
type = types.str;
default = "gitea-mirror";
description = "User account under which Gitea Mirror runs";
};
group = mkOption {
type = types.str;
default = "gitea-mirror";
description = "Group under which Gitea Mirror runs";
};
host = mkOption {
type = types.str;
default = "0.0.0.0";
description = "Host to bind to";
};
port = mkOption {
type = types.port;
default = 4321;
description = "Port to listen on";
};
betterAuthUrl = mkOption {
type = types.str;
default = "http://localhost:4321";
description = "Better Auth URL (external URL of the service)";
};
betterAuthTrustedOrigins = mkOption {
type = types.str;
default = "http://localhost:4321";
description = "Comma-separated list of trusted origins for Better Auth";
};
mirrorIssueConcurrency = mkOption {
type = types.int;
default = 3;
description = "Number of concurrent issue mirror operations (set to 1 for perfect ordering)";
};
mirrorPullRequestConcurrency = mkOption {
type = types.int;
default = 5;
description = "Number of concurrent PR mirror operations (set to 1 for perfect ordering)";
};
environmentFile = mkOption {
type = types.nullOr types.path;
default = null;
description = ''
Path to file containing environment variables.
Only needed if you want to set BETTER_AUTH_SECRET or ENCRYPTION_SECRET manually.
Otherwise, secrets will be auto-generated and stored in the data directory.
Example:
BETTER_AUTH_SECRET=your-32-character-secret-here
ENCRYPTION_SECRET=your-encryption-secret-here
'';
};
openFirewall = mkOption {
type = types.bool;
default = false;
description = "Open the firewall for the specified port";
};
package = mkOption {
type = types.package;
default = self.packages.${pkgs.system}.default;
description = "The Gitea Mirror package to use";
};
config = mkIf cfg.enable {
users.users.${cfg.user} = {
isSystemUser = true;
group = cfg.group;
home = cfg.dataDir;
createHome = true;
};
dataDir = mkOption {
type = types.path;
default = "/var/lib/gitea-mirror";
description = "Directory to store data and database";
};
users.groups.${cfg.group} = {};
user = mkOption {
type = types.str;
default = "gitea-mirror";
description = "User account under which Gitea Mirror runs";
};
systemd.services.gitea-mirror = {
description = "Gitea Mirror - GitHub to Gitea mirroring service";
after = [ "network.target" ];
wantedBy = [ "multi-user.target" ];
group = mkOption {
type = types.str;
default = "gitea-mirror";
description = "Group under which Gitea Mirror runs";
};
environment = {
DATA_DIR = cfg.dataDir;
DATABASE_URL = "file:${cfg.dataDir}/gitea-mirror.db";
HOST = cfg.host;
PORT = toString cfg.port;
NODE_ENV = "production";
BETTER_AUTH_URL = cfg.betterAuthUrl;
BETTER_AUTH_TRUSTED_ORIGINS = cfg.betterAuthTrustedOrigins;
PUBLIC_BETTER_AUTH_URL = cfg.betterAuthUrl;
MIRROR_ISSUE_CONCURRENCY = toString cfg.mirrorIssueConcurrency;
MIRROR_PULL_REQUEST_CONCURRENCY = toString cfg.mirrorPullRequestConcurrency;
};
host = mkOption {
type = types.str;
default = "0.0.0.0";
description = "Host to bind to";
};
serviceConfig = {
Type = "simple";
User = cfg.user;
Group = cfg.group;
ExecStart = "${cfg.package}/bin/gitea-mirror";
Restart = "always";
RestartSec = "10s";
port = mkOption {
type = types.port;
default = 4321;
description = "Port to listen on";
};
# Security hardening
NoNewPrivileges = true;
PrivateTmp = true;
ProtectSystem = "strict";
ProtectHome = true;
ReadWritePaths = [ cfg.dataDir ];
betterAuthUrl = mkOption {
type = types.str;
default = "http://localhost:4321";
description = "Better Auth URL (external URL of the service)";
};
# Load environment file if specified (optional)
EnvironmentFile = mkIf (cfg.environmentFile != null) cfg.environmentFile;
betterAuthTrustedOrigins = mkOption {
type = types.str;
default = "http://localhost:4321";
description = "Comma-separated list of trusted origins for Better Auth";
};
# Graceful shutdown
TimeoutStopSec = "30s";
KillMode = "mixed";
KillSignal = "SIGTERM";
};
};
mirrorIssueConcurrency = mkOption {
type = types.int;
default = 3;
description = "Number of concurrent issue mirror operations (set to 1 for perfect ordering)";
};
# Health check timer (optional monitoring)
systemd.timers.gitea-mirror-healthcheck = mkIf cfg.enable {
description = "Gitea Mirror health check timer";
wantedBy = [ "timers.target" ];
timerConfig = {
OnBootSec = "5min";
OnUnitActiveSec = "5min";
};
};
mirrorPullRequestConcurrency = mkOption {
type = types.int;
default = 5;
description = "Number of concurrent PR mirror operations (set to 1 for perfect ordering)";
};
systemd.services.gitea-mirror-healthcheck = mkIf cfg.enable {
description = "Gitea Mirror health check";
after = [ "gitea-mirror.service" ];
serviceConfig = {
Type = "oneshot";
ExecStart = "${pkgs.curl}/bin/curl -f http://${cfg.host}:${toString cfg.port}/api/health || true";
User = "nobody";
};
};
environmentFile = mkOption {
type = types.nullOr types.path;
default = null;
description = ''
Path to file containing environment variables.
Only needed if you want to set BETTER_AUTH_SECRET or ENCRYPTION_SECRET manually.
Otherwise, secrets will be auto-generated and stored in the data directory.
networking.firewall = mkIf cfg.openFirewall {
allowedTCPPorts = [ cfg.port ];
};
Example:
BETTER_AUTH_SECRET=your-32-character-secret-here
ENCRYPTION_SECRET=your-encryption-secret-here
'';
};
openFirewall = mkOption {
type = types.bool;
default = false;
description = "Open the firewall for the specified port";
};
};
}
) // {
config = mkIf cfg.enable {
users.users.${cfg.user} = {
isSystemUser = true;
group = cfg.group;
home = cfg.dataDir;
createHome = true;
};
users.groups.${cfg.group} = {};
systemd.services.gitea-mirror = {
description = "Gitea Mirror - GitHub to Gitea mirroring service";
after = [ "network.target" ];
wantedBy = [ "multi-user.target" ];
environment = {
DATA_DIR = cfg.dataDir;
DATABASE_URL = "file:${cfg.dataDir}/gitea-mirror.db";
HOST = cfg.host;
PORT = toString cfg.port;
NODE_ENV = "production";
BETTER_AUTH_URL = cfg.betterAuthUrl;
BETTER_AUTH_TRUSTED_ORIGINS = cfg.betterAuthTrustedOrigins;
PUBLIC_BETTER_AUTH_URL = cfg.betterAuthUrl;
MIRROR_ISSUE_CONCURRENCY = toString cfg.mirrorIssueConcurrency;
MIRROR_PULL_REQUEST_CONCURRENCY = toString cfg.mirrorPullRequestConcurrency;
};
serviceConfig = {
Type = "simple";
User = cfg.user;
Group = cfg.group;
ExecStart = "${cfg.package}/bin/gitea-mirror";
Restart = "always";
RestartSec = "10s";
# Security hardening
NoNewPrivileges = true;
PrivateTmp = true;
ProtectSystem = "strict";
ProtectHome = true;
ReadWritePaths = [ cfg.dataDir ];
# Graceful shutdown
TimeoutStopSec = "30s";
KillMode = "mixed";
KillSignal = "SIGTERM";
} // optionalAttrs (cfg.environmentFile != null) {
EnvironmentFile = cfg.environmentFile;
};
};
# Health check timer (optional monitoring)
systemd.timers.gitea-mirror-healthcheck = {
description = "Gitea Mirror health check timer";
wantedBy = [ "timers.target" ];
timerConfig = {
OnBootSec = "5min";
OnUnitActiveSec = "5min";
};
};
systemd.services.gitea-mirror-healthcheck = {
description = "Gitea Mirror health check";
after = [ "gitea-mirror.service" ];
serviceConfig = {
Type = "oneshot";
ExecStart = "${pkgs.bash}/bin/bash -c '${pkgs.curl}/bin/curl -f http://127.0.0.1:${toString cfg.port}/api/health || true'";
User = "nobody";
};
};
networking.firewall = mkIf cfg.openFirewall {
allowedTCPPorts = [ cfg.port ];
};
};
};
# Overlay for adding to nixpkgs
overlays.default = final: prev: {
gitea-mirror = self.packages.${final.system}.default;

View File

@@ -1,7 +1,7 @@
{
"name": "gitea-mirror",
"type": "module",
"version": "3.9.3",
"version": "3.12.1",
"engines": {
"bun": ">=1.2.9"
},
@@ -36,6 +36,10 @@
"test": "bun test",
"test:watch": "bun test --watch",
"test:coverage": "bun test --coverage",
"test:e2e": "bash tests/e2e/run-e2e.sh",
"test:e2e:ci": "bash tests/e2e/run-e2e.sh --ci",
"test:e2e:keep": "bash tests/e2e/run-e2e.sh --keep",
"test:e2e:cleanup": "bash tests/e2e/cleanup.sh",
"astro": "bunx --bun astro"
},
"overrides": {
@@ -73,10 +77,10 @@
"@types/canvas-confetti": "^1.9.0",
"@types/react": "^19.2.14",
"@types/react-dom": "^19.2.3",
"astro": "^5.17.3",
"astro": "^5.18.0",
"bcryptjs": "^3.0.3",
"buffer": "^6.0.3",
"better-auth": "1.4.19",
"buffer": "^6.0.3",
"canvas-confetti": "^1.9.4",
"class-variance-authority": "^0.7.1",
"clsx": "^2.1.1",
@@ -86,6 +90,7 @@
"fuse.js": "^7.1.0",
"jsonwebtoken": "^9.0.3",
"lucide-react": "^0.575.0",
"nanoid": "^3.3.11",
"next-themes": "^0.4.6",
"react": "^19.2.4",
"react-dom": "^19.2.4",
@@ -100,11 +105,13 @@
"zod": "^4.3.6"
},
"devDependencies": {
"@playwright/test": "^1.58.2",
"@testing-library/jest-dom": "^6.9.1",
"@testing-library/react": "^16.3.2",
"@types/bcryptjs": "^3.0.0",
"@types/bun": "^1.3.9",
"@types/jsonwebtoken": "^9.0.10",
"@types/node": "^25.3.2",
"@types/uuid": "^11.0.0",
"@vitejs/plugin-react": "^5.1.4",
"drizzle-kit": "^0.31.9",

View File

@@ -50,6 +50,10 @@ export function ConfigTabs() {
starredReposOrg: 'starred',
starredReposMode: 'dedicated-org',
preserveOrgStructure: false,
backupStrategy: "on-force-push",
backupRetentionCount: 20,
backupDirectory: 'data/repo-backups',
blockSyncOnBackupFailure: true,
},
scheduleConfig: {
enabled: false, // Don't set defaults here - will be loaded from API
@@ -79,6 +83,7 @@ export function ConfigTabs() {
advancedOptions: {
skipForks: false,
starredCodeOnly: false,
autoMirrorStarred: false,
},
});
const { user } = useAuth();
@@ -656,9 +661,20 @@ export function ConfigTabs() {
: update,
}))
}
giteaConfig={config.giteaConfig}
setGiteaConfig={update =>
setConfig(prev => ({
...prev,
giteaConfig:
typeof update === 'function'
? update(prev.giteaConfig)
: update,
}))
}
onAutoSave={autoSaveGitHubConfig}
onMirrorOptionsAutoSave={autoSaveMirrorOptions}
onAdvancedOptionsAutoSave={autoSaveAdvancedOptions}
onGiteaAutoSave={autoSaveGiteaConfig}
isAutoSaving={isAutoSavingGitHub}
/>
<GiteaConfigForm

View File

@@ -7,10 +7,11 @@ import {
CardTitle,
} from "@/components/ui/card";
import { githubApi } from "@/lib/api";
import type { GitHubConfig, MirrorOptions, AdvancedOptions } from "@/types/config";
import type { GitHubConfig, MirrorOptions, AdvancedOptions, GiteaConfig, BackupStrategy } from "@/types/config";
import { Input } from "../ui/input";
import { toast } from "sonner";
import { Info } from "lucide-react";
import { Info, ShieldAlert } from "lucide-react";
import { Badge } from "@/components/ui/badge";
import { GitHubMirrorSettings } from "./GitHubMirrorSettings";
import { Separator } from "../ui/separator";
import {
@@ -26,23 +27,29 @@ interface GitHubConfigFormProps {
setMirrorOptions: React.Dispatch<React.SetStateAction<MirrorOptions>>;
advancedOptions: AdvancedOptions;
setAdvancedOptions: React.Dispatch<React.SetStateAction<AdvancedOptions>>;
giteaConfig?: GiteaConfig;
setGiteaConfig?: React.Dispatch<React.SetStateAction<GiteaConfig>>;
onAutoSave?: (githubConfig: GitHubConfig) => Promise<void>;
onMirrorOptionsAutoSave?: (mirrorOptions: MirrorOptions) => Promise<void>;
onAdvancedOptionsAutoSave?: (advancedOptions: AdvancedOptions) => Promise<void>;
onGiteaAutoSave?: (giteaConfig: GiteaConfig) => Promise<void>;
isAutoSaving?: boolean;
}
export function GitHubConfigForm({
config,
setConfig,
config,
setConfig,
mirrorOptions,
setMirrorOptions,
advancedOptions,
setAdvancedOptions,
onAutoSave,
giteaConfig,
setGiteaConfig,
onAutoSave,
onMirrorOptionsAutoSave,
onAdvancedOptionsAutoSave,
isAutoSaving
onGiteaAutoSave,
isAutoSaving
}: GitHubConfigFormProps) {
const [isLoading, setIsLoading] = useState(false);
@@ -202,7 +209,139 @@ export function GitHubConfigForm({
if (onAdvancedOptionsAutoSave) onAdvancedOptionsAutoSave(newOptions);
}}
/>
{giteaConfig && setGiteaConfig && (
<>
<Separator />
<div className="space-y-4">
<h3 className="text-sm font-medium flex items-center gap-2">
<ShieldAlert className="h-4 w-4 text-primary" />
Destructive Update Protection
<Badge variant="secondary" className="ml-2 text-[10px] px-1.5 py-0">BETA</Badge>
</h3>
<p className="text-xs text-muted-foreground">
Choose how to handle force-pushes or rewritten upstream history on GitHub.
</p>
<div className="grid grid-cols-2 md:grid-cols-4 gap-2">
{([
{
value: "disabled",
label: "Disabled",
desc: "No detection or backups",
},
{
value: "always",
label: "Always Backup",
desc: "Snapshot before every sync",
},
{
value: "on-force-push",
label: "Smart",
desc: "Backup only on force-push",
},
{
value: "block-on-force-push",
label: "Block & Approve",
desc: "Require approval on force-push",
},
] as const).map((opt) => {
const isSelected = (giteaConfig.backupStrategy ?? "on-force-push") === opt.value;
return (
<button
key={opt.value}
type="button"
onClick={() => {
const newConfig = { ...giteaConfig, backupStrategy: opt.value as BackupStrategy };
setGiteaConfig(newConfig);
if (onGiteaAutoSave) onGiteaAutoSave(newConfig);
}}
className={`flex flex-col items-start gap-1 rounded-lg border p-3 text-left text-sm transition-colors ${
isSelected
? "border-primary bg-primary/5 ring-1 ring-primary"
: "border-input hover:bg-accent hover:text-accent-foreground"
}`}
>
<span className="font-medium">{opt.label}</span>
<span className="text-xs text-muted-foreground">{opt.desc}</span>
</button>
);
})}
</div>
{(giteaConfig.backupStrategy ?? "on-force-push") !== "disabled" && (
<>
<div className="grid grid-cols-1 md:grid-cols-2 gap-4">
<div>
<label htmlFor="backup-retention" className="block text-sm font-medium mb-1.5">
Snapshot retention count
</label>
<input
id="backup-retention"
name="backupRetentionCount"
type="number"
min={1}
value={giteaConfig.backupRetentionCount ?? 20}
onChange={(e) => {
const newConfig = {
...giteaConfig,
backupRetentionCount: Math.max(1, Number.parseInt(e.target.value, 10) || 20),
};
setGiteaConfig(newConfig);
if (onGiteaAutoSave) onGiteaAutoSave(newConfig);
}}
className="w-full rounded-md border border-input bg-background px-3 py-2 text-sm shadow-sm transition-colors placeholder:text-muted-foreground focus-visible:outline-none focus-visible:ring-1 focus-visible:ring-ring"
/>
</div>
<div>
<label htmlFor="backup-directory" className="block text-sm font-medium mb-1.5">
Snapshot directory
</label>
<input
id="backup-directory"
name="backupDirectory"
type="text"
value={giteaConfig.backupDirectory || "data/repo-backups"}
onChange={(e) => {
const newConfig = { ...giteaConfig, backupDirectory: e.target.value };
setGiteaConfig(newConfig);
if (onGiteaAutoSave) onGiteaAutoSave(newConfig);
}}
className="w-full rounded-md border border-input bg-background px-3 py-2 text-sm shadow-sm transition-colors placeholder:text-muted-foreground focus-visible:outline-none focus-visible:ring-1 focus-visible:ring-ring"
placeholder="data/repo-backups"
/>
</div>
</div>
{((giteaConfig.backupStrategy ?? "on-force-push") === "always" ||
(giteaConfig.backupStrategy ?? "on-force-push") === "on-force-push") && (
<label className="flex items-start gap-3 text-sm">
<input
name="blockSyncOnBackupFailure"
type="checkbox"
checked={Boolean(giteaConfig.blockSyncOnBackupFailure)}
onChange={(e) => {
const newConfig = { ...giteaConfig, blockSyncOnBackupFailure: e.target.checked };
setGiteaConfig(newConfig);
if (onGiteaAutoSave) onGiteaAutoSave(newConfig);
}}
className="mt-0.5 rounded border-input"
/>
<span>
Block sync when snapshot fails
<p className="text-xs text-muted-foreground">
Recommended for backup-first behavior. If disabled, sync continues even when snapshot creation fails.
</p>
</span>
</label>
)}
</>
)}
</div>
</>
)}
{/* Mobile: Show button at bottom */}
<Button
type="button"

View File

@@ -287,6 +287,31 @@ export function GitHubMirrorSettings({
</div>
</div>
{/* Auto-mirror starred repos toggle */}
{githubConfig.mirrorStarred && (
<div className="mt-4">
<div className="flex items-start space-x-3">
<Checkbox
id="auto-mirror-starred"
checked={advancedOptions.autoMirrorStarred ?? false}
onCheckedChange={(checked) => handleAdvancedChange('autoMirrorStarred', !!checked)}
/>
<div className="space-y-0.5 flex-1">
<Label
htmlFor="auto-mirror-starred"
className="text-sm font-normal cursor-pointer flex items-center gap-2"
>
<Star className="h-3.5 w-3.5" />
Auto-mirror new starred repositories
</Label>
<p className="text-xs text-muted-foreground">
When disabled, starred repos are imported for browsing but not automatically mirrored. You can still mirror individual repos manually.
</p>
</div>
</div>
</div>
)}
{/* Duplicate name handling for starred repos */}
{githubConfig.mirrorStarred && (
<div className="mt-4 space-y-2">

View File

@@ -100,9 +100,14 @@ export function GiteaConfigForm({ config, setConfig, onAutoSave, isAutoSaving, g
);
}
const normalizedValue =
type === "checkbox"
? checked
: value;
const newConfig = {
...config,
[name]: type === "checkbox" ? checked : value,
[name]: normalizedValue,
};
setConfig(newConfig);
@@ -286,7 +291,7 @@ export function GiteaConfigForm({ config, setConfig, onAutoSave, isAutoSaving, g
if (onAutoSave) onAutoSave(newConfig);
}}
/>
{/* Mobile: Show button at bottom */}
<Button
type="button"

View File

@@ -18,10 +18,12 @@ interface AddRepositoryDialogProps {
repo,
owner,
force,
destinationOrg,
}: {
repo: string;
owner: string;
force?: boolean;
destinationOrg?: string;
}) => Promise<void>;
}
@@ -32,6 +34,7 @@ export default function AddRepositoryDialog({
}: AddRepositoryDialogProps) {
const [repo, setRepo] = useState<string>("");
const [owner, setOwner] = useState<string>("");
const [destinationOrg, setDestinationOrg] = useState<string>("");
const [isLoading, setIsLoading] = useState<boolean>(false);
const [error, setError] = useState<string>("");
@@ -40,6 +43,7 @@ export default function AddRepositoryDialog({
setError("");
setRepo("");
setOwner("");
setDestinationOrg("");
}
}, [isDialogOpen]);
@@ -54,11 +58,16 @@ export default function AddRepositoryDialog({
try {
setIsLoading(true);
await onAddRepository({ repo, owner });
await onAddRepository({
repo,
owner,
destinationOrg: destinationOrg.trim() || undefined,
});
setError("");
setRepo("");
setOwner("");
setDestinationOrg("");
setIsDialogOpen(false);
} catch (err: any) {
setError(err?.message || "Failed to add repository.");
@@ -124,6 +133,27 @@ export default function AddRepositoryDialog({
/>
</div>
<div>
<label
htmlFor="destinationOrg"
className="block text-sm font-medium mb-1.5"
>
Target Organization{" "}
<span className="text-muted-foreground font-normal">
(optional)
</span>
</label>
<input
id="destinationOrg"
type="text"
value={destinationOrg}
onChange={(e) => setDestinationOrg(e.target.value)}
className="w-full rounded-md border border-input bg-background px-3 py-2 text-sm shadow-sm transition-colors placeholder:text-muted-foreground focus-visible:outline-none focus-visible:ring-1 focus-visible:ring-ring"
placeholder="Gitea org or user (uses default strategy if empty)"
autoComplete="off"
/>
</div>
{error && <p className="text-sm text-red-500 mt-1">{error}</p>}
</div>

View File

@@ -56,7 +56,7 @@ export default function Repository() {
const [isInitialLoading, setIsInitialLoading] = useState(true);
const { user } = useAuth();
const { registerRefreshCallback, isLiveEnabled } = useLiveRefresh();
const { isGitHubConfigured, isFullyConfigured } = useConfigStatus();
const { isGitHubConfigured, isFullyConfigured, autoMirrorStarred, githubOwner } = useConfigStatus();
const { navigationKey } = useNavigation();
const { filter, setFilter } = useFilterParams({
searchTerm: "",
@@ -233,10 +233,12 @@ export default function Repository() {
// Filter out repositories that are already mirroring, mirrored, or ignored
const eligibleRepos = repositories.filter(
(repo) =>
repo.status !== "mirroring" &&
repo.status !== "mirrored" &&
repo.status !== "mirroring" &&
repo.status !== "mirrored" &&
repo.status !== "ignored" && // Skip ignored repositories
repo.id
repo.id &&
// Skip starred repos from other owners when autoMirrorStarred is disabled
!(repo.isStarred && !autoMirrorStarred && repo.owner !== githubOwner)
);
if (eligibleRepos.length === 0) {
@@ -292,7 +294,7 @@ export default function Repository() {
const selectedRepos = repositories.filter(repo => repo.id && selectedRepoIds.has(repo.id));
const eligibleRepos = selectedRepos.filter(
repo => repo.status === "imported" || repo.status === "failed"
repo => repo.status === "imported" || repo.status === "failed" || repo.status === "pending-approval"
);
if (eligibleRepos.length === 0) {
@@ -301,7 +303,7 @@ export default function Repository() {
}
const repoIds = eligibleRepos.map(repo => repo.id as string);
setLoadingRepoIds(prev => {
const newSet = new Set(prev);
repoIds.forEach(id => newSet.add(id));
@@ -694,14 +696,90 @@ export default function Repository() {
}
};
const handleApproveSyncAction = async ({ repoId }: { repoId: string }) => {
try {
if (!user || !user.id) return;
setLoadingRepoIds((prev) => new Set(prev).add(repoId));
const response = await apiRequest<{
success: boolean;
message?: string;
error?: string;
repositories: Repository[];
}>("/job/approve-sync", {
method: "POST",
data: { repositoryIds: [repoId], action: "approve" },
});
if (response.success) {
toast.success("Sync approved — backup + sync started");
setRepositories((prevRepos) =>
prevRepos.map((repo) => {
const updated = response.repositories.find((r) => r.id === repo.id);
return updated ? updated : repo;
}),
);
} else {
showErrorToast(response.error || "Error approving sync", toast);
}
} catch (error) {
showErrorToast(error, toast);
} finally {
setLoadingRepoIds((prev) => {
const newSet = new Set(prev);
newSet.delete(repoId);
return newSet;
});
}
};
const handleDismissSyncAction = async ({ repoId }: { repoId: string }) => {
try {
if (!user || !user.id) return;
setLoadingRepoIds((prev) => new Set(prev).add(repoId));
const response = await apiRequest<{
success: boolean;
message?: string;
error?: string;
repositories: Repository[];
}>("/job/approve-sync", {
method: "POST",
data: { repositoryIds: [repoId], action: "dismiss" },
});
if (response.success) {
toast.success("Force-push alert dismissed");
setRepositories((prevRepos) =>
prevRepos.map((repo) => {
const updated = response.repositories.find((r) => r.id === repo.id);
return updated ? updated : repo;
}),
);
} else {
showErrorToast(response.error || "Error dismissing alert", toast);
}
} catch (error) {
showErrorToast(error, toast);
} finally {
setLoadingRepoIds((prev) => {
const newSet = new Set(prev);
newSet.delete(repoId);
return newSet;
});
}
};
const handleAddRepository = async ({
repo,
owner,
force = false,
destinationOrg,
}: {
repo: string;
owner: string;
force?: boolean;
destinationOrg?: string;
}) => {
if (!user || !user.id) {
return;
@@ -736,6 +814,7 @@ export default function Repository() {
repo: trimmedRepo,
owner: trimmedOwner,
force,
...(destinationOrg ? { destinationOrg } : {}),
};
const response = await apiRequest<AddRepositoriesApiResponse>(
@@ -860,7 +939,7 @@ export default function Repository() {
const actions = [];
// Check if any selected repos can be mirrored
if (selectedRepos.some(repo => repo.status === "imported" || repo.status === "failed")) {
if (selectedRepos.some(repo => repo.status === "imported" || repo.status === "failed" || repo.status === "pending-approval")) {
actions.push('mirror');
}
@@ -898,7 +977,7 @@ export default function Repository() {
const selectedRepos = repositories.filter(repo => repo.id && selectedRepoIds.has(repo.id));
return {
mirror: selectedRepos.filter(repo => repo.status === "imported" || repo.status === "failed").length,
mirror: selectedRepos.filter(repo => repo.status === "imported" || repo.status === "failed" || repo.status === "pending-approval").length,
sync: selectedRepos.filter(repo => repo.status === "mirrored" || repo.status === "synced").length,
rerunMetadata: selectedRepos.filter(repo => ["mirrored", "synced", "archived"].includes(repo.status)).length,
retry: selectedRepos.filter(repo => repo.status === "failed").length,
@@ -1406,6 +1485,8 @@ export default function Repository() {
await fetchRepositories(false);
}}
onDelete={handleRequestDeleteRepository}
onApproveSync={handleApproveSyncAction}
onDismissSync={handleDismissSyncAction}
/>
)}

View File

@@ -1,7 +1,7 @@
import { useMemo, useRef } from "react";
import Fuse from "fuse.js";
import { useVirtualizer } from "@tanstack/react-virtual";
import { FlipHorizontal, GitFork, RefreshCw, RotateCcw, Star, Lock, Ban, Check, ChevronDown, Trash2 } from "lucide-react";
import { FlipHorizontal, GitFork, RefreshCw, RotateCcw, Star, Lock, Ban, Check, ChevronDown, Trash2, X } from "lucide-react";
import { SiGithub, SiGitea } from "react-icons/si";
import type { Repository } from "@/lib/db/schema";
import { Button } from "@/components/ui/button";
@@ -42,6 +42,8 @@ interface RepositoryTableProps {
onSelectionChange: (selectedIds: Set<string>) => void;
onRefresh?: () => Promise<void>;
onDelete?: (repoId: string) => void;
onApproveSync?: ({ repoId }: { repoId: string }) => Promise<void>;
onDismissSync?: ({ repoId }: { repoId: string }) => Promise<void>;
}
export default function RepositoryTable({
@@ -59,6 +61,8 @@ export default function RepositoryTable({
onSelectionChange,
onRefresh,
onDelete,
onApproveSync,
onDismissSync,
}: RepositoryTableProps) {
const tableParentRef = useRef<HTMLDivElement>(null);
const { giteaConfig } = useGiteaConfig();
@@ -239,6 +243,7 @@ export default function RepositoryTable({
repo.status === 'failed' ? 'bg-red-500/10 text-red-600 hover:bg-red-500/20 dark:text-red-400' :
repo.status === 'ignored' ? 'bg-gray-500/10 text-gray-600 hover:bg-gray-500/20 dark:text-gray-400' :
repo.status === 'skipped' ? 'bg-orange-500/10 text-orange-600 hover:bg-orange-500/20 dark:text-orange-400' :
repo.status === 'pending-approval' ? 'bg-amber-500/10 text-amber-600 hover:bg-amber-500/20 dark:text-amber-400' :
'bg-muted hover:bg-muted/80'}`}
variant="secondary"
>
@@ -316,7 +321,40 @@ export default function RepositoryTable({
)}
</Button>
)}
{repo.status === "pending-approval" && (
<div className="flex gap-2 w-full">
<Button
size="default"
variant="default"
onClick={() => repo.id && onApproveSync?.({ repoId: repo.id })}
disabled={isLoading}
className="flex-1 h-10"
>
{isLoading ? (
<>
<Check className="h-4 w-4 mr-2 animate-spin" />
Approving...
</>
) : (
<>
<Check className="h-4 w-4 mr-2" />
Approve Sync
</>
)}
</Button>
<Button
size="default"
variant="outline"
onClick={() => repo.id && onDismissSync?.({ repoId: repo.id })}
disabled={isLoading}
className="flex-1 h-10"
>
<X className="h-4 w-4 mr-2" />
Dismiss
</Button>
</div>
)}
{/* Ignore/Include button */}
{repo.status === "ignored" ? (
<Button
@@ -663,6 +701,7 @@ export default function RepositoryTable({
repo.status === 'failed' ? 'bg-red-500/10 text-red-600 hover:bg-red-500/20 dark:text-red-400' :
repo.status === 'ignored' ? 'bg-gray-500/10 text-gray-600 hover:bg-gray-500/20 dark:text-gray-400' :
repo.status === 'skipped' ? 'bg-orange-500/10 text-orange-600 hover:bg-orange-500/20 dark:text-orange-400' :
repo.status === 'pending-approval' ? 'bg-amber-500/10 text-amber-600 hover:bg-amber-500/20 dark:text-amber-400' :
'bg-muted hover:bg-muted/80'}`}
variant="secondary"
>
@@ -680,6 +719,8 @@ export default function RepositoryTable({
onRetry={() => onRetry({ repoId: repo.id ?? "" })}
onSkip={(skip) => onSkip({ repoId: repo.id ?? "", skip })}
onDelete={onDelete && repo.id ? () => onDelete(repo.id as string) : undefined}
onApproveSync={onApproveSync ? () => onApproveSync({ repoId: repo.id ?? "" }) : undefined}
onDismissSync={onDismissSync ? () => onDismissSync({ repoId: repo.id ?? "" }) : undefined}
/>
</div>
{/* Links */}
@@ -791,6 +832,8 @@ function RepoActionButton({
onRetry,
onSkip,
onDelete,
onApproveSync,
onDismissSync,
}: {
repo: { id: string; status: string };
isLoading: boolean;
@@ -799,7 +842,36 @@ function RepoActionButton({
onRetry: () => void;
onSkip: (skip: boolean) => void;
onDelete?: () => void;
onApproveSync?: () => void;
onDismissSync?: () => void;
}) {
// For pending-approval repos, show approve/dismiss actions
if (repo.status === "pending-approval") {
return (
<div className="flex gap-1">
<Button
variant="default"
size="sm"
disabled={isLoading}
onClick={onApproveSync}
className="min-w-[70px]"
>
<Check className="h-4 w-4 mr-1" />
Approve
</Button>
<Button
variant="outline"
size="sm"
disabled={isLoading}
onClick={onDismissSync}
>
<X className="h-4 w-4 mr-1" />
Dismiss
</Button>
</div>
);
}
// For ignored repos, show an "Include" action
if (repo.status === "ignored") {
return (

View File

@@ -9,6 +9,8 @@ interface ConfigStatus {
isFullyConfigured: boolean;
isLoading: boolean;
error: string | null;
autoMirrorStarred: boolean;
githubOwner: string;
}
// Cache to prevent duplicate API calls across components
@@ -33,6 +35,8 @@ export function useConfigStatus(): ConfigStatus {
isFullyConfigured: false,
isLoading: true,
error: null,
autoMirrorStarred: false,
githubOwner: '',
});
// Track if this hook has already checked config to prevent multiple calls
@@ -46,6 +50,8 @@ export function useConfigStatus(): ConfigStatus {
isFullyConfigured: false,
isLoading: false,
error: 'No user found',
autoMirrorStarred: false,
githubOwner: '',
});
return;
}
@@ -78,6 +84,8 @@ export function useConfigStatus(): ConfigStatus {
isFullyConfigured,
isLoading: false,
error: null,
autoMirrorStarred: configResponse?.advancedOptions?.autoMirrorStarred ?? false,
githubOwner: configResponse?.githubConfig?.username ?? '',
});
return;
}
@@ -119,6 +127,8 @@ export function useConfigStatus(): ConfigStatus {
isFullyConfigured,
isLoading: false,
error: null,
autoMirrorStarred: configResponse?.advancedOptions?.autoMirrorStarred ?? false,
githubOwner: configResponse?.githubConfig?.username ?? '',
});
hasCheckedRef.current = true;
@@ -129,6 +139,8 @@ export function useConfigStatus(): ConfigStatus {
isFullyConfigured: false,
isLoading: false,
error: error instanceof Error ? error.message : 'Failed to check configuration',
autoMirrorStarred: false,
githubOwner: '',
});
hasCheckedRef.current = true;
}

View File

@@ -0,0 +1,66 @@
import { describe, expect, mock, test } from "bun:test";
const getSessionMock = mock(async () => null);
mock.module("@/lib/auth", () => ({
auth: {
api: {
getSession: getSessionMock,
},
},
}));
import { requireAuthenticatedUserId } from "./auth-guards";
describe("requireAuthenticatedUserId", () => {
test("returns user id from locals session without calling auth api", async () => {
getSessionMock.mockImplementation(async () => {
throw new Error("should not be called");
});
const result = await requireAuthenticatedUserId({
request: new Request("http://localhost/test"),
locals: {
session: { userId: "local-user-id" },
} as any,
});
expect("userId" in result).toBe(true);
if ("userId" in result) {
expect(result.userId).toBe("local-user-id");
}
});
test("returns user id from auth session when locals are empty", async () => {
getSessionMock.mockImplementation(async () => ({
user: { id: "session-user-id" },
session: { id: "session-id" },
}));
const result = await requireAuthenticatedUserId({
request: new Request("http://localhost/test"),
locals: {} as any,
});
expect("userId" in result).toBe(true);
if ("userId" in result) {
expect(result.userId).toBe("session-user-id");
}
});
test("returns unauthorized response when auth lookup throws", async () => {
getSessionMock.mockImplementation(async () => {
throw new Error("session provider unavailable");
});
const result = await requireAuthenticatedUserId({
request: new Request("http://localhost/test"),
locals: {} as any,
});
expect("response" in result).toBe(true);
if ("response" in result) {
expect(result.response.status).toBe(401);
}
});
});

45
src/lib/auth-guards.ts Normal file
View File

@@ -0,0 +1,45 @@
import type { APIContext } from "astro";
import { auth } from "@/lib/auth";
function unauthorizedResponse() {
return new Response(
JSON.stringify({
success: false,
error: "Unauthorized",
}),
{
status: 401,
headers: { "Content-Type": "application/json" },
}
);
}
/**
* Ensures request is authenticated and returns the authenticated user ID.
* Never trust client-provided userId for authorization decisions.
*/
export async function requireAuthenticatedUserId(
context: Pick<APIContext, "request" | "locals">
): Promise<{ userId: string } | { response: Response }> {
const localUserId =
context.locals?.session?.userId || context.locals?.user?.id;
if (localUserId) {
return { userId: localUserId };
}
let session: Awaited<ReturnType<typeof auth.api.getSession>> | null = null;
try {
session = await auth.api.getSession({
headers: context.request.headers,
});
} catch {
return { response: unauthorizedResponse() };
}
if (!session?.user?.id) {
return { response: unauthorizedResponse() };
}
return { userId: session.user.id };
}

View File

@@ -29,10 +29,18 @@ export const githubConfigSchema = z.object({
mirrorStrategy: z.enum(["preserve", "single-org", "flat-user", "mixed"]).default("preserve"),
defaultOrg: z.string().optional(),
starredCodeOnly: z.boolean().default(false),
autoMirrorStarred: z.boolean().default(false),
skipStarredIssues: z.boolean().optional(), // Deprecated: kept for backward compatibility, use starredCodeOnly instead
starredDuplicateStrategy: z.enum(["suffix", "prefix", "owner-org"]).default("suffix").optional(),
});
export const backupStrategyEnum = z.enum([
"disabled",
"always",
"on-force-push",
"block-on-force-push",
]);
export const giteaConfigSchema = z.object({
url: z.url(),
externalUrl: z.url().optional(),
@@ -65,6 +73,11 @@ export const giteaConfigSchema = z.object({
mirrorPullRequests: z.boolean().default(false),
mirrorLabels: z.boolean().default(false),
mirrorMilestones: z.boolean().default(false),
backupStrategy: backupStrategyEnum.default("on-force-push"),
backupBeforeSync: z.boolean().default(true), // Deprecated: kept for backward compat, use backupStrategy
backupRetentionCount: z.number().int().min(1).default(20),
backupDirectory: z.string().optional(),
blockSyncOnBackupFailure: z.boolean().default(true),
});
export const scheduleConfigSchema = z.object({
@@ -161,6 +174,7 @@ export const repositorySchema = z.object({
"syncing",
"synced",
"archived",
"pending-approval", // Blocked by force-push detection, needs manual approval
])
.default("imported"),
lastMirrored: z.coerce.date().optional().nullable(),
@@ -192,6 +206,7 @@ export const mirrorJobSchema = z.object({
"syncing",
"synced",
"archived",
"pending-approval",
])
.default("imported"),
message: z.string(),

View File

@@ -22,6 +22,7 @@ interface EnvConfig {
preserveOrgStructure?: boolean;
onlyMirrorOrgs?: boolean;
starredCodeOnly?: boolean;
autoMirrorStarred?: boolean;
starredReposOrg?: string;
starredReposMode?: 'dedicated-org' | 'preserve-owner';
mirrorStrategy?: 'preserve' | 'single-org' | 'flat-user' | 'mixed';
@@ -113,6 +114,7 @@ function parseEnvConfig(): EnvConfig {
preserveOrgStructure: process.env.PRESERVE_ORG_STRUCTURE === 'true',
onlyMirrorOrgs: process.env.ONLY_MIRROR_ORGS === 'true',
starredCodeOnly: process.env.SKIP_STARRED_ISSUES === 'true',
autoMirrorStarred: process.env.AUTO_MIRROR_STARRED === 'true',
starredReposOrg: process.env.STARRED_REPOS_ORG,
starredReposMode: process.env.STARRED_REPOS_MODE as 'dedicated-org' | 'preserve-owner',
mirrorStrategy: process.env.MIRROR_STRATEGY as 'preserve' | 'single-org' | 'flat-user' | 'mixed',
@@ -264,6 +266,7 @@ export async function initializeConfigFromEnv(): Promise<void> {
mirrorStrategy,
defaultOrg: envConfig.gitea.organization || existingConfig?.[0]?.githubConfig?.defaultOrg || 'github-mirrors',
starredCodeOnly: envConfig.github.starredCodeOnly ?? existingConfig?.[0]?.githubConfig?.starredCodeOnly ?? false,
autoMirrorStarred: envConfig.github.autoMirrorStarred ?? existingConfig?.[0]?.githubConfig?.autoMirrorStarred ?? false,
};
// Build Gitea config

View File

@@ -13,6 +13,11 @@ const mockMirrorGitRepoPullRequestsToGitea = mock(() => Promise.resolve());
const mockMirrorGitRepoLabelsToGitea = mock(() => Promise.resolve());
const mockMirrorGitRepoMilestonesToGitea = mock(() => Promise.resolve());
const mockGetGiteaRepoOwnerAsync = mock(() => Promise.resolve("starred"));
const mockCreatePreSyncBundleBackup = mock(() =>
Promise.resolve({ bundlePath: "/tmp/mock.bundle" })
);
let mockShouldCreatePreSyncBackup = false;
let mockShouldBlockSyncOnBackupFailure = true;
// Mock the database module
const mockDb = {
@@ -28,8 +33,14 @@ const mockDb = {
mock.module("@/lib/db", () => ({
db: mockDb,
users: {},
configs: {},
organizations: {},
mirrorJobs: {},
repositories: {}
repositories: {},
events: {},
accounts: {},
sessions: {},
}));
// Mock config encryption
@@ -235,6 +246,12 @@ mock.module("@/lib/http-client", () => ({
HttpError: MockHttpError
}));
mock.module("@/lib/repo-backup", () => ({
createPreSyncBundleBackup: mockCreatePreSyncBundleBackup,
shouldCreatePreSyncBackup: () => mockShouldCreatePreSyncBackup,
shouldBlockSyncOnBackupFailure: () => mockShouldBlockSyncOnBackupFailure,
}));
// Now import the modules we're testing
import {
getGiteaRepoInfo,
@@ -264,6 +281,15 @@ describe("Enhanced Gitea Operations", () => {
mockMirrorGitRepoMilestonesToGitea.mockClear();
mockGetGiteaRepoOwnerAsync.mockClear();
mockGetGiteaRepoOwnerAsync.mockImplementation(() => Promise.resolve("starred"));
mockHttpGet.mockClear();
mockHttpPost.mockClear();
mockHttpDelete.mockClear();
mockCreatePreSyncBundleBackup.mockClear();
mockCreatePreSyncBundleBackup.mockImplementation(() =>
Promise.resolve({ bundlePath: "/tmp/mock.bundle" })
);
mockShouldCreatePreSyncBackup = false;
mockShouldBlockSyncOnBackupFailure = true;
// Reset tracking variables
orgCheckCount = 0;
orgTestContext = "";
@@ -529,6 +555,125 @@ describe("Enhanced Gitea Operations", () => {
expect(releaseCall.octokit).toBeDefined();
});
test("blocks sync when pre-sync snapshot fails and blocking is enabled", async () => {
mockShouldCreatePreSyncBackup = true;
mockShouldBlockSyncOnBackupFailure = true;
mockCreatePreSyncBundleBackup.mockImplementation(() =>
Promise.reject(new Error("simulated backup failure"))
);
const config: Partial<Config> = {
userId: "user123",
githubConfig: {
username: "testuser",
token: "github-token",
privateRepositories: false,
mirrorStarred: true,
},
giteaConfig: {
url: "https://gitea.example.com",
token: "encrypted-token",
defaultOwner: "testuser",
mirrorReleases: false,
backupBeforeSync: true,
blockSyncOnBackupFailure: true,
},
};
const repository: Repository = {
id: "repo456",
name: "mirror-repo",
fullName: "user/mirror-repo",
owner: "user",
cloneUrl: "https://github.com/user/mirror-repo.git",
isPrivate: false,
isStarred: true,
status: repoStatusEnum.parse("mirrored"),
visibility: "public",
userId: "user123",
createdAt: new Date(),
updatedAt: new Date(),
};
await expect(
syncGiteaRepoEnhanced(
{ config, repository },
{
getGiteaRepoOwnerAsync: mockGetGiteaRepoOwnerAsync,
mirrorGitHubReleasesToGitea: mockMirrorGitHubReleasesToGitea,
mirrorGitRepoIssuesToGitea: mockMirrorGitRepoIssuesToGitea,
mirrorGitRepoPullRequestsToGitea: mockMirrorGitRepoPullRequestsToGitea,
mirrorGitRepoLabelsToGitea: mockMirrorGitRepoLabelsToGitea,
mirrorGitRepoMilestonesToGitea: mockMirrorGitRepoMilestonesToGitea,
}
)
).rejects.toThrow("Snapshot failed; sync blocked to protect history.");
const mirrorSyncCalls = mockHttpPost.mock.calls.filter((call) =>
String(call[0]).includes("/mirror-sync")
);
expect(mirrorSyncCalls.length).toBe(0);
});
test("continues sync when pre-sync snapshot fails and blocking is disabled", async () => {
mockShouldCreatePreSyncBackup = true;
mockShouldBlockSyncOnBackupFailure = false;
mockCreatePreSyncBundleBackup.mockImplementation(() =>
Promise.reject(new Error("simulated backup failure"))
);
const config: Partial<Config> = {
userId: "user123",
githubConfig: {
username: "testuser",
token: "github-token",
privateRepositories: false,
mirrorStarred: true,
},
giteaConfig: {
url: "https://gitea.example.com",
token: "encrypted-token",
defaultOwner: "testuser",
mirrorReleases: false,
backupBeforeSync: true,
blockSyncOnBackupFailure: false,
},
};
const repository: Repository = {
id: "repo457",
name: "mirror-repo",
fullName: "user/mirror-repo",
owner: "user",
cloneUrl: "https://github.com/user/mirror-repo.git",
isPrivate: false,
isStarred: true,
status: repoStatusEnum.parse("mirrored"),
visibility: "public",
userId: "user123",
createdAt: new Date(),
updatedAt: new Date(),
};
const result = await syncGiteaRepoEnhanced(
{ config, repository },
{
getGiteaRepoOwnerAsync: mockGetGiteaRepoOwnerAsync,
mirrorGitHubReleasesToGitea: mockMirrorGitHubReleasesToGitea,
mirrorGitRepoIssuesToGitea: mockMirrorGitRepoIssuesToGitea,
mirrorGitRepoPullRequestsToGitea: mockMirrorGitRepoPullRequestsToGitea,
mirrorGitRepoLabelsToGitea: mockMirrorGitRepoLabelsToGitea,
mirrorGitRepoMilestonesToGitea: mockMirrorGitRepoMilestonesToGitea,
}
);
expect(result).toEqual({ success: true });
const mirrorSyncCalls = mockHttpPost.mock.calls.filter((call) =>
String(call[0]).includes("/mirror-sync")
);
expect(mirrorSyncCalls.length).toBe(1);
});
test("mirrors metadata components when enabled and not previously synced", async () => {
const config: Partial<Config> = {
userId: "user123",

View File

@@ -15,6 +15,16 @@ import { httpPost, httpGet, httpPatch, HttpError } from "./http-client";
import { db, repositories } from "./db";
import { eq } from "drizzle-orm";
import { repoStatusEnum } from "@/types/Repository";
import {
createPreSyncBundleBackup,
shouldCreatePreSyncBackup,
shouldBlockSyncOnBackupFailure,
resolveBackupStrategy,
shouldBackupForStrategy,
shouldBlockSyncForStrategy,
strategyNeedsDetection,
} from "./repo-backup";
import { detectForcePush } from "./utils/force-push-detection";
import {
parseRepositoryMetadataState,
serializeRepositoryMetadataState,
@@ -250,9 +260,12 @@ export async function getOrCreateGiteaOrgEnhanced({
export async function syncGiteaRepoEnhanced({
config,
repository,
skipForcePushDetection,
}: {
config: Partial<Config>;
repository: Repository;
/** When true, skip force-push detection and blocking (used by approve-sync). */
skipForcePushDetection?: boolean;
}, deps?: SyncDependencies): Promise<any> {
try {
if (!config.userId || !config.giteaConfig?.url || !config.giteaConfig?.token) {
@@ -313,6 +326,141 @@ export async function syncGiteaRepoEnhanced({
throw new Error(`Repository ${repository.name} is not a mirror. Cannot sync.`);
}
// ---- Smart backup strategy with force-push detection ----
const backupStrategy = resolveBackupStrategy(config);
let forcePushDetected = false;
if (backupStrategy !== "disabled") {
// Run force-push detection if the strategy requires it
// (skip when called from approve-sync to avoid re-blocking)
if (strategyNeedsDetection(backupStrategy) && !skipForcePushDetection) {
try {
const decryptedGithubToken = decryptedConfig.githubConfig?.token;
if (decryptedGithubToken) {
const fpOctokit = new Octokit({ auth: decryptedGithubToken });
const detectionResult = await detectForcePush({
giteaUrl: config.giteaConfig.url,
giteaToken: decryptedConfig.giteaConfig.token,
giteaOwner: repoOwner,
giteaRepo: repository.name,
octokit: fpOctokit,
githubOwner: repository.owner,
githubRepo: repository.name,
});
forcePushDetected = detectionResult.detected;
if (detectionResult.skipped) {
console.log(
`[Sync] Force-push detection skipped for ${repository.name}: ${detectionResult.skipReason}`,
);
} else if (forcePushDetected) {
const branchNames = detectionResult.affectedBranches
.map((b) => `${b.name} (${b.reason})`)
.join(", ");
console.warn(
`[Sync] Force-push detected on ${repository.name}: ${branchNames}`,
);
}
} else {
console.log(
`[Sync] Skipping force-push detection for ${repository.name}: no GitHub token`,
);
}
} catch (detectionError) {
// Fail-open: detection errors should never block sync
console.warn(
`[Sync] Force-push detection failed for ${repository.name}, proceeding with sync: ${
detectionError instanceof Error ? detectionError.message : String(detectionError)
}`,
);
}
}
// Check if sync should be blocked (block-on-force-push mode)
if (shouldBlockSyncForStrategy(backupStrategy, forcePushDetected)) {
const branchInfo = `Force-push detected; sync blocked for manual approval.`;
await db
.update(repositories)
.set({
status: "pending-approval",
updatedAt: new Date(),
errorMessage: branchInfo,
})
.where(eq(repositories.id, repository.id!));
await createMirrorJob({
userId: config.userId,
repositoryId: repository.id,
repositoryName: repository.name,
message: `Sync blocked for ${repository.name}: force-push detected`,
details: branchInfo,
status: "pending-approval",
});
console.warn(`[Sync] Sync blocked for ${repository.name}: pending manual approval`);
return { blocked: true, reason: branchInfo };
}
// Create backup if strategy says so
if (shouldBackupForStrategy(backupStrategy, forcePushDetected)) {
const cloneUrl =
repoInfo.clone_url ||
`${config.giteaConfig.url.replace(/\/$/, "")}/${repoOwner}/${repository.name}.git`;
try {
const backupResult = await createPreSyncBundleBackup({
config,
owner: repoOwner,
repoName: repository.name,
cloneUrl,
force: true, // Strategy already decided to backup; skip legacy gate
});
await createMirrorJob({
userId: config.userId,
repositoryId: repository.id,
repositoryName: repository.name,
message: `Snapshot created for ${repository.name}`,
details: `Pre-sync snapshot created at ${backupResult.bundlePath}.`,
status: "syncing",
});
} catch (backupError) {
const errorMessage =
backupError instanceof Error ? backupError.message : String(backupError);
await createMirrorJob({
userId: config.userId,
repositoryId: repository.id,
repositoryName: repository.name,
message: `Snapshot failed for ${repository.name}`,
details: `Pre-sync snapshot failed: ${errorMessage}`,
status: "failed",
});
if (shouldBlockSyncOnBackupFailure(config)) {
await db
.update(repositories)
.set({
status: repoStatusEnum.parse("failed"),
updatedAt: new Date(),
errorMessage: `Snapshot failed; sync blocked to protect history. ${errorMessage}`,
})
.where(eq(repositories.id, repository.id!));
throw new Error(
`Snapshot failed; sync blocked to protect history. ${errorMessage}`,
);
}
console.warn(
`[Sync] Snapshot failed for ${repository.name}, continuing because blockSyncOnBackupFailure=false: ${errorMessage}`,
);
}
}
}
// Update mirror interval if needed
if (config.giteaConfig?.mirrorInterval) {
try {

View File

@@ -24,9 +24,14 @@ mock.module("@/lib/db", () => {
values: mock(() => Promise.resolve())
}))
},
users: {},
configs: {},
repositories: {},
organizations: {},
events: {}
events: {},
mirrorJobs: {},
accounts: {},
sessions: {},
};
});
@@ -59,10 +64,16 @@ const mockGetOrCreateGiteaOrg = mock(async ({ orgName, config }: any) => {
const mockMirrorGitHubOrgRepoToGiteaOrg = mock(async () => {});
const mockIsRepoPresentInGitea = mock(async () => false);
const mockMirrorGithubRepoToGitea = mock(async () => {});
const mockGetGiteaRepoOwnerAsync = mock(async () => "starred");
const mockGetGiteaRepoOwner = mock(() => "starred");
mock.module("./gitea", () => ({
getOrCreateGiteaOrg: mockGetOrCreateGiteaOrg,
mirrorGitHubOrgRepoToGiteaOrg: mockMirrorGitHubOrgRepoToGiteaOrg,
mirrorGithubRepoToGitea: mockMirrorGithubRepoToGitea,
getGiteaRepoOwner: mockGetGiteaRepoOwner,
getGiteaRepoOwnerAsync: mockGetGiteaRepoOwnerAsync,
isRepoPresentInGitea: mockIsRepoPresentInGitea
}));
@@ -226,4 +237,4 @@ describe("Starred Repository Error Handling", () => {
});
});
});
});

View File

@@ -27,8 +27,14 @@ mock.module("@/lib/db", () => {
})
})
},
users: {},
configs: {},
repositories: {},
organizations: {}
organizations: {},
mirrorJobs: {},
events: {},
accounts: {},
sessions: {},
};
});
@@ -55,8 +61,50 @@ mock.module("@/lib/http-client", () => {
// Mock the gitea module itself
mock.module("./gitea", () => {
const mockGetGiteaRepoOwner = mock(({ config, repository }: any) => {
if (repository?.isStarred && config?.githubConfig?.starredReposMode === "preserve-owner") {
return repository.organization || repository.owner;
}
if (repository?.isStarred) {
return config?.githubConfig?.starredReposOrg || "starred";
}
const mirrorStrategy =
config?.githubConfig?.mirrorStrategy ||
(config?.giteaConfig?.preserveOrgStructure ? "preserve" : "flat-user");
switch (mirrorStrategy) {
case "preserve":
return repository?.organization || config?.giteaConfig?.defaultOwner || "giteauser";
case "single-org":
return config?.giteaConfig?.organization || config?.giteaConfig?.defaultOwner || "giteauser";
case "mixed":
if (repository?.organization) return repository.organization;
return config?.giteaConfig?.organization || config?.giteaConfig?.defaultOwner || "giteauser";
case "flat-user":
default:
return config?.giteaConfig?.defaultOwner || "giteauser";
}
});
const mockGetGiteaRepoOwnerAsync = mock(async ({ config, repository }: any) => {
if (repository?.isStarred && config?.githubConfig?.starredReposMode === "preserve-owner") {
return repository.organization || repository.owner;
}
if (repository?.destinationOrg) {
return repository.destinationOrg;
}
if (repository?.organization && mockDbSelectResult[0]?.destinationOrg) {
return mockDbSelectResult[0].destinationOrg;
}
return config?.giteaConfig?.defaultOwner || "giteauser";
});
return {
isRepoPresentInGitea: mockIsRepoPresentInGitea,
getGiteaRepoOwner: mockGetGiteaRepoOwner,
getGiteaRepoOwnerAsync: mockGetGiteaRepoOwnerAsync,
mirrorGithubRepoToGitea: mock(async () => {}),
mirrorGitHubOrgRepoToGiteaOrg: mock(async () => {})
};

View File

@@ -22,22 +22,30 @@ if (process.env.NODE_ENV !== "test") {
// Fallback to base Octokit if .plugin is not present
const MyOctokit: any = (Octokit as any)?.plugin?.call
? (Octokit as any).plugin(throttling)
: Octokit as any;
: (Octokit as any);
/**
* Creates an authenticated Octokit instance with rate limit tracking and throttling
*/
export function createGitHubClient(token: string, userId?: string, username?: string): Octokit {
export function createGitHubClient(
token: string,
userId?: string,
username?: string,
): Octokit {
// Create a proper User-Agent to identify our application
// This helps GitHub understand our traffic patterns and can provide better rate limits
const userAgent = username
? `gitea-mirror/3.5.4 (user:${username})`
const userAgent = username
? `gitea-mirror/3.5.4 (user:${username})`
: "gitea-mirror/3.5.4";
// Support GH_API_URL (preferred) or GITHUB_API_URL (may conflict with GitHub Actions)
// GitHub Actions sets GITHUB_API_URL to https://api.github.com by default
const baseUrl = process.env.GH_API_URL || process.env.GITHUB_API_URL || "https://api.github.com";
const octokit = new MyOctokit({
auth: token, // Always use token for authentication (5000 req/hr vs 60 for unauthenticated)
userAgent, // Identify our application and user
baseUrl: "https://api.github.com", // Explicitly set the API endpoint
baseUrl, // Configurable for E2E testing
log: {
debug: () => {},
info: console.log,
@@ -52,14 +60,19 @@ export function createGitHubClient(token: string, userId?: string, username?: st
},
},
throttle: {
onRateLimit: async (retryAfter: number, options: any, octokit: any, retryCount: number) => {
onRateLimit: async (
retryAfter: number,
options: any,
octokit: any,
retryCount: number,
) => {
const isSearch = options.url.includes("/search/");
const maxRetries = isSearch ? 5 : 3; // Search endpoints get more retries
console.warn(
`[GitHub] Rate limit hit for ${options.method} ${options.url}. Retry ${retryCount + 1}/${maxRetries}`
`[GitHub] Rate limit hit for ${options.method} ${options.url}. Retry ${retryCount + 1}/${maxRetries}`,
);
// Update rate limit status and notify UI (if available)
if (userId && RateLimitManager) {
await RateLimitManager.updateFromResponse(userId, {
@@ -68,7 +81,7 @@ export function createGitHubClient(token: string, userId?: string, username?: st
"x-ratelimit-reset": (Date.now() / 1000 + retryAfter).toString(),
});
}
if (userId && publishEvent) {
await publishEvent({
userId,
@@ -83,22 +96,29 @@ export function createGitHubClient(token: string, userId?: string, username?: st
},
});
}
// Retry with exponential backoff
if (retryCount < maxRetries) {
console.log(`[GitHub] Waiting ${retryAfter}s before retry...`);
return true;
}
// Max retries reached
console.error(`[GitHub] Max retries (${maxRetries}) reached for ${options.url}`);
console.error(
`[GitHub] Max retries (${maxRetries}) reached for ${options.url}`,
);
return false;
},
onSecondaryRateLimit: async (retryAfter: number, options: any, octokit: any, retryCount: number) => {
onSecondaryRateLimit: async (
retryAfter: number,
options: any,
octokit: any,
retryCount: number,
) => {
console.warn(
`[GitHub] Secondary rate limit hit for ${options.method} ${options.url}`
`[GitHub] Secondary rate limit hit for ${options.method} ${options.url}`,
);
// Update status and notify UI (if available)
if (userId && publishEvent) {
await publishEvent({
@@ -114,13 +134,15 @@ export function createGitHubClient(token: string, userId?: string, username?: st
},
});
}
// Retry up to 2 times for secondary rate limits
if (retryCount < 2) {
console.log(`[GitHub] Waiting ${retryAfter}s for secondary rate limit...`);
console.log(
`[GitHub] Waiting ${retryAfter}s for secondary rate limit...`,
);
return true;
}
return false;
},
// Throttle options to prevent hitting limits
@@ -129,50 +151,57 @@ export function createGitHubClient(token: string, userId?: string, username?: st
retryAfterBaseValue: 1000, // Base retry in ms
},
});
// Add additional rate limit tracking if userId is provided and RateLimitManager is available
// Add rate limit tracking hooks if userId is provided and RateLimitManager is available
if (userId && RateLimitManager) {
octokit.hook.after("request", async (response: any, options: any) => {
// Update rate limit from response headers
octokit.hook.after("request", async (response: any, _options: any) => {
if (response.headers) {
await RateLimitManager.updateFromResponse(userId, response.headers);
}
});
octokit.hook.error("request", async (error: any, options: any) => {
// Handle rate limit errors
if (error.status === 403 || error.status === 429) {
const message = error.message || "";
if (message.includes("rate limit") || message.includes("API rate limit")) {
console.error(`[GitHub] Rate limit error for user ${userId}: ${message}`);
if (
message.includes("rate limit") ||
message.includes("API rate limit")
) {
console.error(
`[GitHub] Rate limit error for user ${userId}: ${message}`,
);
// Update rate limit status from error response (if available)
if (error.response?.headers && RateLimitManager) {
await RateLimitManager.updateFromResponse(userId, error.response.headers);
await RateLimitManager.updateFromResponse(
userId,
error.response.headers,
);
}
// Create error event for UI (if available)
if (publishEvent) {
await publishEvent({
userId,
channel: "rate-limit",
payload: {
type: "error",
provider: "github",
error: message,
endpoint: options.url,
message: `Rate limit exceeded: ${message}`,
},
});
channel: "rate-limit",
payload: {
type: "error",
provider: "github",
error: message,
endpoint: options.url,
message: `Rate limit exceeded: ${message}`,
},
});
}
}
}
throw error;
});
}
return octokit;
}
@@ -213,7 +242,7 @@ export async function getGithubRepositories({
try {
const repos = await octokit.paginate(
octokit.repos.listForAuthenticatedUser,
{ per_page: 100 }
{ per_page: 100 },
);
const skipForks = config.githubConfig?.skipForks ?? false;
@@ -254,6 +283,7 @@ export async function getGithubRepositories({
visibility: (repo.visibility ?? "public") as GitRepo["visibility"],
status: "imported",
isDisabled: repo.disabled ?? false,
lastMirrored: undefined,
errorMessage: undefined,
@@ -264,7 +294,7 @@ export async function getGithubRepositories({
throw new Error(
`Error fetching repositories: ${
error instanceof Error ? error.message : String(error)
}`
}`,
);
}
}
@@ -275,13 +305,13 @@ export async function getGithubStarredRepositories({
}: {
octokit: Octokit;
config: Partial<Config>;
}) {
}): Promise<GitRepo[]> {
try {
const starredRepos = await octokit.paginate(
octokit.activity.listReposStarredByAuthenticatedUser,
{
per_page: 100,
}
},
);
return starredRepos.map((repo) => ({
@@ -314,6 +344,7 @@ export async function getGithubStarredRepositories({
visibility: (repo.visibility ?? "public") as GitRepo["visibility"],
status: "imported",
isDisabled: repo.disabled ?? false,
lastMirrored: undefined,
errorMessage: undefined,
@@ -324,7 +355,7 @@ export async function getGithubStarredRepositories({
throw new Error(
`Error fetching starred repositories: ${
error instanceof Error ? error.message : String(error)
}`
}`,
);
}
}
@@ -347,13 +378,15 @@ export async function getGithubOrganizations({
// Get excluded organizations from environment variable
const excludedOrgsEnv = process.env.GITHUB_EXCLUDED_ORGS;
const excludedOrgs = excludedOrgsEnv
? excludedOrgsEnv.split(',').map(org => org.trim().toLowerCase())
? excludedOrgsEnv.split(",").map((org) => org.trim().toLowerCase())
: [];
// Filter out excluded organizations
const filteredOrgs = orgs.filter(org => {
const filteredOrgs = orgs.filter((org) => {
if (excludedOrgs.includes(org.login.toLowerCase())) {
console.log(`Skipping organization ${org.login} - excluded via GITHUB_EXCLUDED_ORGS environment variable`);
console.log(
`Skipping organization ${org.login} - excluded via GITHUB_EXCLUDED_ORGS environment variable`,
);
return false;
}
return true;
@@ -379,7 +412,7 @@ export async function getGithubOrganizations({
createdAt: new Date(),
updatedAt: new Date(),
};
})
}),
);
return organizations;
@@ -387,7 +420,7 @@ export async function getGithubOrganizations({
throw new Error(
`Error fetching organizations: ${
error instanceof Error ? error.message : String(error)
}`
}`,
);
}
}
@@ -438,6 +471,7 @@ export async function getGithubOrganizationRepositories({
visibility: (repo.visibility ?? "public") as GitRepo["visibility"],
status: "imported",
isDisabled: repo.disabled ?? false,
lastMirrored: undefined,
errorMessage: undefined,
@@ -448,7 +482,7 @@ export async function getGithubOrganizationRepositories({
throw new Error(
`Error fetching organization repositories: ${
error instanceof Error ? error.message : String(error)
}`
}`,
);
}
}

248
src/lib/repo-backup.test.ts Normal file
View File

@@ -0,0 +1,248 @@
import path from "node:path";
import { afterEach, beforeEach, describe, expect, test } from "bun:test";
import type { Config } from "@/types/config";
import {
resolveBackupPaths,
resolveBackupStrategy,
shouldBackupForStrategy,
shouldBlockSyncForStrategy,
strategyNeedsDetection,
} from "@/lib/repo-backup";
describe("resolveBackupPaths", () => {
let originalBackupDirEnv: string | undefined;
beforeEach(() => {
originalBackupDirEnv = process.env.PRE_SYNC_BACKUP_DIR;
delete process.env.PRE_SYNC_BACKUP_DIR;
});
afterEach(() => {
if (originalBackupDirEnv === undefined) {
delete process.env.PRE_SYNC_BACKUP_DIR;
} else {
process.env.PRE_SYNC_BACKUP_DIR = originalBackupDirEnv;
}
});
test("returns absolute paths when backupDirectory is relative", () => {
const config: Partial<Config> = {
userId: "user-123",
giteaConfig: {
backupDirectory: "data/repo-backups",
} as Config["giteaConfig"],
};
const { backupRoot, repoBackupDir } = resolveBackupPaths({
config,
owner: "RayLabsHQ",
repoName: "gitea-mirror",
});
expect(path.isAbsolute(backupRoot)).toBe(true);
expect(path.isAbsolute(repoBackupDir)).toBe(true);
expect(repoBackupDir).toBe(
path.join(backupRoot, "user-123", "RayLabsHQ", "gitea-mirror")
);
});
test("returns absolute paths when backupDirectory is already absolute", () => {
const config: Partial<Config> = {
userId: "user-123",
giteaConfig: {
backupDirectory: "/data/repo-backups",
} as Config["giteaConfig"],
};
const { backupRoot, repoBackupDir } = resolveBackupPaths({
config,
owner: "owner",
repoName: "repo",
});
expect(backupRoot).toBe("/data/repo-backups");
expect(path.isAbsolute(repoBackupDir)).toBe(true);
});
test("falls back to cwd-based path when no backupDirectory is set", () => {
const config: Partial<Config> = {
userId: "user-123",
giteaConfig: {} as Config["giteaConfig"],
};
const { backupRoot } = resolveBackupPaths({
config,
owner: "owner",
repoName: "repo",
});
expect(path.isAbsolute(backupRoot)).toBe(true);
expect(backupRoot).toBe(
path.resolve(process.cwd(), "data", "repo-backups")
);
});
test("uses PRE_SYNC_BACKUP_DIR env var when config has no backupDirectory", () => {
process.env.PRE_SYNC_BACKUP_DIR = "custom/backup/path";
const config: Partial<Config> = {
userId: "user-123",
giteaConfig: {} as Config["giteaConfig"],
};
const { backupRoot } = resolveBackupPaths({
config,
owner: "owner",
repoName: "repo",
});
expect(path.isAbsolute(backupRoot)).toBe(true);
expect(backupRoot).toBe(path.resolve("custom/backup/path"));
});
test("sanitizes owner and repoName in path segments", () => {
const config: Partial<Config> = {
userId: "user-123",
giteaConfig: {
backupDirectory: "/backups",
} as Config["giteaConfig"],
};
const { repoBackupDir } = resolveBackupPaths({
config,
owner: "org/with-slash",
repoName: "repo name!",
});
expect(repoBackupDir).toBe(
path.join("/backups", "user-123", "org_with-slash", "repo_name_")
);
});
});
// ---- Backup strategy resolver tests ----
function makeConfig(overrides: Record<string, any> = {}): Partial<Config> {
return {
giteaConfig: {
url: "https://gitea.example.com",
token: "tok",
...overrides,
},
} as Partial<Config>;
}
const envKeysToClean = ["PRE_SYNC_BACKUP_STRATEGY", "PRE_SYNC_BACKUP_ENABLED"];
describe("resolveBackupStrategy", () => {
let savedEnv: Record<string, string | undefined> = {};
beforeEach(() => {
savedEnv = {};
for (const key of envKeysToClean) {
savedEnv[key] = process.env[key];
delete process.env[key];
}
});
afterEach(() => {
for (const [key, value] of Object.entries(savedEnv)) {
if (value === undefined) {
delete process.env[key];
} else {
process.env[key] = value;
}
}
});
test("returns explicit backupStrategy when set", () => {
expect(resolveBackupStrategy(makeConfig({ backupStrategy: "always" }))).toBe("always");
expect(resolveBackupStrategy(makeConfig({ backupStrategy: "disabled" }))).toBe("disabled");
expect(resolveBackupStrategy(makeConfig({ backupStrategy: "on-force-push" }))).toBe("on-force-push");
expect(resolveBackupStrategy(makeConfig({ backupStrategy: "block-on-force-push" }))).toBe("block-on-force-push");
});
test("maps backupBeforeSync: true → 'always' (backward compat)", () => {
expect(resolveBackupStrategy(makeConfig({ backupBeforeSync: true }))).toBe("always");
});
test("maps backupBeforeSync: false → 'disabled' (backward compat)", () => {
expect(resolveBackupStrategy(makeConfig({ backupBeforeSync: false }))).toBe("disabled");
});
test("prefers explicit backupStrategy over backupBeforeSync", () => {
expect(
resolveBackupStrategy(
makeConfig({ backupStrategy: "on-force-push", backupBeforeSync: true }),
),
).toBe("on-force-push");
});
test("falls back to PRE_SYNC_BACKUP_STRATEGY env var", () => {
process.env.PRE_SYNC_BACKUP_STRATEGY = "block-on-force-push";
expect(resolveBackupStrategy(makeConfig({}))).toBe("block-on-force-push");
});
test("falls back to PRE_SYNC_BACKUP_ENABLED env var (legacy)", () => {
process.env.PRE_SYNC_BACKUP_ENABLED = "false";
expect(resolveBackupStrategy(makeConfig({}))).toBe("disabled");
});
test("defaults to 'on-force-push' when nothing is configured", () => {
expect(resolveBackupStrategy(makeConfig({}))).toBe("on-force-push");
});
test("handles empty giteaConfig gracefully", () => {
expect(resolveBackupStrategy({})).toBe("on-force-push");
});
});
describe("shouldBackupForStrategy", () => {
test("disabled → never backup", () => {
expect(shouldBackupForStrategy("disabled", false)).toBe(false);
expect(shouldBackupForStrategy("disabled", true)).toBe(false);
});
test("always → always backup", () => {
expect(shouldBackupForStrategy("always", false)).toBe(true);
expect(shouldBackupForStrategy("always", true)).toBe(true);
});
test("on-force-push → backup only when detected", () => {
expect(shouldBackupForStrategy("on-force-push", false)).toBe(false);
expect(shouldBackupForStrategy("on-force-push", true)).toBe(true);
});
test("block-on-force-push → backup only when detected", () => {
expect(shouldBackupForStrategy("block-on-force-push", false)).toBe(false);
expect(shouldBackupForStrategy("block-on-force-push", true)).toBe(true);
});
});
describe("shouldBlockSyncForStrategy", () => {
test("only block-on-force-push + detected returns true", () => {
expect(shouldBlockSyncForStrategy("block-on-force-push", true)).toBe(true);
});
test("block-on-force-push without detection does not block", () => {
expect(shouldBlockSyncForStrategy("block-on-force-push", false)).toBe(false);
});
test("other strategies never block", () => {
expect(shouldBlockSyncForStrategy("disabled", true)).toBe(false);
expect(shouldBlockSyncForStrategy("always", true)).toBe(false);
expect(shouldBlockSyncForStrategy("on-force-push", true)).toBe(false);
});
});
describe("strategyNeedsDetection", () => {
test("returns true for detection-based strategies", () => {
expect(strategyNeedsDetection("on-force-push")).toBe(true);
expect(strategyNeedsDetection("block-on-force-push")).toBe(true);
});
test("returns false for non-detection strategies", () => {
expect(strategyNeedsDetection("disabled")).toBe(false);
expect(strategyNeedsDetection("always")).toBe(false);
});
});

276
src/lib/repo-backup.ts Normal file
View File

@@ -0,0 +1,276 @@
import { mkdir, mkdtemp, readdir, rm, stat } from "node:fs/promises";
import os from "node:os";
import path from "node:path";
import type { Config, BackupStrategy } from "@/types/config";
import { decryptConfigTokens } from "./utils/config-encryption";
const TRUE_VALUES = new Set(["1", "true", "yes", "on"]);
function parseBoolean(value: string | undefined, fallback: boolean): boolean {
if (value === undefined) return fallback;
return TRUE_VALUES.has(value.trim().toLowerCase());
}
function parsePositiveInt(value: string | undefined, fallback: number): number {
if (!value) return fallback;
const parsed = Number.parseInt(value, 10);
if (!Number.isFinite(parsed) || parsed <= 0) {
return fallback;
}
return parsed;
}
function sanitizePathSegment(input: string): string {
return input.replace(/[^a-zA-Z0-9._-]/g, "_");
}
function buildTimestamp(): string {
// Example: 2026-02-25T18-34-22-123Z
return new Date().toISOString().replace(/[:.]/g, "-");
}
function buildAuthenticatedCloneUrl(cloneUrl: string, token: string): string {
const parsed = new URL(cloneUrl);
if (parsed.protocol !== "http:" && parsed.protocol !== "https:") {
return cloneUrl;
}
parsed.username = process.env.PRE_SYNC_BACKUP_GIT_USERNAME || "oauth2";
parsed.password = token;
return parsed.toString();
}
function maskToken(text: string, token: string): string {
if (!token) return text;
return text.split(token).join("***");
}
async function runGit(args: string[], tokenToMask: string): Promise<void> {
const proc = Bun.spawn({
cmd: ["git", ...args],
stdout: "pipe",
stderr: "pipe",
});
const [stdout, stderr, exitCode] = await Promise.all([
new Response(proc.stdout).text(),
new Response(proc.stderr).text(),
proc.exited,
]);
if (exitCode !== 0) {
const details = [stdout, stderr].filter(Boolean).join("\n").trim();
const safeDetails = maskToken(details, tokenToMask);
throw new Error(`git command failed: ${safeDetails || "unknown git error"}`);
}
}
async function enforceRetention(repoBackupDir: string, keepCount: number): Promise<void> {
const entries = await readdir(repoBackupDir);
const bundleFiles = entries
.filter((name) => name.endsWith(".bundle"))
.map((name) => path.join(repoBackupDir, name));
if (bundleFiles.length <= keepCount) return;
const filesWithMtime = await Promise.all(
bundleFiles.map(async (filePath) => ({
filePath,
mtimeMs: (await stat(filePath)).mtimeMs,
}))
);
filesWithMtime.sort((a, b) => b.mtimeMs - a.mtimeMs);
const toDelete = filesWithMtime.slice(keepCount);
await Promise.all(toDelete.map((entry) => rm(entry.filePath, { force: true })));
}
export function isPreSyncBackupEnabled(): boolean {
return parseBoolean(process.env.PRE_SYNC_BACKUP_ENABLED, true);
}
export function shouldCreatePreSyncBackup(config: Partial<Config>): boolean {
const configSetting = config.giteaConfig?.backupBeforeSync;
const fallback = isPreSyncBackupEnabled();
return configSetting === undefined ? fallback : Boolean(configSetting);
}
export function shouldBlockSyncOnBackupFailure(config: Partial<Config>): boolean {
const configSetting = config.giteaConfig?.blockSyncOnBackupFailure;
return configSetting === undefined ? true : Boolean(configSetting);
}
// ---- Backup strategy resolver ----
const VALID_STRATEGIES = new Set<BackupStrategy>([
"disabled",
"always",
"on-force-push",
"block-on-force-push",
]);
/**
* Resolve the effective backup strategy from config, falling back through:
* 1. `backupStrategy` field (new)
* 2. `backupBeforeSync` boolean (deprecated, backward compat)
* 3. `PRE_SYNC_BACKUP_STRATEGY` env var
* 4. `PRE_SYNC_BACKUP_ENABLED` env var (legacy)
* 5. Default: `"on-force-push"`
*/
export function resolveBackupStrategy(config: Partial<Config>): BackupStrategy {
// 1. Explicit backupStrategy field
const explicit = config.giteaConfig?.backupStrategy;
if (explicit && VALID_STRATEGIES.has(explicit as BackupStrategy)) {
return explicit as BackupStrategy;
}
// 2. Legacy backupBeforeSync boolean → map to strategy
const legacy = config.giteaConfig?.backupBeforeSync;
if (legacy !== undefined) {
return legacy ? "always" : "disabled";
}
// 3. Env var (new)
const envStrategy = process.env.PRE_SYNC_BACKUP_STRATEGY?.trim().toLowerCase();
if (envStrategy && VALID_STRATEGIES.has(envStrategy as BackupStrategy)) {
return envStrategy as BackupStrategy;
}
// 4. Env var (legacy)
const envEnabled = process.env.PRE_SYNC_BACKUP_ENABLED;
if (envEnabled !== undefined) {
return parseBoolean(envEnabled, true) ? "always" : "disabled";
}
// 5. Default
return "on-force-push";
}
/**
* Determine whether a backup should be created for the given strategy and
* force-push detection result.
*/
export function shouldBackupForStrategy(
strategy: BackupStrategy,
forcePushDetected: boolean,
): boolean {
switch (strategy) {
case "disabled":
return false;
case "always":
return true;
case "on-force-push":
case "block-on-force-push":
return forcePushDetected;
default:
return false;
}
}
/**
* Determine whether sync should be blocked (requires manual approval).
* Only `block-on-force-push` with an actual detection blocks sync.
*/
export function shouldBlockSyncForStrategy(
strategy: BackupStrategy,
forcePushDetected: boolean,
): boolean {
return strategy === "block-on-force-push" && forcePushDetected;
}
/**
* Returns true when the strategy requires running force-push detection
* before deciding on backup / block behavior.
*/
export function strategyNeedsDetection(strategy: BackupStrategy): boolean {
return strategy === "on-force-push" || strategy === "block-on-force-push";
}
export function resolveBackupPaths({
config,
owner,
repoName,
}: {
config: Partial<Config>;
owner: string;
repoName: string;
}): { backupRoot: string; repoBackupDir: string } {
let backupRoot =
config.giteaConfig?.backupDirectory?.trim() ||
process.env.PRE_SYNC_BACKUP_DIR?.trim() ||
path.join(process.cwd(), "data", "repo-backups");
// Ensure backupRoot is absolute - relative paths break git bundle creation
// because git runs with -C mirrorClonePath and interprets relative paths from there.
// Always use path.resolve() which guarantees an absolute path, rather than a
// conditional check that can miss edge cases (e.g., NixOS systemd services).
backupRoot = path.resolve(backupRoot);
const repoBackupDir = path.join(
backupRoot,
sanitizePathSegment(config.userId || "unknown-user"),
sanitizePathSegment(owner),
sanitizePathSegment(repoName)
);
return { backupRoot, repoBackupDir };
}
export async function createPreSyncBundleBackup({
config,
owner,
repoName,
cloneUrl,
force,
}: {
config: Partial<Config>;
owner: string;
repoName: string;
cloneUrl: string;
/** When true, skip the legacy shouldCreatePreSyncBackup check.
* Used by the strategy-driven path which has already decided to backup. */
force?: boolean;
}): Promise<{ bundlePath: string }> {
if (!force && !shouldCreatePreSyncBackup(config)) {
throw new Error("Pre-sync backup is disabled.");
}
if (!config.giteaConfig?.token) {
throw new Error("Gitea token is required for pre-sync backup.");
}
const decryptedConfig = decryptConfigTokens(config as Config);
const giteaToken = decryptedConfig.giteaConfig?.token;
if (!giteaToken) {
throw new Error("Decrypted Gitea token is required for pre-sync backup.");
}
const { repoBackupDir } = resolveBackupPaths({ config, owner, repoName });
const retention = Math.max(
1,
Number.isFinite(config.giteaConfig?.backupRetentionCount)
? Number(config.giteaConfig?.backupRetentionCount)
: parsePositiveInt(process.env.PRE_SYNC_BACKUP_KEEP_COUNT, 20)
);
await mkdir(repoBackupDir, { recursive: true });
const tmpDir = await mkdtemp(path.join(os.tmpdir(), "gitea-mirror-backup-"));
const mirrorClonePath = path.join(tmpDir, "repo.git");
// path.resolve guarantees an absolute path, critical because git -C changes
// the working directory and would misinterpret a relative bundlePath
const bundlePath = path.resolve(repoBackupDir, `${buildTimestamp()}.bundle`);
try {
const authCloneUrl = buildAuthenticatedCloneUrl(cloneUrl, giteaToken);
await runGit(["clone", "--mirror", authCloneUrl, mirrorClonePath], giteaToken);
await runGit(["-C", mirrorClonePath, "bundle", "create", bundlePath, "--all"], giteaToken);
await enforceRetention(repoBackupDir, retention);
return { bundlePath };
} finally {
await rm(tmpDir, { recursive: true, force: true });
}
}

View File

@@ -0,0 +1,17 @@
import { describe, expect, it } from "bun:test";
import { isMirrorableGitHubRepo } from "@/lib/repo-eligibility";
describe("isMirrorableGitHubRepo", () => {
it("returns false for disabled repos", () => {
expect(isMirrorableGitHubRepo({ isDisabled: true })).toBe(false);
});
it("returns true for enabled repos", () => {
expect(isMirrorableGitHubRepo({ isDisabled: false })).toBe(true);
});
it("returns true when disabled flag is absent", () => {
expect(isMirrorableGitHubRepo({})).toBe(true);
});
});

View File

@@ -0,0 +1,6 @@
import type { GitRepo } from "@/types/Repository";
export function isMirrorableGitHubRepo(repo: Pick<GitRepo, "isDisabled">): boolean {
return repo.isDisabled !== true;
}

View File

@@ -10,6 +10,7 @@ import { createGitHubClient, getGithubRepositories, getGithubStarredRepositories
import { createGiteaClient, deleteGiteaRepo, archiveGiteaRepo, getGiteaRepoOwnerAsync, checkRepoLocation } from '@/lib/gitea';
import { getDecryptedGitHubToken, getDecryptedGiteaToken } from '@/lib/utils/config-encryption';
import { publishEvent } from '@/lib/events';
import { isMirrorableGitHubRepo } from '@/lib/repo-eligibility';
let cleanupInterval: NodeJS.Timeout | null = null;
let isCleanupRunning = false;
@@ -59,7 +60,9 @@ async function identifyOrphanedRepositories(config: any): Promise<any[]> {
return [];
}
const githubRepoFullNames = new Set(allGithubRepos.map(repo => repo.fullName));
const githubReposByFullName = new Map(
allGithubRepos.map((repo) => [repo.fullName, repo] as const)
);
// Get all repositories from our database
const dbRepos = await db
@@ -70,18 +73,30 @@ async function identifyOrphanedRepositories(config: any): Promise<any[]> {
// Only identify repositories as orphaned if we successfully accessed GitHub
// This prevents false positives when GitHub is down or account is inaccessible
const orphanedRepos = dbRepos.filter(repo => {
const isOrphaned = !githubRepoFullNames.has(repo.fullName);
if (!isOrphaned) {
return false;
}
// Skip repositories we've already archived/preserved
if (repo.status === 'archived' || repo.isArchived) {
console.log(`[Repository Cleanup] Skipping ${repo.fullName} - already archived`);
return false;
}
return true;
// If starred repos are not being fetched from GitHub, we can't determine
// if a starred repo is orphaned - skip it to prevent data loss
if (repo.isStarred && !config.githubConfig?.includeStarred) {
console.log(`[Repository Cleanup] Skipping starred repo ${repo.fullName} - starred repos not being fetched from GitHub`);
return false;
}
const githubRepo = githubReposByFullName.get(repo.fullName);
if (!githubRepo) {
return true;
}
if (!isMirrorableGitHubRepo(githubRepo)) {
console.log(`[Repository Cleanup] Preserving ${repo.fullName} - repository is disabled on GitHub`);
return false;
}
return false;
});
if (orphanedRepos.length > 0) {

View File

@@ -12,6 +12,8 @@ import { parseInterval, formatDuration } from '@/lib/utils/duration-parser';
import type { Repository } from '@/lib/db/schema';
import { repoStatusEnum, repositoryVisibilityEnum } from '@/types/Repository';
import { mergeGitReposPreferStarred, normalizeGitRepoToInsert, calcBatchSizeForInsert } from '@/lib/repo-utils';
import { isMirrorableGitHubRepo } from '@/lib/repo-eligibility';
import { createMirrorJob } from '@/lib/helpers';
let schedulerInterval: NodeJS.Timeout | null = null;
let isSchedulerRunning = false;
@@ -96,6 +98,7 @@ async function runScheduledSync(config: any): Promise<void> {
: Promise.resolve([]),
]);
const allGithubRepos = mergeGitReposPreferStarred(basicAndForkedRepos, starredRepos);
const mirrorableGithubRepos = allGithubRepos.filter(isMirrorableGitHubRepo);
// Check for new repositories
const existingRepos = await db
@@ -104,7 +107,7 @@ async function runScheduledSync(config: any): Promise<void> {
.where(eq(repositories.userId, userId));
const existingRepoNames = new Set(existingRepos.map(r => r.normalizedFullName));
const newRepos = allGithubRepos.filter(r => !existingRepoNames.has(r.fullName.toLowerCase()));
const newRepos = mirrorableGithubRepos.filter(r => !existingRepoNames.has(r.fullName.toLowerCase()));
if (newRepos.length > 0) {
console.log(`[Scheduler] Found ${newRepos.length} new repositories for user ${userId}`);
@@ -126,9 +129,26 @@ async function runScheduledSync(config: any): Promise<void> {
.onConflictDoNothing({ target: [repositories.userId, repositories.normalizedFullName] });
}
console.log(`[Scheduler] Successfully imported ${newRepos.length} new repositories for user ${userId}`);
// Log activity for each newly imported repo
for (const repo of newRepos) {
const sourceLabel = repo.isStarred ? 'starred' : 'owned';
await createMirrorJob({
userId,
repositoryName: repo.fullName,
message: `Auto-imported ${sourceLabel} repository: ${repo.fullName}`,
details: `Repository ${repo.fullName} was discovered and imported during scheduled sync.`,
status: 'imported',
skipDuplicateEvent: true,
});
}
} else {
console.log(`[Scheduler] No new repositories found for user ${userId}`);
}
const skippedDisabledCount = allGithubRepos.length - mirrorableGithubRepos.length;
if (skippedDisabledCount > 0) {
console.log(`[Scheduler] Skipped ${skippedDisabledCount} disabled GitHub repositories for user ${userId}`);
}
} catch (error) {
console.error(`[Scheduler] Failed to auto-import repositories for user ${userId}:`, error);
}
@@ -170,7 +190,7 @@ async function runScheduledSync(config: any): Promise<void> {
if (scheduleConfig.autoMirror) {
try {
console.log(`[Scheduler] Auto-mirror enabled - checking for repositories to mirror for user ${userId}...`);
const reposNeedingMirror = await db
let reposNeedingMirror = await db
.select()
.from(repositories)
.where(
@@ -184,6 +204,19 @@ async function runScheduledSync(config: any): Promise<void> {
)
);
// Filter out starred repos from auto-mirror when autoMirrorStarred is disabled
if (!config.githubConfig?.autoMirrorStarred) {
const githubOwner = config.githubConfig?.owner || '';
const beforeCount = reposNeedingMirror.length;
reposNeedingMirror = reposNeedingMirror.filter(
repo => !repo.isStarred || repo.owner === githubOwner
);
const skippedCount = beforeCount - reposNeedingMirror.length;
if (skippedCount > 0) {
console.log(`[Scheduler] Skipped ${skippedCount} starred repositories from auto-mirror (autoMirrorStarred is disabled)`);
}
}
if (reposNeedingMirror.length > 0) {
console.log(`[Scheduler] Found ${reposNeedingMirror.length} repositories that need initial mirroring`);
@@ -274,11 +307,29 @@ async function runScheduledSync(config: any): Promise<void> {
});
}
// Log pending-approval repos that are excluded from sync
try {
const pendingApprovalRepos = await db
.select({ id: repositories.id })
.from(repositories)
.where(
and(
eq(repositories.userId, userId),
eq(repositories.status, 'pending-approval')
)
);
if (pendingApprovalRepos.length > 0) {
console.log(`[Scheduler] ${pendingApprovalRepos.length} repositories pending approval (force-push detected) for user ${userId} — skipping sync for those`);
}
} catch {
// Non-critical logging, ignore errors
}
if (reposToSync.length === 0) {
console.log(`[Scheduler] No repositories to sync for user ${userId}`);
return;
}
console.log(`[Scheduler] Syncing ${reposToSync.length} repositories for user ${userId}`);
// Process repositories in batches
@@ -429,6 +480,7 @@ async function performInitialAutoStart(): Promise<void> {
: Promise.resolve([]),
]);
const allGithubRepos = mergeGitReposPreferStarred(basicAndForkedRepos, starredRepos);
const mirrorableGithubRepos = allGithubRepos.filter(isMirrorableGitHubRepo);
// Check for new repositories
const existingRepos = await db
@@ -437,7 +489,7 @@ async function performInitialAutoStart(): Promise<void> {
.where(eq(repositories.userId, config.userId));
const existingRepoNames = new Set(existingRepos.map(r => r.normalizedFullName));
const reposToImport = allGithubRepos.filter(r => !existingRepoNames.has(r.fullName.toLowerCase()));
const reposToImport = mirrorableGithubRepos.filter(r => !existingRepoNames.has(r.fullName.toLowerCase()));
if (reposToImport.length > 0) {
console.log(`[Scheduler] Importing ${reposToImport.length} repositories for user ${config.userId}...`);
@@ -459,10 +511,27 @@ async function performInitialAutoStart(): Promise<void> {
.onConflictDoNothing({ target: [repositories.userId, repositories.normalizedFullName] });
}
console.log(`[Scheduler] Successfully imported ${reposToImport.length} repositories`);
// Log activity for each newly imported repo
for (const repo of reposToImport) {
const sourceLabel = repo.isStarred ? 'starred' : 'owned';
await createMirrorJob({
userId: config.userId,
repositoryName: repo.fullName,
message: `Auto-imported ${sourceLabel} repository: ${repo.fullName}`,
details: `Repository ${repo.fullName} was discovered and imported during auto-start.`,
status: 'imported',
skipDuplicateEvent: true,
});
}
} else {
console.log(`[Scheduler] No new repositories to import for user ${config.userId}`);
}
const skippedDisabledCount = allGithubRepos.length - mirrorableGithubRepos.length;
if (skippedDisabledCount > 0) {
console.log(`[Scheduler] Skipped ${skippedDisabledCount} disabled GitHub repositories for user ${config.userId}`);
}
// Check if we already have mirrored repositories (indicating this isn't first run)
const mirroredRepos = await db
.select()
@@ -505,8 +574,34 @@ async function performInitialAutoStart(): Promise<void> {
}
// Step 2: Trigger mirror for all repositories that need mirroring
// Only auto-mirror if autoMirror is enabled in schedule config
if (!config.scheduleConfig?.autoMirror) {
console.log(`[Scheduler] Step 2: Skipping initial mirror - autoMirror is disabled for user ${config.userId}`);
// Still update schedule config timestamps
const currentTime2 = new Date();
const intervalSource2 = config.scheduleConfig?.interval ||
config.giteaConfig?.mirrorInterval ||
'8h';
const interval2 = parseScheduleInterval(intervalSource2);
const nextRun2 = new Date(currentTime2.getTime() + interval2);
await db.update(configs).set({
scheduleConfig: {
...config.scheduleConfig,
enabled: true,
lastRun: currentTime2,
nextRun: nextRun2,
},
updatedAt: currentTime2,
}).where(eq(configs.id, config.id));
console.log(`[Scheduler] Scheduling enabled for user ${config.userId}, next sync at ${nextRun2.toISOString()}`);
continue;
}
console.log(`[Scheduler] Step 2: Triggering mirror for repositories that need mirroring...`);
const reposNeedingMirror = await db
let reposNeedingMirror = await db
.select()
.from(repositories)
.where(
@@ -519,7 +614,20 @@ async function performInitialAutoStart(): Promise<void> {
)
)
);
// Filter out starred repos from auto-mirror when autoMirrorStarred is disabled
if (!config.githubConfig?.autoMirrorStarred) {
const githubOwner = config.githubConfig?.owner || '';
const beforeCount = reposNeedingMirror.length;
reposNeedingMirror = reposNeedingMirror.filter(
repo => !repo.isStarred || repo.owner === githubOwner
);
const skippedCount = beforeCount - reposNeedingMirror.length;
if (skippedCount > 0) {
console.log(`[Scheduler] Skipped ${skippedCount} starred repositories from initial auto-mirror (autoMirrorStarred is disabled)`);
}
}
if (reposNeedingMirror.length > 0) {
console.log(`[Scheduler] Found ${reposNeedingMirror.length} repositories that need mirroring`);

View File

@@ -169,4 +169,31 @@ describe("parseErrorMessage", () => {
expect(result.description).toBeUndefined();
expect(result.isStructured).toBe(false);
});
test("adds trusted origins guidance for invalid origin errors", () => {
const errorMessage = "Invalid Origin: https://mirror.example.com";
const result = parseErrorMessage(errorMessage);
expect(result.title).toBe("Invalid Origin");
expect(result.description).toContain("BETTER_AUTH_TRUSTED_ORIGINS");
expect(result.description).toContain("https://mirror.example.com");
expect(result.isStructured).toBe(true);
});
});
describe("showErrorToast", () => {
test("shows invalid origin guidance in toast description", () => {
const calls: any[] = [];
const toast = {
error: (...args: any[]) => calls.push(args),
};
showErrorToast("Invalid Origin: http://10.10.20.45:4321", toast);
expect(calls).toHaveLength(1);
expect(calls[0][0]).toBe("Invalid Origin");
expect(calls[0][1].description).toContain("BETTER_AUTH_TRUSTED_ORIGINS");
expect(calls[0][1].description).toContain("http://10.10.20.45:4321");
});
});

View File

@@ -86,6 +86,30 @@ export interface ParsedErrorMessage {
isStructured: boolean;
}
function getInvalidOriginGuidance(title: string, description?: string): ParsedErrorMessage | null {
const fullMessage = `${title} ${description ?? ""}`.trim();
if (!/invalid origin/i.test(fullMessage)) {
return null;
}
const urlMatch = fullMessage.match(/https?:\/\/[^\s'")]+/i);
let originHint = "this URL";
if (urlMatch) {
try {
originHint = new URL(urlMatch[0]).origin;
} catch {
originHint = urlMatch[0];
}
}
return {
title: "Invalid Origin",
description: `Add ${originHint} to BETTER_AUTH_TRUSTED_ORIGINS and restart the app.`,
isStructured: true,
};
}
export function parseErrorMessage(error: unknown): ParsedErrorMessage {
// Handle Error objects
if (error instanceof Error) {
@@ -102,29 +126,32 @@ export function parseErrorMessage(error: unknown): ParsedErrorMessage {
if (typeof parsed === "object" && parsed !== null) {
// Format 1: { error: "message", errorType: "type", troubleshooting: "info" }
if (parsed.error) {
return {
const formatted = {
title: parsed.error,
description: parsed.troubleshooting || parsed.errorType || undefined,
isStructured: true,
};
return getInvalidOriginGuidance(formatted.title, formatted.description) || formatted;
}
// Format 2: { title: "title", description: "desc" }
if (parsed.title) {
return {
const formatted = {
title: parsed.title,
description: parsed.description || undefined,
isStructured: true,
};
return getInvalidOriginGuidance(formatted.title, formatted.description) || formatted;
}
// Format 3: { message: "msg", details: "details" }
if (parsed.message) {
return {
const formatted = {
title: parsed.message,
description: parsed.details || undefined,
isStructured: true,
};
return getInvalidOriginGuidance(formatted.title, formatted.description) || formatted;
}
}
} catch {
@@ -132,11 +159,12 @@ export function parseErrorMessage(error: unknown): ParsedErrorMessage {
}
// Plain string message
return {
const formatted = {
title: error,
description: undefined,
isStructured: false,
};
return getInvalidOriginGuidance(formatted.title, formatted.description) || formatted;
}
// Handle objects directly
@@ -144,36 +172,40 @@ export function parseErrorMessage(error: unknown): ParsedErrorMessage {
const errorObj = error as any;
if (errorObj.error) {
return {
const formatted = {
title: errorObj.error,
description: errorObj.troubleshooting || errorObj.errorType || undefined,
isStructured: true,
};
return getInvalidOriginGuidance(formatted.title, formatted.description) || formatted;
}
if (errorObj.title) {
return {
const formatted = {
title: errorObj.title,
description: errorObj.description || undefined,
isStructured: true,
};
return getInvalidOriginGuidance(formatted.title, formatted.description) || formatted;
}
if (errorObj.message) {
return {
const formatted = {
title: errorObj.message,
description: errorObj.details || undefined,
isStructured: true,
};
return getInvalidOriginGuidance(formatted.title, formatted.description) || formatted;
}
}
// Fallback for unknown types
return {
const fallback = {
title: String(error),
description: undefined,
isStructured: false,
};
return getInvalidOriginGuidance(fallback.title, fallback.description) || fallback;
}
// Enhanced toast helper that parses structured error messages
@@ -248,6 +280,8 @@ export const getStatusColor = (status: string): string => {
return "bg-orange-500"; // Deleting
case "deleted":
return "bg-gray-600"; // Deleted
case "pending-approval":
return "bg-amber-500"; // Needs manual approval
default:
return "bg-gray-400"; // Unknown/neutral
}

View File

@@ -93,6 +93,11 @@ export async function createDefaultConfig({ userId, envOverrides = {} }: Default
forkStrategy: "reference",
issueConcurrency: 3,
pullRequestConcurrency: 5,
backupStrategy: "on-force-push",
backupBeforeSync: true, // Deprecated: kept for backward compat
backupRetentionCount: 20,
backupDirectory: "data/repo-backups",
blockSyncOnBackupFailure: true,
},
include: [],
exclude: [],

View File

@@ -56,6 +56,7 @@ export function mapUiToDbConfig(
// Advanced options
starredCodeOnly: advancedOptions.starredCodeOnly,
autoMirrorStarred: advancedOptions.autoMirrorStarred ?? false,
};
// Map Gitea config to match database schema
@@ -100,6 +101,11 @@ export function mapUiToDbConfig(
mirrorPullRequests: mirrorOptions.mirrorMetadata && mirrorOptions.metadataComponents.pullRequests,
mirrorLabels: mirrorOptions.mirrorMetadata && mirrorOptions.metadataComponents.labels,
mirrorMilestones: mirrorOptions.mirrorMetadata && mirrorOptions.metadataComponents.milestones,
backupStrategy: giteaConfig.backupStrategy,
backupBeforeSync: giteaConfig.backupBeforeSync ?? true,
backupRetentionCount: giteaConfig.backupRetentionCount ?? 20,
backupDirectory: giteaConfig.backupDirectory?.trim() || undefined,
blockSyncOnBackupFailure: giteaConfig.blockSyncOnBackupFailure ?? true,
};
return {
@@ -140,6 +146,11 @@ export function mapDbToUiConfig(dbConfig: any): {
personalReposOrg: undefined, // Not stored in current schema
issueConcurrency: dbConfig.giteaConfig?.issueConcurrency ?? 3,
pullRequestConcurrency: dbConfig.giteaConfig?.pullRequestConcurrency ?? 5,
backupStrategy: dbConfig.giteaConfig?.backupStrategy || undefined,
backupBeforeSync: dbConfig.giteaConfig?.backupBeforeSync ?? true,
backupRetentionCount: dbConfig.giteaConfig?.backupRetentionCount ?? 20,
backupDirectory: dbConfig.giteaConfig?.backupDirectory || "data/repo-backups",
blockSyncOnBackupFailure: dbConfig.giteaConfig?.blockSyncOnBackupFailure ?? true,
};
// Map mirror options from various database fields
@@ -162,6 +173,7 @@ export function mapDbToUiConfig(dbConfig: any): {
skipForks: !(dbConfig.githubConfig?.includeForks ?? true), // Invert includeForks to get skipForks
// Support both old (skipStarredIssues) and new (starredCodeOnly) field names for backward compatibility
starredCodeOnly: dbConfig.githubConfig?.starredCodeOnly ?? (dbConfig.githubConfig as any)?.skipStarredIssues ?? false,
autoMirrorStarred: dbConfig.githubConfig?.autoMirrorStarred ?? false,
};
return {

View File

@@ -0,0 +1,319 @@
import { describe, expect, it, mock } from "bun:test";
import {
detectForcePush,
fetchGitHubBranches,
checkAncestry,
type BranchInfo,
} from "./force-push-detection";
// ---- Helpers ----
function makeOctokit(overrides: Record<string, any> = {}) {
return {
repos: {
listBranches: mock(() => Promise.resolve({ data: [] })),
compareCommits: mock(() =>
Promise.resolve({ data: { status: "ahead" } }),
),
...overrides.repos,
},
paginate: mock(async (_method: any, params: any) => {
// Default: return whatever the test wired into _githubBranches
return overrides._githubBranches ?? [];
}),
...overrides,
} as any;
}
// ---- fetchGitHubBranches ----
describe("fetchGitHubBranches", () => {
it("maps Octokit paginated response to BranchInfo[]", async () => {
const octokit = makeOctokit({
_githubBranches: [
{ name: "main", commit: { sha: "aaa" } },
{ name: "dev", commit: { sha: "bbb" } },
],
});
const result = await fetchGitHubBranches({
octokit,
owner: "user",
repo: "repo",
});
expect(result).toEqual([
{ name: "main", sha: "aaa" },
{ name: "dev", sha: "bbb" },
]);
});
});
// ---- checkAncestry ----
describe("checkAncestry", () => {
it("returns true for fast-forward (ahead)", async () => {
const octokit = makeOctokit({
repos: {
compareCommits: mock(() =>
Promise.resolve({ data: { status: "ahead" } }),
),
},
});
const result = await checkAncestry({
octokit,
owner: "user",
repo: "repo",
baseSha: "old",
headSha: "new",
});
expect(result).toBe(true);
});
it("returns true for identical", async () => {
const octokit = makeOctokit({
repos: {
compareCommits: mock(() =>
Promise.resolve({ data: { status: "identical" } }),
),
},
});
const result = await checkAncestry({
octokit,
owner: "user",
repo: "repo",
baseSha: "same",
headSha: "same",
});
expect(result).toBe(true);
});
it("returns false for diverged", async () => {
const octokit = makeOctokit({
repos: {
compareCommits: mock(() =>
Promise.resolve({ data: { status: "diverged" } }),
),
},
});
const result = await checkAncestry({
octokit,
owner: "user",
repo: "repo",
baseSha: "old",
headSha: "new",
});
expect(result).toBe(false);
});
it("returns false when API returns 404 (old SHA gone)", async () => {
const error404 = Object.assign(new Error("Not Found"), { status: 404 });
const octokit = makeOctokit({
repos: {
compareCommits: mock(() => Promise.reject(error404)),
},
});
const result = await checkAncestry({
octokit,
owner: "user",
repo: "repo",
baseSha: "gone",
headSha: "new",
});
expect(result).toBe(false);
});
it("throws on transient errors (fail-open for caller)", async () => {
const error500 = Object.assign(new Error("Internal Server Error"), { status: 500 });
const octokit = makeOctokit({
repos: {
compareCommits: mock(() => Promise.reject(error500)),
},
});
expect(
checkAncestry({
octokit,
owner: "user",
repo: "repo",
baseSha: "old",
headSha: "new",
}),
).rejects.toThrow("Internal Server Error");
});
});
// ---- detectForcePush ----
// Uses _deps injection to avoid fragile global fetch mocking.
describe("detectForcePush", () => {
const baseArgs = {
giteaUrl: "https://gitea.example.com",
giteaToken: "tok",
giteaOwner: "org",
giteaRepo: "repo",
githubOwner: "user",
githubRepo: "repo",
};
function makeDeps(overrides: {
giteaBranches?: BranchInfo[] | Error;
githubBranches?: BranchInfo[] | Error;
ancestryResult?: boolean;
} = {}) {
return {
fetchGiteaBranches: mock(async () => {
if (overrides.giteaBranches instanceof Error) throw overrides.giteaBranches;
return overrides.giteaBranches ?? [];
}) as any,
fetchGitHubBranches: mock(async () => {
if (overrides.githubBranches instanceof Error) throw overrides.githubBranches;
return overrides.githubBranches ?? [];
}) as any,
checkAncestry: mock(async () => overrides.ancestryResult ?? true) as any,
};
}
const dummyOctokit = {} as any;
it("skips when Gitea has no branches (first mirror)", async () => {
const deps = makeDeps({ giteaBranches: [] });
const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps });
expect(result.detected).toBe(false);
expect(result.skipped).toBe(true);
expect(result.skipReason).toContain("No Gitea branches");
});
it("returns no detection when all SHAs match", async () => {
const deps = makeDeps({
giteaBranches: [
{ name: "main", sha: "aaa" },
{ name: "dev", sha: "bbb" },
],
githubBranches: [
{ name: "main", sha: "aaa" },
{ name: "dev", sha: "bbb" },
],
});
const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps });
expect(result.detected).toBe(false);
expect(result.skipped).toBe(false);
expect(result.affectedBranches).toHaveLength(0);
});
it("detects deleted branch", async () => {
const deps = makeDeps({
giteaBranches: [
{ name: "main", sha: "aaa" },
{ name: "old-branch", sha: "ccc" },
],
githubBranches: [{ name: "main", sha: "aaa" }],
});
const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps });
expect(result.detected).toBe(true);
expect(result.affectedBranches).toHaveLength(1);
expect(result.affectedBranches[0]).toEqual({
name: "old-branch",
reason: "deleted",
giteaSha: "ccc",
githubSha: null,
});
});
it("returns no detection for fast-forward", async () => {
const deps = makeDeps({
giteaBranches: [{ name: "main", sha: "old-sha" }],
githubBranches: [{ name: "main", sha: "new-sha" }],
ancestryResult: true, // fast-forward
});
const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps });
expect(result.detected).toBe(false);
expect(result.affectedBranches).toHaveLength(0);
});
it("detects diverged branch", async () => {
const deps = makeDeps({
giteaBranches: [{ name: "main", sha: "old-sha" }],
githubBranches: [{ name: "main", sha: "rewritten-sha" }],
ancestryResult: false, // diverged
});
const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps });
expect(result.detected).toBe(true);
expect(result.affectedBranches).toHaveLength(1);
expect(result.affectedBranches[0]).toEqual({
name: "main",
reason: "diverged",
giteaSha: "old-sha",
githubSha: "rewritten-sha",
});
});
it("detects force-push when ancestry check fails (old SHA gone)", async () => {
const deps = makeDeps({
giteaBranches: [{ name: "main", sha: "old-sha" }],
githubBranches: [{ name: "main", sha: "new-sha" }],
ancestryResult: false, // checkAncestry returns false on error
});
const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps });
expect(result.detected).toBe(true);
expect(result.affectedBranches).toHaveLength(1);
expect(result.affectedBranches[0].reason).toBe("diverged");
});
it("skips when Gitea API returns 404", async () => {
const { HttpError } = await import("@/lib/http-client");
const deps = makeDeps({
giteaBranches: new HttpError("not found", 404, "Not Found"),
});
const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps });
expect(result.detected).toBe(false);
expect(result.skipped).toBe(true);
expect(result.skipReason).toContain("not found");
});
it("skips when Gitea API returns server error", async () => {
const deps = makeDeps({
giteaBranches: new Error("HTTP 500: internal error"),
});
const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps });
expect(result.detected).toBe(false);
expect(result.skipped).toBe(true);
expect(result.skipReason).toContain("Failed to fetch Gitea branches");
});
it("skips when GitHub API fails", async () => {
const deps = makeDeps({
giteaBranches: [{ name: "main", sha: "aaa" }],
githubBranches: new Error("rate limited"),
});
const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps });
expect(result.detected).toBe(false);
expect(result.skipped).toBe(true);
expect(result.skipReason).toContain("Failed to fetch GitHub branches");
});
});

View File

@@ -0,0 +1,286 @@
/**
* Force-push detection module.
*
* Compares branch SHAs between a Gitea mirror and GitHub source to detect
* branches that were deleted, rewritten, or force-pushed.
*
* **Fail-open**: If detection itself fails (API errors, rate limits, etc.),
* the result indicates no force-push so sync proceeds normally. Detection
* should never block sync due to its own failure.
*/
import type { Octokit } from "@octokit/rest";
import { httpGet, HttpError } from "@/lib/http-client";
// ---- Types ----
export interface BranchInfo {
name: string;
sha: string;
}
export type ForcePushReason = "deleted" | "diverged" | "non-fast-forward";
export interface AffectedBranch {
name: string;
reason: ForcePushReason;
giteaSha: string;
githubSha: string | null; // null when branch was deleted
}
export interface ForcePushDetectionResult {
detected: boolean;
affectedBranches: AffectedBranch[];
/** True when detection could not run (API error, etc.) */
skipped: boolean;
skipReason?: string;
}
const NO_FORCE_PUSH: ForcePushDetectionResult = {
detected: false,
affectedBranches: [],
skipped: false,
};
function skippedResult(reason: string): ForcePushDetectionResult {
return {
detected: false,
affectedBranches: [],
skipped: true,
skipReason: reason,
};
}
// ---- Branch fetching ----
/**
* Fetch all branches from a Gitea repository (paginated).
*/
export async function fetchGiteaBranches({
giteaUrl,
giteaToken,
owner,
repo,
}: {
giteaUrl: string;
giteaToken: string;
owner: string;
repo: string;
}): Promise<BranchInfo[]> {
const branches: BranchInfo[] = [];
let page = 1;
const perPage = 50;
while (true) {
const url = `${giteaUrl}/api/v1/repos/${owner}/${repo}/branches?page=${page}&limit=${perPage}`;
const response = await httpGet<Array<{ name: string; commit: { id: string } }>>(
url,
{ Authorization: `token ${giteaToken}` },
);
if (!Array.isArray(response.data) || response.data.length === 0) break;
for (const b of response.data) {
branches.push({ name: b.name, sha: b.commit.id });
}
if (response.data.length < perPage) break;
page++;
}
return branches;
}
/**
* Fetch all branches from a GitHub repository (paginated via Octokit).
*/
export async function fetchGitHubBranches({
octokit,
owner,
repo,
}: {
octokit: Octokit;
owner: string;
repo: string;
}): Promise<BranchInfo[]> {
const data = await octokit.paginate(octokit.repos.listBranches, {
owner,
repo,
per_page: 100,
});
return data.map((b) => ({ name: b.name, sha: b.commit.sha }));
}
/**
* Check whether the transition from `baseSha` to `headSha` on the same branch
* is a fast-forward (i.e. `baseSha` is an ancestor of `headSha`).
*
* Returns `true` when the change is safe (fast-forward) and `false` when it
* is a confirmed force-push (404 = old SHA garbage-collected from GitHub).
*
* Throws on transient errors (rate limits, network issues) so the caller
* can decide how to handle them (fail-open: skip that branch).
*/
export async function checkAncestry({
octokit,
owner,
repo,
baseSha,
headSha,
}: {
octokit: Octokit;
owner: string;
repo: string;
baseSha: string;
headSha: string;
}): Promise<boolean> {
try {
const { data } = await octokit.repos.compareCommits({
owner,
repo,
base: baseSha,
head: headSha,
});
// "ahead" means headSha is strictly ahead of baseSha → fast-forward.
// "behind" or "diverged" means the branch was rewritten.
return data.status === "ahead" || data.status === "identical";
} catch (error: any) {
// 404 / 422 = old SHA no longer exists on GitHub → confirmed force-push.
if (error?.status === 404 || error?.status === 422) {
return false;
}
// Any other error (rate limit, network) → rethrow so caller can
// handle it as fail-open (skip branch) rather than false-positive.
throw error;
}
}
// ---- Main detection ----
/**
* Compare branch SHAs between Gitea and GitHub to detect force-pushes.
*
* The function is intentionally fail-open: any error during detection returns
* a "skipped" result so that sync can proceed normally.
*/
export async function detectForcePush({
giteaUrl,
giteaToken,
giteaOwner,
giteaRepo,
octokit,
githubOwner,
githubRepo,
_deps,
}: {
giteaUrl: string;
giteaToken: string;
giteaOwner: string;
giteaRepo: string;
octokit: Octokit;
githubOwner: string;
githubRepo: string;
/** @internal — test-only dependency injection */
_deps?: {
fetchGiteaBranches: typeof fetchGiteaBranches;
fetchGitHubBranches: typeof fetchGitHubBranches;
checkAncestry: typeof checkAncestry;
};
}): Promise<ForcePushDetectionResult> {
const deps = _deps ?? { fetchGiteaBranches, fetchGitHubBranches, checkAncestry };
// 1. Fetch Gitea branches
let giteaBranches: BranchInfo[];
try {
giteaBranches = await deps.fetchGiteaBranches({
giteaUrl,
giteaToken,
owner: giteaOwner,
repo: giteaRepo,
});
} catch (error) {
// Gitea 404 = repo not yet mirrored, skip detection
if (error instanceof HttpError && error.status === 404) {
return skippedResult("Gitea repository not found (first mirror?)");
}
return skippedResult(
`Failed to fetch Gitea branches: ${error instanceof Error ? error.message : String(error)}`,
);
}
// First-time mirror: no Gitea branches → nothing to compare
if (giteaBranches.length === 0) {
return skippedResult("No Gitea branches found (first mirror?)");
}
// 2. Fetch GitHub branches
let githubBranches: BranchInfo[];
try {
githubBranches = await deps.fetchGitHubBranches({
octokit,
owner: githubOwner,
repo: githubRepo,
});
} catch (error) {
return skippedResult(
`Failed to fetch GitHub branches: ${error instanceof Error ? error.message : String(error)}`,
);
}
const githubBranchMap = new Map(githubBranches.map((b) => [b.name, b.sha]));
// 3. Compare each Gitea branch against GitHub
const affected: AffectedBranch[] = [];
for (const giteaBranch of giteaBranches) {
const githubSha = githubBranchMap.get(giteaBranch.name);
if (githubSha === undefined) {
// Branch was deleted on GitHub
affected.push({
name: giteaBranch.name,
reason: "deleted",
giteaSha: giteaBranch.sha,
githubSha: null,
});
continue;
}
// Same SHA → no change
if (githubSha === giteaBranch.sha) continue;
// SHAs differ → check if it's a fast-forward
try {
const isFastForward = await deps.checkAncestry({
octokit,
owner: githubOwner,
repo: githubRepo,
baseSha: giteaBranch.sha,
headSha: githubSha,
});
if (!isFastForward) {
affected.push({
name: giteaBranch.name,
reason: "diverged",
giteaSha: giteaBranch.sha,
githubSha,
});
}
} catch {
// Individual branch check failure → skip that branch (fail-open)
continue;
}
}
if (affected.length === 0) {
return NO_FORCE_PUSH;
}
return {
detected: true,
affectedBranches: affected,
skipped: false,
};
}

View File

@@ -2,28 +2,13 @@ import type { APIRoute } from "astro";
import { db, mirrorJobs, events } from "@/lib/db";
import { eq, count } from "drizzle-orm";
import { createSecureErrorResponse } from "@/lib/utils";
import { requireAuthenticatedUserId } from "@/lib/auth-guards";
export const POST: APIRoute = async ({ request }) => {
export const POST: APIRoute = async ({ request, locals }) => {
try {
let body;
try {
body = await request.json();
} catch (jsonError) {
console.error("Invalid JSON in request body:", jsonError);
return new Response(
JSON.stringify({ error: "Invalid JSON in request body." }),
{ status: 400, headers: { "Content-Type": "application/json" } }
);
}
const { userId } = body || {};
if (!userId) {
return new Response(
JSON.stringify({ error: "Missing 'userId' in request body." }),
{ status: 400, headers: { "Content-Type": "application/json" } }
);
}
const authResult = await requireAuthenticatedUserId({ request, locals });
if ("response" in authResult) return authResult.response;
const userId = authResult.userId;
// Start a transaction to ensure all operations succeed or fail together
const result = await db.transaction(async (tx) => {

View File

@@ -1,21 +1,16 @@
import type { APIRoute } from "astro";
import { db, mirrorJobs, configs } from "@/lib/db";
import { db, mirrorJobs } from "@/lib/db";
import { eq, sql } from "drizzle-orm";
import { createSecureErrorResponse } from "@/lib/utils";
import type { MirrorJob } from "@/lib/db/schema";
import { repoStatusEnum } from "@/types/Repository";
import { requireAuthenticatedUserId } from "@/lib/auth-guards";
export const GET: APIRoute = async ({ url }) => {
export const GET: APIRoute = async ({ request, locals }) => {
try {
const searchParams = new URL(url).searchParams;
const userId = searchParams.get("userId");
if (!userId) {
return new Response(
JSON.stringify({ error: "Missing 'userId' in query parameters." }),
{ status: 400, headers: { "Content-Type": "application/json" } }
);
}
const authResult = await requireAuthenticatedUserId({ request, locals });
if ("response" in authResult) return authResult.response;
const userId = authResult.userId;
// Fetch mirror jobs associated with the user
const jobs = await db

View File

@@ -2,7 +2,6 @@ import type { APIRoute } from "astro";
import { db, configs, users } from "@/lib/db";
import { v4 as uuidv4 } from "uuid";
import { eq } from "drizzle-orm";
import { calculateCleanupInterval } from "@/lib/cleanup-service";
import { createSecureErrorResponse } from "@/lib/utils";
import {
mapUiToDbConfig,
@@ -12,20 +11,25 @@ import {
mapDbScheduleToUi,
mapDbCleanupToUi
} from "@/lib/utils/config-mapper";
import { encrypt, decrypt, migrateToken } from "@/lib/utils/encryption";
import { encrypt, decrypt } from "@/lib/utils/encryption";
import { createDefaultConfig } from "@/lib/utils/config-defaults";
import { requireAuthenticatedUserId } from "@/lib/auth-guards";
export const POST: APIRoute = async ({ request }) => {
export const POST: APIRoute = async ({ request, locals }) => {
try {
const body = await request.json();
const { userId, githubConfig, giteaConfig, scheduleConfig, cleanupConfig, mirrorOptions, advancedOptions } = body;
const authResult = await requireAuthenticatedUserId({ request, locals });
if ("response" in authResult) return authResult.response;
const userId = authResult.userId;
if (!userId || !githubConfig || !giteaConfig || !scheduleConfig || !cleanupConfig || !mirrorOptions || !advancedOptions) {
const body = await request.json();
const { githubConfig, giteaConfig, scheduleConfig, cleanupConfig, mirrorOptions, advancedOptions } = body;
if (!githubConfig || !giteaConfig || !scheduleConfig || !cleanupConfig || !mirrorOptions || !advancedOptions) {
return new Response(
JSON.stringify({
success: false,
message:
"userId, githubConfig, giteaConfig, scheduleConfig, cleanupConfig, mirrorOptions, and advancedOptions are required.",
"githubConfig, giteaConfig, scheduleConfig, cleanupConfig, mirrorOptions, and advancedOptions are required.",
}),
{
status: 400,
@@ -172,17 +176,11 @@ export const POST: APIRoute = async ({ request }) => {
}
};
export const GET: APIRoute = async ({ request }) => {
export const GET: APIRoute = async ({ request, locals }) => {
try {
const url = new URL(request.url);
const userId = url.searchParams.get("userId");
if (!userId) {
return new Response(JSON.stringify({ error: "User ID is required" }), {
status: 400,
headers: { "Content-Type": "application/json" },
});
}
const authResult = await requireAuthenticatedUserId({ request, locals });
if ("response" in authResult) return authResult.response;
const userId = authResult.userId;
// Fetch the configuration for the user
const config = await db

View File

@@ -3,24 +3,14 @@ import { db, repositories, organizations, mirrorJobs, configs } from "@/lib/db";
import { eq, count, and, sql, or } from "drizzle-orm";
import { jsonResponse, createSecureErrorResponse } from "@/lib/utils";
import type { DashboardApiResponse } from "@/types/dashboard";
import { repositoryVisibilityEnum, repoStatusEnum } from "@/types/Repository";
import { membershipRoleEnum } from "@/types/organizations";
export const GET: APIRoute = async ({ request }) => {
const url = new URL(request.url);
const userId = url.searchParams.get("userId");
if (!userId) {
return jsonResponse({
data: {
success: false,
error: "Missing userId",
},
status: 400,
});
}
import { requireAuthenticatedUserId } from "@/lib/auth-guards";
export const GET: APIRoute = async ({ request, locals }) => {
try {
const authResult = await requireAuthenticatedUserId({ request, locals });
if ("response" in authResult) return authResult.response;
const userId = authResult.userId;
const [
userRepos,
userOrgs,

View File

@@ -1,13 +1,11 @@
import type { APIRoute } from "astro";
import { getNewEvents } from "@/lib/events";
import { requireAuthenticatedUserId } from "@/lib/auth-guards";
export const GET: APIRoute = async ({ request }) => {
const url = new URL(request.url);
const userId = url.searchParams.get("userId");
if (!userId) {
return new Response("Missing userId", { status: 400 });
}
export const GET: APIRoute = async ({ request, locals }) => {
const authResult = await requireAuthenticatedUserId({ request, locals });
if ("response" in authResult) return authResult.response;
const userId = authResult.userId;
// Create a new ReadableStream for SSE
const stream = new ReadableStream({
@@ -66,4 +64,4 @@ export const GET: APIRoute = async ({ request }) => {
"X-Accel-Buffering": "no", // Disable nginx buffering
},
});
};
};

View File

@@ -9,22 +9,14 @@ import {
import type { Organization } from "@/lib/db/schema";
import { repoStatusEnum } from "@/types/Repository";
import { jsonResponse, createSecureErrorResponse } from "@/lib/utils";
import { requireAuthenticatedUserId } from "@/lib/auth-guards";
export const GET: APIRoute = async ({ request }) => {
const url = new URL(request.url);
const userId = url.searchParams.get("userId");
if (!userId) {
return jsonResponse({
data: {
success: false,
error: "Missing userId",
},
status: 400,
});
}
export const GET: APIRoute = async ({ request, locals }) => {
try {
const authResult = await requireAuthenticatedUserId({ request, locals });
if ("response" in authResult) return authResult.response;
const userId = authResult.userId;
// Fetch the user's active configuration to respect filtering settings
const [config] = await db
.select()

View File

@@ -7,19 +7,14 @@ import {
type RepositoryApiResponse,
} from "@/types/Repository";
import { jsonResponse, createSecureErrorResponse } from "@/lib/utils";
import { requireAuthenticatedUserId } from "@/lib/auth-guards";
export const GET: APIRoute = async ({ request }) => {
const url = new URL(request.url);
const userId = url.searchParams.get("userId");
if (!userId) {
return jsonResponse({
data: { success: false, error: "Missing userId" },
status: 400,
});
}
export const GET: APIRoute = async ({ request, locals }) => {
try {
const authResult = await requireAuthenticatedUserId({ request, locals });
if ("response" in authResult) return authResult.response;
const userId = authResult.userId;
// Fetch the user's active configuration
const [config] = await db
.select()

View File

@@ -0,0 +1,202 @@
import type { APIRoute } from "astro";
import { db, configs, repositories } from "@/lib/db";
import { and, eq, inArray } from "drizzle-orm";
import { repositoryVisibilityEnum, repoStatusEnum } from "@/types/Repository";
import { syncGiteaRepoEnhanced } from "@/lib/gitea-enhanced";
import { createSecureErrorResponse } from "@/lib/utils";
import { requireAuthenticatedUserId } from "@/lib/auth-guards";
import { createPreSyncBundleBackup } from "@/lib/repo-backup";
import { decryptConfigTokens } from "@/lib/utils/config-encryption";
import type { Config } from "@/types/config";
import { createMirrorJob } from "@/lib/helpers";
interface ApproveSyncRequest {
repositoryIds: string[];
action: "approve" | "dismiss";
}
export const POST: APIRoute = async ({ request, locals }) => {
try {
const authResult = await requireAuthenticatedUserId({ request, locals });
if ("response" in authResult) return authResult.response;
const userId = authResult.userId;
const body: ApproveSyncRequest = await request.json();
const { repositoryIds, action } = body;
if (!repositoryIds || !Array.isArray(repositoryIds) || repositoryIds.length === 0) {
return new Response(
JSON.stringify({ success: false, message: "repositoryIds are required." }),
{ status: 400, headers: { "Content-Type": "application/json" } },
);
}
if (action !== "approve" && action !== "dismiss") {
return new Response(
JSON.stringify({ success: false, message: "action must be 'approve' or 'dismiss'." }),
{ status: 400, headers: { "Content-Type": "application/json" } },
);
}
// Fetch config
const configResult = await db
.select()
.from(configs)
.where(eq(configs.userId, userId))
.limit(1);
const config = configResult[0];
if (!config) {
return new Response(
JSON.stringify({ success: false, message: "No configuration found." }),
{ status: 400, headers: { "Content-Type": "application/json" } },
);
}
// Fetch repos — only those in pending-approval status
const repos = await db
.select()
.from(repositories)
.where(
and(
eq(repositories.userId, userId),
eq(repositories.status, "pending-approval"),
inArray(repositories.id, repositoryIds),
),
);
if (!repos.length) {
return new Response(
JSON.stringify({ success: false, message: "No pending-approval repositories found for the given IDs." }),
{ status: 404, headers: { "Content-Type": "application/json" } },
);
}
if (action === "dismiss") {
// Reset status to "synced" so repos resume normal schedule
for (const repo of repos) {
await db
.update(repositories)
.set({
status: "synced",
errorMessage: null,
updatedAt: new Date(),
})
.where(eq(repositories.id, repo.id));
await createMirrorJob({
userId,
repositoryId: repo.id,
repositoryName: repo.name,
message: `Force-push alert dismissed for ${repo.name}`,
details: "User dismissed the force-push alert. Repository will resume normal sync schedule.",
status: "synced",
});
}
return new Response(
JSON.stringify({
success: true,
message: `Dismissed ${repos.length} repository alert(s).`,
repositories: repos.map((repo) => ({
...repo,
status: "synced",
errorMessage: null,
})),
}),
{ status: 200, headers: { "Content-Type": "application/json" } },
);
}
// action === "approve": create backup first (safety), then trigger sync
const decryptedConfig = decryptConfigTokens(config as unknown as Config);
// Process in background
setTimeout(async () => {
for (const repo of repos) {
try {
const { getGiteaRepoOwnerAsync } = await import("@/lib/gitea");
const repoOwner = await getGiteaRepoOwnerAsync({ config, repository: repo });
// Always create a backup before approved sync for safety
const cloneUrl = `${config.giteaConfig.url.replace(/\/$/, "")}/${repoOwner}/${repo.name}.git`;
try {
const backupResult = await createPreSyncBundleBackup({
config,
owner: repoOwner,
repoName: repo.name,
cloneUrl,
force: true, // Bypass legacy gate — approval implies backup
});
await createMirrorJob({
userId,
repositoryId: repo.id,
repositoryName: repo.name,
message: `Safety snapshot created for ${repo.name}`,
details: `Pre-approval snapshot at ${backupResult.bundlePath}.`,
status: "syncing",
});
} catch (backupError) {
console.warn(
`[ApproveSync] Backup failed for ${repo.name}, proceeding with sync: ${
backupError instanceof Error ? backupError.message : String(backupError)
}`,
);
}
// Trigger sync — skip detection to avoid re-blocking
const repoData = {
...repo,
status: repoStatusEnum.parse("syncing"),
organization: repo.organization ?? undefined,
lastMirrored: repo.lastMirrored ?? undefined,
errorMessage: repo.errorMessage ?? undefined,
forkedFrom: repo.forkedFrom ?? undefined,
visibility: repositoryVisibilityEnum.parse(repo.visibility),
mirroredLocation: repo.mirroredLocation || "",
};
await syncGiteaRepoEnhanced({
config,
repository: repoData,
skipForcePushDetection: true,
});
console.log(`[ApproveSync] Sync completed for approved repository: ${repo.name}`);
} catch (error) {
console.error(
`[ApproveSync] Failed to sync approved repository ${repo.name}:`,
error,
);
}
}
}, 0);
// Immediately update status to syncing for responsiveness
for (const repo of repos) {
await db
.update(repositories)
.set({
status: "syncing",
errorMessage: null,
updatedAt: new Date(),
})
.where(eq(repositories.id, repo.id));
}
return new Response(
JSON.stringify({
success: true,
message: `Approved sync for ${repos.length} repository(ies). Backup + sync started.`,
repositories: repos.map((repo) => ({
...repo,
status: "syncing",
errorMessage: null,
})),
}),
{ status: 200, headers: { "Content-Type": "application/json" } },
);
} catch (error) {
return createSecureErrorResponse(error, "approve-sync", 500);
}
};

View File

@@ -1,7 +1,7 @@
import type { APIRoute } from "astro";
import type { MirrorOrgRequest, MirrorOrgResponse } from "@/types/mirror";
import { db, configs, organizations } from "@/lib/db";
import { eq, inArray } from "drizzle-orm";
import { and, eq, inArray } from "drizzle-orm";
import { createGitHubClient } from "@/lib/github";
import { mirrorGitHubOrgToGitea } from "@/lib/gitea";
import { repoStatusEnum } from "@/types/Repository";
@@ -10,17 +10,22 @@ import { createSecureErrorResponse } from "@/lib/utils";
import { processWithResilience } from "@/lib/utils/concurrency";
import { v4 as uuidv4 } from "uuid";
import { getDecryptedGitHubToken } from "@/lib/utils/config-encryption";
import { requireAuthenticatedUserId } from "@/lib/auth-guards";
export const POST: APIRoute = async ({ request }) => {
export const POST: APIRoute = async ({ request, locals }) => {
try {
const body: MirrorOrgRequest = await request.json();
const { userId, organizationIds } = body;
const authResult = await requireAuthenticatedUserId({ request, locals });
if ("response" in authResult) return authResult.response;
const userId = authResult.userId;
if (!userId || !organizationIds || !Array.isArray(organizationIds)) {
const body: MirrorOrgRequest = await request.json();
const { organizationIds } = body;
if (!organizationIds || !Array.isArray(organizationIds)) {
return new Response(
JSON.stringify({
success: false,
message: "userId and organizationIds are required.",
message: "organizationIds are required.",
}),
{ status: 400, headers: { "Content-Type": "application/json" } }
);
@@ -56,7 +61,12 @@ export const POST: APIRoute = async ({ request }) => {
const orgs = await db
.select()
.from(organizations)
.where(inArray(organizations.id, organizationIds));
.where(
and(
eq(organizations.userId, userId),
inArray(organizations.id, organizationIds)
)
);
if (!orgs.length) {
return new Response(

View File

@@ -62,7 +62,13 @@ const mockRepositories = {};
mock.module("@/lib/db", () => ({
db: mockDb,
configs: mockConfigs,
repositories: mockRepositories
repositories: mockRepositories,
users: {},
organizations: {},
mirrorJobs: {},
events: {},
accounts: {},
sessions: {}
}));
// Mock the gitea module
@@ -71,7 +77,10 @@ const mockMirrorGitHubOrgRepoToGiteaOrg = mock(() => Promise.resolve());
mock.module("@/lib/gitea", () => ({
mirrorGithubRepoToGitea: mockMirrorGithubRepoToGitea,
mirrorGitHubOrgRepoToGiteaOrg: mockMirrorGitHubOrgRepoToGiteaOrg
mirrorGitHubOrgRepoToGiteaOrg: mockMirrorGitHubOrgRepoToGiteaOrg,
getGiteaRepoOwnerAsync: mock(() => Promise.resolve("test-owner")),
isRepoPresentInGitea: mock(() => Promise.resolve(true)),
syncGiteaRepo: mock(() => Promise.resolve({ success: true })),
}));
// Mock the github module
@@ -90,6 +99,7 @@ mock.module("@/lib/utils/concurrency", () => ({
// Mock drizzle-orm
mock.module("drizzle-orm", () => ({
and: mock(() => ({})),
eq: mock(() => ({})),
inArray: mock(() => ({}))
}));
@@ -121,7 +131,7 @@ describe("Repository Mirroring API", () => {
console.error = originalConsoleError;
});
test("returns 400 if userId is missing", async () => {
test("returns 401 when request is unauthenticated", async () => {
const request = new Request("http://localhost/api/job/mirror-repo", {
method: "POST",
headers: {
@@ -134,11 +144,11 @@ describe("Repository Mirroring API", () => {
const response = await POST({ request } as any);
expect(response.status).toBe(400);
expect(response.status).toBe(401);
const data = await response.json();
expect(data.success).toBe(false);
expect(data.message).toBe("userId and repositoryIds are required.");
expect(data.error).toBe("Unauthorized");
});
test("returns 400 if repositoryIds is missing", async () => {
@@ -152,13 +162,18 @@ describe("Repository Mirroring API", () => {
})
});
const response = await POST({ request } as any);
const response = await POST({
request,
locals: {
session: { userId: "user-id" },
},
} as any);
expect(response.status).toBe(400);
const data = await response.json();
expect(data.success).toBe(false);
expect(data.message).toBe("userId and repositoryIds are required.");
expect(data.message).toBe("repositoryIds are required.");
});
test("returns 200 and starts mirroring repositories", async () => {
@@ -173,7 +188,12 @@ describe("Repository Mirroring API", () => {
})
});
const response = await POST({ request } as any);
const response = await POST({
request,
locals: {
session: { userId: "user-id" },
},
} as any);
expect(response.status).toBe(200);

View File

@@ -1,7 +1,7 @@
import type { APIRoute } from "astro";
import type { MirrorRepoRequest, MirrorRepoResponse } from "@/types/mirror";
import { db, configs, repositories } from "@/lib/db";
import { eq, inArray } from "drizzle-orm";
import { and, eq, inArray } from "drizzle-orm";
import { repositoryVisibilityEnum, repoStatusEnum } from "@/types/Repository";
import {
mirrorGithubRepoToGitea,
@@ -12,17 +12,22 @@ import { createGitHubClient } from "@/lib/github";
import { getDecryptedGitHubToken } from "@/lib/utils/config-encryption";
import { processWithResilience } from "@/lib/utils/concurrency";
import { createSecureErrorResponse } from "@/lib/utils";
import { requireAuthenticatedUserId } from "@/lib/auth-guards";
export const POST: APIRoute = async ({ request }) => {
export const POST: APIRoute = async ({ request, locals }) => {
try {
const body: MirrorRepoRequest = await request.json();
const { userId, repositoryIds } = body;
const authResult = await requireAuthenticatedUserId({ request, locals });
if ("response" in authResult) return authResult.response;
const userId = authResult.userId;
if (!userId || !repositoryIds || !Array.isArray(repositoryIds)) {
const body: MirrorRepoRequest = await request.json();
const { repositoryIds } = body;
if (!repositoryIds || !Array.isArray(repositoryIds)) {
return new Response(
JSON.stringify({
success: false,
message: "userId and repositoryIds are required.",
message: "repositoryIds are required.",
}),
{ status: 400, headers: { "Content-Type": "application/json" } }
);
@@ -58,7 +63,12 @@ export const POST: APIRoute = async ({ request }) => {
const repos = await db
.select()
.from(repositories)
.where(inArray(repositories.id, repositoryIds));
.where(
and(
eq(repositories.userId, userId),
inArray(repositories.id, repositoryIds)
)
);
if (!repos.length) {
return new Response(

View File

@@ -4,17 +4,22 @@ import { db, configs, repositories } from "@/lib/db";
import { repositoryVisibilityEnum, repoStatusEnum } from "@/types/Repository";
import type { ResetMetadataRequest, ResetMetadataResponse } from "@/types/reset-metadata";
import { createSecureErrorResponse } from "@/lib/utils";
import { requireAuthenticatedUserId } from "@/lib/auth-guards";
export const POST: APIRoute = async ({ request }) => {
export const POST: APIRoute = async ({ request, locals }) => {
try {
const body: ResetMetadataRequest = await request.json();
const { userId, repositoryIds } = body;
const authResult = await requireAuthenticatedUserId({ request, locals });
if ("response" in authResult) return authResult.response;
const userId = authResult.userId;
if (!userId || !repositoryIds || !Array.isArray(repositoryIds)) {
const body: ResetMetadataRequest = await request.json();
const { repositoryIds } = body;
if (!repositoryIds || !Array.isArray(repositoryIds)) {
return new Response(
JSON.stringify({
success: false,
message: "userId and repositoryIds are required.",
message: "repositoryIds are required.",
}),
{ status: 400, headers: { "Content-Type": "application/json" } }
);

View File

@@ -1,6 +1,6 @@
import type { APIRoute } from "astro";
import { db, configs, repositories } from "@/lib/db";
import { eq, inArray } from "drizzle-orm";
import { and, eq, inArray } from "drizzle-orm";
import { getGiteaRepoOwnerAsync, isRepoPresentInGitea } from "@/lib/gitea";
import {
mirrorGithubRepoToGitea,
@@ -14,17 +14,22 @@ import { processWithRetry } from "@/lib/utils/concurrency";
import { createMirrorJob } from "@/lib/helpers";
import { createSecureErrorResponse } from "@/lib/utils";
import { getDecryptedGitHubToken } from "@/lib/utils/config-encryption";
import { requireAuthenticatedUserId } from "@/lib/auth-guards";
export const POST: APIRoute = async ({ request }) => {
export const POST: APIRoute = async ({ request, locals }) => {
try {
const body: RetryRepoRequest = await request.json();
const { userId, repositoryIds } = body;
const authResult = await requireAuthenticatedUserId({ request, locals });
if ("response" in authResult) return authResult.response;
const userId = authResult.userId;
if (!userId || !repositoryIds || !Array.isArray(repositoryIds)) {
const body: RetryRepoRequest = await request.json();
const { repositoryIds } = body;
if (!repositoryIds || !Array.isArray(repositoryIds)) {
return new Response(
JSON.stringify({
success: false,
message: "userId and repositoryIds are required.",
message: "repositoryIds are required.",
}),
{ status: 400, headers: { "Content-Type": "application/json" } }
);
@@ -60,7 +65,12 @@ export const POST: APIRoute = async ({ request }) => {
const repos = await db
.select()
.from(repositories)
.where(inArray(repositories.id, repositoryIds));
.where(
and(
eq(repositories.userId, userId),
inArray(repositories.id, repositoryIds)
)
);
if (!repos.length) {
return new Response(

View File

@@ -1,6 +1,6 @@
import type { APIRoute } from "astro";
import { db, configs, repositories } from "@/lib/db";
import { eq, or } from "drizzle-orm";
import { and, eq, or } from "drizzle-orm";
import { repoStatusEnum, repositoryVisibilityEnum } from "@/types/Repository";
import { isRepoPresentInGitea, syncGiteaRepo } from "@/lib/gitea";
import type {
@@ -9,22 +9,15 @@ import type {
} from "@/types/sync";
import { createSecureErrorResponse } from "@/lib/utils";
import { parseInterval } from "@/lib/utils/duration-parser";
import { requireAuthenticatedUserId } from "@/lib/auth-guards";
export const POST: APIRoute = async ({ request }) => {
export const POST: APIRoute = async ({ request, locals }) => {
try {
const body: ScheduleSyncRepoRequest = await request.json();
const { userId } = body;
const authResult = await requireAuthenticatedUserId({ request, locals });
if ("response" in authResult) return authResult.response;
const userId = authResult.userId;
if (!userId) {
return new Response(
JSON.stringify({
success: false,
error: "Missing userId in request body.",
repositories: [],
} satisfies ScheduleSyncRepoResponse),
{ status: 400, headers: { "Content-Type": "application/json" } }
);
}
await request.json().catch(() => ({} as ScheduleSyncRepoRequest));
// Fetch config for the user
const configResult = await db
@@ -51,12 +44,14 @@ export const POST: APIRoute = async ({ request }) => {
.select()
.from(repositories)
.where(
eq(repositories.userId, userId) &&
and(
eq(repositories.userId, userId),
or(
eq(repositories.status, "mirrored"),
eq(repositories.status, "synced"),
eq(repositories.status, "failed")
)
)
);
if (!repos.length) {

View File

@@ -1,23 +1,28 @@
import type { APIRoute } from "astro";
import type { MirrorRepoRequest } from "@/types/mirror";
import { db, configs, repositories } from "@/lib/db";
import { eq, inArray } from "drizzle-orm";
import { and, eq, inArray } from "drizzle-orm";
import { repositoryVisibilityEnum, repoStatusEnum } from "@/types/Repository";
import { syncGiteaRepo } from "@/lib/gitea";
import type { SyncRepoResponse } from "@/types/sync";
import { processWithResilience } from "@/lib/utils/concurrency";
import { createSecureErrorResponse } from "@/lib/utils";
import { requireAuthenticatedUserId } from "@/lib/auth-guards";
export const POST: APIRoute = async ({ request }) => {
export const POST: APIRoute = async ({ request, locals }) => {
try {
const body: MirrorRepoRequest = await request.json();
const { userId, repositoryIds } = body;
const authResult = await requireAuthenticatedUserId({ request, locals });
if ("response" in authResult) return authResult.response;
const userId = authResult.userId;
if (!userId || !repositoryIds || !Array.isArray(repositoryIds)) {
const body: MirrorRepoRequest = await request.json();
const { repositoryIds } = body;
if (!repositoryIds || !Array.isArray(repositoryIds)) {
return new Response(
JSON.stringify({
success: false,
message: "userId and repositoryIds are required.",
message: "repositoryIds are required.",
}),
{ status: 400, headers: { "Content-Type": "application/json" } }
);
@@ -53,7 +58,12 @@ export const POST: APIRoute = async ({ request }) => {
const repos = await db
.select()
.from(repositories)
.where(inArray(repositories.id, repositoryIds));
.where(
and(
eq(repositories.userId, userId),
inArray(repositories.id, repositoryIds)
)
);
if (!repos.length) {
return new Response(

View File

@@ -2,18 +2,23 @@ import type { APIContext } from "astro";
import { db, organizations } from "@/lib/db";
import { eq, and } from "drizzle-orm";
import { createSecureErrorResponse } from "@/lib/utils";
import { requireAuthenticatedUserId } from "@/lib/auth-guards";
export async function PATCH({ params, request }: APIContext) {
export async function PATCH({ params, request, locals }: APIContext) {
try {
const authResult = await requireAuthenticatedUserId({ request, locals });
if ("response" in authResult) return authResult.response;
const userId = authResult.userId;
const { id } = params;
const body = await request.json();
const { status, userId } = body;
const { status } = body;
if (!id || !userId) {
if (!id) {
return new Response(
JSON.stringify({
success: false,
error: "Organization ID and User ID are required",
error: "Organization ID is required",
}),
{
status: 400,
@@ -78,4 +83,4 @@ export async function PATCH({ params, request }: APIContext) {
} catch (error) {
return createSecureErrorResponse(error);
}
}
}

View File

@@ -6,19 +6,16 @@ import { RateLimitManager } from "@/lib/rate-limit-manager";
import { createGitHubClient } from "@/lib/github";
import { getDecryptedGitHubToken } from "@/lib/utils/config-encryption";
import { configs } from "@/lib/db";
import { requireAuthenticatedUserId } from "@/lib/auth-guards";
export const GET: APIRoute = async ({ request, locals }) => {
const authResult = await requireAuthenticatedUserId({ request, locals });
if ("response" in authResult) return authResult.response;
const userId = authResult.userId;
export const GET: APIRoute = async ({ request }) => {
const url = new URL(request.url);
const userId = url.searchParams.get("userId");
const refresh = url.searchParams.get("refresh") === "true";
if (!userId) {
return jsonResponse({
data: { error: "Missing userId" },
status: 400,
});
}
try {
// If refresh is requested, fetch current rate limit from GitHub
if (refresh) {
@@ -101,4 +98,4 @@ export const GET: APIRoute = async ({ request }) => {
} catch (error) {
return createSecureErrorResponse(error, "rate limit check", 500);
}
};
};

View File

@@ -3,18 +3,23 @@ import { db, repositories } from "@/lib/db";
import { eq, and } from "drizzle-orm";
import { createSecureErrorResponse } from "@/lib/utils";
import { repoStatusEnum } from "@/types/Repository";
import { requireAuthenticatedUserId } from "@/lib/auth-guards";
export async function PATCH({ params, request }: APIContext) {
export async function PATCH({ params, request, locals }: APIContext) {
try {
const authResult = await requireAuthenticatedUserId({ request, locals });
if ("response" in authResult) return authResult.response;
const userId = authResult.userId;
const { id } = params;
const body = await request.json();
const { status, userId } = body;
const { status } = body;
if (!id || !userId) {
if (!id) {
return new Response(
JSON.stringify({
success: false,
error: "Repository ID and User ID are required",
error: "Repository ID is required",
}),
{
status: 400,
@@ -79,4 +84,4 @@ export async function PATCH({ params, request }: APIContext) {
} catch (error) {
return createSecureErrorResponse(error);
}
}
}

View File

@@ -1,13 +1,11 @@
import type { APIRoute } from "astro";
import { getNewEvents } from "@/lib/events";
import { requireAuthenticatedUserId } from "@/lib/auth-guards";
export const GET: APIRoute = async ({ request }) => {
const url = new URL(request.url);
const userId = url.searchParams.get("userId");
if (!userId) {
return new Response("Missing userId", { status: 400 });
}
export const GET: APIRoute = async ({ request, locals }) => {
const authResult = await requireAuthenticatedUserId({ request, locals });
if ("response" in authResult) return authResult.response;
const userId = authResult.userId;
const channel = `mirror-status:${userId}`;
let isClosed = false;

View File

@@ -12,14 +12,13 @@ import {
import { jsonResponse, createSecureErrorResponse } from "@/lib/utils";
import { mergeGitReposPreferStarred, calcBatchSizeForInsert } from "@/lib/repo-utils";
import { getDecryptedGitHubToken } from "@/lib/utils/config-encryption";
import { requireAuthenticatedUserId } from "@/lib/auth-guards";
import { isMirrorableGitHubRepo } from "@/lib/repo-eligibility";
export const POST: APIRoute = async ({ request }) => {
const url = new URL(request.url);
const userId = url.searchParams.get("userId");
if (!userId) {
return jsonResponse({ data: { error: "Missing userId" }, status: 400 });
}
export const POST: APIRoute = async ({ request, locals }) => {
const authResult = await requireAuthenticatedUserId({ request, locals });
if ("response" in authResult) return authResult.response;
const userId = authResult.userId;
try {
const [config] = await db
@@ -58,9 +57,10 @@ export const POST: APIRoute = async ({ request }) => {
// Merge and de-duplicate by fullName, preferring starred variant when duplicated
const allGithubRepos = mergeGitReposPreferStarred(basicAndForkedRepos, starredRepos);
const mirrorableGithubRepos = allGithubRepos.filter(isMirrorableGitHubRepo);
// Prepare full list of repos and orgs
const newRepos = allGithubRepos.map((repo) => ({
const newRepos = mirrorableGithubRepos.map((repo) => ({
id: uuidv4(),
userId,
configId: config.id,
@@ -188,6 +188,7 @@ export const POST: APIRoute = async ({ request }) => {
message: "Repositories and organizations synced successfully",
newRepositories: insertedRepos.length,
newOrganizations: insertedOrgs.length,
skippedDisabledRepositories: allGithubRepos.length - mirrorableGithubRepos.length,
},
});
} catch (error) {

View File

@@ -10,15 +10,20 @@ import type { RepositoryVisibility, RepoStatus } from "@/types/Repository";
import { v4 as uuidv4 } from "uuid";
import { decryptConfigTokens } from "@/lib/utils/config-encryption";
import { createGitHubClient } from "@/lib/github";
import { requireAuthenticatedUserId } from "@/lib/auth-guards";
export const POST: APIRoute = async ({ request }) => {
export const POST: APIRoute = async ({ request, locals }) => {
try {
const body: AddOrganizationApiRequest = await request.json();
const { role, org, userId, force = false } = body;
const authResult = await requireAuthenticatedUserId({ request, locals });
if ("response" in authResult) return authResult.response;
const userId = authResult.userId;
if (!org || !userId || !role) {
const body: AddOrganizationApiRequest = await request.json();
const { role, org, force = false } = body;
if (!org || !role) {
return jsonResponse({
data: { success: false, error: "Missing org, role or userId" },
data: { success: false, error: "Missing org or role" },
status: 400,
});
}
@@ -145,9 +150,10 @@ export const POST: APIRoute = async ({ request }) => {
const existingIds = new Set(allRepos.map(r => r.id));
const uniqueMemberRepos = memberRepos.filter(r => !existingIds.has(r.id));
allRepos.push(...uniqueMemberRepos);
const mirrorableRepos = allRepos.filter((repo) => !repo.disabled);
// Insert repositories
const repoRecords = allRepos.map((repo) => {
const repoRecords = mirrorableRepos.map((repo) => {
const normalizedOwner = repo.owner.login.trim().toLowerCase();
const normalizedRepoName = repo.name.trim().toLowerCase();

View File

@@ -11,17 +11,22 @@ import type {
RepositoryVisibility,
} from "@/types/Repository";
import { createMirrorJob } from "@/lib/helpers";
import { requireAuthenticatedUserId } from "@/lib/auth-guards";
export const POST: APIRoute = async ({ request }) => {
export const POST: APIRoute = async ({ request, locals }) => {
try {
const body: AddRepositoriesApiRequest = await request.json();
const { owner, repo, userId, force = false } = body;
const authResult = await requireAuthenticatedUserId({ request, locals });
if ("response" in authResult) return authResult.response;
const userId = authResult.userId;
if (!owner || !repo || !userId) {
const body: AddRepositoriesApiRequest = await request.json();
const { owner, repo, force = false, destinationOrg } = body;
if (!owner || !repo) {
return new Response(
JSON.stringify({
success: false,
error: "Missing owner, repo, or userId",
error: "Missing owner or repo",
}),
{ status: 400 }
);
@@ -34,7 +39,7 @@ export const POST: APIRoute = async ({ request }) => {
return jsonResponse({
data: {
success: false,
error: "Missing owner, repo, or userId",
error: "Missing owner or repo",
},
status: 400,
});
@@ -117,7 +122,7 @@ export const POST: APIRoute = async ({ request }) => {
lastMirrored: existingRepo?.lastMirrored ?? null,
errorMessage: existingRepo?.errorMessage ?? null,
mirroredLocation: existingRepo?.mirroredLocation ?? "",
destinationOrg: existingRepo?.destinationOrg ?? null,
destinationOrg: destinationOrg?.trim() || existingRepo?.destinationOrg || null,
updatedAt: repoData.updated_at
? new Date(repoData.updated_at)
: new Date(),

View File

@@ -2,16 +2,21 @@ import type { APIRoute } from "astro";
import { publishEvent } from "@/lib/events";
import { v4 as uuidv4 } from "uuid";
import { createSecureErrorResponse } from "@/lib/utils";
import { requireAuthenticatedUserId } from "@/lib/auth-guards";
export const POST: APIRoute = async ({ request }) => {
export const POST: APIRoute = async ({ request, locals }) => {
try {
const body = await request.json();
const { userId, message, status } = body;
const authResult = await requireAuthenticatedUserId({ request, locals });
if ("response" in authResult) return authResult.response;
const userId = authResult.userId;
if (!userId || !message || !status) {
const body = await request.json();
const { message, status } = body;
if (!message || !status) {
return new Response(
JSON.stringify({
error: "Missing required fields: userId, message, status",
error: "Missing required fields: message, status",
}),
{ status: 400 }
);

View File

@@ -13,6 +13,7 @@ export const repoStatusEnum = z.enum([
"syncing",
"synced",
"archived",
"pending-approval", // Blocked by force-push detection, needs manual approval
]);
export type RepoStatus = z.infer<typeof repoStatusEnum>;
@@ -70,6 +71,7 @@ export interface GitRepo {
visibility: RepositoryVisibility;
status: RepoStatus;
isDisabled?: boolean;
lastMirrored?: Date;
errorMessage?: string;
@@ -82,6 +84,7 @@ export interface AddRepositoriesApiRequest {
repo: string;
owner: string;
force?: boolean;
destinationOrg?: string;
}
export interface AddRepositoriesApiResponse {

View File

@@ -3,6 +3,7 @@ import { type Config as ConfigType } from "@/lib/db/schema";
export type GiteaOrgVisibility = "public" | "private" | "limited";
export type MirrorStrategy = "preserve" | "single-org" | "flat-user" | "mixed";
export type StarredReposMode = "dedicated-org" | "preserve-owner";
export type BackupStrategy = "disabled" | "always" | "on-force-push" | "block-on-force-push";
export interface GiteaConfig {
url: string;
@@ -18,6 +19,11 @@ export interface GiteaConfig {
personalReposOrg?: string; // Override destination for personal repos
issueConcurrency?: number;
pullRequestConcurrency?: number;
backupStrategy?: BackupStrategy;
backupBeforeSync?: boolean; // Deprecated: kept for backward compat, use backupStrategy
backupRetentionCount?: number;
backupDirectory?: string;
blockSyncOnBackupFailure?: boolean;
}
export interface ScheduleConfig {
@@ -69,6 +75,7 @@ export interface MirrorOptions {
export interface AdvancedOptions {
skipForks: boolean;
starredCodeOnly: boolean;
autoMirrorStarred?: boolean;
}
export interface SaveConfigApiRequest {

View File

@@ -0,0 +1,77 @@
/**
* 01 Service health checks.
*
* Quick smoke tests that confirm every service required by the E2E suite is
* reachable before the heavier workflow tests run.
*/
import { test, expect } from "@playwright/test";
import {
APP_URL,
GITEA_URL,
FAKE_GITHUB_URL,
GIT_SERVER_URL,
waitFor,
} from "./helpers";
test.describe("E2E: Service health checks", () => {
test("Fake GitHub API is running", async ({ request }) => {
const resp = await request.get(`${FAKE_GITHUB_URL}/___mgmt/health`);
expect(resp.ok()).toBeTruthy();
const data = await resp.json();
expect(data.status).toBe("ok");
expect(data.repos).toBeGreaterThan(0);
console.log(
`[Health] Fake GitHub: ${data.repos} repos, ${data.orgs} orgs, clone base: ${data.gitCloneBaseUrl ?? "default"}`,
);
});
test("Git HTTP server is running (serves test repos)", async ({
request,
}) => {
const resp = await request.get(`${GIT_SERVER_URL}/manifest.json`, {
failOnStatusCode: false,
});
expect(resp.ok(), "Git server should serve manifest.json").toBeTruthy();
const manifest = await resp.json();
expect(manifest.repos).toBeDefined();
expect(manifest.repos.length).toBeGreaterThan(0);
console.log(`[Health] Git server: serving ${manifest.repos.length} repos`);
for (const r of manifest.repos) {
console.log(`[Health] • ${r.owner}/${r.name}${r.description}`);
}
});
test("Gitea instance is running", async ({ request }) => {
await waitFor(
async () => {
const resp = await request.get(`${GITEA_URL}/api/v1/version`, {
failOnStatusCode: false,
});
return resp.ok();
},
{ timeout: 30_000, interval: 2_000, label: "Gitea healthy" },
);
const resp = await request.get(`${GITEA_URL}/api/v1/version`);
const data = await resp.json();
console.log(`[Health] Gitea version: ${data.version}`);
expect(data.version).toBeTruthy();
});
test("gitea-mirror app is running", async ({ request }) => {
await waitFor(
async () => {
const resp = await request.get(`${APP_URL}/`, {
failOnStatusCode: false,
});
return resp.status() < 500;
},
{ timeout: 60_000, interval: 2_000, label: "App healthy" },
);
const resp = await request.get(`${APP_URL}/`, {
failOnStatusCode: false,
});
console.log(`[Health] App status: ${resp.status()}`);
expect(resp.status()).toBeLessThan(500);
});
});

View File

@@ -0,0 +1,344 @@
/**
* 02 Main mirror workflow.
*
* Walks through the full first-time user journey:
* 1. Create Gitea admin user + API token
* 2. Create the mirror target organization
* 3. Register / sign-in to the gitea-mirror app
* 4. Save GitHub + Gitea configuration
* 5. Trigger a GitHub data sync (pull repo list from fake GitHub)
* 6. Trigger mirror jobs (push repos into Gitea)
* 7. Verify repos actually appeared in Gitea with real content
* 8. Verify mirror job activity and app state
*/
import { test, expect } from "@playwright/test";
import {
APP_URL,
GITEA_URL,
GITEA_MIRROR_ORG,
GiteaAPI,
getAppSessionCookies,
saveConfig,
waitFor,
getRepositoryIds,
triggerMirrorJobs,
} from "./helpers";
test.describe("E2E: Mirror workflow", () => {
let giteaApi: GiteaAPI;
let appCookies = "";
test.beforeAll(async () => {
giteaApi = new GiteaAPI(GITEA_URL);
});
test.afterAll(async () => {
await giteaApi.dispose();
});
test("Step 1: Setup Gitea admin user and token", async () => {
await giteaApi.ensureAdminUser();
const token = await giteaApi.createToken();
expect(token).toBeTruthy();
expect(token.length).toBeGreaterThan(10);
console.log(`[Setup] Gitea token acquired (length: ${token.length})`);
});
test("Step 2: Create mirror organization in Gitea", async () => {
await giteaApi.ensureOrg(GITEA_MIRROR_ORG);
const repos = await giteaApi.listOrgRepos(GITEA_MIRROR_ORG);
expect(Array.isArray(repos)).toBeTruthy();
console.log(
`[Setup] Org ${GITEA_MIRROR_ORG} exists with ${repos.length} repos`,
);
});
test("Step 3: Register and sign in to gitea-mirror app", async ({
request,
}) => {
appCookies = await getAppSessionCookies(request);
expect(appCookies).toBeTruthy();
console.log(
`[Auth] Session cookies acquired (length: ${appCookies.length})`,
);
const whoami = await request.get(`${APP_URL}/api/config`, {
headers: { Cookie: appCookies },
failOnStatusCode: false,
});
expect(
whoami.status(),
`Auth check returned ${whoami.status()} cookies may be invalid`,
).not.toBe(401);
console.log(`[Auth] Auth check status: ${whoami.status()}`);
});
test("Step 4: Configure mirrors via API (backup disabled)", async ({
request,
}) => {
if (!appCookies) {
appCookies = await getAppSessionCookies(request);
}
const giteaToken = giteaApi.getTokenValue();
expect(giteaToken, "Gitea token should be set from Step 1").toBeTruthy();
await saveConfig(request, giteaToken, appCookies, {
giteaConfig: {
backupBeforeSync: false,
blockSyncOnBackupFailure: false,
},
});
console.log("[Config] Configuration saved (backup disabled)");
});
test("Step 5: Trigger GitHub data sync (fetch repos from fake GitHub)", async ({
request,
}) => {
if (!appCookies) {
appCookies = await getAppSessionCookies(request);
}
const syncResp = await request.post(`${APP_URL}/api/sync`, {
headers: {
"Content-Type": "application/json",
Cookie: appCookies,
},
failOnStatusCode: false,
});
const status = syncResp.status();
console.log(`[Sync] GitHub sync response: ${status}`);
if (status >= 400) {
const body = await syncResp.text();
console.log(`[Sync] Error body: ${body}`);
}
expect(status, "Sync should not be unauthorized").not.toBe(401);
expect(status, "Sync should not return server error").toBeLessThan(500);
if (syncResp.ok()) {
const data = await syncResp.json();
console.log(
`[Sync] New repos: ${data.newRepositories ?? "?"}, new orgs: ${data.newOrganizations ?? "?"}`,
);
}
});
test("Step 6: Trigger mirror jobs (push repos to Gitea)", async ({
request,
}) => {
if (!appCookies) {
appCookies = await getAppSessionCookies(request);
}
// Fetch repository IDs from the dashboard API
const { ids: repositoryIds, repos } = await getRepositoryIds(
request,
appCookies,
);
console.log(
`[Mirror] Found ${repositoryIds.length} repos to mirror: ${repos.map((r: any) => r.name).join(", ")}`,
);
if (repositoryIds.length === 0) {
// Fallback: try the github/repositories endpoint
const repoResp = await request.get(
`${APP_URL}/api/github/repositories`,
{
headers: { Cookie: appCookies },
failOnStatusCode: false,
},
);
if (repoResp.ok()) {
const repoData = await repoResp.json();
const fallbackRepos: any[] = Array.isArray(repoData)
? repoData
: (repoData.repositories ?? []);
repositoryIds.push(...fallbackRepos.map((r: any) => r.id));
console.log(
`[Mirror] Fallback: found ${repositoryIds.length} repos`,
);
}
}
expect(
repositoryIds.length,
"Should have at least one repository to mirror",
).toBeGreaterThan(0);
const status = await triggerMirrorJobs(
request,
appCookies,
repositoryIds,
30_000,
);
console.log(`[Mirror] Mirror job response: ${status}`);
expect(status, "Mirror job should not be unauthorized").not.toBe(401);
expect(status, "Mirror job should not return server error").toBeLessThan(
500,
);
});
test("Step 7: Verify repos were actually mirrored to Gitea", async ({
request,
}) => {
if (!appCookies) {
appCookies = await getAppSessionCookies(request);
}
// Wait for mirror jobs to finish processing
await waitFor(
async () => {
const orgRepos = await giteaApi.listOrgRepos(GITEA_MIRROR_ORG);
console.log(
`[Verify] Gitea org repos so far: ${orgRepos.length} (${orgRepos.map((r: any) => r.name).join(", ")})`,
);
// We expect at least 3 repos (my-project, dotfiles, notes)
return orgRepos.length >= 3;
},
{
timeout: 90_000,
interval: 5_000,
label: "repos appear in Gitea",
},
);
const orgRepos = await giteaApi.listOrgRepos(GITEA_MIRROR_ORG);
const orgRepoNames = orgRepos.map((r: any) => r.name);
console.log(
`[Verify] Gitea org repos: ${orgRepoNames.join(", ")} (total: ${orgRepos.length})`,
);
// Check that at least the 3 personal repos are mirrored
for (const repoName of ["my-project", "dotfiles", "notes"]) {
expect(
orgRepoNames,
`Expected repo "${repoName}" to be mirrored into org ${GITEA_MIRROR_ORG}`,
).toContain(repoName);
}
// Verify my-project has actual content (branches, commits)
const myProjectBranches = await giteaApi.listBranches(
GITEA_MIRROR_ORG,
"my-project",
);
const branchNames = myProjectBranches.map((b: any) => b.name);
console.log(`[Verify] my-project branches: ${branchNames.join(", ")}`);
expect(branchNames, "main branch should exist").toContain("main");
// Verify we can read actual file content
const readmeContent = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"my-project",
"README.md",
);
expect(readmeContent, "README.md should have content").toBeTruthy();
expect(readmeContent).toContain("My Project");
console.log(
`[Verify] my-project README.md starts with: ${readmeContent?.substring(0, 50)}...`,
);
// Verify tags were mirrored
const tags = await giteaApi.listTags(GITEA_MIRROR_ORG, "my-project");
const tagNames = tags.map((t: any) => t.name);
console.log(`[Verify] my-project tags: ${tagNames.join(", ")}`);
if (tagNames.length > 0) {
expect(tagNames).toContain("v1.0.0");
}
// Verify commits exist
const commits = await giteaApi.listCommits(
GITEA_MIRROR_ORG,
"my-project",
);
console.log(`[Verify] my-project commits: ${commits.length}`);
expect(commits.length, "Should have multiple commits").toBeGreaterThan(0);
// Verify dotfiles repo has content
const bashrc = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"dotfiles",
".bashrc",
);
expect(bashrc, "dotfiles should contain .bashrc").toBeTruthy();
console.log("[Verify] dotfiles .bashrc verified");
});
test("Step 8: Verify mirror jobs and app state", async ({ request }) => {
if (!appCookies) {
appCookies = await getAppSessionCookies(request);
}
// Check activity log
const activitiesResp = await request.get(`${APP_URL}/api/activities`, {
headers: { Cookie: appCookies },
failOnStatusCode: false,
});
if (activitiesResp.ok()) {
const activities = await activitiesResp.json();
const jobs: any[] = Array.isArray(activities)
? activities
: (activities.jobs ?? activities.activities ?? []);
console.log(`[State] Activity/job records: ${jobs.length}`);
const mirrorJobs = jobs.filter(
(j: any) =>
j.status === "mirroring" ||
j.status === "failed" ||
j.status === "success" ||
j.status === "mirrored" ||
j.message?.includes("mirror") ||
j.message?.includes("Mirror"),
);
console.log(`[State] Mirror-related jobs: ${mirrorJobs.length}`);
for (const j of mirrorJobs.slice(0, 5)) {
console.log(
`[State] • ${j.repositoryName ?? "?"}: ${j.status}${j.message ?? ""}`,
);
}
}
// Check dashboard repos
const dashResp = await request.get(`${APP_URL}/api/dashboard`, {
headers: { Cookie: appCookies },
failOnStatusCode: false,
});
if (dashResp.ok()) {
const dashData = await dashResp.json();
const repos: any[] = dashData.repositories ?? [];
console.log(`[State] Dashboard repos: ${repos.length}`);
for (const r of repos) {
console.log(
`[State] • ${r.name}: status=${r.status}, mirrored=${r.mirroredLocation ?? "none"}`,
);
}
expect(repos.length, "Repos should exist in DB").toBeGreaterThan(0);
const succeeded = repos.filter(
(r: any) => r.status === "mirrored" || r.status === "success",
);
console.log(
`[State] Successfully mirrored repos: ${succeeded.length}/${repos.length}`,
);
}
// App should still be running
const healthResp = await request.get(`${APP_URL}/`, {
failOnStatusCode: false,
});
expect(
healthResp.status(),
"App should still be running after mirror attempts",
).toBeLessThan(500);
console.log(`[State] App health: ${healthResp.status()}`);
});
});

305
tests/e2e/03-backup.spec.ts Normal file
View File

@@ -0,0 +1,305 @@
/**
* 03 Backup configuration tests.
*
* Exercises the pre-sync backup system by toggling config flags through
* the app API and triggering re-syncs on repos that were already mirrored
* by the 02-mirror-workflow suite.
*
* What is tested:
* B1. Enable backupStrategy: "always" in config
* B2. Confirm mirrored repos exist in Gitea (precondition)
* B3. Trigger a re-sync with backup enabled — verify the backup code path
* runs (snapshot activity entries appear in the activity log)
* B4. Inspect activity log for snapshot-related entries
* B5. Enable blockSyncOnBackupFailure and verify the flag is persisted
* B6. Disable backup (backupStrategy: "disabled") and verify config resets cleanly
*/
import { test, expect } from "@playwright/test";
import {
APP_URL,
GITEA_URL,
GITEA_MIRROR_ORG,
GiteaAPI,
getAppSessionCookies,
saveConfig,
getRepositoryIds,
triggerSyncRepo,
} from "./helpers";
test.describe("E2E: Backup configuration", () => {
let giteaApi: GiteaAPI;
let appCookies = "";
test.beforeAll(async () => {
giteaApi = new GiteaAPI(GITEA_URL);
try {
await giteaApi.createToken();
} catch {
console.log(
"[Backup] Could not create Gitea token; tests may be limited",
);
}
});
test.afterAll(async () => {
await giteaApi.dispose();
});
// ── B1 ─────────────────────────────────────────────────────────────────────
test("Step B1: Enable backup in config", async ({ request }) => {
appCookies = await getAppSessionCookies(request);
const giteaToken = giteaApi.getTokenValue();
expect(giteaToken, "Gitea token required").toBeTruthy();
// Save config with backup strategy set to "always"
await saveConfig(request, giteaToken, appCookies, {
giteaConfig: {
backupStrategy: "always",
blockSyncOnBackupFailure: false,
backupRetentionCount: 5,
backupDirectory: "data/repo-backups",
},
});
// Verify config was saved
const configResp = await request.get(`${APP_URL}/api/config`, {
headers: { Cookie: appCookies },
failOnStatusCode: false,
});
expect(configResp.status()).toBeLessThan(500);
if (configResp.ok()) {
const configData = await configResp.json();
const giteaCfg = configData.giteaConfig ?? configData.gitea ?? {};
console.log(
`[Backup] Config saved: backupStrategy=${giteaCfg.backupStrategy}, blockOnFailure=${giteaCfg.blockSyncOnBackupFailure}`,
);
}
});
// ── B2 ─────────────────────────────────────────────────────────────────────
test("Step B2: Verify mirrored repos exist in Gitea before backup test", async () => {
// We need repos to already be mirrored from the 02-mirror-workflow suite
const orgRepos = await giteaApi.listOrgRepos(GITEA_MIRROR_ORG);
console.log(
`[Backup] Repos in ${GITEA_MIRROR_ORG}: ${orgRepos.length} (${orgRepos.map((r: any) => r.name).join(", ")})`,
);
if (orgRepos.length === 0) {
console.log(
"[Backup] WARNING: No repos in Gitea yet. Backup test will verify " +
"job creation but not bundle creation.",
);
}
});
// ── B3 ─────────────────────────────────────────────────────────────────────
test("Step B3: Trigger re-sync with backup enabled", async ({ request }) => {
if (!appCookies) {
appCookies = await getAppSessionCookies(request);
}
// Fetch mirrored repository IDs (sync-repo requires them)
const { ids: repositoryIds, repos } = await getRepositoryIds(
request,
appCookies,
{ status: "mirrored" },
);
// Also include repos with "success" status
if (repositoryIds.length === 0) {
const { ids: successIds } = await getRepositoryIds(
request,
appCookies,
{ status: "success" },
);
repositoryIds.push(...successIds);
}
// Fall back to all repos if no mirrored/success repos
if (repositoryIds.length === 0) {
const { ids: allIds } = await getRepositoryIds(request, appCookies);
repositoryIds.push(...allIds);
}
console.log(
`[Backup] Found ${repositoryIds.length} repos to re-sync: ` +
repos.map((r: any) => r.name).join(", "),
);
expect(
repositoryIds.length,
"Need at least one repo to test backup",
).toBeGreaterThan(0);
// Trigger sync-repo — this calls syncGiteaRepoEnhanced which checks
// shouldCreatePreSyncBackup and creates bundles before syncing
const status = await triggerSyncRepo(
request,
appCookies,
repositoryIds,
25_000,
);
console.log(`[Backup] Sync-repo response: ${status}`);
expect(status, "Sync-repo should accept request").toBeLessThan(500);
});
// ── B4 ─────────────────────────────────────────────────────────────────────
test("Step B4: Verify backup-related activity in logs", async ({
request,
}) => {
if (!appCookies) {
appCookies = await getAppSessionCookies(request);
}
const activitiesResp = await request.get(`${APP_URL}/api/activities`, {
headers: { Cookie: appCookies },
failOnStatusCode: false,
});
if (!activitiesResp.ok()) {
console.log(
`[Backup] Could not fetch activities: ${activitiesResp.status()}`,
);
return;
}
const activities = await activitiesResp.json();
const jobs: any[] = Array.isArray(activities)
? activities
: (activities.jobs ?? activities.activities ?? []);
// Look for backup / snapshot related messages
const backupJobs = jobs.filter(
(j: any) =>
j.message?.toLowerCase().includes("snapshot") ||
j.message?.toLowerCase().includes("backup") ||
j.details?.toLowerCase().includes("snapshot") ||
j.details?.toLowerCase().includes("backup") ||
j.details?.toLowerCase().includes("bundle"),
);
console.log(
`[Backup] Backup-related activity entries: ${backupJobs.length}`,
);
for (const j of backupJobs.slice(0, 10)) {
console.log(
`[Backup] • ${j.repositoryName ?? "?"}: ${j.status}${j.message ?? ""} | ${(j.details ?? "").substring(0, 120)}`,
);
}
// We expect at least some backup-related entries if repos were mirrored
const orgRepos = await giteaApi.listOrgRepos(GITEA_MIRROR_ORG);
if (orgRepos.length > 0) {
// With repos in Gitea, the backup system should have tried to create
// snapshots. All snapshots should succeed.
expect(
backupJobs.length,
"Expected at least one backup/snapshot activity entry when " +
"backupStrategy is 'always' and repos exist in Gitea",
).toBeGreaterThan(0);
// Check for any failed backups
const failedBackups = backupJobs.filter(
(j: any) =>
j.status === "failed" &&
(j.message?.toLowerCase().includes("snapshot") ||
j.details?.toLowerCase().includes("snapshot")),
);
expect(
failedBackups.length,
`Expected all backups to succeed, but ${failedBackups.length} backup(s) failed. ` +
`Failed: ${failedBackups.map((j: any) => `${j.repositoryName}: ${j.details?.substring(0, 100)}`).join("; ")}`,
).toBe(0);
console.log(
`[Backup] Confirmed: backup system was invoked for ${backupJobs.length} repos`,
);
}
// Dump all recent jobs for debugging visibility
console.log(`[Backup] All recent jobs (last 20):`);
for (const j of jobs.slice(0, 20)) {
console.log(
`[Backup] - [${j.status}] ${j.repositoryName ?? "?"}: ${j.message ?? ""} ` +
`${j.details ? `(${j.details.substring(0, 80)})` : ""}`,
);
}
});
// ── B5 ─────────────────────────────────────────────────────────────────────
test("Step B5: Enable blockSyncOnBackupFailure and verify behavior", async ({
request,
}) => {
if (!appCookies) {
appCookies = await getAppSessionCookies(request);
}
const giteaToken = giteaApi.getTokenValue();
// Update config to block sync on backup failure
await saveConfig(request, giteaToken, appCookies, {
giteaConfig: {
backupStrategy: "always",
blockSyncOnBackupFailure: true,
backupRetentionCount: 5,
backupDirectory: "data/repo-backups",
},
});
console.log("[Backup] Config updated: blockSyncOnBackupFailure=true");
// Verify the flag persisted
const configResp = await request.get(`${APP_URL}/api/config`, {
headers: { Cookie: appCookies },
failOnStatusCode: false,
});
if (configResp.ok()) {
const configData = await configResp.json();
const giteaCfg = configData.giteaConfig ?? configData.gitea ?? {};
expect(giteaCfg.blockSyncOnBackupFailure).toBe(true);
console.log(
`[Backup] Verified: blockSyncOnBackupFailure=${giteaCfg.blockSyncOnBackupFailure}`,
);
}
});
// ── B6 ─────────────────────────────────────────────────────────────────────
test("Step B6: Disable backup and verify config resets", async ({
request,
}) => {
if (!appCookies) {
appCookies = await getAppSessionCookies(request);
}
const giteaToken = giteaApi.getTokenValue();
// Disable backup
await saveConfig(request, giteaToken, appCookies, {
giteaConfig: {
backupStrategy: "disabled",
blockSyncOnBackupFailure: false,
},
});
const configResp = await request.get(`${APP_URL}/api/config`, {
headers: { Cookie: appCookies },
failOnStatusCode: false,
});
if (configResp.ok()) {
const configData = await configResp.json();
const giteaCfg = configData.giteaConfig ?? configData.gitea ?? {};
console.log(
`[Backup] After disable: backupStrategy=${giteaCfg.backupStrategy}`,
);
}
console.log("[Backup] Backup configuration test complete");
});
});

View File

@@ -0,0 +1,864 @@
/**
* 04 Force-push simulation and backup verification.
*
* This is the critical test that proves data loss can happen from a
* force-push on the source repo, and verifies that the backup system
* (when enabled) preserves the old state.
*
* Scenario:
* 1. Confirm my-project is already mirrored with known commits / content
* 2. Record the pre-force-push state (branch SHAs, commit messages, file content)
* 3. Rewrite history in the source bare repo (simulate a force-push)
* 4. Trigger Gitea mirror-sync WITHOUT backup
* 5. Verify Gitea now reflects the rewritten history — old commits are GONE
* 6. Restore the source repo, re-mirror, then enable backup
* 7. Force-push again and sync WITH backup enabled
* 8. Verify backup activity was recorded (snapshot attempted before sync)
*
* The source bare repos live on the host filesystem at
* tests/e2e/git-repos/<owner>/<name>.git and are served read-only into the
* git-server container. Because the bind-mount is :ro in docker-compose,
* we modify the repos on the host and Gitea's dumb-HTTP clone picks up
* the changes on the next fetch.
*
* Prerequisites: 02-mirror-workflow.spec.ts must have run first so that
* my-project is already mirrored into Gitea.
*/
import { execSync } from "node:child_process";
import { existsSync, mkdirSync, rmSync, writeFileSync } from "node:fs";
import { join, resolve, dirname } from "node:path";
import { fileURLToPath } from "node:url";
import { test, expect } from "@playwright/test";
import {
APP_URL,
GITEA_URL,
GITEA_MIRROR_ORG,
GiteaAPI,
getAppSessionCookies,
saveConfig,
waitFor,
getRepositoryIds,
triggerSyncRepo,
} from "./helpers";
// ─── Paths ───────────────────────────────────────────────────────────────────
const E2E_DIR = resolve(dirname(fileURLToPath(import.meta.url)));
const GIT_REPOS_DIR = join(E2E_DIR, "git-repos");
const MY_PROJECT_BARE = join(GIT_REPOS_DIR, "e2e-test-user", "my-project.git");
// ─── Git helpers ─────────────────────────────────────────────────────────────
/** Run a git command in a given directory. */
function git(args: string, cwd: string): string {
try {
return execSync(`git ${args}`, {
cwd,
encoding: "utf-8",
stdio: ["pipe", "pipe", "pipe"],
env: {
...process.env,
GIT_AUTHOR_NAME: "Force Push Bot",
GIT_AUTHOR_EMAIL: "force-push@test.local",
GIT_COMMITTER_NAME: "Force Push Bot",
GIT_COMMITTER_EMAIL: "force-push@test.local",
},
}).trim();
} catch (err: any) {
const stderr = err.stderr?.toString() ?? "";
const stdout = err.stdout?.toString() ?? "";
throw new Error(
`git ${args} failed in ${cwd}:\n${stderr || stdout || err.message}`,
);
}
}
/**
* Get the SHA of a ref in a bare repository.
* Uses `git rev-parse` so it works for branches and tags.
*/
function getRefSha(bareRepo: string, ref: string): string {
return git(`rev-parse ${ref}`, bareRepo);
}
/**
* Clone the bare repo to a temporary working copy, execute a callback that
* mutates the working copy, then force-push back to the bare repo and
* update server-info for dumb-HTTP serving.
*/
function mutateSourceRepo(
bareRepo: string,
tmpName: string,
mutate: (workDir: string) => void,
): void {
const tmpDir = join(GIT_REPOS_DIR, ".work-force-push", tmpName);
rmSync(tmpDir, { recursive: true, force: true });
mkdirSync(join(GIT_REPOS_DIR, ".work-force-push"), { recursive: true });
try {
// Clone from the bare repo
git(`clone "${bareRepo}" "${tmpDir}"`, GIT_REPOS_DIR);
git("config user.name 'Force Push Bot'", tmpDir);
git("config user.email 'force-push@test.local'", tmpDir);
// Let the caller rewrite history
mutate(tmpDir);
// Force-push all refs back to the bare repo
git(`push --force --all "${bareRepo}"`, tmpDir);
git(`push --force --tags "${bareRepo}"`, tmpDir);
// Update server-info so the dumb-HTTP server picks up the new refs
git("update-server-info", bareRepo);
} finally {
rmSync(tmpDir, { recursive: true, force: true });
}
}
/** Helper to clean up the temporary working directory. */
function cleanupWorkDir(): void {
const workDir = join(GIT_REPOS_DIR, ".work-force-push");
rmSync(workDir, { recursive: true, force: true });
}
// ─── Tests ───────────────────────────────────────────────────────────────────
test.describe("E2E: Force-push simulation", () => {
let giteaApi: GiteaAPI;
let appCookies = "";
/** SHA of the main branch BEFORE we force-push. */
let originalMainSha = "";
/** The commit message of the HEAD commit before force-push. */
let originalHeadMessage = "";
/** Content of README.md before force-push. */
let originalReadmeContent = "";
/** Number of commits on main before force-push. */
let originalCommitCount = 0;
test.beforeAll(async () => {
giteaApi = new GiteaAPI(GITEA_URL);
try {
await giteaApi.createToken();
} catch {
console.log("[ForcePush] Could not create Gitea token");
}
});
test.afterAll(async () => {
cleanupWorkDir();
await giteaApi.dispose();
});
// ── F0: Preconditions ────────────────────────────────────────────────────
test("F0: Confirm my-project is mirrored and record its state", async ({
request,
}) => {
// Verify the source bare repo exists on the host
expect(
existsSync(MY_PROJECT_BARE),
`Bare repo should exist at ${MY_PROJECT_BARE}`,
).toBeTruthy();
// Verify it is mirrored in Gitea
const repo = await giteaApi.getRepo(GITEA_MIRROR_ORG, "my-project");
expect(repo, "my-project should exist in Gitea").toBeTruthy();
console.log(
`[ForcePush] my-project in Gitea: mirror=${repo.mirror}, ` +
`default_branch=${repo.default_branch}`,
);
// Record the current state of main in Gitea
const mainBranch = await giteaApi.getBranch(
GITEA_MIRROR_ORG,
"my-project",
"main",
);
expect(mainBranch, "main branch should exist").toBeTruthy();
originalMainSha = mainBranch.commit.id;
originalHeadMessage =
mainBranch.commit.message?.trim() ?? "(unknown message)";
console.log(
`[ForcePush] Original main HEAD: ${originalMainSha.substring(0, 12)} ` +
`"${originalHeadMessage}"`,
);
// Record commit count
const commits = await giteaApi.listCommits(GITEA_MIRROR_ORG, "my-project", {
limit: 50,
});
originalCommitCount = commits.length;
console.log(
`[ForcePush] Original commit count on main: ${originalCommitCount}`,
);
// Record README content
const readme = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"my-project",
"README.md",
);
originalReadmeContent = readme ?? "";
expect(originalReadmeContent).toContain("My Project");
console.log(
`[ForcePush] Original README length: ${originalReadmeContent.length} chars`,
);
// Also verify the source bare repo matches
const sourceSha = getRefSha(MY_PROJECT_BARE, "refs/heads/main");
console.log(
`[ForcePush] Source bare main SHA: ${sourceSha.substring(0, 12)}`,
);
// They may differ slightly if Gitea hasn't synced the very latest, but
// the important thing is that both exist.
});
// ── F1: Rewrite history on the source repo ───────────────────────────────
test("F1: Force-push rewritten history to source repo", async () => {
const shaBeforeRewrite = getRefSha(MY_PROJECT_BARE, "refs/heads/main");
console.log(
`[ForcePush] Source main before rewrite: ${shaBeforeRewrite.substring(0, 12)}`,
);
mutateSourceRepo(MY_PROJECT_BARE, "my-project-rewrite", (workDir) => {
// We're on the main branch.
// Rewrite history: remove the last commit (the LICENSE commit) via
// reset --hard HEAD~1, then add a completely different commit.
git("checkout main", workDir);
// Record what HEAD is for logging
const headBefore = git("log --oneline -1", workDir);
console.log(`[ForcePush] Working copy HEAD before reset: ${headBefore}`);
// Hard reset to remove the last commit (this drops "Add MIT license")
git("reset --hard HEAD~1", workDir);
const headAfterReset = git("log --oneline -1", workDir);
console.log(`[ForcePush] After reset HEAD~1: ${headAfterReset}`);
// Write a replacement commit with different content (simulates someone
// rewriting history with different changes)
writeFileSync(
join(workDir, "README.md"),
"# My Project\n\nThis README was FORCE-PUSHED.\n\nOriginal history has been rewritten.\n",
);
writeFileSync(
join(workDir, "FORCE_PUSH_MARKER.txt"),
`Force-pushed at ${new Date().toISOString()}\n`,
);
git("add -A", workDir);
execSync('git commit -m "FORCE PUSH: Rewritten history"', {
cwd: workDir,
encoding: "utf-8",
stdio: ["pipe", "pipe", "pipe"],
env: {
...process.env,
GIT_AUTHOR_NAME: "Force Push Bot",
GIT_AUTHOR_EMAIL: "force-push@test.local",
GIT_AUTHOR_DATE: "2024-06-15T12:00:00+00:00",
GIT_COMMITTER_NAME: "Force Push Bot",
GIT_COMMITTER_EMAIL: "force-push@test.local",
GIT_COMMITTER_DATE: "2024-06-15T12:00:00+00:00",
},
});
const headAfterRewrite = git("log --oneline -3", workDir);
console.log(`[ForcePush] After rewrite (last 3):\n${headAfterRewrite}`);
});
const shaAfterRewrite = getRefSha(MY_PROJECT_BARE, "refs/heads/main");
console.log(
`[ForcePush] Source main after rewrite: ${shaAfterRewrite.substring(0, 12)}`,
);
// The SHA must have changed — this proves the force-push happened
expect(
shaAfterRewrite,
"Source repo main SHA should change after force-push",
).not.toBe(originalMainSha);
// Verify the old SHA is no longer reachable on main
const logOutput = git("log --oneline main", MY_PROJECT_BARE);
expect(
logOutput,
"Rewritten history should NOT contain the old head commit",
).toContain("FORCE PUSH");
});
// ── F2: Sync to Gitea WITHOUT backup ─────────────────────────────────────
test("F2: Disable backup and sync force-pushed repo to Gitea", async ({
request,
}) => {
appCookies = await getAppSessionCookies(request);
const giteaToken = giteaApi.getTokenValue();
expect(giteaToken).toBeTruthy();
// Ensure backup is disabled for this test
await saveConfig(request, giteaToken, appCookies, {
giteaConfig: {
backupStrategy: "disabled",
blockSyncOnBackupFailure: false,
},
});
console.log("[ForcePush] Backup disabled for unprotected sync test");
// Trigger Gitea's mirror-sync directly via the Gitea API.
// This is more reliable than going through the app for this test because
// the app's sync-repo endpoint involves extra processing. We want to test
// the raw effect of Gitea pulling the rewritten refs.
const synced = await giteaApi.triggerMirrorSync(
GITEA_MIRROR_ORG,
"my-project",
);
console.log(`[ForcePush] Gitea mirror-sync triggered: ${synced}`);
// Wait for Gitea to pull the new refs from the git-server
console.log("[ForcePush] Waiting for Gitea to pull rewritten refs...");
await new Promise((r) => setTimeout(r, 15_000));
});
// ── F3: Verify Gitea reflects the rewritten history ──────────────────────
test("F3: Verify Gitea has the force-pushed content (old history GONE)", async () => {
// Poll until Gitea picks up the new HEAD
await waitFor(
async () => {
const branch = await giteaApi.getBranch(
GITEA_MIRROR_ORG,
"my-project",
"main",
);
if (!branch) return false;
return branch.commit.id !== originalMainSha;
},
{
timeout: 60_000,
interval: 5_000,
label: "Gitea main branch updates to new SHA",
},
);
// Read the new state
const newMainBranch = await giteaApi.getBranch(
GITEA_MIRROR_ORG,
"my-project",
"main",
);
expect(newMainBranch).toBeTruthy();
const newSha = newMainBranch.commit.id;
const newMsg = newMainBranch.commit.message?.trim() ?? "";
console.log(
`[ForcePush] New main HEAD: ${newSha.substring(0, 12)} "${newMsg}"`,
);
// The SHA MUST be different from the original
expect(
newSha,
"Gitea main SHA should have changed after force-push sync",
).not.toBe(originalMainSha);
// The new commit message should be the force-pushed one
expect(newMsg).toContain("FORCE PUSH");
// Verify the force-push marker file now exists in Gitea
const markerContent = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"my-project",
"FORCE_PUSH_MARKER.txt",
);
expect(
markerContent,
"FORCE_PUSH_MARKER.txt should appear after sync",
).toBeTruthy();
console.log(
`[ForcePush] Marker file present: ${markerContent?.substring(0, 40)}...`,
);
// Verify the README was overwritten
const newReadme = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"my-project",
"README.md",
);
expect(newReadme).toContain("FORCE-PUSHED");
expect(newReadme).not.toBe(originalReadmeContent);
console.log("[ForcePush] README.md confirms overwritten content");
// Verify the LICENSE file is GONE (it was in the dropped commit)
const licenseContent = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"my-project",
"LICENSE",
);
expect(
licenseContent,
"LICENSE should be GONE after force-push removed that commit",
).toBeNull();
console.log("[ForcePush] ✗ LICENSE file is GONE — data loss confirmed");
// Verify the old commit SHA is no longer accessible
const oldCommit = await giteaApi.getCommit(
GITEA_MIRROR_ORG,
"my-project",
originalMainSha,
);
// Gitea may or may not GC the unreachable commit immediately, so this
// is informational rather than a hard assertion.
if (oldCommit) {
console.log(
`[ForcePush] Old commit ${originalMainSha.substring(0, 12)} is ` +
`still in Gitea's object store (not yet GC'd)`,
);
} else {
console.log(
`[ForcePush] Old commit ${originalMainSha.substring(0, 12)} is ` +
`no longer accessible — data loss complete`,
);
}
// Check commit count changed
const newCommits = await giteaApi.listCommits(
GITEA_MIRROR_ORG,
"my-project",
{ limit: 50 },
);
console.log(
`[ForcePush] Commit count: was ${originalCommitCount}, now ${newCommits.length}`,
);
// The rewrite dropped one commit and added one, so the count should differ
// or at minimum the commit list should not contain the old head message.
const commitMessages = newCommits.map(
(c: any) => c.commit?.message?.trim() ?? "",
);
expect(
commitMessages.some((m: string) => m.includes("FORCE PUSH")),
"New commit list should contain the force-pushed commit",
).toBeTruthy();
console.log(
"\n[ForcePush] ════════════════════════════════════════════════════",
);
console.log(
"[ForcePush] CONFIRMED: Force-push without backup = DATA LOSS",
);
console.log(
"[ForcePush] The LICENSE file and original HEAD commit are gone.",
);
console.log(
"[ForcePush] ════════════════════════════════════════════════════\n",
);
});
// ── F4: Restore source, re-mirror, then test WITH backup ─────────────────
test("F4: Restore source repo to a good state and re-mirror", async ({
request,
}) => {
// To test the backup path we need a clean slate. Re-create the original
// my-project content in the source repo so it has known good history.
mutateSourceRepo(MY_PROJECT_BARE, "my-project-restore", (workDir) => {
git("checkout main", workDir);
// Remove the force-push marker
try {
execSync("rm -f FORCE_PUSH_MARKER.txt", { cwd: workDir });
} catch {
// may not exist
}
// Restore README
writeFileSync(
join(workDir, "README.md"),
"# My Project\n\nA sample project for E2E testing.\n\n" +
"## Features\n- Greeting module\n- Math utilities\n",
);
// Restore LICENSE
writeFileSync(
join(workDir, "LICENSE"),
"MIT License\n\nCopyright (c) 2024 E2E Test\n",
);
git("add -A", workDir);
execSync(
'git commit -m "Restore original content after force-push test"',
{
cwd: workDir,
encoding: "utf-8",
stdio: ["pipe", "pipe", "pipe"],
env: {
...process.env,
GIT_AUTHOR_NAME: "E2E Test Bot",
GIT_AUTHOR_EMAIL: "e2e-bot@test.local",
GIT_COMMITTER_NAME: "E2E Test Bot",
GIT_COMMITTER_EMAIL: "e2e-bot@test.local",
},
},
);
const newHead = git("log --oneline -1", workDir);
console.log(`[ForcePush] Restored source HEAD: ${newHead}`);
});
// Sync Gitea to pick up the restored state
const synced = await giteaApi.triggerMirrorSync(
GITEA_MIRROR_ORG,
"my-project",
);
console.log(`[ForcePush] Gitea mirror-sync for restore: ${synced}`);
await new Promise((r) => setTimeout(r, 15_000));
// Verify Gitea has the restored content
await waitFor(
async () => {
const readme = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"my-project",
"README.md",
);
return readme !== null && readme.includes("Features");
},
{
timeout: 60_000,
interval: 5_000,
label: "Gitea picks up restored content",
},
);
const license = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"my-project",
"LICENSE",
);
expect(license, "LICENSE should be restored").toBeTruthy();
console.log("[ForcePush] Gitea restored to good state");
// Record the new "good" SHA for the next force-push test
const restoredBranch = await giteaApi.getBranch(
GITEA_MIRROR_ORG,
"my-project",
"main",
);
originalMainSha = restoredBranch.commit.id;
console.log(
`[ForcePush] Restored main SHA: ${originalMainSha.substring(0, 12)}`,
);
});
// ── F5: Force-push AGAIN, this time with backup enabled ──────────────────
test("F5: Enable backup, force-push, and sync", async ({ request }) => {
if (!appCookies) {
appCookies = await getAppSessionCookies(request);
}
const giteaToken = giteaApi.getTokenValue();
// Enable backup with "always" strategy
await saveConfig(request, giteaToken, appCookies, {
giteaConfig: {
backupStrategy: "always",
blockSyncOnBackupFailure: false, // don't block — we want to see both backup + sync happen
backupRetentionCount: 5,
backupDirectory: "data/repo-backups",
},
});
console.log("[ForcePush] Backup enabled (strategy=always) for protected sync test");
// Force-push again
mutateSourceRepo(MY_PROJECT_BARE, "my-project-rewrite2", (workDir) => {
git("checkout main", workDir);
writeFileSync(
join(workDir, "README.md"),
"# My Project\n\nSECOND FORCE-PUSH — backup should have preserved old state.\n",
);
writeFileSync(
join(workDir, "SECOND_FORCE_PUSH.txt"),
`Second force-push at ${new Date().toISOString()}\n`,
);
// Remove LICENSE again to simulate destructive rewrite
try {
execSync("rm -f LICENSE", { cwd: workDir });
} catch {
// may not exist
}
git("add -A", workDir);
execSync('git commit -m "SECOND FORCE PUSH: backup should catch this"', {
cwd: workDir,
encoding: "utf-8",
stdio: ["pipe", "pipe", "pipe"],
env: {
...process.env,
GIT_AUTHOR_NAME: "Force Push Bot",
GIT_AUTHOR_EMAIL: "force-push@test.local",
GIT_COMMITTER_NAME: "Force Push Bot",
GIT_COMMITTER_EMAIL: "force-push@test.local",
},
});
});
console.log("[ForcePush] Second force-push applied to source repo");
// Use the app's sync-repo to trigger the sync (this goes through
// syncGiteaRepoEnhanced which runs the backup code path)
const { ids: repoIds } = await getRepositoryIds(request, appCookies);
// Find the my-project repo ID
const dashResp = await request.get(`${APP_URL}/api/dashboard`, {
headers: { Cookie: appCookies },
failOnStatusCode: false,
});
let myProjectId = "";
if (dashResp.ok()) {
const data = await dashResp.json();
const repos: any[] = data.repositories ?? [];
const myProj = repos.find((r: any) => r.name === "my-project");
if (myProj) myProjectId = myProj.id;
}
if (myProjectId) {
console.log(
`[ForcePush] Triggering app sync-repo for my-project (${myProjectId})`,
);
const status = await triggerSyncRepo(
request,
appCookies,
[myProjectId],
25_000,
);
console.log(`[ForcePush] App sync-repo response: ${status}`);
} else {
// Fallback: trigger via Gitea API directly
console.log(
"[ForcePush] Could not find my-project ID, using Gitea API directly",
);
await giteaApi.triggerMirrorSync(GITEA_MIRROR_ORG, "my-project");
await new Promise((r) => setTimeout(r, 15_000));
}
});
// ── F6: Verify Gitea picked up the second force-push ─────────────────────
test("F6: Verify Gitea reflects second force-push", async () => {
await waitFor(
async () => {
const branch = await giteaApi.getBranch(
GITEA_MIRROR_ORG,
"my-project",
"main",
);
if (!branch) return false;
return branch.commit.id !== originalMainSha;
},
{
timeout: 60_000,
interval: 5_000,
label: "Gitea main branch updates after second force-push",
},
);
const newBranch = await giteaApi.getBranch(
GITEA_MIRROR_ORG,
"my-project",
"main",
);
const newSha = newBranch.commit.id;
console.log(
`[ForcePush] After 2nd force-push: main=${newSha.substring(0, 12)}, ` +
`msg="${newBranch.commit.message?.trim()}"`,
);
expect(newSha).not.toBe(originalMainSha);
// Verify the second force-push marker
const marker = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"my-project",
"SECOND_FORCE_PUSH.txt",
);
expect(marker, "Second force-push marker should exist").toBeTruthy();
// LICENSE should be gone again
const license = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"my-project",
"LICENSE",
);
expect(license, "LICENSE gone again after 2nd force-push").toBeNull();
console.log("[ForcePush] Second force-push verified in Gitea");
});
// ── F7: Verify backup activity was logged for the second force-push ──────
test("F7: Verify backup activity was recorded for protected sync", async ({
request,
}) => {
if (!appCookies) {
appCookies = await getAppSessionCookies(request);
}
const activitiesResp = await request.get(`${APP_URL}/api/activities`, {
headers: { Cookie: appCookies },
failOnStatusCode: false,
});
if (!activitiesResp.ok()) {
console.log(
`[ForcePush] Could not fetch activities: ${activitiesResp.status()}`,
);
return;
}
const activities = await activitiesResp.json();
const jobs: any[] = Array.isArray(activities)
? activities
: (activities.jobs ?? activities.activities ?? []);
// Filter to backup/snapshot entries for my-project
const backupJobs = jobs.filter(
(j: any) =>
(j.repositoryName === "my-project" ||
j.repositoryName === "my-project") &&
(j.message?.toLowerCase().includes("snapshot") ||
j.message?.toLowerCase().includes("backup") ||
j.details?.toLowerCase().includes("snapshot") ||
j.details?.toLowerCase().includes("backup") ||
j.details?.toLowerCase().includes("bundle")),
);
console.log(
`[ForcePush] Backup activity for my-project: ${backupJobs.length} entries`,
);
for (const j of backupJobs) {
console.log(
`[ForcePush] • [${j.status}] ${j.message ?? ""} | ${(j.details ?? "").substring(0, 100)}`,
);
}
// The backup system should have been invoked and must succeed.
expect(
backupJobs.length,
"At least one backup/snapshot activity should exist for my-project " +
"when backupStrategy is 'always'",
).toBeGreaterThan(0);
// Check whether any backups actually succeeded
const successfulBackups = backupJobs.filter(
(j: any) =>
j.status === "syncing" ||
j.message?.includes("Snapshot created") ||
j.details?.includes("Pre-sync snapshot created"),
);
const failedBackups = backupJobs.filter(
(j: any) =>
j.status === "failed" &&
(j.message?.includes("Snapshot failed") ||
j.details?.includes("snapshot failed")),
);
if (successfulBackups.length > 0) {
console.log(
`[ForcePush] ✓ ${successfulBackups.length} backup(s) SUCCEEDED — ` +
`old state was preserved in bundle`,
);
}
if (failedBackups.length > 0) {
console.log(
`[ForcePush] ⚠ ${failedBackups.length} backup(s) FAILED`,
);
// Extract and log the first failure reason for visibility
const firstFailure = failedBackups[0];
console.log(
`[ForcePush] Failure reason: ${firstFailure.details?.substring(0, 200)}`,
);
}
console.log(
"[ForcePush] ════════════════════════════════════════════════════",
);
if (successfulBackups.length > 0) {
console.log(
"[ForcePush] RESULT: Backup system PROTECTED against force-push",
);
} else {
console.log("[ForcePush] RESULT: Backup system was INVOKED but FAILED.");
}
console.log(
"[ForcePush] ════════════════════════════════════════════════════\n",
);
// Fail the test if any backups failed
expect(
failedBackups.length,
`Expected all backups to succeed, but ${failedBackups.length} backup(s) failed. ` +
`First failure: ${failedBackups[0]?.details || "unknown error"}`,
).toBe(0);
});
// ── F8: Restore source repo for subsequent test suites ───────────────────
test("F8: Restore source repo to clean state for other tests", async () => {
mutateSourceRepo(MY_PROJECT_BARE, "my-project-final-restore", (workDir) => {
git("checkout main", workDir);
// Remove force-push artifacts
try {
execSync("rm -f FORCE_PUSH_MARKER.txt SECOND_FORCE_PUSH.txt", {
cwd: workDir,
});
} catch {
// ignore
}
// Restore content
writeFileSync(
join(workDir, "README.md"),
"# My Project\n\nA sample project for E2E testing.\n\n" +
"## Features\n- Greeting module\n- Math utilities\n",
);
writeFileSync(
join(workDir, "LICENSE"),
"MIT License\n\nCopyright (c) 2024 E2E Test\n",
);
git("add -A", workDir);
execSync(
'git commit --allow-empty -m "Final restore after force-push tests"',
{
cwd: workDir,
encoding: "utf-8",
stdio: ["pipe", "pipe", "pipe"],
env: {
...process.env,
GIT_AUTHOR_NAME: "E2E Test Bot",
GIT_AUTHOR_EMAIL: "e2e-bot@test.local",
GIT_COMMITTER_NAME: "E2E Test Bot",
GIT_COMMITTER_EMAIL: "e2e-bot@test.local",
},
},
);
});
// Sync Gitea
await giteaApi.triggerMirrorSync(GITEA_MIRROR_ORG, "my-project");
await new Promise((r) => setTimeout(r, 10_000));
// Verify restoration
const license = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"my-project",
"LICENSE",
);
if (license) {
console.log("[ForcePush] Source repo restored for subsequent tests");
} else {
console.log(
"[ForcePush] Warning: restoration may not have synced yet (Gitea async)",
);
}
});
});

View File

@@ -0,0 +1,342 @@
/**
* 05 Sync verification and cleanup.
*
* Exercises the dynamic aspects of the sync pipeline:
* • Adding a repo to the fake GitHub at runtime and verifying the app
* discovers it on the next sync
* • Deep content-integrity checks on repos mirrored during earlier suites
* • Resetting the fake GitHub store to its defaults
*
* Prerequisites: 02-mirror-workflow.spec.ts must have run so that repos
* already exist in Gitea.
*/
import { test, expect } from "@playwright/test";
import {
APP_URL,
GITEA_URL,
FAKE_GITHUB_URL,
GITEA_MIRROR_ORG,
GiteaAPI,
getAppSessionCookies,
} from "./helpers";
test.describe("E2E: Sync verification", () => {
let giteaApi: GiteaAPI;
let appCookies = "";
test.beforeAll(async () => {
giteaApi = new GiteaAPI(GITEA_URL);
try {
await giteaApi.createToken();
} catch {
console.log("[SyncVerify] Could not create Gitea token; tests may skip");
}
});
test.afterAll(async () => {
await giteaApi.dispose();
});
// ── Dynamic repo addition ────────────────────────────────────────────────
test("Verify fake GitHub management API can add repos dynamically", async ({
request,
}) => {
const addResp = await request.post(`${FAKE_GITHUB_URL}/___mgmt/add-repo`, {
data: {
name: "dynamic-repo",
owner_login: "e2e-test-user",
description: "Dynamically added for E2E testing",
language: "Rust",
},
});
expect(addResp.ok()).toBeTruthy();
const repoResp = await request.get(
`${FAKE_GITHUB_URL}/repos/e2e-test-user/dynamic-repo`,
);
expect(repoResp.ok()).toBeTruthy();
const repo = await repoResp.json();
expect(repo.name).toBe("dynamic-repo");
expect(repo.language).toBe("Rust");
console.log("[DynamicRepo] Successfully added and verified dynamic repo");
});
test("Newly added fake GitHub repo gets picked up by sync", async ({
request,
}) => {
appCookies = await getAppSessionCookies(request);
const syncResp = await request.post(`${APP_URL}/api/sync`, {
headers: {
"Content-Type": "application/json",
Cookie: appCookies,
},
failOnStatusCode: false,
});
const status = syncResp.status();
console.log(`[DynamicSync] Sync response: ${status}`);
expect(status).toBeLessThan(500);
if (syncResp.ok()) {
const data = await syncResp.json();
console.log(
`[DynamicSync] New repos discovered: ${data.newRepositories ?? "?"}`,
);
if (data.newRepositories !== undefined) {
expect(data.newRepositories).toBeGreaterThanOrEqual(0);
}
}
});
// ── Content integrity ────────────────────────────────────────────────────
test("Verify repo content integrity after mirror", async () => {
// Check repos in the mirror org
const orgRepos = await giteaApi.listOrgRepos(GITEA_MIRROR_ORG);
const orgRepoNames = orgRepos.map((r: any) => r.name);
console.log(
`[Integrity] Repos in ${GITEA_MIRROR_ORG}: ${orgRepoNames.join(", ")}`,
);
// Check github-stars org for starred repos
const starsRepos = await giteaApi.listOrgRepos("github-stars");
const starsRepoNames = starsRepos.map((r: any) => r.name);
console.log(
`[Integrity] Repos in github-stars: ${starsRepoNames.join(", ")}`,
);
// ── notes repo (minimal single-commit repo) ──────────────────────────
if (orgRepoNames.includes("notes")) {
const notesReadme = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"notes",
"README.md",
);
if (notesReadme) {
expect(notesReadme).toContain("Notes");
console.log("[Integrity] notes/README.md verified");
}
const ideas = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"notes",
"ideas.md",
);
if (ideas) {
expect(ideas).toContain("Ideas");
console.log("[Integrity] notes/ideas.md verified");
}
const todo = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"notes",
"todo.md",
);
if (todo) {
expect(todo).toContain("TODO");
console.log("[Integrity] notes/todo.md verified");
}
}
// ── dotfiles repo ────────────────────────────────────────────────────
if (orgRepoNames.includes("dotfiles")) {
const vimrc = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"dotfiles",
".vimrc",
);
if (vimrc) {
expect(vimrc).toContain("set number");
console.log("[Integrity] dotfiles/.vimrc verified");
}
const gitconfig = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"dotfiles",
".gitconfig",
);
if (gitconfig) {
expect(gitconfig).toContain("[user]");
console.log("[Integrity] dotfiles/.gitconfig verified");
}
// Verify commit count (dotfiles has 2 commits)
const commits = await giteaApi.listCommits(
GITEA_MIRROR_ORG,
"dotfiles",
);
console.log(`[Integrity] dotfiles commit count: ${commits.length}`);
expect(
commits.length,
"dotfiles should have at least 2 commits",
).toBeGreaterThanOrEqual(2);
}
// ── popular-lib (starred repo from other-user) ───────────────────────
// In single-org strategy it goes to the starredReposOrg ("github-stars")
if (starsRepoNames.includes("popular-lib")) {
const readme = await giteaApi.getFileContent(
"github-stars",
"popular-lib",
"README.md",
);
if (readme) {
expect(readme).toContain("Popular Lib");
console.log("[Integrity] popular-lib/README.md verified");
}
const pkg = await giteaApi.getFileContent(
"github-stars",
"popular-lib",
"package.json",
);
if (pkg) {
const parsed = JSON.parse(pkg);
expect(parsed.name).toBe("popular-lib");
expect(parsed.version).toBe("2.5.0");
console.log("[Integrity] popular-lib/package.json verified");
}
const tags = await giteaApi.listTags("github-stars", "popular-lib");
const tagNames = tags.map((t: any) => t.name);
console.log(
`[Integrity] popular-lib tags: ${tagNames.join(", ") || "(none)"}`,
);
if (tagNames.length > 0) {
expect(tagNames).toContain("v2.5.0");
}
} else {
console.log(
"[Integrity] popular-lib not found in github-stars " +
"(may be in mirror org or not yet mirrored)",
);
}
// ── org-tool (organization repo) ─────────────────────────────────────
// org-tool may be in the mirror org or a separate org depending on
// the mirror strategy — check several possible locations.
const orgToolOwners = [GITEA_MIRROR_ORG, "test-org"];
let foundOrgTool = false;
for (const owner of orgToolOwners) {
const repo = await giteaApi.getRepo(owner, "org-tool");
if (repo) {
foundOrgTool = true;
console.log(`[Integrity] org-tool found in ${owner}`);
const readme = await giteaApi.getFileContent(
owner,
"org-tool",
"README.md",
);
if (readme) {
expect(readme).toContain("Org Tool");
console.log("[Integrity] org-tool/README.md verified");
}
const mainGo = await giteaApi.getFileContent(
owner,
"org-tool",
"main.go",
);
if (mainGo) {
expect(mainGo).toContain("package main");
console.log("[Integrity] org-tool/main.go verified");
}
// Check branches
const branches = await giteaApi.listBranches(owner, "org-tool");
const branchNames = branches.map((b: any) => b.name);
console.log(
`[Integrity] org-tool branches: ${branchNames.join(", ")}`,
);
if (branchNames.length > 0) {
expect(branchNames).toContain("main");
}
// Check tags
const tags = await giteaApi.listTags(owner, "org-tool");
const tagNames = tags.map((t: any) => t.name);
console.log(
`[Integrity] org-tool tags: ${tagNames.join(", ") || "(none)"}`,
);
break;
}
}
if (!foundOrgTool) {
console.log(
"[Integrity] org-tool not found in Gitea " +
"(may not have been mirrored in single-org strategy)",
);
}
});
// ── my-project deep check ────────────────────────────────────────────────
test("Verify my-project branch and tag structure", async () => {
const branches = await giteaApi.listBranches(
GITEA_MIRROR_ORG,
"my-project",
);
const branchNames = branches.map((b: any) => b.name);
console.log(
`[Integrity] my-project branches: ${branchNames.join(", ")}`,
);
// The source repo had main, develop, and feature/add-tests
expect(branchNames, "main branch should exist").toContain("main");
// develop and feature/add-tests may or may not survive force-push tests
// depending on test ordering, so just log them
for (const expected of ["develop", "feature/add-tests"]) {
if (branchNames.includes(expected)) {
console.log(`[Integrity] ✓ Branch "${expected}" present`);
} else {
console.log(`[Integrity] ⊘ Branch "${expected}" not present (may have been affected by force-push tests)`);
}
}
const tags = await giteaApi.listTags(GITEA_MIRROR_ORG, "my-project");
const tagNames = tags.map((t: any) => t.name);
console.log(
`[Integrity] my-project tags: ${tagNames.join(", ") || "(none)"}`,
);
// Verify package.json exists and is valid JSON
const pkg = await giteaApi.getFileContent(
GITEA_MIRROR_ORG,
"my-project",
"package.json",
);
if (pkg) {
const parsed = JSON.parse(pkg);
expect(parsed.name).toBe("my-project");
console.log("[Integrity] my-project/package.json verified");
}
});
});
// ─── Fake GitHub reset ───────────────────────────────────────────────────────
test.describe("E2E: Fake GitHub reset", () => {
test("Can reset fake GitHub to default state", async ({ request }) => {
const resp = await request.post(`${FAKE_GITHUB_URL}/___mgmt/reset`);
expect(resp.ok()).toBeTruthy();
const data = await resp.json();
expect(data.message).toContain("reset");
console.log("[Reset] Fake GitHub reset to defaults");
const health = await request.get(`${FAKE_GITHUB_URL}/___mgmt/health`);
const healthData = await health.json();
expect(healthData.repos).toBeGreaterThan(0);
console.log(
`[Reset] After reset: ${healthData.repos} repos, ${healthData.orgs} orgs`,
);
});
});

141
tests/e2e/cleanup.sh Executable file
View File

@@ -0,0 +1,141 @@
#!/usr/bin/env bash
# ────────────────────────────────────────────────────────────────────────────────
# E2E Cleanup Script
# Removes all temporary data from previous E2E test runs.
#
# Usage:
# ./tests/e2e/cleanup.sh # cleanup everything
# ./tests/e2e/cleanup.sh --soft # keep container images, only remove volumes/data
# ────────────────────────────────────────────────────────────────────────────────
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
PROJECT_ROOT="$(cd "$SCRIPT_DIR/../.." && pwd)"
COMPOSE_FILE="$SCRIPT_DIR/docker-compose.e2e.yml"
SOFT_CLEAN=false
if [[ "${1:-}" == "--soft" ]]; then
SOFT_CLEAN=true
fi
# Detect container runtime (podman or docker)
if command -v podman-compose &>/dev/null; then
COMPOSE_CMD="podman-compose"
CONTAINER_CMD="podman"
elif command -v docker-compose &>/dev/null; then
COMPOSE_CMD="docker-compose"
CONTAINER_CMD="docker"
elif command -v docker &>/dev/null && docker compose version &>/dev/null 2>&1; then
COMPOSE_CMD="docker compose"
CONTAINER_CMD="docker"
else
echo "[cleanup] WARNING: No container compose tool found. Skipping container cleanup."
COMPOSE_CMD=""
CONTAINER_CMD=""
fi
echo "╔══════════════════════════════════════════════════════════════╗"
echo "║ E2E Test Cleanup ║"
echo "╚══════════════════════════════════════════════════════════════╝"
echo ""
# ── 1. Stop and remove containers ─────────────────────────────────────────────
if [[ -n "$COMPOSE_CMD" ]] && [[ -f "$COMPOSE_FILE" ]]; then
echo "[cleanup] Stopping E2E containers..."
$COMPOSE_CMD -f "$COMPOSE_FILE" down --volumes --remove-orphans 2>/dev/null || true
echo "[cleanup] ✓ Containers stopped and removed"
else
echo "[cleanup] ⊘ No compose file or runtime found, skipping container teardown"
fi
# ── 2. Remove named volumes created by E2E compose ───────────────────────────
if [[ -n "$CONTAINER_CMD" ]]; then
for vol in e2e-gitea-data; do
full_vol_name="e2e_${vol}"
# Try both with and without the project prefix
for candidate in "$vol" "$full_vol_name" "tests_e2e_${vol}"; do
if $CONTAINER_CMD volume inspect "$candidate" &>/dev/null 2>&1; then
echo "[cleanup] Removing volume: $candidate"
$CONTAINER_CMD volume rm -f "$candidate" 2>/dev/null || true
fi
done
done
echo "[cleanup] ✓ Named volumes cleaned"
fi
# ── 3. Kill leftover background processes from previous runs ──────────────────
echo "[cleanup] Checking for leftover processes..."
# Kill fake GitHub server
if pgrep -f "fake-github-server" &>/dev/null; then
echo "[cleanup] Killing leftover fake-github-server process(es)..."
pkill -f "fake-github-server" 2>/dev/null || true
fi
# Kill any stray node/tsx processes on our E2E ports (including git-server on 4590)
for port in 4580 4590 4321 3333; do
pid=$(lsof -ti :"$port" 2>/dev/null || true)
if [[ -n "$pid" ]]; then
echo "[cleanup] Killing process on port $port (PID: $pid)..."
kill -9 $pid 2>/dev/null || true
fi
done
echo "[cleanup] ✓ Leftover processes cleaned"
# ── 4. Remove E2E database and data files ─────────────────────────────────────
echo "[cleanup] Removing E2E data files..."
# Remove test databases
rm -f "$PROJECT_ROOT/gitea-mirror.db" 2>/dev/null || true
rm -f "$PROJECT_ROOT/data/gitea-mirror.db" 2>/dev/null || true
rm -f "$PROJECT_ROOT/e2e-gitea-mirror.db" 2>/dev/null || true
# Remove test backup data
rm -rf "$PROJECT_ROOT/data/repo-backups"* 2>/dev/null || true
# Remove programmatically created test git repositories
if [[ -d "$SCRIPT_DIR/git-repos" ]]; then
echo "[cleanup] Removing test git repos..."
rm -rf "$SCRIPT_DIR/git-repos" 2>/dev/null || true
echo "[cleanup] ✓ Test git repos removed"
fi
# Remove Playwright state/artifacts from previous runs
rm -rf "$SCRIPT_DIR/test-results" 2>/dev/null || true
rm -rf "$SCRIPT_DIR/playwright-report" 2>/dev/null || true
rm -rf "$SCRIPT_DIR/.auth" 2>/dev/null || true
rm -f "$SCRIPT_DIR/e2e-storage-state.json" 2>/dev/null || true
# Remove any PID files we might have created
rm -f "$SCRIPT_DIR/.fake-github.pid" 2>/dev/null || true
rm -f "$SCRIPT_DIR/.app.pid" 2>/dev/null || true
echo "[cleanup] ✓ Data files cleaned"
# ── 5. Remove temp directories ────────────────────────────────────────────────
echo "[cleanup] Removing temp directories..."
rm -rf /tmp/gitea-mirror-backup-* 2>/dev/null || true
rm -rf /tmp/e2e-gitea-mirror-* 2>/dev/null || true
echo "[cleanup] ✓ Temp directories cleaned"
# ── 6. Optionally remove container images ─────────────────────────────────────
if [[ "$SOFT_CLEAN" == false ]] && [[ -n "$CONTAINER_CMD" ]]; then
echo "[cleanup] Pruning dangling images..."
$CONTAINER_CMD image prune -f 2>/dev/null || true
echo "[cleanup] ✓ Dangling images pruned"
else
echo "[cleanup] ⊘ Skipping image cleanup (soft mode)"
fi
# ── 7. Remove node_modules/.cache artifacts from E2E ──────────────────────────
if [[ -d "$PROJECT_ROOT/node_modules/.cache/playwright" ]]; then
echo "[cleanup] Removing Playwright cache..."
rm -rf "$PROJECT_ROOT/node_modules/.cache/playwright" 2>/dev/null || true
echo "[cleanup] ✓ Playwright cache removed"
fi
echo ""
echo "═══════════════════════════════════════════════════════════════"
echo " ✅ E2E cleanup complete"
echo "═══════════════════════════════════════════════════════════════"

View File

@@ -0,0 +1,522 @@
#!/usr/bin/env bun
/**
* create-test-repos.ts
*
* Programmatically creates bare git repositories with real commits, branches,
* and tags so that Gitea can actually clone them during E2E testing.
*
* Repos are created under <outputDir>/<owner>/<name>.git as bare repositories.
* After creation, `git update-server-info` is run on each so they can be served
* via the "dumb HTTP" protocol by any static file server (nginx, darkhttpd, etc.).
*
* Usage:
* bun run tests/e2e/create-test-repos.ts [--output-dir tests/e2e/git-repos]
*
* The script creates the following repositories matching the fake GitHub server's
* default store:
*
* e2e-test-user/my-project.git repo with commits, branches, tags, README
* e2e-test-user/dotfiles.git simple repo with a few config files
* e2e-test-user/notes.git minimal repo with one commit
* other-user/popular-lib.git starred repo from another user
* test-org/org-tool.git organization repository
*/
import { execSync } from "node:child_process";
import { mkdirSync, rmSync, writeFileSync, existsSync } from "node:fs";
import { join, resolve } from "node:path";
// ─── Configuration ───────────────────────────────────────────────────────────
const DEFAULT_OUTPUT_DIR = join(import.meta.dir, "git-repos");
const outputDir = (() => {
const idx = process.argv.indexOf("--output-dir");
if (idx !== -1 && process.argv[idx + 1]) {
return resolve(process.argv[idx + 1]);
}
return DEFAULT_OUTPUT_DIR;
})();
// ─── Helpers ─────────────────────────────────────────────────────────────────
function git(args: string, cwd: string): string {
try {
return execSync(`git ${args}`, {
cwd,
encoding: "utf-8",
stdio: ["pipe", "pipe", "pipe"],
env: {
...process.env,
// Deterministic committer for reproducible repos
GIT_AUTHOR_NAME: "E2E Test Bot",
GIT_AUTHOR_EMAIL: "e2e-bot@test.local",
GIT_AUTHOR_DATE: "2024-01-15T10:00:00+00:00",
GIT_COMMITTER_NAME: "E2E Test Bot",
GIT_COMMITTER_EMAIL: "e2e-bot@test.local",
GIT_COMMITTER_DATE: "2024-01-15T10:00:00+00:00",
},
}).trim();
} catch (err: any) {
const stderr = err.stderr?.toString() ?? "";
const stdout = err.stdout?.toString() ?? "";
throw new Error(
`git ${args} failed in ${cwd}:\n${stderr || stdout || err.message}`,
);
}
}
/** Increment the fake date for each commit so they have unique timestamps */
let commitCounter = 0;
function gitCommit(msg: string, cwd: string): void {
commitCounter++;
const date = `2024-01-15T${String(10 + Math.floor(commitCounter / 60)).padStart(2, "0")}:${String(commitCounter % 60).padStart(2, "0")}:00+00:00`;
execSync(`git commit -m "${msg}"`, {
cwd,
encoding: "utf-8",
stdio: ["pipe", "pipe", "pipe"],
env: {
...process.env,
GIT_AUTHOR_NAME: "E2E Test Bot",
GIT_AUTHOR_EMAIL: "e2e-bot@test.local",
GIT_AUTHOR_DATE: date,
GIT_COMMITTER_NAME: "E2E Test Bot",
GIT_COMMITTER_EMAIL: "e2e-bot@test.local",
GIT_COMMITTER_DATE: date,
},
});
}
function writeFile(repoDir: string, relPath: string, content: string): void {
const fullPath = join(repoDir, relPath);
const dir = fullPath.substring(0, fullPath.lastIndexOf("/"));
if (dir && !existsSync(dir)) {
mkdirSync(dir, { recursive: true });
}
writeFileSync(fullPath, content, "utf-8");
}
interface RepoSpec {
owner: string;
name: string;
description: string;
/** Function that populates the working repo with commits/branches/tags */
populate: (workDir: string) => void;
}
/**
* Creates a bare repo at <outputDir>/<owner>/<name>.git
* by first building a working repo, then cloning it as bare.
*/
function createBareRepo(spec: RepoSpec): string {
const barePath = join(outputDir, spec.owner, `${spec.name}.git`);
const workPath = join(outputDir, ".work", spec.owner, spec.name);
// Clean previous
rmSync(barePath, { recursive: true, force: true });
rmSync(workPath, { recursive: true, force: true });
// Create working repo
mkdirSync(workPath, { recursive: true });
git("init -b main", workPath);
git("config user.name 'E2E Test Bot'", workPath);
git("config user.email 'e2e-bot@test.local'", workPath);
// Populate with content
spec.populate(workPath);
// Clone as bare
mkdirSync(join(outputDir, spec.owner), { recursive: true });
git(`clone --bare "${workPath}" "${barePath}"`, outputDir);
// Enable dumb HTTP protocol support
git("update-server-info", barePath);
// Also enable the post-update hook so update-server-info runs on push
const hookPath = join(barePath, "hooks", "post-update");
mkdirSync(join(barePath, "hooks"), { recursive: true });
writeFileSync(hookPath, "#!/bin/sh\nexec git update-server-info\n", {
mode: 0o755,
});
return barePath;
}
// ─── Repository Definitions ──────────────────────────────────────────────────
const repos: RepoSpec[] = [
// ── my-project: feature-rich repo ────────────────────────────────────────
{
owner: "e2e-test-user",
name: "my-project",
description: "A test project with branches, tags, and multiple commits",
populate(dir) {
// Initial commit
writeFile(
dir,
"README.md",
"# My Project\n\nA sample project for E2E testing.\n",
);
writeFile(
dir,
"package.json",
JSON.stringify(
{
name: "my-project",
version: "1.0.0",
description: "E2E test project",
main: "index.js",
},
null,
2,
) + "\n",
);
writeFile(
dir,
"index.js",
'// Main entry point\nconsole.log("Hello from my-project");\n',
);
writeFile(dir, ".gitignore", "node_modules/\ndist/\n.env\n");
git("add -A", dir);
gitCommit("Initial commit", dir);
// Second commit
writeFile(
dir,
"src/lib.js",
"export function greet(name) {\n return `Hello, ${name}!`;\n}\n",
);
writeFile(
dir,
"src/utils.js",
"export function sum(a, b) {\n return a + b;\n}\n",
);
git("add -A", dir);
gitCommit("Add library modules", dir);
// Tag v1.0.0
git("tag -a v1.0.0 -m 'Initial release'", dir);
// Create develop branch
git("checkout -b develop", dir);
writeFile(
dir,
"src/feature.js",
"export function newFeature() {\n return 'coming soon';\n}\n",
);
git("add -A", dir);
gitCommit("Add new feature placeholder", dir);
// Create feature branch from develop
git("checkout -b feature/add-tests", dir);
writeFile(
dir,
"tests/lib.test.js",
`import { greet } from '../src/lib.js';
import { sum } from '../src/utils.js';
console.assert(greet('World') === 'Hello, World!');
console.assert(sum(2, 3) === 5);
console.log('All tests passed');
`,
);
git("add -A", dir);
gitCommit("Add unit tests", dir);
// Go back to main and add another commit
git("checkout main", dir);
writeFile(
dir,
"README.md",
"# My Project\n\nA sample project for E2E testing.\n\n## Features\n- Greeting module\n- Math utilities\n",
);
git("add -A", dir);
gitCommit("Update README with features list", dir);
// Tag v1.1.0
git("tag -a v1.1.0 -m 'Feature update'", dir);
// Third commit on main for more history
writeFile(dir, "LICENSE", "MIT License\n\nCopyright (c) 2024 E2E Test\n");
git("add -A", dir);
gitCommit("Add MIT license", dir);
},
},
// ── dotfiles: simple config repo ─────────────────────────────────────────
{
owner: "e2e-test-user",
name: "dotfiles",
description: "Personal configuration files",
populate(dir) {
writeFile(
dir,
".bashrc",
"# Bash configuration\nalias ll='ls -la'\nalias gs='git status'\nexport EDITOR=vim\n",
);
writeFile(
dir,
".vimrc",
'" Vim configuration\nset number\nset tabstop=2\nset shiftwidth=2\nset expandtab\nsyntax on\n',
);
writeFile(
dir,
".gitconfig",
"[user]\n name = E2E Test User\n email = e2e@test.local\n[alias]\n co = checkout\n br = branch\n st = status\n",
);
git("add -A", dir);
gitCommit("Add dotfiles", dir);
writeFile(
dir,
".tmux.conf",
"# Tmux configuration\nset -g mouse on\nset -g default-terminal 'screen-256color'\n",
);
writeFile(
dir,
"install.sh",
'#!/bin/bash\n# Symlink dotfiles to home\nfor f in .bashrc .vimrc .gitconfig .tmux.conf; do\n ln -sf "$(pwd)/$f" "$HOME/$f"\ndone\necho \'Dotfiles installed!\'\n',
);
git("add -A", dir);
gitCommit("Add tmux config and install script", dir);
},
},
// ── notes: minimal single-commit repo ────────────────────────────────────
{
owner: "e2e-test-user",
name: "notes",
description: "Personal notes and documentation",
populate(dir) {
writeFile(
dir,
"README.md",
"# Notes\n\nA collection of personal notes.\n",
);
writeFile(
dir,
"ideas.md",
"# Ideas\n\n- Build a mirror tool\n- Automate backups\n- Learn Rust\n",
);
writeFile(
dir,
"todo.md",
"# TODO\n\n- [x] Set up repository\n- [ ] Add more notes\n- [ ] Organize by topic\n",
);
git("add -A", dir);
gitCommit("Initial notes", dir);
},
},
// ── popular-lib: starred repo from another user ──────────────────────────
{
owner: "other-user",
name: "popular-lib",
description: "A popular library that we starred",
populate(dir) {
writeFile(
dir,
"README.md",
"# Popular Lib\n\nA widely-used utility library.\n\n## Installation\n\n```bash\nnpm install popular-lib\n```\n",
);
writeFile(
dir,
"package.json",
JSON.stringify(
{
name: "popular-lib",
version: "2.5.0",
description: "A widely-used utility library",
main: "dist/index.js",
license: "Apache-2.0",
},
null,
2,
) + "\n",
);
writeFile(
dir,
"src/index.ts",
`/**
* Popular Lib - utility functions
*/
export function capitalize(str: string): string {
return str.charAt(0).toUpperCase() + str.slice(1);
}
export function slugify(str: string): string {
return str.toLowerCase().replace(/\\s+/g, '-').replace(/[^a-z0-9-]/g, '');
}
export function truncate(str: string, len: number): string {
if (str.length <= len) return str;
return str.slice(0, len) + '...';
}
`,
);
git("add -A", dir);
gitCommit("Initial release of popular-lib", dir);
git("tag -a v2.5.0 -m 'Stable release 2.5.0'", dir);
// Add a second commit
writeFile(
dir,
"CHANGELOG.md",
"# Changelog\n\n## 2.5.0\n- Added capitalize, slugify, truncate\n\n## 2.4.0\n- Bug fixes\n",
);
git("add -A", dir);
gitCommit("Add changelog", dir);
},
},
// ── org-tool: organization repo ──────────────────────────────────────────
{
owner: "test-org",
name: "org-tool",
description: "Internal organization tooling",
populate(dir) {
writeFile(
dir,
"README.md",
"# Org Tool\n\nInternal tooling for test-org.\n\n## Usage\n\n```bash\norg-tool run <command>\n```\n",
);
writeFile(
dir,
"main.go",
`package main
import "fmt"
func main() {
\tfmt.Println("org-tool v0.1.0")
}
`,
);
writeFile(
dir,
"go.mod",
"module github.com/test-org/org-tool\n\ngo 1.21\n",
);
writeFile(
dir,
"Makefile",
"build:\n\tgo build -o org-tool .\n\ntest:\n\tgo test ./...\n\nclean:\n\trm -f org-tool\n",
);
git("add -A", dir);
gitCommit("Initial org tool", dir);
// Add a release branch
git("checkout -b release/v0.1", dir);
writeFile(dir, "VERSION", "0.1.0\n");
git("add -A", dir);
gitCommit("Pin version for release", dir);
git("tag -a v0.1.0 -m 'Release v0.1.0'", dir);
// Back to main with more work
git("checkout main", dir);
writeFile(
dir,
"cmd/serve.go",
`package cmd
import "fmt"
func Serve() {
\tfmt.Println("Starting server on :8080")
}
`,
);
git("add -A", dir);
gitCommit("Add serve command", dir);
},
},
];
// ─── Main ────────────────────────────────────────────────────────────────────
function main() {
console.log(
"╔══════════════════════════════════════════════════════════════╗",
);
console.log(
"║ Create E2E Test Git Repositories ║",
);
console.log(
"╠══════════════════════════════════════════════════════════════╣",
);
console.log(`║ Output directory: ${outputDir}`);
console.log(`║ Repositories: ${repos.length}`);
console.log(
"╚══════════════════════════════════════════════════════════════╝",
);
console.log("");
// Verify git is available
try {
const version = execSync("git --version", { encoding: "utf-8" }).trim();
console.log(`[setup] Git version: ${version}`);
} catch {
console.error("ERROR: git is not installed or not in PATH");
process.exit(1);
}
// Clean output directory (preserve the directory itself)
if (existsSync(outputDir)) {
console.log("[setup] Cleaning previous repos...");
rmSync(outputDir, { recursive: true, force: true });
}
mkdirSync(outputDir, { recursive: true });
// Create each repository
const created: string[] = [];
for (const spec of repos) {
const label = `${spec.owner}/${spec.name}`;
console.log(`\n[repo] Creating ${label} ...`);
try {
const barePath = createBareRepo(spec);
console.log(`[repo] ✓ ${label}${barePath}`);
created.push(label);
} catch (err) {
console.error(`[repo] ✗ ${label} FAILED:`, err);
process.exit(1);
}
}
// Cleanup working directories
const workDir = join(outputDir, ".work");
if (existsSync(workDir)) {
rmSync(workDir, { recursive: true, force: true });
}
// Write a manifest file so other scripts know what repos exist
const manifest = {
createdAt: new Date().toISOString(),
outputDir,
repos: repos.map((r) => ({
owner: r.owner,
name: r.name,
description: r.description,
barePath: `${r.owner}/${r.name}.git`,
})),
};
writeFileSync(
join(outputDir, "manifest.json"),
JSON.stringify(manifest, null, 2) + "\n",
"utf-8",
);
console.log(
"\n═══════════════════════════════════════════════════════════════",
);
console.log(` ✅ Created ${created.length} bare repositories:`);
for (const name of created) {
console.log(`${name}.git`);
}
console.log(`\n Manifest: ${join(outputDir, "manifest.json")}`);
console.log(
"═══════════════════════════════════════════════════════════════",
);
}
main();

View File

@@ -0,0 +1,105 @@
# E2E testing environment
# Spins up a Gitea instance and a git HTTP server for integration testing.
#
# The git-server container serves bare git repositories created by
# create-test-repos.ts via the "dumb HTTP" protocol so that Gitea can
# actually clone them during mirror operations.
#
# Usage: podman-compose -f tests/e2e/docker-compose.e2e.yml up -d
services:
gitea-e2e:
image: docker.io/gitea/gitea:1.22
container_name: gitea-e2e
environment:
- USER_UID=1000
- USER_GID=1000
- GITEA__database__DB_TYPE=sqlite3
- GITEA__database__PATH=/data/gitea/gitea.db
- GITEA__server__DOMAIN=localhost
- GITEA__server__ROOT_URL=http://localhost:3333/
- GITEA__server__HTTP_PORT=3000
- GITEA__server__SSH_DOMAIN=localhost
- GITEA__server__START_SSH_SERVER=false
- GITEA__security__INSTALL_LOCK=true
- GITEA__service__DISABLE_REGISTRATION=false
- GITEA__service__REQUIRE_SIGNIN_VIEW=false
- GITEA__api__ENABLE_SWAGGER=false
- GITEA__log__MODE=console
- GITEA__log__LEVEL=Warn
- GITEA__mirror__ENABLED=true
- GITEA__mirror__DEFAULT_INTERVAL=1m
- GITEA__mirror__MIN_INTERVAL=1m
# Allow migrations from any domain including the git-server container
- GITEA__migrations__ALLOWED_DOMAINS=*
- GITEA__migrations__ALLOW_LOCAL_NETWORKS=true
- GITEA__migrations__SKIP_TLS_VERIFY=true
ports:
- "3333:3000"
volumes:
- e2e-gitea-data:/data
depends_on:
git-server:
condition: service_started
healthcheck:
test:
[
"CMD",
"wget",
"--no-verbose",
"--tries=1",
"--spider",
"http://localhost:3000/",
]
interval: 5s
timeout: 5s
retries: 30
start_period: 10s
tmpfs:
- /tmp
networks:
- e2e-net
# Lightweight HTTP server that serves bare git repositories.
# Repos are created on the host by create-test-repos.ts and bind-mounted
# into this container. Gitea clones from http://git-server/<owner>/<name>.git
# using the "dumb HTTP" protocol (repos have git update-server-info run).
git-server:
image: docker.io/alpine:3.19
container_name: git-server
command:
- sh
- -c
- |
apk add --no-cache darkhttpd >/dev/null 2>&1
echo "[git-server] Serving repos from /repos on port 80"
ls -la /repos/ 2>/dev/null || echo "[git-server] WARNING: /repos is empty"
exec darkhttpd /repos --port 80 --no-listing --log /dev/stdout
volumes:
- ./git-repos:/repos:ro
ports:
- "4590:80"
healthcheck:
test:
[
"CMD",
"wget",
"--no-verbose",
"--tries=1",
"--spider",
"http://localhost:80/manifest.json",
]
interval: 3s
timeout: 3s
retries: 15
start_period: 5s
networks:
- e2e-net
networks:
e2e-net:
driver: bridge
volumes:
e2e-gitea-data:
driver: local

File diff suppressed because it is too large Load Diff

666
tests/e2e/helpers.ts Normal file
View File

@@ -0,0 +1,666 @@
/**
* Shared helpers for E2E tests.
*
* Exports constants, the GiteaAPI wrapper, auth helpers (sign-up / sign-in),
* the saveConfig helper, and a generic waitFor polling utility.
*/
import {
expect,
request as playwrightRequest,
type Page,
type APIRequestContext,
} from "@playwright/test";
// ─── Constants ───────────────────────────────────────────────────────────────
export const APP_URL = process.env.APP_URL || "http://localhost:4321";
export const GITEA_URL = process.env.GITEA_URL || "http://localhost:3333";
export const FAKE_GITHUB_URL =
process.env.FAKE_GITHUB_URL || "http://localhost:4580";
export const GIT_SERVER_URL =
process.env.GIT_SERVER_URL || "http://localhost:4590";
export const GITEA_ADMIN_USER = "e2e_admin";
export const GITEA_ADMIN_PASS = "e2eAdminPass123!";
export const GITEA_ADMIN_EMAIL = "admin@e2e-test.local";
export const APP_USER_EMAIL = "e2e@test.local";
export const APP_USER_PASS = "E2eTestPass123!";
export const APP_USER_NAME = "e2e-tester";
export const GITEA_MIRROR_ORG = "github-mirrors";
// ─── waitFor ─────────────────────────────────────────────────────────────────
/** Retry a function until it returns truthy or timeout is reached. */
export async function waitFor(
fn: () => Promise<boolean>,
{
timeout = 60_000,
interval = 2_000,
label = "condition",
}: { timeout?: number; interval?: number; label?: string } = {},
): Promise<void> {
const deadline = Date.now() + timeout;
let lastErr: Error | undefined;
while (Date.now() < deadline) {
try {
if (await fn()) return;
} catch (e) {
lastErr = e instanceof Error ? e : new Error(String(e));
}
await new Promise((r) => setTimeout(r, interval));
}
throw new Error(
`waitFor("${label}") timed out after ${timeout}ms` +
(lastErr ? `: ${lastErr.message}` : ""),
);
}
// ─── GiteaAPI ────────────────────────────────────────────────────────────────
/**
* Direct HTTP helper for talking to Gitea's API.
*
* Uses a manually-created APIRequestContext so it can be shared across
* beforeAll / afterAll / individual tests without hitting Playwright's
* "fixture from beforeAll cannot be reused" restriction.
*/
export class GiteaAPI {
private token = "";
private ctx: APIRequestContext | null = null;
constructor(private baseUrl: string) {}
/** Lazily create (and cache) a Playwright APIRequestContext. */
private async getCtx(): Promise<APIRequestContext> {
if (!this.ctx) {
this.ctx = await playwrightRequest.newContext({
baseURL: this.baseUrl,
});
}
return this.ctx;
}
/** Dispose of the underlying context call in afterAll. */
async dispose(): Promise<void> {
if (this.ctx) {
await this.ctx.dispose();
this.ctx = null;
}
}
/** Create the admin user via Gitea's sign-up form (first user becomes admin). */
async ensureAdminUser(): Promise<void> {
const ctx = await this.getCtx();
// Check if admin already exists by trying basic-auth
try {
const resp = await ctx.get(`/api/v1/user`, {
headers: {
Authorization: `Basic ${btoa(`${GITEA_ADMIN_USER}:${GITEA_ADMIN_PASS}`)}`,
},
failOnStatusCode: false,
});
if (resp.ok()) {
console.log("[GiteaAPI] Admin user already exists");
return;
}
} catch {
// Expected on first run
}
// Register through the form first user auto-becomes admin
console.log("[GiteaAPI] Creating admin via sign-up form...");
const signUpResp = await ctx.post(`/user/sign_up`, {
form: {
user_name: GITEA_ADMIN_USER,
password: GITEA_ADMIN_PASS,
retype: GITEA_ADMIN_PASS,
email: GITEA_ADMIN_EMAIL,
},
failOnStatusCode: false,
maxRedirects: 5,
});
console.log(`[GiteaAPI] Sign-up response status: ${signUpResp.status()}`);
// Verify
const check = await ctx.get(`/api/v1/user`, {
headers: {
Authorization: `Basic ${btoa(`${GITEA_ADMIN_USER}:${GITEA_ADMIN_PASS}`)}`,
},
failOnStatusCode: false,
});
if (!check.ok()) {
throw new Error(
`Failed to verify admin user after creation (status ${check.status()})`,
);
}
console.log("[GiteaAPI] Admin user verified");
}
/** Generate a Gitea API token for the admin user. */
async createToken(): Promise<string> {
if (this.token) return this.token;
const ctx = await this.getCtx();
const tokenName = `e2e-token-${Date.now()}`;
const resp = await ctx.post(`/api/v1/users/${GITEA_ADMIN_USER}/tokens`, {
headers: {
Authorization: `Basic ${btoa(`${GITEA_ADMIN_USER}:${GITEA_ADMIN_PASS}`)}`,
"Content-Type": "application/json",
},
data: {
name: tokenName,
scopes: [
"read:user",
"write:user",
"read:organization",
"write:organization",
"read:repository",
"write:repository",
"read:issue",
"write:issue",
"read:misc",
"write:misc",
"read:admin",
"write:admin",
],
},
});
expect(
resp.ok(),
`Failed to create Gitea token: ${resp.status()}`,
).toBeTruthy();
const data = await resp.json();
this.token = data.sha1 || data.token;
console.log(`[GiteaAPI] Created token: ${tokenName}`);
return this.token;
}
/** Create an organization in Gitea. */
async ensureOrg(orgName: string): Promise<void> {
const ctx = await this.getCtx();
const token = await this.createToken();
// Check if org exists
const check = await ctx.get(`/api/v1/orgs/${orgName}`, {
headers: { Authorization: `token ${token}` },
failOnStatusCode: false,
});
if (check.ok()) {
console.log(`[GiteaAPI] Org ${orgName} already exists`);
return;
}
const resp = await ctx.post(`/api/v1/orgs`, {
headers: {
Authorization: `token ${token}`,
"Content-Type": "application/json",
},
data: {
username: orgName,
full_name: orgName,
description: "E2E test mirror organization",
visibility: "public",
},
});
expect(resp.ok(), `Failed to create org: ${resp.status()}`).toBeTruthy();
console.log(`[GiteaAPI] Created org: ${orgName}`);
}
/** List repos in a Gitea org. */
async listOrgRepos(orgName: string): Promise<any[]> {
const ctx = await this.getCtx();
const token = await this.createToken();
const resp = await ctx.get(`/api/v1/orgs/${orgName}/repos`, {
headers: { Authorization: `token ${token}` },
failOnStatusCode: false,
});
if (!resp.ok()) return [];
return resp.json();
}
/** List repos for the admin user. */
async listUserRepos(): Promise<any[]> {
const ctx = await this.getCtx();
const token = await this.createToken();
const resp = await ctx.get(`/api/v1/users/${GITEA_ADMIN_USER}/repos`, {
headers: { Authorization: `token ${token}` },
failOnStatusCode: false,
});
if (!resp.ok()) return [];
return resp.json();
}
/** Get a specific repo. */
async getRepo(owner: string, name: string): Promise<any | null> {
const ctx = await this.getCtx();
const token = await this.createToken();
const resp = await ctx.get(`/api/v1/repos/${owner}/${name}`, {
headers: { Authorization: `token ${token}` },
failOnStatusCode: false,
});
if (!resp.ok()) return null;
return resp.json();
}
/** List branches for a repo. */
async listBranches(owner: string, name: string): Promise<any[]> {
const ctx = await this.getCtx();
const token = await this.createToken();
const resp = await ctx.get(`/api/v1/repos/${owner}/${name}/branches`, {
headers: { Authorization: `token ${token}` },
failOnStatusCode: false,
});
if (!resp.ok()) return [];
return resp.json();
}
/** List tags for a repo. */
async listTags(owner: string, name: string): Promise<any[]> {
const ctx = await this.getCtx();
const token = await this.createToken();
const resp = await ctx.get(`/api/v1/repos/${owner}/${name}/tags`, {
headers: { Authorization: `token ${token}` },
failOnStatusCode: false,
});
if (!resp.ok()) return [];
return resp.json();
}
/** List commits for a repo (on default branch). */
async listCommits(
owner: string,
name: string,
opts?: { sha?: string; limit?: number },
): Promise<any[]> {
const ctx = await this.getCtx();
const token = await this.createToken();
const params = new URLSearchParams();
if (opts?.sha) params.set("sha", opts.sha);
if (opts?.limit) params.set("limit", String(opts.limit));
const qs = params.toString() ? `?${params.toString()}` : "";
const resp = await ctx.get(
`/api/v1/repos/${owner}/${name}/commits${qs}`,
{
headers: { Authorization: `token ${token}` },
failOnStatusCode: false,
},
);
if (!resp.ok()) return [];
return resp.json();
}
/** Get a single branch (includes the commit SHA). */
async getBranch(
owner: string,
name: string,
branch: string,
): Promise<any | null> {
const ctx = await this.getCtx();
const token = await this.createToken();
const resp = await ctx.get(
`/api/v1/repos/${owner}/${name}/branches/${branch}`,
{
headers: { Authorization: `token ${token}` },
failOnStatusCode: false,
},
);
if (!resp.ok()) return null;
return resp.json();
}
/** Get file content from a repo. */
async getFileContent(
owner: string,
name: string,
filePath: string,
ref?: string,
): Promise<string | null> {
const ctx = await this.getCtx();
const token = await this.createToken();
const refQuery = ref ? `?ref=${encodeURIComponent(ref)}` : "";
const resp = await ctx.get(
`/api/v1/repos/${owner}/${name}/raw/${filePath}${refQuery}`,
{
headers: { Authorization: `token ${token}` },
failOnStatusCode: false,
},
);
if (!resp.ok()) return null;
return resp.text();
}
/** Get a commit by SHA. */
async getCommit(
owner: string,
name: string,
sha: string,
): Promise<any | null> {
const ctx = await this.getCtx();
const token = await this.createToken();
const resp = await ctx.get(
`/api/v1/repos/${owner}/${name}/git/commits/${sha}`,
{
headers: { Authorization: `token ${token}` },
failOnStatusCode: false,
},
);
if (!resp.ok()) return null;
return resp.json();
}
/** Trigger mirror sync for a repo via the Gitea API directly. */
async triggerMirrorSync(owner: string, name: string): Promise<boolean> {
const ctx = await this.getCtx();
const token = await this.createToken();
const resp = await ctx.post(
`/api/v1/repos/${owner}/${name}/mirror-sync`,
{
headers: { Authorization: `token ${token}` },
failOnStatusCode: false,
},
);
return resp.ok() || resp.status() === 200;
}
getTokenValue(): string {
return this.token;
}
}
// ─── App auth helpers ────────────────────────────────────────────────────────
/**
* Sign up + sign in to the gitea-mirror app using the Better Auth REST API
* and return the session cookie string.
*/
export async function getAppSessionCookies(
request: APIRequestContext,
): Promise<string> {
// 1. Try sign-in first (user may already exist from a previous test / run)
const signInResp = await request.post(`${APP_URL}/api/auth/sign-in/email`, {
data: { email: APP_USER_EMAIL, password: APP_USER_PASS },
failOnStatusCode: false,
});
if (signInResp.ok()) {
const cookies = extractSetCookies(signInResp);
if (cookies) {
console.log("[App] Signed in (existing user)");
return cookies;
}
}
// 2. Register
const signUpResp = await request.post(`${APP_URL}/api/auth/sign-up/email`, {
data: {
name: APP_USER_NAME,
email: APP_USER_EMAIL,
password: APP_USER_PASS,
},
failOnStatusCode: false,
});
const signUpStatus = signUpResp.status();
console.log(`[App] Sign-up response: ${signUpStatus}`);
// After sign-up Better Auth may already set a session cookie
const signUpCookies = extractSetCookies(signUpResp);
if (signUpCookies) {
console.log("[App] Got session from sign-up response");
return signUpCookies;
}
// 3. Sign in after registration
const postRegSignIn = await request.post(
`${APP_URL}/api/auth/sign-in/email`,
{
data: { email: APP_USER_EMAIL, password: APP_USER_PASS },
failOnStatusCode: false,
},
);
if (!postRegSignIn.ok()) {
const body = await postRegSignIn.text();
throw new Error(
`Sign-in after registration failed (${postRegSignIn.status()}): ${body}`,
);
}
const cookies = extractSetCookies(postRegSignIn);
if (!cookies) {
throw new Error("Sign-in succeeded but no session cookie was returned");
}
console.log("[App] Signed in (after registration)");
return cookies;
}
/**
* Extract session cookies from a response's `set-cookie` headers.
*/
export function extractSetCookies(
resp: Awaited<ReturnType<APIRequestContext["post"]>>,
): string {
const raw = resp
.headersArray()
.filter((h) => h.name.toLowerCase() === "set-cookie");
if (raw.length === 0) return "";
const pairs: string[] = [];
for (const header of raw) {
const nv = header.value.split(";")[0].trim();
if (nv) pairs.push(nv);
}
return pairs.join("; ");
}
/**
* Sign in via the browser UI so the browser context gets session cookies.
*/
export async function signInViaBrowser(page: Page): Promise<string> {
const signInResp = await page.request.post(
`${APP_URL}/api/auth/sign-in/email`,
{
data: { email: APP_USER_EMAIL, password: APP_USER_PASS },
failOnStatusCode: false,
},
);
if (!signInResp.ok()) {
const signUpResp = await page.request.post(
`${APP_URL}/api/auth/sign-up/email`,
{
data: {
name: APP_USER_NAME,
email: APP_USER_EMAIL,
password: APP_USER_PASS,
},
failOnStatusCode: false,
},
);
console.log(`[Browser] Sign-up status: ${signUpResp.status()}`);
const retryResp = await page.request.post(
`${APP_URL}/api/auth/sign-in/email`,
{
data: { email: APP_USER_EMAIL, password: APP_USER_PASS },
failOnStatusCode: false,
},
);
if (!retryResp.ok()) {
console.log(`[Browser] Sign-in retry failed: ${retryResp.status()}`);
}
}
await page.goto(`${APP_URL}/`);
await page.waitForLoadState("networkidle");
const url = page.url();
console.log(`[Browser] After sign-in, URL: ${url}`);
const cookies = await page.context().cookies();
return cookies.map((c) => `${c.name}=${c.value}`).join("; ");
}
// ─── Config helper ───────────────────────────────────────────────────────────
/** Save app config via the API. */
export async function saveConfig(
request: APIRequestContext,
giteaToken: string,
cookies: string,
overrides: Record<string, any> = {},
): Promise<void> {
const giteaConfigDefaults = {
url: GITEA_URL,
username: GITEA_ADMIN_USER,
token: giteaToken,
organization: GITEA_MIRROR_ORG,
visibility: "public",
starredReposOrg: "github-stars",
preserveOrgStructure: false,
mirrorStrategy: "single-org",
backupStrategy: "disabled",
blockSyncOnBackupFailure: false,
};
const configPayload = {
githubConfig: {
username: "e2e-test-user",
token: "fake-github-token-for-e2e",
privateRepositories: false,
mirrorStarred: true,
},
giteaConfig: { ...giteaConfigDefaults, ...(overrides.giteaConfig || {}) },
scheduleConfig: {
enabled: false,
interval: 3600,
},
cleanupConfig: {
enabled: false,
retentionDays: 86400,
deleteIfNotInGitHub: false,
orphanedRepoAction: "skip",
dryRun: true,
},
mirrorOptions: {
mirrorReleases: false,
mirrorLFS: false,
mirrorMetadata: false,
metadataComponents: {
issues: false,
pullRequests: false,
labels: false,
milestones: false,
wiki: false,
},
},
advancedOptions: {
skipForks: false,
starredCodeOnly: false,
},
};
const resp = await request.post(`${APP_URL}/api/config`, {
data: configPayload,
headers: {
"Content-Type": "application/json",
Cookie: cookies,
},
failOnStatusCode: false,
});
const status = resp.status();
console.log(`[App] Save config response: ${status}`);
if (status >= 400) {
const body = await resp.text();
console.log(`[App] Config error body: ${body}`);
}
expect(status, "Config save should not return server error").toBeLessThan(
500,
);
}
// ─── Dashboard / repo helpers ────────────────────────────────────────────────
/**
* Fetch the list of repository IDs from the app's dashboard API.
* Optionally filter to repos with a given status.
*/
export async function getRepositoryIds(
request: APIRequestContext,
cookies: string,
opts?: { status?: string },
): Promise<{ ids: string[]; repos: any[] }> {
const dashResp = await request.get(`${APP_URL}/api/dashboard`, {
headers: { Cookie: cookies },
failOnStatusCode: false,
});
if (!dashResp.ok()) return { ids: [], repos: [] };
const dashData = await dashResp.json();
const repos: any[] = dashData.repositories ?? dashData.repos ?? [];
const filtered = opts?.status
? repos.filter((r: any) => r.status === opts.status)
: repos;
return {
ids: filtered.map((r: any) => r.id),
repos: filtered,
};
}
/**
* Trigger mirror jobs for the given repository IDs via the app API,
* then wait for a specified delay for async processing.
*/
export async function triggerMirrorJobs(
request: APIRequestContext,
cookies: string,
repositoryIds: string[],
waitMs = 30_000,
): Promise<number> {
const mirrorResp = await request.post(`${APP_URL}/api/job/mirror-repo`, {
headers: {
"Content-Type": "application/json",
Cookie: cookies,
},
data: { repositoryIds },
failOnStatusCode: false,
});
const status = mirrorResp.status();
if (waitMs > 0) {
await new Promise((r) => setTimeout(r, waitMs));
}
return status;
}
/**
* Trigger sync-repo (re-sync already-mirrored repos) for the given
* repository IDs, then wait for processing.
*/
export async function triggerSyncRepo(
request: APIRequestContext,
cookies: string,
repositoryIds: string[],
waitMs = 25_000,
): Promise<number> {
const syncResp = await request.post(`${APP_URL}/api/job/sync-repo`, {
headers: {
"Content-Type": "application/json",
Cookie: cookies,
},
data: { repositoryIds },
failOnStatusCode: false,
});
const status = syncResp.status();
if (waitMs > 0) {
await new Promise((r) => setTimeout(r, waitMs));
}
return status;
}

View File

@@ -0,0 +1,98 @@
import { defineConfig, devices } from "@playwright/test";
/**
* Playwright configuration for gitea-mirror E2E tests.
*
* Expected services (started by run-e2e.sh before Playwright launches):
* - Fake GitHub API server on http://localhost:4580
* - Git HTTP server on http://localhost:4590
* - Gitea instance on http://localhost:3333
* - gitea-mirror app on http://localhost:4321
*
* Test files are numbered to enforce execution order (they share state
* via a single Gitea + app instance):
* 01-health.spec.ts service smoke tests
* 02-mirror-workflow.spec.ts full first-mirror journey
* 03-backup.spec.ts backup config toggling
* 04-force-push.spec.ts force-push simulation & backup verification
* 05-sync-verification.spec.ts dynamic repos, content integrity, reset
*/
export default defineConfig({
testDir: ".",
testMatch: /\d+-.*\.spec\.ts$/,
/* Fail the build on CI if test.only is left in source */
forbidOnly: !!process.env.CI,
/* Retry once on CI to absorb flakiness from container startup races */
retries: process.env.CI ? 1 : 0,
/* Limit parallelism the tests share a single Gitea + app instance */
workers: 1,
fullyParallel: false,
/* Generous timeout: mirrors involve real HTTP round-trips to Gitea */
timeout: 120_000,
expect: { timeout: 15_000 },
/* Reporter */
reporter: process.env.CI
? [
["github"],
["html", { open: "never", outputFolder: "playwright-report" }],
]
: [
["list"],
["html", { open: "on-failure", outputFolder: "playwright-report" }],
],
outputDir: "test-results",
use: {
/* Base URL of the gitea-mirror app */
baseURL: process.env.APP_URL || "http://localhost:4321",
/* Collect traces on first retry so CI failures are debuggable */
trace: "on-first-retry",
screenshot: "only-on-failure",
video: "retain-on-failure",
/* Extra HTTP headers aren't needed but keep accept consistent */
extraHTTPHeaders: {
Accept: "application/json, text/html, */*",
},
},
projects: [
{
name: "chromium",
use: { ...devices["Desktop Chrome"] },
},
],
/* We do NOT use webServer here because run-e2e.sh manages all services.
* On CI the GitHub Action workflow starts them before invoking Playwright.
* Locally, run-e2e.sh does the same.
*
* If you want Playwright to start the app for you during local dev, uncomment:
*
* webServer: [
* {
* command: "npx tsx tests/e2e/fake-github-server.ts",
* port: 4580,
* reuseExistingServer: true,
* timeout: 10_000,
* },
* {
* command: "bun run dev",
* port: 4321,
* reuseExistingServer: true,
* timeout: 30_000,
* env: {
* GITHUB_API_URL: "http://localhost:4580",
* BETTER_AUTH_SECRET: "e2e-test-secret",
* },
* },
* ],
*/
});

455
tests/e2e/run-e2e.sh Executable file
View File

@@ -0,0 +1,455 @@
#!/usr/bin/env bash
# ────────────────────────────────────────────────────────────────────────────────
# E2E Test Orchestrator
#
# Starts all required services, runs Playwright E2E tests, and tears down.
#
# Services managed:
# 1. Gitea instance (Docker/Podman on port 3333)
# 2. Fake GitHub API (Node.js on port 4580)
# 3. gitea-mirror app (Astro dev server on port 4321)
#
# Usage:
# ./tests/e2e/run-e2e.sh # full run (cleanup → start → test → teardown)
# ./tests/e2e/run-e2e.sh --no-build # skip the Astro build step
# ./tests/e2e/run-e2e.sh --keep # don't tear down services after tests
# ./tests/e2e/run-e2e.sh --ci # CI-friendly mode (stricter, no --keep)
#
# Environment variables:
# GITEA_PORT (default: 3333)
# FAKE_GITHUB_PORT (default: 4580)
# APP_PORT (default: 4321)
# SKIP_CLEANUP (default: false) set "true" to skip initial cleanup
# BUN_CMD (default: auto-detected bun or "npx --yes bun")
# ────────────────────────────────────────────────────────────────────────────────
set -euo pipefail
# ─── Resolve paths ────────────────────────────────────────────────────────────
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
PROJECT_ROOT="$(cd "$SCRIPT_DIR/../.." && pwd)"
COMPOSE_FILE="$SCRIPT_DIR/docker-compose.e2e.yml"
# ─── Configuration ────────────────────────────────────────────────────────────
GITEA_PORT="${GITEA_PORT:-3333}"
FAKE_GITHUB_PORT="${FAKE_GITHUB_PORT:-4580}"
APP_PORT="${APP_PORT:-4321}"
GIT_SERVER_PORT="${GIT_SERVER_PORT:-4590}"
GITEA_URL="http://localhost:${GITEA_PORT}"
FAKE_GITHUB_URL="http://localhost:${FAKE_GITHUB_PORT}"
APP_URL="http://localhost:${APP_PORT}"
GIT_SERVER_URL="http://localhost:${GIT_SERVER_PORT}"
# URL that Gitea (inside Docker) uses to reach the git-server container
GIT_SERVER_INTERNAL_URL="http://git-server"
NO_BUILD=false
KEEP_RUNNING=false
CI_MODE=false
for arg in "$@"; do
case "$arg" in
--no-build) NO_BUILD=true ;;
--keep) KEEP_RUNNING=true ;;
--ci) CI_MODE=true ;;
--help|-h)
echo "Usage: $0 [--no-build] [--keep] [--ci]"
exit 0
;;
esac
done
# ─── Detect tools ─────────────────────────────────────────────────────────────
# Container runtime
COMPOSE_CMD=""
CONTAINER_CMD=""
if command -v podman-compose &>/dev/null; then
COMPOSE_CMD="podman-compose"
CONTAINER_CMD="podman"
elif command -v docker-compose &>/dev/null; then
COMPOSE_CMD="docker-compose"
CONTAINER_CMD="docker"
elif command -v docker &>/dev/null && docker compose version &>/dev/null 2>&1; then
COMPOSE_CMD="docker compose"
CONTAINER_CMD="docker"
else
echo "ERROR: No container compose tool found. Install docker-compose or podman-compose."
exit 1
fi
# Bun or fallback
if command -v bun &>/dev/null; then
BUN_CMD="${BUN_CMD:-bun}"
elif command -v npx &>/dev/null; then
# Use npx to run bun commands works on CI with setup-bun action
BUN_CMD="${BUN_CMD:-npx --yes bun}"
else
echo "ERROR: Neither bun nor npx found."
exit 1
fi
# Node/tsx for the fake GitHub server
if command -v tsx &>/dev/null; then
TSX_CMD="tsx"
elif command -v npx &>/dev/null; then
TSX_CMD="npx --yes tsx"
else
echo "ERROR: Neither tsx nor npx found."
exit 1
fi
echo "╔══════════════════════════════════════════════════════════════╗"
echo "║ E2E Test Orchestrator ║"
echo "╠══════════════════════════════════════════════════════════════╣"
echo "║ Container runtime : $COMPOSE_CMD"
echo "║ Bun command : $BUN_CMD"
echo "║ TSX command : $TSX_CMD"
echo "║ Gitea URL : $GITEA_URL"
echo "║ Fake GitHub URL : $FAKE_GITHUB_URL"
echo "║ App URL : $APP_URL"
echo "║ Git Server URL : $GIT_SERVER_URL"
echo "║ Git Server (int) : $GIT_SERVER_INTERNAL_URL"
echo "║ CI mode : $CI_MODE"
echo "╚══════════════════════════════════════════════════════════════╝"
echo ""
# ─── PID tracking for cleanup ─────────────────────────────────────────────────
FAKE_GITHUB_PID=""
APP_PID=""
EXIT_CODE=0
cleanup_on_exit() {
local code=$?
echo ""
echo "────────────────────────────────────────────────────────────────"
echo "[teardown] Cleaning up..."
# Kill fake GitHub server
if [[ -n "$FAKE_GITHUB_PID" ]] && kill -0 "$FAKE_GITHUB_PID" 2>/dev/null; then
echo "[teardown] Stopping fake GitHub server (PID $FAKE_GITHUB_PID)..."
kill "$FAKE_GITHUB_PID" 2>/dev/null || true
wait "$FAKE_GITHUB_PID" 2>/dev/null || true
fi
rm -f "$SCRIPT_DIR/.fake-github.pid"
# Kill app server
if [[ -n "$APP_PID" ]] && kill -0 "$APP_PID" 2>/dev/null; then
echo "[teardown] Stopping gitea-mirror app (PID $APP_PID)..."
kill "$APP_PID" 2>/dev/null || true
wait "$APP_PID" 2>/dev/null || true
fi
rm -f "$SCRIPT_DIR/.app.pid"
# Stop containers (unless --keep)
if [[ "$KEEP_RUNNING" == false ]]; then
if [[ -n "$COMPOSE_CMD" ]] && [[ -f "$COMPOSE_FILE" ]]; then
echo "[teardown] Stopping Gitea container..."
$COMPOSE_CMD -f "$COMPOSE_FILE" down --volumes --remove-orphans 2>/dev/null || true
fi
else
echo "[teardown] --keep flag set, leaving services running"
fi
echo "[teardown] Done."
# Use the test exit code, not the cleanup exit code
if [[ $EXIT_CODE -ne 0 ]]; then
exit $EXIT_CODE
fi
exit $code
}
trap cleanup_on_exit EXIT INT TERM
# ─── Step 0: Cleanup previous run ────────────────────────────────────────────
if [[ "${SKIP_CLEANUP:-false}" != "true" ]]; then
echo "┌──────────────────────────────────────────────────────────────┐"
echo "│ Step 0: Cleanup previous E2E run │"
echo "└──────────────────────────────────────────────────────────────┘"
bash "$SCRIPT_DIR/cleanup.sh" --soft 2>/dev/null || true
echo ""
fi
# ─── Step 1: Install dependencies ────────────────────────────────────────────
echo "┌──────────────────────────────────────────────────────────────┐"
echo "│ Step 1: Install dependencies │"
echo "└──────────────────────────────────────────────────────────────┘"
cd "$PROJECT_ROOT"
$BUN_CMD install 2>&1 | tail -5
echo "[deps] ✓ Dependencies installed"
# Install Playwright browsers if needed
if ! npx playwright install --dry-run chromium &>/dev/null 2>&1; then
echo "[deps] Installing Playwright browsers..."
npx playwright install chromium 2>&1 | tail -3
fi
# Always ensure system deps are available (needed in CI/fresh environments)
if [[ "$CI_MODE" == true ]]; then
echo "[deps] Installing Playwright system dependencies..."
npx playwright install-deps chromium 2>&1 | tail -5 || true
fi
echo "[deps] ✓ Playwright ready"
echo ""
# ─── Step 1.5: Create test git repositories ─────────────────────────────────
echo "┌──────────────────────────────────────────────────────────────┐"
echo "│ Step 1.5: Create test git repositories │"
echo "└──────────────────────────────────────────────────────────────┘"
GIT_REPOS_DIR="$SCRIPT_DIR/git-repos"
echo "[git-repos] Creating bare git repos in $GIT_REPOS_DIR ..."
$BUN_CMD run "$SCRIPT_DIR/create-test-repos.ts" --output-dir "$GIT_REPOS_DIR" 2>&1
if [[ ! -f "$GIT_REPOS_DIR/manifest.json" ]]; then
echo "ERROR: Test git repos were not created (manifest.json missing)"
EXIT_CODE=1
exit 1
fi
echo "[git-repos] ✓ Test repositories created"
echo ""
# ─── Step 2: Build the app ──────────────────────────────────────────────────
if [[ "$NO_BUILD" == false ]]; then
echo "┌──────────────────────────────────────────────────────────────┐"
echo "│ Step 2: Build gitea-mirror │"
echo "└──────────────────────────────────────────────────────────────┘"
cd "$PROJECT_ROOT"
# Initialize the database
echo "[build] Initializing database..."
$BUN_CMD run manage-db init 2>&1 | tail -3 || true
# Build the Astro project
echo "[build] Building Astro project..."
GITHUB_API_URL="$FAKE_GITHUB_URL" \
BETTER_AUTH_SECRET="e2e-test-secret" \
$BUN_CMD run build 2>&1 | tail -10
echo "[build] ✓ Build complete"
echo ""
else
echo "[build] Skipped (--no-build flag)"
echo ""
fi
# ─── Step 3: Start Gitea container ──────────────────────────────────────────
echo "┌──────────────────────────────────────────────────────────────┐"
echo "│ Step 3: Start Gitea container │"
echo "└──────────────────────────────────────────────────────────────┘"
$COMPOSE_CMD -f "$COMPOSE_FILE" up -d 2>&1
# Wait for git-server to be healthy first (Gitea depends on it)
echo "[git-server] Waiting for git HTTP server..."
GIT_SERVER_READY=false
for i in $(seq 1 30); do
if curl -sf "${GIT_SERVER_URL}/manifest.json" &>/dev/null; then
GIT_SERVER_READY=true
break
fi
printf "."
sleep 1
done
echo ""
if [[ "$GIT_SERVER_READY" != true ]]; then
echo "ERROR: Git HTTP server did not start within 30 seconds"
echo "[git-server] Container logs:"
$COMPOSE_CMD -f "$COMPOSE_FILE" logs git-server --tail=20 2>/dev/null || true
EXIT_CODE=1
exit 1
fi
echo "[git-server] ✓ Git HTTP server is ready on $GIT_SERVER_URL"
echo "[gitea] Waiting for Gitea to become healthy..."
GITEA_READY=false
for i in $(seq 1 60); do
if curl -sf "${GITEA_URL}/api/v1/version" &>/dev/null; then
GITEA_READY=true
break
fi
printf "."
sleep 2
done
echo ""
if [[ "$GITEA_READY" != true ]]; then
echo "ERROR: Gitea did not become healthy within 120 seconds"
echo "[gitea] Container logs:"
$COMPOSE_CMD -f "$COMPOSE_FILE" logs gitea-e2e --tail=30 2>/dev/null || true
EXIT_CODE=1
exit 1
fi
GITEA_VERSION=$(curl -sf "${GITEA_URL}/api/v1/version" | grep -o '"version":"[^"]*"' | cut -d'"' -f4)
echo "[gitea] ✓ Gitea is ready (version: ${GITEA_VERSION:-unknown})"
echo ""
# ─── Step 4: Start fake GitHub API ──────────────────────────────────────────
echo "┌──────────────────────────────────────────────────────────────┐"
echo "│ Step 4: Start fake GitHub API server │"
echo "└──────────────────────────────────────────────────────────────┘"
PORT=$FAKE_GITHUB_PORT GIT_SERVER_URL="$GIT_SERVER_INTERNAL_URL" \
$TSX_CMD "$SCRIPT_DIR/fake-github-server.ts" &
FAKE_GITHUB_PID=$!
echo "$FAKE_GITHUB_PID" > "$SCRIPT_DIR/.fake-github.pid"
echo "[fake-github] Started (PID: $FAKE_GITHUB_PID)"
echo "[fake-github] Waiting for server to be ready..."
FAKE_READY=false
for i in $(seq 1 30); do
if curl -sf "${FAKE_GITHUB_URL}/___mgmt/health" &>/dev/null; then
FAKE_READY=true
break
fi
# Check if process died
if ! kill -0 "$FAKE_GITHUB_PID" 2>/dev/null; then
echo "ERROR: Fake GitHub server process died"
EXIT_CODE=1
exit 1
fi
printf "."
sleep 1
done
echo ""
if [[ "$FAKE_READY" != true ]]; then
echo "ERROR: Fake GitHub server did not start within 30 seconds"
EXIT_CODE=1
exit 1
fi
echo "[fake-github] ✓ Fake GitHub API is ready on $FAKE_GITHUB_URL"
# Tell the fake GitHub server to use the git-server container URL for clone_url
# (This updates existing repos in the store so Gitea can actually clone them)
echo "[fake-github] Setting clone URL base to $GIT_SERVER_INTERNAL_URL ..."
curl -sf -X POST "${FAKE_GITHUB_URL}/___mgmt/set-clone-url" \
-H "Content-Type: application/json" \
-d "{\"url\": \"${GIT_SERVER_INTERNAL_URL}\"}" || true
echo "[fake-github] ✓ Clone URLs configured"
echo ""
# ─── Step 5: Start gitea-mirror app ────────────────────────────────────────
echo "┌──────────────────────────────────────────────────────────────┐"
echo "│ Step 5: Start gitea-mirror application │"
echo "└──────────────────────────────────────────────────────────────┘"
cd "$PROJECT_ROOT"
# Reinitialize the database in case build step reset it
$BUN_CMD run manage-db init 2>&1 | tail -2 || true
# Start the app with E2E environment
GITHUB_API_URL="$FAKE_GITHUB_URL" \
BETTER_AUTH_SECRET="e2e-test-secret" \
BETTER_AUTH_URL="$APP_URL" \
DATABASE_URL="file:data/gitea-mirror.db" \
HOST="0.0.0.0" \
PORT="$APP_PORT" \
NODE_ENV="production" \
PRE_SYNC_BACKUP_ENABLED="false" \
ENCRYPTION_SECRET="e2e-encryption-secret-32char!!" \
$BUN_CMD run start &
APP_PID=$!
echo "$APP_PID" > "$SCRIPT_DIR/.app.pid"
echo "[app] Started (PID: $APP_PID)"
echo "[app] Waiting for app to be ready..."
APP_READY=false
for i in $(seq 1 90); do
# Try the health endpoint first, then fall back to root
if curl -sf "${APP_URL}/api/health" &>/dev/null 2>&1 || \
curl -sf -o /dev/null -w "%{http_code}" "${APP_URL}/" 2>/dev/null | grep -q "^[23]"; then
APP_READY=true
break
fi
# Check if process died
if ! kill -0 "$APP_PID" 2>/dev/null; then
echo ""
echo "ERROR: gitea-mirror app process died"
EXIT_CODE=1
exit 1
fi
printf "."
sleep 2
done
echo ""
if [[ "$APP_READY" != true ]]; then
echo "ERROR: gitea-mirror app did not start within 180 seconds"
EXIT_CODE=1
exit 1
fi
echo "[app] ✓ gitea-mirror app is ready on $APP_URL"
echo ""
# ─── Step 6: Run Playwright E2E tests ──────────────────────────────────────
echo "┌──────────────────────────────────────────────────────────────┐"
echo "│ Step 6: Run Playwright E2E tests │"
echo "└──────────────────────────────────────────────────────────────┘"
cd "$PROJECT_ROOT"
# Ensure test-results directory exists
mkdir -p "$SCRIPT_DIR/test-results"
# Run Playwright
set +e
APP_URL="$APP_URL" \
GITEA_URL="$GITEA_URL" \
FAKE_GITHUB_URL="$FAKE_GITHUB_URL" \
npx playwright test \
--config "$SCRIPT_DIR/playwright.config.ts" \
--reporter=list
PLAYWRIGHT_EXIT=$?
set -e
echo ""
if [[ $PLAYWRIGHT_EXIT -eq 0 ]]; then
echo "═══════════════════════════════════════════════════════════════"
echo " ✅ E2E tests PASSED"
echo "═══════════════════════════════════════════════════════════════"
else
echo "═══════════════════════════════════════════════════════════════"
echo " ❌ E2E tests FAILED (exit code: $PLAYWRIGHT_EXIT)"
echo "═══════════════════════════════════════════════════════════════"
# On failure, dump some diagnostic info
echo ""
echo "[diag] Gitea container status:"
$COMPOSE_CMD -f "$COMPOSE_FILE" ps 2>/dev/null || true
echo ""
echo "[diag] Gitea container logs (last 20 lines):"
$COMPOSE_CMD -f "$COMPOSE_FILE" logs gitea-e2e --tail=20 2>/dev/null || true
echo ""
echo "[diag] Git server logs (last 10 lines):"
$COMPOSE_CMD -f "$COMPOSE_FILE" logs git-server --tail=10 2>/dev/null || true
echo ""
echo "[diag] Git server health:"
curl -sf "${GIT_SERVER_URL}/manifest.json" 2>/dev/null || echo "(unreachable)"
echo ""
echo "[diag] Fake GitHub health:"
curl -sf "${FAKE_GITHUB_URL}/___mgmt/health" 2>/dev/null || echo "(unreachable)"
echo ""
echo "[diag] App health:"
curl -sf "${APP_URL}/api/health" 2>/dev/null || echo "(unreachable)"
echo ""
# Point to HTML report
if [[ -d "$SCRIPT_DIR/playwright-report" ]]; then
echo "[diag] HTML report: $SCRIPT_DIR/playwright-report/index.html"
echo " Run: npx playwright show-report $SCRIPT_DIR/playwright-report"
fi
EXIT_CODE=$PLAYWRIGHT_EXIT
fi
# EXIT_CODE is used by the trap handler
exit $EXIT_CODE

View File

@@ -11,7 +11,6 @@
"dependencies": {
"@astrojs/mdx": "^4.3.13",
"@astrojs/react": "^4.4.2",
"@radix-ui/react-icons": "^1.3.2",
"@radix-ui/react-slot": "^1.2.4",
"@splinetool/react-spline": "^4.1.0",
"@splinetool/runtime": "^1.12.60",

28
www/pnpm-lock.yaml generated
View File

@@ -14,9 +14,6 @@ importers:
'@astrojs/react':
specifier: ^4.4.2
version: 4.4.2(@types/node@24.7.1)(@types/react-dom@19.2.3(@types/react@19.2.14))(@types/react@19.2.14)(jiti@2.6.1)(lightningcss@1.31.1)(react-dom@19.2.4(react@19.2.4))(react@19.2.4)
'@radix-ui/react-icons':
specifier: ^1.3.2
version: 1.3.2(react@19.2.4)
'@radix-ui/react-slot':
specifier: ^1.2.4
version: 1.2.4(@types/react@19.2.14)(react@19.2.4)
@@ -674,11 +671,6 @@ packages:
'@types/react':
optional: true
'@radix-ui/react-icons@1.3.2':
resolution: {integrity: sha512-fyQIhGDhzfc9pK2kH6Pl9c4BDJGfMkPqkyIgYDthyNYoNg3wVhoJMMh19WS4Up/1KMPFVpNsT2q3WmXn2N1m6g==}
peerDependencies:
react: ^16.x || ^17.x || ^18.x || ^19.0.0 || ^19.0.0-rc
'@radix-ui/react-slot@1.2.4':
resolution: {integrity: sha512-Jl+bCv8HxKnlTLVrcDE8zTMJ09R9/ukw4qBs/oZClOfoQk/cOTbDn+NceXfV7j09YPVQUryJPHurafcSg6EVKA==}
peerDependencies:
@@ -1951,8 +1943,8 @@ packages:
engines: {node: '>=18.0.0', npm: '>=8.0.0'}
hasBin: true
sax@1.4.4:
resolution: {integrity: sha512-1n3r/tGXO6b6VXMdFT54SHzT9ytu9yr7TaELowdYpMqY/Ao7EnlQGmAQ1+RatX7Tkkdm6hONI2owqNx2aZj5Sw==}
sax@1.5.0:
resolution: {integrity: sha512-21IYA3Q5cQf089Z6tgaUTr7lDAyzoTPx5HRtbhsME8Udispad8dC/+sziTNugOEx54ilvatQ9YCzl4KQLPcRHA==}
engines: {node: '>=11.0.0'}
scheduler@0.27.0:
@@ -2020,8 +2012,8 @@ packages:
style-to-object@1.0.14:
resolution: {integrity: sha512-LIN7rULI0jBscWQYaSswptyderlarFkjQ+t79nzty8tcIAceVomEVlLzH5VP4Cmsv6MtKhs7qaAiwlcp+Mgaxw==}
svgo@4.0.0:
resolution: {integrity: sha512-VvrHQ+9uniE+Mvx3+C9IEe/lWasXCU0nXMY2kZeLrHNICuRiC8uMPyM14UEaMOFA5mhyQqEkB02VoQ16n3DLaw==}
svgo@4.0.1:
resolution: {integrity: sha512-XDpWUOPC6FEibaLzjfe0ucaV0YrOjYotGJO1WpF0Zd+n6ZGEQUsSugaoLq9QkEZtAfQIxT42UChcssDVPP3+/w==}
engines: {node: '>=16'}
hasBin: true
@@ -2828,10 +2820,6 @@ snapshots:
optionalDependencies:
'@types/react': 19.2.14
'@radix-ui/react-icons@1.3.2(react@19.2.4)':
dependencies:
react: 19.2.4
'@radix-ui/react-slot@1.2.4(@types/react@19.2.14)(react@19.2.4)':
dependencies:
'@radix-ui/react-compose-refs': 1.1.2(@types/react@19.2.14)(react@19.2.4)
@@ -3197,7 +3185,7 @@ snapshots:
semver: 7.7.4
shiki: 3.22.0
smol-toml: 1.6.0
svgo: 4.0.0
svgo: 4.0.1
tinyexec: 1.0.2
tinyglobby: 0.2.15
tsconfck: 3.1.6(typescript@5.8.3)
@@ -4564,7 +4552,7 @@ snapshots:
'@rollup/rollup-win32-x64-msvc': 4.59.0
fsevents: 2.3.3
sax@1.4.4: {}
sax@1.5.0: {}
scheduler@0.27.0: {}
@@ -4660,7 +4648,7 @@ snapshots:
dependencies:
inline-style-parser: 0.2.7
svgo@4.0.0:
svgo@4.0.1:
dependencies:
commander: 11.1.0
css-select: 5.2.2
@@ -4668,7 +4656,7 @@ snapshots:
css-what: 6.2.2
csso: 5.0.5
picocolors: 1.1.1
sax: 1.4.4
sax: 1.5.0
tailwind-merge@3.5.0: {}

View File

@@ -1,11 +1,11 @@
---
import {
RefreshCw,
Building2,
FolderTree,
Activity,
Lock,
Heart,
import {
RefreshCw,
FileText,
ShieldCheck,
Activity,
Lock,
HardDrive,
} from 'lucide-react';
const features = [
@@ -17,37 +17,37 @@ const features = [
iconColor: "text-primary"
},
{
title: "Bulk Operations",
description: "Mirror entire organizations or user accounts with a single configuration.",
icon: Building2,
title: "Metadata Preservation",
description: "Mirror issues, pull requests, releases, labels, milestones, and wiki pages alongside your code.",
icon: FileText,
gradient: "from-accent/10 to-accent-teal/10",
iconColor: "text-accent"
},
{
title: "Preserve Structure",
description: "Maintain your GitHub organization structure or customize how repos are organized.",
icon: FolderTree,
title: "Force-Push Protection",
description: "Detect upstream force-pushes and automatically snapshot repos before destructive changes.",
icon: ShieldCheck,
gradient: "from-accent-teal/10 to-primary/10",
iconColor: "text-accent-teal"
},
{
title: "Real-time Status",
description: "Monitor mirror progress with live updates and detailed activity logs.",
title: "Real-time Dashboard",
description: "Monitor mirror progress with live updates, activity logs, and per-repo status tracking.",
icon: Activity,
gradient: "from-accent-coral/10 to-primary/10",
iconColor: "text-accent-coral"
},
{
title: "Secure & Private",
description: "Self-hosted solution keeps your code on your infrastructure with full control.",
title: "Secure & Self-Hosted",
description: "Tokens encrypted at rest with AES-256-GCM. Your code stays on your infrastructure.",
icon: Lock,
gradient: "from-accent-purple/10 to-primary/10",
iconColor: "text-accent-purple"
},
{
title: "Open Source",
description: "Free, transparent, and community-driven development. Contribute and customize.",
icon: Heart,
title: "Git LFS Support",
description: "Mirror large files and binary assets alongside your repositories with full LFS support.",
icon: HardDrive,
gradient: "from-primary/10 to-accent-purple/10",
iconColor: "text-primary"
}

View File

@@ -1,6 +1,5 @@
import { Button } from "./ui/button";
import { ArrowRight, Shield, RefreshCw, HardDrive } from "lucide-react";
import { GitHubLogoIcon } from "@radix-ui/react-icons";
import React, { Suspense } from 'react';
const Spline = React.lazy(() => import('@splinetool/react-spline'));

View File

@@ -1,8 +1,8 @@
import React, { useState } from 'react';
import { Button } from './ui/button';
import { Copy, Check, Terminal, Container, Cloud } from 'lucide-react';
import { Copy, Check, Terminal, Container, Cloud, Ship, Snowflake } from 'lucide-react';
type InstallMethod = 'docker' | 'manual' | 'proxmox';
type InstallMethod = 'docker' | 'helm' | 'nix' | 'manual' | 'proxmox';
export function Installation() {
const [activeMethod, setActiveMethod] = useState<InstallMethod>('docker');
@@ -37,6 +37,50 @@ export function Installation() {
}
]
},
helm: {
icon: Ship,
title: "Helm",
description: "Deploy to Kubernetes",
steps: [
{
title: "Clone the repository",
command: "git clone https://github.com/RayLabsHQ/gitea-mirror.git && cd gitea-mirror",
id: "helm-clone"
},
{
title: "Install the chart",
command: "helm upgrade --install gitea-mirror ./helm/gitea-mirror \\\n --namespace gitea-mirror --create-namespace",
id: "helm-install"
},
{
title: "Access the application",
command: "kubectl port-forward svc/gitea-mirror 4321:4321 -n gitea-mirror",
id: "helm-access"
}
]
},
nix: {
icon: Snowflake,
title: "Nix",
description: "Zero-config with Nix flakes",
steps: [
{
title: "Run directly with Nix",
command: "nix run github:RayLabsHQ/gitea-mirror",
id: "nix-run"
},
{
title: "Or install to your profile",
command: "nix profile install github:RayLabsHQ/gitea-mirror",
id: "nix-install"
},
{
title: "Access the application",
command: "# Open http://localhost:4321 in your browser",
id: "nix-access"
}
]
},
manual: {
icon: Terminal,
title: "Manual",

View File

@@ -39,7 +39,7 @@ const structuredData = {
name: "RayLabs",
url: "https://github.com/RayLabsHQ",
},
softwareVersion: "3.9.2",
softwareVersion: "3.11.0",
screenshot: [
`${siteUrl}/assets/dashboard.png`,
`${siteUrl}/assets/repositories.png`,
@@ -49,8 +49,9 @@ const structuredData = {
"Automated scheduled backups",
"Self-hosted (full data ownership)",
"Metadata preservation (issues, PRs, releases, wiki)",
"Docker support",
"Multi-repository backup",
"Force-push protection with smart detection",
"Docker, Helm, Nix, and Proxmox support",
"Multi-repository and organization backup",
"Git LFS support",
"Free and open source",
],