diff --git a/.github/screenshots/backup-strategy-ui.png b/.github/screenshots/backup-strategy-ui.png new file mode 100644 index 0000000..9969daf Binary files /dev/null and b/.github/screenshots/backup-strategy-ui.png differ diff --git a/README.md b/README.md index ea2d0b4..ad72df2 100644 --- a/README.md +++ b/README.md @@ -40,6 +40,7 @@ First user signup becomes admin. Configure GitHub and Gitea through the web inte - ๐Ÿ”„ **Auto-discovery** - Automatically import new GitHub repositories (v3.4.0+) - ๐Ÿงน **Repository cleanup** - Auto-remove repos deleted from GitHub (v3.4.0+) - ๐ŸŽฏ **Proper mirror intervals** - Respects configured sync intervals (v3.4.0+) +- ๐Ÿ›ก๏ธ **[Force-push protection](docs/FORCE_PUSH_PROTECTION.md)** - Smart detection with backup-on-demand or block-and-approve modes (Beta) - ๐Ÿ—‘๏ธ Automatic database cleanup with configurable retention - ๐Ÿณ Dockerized with multi-arch support (AMD64/ARM64) @@ -499,6 +500,7 @@ GNU Affero General Public License v3.0 (AGPL-3.0) - see [LICENSE](LICENSE) file - ๐Ÿ“– [Documentation](https://github.com/RayLabsHQ/gitea-mirror/tree/main/docs) - ๐Ÿ” [Environment Variables](docs/ENVIRONMENT_VARIABLES.md) +- ๐Ÿ›ก๏ธ [Force-Push Protection](docs/FORCE_PUSH_PROTECTION.md) - ๐Ÿ› [Report Issues](https://github.com/RayLabsHQ/gitea-mirror/issues) - ๐Ÿ’ฌ [Discussions](https://github.com/RayLabsHQ/gitea-mirror/discussions) - ๐Ÿ”ง [Proxmox VE Script](https://community-scripts.github.io/ProxmoxVE/scripts?id=gitea-mirror) diff --git a/docs/FORCE_PUSH_PROTECTION.md b/docs/FORCE_PUSH_PROTECTION.md new file mode 100644 index 0000000..f15b1d2 --- /dev/null +++ b/docs/FORCE_PUSH_PROTECTION.md @@ -0,0 +1,179 @@ +# Force-Push Protection + +This document describes the smart force-push protection system introduced in gitea-mirror v3.11.0+. + +## The Problem + +GitHub repositories can be force-pushed at any time โ€” rewriting history, deleting branches, or replacing commits entirely. When gitea-mirror syncs a force-pushed repo, the old history in Gitea is silently overwritten. Files, commits, and branches disappear with no way to recover them. + +The original workaround (`backupBeforeSync: true`) created a full git bundle backup before **every** sync. This doesn't scale โ€” a user with 100+ GiB of mirrors would need up to 2 TB of backup storage with default retention settings, even though force-pushes are rare. + +## Solution: Smart Detection + +Instead of backing up everything every time, the system detects force-pushes **before** they happen and only acts when needed. + +### How Detection Works + +Before each sync, the app compares branch SHAs between Gitea (the mirror) and GitHub (the source): + +1. **Fetch branches from both sides** โ€” lightweight API calls to get branch names and their latest commit SHAs +2. **Compare each branch**: + - SHAs match โ†’ nothing changed, no action needed + - SHAs differ โ†’ check if the change is a normal push or a force-push +3. **Ancestry check** โ€” for branches with different SHAs, call GitHub's compare API to determine if the new SHA is a descendant of the old one: + - **Fast-forward** (new SHA descends from old) โ†’ normal push, safe to sync + - **Diverged** (histories split) โ†’ force-push detected + - **404** (old SHA doesn't exist on GitHub anymore) โ†’ history was rewritten, force-push detected + - **Branch deleted on GitHub** โ†’ flagged as destructive change + +### What Happens on Detection + +Depends on the configured strategy (see below): +- **Backup strategies** (`always`, `on-force-push`): create a git bundle snapshot, then sync +- **Block strategy** (`block-on-force-push`): halt the sync, mark the repo as `pending-approval`, wait for user action + +### Fail-Open Design + +If detection itself fails (GitHub rate limits, network errors, API outages), sync proceeds normally. Detection never blocks a sync due to its own failure. Individual branch check failures are skipped โ€” one flaky branch doesn't affect the others. + +## Backup Strategies + +Configure via **Settings โ†’ GitHub Configuration โ†’ Destructive Update Protection**. + +| Strategy | What It Does | Storage Cost | Best For | +|---|---|---|---| +| **Disabled** | No detection, no backups | Zero | Repos you don't care about losing | +| **Always Backup** | Snapshot before every sync (original behavior) | High | Small mirror sets, maximum safety | +| **Smart** (default) | Detect force-pushes, backup only when found | Near-zero normally | Most users โ€” efficient protection | +| **Block & Approve** | Detect force-pushes, block sync until approved | Zero | Critical repos needing manual review | + +### Strategy Details + +#### Disabled + +Syncs proceed without any detection or backup. If a force-push happens on GitHub, the mirror silently overwrites. + +#### Always Backup + +Creates a git bundle snapshot before every sync regardless of whether a force-push occurred. This is the legacy behavior (equivalent to the old `backupBeforeSync: true`). Safe but expensive for large mirror sets. + +#### Smart (`on-force-push`) โ€” Recommended + +Runs the force-push detection before each sync. On normal days (no force-pushes), syncs proceed without any backup overhead. When a force-push is detected, a snapshot is created before the sync runs. + +This gives you protection when it matters with near-zero cost when it doesn't. + +#### Block & Approve (`block-on-force-push`) + +Runs detection and, when a force-push is found, **blocks the sync entirely**. The repository is marked as `pending-approval` and excluded from future scheduled syncs until you take action: + +- **Approve**: creates a backup first, then syncs (safe) +- **Dismiss**: clears the flag and resumes normal syncing (no backup) + +Use this for repos where you want manual control over destructive changes. + +## Additional Settings + +These appear when any non-disabled strategy is selected: + +### Snapshot Retention Count + +How many backup snapshots to keep per repository. Oldest snapshots are deleted when this limit is exceeded. Default: **20**. + +### Snapshot Directory + +Where git bundle backups are stored. Default: **`data/repo-backups`**. Bundles are organized as `///.bundle`. + +### Block Sync on Snapshot Failure + +Available for **Always Backup** and **Smart** strategies. When enabled, if the snapshot creation fails (disk full, permissions error, etc.), the sync is also blocked. When disabled, sync continues even if the snapshot couldn't be created. + +Recommended: **enabled** if you rely on backups for recovery. + +## Backward Compatibility + +The old `backupBeforeSync` boolean is still recognized: + +| Old Setting | New Equivalent | +|---|---| +| `backupBeforeSync: true` | `backupStrategy: "always"` | +| `backupBeforeSync: false` | `backupStrategy: "disabled"` | +| Neither set | `backupStrategy: "on-force-push"` (new default) | + +Existing configurations are automatically mapped. The old field is deprecated but will continue to work. + +## Environment Variables + +No new environment variables are required. The backup strategy is configured through the web UI and stored in the database alongside other config. + +## API + +### Approve/Dismiss Blocked Repos + +When using the `block-on-force-push` strategy, repos that are blocked can be managed via the API: + +```bash +# Approve sync (creates backup first, then syncs) +curl -X POST http://localhost:4321/api/job/approve-sync \ + -H "Content-Type: application/json" \ + -H "Cookie: " \ + -d '{"repositoryIds": [""], "action": "approve"}' + +# Dismiss (clear the block, resume normal syncing) +curl -X POST http://localhost:4321/api/job/approve-sync \ + -H "Content-Type: application/json" \ + -H "Cookie: " \ + -d '{"repositoryIds": [""], "action": "dismiss"}' +``` + +Blocked repos also show an **Approve** / **Dismiss** button in the repository table UI. + +## Architecture + +### Key Files + +| File | Purpose | +|---|---| +| `src/lib/utils/force-push-detection.ts` | Core detection: fetch branches, compare SHAs, check ancestry | +| `src/lib/repo-backup.ts` | Strategy resolver, backup decision logic, bundle creation | +| `src/lib/gitea-enhanced.ts` | Sync flow integration (calls detection + backup before mirror-sync) | +| `src/pages/api/job/approve-sync.ts` | Approve/dismiss API endpoint | +| `src/components/config/GitHubConfigForm.tsx` | Strategy selector UI | +| `src/components/repositories/RepositoryTable.tsx` | Pending-approval badge + action buttons | + +### Detection Flow + +``` +syncGiteaRepoEnhanced() + โ”‚ + โ”œโ”€ Resolve backup strategy (config โ†’ backupStrategy โ†’ backupBeforeSync โ†’ default) + โ”‚ + โ”œโ”€ If strategy needs detection ("on-force-push" or "block-on-force-push"): + โ”‚ โ”‚ + โ”‚ โ”œโ”€ fetchGiteaBranches() โ€” GET /api/v1/repos/{owner}/{repo}/branches + โ”‚ โ”œโ”€ fetchGitHubBranches() โ€” octokit.paginate(repos.listBranches) + โ”‚ โ”‚ + โ”‚ โ””โ”€ For each Gitea branch where SHA differs: + โ”‚ โ””โ”€ checkAncestry() โ€” octokit.repos.compareCommits() + โ”‚ โ”œโ”€ "ahead" or "identical" โ†’ fast-forward (safe) + โ”‚ โ”œโ”€ "diverged" or "behind" โ†’ force-push detected + โ”‚ โ””โ”€ 404/422 โ†’ old SHA gone โ†’ force-push detected + โ”‚ + โ”œโ”€ If "block-on-force-push" + detected: + โ”‚ โ””โ”€ Set repo status to "pending-approval", return early + โ”‚ + โ”œโ”€ If backup needed (always, or on-force-push + detected): + โ”‚ โ””โ”€ Create git bundle snapshot + โ”‚ + โ””โ”€ Proceed to mirror-sync +``` + +## Troubleshooting + +**Repos stuck in "pending-approval"**: Use the Approve or Dismiss buttons in the repository table, or call the approve-sync API endpoint. + +**Detection always skipped**: Check the activity log for skip reasons. Common causes: Gitea repo not yet mirrored (first sync), GitHub API rate limits, network errors. All are fail-open by design. + +**Backups consuming too much space**: Lower the retention count, or switch from "Always Backup" to "Smart" which only creates backups on actual force-pushes. + +**False positives**: The detection compares branch-by-branch. A rebase (which is a force-push) will correctly trigger detection. If you routinely rebase branches, consider using "Smart" instead of "Block & Approve" to avoid constant approval prompts. diff --git a/src/components/config/ConfigTabs.tsx b/src/components/config/ConfigTabs.tsx index f64ad3f..899a8d8 100644 --- a/src/components/config/ConfigTabs.tsx +++ b/src/components/config/ConfigTabs.tsx @@ -50,7 +50,7 @@ export function ConfigTabs() { starredReposOrg: 'starred', starredReposMode: 'dedicated-org', preserveOrgStructure: false, - backupBeforeSync: true, + backupStrategy: "on-force-push", backupRetentionCount: 20, backupDirectory: 'data/repo-backups', blockSyncOnBackupFailure: true, @@ -660,9 +660,20 @@ export function ConfigTabs() { : update, })) } + giteaConfig={config.giteaConfig} + setGiteaConfig={update => + setConfig(prev => ({ + ...prev, + giteaConfig: + typeof update === 'function' + ? update(prev.giteaConfig) + : update, + })) + } onAutoSave={autoSaveGitHubConfig} onMirrorOptionsAutoSave={autoSaveMirrorOptions} onAdvancedOptionsAutoSave={autoSaveAdvancedOptions} + onGiteaAutoSave={autoSaveGiteaConfig} isAutoSaving={isAutoSavingGitHub} /> >; advancedOptions: AdvancedOptions; setAdvancedOptions: React.Dispatch>; + giteaConfig?: GiteaConfig; + setGiteaConfig?: React.Dispatch>; onAutoSave?: (githubConfig: GitHubConfig) => Promise; onMirrorOptionsAutoSave?: (mirrorOptions: MirrorOptions) => Promise; onAdvancedOptionsAutoSave?: (advancedOptions: AdvancedOptions) => Promise; + onGiteaAutoSave?: (giteaConfig: GiteaConfig) => Promise; isAutoSaving?: boolean; } export function GitHubConfigForm({ - config, - setConfig, + config, + setConfig, mirrorOptions, setMirrorOptions, advancedOptions, setAdvancedOptions, - onAutoSave, + giteaConfig, + setGiteaConfig, + onAutoSave, onMirrorOptionsAutoSave, onAdvancedOptionsAutoSave, - isAutoSaving + onGiteaAutoSave, + isAutoSaving }: GitHubConfigFormProps) { const [isLoading, setIsLoading] = useState(false); @@ -202,7 +209,139 @@ export function GitHubConfigForm({ if (onAdvancedOptionsAutoSave) onAdvancedOptionsAutoSave(newOptions); }} /> - + + {giteaConfig && setGiteaConfig && ( + <> + + +
+

+ + Destructive Update Protection + BETA +

+

+ Choose how to handle force-pushes or rewritten upstream history on GitHub. +

+ +
+ {([ + { + value: "disabled", + label: "Disabled", + desc: "No detection or backups", + }, + { + value: "always", + label: "Always Backup", + desc: "Snapshot before every sync", + }, + { + value: "on-force-push", + label: "Smart", + desc: "Backup only on force-push", + }, + { + value: "block-on-force-push", + label: "Block & Approve", + desc: "Require approval on force-push", + }, + ] as const).map((opt) => { + const isSelected = (giteaConfig.backupStrategy ?? "on-force-push") === opt.value; + return ( + + ); + })} +
+ + {(giteaConfig.backupStrategy ?? "on-force-push") !== "disabled" && ( + <> +
+
+ + { + const newConfig = { + ...giteaConfig, + backupRetentionCount: Math.max(1, Number.parseInt(e.target.value, 10) || 20), + }; + setGiteaConfig(newConfig); + if (onGiteaAutoSave) onGiteaAutoSave(newConfig); + }} + className="w-full rounded-md border border-input bg-background px-3 py-2 text-sm shadow-sm transition-colors placeholder:text-muted-foreground focus-visible:outline-none focus-visible:ring-1 focus-visible:ring-ring" + /> +
+
+ + { + const newConfig = { ...giteaConfig, backupDirectory: e.target.value }; + setGiteaConfig(newConfig); + if (onGiteaAutoSave) onGiteaAutoSave(newConfig); + }} + className="w-full rounded-md border border-input bg-background px-3 py-2 text-sm shadow-sm transition-colors placeholder:text-muted-foreground focus-visible:outline-none focus-visible:ring-1 focus-visible:ring-ring" + placeholder="data/repo-backups" + /> +
+
+ + {((giteaConfig.backupStrategy ?? "on-force-push") === "always" || + (giteaConfig.backupStrategy ?? "on-force-push") === "on-force-push") && ( + + )} + + )} +
+ + )} + {/* Mobile: Show button at bottom */} )} - + {repo.status === "pending-approval" && ( +
+ + +
+ )} + {/* Ignore/Include button */} {repo.status === "ignored" ? ( + + + ); + } + // For ignored repos, show an "Include" action if (repo.status === "ignored") { return ( diff --git a/src/lib/db/schema.ts b/src/lib/db/schema.ts index 67b3f22..00a0ba5 100644 --- a/src/lib/db/schema.ts +++ b/src/lib/db/schema.ts @@ -33,6 +33,13 @@ export const githubConfigSchema = z.object({ starredDuplicateStrategy: z.enum(["suffix", "prefix", "owner-org"]).default("suffix").optional(), }); +export const backupStrategyEnum = z.enum([ + "disabled", + "always", + "on-force-push", + "block-on-force-push", +]); + export const giteaConfigSchema = z.object({ url: z.url(), externalUrl: z.url().optional(), @@ -65,7 +72,8 @@ export const giteaConfigSchema = z.object({ mirrorPullRequests: z.boolean().default(false), mirrorLabels: z.boolean().default(false), mirrorMilestones: z.boolean().default(false), - backupBeforeSync: z.boolean().default(true), + backupStrategy: backupStrategyEnum.default("on-force-push"), + backupBeforeSync: z.boolean().default(true), // Deprecated: kept for backward compat, use backupStrategy backupRetentionCount: z.number().int().min(1).default(20), backupDirectory: z.string().optional(), blockSyncOnBackupFailure: z.boolean().default(true), @@ -165,6 +173,7 @@ export const repositorySchema = z.object({ "syncing", "synced", "archived", + "pending-approval", // Blocked by force-push detection, needs manual approval ]) .default("imported"), lastMirrored: z.coerce.date().optional().nullable(), @@ -196,6 +205,7 @@ export const mirrorJobSchema = z.object({ "syncing", "synced", "archived", + "pending-approval", ]) .default("imported"), message: z.string(), diff --git a/src/lib/gitea-enhanced.ts b/src/lib/gitea-enhanced.ts index 9dc723d..7f133af 100644 --- a/src/lib/gitea-enhanced.ts +++ b/src/lib/gitea-enhanced.ts @@ -19,7 +19,12 @@ import { createPreSyncBundleBackup, shouldCreatePreSyncBackup, shouldBlockSyncOnBackupFailure, + resolveBackupStrategy, + shouldBackupForStrategy, + shouldBlockSyncForStrategy, + strategyNeedsDetection, } from "./repo-backup"; +import { detectForcePush } from "./utils/force-push-detection"; import { parseRepositoryMetadataState, serializeRepositoryMetadataState, @@ -255,9 +260,12 @@ export async function getOrCreateGiteaOrgEnhanced({ export async function syncGiteaRepoEnhanced({ config, repository, + skipForcePushDetection, }: { config: Partial; repository: Repository; + /** When true, skip force-push detection and blocking (used by approve-sync). */ + skipForcePushDetection?: boolean; }, deps?: SyncDependencies): Promise { try { if (!config.userId || !config.giteaConfig?.url || !config.giteaConfig?.token) { @@ -318,58 +326,138 @@ export async function syncGiteaRepoEnhanced({ throw new Error(`Repository ${repository.name} is not a mirror. Cannot sync.`); } - if (shouldCreatePreSyncBackup(config)) { - const cloneUrl = - repoInfo.clone_url || - `${config.giteaConfig.url.replace(/\/$/, "")}/${repoOwner}/${repository.name}.git`; + // ---- Smart backup strategy with force-push detection ---- + const backupStrategy = resolveBackupStrategy(config); + let forcePushDetected = false; - try { - const backupResult = await createPreSyncBundleBackup({ - config, - owner: repoOwner, - repoName: repository.name, - cloneUrl, - }); + if (backupStrategy !== "disabled") { + // Run force-push detection if the strategy requires it + // (skip when called from approve-sync to avoid re-blocking) + if (strategyNeedsDetection(backupStrategy) && !skipForcePushDetection) { + try { + const decryptedGithubToken = decryptedConfig.githubConfig?.token; + if (decryptedGithubToken) { + const fpOctokit = new Octokit({ auth: decryptedGithubToken }); + const detectionResult = await detectForcePush({ + giteaUrl: config.giteaConfig.url, + giteaToken: decryptedConfig.giteaConfig.token, + giteaOwner: repoOwner, + giteaRepo: repository.name, + octokit: fpOctokit, + githubOwner: repository.owner, + githubRepo: repository.name, + }); - await createMirrorJob({ - userId: config.userId, - repositoryId: repository.id, - repositoryName: repository.name, - message: `Snapshot created for ${repository.name}`, - details: `Pre-sync snapshot created at ${backupResult.bundlePath}.`, - status: "syncing", - }); - } catch (backupError) { - const errorMessage = - backupError instanceof Error ? backupError.message : String(backupError); + forcePushDetected = detectionResult.detected; - await createMirrorJob({ - userId: config.userId, - repositoryId: repository.id, - repositoryName: repository.name, - message: `Snapshot failed for ${repository.name}`, - details: `Pre-sync snapshot failed: ${errorMessage}`, - status: "failed", - }); - - if (shouldBlockSyncOnBackupFailure(config)) { - await db - .update(repositories) - .set({ - status: repoStatusEnum.parse("failed"), - updatedAt: new Date(), - errorMessage: `Snapshot failed; sync blocked to protect history. ${errorMessage}`, - }) - .where(eq(repositories.id, repository.id!)); - - throw new Error( - `Snapshot failed; sync blocked to protect history. ${errorMessage}` + if (detectionResult.skipped) { + console.log( + `[Sync] Force-push detection skipped for ${repository.name}: ${detectionResult.skipReason}`, + ); + } else if (forcePushDetected) { + const branchNames = detectionResult.affectedBranches + .map((b) => `${b.name} (${b.reason})`) + .join(", "); + console.warn( + `[Sync] Force-push detected on ${repository.name}: ${branchNames}`, + ); + } + } else { + console.log( + `[Sync] Skipping force-push detection for ${repository.name}: no GitHub token`, + ); + } + } catch (detectionError) { + // Fail-open: detection errors should never block sync + console.warn( + `[Sync] Force-push detection failed for ${repository.name}, proceeding with sync: ${ + detectionError instanceof Error ? detectionError.message : String(detectionError) + }`, ); } + } - console.warn( - `[Sync] Snapshot failed for ${repository.name}, continuing because blockSyncOnBackupFailure=false: ${errorMessage}` - ); + // Check if sync should be blocked (block-on-force-push mode) + if (shouldBlockSyncForStrategy(backupStrategy, forcePushDetected)) { + const branchInfo = `Force-push detected; sync blocked for manual approval.`; + + await db + .update(repositories) + .set({ + status: "pending-approval", + updatedAt: new Date(), + errorMessage: branchInfo, + }) + .where(eq(repositories.id, repository.id!)); + + await createMirrorJob({ + userId: config.userId, + repositoryId: repository.id, + repositoryName: repository.name, + message: `Sync blocked for ${repository.name}: force-push detected`, + details: branchInfo, + status: "pending-approval", + }); + + console.warn(`[Sync] Sync blocked for ${repository.name}: pending manual approval`); + return { blocked: true, reason: branchInfo }; + } + + // Create backup if strategy says so + if (shouldBackupForStrategy(backupStrategy, forcePushDetected)) { + const cloneUrl = + repoInfo.clone_url || + `${config.giteaConfig.url.replace(/\/$/, "")}/${repoOwner}/${repository.name}.git`; + + try { + const backupResult = await createPreSyncBundleBackup({ + config, + owner: repoOwner, + repoName: repository.name, + cloneUrl, + force: true, // Strategy already decided to backup; skip legacy gate + }); + + await createMirrorJob({ + userId: config.userId, + repositoryId: repository.id, + repositoryName: repository.name, + message: `Snapshot created for ${repository.name}`, + details: `Pre-sync snapshot created at ${backupResult.bundlePath}.`, + status: "syncing", + }); + } catch (backupError) { + const errorMessage = + backupError instanceof Error ? backupError.message : String(backupError); + + await createMirrorJob({ + userId: config.userId, + repositoryId: repository.id, + repositoryName: repository.name, + message: `Snapshot failed for ${repository.name}`, + details: `Pre-sync snapshot failed: ${errorMessage}`, + status: "failed", + }); + + if (shouldBlockSyncOnBackupFailure(config)) { + await db + .update(repositories) + .set({ + status: repoStatusEnum.parse("failed"), + updatedAt: new Date(), + errorMessage: `Snapshot failed; sync blocked to protect history. ${errorMessage}`, + }) + .where(eq(repositories.id, repository.id!)); + + throw new Error( + `Snapshot failed; sync blocked to protect history. ${errorMessage}`, + ); + } + + console.warn( + `[Sync] Snapshot failed for ${repository.name}, continuing because blockSyncOnBackupFailure=false: ${errorMessage}`, + ); + } } } diff --git a/src/lib/repo-backup.test.ts b/src/lib/repo-backup.test.ts index 5d0e498..d491bd6 100644 --- a/src/lib/repo-backup.test.ts +++ b/src/lib/repo-backup.test.ts @@ -1,7 +1,13 @@ import path from "node:path"; import { afterEach, beforeEach, describe, expect, test } from "bun:test"; import type { Config } from "@/types/config"; -import { resolveBackupPaths } from "@/lib/repo-backup"; +import { + resolveBackupPaths, + resolveBackupStrategy, + shouldBackupForStrategy, + shouldBlockSyncForStrategy, + strategyNeedsDetection, +} from "@/lib/repo-backup"; describe("resolveBackupPaths", () => { let originalBackupDirEnv: string | undefined; @@ -113,3 +119,130 @@ describe("resolveBackupPaths", () => { ); }); }); + +// ---- Backup strategy resolver tests ---- + +function makeConfig(overrides: Record = {}): Partial { + return { + giteaConfig: { + url: "https://gitea.example.com", + token: "tok", + ...overrides, + }, + } as Partial; +} + +const envKeysToClean = ["PRE_SYNC_BACKUP_STRATEGY", "PRE_SYNC_BACKUP_ENABLED"]; + +describe("resolveBackupStrategy", () => { + let savedEnv: Record = {}; + + beforeEach(() => { + savedEnv = {}; + for (const key of envKeysToClean) { + savedEnv[key] = process.env[key]; + delete process.env[key]; + } + }); + + afterEach(() => { + for (const [key, value] of Object.entries(savedEnv)) { + if (value === undefined) { + delete process.env[key]; + } else { + process.env[key] = value; + } + } + }); + + test("returns explicit backupStrategy when set", () => { + expect(resolveBackupStrategy(makeConfig({ backupStrategy: "always" }))).toBe("always"); + expect(resolveBackupStrategy(makeConfig({ backupStrategy: "disabled" }))).toBe("disabled"); + expect(resolveBackupStrategy(makeConfig({ backupStrategy: "on-force-push" }))).toBe("on-force-push"); + expect(resolveBackupStrategy(makeConfig({ backupStrategy: "block-on-force-push" }))).toBe("block-on-force-push"); + }); + + test("maps backupBeforeSync: true โ†’ 'always' (backward compat)", () => { + expect(resolveBackupStrategy(makeConfig({ backupBeforeSync: true }))).toBe("always"); + }); + + test("maps backupBeforeSync: false โ†’ 'disabled' (backward compat)", () => { + expect(resolveBackupStrategy(makeConfig({ backupBeforeSync: false }))).toBe("disabled"); + }); + + test("prefers explicit backupStrategy over backupBeforeSync", () => { + expect( + resolveBackupStrategy( + makeConfig({ backupStrategy: "on-force-push", backupBeforeSync: true }), + ), + ).toBe("on-force-push"); + }); + + test("falls back to PRE_SYNC_BACKUP_STRATEGY env var", () => { + process.env.PRE_SYNC_BACKUP_STRATEGY = "block-on-force-push"; + expect(resolveBackupStrategy(makeConfig({}))).toBe("block-on-force-push"); + }); + + test("falls back to PRE_SYNC_BACKUP_ENABLED env var (legacy)", () => { + process.env.PRE_SYNC_BACKUP_ENABLED = "false"; + expect(resolveBackupStrategy(makeConfig({}))).toBe("disabled"); + }); + + test("defaults to 'on-force-push' when nothing is configured", () => { + expect(resolveBackupStrategy(makeConfig({}))).toBe("on-force-push"); + }); + + test("handles empty giteaConfig gracefully", () => { + expect(resolveBackupStrategy({})).toBe("on-force-push"); + }); +}); + +describe("shouldBackupForStrategy", () => { + test("disabled โ†’ never backup", () => { + expect(shouldBackupForStrategy("disabled", false)).toBe(false); + expect(shouldBackupForStrategy("disabled", true)).toBe(false); + }); + + test("always โ†’ always backup", () => { + expect(shouldBackupForStrategy("always", false)).toBe(true); + expect(shouldBackupForStrategy("always", true)).toBe(true); + }); + + test("on-force-push โ†’ backup only when detected", () => { + expect(shouldBackupForStrategy("on-force-push", false)).toBe(false); + expect(shouldBackupForStrategy("on-force-push", true)).toBe(true); + }); + + test("block-on-force-push โ†’ backup only when detected", () => { + expect(shouldBackupForStrategy("block-on-force-push", false)).toBe(false); + expect(shouldBackupForStrategy("block-on-force-push", true)).toBe(true); + }); +}); + +describe("shouldBlockSyncForStrategy", () => { + test("only block-on-force-push + detected returns true", () => { + expect(shouldBlockSyncForStrategy("block-on-force-push", true)).toBe(true); + }); + + test("block-on-force-push without detection does not block", () => { + expect(shouldBlockSyncForStrategy("block-on-force-push", false)).toBe(false); + }); + + test("other strategies never block", () => { + expect(shouldBlockSyncForStrategy("disabled", true)).toBe(false); + expect(shouldBlockSyncForStrategy("always", true)).toBe(false); + expect(shouldBlockSyncForStrategy("on-force-push", true)).toBe(false); + }); +}); + +describe("strategyNeedsDetection", () => { + test("returns true for detection-based strategies", () => { + expect(strategyNeedsDetection("on-force-push")).toBe(true); + expect(strategyNeedsDetection("block-on-force-push")).toBe(true); + }); + + test("returns false for non-detection strategies", () => { + expect(strategyNeedsDetection("disabled")).toBe(false); + expect(strategyNeedsDetection("always")).toBe(false); + }); +}); diff --git a/src/lib/repo-backup.ts b/src/lib/repo-backup.ts index f84b8bb..0fa4463 100644 --- a/src/lib/repo-backup.ts +++ b/src/lib/repo-backup.ts @@ -1,7 +1,7 @@ import { mkdir, mkdtemp, readdir, rm, stat } from "node:fs/promises"; import os from "node:os"; import path from "node:path"; -import type { Config } from "@/types/config"; +import type { Config, BackupStrategy } from "@/types/config"; import { decryptConfigTokens } from "./utils/config-encryption"; const TRUE_VALUES = new Set(["1", "true", "yes", "on"]); @@ -101,6 +101,92 @@ export function shouldBlockSyncOnBackupFailure(config: Partial): boolean return configSetting === undefined ? true : Boolean(configSetting); } +// ---- Backup strategy resolver ---- + +const VALID_STRATEGIES = new Set([ + "disabled", + "always", + "on-force-push", + "block-on-force-push", +]); + +/** + * Resolve the effective backup strategy from config, falling back through: + * 1. `backupStrategy` field (new) + * 2. `backupBeforeSync` boolean (deprecated, backward compat) + * 3. `PRE_SYNC_BACKUP_STRATEGY` env var + * 4. `PRE_SYNC_BACKUP_ENABLED` env var (legacy) + * 5. Default: `"on-force-push"` + */ +export function resolveBackupStrategy(config: Partial): BackupStrategy { + // 1. Explicit backupStrategy field + const explicit = config.giteaConfig?.backupStrategy; + if (explicit && VALID_STRATEGIES.has(explicit as BackupStrategy)) { + return explicit as BackupStrategy; + } + + // 2. Legacy backupBeforeSync boolean โ†’ map to strategy + const legacy = config.giteaConfig?.backupBeforeSync; + if (legacy !== undefined) { + return legacy ? "always" : "disabled"; + } + + // 3. Env var (new) + const envStrategy = process.env.PRE_SYNC_BACKUP_STRATEGY?.trim().toLowerCase(); + if (envStrategy && VALID_STRATEGIES.has(envStrategy as BackupStrategy)) { + return envStrategy as BackupStrategy; + } + + // 4. Env var (legacy) + const envEnabled = process.env.PRE_SYNC_BACKUP_ENABLED; + if (envEnabled !== undefined) { + return parseBoolean(envEnabled, true) ? "always" : "disabled"; + } + + // 5. Default + return "on-force-push"; +} + +/** + * Determine whether a backup should be created for the given strategy and + * force-push detection result. + */ +export function shouldBackupForStrategy( + strategy: BackupStrategy, + forcePushDetected: boolean, +): boolean { + switch (strategy) { + case "disabled": + return false; + case "always": + return true; + case "on-force-push": + case "block-on-force-push": + return forcePushDetected; + default: + return false; + } +} + +/** + * Determine whether sync should be blocked (requires manual approval). + * Only `block-on-force-push` with an actual detection blocks sync. + */ +export function shouldBlockSyncForStrategy( + strategy: BackupStrategy, + forcePushDetected: boolean, +): boolean { + return strategy === "block-on-force-push" && forcePushDetected; +} + +/** + * Returns true when the strategy requires running force-push detection + * before deciding on backup / block behavior. + */ +export function strategyNeedsDetection(strategy: BackupStrategy): boolean { + return strategy === "on-force-push" || strategy === "block-on-force-push"; +} + export function resolveBackupPaths({ config, owner, @@ -136,13 +222,17 @@ export async function createPreSyncBundleBackup({ owner, repoName, cloneUrl, + force, }: { config: Partial; owner: string; repoName: string; cloneUrl: string; + /** When true, skip the legacy shouldCreatePreSyncBackup check. + * Used by the strategy-driven path which has already decided to backup. */ + force?: boolean; }): Promise<{ bundlePath: string }> { - if (!shouldCreatePreSyncBackup(config)) { + if (!force && !shouldCreatePreSyncBackup(config)) { throw new Error("Pre-sync backup is disabled."); } diff --git a/src/lib/scheduler-service.ts b/src/lib/scheduler-service.ts index 8fe45d2..8e0413d 100644 --- a/src/lib/scheduler-service.ts +++ b/src/lib/scheduler-service.ts @@ -280,11 +280,29 @@ async function runScheduledSync(config: any): Promise { }); } + // Log pending-approval repos that are excluded from sync + try { + const pendingApprovalRepos = await db + .select({ id: repositories.id }) + .from(repositories) + .where( + and( + eq(repositories.userId, userId), + eq(repositories.status, 'pending-approval') + ) + ); + if (pendingApprovalRepos.length > 0) { + console.log(`[Scheduler] ${pendingApprovalRepos.length} repositories pending approval (force-push detected) for user ${userId} โ€” skipping sync for those`); + } + } catch { + // Non-critical logging, ignore errors + } + if (reposToSync.length === 0) { console.log(`[Scheduler] No repositories to sync for user ${userId}`); return; } - + console.log(`[Scheduler] Syncing ${reposToSync.length} repositories for user ${userId}`); // Process repositories in batches diff --git a/src/lib/utils.ts b/src/lib/utils.ts index 77e27c0..c47fb36 100644 --- a/src/lib/utils.ts +++ b/src/lib/utils.ts @@ -280,6 +280,8 @@ export const getStatusColor = (status: string): string => { return "bg-orange-500"; // Deleting case "deleted": return "bg-gray-600"; // Deleted + case "pending-approval": + return "bg-amber-500"; // Needs manual approval default: return "bg-gray-400"; // Unknown/neutral } diff --git a/src/lib/utils/config-defaults.ts b/src/lib/utils/config-defaults.ts index 32693b3..9c95173 100644 --- a/src/lib/utils/config-defaults.ts +++ b/src/lib/utils/config-defaults.ts @@ -93,7 +93,8 @@ export async function createDefaultConfig({ userId, envOverrides = {} }: Default forkStrategy: "reference", issueConcurrency: 3, pullRequestConcurrency: 5, - backupBeforeSync: true, + backupStrategy: "on-force-push", + backupBeforeSync: true, // Deprecated: kept for backward compat backupRetentionCount: 20, backupDirectory: "data/repo-backups", blockSyncOnBackupFailure: true, diff --git a/src/lib/utils/config-mapper.ts b/src/lib/utils/config-mapper.ts index 422bbbd..4c2779f 100644 --- a/src/lib/utils/config-mapper.ts +++ b/src/lib/utils/config-mapper.ts @@ -100,6 +100,7 @@ export function mapUiToDbConfig( mirrorPullRequests: mirrorOptions.mirrorMetadata && mirrorOptions.metadataComponents.pullRequests, mirrorLabels: mirrorOptions.mirrorMetadata && mirrorOptions.metadataComponents.labels, mirrorMilestones: mirrorOptions.mirrorMetadata && mirrorOptions.metadataComponents.milestones, + backupStrategy: giteaConfig.backupStrategy, backupBeforeSync: giteaConfig.backupBeforeSync ?? true, backupRetentionCount: giteaConfig.backupRetentionCount ?? 20, backupDirectory: giteaConfig.backupDirectory?.trim() || undefined, @@ -144,6 +145,7 @@ export function mapDbToUiConfig(dbConfig: any): { personalReposOrg: undefined, // Not stored in current schema issueConcurrency: dbConfig.giteaConfig?.issueConcurrency ?? 3, pullRequestConcurrency: dbConfig.giteaConfig?.pullRequestConcurrency ?? 5, + backupStrategy: dbConfig.giteaConfig?.backupStrategy || undefined, backupBeforeSync: dbConfig.giteaConfig?.backupBeforeSync ?? true, backupRetentionCount: dbConfig.giteaConfig?.backupRetentionCount ?? 20, backupDirectory: dbConfig.giteaConfig?.backupDirectory || "data/repo-backups", diff --git a/src/lib/utils/force-push-detection.test.ts b/src/lib/utils/force-push-detection.test.ts new file mode 100644 index 0000000..5a686c6 --- /dev/null +++ b/src/lib/utils/force-push-detection.test.ts @@ -0,0 +1,319 @@ +import { describe, expect, it, mock } from "bun:test"; +import { + detectForcePush, + fetchGitHubBranches, + checkAncestry, + type BranchInfo, +} from "./force-push-detection"; + +// ---- Helpers ---- + +function makeOctokit(overrides: Record = {}) { + return { + repos: { + listBranches: mock(() => Promise.resolve({ data: [] })), + compareCommits: mock(() => + Promise.resolve({ data: { status: "ahead" } }), + ), + ...overrides.repos, + }, + paginate: mock(async (_method: any, params: any) => { + // Default: return whatever the test wired into _githubBranches + return overrides._githubBranches ?? []; + }), + ...overrides, + } as any; +} + +// ---- fetchGitHubBranches ---- + +describe("fetchGitHubBranches", () => { + it("maps Octokit paginated response to BranchInfo[]", async () => { + const octokit = makeOctokit({ + _githubBranches: [ + { name: "main", commit: { sha: "aaa" } }, + { name: "dev", commit: { sha: "bbb" } }, + ], + }); + + const result = await fetchGitHubBranches({ + octokit, + owner: "user", + repo: "repo", + }); + + expect(result).toEqual([ + { name: "main", sha: "aaa" }, + { name: "dev", sha: "bbb" }, + ]); + }); +}); + +// ---- checkAncestry ---- + +describe("checkAncestry", () => { + it("returns true for fast-forward (ahead)", async () => { + const octokit = makeOctokit({ + repos: { + compareCommits: mock(() => + Promise.resolve({ data: { status: "ahead" } }), + ), + }, + }); + + const result = await checkAncestry({ + octokit, + owner: "user", + repo: "repo", + baseSha: "old", + headSha: "new", + }); + + expect(result).toBe(true); + }); + + it("returns true for identical", async () => { + const octokit = makeOctokit({ + repos: { + compareCommits: mock(() => + Promise.resolve({ data: { status: "identical" } }), + ), + }, + }); + + const result = await checkAncestry({ + octokit, + owner: "user", + repo: "repo", + baseSha: "same", + headSha: "same", + }); + + expect(result).toBe(true); + }); + + it("returns false for diverged", async () => { + const octokit = makeOctokit({ + repos: { + compareCommits: mock(() => + Promise.resolve({ data: { status: "diverged" } }), + ), + }, + }); + + const result = await checkAncestry({ + octokit, + owner: "user", + repo: "repo", + baseSha: "old", + headSha: "new", + }); + + expect(result).toBe(false); + }); + + it("returns false when API returns 404 (old SHA gone)", async () => { + const error404 = Object.assign(new Error("Not Found"), { status: 404 }); + const octokit = makeOctokit({ + repos: { + compareCommits: mock(() => Promise.reject(error404)), + }, + }); + + const result = await checkAncestry({ + octokit, + owner: "user", + repo: "repo", + baseSha: "gone", + headSha: "new", + }); + + expect(result).toBe(false); + }); + + it("throws on transient errors (fail-open for caller)", async () => { + const error500 = Object.assign(new Error("Internal Server Error"), { status: 500 }); + const octokit = makeOctokit({ + repos: { + compareCommits: mock(() => Promise.reject(error500)), + }, + }); + + expect( + checkAncestry({ + octokit, + owner: "user", + repo: "repo", + baseSha: "old", + headSha: "new", + }), + ).rejects.toThrow("Internal Server Error"); + }); +}); + +// ---- detectForcePush ---- +// Uses _deps injection to avoid fragile global fetch mocking. + +describe("detectForcePush", () => { + const baseArgs = { + giteaUrl: "https://gitea.example.com", + giteaToken: "tok", + giteaOwner: "org", + giteaRepo: "repo", + githubOwner: "user", + githubRepo: "repo", + }; + + function makeDeps(overrides: { + giteaBranches?: BranchInfo[] | Error; + githubBranches?: BranchInfo[] | Error; + ancestryResult?: boolean; + } = {}) { + return { + fetchGiteaBranches: mock(async () => { + if (overrides.giteaBranches instanceof Error) throw overrides.giteaBranches; + return overrides.giteaBranches ?? []; + }) as any, + fetchGitHubBranches: mock(async () => { + if (overrides.githubBranches instanceof Error) throw overrides.githubBranches; + return overrides.githubBranches ?? []; + }) as any, + checkAncestry: mock(async () => overrides.ancestryResult ?? true) as any, + }; + } + + const dummyOctokit = {} as any; + + it("skips when Gitea has no branches (first mirror)", async () => { + const deps = makeDeps({ giteaBranches: [] }); + const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps }); + + expect(result.detected).toBe(false); + expect(result.skipped).toBe(true); + expect(result.skipReason).toContain("No Gitea branches"); + }); + + it("returns no detection when all SHAs match", async () => { + const deps = makeDeps({ + giteaBranches: [ + { name: "main", sha: "aaa" }, + { name: "dev", sha: "bbb" }, + ], + githubBranches: [ + { name: "main", sha: "aaa" }, + { name: "dev", sha: "bbb" }, + ], + }); + + const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps }); + + expect(result.detected).toBe(false); + expect(result.skipped).toBe(false); + expect(result.affectedBranches).toHaveLength(0); + }); + + it("detects deleted branch", async () => { + const deps = makeDeps({ + giteaBranches: [ + { name: "main", sha: "aaa" }, + { name: "old-branch", sha: "ccc" }, + ], + githubBranches: [{ name: "main", sha: "aaa" }], + }); + + const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps }); + + expect(result.detected).toBe(true); + expect(result.affectedBranches).toHaveLength(1); + expect(result.affectedBranches[0]).toEqual({ + name: "old-branch", + reason: "deleted", + giteaSha: "ccc", + githubSha: null, + }); + }); + + it("returns no detection for fast-forward", async () => { + const deps = makeDeps({ + giteaBranches: [{ name: "main", sha: "old-sha" }], + githubBranches: [{ name: "main", sha: "new-sha" }], + ancestryResult: true, // fast-forward + }); + + const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps }); + + expect(result.detected).toBe(false); + expect(result.affectedBranches).toHaveLength(0); + }); + + it("detects diverged branch", async () => { + const deps = makeDeps({ + giteaBranches: [{ name: "main", sha: "old-sha" }], + githubBranches: [{ name: "main", sha: "rewritten-sha" }], + ancestryResult: false, // diverged + }); + + const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps }); + + expect(result.detected).toBe(true); + expect(result.affectedBranches).toHaveLength(1); + expect(result.affectedBranches[0]).toEqual({ + name: "main", + reason: "diverged", + giteaSha: "old-sha", + githubSha: "rewritten-sha", + }); + }); + + it("detects force-push when ancestry check fails (old SHA gone)", async () => { + const deps = makeDeps({ + giteaBranches: [{ name: "main", sha: "old-sha" }], + githubBranches: [{ name: "main", sha: "new-sha" }], + ancestryResult: false, // checkAncestry returns false on error + }); + + const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps }); + + expect(result.detected).toBe(true); + expect(result.affectedBranches).toHaveLength(1); + expect(result.affectedBranches[0].reason).toBe("diverged"); + }); + + it("skips when Gitea API returns 404", async () => { + const { HttpError } = await import("@/lib/http-client"); + const deps = makeDeps({ + giteaBranches: new HttpError("not found", 404, "Not Found"), + }); + + const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps }); + + expect(result.detected).toBe(false); + expect(result.skipped).toBe(true); + expect(result.skipReason).toContain("not found"); + }); + + it("skips when Gitea API returns server error", async () => { + const deps = makeDeps({ + giteaBranches: new Error("HTTP 500: internal error"), + }); + + const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps }); + + expect(result.detected).toBe(false); + expect(result.skipped).toBe(true); + expect(result.skipReason).toContain("Failed to fetch Gitea branches"); + }); + + it("skips when GitHub API fails", async () => { + const deps = makeDeps({ + giteaBranches: [{ name: "main", sha: "aaa" }], + githubBranches: new Error("rate limited"), + }); + + const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps }); + + expect(result.detected).toBe(false); + expect(result.skipped).toBe(true); + expect(result.skipReason).toContain("Failed to fetch GitHub branches"); + }); +}); diff --git a/src/lib/utils/force-push-detection.ts b/src/lib/utils/force-push-detection.ts new file mode 100644 index 0000000..a6f65df --- /dev/null +++ b/src/lib/utils/force-push-detection.ts @@ -0,0 +1,286 @@ +/** + * Force-push detection module. + * + * Compares branch SHAs between a Gitea mirror and GitHub source to detect + * branches that were deleted, rewritten, or force-pushed. + * + * **Fail-open**: If detection itself fails (API errors, rate limits, etc.), + * the result indicates no force-push so sync proceeds normally. Detection + * should never block sync due to its own failure. + */ + +import type { Octokit } from "@octokit/rest"; +import { httpGet, HttpError } from "@/lib/http-client"; + +// ---- Types ---- + +export interface BranchInfo { + name: string; + sha: string; +} + +export type ForcePushReason = "deleted" | "diverged" | "non-fast-forward"; + +export interface AffectedBranch { + name: string; + reason: ForcePushReason; + giteaSha: string; + githubSha: string | null; // null when branch was deleted +} + +export interface ForcePushDetectionResult { + detected: boolean; + affectedBranches: AffectedBranch[]; + /** True when detection could not run (API error, etc.) */ + skipped: boolean; + skipReason?: string; +} + +const NO_FORCE_PUSH: ForcePushDetectionResult = { + detected: false, + affectedBranches: [], + skipped: false, +}; + +function skippedResult(reason: string): ForcePushDetectionResult { + return { + detected: false, + affectedBranches: [], + skipped: true, + skipReason: reason, + }; +} + +// ---- Branch fetching ---- + +/** + * Fetch all branches from a Gitea repository (paginated). + */ +export async function fetchGiteaBranches({ + giteaUrl, + giteaToken, + owner, + repo, +}: { + giteaUrl: string; + giteaToken: string; + owner: string; + repo: string; +}): Promise { + const branches: BranchInfo[] = []; + let page = 1; + const perPage = 50; + + while (true) { + const url = `${giteaUrl}/api/v1/repos/${owner}/${repo}/branches?page=${page}&limit=${perPage}`; + const response = await httpGet>( + url, + { Authorization: `token ${giteaToken}` }, + ); + + if (!Array.isArray(response.data) || response.data.length === 0) break; + + for (const b of response.data) { + branches.push({ name: b.name, sha: b.commit.id }); + } + + if (response.data.length < perPage) break; + page++; + } + + return branches; +} + +/** + * Fetch all branches from a GitHub repository (paginated via Octokit). + */ +export async function fetchGitHubBranches({ + octokit, + owner, + repo, +}: { + octokit: Octokit; + owner: string; + repo: string; +}): Promise { + const data = await octokit.paginate(octokit.repos.listBranches, { + owner, + repo, + per_page: 100, + }); + + return data.map((b) => ({ name: b.name, sha: b.commit.sha })); +} + +/** + * Check whether the transition from `baseSha` to `headSha` on the same branch + * is a fast-forward (i.e. `baseSha` is an ancestor of `headSha`). + * + * Returns `true` when the change is safe (fast-forward) and `false` when it + * is a confirmed force-push (404 = old SHA garbage-collected from GitHub). + * + * Throws on transient errors (rate limits, network issues) so the caller + * can decide how to handle them (fail-open: skip that branch). + */ +export async function checkAncestry({ + octokit, + owner, + repo, + baseSha, + headSha, +}: { + octokit: Octokit; + owner: string; + repo: string; + baseSha: string; + headSha: string; +}): Promise { + try { + const { data } = await octokit.repos.compareCommits({ + owner, + repo, + base: baseSha, + head: headSha, + }); + // "ahead" means headSha is strictly ahead of baseSha โ†’ fast-forward. + // "behind" or "diverged" means the branch was rewritten. + return data.status === "ahead" || data.status === "identical"; + } catch (error: any) { + // 404 / 422 = old SHA no longer exists on GitHub โ†’ confirmed force-push. + if (error?.status === 404 || error?.status === 422) { + return false; + } + // Any other error (rate limit, network) โ†’ rethrow so caller can + // handle it as fail-open (skip branch) rather than false-positive. + throw error; + } +} + +// ---- Main detection ---- + +/** + * Compare branch SHAs between Gitea and GitHub to detect force-pushes. + * + * The function is intentionally fail-open: any error during detection returns + * a "skipped" result so that sync can proceed normally. + */ +export async function detectForcePush({ + giteaUrl, + giteaToken, + giteaOwner, + giteaRepo, + octokit, + githubOwner, + githubRepo, + _deps, +}: { + giteaUrl: string; + giteaToken: string; + giteaOwner: string; + giteaRepo: string; + octokit: Octokit; + githubOwner: string; + githubRepo: string; + /** @internal โ€” test-only dependency injection */ + _deps?: { + fetchGiteaBranches: typeof fetchGiteaBranches; + fetchGitHubBranches: typeof fetchGitHubBranches; + checkAncestry: typeof checkAncestry; + }; +}): Promise { + const deps = _deps ?? { fetchGiteaBranches, fetchGitHubBranches, checkAncestry }; + + // 1. Fetch Gitea branches + let giteaBranches: BranchInfo[]; + try { + giteaBranches = await deps.fetchGiteaBranches({ + giteaUrl, + giteaToken, + owner: giteaOwner, + repo: giteaRepo, + }); + } catch (error) { + // Gitea 404 = repo not yet mirrored, skip detection + if (error instanceof HttpError && error.status === 404) { + return skippedResult("Gitea repository not found (first mirror?)"); + } + return skippedResult( + `Failed to fetch Gitea branches: ${error instanceof Error ? error.message : String(error)}`, + ); + } + + // First-time mirror: no Gitea branches โ†’ nothing to compare + if (giteaBranches.length === 0) { + return skippedResult("No Gitea branches found (first mirror?)"); + } + + // 2. Fetch GitHub branches + let githubBranches: BranchInfo[]; + try { + githubBranches = await deps.fetchGitHubBranches({ + octokit, + owner: githubOwner, + repo: githubRepo, + }); + } catch (error) { + return skippedResult( + `Failed to fetch GitHub branches: ${error instanceof Error ? error.message : String(error)}`, + ); + } + + const githubBranchMap = new Map(githubBranches.map((b) => [b.name, b.sha])); + + // 3. Compare each Gitea branch against GitHub + const affected: AffectedBranch[] = []; + + for (const giteaBranch of giteaBranches) { + const githubSha = githubBranchMap.get(giteaBranch.name); + + if (githubSha === undefined) { + // Branch was deleted on GitHub + affected.push({ + name: giteaBranch.name, + reason: "deleted", + giteaSha: giteaBranch.sha, + githubSha: null, + }); + continue; + } + + // Same SHA โ†’ no change + if (githubSha === giteaBranch.sha) continue; + + // SHAs differ โ†’ check if it's a fast-forward + try { + const isFastForward = await deps.checkAncestry({ + octokit, + owner: githubOwner, + repo: githubRepo, + baseSha: giteaBranch.sha, + headSha: githubSha, + }); + + if (!isFastForward) { + affected.push({ + name: giteaBranch.name, + reason: "diverged", + giteaSha: giteaBranch.sha, + githubSha, + }); + } + } catch { + // Individual branch check failure โ†’ skip that branch (fail-open) + continue; + } + } + + if (affected.length === 0) { + return NO_FORCE_PUSH; + } + + return { + detected: true, + affectedBranches: affected, + skipped: false, + }; +} diff --git a/src/pages/api/job/approve-sync.ts b/src/pages/api/job/approve-sync.ts new file mode 100644 index 0000000..14cec9b --- /dev/null +++ b/src/pages/api/job/approve-sync.ts @@ -0,0 +1,202 @@ +import type { APIRoute } from "astro"; +import { db, configs, repositories } from "@/lib/db"; +import { and, eq, inArray } from "drizzle-orm"; +import { repositoryVisibilityEnum, repoStatusEnum } from "@/types/Repository"; +import { syncGiteaRepoEnhanced } from "@/lib/gitea-enhanced"; +import { createSecureErrorResponse } from "@/lib/utils"; +import { requireAuthenticatedUserId } from "@/lib/auth-guards"; +import { createPreSyncBundleBackup } from "@/lib/repo-backup"; +import { decryptConfigTokens } from "@/lib/utils/config-encryption"; +import type { Config } from "@/types/config"; +import { createMirrorJob } from "@/lib/helpers"; + +interface ApproveSyncRequest { + repositoryIds: string[]; + action: "approve" | "dismiss"; +} + +export const POST: APIRoute = async ({ request, locals }) => { + try { + const authResult = await requireAuthenticatedUserId({ request, locals }); + if ("response" in authResult) return authResult.response; + const userId = authResult.userId; + + const body: ApproveSyncRequest = await request.json(); + const { repositoryIds, action } = body; + + if (!repositoryIds || !Array.isArray(repositoryIds) || repositoryIds.length === 0) { + return new Response( + JSON.stringify({ success: false, message: "repositoryIds are required." }), + { status: 400, headers: { "Content-Type": "application/json" } }, + ); + } + + if (action !== "approve" && action !== "dismiss") { + return new Response( + JSON.stringify({ success: false, message: "action must be 'approve' or 'dismiss'." }), + { status: 400, headers: { "Content-Type": "application/json" } }, + ); + } + + // Fetch config + const configResult = await db + .select() + .from(configs) + .where(eq(configs.userId, userId)) + .limit(1); + + const config = configResult[0]; + if (!config) { + return new Response( + JSON.stringify({ success: false, message: "No configuration found." }), + { status: 400, headers: { "Content-Type": "application/json" } }, + ); + } + + // Fetch repos โ€” only those in pending-approval status + const repos = await db + .select() + .from(repositories) + .where( + and( + eq(repositories.userId, userId), + eq(repositories.status, "pending-approval"), + inArray(repositories.id, repositoryIds), + ), + ); + + if (!repos.length) { + return new Response( + JSON.stringify({ success: false, message: "No pending-approval repositories found for the given IDs." }), + { status: 404, headers: { "Content-Type": "application/json" } }, + ); + } + + if (action === "dismiss") { + // Reset status to "synced" so repos resume normal schedule + for (const repo of repos) { + await db + .update(repositories) + .set({ + status: "synced", + errorMessage: null, + updatedAt: new Date(), + }) + .where(eq(repositories.id, repo.id)); + + await createMirrorJob({ + userId, + repositoryId: repo.id, + repositoryName: repo.name, + message: `Force-push alert dismissed for ${repo.name}`, + details: "User dismissed the force-push alert. Repository will resume normal sync schedule.", + status: "synced", + }); + } + + return new Response( + JSON.stringify({ + success: true, + message: `Dismissed ${repos.length} repository alert(s).`, + repositories: repos.map((repo) => ({ + ...repo, + status: "synced", + errorMessage: null, + })), + }), + { status: 200, headers: { "Content-Type": "application/json" } }, + ); + } + + // action === "approve": create backup first (safety), then trigger sync + const decryptedConfig = decryptConfigTokens(config as unknown as Config); + + // Process in background + setTimeout(async () => { + for (const repo of repos) { + try { + const { getGiteaRepoOwnerAsync } = await import("@/lib/gitea"); + const repoOwner = await getGiteaRepoOwnerAsync({ config, repository: repo }); + + // Always create a backup before approved sync for safety + const cloneUrl = `${config.giteaConfig.url.replace(/\/$/, "")}/${repoOwner}/${repo.name}.git`; + try { + const backupResult = await createPreSyncBundleBackup({ + config, + owner: repoOwner, + repoName: repo.name, + cloneUrl, + force: true, // Bypass legacy gate โ€” approval implies backup + }); + + await createMirrorJob({ + userId, + repositoryId: repo.id, + repositoryName: repo.name, + message: `Safety snapshot created for ${repo.name}`, + details: `Pre-approval snapshot at ${backupResult.bundlePath}.`, + status: "syncing", + }); + } catch (backupError) { + console.warn( + `[ApproveSync] Backup failed for ${repo.name}, proceeding with sync: ${ + backupError instanceof Error ? backupError.message : String(backupError) + }`, + ); + } + + // Trigger sync โ€” skip detection to avoid re-blocking + const repoData = { + ...repo, + status: repoStatusEnum.parse("syncing"), + organization: repo.organization ?? undefined, + lastMirrored: repo.lastMirrored ?? undefined, + errorMessage: repo.errorMessage ?? undefined, + forkedFrom: repo.forkedFrom ?? undefined, + visibility: repositoryVisibilityEnum.parse(repo.visibility), + mirroredLocation: repo.mirroredLocation || "", + }; + + await syncGiteaRepoEnhanced({ + config, + repository: repoData, + skipForcePushDetection: true, + }); + console.log(`[ApproveSync] Sync completed for approved repository: ${repo.name}`); + } catch (error) { + console.error( + `[ApproveSync] Failed to sync approved repository ${repo.name}:`, + error, + ); + } + } + }, 0); + + // Immediately update status to syncing for responsiveness + for (const repo of repos) { + await db + .update(repositories) + .set({ + status: "syncing", + errorMessage: null, + updatedAt: new Date(), + }) + .where(eq(repositories.id, repo.id)); + } + + return new Response( + JSON.stringify({ + success: true, + message: `Approved sync for ${repos.length} repository(ies). Backup + sync started.`, + repositories: repos.map((repo) => ({ + ...repo, + status: "syncing", + errorMessage: null, + })), + }), + { status: 200, headers: { "Content-Type": "application/json" } }, + ); + } catch (error) { + return createSecureErrorResponse(error, "approve-sync", 500); + } +}; diff --git a/src/types/Repository.ts b/src/types/Repository.ts index 30cc169..9272c3b 100644 --- a/src/types/Repository.ts +++ b/src/types/Repository.ts @@ -13,6 +13,7 @@ export const repoStatusEnum = z.enum([ "syncing", "synced", "archived", + "pending-approval", // Blocked by force-push detection, needs manual approval ]); export type RepoStatus = z.infer; diff --git a/src/types/config.ts b/src/types/config.ts index 3101316..ca25e5d 100644 --- a/src/types/config.ts +++ b/src/types/config.ts @@ -3,6 +3,7 @@ import { type Config as ConfigType } from "@/lib/db/schema"; export type GiteaOrgVisibility = "public" | "private" | "limited"; export type MirrorStrategy = "preserve" | "single-org" | "flat-user" | "mixed"; export type StarredReposMode = "dedicated-org" | "preserve-owner"; +export type BackupStrategy = "disabled" | "always" | "on-force-push" | "block-on-force-push"; export interface GiteaConfig { url: string; @@ -18,7 +19,8 @@ export interface GiteaConfig { personalReposOrg?: string; // Override destination for personal repos issueConcurrency?: number; pullRequestConcurrency?: number; - backupBeforeSync?: boolean; + backupStrategy?: BackupStrategy; + backupBeforeSync?: boolean; // Deprecated: kept for backward compat, use backupStrategy backupRetentionCount?: number; backupDirectory?: string; blockSyncOnBackupFailure?: boolean; diff --git a/tests/e2e/03-backup.spec.ts b/tests/e2e/03-backup.spec.ts index 35a5c5e..ca6f1df 100644 --- a/tests/e2e/03-backup.spec.ts +++ b/tests/e2e/03-backup.spec.ts @@ -6,13 +6,13 @@ * by the 02-mirror-workflow suite. * * What is tested: - * B1. Enable backupBeforeSync in config + * B1. Enable backupStrategy: "always" in config * B2. Confirm mirrored repos exist in Gitea (precondition) * B3. Trigger a re-sync with backup enabled โ€” verify the backup code path * runs (snapshot activity entries appear in the activity log) * B4. Inspect activity log for snapshot-related entries * B5. Enable blockSyncOnBackupFailure and verify the flag is persisted - * B6. Disable backup and verify config resets cleanly + * B6. Disable backup (backupStrategy: "disabled") and verify config resets cleanly */ import { test, expect } from "@playwright/test"; @@ -54,10 +54,10 @@ test.describe("E2E: Backup configuration", () => { const giteaToken = giteaApi.getTokenValue(); expect(giteaToken, "Gitea token required").toBeTruthy(); - // Save config with backup enabled + // Save config with backup strategy set to "always" await saveConfig(request, giteaToken, appCookies, { giteaConfig: { - backupBeforeSync: true, + backupStrategy: "always", blockSyncOnBackupFailure: false, backupRetentionCount: 5, backupDirectory: "data/repo-backups", @@ -75,7 +75,7 @@ test.describe("E2E: Backup configuration", () => { const configData = await configResp.json(); const giteaCfg = configData.giteaConfig ?? configData.gitea ?? {}; console.log( - `[Backup] Config saved: backupBeforeSync=${giteaCfg.backupBeforeSync}, blockOnFailure=${giteaCfg.blockSyncOnBackupFailure}`, + `[Backup] Config saved: backupStrategy=${giteaCfg.backupStrategy}, blockOnFailure=${giteaCfg.blockSyncOnBackupFailure}`, ); } }); @@ -202,7 +202,7 @@ test.describe("E2E: Backup configuration", () => { expect( backupJobs.length, "Expected at least one backup/snapshot activity entry when " + - "backupBeforeSync is enabled and repos exist in Gitea", + "backupStrategy is 'always' and repos exist in Gitea", ).toBeGreaterThan(0); // Check for any failed backups @@ -247,7 +247,7 @@ test.describe("E2E: Backup configuration", () => { // Update config to block sync on backup failure await saveConfig(request, giteaToken, appCookies, { giteaConfig: { - backupBeforeSync: true, + backupStrategy: "always", blockSyncOnBackupFailure: true, backupRetentionCount: 5, backupDirectory: "data/repo-backups", @@ -284,7 +284,7 @@ test.describe("E2E: Backup configuration", () => { // Disable backup await saveConfig(request, giteaToken, appCookies, { giteaConfig: { - backupBeforeSync: false, + backupStrategy: "disabled", blockSyncOnBackupFailure: false, }, }); @@ -297,7 +297,7 @@ test.describe("E2E: Backup configuration", () => { const configData = await configResp.json(); const giteaCfg = configData.giteaConfig ?? configData.gitea ?? {}; console.log( - `[Backup] After disable: backupBeforeSync=${giteaCfg.backupBeforeSync}`, + `[Backup] After disable: backupStrategy=${giteaCfg.backupStrategy}`, ); } console.log("[Backup] Backup configuration test complete"); diff --git a/tests/e2e/04-force-push.spec.ts b/tests/e2e/04-force-push.spec.ts index c974e0d..c1a850f 100644 --- a/tests/e2e/04-force-push.spec.ts +++ b/tests/e2e/04-force-push.spec.ts @@ -302,7 +302,7 @@ test.describe("E2E: Force-push simulation", () => { // Ensure backup is disabled for this test await saveConfig(request, giteaToken, appCookies, { giteaConfig: { - backupBeforeSync: false, + backupStrategy: "disabled", blockSyncOnBackupFailure: false, }, }); @@ -560,16 +560,16 @@ test.describe("E2E: Force-push simulation", () => { const giteaToken = giteaApi.getTokenValue(); - // Enable backup + // Enable backup with "always" strategy await saveConfig(request, giteaToken, appCookies, { giteaConfig: { - backupBeforeSync: true, + backupStrategy: "always", blockSyncOnBackupFailure: false, // don't block โ€” we want to see both backup + sync happen backupRetentionCount: 5, backupDirectory: "data/repo-backups", }, }); - console.log("[ForcePush] Backup enabled for protected sync test"); + console.log("[ForcePush] Backup enabled (strategy=always) for protected sync test"); // Force-push again mutateSourceRepo(MY_PROJECT_BARE, "my-project-rewrite2", (workDir) => { @@ -744,7 +744,7 @@ test.describe("E2E: Force-push simulation", () => { expect( backupJobs.length, "At least one backup/snapshot activity should exist for my-project " + - "when backupBeforeSync is enabled", + "when backupStrategy is 'always'", ).toBeGreaterThan(0); // Check whether any backups actually succeeded diff --git a/tests/e2e/helpers.ts b/tests/e2e/helpers.ts index 1a6d310..3fc2bb3 100644 --- a/tests/e2e/helpers.ts +++ b/tests/e2e/helpers.ts @@ -520,7 +520,7 @@ export async function saveConfig( starredReposOrg: "github-stars", preserveOrgStructure: false, mirrorStrategy: "single-org", - backupBeforeSync: false, + backupStrategy: "disabled", blockSyncOnBackupFailure: false, };