Compare commits

...

4 Commits

Author SHA1 Message Date
ARUNAVO RAY
98da7065e0 feat: smart force-push protection with backup strategies (#206)
* feat: smart force-push protection with backup strategies (#187)

Replace blunt `backupBeforeSync` boolean with `backupStrategy` enum
offering four modes: disabled, always, on-force-push (default), and
block-on-force-push. This dramatically reduces backup storage for large
mirror collections by only creating snapshots when force-pushes are
actually detected.

Detection works by comparing branch SHAs between Gitea and GitHub APIs
before each sync — no git cloning required. Fail-open design ensures
detection errors never block sync.

Key changes:
- Add force-push detection module (branch SHA comparison via APIs)
- Add backup strategy resolver with backward-compat migration
- Add pending-approval repo status with approve/dismiss UI + API
- Add block-on-force-push mode requiring manual approval
- Fix checkAncestry to only treat 404 as confirmed force-push
  (transient errors skip branch instead of false-positive blocking)
- Fix approve-sync to bypass detection gate (skipForcePushDetection)
- Fix backup execution to not be hard-gated by deprecated flag
- Persist backupStrategy through config-mapper round-trip

* fix: resolve four bugs in smart force-push protection

P0: Approve flow re-blocks itself — approve-sync now calls
syncGiteaRepoEnhanced with skipForcePushDetection: true so the
detection+block gate is bypassed on approved syncs.

P1: backupStrategy not persisted — added to both directions of the
config-mapper. Don't inject a default in the mapper; let
resolveBackupStrategy handle fallback so legacy backupBeforeSync
still works for E2E tests and existing configs.

P1: Backup hard-gated by deprecated backupBeforeSync — added force
flag to createPreSyncBundleBackup; strategy-driven callers and
approve-sync pass force: true to bypass the legacy guard.

P1: checkAncestry false positives — now only returns false for
404/422 (confirmed force-push). Transient errors (rate limits, 500s)
are rethrown so detectForcePush skips that branch (fail-open).

* test(e2e): migrate backup tests from backupBeforeSync to backupStrategy

Update E2E tests to use the new backupStrategy enum ("always",
"disabled") instead of the deprecated backupBeforeSync boolean.

* docs: add backup strategy UI screenshot

* refactor(ui): move Destructive Update Protection to GitHub config tab

Relocates the backup strategy section from GiteaConfigForm to
GitHubConfigForm since it protects against GitHub-side force-pushes.
Adds ShieldAlert icon to match other section header patterns.

* docs: add force-push protection documentation and Beta badge

Add docs/FORCE_PUSH_PROTECTION.md covering detection mechanism,
backup strategies, API usage, and troubleshooting. Link it from
README features list and support section. Mark the feature as Beta
in the UI with an outline badge.

* fix(ui): match Beta badge style to Git LFS badge
2026-03-02 15:48:59 +05:30
ARUNAVO RAY
58e0194aa6 fix(nix): ensure absolute bundle path in pre-sync backup (#204)
* fix(nix): ensure absolute bundle path in pre-sync backup (#203)

Use path.resolve() instead of conditional path.isAbsolute() check to
guarantee bundlePath is always absolute before passing to git -C. On
NixOS, relative paths were interpreted relative to the temp mirror
clone directory, causing "No such file or directory" errors.

Closes #203

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix(nix): ensure absolute bundle path in pre-sync backup (#203)

Use path.resolve() instead of conditional path.isAbsolute() check to
guarantee bundlePath is always absolute before passing to git -C. On
NixOS, relative paths were interpreted relative to the temp mirror
clone directory, causing "No such file or directory" errors.

Extract resolveBackupPaths() for testability. Bump version to 3.10.1.

Closes #203

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* ci: drop macos matrix and only run nix build on main/tags

- Remove macos-latest from Nix CI matrix (ubuntu-only)
- Only run `nix build` on main branch and version tags, skip on PRs
- `nix flake check` still runs on all PRs for validation

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 08:37:18 +05:30
Arunavo Ray
7864c46279 unused file 2026-03-01 08:06:11 +05:30
Arunavo Ray
e3970e53e1 chore: release v3.10.0
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 08:01:02 +05:30
27 changed files with 1863 additions and 342 deletions

Binary file not shown.

After

Width:  |  Height:  |  Size: 34 KiB

View File

@@ -13,10 +13,7 @@ permissions:
jobs:
check:
strategy:
matrix:
os: [ubuntu-latest, macos-latest]
runs-on: ${{ matrix.os }}
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
@@ -34,4 +31,5 @@ jobs:
run: nix flake show
- name: Build package
if: github.ref == 'refs/heads/main' || startsWith(github.ref, 'refs/tags/v')
run: nix build --print-build-logs

View File

@@ -1,169 +0,0 @@
# Nix Distribution - Ready to Use!
## Current Status: WORKS NOW
Your Nix package is **already distributable**! Users can run it directly from GitHub without any additional setup on your end.
## How Users Will Use It
### Simple: Just Run From GitHub
```bash
nix run --extra-experimental-features 'nix-command flakes' github:RayLabsHQ/gitea-mirror
```
That's it! No releases, no CI, no infrastructure needed. It works right now.
---
## What Happens When They Run This?
1. **Nix fetches** your repo from GitHub
2. **Nix reads** `flake.nix` and `flake.lock`
3. **Nix builds** the package on their machine
4. **Nix runs** the application
5. **Result cached** in `/nix/store` for reuse
---
## Do You Need CI or Releases?
### For Basic Usage: **NO**
Users can already use it from GitHub. No CI or releases required.
### For CI Validation: **Already Set Up**
GitHub Actions validates builds on every push with Magic Nix Cache (free, no setup).
---
## Next Steps (Optional)
### Option 1: Release Versioning (2 minutes)
**Why:** Users can pin to specific versions
**How:**
```bash
# When ready to release
git tag v3.8.11
git push origin v3.8.11
# Users can then pin to this version
nix run github:RayLabsHQ/gitea-mirror/v3.8.11
```
No additional CI needed - tags work automatically with flakes!
### Option 2: Submit to nixpkgs (Long Term)
**Why:** Maximum discoverability and trust
**When:** After package is stable and well-tested
**How:** Submit PR to https://github.com/NixOS/nixpkgs
---
## Files Created
### Essential (Already Working)
- `flake.nix` - Package definition
- `flake.lock` - Dependency lock file
- `.envrc` - direnv integration
### Documentation
- `NIX.md` - Quick reference for users
- `docs/NIX_DEPLOYMENT.md` - Complete deployment guide
- `docs/NIX_DISTRIBUTION.md` - Distribution guide for you (maintainer)
- `README.md` - Updated with Nix instructions
### CI (Already Set Up)
- `.github/workflows/nix-build.yml` - Builds and validates on Linux + macOS
### Updated
- `.gitignore` - Added Nix artifacts
---
## Comparison: Your Distribution Options
| Setup | Time | User Experience | What You Need |
|-------|------|----------------|---------------|
| **Direct GitHub** | 0 min | Slow (build from source) | Nothing! Works now |
| **+ Git Tags** | 2 min | Versionable | Just push tags |
| **+ nixpkgs** | Hours | Official/Trusted | PR review process |
**Recommendation:** Direct GitHub works now. Add git tags for versioning. Consider nixpkgs submission once stable.
---
## Testing Your Distribution
You can test it right now:
```bash
# Test direct GitHub usage
nix run --extra-experimental-features 'nix-command flakes' github:RayLabsHQ/gitea-mirror
# Test with specific commit
nix run github:RayLabsHQ/gitea-mirror/$(git rev-parse HEAD)
# Validate flake
nix flake check
```
---
## User Documentation Locations
Users will find instructions in:
1. **README.md** - Installation section (already updated)
2. **NIX.md** - Quick reference
3. **docs/NIX_DEPLOYMENT.md** - Detailed guide
All docs include the correct commands with experimental features flags.
---
## When to Release New Versions
### For Git Tag Releases:
```bash
# 1. Update version in package.json
vim package.json
# 2. Update version in flake.nix (line 17)
vim flake.nix # version = "3.8.12";
# 3. Commit and tag
git add package.json flake.nix
git commit -m "chore: bump version to v3.8.12"
git tag v3.8.12
git push origin main
git push origin v3.8.12
```
Users can then use: `nix run github:RayLabsHQ/gitea-mirror/v3.8.12`
### No Release Needed For:
- Bug fixes
- Small changes
- Continuous updates
Users can always use latest from main: `nix run github:RayLabsHQ/gitea-mirror`
---
## Summary
**Ready to distribute RIGHT NOW**
- Just commit and push your `flake.nix`
- Users can run directly from GitHub
- CI validates builds automatically
**Optional: Submit to nixpkgs**
- Maximum discoverability
- Official Nix repository
- Do this once package is stable
See `docs/NIX_DISTRIBUTION.md` for complete details!

View File

@@ -40,6 +40,7 @@ First user signup becomes admin. Configure GitHub and Gitea through the web inte
- 🔄 **Auto-discovery** - Automatically import new GitHub repositories (v3.4.0+)
- 🧹 **Repository cleanup** - Auto-remove repos deleted from GitHub (v3.4.0+)
- 🎯 **Proper mirror intervals** - Respects configured sync intervals (v3.4.0+)
- 🛡️ **[Force-push protection](docs/FORCE_PUSH_PROTECTION.md)** - Smart detection with backup-on-demand or block-and-approve modes (Beta)
- 🗑️ Automatic database cleanup with configurable retention
- 🐳 Dockerized with multi-arch support (AMD64/ARM64)
@@ -499,6 +500,7 @@ GNU Affero General Public License v3.0 (AGPL-3.0) - see [LICENSE](LICENSE) file
- 📖 [Documentation](https://github.com/RayLabsHQ/gitea-mirror/tree/main/docs)
- 🔐 [Environment Variables](docs/ENVIRONMENT_VARIABLES.md)
- 🛡️ [Force-Push Protection](docs/FORCE_PUSH_PROTECTION.md)
- 🐛 [Report Issues](https://github.com/RayLabsHQ/gitea-mirror/issues)
- 💬 [Discussions](https://github.com/RayLabsHQ/gitea-mirror/discussions)
- 🔧 [Proxmox VE Script](https://community-scripts.github.io/ProxmoxVE/scripts?id=gitea-mirror)

View File

@@ -0,0 +1,179 @@
# Force-Push Protection
This document describes the smart force-push protection system introduced in gitea-mirror v3.11.0+.
## The Problem
GitHub repositories can be force-pushed at any time — rewriting history, deleting branches, or replacing commits entirely. When gitea-mirror syncs a force-pushed repo, the old history in Gitea is silently overwritten. Files, commits, and branches disappear with no way to recover them.
The original workaround (`backupBeforeSync: true`) created a full git bundle backup before **every** sync. This doesn't scale — a user with 100+ GiB of mirrors would need up to 2 TB of backup storage with default retention settings, even though force-pushes are rare.
## Solution: Smart Detection
Instead of backing up everything every time, the system detects force-pushes **before** they happen and only acts when needed.
### How Detection Works
Before each sync, the app compares branch SHAs between Gitea (the mirror) and GitHub (the source):
1. **Fetch branches from both sides** — lightweight API calls to get branch names and their latest commit SHAs
2. **Compare each branch**:
- SHAs match → nothing changed, no action needed
- SHAs differ → check if the change is a normal push or a force-push
3. **Ancestry check** — for branches with different SHAs, call GitHub's compare API to determine if the new SHA is a descendant of the old one:
- **Fast-forward** (new SHA descends from old) → normal push, safe to sync
- **Diverged** (histories split) → force-push detected
- **404** (old SHA doesn't exist on GitHub anymore) → history was rewritten, force-push detected
- **Branch deleted on GitHub** → flagged as destructive change
### What Happens on Detection
Depends on the configured strategy (see below):
- **Backup strategies** (`always`, `on-force-push`): create a git bundle snapshot, then sync
- **Block strategy** (`block-on-force-push`): halt the sync, mark the repo as `pending-approval`, wait for user action
### Fail-Open Design
If detection itself fails (GitHub rate limits, network errors, API outages), sync proceeds normally. Detection never blocks a sync due to its own failure. Individual branch check failures are skipped — one flaky branch doesn't affect the others.
## Backup Strategies
Configure via **Settings → GitHub Configuration → Destructive Update Protection**.
| Strategy | What It Does | Storage Cost | Best For |
|---|---|---|---|
| **Disabled** | No detection, no backups | Zero | Repos you don't care about losing |
| **Always Backup** | Snapshot before every sync (original behavior) | High | Small mirror sets, maximum safety |
| **Smart** (default) | Detect force-pushes, backup only when found | Near-zero normally | Most users — efficient protection |
| **Block & Approve** | Detect force-pushes, block sync until approved | Zero | Critical repos needing manual review |
### Strategy Details
#### Disabled
Syncs proceed without any detection or backup. If a force-push happens on GitHub, the mirror silently overwrites.
#### Always Backup
Creates a git bundle snapshot before every sync regardless of whether a force-push occurred. This is the legacy behavior (equivalent to the old `backupBeforeSync: true`). Safe but expensive for large mirror sets.
#### Smart (`on-force-push`) — Recommended
Runs the force-push detection before each sync. On normal days (no force-pushes), syncs proceed without any backup overhead. When a force-push is detected, a snapshot is created before the sync runs.
This gives you protection when it matters with near-zero cost when it doesn't.
#### Block & Approve (`block-on-force-push`)
Runs detection and, when a force-push is found, **blocks the sync entirely**. The repository is marked as `pending-approval` and excluded from future scheduled syncs until you take action:
- **Approve**: creates a backup first, then syncs (safe)
- **Dismiss**: clears the flag and resumes normal syncing (no backup)
Use this for repos where you want manual control over destructive changes.
## Additional Settings
These appear when any non-disabled strategy is selected:
### Snapshot Retention Count
How many backup snapshots to keep per repository. Oldest snapshots are deleted when this limit is exceeded. Default: **20**.
### Snapshot Directory
Where git bundle backups are stored. Default: **`data/repo-backups`**. Bundles are organized as `<directory>/<owner>/<repo>/<timestamp>.bundle`.
### Block Sync on Snapshot Failure
Available for **Always Backup** and **Smart** strategies. When enabled, if the snapshot creation fails (disk full, permissions error, etc.), the sync is also blocked. When disabled, sync continues even if the snapshot couldn't be created.
Recommended: **enabled** if you rely on backups for recovery.
## Backward Compatibility
The old `backupBeforeSync` boolean is still recognized:
| Old Setting | New Equivalent |
|---|---|
| `backupBeforeSync: true` | `backupStrategy: "always"` |
| `backupBeforeSync: false` | `backupStrategy: "disabled"` |
| Neither set | `backupStrategy: "on-force-push"` (new default) |
Existing configurations are automatically mapped. The old field is deprecated but will continue to work.
## Environment Variables
No new environment variables are required. The backup strategy is configured through the web UI and stored in the database alongside other config.
## API
### Approve/Dismiss Blocked Repos
When using the `block-on-force-push` strategy, repos that are blocked can be managed via the API:
```bash
# Approve sync (creates backup first, then syncs)
curl -X POST http://localhost:4321/api/job/approve-sync \
-H "Content-Type: application/json" \
-H "Cookie: <session>" \
-d '{"repositoryIds": ["<id>"], "action": "approve"}'
# Dismiss (clear the block, resume normal syncing)
curl -X POST http://localhost:4321/api/job/approve-sync \
-H "Content-Type: application/json" \
-H "Cookie: <session>" \
-d '{"repositoryIds": ["<id>"], "action": "dismiss"}'
```
Blocked repos also show an **Approve** / **Dismiss** button in the repository table UI.
## Architecture
### Key Files
| File | Purpose |
|---|---|
| `src/lib/utils/force-push-detection.ts` | Core detection: fetch branches, compare SHAs, check ancestry |
| `src/lib/repo-backup.ts` | Strategy resolver, backup decision logic, bundle creation |
| `src/lib/gitea-enhanced.ts` | Sync flow integration (calls detection + backup before mirror-sync) |
| `src/pages/api/job/approve-sync.ts` | Approve/dismiss API endpoint |
| `src/components/config/GitHubConfigForm.tsx` | Strategy selector UI |
| `src/components/repositories/RepositoryTable.tsx` | Pending-approval badge + action buttons |
### Detection Flow
```
syncGiteaRepoEnhanced()
├─ Resolve backup strategy (config → backupStrategy → backupBeforeSync → default)
├─ If strategy needs detection ("on-force-push" or "block-on-force-push"):
│ │
│ ├─ fetchGiteaBranches() — GET /api/v1/repos/{owner}/{repo}/branches
│ ├─ fetchGitHubBranches() — octokit.paginate(repos.listBranches)
│ │
│ └─ For each Gitea branch where SHA differs:
│ └─ checkAncestry() — octokit.repos.compareCommits()
│ ├─ "ahead" or "identical" → fast-forward (safe)
│ ├─ "diverged" or "behind" → force-push detected
│ └─ 404/422 → old SHA gone → force-push detected
├─ If "block-on-force-push" + detected:
│ └─ Set repo status to "pending-approval", return early
├─ If backup needed (always, or on-force-push + detected):
│ └─ Create git bundle snapshot
└─ Proceed to mirror-sync
```
## Troubleshooting
**Repos stuck in "pending-approval"**: Use the Approve or Dismiss buttons in the repository table, or call the approve-sync API endpoint.
**Detection always skipped**: Check the activity log for skip reasons. Common causes: Gitea repo not yet mirrored (first sync), GitHub API rate limits, network errors. All are fail-open by design.
**Backups consuming too much space**: Lower the retention count, or switch from "Always Backup" to "Smart" which only creates backups on actual force-pushes.
**False positives**: The detection compares branch-by-branch. A rebase (which is a force-push) will correctly trigger detection. If you routinely rebase branches, consider using "Smart" instead of "Block & Approve" to avoid constant approval prompts.

View File

@@ -1,7 +1,7 @@
{
"name": "gitea-mirror",
"type": "module",
"version": "3.9.6",
"version": "3.10.1",
"engines": {
"bun": ">=1.2.9"
},

View File

@@ -50,7 +50,7 @@ export function ConfigTabs() {
starredReposOrg: 'starred',
starredReposMode: 'dedicated-org',
preserveOrgStructure: false,
backupBeforeSync: true,
backupStrategy: "on-force-push",
backupRetentionCount: 20,
backupDirectory: 'data/repo-backups',
blockSyncOnBackupFailure: true,
@@ -660,9 +660,20 @@ export function ConfigTabs() {
: update,
}))
}
giteaConfig={config.giteaConfig}
setGiteaConfig={update =>
setConfig(prev => ({
...prev,
giteaConfig:
typeof update === 'function'
? update(prev.giteaConfig)
: update,
}))
}
onAutoSave={autoSaveGitHubConfig}
onMirrorOptionsAutoSave={autoSaveMirrorOptions}
onAdvancedOptionsAutoSave={autoSaveAdvancedOptions}
onGiteaAutoSave={autoSaveGiteaConfig}
isAutoSaving={isAutoSavingGitHub}
/>
<GiteaConfigForm

View File

@@ -7,10 +7,11 @@ import {
CardTitle,
} from "@/components/ui/card";
import { githubApi } from "@/lib/api";
import type { GitHubConfig, MirrorOptions, AdvancedOptions } from "@/types/config";
import type { GitHubConfig, MirrorOptions, AdvancedOptions, GiteaConfig, BackupStrategy } from "@/types/config";
import { Input } from "../ui/input";
import { toast } from "sonner";
import { Info } from "lucide-react";
import { Info, ShieldAlert } from "lucide-react";
import { Badge } from "@/components/ui/badge";
import { GitHubMirrorSettings } from "./GitHubMirrorSettings";
import { Separator } from "../ui/separator";
import {
@@ -26,23 +27,29 @@ interface GitHubConfigFormProps {
setMirrorOptions: React.Dispatch<React.SetStateAction<MirrorOptions>>;
advancedOptions: AdvancedOptions;
setAdvancedOptions: React.Dispatch<React.SetStateAction<AdvancedOptions>>;
giteaConfig?: GiteaConfig;
setGiteaConfig?: React.Dispatch<React.SetStateAction<GiteaConfig>>;
onAutoSave?: (githubConfig: GitHubConfig) => Promise<void>;
onMirrorOptionsAutoSave?: (mirrorOptions: MirrorOptions) => Promise<void>;
onAdvancedOptionsAutoSave?: (advancedOptions: AdvancedOptions) => Promise<void>;
onGiteaAutoSave?: (giteaConfig: GiteaConfig) => Promise<void>;
isAutoSaving?: boolean;
}
export function GitHubConfigForm({
config,
setConfig,
config,
setConfig,
mirrorOptions,
setMirrorOptions,
advancedOptions,
setAdvancedOptions,
onAutoSave,
giteaConfig,
setGiteaConfig,
onAutoSave,
onMirrorOptionsAutoSave,
onAdvancedOptionsAutoSave,
isAutoSaving
onGiteaAutoSave,
isAutoSaving
}: GitHubConfigFormProps) {
const [isLoading, setIsLoading] = useState(false);
@@ -202,7 +209,139 @@ export function GitHubConfigForm({
if (onAdvancedOptionsAutoSave) onAdvancedOptionsAutoSave(newOptions);
}}
/>
{giteaConfig && setGiteaConfig && (
<>
<Separator />
<div className="space-y-4">
<h3 className="text-sm font-medium flex items-center gap-2">
<ShieldAlert className="h-4 w-4 text-primary" />
Destructive Update Protection
<Badge variant="secondary" className="ml-2 text-[10px] px-1.5 py-0">BETA</Badge>
</h3>
<p className="text-xs text-muted-foreground">
Choose how to handle force-pushes or rewritten upstream history on GitHub.
</p>
<div className="grid grid-cols-2 md:grid-cols-4 gap-2">
{([
{
value: "disabled",
label: "Disabled",
desc: "No detection or backups",
},
{
value: "always",
label: "Always Backup",
desc: "Snapshot before every sync",
},
{
value: "on-force-push",
label: "Smart",
desc: "Backup only on force-push",
},
{
value: "block-on-force-push",
label: "Block & Approve",
desc: "Require approval on force-push",
},
] as const).map((opt) => {
const isSelected = (giteaConfig.backupStrategy ?? "on-force-push") === opt.value;
return (
<button
key={opt.value}
type="button"
onClick={() => {
const newConfig = { ...giteaConfig, backupStrategy: opt.value as BackupStrategy };
setGiteaConfig(newConfig);
if (onGiteaAutoSave) onGiteaAutoSave(newConfig);
}}
className={`flex flex-col items-start gap-1 rounded-lg border p-3 text-left text-sm transition-colors ${
isSelected
? "border-primary bg-primary/5 ring-1 ring-primary"
: "border-input hover:bg-accent hover:text-accent-foreground"
}`}
>
<span className="font-medium">{opt.label}</span>
<span className="text-xs text-muted-foreground">{opt.desc}</span>
</button>
);
})}
</div>
{(giteaConfig.backupStrategy ?? "on-force-push") !== "disabled" && (
<>
<div className="grid grid-cols-1 md:grid-cols-2 gap-4">
<div>
<label htmlFor="backup-retention" className="block text-sm font-medium mb-1.5">
Snapshot retention count
</label>
<input
id="backup-retention"
name="backupRetentionCount"
type="number"
min={1}
value={giteaConfig.backupRetentionCount ?? 20}
onChange={(e) => {
const newConfig = {
...giteaConfig,
backupRetentionCount: Math.max(1, Number.parseInt(e.target.value, 10) || 20),
};
setGiteaConfig(newConfig);
if (onGiteaAutoSave) onGiteaAutoSave(newConfig);
}}
className="w-full rounded-md border border-input bg-background px-3 py-2 text-sm shadow-sm transition-colors placeholder:text-muted-foreground focus-visible:outline-none focus-visible:ring-1 focus-visible:ring-ring"
/>
</div>
<div>
<label htmlFor="backup-directory" className="block text-sm font-medium mb-1.5">
Snapshot directory
</label>
<input
id="backup-directory"
name="backupDirectory"
type="text"
value={giteaConfig.backupDirectory || "data/repo-backups"}
onChange={(e) => {
const newConfig = { ...giteaConfig, backupDirectory: e.target.value };
setGiteaConfig(newConfig);
if (onGiteaAutoSave) onGiteaAutoSave(newConfig);
}}
className="w-full rounded-md border border-input bg-background px-3 py-2 text-sm shadow-sm transition-colors placeholder:text-muted-foreground focus-visible:outline-none focus-visible:ring-1 focus-visible:ring-ring"
placeholder="data/repo-backups"
/>
</div>
</div>
{((giteaConfig.backupStrategy ?? "on-force-push") === "always" ||
(giteaConfig.backupStrategy ?? "on-force-push") === "on-force-push") && (
<label className="flex items-start gap-3 text-sm">
<input
name="blockSyncOnBackupFailure"
type="checkbox"
checked={Boolean(giteaConfig.blockSyncOnBackupFailure)}
onChange={(e) => {
const newConfig = { ...giteaConfig, blockSyncOnBackupFailure: e.target.checked };
setGiteaConfig(newConfig);
if (onGiteaAutoSave) onGiteaAutoSave(newConfig);
}}
className="mt-0.5 rounded border-input"
/>
<span>
Block sync when snapshot fails
<p className="text-xs text-muted-foreground">
Recommended for backup-first behavior. If disabled, sync continues even when snapshot creation fails.
</p>
</span>
</label>
)}
</>
)}
</div>
</>
)}
{/* Mobile: Show button at bottom */}
<Button
type="button"

View File

@@ -103,9 +103,7 @@ export function GiteaConfigForm({ config, setConfig, onAutoSave, isAutoSaving, g
const normalizedValue =
type === "checkbox"
? checked
: name === "backupRetentionCount"
? Math.max(1, Number.parseInt(value, 10) || 20)
: value;
: value;
const newConfig = {
...config,
@@ -294,76 +292,6 @@ export function GiteaConfigForm({ config, setConfig, onAutoSave, isAutoSaving, g
}}
/>
<Separator />
<div className="space-y-4">
<h3 className="text-sm font-semibold">Destructive Update Protection</h3>
<label className="flex items-start gap-3 text-sm">
<input
name="backupBeforeSync"
type="checkbox"
checked={Boolean(config.backupBeforeSync)}
onChange={handleChange}
className="mt-0.5 rounded border-input"
/>
<span>
Create snapshot before each sync
<p className="text-xs text-muted-foreground">
Saves a restore point so force-pushes or rewritten upstream history can be recovered.
</p>
</span>
</label>
{config.backupBeforeSync && (
<div className="grid grid-cols-1 md:grid-cols-2 gap-4">
<div>
<label htmlFor="gitea-backup-retention" className="block text-sm font-medium mb-1.5">
Snapshot retention count
</label>
<input
id="gitea-backup-retention"
name="backupRetentionCount"
type="number"
min={1}
value={config.backupRetentionCount ?? 20}
onChange={handleChange}
className="w-full rounded-md border border-input bg-background px-3 py-2 text-sm shadow-sm transition-colors placeholder:text-muted-foreground focus-visible:outline-none focus-visible:ring-1 focus-visible:ring-ring"
/>
</div>
<div>
<label htmlFor="gitea-backup-directory" className="block text-sm font-medium mb-1.5">
Snapshot directory
</label>
<input
id="gitea-backup-directory"
name="backupDirectory"
type="text"
value={config.backupDirectory || "data/repo-backups"}
onChange={handleChange}
className="w-full rounded-md border border-input bg-background px-3 py-2 text-sm shadow-sm transition-colors placeholder:text-muted-foreground focus-visible:outline-none focus-visible:ring-1 focus-visible:ring-ring"
placeholder="data/repo-backups"
/>
</div>
</div>
)}
<label className="flex items-start gap-3 text-sm">
<input
name="blockSyncOnBackupFailure"
type="checkbox"
checked={Boolean(config.blockSyncOnBackupFailure)}
onChange={handleChange}
className="mt-0.5 rounded border-input"
/>
<span>
Block sync when snapshot fails
<p className="text-xs text-muted-foreground">
Recommended for backup-first behavior. If disabled, sync continues even when snapshot creation fails.
</p>
</span>
</label>
</div>
{/* Mobile: Show button at bottom */}
<Button
type="button"

View File

@@ -694,6 +694,80 @@ export default function Repository() {
}
};
const handleApproveSyncAction = async ({ repoId }: { repoId: string }) => {
try {
if (!user || !user.id) return;
setLoadingRepoIds((prev) => new Set(prev).add(repoId));
const response = await apiRequest<{
success: boolean;
message?: string;
error?: string;
repositories: Repository[];
}>("/job/approve-sync", {
method: "POST",
data: { repositoryIds: [repoId], action: "approve" },
});
if (response.success) {
toast.success("Sync approved — backup + sync started");
setRepositories((prevRepos) =>
prevRepos.map((repo) => {
const updated = response.repositories.find((r) => r.id === repo.id);
return updated ? updated : repo;
}),
);
} else {
showErrorToast(response.error || "Error approving sync", toast);
}
} catch (error) {
showErrorToast(error, toast);
} finally {
setLoadingRepoIds((prev) => {
const newSet = new Set(prev);
newSet.delete(repoId);
return newSet;
});
}
};
const handleDismissSyncAction = async ({ repoId }: { repoId: string }) => {
try {
if (!user || !user.id) return;
setLoadingRepoIds((prev) => new Set(prev).add(repoId));
const response = await apiRequest<{
success: boolean;
message?: string;
error?: string;
repositories: Repository[];
}>("/job/approve-sync", {
method: "POST",
data: { repositoryIds: [repoId], action: "dismiss" },
});
if (response.success) {
toast.success("Force-push alert dismissed");
setRepositories((prevRepos) =>
prevRepos.map((repo) => {
const updated = response.repositories.find((r) => r.id === repo.id);
return updated ? updated : repo;
}),
);
} else {
showErrorToast(response.error || "Error dismissing alert", toast);
}
} catch (error) {
showErrorToast(error, toast);
} finally {
setLoadingRepoIds((prev) => {
const newSet = new Set(prev);
newSet.delete(repoId);
return newSet;
});
}
};
const handleAddRepository = async ({
repo,
owner,
@@ -1409,6 +1483,8 @@ export default function Repository() {
await fetchRepositories(false);
}}
onDelete={handleRequestDeleteRepository}
onApproveSync={handleApproveSyncAction}
onDismissSync={handleDismissSyncAction}
/>
)}

View File

@@ -1,7 +1,7 @@
import { useMemo, useRef } from "react";
import Fuse from "fuse.js";
import { useVirtualizer } from "@tanstack/react-virtual";
import { FlipHorizontal, GitFork, RefreshCw, RotateCcw, Star, Lock, Ban, Check, ChevronDown, Trash2 } from "lucide-react";
import { FlipHorizontal, GitFork, RefreshCw, RotateCcw, Star, Lock, Ban, Check, ChevronDown, Trash2, X } from "lucide-react";
import { SiGithub, SiGitea } from "react-icons/si";
import type { Repository } from "@/lib/db/schema";
import { Button } from "@/components/ui/button";
@@ -42,6 +42,8 @@ interface RepositoryTableProps {
onSelectionChange: (selectedIds: Set<string>) => void;
onRefresh?: () => Promise<void>;
onDelete?: (repoId: string) => void;
onApproveSync?: ({ repoId }: { repoId: string }) => Promise<void>;
onDismissSync?: ({ repoId }: { repoId: string }) => Promise<void>;
}
export default function RepositoryTable({
@@ -59,6 +61,8 @@ export default function RepositoryTable({
onSelectionChange,
onRefresh,
onDelete,
onApproveSync,
onDismissSync,
}: RepositoryTableProps) {
const tableParentRef = useRef<HTMLDivElement>(null);
const { giteaConfig } = useGiteaConfig();
@@ -239,6 +243,7 @@ export default function RepositoryTable({
repo.status === 'failed' ? 'bg-red-500/10 text-red-600 hover:bg-red-500/20 dark:text-red-400' :
repo.status === 'ignored' ? 'bg-gray-500/10 text-gray-600 hover:bg-gray-500/20 dark:text-gray-400' :
repo.status === 'skipped' ? 'bg-orange-500/10 text-orange-600 hover:bg-orange-500/20 dark:text-orange-400' :
repo.status === 'pending-approval' ? 'bg-amber-500/10 text-amber-600 hover:bg-amber-500/20 dark:text-amber-400' :
'bg-muted hover:bg-muted/80'}`}
variant="secondary"
>
@@ -316,7 +321,40 @@ export default function RepositoryTable({
)}
</Button>
)}
{repo.status === "pending-approval" && (
<div className="flex gap-2 w-full">
<Button
size="default"
variant="default"
onClick={() => repo.id && onApproveSync?.({ repoId: repo.id })}
disabled={isLoading}
className="flex-1 h-10"
>
{isLoading ? (
<>
<Check className="h-4 w-4 mr-2 animate-spin" />
Approving...
</>
) : (
<>
<Check className="h-4 w-4 mr-2" />
Approve Sync
</>
)}
</Button>
<Button
size="default"
variant="outline"
onClick={() => repo.id && onDismissSync?.({ repoId: repo.id })}
disabled={isLoading}
className="flex-1 h-10"
>
<X className="h-4 w-4 mr-2" />
Dismiss
</Button>
</div>
)}
{/* Ignore/Include button */}
{repo.status === "ignored" ? (
<Button
@@ -663,6 +701,7 @@ export default function RepositoryTable({
repo.status === 'failed' ? 'bg-red-500/10 text-red-600 hover:bg-red-500/20 dark:text-red-400' :
repo.status === 'ignored' ? 'bg-gray-500/10 text-gray-600 hover:bg-gray-500/20 dark:text-gray-400' :
repo.status === 'skipped' ? 'bg-orange-500/10 text-orange-600 hover:bg-orange-500/20 dark:text-orange-400' :
repo.status === 'pending-approval' ? 'bg-amber-500/10 text-amber-600 hover:bg-amber-500/20 dark:text-amber-400' :
'bg-muted hover:bg-muted/80'}`}
variant="secondary"
>
@@ -680,6 +719,8 @@ export default function RepositoryTable({
onRetry={() => onRetry({ repoId: repo.id ?? "" })}
onSkip={(skip) => onSkip({ repoId: repo.id ?? "", skip })}
onDelete={onDelete && repo.id ? () => onDelete(repo.id as string) : undefined}
onApproveSync={onApproveSync ? () => onApproveSync({ repoId: repo.id ?? "" }) : undefined}
onDismissSync={onDismissSync ? () => onDismissSync({ repoId: repo.id ?? "" }) : undefined}
/>
</div>
{/* Links */}
@@ -791,6 +832,8 @@ function RepoActionButton({
onRetry,
onSkip,
onDelete,
onApproveSync,
onDismissSync,
}: {
repo: { id: string; status: string };
isLoading: boolean;
@@ -799,7 +842,36 @@ function RepoActionButton({
onRetry: () => void;
onSkip: (skip: boolean) => void;
onDelete?: () => void;
onApproveSync?: () => void;
onDismissSync?: () => void;
}) {
// For pending-approval repos, show approve/dismiss actions
if (repo.status === "pending-approval") {
return (
<div className="flex gap-1">
<Button
variant="default"
size="sm"
disabled={isLoading}
onClick={onApproveSync}
className="min-w-[70px]"
>
<Check className="h-4 w-4 mr-1" />
Approve
</Button>
<Button
variant="outline"
size="sm"
disabled={isLoading}
onClick={onDismissSync}
>
<X className="h-4 w-4 mr-1" />
Dismiss
</Button>
</div>
);
}
// For ignored repos, show an "Include" action
if (repo.status === "ignored") {
return (

View File

@@ -33,6 +33,13 @@ export const githubConfigSchema = z.object({
starredDuplicateStrategy: z.enum(["suffix", "prefix", "owner-org"]).default("suffix").optional(),
});
export const backupStrategyEnum = z.enum([
"disabled",
"always",
"on-force-push",
"block-on-force-push",
]);
export const giteaConfigSchema = z.object({
url: z.url(),
externalUrl: z.url().optional(),
@@ -65,7 +72,8 @@ export const giteaConfigSchema = z.object({
mirrorPullRequests: z.boolean().default(false),
mirrorLabels: z.boolean().default(false),
mirrorMilestones: z.boolean().default(false),
backupBeforeSync: z.boolean().default(true),
backupStrategy: backupStrategyEnum.default("on-force-push"),
backupBeforeSync: z.boolean().default(true), // Deprecated: kept for backward compat, use backupStrategy
backupRetentionCount: z.number().int().min(1).default(20),
backupDirectory: z.string().optional(),
blockSyncOnBackupFailure: z.boolean().default(true),
@@ -165,6 +173,7 @@ export const repositorySchema = z.object({
"syncing",
"synced",
"archived",
"pending-approval", // Blocked by force-push detection, needs manual approval
])
.default("imported"),
lastMirrored: z.coerce.date().optional().nullable(),
@@ -196,6 +205,7 @@ export const mirrorJobSchema = z.object({
"syncing",
"synced",
"archived",
"pending-approval",
])
.default("imported"),
message: z.string(),

View File

@@ -19,7 +19,12 @@ import {
createPreSyncBundleBackup,
shouldCreatePreSyncBackup,
shouldBlockSyncOnBackupFailure,
resolveBackupStrategy,
shouldBackupForStrategy,
shouldBlockSyncForStrategy,
strategyNeedsDetection,
} from "./repo-backup";
import { detectForcePush } from "./utils/force-push-detection";
import {
parseRepositoryMetadataState,
serializeRepositoryMetadataState,
@@ -255,9 +260,12 @@ export async function getOrCreateGiteaOrgEnhanced({
export async function syncGiteaRepoEnhanced({
config,
repository,
skipForcePushDetection,
}: {
config: Partial<Config>;
repository: Repository;
/** When true, skip force-push detection and blocking (used by approve-sync). */
skipForcePushDetection?: boolean;
}, deps?: SyncDependencies): Promise<any> {
try {
if (!config.userId || !config.giteaConfig?.url || !config.giteaConfig?.token) {
@@ -318,58 +326,138 @@ export async function syncGiteaRepoEnhanced({
throw new Error(`Repository ${repository.name} is not a mirror. Cannot sync.`);
}
if (shouldCreatePreSyncBackup(config)) {
const cloneUrl =
repoInfo.clone_url ||
`${config.giteaConfig.url.replace(/\/$/, "")}/${repoOwner}/${repository.name}.git`;
// ---- Smart backup strategy with force-push detection ----
const backupStrategy = resolveBackupStrategy(config);
let forcePushDetected = false;
try {
const backupResult = await createPreSyncBundleBackup({
config,
owner: repoOwner,
repoName: repository.name,
cloneUrl,
});
if (backupStrategy !== "disabled") {
// Run force-push detection if the strategy requires it
// (skip when called from approve-sync to avoid re-blocking)
if (strategyNeedsDetection(backupStrategy) && !skipForcePushDetection) {
try {
const decryptedGithubToken = decryptedConfig.githubConfig?.token;
if (decryptedGithubToken) {
const fpOctokit = new Octokit({ auth: decryptedGithubToken });
const detectionResult = await detectForcePush({
giteaUrl: config.giteaConfig.url,
giteaToken: decryptedConfig.giteaConfig.token,
giteaOwner: repoOwner,
giteaRepo: repository.name,
octokit: fpOctokit,
githubOwner: repository.owner,
githubRepo: repository.name,
});
await createMirrorJob({
userId: config.userId,
repositoryId: repository.id,
repositoryName: repository.name,
message: `Snapshot created for ${repository.name}`,
details: `Pre-sync snapshot created at ${backupResult.bundlePath}.`,
status: "syncing",
});
} catch (backupError) {
const errorMessage =
backupError instanceof Error ? backupError.message : String(backupError);
forcePushDetected = detectionResult.detected;
await createMirrorJob({
userId: config.userId,
repositoryId: repository.id,
repositoryName: repository.name,
message: `Snapshot failed for ${repository.name}`,
details: `Pre-sync snapshot failed: ${errorMessage}`,
status: "failed",
});
if (shouldBlockSyncOnBackupFailure(config)) {
await db
.update(repositories)
.set({
status: repoStatusEnum.parse("failed"),
updatedAt: new Date(),
errorMessage: `Snapshot failed; sync blocked to protect history. ${errorMessage}`,
})
.where(eq(repositories.id, repository.id!));
throw new Error(
`Snapshot failed; sync blocked to protect history. ${errorMessage}`
if (detectionResult.skipped) {
console.log(
`[Sync] Force-push detection skipped for ${repository.name}: ${detectionResult.skipReason}`,
);
} else if (forcePushDetected) {
const branchNames = detectionResult.affectedBranches
.map((b) => `${b.name} (${b.reason})`)
.join(", ");
console.warn(
`[Sync] Force-push detected on ${repository.name}: ${branchNames}`,
);
}
} else {
console.log(
`[Sync] Skipping force-push detection for ${repository.name}: no GitHub token`,
);
}
} catch (detectionError) {
// Fail-open: detection errors should never block sync
console.warn(
`[Sync] Force-push detection failed for ${repository.name}, proceeding with sync: ${
detectionError instanceof Error ? detectionError.message : String(detectionError)
}`,
);
}
}
console.warn(
`[Sync] Snapshot failed for ${repository.name}, continuing because blockSyncOnBackupFailure=false: ${errorMessage}`
);
// Check if sync should be blocked (block-on-force-push mode)
if (shouldBlockSyncForStrategy(backupStrategy, forcePushDetected)) {
const branchInfo = `Force-push detected; sync blocked for manual approval.`;
await db
.update(repositories)
.set({
status: "pending-approval",
updatedAt: new Date(),
errorMessage: branchInfo,
})
.where(eq(repositories.id, repository.id!));
await createMirrorJob({
userId: config.userId,
repositoryId: repository.id,
repositoryName: repository.name,
message: `Sync blocked for ${repository.name}: force-push detected`,
details: branchInfo,
status: "pending-approval",
});
console.warn(`[Sync] Sync blocked for ${repository.name}: pending manual approval`);
return { blocked: true, reason: branchInfo };
}
// Create backup if strategy says so
if (shouldBackupForStrategy(backupStrategy, forcePushDetected)) {
const cloneUrl =
repoInfo.clone_url ||
`${config.giteaConfig.url.replace(/\/$/, "")}/${repoOwner}/${repository.name}.git`;
try {
const backupResult = await createPreSyncBundleBackup({
config,
owner: repoOwner,
repoName: repository.name,
cloneUrl,
force: true, // Strategy already decided to backup; skip legacy gate
});
await createMirrorJob({
userId: config.userId,
repositoryId: repository.id,
repositoryName: repository.name,
message: `Snapshot created for ${repository.name}`,
details: `Pre-sync snapshot created at ${backupResult.bundlePath}.`,
status: "syncing",
});
} catch (backupError) {
const errorMessage =
backupError instanceof Error ? backupError.message : String(backupError);
await createMirrorJob({
userId: config.userId,
repositoryId: repository.id,
repositoryName: repository.name,
message: `Snapshot failed for ${repository.name}`,
details: `Pre-sync snapshot failed: ${errorMessage}`,
status: "failed",
});
if (shouldBlockSyncOnBackupFailure(config)) {
await db
.update(repositories)
.set({
status: repoStatusEnum.parse("failed"),
updatedAt: new Date(),
errorMessage: `Snapshot failed; sync blocked to protect history. ${errorMessage}`,
})
.where(eq(repositories.id, repository.id!));
throw new Error(
`Snapshot failed; sync blocked to protect history. ${errorMessage}`,
);
}
console.warn(
`[Sync] Snapshot failed for ${repository.name}, continuing because blockSyncOnBackupFailure=false: ${errorMessage}`,
);
}
}
}

248
src/lib/repo-backup.test.ts Normal file
View File

@@ -0,0 +1,248 @@
import path from "node:path";
import { afterEach, beforeEach, describe, expect, test } from "bun:test";
import type { Config } from "@/types/config";
import {
resolveBackupPaths,
resolveBackupStrategy,
shouldBackupForStrategy,
shouldBlockSyncForStrategy,
strategyNeedsDetection,
} from "@/lib/repo-backup";
describe("resolveBackupPaths", () => {
let originalBackupDirEnv: string | undefined;
beforeEach(() => {
originalBackupDirEnv = process.env.PRE_SYNC_BACKUP_DIR;
delete process.env.PRE_SYNC_BACKUP_DIR;
});
afterEach(() => {
if (originalBackupDirEnv === undefined) {
delete process.env.PRE_SYNC_BACKUP_DIR;
} else {
process.env.PRE_SYNC_BACKUP_DIR = originalBackupDirEnv;
}
});
test("returns absolute paths when backupDirectory is relative", () => {
const config: Partial<Config> = {
userId: "user-123",
giteaConfig: {
backupDirectory: "data/repo-backups",
} as Config["giteaConfig"],
};
const { backupRoot, repoBackupDir } = resolveBackupPaths({
config,
owner: "RayLabsHQ",
repoName: "gitea-mirror",
});
expect(path.isAbsolute(backupRoot)).toBe(true);
expect(path.isAbsolute(repoBackupDir)).toBe(true);
expect(repoBackupDir).toBe(
path.join(backupRoot, "user-123", "RayLabsHQ", "gitea-mirror")
);
});
test("returns absolute paths when backupDirectory is already absolute", () => {
const config: Partial<Config> = {
userId: "user-123",
giteaConfig: {
backupDirectory: "/data/repo-backups",
} as Config["giteaConfig"],
};
const { backupRoot, repoBackupDir } = resolveBackupPaths({
config,
owner: "owner",
repoName: "repo",
});
expect(backupRoot).toBe("/data/repo-backups");
expect(path.isAbsolute(repoBackupDir)).toBe(true);
});
test("falls back to cwd-based path when no backupDirectory is set", () => {
const config: Partial<Config> = {
userId: "user-123",
giteaConfig: {} as Config["giteaConfig"],
};
const { backupRoot } = resolveBackupPaths({
config,
owner: "owner",
repoName: "repo",
});
expect(path.isAbsolute(backupRoot)).toBe(true);
expect(backupRoot).toBe(
path.resolve(process.cwd(), "data", "repo-backups")
);
});
test("uses PRE_SYNC_BACKUP_DIR env var when config has no backupDirectory", () => {
process.env.PRE_SYNC_BACKUP_DIR = "custom/backup/path";
const config: Partial<Config> = {
userId: "user-123",
giteaConfig: {} as Config["giteaConfig"],
};
const { backupRoot } = resolveBackupPaths({
config,
owner: "owner",
repoName: "repo",
});
expect(path.isAbsolute(backupRoot)).toBe(true);
expect(backupRoot).toBe(path.resolve("custom/backup/path"));
});
test("sanitizes owner and repoName in path segments", () => {
const config: Partial<Config> = {
userId: "user-123",
giteaConfig: {
backupDirectory: "/backups",
} as Config["giteaConfig"],
};
const { repoBackupDir } = resolveBackupPaths({
config,
owner: "org/with-slash",
repoName: "repo name!",
});
expect(repoBackupDir).toBe(
path.join("/backups", "user-123", "org_with-slash", "repo_name_")
);
});
});
// ---- Backup strategy resolver tests ----
function makeConfig(overrides: Record<string, any> = {}): Partial<Config> {
return {
giteaConfig: {
url: "https://gitea.example.com",
token: "tok",
...overrides,
},
} as Partial<Config>;
}
const envKeysToClean = ["PRE_SYNC_BACKUP_STRATEGY", "PRE_SYNC_BACKUP_ENABLED"];
describe("resolveBackupStrategy", () => {
let savedEnv: Record<string, string | undefined> = {};
beforeEach(() => {
savedEnv = {};
for (const key of envKeysToClean) {
savedEnv[key] = process.env[key];
delete process.env[key];
}
});
afterEach(() => {
for (const [key, value] of Object.entries(savedEnv)) {
if (value === undefined) {
delete process.env[key];
} else {
process.env[key] = value;
}
}
});
test("returns explicit backupStrategy when set", () => {
expect(resolveBackupStrategy(makeConfig({ backupStrategy: "always" }))).toBe("always");
expect(resolveBackupStrategy(makeConfig({ backupStrategy: "disabled" }))).toBe("disabled");
expect(resolveBackupStrategy(makeConfig({ backupStrategy: "on-force-push" }))).toBe("on-force-push");
expect(resolveBackupStrategy(makeConfig({ backupStrategy: "block-on-force-push" }))).toBe("block-on-force-push");
});
test("maps backupBeforeSync: true → 'always' (backward compat)", () => {
expect(resolveBackupStrategy(makeConfig({ backupBeforeSync: true }))).toBe("always");
});
test("maps backupBeforeSync: false → 'disabled' (backward compat)", () => {
expect(resolveBackupStrategy(makeConfig({ backupBeforeSync: false }))).toBe("disabled");
});
test("prefers explicit backupStrategy over backupBeforeSync", () => {
expect(
resolveBackupStrategy(
makeConfig({ backupStrategy: "on-force-push", backupBeforeSync: true }),
),
).toBe("on-force-push");
});
test("falls back to PRE_SYNC_BACKUP_STRATEGY env var", () => {
process.env.PRE_SYNC_BACKUP_STRATEGY = "block-on-force-push";
expect(resolveBackupStrategy(makeConfig({}))).toBe("block-on-force-push");
});
test("falls back to PRE_SYNC_BACKUP_ENABLED env var (legacy)", () => {
process.env.PRE_SYNC_BACKUP_ENABLED = "false";
expect(resolveBackupStrategy(makeConfig({}))).toBe("disabled");
});
test("defaults to 'on-force-push' when nothing is configured", () => {
expect(resolveBackupStrategy(makeConfig({}))).toBe("on-force-push");
});
test("handles empty giteaConfig gracefully", () => {
expect(resolveBackupStrategy({})).toBe("on-force-push");
});
});
describe("shouldBackupForStrategy", () => {
test("disabled → never backup", () => {
expect(shouldBackupForStrategy("disabled", false)).toBe(false);
expect(shouldBackupForStrategy("disabled", true)).toBe(false);
});
test("always → always backup", () => {
expect(shouldBackupForStrategy("always", false)).toBe(true);
expect(shouldBackupForStrategy("always", true)).toBe(true);
});
test("on-force-push → backup only when detected", () => {
expect(shouldBackupForStrategy("on-force-push", false)).toBe(false);
expect(shouldBackupForStrategy("on-force-push", true)).toBe(true);
});
test("block-on-force-push → backup only when detected", () => {
expect(shouldBackupForStrategy("block-on-force-push", false)).toBe(false);
expect(shouldBackupForStrategy("block-on-force-push", true)).toBe(true);
});
});
describe("shouldBlockSyncForStrategy", () => {
test("only block-on-force-push + detected returns true", () => {
expect(shouldBlockSyncForStrategy("block-on-force-push", true)).toBe(true);
});
test("block-on-force-push without detection does not block", () => {
expect(shouldBlockSyncForStrategy("block-on-force-push", false)).toBe(false);
});
test("other strategies never block", () => {
expect(shouldBlockSyncForStrategy("disabled", true)).toBe(false);
expect(shouldBlockSyncForStrategy("always", true)).toBe(false);
expect(shouldBlockSyncForStrategy("on-force-push", true)).toBe(false);
});
});
describe("strategyNeedsDetection", () => {
test("returns true for detection-based strategies", () => {
expect(strategyNeedsDetection("on-force-push")).toBe(true);
expect(strategyNeedsDetection("block-on-force-push")).toBe(true);
});
test("returns false for non-detection strategies", () => {
expect(strategyNeedsDetection("disabled")).toBe(false);
expect(strategyNeedsDetection("always")).toBe(false);
});
});

View File

@@ -1,7 +1,7 @@
import { mkdir, mkdtemp, readdir, rm, stat } from "node:fs/promises";
import os from "node:os";
import path from "node:path";
import type { Config } from "@/types/config";
import type { Config, BackupStrategy } from "@/types/config";
import { decryptConfigTokens } from "./utils/config-encryption";
const TRUE_VALUES = new Set(["1", "true", "yes", "on"]);
@@ -101,18 +101,138 @@ export function shouldBlockSyncOnBackupFailure(config: Partial<Config>): boolean
return configSetting === undefined ? true : Boolean(configSetting);
}
// ---- Backup strategy resolver ----
const VALID_STRATEGIES = new Set<BackupStrategy>([
"disabled",
"always",
"on-force-push",
"block-on-force-push",
]);
/**
* Resolve the effective backup strategy from config, falling back through:
* 1. `backupStrategy` field (new)
* 2. `backupBeforeSync` boolean (deprecated, backward compat)
* 3. `PRE_SYNC_BACKUP_STRATEGY` env var
* 4. `PRE_SYNC_BACKUP_ENABLED` env var (legacy)
* 5. Default: `"on-force-push"`
*/
export function resolveBackupStrategy(config: Partial<Config>): BackupStrategy {
// 1. Explicit backupStrategy field
const explicit = config.giteaConfig?.backupStrategy;
if (explicit && VALID_STRATEGIES.has(explicit as BackupStrategy)) {
return explicit as BackupStrategy;
}
// 2. Legacy backupBeforeSync boolean → map to strategy
const legacy = config.giteaConfig?.backupBeforeSync;
if (legacy !== undefined) {
return legacy ? "always" : "disabled";
}
// 3. Env var (new)
const envStrategy = process.env.PRE_SYNC_BACKUP_STRATEGY?.trim().toLowerCase();
if (envStrategy && VALID_STRATEGIES.has(envStrategy as BackupStrategy)) {
return envStrategy as BackupStrategy;
}
// 4. Env var (legacy)
const envEnabled = process.env.PRE_SYNC_BACKUP_ENABLED;
if (envEnabled !== undefined) {
return parseBoolean(envEnabled, true) ? "always" : "disabled";
}
// 5. Default
return "on-force-push";
}
/**
* Determine whether a backup should be created for the given strategy and
* force-push detection result.
*/
export function shouldBackupForStrategy(
strategy: BackupStrategy,
forcePushDetected: boolean,
): boolean {
switch (strategy) {
case "disabled":
return false;
case "always":
return true;
case "on-force-push":
case "block-on-force-push":
return forcePushDetected;
default:
return false;
}
}
/**
* Determine whether sync should be blocked (requires manual approval).
* Only `block-on-force-push` with an actual detection blocks sync.
*/
export function shouldBlockSyncForStrategy(
strategy: BackupStrategy,
forcePushDetected: boolean,
): boolean {
return strategy === "block-on-force-push" && forcePushDetected;
}
/**
* Returns true when the strategy requires running force-push detection
* before deciding on backup / block behavior.
*/
export function strategyNeedsDetection(strategy: BackupStrategy): boolean {
return strategy === "on-force-push" || strategy === "block-on-force-push";
}
export function resolveBackupPaths({
config,
owner,
repoName,
}: {
config: Partial<Config>;
owner: string;
repoName: string;
}): { backupRoot: string; repoBackupDir: string } {
let backupRoot =
config.giteaConfig?.backupDirectory?.trim() ||
process.env.PRE_SYNC_BACKUP_DIR?.trim() ||
path.join(process.cwd(), "data", "repo-backups");
// Ensure backupRoot is absolute - relative paths break git bundle creation
// because git runs with -C mirrorClonePath and interprets relative paths from there.
// Always use path.resolve() which guarantees an absolute path, rather than a
// conditional check that can miss edge cases (e.g., NixOS systemd services).
backupRoot = path.resolve(backupRoot);
const repoBackupDir = path.join(
backupRoot,
sanitizePathSegment(config.userId || "unknown-user"),
sanitizePathSegment(owner),
sanitizePathSegment(repoName)
);
return { backupRoot, repoBackupDir };
}
export async function createPreSyncBundleBackup({
config,
owner,
repoName,
cloneUrl,
force,
}: {
config: Partial<Config>;
owner: string;
repoName: string;
cloneUrl: string;
/** When true, skip the legacy shouldCreatePreSyncBackup check.
* Used by the strategy-driven path which has already decided to backup. */
force?: boolean;
}): Promise<{ bundlePath: string }> {
if (!shouldCreatePreSyncBackup(config)) {
if (!force && !shouldCreatePreSyncBackup(config)) {
throw new Error("Pre-sync backup is disabled.");
}
@@ -126,16 +246,7 @@ export async function createPreSyncBundleBackup({
throw new Error("Decrypted Gitea token is required for pre-sync backup.");
}
let backupRoot =
config.giteaConfig?.backupDirectory?.trim() ||
process.env.PRE_SYNC_BACKUP_DIR?.trim() ||
path.join(process.cwd(), "data", "repo-backups");
// Ensure backupRoot is absolute - relative paths break git bundle creation
// because git runs with -C mirrorClonePath and interprets relative paths from there
if (!path.isAbsolute(backupRoot)) {
backupRoot = path.resolve(process.cwd(), backupRoot);
}
const { repoBackupDir } = resolveBackupPaths({ config, owner, repoName });
const retention = Math.max(
1,
Number.isFinite(config.giteaConfig?.backupRetentionCount)
@@ -143,18 +254,13 @@ export async function createPreSyncBundleBackup({
: parsePositiveInt(process.env.PRE_SYNC_BACKUP_KEEP_COUNT, 20)
);
const repoBackupDir = path.join(
backupRoot,
sanitizePathSegment(config.userId || "unknown-user"),
sanitizePathSegment(owner),
sanitizePathSegment(repoName)
);
await mkdir(repoBackupDir, { recursive: true });
const tmpDir = await mkdtemp(path.join(os.tmpdir(), "gitea-mirror-backup-"));
const mirrorClonePath = path.join(tmpDir, "repo.git");
const bundlePath = path.join(repoBackupDir, `${buildTimestamp()}.bundle`);
// path.resolve guarantees an absolute path, critical because git -C changes
// the working directory and would misinterpret a relative bundlePath
const bundlePath = path.resolve(repoBackupDir, `${buildTimestamp()}.bundle`);
try {
const authCloneUrl = buildAuthenticatedCloneUrl(cloneUrl, giteaToken);

View File

@@ -280,11 +280,29 @@ async function runScheduledSync(config: any): Promise<void> {
});
}
// Log pending-approval repos that are excluded from sync
try {
const pendingApprovalRepos = await db
.select({ id: repositories.id })
.from(repositories)
.where(
and(
eq(repositories.userId, userId),
eq(repositories.status, 'pending-approval')
)
);
if (pendingApprovalRepos.length > 0) {
console.log(`[Scheduler] ${pendingApprovalRepos.length} repositories pending approval (force-push detected) for user ${userId} — skipping sync for those`);
}
} catch {
// Non-critical logging, ignore errors
}
if (reposToSync.length === 0) {
console.log(`[Scheduler] No repositories to sync for user ${userId}`);
return;
}
console.log(`[Scheduler] Syncing ${reposToSync.length} repositories for user ${userId}`);
// Process repositories in batches

View File

@@ -280,6 +280,8 @@ export const getStatusColor = (status: string): string => {
return "bg-orange-500"; // Deleting
case "deleted":
return "bg-gray-600"; // Deleted
case "pending-approval":
return "bg-amber-500"; // Needs manual approval
default:
return "bg-gray-400"; // Unknown/neutral
}

View File

@@ -93,7 +93,8 @@ export async function createDefaultConfig({ userId, envOverrides = {} }: Default
forkStrategy: "reference",
issueConcurrency: 3,
pullRequestConcurrency: 5,
backupBeforeSync: true,
backupStrategy: "on-force-push",
backupBeforeSync: true, // Deprecated: kept for backward compat
backupRetentionCount: 20,
backupDirectory: "data/repo-backups",
blockSyncOnBackupFailure: true,

View File

@@ -100,6 +100,7 @@ export function mapUiToDbConfig(
mirrorPullRequests: mirrorOptions.mirrorMetadata && mirrorOptions.metadataComponents.pullRequests,
mirrorLabels: mirrorOptions.mirrorMetadata && mirrorOptions.metadataComponents.labels,
mirrorMilestones: mirrorOptions.mirrorMetadata && mirrorOptions.metadataComponents.milestones,
backupStrategy: giteaConfig.backupStrategy,
backupBeforeSync: giteaConfig.backupBeforeSync ?? true,
backupRetentionCount: giteaConfig.backupRetentionCount ?? 20,
backupDirectory: giteaConfig.backupDirectory?.trim() || undefined,
@@ -144,6 +145,7 @@ export function mapDbToUiConfig(dbConfig: any): {
personalReposOrg: undefined, // Not stored in current schema
issueConcurrency: dbConfig.giteaConfig?.issueConcurrency ?? 3,
pullRequestConcurrency: dbConfig.giteaConfig?.pullRequestConcurrency ?? 5,
backupStrategy: dbConfig.giteaConfig?.backupStrategy || undefined,
backupBeforeSync: dbConfig.giteaConfig?.backupBeforeSync ?? true,
backupRetentionCount: dbConfig.giteaConfig?.backupRetentionCount ?? 20,
backupDirectory: dbConfig.giteaConfig?.backupDirectory || "data/repo-backups",

View File

@@ -0,0 +1,319 @@
import { describe, expect, it, mock } from "bun:test";
import {
detectForcePush,
fetchGitHubBranches,
checkAncestry,
type BranchInfo,
} from "./force-push-detection";
// ---- Helpers ----
function makeOctokit(overrides: Record<string, any> = {}) {
return {
repos: {
listBranches: mock(() => Promise.resolve({ data: [] })),
compareCommits: mock(() =>
Promise.resolve({ data: { status: "ahead" } }),
),
...overrides.repos,
},
paginate: mock(async (_method: any, params: any) => {
// Default: return whatever the test wired into _githubBranches
return overrides._githubBranches ?? [];
}),
...overrides,
} as any;
}
// ---- fetchGitHubBranches ----
describe("fetchGitHubBranches", () => {
it("maps Octokit paginated response to BranchInfo[]", async () => {
const octokit = makeOctokit({
_githubBranches: [
{ name: "main", commit: { sha: "aaa" } },
{ name: "dev", commit: { sha: "bbb" } },
],
});
const result = await fetchGitHubBranches({
octokit,
owner: "user",
repo: "repo",
});
expect(result).toEqual([
{ name: "main", sha: "aaa" },
{ name: "dev", sha: "bbb" },
]);
});
});
// ---- checkAncestry ----
describe("checkAncestry", () => {
it("returns true for fast-forward (ahead)", async () => {
const octokit = makeOctokit({
repos: {
compareCommits: mock(() =>
Promise.resolve({ data: { status: "ahead" } }),
),
},
});
const result = await checkAncestry({
octokit,
owner: "user",
repo: "repo",
baseSha: "old",
headSha: "new",
});
expect(result).toBe(true);
});
it("returns true for identical", async () => {
const octokit = makeOctokit({
repos: {
compareCommits: mock(() =>
Promise.resolve({ data: { status: "identical" } }),
),
},
});
const result = await checkAncestry({
octokit,
owner: "user",
repo: "repo",
baseSha: "same",
headSha: "same",
});
expect(result).toBe(true);
});
it("returns false for diverged", async () => {
const octokit = makeOctokit({
repos: {
compareCommits: mock(() =>
Promise.resolve({ data: { status: "diverged" } }),
),
},
});
const result = await checkAncestry({
octokit,
owner: "user",
repo: "repo",
baseSha: "old",
headSha: "new",
});
expect(result).toBe(false);
});
it("returns false when API returns 404 (old SHA gone)", async () => {
const error404 = Object.assign(new Error("Not Found"), { status: 404 });
const octokit = makeOctokit({
repos: {
compareCommits: mock(() => Promise.reject(error404)),
},
});
const result = await checkAncestry({
octokit,
owner: "user",
repo: "repo",
baseSha: "gone",
headSha: "new",
});
expect(result).toBe(false);
});
it("throws on transient errors (fail-open for caller)", async () => {
const error500 = Object.assign(new Error("Internal Server Error"), { status: 500 });
const octokit = makeOctokit({
repos: {
compareCommits: mock(() => Promise.reject(error500)),
},
});
expect(
checkAncestry({
octokit,
owner: "user",
repo: "repo",
baseSha: "old",
headSha: "new",
}),
).rejects.toThrow("Internal Server Error");
});
});
// ---- detectForcePush ----
// Uses _deps injection to avoid fragile global fetch mocking.
describe("detectForcePush", () => {
const baseArgs = {
giteaUrl: "https://gitea.example.com",
giteaToken: "tok",
giteaOwner: "org",
giteaRepo: "repo",
githubOwner: "user",
githubRepo: "repo",
};
function makeDeps(overrides: {
giteaBranches?: BranchInfo[] | Error;
githubBranches?: BranchInfo[] | Error;
ancestryResult?: boolean;
} = {}) {
return {
fetchGiteaBranches: mock(async () => {
if (overrides.giteaBranches instanceof Error) throw overrides.giteaBranches;
return overrides.giteaBranches ?? [];
}) as any,
fetchGitHubBranches: mock(async () => {
if (overrides.githubBranches instanceof Error) throw overrides.githubBranches;
return overrides.githubBranches ?? [];
}) as any,
checkAncestry: mock(async () => overrides.ancestryResult ?? true) as any,
};
}
const dummyOctokit = {} as any;
it("skips when Gitea has no branches (first mirror)", async () => {
const deps = makeDeps({ giteaBranches: [] });
const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps });
expect(result.detected).toBe(false);
expect(result.skipped).toBe(true);
expect(result.skipReason).toContain("No Gitea branches");
});
it("returns no detection when all SHAs match", async () => {
const deps = makeDeps({
giteaBranches: [
{ name: "main", sha: "aaa" },
{ name: "dev", sha: "bbb" },
],
githubBranches: [
{ name: "main", sha: "aaa" },
{ name: "dev", sha: "bbb" },
],
});
const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps });
expect(result.detected).toBe(false);
expect(result.skipped).toBe(false);
expect(result.affectedBranches).toHaveLength(0);
});
it("detects deleted branch", async () => {
const deps = makeDeps({
giteaBranches: [
{ name: "main", sha: "aaa" },
{ name: "old-branch", sha: "ccc" },
],
githubBranches: [{ name: "main", sha: "aaa" }],
});
const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps });
expect(result.detected).toBe(true);
expect(result.affectedBranches).toHaveLength(1);
expect(result.affectedBranches[0]).toEqual({
name: "old-branch",
reason: "deleted",
giteaSha: "ccc",
githubSha: null,
});
});
it("returns no detection for fast-forward", async () => {
const deps = makeDeps({
giteaBranches: [{ name: "main", sha: "old-sha" }],
githubBranches: [{ name: "main", sha: "new-sha" }],
ancestryResult: true, // fast-forward
});
const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps });
expect(result.detected).toBe(false);
expect(result.affectedBranches).toHaveLength(0);
});
it("detects diverged branch", async () => {
const deps = makeDeps({
giteaBranches: [{ name: "main", sha: "old-sha" }],
githubBranches: [{ name: "main", sha: "rewritten-sha" }],
ancestryResult: false, // diverged
});
const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps });
expect(result.detected).toBe(true);
expect(result.affectedBranches).toHaveLength(1);
expect(result.affectedBranches[0]).toEqual({
name: "main",
reason: "diverged",
giteaSha: "old-sha",
githubSha: "rewritten-sha",
});
});
it("detects force-push when ancestry check fails (old SHA gone)", async () => {
const deps = makeDeps({
giteaBranches: [{ name: "main", sha: "old-sha" }],
githubBranches: [{ name: "main", sha: "new-sha" }],
ancestryResult: false, // checkAncestry returns false on error
});
const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps });
expect(result.detected).toBe(true);
expect(result.affectedBranches).toHaveLength(1);
expect(result.affectedBranches[0].reason).toBe("diverged");
});
it("skips when Gitea API returns 404", async () => {
const { HttpError } = await import("@/lib/http-client");
const deps = makeDeps({
giteaBranches: new HttpError("not found", 404, "Not Found"),
});
const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps });
expect(result.detected).toBe(false);
expect(result.skipped).toBe(true);
expect(result.skipReason).toContain("not found");
});
it("skips when Gitea API returns server error", async () => {
const deps = makeDeps({
giteaBranches: new Error("HTTP 500: internal error"),
});
const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps });
expect(result.detected).toBe(false);
expect(result.skipped).toBe(true);
expect(result.skipReason).toContain("Failed to fetch Gitea branches");
});
it("skips when GitHub API fails", async () => {
const deps = makeDeps({
giteaBranches: [{ name: "main", sha: "aaa" }],
githubBranches: new Error("rate limited"),
});
const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps });
expect(result.detected).toBe(false);
expect(result.skipped).toBe(true);
expect(result.skipReason).toContain("Failed to fetch GitHub branches");
});
});

View File

@@ -0,0 +1,286 @@
/**
* Force-push detection module.
*
* Compares branch SHAs between a Gitea mirror and GitHub source to detect
* branches that were deleted, rewritten, or force-pushed.
*
* **Fail-open**: If detection itself fails (API errors, rate limits, etc.),
* the result indicates no force-push so sync proceeds normally. Detection
* should never block sync due to its own failure.
*/
import type { Octokit } from "@octokit/rest";
import { httpGet, HttpError } from "@/lib/http-client";
// ---- Types ----
export interface BranchInfo {
name: string;
sha: string;
}
export type ForcePushReason = "deleted" | "diverged" | "non-fast-forward";
export interface AffectedBranch {
name: string;
reason: ForcePushReason;
giteaSha: string;
githubSha: string | null; // null when branch was deleted
}
export interface ForcePushDetectionResult {
detected: boolean;
affectedBranches: AffectedBranch[];
/** True when detection could not run (API error, etc.) */
skipped: boolean;
skipReason?: string;
}
const NO_FORCE_PUSH: ForcePushDetectionResult = {
detected: false,
affectedBranches: [],
skipped: false,
};
function skippedResult(reason: string): ForcePushDetectionResult {
return {
detected: false,
affectedBranches: [],
skipped: true,
skipReason: reason,
};
}
// ---- Branch fetching ----
/**
* Fetch all branches from a Gitea repository (paginated).
*/
export async function fetchGiteaBranches({
giteaUrl,
giteaToken,
owner,
repo,
}: {
giteaUrl: string;
giteaToken: string;
owner: string;
repo: string;
}): Promise<BranchInfo[]> {
const branches: BranchInfo[] = [];
let page = 1;
const perPage = 50;
while (true) {
const url = `${giteaUrl}/api/v1/repos/${owner}/${repo}/branches?page=${page}&limit=${perPage}`;
const response = await httpGet<Array<{ name: string; commit: { id: string } }>>(
url,
{ Authorization: `token ${giteaToken}` },
);
if (!Array.isArray(response.data) || response.data.length === 0) break;
for (const b of response.data) {
branches.push({ name: b.name, sha: b.commit.id });
}
if (response.data.length < perPage) break;
page++;
}
return branches;
}
/**
* Fetch all branches from a GitHub repository (paginated via Octokit).
*/
export async function fetchGitHubBranches({
octokit,
owner,
repo,
}: {
octokit: Octokit;
owner: string;
repo: string;
}): Promise<BranchInfo[]> {
const data = await octokit.paginate(octokit.repos.listBranches, {
owner,
repo,
per_page: 100,
});
return data.map((b) => ({ name: b.name, sha: b.commit.sha }));
}
/**
* Check whether the transition from `baseSha` to `headSha` on the same branch
* is a fast-forward (i.e. `baseSha` is an ancestor of `headSha`).
*
* Returns `true` when the change is safe (fast-forward) and `false` when it
* is a confirmed force-push (404 = old SHA garbage-collected from GitHub).
*
* Throws on transient errors (rate limits, network issues) so the caller
* can decide how to handle them (fail-open: skip that branch).
*/
export async function checkAncestry({
octokit,
owner,
repo,
baseSha,
headSha,
}: {
octokit: Octokit;
owner: string;
repo: string;
baseSha: string;
headSha: string;
}): Promise<boolean> {
try {
const { data } = await octokit.repos.compareCommits({
owner,
repo,
base: baseSha,
head: headSha,
});
// "ahead" means headSha is strictly ahead of baseSha → fast-forward.
// "behind" or "diverged" means the branch was rewritten.
return data.status === "ahead" || data.status === "identical";
} catch (error: any) {
// 404 / 422 = old SHA no longer exists on GitHub → confirmed force-push.
if (error?.status === 404 || error?.status === 422) {
return false;
}
// Any other error (rate limit, network) → rethrow so caller can
// handle it as fail-open (skip branch) rather than false-positive.
throw error;
}
}
// ---- Main detection ----
/**
* Compare branch SHAs between Gitea and GitHub to detect force-pushes.
*
* The function is intentionally fail-open: any error during detection returns
* a "skipped" result so that sync can proceed normally.
*/
export async function detectForcePush({
giteaUrl,
giteaToken,
giteaOwner,
giteaRepo,
octokit,
githubOwner,
githubRepo,
_deps,
}: {
giteaUrl: string;
giteaToken: string;
giteaOwner: string;
giteaRepo: string;
octokit: Octokit;
githubOwner: string;
githubRepo: string;
/** @internal — test-only dependency injection */
_deps?: {
fetchGiteaBranches: typeof fetchGiteaBranches;
fetchGitHubBranches: typeof fetchGitHubBranches;
checkAncestry: typeof checkAncestry;
};
}): Promise<ForcePushDetectionResult> {
const deps = _deps ?? { fetchGiteaBranches, fetchGitHubBranches, checkAncestry };
// 1. Fetch Gitea branches
let giteaBranches: BranchInfo[];
try {
giteaBranches = await deps.fetchGiteaBranches({
giteaUrl,
giteaToken,
owner: giteaOwner,
repo: giteaRepo,
});
} catch (error) {
// Gitea 404 = repo not yet mirrored, skip detection
if (error instanceof HttpError && error.status === 404) {
return skippedResult("Gitea repository not found (first mirror?)");
}
return skippedResult(
`Failed to fetch Gitea branches: ${error instanceof Error ? error.message : String(error)}`,
);
}
// First-time mirror: no Gitea branches → nothing to compare
if (giteaBranches.length === 0) {
return skippedResult("No Gitea branches found (first mirror?)");
}
// 2. Fetch GitHub branches
let githubBranches: BranchInfo[];
try {
githubBranches = await deps.fetchGitHubBranches({
octokit,
owner: githubOwner,
repo: githubRepo,
});
} catch (error) {
return skippedResult(
`Failed to fetch GitHub branches: ${error instanceof Error ? error.message : String(error)}`,
);
}
const githubBranchMap = new Map(githubBranches.map((b) => [b.name, b.sha]));
// 3. Compare each Gitea branch against GitHub
const affected: AffectedBranch[] = [];
for (const giteaBranch of giteaBranches) {
const githubSha = githubBranchMap.get(giteaBranch.name);
if (githubSha === undefined) {
// Branch was deleted on GitHub
affected.push({
name: giteaBranch.name,
reason: "deleted",
giteaSha: giteaBranch.sha,
githubSha: null,
});
continue;
}
// Same SHA → no change
if (githubSha === giteaBranch.sha) continue;
// SHAs differ → check if it's a fast-forward
try {
const isFastForward = await deps.checkAncestry({
octokit,
owner: githubOwner,
repo: githubRepo,
baseSha: giteaBranch.sha,
headSha: githubSha,
});
if (!isFastForward) {
affected.push({
name: giteaBranch.name,
reason: "diverged",
giteaSha: giteaBranch.sha,
githubSha,
});
}
} catch {
// Individual branch check failure → skip that branch (fail-open)
continue;
}
}
if (affected.length === 0) {
return NO_FORCE_PUSH;
}
return {
detected: true,
affectedBranches: affected,
skipped: false,
};
}

View File

@@ -0,0 +1,202 @@
import type { APIRoute } from "astro";
import { db, configs, repositories } from "@/lib/db";
import { and, eq, inArray } from "drizzle-orm";
import { repositoryVisibilityEnum, repoStatusEnum } from "@/types/Repository";
import { syncGiteaRepoEnhanced } from "@/lib/gitea-enhanced";
import { createSecureErrorResponse } from "@/lib/utils";
import { requireAuthenticatedUserId } from "@/lib/auth-guards";
import { createPreSyncBundleBackup } from "@/lib/repo-backup";
import { decryptConfigTokens } from "@/lib/utils/config-encryption";
import type { Config } from "@/types/config";
import { createMirrorJob } from "@/lib/helpers";
interface ApproveSyncRequest {
repositoryIds: string[];
action: "approve" | "dismiss";
}
export const POST: APIRoute = async ({ request, locals }) => {
try {
const authResult = await requireAuthenticatedUserId({ request, locals });
if ("response" in authResult) return authResult.response;
const userId = authResult.userId;
const body: ApproveSyncRequest = await request.json();
const { repositoryIds, action } = body;
if (!repositoryIds || !Array.isArray(repositoryIds) || repositoryIds.length === 0) {
return new Response(
JSON.stringify({ success: false, message: "repositoryIds are required." }),
{ status: 400, headers: { "Content-Type": "application/json" } },
);
}
if (action !== "approve" && action !== "dismiss") {
return new Response(
JSON.stringify({ success: false, message: "action must be 'approve' or 'dismiss'." }),
{ status: 400, headers: { "Content-Type": "application/json" } },
);
}
// Fetch config
const configResult = await db
.select()
.from(configs)
.where(eq(configs.userId, userId))
.limit(1);
const config = configResult[0];
if (!config) {
return new Response(
JSON.stringify({ success: false, message: "No configuration found." }),
{ status: 400, headers: { "Content-Type": "application/json" } },
);
}
// Fetch repos — only those in pending-approval status
const repos = await db
.select()
.from(repositories)
.where(
and(
eq(repositories.userId, userId),
eq(repositories.status, "pending-approval"),
inArray(repositories.id, repositoryIds),
),
);
if (!repos.length) {
return new Response(
JSON.stringify({ success: false, message: "No pending-approval repositories found for the given IDs." }),
{ status: 404, headers: { "Content-Type": "application/json" } },
);
}
if (action === "dismiss") {
// Reset status to "synced" so repos resume normal schedule
for (const repo of repos) {
await db
.update(repositories)
.set({
status: "synced",
errorMessage: null,
updatedAt: new Date(),
})
.where(eq(repositories.id, repo.id));
await createMirrorJob({
userId,
repositoryId: repo.id,
repositoryName: repo.name,
message: `Force-push alert dismissed for ${repo.name}`,
details: "User dismissed the force-push alert. Repository will resume normal sync schedule.",
status: "synced",
});
}
return new Response(
JSON.stringify({
success: true,
message: `Dismissed ${repos.length} repository alert(s).`,
repositories: repos.map((repo) => ({
...repo,
status: "synced",
errorMessage: null,
})),
}),
{ status: 200, headers: { "Content-Type": "application/json" } },
);
}
// action === "approve": create backup first (safety), then trigger sync
const decryptedConfig = decryptConfigTokens(config as unknown as Config);
// Process in background
setTimeout(async () => {
for (const repo of repos) {
try {
const { getGiteaRepoOwnerAsync } = await import("@/lib/gitea");
const repoOwner = await getGiteaRepoOwnerAsync({ config, repository: repo });
// Always create a backup before approved sync for safety
const cloneUrl = `${config.giteaConfig.url.replace(/\/$/, "")}/${repoOwner}/${repo.name}.git`;
try {
const backupResult = await createPreSyncBundleBackup({
config,
owner: repoOwner,
repoName: repo.name,
cloneUrl,
force: true, // Bypass legacy gate — approval implies backup
});
await createMirrorJob({
userId,
repositoryId: repo.id,
repositoryName: repo.name,
message: `Safety snapshot created for ${repo.name}`,
details: `Pre-approval snapshot at ${backupResult.bundlePath}.`,
status: "syncing",
});
} catch (backupError) {
console.warn(
`[ApproveSync] Backup failed for ${repo.name}, proceeding with sync: ${
backupError instanceof Error ? backupError.message : String(backupError)
}`,
);
}
// Trigger sync — skip detection to avoid re-blocking
const repoData = {
...repo,
status: repoStatusEnum.parse("syncing"),
organization: repo.organization ?? undefined,
lastMirrored: repo.lastMirrored ?? undefined,
errorMessage: repo.errorMessage ?? undefined,
forkedFrom: repo.forkedFrom ?? undefined,
visibility: repositoryVisibilityEnum.parse(repo.visibility),
mirroredLocation: repo.mirroredLocation || "",
};
await syncGiteaRepoEnhanced({
config,
repository: repoData,
skipForcePushDetection: true,
});
console.log(`[ApproveSync] Sync completed for approved repository: ${repo.name}`);
} catch (error) {
console.error(
`[ApproveSync] Failed to sync approved repository ${repo.name}:`,
error,
);
}
}
}, 0);
// Immediately update status to syncing for responsiveness
for (const repo of repos) {
await db
.update(repositories)
.set({
status: "syncing",
errorMessage: null,
updatedAt: new Date(),
})
.where(eq(repositories.id, repo.id));
}
return new Response(
JSON.stringify({
success: true,
message: `Approved sync for ${repos.length} repository(ies). Backup + sync started.`,
repositories: repos.map((repo) => ({
...repo,
status: "syncing",
errorMessage: null,
})),
}),
{ status: 200, headers: { "Content-Type": "application/json" } },
);
} catch (error) {
return createSecureErrorResponse(error, "approve-sync", 500);
}
};

View File

@@ -13,6 +13,7 @@ export const repoStatusEnum = z.enum([
"syncing",
"synced",
"archived",
"pending-approval", // Blocked by force-push detection, needs manual approval
]);
export type RepoStatus = z.infer<typeof repoStatusEnum>;

View File

@@ -3,6 +3,7 @@ import { type Config as ConfigType } from "@/lib/db/schema";
export type GiteaOrgVisibility = "public" | "private" | "limited";
export type MirrorStrategy = "preserve" | "single-org" | "flat-user" | "mixed";
export type StarredReposMode = "dedicated-org" | "preserve-owner";
export type BackupStrategy = "disabled" | "always" | "on-force-push" | "block-on-force-push";
export interface GiteaConfig {
url: string;
@@ -18,7 +19,8 @@ export interface GiteaConfig {
personalReposOrg?: string; // Override destination for personal repos
issueConcurrency?: number;
pullRequestConcurrency?: number;
backupBeforeSync?: boolean;
backupStrategy?: BackupStrategy;
backupBeforeSync?: boolean; // Deprecated: kept for backward compat, use backupStrategy
backupRetentionCount?: number;
backupDirectory?: string;
blockSyncOnBackupFailure?: boolean;

View File

@@ -6,13 +6,13 @@
* by the 02-mirror-workflow suite.
*
* What is tested:
* B1. Enable backupBeforeSync in config
* B1. Enable backupStrategy: "always" in config
* B2. Confirm mirrored repos exist in Gitea (precondition)
* B3. Trigger a re-sync with backup enabled — verify the backup code path
* runs (snapshot activity entries appear in the activity log)
* B4. Inspect activity log for snapshot-related entries
* B5. Enable blockSyncOnBackupFailure and verify the flag is persisted
* B6. Disable backup and verify config resets cleanly
* B6. Disable backup (backupStrategy: "disabled") and verify config resets cleanly
*/
import { test, expect } from "@playwright/test";
@@ -54,10 +54,10 @@ test.describe("E2E: Backup configuration", () => {
const giteaToken = giteaApi.getTokenValue();
expect(giteaToken, "Gitea token required").toBeTruthy();
// Save config with backup enabled
// Save config with backup strategy set to "always"
await saveConfig(request, giteaToken, appCookies, {
giteaConfig: {
backupBeforeSync: true,
backupStrategy: "always",
blockSyncOnBackupFailure: false,
backupRetentionCount: 5,
backupDirectory: "data/repo-backups",
@@ -75,7 +75,7 @@ test.describe("E2E: Backup configuration", () => {
const configData = await configResp.json();
const giteaCfg = configData.giteaConfig ?? configData.gitea ?? {};
console.log(
`[Backup] Config saved: backupBeforeSync=${giteaCfg.backupBeforeSync}, blockOnFailure=${giteaCfg.blockSyncOnBackupFailure}`,
`[Backup] Config saved: backupStrategy=${giteaCfg.backupStrategy}, blockOnFailure=${giteaCfg.blockSyncOnBackupFailure}`,
);
}
});
@@ -202,7 +202,7 @@ test.describe("E2E: Backup configuration", () => {
expect(
backupJobs.length,
"Expected at least one backup/snapshot activity entry when " +
"backupBeforeSync is enabled and repos exist in Gitea",
"backupStrategy is 'always' and repos exist in Gitea",
).toBeGreaterThan(0);
// Check for any failed backups
@@ -247,7 +247,7 @@ test.describe("E2E: Backup configuration", () => {
// Update config to block sync on backup failure
await saveConfig(request, giteaToken, appCookies, {
giteaConfig: {
backupBeforeSync: true,
backupStrategy: "always",
blockSyncOnBackupFailure: true,
backupRetentionCount: 5,
backupDirectory: "data/repo-backups",
@@ -284,7 +284,7 @@ test.describe("E2E: Backup configuration", () => {
// Disable backup
await saveConfig(request, giteaToken, appCookies, {
giteaConfig: {
backupBeforeSync: false,
backupStrategy: "disabled",
blockSyncOnBackupFailure: false,
},
});
@@ -297,7 +297,7 @@ test.describe("E2E: Backup configuration", () => {
const configData = await configResp.json();
const giteaCfg = configData.giteaConfig ?? configData.gitea ?? {};
console.log(
`[Backup] After disable: backupBeforeSync=${giteaCfg.backupBeforeSync}`,
`[Backup] After disable: backupStrategy=${giteaCfg.backupStrategy}`,
);
}
console.log("[Backup] Backup configuration test complete");

View File

@@ -302,7 +302,7 @@ test.describe("E2E: Force-push simulation", () => {
// Ensure backup is disabled for this test
await saveConfig(request, giteaToken, appCookies, {
giteaConfig: {
backupBeforeSync: false,
backupStrategy: "disabled",
blockSyncOnBackupFailure: false,
},
});
@@ -560,16 +560,16 @@ test.describe("E2E: Force-push simulation", () => {
const giteaToken = giteaApi.getTokenValue();
// Enable backup
// Enable backup with "always" strategy
await saveConfig(request, giteaToken, appCookies, {
giteaConfig: {
backupBeforeSync: true,
backupStrategy: "always",
blockSyncOnBackupFailure: false, // don't block — we want to see both backup + sync happen
backupRetentionCount: 5,
backupDirectory: "data/repo-backups",
},
});
console.log("[ForcePush] Backup enabled for protected sync test");
console.log("[ForcePush] Backup enabled (strategy=always) for protected sync test");
// Force-push again
mutateSourceRepo(MY_PROJECT_BARE, "my-project-rewrite2", (workDir) => {
@@ -744,7 +744,7 @@ test.describe("E2E: Force-push simulation", () => {
expect(
backupJobs.length,
"At least one backup/snapshot activity should exist for my-project " +
"when backupBeforeSync is enabled",
"when backupStrategy is 'always'",
).toBeGreaterThan(0);
// Check whether any backups actually succeeded

View File

@@ -520,7 +520,7 @@ export async function saveConfig(
starredReposOrg: "github-stars",
preserveOrgStructure: false,
mirrorStrategy: "single-org",
backupBeforeSync: false,
backupStrategy: "disabled",
blockSyncOnBackupFailure: false,
};