mirror of
https://github.com/RayLabsHQ/gitea-mirror.git
synced 2026-03-14 14:32:54 +03:00
Compare commits
5 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
d0693206c3 | ||
|
|
b079070c30 | ||
|
|
e68e9c38a8 | ||
|
|
534150ecf9 | ||
|
|
98da7065e0 |
BIN
.github/screenshots/backup-strategy-ui.png
vendored
Normal file
BIN
.github/screenshots/backup-strategy-ui.png
vendored
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 34 KiB |
5
.github/workflows/astro-build-test.yml
vendored
5
.github/workflows/astro-build-test.yml
vendored
@@ -6,11 +6,15 @@ on:
|
||||
paths-ignore:
|
||||
- 'README.md'
|
||||
- 'docs/**'
|
||||
- 'www/**'
|
||||
- 'helm/**'
|
||||
pull_request:
|
||||
branches: [ '*' ]
|
||||
paths-ignore:
|
||||
- 'README.md'
|
||||
- 'docs/**'
|
||||
- 'www/**'
|
||||
- 'helm/**'
|
||||
|
||||
permissions:
|
||||
contents: read
|
||||
@@ -20,6 +24,7 @@ jobs:
|
||||
build-and-test:
|
||||
name: Build and Test Astro Project
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 10
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
|
||||
1
.github/workflows/docker-build.yml
vendored
1
.github/workflows/docker-build.yml
vendored
@@ -36,6 +36,7 @@ env:
|
||||
jobs:
|
||||
docker:
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 10
|
||||
|
||||
permissions:
|
||||
contents: write
|
||||
|
||||
6
.github/workflows/e2e-tests.yml
vendored
6
.github/workflows/e2e-tests.yml
vendored
@@ -8,6 +8,8 @@ on:
|
||||
- "docs/**"
|
||||
- "CHANGELOG.md"
|
||||
- "LICENSE"
|
||||
- "www/**"
|
||||
- "helm/**"
|
||||
pull_request:
|
||||
branches: ["*"]
|
||||
paths-ignore:
|
||||
@@ -15,6 +17,8 @@ on:
|
||||
- "docs/**"
|
||||
- "CHANGELOG.md"
|
||||
- "LICENSE"
|
||||
- "www/**"
|
||||
- "helm/**"
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
debug_enabled:
|
||||
@@ -42,7 +46,7 @@ jobs:
|
||||
e2e-tests:
|
||||
name: E2E Integration Tests
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 25
|
||||
timeout-minutes: 10
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
|
||||
2
.github/workflows/helm-test.yml
vendored
2
.github/workflows/helm-test.yml
vendored
@@ -21,6 +21,7 @@ jobs:
|
||||
yamllint:
|
||||
name: Lint YAML
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 10
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- uses: actions/setup-python@v5
|
||||
@@ -35,6 +36,7 @@ jobs:
|
||||
helm-template:
|
||||
name: Helm lint & template
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 10
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- name: Setup Helm
|
||||
|
||||
11
.github/workflows/nix-build.yml
vendored
11
.github/workflows/nix-build.yml
vendored
@@ -5,8 +5,18 @@ on:
|
||||
branches: [main, nix]
|
||||
tags:
|
||||
- 'v*'
|
||||
paths-ignore:
|
||||
- 'README.md'
|
||||
- 'docs/**'
|
||||
- 'www/**'
|
||||
- 'helm/**'
|
||||
pull_request:
|
||||
branches: [main]
|
||||
paths-ignore:
|
||||
- 'README.md'
|
||||
- 'docs/**'
|
||||
- 'www/**'
|
||||
- 'helm/**'
|
||||
|
||||
permissions:
|
||||
contents: read
|
||||
@@ -14,6 +24,7 @@ permissions:
|
||||
jobs:
|
||||
check:
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 10
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
@@ -40,6 +40,7 @@ First user signup becomes admin. Configure GitHub and Gitea through the web inte
|
||||
- 🔄 **Auto-discovery** - Automatically import new GitHub repositories (v3.4.0+)
|
||||
- 🧹 **Repository cleanup** - Auto-remove repos deleted from GitHub (v3.4.0+)
|
||||
- 🎯 **Proper mirror intervals** - Respects configured sync intervals (v3.4.0+)
|
||||
- 🛡️ **[Force-push protection](docs/FORCE_PUSH_PROTECTION.md)** - Smart detection with backup-on-demand or block-and-approve modes (Beta)
|
||||
- 🗑️ Automatic database cleanup with configurable retention
|
||||
- 🐳 Dockerized with multi-arch support (AMD64/ARM64)
|
||||
|
||||
@@ -499,6 +500,7 @@ GNU Affero General Public License v3.0 (AGPL-3.0) - see [LICENSE](LICENSE) file
|
||||
|
||||
- 📖 [Documentation](https://github.com/RayLabsHQ/gitea-mirror/tree/main/docs)
|
||||
- 🔐 [Environment Variables](docs/ENVIRONMENT_VARIABLES.md)
|
||||
- 🛡️ [Force-Push Protection](docs/FORCE_PUSH_PROTECTION.md)
|
||||
- 🐛 [Report Issues](https://github.com/RayLabsHQ/gitea-mirror/issues)
|
||||
- 💬 [Discussions](https://github.com/RayLabsHQ/gitea-mirror/discussions)
|
||||
- 🔧 [Proxmox VE Script](https://community-scripts.github.io/ProxmoxVE/scripts?id=gitea-mirror)
|
||||
|
||||
@@ -78,6 +78,7 @@ Settings for connecting to and configuring GitHub repository sources.
|
||||
| Variable | Description | Default | Options |
|
||||
|----------|-------------|---------|---------|
|
||||
| `SKIP_STARRED_ISSUES` | Enable lightweight mode for starred repos (skip issues) | `false` | `true`, `false` |
|
||||
| `AUTO_MIRROR_STARRED` | Automatically mirror starred repos during scheduled syncs and "Mirror All". When `false`, starred repos are imported for browsing but must be mirrored individually. | `false` | `true`, `false` |
|
||||
|
||||
## Gitea Configuration
|
||||
|
||||
|
||||
179
docs/FORCE_PUSH_PROTECTION.md
Normal file
179
docs/FORCE_PUSH_PROTECTION.md
Normal file
@@ -0,0 +1,179 @@
|
||||
# Force-Push Protection
|
||||
|
||||
This document describes the smart force-push protection system introduced in gitea-mirror v3.11.0+.
|
||||
|
||||
## The Problem
|
||||
|
||||
GitHub repositories can be force-pushed at any time — rewriting history, deleting branches, or replacing commits entirely. When gitea-mirror syncs a force-pushed repo, the old history in Gitea is silently overwritten. Files, commits, and branches disappear with no way to recover them.
|
||||
|
||||
The original workaround (`backupBeforeSync: true`) created a full git bundle backup before **every** sync. This doesn't scale — a user with 100+ GiB of mirrors would need up to 2 TB of backup storage with default retention settings, even though force-pushes are rare.
|
||||
|
||||
## Solution: Smart Detection
|
||||
|
||||
Instead of backing up everything every time, the system detects force-pushes **before** they happen and only acts when needed.
|
||||
|
||||
### How Detection Works
|
||||
|
||||
Before each sync, the app compares branch SHAs between Gitea (the mirror) and GitHub (the source):
|
||||
|
||||
1. **Fetch branches from both sides** — lightweight API calls to get branch names and their latest commit SHAs
|
||||
2. **Compare each branch**:
|
||||
- SHAs match → nothing changed, no action needed
|
||||
- SHAs differ → check if the change is a normal push or a force-push
|
||||
3. **Ancestry check** — for branches with different SHAs, call GitHub's compare API to determine if the new SHA is a descendant of the old one:
|
||||
- **Fast-forward** (new SHA descends from old) → normal push, safe to sync
|
||||
- **Diverged** (histories split) → force-push detected
|
||||
- **404** (old SHA doesn't exist on GitHub anymore) → history was rewritten, force-push detected
|
||||
- **Branch deleted on GitHub** → flagged as destructive change
|
||||
|
||||
### What Happens on Detection
|
||||
|
||||
Depends on the configured strategy (see below):
|
||||
- **Backup strategies** (`always`, `on-force-push`): create a git bundle snapshot, then sync
|
||||
- **Block strategy** (`block-on-force-push`): halt the sync, mark the repo as `pending-approval`, wait for user action
|
||||
|
||||
### Fail-Open Design
|
||||
|
||||
If detection itself fails (GitHub rate limits, network errors, API outages), sync proceeds normally. Detection never blocks a sync due to its own failure. Individual branch check failures are skipped — one flaky branch doesn't affect the others.
|
||||
|
||||
## Backup Strategies
|
||||
|
||||
Configure via **Settings → GitHub Configuration → Destructive Update Protection**.
|
||||
|
||||
| Strategy | What It Does | Storage Cost | Best For |
|
||||
|---|---|---|---|
|
||||
| **Disabled** | No detection, no backups | Zero | Repos you don't care about losing |
|
||||
| **Always Backup** | Snapshot before every sync (original behavior) | High | Small mirror sets, maximum safety |
|
||||
| **Smart** (default) | Detect force-pushes, backup only when found | Near-zero normally | Most users — efficient protection |
|
||||
| **Block & Approve** | Detect force-pushes, block sync until approved | Zero | Critical repos needing manual review |
|
||||
|
||||
### Strategy Details
|
||||
|
||||
#### Disabled
|
||||
|
||||
Syncs proceed without any detection or backup. If a force-push happens on GitHub, the mirror silently overwrites.
|
||||
|
||||
#### Always Backup
|
||||
|
||||
Creates a git bundle snapshot before every sync regardless of whether a force-push occurred. This is the legacy behavior (equivalent to the old `backupBeforeSync: true`). Safe but expensive for large mirror sets.
|
||||
|
||||
#### Smart (`on-force-push`) — Recommended
|
||||
|
||||
Runs the force-push detection before each sync. On normal days (no force-pushes), syncs proceed without any backup overhead. When a force-push is detected, a snapshot is created before the sync runs.
|
||||
|
||||
This gives you protection when it matters with near-zero cost when it doesn't.
|
||||
|
||||
#### Block & Approve (`block-on-force-push`)
|
||||
|
||||
Runs detection and, when a force-push is found, **blocks the sync entirely**. The repository is marked as `pending-approval` and excluded from future scheduled syncs until you take action:
|
||||
|
||||
- **Approve**: creates a backup first, then syncs (safe)
|
||||
- **Dismiss**: clears the flag and resumes normal syncing (no backup)
|
||||
|
||||
Use this for repos where you want manual control over destructive changes.
|
||||
|
||||
## Additional Settings
|
||||
|
||||
These appear when any non-disabled strategy is selected:
|
||||
|
||||
### Snapshot Retention Count
|
||||
|
||||
How many backup snapshots to keep per repository. Oldest snapshots are deleted when this limit is exceeded. Default: **20**.
|
||||
|
||||
### Snapshot Directory
|
||||
|
||||
Where git bundle backups are stored. Default: **`data/repo-backups`**. Bundles are organized as `<directory>/<owner>/<repo>/<timestamp>.bundle`.
|
||||
|
||||
### Block Sync on Snapshot Failure
|
||||
|
||||
Available for **Always Backup** and **Smart** strategies. When enabled, if the snapshot creation fails (disk full, permissions error, etc.), the sync is also blocked. When disabled, sync continues even if the snapshot couldn't be created.
|
||||
|
||||
Recommended: **enabled** if you rely on backups for recovery.
|
||||
|
||||
## Backward Compatibility
|
||||
|
||||
The old `backupBeforeSync` boolean is still recognized:
|
||||
|
||||
| Old Setting | New Equivalent |
|
||||
|---|---|
|
||||
| `backupBeforeSync: true` | `backupStrategy: "always"` |
|
||||
| `backupBeforeSync: false` | `backupStrategy: "disabled"` |
|
||||
| Neither set | `backupStrategy: "on-force-push"` (new default) |
|
||||
|
||||
Existing configurations are automatically mapped. The old field is deprecated but will continue to work.
|
||||
|
||||
## Environment Variables
|
||||
|
||||
No new environment variables are required. The backup strategy is configured through the web UI and stored in the database alongside other config.
|
||||
|
||||
## API
|
||||
|
||||
### Approve/Dismiss Blocked Repos
|
||||
|
||||
When using the `block-on-force-push` strategy, repos that are blocked can be managed via the API:
|
||||
|
||||
```bash
|
||||
# Approve sync (creates backup first, then syncs)
|
||||
curl -X POST http://localhost:4321/api/job/approve-sync \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "Cookie: <session>" \
|
||||
-d '{"repositoryIds": ["<id>"], "action": "approve"}'
|
||||
|
||||
# Dismiss (clear the block, resume normal syncing)
|
||||
curl -X POST http://localhost:4321/api/job/approve-sync \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "Cookie: <session>" \
|
||||
-d '{"repositoryIds": ["<id>"], "action": "dismiss"}'
|
||||
```
|
||||
|
||||
Blocked repos also show an **Approve** / **Dismiss** button in the repository table UI.
|
||||
|
||||
## Architecture
|
||||
|
||||
### Key Files
|
||||
|
||||
| File | Purpose |
|
||||
|---|---|
|
||||
| `src/lib/utils/force-push-detection.ts` | Core detection: fetch branches, compare SHAs, check ancestry |
|
||||
| `src/lib/repo-backup.ts` | Strategy resolver, backup decision logic, bundle creation |
|
||||
| `src/lib/gitea-enhanced.ts` | Sync flow integration (calls detection + backup before mirror-sync) |
|
||||
| `src/pages/api/job/approve-sync.ts` | Approve/dismiss API endpoint |
|
||||
| `src/components/config/GitHubConfigForm.tsx` | Strategy selector UI |
|
||||
| `src/components/repositories/RepositoryTable.tsx` | Pending-approval badge + action buttons |
|
||||
|
||||
### Detection Flow
|
||||
|
||||
```
|
||||
syncGiteaRepoEnhanced()
|
||||
│
|
||||
├─ Resolve backup strategy (config → backupStrategy → backupBeforeSync → default)
|
||||
│
|
||||
├─ If strategy needs detection ("on-force-push" or "block-on-force-push"):
|
||||
│ │
|
||||
│ ├─ fetchGiteaBranches() — GET /api/v1/repos/{owner}/{repo}/branches
|
||||
│ ├─ fetchGitHubBranches() — octokit.paginate(repos.listBranches)
|
||||
│ │
|
||||
│ └─ For each Gitea branch where SHA differs:
|
||||
│ └─ checkAncestry() — octokit.repos.compareCommits()
|
||||
│ ├─ "ahead" or "identical" → fast-forward (safe)
|
||||
│ ├─ "diverged" or "behind" → force-push detected
|
||||
│ └─ 404/422 → old SHA gone → force-push detected
|
||||
│
|
||||
├─ If "block-on-force-push" + detected:
|
||||
│ └─ Set repo status to "pending-approval", return early
|
||||
│
|
||||
├─ If backup needed (always, or on-force-push + detected):
|
||||
│ └─ Create git bundle snapshot
|
||||
│
|
||||
└─ Proceed to mirror-sync
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
**Repos stuck in "pending-approval"**: Use the Approve or Dismiss buttons in the repository table, or call the approve-sync API endpoint.
|
||||
|
||||
**Detection always skipped**: Check the activity log for skip reasons. Common causes: Gitea repo not yet mirrored (first sync), GitHub API rate limits, network errors. All are fail-open by design.
|
||||
|
||||
**Backups consuming too much space**: Lower the retention count, or switch from "Always Backup" to "Smart" which only creates backups on actual force-pushes.
|
||||
|
||||
**False positives**: The detection compares branch-by-branch. A rebase (which is a force-push) will correctly trigger detection. If you routinely rebase branches, consider using "Smart" instead of "Block & Approve" to avoid constant approval prompts.
|
||||
@@ -50,7 +50,7 @@ export function ConfigTabs() {
|
||||
starredReposOrg: 'starred',
|
||||
starredReposMode: 'dedicated-org',
|
||||
preserveOrgStructure: false,
|
||||
backupBeforeSync: true,
|
||||
backupStrategy: "on-force-push",
|
||||
backupRetentionCount: 20,
|
||||
backupDirectory: 'data/repo-backups',
|
||||
blockSyncOnBackupFailure: true,
|
||||
@@ -83,6 +83,7 @@ export function ConfigTabs() {
|
||||
advancedOptions: {
|
||||
skipForks: false,
|
||||
starredCodeOnly: false,
|
||||
autoMirrorStarred: false,
|
||||
},
|
||||
});
|
||||
const { user } = useAuth();
|
||||
@@ -660,9 +661,20 @@ export function ConfigTabs() {
|
||||
: update,
|
||||
}))
|
||||
}
|
||||
giteaConfig={config.giteaConfig}
|
||||
setGiteaConfig={update =>
|
||||
setConfig(prev => ({
|
||||
...prev,
|
||||
giteaConfig:
|
||||
typeof update === 'function'
|
||||
? update(prev.giteaConfig)
|
||||
: update,
|
||||
}))
|
||||
}
|
||||
onAutoSave={autoSaveGitHubConfig}
|
||||
onMirrorOptionsAutoSave={autoSaveMirrorOptions}
|
||||
onAdvancedOptionsAutoSave={autoSaveAdvancedOptions}
|
||||
onGiteaAutoSave={autoSaveGiteaConfig}
|
||||
isAutoSaving={isAutoSavingGitHub}
|
||||
/>
|
||||
<GiteaConfigForm
|
||||
|
||||
@@ -7,10 +7,11 @@ import {
|
||||
CardTitle,
|
||||
} from "@/components/ui/card";
|
||||
import { githubApi } from "@/lib/api";
|
||||
import type { GitHubConfig, MirrorOptions, AdvancedOptions } from "@/types/config";
|
||||
import type { GitHubConfig, MirrorOptions, AdvancedOptions, GiteaConfig, BackupStrategy } from "@/types/config";
|
||||
import { Input } from "../ui/input";
|
||||
import { toast } from "sonner";
|
||||
import { Info } from "lucide-react";
|
||||
import { Info, ShieldAlert } from "lucide-react";
|
||||
import { Badge } from "@/components/ui/badge";
|
||||
import { GitHubMirrorSettings } from "./GitHubMirrorSettings";
|
||||
import { Separator } from "../ui/separator";
|
||||
import {
|
||||
@@ -26,23 +27,29 @@ interface GitHubConfigFormProps {
|
||||
setMirrorOptions: React.Dispatch<React.SetStateAction<MirrorOptions>>;
|
||||
advancedOptions: AdvancedOptions;
|
||||
setAdvancedOptions: React.Dispatch<React.SetStateAction<AdvancedOptions>>;
|
||||
giteaConfig?: GiteaConfig;
|
||||
setGiteaConfig?: React.Dispatch<React.SetStateAction<GiteaConfig>>;
|
||||
onAutoSave?: (githubConfig: GitHubConfig) => Promise<void>;
|
||||
onMirrorOptionsAutoSave?: (mirrorOptions: MirrorOptions) => Promise<void>;
|
||||
onAdvancedOptionsAutoSave?: (advancedOptions: AdvancedOptions) => Promise<void>;
|
||||
onGiteaAutoSave?: (giteaConfig: GiteaConfig) => Promise<void>;
|
||||
isAutoSaving?: boolean;
|
||||
}
|
||||
|
||||
export function GitHubConfigForm({
|
||||
config,
|
||||
setConfig,
|
||||
config,
|
||||
setConfig,
|
||||
mirrorOptions,
|
||||
setMirrorOptions,
|
||||
advancedOptions,
|
||||
setAdvancedOptions,
|
||||
onAutoSave,
|
||||
giteaConfig,
|
||||
setGiteaConfig,
|
||||
onAutoSave,
|
||||
onMirrorOptionsAutoSave,
|
||||
onAdvancedOptionsAutoSave,
|
||||
isAutoSaving
|
||||
onGiteaAutoSave,
|
||||
isAutoSaving
|
||||
}: GitHubConfigFormProps) {
|
||||
const [isLoading, setIsLoading] = useState(false);
|
||||
|
||||
@@ -202,7 +209,139 @@ export function GitHubConfigForm({
|
||||
if (onAdvancedOptionsAutoSave) onAdvancedOptionsAutoSave(newOptions);
|
||||
}}
|
||||
/>
|
||||
|
||||
|
||||
{giteaConfig && setGiteaConfig && (
|
||||
<>
|
||||
<Separator />
|
||||
|
||||
<div className="space-y-4">
|
||||
<h3 className="text-sm font-medium flex items-center gap-2">
|
||||
<ShieldAlert className="h-4 w-4 text-primary" />
|
||||
Destructive Update Protection
|
||||
<Badge variant="secondary" className="ml-2 text-[10px] px-1.5 py-0">BETA</Badge>
|
||||
</h3>
|
||||
<p className="text-xs text-muted-foreground">
|
||||
Choose how to handle force-pushes or rewritten upstream history on GitHub.
|
||||
</p>
|
||||
|
||||
<div className="grid grid-cols-2 md:grid-cols-4 gap-2">
|
||||
{([
|
||||
{
|
||||
value: "disabled",
|
||||
label: "Disabled",
|
||||
desc: "No detection or backups",
|
||||
},
|
||||
{
|
||||
value: "always",
|
||||
label: "Always Backup",
|
||||
desc: "Snapshot before every sync",
|
||||
},
|
||||
{
|
||||
value: "on-force-push",
|
||||
label: "Smart",
|
||||
desc: "Backup only on force-push",
|
||||
},
|
||||
{
|
||||
value: "block-on-force-push",
|
||||
label: "Block & Approve",
|
||||
desc: "Require approval on force-push",
|
||||
},
|
||||
] as const).map((opt) => {
|
||||
const isSelected = (giteaConfig.backupStrategy ?? "on-force-push") === opt.value;
|
||||
return (
|
||||
<button
|
||||
key={opt.value}
|
||||
type="button"
|
||||
onClick={() => {
|
||||
const newConfig = { ...giteaConfig, backupStrategy: opt.value as BackupStrategy };
|
||||
setGiteaConfig(newConfig);
|
||||
if (onGiteaAutoSave) onGiteaAutoSave(newConfig);
|
||||
}}
|
||||
className={`flex flex-col items-start gap-1 rounded-lg border p-3 text-left text-sm transition-colors ${
|
||||
isSelected
|
||||
? "border-primary bg-primary/5 ring-1 ring-primary"
|
||||
: "border-input hover:bg-accent hover:text-accent-foreground"
|
||||
}`}
|
||||
>
|
||||
<span className="font-medium">{opt.label}</span>
|
||||
<span className="text-xs text-muted-foreground">{opt.desc}</span>
|
||||
</button>
|
||||
);
|
||||
})}
|
||||
</div>
|
||||
|
||||
{(giteaConfig.backupStrategy ?? "on-force-push") !== "disabled" && (
|
||||
<>
|
||||
<div className="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||
<div>
|
||||
<label htmlFor="backup-retention" className="block text-sm font-medium mb-1.5">
|
||||
Snapshot retention count
|
||||
</label>
|
||||
<input
|
||||
id="backup-retention"
|
||||
name="backupRetentionCount"
|
||||
type="number"
|
||||
min={1}
|
||||
value={giteaConfig.backupRetentionCount ?? 20}
|
||||
onChange={(e) => {
|
||||
const newConfig = {
|
||||
...giteaConfig,
|
||||
backupRetentionCount: Math.max(1, Number.parseInt(e.target.value, 10) || 20),
|
||||
};
|
||||
setGiteaConfig(newConfig);
|
||||
if (onGiteaAutoSave) onGiteaAutoSave(newConfig);
|
||||
}}
|
||||
className="w-full rounded-md border border-input bg-background px-3 py-2 text-sm shadow-sm transition-colors placeholder:text-muted-foreground focus-visible:outline-none focus-visible:ring-1 focus-visible:ring-ring"
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<label htmlFor="backup-directory" className="block text-sm font-medium mb-1.5">
|
||||
Snapshot directory
|
||||
</label>
|
||||
<input
|
||||
id="backup-directory"
|
||||
name="backupDirectory"
|
||||
type="text"
|
||||
value={giteaConfig.backupDirectory || "data/repo-backups"}
|
||||
onChange={(e) => {
|
||||
const newConfig = { ...giteaConfig, backupDirectory: e.target.value };
|
||||
setGiteaConfig(newConfig);
|
||||
if (onGiteaAutoSave) onGiteaAutoSave(newConfig);
|
||||
}}
|
||||
className="w-full rounded-md border border-input bg-background px-3 py-2 text-sm shadow-sm transition-colors placeholder:text-muted-foreground focus-visible:outline-none focus-visible:ring-1 focus-visible:ring-ring"
|
||||
placeholder="data/repo-backups"
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{((giteaConfig.backupStrategy ?? "on-force-push") === "always" ||
|
||||
(giteaConfig.backupStrategy ?? "on-force-push") === "on-force-push") && (
|
||||
<label className="flex items-start gap-3 text-sm">
|
||||
<input
|
||||
name="blockSyncOnBackupFailure"
|
||||
type="checkbox"
|
||||
checked={Boolean(giteaConfig.blockSyncOnBackupFailure)}
|
||||
onChange={(e) => {
|
||||
const newConfig = { ...giteaConfig, blockSyncOnBackupFailure: e.target.checked };
|
||||
setGiteaConfig(newConfig);
|
||||
if (onGiteaAutoSave) onGiteaAutoSave(newConfig);
|
||||
}}
|
||||
className="mt-0.5 rounded border-input"
|
||||
/>
|
||||
<span>
|
||||
Block sync when snapshot fails
|
||||
<p className="text-xs text-muted-foreground">
|
||||
Recommended for backup-first behavior. If disabled, sync continues even when snapshot creation fails.
|
||||
</p>
|
||||
</span>
|
||||
</label>
|
||||
)}
|
||||
</>
|
||||
)}
|
||||
</div>
|
||||
</>
|
||||
)}
|
||||
|
||||
{/* Mobile: Show button at bottom */}
|
||||
<Button
|
||||
type="button"
|
||||
|
||||
@@ -287,6 +287,31 @@ export function GitHubMirrorSettings({
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Auto-mirror starred repos toggle */}
|
||||
{githubConfig.mirrorStarred && (
|
||||
<div className="mt-4">
|
||||
<div className="flex items-start space-x-3">
|
||||
<Checkbox
|
||||
id="auto-mirror-starred"
|
||||
checked={advancedOptions.autoMirrorStarred ?? false}
|
||||
onCheckedChange={(checked) => handleAdvancedChange('autoMirrorStarred', !!checked)}
|
||||
/>
|
||||
<div className="space-y-0.5 flex-1">
|
||||
<Label
|
||||
htmlFor="auto-mirror-starred"
|
||||
className="text-sm font-normal cursor-pointer flex items-center gap-2"
|
||||
>
|
||||
<Star className="h-3.5 w-3.5" />
|
||||
Auto-mirror new starred repositories
|
||||
</Label>
|
||||
<p className="text-xs text-muted-foreground">
|
||||
When disabled, starred repos are imported for browsing but not automatically mirrored. You can still mirror individual repos manually.
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Duplicate name handling for starred repos */}
|
||||
{githubConfig.mirrorStarred && (
|
||||
<div className="mt-4 space-y-2">
|
||||
|
||||
@@ -103,9 +103,7 @@ export function GiteaConfigForm({ config, setConfig, onAutoSave, isAutoSaving, g
|
||||
const normalizedValue =
|
||||
type === "checkbox"
|
||||
? checked
|
||||
: name === "backupRetentionCount"
|
||||
? Math.max(1, Number.parseInt(value, 10) || 20)
|
||||
: value;
|
||||
: value;
|
||||
|
||||
const newConfig = {
|
||||
...config,
|
||||
@@ -294,76 +292,6 @@ export function GiteaConfigForm({ config, setConfig, onAutoSave, isAutoSaving, g
|
||||
}}
|
||||
/>
|
||||
|
||||
<Separator />
|
||||
|
||||
<div className="space-y-4">
|
||||
<h3 className="text-sm font-semibold">Destructive Update Protection</h3>
|
||||
<label className="flex items-start gap-3 text-sm">
|
||||
<input
|
||||
name="backupBeforeSync"
|
||||
type="checkbox"
|
||||
checked={Boolean(config.backupBeforeSync)}
|
||||
onChange={handleChange}
|
||||
className="mt-0.5 rounded border-input"
|
||||
/>
|
||||
<span>
|
||||
Create snapshot before each sync
|
||||
<p className="text-xs text-muted-foreground">
|
||||
Saves a restore point so force-pushes or rewritten upstream history can be recovered.
|
||||
</p>
|
||||
</span>
|
||||
</label>
|
||||
|
||||
{config.backupBeforeSync && (
|
||||
<div className="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||
<div>
|
||||
<label htmlFor="gitea-backup-retention" className="block text-sm font-medium mb-1.5">
|
||||
Snapshot retention count
|
||||
</label>
|
||||
<input
|
||||
id="gitea-backup-retention"
|
||||
name="backupRetentionCount"
|
||||
type="number"
|
||||
min={1}
|
||||
value={config.backupRetentionCount ?? 20}
|
||||
onChange={handleChange}
|
||||
className="w-full rounded-md border border-input bg-background px-3 py-2 text-sm shadow-sm transition-colors placeholder:text-muted-foreground focus-visible:outline-none focus-visible:ring-1 focus-visible:ring-ring"
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<label htmlFor="gitea-backup-directory" className="block text-sm font-medium mb-1.5">
|
||||
Snapshot directory
|
||||
</label>
|
||||
<input
|
||||
id="gitea-backup-directory"
|
||||
name="backupDirectory"
|
||||
type="text"
|
||||
value={config.backupDirectory || "data/repo-backups"}
|
||||
onChange={handleChange}
|
||||
className="w-full rounded-md border border-input bg-background px-3 py-2 text-sm shadow-sm transition-colors placeholder:text-muted-foreground focus-visible:outline-none focus-visible:ring-1 focus-visible:ring-ring"
|
||||
placeholder="data/repo-backups"
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
<label className="flex items-start gap-3 text-sm">
|
||||
<input
|
||||
name="blockSyncOnBackupFailure"
|
||||
type="checkbox"
|
||||
checked={Boolean(config.blockSyncOnBackupFailure)}
|
||||
onChange={handleChange}
|
||||
className="mt-0.5 rounded border-input"
|
||||
/>
|
||||
<span>
|
||||
Block sync when snapshot fails
|
||||
<p className="text-xs text-muted-foreground">
|
||||
Recommended for backup-first behavior. If disabled, sync continues even when snapshot creation fails.
|
||||
</p>
|
||||
</span>
|
||||
</label>
|
||||
</div>
|
||||
|
||||
{/* Mobile: Show button at bottom */}
|
||||
<Button
|
||||
type="button"
|
||||
|
||||
@@ -56,7 +56,7 @@ export default function Repository() {
|
||||
const [isInitialLoading, setIsInitialLoading] = useState(true);
|
||||
const { user } = useAuth();
|
||||
const { registerRefreshCallback, isLiveEnabled } = useLiveRefresh();
|
||||
const { isGitHubConfigured, isFullyConfigured } = useConfigStatus();
|
||||
const { isGitHubConfigured, isFullyConfigured, autoMirrorStarred, githubOwner } = useConfigStatus();
|
||||
const { navigationKey } = useNavigation();
|
||||
const { filter, setFilter } = useFilterParams({
|
||||
searchTerm: "",
|
||||
@@ -233,10 +233,12 @@ export default function Repository() {
|
||||
// Filter out repositories that are already mirroring, mirrored, or ignored
|
||||
const eligibleRepos = repositories.filter(
|
||||
(repo) =>
|
||||
repo.status !== "mirroring" &&
|
||||
repo.status !== "mirrored" &&
|
||||
repo.status !== "mirroring" &&
|
||||
repo.status !== "mirrored" &&
|
||||
repo.status !== "ignored" && // Skip ignored repositories
|
||||
repo.id
|
||||
repo.id &&
|
||||
// Skip starred repos from other owners when autoMirrorStarred is disabled
|
||||
!(repo.isStarred && !autoMirrorStarred && repo.owner !== githubOwner)
|
||||
);
|
||||
|
||||
if (eligibleRepos.length === 0) {
|
||||
@@ -292,7 +294,7 @@ export default function Repository() {
|
||||
|
||||
const selectedRepos = repositories.filter(repo => repo.id && selectedRepoIds.has(repo.id));
|
||||
const eligibleRepos = selectedRepos.filter(
|
||||
repo => repo.status === "imported" || repo.status === "failed"
|
||||
repo => repo.status === "imported" || repo.status === "failed" || repo.status === "pending-approval"
|
||||
);
|
||||
|
||||
if (eligibleRepos.length === 0) {
|
||||
@@ -301,7 +303,7 @@ export default function Repository() {
|
||||
}
|
||||
|
||||
const repoIds = eligibleRepos.map(repo => repo.id as string);
|
||||
|
||||
|
||||
setLoadingRepoIds(prev => {
|
||||
const newSet = new Set(prev);
|
||||
repoIds.forEach(id => newSet.add(id));
|
||||
@@ -694,6 +696,80 @@ export default function Repository() {
|
||||
}
|
||||
};
|
||||
|
||||
const handleApproveSyncAction = async ({ repoId }: { repoId: string }) => {
|
||||
try {
|
||||
if (!user || !user.id) return;
|
||||
setLoadingRepoIds((prev) => new Set(prev).add(repoId));
|
||||
|
||||
const response = await apiRequest<{
|
||||
success: boolean;
|
||||
message?: string;
|
||||
error?: string;
|
||||
repositories: Repository[];
|
||||
}>("/job/approve-sync", {
|
||||
method: "POST",
|
||||
data: { repositoryIds: [repoId], action: "approve" },
|
||||
});
|
||||
|
||||
if (response.success) {
|
||||
toast.success("Sync approved — backup + sync started");
|
||||
setRepositories((prevRepos) =>
|
||||
prevRepos.map((repo) => {
|
||||
const updated = response.repositories.find((r) => r.id === repo.id);
|
||||
return updated ? updated : repo;
|
||||
}),
|
||||
);
|
||||
} else {
|
||||
showErrorToast(response.error || "Error approving sync", toast);
|
||||
}
|
||||
} catch (error) {
|
||||
showErrorToast(error, toast);
|
||||
} finally {
|
||||
setLoadingRepoIds((prev) => {
|
||||
const newSet = new Set(prev);
|
||||
newSet.delete(repoId);
|
||||
return newSet;
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
const handleDismissSyncAction = async ({ repoId }: { repoId: string }) => {
|
||||
try {
|
||||
if (!user || !user.id) return;
|
||||
setLoadingRepoIds((prev) => new Set(prev).add(repoId));
|
||||
|
||||
const response = await apiRequest<{
|
||||
success: boolean;
|
||||
message?: string;
|
||||
error?: string;
|
||||
repositories: Repository[];
|
||||
}>("/job/approve-sync", {
|
||||
method: "POST",
|
||||
data: { repositoryIds: [repoId], action: "dismiss" },
|
||||
});
|
||||
|
||||
if (response.success) {
|
||||
toast.success("Force-push alert dismissed");
|
||||
setRepositories((prevRepos) =>
|
||||
prevRepos.map((repo) => {
|
||||
const updated = response.repositories.find((r) => r.id === repo.id);
|
||||
return updated ? updated : repo;
|
||||
}),
|
||||
);
|
||||
} else {
|
||||
showErrorToast(response.error || "Error dismissing alert", toast);
|
||||
}
|
||||
} catch (error) {
|
||||
showErrorToast(error, toast);
|
||||
} finally {
|
||||
setLoadingRepoIds((prev) => {
|
||||
const newSet = new Set(prev);
|
||||
newSet.delete(repoId);
|
||||
return newSet;
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
const handleAddRepository = async ({
|
||||
repo,
|
||||
owner,
|
||||
@@ -863,7 +939,7 @@ export default function Repository() {
|
||||
const actions = [];
|
||||
|
||||
// Check if any selected repos can be mirrored
|
||||
if (selectedRepos.some(repo => repo.status === "imported" || repo.status === "failed")) {
|
||||
if (selectedRepos.some(repo => repo.status === "imported" || repo.status === "failed" || repo.status === "pending-approval")) {
|
||||
actions.push('mirror');
|
||||
}
|
||||
|
||||
@@ -901,7 +977,7 @@ export default function Repository() {
|
||||
const selectedRepos = repositories.filter(repo => repo.id && selectedRepoIds.has(repo.id));
|
||||
|
||||
return {
|
||||
mirror: selectedRepos.filter(repo => repo.status === "imported" || repo.status === "failed").length,
|
||||
mirror: selectedRepos.filter(repo => repo.status === "imported" || repo.status === "failed" || repo.status === "pending-approval").length,
|
||||
sync: selectedRepos.filter(repo => repo.status === "mirrored" || repo.status === "synced").length,
|
||||
rerunMetadata: selectedRepos.filter(repo => ["mirrored", "synced", "archived"].includes(repo.status)).length,
|
||||
retry: selectedRepos.filter(repo => repo.status === "failed").length,
|
||||
@@ -1409,6 +1485,8 @@ export default function Repository() {
|
||||
await fetchRepositories(false);
|
||||
}}
|
||||
onDelete={handleRequestDeleteRepository}
|
||||
onApproveSync={handleApproveSyncAction}
|
||||
onDismissSync={handleDismissSyncAction}
|
||||
/>
|
||||
)}
|
||||
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import { useMemo, useRef } from "react";
|
||||
import Fuse from "fuse.js";
|
||||
import { useVirtualizer } from "@tanstack/react-virtual";
|
||||
import { FlipHorizontal, GitFork, RefreshCw, RotateCcw, Star, Lock, Ban, Check, ChevronDown, Trash2 } from "lucide-react";
|
||||
import { FlipHorizontal, GitFork, RefreshCw, RotateCcw, Star, Lock, Ban, Check, ChevronDown, Trash2, X } from "lucide-react";
|
||||
import { SiGithub, SiGitea } from "react-icons/si";
|
||||
import type { Repository } from "@/lib/db/schema";
|
||||
import { Button } from "@/components/ui/button";
|
||||
@@ -42,6 +42,8 @@ interface RepositoryTableProps {
|
||||
onSelectionChange: (selectedIds: Set<string>) => void;
|
||||
onRefresh?: () => Promise<void>;
|
||||
onDelete?: (repoId: string) => void;
|
||||
onApproveSync?: ({ repoId }: { repoId: string }) => Promise<void>;
|
||||
onDismissSync?: ({ repoId }: { repoId: string }) => Promise<void>;
|
||||
}
|
||||
|
||||
export default function RepositoryTable({
|
||||
@@ -59,6 +61,8 @@ export default function RepositoryTable({
|
||||
onSelectionChange,
|
||||
onRefresh,
|
||||
onDelete,
|
||||
onApproveSync,
|
||||
onDismissSync,
|
||||
}: RepositoryTableProps) {
|
||||
const tableParentRef = useRef<HTMLDivElement>(null);
|
||||
const { giteaConfig } = useGiteaConfig();
|
||||
@@ -239,6 +243,7 @@ export default function RepositoryTable({
|
||||
repo.status === 'failed' ? 'bg-red-500/10 text-red-600 hover:bg-red-500/20 dark:text-red-400' :
|
||||
repo.status === 'ignored' ? 'bg-gray-500/10 text-gray-600 hover:bg-gray-500/20 dark:text-gray-400' :
|
||||
repo.status === 'skipped' ? 'bg-orange-500/10 text-orange-600 hover:bg-orange-500/20 dark:text-orange-400' :
|
||||
repo.status === 'pending-approval' ? 'bg-amber-500/10 text-amber-600 hover:bg-amber-500/20 dark:text-amber-400' :
|
||||
'bg-muted hover:bg-muted/80'}`}
|
||||
variant="secondary"
|
||||
>
|
||||
@@ -316,7 +321,40 @@ export default function RepositoryTable({
|
||||
)}
|
||||
</Button>
|
||||
)}
|
||||
|
||||
{repo.status === "pending-approval" && (
|
||||
<div className="flex gap-2 w-full">
|
||||
<Button
|
||||
size="default"
|
||||
variant="default"
|
||||
onClick={() => repo.id && onApproveSync?.({ repoId: repo.id })}
|
||||
disabled={isLoading}
|
||||
className="flex-1 h-10"
|
||||
>
|
||||
{isLoading ? (
|
||||
<>
|
||||
<Check className="h-4 w-4 mr-2 animate-spin" />
|
||||
Approving...
|
||||
</>
|
||||
) : (
|
||||
<>
|
||||
<Check className="h-4 w-4 mr-2" />
|
||||
Approve Sync
|
||||
</>
|
||||
)}
|
||||
</Button>
|
||||
<Button
|
||||
size="default"
|
||||
variant="outline"
|
||||
onClick={() => repo.id && onDismissSync?.({ repoId: repo.id })}
|
||||
disabled={isLoading}
|
||||
className="flex-1 h-10"
|
||||
>
|
||||
<X className="h-4 w-4 mr-2" />
|
||||
Dismiss
|
||||
</Button>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Ignore/Include button */}
|
||||
{repo.status === "ignored" ? (
|
||||
<Button
|
||||
@@ -663,6 +701,7 @@ export default function RepositoryTable({
|
||||
repo.status === 'failed' ? 'bg-red-500/10 text-red-600 hover:bg-red-500/20 dark:text-red-400' :
|
||||
repo.status === 'ignored' ? 'bg-gray-500/10 text-gray-600 hover:bg-gray-500/20 dark:text-gray-400' :
|
||||
repo.status === 'skipped' ? 'bg-orange-500/10 text-orange-600 hover:bg-orange-500/20 dark:text-orange-400' :
|
||||
repo.status === 'pending-approval' ? 'bg-amber-500/10 text-amber-600 hover:bg-amber-500/20 dark:text-amber-400' :
|
||||
'bg-muted hover:bg-muted/80'}`}
|
||||
variant="secondary"
|
||||
>
|
||||
@@ -680,6 +719,8 @@ export default function RepositoryTable({
|
||||
onRetry={() => onRetry({ repoId: repo.id ?? "" })}
|
||||
onSkip={(skip) => onSkip({ repoId: repo.id ?? "", skip })}
|
||||
onDelete={onDelete && repo.id ? () => onDelete(repo.id as string) : undefined}
|
||||
onApproveSync={onApproveSync ? () => onApproveSync({ repoId: repo.id ?? "" }) : undefined}
|
||||
onDismissSync={onDismissSync ? () => onDismissSync({ repoId: repo.id ?? "" }) : undefined}
|
||||
/>
|
||||
</div>
|
||||
{/* Links */}
|
||||
@@ -791,6 +832,8 @@ function RepoActionButton({
|
||||
onRetry,
|
||||
onSkip,
|
||||
onDelete,
|
||||
onApproveSync,
|
||||
onDismissSync,
|
||||
}: {
|
||||
repo: { id: string; status: string };
|
||||
isLoading: boolean;
|
||||
@@ -799,7 +842,36 @@ function RepoActionButton({
|
||||
onRetry: () => void;
|
||||
onSkip: (skip: boolean) => void;
|
||||
onDelete?: () => void;
|
||||
onApproveSync?: () => void;
|
||||
onDismissSync?: () => void;
|
||||
}) {
|
||||
// For pending-approval repos, show approve/dismiss actions
|
||||
if (repo.status === "pending-approval") {
|
||||
return (
|
||||
<div className="flex gap-1">
|
||||
<Button
|
||||
variant="default"
|
||||
size="sm"
|
||||
disabled={isLoading}
|
||||
onClick={onApproveSync}
|
||||
className="min-w-[70px]"
|
||||
>
|
||||
<Check className="h-4 w-4 mr-1" />
|
||||
Approve
|
||||
</Button>
|
||||
<Button
|
||||
variant="outline"
|
||||
size="sm"
|
||||
disabled={isLoading}
|
||||
onClick={onDismissSync}
|
||||
>
|
||||
<X className="h-4 w-4 mr-1" />
|
||||
Dismiss
|
||||
</Button>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
// For ignored repos, show an "Include" action
|
||||
if (repo.status === "ignored") {
|
||||
return (
|
||||
|
||||
@@ -9,6 +9,8 @@ interface ConfigStatus {
|
||||
isFullyConfigured: boolean;
|
||||
isLoading: boolean;
|
||||
error: string | null;
|
||||
autoMirrorStarred: boolean;
|
||||
githubOwner: string;
|
||||
}
|
||||
|
||||
// Cache to prevent duplicate API calls across components
|
||||
@@ -33,6 +35,8 @@ export function useConfigStatus(): ConfigStatus {
|
||||
isFullyConfigured: false,
|
||||
isLoading: true,
|
||||
error: null,
|
||||
autoMirrorStarred: false,
|
||||
githubOwner: '',
|
||||
});
|
||||
|
||||
// Track if this hook has already checked config to prevent multiple calls
|
||||
@@ -46,6 +50,8 @@ export function useConfigStatus(): ConfigStatus {
|
||||
isFullyConfigured: false,
|
||||
isLoading: false,
|
||||
error: 'No user found',
|
||||
autoMirrorStarred: false,
|
||||
githubOwner: '',
|
||||
});
|
||||
return;
|
||||
}
|
||||
@@ -78,6 +84,8 @@ export function useConfigStatus(): ConfigStatus {
|
||||
isFullyConfigured,
|
||||
isLoading: false,
|
||||
error: null,
|
||||
autoMirrorStarred: configResponse?.advancedOptions?.autoMirrorStarred ?? false,
|
||||
githubOwner: configResponse?.githubConfig?.username ?? '',
|
||||
});
|
||||
return;
|
||||
}
|
||||
@@ -119,6 +127,8 @@ export function useConfigStatus(): ConfigStatus {
|
||||
isFullyConfigured,
|
||||
isLoading: false,
|
||||
error: null,
|
||||
autoMirrorStarred: configResponse?.advancedOptions?.autoMirrorStarred ?? false,
|
||||
githubOwner: configResponse?.githubConfig?.username ?? '',
|
||||
});
|
||||
|
||||
hasCheckedRef.current = true;
|
||||
@@ -129,6 +139,8 @@ export function useConfigStatus(): ConfigStatus {
|
||||
isFullyConfigured: false,
|
||||
isLoading: false,
|
||||
error: error instanceof Error ? error.message : 'Failed to check configuration',
|
||||
autoMirrorStarred: false,
|
||||
githubOwner: '',
|
||||
});
|
||||
hasCheckedRef.current = true;
|
||||
}
|
||||
|
||||
@@ -29,10 +29,18 @@ export const githubConfigSchema = z.object({
|
||||
mirrorStrategy: z.enum(["preserve", "single-org", "flat-user", "mixed"]).default("preserve"),
|
||||
defaultOrg: z.string().optional(),
|
||||
starredCodeOnly: z.boolean().default(false),
|
||||
autoMirrorStarred: z.boolean().default(false),
|
||||
skipStarredIssues: z.boolean().optional(), // Deprecated: kept for backward compatibility, use starredCodeOnly instead
|
||||
starredDuplicateStrategy: z.enum(["suffix", "prefix", "owner-org"]).default("suffix").optional(),
|
||||
});
|
||||
|
||||
export const backupStrategyEnum = z.enum([
|
||||
"disabled",
|
||||
"always",
|
||||
"on-force-push",
|
||||
"block-on-force-push",
|
||||
]);
|
||||
|
||||
export const giteaConfigSchema = z.object({
|
||||
url: z.url(),
|
||||
externalUrl: z.url().optional(),
|
||||
@@ -65,7 +73,8 @@ export const giteaConfigSchema = z.object({
|
||||
mirrorPullRequests: z.boolean().default(false),
|
||||
mirrorLabels: z.boolean().default(false),
|
||||
mirrorMilestones: z.boolean().default(false),
|
||||
backupBeforeSync: z.boolean().default(true),
|
||||
backupStrategy: backupStrategyEnum.default("on-force-push"),
|
||||
backupBeforeSync: z.boolean().default(true), // Deprecated: kept for backward compat, use backupStrategy
|
||||
backupRetentionCount: z.number().int().min(1).default(20),
|
||||
backupDirectory: z.string().optional(),
|
||||
blockSyncOnBackupFailure: z.boolean().default(true),
|
||||
@@ -165,6 +174,7 @@ export const repositorySchema = z.object({
|
||||
"syncing",
|
||||
"synced",
|
||||
"archived",
|
||||
"pending-approval", // Blocked by force-push detection, needs manual approval
|
||||
])
|
||||
.default("imported"),
|
||||
lastMirrored: z.coerce.date().optional().nullable(),
|
||||
@@ -196,6 +206,7 @@ export const mirrorJobSchema = z.object({
|
||||
"syncing",
|
||||
"synced",
|
||||
"archived",
|
||||
"pending-approval",
|
||||
])
|
||||
.default("imported"),
|
||||
message: z.string(),
|
||||
|
||||
@@ -22,6 +22,7 @@ interface EnvConfig {
|
||||
preserveOrgStructure?: boolean;
|
||||
onlyMirrorOrgs?: boolean;
|
||||
starredCodeOnly?: boolean;
|
||||
autoMirrorStarred?: boolean;
|
||||
starredReposOrg?: string;
|
||||
starredReposMode?: 'dedicated-org' | 'preserve-owner';
|
||||
mirrorStrategy?: 'preserve' | 'single-org' | 'flat-user' | 'mixed';
|
||||
@@ -113,6 +114,7 @@ function parseEnvConfig(): EnvConfig {
|
||||
preserveOrgStructure: process.env.PRESERVE_ORG_STRUCTURE === 'true',
|
||||
onlyMirrorOrgs: process.env.ONLY_MIRROR_ORGS === 'true',
|
||||
starredCodeOnly: process.env.SKIP_STARRED_ISSUES === 'true',
|
||||
autoMirrorStarred: process.env.AUTO_MIRROR_STARRED === 'true',
|
||||
starredReposOrg: process.env.STARRED_REPOS_ORG,
|
||||
starredReposMode: process.env.STARRED_REPOS_MODE as 'dedicated-org' | 'preserve-owner',
|
||||
mirrorStrategy: process.env.MIRROR_STRATEGY as 'preserve' | 'single-org' | 'flat-user' | 'mixed',
|
||||
@@ -264,6 +266,7 @@ export async function initializeConfigFromEnv(): Promise<void> {
|
||||
mirrorStrategy,
|
||||
defaultOrg: envConfig.gitea.organization || existingConfig?.[0]?.githubConfig?.defaultOrg || 'github-mirrors',
|
||||
starredCodeOnly: envConfig.github.starredCodeOnly ?? existingConfig?.[0]?.githubConfig?.starredCodeOnly ?? false,
|
||||
autoMirrorStarred: envConfig.github.autoMirrorStarred ?? existingConfig?.[0]?.githubConfig?.autoMirrorStarred ?? false,
|
||||
};
|
||||
|
||||
// Build Gitea config
|
||||
|
||||
@@ -19,7 +19,12 @@ import {
|
||||
createPreSyncBundleBackup,
|
||||
shouldCreatePreSyncBackup,
|
||||
shouldBlockSyncOnBackupFailure,
|
||||
resolveBackupStrategy,
|
||||
shouldBackupForStrategy,
|
||||
shouldBlockSyncForStrategy,
|
||||
strategyNeedsDetection,
|
||||
} from "./repo-backup";
|
||||
import { detectForcePush } from "./utils/force-push-detection";
|
||||
import {
|
||||
parseRepositoryMetadataState,
|
||||
serializeRepositoryMetadataState,
|
||||
@@ -255,9 +260,12 @@ export async function getOrCreateGiteaOrgEnhanced({
|
||||
export async function syncGiteaRepoEnhanced({
|
||||
config,
|
||||
repository,
|
||||
skipForcePushDetection,
|
||||
}: {
|
||||
config: Partial<Config>;
|
||||
repository: Repository;
|
||||
/** When true, skip force-push detection and blocking (used by approve-sync). */
|
||||
skipForcePushDetection?: boolean;
|
||||
}, deps?: SyncDependencies): Promise<any> {
|
||||
try {
|
||||
if (!config.userId || !config.giteaConfig?.url || !config.giteaConfig?.token) {
|
||||
@@ -318,58 +326,138 @@ export async function syncGiteaRepoEnhanced({
|
||||
throw new Error(`Repository ${repository.name} is not a mirror. Cannot sync.`);
|
||||
}
|
||||
|
||||
if (shouldCreatePreSyncBackup(config)) {
|
||||
const cloneUrl =
|
||||
repoInfo.clone_url ||
|
||||
`${config.giteaConfig.url.replace(/\/$/, "")}/${repoOwner}/${repository.name}.git`;
|
||||
// ---- Smart backup strategy with force-push detection ----
|
||||
const backupStrategy = resolveBackupStrategy(config);
|
||||
let forcePushDetected = false;
|
||||
|
||||
try {
|
||||
const backupResult = await createPreSyncBundleBackup({
|
||||
config,
|
||||
owner: repoOwner,
|
||||
repoName: repository.name,
|
||||
cloneUrl,
|
||||
});
|
||||
if (backupStrategy !== "disabled") {
|
||||
// Run force-push detection if the strategy requires it
|
||||
// (skip when called from approve-sync to avoid re-blocking)
|
||||
if (strategyNeedsDetection(backupStrategy) && !skipForcePushDetection) {
|
||||
try {
|
||||
const decryptedGithubToken = decryptedConfig.githubConfig?.token;
|
||||
if (decryptedGithubToken) {
|
||||
const fpOctokit = new Octokit({ auth: decryptedGithubToken });
|
||||
const detectionResult = await detectForcePush({
|
||||
giteaUrl: config.giteaConfig.url,
|
||||
giteaToken: decryptedConfig.giteaConfig.token,
|
||||
giteaOwner: repoOwner,
|
||||
giteaRepo: repository.name,
|
||||
octokit: fpOctokit,
|
||||
githubOwner: repository.owner,
|
||||
githubRepo: repository.name,
|
||||
});
|
||||
|
||||
await createMirrorJob({
|
||||
userId: config.userId,
|
||||
repositoryId: repository.id,
|
||||
repositoryName: repository.name,
|
||||
message: `Snapshot created for ${repository.name}`,
|
||||
details: `Pre-sync snapshot created at ${backupResult.bundlePath}.`,
|
||||
status: "syncing",
|
||||
});
|
||||
} catch (backupError) {
|
||||
const errorMessage =
|
||||
backupError instanceof Error ? backupError.message : String(backupError);
|
||||
forcePushDetected = detectionResult.detected;
|
||||
|
||||
await createMirrorJob({
|
||||
userId: config.userId,
|
||||
repositoryId: repository.id,
|
||||
repositoryName: repository.name,
|
||||
message: `Snapshot failed for ${repository.name}`,
|
||||
details: `Pre-sync snapshot failed: ${errorMessage}`,
|
||||
status: "failed",
|
||||
});
|
||||
|
||||
if (shouldBlockSyncOnBackupFailure(config)) {
|
||||
await db
|
||||
.update(repositories)
|
||||
.set({
|
||||
status: repoStatusEnum.parse("failed"),
|
||||
updatedAt: new Date(),
|
||||
errorMessage: `Snapshot failed; sync blocked to protect history. ${errorMessage}`,
|
||||
})
|
||||
.where(eq(repositories.id, repository.id!));
|
||||
|
||||
throw new Error(
|
||||
`Snapshot failed; sync blocked to protect history. ${errorMessage}`
|
||||
if (detectionResult.skipped) {
|
||||
console.log(
|
||||
`[Sync] Force-push detection skipped for ${repository.name}: ${detectionResult.skipReason}`,
|
||||
);
|
||||
} else if (forcePushDetected) {
|
||||
const branchNames = detectionResult.affectedBranches
|
||||
.map((b) => `${b.name} (${b.reason})`)
|
||||
.join(", ");
|
||||
console.warn(
|
||||
`[Sync] Force-push detected on ${repository.name}: ${branchNames}`,
|
||||
);
|
||||
}
|
||||
} else {
|
||||
console.log(
|
||||
`[Sync] Skipping force-push detection for ${repository.name}: no GitHub token`,
|
||||
);
|
||||
}
|
||||
} catch (detectionError) {
|
||||
// Fail-open: detection errors should never block sync
|
||||
console.warn(
|
||||
`[Sync] Force-push detection failed for ${repository.name}, proceeding with sync: ${
|
||||
detectionError instanceof Error ? detectionError.message : String(detectionError)
|
||||
}`,
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
console.warn(
|
||||
`[Sync] Snapshot failed for ${repository.name}, continuing because blockSyncOnBackupFailure=false: ${errorMessage}`
|
||||
);
|
||||
// Check if sync should be blocked (block-on-force-push mode)
|
||||
if (shouldBlockSyncForStrategy(backupStrategy, forcePushDetected)) {
|
||||
const branchInfo = `Force-push detected; sync blocked for manual approval.`;
|
||||
|
||||
await db
|
||||
.update(repositories)
|
||||
.set({
|
||||
status: "pending-approval",
|
||||
updatedAt: new Date(),
|
||||
errorMessage: branchInfo,
|
||||
})
|
||||
.where(eq(repositories.id, repository.id!));
|
||||
|
||||
await createMirrorJob({
|
||||
userId: config.userId,
|
||||
repositoryId: repository.id,
|
||||
repositoryName: repository.name,
|
||||
message: `Sync blocked for ${repository.name}: force-push detected`,
|
||||
details: branchInfo,
|
||||
status: "pending-approval",
|
||||
});
|
||||
|
||||
console.warn(`[Sync] Sync blocked for ${repository.name}: pending manual approval`);
|
||||
return { blocked: true, reason: branchInfo };
|
||||
}
|
||||
|
||||
// Create backup if strategy says so
|
||||
if (shouldBackupForStrategy(backupStrategy, forcePushDetected)) {
|
||||
const cloneUrl =
|
||||
repoInfo.clone_url ||
|
||||
`${config.giteaConfig.url.replace(/\/$/, "")}/${repoOwner}/${repository.name}.git`;
|
||||
|
||||
try {
|
||||
const backupResult = await createPreSyncBundleBackup({
|
||||
config,
|
||||
owner: repoOwner,
|
||||
repoName: repository.name,
|
||||
cloneUrl,
|
||||
force: true, // Strategy already decided to backup; skip legacy gate
|
||||
});
|
||||
|
||||
await createMirrorJob({
|
||||
userId: config.userId,
|
||||
repositoryId: repository.id,
|
||||
repositoryName: repository.name,
|
||||
message: `Snapshot created for ${repository.name}`,
|
||||
details: `Pre-sync snapshot created at ${backupResult.bundlePath}.`,
|
||||
status: "syncing",
|
||||
});
|
||||
} catch (backupError) {
|
||||
const errorMessage =
|
||||
backupError instanceof Error ? backupError.message : String(backupError);
|
||||
|
||||
await createMirrorJob({
|
||||
userId: config.userId,
|
||||
repositoryId: repository.id,
|
||||
repositoryName: repository.name,
|
||||
message: `Snapshot failed for ${repository.name}`,
|
||||
details: `Pre-sync snapshot failed: ${errorMessage}`,
|
||||
status: "failed",
|
||||
});
|
||||
|
||||
if (shouldBlockSyncOnBackupFailure(config)) {
|
||||
await db
|
||||
.update(repositories)
|
||||
.set({
|
||||
status: repoStatusEnum.parse("failed"),
|
||||
updatedAt: new Date(),
|
||||
errorMessage: `Snapshot failed; sync blocked to protect history. ${errorMessage}`,
|
||||
})
|
||||
.where(eq(repositories.id, repository.id!));
|
||||
|
||||
throw new Error(
|
||||
`Snapshot failed; sync blocked to protect history. ${errorMessage}`,
|
||||
);
|
||||
}
|
||||
|
||||
console.warn(
|
||||
`[Sync] Snapshot failed for ${repository.name}, continuing because blockSyncOnBackupFailure=false: ${errorMessage}`,
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -1,7 +1,13 @@
|
||||
import path from "node:path";
|
||||
import { afterEach, beforeEach, describe, expect, test } from "bun:test";
|
||||
import type { Config } from "@/types/config";
|
||||
import { resolveBackupPaths } from "@/lib/repo-backup";
|
||||
import {
|
||||
resolveBackupPaths,
|
||||
resolveBackupStrategy,
|
||||
shouldBackupForStrategy,
|
||||
shouldBlockSyncForStrategy,
|
||||
strategyNeedsDetection,
|
||||
} from "@/lib/repo-backup";
|
||||
|
||||
describe("resolveBackupPaths", () => {
|
||||
let originalBackupDirEnv: string | undefined;
|
||||
@@ -113,3 +119,130 @@ describe("resolveBackupPaths", () => {
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
// ---- Backup strategy resolver tests ----
|
||||
|
||||
function makeConfig(overrides: Record<string, any> = {}): Partial<Config> {
|
||||
return {
|
||||
giteaConfig: {
|
||||
url: "https://gitea.example.com",
|
||||
token: "tok",
|
||||
...overrides,
|
||||
},
|
||||
} as Partial<Config>;
|
||||
}
|
||||
|
||||
const envKeysToClean = ["PRE_SYNC_BACKUP_STRATEGY", "PRE_SYNC_BACKUP_ENABLED"];
|
||||
|
||||
describe("resolveBackupStrategy", () => {
|
||||
let savedEnv: Record<string, string | undefined> = {};
|
||||
|
||||
beforeEach(() => {
|
||||
savedEnv = {};
|
||||
for (const key of envKeysToClean) {
|
||||
savedEnv[key] = process.env[key];
|
||||
delete process.env[key];
|
||||
}
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
for (const [key, value] of Object.entries(savedEnv)) {
|
||||
if (value === undefined) {
|
||||
delete process.env[key];
|
||||
} else {
|
||||
process.env[key] = value;
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
test("returns explicit backupStrategy when set", () => {
|
||||
expect(resolveBackupStrategy(makeConfig({ backupStrategy: "always" }))).toBe("always");
|
||||
expect(resolveBackupStrategy(makeConfig({ backupStrategy: "disabled" }))).toBe("disabled");
|
||||
expect(resolveBackupStrategy(makeConfig({ backupStrategy: "on-force-push" }))).toBe("on-force-push");
|
||||
expect(resolveBackupStrategy(makeConfig({ backupStrategy: "block-on-force-push" }))).toBe("block-on-force-push");
|
||||
});
|
||||
|
||||
test("maps backupBeforeSync: true → 'always' (backward compat)", () => {
|
||||
expect(resolveBackupStrategy(makeConfig({ backupBeforeSync: true }))).toBe("always");
|
||||
});
|
||||
|
||||
test("maps backupBeforeSync: false → 'disabled' (backward compat)", () => {
|
||||
expect(resolveBackupStrategy(makeConfig({ backupBeforeSync: false }))).toBe("disabled");
|
||||
});
|
||||
|
||||
test("prefers explicit backupStrategy over backupBeforeSync", () => {
|
||||
expect(
|
||||
resolveBackupStrategy(
|
||||
makeConfig({ backupStrategy: "on-force-push", backupBeforeSync: true }),
|
||||
),
|
||||
).toBe("on-force-push");
|
||||
});
|
||||
|
||||
test("falls back to PRE_SYNC_BACKUP_STRATEGY env var", () => {
|
||||
process.env.PRE_SYNC_BACKUP_STRATEGY = "block-on-force-push";
|
||||
expect(resolveBackupStrategy(makeConfig({}))).toBe("block-on-force-push");
|
||||
});
|
||||
|
||||
test("falls back to PRE_SYNC_BACKUP_ENABLED env var (legacy)", () => {
|
||||
process.env.PRE_SYNC_BACKUP_ENABLED = "false";
|
||||
expect(resolveBackupStrategy(makeConfig({}))).toBe("disabled");
|
||||
});
|
||||
|
||||
test("defaults to 'on-force-push' when nothing is configured", () => {
|
||||
expect(resolveBackupStrategy(makeConfig({}))).toBe("on-force-push");
|
||||
});
|
||||
|
||||
test("handles empty giteaConfig gracefully", () => {
|
||||
expect(resolveBackupStrategy({})).toBe("on-force-push");
|
||||
});
|
||||
});
|
||||
|
||||
describe("shouldBackupForStrategy", () => {
|
||||
test("disabled → never backup", () => {
|
||||
expect(shouldBackupForStrategy("disabled", false)).toBe(false);
|
||||
expect(shouldBackupForStrategy("disabled", true)).toBe(false);
|
||||
});
|
||||
|
||||
test("always → always backup", () => {
|
||||
expect(shouldBackupForStrategy("always", false)).toBe(true);
|
||||
expect(shouldBackupForStrategy("always", true)).toBe(true);
|
||||
});
|
||||
|
||||
test("on-force-push → backup only when detected", () => {
|
||||
expect(shouldBackupForStrategy("on-force-push", false)).toBe(false);
|
||||
expect(shouldBackupForStrategy("on-force-push", true)).toBe(true);
|
||||
});
|
||||
|
||||
test("block-on-force-push → backup only when detected", () => {
|
||||
expect(shouldBackupForStrategy("block-on-force-push", false)).toBe(false);
|
||||
expect(shouldBackupForStrategy("block-on-force-push", true)).toBe(true);
|
||||
});
|
||||
});
|
||||
|
||||
describe("shouldBlockSyncForStrategy", () => {
|
||||
test("only block-on-force-push + detected returns true", () => {
|
||||
expect(shouldBlockSyncForStrategy("block-on-force-push", true)).toBe(true);
|
||||
});
|
||||
|
||||
test("block-on-force-push without detection does not block", () => {
|
||||
expect(shouldBlockSyncForStrategy("block-on-force-push", false)).toBe(false);
|
||||
});
|
||||
|
||||
test("other strategies never block", () => {
|
||||
expect(shouldBlockSyncForStrategy("disabled", true)).toBe(false);
|
||||
expect(shouldBlockSyncForStrategy("always", true)).toBe(false);
|
||||
expect(shouldBlockSyncForStrategy("on-force-push", true)).toBe(false);
|
||||
});
|
||||
});
|
||||
|
||||
describe("strategyNeedsDetection", () => {
|
||||
test("returns true for detection-based strategies", () => {
|
||||
expect(strategyNeedsDetection("on-force-push")).toBe(true);
|
||||
expect(strategyNeedsDetection("block-on-force-push")).toBe(true);
|
||||
});
|
||||
|
||||
test("returns false for non-detection strategies", () => {
|
||||
expect(strategyNeedsDetection("disabled")).toBe(false);
|
||||
expect(strategyNeedsDetection("always")).toBe(false);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import { mkdir, mkdtemp, readdir, rm, stat } from "node:fs/promises";
|
||||
import os from "node:os";
|
||||
import path from "node:path";
|
||||
import type { Config } from "@/types/config";
|
||||
import type { Config, BackupStrategy } from "@/types/config";
|
||||
import { decryptConfigTokens } from "./utils/config-encryption";
|
||||
|
||||
const TRUE_VALUES = new Set(["1", "true", "yes", "on"]);
|
||||
@@ -101,6 +101,92 @@ export function shouldBlockSyncOnBackupFailure(config: Partial<Config>): boolean
|
||||
return configSetting === undefined ? true : Boolean(configSetting);
|
||||
}
|
||||
|
||||
// ---- Backup strategy resolver ----
|
||||
|
||||
const VALID_STRATEGIES = new Set<BackupStrategy>([
|
||||
"disabled",
|
||||
"always",
|
||||
"on-force-push",
|
||||
"block-on-force-push",
|
||||
]);
|
||||
|
||||
/**
|
||||
* Resolve the effective backup strategy from config, falling back through:
|
||||
* 1. `backupStrategy` field (new)
|
||||
* 2. `backupBeforeSync` boolean (deprecated, backward compat)
|
||||
* 3. `PRE_SYNC_BACKUP_STRATEGY` env var
|
||||
* 4. `PRE_SYNC_BACKUP_ENABLED` env var (legacy)
|
||||
* 5. Default: `"on-force-push"`
|
||||
*/
|
||||
export function resolveBackupStrategy(config: Partial<Config>): BackupStrategy {
|
||||
// 1. Explicit backupStrategy field
|
||||
const explicit = config.giteaConfig?.backupStrategy;
|
||||
if (explicit && VALID_STRATEGIES.has(explicit as BackupStrategy)) {
|
||||
return explicit as BackupStrategy;
|
||||
}
|
||||
|
||||
// 2. Legacy backupBeforeSync boolean → map to strategy
|
||||
const legacy = config.giteaConfig?.backupBeforeSync;
|
||||
if (legacy !== undefined) {
|
||||
return legacy ? "always" : "disabled";
|
||||
}
|
||||
|
||||
// 3. Env var (new)
|
||||
const envStrategy = process.env.PRE_SYNC_BACKUP_STRATEGY?.trim().toLowerCase();
|
||||
if (envStrategy && VALID_STRATEGIES.has(envStrategy as BackupStrategy)) {
|
||||
return envStrategy as BackupStrategy;
|
||||
}
|
||||
|
||||
// 4. Env var (legacy)
|
||||
const envEnabled = process.env.PRE_SYNC_BACKUP_ENABLED;
|
||||
if (envEnabled !== undefined) {
|
||||
return parseBoolean(envEnabled, true) ? "always" : "disabled";
|
||||
}
|
||||
|
||||
// 5. Default
|
||||
return "on-force-push";
|
||||
}
|
||||
|
||||
/**
|
||||
* Determine whether a backup should be created for the given strategy and
|
||||
* force-push detection result.
|
||||
*/
|
||||
export function shouldBackupForStrategy(
|
||||
strategy: BackupStrategy,
|
||||
forcePushDetected: boolean,
|
||||
): boolean {
|
||||
switch (strategy) {
|
||||
case "disabled":
|
||||
return false;
|
||||
case "always":
|
||||
return true;
|
||||
case "on-force-push":
|
||||
case "block-on-force-push":
|
||||
return forcePushDetected;
|
||||
default:
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Determine whether sync should be blocked (requires manual approval).
|
||||
* Only `block-on-force-push` with an actual detection blocks sync.
|
||||
*/
|
||||
export function shouldBlockSyncForStrategy(
|
||||
strategy: BackupStrategy,
|
||||
forcePushDetected: boolean,
|
||||
): boolean {
|
||||
return strategy === "block-on-force-push" && forcePushDetected;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns true when the strategy requires running force-push detection
|
||||
* before deciding on backup / block behavior.
|
||||
*/
|
||||
export function strategyNeedsDetection(strategy: BackupStrategy): boolean {
|
||||
return strategy === "on-force-push" || strategy === "block-on-force-push";
|
||||
}
|
||||
|
||||
export function resolveBackupPaths({
|
||||
config,
|
||||
owner,
|
||||
@@ -136,13 +222,17 @@ export async function createPreSyncBundleBackup({
|
||||
owner,
|
||||
repoName,
|
||||
cloneUrl,
|
||||
force,
|
||||
}: {
|
||||
config: Partial<Config>;
|
||||
owner: string;
|
||||
repoName: string;
|
||||
cloneUrl: string;
|
||||
/** When true, skip the legacy shouldCreatePreSyncBackup check.
|
||||
* Used by the strategy-driven path which has already decided to backup. */
|
||||
force?: boolean;
|
||||
}): Promise<{ bundlePath: string }> {
|
||||
if (!shouldCreatePreSyncBackup(config)) {
|
||||
if (!force && !shouldCreatePreSyncBackup(config)) {
|
||||
throw new Error("Pre-sync backup is disabled.");
|
||||
}
|
||||
|
||||
|
||||
@@ -79,6 +79,13 @@ async function identifyOrphanedRepositories(config: any): Promise<any[]> {
|
||||
return false;
|
||||
}
|
||||
|
||||
// If starred repos are not being fetched from GitHub, we can't determine
|
||||
// if a starred repo is orphaned - skip it to prevent data loss
|
||||
if (repo.isStarred && !config.githubConfig?.includeStarred) {
|
||||
console.log(`[Repository Cleanup] Skipping starred repo ${repo.fullName} - starred repos not being fetched from GitHub`);
|
||||
return false;
|
||||
}
|
||||
|
||||
const githubRepo = githubReposByFullName.get(repo.fullName);
|
||||
if (!githubRepo) {
|
||||
return true;
|
||||
|
||||
@@ -13,6 +13,7 @@ import type { Repository } from '@/lib/db/schema';
|
||||
import { repoStatusEnum, repositoryVisibilityEnum } from '@/types/Repository';
|
||||
import { mergeGitReposPreferStarred, normalizeGitRepoToInsert, calcBatchSizeForInsert } from '@/lib/repo-utils';
|
||||
import { isMirrorableGitHubRepo } from '@/lib/repo-eligibility';
|
||||
import { createMirrorJob } from '@/lib/helpers';
|
||||
|
||||
let schedulerInterval: NodeJS.Timeout | null = null;
|
||||
let isSchedulerRunning = false;
|
||||
@@ -128,6 +129,19 @@ async function runScheduledSync(config: any): Promise<void> {
|
||||
.onConflictDoNothing({ target: [repositories.userId, repositories.normalizedFullName] });
|
||||
}
|
||||
console.log(`[Scheduler] Successfully imported ${newRepos.length} new repositories for user ${userId}`);
|
||||
|
||||
// Log activity for each newly imported repo
|
||||
for (const repo of newRepos) {
|
||||
const sourceLabel = repo.isStarred ? 'starred' : 'owned';
|
||||
await createMirrorJob({
|
||||
userId,
|
||||
repositoryName: repo.fullName,
|
||||
message: `Auto-imported ${sourceLabel} repository: ${repo.fullName}`,
|
||||
details: `Repository ${repo.fullName} was discovered and imported during scheduled sync.`,
|
||||
status: 'imported',
|
||||
skipDuplicateEvent: true,
|
||||
});
|
||||
}
|
||||
} else {
|
||||
console.log(`[Scheduler] No new repositories found for user ${userId}`);
|
||||
}
|
||||
@@ -176,7 +190,7 @@ async function runScheduledSync(config: any): Promise<void> {
|
||||
if (scheduleConfig.autoMirror) {
|
||||
try {
|
||||
console.log(`[Scheduler] Auto-mirror enabled - checking for repositories to mirror for user ${userId}...`);
|
||||
const reposNeedingMirror = await db
|
||||
let reposNeedingMirror = await db
|
||||
.select()
|
||||
.from(repositories)
|
||||
.where(
|
||||
@@ -190,6 +204,19 @@ async function runScheduledSync(config: any): Promise<void> {
|
||||
)
|
||||
);
|
||||
|
||||
// Filter out starred repos from auto-mirror when autoMirrorStarred is disabled
|
||||
if (!config.githubConfig?.autoMirrorStarred) {
|
||||
const githubOwner = config.githubConfig?.owner || '';
|
||||
const beforeCount = reposNeedingMirror.length;
|
||||
reposNeedingMirror = reposNeedingMirror.filter(
|
||||
repo => !repo.isStarred || repo.owner === githubOwner
|
||||
);
|
||||
const skippedCount = beforeCount - reposNeedingMirror.length;
|
||||
if (skippedCount > 0) {
|
||||
console.log(`[Scheduler] Skipped ${skippedCount} starred repositories from auto-mirror (autoMirrorStarred is disabled)`);
|
||||
}
|
||||
}
|
||||
|
||||
if (reposNeedingMirror.length > 0) {
|
||||
console.log(`[Scheduler] Found ${reposNeedingMirror.length} repositories that need initial mirroring`);
|
||||
|
||||
@@ -280,11 +307,29 @@ async function runScheduledSync(config: any): Promise<void> {
|
||||
});
|
||||
}
|
||||
|
||||
// Log pending-approval repos that are excluded from sync
|
||||
try {
|
||||
const pendingApprovalRepos = await db
|
||||
.select({ id: repositories.id })
|
||||
.from(repositories)
|
||||
.where(
|
||||
and(
|
||||
eq(repositories.userId, userId),
|
||||
eq(repositories.status, 'pending-approval')
|
||||
)
|
||||
);
|
||||
if (pendingApprovalRepos.length > 0) {
|
||||
console.log(`[Scheduler] ${pendingApprovalRepos.length} repositories pending approval (force-push detected) for user ${userId} — skipping sync for those`);
|
||||
}
|
||||
} catch {
|
||||
// Non-critical logging, ignore errors
|
||||
}
|
||||
|
||||
if (reposToSync.length === 0) {
|
||||
console.log(`[Scheduler] No repositories to sync for user ${userId}`);
|
||||
return;
|
||||
}
|
||||
|
||||
|
||||
console.log(`[Scheduler] Syncing ${reposToSync.length} repositories for user ${userId}`);
|
||||
|
||||
// Process repositories in batches
|
||||
@@ -466,6 +511,19 @@ async function performInitialAutoStart(): Promise<void> {
|
||||
.onConflictDoNothing({ target: [repositories.userId, repositories.normalizedFullName] });
|
||||
}
|
||||
console.log(`[Scheduler] Successfully imported ${reposToImport.length} repositories`);
|
||||
|
||||
// Log activity for each newly imported repo
|
||||
for (const repo of reposToImport) {
|
||||
const sourceLabel = repo.isStarred ? 'starred' : 'owned';
|
||||
await createMirrorJob({
|
||||
userId: config.userId,
|
||||
repositoryName: repo.fullName,
|
||||
message: `Auto-imported ${sourceLabel} repository: ${repo.fullName}`,
|
||||
details: `Repository ${repo.fullName} was discovered and imported during auto-start.`,
|
||||
status: 'imported',
|
||||
skipDuplicateEvent: true,
|
||||
});
|
||||
}
|
||||
} else {
|
||||
console.log(`[Scheduler] No new repositories to import for user ${config.userId}`);
|
||||
}
|
||||
@@ -473,7 +531,7 @@ async function performInitialAutoStart(): Promise<void> {
|
||||
if (skippedDisabledCount > 0) {
|
||||
console.log(`[Scheduler] Skipped ${skippedDisabledCount} disabled GitHub repositories for user ${config.userId}`);
|
||||
}
|
||||
|
||||
|
||||
// Check if we already have mirrored repositories (indicating this isn't first run)
|
||||
const mirroredRepos = await db
|
||||
.select()
|
||||
@@ -516,8 +574,34 @@ async function performInitialAutoStart(): Promise<void> {
|
||||
}
|
||||
|
||||
// Step 2: Trigger mirror for all repositories that need mirroring
|
||||
// Only auto-mirror if autoMirror is enabled in schedule config
|
||||
if (!config.scheduleConfig?.autoMirror) {
|
||||
console.log(`[Scheduler] Step 2: Skipping initial mirror - autoMirror is disabled for user ${config.userId}`);
|
||||
|
||||
// Still update schedule config timestamps
|
||||
const currentTime2 = new Date();
|
||||
const intervalSource2 = config.scheduleConfig?.interval ||
|
||||
config.giteaConfig?.mirrorInterval ||
|
||||
'8h';
|
||||
const interval2 = parseScheduleInterval(intervalSource2);
|
||||
const nextRun2 = new Date(currentTime2.getTime() + interval2);
|
||||
|
||||
await db.update(configs).set({
|
||||
scheduleConfig: {
|
||||
...config.scheduleConfig,
|
||||
enabled: true,
|
||||
lastRun: currentTime2,
|
||||
nextRun: nextRun2,
|
||||
},
|
||||
updatedAt: currentTime2,
|
||||
}).where(eq(configs.id, config.id));
|
||||
|
||||
console.log(`[Scheduler] Scheduling enabled for user ${config.userId}, next sync at ${nextRun2.toISOString()}`);
|
||||
continue;
|
||||
}
|
||||
|
||||
console.log(`[Scheduler] Step 2: Triggering mirror for repositories that need mirroring...`);
|
||||
const reposNeedingMirror = await db
|
||||
let reposNeedingMirror = await db
|
||||
.select()
|
||||
.from(repositories)
|
||||
.where(
|
||||
@@ -530,7 +614,20 @@ async function performInitialAutoStart(): Promise<void> {
|
||||
)
|
||||
)
|
||||
);
|
||||
|
||||
|
||||
// Filter out starred repos from auto-mirror when autoMirrorStarred is disabled
|
||||
if (!config.githubConfig?.autoMirrorStarred) {
|
||||
const githubOwner = config.githubConfig?.owner || '';
|
||||
const beforeCount = reposNeedingMirror.length;
|
||||
reposNeedingMirror = reposNeedingMirror.filter(
|
||||
repo => !repo.isStarred || repo.owner === githubOwner
|
||||
);
|
||||
const skippedCount = beforeCount - reposNeedingMirror.length;
|
||||
if (skippedCount > 0) {
|
||||
console.log(`[Scheduler] Skipped ${skippedCount} starred repositories from initial auto-mirror (autoMirrorStarred is disabled)`);
|
||||
}
|
||||
}
|
||||
|
||||
if (reposNeedingMirror.length > 0) {
|
||||
console.log(`[Scheduler] Found ${reposNeedingMirror.length} repositories that need mirroring`);
|
||||
|
||||
|
||||
@@ -280,6 +280,8 @@ export const getStatusColor = (status: string): string => {
|
||||
return "bg-orange-500"; // Deleting
|
||||
case "deleted":
|
||||
return "bg-gray-600"; // Deleted
|
||||
case "pending-approval":
|
||||
return "bg-amber-500"; // Needs manual approval
|
||||
default:
|
||||
return "bg-gray-400"; // Unknown/neutral
|
||||
}
|
||||
|
||||
@@ -93,7 +93,8 @@ export async function createDefaultConfig({ userId, envOverrides = {} }: Default
|
||||
forkStrategy: "reference",
|
||||
issueConcurrency: 3,
|
||||
pullRequestConcurrency: 5,
|
||||
backupBeforeSync: true,
|
||||
backupStrategy: "on-force-push",
|
||||
backupBeforeSync: true, // Deprecated: kept for backward compat
|
||||
backupRetentionCount: 20,
|
||||
backupDirectory: "data/repo-backups",
|
||||
blockSyncOnBackupFailure: true,
|
||||
|
||||
@@ -56,6 +56,7 @@ export function mapUiToDbConfig(
|
||||
|
||||
// Advanced options
|
||||
starredCodeOnly: advancedOptions.starredCodeOnly,
|
||||
autoMirrorStarred: advancedOptions.autoMirrorStarred ?? false,
|
||||
};
|
||||
|
||||
// Map Gitea config to match database schema
|
||||
@@ -100,6 +101,7 @@ export function mapUiToDbConfig(
|
||||
mirrorPullRequests: mirrorOptions.mirrorMetadata && mirrorOptions.metadataComponents.pullRequests,
|
||||
mirrorLabels: mirrorOptions.mirrorMetadata && mirrorOptions.metadataComponents.labels,
|
||||
mirrorMilestones: mirrorOptions.mirrorMetadata && mirrorOptions.metadataComponents.milestones,
|
||||
backupStrategy: giteaConfig.backupStrategy,
|
||||
backupBeforeSync: giteaConfig.backupBeforeSync ?? true,
|
||||
backupRetentionCount: giteaConfig.backupRetentionCount ?? 20,
|
||||
backupDirectory: giteaConfig.backupDirectory?.trim() || undefined,
|
||||
@@ -144,6 +146,7 @@ export function mapDbToUiConfig(dbConfig: any): {
|
||||
personalReposOrg: undefined, // Not stored in current schema
|
||||
issueConcurrency: dbConfig.giteaConfig?.issueConcurrency ?? 3,
|
||||
pullRequestConcurrency: dbConfig.giteaConfig?.pullRequestConcurrency ?? 5,
|
||||
backupStrategy: dbConfig.giteaConfig?.backupStrategy || undefined,
|
||||
backupBeforeSync: dbConfig.giteaConfig?.backupBeforeSync ?? true,
|
||||
backupRetentionCount: dbConfig.giteaConfig?.backupRetentionCount ?? 20,
|
||||
backupDirectory: dbConfig.giteaConfig?.backupDirectory || "data/repo-backups",
|
||||
@@ -170,6 +173,7 @@ export function mapDbToUiConfig(dbConfig: any): {
|
||||
skipForks: !(dbConfig.githubConfig?.includeForks ?? true), // Invert includeForks to get skipForks
|
||||
// Support both old (skipStarredIssues) and new (starredCodeOnly) field names for backward compatibility
|
||||
starredCodeOnly: dbConfig.githubConfig?.starredCodeOnly ?? (dbConfig.githubConfig as any)?.skipStarredIssues ?? false,
|
||||
autoMirrorStarred: dbConfig.githubConfig?.autoMirrorStarred ?? false,
|
||||
};
|
||||
|
||||
return {
|
||||
|
||||
319
src/lib/utils/force-push-detection.test.ts
Normal file
319
src/lib/utils/force-push-detection.test.ts
Normal file
@@ -0,0 +1,319 @@
|
||||
import { describe, expect, it, mock } from "bun:test";
|
||||
import {
|
||||
detectForcePush,
|
||||
fetchGitHubBranches,
|
||||
checkAncestry,
|
||||
type BranchInfo,
|
||||
} from "./force-push-detection";
|
||||
|
||||
// ---- Helpers ----
|
||||
|
||||
function makeOctokit(overrides: Record<string, any> = {}) {
|
||||
return {
|
||||
repos: {
|
||||
listBranches: mock(() => Promise.resolve({ data: [] })),
|
||||
compareCommits: mock(() =>
|
||||
Promise.resolve({ data: { status: "ahead" } }),
|
||||
),
|
||||
...overrides.repos,
|
||||
},
|
||||
paginate: mock(async (_method: any, params: any) => {
|
||||
// Default: return whatever the test wired into _githubBranches
|
||||
return overrides._githubBranches ?? [];
|
||||
}),
|
||||
...overrides,
|
||||
} as any;
|
||||
}
|
||||
|
||||
// ---- fetchGitHubBranches ----
|
||||
|
||||
describe("fetchGitHubBranches", () => {
|
||||
it("maps Octokit paginated response to BranchInfo[]", async () => {
|
||||
const octokit = makeOctokit({
|
||||
_githubBranches: [
|
||||
{ name: "main", commit: { sha: "aaa" } },
|
||||
{ name: "dev", commit: { sha: "bbb" } },
|
||||
],
|
||||
});
|
||||
|
||||
const result = await fetchGitHubBranches({
|
||||
octokit,
|
||||
owner: "user",
|
||||
repo: "repo",
|
||||
});
|
||||
|
||||
expect(result).toEqual([
|
||||
{ name: "main", sha: "aaa" },
|
||||
{ name: "dev", sha: "bbb" },
|
||||
]);
|
||||
});
|
||||
});
|
||||
|
||||
// ---- checkAncestry ----
|
||||
|
||||
describe("checkAncestry", () => {
|
||||
it("returns true for fast-forward (ahead)", async () => {
|
||||
const octokit = makeOctokit({
|
||||
repos: {
|
||||
compareCommits: mock(() =>
|
||||
Promise.resolve({ data: { status: "ahead" } }),
|
||||
),
|
||||
},
|
||||
});
|
||||
|
||||
const result = await checkAncestry({
|
||||
octokit,
|
||||
owner: "user",
|
||||
repo: "repo",
|
||||
baseSha: "old",
|
||||
headSha: "new",
|
||||
});
|
||||
|
||||
expect(result).toBe(true);
|
||||
});
|
||||
|
||||
it("returns true for identical", async () => {
|
||||
const octokit = makeOctokit({
|
||||
repos: {
|
||||
compareCommits: mock(() =>
|
||||
Promise.resolve({ data: { status: "identical" } }),
|
||||
),
|
||||
},
|
||||
});
|
||||
|
||||
const result = await checkAncestry({
|
||||
octokit,
|
||||
owner: "user",
|
||||
repo: "repo",
|
||||
baseSha: "same",
|
||||
headSha: "same",
|
||||
});
|
||||
|
||||
expect(result).toBe(true);
|
||||
});
|
||||
|
||||
it("returns false for diverged", async () => {
|
||||
const octokit = makeOctokit({
|
||||
repos: {
|
||||
compareCommits: mock(() =>
|
||||
Promise.resolve({ data: { status: "diverged" } }),
|
||||
),
|
||||
},
|
||||
});
|
||||
|
||||
const result = await checkAncestry({
|
||||
octokit,
|
||||
owner: "user",
|
||||
repo: "repo",
|
||||
baseSha: "old",
|
||||
headSha: "new",
|
||||
});
|
||||
|
||||
expect(result).toBe(false);
|
||||
});
|
||||
|
||||
it("returns false when API returns 404 (old SHA gone)", async () => {
|
||||
const error404 = Object.assign(new Error("Not Found"), { status: 404 });
|
||||
const octokit = makeOctokit({
|
||||
repos: {
|
||||
compareCommits: mock(() => Promise.reject(error404)),
|
||||
},
|
||||
});
|
||||
|
||||
const result = await checkAncestry({
|
||||
octokit,
|
||||
owner: "user",
|
||||
repo: "repo",
|
||||
baseSha: "gone",
|
||||
headSha: "new",
|
||||
});
|
||||
|
||||
expect(result).toBe(false);
|
||||
});
|
||||
|
||||
it("throws on transient errors (fail-open for caller)", async () => {
|
||||
const error500 = Object.assign(new Error("Internal Server Error"), { status: 500 });
|
||||
const octokit = makeOctokit({
|
||||
repos: {
|
||||
compareCommits: mock(() => Promise.reject(error500)),
|
||||
},
|
||||
});
|
||||
|
||||
expect(
|
||||
checkAncestry({
|
||||
octokit,
|
||||
owner: "user",
|
||||
repo: "repo",
|
||||
baseSha: "old",
|
||||
headSha: "new",
|
||||
}),
|
||||
).rejects.toThrow("Internal Server Error");
|
||||
});
|
||||
});
|
||||
|
||||
// ---- detectForcePush ----
|
||||
// Uses _deps injection to avoid fragile global fetch mocking.
|
||||
|
||||
describe("detectForcePush", () => {
|
||||
const baseArgs = {
|
||||
giteaUrl: "https://gitea.example.com",
|
||||
giteaToken: "tok",
|
||||
giteaOwner: "org",
|
||||
giteaRepo: "repo",
|
||||
githubOwner: "user",
|
||||
githubRepo: "repo",
|
||||
};
|
||||
|
||||
function makeDeps(overrides: {
|
||||
giteaBranches?: BranchInfo[] | Error;
|
||||
githubBranches?: BranchInfo[] | Error;
|
||||
ancestryResult?: boolean;
|
||||
} = {}) {
|
||||
return {
|
||||
fetchGiteaBranches: mock(async () => {
|
||||
if (overrides.giteaBranches instanceof Error) throw overrides.giteaBranches;
|
||||
return overrides.giteaBranches ?? [];
|
||||
}) as any,
|
||||
fetchGitHubBranches: mock(async () => {
|
||||
if (overrides.githubBranches instanceof Error) throw overrides.githubBranches;
|
||||
return overrides.githubBranches ?? [];
|
||||
}) as any,
|
||||
checkAncestry: mock(async () => overrides.ancestryResult ?? true) as any,
|
||||
};
|
||||
}
|
||||
|
||||
const dummyOctokit = {} as any;
|
||||
|
||||
it("skips when Gitea has no branches (first mirror)", async () => {
|
||||
const deps = makeDeps({ giteaBranches: [] });
|
||||
const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps });
|
||||
|
||||
expect(result.detected).toBe(false);
|
||||
expect(result.skipped).toBe(true);
|
||||
expect(result.skipReason).toContain("No Gitea branches");
|
||||
});
|
||||
|
||||
it("returns no detection when all SHAs match", async () => {
|
||||
const deps = makeDeps({
|
||||
giteaBranches: [
|
||||
{ name: "main", sha: "aaa" },
|
||||
{ name: "dev", sha: "bbb" },
|
||||
],
|
||||
githubBranches: [
|
||||
{ name: "main", sha: "aaa" },
|
||||
{ name: "dev", sha: "bbb" },
|
||||
],
|
||||
});
|
||||
|
||||
const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps });
|
||||
|
||||
expect(result.detected).toBe(false);
|
||||
expect(result.skipped).toBe(false);
|
||||
expect(result.affectedBranches).toHaveLength(0);
|
||||
});
|
||||
|
||||
it("detects deleted branch", async () => {
|
||||
const deps = makeDeps({
|
||||
giteaBranches: [
|
||||
{ name: "main", sha: "aaa" },
|
||||
{ name: "old-branch", sha: "ccc" },
|
||||
],
|
||||
githubBranches: [{ name: "main", sha: "aaa" }],
|
||||
});
|
||||
|
||||
const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps });
|
||||
|
||||
expect(result.detected).toBe(true);
|
||||
expect(result.affectedBranches).toHaveLength(1);
|
||||
expect(result.affectedBranches[0]).toEqual({
|
||||
name: "old-branch",
|
||||
reason: "deleted",
|
||||
giteaSha: "ccc",
|
||||
githubSha: null,
|
||||
});
|
||||
});
|
||||
|
||||
it("returns no detection for fast-forward", async () => {
|
||||
const deps = makeDeps({
|
||||
giteaBranches: [{ name: "main", sha: "old-sha" }],
|
||||
githubBranches: [{ name: "main", sha: "new-sha" }],
|
||||
ancestryResult: true, // fast-forward
|
||||
});
|
||||
|
||||
const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps });
|
||||
|
||||
expect(result.detected).toBe(false);
|
||||
expect(result.affectedBranches).toHaveLength(0);
|
||||
});
|
||||
|
||||
it("detects diverged branch", async () => {
|
||||
const deps = makeDeps({
|
||||
giteaBranches: [{ name: "main", sha: "old-sha" }],
|
||||
githubBranches: [{ name: "main", sha: "rewritten-sha" }],
|
||||
ancestryResult: false, // diverged
|
||||
});
|
||||
|
||||
const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps });
|
||||
|
||||
expect(result.detected).toBe(true);
|
||||
expect(result.affectedBranches).toHaveLength(1);
|
||||
expect(result.affectedBranches[0]).toEqual({
|
||||
name: "main",
|
||||
reason: "diverged",
|
||||
giteaSha: "old-sha",
|
||||
githubSha: "rewritten-sha",
|
||||
});
|
||||
});
|
||||
|
||||
it("detects force-push when ancestry check fails (old SHA gone)", async () => {
|
||||
const deps = makeDeps({
|
||||
giteaBranches: [{ name: "main", sha: "old-sha" }],
|
||||
githubBranches: [{ name: "main", sha: "new-sha" }],
|
||||
ancestryResult: false, // checkAncestry returns false on error
|
||||
});
|
||||
|
||||
const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps });
|
||||
|
||||
expect(result.detected).toBe(true);
|
||||
expect(result.affectedBranches).toHaveLength(1);
|
||||
expect(result.affectedBranches[0].reason).toBe("diverged");
|
||||
});
|
||||
|
||||
it("skips when Gitea API returns 404", async () => {
|
||||
const { HttpError } = await import("@/lib/http-client");
|
||||
const deps = makeDeps({
|
||||
giteaBranches: new HttpError("not found", 404, "Not Found"),
|
||||
});
|
||||
|
||||
const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps });
|
||||
|
||||
expect(result.detected).toBe(false);
|
||||
expect(result.skipped).toBe(true);
|
||||
expect(result.skipReason).toContain("not found");
|
||||
});
|
||||
|
||||
it("skips when Gitea API returns server error", async () => {
|
||||
const deps = makeDeps({
|
||||
giteaBranches: new Error("HTTP 500: internal error"),
|
||||
});
|
||||
|
||||
const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps });
|
||||
|
||||
expect(result.detected).toBe(false);
|
||||
expect(result.skipped).toBe(true);
|
||||
expect(result.skipReason).toContain("Failed to fetch Gitea branches");
|
||||
});
|
||||
|
||||
it("skips when GitHub API fails", async () => {
|
||||
const deps = makeDeps({
|
||||
giteaBranches: [{ name: "main", sha: "aaa" }],
|
||||
githubBranches: new Error("rate limited"),
|
||||
});
|
||||
|
||||
const result = await detectForcePush({ ...baseArgs, octokit: dummyOctokit, _deps: deps });
|
||||
|
||||
expect(result.detected).toBe(false);
|
||||
expect(result.skipped).toBe(true);
|
||||
expect(result.skipReason).toContain("Failed to fetch GitHub branches");
|
||||
});
|
||||
});
|
||||
286
src/lib/utils/force-push-detection.ts
Normal file
286
src/lib/utils/force-push-detection.ts
Normal file
@@ -0,0 +1,286 @@
|
||||
/**
|
||||
* Force-push detection module.
|
||||
*
|
||||
* Compares branch SHAs between a Gitea mirror and GitHub source to detect
|
||||
* branches that were deleted, rewritten, or force-pushed.
|
||||
*
|
||||
* **Fail-open**: If detection itself fails (API errors, rate limits, etc.),
|
||||
* the result indicates no force-push so sync proceeds normally. Detection
|
||||
* should never block sync due to its own failure.
|
||||
*/
|
||||
|
||||
import type { Octokit } from "@octokit/rest";
|
||||
import { httpGet, HttpError } from "@/lib/http-client";
|
||||
|
||||
// ---- Types ----
|
||||
|
||||
export interface BranchInfo {
|
||||
name: string;
|
||||
sha: string;
|
||||
}
|
||||
|
||||
export type ForcePushReason = "deleted" | "diverged" | "non-fast-forward";
|
||||
|
||||
export interface AffectedBranch {
|
||||
name: string;
|
||||
reason: ForcePushReason;
|
||||
giteaSha: string;
|
||||
githubSha: string | null; // null when branch was deleted
|
||||
}
|
||||
|
||||
export interface ForcePushDetectionResult {
|
||||
detected: boolean;
|
||||
affectedBranches: AffectedBranch[];
|
||||
/** True when detection could not run (API error, etc.) */
|
||||
skipped: boolean;
|
||||
skipReason?: string;
|
||||
}
|
||||
|
||||
const NO_FORCE_PUSH: ForcePushDetectionResult = {
|
||||
detected: false,
|
||||
affectedBranches: [],
|
||||
skipped: false,
|
||||
};
|
||||
|
||||
function skippedResult(reason: string): ForcePushDetectionResult {
|
||||
return {
|
||||
detected: false,
|
||||
affectedBranches: [],
|
||||
skipped: true,
|
||||
skipReason: reason,
|
||||
};
|
||||
}
|
||||
|
||||
// ---- Branch fetching ----
|
||||
|
||||
/**
|
||||
* Fetch all branches from a Gitea repository (paginated).
|
||||
*/
|
||||
export async function fetchGiteaBranches({
|
||||
giteaUrl,
|
||||
giteaToken,
|
||||
owner,
|
||||
repo,
|
||||
}: {
|
||||
giteaUrl: string;
|
||||
giteaToken: string;
|
||||
owner: string;
|
||||
repo: string;
|
||||
}): Promise<BranchInfo[]> {
|
||||
const branches: BranchInfo[] = [];
|
||||
let page = 1;
|
||||
const perPage = 50;
|
||||
|
||||
while (true) {
|
||||
const url = `${giteaUrl}/api/v1/repos/${owner}/${repo}/branches?page=${page}&limit=${perPage}`;
|
||||
const response = await httpGet<Array<{ name: string; commit: { id: string } }>>(
|
||||
url,
|
||||
{ Authorization: `token ${giteaToken}` },
|
||||
);
|
||||
|
||||
if (!Array.isArray(response.data) || response.data.length === 0) break;
|
||||
|
||||
for (const b of response.data) {
|
||||
branches.push({ name: b.name, sha: b.commit.id });
|
||||
}
|
||||
|
||||
if (response.data.length < perPage) break;
|
||||
page++;
|
||||
}
|
||||
|
||||
return branches;
|
||||
}
|
||||
|
||||
/**
|
||||
* Fetch all branches from a GitHub repository (paginated via Octokit).
|
||||
*/
|
||||
export async function fetchGitHubBranches({
|
||||
octokit,
|
||||
owner,
|
||||
repo,
|
||||
}: {
|
||||
octokit: Octokit;
|
||||
owner: string;
|
||||
repo: string;
|
||||
}): Promise<BranchInfo[]> {
|
||||
const data = await octokit.paginate(octokit.repos.listBranches, {
|
||||
owner,
|
||||
repo,
|
||||
per_page: 100,
|
||||
});
|
||||
|
||||
return data.map((b) => ({ name: b.name, sha: b.commit.sha }));
|
||||
}
|
||||
|
||||
/**
|
||||
* Check whether the transition from `baseSha` to `headSha` on the same branch
|
||||
* is a fast-forward (i.e. `baseSha` is an ancestor of `headSha`).
|
||||
*
|
||||
* Returns `true` when the change is safe (fast-forward) and `false` when it
|
||||
* is a confirmed force-push (404 = old SHA garbage-collected from GitHub).
|
||||
*
|
||||
* Throws on transient errors (rate limits, network issues) so the caller
|
||||
* can decide how to handle them (fail-open: skip that branch).
|
||||
*/
|
||||
export async function checkAncestry({
|
||||
octokit,
|
||||
owner,
|
||||
repo,
|
||||
baseSha,
|
||||
headSha,
|
||||
}: {
|
||||
octokit: Octokit;
|
||||
owner: string;
|
||||
repo: string;
|
||||
baseSha: string;
|
||||
headSha: string;
|
||||
}): Promise<boolean> {
|
||||
try {
|
||||
const { data } = await octokit.repos.compareCommits({
|
||||
owner,
|
||||
repo,
|
||||
base: baseSha,
|
||||
head: headSha,
|
||||
});
|
||||
// "ahead" means headSha is strictly ahead of baseSha → fast-forward.
|
||||
// "behind" or "diverged" means the branch was rewritten.
|
||||
return data.status === "ahead" || data.status === "identical";
|
||||
} catch (error: any) {
|
||||
// 404 / 422 = old SHA no longer exists on GitHub → confirmed force-push.
|
||||
if (error?.status === 404 || error?.status === 422) {
|
||||
return false;
|
||||
}
|
||||
// Any other error (rate limit, network) → rethrow so caller can
|
||||
// handle it as fail-open (skip branch) rather than false-positive.
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
// ---- Main detection ----
|
||||
|
||||
/**
|
||||
* Compare branch SHAs between Gitea and GitHub to detect force-pushes.
|
||||
*
|
||||
* The function is intentionally fail-open: any error during detection returns
|
||||
* a "skipped" result so that sync can proceed normally.
|
||||
*/
|
||||
export async function detectForcePush({
|
||||
giteaUrl,
|
||||
giteaToken,
|
||||
giteaOwner,
|
||||
giteaRepo,
|
||||
octokit,
|
||||
githubOwner,
|
||||
githubRepo,
|
||||
_deps,
|
||||
}: {
|
||||
giteaUrl: string;
|
||||
giteaToken: string;
|
||||
giteaOwner: string;
|
||||
giteaRepo: string;
|
||||
octokit: Octokit;
|
||||
githubOwner: string;
|
||||
githubRepo: string;
|
||||
/** @internal — test-only dependency injection */
|
||||
_deps?: {
|
||||
fetchGiteaBranches: typeof fetchGiteaBranches;
|
||||
fetchGitHubBranches: typeof fetchGitHubBranches;
|
||||
checkAncestry: typeof checkAncestry;
|
||||
};
|
||||
}): Promise<ForcePushDetectionResult> {
|
||||
const deps = _deps ?? { fetchGiteaBranches, fetchGitHubBranches, checkAncestry };
|
||||
|
||||
// 1. Fetch Gitea branches
|
||||
let giteaBranches: BranchInfo[];
|
||||
try {
|
||||
giteaBranches = await deps.fetchGiteaBranches({
|
||||
giteaUrl,
|
||||
giteaToken,
|
||||
owner: giteaOwner,
|
||||
repo: giteaRepo,
|
||||
});
|
||||
} catch (error) {
|
||||
// Gitea 404 = repo not yet mirrored, skip detection
|
||||
if (error instanceof HttpError && error.status === 404) {
|
||||
return skippedResult("Gitea repository not found (first mirror?)");
|
||||
}
|
||||
return skippedResult(
|
||||
`Failed to fetch Gitea branches: ${error instanceof Error ? error.message : String(error)}`,
|
||||
);
|
||||
}
|
||||
|
||||
// First-time mirror: no Gitea branches → nothing to compare
|
||||
if (giteaBranches.length === 0) {
|
||||
return skippedResult("No Gitea branches found (first mirror?)");
|
||||
}
|
||||
|
||||
// 2. Fetch GitHub branches
|
||||
let githubBranches: BranchInfo[];
|
||||
try {
|
||||
githubBranches = await deps.fetchGitHubBranches({
|
||||
octokit,
|
||||
owner: githubOwner,
|
||||
repo: githubRepo,
|
||||
});
|
||||
} catch (error) {
|
||||
return skippedResult(
|
||||
`Failed to fetch GitHub branches: ${error instanceof Error ? error.message : String(error)}`,
|
||||
);
|
||||
}
|
||||
|
||||
const githubBranchMap = new Map(githubBranches.map((b) => [b.name, b.sha]));
|
||||
|
||||
// 3. Compare each Gitea branch against GitHub
|
||||
const affected: AffectedBranch[] = [];
|
||||
|
||||
for (const giteaBranch of giteaBranches) {
|
||||
const githubSha = githubBranchMap.get(giteaBranch.name);
|
||||
|
||||
if (githubSha === undefined) {
|
||||
// Branch was deleted on GitHub
|
||||
affected.push({
|
||||
name: giteaBranch.name,
|
||||
reason: "deleted",
|
||||
giteaSha: giteaBranch.sha,
|
||||
githubSha: null,
|
||||
});
|
||||
continue;
|
||||
}
|
||||
|
||||
// Same SHA → no change
|
||||
if (githubSha === giteaBranch.sha) continue;
|
||||
|
||||
// SHAs differ → check if it's a fast-forward
|
||||
try {
|
||||
const isFastForward = await deps.checkAncestry({
|
||||
octokit,
|
||||
owner: githubOwner,
|
||||
repo: githubRepo,
|
||||
baseSha: giteaBranch.sha,
|
||||
headSha: githubSha,
|
||||
});
|
||||
|
||||
if (!isFastForward) {
|
||||
affected.push({
|
||||
name: giteaBranch.name,
|
||||
reason: "diverged",
|
||||
giteaSha: giteaBranch.sha,
|
||||
githubSha,
|
||||
});
|
||||
}
|
||||
} catch {
|
||||
// Individual branch check failure → skip that branch (fail-open)
|
||||
continue;
|
||||
}
|
||||
}
|
||||
|
||||
if (affected.length === 0) {
|
||||
return NO_FORCE_PUSH;
|
||||
}
|
||||
|
||||
return {
|
||||
detected: true,
|
||||
affectedBranches: affected,
|
||||
skipped: false,
|
||||
};
|
||||
}
|
||||
202
src/pages/api/job/approve-sync.ts
Normal file
202
src/pages/api/job/approve-sync.ts
Normal file
@@ -0,0 +1,202 @@
|
||||
import type { APIRoute } from "astro";
|
||||
import { db, configs, repositories } from "@/lib/db";
|
||||
import { and, eq, inArray } from "drizzle-orm";
|
||||
import { repositoryVisibilityEnum, repoStatusEnum } from "@/types/Repository";
|
||||
import { syncGiteaRepoEnhanced } from "@/lib/gitea-enhanced";
|
||||
import { createSecureErrorResponse } from "@/lib/utils";
|
||||
import { requireAuthenticatedUserId } from "@/lib/auth-guards";
|
||||
import { createPreSyncBundleBackup } from "@/lib/repo-backup";
|
||||
import { decryptConfigTokens } from "@/lib/utils/config-encryption";
|
||||
import type { Config } from "@/types/config";
|
||||
import { createMirrorJob } from "@/lib/helpers";
|
||||
|
||||
interface ApproveSyncRequest {
|
||||
repositoryIds: string[];
|
||||
action: "approve" | "dismiss";
|
||||
}
|
||||
|
||||
export const POST: APIRoute = async ({ request, locals }) => {
|
||||
try {
|
||||
const authResult = await requireAuthenticatedUserId({ request, locals });
|
||||
if ("response" in authResult) return authResult.response;
|
||||
const userId = authResult.userId;
|
||||
|
||||
const body: ApproveSyncRequest = await request.json();
|
||||
const { repositoryIds, action } = body;
|
||||
|
||||
if (!repositoryIds || !Array.isArray(repositoryIds) || repositoryIds.length === 0) {
|
||||
return new Response(
|
||||
JSON.stringify({ success: false, message: "repositoryIds are required." }),
|
||||
{ status: 400, headers: { "Content-Type": "application/json" } },
|
||||
);
|
||||
}
|
||||
|
||||
if (action !== "approve" && action !== "dismiss") {
|
||||
return new Response(
|
||||
JSON.stringify({ success: false, message: "action must be 'approve' or 'dismiss'." }),
|
||||
{ status: 400, headers: { "Content-Type": "application/json" } },
|
||||
);
|
||||
}
|
||||
|
||||
// Fetch config
|
||||
const configResult = await db
|
||||
.select()
|
||||
.from(configs)
|
||||
.where(eq(configs.userId, userId))
|
||||
.limit(1);
|
||||
|
||||
const config = configResult[0];
|
||||
if (!config) {
|
||||
return new Response(
|
||||
JSON.stringify({ success: false, message: "No configuration found." }),
|
||||
{ status: 400, headers: { "Content-Type": "application/json" } },
|
||||
);
|
||||
}
|
||||
|
||||
// Fetch repos — only those in pending-approval status
|
||||
const repos = await db
|
||||
.select()
|
||||
.from(repositories)
|
||||
.where(
|
||||
and(
|
||||
eq(repositories.userId, userId),
|
||||
eq(repositories.status, "pending-approval"),
|
||||
inArray(repositories.id, repositoryIds),
|
||||
),
|
||||
);
|
||||
|
||||
if (!repos.length) {
|
||||
return new Response(
|
||||
JSON.stringify({ success: false, message: "No pending-approval repositories found for the given IDs." }),
|
||||
{ status: 404, headers: { "Content-Type": "application/json" } },
|
||||
);
|
||||
}
|
||||
|
||||
if (action === "dismiss") {
|
||||
// Reset status to "synced" so repos resume normal schedule
|
||||
for (const repo of repos) {
|
||||
await db
|
||||
.update(repositories)
|
||||
.set({
|
||||
status: "synced",
|
||||
errorMessage: null,
|
||||
updatedAt: new Date(),
|
||||
})
|
||||
.where(eq(repositories.id, repo.id));
|
||||
|
||||
await createMirrorJob({
|
||||
userId,
|
||||
repositoryId: repo.id,
|
||||
repositoryName: repo.name,
|
||||
message: `Force-push alert dismissed for ${repo.name}`,
|
||||
details: "User dismissed the force-push alert. Repository will resume normal sync schedule.",
|
||||
status: "synced",
|
||||
});
|
||||
}
|
||||
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
success: true,
|
||||
message: `Dismissed ${repos.length} repository alert(s).`,
|
||||
repositories: repos.map((repo) => ({
|
||||
...repo,
|
||||
status: "synced",
|
||||
errorMessage: null,
|
||||
})),
|
||||
}),
|
||||
{ status: 200, headers: { "Content-Type": "application/json" } },
|
||||
);
|
||||
}
|
||||
|
||||
// action === "approve": create backup first (safety), then trigger sync
|
||||
const decryptedConfig = decryptConfigTokens(config as unknown as Config);
|
||||
|
||||
// Process in background
|
||||
setTimeout(async () => {
|
||||
for (const repo of repos) {
|
||||
try {
|
||||
const { getGiteaRepoOwnerAsync } = await import("@/lib/gitea");
|
||||
const repoOwner = await getGiteaRepoOwnerAsync({ config, repository: repo });
|
||||
|
||||
// Always create a backup before approved sync for safety
|
||||
const cloneUrl = `${config.giteaConfig.url.replace(/\/$/, "")}/${repoOwner}/${repo.name}.git`;
|
||||
try {
|
||||
const backupResult = await createPreSyncBundleBackup({
|
||||
config,
|
||||
owner: repoOwner,
|
||||
repoName: repo.name,
|
||||
cloneUrl,
|
||||
force: true, // Bypass legacy gate — approval implies backup
|
||||
});
|
||||
|
||||
await createMirrorJob({
|
||||
userId,
|
||||
repositoryId: repo.id,
|
||||
repositoryName: repo.name,
|
||||
message: `Safety snapshot created for ${repo.name}`,
|
||||
details: `Pre-approval snapshot at ${backupResult.bundlePath}.`,
|
||||
status: "syncing",
|
||||
});
|
||||
} catch (backupError) {
|
||||
console.warn(
|
||||
`[ApproveSync] Backup failed for ${repo.name}, proceeding with sync: ${
|
||||
backupError instanceof Error ? backupError.message : String(backupError)
|
||||
}`,
|
||||
);
|
||||
}
|
||||
|
||||
// Trigger sync — skip detection to avoid re-blocking
|
||||
const repoData = {
|
||||
...repo,
|
||||
status: repoStatusEnum.parse("syncing"),
|
||||
organization: repo.organization ?? undefined,
|
||||
lastMirrored: repo.lastMirrored ?? undefined,
|
||||
errorMessage: repo.errorMessage ?? undefined,
|
||||
forkedFrom: repo.forkedFrom ?? undefined,
|
||||
visibility: repositoryVisibilityEnum.parse(repo.visibility),
|
||||
mirroredLocation: repo.mirroredLocation || "",
|
||||
};
|
||||
|
||||
await syncGiteaRepoEnhanced({
|
||||
config,
|
||||
repository: repoData,
|
||||
skipForcePushDetection: true,
|
||||
});
|
||||
console.log(`[ApproveSync] Sync completed for approved repository: ${repo.name}`);
|
||||
} catch (error) {
|
||||
console.error(
|
||||
`[ApproveSync] Failed to sync approved repository ${repo.name}:`,
|
||||
error,
|
||||
);
|
||||
}
|
||||
}
|
||||
}, 0);
|
||||
|
||||
// Immediately update status to syncing for responsiveness
|
||||
for (const repo of repos) {
|
||||
await db
|
||||
.update(repositories)
|
||||
.set({
|
||||
status: "syncing",
|
||||
errorMessage: null,
|
||||
updatedAt: new Date(),
|
||||
})
|
||||
.where(eq(repositories.id, repo.id));
|
||||
}
|
||||
|
||||
return new Response(
|
||||
JSON.stringify({
|
||||
success: true,
|
||||
message: `Approved sync for ${repos.length} repository(ies). Backup + sync started.`,
|
||||
repositories: repos.map((repo) => ({
|
||||
...repo,
|
||||
status: "syncing",
|
||||
errorMessage: null,
|
||||
})),
|
||||
}),
|
||||
{ status: 200, headers: { "Content-Type": "application/json" } },
|
||||
);
|
||||
} catch (error) {
|
||||
return createSecureErrorResponse(error, "approve-sync", 500);
|
||||
}
|
||||
};
|
||||
@@ -13,6 +13,7 @@ export const repoStatusEnum = z.enum([
|
||||
"syncing",
|
||||
"synced",
|
||||
"archived",
|
||||
"pending-approval", // Blocked by force-push detection, needs manual approval
|
||||
]);
|
||||
|
||||
export type RepoStatus = z.infer<typeof repoStatusEnum>;
|
||||
|
||||
@@ -3,6 +3,7 @@ import { type Config as ConfigType } from "@/lib/db/schema";
|
||||
export type GiteaOrgVisibility = "public" | "private" | "limited";
|
||||
export type MirrorStrategy = "preserve" | "single-org" | "flat-user" | "mixed";
|
||||
export type StarredReposMode = "dedicated-org" | "preserve-owner";
|
||||
export type BackupStrategy = "disabled" | "always" | "on-force-push" | "block-on-force-push";
|
||||
|
||||
export interface GiteaConfig {
|
||||
url: string;
|
||||
@@ -18,7 +19,8 @@ export interface GiteaConfig {
|
||||
personalReposOrg?: string; // Override destination for personal repos
|
||||
issueConcurrency?: number;
|
||||
pullRequestConcurrency?: number;
|
||||
backupBeforeSync?: boolean;
|
||||
backupStrategy?: BackupStrategy;
|
||||
backupBeforeSync?: boolean; // Deprecated: kept for backward compat, use backupStrategy
|
||||
backupRetentionCount?: number;
|
||||
backupDirectory?: string;
|
||||
blockSyncOnBackupFailure?: boolean;
|
||||
@@ -73,6 +75,7 @@ export interface MirrorOptions {
|
||||
export interface AdvancedOptions {
|
||||
skipForks: boolean;
|
||||
starredCodeOnly: boolean;
|
||||
autoMirrorStarred?: boolean;
|
||||
}
|
||||
|
||||
export interface SaveConfigApiRequest {
|
||||
|
||||
@@ -6,13 +6,13 @@
|
||||
* by the 02-mirror-workflow suite.
|
||||
*
|
||||
* What is tested:
|
||||
* B1. Enable backupBeforeSync in config
|
||||
* B1. Enable backupStrategy: "always" in config
|
||||
* B2. Confirm mirrored repos exist in Gitea (precondition)
|
||||
* B3. Trigger a re-sync with backup enabled — verify the backup code path
|
||||
* runs (snapshot activity entries appear in the activity log)
|
||||
* B4. Inspect activity log for snapshot-related entries
|
||||
* B5. Enable blockSyncOnBackupFailure and verify the flag is persisted
|
||||
* B6. Disable backup and verify config resets cleanly
|
||||
* B6. Disable backup (backupStrategy: "disabled") and verify config resets cleanly
|
||||
*/
|
||||
|
||||
import { test, expect } from "@playwright/test";
|
||||
@@ -54,10 +54,10 @@ test.describe("E2E: Backup configuration", () => {
|
||||
const giteaToken = giteaApi.getTokenValue();
|
||||
expect(giteaToken, "Gitea token required").toBeTruthy();
|
||||
|
||||
// Save config with backup enabled
|
||||
// Save config with backup strategy set to "always"
|
||||
await saveConfig(request, giteaToken, appCookies, {
|
||||
giteaConfig: {
|
||||
backupBeforeSync: true,
|
||||
backupStrategy: "always",
|
||||
blockSyncOnBackupFailure: false,
|
||||
backupRetentionCount: 5,
|
||||
backupDirectory: "data/repo-backups",
|
||||
@@ -75,7 +75,7 @@ test.describe("E2E: Backup configuration", () => {
|
||||
const configData = await configResp.json();
|
||||
const giteaCfg = configData.giteaConfig ?? configData.gitea ?? {};
|
||||
console.log(
|
||||
`[Backup] Config saved: backupBeforeSync=${giteaCfg.backupBeforeSync}, blockOnFailure=${giteaCfg.blockSyncOnBackupFailure}`,
|
||||
`[Backup] Config saved: backupStrategy=${giteaCfg.backupStrategy}, blockOnFailure=${giteaCfg.blockSyncOnBackupFailure}`,
|
||||
);
|
||||
}
|
||||
});
|
||||
@@ -202,7 +202,7 @@ test.describe("E2E: Backup configuration", () => {
|
||||
expect(
|
||||
backupJobs.length,
|
||||
"Expected at least one backup/snapshot activity entry when " +
|
||||
"backupBeforeSync is enabled and repos exist in Gitea",
|
||||
"backupStrategy is 'always' and repos exist in Gitea",
|
||||
).toBeGreaterThan(0);
|
||||
|
||||
// Check for any failed backups
|
||||
@@ -247,7 +247,7 @@ test.describe("E2E: Backup configuration", () => {
|
||||
// Update config to block sync on backup failure
|
||||
await saveConfig(request, giteaToken, appCookies, {
|
||||
giteaConfig: {
|
||||
backupBeforeSync: true,
|
||||
backupStrategy: "always",
|
||||
blockSyncOnBackupFailure: true,
|
||||
backupRetentionCount: 5,
|
||||
backupDirectory: "data/repo-backups",
|
||||
@@ -284,7 +284,7 @@ test.describe("E2E: Backup configuration", () => {
|
||||
// Disable backup
|
||||
await saveConfig(request, giteaToken, appCookies, {
|
||||
giteaConfig: {
|
||||
backupBeforeSync: false,
|
||||
backupStrategy: "disabled",
|
||||
blockSyncOnBackupFailure: false,
|
||||
},
|
||||
});
|
||||
@@ -297,7 +297,7 @@ test.describe("E2E: Backup configuration", () => {
|
||||
const configData = await configResp.json();
|
||||
const giteaCfg = configData.giteaConfig ?? configData.gitea ?? {};
|
||||
console.log(
|
||||
`[Backup] After disable: backupBeforeSync=${giteaCfg.backupBeforeSync}`,
|
||||
`[Backup] After disable: backupStrategy=${giteaCfg.backupStrategy}`,
|
||||
);
|
||||
}
|
||||
console.log("[Backup] Backup configuration test complete");
|
||||
|
||||
@@ -302,7 +302,7 @@ test.describe("E2E: Force-push simulation", () => {
|
||||
// Ensure backup is disabled for this test
|
||||
await saveConfig(request, giteaToken, appCookies, {
|
||||
giteaConfig: {
|
||||
backupBeforeSync: false,
|
||||
backupStrategy: "disabled",
|
||||
blockSyncOnBackupFailure: false,
|
||||
},
|
||||
});
|
||||
@@ -560,16 +560,16 @@ test.describe("E2E: Force-push simulation", () => {
|
||||
|
||||
const giteaToken = giteaApi.getTokenValue();
|
||||
|
||||
// Enable backup
|
||||
// Enable backup with "always" strategy
|
||||
await saveConfig(request, giteaToken, appCookies, {
|
||||
giteaConfig: {
|
||||
backupBeforeSync: true,
|
||||
backupStrategy: "always",
|
||||
blockSyncOnBackupFailure: false, // don't block — we want to see both backup + sync happen
|
||||
backupRetentionCount: 5,
|
||||
backupDirectory: "data/repo-backups",
|
||||
},
|
||||
});
|
||||
console.log("[ForcePush] Backup enabled for protected sync test");
|
||||
console.log("[ForcePush] Backup enabled (strategy=always) for protected sync test");
|
||||
|
||||
// Force-push again
|
||||
mutateSourceRepo(MY_PROJECT_BARE, "my-project-rewrite2", (workDir) => {
|
||||
@@ -744,7 +744,7 @@ test.describe("E2E: Force-push simulation", () => {
|
||||
expect(
|
||||
backupJobs.length,
|
||||
"At least one backup/snapshot activity should exist for my-project " +
|
||||
"when backupBeforeSync is enabled",
|
||||
"when backupStrategy is 'always'",
|
||||
).toBeGreaterThan(0);
|
||||
|
||||
// Check whether any backups actually succeeded
|
||||
|
||||
@@ -520,7 +520,7 @@ export async function saveConfig(
|
||||
starredReposOrg: "github-stars",
|
||||
preserveOrgStructure: false,
|
||||
mirrorStrategy: "single-org",
|
||||
backupBeforeSync: false,
|
||||
backupStrategy: "disabled",
|
||||
blockSyncOnBackupFailure: false,
|
||||
};
|
||||
|
||||
|
||||
@@ -11,7 +11,6 @@
|
||||
"dependencies": {
|
||||
"@astrojs/mdx": "^4.3.13",
|
||||
"@astrojs/react": "^4.4.2",
|
||||
"@radix-ui/react-icons": "^1.3.2",
|
||||
"@radix-ui/react-slot": "^1.2.4",
|
||||
"@splinetool/react-spline": "^4.1.0",
|
||||
"@splinetool/runtime": "^1.12.60",
|
||||
|
||||
12
www/pnpm-lock.yaml
generated
12
www/pnpm-lock.yaml
generated
@@ -14,9 +14,6 @@ importers:
|
||||
'@astrojs/react':
|
||||
specifier: ^4.4.2
|
||||
version: 4.4.2(@types/node@24.7.1)(@types/react-dom@19.2.3(@types/react@19.2.14))(@types/react@19.2.14)(jiti@2.6.1)(lightningcss@1.31.1)(react-dom@19.2.4(react@19.2.4))(react@19.2.4)
|
||||
'@radix-ui/react-icons':
|
||||
specifier: ^1.3.2
|
||||
version: 1.3.2(react@19.2.4)
|
||||
'@radix-ui/react-slot':
|
||||
specifier: ^1.2.4
|
||||
version: 1.2.4(@types/react@19.2.14)(react@19.2.4)
|
||||
@@ -674,11 +671,6 @@ packages:
|
||||
'@types/react':
|
||||
optional: true
|
||||
|
||||
'@radix-ui/react-icons@1.3.2':
|
||||
resolution: {integrity: sha512-fyQIhGDhzfc9pK2kH6Pl9c4BDJGfMkPqkyIgYDthyNYoNg3wVhoJMMh19WS4Up/1KMPFVpNsT2q3WmXn2N1m6g==}
|
||||
peerDependencies:
|
||||
react: ^16.x || ^17.x || ^18.x || ^19.0.0 || ^19.0.0-rc
|
||||
|
||||
'@radix-ui/react-slot@1.2.4':
|
||||
resolution: {integrity: sha512-Jl+bCv8HxKnlTLVrcDE8zTMJ09R9/ukw4qBs/oZClOfoQk/cOTbDn+NceXfV7j09YPVQUryJPHurafcSg6EVKA==}
|
||||
peerDependencies:
|
||||
@@ -2828,10 +2820,6 @@ snapshots:
|
||||
optionalDependencies:
|
||||
'@types/react': 19.2.14
|
||||
|
||||
'@radix-ui/react-icons@1.3.2(react@19.2.4)':
|
||||
dependencies:
|
||||
react: 19.2.4
|
||||
|
||||
'@radix-ui/react-slot@1.2.4(@types/react@19.2.14)(react@19.2.4)':
|
||||
dependencies:
|
||||
'@radix-ui/react-compose-refs': 1.1.2(@types/react@19.2.14)(react@19.2.4)
|
||||
|
||||
@@ -1,11 +1,11 @@
|
||||
---
|
||||
import {
|
||||
RefreshCw,
|
||||
Building2,
|
||||
FolderTree,
|
||||
Activity,
|
||||
Lock,
|
||||
Heart,
|
||||
import {
|
||||
RefreshCw,
|
||||
FileText,
|
||||
ShieldCheck,
|
||||
Activity,
|
||||
Lock,
|
||||
HardDrive,
|
||||
} from 'lucide-react';
|
||||
|
||||
const features = [
|
||||
@@ -17,37 +17,37 @@ const features = [
|
||||
iconColor: "text-primary"
|
||||
},
|
||||
{
|
||||
title: "Bulk Operations",
|
||||
description: "Mirror entire organizations or user accounts with a single configuration.",
|
||||
icon: Building2,
|
||||
title: "Metadata Preservation",
|
||||
description: "Mirror issues, pull requests, releases, labels, milestones, and wiki pages alongside your code.",
|
||||
icon: FileText,
|
||||
gradient: "from-accent/10 to-accent-teal/10",
|
||||
iconColor: "text-accent"
|
||||
},
|
||||
{
|
||||
title: "Preserve Structure",
|
||||
description: "Maintain your GitHub organization structure or customize how repos are organized.",
|
||||
icon: FolderTree,
|
||||
title: "Force-Push Protection",
|
||||
description: "Detect upstream force-pushes and automatically snapshot repos before destructive changes.",
|
||||
icon: ShieldCheck,
|
||||
gradient: "from-accent-teal/10 to-primary/10",
|
||||
iconColor: "text-accent-teal"
|
||||
},
|
||||
{
|
||||
title: "Real-time Status",
|
||||
description: "Monitor mirror progress with live updates and detailed activity logs.",
|
||||
title: "Real-time Dashboard",
|
||||
description: "Monitor mirror progress with live updates, activity logs, and per-repo status tracking.",
|
||||
icon: Activity,
|
||||
gradient: "from-accent-coral/10 to-primary/10",
|
||||
iconColor: "text-accent-coral"
|
||||
},
|
||||
{
|
||||
title: "Secure & Private",
|
||||
description: "Self-hosted solution keeps your code on your infrastructure with full control.",
|
||||
title: "Secure & Self-Hosted",
|
||||
description: "Tokens encrypted at rest with AES-256-GCM. Your code stays on your infrastructure.",
|
||||
icon: Lock,
|
||||
gradient: "from-accent-purple/10 to-primary/10",
|
||||
iconColor: "text-accent-purple"
|
||||
},
|
||||
{
|
||||
title: "Open Source",
|
||||
description: "Free, transparent, and community-driven development. Contribute and customize.",
|
||||
icon: Heart,
|
||||
title: "Git LFS Support",
|
||||
description: "Mirror large files and binary assets alongside your repositories with full LFS support.",
|
||||
icon: HardDrive,
|
||||
gradient: "from-primary/10 to-accent-purple/10",
|
||||
iconColor: "text-primary"
|
||||
}
|
||||
|
||||
@@ -1,6 +1,5 @@
|
||||
import { Button } from "./ui/button";
|
||||
import { ArrowRight, Shield, RefreshCw, HardDrive } from "lucide-react";
|
||||
import { GitHubLogoIcon } from "@radix-ui/react-icons";
|
||||
import React, { Suspense } from 'react';
|
||||
|
||||
const Spline = React.lazy(() => import('@splinetool/react-spline'));
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
import React, { useState } from 'react';
|
||||
import { Button } from './ui/button';
|
||||
import { Copy, Check, Terminal, Container, Cloud } from 'lucide-react';
|
||||
import { Copy, Check, Terminal, Container, Cloud, Ship, Snowflake } from 'lucide-react';
|
||||
|
||||
type InstallMethod = 'docker' | 'manual' | 'proxmox';
|
||||
type InstallMethod = 'docker' | 'helm' | 'nix' | 'manual' | 'proxmox';
|
||||
|
||||
export function Installation() {
|
||||
const [activeMethod, setActiveMethod] = useState<InstallMethod>('docker');
|
||||
@@ -37,6 +37,50 @@ export function Installation() {
|
||||
}
|
||||
]
|
||||
},
|
||||
helm: {
|
||||
icon: Ship,
|
||||
title: "Helm",
|
||||
description: "Deploy to Kubernetes",
|
||||
steps: [
|
||||
{
|
||||
title: "Clone the repository",
|
||||
command: "git clone https://github.com/RayLabsHQ/gitea-mirror.git && cd gitea-mirror",
|
||||
id: "helm-clone"
|
||||
},
|
||||
{
|
||||
title: "Install the chart",
|
||||
command: "helm upgrade --install gitea-mirror ./helm/gitea-mirror \\\n --namespace gitea-mirror --create-namespace",
|
||||
id: "helm-install"
|
||||
},
|
||||
{
|
||||
title: "Access the application",
|
||||
command: "kubectl port-forward svc/gitea-mirror 4321:4321 -n gitea-mirror",
|
||||
id: "helm-access"
|
||||
}
|
||||
]
|
||||
},
|
||||
nix: {
|
||||
icon: Snowflake,
|
||||
title: "Nix",
|
||||
description: "Zero-config with Nix flakes",
|
||||
steps: [
|
||||
{
|
||||
title: "Run directly with Nix",
|
||||
command: "nix run github:RayLabsHQ/gitea-mirror",
|
||||
id: "nix-run"
|
||||
},
|
||||
{
|
||||
title: "Or install to your profile",
|
||||
command: "nix profile install github:RayLabsHQ/gitea-mirror",
|
||||
id: "nix-install"
|
||||
},
|
||||
{
|
||||
title: "Access the application",
|
||||
command: "# Open http://localhost:4321 in your browser",
|
||||
id: "nix-access"
|
||||
}
|
||||
]
|
||||
},
|
||||
manual: {
|
||||
icon: Terminal,
|
||||
title: "Manual",
|
||||
|
||||
@@ -39,7 +39,7 @@ const structuredData = {
|
||||
name: "RayLabs",
|
||||
url: "https://github.com/RayLabsHQ",
|
||||
},
|
||||
softwareVersion: "3.9.2",
|
||||
softwareVersion: "3.11.0",
|
||||
screenshot: [
|
||||
`${siteUrl}/assets/dashboard.png`,
|
||||
`${siteUrl}/assets/repositories.png`,
|
||||
@@ -49,8 +49,9 @@ const structuredData = {
|
||||
"Automated scheduled backups",
|
||||
"Self-hosted (full data ownership)",
|
||||
"Metadata preservation (issues, PRs, releases, wiki)",
|
||||
"Docker support",
|
||||
"Multi-repository backup",
|
||||
"Force-push protection with smart detection",
|
||||
"Docker, Helm, Nix, and Proxmox support",
|
||||
"Multi-repository and organization backup",
|
||||
"Git LFS support",
|
||||
"Free and open source",
|
||||
],
|
||||
|
||||
Reference in New Issue
Block a user