Compare commits

..

29 Commits

Author SHA1 Message Date
Arunavo Ray
fcc92b8f27 added security audit 2025-11-08 07:49:33 +05:30
Arunavo Ray
749ad4a694 v3.9.0 2025-11-06 07:47:23 +05:30
ARUNAVO RAY
0f752acae5 Merge pull request #146 from RayLabsHQ/fix/issue-129-release-sort-order
fix: Ensure proper release ordering in Gitea mirrors (#129)
2025-11-06 07:44:12 +05:30
Arunavo Ray
652bd220c2 added missing favicon 2025-11-06 07:43:14 +05:30
Arunavo Ray
9f2eaaf04e fix: Ensure proper release ordering in Gitea mirrors (#129)
- Add 1-second delays between release creations to ensure distinct timestamps
- Prepend GitHub original publication date to release notes
- Improve logging to show chronological processing order
- Addresses Gitea API limitation where created_unix is always set to current time

Fixes #129
2025-11-05 20:57:33 +05:30
ARUNAVO RAY
63d3f0e86c Merge pull request #145 from z0xca/main
fix website install guide command
2025-11-05 11:53:22 +05:30
z0x
25e7d234ba Update clone command to use '&&' for chaining 2025-11-04 22:58:23 -05:00
ARUNAVO RAY
7a3f734728 Merge pull request #142 from RayLabsHQ/fix/issue-141-duplicate-issues-on-sync
fix: add metadata field to repositories table to prevent duplicate issues on sync
2025-10-31 08:51:34 +05:30
Arunavo Ray
d59a07a8c5 fix: add metadata field to repositories table to prevent duplicate issues on sync
Fixes #141

The repository metadata field was missing from the database schema, which
caused the metadata sync state (issues, PRs, releases, etc.) to not persist.
This resulted in duplicate issues being created every time a repository was
synced because the system couldn't track what had already been mirrored.

Changes:
- Added metadata text field to repositories table in schema
- Added metadata field to repositorySchema Zod validation
- Generated database migration 0008_serious_thena.sql

Root cause analysis:
1. Code tried to read/write repository.metadata to track mirrored components
2. The metadata field didn't exist in the database schema
3. On sync, metadataState.components.issues was always false
4. This triggered re-mirroring of all issues, creating duplicates

The fix ensures metadata state persists between mirrors and syncs, preventing
duplicate metadata (issues, PRs, releases) from being created in Gitea.
2025-10-30 10:58:48 +05:30
Arunavo Ray
5a77ae5084 v3.8.10 2025-10-30 10:54:56 +05:30
ARUNAVO RAY
dcb5bd80e3 Merge pull request #138 from RayLabsHQ/issue-132-org-repo-duplicates 2025-10-30 07:11:34 +05:30
Arunavo Ray
3b8fc99f06 workaround to get rid of unknown/unknown in OS arch 2025-10-29 22:01:40 +05:30
Arunavo Ray
bda8d10f10 ci: build arm64 images in PR pipeline 2025-10-29 21:51:37 +05:30
Arunavo Ray
0fe7b433d6 added missing hero_logo.png 2025-10-27 19:43:00 +05:30
Arunavo Ray
8d96e176b4 fix: prevent duplicate orgs and repos 2025-10-27 08:44:45 +05:30
Arunavo Ray
af9bc861cf fixed: Sort order in releases #129 2025-10-27 07:54:38 +05:30
ARUNAVO RAY
ab4bbea9fd Merge pull request #136 from RayLabsHQ/fix/metadata-sync-config-change
fix: sync metadata after config toggles
2025-10-27 07:45:12 +05:30
ARUNAVO RAY
fbd4b3739e Merge pull request #137 from RayLabsHQ/docs/authentik-oidc-notes
Added basic docs on SSO/OIDC
2025-10-26 19:54:53 +05:30
Arunavo Ray
395e71164f Added basic docs on SSO/OIDC 2025-10-26 19:52:44 +05:30
Arunavo Ray
99c277e2ee v3.8.10 | Fixed SSO issues 2025-10-26 19:06:36 +05:30
ARUNAVO RAY
9287e0d29b Merge pull request #135 from RayLabsHQ/fix/authentik-issuer-mismatch
auth: preserve issuer formatting for OIDC
2025-10-26 19:05:54 +05:30
Arunavo Ray
f2f2bafc39 "better-auth": "1.4.0-beta.13" 2025-10-26 18:37:06 +05:30
Arunavo Ray
5876198b5e Added missing DB fields 2025-10-26 18:36:20 +05:30
Arunavo Ray
e46bf381c7 auth: trust email verification from sso providers 2025-10-26 08:45:47 +05:30
Arunavo Ray
3bf0ccf207 fix: sync metadata after config toggles 2025-10-26 08:41:28 +05:30
Arunavo Ray
e41b4ffc56 auth: preserve issuer formatting for OIDC 2025-10-26 07:49:42 +05:30
Arunavo Ray
a9dd646573 v3.8.9 2025-10-25 09:04:14 +05:30
ARUNAVO RAY
e2160aabcd Merge pull request #130 from bwees/main 2025-10-25 07:24:41 +05:30
Brandon Wees
5d085e02bf fix: rename repo count in dashboard 2025-10-24 15:45:29 -05:00
51 changed files with 8170 additions and 272 deletions

View File

@@ -101,26 +101,30 @@ jobs:
# Build and push Docker image
- name: Build and push Docker image
id: build-and-push
uses: docker/build-push-action@v5
uses: docker/build-push-action@v6
with:
context: .
platforms: ${{ github.event_name == 'pull_request' && 'linux/amd64' || 'linux/amd64,linux/arm64' }}
platforms: linux/amd64,linux/arm64
push: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
cache-from: type=gha
cache-to: type=gha,mode=max
provenance: false # Disable provenance to avoid unknown/unknown
sbom: false # Disable sbom to avoid unknown/unknown
# Load image locally for security scanning (PRs only)
- name: Load image for scanning
if: github.event_name == 'pull_request'
uses: docker/build-push-action@v5
uses: docker/build-push-action@v6
with:
context: .
platforms: linux/amd64
load: true
tags: gitea-mirror:scan
cache-from: type=gha
provenance: false # Disable provenance to avoid unknown/unknown
sbom: false # Disable sbom to avoid unknown/unknown
# Wait for image to be available in registry
- name: Wait for image availability
@@ -169,8 +173,8 @@ jobs:
- BETTER_AUTH_TRUSTED_ORIGINS=http://localhost:4321
\`\`\`
> 💡 **Note:** PR images are tagged as \`pr-<number>\` and only built for \`linux/amd64\` to speed up CI.
> Production images (\`latest\`, version tags) are multi-platform (\`linux/amd64\`, \`linux/arm64\`).
> 💡 **Note:** PR images are tagged as \`pr-<number>\` and built for both \`linux/amd64\` and \`linux/arm64\`.
> Production images (\`latest\`, version tags) use the same multi-platform set.
---
📦 View in [GitHub Packages](https://github.com/${{ github.repository }}/pkgs/container/gitea-mirror)`;

View File

@@ -326,6 +326,8 @@ Enable users to sign in with external identity providers like Google, Azure AD,
https://your-domain.com/api/auth/sso/callback/{provider-id}
```
Need help? The [SSO & OIDC guide](docs/SSO-OIDC-SETUP.md) now includes a working Authentik walkthrough plus troubleshooting tips. If you upgraded from a version earlier than v3.8.10 and see `TypeError … url.startsWith` after the callback, delete the old provider and add it again using the Discover button (see [#73](https://github.com/RayLabsHQ/gitea-mirror/issues/73) and [#122](https://github.com/RayLabsHQ/gitea-mirror/issues/122)).
### 3. Header Authentication (Reverse Proxy)
Perfect for automatic authentication when using reverse proxies like Authentik, Authelia, or Traefik Forward Auth.

496
Security-Audit.md Normal file
View File

@@ -0,0 +1,496 @@
Date: 2025-11-07Application: Gitea Mirror v3.9.0Branch: bun-v1.3.1Scope: Full application security review
---
Executive Summary
A comprehensive security audit was conducted across all components of the Gitea Mirror application, including authentication, authorization,
encryption, input validation, API endpoints, database operations, UI components, and external integrations. The review identified 23 security
findings across multiple severity levels.
Critical Findings: 2
High Severity: 9
Medium Severity: 10
Low Severity: 2
Overall Security Posture: The application demonstrates strong foundational security practices (AES-256-GCM encryption, Drizzle ORM parameterized
queries, React auto-escaping) but has critical authorization flaws that require immediate remediation.
---
CRITICAL SEVERITY VULNERABILITIES
Vuln 1: Broken Object Level Authorization (BOLA) - Multiple Endpoints
Severity: CRITICALConfidence: 1.0Category: Authorization Bypass
Affected Files:
- src/pages/api/config/index.ts:18
- src/pages/api/job/mirror-repo.ts:16
- src/pages/api/github/repositories.ts:11
- src/pages/api/dashboard/index.ts:9
- src/pages/api/activities/index.ts:8
- src/pages/api/sync/repository.ts:15
- src/pages/api/sync/organization.ts:14
- src/pages/api/job/mirror-org.ts:14
- src/pages/api/job/retry-repo.ts:18
- src/pages/api/job/sync-repo.ts:11
- src/pages/api/job/schedule-sync-repo.ts:13
Description: Multiple API endpoints accept a userId parameter from client requests without validating that the authenticated user matches the
requested userId. This allows any authenticated user to access and manipulate any other user's data.
Exploit Scenario:
POST /api/config/index
Content-Type: application/json
Cookie: better-auth-session=attacker-session
{
"userId": "victim-user-id",
"giteaConfig": {
"token": "attacker-controlled-token"
}
}
Result: Attacker modifies victim's configuration, steals encrypted tokens, triggers mirror operations, or accesses activity logs.
Recommendation:
export const POST: APIRoute = async (context) => {
// 1. Validate authentication
const { user, response } = await requireAuth(context);
if (response) return response;
// 2. Use authenticated userId from session, NOT from request body
const authenticatedUserId = user!.id;
// 3. Don't accept userId from client
const body = await request.json();
const { githubConfig, giteaConfig } = body;
// 4. Use session userId for all operations
const config = await db
.select()
.from(configs)
.where(eq(configs.userId, authenticatedUserId));
}
---
Vuln 2: Header Authentication Spoofing - src/lib/auth-header.ts:48
Severity: CRITICALConfidence: 0.95Category: Authentication Bypass
Description: The header authentication mechanism trusts HTTP headers (X-Authentik-Username, X-Authentik-Email) without proper validation. If the
application is accessible without a properly configured reverse proxy, attackers can send these headers directly to impersonate any user.
Exploit Scenario:
GET /api/config/index?userId=admin
Host: gitea-mirror.example.com
X-Authentik-Username: admin
X-Authentik-Email: admin@example.com
Result: Attacker gains admin access without authentication.
Recommendation:
// 1. Add trusted proxy IP validation
const TRUSTED_PROXY_IPS = process.env.TRUSTED_PROXY_IPS?.split(',') || [];
export function extractUserFromHeaders(headers: Headers, remoteIp: string): UserInfo | null {
// Validate request comes from trusted reverse proxy
if (TRUSTED_PROXY_IPS.length > 0 && !TRUSTED_PROXY_IPS.includes(remoteIp)) {
console.warn(`Header auth rejected: untrusted IP ${remoteIp}`);
return null;
}
// 2. Add shared secret validation
const authSecret = headers.get('X-Auth-Secret');
if (authSecret !== process.env.HEADER_AUTH_SECRET) {
console.warn('Header auth rejected: invalid secret');
return null;
}
// Continue with header extraction...
}
Additional Requirements:
- Document that header auth MUST only be used behind a reverse proxy
- Reverse proxy MUST strip X-Authentik-* headers from untrusted requests
- Direct access MUST be blocked via firewall rules
---
HIGH SEVERITY VULNERABILITIES
Vuln 3: Missing Authentication on Critical Endpoints
Severity: HIGHConfidence: 1.0Category: Authentication Bypass
Affected Endpoints:
- /api/github/test-connection - Tests GitHub connection with user's token
- /api/gitea/test-connection - Tests Gitea connection with user's token
- /api/rate-limit/index - Exposes rate limit info
- /api/events/index - SSE endpoint for events
- /api/activities/cleanup - Deletes activities
Description: Several sensitive endpoints lack authentication checks entirely.
Exploit Scenario:
POST /api/github/test-connection
{"token": "stolen-token", "userId": "victim"}
Recommendation: Add requireAuth to all sensitive endpoints.
---
Vuln 4: Token Exposure Through Config API - src/pages/api/config/index.ts:220
Severity: HIGHConfidence: 0.9Category: Sensitive Data Exposure
Description: The GET /api/config/index endpoint decrypts and returns GitHub/Gitea API tokens in plaintext. Combined with BOLA, any user can retrieve
any other user's tokens.
Exploit Scenario:
1. Attacker authenticates as User A
2. Calls /api/config/index?userId=admin
3. Receives admin's decrypted GitHub Personal Access Token
4. Uses stolen token to access admin's repositories
Recommendation:
// Never send full tokens - use masked versions
if (githubConfig.token) {
const decrypted = decrypt(githubConfig.token);
githubConfig.token = maskToken(decrypted); // "ghp_****...last4"
githubConfig.tokenSet = true;
}
---
Vuln 5: Static Salt in Key Derivation - src/lib/utils/encryption.ts:21
Severity: HIGHConfidence: 0.95Category: Cryptographic Weakness
Description: The key derivation function uses a static, hardcoded salt that reduces protection against precomputation attacks.
Exploit Scenario: If ENCRYPTION_SECRET is weak or compromised, the static salt makes brute-force attacks more efficient. Attackers can build targeted
rainbow tables.
Recommendation:
// Generate and store unique salt per installation
const SALT_FILE = '/app/data/.encryption_salt';
let installationSalt: Buffer;
if (fs.existsSync(SALT_FILE)) {
installationSalt = Buffer.from(fs.readFileSync(SALT_FILE, 'utf8'), 'hex');
} else {
installationSalt = crypto.randomBytes(32);
fs.writeFileSync(SALT_FILE, installationSalt.toString('hex'));
fs.chmodSync(SALT_FILE, 0o600);
}
return crypto.pbkdf2Sync(secret, installationSalt, ITERATIONS, KEY_LENGTH, 'sha256');
---
Vuln 6: Weak Default Secret - src/lib/config.ts:22
Severity: HIGHConfidence: 0.90Category: Hardcoded Credentials
Description: The application uses a hardcoded default for BETTER_AUTH_SECRET when not set.
Exploit Scenario: Non-Docker deployments may use the weak default. Attackers can forge authentication tokens using the known default secret.
Recommendation:
BETTER_AUTH_SECRET: (() => {
const secret = process.env.BETTER_AUTH_SECRET;
const weakDefaults = ["your-secret-key-change-this-in-production"];
if (!secret || weakDefaults.includes(secret)) {
if (process.env.NODE_ENV === 'production') {
throw new Error("BETTER_AUTH_SECRET required. Generate with: openssl rand -base64 32");
}
return "dev-secret-minimum-32-chars";
}
if (secret.length < 32) {
throw new Error("BETTER_AUTH_SECRET must be at least 32 characters");
}
return secret;
})()
---
Vuln 7: Unencrypted OAuth Client Secrets - src/lib/db/schema.ts:569
Severity: HIGHConfidence: 0.92Category: Sensitive Data Exposure
Description: OAuth application client secrets and SSO provider credentials are stored as plaintext, inconsistent with encrypted GitHub/Gitea tokens.
Exploit Scenario: Database leak exposes OAuth client secrets, allowing attackers to impersonate the application to external OAuth providers.
Recommendation:
// Encrypt before storage
import { encrypt } from "@/lib/utils/encryption";
await db.insert(oauthApplications).values({
clientSecret: encrypt(clientSecret),
// ...
});
// Add decryption helper
export function decryptOAuthClientSecret(encrypted: string): string {
return decrypt(encrypted);
}
---
Vuln 8: SSRF in OIDC Discovery - src/pages/api/sso/discover.ts:48
Severity: HIGHConfidence: 0.95Category: Server-Side Request Forgery
Description: The endpoint accepts user-provided issuer URL and fetches ${issuer}/.well-known/openid-configuration without validating against internal
networks.
Exploit Scenario:
POST /api/sso/discover
{"issuer": "http://169.254.169.254/latest/meta-data"}
Result: Access to AWS metadata service, internal APIs, or port scanning.
Recommendation:
// Block private IPs and localhost
const ALLOWED_PROTOCOLS = ['https:'];
const hostname = parsedIssuer.hostname.toLowerCase();
if (
!ALLOWED_PROTOCOLS.includes(parsedIssuer.protocol) ||
hostname === 'localhost' ||
hostname.match(/^10\./) ||
hostname.match(/^192\.168\./) ||
hostname.match(/^172\.(1[6-9]|2[0-9]|3[0-1])\./) ||
hostname.match(/^169\.254\./) ||
hostname.match(/^127\./)
) {
return new Response(JSON.stringify({ error: "Invalid issuer" }), { status: 400 });
}
---
Vuln 9: SSRF in Gitea Connection Test - src/pages/api/gitea/test-connection.ts:29
Severity: HIGHConfidence: 0.95Category: SSRF / Input Validation
Description: Accepts user-provided url and makes HTTP request without validation.
Exploit Scenario: Same as Vuln 8 - internal network scanning, metadata service access.
Recommendation: Apply same URL validation as Vuln 8.
---
Vuln 10: Command Injection in Docker Build - scripts/build-docker.sh:72
Severity: HIGHConfidence: 0.95Category: Command Injection
Description: Uses eval with dynamically constructed Docker command including environment variables.
Exploit Scenario:
DOCKER_IMAGE='test; rm -rf /; #'
./scripts/build-docker.sh
Recommendation:
# Replace eval with direct execution
if docker buildx build --platform linux/amd64,linux/arm64 \
-t "$FULL_IMAGE_NAME" \
${LOAD:+--load} \
${PUSH:+--push} \
.; then
echo "Success"
fi
---
Vuln 11: Path Traversal in Gitea API - src/lib/gitea.ts:193
Severity: MEDIUM-HIGHConfidence: 0.85Category: Path Traversal / SSRF
Description: Repository owner/name interpolated into URLs without encoding.
Exploit Scenario:
owner = "../../admin"
repoName = "users/../../../config"
// Results in: /api/v1/repos/../../admin/users/../../../config
Recommendation:
const response = await fetch(
`${config.url}/api/v1/repos/${encodeURIComponent(owner)}/${encodeURIComponent(repoName)}`,
// ...
);
Apply to all URL constructions in gitea.ts and gitea-enhanced.ts.
---
MEDIUM SEVERITY VULNERABILITIES
Vuln 12: Weak Session Configuration - src/lib/auth.ts:105
Severity: MEDIUMConfidence: 0.85
Description: 30-day session expiration is excessive; missing explicit security flags.
Recommendation:
session: {
expiresIn: 60 * 60 * 24 * 7, // 7 days max
cookieOptions: {
secure: process.env.NODE_ENV === 'production',
sameSite: 'lax',
httpOnly: true,
},
}
---
Vuln 13: Missing CSRF Protection
Severity: MEDIUMConfidence: 0.80
Description: No visible CSRF token validation on state-changing operations.
Recommendation: Verify Better Auth CSRF protection is enabled; implement explicit tokens if needed.
---
Vuln 14: Weak Random Number Generation - src/lib/utils.ts:12
Severity: MEDIUMConfidence: 0.85
Description: generateRandomString() uses Math.random() (not cryptographically secure) for OAuth client credentials.
Recommendation:
// Use existing secure function
const clientId = `client_${generateSecureToken(16)}`;
const clientSecret = `secret_${generateSecureToken(24)}`;
---
Vuln 15-18: Input Validation Issues
Severity: MEDIUMConfidence: 0.85
Affected:
- src/pages/api/repositories/[id].ts:24 - destinationOrg not validated
- src/pages/api/organizations/[id].ts:24 - destinationOrg not validated
- src/pages/api/job/mirror-repo.ts:19 - repositoryIds array not validated
- src/lib/helpers.ts:49 - Repository names in messages not sanitized
Recommendation: Use Zod schemas for all user inputs:
const updateRepoSchema = z.object({
destinationOrg: z.string()
.min(1)
.max(100)
.regex(/^[a-zA-Z0-9\-_\.]+$/)
.nullable()
.optional()
});
---
LOW SEVERITY VULNERABILITIES
Vuln 19: XSS via Astro set:html - src/pages/docs/quickstart.astro:102
Severity: LOWConfidence: 0.85
Description: Uses set:html with static data; risky pattern if copied with dynamic content.
Recommendation:
<p class="text-sm" set:text={item.text}></p>
---
Vuln 20: Open Redirect in OAuth - src/components/oauth/ConsentPage.tsx:118
Severity: LOWConfidence: 0.90
Description: OAuth redirect uses window.location.href without explicit protocol validation.
Recommendation:
if (!['http:', 'https:'].includes(url.protocol)) {
throw new Error('Invalid protocol');
}
window.location.assign(url.toString());
---
POSITIVE SECURITY FINDINGS ✅
The codebase demonstrates excellent security practices:
1. SQL Injection Protection - Drizzle ORM with parameterized queries throughout
2. Token Encryption - AES-256-GCM for GitHub/Gitea tokens
3. XSS Protection - React auto-escaping, no dangerouslySetInnerHTML found
4. Password Hashing - bcrypt via Better Auth
5. Secure Random - crypto.randomBytes() in encryption functions
6. Input Validation - Comprehensive Zod schemas
7. Safe Path Operations - path.join() with proper base directories
8. No Command Execution - TypeScript codebase doesn't use child_process
9. Authentication Framework - Better Auth with session management
10. Type Safety - TypeScript strict mode
---
REMEDIATION PRIORITY
Phase 1: EMERGENCY (Deploy Today)
1. Fix BOLA vulnerabilities - Add authentication and userId validation (Vuln 1)
2. Disable or secure header auth - Add IP whitelist validation (Vuln 2)
3. Add authentication checks - Protect unprotected endpoints (Vuln 3)
Phase 2: URGENT (This Week)
1. Implement token masking - Don't return decrypted tokens (Vuln 4)
2. Fix SSRF vulnerabilities - Add URL validation (Vuln 8, 9)
3. Fix command injection - Remove eval from build script (Vuln 10)
4. Add URL encoding - Path traversal fix (Vuln 11)
Phase 3: HIGH (This Sprint)
1. Fix KDF salt - Generate unique salt per installation (Vuln 5)
2. Enforce strong secrets - Fail if weak defaults used (Vuln 6)
3. Encrypt OAuth secrets - Match GitHub/Gitea token encryption (Vuln 7)
4. Reduce session expiration - 7 days maximum (Vuln 12)
5. Add CSRF protection - Explicit tokens (Vuln 13)
Phase 4: MEDIUM (Next Sprint)
1. Input validation - Zod schemas for all endpoints (Vuln 14-18)
2. Fix weak RNG - Use crypto.randomBytes (Vuln 14)
Phase 5: LOW (Backlog)
1. Remove set:html - Use safer alternatives (Vuln 19)
2. OAuth redirect validation - Add protocol check (Vuln 20)
---
TESTING RECOMMENDATIONS
Proof of Concept Tests
# Test BOLA
curl -X POST http://localhost:4321/api/config/index \
-H "Content-Type: application/json" \
-H "Cookie: better-auth-session=<attacker-session>" \
-d '{"userId":"victim-id","githubConfig":{...}}'
# Test header spoofing
curl http://localhost:4321/api/config/index?userId=admin \
-H "X-Authentik-Username: admin" \
-H "X-Authentik-Email: admin@example.com"
# Test SSRF
curl -X POST http://localhost:4321/api/sso/discover \
-H "Content-Type: application/json" \
-d '{"issuer":"http://169.254.169.254/latest/meta-data"}'
---
SECURITY GRADE
Overall Security Grade: C+
- Would be B+ after Phase 1-2 fixes
- Would be A- after all fixes
Critical Issues: 2 (authorization bypass, auth spoofing)Risk Level: HIGH due to BOLA allowing complete account takeover
---
CONCLUSION
The Gitea Mirror application has a strong security foundation with proper encryption, ORM usage, and XSS protection. However, critical authorization
flaws in API endpoints allow authenticated users to access any other user's data. These must be fixed immediately before production deployment.
The development team has clearly prioritized security (encryption, input validation, type safety), and the identified issues are fixable with
moderate effort. After remediation, this will be a secure, production-ready application.

View File

@@ -36,7 +36,7 @@
"@types/react-dom": "^19.2.2",
"astro": "^5.14.8",
"bcryptjs": "^3.0.2",
"better-auth": "1.4.0-beta.12",
"better-auth": "1.4.0-beta.13",
"buffer": "^6.0.3",
"canvas-confetti": "^1.9.3",
"class-variance-authority": "^0.7.1",
@@ -150,11 +150,11 @@
"@babel/types": ["@babel/types@7.28.4", "", { "dependencies": { "@babel/helper-string-parser": "^7.27.1", "@babel/helper-validator-identifier": "^7.27.1" } }, "sha512-bkFqkLhh3pMBUQQkpVgWDWq/lqzc2678eUyDlTBhRqhCHFguYYGM0Efga7tYk4TogG/3x0EEl66/OQ+WGbWB/Q=="],
"@better-auth/core": ["@better-auth/core@1.4.0-beta.12", "", { "dependencies": { "zod": "^4.1.5" }, "peerDependencies": { "@better-auth/utils": "0.3.0", "@better-fetch/fetch": "1.1.18", "better-call": "1.0.24", "better-sqlite3": "^12.4.1", "jose": "^6.1.0", "kysely": "^0.28.5", "nanostores": "^1.0.1" } }, "sha512-2GisAGuSVZS4gtnwP5Owk3RyC6GevZe9zcODTrtbwRCvBTrHUmu0j6bcklK9uNG8DaWDmzCK1+VGA5qIHzg5Pw=="],
"@better-auth/core": ["@better-auth/core@1.4.0-beta.13", "", { "dependencies": { "zod": "^4.1.5" }, "peerDependencies": { "@better-auth/utils": "0.3.0", "@better-fetch/fetch": "1.1.18", "better-call": "1.0.24", "jose": "^6.1.0", "kysely": "^0.28.5", "nanostores": "^1.0.1" } }, "sha512-EGySsNv6HQYnlRQDIa7otIMrwFoC0gGLxBum9lC6C3wAsF4l4pn/ECcdIriFpc9ewLb8mGkeMSpvjVBUBND6ew=="],
"@better-auth/sso": ["@better-auth/sso@1.4.0-beta.12", "", { "dependencies": { "@better-fetch/fetch": "1.1.18", "fast-xml-parser": "^5.2.5", "jose": "^6.1.0", "oauth2-mock-server": "^7.2.1", "samlify": "^2.10.1", "zod": "^4.1.5" }, "peerDependencies": { "better-auth": "1.4.0-beta.12" } }, "sha512-iuRuy59J3yXQihZJ34rqYClWyuVjSkxuBkdFblccKbOhNy7pmRO1lfmBMpyeth3ET5Cp0PDVV/z1XBbDcQp0LA=="],
"@better-auth/telemetry": ["@better-auth/telemetry@1.4.0-beta.12", "", { "dependencies": { "@better-auth/utils": "0.3.0", "@better-fetch/fetch": "1.1.18" }, "peerDependencies": { "@better-auth/core": "1.4.0-beta.12" } }, "sha512-pQ5HITRGXMHQPcPCDnz0xlxFqqxvpD4kQMvY6cdt1vDsPVePHAj9R3S318XEfaw3NAgtw3af/wCN6eBt2u4Kew=="],
"@better-auth/telemetry": ["@better-auth/telemetry@1.4.0-beta.13", "", { "dependencies": { "@better-auth/utils": "0.3.0", "@better-fetch/fetch": "1.1.18" }, "peerDependencies": { "@better-auth/core": "1.4.0-beta.13" } }, "sha512-910f+APALhhD79TiujzXp85Pnd2M3TlcTgBfiYF+mk3ouIkBJkl2N6D2ElcgwfiNTg50cFuTkP3AFPYioz8Arw=="],
"@better-auth/utils": ["@better-auth/utils@0.3.0", "", {}, "sha512-W+Adw6ZA6mgvnSnhOki270rwJ42t4XzSK6YWGF//BbVXL6SwCLWfyzBc1lN2m/4RM28KubdBKQ4X5VMoLRNPQw=="],
@@ -698,7 +698,7 @@
"before-after-hook": ["before-after-hook@4.0.0", "", {}, "sha512-q6tR3RPqIB1pMiTRMFcZwuG5T8vwp+vUvEG0vuI6B+Rikh5BfPp2fQ82c925FOs+b0lcFQ8CFrL+KbilfZFhOQ=="],
"better-auth": ["better-auth@1.4.0-beta.12", "", { "dependencies": { "@better-auth/core": "1.4.0-beta.12", "@better-auth/telemetry": "1.4.0-beta.12", "@better-auth/utils": "0.3.0", "@better-fetch/fetch": "1.1.18", "@noble/ciphers": "^2.0.0", "@noble/hashes": "^2.0.0", "@simplewebauthn/browser": "^13.1.2", "@simplewebauthn/server": "^13.1.2", "better-call": "1.0.24", "defu": "^6.1.4", "jose": "^6.1.0", "kysely": "^0.28.5", "nanostores": "^1.0.1", "zod": "^4.1.5" } }, "sha512-IvrSBmQkHgOinDh6JyJCoKwbMPmHpkmt98/0hBU9Nc0s7Y7u72AOx1Z35J2dRQxxX4SzvFQ9pHqlV6wPnm72Ww=="],
"better-auth": ["better-auth@1.4.0-beta.13", "", { "dependencies": { "@better-auth/core": "1.4.0-beta.13", "@better-auth/telemetry": "1.4.0-beta.13", "@better-auth/utils": "0.3.0", "@better-fetch/fetch": "1.1.18", "@noble/ciphers": "^2.0.0", "@noble/hashes": "^2.0.0", "@simplewebauthn/browser": "^13.1.2", "@simplewebauthn/server": "^13.1.2", "better-call": "1.0.24", "defu": "^6.1.4", "jose": "^6.1.0", "kysely": "^0.28.5", "nanostores": "^1.0.1", "zod": "^4.1.5" } }, "sha512-VOzbsCldupk2AdNfzDmpCVajX83nwITX8S9I8TdEUURgr3kB/CDVrsN6S8t0AClMnGgB4XaeKiXUNN30CCG4aA=="],
"better-call": ["better-call@1.0.24", "", { "dependencies": { "@better-auth/utils": "^0.3.0", "@better-fetch/fetch": "^1.1.4", "rou3": "^0.5.1", "set-cookie-parser": "^2.7.1", "uncrypto": "^0.1.3" } }, "sha512-iGqL29cstPp4xLD2MjKL1EmyAqQHjYS+cBMt4W27rPs3vf+kuqkVPA0NYaf7JciBOzVsJdNj4cbZWXC5TardWQ=="],

View File

@@ -81,6 +81,26 @@ Replace `{provider-id}` with your chosen Provider ID.
- Client Secret: [Your Okta Client Secret]
- Click "Discover" to auto-fill endpoints
### Example: Authentik SSO Setup
Working Authentik deployments (see [#134](https://github.com/RayLabsHQ/gitea-mirror/issues/134)) follow these steps:
1. In Authentik, create a new **Application** and OIDC **Provider** (implicit flow works well for testing).
2. Start creating an SSO provider inside Gitea Mirror so you can copy the redirect URL shown (`https://your-domain.com/api/auth/sso/callback/authentik` if you pick `authentik` as your Provider ID).
3. Paste that redirect URL into the Authentik Provider configuration and finish creating the provider.
4. Copy the Authentik issuer URL, client ID, and client secret.
5. Back in Gitea Mirror:
- Issuer URL: the exact value from Authentik (keep any trailing slash Authentik shows).
- Provider ID: match the one you used in step 2.
- Click **Discover** so Gitea Mirror stores the authorization, token, and JWKS endpoints (Authentik publishes them via discovery).
- Domain: enter the email domain you expect to match (e.g. `example.com`).
6. Save the provider and test the login flow.
Notes:
- Make sure `BETTER_AUTH_URL` and (if you serve the UI from multiple origins) `BETTER_AUTH_TRUSTED_ORIGINS` point at the public URL users reach. A mismatch can surface as 500 errors after redirect.
- Authentik must report the users email as verified (default behavior) so Gitea Mirror can auto-link accounts.
- If you created an Authentik provider before v3.8.10 you should delete it and re-add it after upgrading; older versions saved incomplete endpoint data which leads to the `url.startsWith` error explained in the Troubleshooting section.
## Setting up OIDC Provider
The OIDC Provider feature allows other applications to use Gitea Mirror as their authentication provider.
@@ -165,6 +185,7 @@ When an application requests authentication:
1. **"Invalid origin" error**: Check that your Gitea Mirror URL matches the configured redirect URI
2. **"Provider not found" error**: Ensure the provider is properly configured and enabled
3. **Redirect loop**: Verify the redirect URI in both Gitea Mirror and the SSO provider match exactly
4. **`TypeError: undefined is not an object (evaluating 'url.startsWith')`**: This indicates the stored provider configuration is missing OIDC endpoints. Delete the provider from Gitea Mirror and re-register it using the **Discover** button so authorization/token URLs are saved (see [#73](https://github.com/RayLabsHQ/gitea-mirror/issues/73) and [#122](https://github.com/RayLabsHQ/gitea-mirror/issues/122) for examples).
### OIDC Provider Issues
@@ -202,4 +223,4 @@ This immediately prevents the application from authenticating new users.
If migrating from the previous JWT-based authentication:
- Existing users remain unaffected
- Users can continue using email/password authentication
- SSO can be added as an additional authentication method
- SSO can be added as an additional authentication method

View File

@@ -0,0 +1,4 @@
ALTER TABLE `accounts` ADD `id_token` text;--> statement-breakpoint
ALTER TABLE `accounts` ADD `access_token_expires_at` integer;--> statement-breakpoint
ALTER TABLE `accounts` ADD `refresh_token_expires_at` integer;--> statement-breakpoint
ALTER TABLE `accounts` ADD `scope` text;

View File

@@ -0,0 +1,18 @@
ALTER TABLE `organizations` ADD `normalized_name` text NOT NULL DEFAULT '';--> statement-breakpoint
UPDATE `organizations` SET `normalized_name` = lower(trim(`name`));--> statement-breakpoint
DELETE FROM `organizations`
WHERE rowid NOT IN (
SELECT MIN(rowid)
FROM `organizations`
GROUP BY `user_id`, `normalized_name`
);--> statement-breakpoint
CREATE UNIQUE INDEX `uniq_organizations_user_normalized_name` ON `organizations` (`user_id`,`normalized_name`);--> statement-breakpoint
ALTER TABLE `repositories` ADD `normalized_full_name` text NOT NULL DEFAULT '';--> statement-breakpoint
UPDATE `repositories` SET `normalized_full_name` = lower(trim(`full_name`));--> statement-breakpoint
DELETE FROM `repositories`
WHERE rowid NOT IN (
SELECT MIN(rowid)
FROM `repositories`
GROUP BY `user_id`, `normalized_full_name`
);--> statement-breakpoint
CREATE UNIQUE INDEX `uniq_repositories_user_normalized_full_name` ON `repositories` (`user_id`,`normalized_full_name`);

View File

@@ -0,0 +1 @@
ALTER TABLE `repositories` ADD `metadata` text;

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -43,6 +43,27 @@
"when": 1757786449446,
"tag": "0005_polite_preak",
"breakpoints": true
},
{
"idx": 6,
"version": "6",
"when": 1761483928546,
"tag": "0006_military_la_nuit",
"breakpoints": true
},
{
"idx": 7,
"version": "6",
"when": 1761534391115,
"tag": "0007_whole_hellion",
"breakpoints": true
},
{
"idx": 8,
"version": "6",
"when": 1761802056073,
"tag": "0008_serious_thena",
"breakpoints": true
}
]
}

View File

@@ -1,7 +1,7 @@
{
"name": "gitea-mirror",
"type": "module",
"version": "3.8.7",
"version": "3.9.0",
"engines": {
"bun": ">=1.2.9"
},
@@ -75,7 +75,7 @@
"astro": "^5.14.8",
"bcryptjs": "^3.0.2",
"buffer": "^6.0.3",
"better-auth": "1.4.0-beta.12",
"better-auth": "1.4.0-beta.13",
"canvas-confetti": "^1.9.3",
"class-variance-authority": "^0.7.1",
"clsx": "^2.1.1",

BIN
public/apple-touch-icon.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 31 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 15 KiB

After

Width:  |  Height:  |  Size: 15 KiB

3
public/favicon.svg Normal file

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 216 KiB

View File

@@ -114,10 +114,10 @@ EOF
echo "======================================"
echo "1. Access Authentik at http://localhost:9000"
echo "2. Login with akadmin / admin-password"
echo "3. Create OAuth2 Provider for Gitea Mirror:"
echo "3. Create an Authentik OIDC Provider for Gitea Mirror:"
echo " - Name: gitea-mirror"
echo " - Redirect URIs:"
echo " http://localhost:4321/api/auth/callback/sso-provider"
echo " - Redirect URI:"
echo " http://localhost:4321/api/auth/sso/callback/authentik"
echo " - Scopes: openid, profile, email"
echo ""
echo "4. Create Application:"
@@ -131,10 +131,14 @@ EOF
echo "6. Configure SSO in Gitea Mirror:"
echo " - Go to Settings → Authentication & SSO"
echo " - Add provider with:"
echo " - Provider ID: authentik"
echo " - Issuer URL: http://localhost:9000/application/o/gitea-mirror/"
echo " - Click Discover to pull Authentik endpoints"
echo " - Client ID: (from Authentik provider)"
echo " - Client Secret: (from Authentik provider)"
echo ""
echo "If you previously registered this provider on a version earlier than v3.8.10, delete it and re-add it after upgrading to avoid missing endpoint data."
echo ""
;;
stop)
@@ -177,4 +181,4 @@ EOF
echo " status - Show service status"
exit 1
;;
esac
esac

View File

@@ -306,7 +306,7 @@ export function Dashboard() {
title="Repositories"
value={repoCount}
icon={<GitFork className="h-4 w-4" />}
description="Total in mirror queue"
description="Total imported repositories"
/>
<StatusCard
title="Mirrored"

View File

@@ -1,5 +1,5 @@
import * as React from "react";
import { useState } from "react";
import { useEffect, useState } from "react";
import { Button } from "@/components/ui/button";
import {
Dialog,
@@ -20,9 +20,11 @@ interface AddOrganizationDialogProps {
onAddOrganization: ({
org,
role,
force,
}: {
org: string;
role: MembershipRole;
force?: boolean;
}) => Promise<void>;
}
@@ -36,6 +38,14 @@ export default function AddOrganizationDialog({
const [isLoading, setIsLoading] = useState<boolean>(false);
const [error, setError] = useState<string>("");
useEffect(() => {
if (!isDialogOpen) {
setError("");
setOrg("");
setRole("member");
}
}, [isDialogOpen]);
const handleSubmit = async (e: React.FormEvent) => {
e.preventDefault();
@@ -54,7 +64,7 @@ export default function AddOrganizationDialog({
setRole("member");
setIsDialogOpen(false);
} catch (err: any) {
setError(err?.message || "Failed to add repository.");
setError(err?.message || "Failed to add organization.");
} finally {
setIsLoading(false);
}
@@ -139,7 +149,7 @@ export default function AddOrganizationDialog({
{isLoading ? (
<LoaderCircle className="h-4 w-4 animate-spin" />
) : (
"Add Repository"
"Add Organization"
)}
</Button>
</div>

View File

@@ -1,6 +1,6 @@
import { useCallback, useEffect, useState } from "react";
import { Button } from "@/components/ui/button";
import { Search, RefreshCw, FlipHorizontal, Filter } from "lucide-react";
import { Search, RefreshCw, FlipHorizontal, Filter, LoaderCircle, Trash2 } from "lucide-react";
import type { MirrorJob, Organization } from "@/lib/db/schema";
import { OrganizationList } from "./OrganizationsList";
import AddOrganizationDialog from "./AddOrganizationDialog";
@@ -37,6 +37,14 @@ import {
DrawerTitle,
DrawerTrigger,
} from "@/components/ui/drawer";
import {
Dialog,
DialogContent,
DialogDescription,
DialogFooter,
DialogHeader,
DialogTitle,
} from "@/components/ui/dialog";
export function Organization() {
const [organizations, setOrganizations] = useState<Organization[]>([]);
@@ -52,6 +60,15 @@ export function Organization() {
status: "",
});
const [loadingOrgIds, setLoadingOrgIds] = useState<Set<string>>(new Set()); // this is used when the api actions are performed
const [duplicateOrgCandidate, setDuplicateOrgCandidate] = useState<{
org: string;
role: MembershipRole;
} | null>(null);
const [isDuplicateOrgDialogOpen, setIsDuplicateOrgDialogOpen] = useState(false);
const [isProcessingDuplicateOrg, setIsProcessingDuplicateOrg] = useState(false);
const [orgToDelete, setOrgToDelete] = useState<Organization | null>(null);
const [isDeleteOrgDialogOpen, setIsDeleteOrgDialogOpen] = useState(false);
const [isDeletingOrg, setIsDeletingOrg] = useState(false);
// Create a stable callback using useCallback
const handleNewMessage = useCallback((data: MirrorJob) => {
@@ -256,19 +273,45 @@ export function Organization() {
const handleAddOrganization = async ({
org,
role,
force = false,
}: {
org: string;
role: MembershipRole;
force?: boolean;
}) => {
try {
if (!user || !user.id) {
return;
if (!user || !user.id) {
return;
}
const trimmedOrg = org.trim();
const normalizedOrg = trimmedOrg.toLowerCase();
if (!trimmedOrg) {
toast.error("Please enter a valid organization name.");
throw new Error("Invalid organization name");
}
if (!force) {
const alreadyExists = organizations.some(
(existing) => existing.name?.trim().toLowerCase() === normalizedOrg
);
if (alreadyExists) {
toast.warning("Organization already exists.");
setDuplicateOrgCandidate({ org: trimmedOrg, role });
setIsDuplicateOrgDialogOpen(true);
throw new Error("Organization already exists");
}
}
try {
setIsLoading(true);
const reqPayload: AddOrganizationApiRequest = {
userId: user.id,
org,
org: trimmedOrg,
role,
force,
};
const response = await apiRequest<AddOrganizationApiResponse>(
@@ -280,25 +323,100 @@ export function Organization() {
);
if (response.success) {
toast.success(`Organization added successfully`);
setOrganizations((prev) => [...prev, response.organization]);
const message = force
? "Organization already exists; using existing entry."
: "Organization added successfully";
toast.success(message);
await fetchOrganizations();
await fetchOrganizations(false);
setFilter((prev) => ({
...prev,
searchTerm: org,
searchTerm: trimmedOrg,
}));
if (force) {
setIsDuplicateOrgDialogOpen(false);
setDuplicateOrgCandidate(null);
}
} else {
showErrorToast(response.error || "Error adding organization", toast);
}
} catch (error) {
showErrorToast(error, toast);
throw error;
} finally {
setIsLoading(false);
}
};
const handleConfirmDuplicateOrganization = async () => {
if (!duplicateOrgCandidate) {
return;
}
setIsProcessingDuplicateOrg(true);
try {
await handleAddOrganization({
org: duplicateOrgCandidate.org,
role: duplicateOrgCandidate.role,
force: true,
});
setIsDialogOpen(false);
setDuplicateOrgCandidate(null);
setIsDuplicateOrgDialogOpen(false);
} catch (error) {
// Error already surfaced via toast
} finally {
setIsProcessingDuplicateOrg(false);
}
};
const handleCancelDuplicateOrganization = () => {
setIsDuplicateOrgDialogOpen(false);
setDuplicateOrgCandidate(null);
};
const handleRequestDeleteOrganization = (orgId: string) => {
const org = organizations.find((item) => item.id === orgId);
if (!org) {
toast.error("Organization not found");
return;
}
setOrgToDelete(org);
setIsDeleteOrgDialogOpen(true);
};
const handleDeleteOrganization = async () => {
if (!user || !user.id || !orgToDelete) {
return;
}
setIsDeletingOrg(true);
try {
const response = await apiRequest<{ success: boolean; error?: string }>(
`/organizations/${orgToDelete.id}`,
{
method: "DELETE",
}
);
if (response.success) {
toast.success(`Removed ${orgToDelete.name} from Gitea Mirror.`);
await fetchOrganizations(false);
} else {
showErrorToast(response.error || "Failed to delete organization", toast);
}
} catch (error) {
showErrorToast(error, toast);
} finally {
setIsDeletingOrg(false);
setIsDeleteOrgDialogOpen(false);
setOrgToDelete(null);
}
};
const handleMirrorAllOrgs = async () => {
try {
if (!user || !user.id || organizations.length === 0) {
@@ -711,6 +829,7 @@ export function Organization() {
onMirror={handleMirrorOrg}
onIgnore={handleIgnoreOrg}
onAddOrganization={() => setIsDialogOpen(true)}
onDelete={handleRequestDeleteOrganization}
onRefresh={async () => {
await fetchOrganizations(false);
}}
@@ -721,6 +840,68 @@ export function Organization() {
isDialogOpen={isDialogOpen}
setIsDialogOpen={setIsDialogOpen}
/>
<Dialog open={isDuplicateOrgDialogOpen} onOpenChange={(open) => {
if (!open) {
handleCancelDuplicateOrganization();
}
}}>
<DialogContent>
<DialogHeader>
<DialogTitle>Organization already exists</DialogTitle>
<DialogDescription>
{duplicateOrgCandidate?.org ?? "This organization"} is already synced in Gitea Mirror.
Continuing will reuse the existing entry without creating a duplicate. You can remove it later if needed.
</DialogDescription>
</DialogHeader>
<DialogFooter>
<Button variant="outline" onClick={handleCancelDuplicateOrganization} disabled={isProcessingDuplicateOrg}>
Cancel
</Button>
<Button onClick={handleConfirmDuplicateOrganization} disabled={isProcessingDuplicateOrg}>
{isProcessingDuplicateOrg ? (
<LoaderCircle className="h-4 w-4 animate-spin" />
) : (
"Continue"
)}
</Button>
</DialogFooter>
</DialogContent>
</Dialog>
<Dialog open={isDeleteOrgDialogOpen} onOpenChange={(open) => {
if (!open) {
setIsDeleteOrgDialogOpen(false);
setOrgToDelete(null);
}
}}>
<DialogContent>
<DialogHeader>
<DialogTitle>Remove organization from Gitea Mirror?</DialogTitle>
<DialogDescription>
{orgToDelete?.name ?? "This organization"} will be deleted from Gitea Mirror only. Nothing will be removed from Gitea; you will need to clean it up manually in Gitea if desired.
</DialogDescription>
</DialogHeader>
<DialogFooter>
<Button variant="outline" onClick={() => {
setIsDeleteOrgDialogOpen(false);
setOrgToDelete(null);
}} disabled={isDeletingOrg}>
Cancel
</Button>
<Button variant="destructive" onClick={handleDeleteOrganization} disabled={isDeletingOrg}>
{isDeletingOrg ? (
<LoaderCircle className="h-4 w-4 animate-spin" />
) : (
<span className="flex items-center gap-2">
<Trash2 className="h-4 w-4" />
Delete
</span>
)}
</Button>
</DialogFooter>
</DialogContent>
</Dialog>
</div>
);
}

View File

@@ -2,7 +2,7 @@ import { useMemo } from "react";
import { Card } from "@/components/ui/card";
import { Button } from "@/components/ui/button";
import { Badge } from "@/components/ui/badge";
import { Plus, RefreshCw, Building2, Check, AlertCircle, Clock, MoreVertical, Ban } from "lucide-react";
import { Plus, RefreshCw, Building2, Check, AlertCircle, Clock, MoreVertical, Ban, Trash2 } from "lucide-react";
import { SiGithub, SiGitea } from "react-icons/si";
import type { Organization } from "@/lib/db/schema";
import type { FilterParams } from "@/types/filter";
@@ -30,6 +30,7 @@ interface OrganizationListProps {
loadingOrgIds: Set<string>;
onAddOrganization?: () => void;
onRefresh?: () => Promise<void>;
onDelete?: (orgId: string) => void;
}
// Helper function to get status badge variant and icon
@@ -60,6 +61,7 @@ export function OrganizationList({
loadingOrgIds,
onAddOrganization,
onRefresh,
onDelete,
}: OrganizationListProps) {
const { giteaConfig } = useGiteaConfig();
@@ -414,7 +416,7 @@ export function OrganizationList({
)}
{/* Dropdown menu for additional actions */}
{org.status !== "ignored" && org.status !== "mirroring" && (
{org.status !== "mirroring" && (
<DropdownMenu>
<DropdownMenuTrigger asChild>
<Button variant="ghost" size="icon" disabled={isLoading} className="h-10 w-10">
@@ -422,12 +424,26 @@ export function OrganizationList({
</Button>
</DropdownMenuTrigger>
<DropdownMenuContent align="end">
<DropdownMenuItem
onClick={() => org.id && onIgnore && onIgnore({ orgId: org.id, ignore: true })}
>
<Ban className="h-4 w-4 mr-2" />
Ignore Organization
</DropdownMenuItem>
{org.status !== "ignored" && (
<DropdownMenuItem
onClick={() => org.id && onIgnore && onIgnore({ orgId: org.id, ignore: true })}
>
<Ban className="h-4 w-4 mr-2" />
Ignore Organization
</DropdownMenuItem>
)}
{onDelete && (
<>
{org.status !== "ignored" && <DropdownMenuSeparator />}
<DropdownMenuItem
className="text-destructive focus:text-destructive"
onClick={() => org.id && onDelete(org.id)}
>
<Trash2 className="h-4 w-4 mr-2" />
Delete from Mirror
</DropdownMenuItem>
</>
)}
</DropdownMenuContent>
</DropdownMenu>
)}
@@ -561,7 +577,7 @@ export function OrganizationList({
)}
{/* Dropdown menu for additional actions */}
{org.status !== "ignored" && org.status !== "mirroring" && (
{org.status !== "mirroring" && (
<DropdownMenu>
<DropdownMenuTrigger asChild>
<Button variant="ghost" size="icon" disabled={isLoading}>
@@ -569,12 +585,26 @@ export function OrganizationList({
</Button>
</DropdownMenuTrigger>
<DropdownMenuContent align="end">
<DropdownMenuItem
onClick={() => org.id && onIgnore && onIgnore({ orgId: org.id, ignore: true })}
>
<Ban className="h-4 w-4 mr-2" />
Ignore Organization
</DropdownMenuItem>
{org.status !== "ignored" && (
<DropdownMenuItem
onClick={() => org.id && onIgnore && onIgnore({ orgId: org.id, ignore: true })}
>
<Ban className="h-4 w-4 mr-2" />
Ignore Organization
</DropdownMenuItem>
)}
{onDelete && (
<>
{org.status !== "ignored" && <DropdownMenuSeparator />}
<DropdownMenuItem
className="text-destructive focus:text-destructive"
onClick={() => org.id && onDelete(org.id)}
>
<Trash2 className="h-4 w-4 mr-2" />
Delete from Mirror
</DropdownMenuItem>
</>
)}
</DropdownMenuContent>
</DropdownMenu>
)}

View File

@@ -1,5 +1,5 @@
import * as React from "react";
import { useState } from "react";
import { useEffect, useState } from "react";
import { Button } from "@/components/ui/button";
import {
Dialog,
@@ -17,9 +17,11 @@ interface AddRepositoryDialogProps {
onAddRepository: ({
repo,
owner,
force,
}: {
repo: string;
owner: string;
force?: boolean;
}) => Promise<void>;
}
@@ -33,6 +35,14 @@ export default function AddRepositoryDialog({
const [isLoading, setIsLoading] = useState<boolean>(false);
const [error, setError] = useState<string>("");
useEffect(() => {
if (!isDialogOpen) {
setError("");
setRepo("");
setOwner("");
}
}, [isDialogOpen]);
const handleSubmit = async (e: React.FormEvent) => {
e.preventDefault();

View File

@@ -18,7 +18,7 @@ import {
SelectValue,
} from "../ui/select";
import { Button } from "@/components/ui/button";
import { Search, RefreshCw, FlipHorizontal, RotateCcw, X, Filter, Ban, Check } from "lucide-react";
import { Search, RefreshCw, FlipHorizontal, RotateCcw, X, Filter, Ban, Check, LoaderCircle, Trash2 } from "lucide-react";
import type { MirrorRepoRequest, MirrorRepoResponse } from "@/types/mirror";
import {
Drawer,
@@ -30,6 +30,14 @@ import {
DrawerTitle,
DrawerTrigger,
} from "@/components/ui/drawer";
import {
Dialog,
DialogContent,
DialogDescription,
DialogFooter,
DialogHeader,
DialogTitle,
} from "@/components/ui/dialog";
import { useSSE } from "@/hooks/useSEE";
import { useFilterParams } from "@/hooks/useFilterParams";
import { toast } from "sonner";
@@ -69,6 +77,15 @@ export default function Repository() {
}, [setFilter]);
const [loadingRepoIds, setLoadingRepoIds] = useState<Set<string>>(new Set()); // this is used when the api actions are performed
const [duplicateRepoCandidate, setDuplicateRepoCandidate] = useState<{
owner: string;
repo: string;
} | null>(null);
const [isDuplicateRepoDialogOpen, setIsDuplicateRepoDialogOpen] = useState(false);
const [isProcessingDuplicateRepo, setIsProcessingDuplicateRepo] = useState(false);
const [repoToDelete, setRepoToDelete] = useState<Repository | null>(null);
const [isDeleteRepoDialogOpen, setIsDeleteRepoDialogOpen] = useState(false);
const [isDeletingRepo, setIsDeletingRepo] = useState(false);
// Create a stable callback using useCallback
const handleNewMessage = useCallback((data: MirrorJob) => {
@@ -618,19 +635,45 @@ export default function Repository() {
const handleAddRepository = async ({
repo,
owner,
force = false,
}: {
repo: string;
owner: string;
force?: boolean;
}) => {
try {
if (!user || !user.id) {
return;
}
if (!user || !user.id) {
return;
}
const trimmedRepo = repo.trim();
const trimmedOwner = owner.trim();
if (!trimmedRepo || !trimmedOwner) {
toast.error("Please provide both owner and repository name.");
throw new Error("Invalid repository details");
}
const normalizedFullName = `${trimmedOwner}/${trimmedRepo}`.toLowerCase();
if (!force) {
const duplicateRepo = repositories.find(
(existing) => existing.normalizedFullName?.toLowerCase() === normalizedFullName
);
if (duplicateRepo) {
toast.warning("Repository already exists.");
setDuplicateRepoCandidate({ repo: trimmedRepo, owner: trimmedOwner });
setIsDuplicateRepoDialogOpen(true);
throw new Error("Repository already exists");
}
}
try {
const reqPayload: AddRepositoriesApiRequest = {
userId: user.id,
repo,
owner,
repo: trimmedRepo,
owner: trimmedOwner,
force,
};
const response = await apiRequest<AddRepositoriesApiResponse>(
@@ -642,20 +685,28 @@ export default function Repository() {
);
if (response.success) {
toast.success(`Repository added successfully`);
setRepositories((prevRepos) => [...prevRepos, response.repository]);
const message = force
? "Repository already exists; metadata refreshed."
: "Repository added successfully";
toast.success(message);
await fetchRepositories(false); // Manual refresh after adding repository
await fetchRepositories(false);
setFilter((prev) => ({
...prev,
searchTerm: repo,
searchTerm: trimmedRepo,
}));
if (force) {
setDuplicateRepoCandidate(null);
setIsDuplicateRepoDialogOpen(false);
}
} else {
showErrorToast(response.error || "Error adding repository", toast);
}
} catch (error) {
showErrorToast(error, toast);
throw error;
}
};
@@ -673,6 +724,71 @@ export default function Repository() {
)
).sort();
const handleConfirmDuplicateRepository = async () => {
if (!duplicateRepoCandidate) {
return;
}
setIsProcessingDuplicateRepo(true);
try {
await handleAddRepository({
repo: duplicateRepoCandidate.repo,
owner: duplicateRepoCandidate.owner,
force: true,
});
setIsDialogOpen(false);
} catch (error) {
// Error already shown
} finally {
setIsProcessingDuplicateRepo(false);
}
};
const handleCancelDuplicateRepository = () => {
setDuplicateRepoCandidate(null);
setIsDuplicateRepoDialogOpen(false);
};
const handleRequestDeleteRepository = (repoId: string) => {
const repo = repositories.find((item) => item.id === repoId);
if (!repo) {
toast.error("Repository not found");
return;
}
setRepoToDelete(repo);
setIsDeleteRepoDialogOpen(true);
};
const handleDeleteRepository = async () => {
if (!user || !user.id || !repoToDelete) {
return;
}
setIsDeletingRepo(true);
try {
const response = await apiRequest<{ success: boolean; error?: string }>(
`/repositories/${repoToDelete.id}`,
{
method: "DELETE",
}
);
if (response.success) {
toast.success(`Removed ${repoToDelete.fullName} from Gitea Mirror.`);
await fetchRepositories(false);
} else {
showErrorToast(response.error || "Failed to delete repository", toast);
}
} catch (error) {
showErrorToast(error, toast);
} finally {
setIsDeletingRepo(false);
setIsDeleteRepoDialogOpen(false);
setRepoToDelete(null);
}
};
// Determine what actions are available for selected repositories
const getAvailableActions = () => {
if (selectedRepoIds.size === 0) return [];
@@ -1198,6 +1314,7 @@ export default function Repository() {
onRefresh={async () => {
await fetchRepositories(false);
}}
onDelete={handleRequestDeleteRepository}
/>
)}
@@ -1206,6 +1323,77 @@ export default function Repository() {
isDialogOpen={isDialogOpen}
setIsDialogOpen={setIsDialogOpen}
/>
<Dialog
open={isDuplicateRepoDialogOpen}
onOpenChange={(open) => {
if (!open) {
handleCancelDuplicateRepository();
}
}}
>
<DialogContent>
<DialogHeader>
<DialogTitle>Repository already exists</DialogTitle>
<DialogDescription>
{duplicateRepoCandidate ? `${duplicateRepoCandidate.owner}/${duplicateRepoCandidate.repo}` : "This repository"} is already tracked in Gitea Mirror. Continuing will refresh the existing entry without creating a duplicate.
</DialogDescription>
</DialogHeader>
<DialogFooter>
<Button variant="outline" onClick={handleCancelDuplicateRepository} disabled={isProcessingDuplicateRepo}>
Cancel
</Button>
<Button onClick={handleConfirmDuplicateRepository} disabled={isProcessingDuplicateRepo}>
{isProcessingDuplicateRepo ? (
<LoaderCircle className="h-4 w-4 animate-spin" />
) : (
"Continue"
)}
</Button>
</DialogFooter>
</DialogContent>
</Dialog>
<Dialog
open={isDeleteRepoDialogOpen}
onOpenChange={(open) => {
if (!open) {
setIsDeleteRepoDialogOpen(false);
setRepoToDelete(null);
}
}}
>
<DialogContent>
<DialogHeader>
<DialogTitle>Remove repository from Gitea Mirror?</DialogTitle>
<DialogDescription>
{repoToDelete?.fullName ?? "This repository"} will be deleted from Gitea Mirror only. The mirror on Gitea will remain untouched; remove it manually in Gitea if needed.
</DialogDescription>
</DialogHeader>
<DialogFooter>
<Button
variant="outline"
onClick={() => {
setIsDeleteRepoDialogOpen(false);
setRepoToDelete(null);
}}
disabled={isDeletingRepo}
>
Cancel
</Button>
<Button variant="destructive" onClick={handleDeleteRepository} disabled={isDeletingRepo}>
{isDeletingRepo ? (
<LoaderCircle className="h-4 w-4 animate-spin" />
) : (
<span className="flex items-center gap-2">
<Trash2 className="h-4 w-4" />
Delete
</span>
)}
</Button>
</DialogFooter>
</DialogContent>
</Dialog>
</div>
);
}

View File

@@ -1,7 +1,7 @@
import { useMemo, useRef } from "react";
import Fuse from "fuse.js";
import { useVirtualizer } from "@tanstack/react-virtual";
import { FlipHorizontal, GitFork, RefreshCw, RotateCcw, Star, Lock, Ban, Check, ChevronDown } from "lucide-react";
import { FlipHorizontal, GitFork, RefreshCw, RotateCcw, Star, Lock, Ban, Check, ChevronDown, Trash2 } from "lucide-react";
import { SiGithub, SiGitea } from "react-icons/si";
import type { Repository } from "@/lib/db/schema";
import { Button } from "@/components/ui/button";
@@ -23,6 +23,7 @@ import {
DropdownMenu,
DropdownMenuContent,
DropdownMenuItem,
DropdownMenuSeparator,
DropdownMenuTrigger,
} from "@/components/ui/dropdown-menu";
@@ -40,6 +41,7 @@ interface RepositoryTableProps {
selectedRepoIds: Set<string>;
onSelectionChange: (selectedIds: Set<string>) => void;
onRefresh?: () => Promise<void>;
onDelete?: (repoId: string) => void;
}
export default function RepositoryTable({
@@ -56,6 +58,7 @@ export default function RepositoryTable({
selectedRepoIds,
onSelectionChange,
onRefresh,
onDelete,
}: RepositoryTableProps) {
const tableParentRef = useRef<HTMLDivElement>(null);
const { giteaConfig } = useGiteaConfig();
@@ -676,6 +679,7 @@ export default function RepositoryTable({
onSync={() => onSync({ repoId: repo.id ?? "" })}
onRetry={() => onRetry({ repoId: repo.id ?? "" })}
onSkip={(skip) => onSkip({ repoId: repo.id ?? "", skip })}
onDelete={onDelete && repo.id ? () => onDelete(repo.id as string) : undefined}
/>
</div>
{/* Links */}
@@ -786,6 +790,7 @@ function RepoActionButton({
onSync,
onRetry,
onSkip,
onDelete,
}: {
repo: { id: string; status: string };
isLoading: boolean;
@@ -793,6 +798,7 @@ function RepoActionButton({
onSync: () => void;
onRetry: () => void;
onSkip: (skip: boolean) => void;
onDelete?: () => void;
}) {
// For ignored repos, show an "Include" action
if (repo.status === "ignored") {
@@ -849,7 +855,7 @@ function RepoActionButton({
);
}
// Show primary action with dropdown for skip option
// Show primary action with dropdown for additional actions
return (
<DropdownMenu>
<div className="flex">
@@ -886,6 +892,18 @@ function RepoActionButton({
<Ban className="h-4 w-4 mr-2" />
Ignore Repository
</DropdownMenuItem>
{onDelete && (
<>
<DropdownMenuSeparator />
<DropdownMenuItem
className="text-destructive focus:text-destructive"
onClick={onDelete}
>
<Trash2 className="h-4 w-4 mr-2" />
Delete from Mirror
</DropdownMenuItem>
</>
)}
</DropdownMenuContent>
</DropdownMenu>
);

View File

@@ -166,6 +166,8 @@ export const auth = betterAuth({
defaultOverrideUserInfo: true,
// Allow implicit sign up for new users
disableImplicitSignUp: false,
// Trust email_verified claims from the upstream provider so we can link by matching email
trustEmailVerified: true,
}),
],
});

View File

@@ -127,6 +127,7 @@ export const repositorySchema = z.object({
configId: z.string(),
name: z.string(),
fullName: z.string(),
normalizedFullName: z.string(),
url: z.url(),
cloneUrl: z.url(),
owner: z.string(),
@@ -163,6 +164,7 @@ export const repositorySchema = z.object({
lastMirrored: z.coerce.date().optional().nullable(),
errorMessage: z.string().optional().nullable(),
destinationOrg: z.string().optional().nullable(),
metadata: z.string().optional().nullable(), // JSON string for metadata sync state
createdAt: z.coerce.date(),
updatedAt: z.coerce.date(),
});
@@ -209,6 +211,7 @@ export const organizationSchema = z.object({
userId: z.string(),
configId: z.string(),
name: z.string(),
normalizedName: z.string(),
avatarUrl: z.string(),
membershipRole: z.enum(["member", "admin", "owner", "billing_manager"]).default("member"),
isIncluded: z.boolean().default(true),
@@ -334,6 +337,7 @@ export const repositories = sqliteTable("repositories", {
.references(() => configs.id),
name: text("name").notNull(),
fullName: text("full_name").notNull(),
normalizedFullName: text("normalized_full_name").notNull(),
url: text("url").notNull(),
cloneUrl: text("clone_url").notNull(),
owner: text("owner").notNull(),
@@ -373,6 +377,8 @@ export const repositories = sqliteTable("repositories", {
destinationOrg: text("destination_org"),
metadata: text("metadata"), // JSON string storing metadata sync state (issues, PRs, releases, etc.)
createdAt: integer("created_at", { mode: "timestamp" })
.notNull()
.default(sql`(unixepoch())`),
@@ -388,6 +394,7 @@ export const repositories = sqliteTable("repositories", {
index("idx_repositories_is_fork").on(table.isForked),
index("idx_repositories_is_starred").on(table.isStarred),
uniqueIndex("uniq_repositories_user_full_name").on(table.userId, table.fullName),
uniqueIndex("uniq_repositories_user_normalized_full_name").on(table.userId, table.normalizedFullName),
]);
export const mirrorJobs = sqliteTable("mirror_jobs", {
@@ -438,6 +445,7 @@ export const organizations = sqliteTable("organizations", {
.notNull()
.references(() => configs.id),
name: text("name").notNull(),
normalizedName: text("normalized_name").notNull(),
avatarUrl: text("avatar_url").notNull(),
@@ -469,6 +477,7 @@ export const organizations = sqliteTable("organizations", {
index("idx_organizations_config_id").on(table.configId),
index("idx_organizations_status").on(table.status),
index("idx_organizations_is_included").on(table.isIncluded),
uniqueIndex("uniq_organizations_user_normalized_name").on(table.userId, table.normalizedName),
]);
// ===== Better Auth Tables =====
@@ -502,6 +511,10 @@ export const accounts = sqliteTable("accounts", {
providerUserId: text("provider_user_id"), // Make nullable for email/password auth
accessToken: text("access_token"),
refreshToken: text("refresh_token"),
idToken: text("id_token"),
accessTokenExpiresAt: integer("access_token_expires_at", { mode: "timestamp" }),
refreshTokenExpiresAt: integer("refresh_token_expires_at", { mode: "timestamp" }),
scope: text("scope"),
expiresAt: integer("expires_at", { mode: "timestamp" }),
password: text("password"), // For credential provider
createdAt: integer("created_at", { mode: "timestamp" })

View File

@@ -8,6 +8,10 @@ mock.module("@/lib/helpers", () => ({
}));
const mockMirrorGitHubReleasesToGitea = mock(() => Promise.resolve());
const mockMirrorGitRepoIssuesToGitea = mock(() => Promise.resolve());
const mockMirrorGitRepoPullRequestsToGitea = mock(() => Promise.resolve());
const mockMirrorGitRepoLabelsToGitea = mock(() => Promise.resolve());
const mockMirrorGitRepoMilestonesToGitea = mock(() => Promise.resolve());
const mockGetGiteaRepoOwnerAsync = mock(() => Promise.resolve("starred"));
// Mock the database module
@@ -128,6 +132,36 @@ const mockHttpGet = mock(async (url: string, headers?: any) => {
headers: new Headers()
};
}
if (url.includes("/api/v1/repos/starred/metadata-repo")) {
return {
data: {
id: 790,
name: "metadata-repo",
mirror: true,
owner: { login: "starred" },
mirror_interval: "8h",
private: false,
},
status: 200,
statusText: "OK",
headers: new Headers(),
};
}
if (url.includes("/api/v1/repos/starred/already-synced-repo")) {
return {
data: {
id: 791,
name: "already-synced-repo",
mirror: true,
owner: { login: "starred" },
mirror_interval: "8h",
private: false,
},
status: 200,
statusText: "OK",
headers: new Headers(),
};
}
if (url.includes("/api/v1/repos/")) {
throw new MockHttpError("Not Found", 404, "Not Found");
}
@@ -224,6 +258,10 @@ describe("Enhanced Gitea Operations", () => {
mockDb.insert.mockClear();
mockDb.update.mockClear();
mockMirrorGitHubReleasesToGitea.mockClear();
mockMirrorGitRepoIssuesToGitea.mockClear();
mockMirrorGitRepoPullRequestsToGitea.mockClear();
mockMirrorGitRepoLabelsToGitea.mockClear();
mockMirrorGitRepoMilestonesToGitea.mockClear();
mockGetGiteaRepoOwnerAsync.mockClear();
mockGetGiteaRepoOwnerAsync.mockImplementation(() => Promise.resolve("starred"));
// Reset tracking variables
@@ -426,6 +464,10 @@ describe("Enhanced Gitea Operations", () => {
{
getGiteaRepoOwnerAsync: mockGetGiteaRepoOwnerAsync,
mirrorGitHubReleasesToGitea: mockMirrorGitHubReleasesToGitea,
mirrorGitRepoIssuesToGitea: mockMirrorGitRepoIssuesToGitea,
mirrorGitRepoPullRequestsToGitea: mockMirrorGitRepoPullRequestsToGitea,
mirrorGitRepoLabelsToGitea: mockMirrorGitRepoLabelsToGitea,
mirrorGitRepoMilestonesToGitea: mockMirrorGitRepoMilestonesToGitea,
}
)
).rejects.toThrow("Repository non-mirror-repo is not a mirror. Cannot sync.");
@@ -470,6 +512,10 @@ describe("Enhanced Gitea Operations", () => {
{
getGiteaRepoOwnerAsync: mockGetGiteaRepoOwnerAsync,
mirrorGitHubReleasesToGitea: mockMirrorGitHubReleasesToGitea,
mirrorGitRepoIssuesToGitea: mockMirrorGitRepoIssuesToGitea,
mirrorGitRepoPullRequestsToGitea: mockMirrorGitRepoPullRequestsToGitea,
mirrorGitRepoLabelsToGitea: mockMirrorGitRepoLabelsToGitea,
mirrorGitRepoMilestonesToGitea: mockMirrorGitRepoMilestonesToGitea,
}
);
@@ -482,6 +528,130 @@ describe("Enhanced Gitea Operations", () => {
expect(releaseCall.config.githubConfig?.token).toBe("github-token");
expect(releaseCall.octokit).toBeDefined();
});
test("mirrors metadata components when enabled and not previously synced", async () => {
const config: Partial<Config> = {
userId: "user123",
githubConfig: {
username: "testuser",
token: "github-token",
privateRepositories: true,
mirrorStarred: false,
},
giteaConfig: {
url: "https://gitea.example.com",
token: "encrypted-token",
defaultOwner: "testuser",
mirrorReleases: true,
mirrorMetadata: true,
mirrorIssues: true,
mirrorPullRequests: true,
mirrorLabels: true,
mirrorMilestones: true,
},
};
const repository: Repository = {
id: "repo789",
name: "metadata-repo",
fullName: "user/metadata-repo",
owner: "user",
cloneUrl: "https://github.com/user/metadata-repo.git",
isPrivate: false,
isStarred: false,
status: repoStatusEnum.parse("mirrored"),
visibility: "public",
userId: "user123",
createdAt: new Date(),
updatedAt: new Date(),
metadata: null,
};
await syncGiteaRepoEnhanced(
{ config, repository },
{
getGiteaRepoOwnerAsync: mockGetGiteaRepoOwnerAsync,
mirrorGitHubReleasesToGitea: mockMirrorGitHubReleasesToGitea,
mirrorGitRepoIssuesToGitea: mockMirrorGitRepoIssuesToGitea,
mirrorGitRepoPullRequestsToGitea: mockMirrorGitRepoPullRequestsToGitea,
mirrorGitRepoLabelsToGitea: mockMirrorGitRepoLabelsToGitea,
mirrorGitRepoMilestonesToGitea: mockMirrorGitRepoMilestonesToGitea,
}
);
expect(mockMirrorGitHubReleasesToGitea).toHaveBeenCalledTimes(1);
expect(mockMirrorGitRepoIssuesToGitea).toHaveBeenCalledTimes(1);
expect(mockMirrorGitRepoPullRequestsToGitea).toHaveBeenCalledTimes(1);
expect(mockMirrorGitRepoMilestonesToGitea).toHaveBeenCalledTimes(1);
// Labels should be skipped because issues already import them
expect(mockMirrorGitRepoLabelsToGitea).not.toHaveBeenCalled();
});
test("skips metadata mirroring when components already synced", async () => {
const config: Partial<Config> = {
userId: "user123",
githubConfig: {
username: "testuser",
token: "github-token",
privateRepositories: true,
mirrorStarred: false,
},
giteaConfig: {
url: "https://gitea.example.com",
token: "encrypted-token",
defaultOwner: "testuser",
mirrorReleases: false,
mirrorMetadata: true,
mirrorIssues: true,
mirrorPullRequests: true,
mirrorLabels: true,
mirrorMilestones: true,
},
};
const repository: Repository = {
id: "repo790",
name: "already-synced-repo",
fullName: "user/already-synced-repo",
owner: "user",
cloneUrl: "https://github.com/user/already-synced-repo.git",
isPrivate: false,
isStarred: false,
status: repoStatusEnum.parse("mirrored"),
visibility: "public",
userId: "user123",
createdAt: new Date(),
updatedAt: new Date(),
metadata: JSON.stringify({
components: {
releases: true,
issues: true,
pullRequests: true,
labels: true,
milestones: true,
},
lastSyncedAt: new Date().toISOString(),
}),
};
await syncGiteaRepoEnhanced(
{ config, repository },
{
getGiteaRepoOwnerAsync: mockGetGiteaRepoOwnerAsync,
mirrorGitHubReleasesToGitea: mockMirrorGitHubReleasesToGitea,
mirrorGitRepoIssuesToGitea: mockMirrorGitRepoIssuesToGitea,
mirrorGitRepoPullRequestsToGitea: mockMirrorGitRepoPullRequestsToGitea,
mirrorGitRepoLabelsToGitea: mockMirrorGitRepoLabelsToGitea,
mirrorGitRepoMilestonesToGitea: mockMirrorGitRepoMilestonesToGitea,
}
);
expect(mockMirrorGitHubReleasesToGitea).not.toHaveBeenCalled();
expect(mockMirrorGitRepoIssuesToGitea).not.toHaveBeenCalled();
expect(mockMirrorGitRepoPullRequestsToGitea).not.toHaveBeenCalled();
expect(mockMirrorGitRepoLabelsToGitea).not.toHaveBeenCalled();
expect(mockMirrorGitRepoMilestonesToGitea).not.toHaveBeenCalled();
});
});
describe("handleExistingNonMirrorRepo", () => {

View File

@@ -15,10 +15,18 @@ import { httpPost, httpGet, httpPatch, HttpError } from "./http-client";
import { db, repositories } from "./db";
import { eq } from "drizzle-orm";
import { repoStatusEnum } from "@/types/Repository";
import {
parseRepositoryMetadataState,
serializeRepositoryMetadataState,
} from "./metadata-state";
type SyncDependencies = {
getGiteaRepoOwnerAsync: typeof import("./gitea")["getGiteaRepoOwnerAsync"];
mirrorGitHubReleasesToGitea: typeof import("./gitea")["mirrorGitHubReleasesToGitea"];
mirrorGitRepoIssuesToGitea: typeof import("./gitea")["mirrorGitRepoIssuesToGitea"];
mirrorGitRepoPullRequestsToGitea: typeof import("./gitea")["mirrorGitRepoPullRequestsToGitea"];
mirrorGitRepoLabelsToGitea: typeof import("./gitea")["mirrorGitRepoLabelsToGitea"];
mirrorGitRepoMilestonesToGitea: typeof import("./gitea")["mirrorGitRepoMilestonesToGitea"];
};
/**
@@ -330,36 +338,236 @@ export async function syncGiteaRepoEnhanced({
Authorization: `token ${decryptedConfig.giteaConfig.token}`,
});
const metadataState = parseRepositoryMetadataState(repository.metadata);
let metadataUpdated = false;
const skipMetadataForStarred =
repository.isStarred && config.githubConfig?.starredCodeOnly;
let metadataOctokit: Octokit | null = null;
const ensureOctokit = (): Octokit | null => {
if (metadataOctokit) {
return metadataOctokit;
}
if (!decryptedConfig.githubConfig?.token) {
return null;
}
metadataOctokit = new Octokit({
auth: decryptedConfig.githubConfig.token,
});
return metadataOctokit;
};
const shouldMirrorReleases =
decryptedConfig.giteaConfig?.mirrorReleases &&
!(repository.isStarred && decryptedConfig.githubConfig?.starredCodeOnly);
!!config.giteaConfig?.mirrorReleases && !skipMetadataForStarred;
const shouldMirrorIssuesThisRun =
!!config.giteaConfig?.mirrorIssues &&
!skipMetadataForStarred &&
!metadataState.components.issues;
const shouldMirrorPullRequests =
!!config.giteaConfig?.mirrorPullRequests &&
!skipMetadataForStarred &&
!metadataState.components.pullRequests;
const shouldMirrorLabels =
!!config.giteaConfig?.mirrorLabels &&
!skipMetadataForStarred &&
!shouldMirrorIssuesThisRun &&
!metadataState.components.labels;
const shouldMirrorMilestones =
!!config.giteaConfig?.mirrorMilestones &&
!skipMetadataForStarred &&
!metadataState.components.milestones;
if (shouldMirrorReleases) {
if (!decryptedConfig.githubConfig?.token) {
const octokit = ensureOctokit();
if (!octokit) {
console.warn(
`[Sync] Skipping release mirroring for ${repository.name}: Missing GitHub token`
);
} else {
try {
const octokit = new Octokit({ auth: decryptedConfig.githubConfig.token });
await dependencies.mirrorGitHubReleasesToGitea({
config: decryptedConfig,
config,
octokit,
repository,
giteaOwner: repoOwner,
giteaRepoName: repository.name,
});
console.log(`[Sync] Mirrored releases for ${repository.name} after sync`);
metadataState.components.releases = true;
metadataUpdated = true;
console.log(
`[Sync] Mirrored releases for ${repository.name} after sync`
);
} catch (releaseError) {
console.error(
`[Sync] Failed to mirror releases for ${repository.name}: ${
releaseError instanceof Error ? releaseError.message : String(releaseError)
releaseError instanceof Error
? releaseError.message
: String(releaseError)
}`
);
}
}
}
if (shouldMirrorIssuesThisRun) {
const octokit = ensureOctokit();
if (!octokit) {
console.warn(
`[Sync] Skipping issue mirroring for ${repository.name}: Missing GitHub token`
);
} else {
try {
await dependencies.mirrorGitRepoIssuesToGitea({
config,
octokit,
repository,
giteaOwner: repoOwner,
giteaRepoName: repository.name,
});
metadataState.components.issues = true;
metadataState.components.labels = true;
metadataUpdated = true;
console.log(
`[Sync] Mirrored issues for ${repository.name} after sync`
);
} catch (issueError) {
console.error(
`[Sync] Failed to mirror issues for ${repository.name}: ${
issueError instanceof Error
? issueError.message
: String(issueError)
}`
);
}
}
} else if (
config.giteaConfig?.mirrorIssues &&
metadataState.components.issues
) {
console.log(
`[Sync] Issues already mirrored for ${repository.name}; skipping`
);
}
if (shouldMirrorPullRequests) {
const octokit = ensureOctokit();
if (!octokit) {
console.warn(
`[Sync] Skipping pull request mirroring for ${repository.name}: Missing GitHub token`
);
} else {
try {
await dependencies.mirrorGitRepoPullRequestsToGitea({
config,
octokit,
repository,
giteaOwner: repoOwner,
giteaRepoName: repository.name,
});
metadataState.components.pullRequests = true;
metadataUpdated = true;
console.log(
`[Sync] Mirrored pull requests for ${repository.name} after sync`
);
} catch (prError) {
console.error(
`[Sync] Failed to mirror pull requests for ${repository.name}: ${
prError instanceof Error ? prError.message : String(prError)
}`
);
}
}
} else if (
config.giteaConfig?.mirrorPullRequests &&
metadataState.components.pullRequests
) {
console.log(
`[Sync] Pull requests already mirrored for ${repository.name}; skipping`
);
}
if (shouldMirrorLabels) {
const octokit = ensureOctokit();
if (!octokit) {
console.warn(
`[Sync] Skipping label mirroring for ${repository.name}: Missing GitHub token`
);
} else {
try {
await dependencies.mirrorGitRepoLabelsToGitea({
config,
octokit,
repository,
giteaOwner: repoOwner,
giteaRepoName: repository.name,
});
metadataState.components.labels = true;
metadataUpdated = true;
console.log(
`[Sync] Mirrored labels for ${repository.name} after sync`
);
} catch (labelError) {
console.error(
`[Sync] Failed to mirror labels for ${repository.name}: ${
labelError instanceof Error
? labelError.message
: String(labelError)
}`
);
}
}
} else if (
config.giteaConfig?.mirrorLabels &&
metadataState.components.labels
) {
console.log(
`[Sync] Labels already mirrored for ${repository.name}; skipping`
);
}
if (shouldMirrorMilestones) {
const octokit = ensureOctokit();
if (!octokit) {
console.warn(
`[Sync] Skipping milestone mirroring for ${repository.name}: Missing GitHub token`
);
} else {
try {
await dependencies.mirrorGitRepoMilestonesToGitea({
config,
octokit,
repository,
giteaOwner: repoOwner,
giteaRepoName: repository.name,
});
metadataState.components.milestones = true;
metadataUpdated = true;
console.log(
`[Sync] Mirrored milestones for ${repository.name} after sync`
);
} catch (milestoneError) {
console.error(
`[Sync] Failed to mirror milestones for ${repository.name}: ${
milestoneError instanceof Error
? milestoneError.message
: String(milestoneError)
}`
);
}
}
} else if (
config.giteaConfig?.mirrorMilestones &&
metadataState.components.milestones
) {
console.log(
`[Sync] Milestones already mirrored for ${repository.name}; skipping`
);
}
if (metadataUpdated) {
metadataState.lastSyncedAt = new Date().toISOString();
}
// Mark repo as "synced" in DB
await db
.update(repositories)
@@ -369,6 +577,9 @@ export async function syncGiteaRepoEnhanced({
lastMirrored: new Date(),
errorMessage: null,
mirroredLocation: `${repoOwner}/${repository.name}`,
metadata: metadataUpdated
? serializeRepositoryMetadataState(metadataState)
: repository.metadata ?? null,
})
.where(eq(repositories.id, repository.id!));

View File

@@ -13,6 +13,10 @@ import { db, organizations, repositories } from "./db";
import { eq, and } from "drizzle-orm";
import { decryptConfigTokens } from "./utils/config-encryption";
import { formatDateShort } from "./utils";
import {
parseRepositoryMetadataState,
serializeRepositoryMetadataState,
} from "./metadata-state";
/**
* Helper function to get organization configuration including destination override
@@ -587,12 +591,18 @@ export const mirrorGithubRepoToGitea = async ({
}
);
//mirror releases
// Skip releases for starred repos if starredCodeOnly is enabled
const shouldMirrorReleases = config.giteaConfig?.mirrorReleases &&
!(repository.isStarred && config.githubConfig?.starredCodeOnly);
const metadataState = parseRepositoryMetadataState(repository.metadata);
let metadataUpdated = false;
const skipMetadataForStarred =
repository.isStarred && config.githubConfig?.starredCodeOnly;
console.log(`[Metadata] Release mirroring check: mirrorReleases=${config.giteaConfig?.mirrorReleases}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorReleases=${shouldMirrorReleases}`);
// Mirror releases if enabled (always allowed to rerun for updates)
const shouldMirrorReleases =
!!config.giteaConfig?.mirrorReleases && !skipMetadataForStarred;
console.log(
`[Metadata] Release mirroring check: mirrorReleases=${config.giteaConfig?.mirrorReleases}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorReleases=${shouldMirrorReleases}`
);
if (shouldMirrorReleases) {
try {
@@ -603,21 +613,32 @@ export const mirrorGithubRepoToGitea = async ({
giteaOwner: repoOwner,
giteaRepoName: targetRepoName,
});
console.log(`[Metadata] Successfully mirrored releases for ${repository.name}`);
metadataState.components.releases = true;
metadataUpdated = true;
console.log(
`[Metadata] Successfully mirrored releases for ${repository.name}`
);
} catch (error) {
console.error(`[Metadata] Failed to mirror releases for ${repository.name}: ${error instanceof Error ? error.message : String(error)}`);
console.error(
`[Metadata] Failed to mirror releases for ${repository.name}: ${
error instanceof Error ? error.message : String(error)
}`
);
// Continue with other operations even if releases fail
}
}
// clone issues
// Skip issues for starred repos if starredCodeOnly is enabled
const shouldMirrorIssues = config.giteaConfig?.mirrorIssues &&
!(repository.isStarred && config.githubConfig?.starredCodeOnly);
console.log(`[Metadata] Issue mirroring check: mirrorIssues=${config.giteaConfig?.mirrorIssues}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorIssues=${shouldMirrorIssues}`);
if (shouldMirrorIssues) {
// Determine metadata operations to avoid duplicates
const shouldMirrorIssuesThisRun =
!!config.giteaConfig?.mirrorIssues &&
!skipMetadataForStarred &&
!metadataState.components.issues;
console.log(
`[Metadata] Issue mirroring check: mirrorIssues=${config.giteaConfig?.mirrorIssues}, alreadyMirrored=${metadataState.components.issues}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorIssues=${shouldMirrorIssuesThisRun}`
);
if (shouldMirrorIssuesThisRun) {
try {
await mirrorGitRepoIssuesToGitea({
config,
@@ -626,19 +647,34 @@ export const mirrorGithubRepoToGitea = async ({
giteaOwner: repoOwner,
giteaRepoName: targetRepoName,
});
console.log(`[Metadata] Successfully mirrored issues for ${repository.name}`);
metadataState.components.issues = true;
metadataState.components.labels = true;
metadataUpdated = true;
console.log(
`[Metadata] Successfully mirrored issues for ${repository.name}`
);
} catch (error) {
console.error(`[Metadata] Failed to mirror issues for ${repository.name}: ${error instanceof Error ? error.message : String(error)}`);
console.error(
`[Metadata] Failed to mirror issues for ${repository.name}: ${
error instanceof Error ? error.message : String(error)
}`
);
// Continue with other metadata operations even if issues fail
}
} else if (config.giteaConfig?.mirrorIssues && metadataState.components.issues) {
console.log(
`[Metadata] Issues already mirrored for ${repository.name}; skipping to avoid duplicates`
);
}
// Mirror pull requests if enabled
// Skip pull requests for starred repos if starredCodeOnly is enabled
const shouldMirrorPullRequests = config.giteaConfig?.mirrorPullRequests &&
!(repository.isStarred && config.githubConfig?.starredCodeOnly);
const shouldMirrorPullRequests =
!!config.giteaConfig?.mirrorPullRequests &&
!skipMetadataForStarred &&
!metadataState.components.pullRequests;
console.log(`[Metadata] Pull request mirroring check: mirrorPullRequests=${config.giteaConfig?.mirrorPullRequests}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorPullRequests=${shouldMirrorPullRequests}`);
console.log(
`[Metadata] Pull request mirroring check: mirrorPullRequests=${config.giteaConfig?.mirrorPullRequests}, alreadyMirrored=${metadataState.components.pullRequests}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorPullRequests=${shouldMirrorPullRequests}`
);
if (shouldMirrorPullRequests) {
try {
@@ -649,19 +685,37 @@ export const mirrorGithubRepoToGitea = async ({
giteaOwner: repoOwner,
giteaRepoName: targetRepoName,
});
console.log(`[Metadata] Successfully mirrored pull requests for ${repository.name}`);
metadataState.components.pullRequests = true;
metadataUpdated = true;
console.log(
`[Metadata] Successfully mirrored pull requests for ${repository.name}`
);
} catch (error) {
console.error(`[Metadata] Failed to mirror pull requests for ${repository.name}: ${error instanceof Error ? error.message : String(error)}`);
console.error(
`[Metadata] Failed to mirror pull requests for ${repository.name}: ${
error instanceof Error ? error.message : String(error)
}`
);
// Continue with other metadata operations even if PRs fail
}
} else if (
config.giteaConfig?.mirrorPullRequests &&
metadataState.components.pullRequests
) {
console.log(
`[Metadata] Pull requests already mirrored for ${repository.name}; skipping`
);
}
// Mirror labels if enabled (and not already done via issues)
// Skip labels for starred repos if starredCodeOnly is enabled
const shouldMirrorLabels = config.giteaConfig?.mirrorLabels && !shouldMirrorIssues &&
!(repository.isStarred && config.githubConfig?.starredCodeOnly);
const shouldMirrorLabels =
!!config.giteaConfig?.mirrorLabels &&
!skipMetadataForStarred &&
!shouldMirrorIssuesThisRun &&
!metadataState.components.labels;
console.log(`[Metadata] Label mirroring check: mirrorLabels=${config.giteaConfig?.mirrorLabels}, shouldMirrorIssues=${shouldMirrorIssues}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorLabels=${shouldMirrorLabels}`);
console.log(
`[Metadata] Label mirroring check: mirrorLabels=${config.giteaConfig?.mirrorLabels}, alreadyMirrored=${metadataState.components.labels}, issuesRunning=${shouldMirrorIssuesThisRun}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorLabels=${shouldMirrorLabels}`
);
if (shouldMirrorLabels) {
try {
@@ -672,19 +726,33 @@ export const mirrorGithubRepoToGitea = async ({
giteaOwner: repoOwner,
giteaRepoName: targetRepoName,
});
console.log(`[Metadata] Successfully mirrored labels for ${repository.name}`);
metadataState.components.labels = true;
metadataUpdated = true;
console.log(
`[Metadata] Successfully mirrored labels for ${repository.name}`
);
} catch (error) {
console.error(`[Metadata] Failed to mirror labels for ${repository.name}: ${error instanceof Error ? error.message : String(error)}`);
console.error(
`[Metadata] Failed to mirror labels for ${repository.name}: ${
error instanceof Error ? error.message : String(error)
}`
);
// Continue with other metadata operations even if labels fail
}
} else if (config.giteaConfig?.mirrorLabels && metadataState.components.labels) {
console.log(
`[Metadata] Labels already mirrored for ${repository.name}; skipping`
);
}
// Mirror milestones if enabled
// Skip milestones for starred repos if starredCodeOnly is enabled
const shouldMirrorMilestones = config.giteaConfig?.mirrorMilestones &&
!(repository.isStarred && config.githubConfig?.starredCodeOnly);
const shouldMirrorMilestones =
!!config.giteaConfig?.mirrorMilestones &&
!skipMetadataForStarred &&
!metadataState.components.milestones;
console.log(`[Metadata] Milestone mirroring check: mirrorMilestones=${config.giteaConfig?.mirrorMilestones}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorMilestones=${shouldMirrorMilestones}`);
console.log(
`[Metadata] Milestone mirroring check: mirrorMilestones=${config.giteaConfig?.mirrorMilestones}, alreadyMirrored=${metadataState.components.milestones}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorMilestones=${shouldMirrorMilestones}`
);
if (shouldMirrorMilestones) {
try {
@@ -695,11 +763,30 @@ export const mirrorGithubRepoToGitea = async ({
giteaOwner: repoOwner,
giteaRepoName: targetRepoName,
});
console.log(`[Metadata] Successfully mirrored milestones for ${repository.name}`);
metadataState.components.milestones = true;
metadataUpdated = true;
console.log(
`[Metadata] Successfully mirrored milestones for ${repository.name}`
);
} catch (error) {
console.error(`[Metadata] Failed to mirror milestones for ${repository.name}: ${error instanceof Error ? error.message : String(error)}`);
console.error(
`[Metadata] Failed to mirror milestones for ${repository.name}: ${
error instanceof Error ? error.message : String(error)
}`
);
// Continue with other metadata operations even if milestones fail
}
} else if (
config.giteaConfig?.mirrorMilestones &&
metadataState.components.milestones
) {
console.log(
`[Metadata] Milestones already mirrored for ${repository.name}; skipping`
);
}
if (metadataUpdated) {
metadataState.lastSyncedAt = new Date().toISOString();
}
console.log(`Repository ${repository.name} mirrored successfully as ${targetRepoName}`);
@@ -713,6 +800,9 @@ export const mirrorGithubRepoToGitea = async ({
lastMirrored: new Date(),
errorMessage: null,
mirroredLocation: `${repoOwner}/${targetRepoName}`,
metadata: metadataUpdated
? serializeRepositoryMetadataState(metadataState)
: repository.metadata ?? null,
})
.where(eq(repositories.id, repository.id!));
@@ -1053,12 +1143,17 @@ export async function mirrorGitHubRepoToGiteaOrg({
}
);
//mirror releases
// Skip releases for starred repos if starredCodeOnly is enabled
const shouldMirrorReleases = config.giteaConfig?.mirrorReleases &&
!(repository.isStarred && config.githubConfig?.starredCodeOnly);
const metadataState = parseRepositoryMetadataState(repository.metadata);
let metadataUpdated = false;
const skipMetadataForStarred =
repository.isStarred && config.githubConfig?.starredCodeOnly;
console.log(`[Metadata] Release mirroring check: mirrorReleases=${config.giteaConfig?.mirrorReleases}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorReleases=${shouldMirrorReleases}`);
const shouldMirrorReleases =
!!config.giteaConfig?.mirrorReleases && !skipMetadataForStarred;
console.log(
`[Metadata] Release mirroring check: mirrorReleases=${config.giteaConfig?.mirrorReleases}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorReleases=${shouldMirrorReleases}`
);
if (shouldMirrorReleases) {
try {
@@ -1069,21 +1164,31 @@ export async function mirrorGitHubRepoToGiteaOrg({
giteaOwner: orgName,
giteaRepoName: targetRepoName,
});
console.log(`[Metadata] Successfully mirrored releases for ${repository.name}`);
metadataState.components.releases = true;
metadataUpdated = true;
console.log(
`[Metadata] Successfully mirrored releases for ${repository.name}`
);
} catch (error) {
console.error(`[Metadata] Failed to mirror releases for ${repository.name}: ${error instanceof Error ? error.message : String(error)}`);
console.error(
`[Metadata] Failed to mirror releases for ${repository.name}: ${
error instanceof Error ? error.message : String(error)
}`
);
// Continue with other operations even if releases fail
}
}
// Clone issues
// Skip issues for starred repos if starredCodeOnly is enabled
const shouldMirrorIssues = config.giteaConfig?.mirrorIssues &&
!(repository.isStarred && config.githubConfig?.starredCodeOnly);
console.log(`[Metadata] Issue mirroring check: mirrorIssues=${config.giteaConfig?.mirrorIssues}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorIssues=${shouldMirrorIssues}`);
if (shouldMirrorIssues) {
const shouldMirrorIssuesThisRun =
!!config.giteaConfig?.mirrorIssues &&
!skipMetadataForStarred &&
!metadataState.components.issues;
console.log(
`[Metadata] Issue mirroring check: mirrorIssues=${config.giteaConfig?.mirrorIssues}, alreadyMirrored=${metadataState.components.issues}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorIssues=${shouldMirrorIssuesThisRun}`
);
if (shouldMirrorIssuesThisRun) {
try {
await mirrorGitRepoIssuesToGitea({
config,
@@ -1092,19 +1197,37 @@ export async function mirrorGitHubRepoToGiteaOrg({
giteaOwner: orgName,
giteaRepoName: targetRepoName,
});
console.log(`[Metadata] Successfully mirrored issues for ${repository.name} to org ${orgName}/${targetRepoName}`);
metadataState.components.issues = true;
metadataState.components.labels = true;
metadataUpdated = true;
console.log(
`[Metadata] Successfully mirrored issues for ${repository.name} to org ${orgName}/${targetRepoName}`
);
} catch (error) {
console.error(`[Metadata] Failed to mirror issues for ${repository.name} to org ${orgName}/${targetRepoName}: ${error instanceof Error ? error.message : String(error)}`);
console.error(
`[Metadata] Failed to mirror issues for ${repository.name} to org ${orgName}/${targetRepoName}: ${
error instanceof Error ? error.message : String(error)
}`
);
// Continue with other metadata operations even if issues fail
}
} else if (
config.giteaConfig?.mirrorIssues &&
metadataState.components.issues
) {
console.log(
`[Metadata] Issues already mirrored for ${repository.name}; skipping`
);
}
// Mirror pull requests if enabled
// Skip pull requests for starred repos if starredCodeOnly is enabled
const shouldMirrorPullRequests = config.giteaConfig?.mirrorPullRequests &&
!(repository.isStarred && config.githubConfig?.starredCodeOnly);
const shouldMirrorPullRequests =
!!config.giteaConfig?.mirrorPullRequests &&
!skipMetadataForStarred &&
!metadataState.components.pullRequests;
console.log(`[Metadata] Pull request mirroring check: mirrorPullRequests=${config.giteaConfig?.mirrorPullRequests}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorPullRequests=${shouldMirrorPullRequests}`);
console.log(
`[Metadata] Pull request mirroring check: mirrorPullRequests=${config.giteaConfig?.mirrorPullRequests}, alreadyMirrored=${metadataState.components.pullRequests}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorPullRequests=${shouldMirrorPullRequests}`
);
if (shouldMirrorPullRequests) {
try {
@@ -1115,19 +1238,37 @@ export async function mirrorGitHubRepoToGiteaOrg({
giteaOwner: orgName,
giteaRepoName: targetRepoName,
});
console.log(`[Metadata] Successfully mirrored pull requests for ${repository.name} to org ${orgName}/${targetRepoName}`);
metadataState.components.pullRequests = true;
metadataUpdated = true;
console.log(
`[Metadata] Successfully mirrored pull requests for ${repository.name} to org ${orgName}/${targetRepoName}`
);
} catch (error) {
console.error(`[Metadata] Failed to mirror pull requests for ${repository.name} to org ${orgName}/${targetRepoName}: ${error instanceof Error ? error.message : String(error)}`);
console.error(
`[Metadata] Failed to mirror pull requests for ${repository.name} to org ${orgName}/${targetRepoName}: ${
error instanceof Error ? error.message : String(error)
}`
);
// Continue with other metadata operations even if PRs fail
}
} else if (
config.giteaConfig?.mirrorPullRequests &&
metadataState.components.pullRequests
) {
console.log(
`[Metadata] Pull requests already mirrored for ${repository.name}; skipping`
);
}
// Mirror labels if enabled (and not already done via issues)
// Skip labels for starred repos if starredCodeOnly is enabled
const shouldMirrorLabels = config.giteaConfig?.mirrorLabels && !shouldMirrorIssues &&
!(repository.isStarred && config.githubConfig?.starredCodeOnly);
const shouldMirrorLabels =
!!config.giteaConfig?.mirrorLabels &&
!skipMetadataForStarred &&
!shouldMirrorIssuesThisRun &&
!metadataState.components.labels;
console.log(`[Metadata] Label mirroring check: mirrorLabels=${config.giteaConfig?.mirrorLabels}, shouldMirrorIssues=${shouldMirrorIssues}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorLabels=${shouldMirrorLabels}`);
console.log(
`[Metadata] Label mirroring check: mirrorLabels=${config.giteaConfig?.mirrorLabels}, alreadyMirrored=${metadataState.components.labels}, issuesRunning=${shouldMirrorIssuesThisRun}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorLabels=${shouldMirrorLabels}`
);
if (shouldMirrorLabels) {
try {
@@ -1138,19 +1279,36 @@ export async function mirrorGitHubRepoToGiteaOrg({
giteaOwner: orgName,
giteaRepoName: targetRepoName,
});
console.log(`[Metadata] Successfully mirrored labels for ${repository.name} to org ${orgName}/${targetRepoName}`);
metadataState.components.labels = true;
metadataUpdated = true;
console.log(
`[Metadata] Successfully mirrored labels for ${repository.name} to org ${orgName}/${targetRepoName}`
);
} catch (error) {
console.error(`[Metadata] Failed to mirror labels for ${repository.name} to org ${orgName}/${targetRepoName}: ${error instanceof Error ? error.message : String(error)}`);
console.error(
`[Metadata] Failed to mirror labels for ${repository.name} to org ${orgName}/${targetRepoName}: ${
error instanceof Error ? error.message : String(error)
}`
);
// Continue with other metadata operations even if labels fail
}
} else if (
config.giteaConfig?.mirrorLabels &&
metadataState.components.labels
) {
console.log(
`[Metadata] Labels already mirrored for ${repository.name}; skipping`
);
}
// Mirror milestones if enabled
// Skip milestones for starred repos if starredCodeOnly is enabled
const shouldMirrorMilestones = config.giteaConfig?.mirrorMilestones &&
!(repository.isStarred && config.githubConfig?.starredCodeOnly);
const shouldMirrorMilestones =
!!config.giteaConfig?.mirrorMilestones &&
!skipMetadataForStarred &&
!metadataState.components.milestones;
console.log(`[Metadata] Milestone mirroring check: mirrorMilestones=${config.giteaConfig?.mirrorMilestones}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorMilestones=${shouldMirrorMilestones}`);
console.log(
`[Metadata] Milestone mirroring check: mirrorMilestones=${config.giteaConfig?.mirrorMilestones}, alreadyMirrored=${metadataState.components.milestones}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorMilestones=${shouldMirrorMilestones}`
);
if (shouldMirrorMilestones) {
try {
@@ -1161,11 +1319,30 @@ export async function mirrorGitHubRepoToGiteaOrg({
giteaOwner: orgName,
giteaRepoName: targetRepoName,
});
console.log(`[Metadata] Successfully mirrored milestones for ${repository.name} to org ${orgName}/${targetRepoName}`);
metadataState.components.milestones = true;
metadataUpdated = true;
console.log(
`[Metadata] Successfully mirrored milestones for ${repository.name} to org ${orgName}/${targetRepoName}`
);
} catch (error) {
console.error(`[Metadata] Failed to mirror milestones for ${repository.name} to org ${orgName}/${targetRepoName}: ${error instanceof Error ? error.message : String(error)}`);
console.error(
`[Metadata] Failed to mirror milestones for ${repository.name} to org ${orgName}/${targetRepoName}: ${
error instanceof Error ? error.message : String(error)
}`
);
// Continue with other metadata operations even if milestones fail
}
} else if (
config.giteaConfig?.mirrorMilestones &&
metadataState.components.milestones
) {
console.log(
`[Metadata] Milestones already mirrored for ${repository.name}; skipping`
);
}
if (metadataUpdated) {
metadataState.lastSyncedAt = new Date().toISOString();
}
console.log(
@@ -1181,6 +1358,9 @@ export async function mirrorGitHubRepoToGiteaOrg({
lastMirrored: new Date(),
errorMessage: null,
mirroredLocation: `${orgName}/${targetRepoName}`,
metadata: metadataUpdated
? serializeRepositoryMetadataState(metadataState)
: repository.metadata ?? null,
})
.where(eq(repositories.id, repository.id!));
@@ -1812,12 +1992,26 @@ export async function mirrorGitHubReleasesToGitea({
let mirroredCount = 0;
let skippedCount = 0;
// Sort releases by created_at to ensure we get the most recent ones
const sortedReleases = releases.data.sort((a, b) =>
new Date(b.created_at).getTime() - new Date(a.created_at).getTime()
).slice(0, releaseLimit);
const getReleaseTimestamp = (release: typeof releases.data[number]) => {
const sourceDate = release.created_at ?? release.published_at ?? "";
const timestamp = sourceDate ? new Date(sourceDate).getTime() : 0;
return Number.isFinite(timestamp) ? timestamp : 0;
};
for (const release of sortedReleases) {
// Capture the latest releases, then process them oldest-to-newest so Gitea mirrors keep chronological order
const releasesToProcess = releases.data
.slice()
.sort((a, b) => getReleaseTimestamp(b) - getReleaseTimestamp(a))
.slice(0, releaseLimit)
.sort((a, b) => getReleaseTimestamp(a) - getReleaseTimestamp(b));
console.log(`[Releases] Processing ${releasesToProcess.length} releases in chronological order (oldest to newest)`);
releasesToProcess.forEach((rel, idx) => {
const date = new Date(rel.published_at || rel.created_at);
console.log(`[Releases] ${idx + 1}. ${rel.tag_name} - Originally published: ${date.toISOString()}`);
});
for (const release of releasesToProcess) {
try {
// Check if release already exists
const existingReleasesResponse = await httpGet(
@@ -1827,8 +2021,14 @@ export async function mirrorGitHubReleasesToGitea({
}
).catch(() => null);
const releaseNote = release.body || "";
// Prepare release body with GitHub original date header
const githubPublishedDate = release.published_at || release.created_at;
const githubDateHeader = githubPublishedDate
? `> 📅 **Originally published on GitHub:** ${new Date(githubPublishedDate).toUTCString()}\n\n`
: '';
const originalReleaseNote = release.body || "";
const releaseNote = githubDateHeader + originalReleaseNote;
if (existingReleasesResponse) {
// Update existing release if the changelog/body differs
const existingRelease = existingReleasesResponse.data;
@@ -1851,9 +2051,11 @@ export async function mirrorGitHubReleasesToGitea({
Authorization: `token ${decryptedConfig.giteaConfig.token}`,
}
);
if (releaseNote) {
console.log(`[Releases] Updated changelog for ${release.tag_name} (${releaseNote.length} characters)`);
if (originalReleaseNote) {
console.log(`[Releases] Updated changelog for ${release.tag_name} (${originalReleaseNote.length} characters + GitHub date header)`);
} else {
console.log(`[Releases] Updated release ${release.tag_name} with GitHub date header`);
}
mirroredCount++;
} else {
@@ -1863,9 +2065,11 @@ export async function mirrorGitHubReleasesToGitea({
continue;
}
// Create new release with changelog/body content
if (releaseNote) {
console.log(`[Releases] Including changelog for ${release.tag_name} (${releaseNote.length} characters)`);
// Create new release with changelog/body content (includes GitHub date header)
if (originalReleaseNote) {
console.log(`[Releases] Including changelog for ${release.tag_name} (${originalReleaseNote.length} characters + GitHub date header)`);
} else {
console.log(`[Releases] Creating release ${release.tag_name} with GitHub date header (no changelog)`);
}
const createReleaseResponse = await httpPost(
@@ -1933,8 +2137,14 @@ export async function mirrorGitHubReleasesToGitea({
}
mirroredCount++;
const noteInfo = releaseNote ? ` with ${releaseNote.length} character changelog` : " without changelog";
const noteInfo = originalReleaseNote ? ` with ${originalReleaseNote.length} character changelog` : " without changelog";
console.log(`[Releases] Successfully mirrored release: ${release.tag_name}${noteInfo}`);
// Add delay to ensure proper timestamp ordering in Gitea
// Gitea sorts releases by created_unix DESC, and all releases created in quick succession
// will have nearly identical timestamps. The 1-second delay ensures proper chronological order.
console.log(`[Releases] Waiting 1 second to ensure proper timestamp ordering in Gitea...`);
await new Promise(resolve => setTimeout(resolve, 1000));
} catch (error) {
console.error(`[Releases] Failed to mirror release ${release.tag_name}: ${error instanceof Error ? error.message : String(error)}`);
}

75
src/lib/metadata-state.ts Normal file
View File

@@ -0,0 +1,75 @@
interface MetadataComponentsState {
releases: boolean;
issues: boolean;
pullRequests: boolean;
labels: boolean;
milestones: boolean;
}
export interface RepositoryMetadataState {
components: MetadataComponentsState;
lastSyncedAt?: string;
}
const defaultComponents: MetadataComponentsState = {
releases: false,
issues: false,
pullRequests: false,
labels: false,
milestones: false,
};
export function createDefaultMetadataState(): RepositoryMetadataState {
return {
components: { ...defaultComponents },
};
}
export function parseRepositoryMetadataState(
raw: unknown
): RepositoryMetadataState {
const base = createDefaultMetadataState();
if (!raw) {
return base;
}
let parsed: any = raw;
if (typeof raw === "string") {
try {
parsed = JSON.parse(raw);
} catch {
return base;
}
}
if (!parsed || typeof parsed !== "object") {
return base;
}
if (parsed.components && typeof parsed.components === "object") {
base.components = {
...base.components,
releases: Boolean(parsed.components.releases),
issues: Boolean(parsed.components.issues),
pullRequests: Boolean(parsed.components.pullRequests),
labels: Boolean(parsed.components.labels),
milestones: Boolean(parsed.components.milestones),
};
}
if (typeof parsed.lastSyncedAt === "string") {
base.lastSyncedAt = parsed.lastSyncedAt;
} else if (typeof parsed.lastMetadataSync === "string") {
base.lastSyncedAt = parsed.lastMetadataSync;
}
return base;
}
export function serializeRepositoryMetadataState(
state: RepositoryMetadataState
): string {
return JSON.stringify(state);
}

View File

@@ -62,6 +62,7 @@ describe('normalizeGitRepoToInsert', () => {
expect(insert.description).toBeNull();
expect(insert.lastMirrored).toBeNull();
expect(insert.errorMessage).toBeNull();
expect(insert.normalizedFullName).toBe(repo.fullName.toLowerCase());
});
});
@@ -72,4 +73,3 @@ describe('calcBatchSizeForInsert', () => {
expect(batch * 29).toBeLessThanOrEqual(999);
});
});

View File

@@ -33,6 +33,7 @@ export function normalizeGitRepoToInsert(
configId,
name: repo.name,
fullName: repo.fullName,
normalizedFullName: repo.fullName.toLowerCase(),
url: repo.url,
cloneUrl: repo.cloneUrl,
owner: repo.owner,
@@ -68,4 +69,3 @@ export function calcBatchSizeForInsert(columnCount: number, maxParams = 999): nu
const effectiveMax = Math.max(1, maxParams - safety);
return Math.max(1, Math.floor(effectiveMax / columnCount));
}

View File

@@ -99,12 +99,12 @@ async function runScheduledSync(config: any): Promise<void> {
// Check for new repositories
const existingRepos = await db
.select({ fullName: repositories.fullName })
.select({ normalizedFullName: repositories.normalizedFullName })
.from(repositories)
.where(eq(repositories.userId, userId));
const existingRepoNames = new Set(existingRepos.map(r => r.fullName));
const newRepos = allGithubRepos.filter(r => !existingRepoNames.has(r.fullName));
const existingRepoNames = new Set(existingRepos.map(r => r.normalizedFullName));
const newRepos = allGithubRepos.filter(r => !existingRepoNames.has(r.fullName.toLowerCase()));
if (newRepos.length > 0) {
console.log(`[Scheduler] Found ${newRepos.length} new repositories for user ${userId}`);
@@ -123,7 +123,7 @@ async function runScheduledSync(config: any): Promise<void> {
await db
.insert(repositories)
.values(batch)
.onConflictDoNothing({ target: [repositories.userId, repositories.fullName] });
.onConflictDoNothing({ target: [repositories.userId, repositories.normalizedFullName] });
}
console.log(`[Scheduler] Successfully imported ${newRepos.length} new repositories for user ${userId}`);
} else {
@@ -432,12 +432,12 @@ async function performInitialAutoStart(): Promise<void> {
// Check for new repositories
const existingRepos = await db
.select({ fullName: repositories.fullName })
.select({ normalizedFullName: repositories.normalizedFullName })
.from(repositories)
.where(eq(repositories.userId, config.userId));
const existingRepoNames = new Set(existingRepos.map(r => r.fullName));
const reposToImport = allGithubRepos.filter(r => !existingRepoNames.has(r.fullName));
const existingRepoNames = new Set(existingRepos.map(r => r.normalizedFullName));
const reposToImport = allGithubRepos.filter(r => !existingRepoNames.has(r.fullName.toLowerCase()));
if (reposToImport.length > 0) {
console.log(`[Scheduler] Importing ${reposToImport.length} repositories for user ${config.userId}...`);
@@ -456,7 +456,7 @@ async function performInitialAutoStart(): Promise<void> {
await db
.insert(repositories)
.values(batch)
.onConflictDoNothing({ target: [repositories.userId, repositories.fullName] });
.onConflictDoNothing({ target: [repositories.userId, repositories.normalizedFullName] });
}
console.log(`[Scheduler] Successfully imported ${reposToImport.length} repositories`);
} else {

View File

@@ -24,6 +24,7 @@ describe("normalizeOidcProviderConfig", () => {
expect(result.oidcConfig.userInfoEndpoint).toBe("https://auth.example.com/userinfo");
expect(result.oidcConfig.scopes).toEqual(["openid", "email"]);
expect(result.oidcConfig.pkce).toBe(false);
expect(result.oidcConfig.discoveryEndpoint).toBe("https://auth.example.com/.well-known/openid-configuration");
});
it("derives missing fields from discovery", async () => {
@@ -46,6 +47,24 @@ describe("normalizeOidcProviderConfig", () => {
expect(result.oidcConfig.jwksEndpoint).toBe("https://auth.example.com/jwks");
expect(result.oidcConfig.userInfoEndpoint).toBe("https://auth.example.com/userinfo");
expect(result.oidcConfig.scopes).toEqual(["openid", "email", "profile"]);
expect(result.oidcConfig.discoveryEndpoint).toBe("https://auth.example.com/.well-known/openid-configuration");
});
it("preserves trailing slash issuers when building discovery endpoints", async () => {
const trailingIssuer = "https://auth.example.com/application/o/example/";
const requestedUrls: string[] = [];
const fetchMock: typeof fetch = async (url) => {
requestedUrls.push(typeof url === "string" ? url : url.url);
return new Response(JSON.stringify({
authorization_endpoint: "https://auth.example.com/application/o/example/auth",
token_endpoint: "https://auth.example.com/application/o/example/token",
}));
};
const result = await normalizeOidcProviderConfig(trailingIssuer, {}, fetchMock);
expect(requestedUrls[0]).toBe("https://auth.example.com/application/o/example/.well-known/openid-configuration");
expect(result.oidcConfig.discoveryEndpoint).toBe("https://auth.example.com/application/o/example/.well-known/openid-configuration");
});
it("throws for invalid issuer URL", async () => {

View File

@@ -131,18 +131,21 @@ export async function normalizeOidcProviderConfig(
throw new OidcConfigError("Issuer is required");
}
let normalizedIssuer: string;
const trimmedIssuer = issuer.trim();
try {
const issuerUrl = new URL(issuer.trim());
normalizedIssuer = issuerUrl.toString().replace(/\/$/, "");
// Validate issuer but keep caller-provided formatting so we don't break provider expectations
new URL(trimmedIssuer);
} catch {
throw new OidcConfigError(`Invalid issuer URL: ${issuer}`);
}
const issuerForDiscovery = trimmedIssuer.replace(/\/$/, "");
const discoveryEndpoint = cleanUrl(
rawConfig.discoveryEndpoint,
"discovery endpoint",
) ?? `${normalizedIssuer}/.well-known/openid-configuration`;
) ?? `${issuerForDiscovery}/.well-known/openid-configuration`;
const authorizationEndpoint = cleanUrl(rawConfig.authorizationEndpoint, "authorization endpoint");
const tokenEndpoint = cleanUrl(rawConfig.tokenEndpoint, "token endpoint");

View File

@@ -29,12 +29,13 @@ export async function POST(context: APIContext) {
);
}
// Validate issuer URL format
// Validate issuer URL format while preserving trailing slash when provided
let validatedIssuer = issuer;
if (issuer && typeof issuer === 'string' && issuer.trim() !== '') {
try {
const issuerUrl = new URL(issuer.trim());
validatedIssuer = issuerUrl.toString().replace(/\/$/, ''); // Remove trailing slash
const trimmedIssuer = issuer.trim();
new URL(trimmedIssuer);
validatedIssuer = trimmedIssuer;
} catch (e) {
return new Response(
JSON.stringify({ error: `Invalid issuer URL format: ${issuer}` }),

View File

@@ -1,5 +1,5 @@
import type { APIRoute } from "astro";
import { db, organizations } from "@/lib/db";
import { db, organizations, repositories } from "@/lib/db";
import { eq, and } from "drizzle-orm";
import { createSecureErrorResponse } from "@/lib/utils";
import { requireAuth } from "@/lib/utils/auth-helpers";
@@ -61,3 +61,60 @@ export const PATCH: APIRoute = async (context) => {
return createSecureErrorResponse(error, "Update organization destination", 500);
}
};
export const DELETE: APIRoute = async (context) => {
try {
const { user, response } = await requireAuth(context);
if (response) return response;
const userId = user!.id;
const orgId = context.params.id;
if (!orgId) {
return new Response(
JSON.stringify({ error: "Organization ID is required" }),
{
status: 400,
headers: { "Content-Type": "application/json" },
}
);
}
const [existingOrg] = await db
.select()
.from(organizations)
.where(and(eq(organizations.id, orgId), eq(organizations.userId, userId)))
.limit(1);
if (!existingOrg) {
return new Response(
JSON.stringify({ error: "Organization not found" }),
{
status: 404,
headers: { "Content-Type": "application/json" },
}
);
}
await db.delete(repositories).where(
and(
eq(repositories.userId, userId),
eq(repositories.organization, existingOrg.name)
)
);
await db
.delete(organizations)
.where(and(eq(organizations.id, orgId), eq(organizations.userId, userId)));
return new Response(
JSON.stringify({ success: true }),
{
status: 200,
headers: { "Content-Type": "application/json" },
}
);
} catch (error) {
return createSecureErrorResponse(error, "Delete organization", 500);
}
};

View File

@@ -1,5 +1,5 @@
import type { APIRoute } from "astro";
import { db, repositories } from "@/lib/db";
import { db, repositories, mirrorJobs } from "@/lib/db";
import { eq, and } from "drizzle-orm";
import { createSecureErrorResponse } from "@/lib/utils";
import { requireAuth } from "@/lib/utils/auth-helpers";
@@ -60,4 +60,55 @@ export const PATCH: APIRoute = async (context) => {
} catch (error) {
return createSecureErrorResponse(error, "Update repository destination", 500);
}
};
};
export const DELETE: APIRoute = async (context) => {
try {
const { user, response } = await requireAuth(context);
if (response) return response;
const userId = user!.id;
const repoId = context.params.id;
if (!repoId) {
return new Response(JSON.stringify({ error: "Repository ID is required" }), {
status: 400,
headers: { "Content-Type": "application/json" },
});
}
const [existingRepo] = await db
.select()
.from(repositories)
.where(and(eq(repositories.id, repoId), eq(repositories.userId, userId)))
.limit(1);
if (!existingRepo) {
return new Response(
JSON.stringify({ error: "Repository not found" }),
{
status: 404,
headers: { "Content-Type": "application/json" },
}
);
}
await db
.delete(repositories)
.where(and(eq(repositories.id, repoId), eq(repositories.userId, userId)));
await db
.delete(mirrorJobs)
.where(and(eq(mirrorJobs.repositoryId, repoId), eq(mirrorJobs.userId, userId)));
return new Response(
JSON.stringify({ success: true }),
{
status: 200,
headers: { "Content-Type": "application/json" },
}
);
} catch (error) {
return createSecureErrorResponse(error, "Delete repository", 500);
}
};

View File

@@ -17,11 +17,11 @@ export async function POST(context: APIContext) {
});
}
// Validate issuer URL format
let cleanIssuer: string;
// Validate issuer URL format while keeping trailing slash if provided
const trimmedIssuer = issuer.trim();
let parsedIssuer: URL;
try {
const issuerUrl = new URL(issuer.trim());
cleanIssuer = issuerUrl.toString().replace(/\/$/, ""); // Remove trailing slash
parsedIssuer = new URL(trimmedIssuer);
} catch (e) {
return new Response(
JSON.stringify({
@@ -35,7 +35,8 @@ export async function POST(context: APIContext) {
);
}
const discoveryUrl = `${cleanIssuer}/.well-known/openid-configuration`;
const issuerForDiscovery = trimmedIssuer.replace(/\/$/, "");
const discoveryUrl = `${issuerForDiscovery}/.well-known/openid-configuration`;
try {
// Fetch OIDC discovery document with timeout
@@ -52,9 +53,9 @@ export async function POST(context: APIContext) {
});
} catch (fetchError) {
if (fetchError instanceof Error && fetchError.name === 'AbortError') {
throw new Error(`Request timeout: The OIDC provider at ${cleanIssuer} did not respond within 10 seconds`);
throw new Error(`Request timeout: The OIDC provider at ${trimmedIssuer} did not respond within 10 seconds`);
}
throw new Error(`Network error: Could not connect to ${cleanIssuer}. Please verify the URL is correct and accessible.`);
throw new Error(`Network error: Could not connect to ${trimmedIssuer}. Please verify the URL is correct and accessible.`);
} finally {
clearTimeout(timeoutId);
}
@@ -63,7 +64,7 @@ export async function POST(context: APIContext) {
if (response.status === 404) {
throw new Error(`OIDC discovery document not found at ${discoveryUrl}. For Authentik, ensure you're using the correct application slug in the URL.`);
} else if (response.status >= 500) {
throw new Error(`OIDC provider error (${response.status}): The server at ${cleanIssuer} returned an error.`);
throw new Error(`OIDC provider error (${response.status}): The server at ${trimmedIssuer} returned an error.`);
} else {
throw new Error(`Failed to fetch discovery document (${response.status}): ${response.statusText}`);
}
@@ -73,12 +74,12 @@ export async function POST(context: APIContext) {
try {
config = await response.json();
} catch (parseError) {
throw new Error(`Invalid response: The discovery document from ${cleanIssuer} is not valid JSON.`);
throw new Error(`Invalid response: The discovery document from ${trimmedIssuer} is not valid JSON.`);
}
// Extract the essential endpoints
const discoveredConfig = {
issuer: config.issuer || cleanIssuer,
issuer: config.issuer || trimmedIssuer,
authorizationEndpoint: config.authorization_endpoint,
tokenEndpoint: config.token_endpoint,
userInfoEndpoint: config.userinfo_endpoint,
@@ -88,7 +89,7 @@ export async function POST(context: APIContext) {
responseTypes: config.response_types_supported || ["code"],
grantTypes: config.grant_types_supported || ["authorization_code"],
// Suggested domain from issuer
suggestedDomain: new URL(cleanIssuer).hostname.replace("www.", ""),
suggestedDomain: parsedIssuer.hostname.replace("www.", ""),
};
return new Response(JSON.stringify(discoveredConfig), {
@@ -111,4 +112,4 @@ export async function POST(context: APIContext) {
} catch (error) {
return createSecureErrorResponse(error, "SSO discover API");
}
}
}

View File

@@ -82,11 +82,10 @@ export async function POST(context: APIContext) {
);
}
// Clean issuer URL (remove trailing slash); validate format
let cleanIssuer = issuer;
// Validate issuer URL format but keep trailing slash if provided
const trimmedIssuer = issuer.toString().trim();
try {
const issuerUrl = new URL(issuer.toString().trim());
cleanIssuer = issuerUrl.toString().replace(/\/$/, "");
new URL(trimmedIssuer);
} catch {
return new Response(
JSON.stringify({ error: `Invalid issuer URL format: ${issuer}` }),
@@ -99,7 +98,7 @@ export async function POST(context: APIContext) {
let normalized;
try {
normalized = await normalizeOidcProviderConfig(cleanIssuer, {
normalized = await normalizeOidcProviderConfig(trimmedIssuer, {
clientId,
clientSecret,
authorizationEndpoint,
@@ -134,7 +133,7 @@ export async function POST(context: APIContext) {
.insert(ssoProviders)
.values({
id: nanoid(),
issuer: cleanIssuer,
issuer: trimmedIssuer,
domain,
oidcConfig: JSON.stringify(storedOidcConfig),
userId: user.id,
@@ -213,12 +212,10 @@ export async function PUT(context: APIContext) {
// Parse existing config
const existingConfig = JSON.parse(existingProvider.oidcConfig);
const effectiveIssuer = issuer || existingProvider.issuer;
const effectiveIssuer = issuer?.toString().trim() || existingProvider.issuer;
let cleanIssuer = effectiveIssuer;
try {
const issuerUrl = new URL(effectiveIssuer.toString().trim());
cleanIssuer = issuerUrl.toString().replace(/\/$/, "");
new URL(effectiveIssuer);
} catch {
return new Response(
JSON.stringify({ error: `Invalid issuer URL format: ${effectiveIssuer}` }),
@@ -244,7 +241,7 @@ export async function PUT(context: APIContext) {
let normalized;
try {
normalized = await normalizeOidcProviderConfig(cleanIssuer, mergedConfig);
normalized = await normalizeOidcProviderConfig(effectiveIssuer, mergedConfig);
} catch (error) {
if (error instanceof OidcConfigError) {
return new Response(
@@ -266,7 +263,7 @@ export async function PUT(context: APIContext) {
const [updatedProvider] = await db
.update(ssoProviders)
.set({
issuer: cleanIssuer,
issuer: effectiveIssuer,
domain: domain || existingProvider.domain,
oidcConfig: JSON.stringify(storedOidcConfig),
organizationId: organizationId !== undefined ? organizationId : existingProvider.organizationId,

View File

@@ -66,6 +66,7 @@ export const POST: APIRoute = async ({ request }) => {
configId: config.id,
name: repo.name,
fullName: repo.fullName,
normalizedFullName: repo.fullName.toLowerCase(),
url: repo.url,
cloneUrl: repo.cloneUrl,
owner: repo.owner,
@@ -97,6 +98,7 @@ export const POST: APIRoute = async ({ request }) => {
userId,
configId: config.id,
name: org.name,
normalizedName: org.name.toLowerCase(),
avatarUrl: org.avatarUrl,
membershipRole: org.membershipRole,
isIncluded: false,
@@ -113,22 +115,22 @@ export const POST: APIRoute = async ({ request }) => {
await db.transaction(async (tx) => {
const [existingRepos, existingOrgs] = await Promise.all([
tx
.select({ fullName: repositories.fullName })
.select({ normalizedFullName: repositories.normalizedFullName })
.from(repositories)
.where(eq(repositories.userId, userId)),
tx
.select({ name: organizations.name })
.select({ normalizedName: organizations.normalizedName })
.from(organizations)
.where(eq(organizations.userId, userId)),
]);
const existingRepoNames = new Set(existingRepos.map((r) => r.fullName));
const existingOrgNames = new Set(existingOrgs.map((o) => o.name));
const existingRepoNames = new Set(existingRepos.map((r) => r.normalizedFullName));
const existingOrgNames = new Set(existingOrgs.map((o) => o.normalizedName));
insertedRepos = newRepos.filter(
(r) => !existingRepoNames.has(r.fullName)
(r) => !existingRepoNames.has(r.normalizedFullName)
);
insertedOrgs = newOrgs.filter((o) => !existingOrgNames.has(o.name));
insertedOrgs = newOrgs.filter((o) => !existingOrgNames.has(o.normalizedName));
// Batch insert repositories to avoid SQLite parameter limit (dynamic by column count)
const sample = newRepos[0];
@@ -140,7 +142,7 @@ export const POST: APIRoute = async ({ request }) => {
await tx
.insert(repositories)
.values(batch)
.onConflictDoNothing({ target: [repositories.userId, repositories.fullName] });
.onConflictDoNothing({ target: [repositories.userId, repositories.normalizedFullName] });
}
}

View File

@@ -1,5 +1,4 @@
import type { APIRoute } from "astro";
import { Octokit } from "@octokit/rest";
import { configs, db, organizations, repositories } from "@/lib/db";
import { and, eq } from "drizzle-orm";
import { jsonResponse, createSecureErrorResponse } from "@/lib/utils";
@@ -15,7 +14,7 @@ import { createGitHubClient } from "@/lib/github";
export const POST: APIRoute = async ({ request }) => {
try {
const body: AddOrganizationApiRequest = await request.json();
const { role, org, userId } = body;
const { role, org, userId, force = false } = body;
if (!org || !userId || !role) {
return jsonResponse({
@@ -24,21 +23,58 @@ export const POST: APIRoute = async ({ request }) => {
});
}
// Check if org already exists
const existingOrg = await db
const trimmedOrg = org.trim();
const normalizedOrg = trimmedOrg.toLowerCase();
// Check if org already exists (case-insensitive)
const [existingOrg] = await db
.select()
.from(organizations)
.where(
and(eq(organizations.name, org), eq(organizations.userId, userId))
);
and(
eq(organizations.userId, userId),
eq(organizations.normalizedName, normalizedOrg)
)
)
.limit(1);
if (existingOrg.length > 0) {
if (existingOrg && !force) {
return jsonResponse({
data: {
success: false,
error: "Organization already exists for this user",
},
status: 400,
status: 409,
});
}
if (existingOrg && force) {
const [updatedOrg] = await db
.update(organizations)
.set({
membershipRole: role,
normalizedName: normalizedOrg,
updatedAt: new Date(),
})
.where(eq(organizations.id, existingOrg.id))
.returning();
const resPayload: AddOrganizationApiResponse = {
success: true,
organization: updatedOrg ?? existingOrg,
message: "Organization already exists; using existing record.",
};
return jsonResponse({ data: resPayload, status: 200 });
}
if (existingOrg) {
return jsonResponse({
data: {
success: false,
error: "Organization already exists for this user",
},
status: 409,
});
}
@@ -71,17 +107,21 @@ export const POST: APIRoute = async ({ request }) => {
// Create authenticated Octokit instance with rate limit tracking
const githubUsername = decryptedConfig.githubConfig?.owner || undefined;
const octokit = createGitHubClient(decryptedConfig.githubConfig.token, userId, githubUsername);
const octokit = createGitHubClient(
decryptedConfig.githubConfig.token,
userId,
githubUsername
);
// Fetch org metadata
const { data: orgData } = await octokit.orgs.get({ org });
const { data: orgData } = await octokit.orgs.get({ org: trimmedOrg });
// Fetch repos based on config settings
const allRepos = [];
// Fetch all repos (public, private, and member) to show in UI
const publicRepos = await octokit.paginate(octokit.repos.listForOrg, {
org,
org: trimmedOrg,
type: "public",
per_page: 100,
});
@@ -89,7 +129,7 @@ export const POST: APIRoute = async ({ request }) => {
// Always fetch private repos to show them in the UI
const privateRepos = await octokit.paginate(octokit.repos.listForOrg, {
org,
org: trimmedOrg,
type: "private",
per_page: 100,
});
@@ -97,7 +137,7 @@ export const POST: APIRoute = async ({ request }) => {
// Also fetch member repos (includes private repos the user has access to)
const memberRepos = await octokit.paginate(octokit.repos.listForOrg, {
org,
org: trimmedOrg,
type: "member",
per_page: 100,
});
@@ -107,38 +147,44 @@ export const POST: APIRoute = async ({ request }) => {
allRepos.push(...uniqueMemberRepos);
// Insert repositories
const repoRecords = allRepos.map((repo) => ({
id: uuidv4(),
userId,
configId,
name: repo.name,
fullName: repo.full_name,
url: repo.html_url,
cloneUrl: repo.clone_url ?? "",
owner: repo.owner.login,
organization:
repo.owner.type === "Organization" ? repo.owner.login : null,
mirroredLocation: "",
destinationOrg: null,
isPrivate: repo.private,
isForked: repo.fork,
forkedFrom: null,
hasIssues: repo.has_issues,
isStarred: false,
isArchived: repo.archived,
size: repo.size,
hasLFS: false,
hasSubmodules: false,
language: repo.language ?? null,
description: repo.description ?? null,
defaultBranch: repo.default_branch ?? "main",
visibility: (repo.visibility ?? "public") as RepositoryVisibility,
status: "imported" as RepoStatus,
lastMirrored: null,
errorMessage: null,
createdAt: repo.created_at ? new Date(repo.created_at) : new Date(),
updatedAt: repo.updated_at ? new Date(repo.updated_at) : new Date(),
}));
const repoRecords = allRepos.map((repo) => {
const normalizedOwner = repo.owner.login.trim().toLowerCase();
const normalizedRepoName = repo.name.trim().toLowerCase();
return {
id: uuidv4(),
userId,
configId,
name: repo.name,
fullName: repo.full_name,
normalizedFullName: `${normalizedOwner}/${normalizedRepoName}`,
url: repo.html_url,
cloneUrl: repo.clone_url ?? "",
owner: repo.owner.login,
organization:
repo.owner.type === "Organization" ? repo.owner.login : null,
mirroredLocation: "",
destinationOrg: null,
isPrivate: repo.private,
isForked: repo.fork,
forkedFrom: null,
hasIssues: repo.has_issues,
isStarred: false,
isArchived: repo.archived,
size: repo.size,
hasLFS: false,
hasSubmodules: false,
language: repo.language ?? null,
description: repo.description ?? null,
defaultBranch: repo.default_branch ?? "main",
visibility: (repo.visibility ?? "public") as RepositoryVisibility,
status: "imported" as RepoStatus,
lastMirrored: null,
errorMessage: null,
createdAt: repo.created_at ? new Date(repo.created_at) : new Date(),
updatedAt: repo.updated_at ? new Date(repo.updated_at) : new Date(),
};
});
// Batch insert repositories to avoid SQLite parameter limit
// Compute batch size based on column count
@@ -150,7 +196,7 @@ export const POST: APIRoute = async ({ request }) => {
await db
.insert(repositories)
.values(batch)
.onConflictDoNothing({ target: [repositories.userId, repositories.fullName] });
.onConflictDoNothing({ target: [repositories.userId, repositories.normalizedFullName] });
}
// Insert organization metadata
@@ -159,6 +205,7 @@ export const POST: APIRoute = async ({ request }) => {
userId,
configId,
name: orgData.login,
normalizedName: normalizedOrg,
avatarUrl: orgData.avatar_url,
membershipRole: role,
isIncluded: false,

View File

@@ -15,7 +15,7 @@ import { createMirrorJob } from "@/lib/helpers";
export const POST: APIRoute = async ({ request }) => {
try {
const body: AddRepositoriesApiRequest = await request.json();
const { owner, repo, userId } = body;
const { owner, repo, userId, force = false } = body;
if (!owner || !repo || !userId) {
return new Response(
@@ -27,26 +27,43 @@ export const POST: APIRoute = async ({ request }) => {
);
}
const trimmedOwner = owner.trim();
const trimmedRepo = repo.trim();
if (!trimmedOwner || !trimmedRepo) {
return jsonResponse({
data: {
success: false,
error: "Missing owner, repo, or userId",
},
status: 400,
});
}
const normalizedOwner = trimmedOwner.toLowerCase();
const normalizedRepo = trimmedRepo.toLowerCase();
const normalizedFullName = `${normalizedOwner}/${normalizedRepo}`;
// Check if repository with the same owner, name, and userId already exists
const existingRepo = await db
const [existingRepo] = await db
.select()
.from(repositories)
.where(
and(
eq(repositories.owner, owner),
eq(repositories.name, repo),
eq(repositories.userId, userId)
eq(repositories.userId, userId),
eq(repositories.normalizedFullName, normalizedFullName)
)
);
)
.limit(1);
if (existingRepo.length > 0) {
if (existingRepo && !force) {
return jsonResponse({
data: {
success: false,
error:
"Repository with this name and owner already exists for this user",
},
status: 400,
status: 409,
});
}
@@ -68,14 +85,17 @@ export const POST: APIRoute = async ({ request }) => {
const octokit = new Octokit(); // No auth for public repos
const { data: repoData } = await octokit.rest.repos.get({ owner, repo });
const { data: repoData } = await octokit.rest.repos.get({
owner: trimmedOwner,
repo: trimmedRepo,
});
const metadata = {
id: uuidv4(),
const baseMetadata = {
userId,
configId,
name: repoData.name,
fullName: repoData.full_name,
normalizedFullName,
url: repoData.html_url,
cloneUrl: repoData.clone_url,
owner: repoData.owner.login,
@@ -94,6 +114,37 @@ export const POST: APIRoute = async ({ request }) => {
description: repoData.description ?? null,
defaultBranch: repoData.default_branch,
visibility: (repoData.visibility ?? "public") as RepositoryVisibility,
lastMirrored: existingRepo?.lastMirrored ?? null,
errorMessage: existingRepo?.errorMessage ?? null,
mirroredLocation: existingRepo?.mirroredLocation ?? "",
destinationOrg: existingRepo?.destinationOrg ?? null,
updatedAt: repoData.updated_at
? new Date(repoData.updated_at)
: new Date(),
};
if (existingRepo && force) {
const [updatedRepo] = await db
.update(repositories)
.set({
...baseMetadata,
normalizedFullName,
configId,
})
.where(eq(repositories.id, existingRepo.id))
.returning();
const resPayload: AddRepositoriesApiResponse = {
success: true,
repository: updatedRepo ?? existingRepo,
message: "Repository already exists; metadata refreshed.",
};
return jsonResponse({ data: resPayload, status: 200 });
}
const metadata = {
id: uuidv4(),
status: "imported" as Repository["status"],
lastMirrored: null,
errorMessage: null,
@@ -102,15 +153,13 @@ export const POST: APIRoute = async ({ request }) => {
createdAt: repoData.created_at
? new Date(repoData.created_at)
: new Date(),
updatedAt: repoData.updated_at
? new Date(repoData.updated_at)
: new Date(),
};
...baseMetadata,
} satisfies Repository;
await db
.insert(repositories)
.values(metadata)
.onConflictDoNothing({ target: [repositories.userId, repositories.fullName] });
.onConflictDoNothing({ target: [repositories.userId, repositories.normalizedFullName] });
createMirrorJob({
userId,

View File

@@ -81,11 +81,12 @@ export interface AddRepositoriesApiRequest {
userId: string;
repo: string;
owner: string;
force?: boolean;
}
export interface AddRepositoriesApiResponse {
success: boolean;
message: string;
repository: Repository;
repository?: Repository;
error?: string;
}

View File

@@ -45,11 +45,12 @@ export interface AddOrganizationApiRequest {
userId: string;
org: string;
role: MembershipRole;
force?: boolean;
}
export interface AddOrganizationApiResponse {
success: boolean;
message: string;
organization: Organization;
organization?: Organization;
error?: string;
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 31 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 128 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 15 KiB

After

Width:  |  Height:  |  Size: 15 KiB

3
www/public/favicon.svg Normal file

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 216 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 338 KiB

After

Width:  |  Height:  |  Size: 330 KiB

View File

@@ -22,7 +22,7 @@ export function Installation() {
steps: [
{
title: "Clone the repository",
command: "git clone https://github.com/RayLabsHQ/gitea-mirror.git\ncd gitea-mirror",
command: "git clone https://github.com/RayLabsHQ/gitea-mirror.git && cd gitea-mirror",
id: "docker-clone"
},
{
@@ -166,4 +166,4 @@ export function Installation() {
</div>
</section>
);
}
}