Compare commits

...

3 Commits

Author SHA1 Message Date
Arunavo Ray
a5b4482c8a working on a fix for SSO issue 2025-09-14 10:18:37 +05:30
Arunavo Ray
5add8766a4 fix(scheduler,config): preserve ENV schedule; add AUTO_MIRROR_REPOS auto-mirroring
- Prevent Automation UI from overriding schedule:
      - mapDbScheduleToUi now parses intervals robustly (cron/duration/seconds) via parseInterval
      - mapUiScheduleToDb merges with existing config and stores interval as seconds (no lossy cron conversion)
      - /api/config passes existing scheduleConfig to preserve ENV-sourced values
      - schedule-sync endpoint uses parseInterval for nextRun calculation
  - Add AUTO_MIRROR_REPOS support and scheduled auto-mirror phase:
      - scheduleConfig schema includes autoImport and autoMirror
      - env-config-loader reads AUTO_MIRROR_REPOS and carries through to DB
      - scheduler auto-mirrors imported/pending/failed repos when autoMirror is enabled before regular sync
      - docker-compose and ENV docs updated with AUTO_MIRROR_REPOS
  - Tests pass and build succeeds
2025-09-14 08:31:31 +05:30
Arunavo Ray
6ce70bb5bf chore(version): bump to 3.7.1\n\ncleanup: attempt fix for orphaned repo archiving (refs #84)\n- Sanitize mirror rename to satisfy AlphaDashDot; timestamped fallback\n- Resolve Gitea owner robustly via mirroredLocation/strategy; verify presence\n- Add 'archived' status to Zod enums; set isArchived on archive\n- Update CHANGELOG entry without closing keyword 2025-09-14 07:53:36 +05:30
28 changed files with 34279 additions and 144 deletions

View File

@@ -58,6 +58,23 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
- Updated README with new features
- Enhanced CLAUDE.md with repository status definitions
## [3.7.1] - 2025-09-14
### Fixed
- Cleanup archiving for mirror repositories now works reliably (refs #84; awaiting user confirmation).
- Gitea rejects names violating the AlphaDashDot rule; archiving a mirror now uses a sanitized rename strategy (`archived-<name>`), with a timestamped fallback on conflicts or validation errors.
- Owner resolution during cleanup no longer uses the GitHub owner by mistake. It prefers `mirroredLocation`, falls back to computed Gitea owner via configuration, and verifies location with a presence check to avoid `GetUserByName` 404s.
- Repositories UI crash resolved when cleanup marked repos as archived.
- Added `"archived"` to repository/job status enums, fixing Zod validation errors on the Repositories page.
### Changed
- Archiving logic for mirror repos is non-destructive by design: data is preserved, repo is renamed with an archive marker, and mirror interval is reduced (besteffort) to minimize sync attempts.
- Cleanup service updates DB to `status: "archived"` and `isArchived: true` on successful archive path.
### Notes
- This release addresses the scenario where a GitHub source disappears (deleted/banned), ensuring Gitea backups are preserved even when using `CLEANUP_DELETE_IF_NOT_IN_GITHUB=true` with `CLEANUP_ORPHANED_REPO_ACTION=archive`.
- No database migration required.
## [3.2.6] - 2025-08-09
### Fixed

View File

@@ -57,6 +57,7 @@ services:
- SCHEDULE_ENABLED=${SCHEDULE_ENABLED:-false}
- GITEA_MIRROR_INTERVAL=${GITEA_MIRROR_INTERVAL:-8h}
- AUTO_IMPORT_REPOS=${AUTO_IMPORT_REPOS:-true}
- AUTO_MIRROR_REPOS=${AUTO_MIRROR_REPOS:-false}
# Repository Cleanup Configuration
- CLEANUP_DELETE_IF_NOT_IN_GITHUB=${CLEANUP_DELETE_IF_NOT_IN_GITHUB:-false}
- CLEANUP_ORPHANED_REPO_ACTION=${CLEANUP_ORPHANED_REPO_ACTION:-archive}

View File

@@ -195,6 +195,7 @@ Configure automatic scheduled mirroring.
| Variable | Description | Default | Options |
|----------|-------------|---------|---------|
| `AUTO_IMPORT_REPOS` | Automatically discover and import new GitHub repositories during scheduled syncs | `true` | `true`, `false` |
| `AUTO_MIRROR_REPOS` | Automatically mirror newly imported repositories during scheduled syncs (no manual “Mirror All” required) | `false` | `true`, `false` |
| `SCHEDULE_ONLY_MIRROR_UPDATED` | Only mirror repos with updates | `false` | `true`, `false` |
| `SCHEDULE_UPDATE_INTERVAL` | Check for updates interval (milliseconds) | `86400000` | Number |
| `SCHEDULE_SKIP_RECENTLY_MIRRORED` | Skip recently mirrored repos | `true` | `true`, `false` |

View File

@@ -29,6 +29,8 @@ This guide explains how to test SSO authentication locally with Gitea Mirror.
- Client Secret: (from Google Console)
- Save the provider
Note: Provider creation uses Better Auth's SSO registration under the hood. Do not call the legacy `POST /api/sso/providers` endpoint directly; it is deprecated and reserved for internal mirroring. Use the UI or Better Auth client/server registration APIs instead.
## Option 2: Using Keycloak (Local Identity Provider)
### Setup with Docker:
@@ -113,8 +115,8 @@ npm start
2. **Provider not showing in login**
- Check browser console for errors
- Verify provider was saved successfully
- Check `/api/sso/providers` returns your providers
- Verify provider was saved successfully (via UI)
- Check `/api/sso/providers` (or `/api/sso/providers/public`) returns your providers. This list mirrors what was registered with Better Auth.
3. **Redirect URI mismatch**
- Ensure the redirect URI in your OAuth app matches exactly:

31756
docs/better-auth-docs.md Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1 @@
ALTER TABLE `sso_providers` ADD `saml_config` text;

File diff suppressed because it is too large Load Diff

View File

@@ -43,6 +43,13 @@
"when": 1757786449446,
"tag": "0005_polite_preak",
"breakpoints": true
},
{
"idx": 6,
"version": "6",
"when": 1757825311459,
"tag": "0006_illegal_spyke",
"breakpoints": true
}
]
}

View File

@@ -1,7 +1,7 @@
{
"name": "gitea-mirror",
"type": "module",
"version": "3.7.0",
"version": "3.7.1",
"engines": {
"bun": ">=1.2.9"
},

View File

@@ -70,6 +70,8 @@ export function LoginForm() {
domain: domain,
providerId: providerId,
callbackURL: `${baseURL}/`,
errorCallbackURL: `${baseURL}/auth-error`,
newUserCallbackURL: `${baseURL}/`,
scopes: ['openid', 'email', 'profile'], // TODO: This is not being respected by the SSO plugin.
});
} catch (error) {

View File

@@ -14,6 +14,7 @@ import { Badge } from '../ui/badge';
import { Tabs, TabsContent, TabsList, TabsTrigger } from '@/components/ui/tabs';
import { Textarea } from '@/components/ui/textarea';
import { MultiSelect } from '@/components/ui/multi-select';
import { authClient } from '@/lib/auth-client';
function isTrustedIssuer(issuer: string, allowedHosts: string[]): boolean {
try {
@@ -158,50 +159,146 @@ export function SSOSettings() {
const createProvider = async () => {
setAddingProvider(true);
try {
const requestData: any = {
if (editingProvider) {
// Delete and recreate to align with Better Auth docs
try {
await apiRequest(`/sso/providers?id=${editingProvider.id}`, { method: 'DELETE' });
} catch (e) {
// Continue even if local delete fails; registration will mirror latest
console.warn('Failed to delete local provider before recreate', e);
}
// Recreate via Better Auth registration
try {
if (providerType === 'oidc') {
await authClient.sso.register({
providerId: providerForm.providerId,
issuer: providerForm.issuer,
domain: providerForm.domain,
organizationId: providerForm.organizationId || undefined,
providerType,
};
if (providerType === 'oidc') {
requestData.clientId = providerForm.clientId;
requestData.clientSecret = providerForm.clientSecret;
requestData.authorizationEndpoint = providerForm.authorizationEndpoint;
requestData.tokenEndpoint = providerForm.tokenEndpoint;
requestData.jwksEndpoint = providerForm.jwksEndpoint;
requestData.userInfoEndpoint = providerForm.userInfoEndpoint;
requestData.discoveryEndpoint = providerForm.discoveryEndpoint;
requestData.scopes = providerForm.scopes;
requestData.pkce = providerForm.pkce;
oidcConfig: {
clientId: providerForm.clientId || undefined,
clientSecret: providerForm.clientSecret || undefined,
authorizationEndpoint: providerForm.authorizationEndpoint || undefined,
tokenEndpoint: providerForm.tokenEndpoint || undefined,
jwksEndpoint: providerForm.jwksEndpoint || undefined,
userInfoEndpoint: providerForm.userInfoEndpoint || undefined,
discoveryEndpoint: providerForm.discoveryEndpoint || undefined,
scopes: providerForm.scopes,
pkce: providerForm.pkce,
},
mapping: {
id: 'sub',
email: 'email',
emailVerified: 'email_verified',
name: 'name',
image: 'picture',
},
} as any);
} else {
requestData.entryPoint = providerForm.entryPoint;
requestData.cert = providerForm.cert;
requestData.callbackUrl = providerForm.callbackUrl || `${window.location.origin}/api/auth/sso/saml2/callback/${providerForm.providerId}`;
requestData.audience = providerForm.audience || window.location.origin;
requestData.wantAssertionsSigned = providerForm.wantAssertionsSigned;
requestData.signatureAlgorithm = providerForm.signatureAlgorithm;
requestData.digestAlgorithm = providerForm.digestAlgorithm;
requestData.identifierFormat = providerForm.identifierFormat;
await authClient.sso.register({
providerId: providerForm.providerId,
issuer: providerForm.issuer,
domain: providerForm.domain,
organizationId: providerForm.organizationId || undefined,
samlConfig: {
entryPoint: providerForm.entryPoint,
cert: providerForm.cert,
callbackUrl:
providerForm.callbackUrl ||
`${window.location.origin}/api/auth/sso/saml2/callback/${providerForm.providerId}`,
audience: providerForm.audience || window.location.origin,
wantAssertionsSigned: providerForm.wantAssertionsSigned,
signatureAlgorithm: providerForm.signatureAlgorithm,
digestAlgorithm: providerForm.digestAlgorithm,
identifierFormat: providerForm.identifierFormat,
},
mapping: {
id: 'nameID',
email: 'email',
name: 'displayName',
firstName: 'givenName',
lastName: 'surname',
},
} as any);
}
toast.success('SSO provider recreated');
} catch (e: any) {
console.error('Recreate failed', e);
const msg = typeof e?.message === 'string' ? e.message : String(e);
// Common case: providerId already exists in Better Auth
if (msg.toLowerCase().includes('already exists')) {
toast.error('Provider ID already exists in auth server. Choose a new Provider ID and try again.');
} else {
showErrorToast(e, toast);
}
}
if (editingProvider) {
// Update existing provider
const updatedProvider = await apiRequest<SSOProvider>(`/sso/providers?id=${editingProvider.id}`, {
method: 'PUT',
data: requestData,
});
setProviders(providers.map(p => p.id === editingProvider.id ? updatedProvider : p));
toast.success('SSO provider updated successfully');
// Refresh providers from our API after registration mirrors into DB
const refreshed = await apiRequest<SSOProvider[] | { providers: SSOProvider[] }>(
'/sso/providers'
);
setProviders(Array.isArray(refreshed) ? refreshed : refreshed?.providers || []);
} else {
// Create new provider
const newProvider = await apiRequest<SSOProvider>('/sso/providers', {
method: 'POST',
data: requestData,
});
setProviders([...providers, newProvider]);
// Create new provider - follow Better Auth docs using the SSO client
if (providerType === 'oidc') {
await authClient.sso.register({
providerId: providerForm.providerId,
issuer: providerForm.issuer,
domain: providerForm.domain,
organizationId: providerForm.organizationId || undefined,
oidcConfig: {
clientId: providerForm.clientId || undefined,
clientSecret: providerForm.clientSecret || undefined,
authorizationEndpoint: providerForm.authorizationEndpoint || undefined,
tokenEndpoint: providerForm.tokenEndpoint || undefined,
jwksEndpoint: providerForm.jwksEndpoint || undefined,
userInfoEndpoint: providerForm.userInfoEndpoint || undefined,
discoveryEndpoint: providerForm.discoveryEndpoint || undefined,
scopes: providerForm.scopes,
pkce: providerForm.pkce,
},
mapping: {
id: 'sub',
email: 'email',
emailVerified: 'email_verified',
name: 'name',
image: 'picture',
},
} as any);
} else {
await authClient.sso.register({
providerId: providerForm.providerId,
issuer: providerForm.issuer,
domain: providerForm.domain,
organizationId: providerForm.organizationId || undefined,
samlConfig: {
entryPoint: providerForm.entryPoint,
cert: providerForm.cert,
callbackUrl:
providerForm.callbackUrl ||
`${window.location.origin}/api/auth/sso/saml2/callback/${providerForm.providerId}`,
audience: providerForm.audience || window.location.origin,
wantAssertionsSigned: providerForm.wantAssertionsSigned,
signatureAlgorithm: providerForm.signatureAlgorithm,
digestAlgorithm: providerForm.digestAlgorithm,
identifierFormat: providerForm.identifierFormat,
},
mapping: {
id: 'nameID',
email: 'email',
name: 'displayName',
firstName: 'givenName',
lastName: 'surname',
},
} as any);
}
// Refresh providers from our API after registration mirrors into DB
const refreshed = await apiRequest<SSOProvider[] | { providers: SSOProvider[] }>(
'/sso/providers'
);
setProviders(Array.isArray(refreshed) ? refreshed : refreshed?.providers || []);
toast.success('SSO provider created successfully');
}

View File

@@ -81,6 +81,8 @@ export const scheduleConfigSchema = z.object({
updateInterval: z.number().default(86400000),
skipRecentlyMirrored: z.boolean().default(true),
recentThreshold: z.number().default(3600000),
autoImport: z.boolean().default(true),
autoMirror: z.boolean().default(false),
lastRun: z.coerce.date().optional(),
nextRun: z.coerce.date().optional(),
});
@@ -152,6 +154,7 @@ export const repositorySchema = z.object({
"deleted",
"syncing",
"synced",
"archived",
])
.default("imported"),
lastMirrored: z.coerce.date().optional().nullable(),
@@ -181,6 +184,7 @@ export const mirrorJobSchema = z.object({
"deleted",
"syncing",
"synced",
"archived",
])
.default("imported"),
message: z.string(),
@@ -613,6 +617,7 @@ export const ssoProviders = sqliteTable("sso_providers", {
issuer: text("issuer").notNull(),
domain: text("domain").notNull(),
oidcConfig: text("oidc_config").notNull(), // JSON string with OIDC configuration
samlConfig: text("saml_config"), // JSON string with SAML configuration (optional)
userId: text("user_id").notNull(), // Admin who created this provider
providerId: text("provider_id").notNull().unique(), // Unique identifier for the provider
organizationId: text("organization_id"), // Optional - if provider is linked to an organization

View File

@@ -69,6 +69,8 @@ interface EnvConfig {
updateInterval?: number;
skipRecentlyMirrored?: boolean;
recentThreshold?: number;
autoImport?: boolean;
autoMirror?: boolean;
};
cleanup: {
enabled?: boolean;
@@ -157,6 +159,8 @@ function parseEnvConfig(): EnvConfig {
updateInterval: process.env.SCHEDULE_UPDATE_INTERVAL ? parseInt(process.env.SCHEDULE_UPDATE_INTERVAL, 10) : undefined,
skipRecentlyMirrored: process.env.SCHEDULE_SKIP_RECENTLY_MIRRORED === 'true',
recentThreshold: process.env.SCHEDULE_RECENT_THRESHOLD ? parseInt(process.env.SCHEDULE_RECENT_THRESHOLD, 10) : undefined,
autoImport: process.env.AUTO_IMPORT_REPOS !== 'false',
autoMirror: process.env.AUTO_MIRROR_REPOS === 'true',
},
cleanup: {
enabled: process.env.CLEANUP_ENABLED === 'true' ||
@@ -301,7 +305,8 @@ export async function initializeConfigFromEnv(): Promise<void> {
updateInterval: envConfig.schedule.updateInterval ?? existingConfig?.[0]?.scheduleConfig?.updateInterval ?? 86400000,
skipRecentlyMirrored: envConfig.schedule.skipRecentlyMirrored ?? existingConfig?.[0]?.scheduleConfig?.skipRecentlyMirrored ?? true,
recentThreshold: envConfig.schedule.recentThreshold ?? existingConfig?.[0]?.scheduleConfig?.recentThreshold ?? 3600000,
autoImport: process.env.AUTO_IMPORT_REPOS !== 'false', // New field for auto-importing new repositories
autoImport: envConfig.schedule.autoImport ?? existingConfig?.[0]?.scheduleConfig?.autoImport ?? true,
autoMirror: envConfig.schedule.autoMirror ?? existingConfig?.[0]?.scheduleConfig?.autoMirror ?? false,
lastRun: existingConfig?.[0]?.scheduleConfig?.lastRun || undefined,
nextRun: existingConfig?.[0]?.scheduleConfig?.nextRun || undefined,
};

View File

@@ -2176,6 +2176,14 @@ export async function archiveGiteaRepo(
repo: string
): Promise<void> {
try {
// Helper: sanitize to Gitea's AlphaDashDot rule
const sanitizeRepoNameAlphaDashDot = (name: string): string => {
// Replace anything that's not [A-Za-z0-9.-] with '-'
const base = name.replace(/[^A-Za-z0-9.-]+/g, "-").replace(/-+/g, "-");
// Trim leading/trailing separators and dots for safety
return base.replace(/^[.-]+/, "").replace(/[.-]+$/, "");
};
// First, check if this is a mirror repository
const repoResponse = await httpGet(
`${client.url}/api/v1/repos/${owner}/${repo}`,
@@ -2207,7 +2215,8 @@ export async function archiveGiteaRepo(
return;
}
const archivedName = `[ARCHIVED] ${currentName}`;
// Use a safe prefix and sanitize the name to satisfy AlphaDashDot rule
let archivedName = `archived-${sanitizeRepoNameAlphaDashDot(currentName)}`;
const currentDesc = repoResponse.data.description || '';
const archiveNotice = `\n\n⚠ ARCHIVED: Original GitHub repository no longer exists. Preserved as backup on ${new Date().toISOString()}`;
@@ -2216,7 +2225,8 @@ export async function archiveGiteaRepo(
? currentDesc
: currentDesc + archiveNotice;
const renameResponse = await httpPatch(
try {
await httpPatch(
`${client.url}/api/v1/repos/${owner}/${repo}`,
{
name: archivedName,
@@ -2227,13 +2237,29 @@ export async function archiveGiteaRepo(
'Content-Type': 'application/json',
}
);
if (renameResponse.status >= 400) {
// If rename fails, log but don't throw - data is still preserved
console.error(`[Archive] Failed to rename mirror repository ${owner}/${repo}: ${renameResponse.status}`);
} catch (e: any) {
// If rename fails (e.g., 422 AlphaDashDot or name conflict), attempt a timestamped fallback
const ts = new Date().toISOString().replace(/[-:T.]/g, "").slice(0, 14);
archivedName = `archived-${ts}-${sanitizeRepoNameAlphaDashDot(currentName)}`;
try {
await httpPatch(
`${client.url}/api/v1/repos/${owner}/${repo}`,
{
name: archivedName,
description: newDescription,
},
{
Authorization: `token ${client.token}`,
'Content-Type': 'application/json',
}
);
} catch (e2) {
// If this also fails, log but don't throw - data remains preserved
console.error(`[Archive] Failed to rename mirror repository ${owner}/${repo}:`, e2);
console.log(`[Archive] Repository ${owner}/${repo} remains accessible but not marked as archived`);
return;
}
}
console.log(`[Archive] Successfully marked mirror repository ${owner}/${repo} as archived (renamed to ${archivedName})`);

View File

@@ -7,7 +7,7 @@
import { db, configs, repositories } from '@/lib/db';
import { eq, and, or, sql, not, inArray } from 'drizzle-orm';
import { createGitHubClient, getGithubRepositories, getGithubStarredRepositories } from '@/lib/github';
import { createGiteaClient, deleteGiteaRepo, archiveGiteaRepo } from '@/lib/gitea';
import { createGiteaClient, deleteGiteaRepo, archiveGiteaRepo, getGiteaRepoOwnerAsync, checkRepoLocation } from '@/lib/gitea';
import { getDecryptedGitHubToken, getDecryptedGiteaToken } from '@/lib/utils/config-encryption';
import { publishEvent } from '@/lib/events';
@@ -109,26 +109,46 @@ async function handleOrphanedRepository(
const giteaToken = getDecryptedGiteaToken(config);
const giteaClient = createGiteaClient(config.giteaConfig.url, giteaToken);
// Determine the Gitea owner and repo name
const mirroredLocation = repo.mirroredLocation || '';
let giteaOwner = repo.owner;
let giteaRepoName = repo.name;
// Determine the Gitea owner and repo name more robustly
const mirroredLocation = (repo.mirroredLocation || '').trim();
let giteaOwner: string;
let giteaRepoName: string;
if (mirroredLocation) {
const parts = mirroredLocation.split('/');
if (parts.length >= 2) {
giteaOwner = parts[parts.length - 2];
giteaRepoName = parts[parts.length - 1];
}
if (mirroredLocation && mirroredLocation.includes('/')) {
const [ownerPart, namePart] = mirroredLocation.split('/');
giteaOwner = ownerPart;
giteaRepoName = namePart;
} else {
// Fall back to expected owner based on config and repo flags (starred/org overrides)
giteaOwner = await getGiteaRepoOwnerAsync({ config, repository: repo });
giteaRepoName = repo.name;
}
// Normalize owner casing to avoid GetUserByName issues on some Gitea setups
giteaOwner = giteaOwner.trim();
if (action === 'archive') {
console.log(`[Repository Cleanup] Archiving orphaned repository ${repoFullName} in Gitea`);
// Best-effort check to validate actual location; falls back gracefully
try {
const { present, actualOwner } = await checkRepoLocation({
config,
repository: repo,
expectedOwner: giteaOwner,
});
if (present) {
giteaOwner = actualOwner;
}
} catch {
// Non-fatal; continue with best guess
}
await archiveGiteaRepo(giteaClient, giteaOwner, giteaRepoName);
// Update database status
await db.update(repositories).set({
status: 'archived',
isArchived: true,
errorMessage: 'Repository archived - no longer in GitHub',
updatedAt: new Date(),
}).where(eq(repositories.id, repo.id));

View File

@@ -166,6 +166,75 @@ async function runScheduledSync(config: any): Promise<void> {
}
}
// Auto-mirror: Mirror imported/pending/failed repositories if enabled
if (scheduleConfig.autoMirror) {
try {
console.log(`[Scheduler] Auto-mirror enabled - checking for repositories to mirror for user ${userId}...`);
const reposNeedingMirror = await db
.select()
.from(repositories)
.where(
and(
eq(repositories.userId, userId),
or(
eq(repositories.status, 'imported'),
eq(repositories.status, 'pending'),
eq(repositories.status, 'failed')
)
)
);
if (reposNeedingMirror.length > 0) {
console.log(`[Scheduler] Found ${reposNeedingMirror.length} repositories that need initial mirroring`);
// Prepare Octokit client
const decryptedToken = getDecryptedGitHubToken(config);
const { Octokit } = await import('@octokit/rest');
const octokit = new Octokit({ auth: decryptedToken });
// Process repositories in batches
const batchSize = scheduleConfig.batchSize || 10;
const pauseBetweenBatches = scheduleConfig.pauseBetweenBatches || 2000;
for (let i = 0; i < reposNeedingMirror.length; i += batchSize) {
const batch = reposNeedingMirror.slice(i, Math.min(i + batchSize, reposNeedingMirror.length));
console.log(`[Scheduler] Auto-mirror batch ${Math.floor(i / batchSize) + 1} of ${Math.ceil(reposNeedingMirror.length / batchSize)} (${batch.length} repos)`);
await Promise.all(
batch.map(async (repo) => {
try {
const repository: Repository = {
...repo,
status: repoStatusEnum.parse(repo.status),
organization: repo.organization ?? undefined,
lastMirrored: repo.lastMirrored ?? undefined,
errorMessage: repo.errorMessage ?? undefined,
mirroredLocation: repo.mirroredLocation || '',
forkedFrom: repo.forkedFrom ?? undefined,
visibility: repositoryVisibilityEnum.parse(repo.visibility),
};
await mirrorGithubRepoToGitea({ octokit, repository, config });
console.log(`[Scheduler] Auto-mirrored repository: ${repo.fullName}`);
} catch (error) {
console.error(`[Scheduler] Failed to auto-mirror repository ${repo.fullName}:`, error);
}
})
);
// Pause between batches if configured
if (i + batchSize < reposNeedingMirror.length) {
console.log(`[Scheduler] Pausing for ${pauseBetweenBatches}ms before next auto-mirror batch...`);
await new Promise(resolve => setTimeout(resolve, pauseBetweenBatches));
}
}
} else {
console.log(`[Scheduler] No repositories need initial mirroring`);
}
} catch (mirrorError) {
console.error(`[Scheduler] Error during auto-mirror phase for user ${userId}:`, mirrorError);
}
}
// Get repositories to sync
let reposToSync = await db
.select()

View File

@@ -11,6 +11,7 @@ import type {
} from "@/types/config";
import { z } from "zod";
import { githubConfigSchema, giteaConfigSchema, scheduleConfigSchema, cleanupConfigSchema } from "@/lib/db/schema";
import { parseInterval } from "@/lib/utils/duration-parser";
// Use the actual database schema types
type DbGitHubConfig = z.infer<typeof githubConfigSchema>;
@@ -165,27 +166,22 @@ export function mapDbToUiConfig(dbConfig: any): {
/**
* Maps UI schedule config to database schema
*/
export function mapUiScheduleToDb(uiSchedule: any): DbScheduleConfig {
export function mapUiScheduleToDb(uiSchedule: any, existing?: DbScheduleConfig): DbScheduleConfig {
// Preserve existing schedule config and only update fields controlled by the UI
const base: DbScheduleConfig = existing
? { ...(existing as unknown as DbScheduleConfig) }
: (scheduleConfigSchema.parse({}) as unknown as DbScheduleConfig);
// Store interval as seconds string to avoid lossy cron conversion
const intervalSeconds = typeof uiSchedule.interval === 'number' && uiSchedule.interval > 0
? String(uiSchedule.interval)
: (typeof base.interval === 'string' ? base.interval : String(86400));
return {
enabled: uiSchedule.enabled || false,
interval: uiSchedule.interval ? `0 */${Math.floor(uiSchedule.interval / 3600)} * * *` : "0 2 * * *", // Convert seconds to cron expression
concurrent: false,
batchSize: 10,
pauseBetweenBatches: 5000,
retryAttempts: 3,
retryDelay: 60000,
timeout: 3600000,
autoRetry: true,
cleanupBeforeMirror: false,
notifyOnFailure: true,
notifyOnSuccess: false,
logLevel: "info",
timezone: "UTC",
onlyMirrorUpdated: false,
updateInterval: 86400000,
skipRecentlyMirrored: true,
recentThreshold: 3600000,
};
...base,
enabled: !!uiSchedule.enabled,
interval: intervalSeconds,
} as DbScheduleConfig;
}
/**
@@ -202,24 +198,19 @@ export function mapDbScheduleToUi(dbSchedule: DbScheduleConfig): any {
};
}
// Extract hours from cron expression if possible
// Parse interval supporting numbers (seconds), duration strings, and cron
let intervalSeconds = 86400; // Default to daily (24 hours)
if (dbSchedule.interval) {
// Check if it's already a number (seconds), use it directly
if (typeof dbSchedule.interval === 'number') {
intervalSeconds = dbSchedule.interval;
} else if (typeof dbSchedule.interval === 'string') {
// Check if it's a cron expression
const cronMatch = dbSchedule.interval.match(/0 \*\/(\d+) \* \* \*/);
if (cronMatch) {
intervalSeconds = parseInt(cronMatch[1]) * 3600;
} else if (dbSchedule.interval === "0 2 * * *") {
// Daily at 2 AM
try {
const ms = parseInterval(
typeof dbSchedule.interval === 'number'
? dbSchedule.interval
: (dbSchedule.interval as unknown as string)
);
intervalSeconds = Math.max(1, Math.floor(ms / 1000));
} catch (_e) {
// Fallback to default if unparsable
intervalSeconds = 86400;
}
}
}
return {
enabled: dbSchedule.enabled || false,

View File

@@ -2,6 +2,9 @@ import type { APIContext } from "astro";
import { createSecureErrorResponse } from "@/lib/utils";
import { requireAuth } from "@/lib/utils/auth-helpers";
import { auth } from "@/lib/auth";
import { db, ssoProviders } from "@/lib/db";
import { eq } from "drizzle-orm";
import { nanoid } from "nanoid";
// POST /api/auth/sso/register - Register a new SSO provider using Better Auth
export async function POST(context: APIContext) {
@@ -169,6 +172,46 @@ export async function POST(context: APIContext) {
const result = await response.json();
// Mirror provider into our local sso_providers table for UI listing
try {
const existing = await db
.select()
.from(ssoProviders)
.where(eq(ssoProviders.providerId, providerId))
.limit(1);
const values: any = {
issuer: registrationBody.issuer,
domain: registrationBody.domain,
organizationId: registrationBody.organizationId,
updatedAt: new Date(),
};
if (registrationBody.oidcConfig) {
values.oidcConfig = JSON.stringify(registrationBody.oidcConfig);
}
if (registrationBody.samlConfig) {
values.samlConfig = JSON.stringify(registrationBody.samlConfig);
}
if (existing.length > 0) {
await db.update(ssoProviders).set(values).where(eq(ssoProviders.id, existing[0].id));
} else {
await db.insert(ssoProviders).values({
id: nanoid(),
issuer: registrationBody.issuer,
domain: registrationBody.domain,
oidcConfig: JSON.stringify(registrationBody.oidcConfig || {}),
samlConfig: registrationBody.samlConfig ? JSON.stringify(registrationBody.samlConfig) : undefined,
userId: user.id,
providerId: registrationBody.providerId,
organizationId: registrationBody.organizationId,
});
}
} catch (e) {
// Do not fail the main request if mirroring to local table fails
console.warn("Failed to mirror SSO provider to local DB:", e);
}
return new Response(JSON.stringify(result), {
status: 201,
headers: { "Content-Type": "application/json" },

View File

@@ -87,7 +87,10 @@ export const POST: APIRoute = async ({ request }) => {
}
// Map schedule and cleanup configs to database schema
const processedScheduleConfig = mapUiScheduleToDb(scheduleConfig);
const processedScheduleConfig = mapUiScheduleToDb(
scheduleConfig,
existingConfig ? existingConfig.scheduleConfig : undefined
);
const processedCleanupConfig = mapUiCleanupToDb(cleanupConfig);
if (existingConfig) {

View File

@@ -8,6 +8,7 @@ import type {
ScheduleSyncRepoResponse,
} from "@/types/sync";
import { createSecureErrorResponse } from "@/lib/utils";
import { parseInterval } from "@/lib/utils/duration-parser";
export const POST: APIRoute = async ({ request }) => {
try {
@@ -72,8 +73,17 @@ export const POST: APIRoute = async ({ request }) => {
// Calculate nextRun and update lastRun and nextRun in the config
const currentTime = new Date();
const interval = config.scheduleConfig?.interval ?? 3600;
const nextRun = new Date(currentTime.getTime() + interval * 1000);
let intervalMs = 3600 * 1000;
try {
intervalMs = parseInterval(
typeof config.scheduleConfig?.interval === 'number'
? config.scheduleConfig.interval
: (config.scheduleConfig?.interval as unknown as string) || '3600'
);
} catch {
intervalMs = 3600 * 1000;
}
const nextRun = new Date(currentTime.getTime() + intervalMs);
// Update the full giteaConfig object
await db

View File

@@ -4,6 +4,7 @@ import { requireAuth } from "@/lib/utils/auth-helpers";
import { db, ssoProviders } from "@/lib/db";
import { nanoid } from "nanoid";
import { eq } from "drizzle-orm";
import { auth } from "@/lib/auth";
// GET /api/sso/providers - List all SSO providers
export async function GET(context: APIContext) {
@@ -29,7 +30,11 @@ export async function GET(context: APIContext) {
}
}
// POST /api/sso/providers - Create a new SSO provider
// POST /api/sso/providers - DEPRECATED legacy create (use Better Auth registration)
// This route remains for backward-compatibility only. Preferred flow:
// - Client/UI calls authClient.sso.register(...) to register with Better Auth
// - Server mirrors provider into local DB for listing
// Creation via this route is discouraged and may be removed in a future version.
export async function POST(context: APIContext) {
try {
const { user, response } = await requireAuth(context);
@@ -45,10 +50,12 @@ export async function POST(context: APIContext) {
tokenEndpoint,
jwksEndpoint,
userInfoEndpoint,
discoveryEndpoint,
mapping,
providerId,
organizationId,
scopes,
pkce,
} = body;
// Validate required fields
@@ -62,6 +69,32 @@ export async function POST(context: APIContext) {
);
}
// Clean issuer URL (remove trailing slash); validate URL format
let cleanIssuer = issuer;
try {
const issuerUrl = new URL(issuer.toString().trim());
cleanIssuer = issuerUrl.toString().replace(/\/$/, "");
} catch {
return new Response(
JSON.stringify({ error: `Invalid issuer URL format: ${issuer}` }),
{ status: 400, headers: { "Content-Type": "application/json" } }
);
}
// Validate OIDC endpoints: require discoveryEndpoint or at least authorization+token
const hasDiscovery = typeof discoveryEndpoint === 'string' && discoveryEndpoint.trim() !== '';
const hasCoreEndpoints = typeof authorizationEndpoint === 'string' && authorizationEndpoint.trim() !== ''
&& typeof tokenEndpoint === 'string' && tokenEndpoint.trim() !== '';
if (!hasDiscovery && !hasCoreEndpoints) {
return new Response(
JSON.stringify({
error: "Invalid OIDC configuration",
details: "Provide discoveryEndpoint, or both authorizationEndpoint and tokenEndpoint."
}),
{ status: 400, headers: { "Content-Type": "application/json" } }
);
}
// Check if provider ID already exists
const existing = await db
.select()
@@ -79,15 +112,27 @@ export async function POST(context: APIContext) {
);
}
// Create OIDC config object
// Helper to validate and normalize URL strings (optional fields allowed)
const validateUrl = (value?: string) => {
if (!value || typeof value !== 'string' || value.trim() === '') return undefined;
try {
return new URL(value.trim()).toString();
} catch {
return undefined;
}
};
// Create OIDC config object (store as-is for UI and for Better Auth registration)
const oidcConfig = {
clientId,
clientSecret,
authorizationEndpoint,
tokenEndpoint,
jwksEndpoint,
userInfoEndpoint,
authorizationEndpoint: validateUrl(authorizationEndpoint),
tokenEndpoint: validateUrl(tokenEndpoint),
jwksEndpoint: validateUrl(jwksEndpoint),
userInfoEndpoint: validateUrl(userInfoEndpoint),
discoveryEndpoint: validateUrl(discoveryEndpoint),
scopes: scopes || ["openid", "email", "profile"],
pkce: pkce !== false,
mapping: mapping || {
id: "sub",
email: "email",
@@ -97,12 +142,55 @@ export async function POST(context: APIContext) {
},
};
// First, register with Better Auth so the SSO plugin has the provider
try {
const headers = new Headers();
const cookieHeader = context.request.headers.get("cookie");
if (cookieHeader) headers.set("cookie", cookieHeader);
const res = await auth.api.registerSSOProvider({
body: {
providerId,
issuer: cleanIssuer,
domain,
organizationId,
oidcConfig: {
clientId: oidcConfig.clientId,
clientSecret: oidcConfig.clientSecret,
authorizationEndpoint: oidcConfig.authorizationEndpoint,
tokenEndpoint: oidcConfig.tokenEndpoint,
jwksEndpoint: oidcConfig.jwksEndpoint,
discoveryEndpoint: oidcConfig.discoveryEndpoint,
userInfoEndpoint: oidcConfig.userInfoEndpoint,
scopes: oidcConfig.scopes,
pkce: oidcConfig.pkce,
},
mapping: oidcConfig.mapping,
},
headers,
});
if (!res.ok) {
const errText = await res.text();
return new Response(
JSON.stringify({ error: `Failed to register SSO provider: ${errText}` }),
{ status: res.status || 500, headers: { "Content-Type": "application/json" } }
);
}
} catch (err) {
const message = err instanceof Error ? err.message : String(err);
return new Response(
JSON.stringify({ error: `Better Auth registration failed: ${message}` }),
{ status: 500, headers: { "Content-Type": "application/json" } }
);
}
// Insert new provider
const [newProvider] = await db
.insert(ssoProviders)
.values({
id: nanoid(),
issuer,
issuer: cleanIssuer,
domain,
oidcConfig: JSON.stringify(oidcConfig),
userId: user.id,

View File

@@ -52,7 +52,9 @@ import MainLayout from '../../layouts/main.astro';
{ var: 'PORT', desc: 'Server port', default: '4321' },
{ var: 'HOST', desc: 'Server host', default: '0.0.0.0' },
{ var: 'BETTER_AUTH_SECRET', desc: 'Authentication secret key', default: 'Auto-generated' },
{ var: 'BETTER_AUTH_URL', desc: 'Authentication base URL', default: 'http://localhost:4321' },
{ var: 'BETTER_AUTH_URL', desc: 'Authentication base URL (public origin)', default: 'http://localhost:4321' },
{ var: 'PUBLIC_BETTER_AUTH_URL', desc: 'Optional: public URL used by the client', default: 'Unset' },
{ var: 'BETTER_AUTH_TRUSTED_ORIGINS', desc: 'Comma-separated list of additional trusted origins', default: 'Unset' },
{ var: 'NODE_EXTRA_CA_CERTS', desc: 'Path to CA certificate file', default: 'None' },
{ var: 'DATABASE_URL', desc: 'SQLite database path', default: './data/gitea-mirror.db' },
].map((item, i) => (

View File

@@ -216,7 +216,7 @@ import MainLayout from '../../layouts/main.astro';
<div class="space-y-3">
{[
'User accounts and authentication data (Better Auth)',
'OAuth applications and SSO provider configurations',
'OAuth applications and SSO provider configurations (providers registered via Better Auth; mirrored locally for UI)',
'GitHub and Gitea configuration',
'Repository and organization information',
'Mirroring job history and status',

View File

@@ -107,6 +107,23 @@ import MainLayout from '../../layouts/main.astro';
</p>
<h3 class="text-xl font-semibold mb-4">Adding an SSO Provider</h3>
<div class="bg-blue-500/10 border border-blue-500/20 rounded-lg p-4 mb-6">
<div class="flex gap-3">
<div class="text-blue-600 dark:text-blue-500">
<svg class="w-5 h-5 mt-0.5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M13 16h-1v-4h-1m1-4h.01M21 12a9 9 0 11-18 0 9 9 0 0118 0z"/>
</svg>
</div>
<div>
<p class="font-semibold text-blue-600 dark:text-blue-500 mb-1">Better Auth Registration</p>
<p class="text-sm">
Provider creation uses Better Auth's SSO registration under the hood. The legacy API route
<code class="bg-muted px-1 rounded">POST /api/sso/providers</code> is deprecated and not required for setup.
To change Provider ID or endpoints, delete and recreate the provider from the UI.
</p>
</div>
</div>
</div>
<div class="bg-card rounded-lg border border-border p-6 mb-6">
<h4 class="font-semibold mb-4">Required Information</h4>

View File

@@ -4,7 +4,10 @@ import MainLayout from '../../layouts/main.astro';
const envVars = [
{ name: 'NODE_ENV', desc: 'Runtime environment', default: 'development', example: 'production' },
{ name: 'DATABASE_URL', desc: 'SQLite database URL', default: 'file:data/gitea-mirror.db', example: 'file:path/to/database.db' },
{ name: 'JWT_SECRET', desc: 'Secret key for JWT auth', default: 'Auto-generated', example: 'your-secure-string' },
{ name: 'BETTER_AUTH_SECRET', desc: 'Authentication secret key', default: 'Auto-generated', example: 'generate a strong random string' },
{ name: 'BETTER_AUTH_URL', desc: 'Authentication base URL (public origin)', default: 'http://localhost:4321', example: 'https://gitea-mirror.example.com' },
{ name: 'PUBLIC_BETTER_AUTH_URL', desc: 'Optional: public URL used by the client', default: 'Unset', example: 'https://gitea-mirror.example.com' },
{ name: 'BETTER_AUTH_TRUSTED_ORIGINS', desc: 'Comma-separated list of additional trusted origins', default: 'Unset', example: 'https://gitea-mirror.example.com,https://alt.example.com' },
{ name: 'HOST', desc: 'Server host', default: 'localhost', example: '0.0.0.0' },
{ name: 'PORT', desc: 'Server port', default: '4321', example: '8080' }
];

View File

@@ -260,6 +260,16 @@ bun run start</code></pre>
},
{
num: '3',
title: 'Optional: Configure SSO',
items: [
'Open Configuration → Authentication',
'Click “Add Provider” and enter your OIDC details',
'Use redirect URL: https://<your-domain>/api/auth/sso/callback/{provider-id}',
'Edits are handled as delete & recreate (Better Auth registration)'
]
},
{
num: '4',
title: 'Configure Gitea Connection',
items: [
'Enter your Gitea server URL',
@@ -269,7 +279,7 @@ bun run start</code></pre>
]
},
{
num: '4',
num: '5',
title: 'Set Up Scheduling',
items: [
'Enable automatic mirroring',

View File

@@ -12,6 +12,7 @@ export const repoStatusEnum = z.enum([
"deleted",
"syncing",
"synced",
"archived",
]);
export type RepoStatus = z.infer<typeof repoStatusEnum>;

View File

@@ -0,0 +1,10 @@
import { describe, expect, it } from "bun:test";
import { repoStatusEnum } from "@/types/Repository";
describe("repoStatusEnum", () => {
it("includes archived status", () => {
const res = repoStatusEnum.safeParse("archived");
expect(res.success).toBe(true);
});
});