Compare commits

...

55 Commits

Author SHA1 Message Date
Arunavo Ray
749ad4a694 v3.9.0 2025-11-06 07:47:23 +05:30
ARUNAVO RAY
0f752acae5 Merge pull request #146 from RayLabsHQ/fix/issue-129-release-sort-order
fix: Ensure proper release ordering in Gitea mirrors (#129)
2025-11-06 07:44:12 +05:30
Arunavo Ray
652bd220c2 added missing favicon 2025-11-06 07:43:14 +05:30
Arunavo Ray
9f2eaaf04e fix: Ensure proper release ordering in Gitea mirrors (#129)
- Add 1-second delays between release creations to ensure distinct timestamps
- Prepend GitHub original publication date to release notes
- Improve logging to show chronological processing order
- Addresses Gitea API limitation where created_unix is always set to current time

Fixes #129
2025-11-05 20:57:33 +05:30
ARUNAVO RAY
63d3f0e86c Merge pull request #145 from z0xca/main
fix website install guide command
2025-11-05 11:53:22 +05:30
z0x
25e7d234ba Update clone command to use '&&' for chaining 2025-11-04 22:58:23 -05:00
ARUNAVO RAY
7a3f734728 Merge pull request #142 from RayLabsHQ/fix/issue-141-duplicate-issues-on-sync
fix: add metadata field to repositories table to prevent duplicate issues on sync
2025-10-31 08:51:34 +05:30
Arunavo Ray
d59a07a8c5 fix: add metadata field to repositories table to prevent duplicate issues on sync
Fixes #141

The repository metadata field was missing from the database schema, which
caused the metadata sync state (issues, PRs, releases, etc.) to not persist.
This resulted in duplicate issues being created every time a repository was
synced because the system couldn't track what had already been mirrored.

Changes:
- Added metadata text field to repositories table in schema
- Added metadata field to repositorySchema Zod validation
- Generated database migration 0008_serious_thena.sql

Root cause analysis:
1. Code tried to read/write repository.metadata to track mirrored components
2. The metadata field didn't exist in the database schema
3. On sync, metadataState.components.issues was always false
4. This triggered re-mirroring of all issues, creating duplicates

The fix ensures metadata state persists between mirrors and syncs, preventing
duplicate metadata (issues, PRs, releases) from being created in Gitea.
2025-10-30 10:58:48 +05:30
Arunavo Ray
5a77ae5084 v3.8.10 2025-10-30 10:54:56 +05:30
ARUNAVO RAY
dcb5bd80e3 Merge pull request #138 from RayLabsHQ/issue-132-org-repo-duplicates 2025-10-30 07:11:34 +05:30
Arunavo Ray
3b8fc99f06 workaround to get rid of unknown/unknown in OS arch 2025-10-29 22:01:40 +05:30
Arunavo Ray
bda8d10f10 ci: build arm64 images in PR pipeline 2025-10-29 21:51:37 +05:30
Arunavo Ray
0fe7b433d6 added missing hero_logo.png 2025-10-27 19:43:00 +05:30
Arunavo Ray
8d96e176b4 fix: prevent duplicate orgs and repos 2025-10-27 08:44:45 +05:30
Arunavo Ray
af9bc861cf fixed: Sort order in releases #129 2025-10-27 07:54:38 +05:30
ARUNAVO RAY
ab4bbea9fd Merge pull request #136 from RayLabsHQ/fix/metadata-sync-config-change
fix: sync metadata after config toggles
2025-10-27 07:45:12 +05:30
ARUNAVO RAY
fbd4b3739e Merge pull request #137 from RayLabsHQ/docs/authentik-oidc-notes
Added basic docs on SSO/OIDC
2025-10-26 19:54:53 +05:30
Arunavo Ray
395e71164f Added basic docs on SSO/OIDC 2025-10-26 19:52:44 +05:30
Arunavo Ray
99c277e2ee v3.8.10 | Fixed SSO issues 2025-10-26 19:06:36 +05:30
ARUNAVO RAY
9287e0d29b Merge pull request #135 from RayLabsHQ/fix/authentik-issuer-mismatch
auth: preserve issuer formatting for OIDC
2025-10-26 19:05:54 +05:30
Arunavo Ray
f2f2bafc39 "better-auth": "1.4.0-beta.13" 2025-10-26 18:37:06 +05:30
Arunavo Ray
5876198b5e Added missing DB fields 2025-10-26 18:36:20 +05:30
Arunavo Ray
e46bf381c7 auth: trust email verification from sso providers 2025-10-26 08:45:47 +05:30
Arunavo Ray
3bf0ccf207 fix: sync metadata after config toggles 2025-10-26 08:41:28 +05:30
Arunavo Ray
e41b4ffc56 auth: preserve issuer formatting for OIDC 2025-10-26 07:49:42 +05:30
Arunavo Ray
a9dd646573 v3.8.9 2025-10-25 09:04:14 +05:30
ARUNAVO RAY
e2160aabcd Merge pull request #130 from bwees/main 2025-10-25 07:24:41 +05:30
Brandon Wees
5d085e02bf fix: rename repo count in dashboard 2025-10-24 15:45:29 -05:00
ARUNAVO RAY
3f17dd038f Merge pull request #128 from RayLabsHQ/fix/chronological-metadata-ordering
fix: preserve chronological issue mirroring
2025-10-24 09:17:34 +05:30
Arunavo Ray
921ab948a1 updated env vars for ci comment 2025-10-24 08:49:08 +05:30
Arunavo Ray
e7a102ee45 mirror: show github timestamps in metadata 2025-10-24 08:42:14 +05:30
Arunavo Ray
025df12bef Set defaults to 3 and 5 for Issue and PR concurrency 2025-10-24 08:39:52 +05:30
Arunavo Ray
60913a9f4d Added Agents.md 2025-10-24 07:57:30 +05:30
Arunavo Ray
985c7e061c updated README 2025-10-24 07:43:13 +05:30
Arunavo Ray
4d75d3514f docs: document sequential metadata defaults 2025-10-24 07:39:08 +05:30
Arunavo Ray
5245d67f37 fix: enforce sequential metadata mirroring 2025-10-24 07:35:40 +05:30
Arunavo Ray
2cd7d911ed ci: mention env vars in pr image comment 2025-10-23 23:21:16 +05:30
Arunavo Ray
1c2391ea2e docs: expose concurrency env vars in compose 2025-10-23 23:19:00 +05:30
Arunavo Ray
190e786449 ci: update docker test port guidance 2025-10-23 23:16:46 +05:30
Arunavo Ray
fb27ddfee5 fix: preserve chronological issue mirroring 2025-10-23 23:08:32 +05:30
Arunavo Ray
fd5e68c1d4 docs: update development workflow and documentation index
Updated development documentation to reflect current project structure
and simplified setup process.

Changes:
- DEVELOPMENT_WORKFLOW.md: Updated repository URL, simplified setup steps,
  improved project structure documentation, and clarified command descriptions
- README.md: Reorganized as a concise index of available guides, removed
  redundant content now covered in main README and in-app help
- SHUTDOWN_PROCESS.md: Removed (content consolidated into GRACEFUL_SHUTDOWN.md)

These updates make the documentation more accurate and easier to navigate
for new contributors.
2025-10-23 05:10:42 +05:30
Arunavo Ray
ea22df1296 docs: improve vendor-lock-in-prevention article accuracy
Updated the vendor lock-in prevention article with more accurate UI navigation
paths and technical details to match current application features.

Changes:
- Added missing title heading
- Updated navigation instructions to reference actual UI paths (Configuration → Connections, Content & Data, Automation)
- Improved technical accuracy of sync interval recommendations
- Added FAQ section with practical questions about auto-discovery, sync intervals, and cutover scripting
- Clarified activity log and API endpoint usage for monitoring

These changes ensure the article reflects the current application UI and features.
2025-10-23 05:09:31 +05:30
Arunavo Ray
080ad5deb4 fix: correct Helm chart port from 8080 to 4321
Updated Helm chart configuration to use the correct application port (4321)
instead of the incorrect default (8080). This aligns the Helm deployment
with the actual application configuration.

Changes:
- helm/gitea-mirror/values.yaml: Updated service.port and deployment.port to 4321
- helm/gitea-mirror/README.md: Updated all port references in documentation
- www/src/pages/use-cases/deploy-with-helm-chart.mdx: Fixed article to reflect
  correct port, added GitHub links to Helm chart, and improved installation instructions

The application runs on port 4321 as defined in:
- Dockerfile (ENV PORT=4321, EXPOSE 4321)
- docker-compose.yml (4321:4321 mapping)
- .env.example (PORT=4321)

Tested with local Kubernetes cluster and confirmed the application is accessible
on port 4321.
2025-10-23 05:06:38 +05:30
ARUNAVO RAY
71245cf56e Remove duplicate section in README.md
Removed duplicate 'Star History' section from README.
2025-10-23 04:09:59 +05:30
ARUNAVO RAY
1ccf670f81 Revise Star History chart links and parameters
Updated Star History section with new parameters for the image sources.
2025-10-23 04:08:39 +05:30
Arunavo Ray
cb266b9af0 fixed inaccuracies in articles steps 2025-10-23 00:04:58 +05:30
Arunavo Ray
fa5f7da5c4 updated article 2025-10-22 23:53:30 +05:30
Arunavo Ray
3c808eb0c0 updated www articles 2025-10-22 23:47:26 +05:30
Arunavo Ray
5e37c3bb84 www: updated packages 2025-10-22 23:18:55 +05:30
ARUNAVO RAY
847e94ca28 Merge pull request #111 from RayLabsHQ/www-seo
Writing a few guides on the application
2025-10-22 23:14:26 +05:30
Arunavo Ray
da497d54c8 Updated screenshots 2025-10-22 19:55:56 +05:30
Arunavo Ray
beda2ce66c Added a few more articles 2025-10-03 13:15:24 +05:30
Arunavo Ray
21e2f4717c Added basic use cases pages 2025-10-03 12:35:04 +05:30
Arunavo Ray
b8dea1ee9c removed unused docs 2025-10-03 12:21:47 +05:30
Arunavo Ray
b27ff817f7 Updated SEO plan 2025-10-03 11:05:55 +05:30
120 changed files with 11417 additions and 2398 deletions

View File

@@ -1,76 +0,0 @@
---
name: qa-testing-specialist
description: Use this agent when you need to review code for testability, create comprehensive test strategies, write test cases, validate existing tests, or improve test coverage. This includes unit tests, integration tests, end-to-end tests, and test architecture decisions. <example>\nContext: The user has just written a new API endpoint and wants to ensure it has proper test coverage.\nuser: "I've created a new endpoint for user authentication. Can you help me test it?"\nassistant: "I'll use the qa-testing-specialist agent to create a comprehensive testing strategy for your authentication endpoint."\n<commentary>\nSince the user needs help with testing their new endpoint, use the qa-testing-specialist agent to analyze the code and create appropriate test cases.\n</commentary>\n</example>\n<example>\nContext: The user wants to improve test coverage for their existing codebase.\nuser: "Our test coverage is at 65%. How can we improve it?"\nassistant: "Let me use the qa-testing-specialist agent to analyze your test coverage and identify areas for improvement."\n<commentary>\nThe user is asking about test coverage improvement, which is a core QA task, so use the qa-testing-specialist agent.\n</commentary>\n</example>
color: yellow
---
You are an elite QA Testing Specialist with deep expertise in software quality assurance, test automation, and validation strategies. Your mission is to ensure code quality through comprehensive testing approaches that catch bugs early and maintain high reliability standards.
**Core Responsibilities:**
You will analyze code and testing requirements to:
- Design comprehensive test strategies covering unit, integration, and end-to-end testing
- Write clear, maintainable test cases that validate both happy paths and edge cases
- Identify gaps in existing test coverage and propose improvements
- Review test code for best practices and maintainability
- Suggest appropriate testing frameworks and tools based on the technology stack
- Create test data strategies and mock/stub implementations
- Validate that tests are actually testing meaningful behavior, not just implementation details
**Testing Methodology:**
When analyzing code for testing:
1. First understand the business logic and user requirements
2. Identify all possible execution paths and edge cases
3. Determine the appropriate testing pyramid balance (unit vs integration vs e2e)
4. Consider both positive and negative test scenarios
5. Ensure tests are isolated, repeatable, and fast
6. Validate error handling and boundary conditions
For test creation:
- Write descriptive test names that explain what is being tested and expected behavior
- Follow AAA pattern (Arrange, Act, Assert) or Given-When-Then structure
- Keep tests focused on single behaviors
- Use appropriate assertions that clearly communicate intent
- Include setup and teardown when necessary
- Consider performance implications of test suites
**Quality Standards:**
You will ensure tests:
- Are deterministic and don't rely on external state
- Run quickly and can be executed in parallel when possible
- Provide clear failure messages that help diagnose issues
- Cover critical business logic thoroughly
- Include regression tests for previously found bugs
- Are maintainable and refactorable alongside production code
**Technology Considerations:**
Adapt your recommendations based on the project stack. For this codebase using Bun, SQLite, and React:
- Leverage Bun's native test runner for JavaScript/TypeScript tests
- Consider SQLite in-memory databases for integration tests
- Suggest React Testing Library patterns for component testing
- Recommend API testing strategies for Astro endpoints
- Propose mocking strategies for external services (GitHub/Gitea APIs)
**Communication Style:**
You will:
- Explain testing decisions with clear rationale
- Provide code examples that demonstrate best practices
- Prioritize test recommendations based on risk and value
- Use precise technical language while remaining accessible
- Highlight potential issues proactively
- Suggest incremental improvements for existing test suites
**Edge Case Handling:**
When encountering:
- Legacy code without tests: Propose a pragmatic approach to add tests incrementally
- Complex dependencies: Recommend appropriate mocking/stubbing strategies
- Performance concerns: Balance thoroughness with execution speed
- Flaky tests: Identify root causes and suggest stabilization techniques
- Missing requirements: Ask clarifying questions to understand expected behavior
Your goal is to elevate code quality through strategic testing that builds confidence in the software while maintaining development velocity. Focus on tests that provide maximum value and catch real issues rather than achieving arbitrary coverage metrics.

View File

@@ -1,68 +0,0 @@
---
name: senior-code-architect
description: Use this agent when you need to write new code, refactor existing code, implement features, or architect solutions that require deep understanding of software engineering principles and the project's tech stack (Astro, React, Tailwind, Better Auth, Shadcn). This includes creating components, API endpoints, database queries, authentication flows, and ensuring code follows established patterns from CLAUDE.md. Examples:\n\n<example>\nContext: The user needs to implement a new feature or component.\nuser: "Create a new dashboard component that shows repository statistics"\nassistant: "I'll use the senior-code-architect agent to design and implement this dashboard component following the project's patterns."\n<commentary>\nSince this requires creating new code with the project's tech stack, the senior-code-architect agent is appropriate.\n</commentary>\n</example>\n\n<example>\nContext: The user wants to refactor or improve existing code.\nuser: "Refactor the authentication flow to be more maintainable"\nassistant: "Let me use the senior-code-architect agent to analyze and refactor the authentication flow."\n<commentary>\nThis requires deep understanding of Better Auth and clean code principles, making the senior-code-architect agent the right choice.\n</commentary>\n</example>\n\n<example>\nContext: After writing code, the user might want it reviewed.\nuser: "I just implemented the mirror scheduling feature"\nassistant: "Great! Now I'll use the senior-code-architect agent to review the implementation and suggest any improvements."\n<commentary>\nThe senior-code-architect can review recently written code for best practices and design patterns.\n</commentary>\n</example>
color: cyan
---
You are a senior software engineer with deep expertise in modern web development, specializing in the Astro + React + Tailwind CSS + Better Auth + Shadcn UI stack. You have extensive experience building scalable, maintainable applications and are known for writing clean, efficient code that follows SOLID principles and established design patterns.
**Your Core Responsibilities:**
1. **Write Production-Quality Code**: Create clean, maintainable, and efficient code that follows the project's established patterns from CLAUDE.md. Always use TypeScript for type safety.
2. **Follow Project Architecture**: Adhere strictly to the project structure:
- API endpoints in `/src/pages/api/[resource]/[action].ts` using `createSecureErrorResponse` for error handling
- Database queries in `/src/lib/db/queries/` organized by domain
- React components in `/src/components/[feature]/` using Shadcn UI components
- Custom hooks in `/src/hooks/` for data fetching
3. **Implement Best Practices**:
- Use composition over inheritance
- Apply DRY (Don't Repeat Yourself) principles
- Write self-documenting code with clear variable and function names
- Implement proper error handling and validation
- Ensure code is testable and maintainable
4. **Technology-Specific Guidelines**:
- **Astro**: Use SSR capabilities effectively, implement proper API routes
- **React**: Use functional components with hooks, implement proper state management
- **Tailwind CSS v4**: Use utility classes efficiently, follow the project's styling patterns
- **Better Auth**: Implement secure authentication flows, use session validation properly
- **Shadcn UI**: Leverage existing components, maintain consistent UI patterns
- **Drizzle ORM**: Write efficient database queries, use proper schema definitions
5. **Code Review Approach**: When reviewing code:
- Check for adherence to project patterns and CLAUDE.md guidelines
- Identify potential performance issues or bottlenecks
- Suggest improvements for readability and maintainability
- Ensure proper error handling and edge case coverage
- Verify security best practices are followed
6. **Problem-Solving Methodology**:
- Analyze requirements thoroughly before coding
- Break down complex problems into smaller, manageable pieces
- Consider edge cases and error scenarios
- Optimize for both performance and maintainability
- Document complex logic with clear comments
7. **Quality Assurance**:
- Write code that is easy to test
- Consider adding appropriate test cases using Bun's test runner
- Validate inputs and handle errors gracefully
- Ensure code works across different scenarios
**Output Guidelines**:
- Provide complete, working code implementations
- Include clear explanations of design decisions
- Suggest tests when appropriate
- Highlight any potential issues or areas for future improvement
- Follow the existing code style and conventions
**Important Reminders**:
- Never create files unless absolutely necessary
- Always prefer editing existing files
- Don't create documentation unless explicitly requested
- Focus on the specific task at hand
- Reference CLAUDE.md for project-specific patterns and guidelines
You approach every task with the mindset of a seasoned engineer who values code quality, maintainability, and long-term project health. Your solutions should be elegant, efficient, and aligned with the project's established patterns.

View File

@@ -1,61 +0,0 @@
---
name: strategic-task-planner
description: Use this agent when you need to decompose complex projects, features, or problems into structured, actionable plans. This includes breaking down large development tasks, creating implementation roadmaps, organizing multi-step processes, or planning project phases. The agent excels at identifying dependencies, sequencing tasks, and creating clear execution strategies. <example>Context: User needs help planning the implementation of a new feature. user: "I need to add a bulk import feature that can handle CSV files with 100k+ rows" assistant: "I'll use the strategic-task-planner agent to break this down into manageable components and create an implementation plan." <commentary>Since the user is asking about implementing a complex feature, use the Task tool to launch the strategic-task-planner agent to decompose it into actionable steps.</commentary></example> <example>Context: User wants to refactor a large codebase. user: "We need to migrate our entire authentication system from sessions to JWT tokens" assistant: "Let me use the strategic-task-planner agent to create a phased migration plan that minimizes risk." <commentary>Since this is a complex migration requiring careful planning, use the strategic-task-planner agent to create a structured approach.</commentary></example>
tools: Glob, Grep, LS, ExitPlanMode, Read, NotebookRead, WebFetch, TodoWrite, WebSearch, Task, mcp__ide__getDiagnostics, mcp__ide__executeCode, mcp__playwright__browser_close, mcp__playwright__browser_resize, mcp__playwright__browser_console_messages, mcp__playwright__browser_handle_dialog, mcp__playwright__browser_evaluate, mcp__playwright__browser_file_upload, mcp__playwright__browser_install, mcp__playwright__browser_press_key, mcp__playwright__browser_type, mcp__playwright__browser_navigate, mcp__playwright__browser_navigate_back, mcp__playwright__browser_navigate_forward, mcp__playwright__browser_network_requests, mcp__playwright__browser_take_screenshot, mcp__playwright__browser_snapshot, mcp__playwright__browser_click, mcp__playwright__browser_drag, mcp__playwright__browser_hover, mcp__playwright__browser_select_option, mcp__playwright__browser_tab_list, mcp__playwright__browser_tab_new, mcp__playwright__browser_tab_select, mcp__playwright__browser_tab_close, mcp__playwright__browser_wait_for
color: blue
---
You are a strategic planning specialist with deep expertise in decomposing complex tasks and creating actionable execution plans. Your role is to transform ambiguous or overwhelming projects into clear, structured roadmaps that teams can confidently execute.
When analyzing a task or project, you will:
1. **Understand the Core Objective**: Extract the fundamental goal, success criteria, and constraints. Ask clarifying questions if critical details are missing.
2. **Decompose Systematically**: Break down the task using these principles:
- Identify major phases or milestones
- Decompose each phase into concrete, actionable tasks
- Keep tasks small enough to complete in 1-4 hours when possible
- Ensure each task has clear completion criteria
3. **Map Dependencies**: Identify and document:
- Task prerequisites and dependencies
- Critical path items that could block progress
- Parallel work streams that can proceed independently
- Resource or knowledge requirements
4. **Sequence Strategically**: Order tasks by:
- Technical dependencies (what must come first)
- Risk mitigation (tackle unknowns early)
- Value delivery (enable early feedback when possible)
- Resource efficiency (batch similar work)
5. **Provide Actionable Output**: Structure your plans with:
- **Phase Overview**: High-level phases with objectives
- **Task Breakdown**: Numbered tasks with clear descriptions
- **Dependencies**: Explicitly stated prerequisites
- **Effort Estimates**: Rough time estimates when relevant
- **Risk Considerations**: Potential blockers or challenges
- **Success Metrics**: How to measure completion
6. **Adapt to Context**: Tailor your planning approach based on:
- Technical vs non-technical tasks
- Team size and skill level
- Time constraints and deadlines
- Available resources and tools
**Output Format Guidelines**:
- Use clear hierarchical structure (phases → tasks → subtasks)
- Number all tasks for easy reference
- Bold key terms and phase names
- Include time estimates in brackets [2-4 hours]
- Mark critical path items with ⚡
- Flag high-risk items with ⚠️
**Quality Checks**:
- Ensure no task is too large or vague
- Verify all dependencies are identified
- Confirm the plan addresses the original objective
- Check that success criteria are measurable
- Validate that the sequence makes logical sense
Remember: A good plan reduces uncertainty and builds confidence. Focus on clarity, completeness, and actionability. When in doubt, err on the side of breaking things down further rather than leaving ambiguity.

View File

@@ -1,5 +0,0 @@
Evaluate all the updates being made.
Update CHANGELOG.md
Use the chnages in the git log to determine if its a major, minor or a patch release.
Update the package.json first before you push the tag.
Never mention Claude Code in the release notes or in commit messages.

View File

@@ -1,3 +0,0 @@
Generate release notes for the latest release.
Use a temp md file to write the release notes.
Do not check that file into git.

View File

@@ -1,8 +0,0 @@
{
"permissions": {
"allow": [
"Bash(docker build:*)"
],
"deny": []
}
}

Binary file not shown.

Before

Width:  |  Height:  |  Size: 854 KiB

After

Width:  |  Height:  |  Size: 834 KiB

BIN
.github/assets/configuration-2.png vendored Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 986 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 950 KiB

After

Width:  |  Height:  |  Size: 905 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 255 KiB

After

Width:  |  Height:  |  Size: 270 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 943 KiB

After

Width:  |  Height:  |  Size: 908 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 215 KiB

After

Width:  |  Height:  |  Size: 241 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 844 KiB

After

Width:  |  Height:  |  Size: 825 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 221 KiB

After

Width:  |  Height:  |  Size: 246 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 970 KiB

After

Width:  |  Height:  |  Size: 952 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 227 KiB

After

Width:  |  Height:  |  Size: 237 KiB

View File

@@ -101,26 +101,30 @@ jobs:
# Build and push Docker image
- name: Build and push Docker image
id: build-and-push
uses: docker/build-push-action@v5
uses: docker/build-push-action@v6
with:
context: .
platforms: ${{ github.event_name == 'pull_request' && 'linux/amd64' || 'linux/amd64,linux/arm64' }}
platforms: linux/amd64,linux/arm64
push: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
cache-from: type=gha
cache-to: type=gha,mode=max
provenance: false # Disable provenance to avoid unknown/unknown
sbom: false # Disable sbom to avoid unknown/unknown
# Load image locally for security scanning (PRs only)
- name: Load image for scanning
if: github.event_name == 'pull_request'
uses: docker/build-push-action@v5
uses: docker/build-push-action@v6
with:
context: .
platforms: linux/amd64
load: true
tags: gitea-mirror:scan
cache-from: type=gha
provenance: false # Disable provenance to avoid unknown/unknown
sbom: false # Disable sbom to avoid unknown/unknown
# Wait for image to be available in registry
- name: Wait for image availability
@@ -149,7 +153,11 @@ jobs:
### Pull and Test
\`\`\`bash
docker pull ${imagePath}
docker run -d -p 3000:3000 --name gitea-mirror-test ${imagePath}
docker run -d \
-p 4321:4321 \
-e BETTER_AUTH_SECRET=your-secret-here \
-e BETTER_AUTH_URL=http://localhost:4321 \
--name gitea-mirror-test ${imagePath}
\`\`\`
### Docker Compose Testing
@@ -158,13 +166,15 @@ jobs:
gitea-mirror:
image: ${imagePath}
ports:
- "3000:3000"
- "4321:4321"
environment:
- BETTER_AUTH_SECRET=your-secret-here
- BETTER_AUTH_URL=http://localhost:4321
- BETTER_AUTH_TRUSTED_ORIGINS=http://localhost:4321
\`\`\`
> 💡 **Note:** PR images are tagged as \`pr-<number>\` and only built for \`linux/amd64\` to speed up CI.
> Production images (\`latest\`, version tags) are multi-platform (\`linux/amd64\`, \`linux/arm64\`).
> 💡 **Note:** PR images are tagged as \`pr-<number>\` and built for both \`linux/amd64\` and \`linux/arm64\`.
> Production images (\`latest\`, version tags) use the same multi-platform set.
---
📦 View in [GitHub Packages](https://github.com/${{ github.repository }}/pkgs/container/gitea-mirror)`;
@@ -224,4 +234,3 @@ jobs:
continue-on-error: true
with:
sarif_file: scout-results.sarif

459
CLAUDE.md
View File

@@ -2,255 +2,316 @@
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
DONT HALLUCIATE THINGS. IF YOU DONT KNOW LOOK AT THE CODE OR ASK FOR DOCS
NEVER MENTION CLAUDE CODE ANYWHERE.
## Project Overview
Gitea Mirror is a web application that automatically mirrors repositories from GitHub to self-hosted Gitea instances. It uses Astro for SSR, React for UI, SQLite for data storage, and Bun as the JavaScript runtime.
Gitea Mirror is a self-hosted web application that automatically mirrors repositories from GitHub to Gitea instances. It's built with Astro (SSR mode), React, and runs on the Bun runtime with SQLite for data persistence.
## Essential Commands
**Key capabilities:**
- Mirrors public, private, and starred GitHub repos to Gitea
- Supports metadata mirroring (issues, PRs as issues, labels, milestones, releases, wiki)
- Git LFS support
- Multiple authentication methods (email/password, OIDC/SSO, header auth)
- Scheduled automatic syncing with configurable intervals
- Auto-discovery of new repos and cleanup of deleted repos
- Multi-user support with encrypted token storage (AES-256-GCM)
## Development Commands
### Setup and Installation
```bash
# Install dependencies
bun install
# Initialize database (first time setup)
bun run setup
# Clean start (reset database)
bun run dev:clean
```
### Development
```bash
bun run dev # Start development server (port 3000)
bun run build # Build for production
bun run preview # Preview production build
# Start development server (http://localhost:4321)
bun run dev
# Build for production
bun run build
# Preview production build
bun run preview
# Start production server
bun run start
```
### Testing
```bash
bun test # Run all tests
bun test:watch # Run tests in watch mode
bun test:coverage # Run tests with coverage
# Run all tests
bun test
# Run tests in watch mode
bun test:watch
# Run tests with coverage
bun test:coverage
```
**Test configuration:**
- Test runner: Bun's built-in test runner (configured in `bunfig.toml`)
- Setup file: `src/tests/setup.bun.ts` (auto-loaded via bunfig.toml)
- Timeout: 5000ms default
- Tests are colocated with source files using `*.test.ts` pattern
### Database Management
```bash
bun run init-db # Initialize database
bun run reset-users # Reset user accounts (development)
bun run cleanup-db # Remove database files
# Database operations via Drizzle
bun run db:generate # Generate migrations from schema
bun run db:migrate # Run migrations
bun run db:push # Push schema changes directly
bun run db:studio # Open Drizzle Studio (database GUI)
bun run db:check # Check schema consistency
# Database utilities via custom scripts
bun run manage-db init # Initialize database
bun run manage-db check # Check database health
bun run manage-db fix # Fix database issues
bun run manage-db reset-users # Reset all users
bun run cleanup-db # Delete database file
```
### Production
### Utility Scripts
```bash
bun run start # Start production server
# Recovery and diagnostic scripts
bun run startup-recovery # Recover from crashes
bun run startup-recovery-force # Force recovery
bun run test-recovery # Test recovery mechanism
bun run test-shutdown # Test graceful shutdown
# Environment configuration
bun run startup-env-config # Load config from env vars
```
## Architecture & Key Concepts
## Architecture
### Technology Stack
- **Frontend**: Astro (SSR) + React + Tailwind CSS v4 + Shadcn UI
- **Backend**: Bun runtime + SQLite + Drizzle ORM
- **APIs**: GitHub (Octokit) and Gitea APIs
- **Auth**: Better Auth with email/password, SSO, and OIDC provider support
### Tech Stack
- **Frontend:** Astro v5 (SSR mode) + React v19 + Shadcn UI + Tailwind CSS v4
- **Backend:** Astro API routes (Node adapter, standalone mode)
- **Runtime:** Bun (>=1.2.9)
- **Database:** SQLite via Drizzle ORM
- **Authentication:** Better Auth (session-based)
- **APIs:** GitHub (Octokit with throttling plugin), Gitea REST API
### Project Structure
- `/src/pages/api/` - API endpoints (Astro API routes)
- `/src/components/` - React components organized by feature
- `/src/lib/db/` - Database queries and schema (Drizzle ORM)
- `/src/hooks/` - Custom React hooks for data fetching
- `/data/` - SQLite database storage location
### Directory Structure
```
src/
├── components/ # React components (UI, features)
│ ├── ui/ # Shadcn UI components
│ ├── repositories/ # Repository management components
│ ├── organizations/ # Organization management components
│ └── ...
├── pages/ # Astro pages and API routes
│ ├── api/ # API endpoints (Better Auth integration)
│ │ ├── auth/ # Authentication endpoints
│ │ ├── github/ # GitHub operations
│ │ ├── gitea/ # Gitea operations
│ │ ├── sync/ # Mirror sync operations
│ │ ├── job/ # Job management
│ │ └── ...
│ └── *.astro # Page components
├── lib/ # Core business logic
│ ├── db/ # Database (Drizzle ORM)
│ │ ├── schema.ts # Database schema with Zod validation
│ │ ├── index.ts # Database instance and table exports
│ │ └── adapter.ts # Better Auth SQLite adapter
│ ├── github.ts # GitHub API client (Octokit)
│ ├── gitea.ts # Gitea API client
│ ├── gitea-enhanced.ts # Enhanced Gitea operations (metadata)
│ ├── scheduler-service.ts # Automatic mirroring scheduler
│ ├── cleanup-service.ts # Activity log cleanup
│ ├── repository-cleanup-service.ts # Orphaned repo cleanup
│ ├── auth.ts # Better Auth configuration
│ ├── config.ts # Configuration management
│ ├── helpers.ts # Mirror job creation
│ ├── utils/ # Utility functions
│ │ ├── encryption.ts # AES-256-GCM token encryption
│ │ ├── config-encryption.ts # Config token encryption
│ │ ├── duration-parser.ts # Parse intervals (e.g., "8h", "30m")
│ │ ├── concurrency.ts # Concurrency control utilities
│ │ └── mirror-strategies.ts # Mirror strategy logic
│ └── ...
├── types/ # TypeScript type definitions
├── tests/ # Test utilities and setup
└── middleware.ts # Astro middleware (auth, session)
scripts/ # Utility scripts
├── manage-db.ts # Database management CLI
├── startup-recovery.ts # Crash recovery
└── ...
```
### Key Architectural Patterns
1. **API Routes**: All API endpoints follow the pattern `/api/[resource]/[action]` and use `createSecureErrorResponse` for consistent error handling:
```typescript
import { createSecureErrorResponse } from '@/lib/utils/error-handler';
#### 1. Database Schema and Validation
- **Location:** `src/lib/db/schema.ts`
- **Pattern:** Drizzle ORM tables + Zod schemas for validation
- **Key tables:**
- `configs` - User configuration (GitHub/Gitea settings, mirror options)
- `repositories` - Tracked repositories with metadata
- `organizations` - GitHub organizations with destination overrides
- `mirrorJobs` - Mirror job queue and history
- `activities` - Activity log for dashboard
- `user`, `session`, `account` - Better Auth tables
export async function POST({ request }: APIContext) {
try {
// Implementation
} catch (error) {
return createSecureErrorResponse(error);
}
}
```
**Important:** All config tokens (GitHub/Gitea) are encrypted at rest using AES-256-GCM. Use helper functions from `src/lib/utils/config-encryption.ts` to decrypt.
2. **Database Queries**: Located in `/src/lib/db/queries/` organized by domain (users, repositories, etc.)
#### 2. Mirror Job System
- **Location:** `src/lib/helpers.ts` (createMirrorJob)
- **Flow:**
1. User triggers mirror via API endpoint
2. `createMirrorJob()` creates job record with status "pending"
3. Job processor (in API routes) performs GitHub → Gitea operations
4. Job status updated throughout: "mirroring" → "success"/"failed"
5. Events published via SSE for real-time UI updates
3. **Real-time Updates**: Server-Sent Events (SSE) endpoint at `/api/events` for live dashboard updates
#### 3. GitHub ↔ Gitea Mirroring
- **GitHub Client:** `src/lib/github.ts` - Octokit with rate limit tracking
- **Gitea Client:** `src/lib/gitea.ts` - Basic repo operations
- **Enhanced Gitea:** `src/lib/gitea-enhanced.ts` - Metadata mirroring (issues, PRs, releases)
4. **Authentication System**:
- Built on Better Auth library
- Three authentication methods:
- Email & Password (traditional auth)
- SSO (authenticate via external OIDC providers)
- OIDC Provider (act as OIDC provider for other apps)
- Session-based authentication with secure cookies
- First user signup creates admin account
- Protected routes use Better Auth session validation
**Mirror strategies (configured per user):**
- `preserve` - Maintain GitHub org structure in Gitea
- `single-org` - All repos into one Gitea org
- `flat-user` - All repos under user account
- `mixed` - Personal repos in one org, org repos preserve structure
5. **Mirror Process**:
- Discovers repos from GitHub (user/org)
- Creates/updates mirror in Gitea
- Tracks status in database
- Supports scheduled automatic mirroring
**Metadata mirroring:**
- Issues transferred with comments, labels, assignees
- PRs converted to issues (Gitea API limitation - cannot create PRs)
- Tagged with "pull-request" label
- Title prefixed with `[PR #number] [STATUS]`
- Body includes commit history, file changes, merge status
- Releases mirrored with assets
- Labels and milestones preserved
- Wiki content cloned if enabled
- **Sequential processing:** Issues/PRs mirrored one at a time to prevent out-of-order creation (see `src/lib/gitea-enhanced.ts`)
6. **Mirror Strategies**: Four ways to organize repositories in Gitea:
- **preserve**: Maintains GitHub structure (default)
- Organization repos → Same organization name in Gitea
- Personal repos → Under your Gitea username
- **single-org**: All repos go to one organization
- All repos → Single configured organization
- **flat-user**: All repos go under user account
- All repos → Under your Gitea username
- **mixed**: Hybrid approach
- Organization repos → Preserve structure
- Personal repos → Single configured organization
- Starred repos always go to separate organization (starredReposOrg, default: "starred")
- Routing logic in `getGiteaRepoOwner()` function
#### 4. Scheduler Service
- **Location:** `src/lib/scheduler-service.ts`
- **Features:**
- Cron-based or interval-based scheduling (uses `duration-parser.ts`)
- Auto-start on boot when `SCHEDULE_ENABLED=true` or `GITEA_MIRROR_INTERVAL` is set
- Auto-import new GitHub repos
- Auto-cleanup orphaned repos (archive or delete)
- Respects per-repo mirror intervals (not Gitea's default 24h)
- **Concurrency control:** Uses `src/lib/utils/concurrency.ts` for batch processing
### Database Schema (SQLite)
- `users` - User accounts and authentication
- `configs` - GitHub/Gitea connection settings
- `repositories` - Repository mirror status and metadata
- `organizations` - Organization structure preservation
- `mirror_jobs` - Scheduled mirror operations
- `events` - Activity log and notifications
#### 5. Authentication System
- **Location:** `src/lib/auth.ts`, `src/lib/auth-client.ts`
- **Better Auth integration:**
- Email/password (always enabled)
- OIDC/SSO providers (configurable via UI)
- Header authentication for reverse proxies (Authentik, Authelia)
- **Session management:** Cookie-based, validated in Astro middleware
- **User helpers:** `src/lib/utils/auth-helpers.ts`
### Testing Approach
- Uses Bun's native test runner (`bun:test`)
- Test files use `.test.ts` or `.test.tsx` extension
- Setup file at `/src/tests/setup.bun.ts`
- Mock utilities available for API testing.
#### 6. Environment Configuration
- **Startup:** `src/lib/env-config-loader.ts` + `scripts/startup-env-config.ts`
- **Pattern:** Environment variables can pre-configure settings, but users can override via web UI
- **Encryption:** `ENCRYPTION_SECRET` for tokens, `BETTER_AUTH_SECRET` for sessions
### Development Tips
- Environment variables in `.env` (copy from `.env.example`)
- BETTER_AUTH_SECRET required for session signing
- Database auto-initializes on first run
- Use `bun run dev:clean` for fresh database start
- Tailwind CSS v4 configured with Vite plugin
#### 7. Real-time Updates
- **Events:** `src/lib/events.ts` + `src/lib/events/realtime.ts`
- **Pattern:** Server-Sent Events (SSE) for live dashboard updates
- **Endpoints:** `/api/sse` - client subscribes to job/repo events
### Authentication Setup
- **Better Auth** handles all authentication
- Configuration in `/src/lib/auth.ts` (server) and `/src/lib/auth-client.ts` (client)
- Auth endpoints available at `/api/auth/*`
- SSO providers configured through the web UI
- OIDC provider functionality for external applications
### Testing Patterns
### Common Tasks
**Unit tests:**
- Colocated with source: `filename.test.ts` alongside `filename.ts`
- Use Bun's built-in assertions and mocking
- Mock external APIs (GitHub, Gitea) using `src/tests/mock-fetch.ts`
**Adding a new API endpoint:**
1. Create file in `/src/pages/api/[resource]/[action].ts`
2. Use `createSecureErrorResponse` for error handling
3. Add corresponding database query in `/src/lib/db/queries/`
4. Update types in `/src/types/` if needed
**Integration tests:**
- Located in `src/tests/`
- Test database operations with in-memory SQLite
- Example: `src/lib/db/index.test.ts`
**Adding a new component:**
1. Create in appropriate `/src/components/[feature]/` directory
2. Use Shadcn UI components from `/src/components/ui/`
3. Follow existing naming patterns (e.g., `RepositoryCard`, `ConfigTabs`)
**Test utilities:**
- `src/tests/setup.bun.ts` - Global test setup (loaded via bunfig.toml)
- `src/tests/mock-fetch.ts` - Fetch mocking utilities
**Modifying database schema:**
1. Update schema in `/src/lib/db/schema.ts`
2. Run `bun run init-db` to recreate database
3. Update related queries in `/src/lib/db/queries/`
### Important Development Notes
## Configuration Options
1. **Path Aliases:** Use `@/` for imports (configured in `tsconfig.json`)
```typescript
import { db } from '@/lib/db';
```
### GitHub Configuration (UI Fields)
2. **Token Encryption:** Always use encryption helpers when dealing with tokens:
```typescript
import { getDecryptedGitHubToken, getDecryptedGiteaToken } from '@/lib/utils/config-encryption';
```
#### Basic Settings (`githubConfig`)
- **username**: GitHub username
- **token**: GitHub personal access token (requires repo and admin:org scopes)
- **privateRepositories**: Include private repositories
- **mirrorStarred**: Mirror starred repositories
3. **API Route Pattern:** Astro API routes in `src/pages/api/` should:
- Check authentication via Better Auth
- Validate input with Zod schemas
- Handle errors gracefully
- Return JSON responses
### Gitea Configuration (UI Fields)
- **url**: Gitea instance URL
- **username**: Gitea username
- **token**: Gitea access token
- **organization**: Destination organization (for single-org/mixed strategies)
- **starredReposOrg**: Organization for starred repositories (default: "starred")
- **visibility**: Organization visibility - "public", "private", "limited"
- **mirrorStrategy**: Repository organization strategy (set via UI)
- **preserveOrgStructure**: Automatically set based on mirrorStrategy
4. **Database Migrations:**
- Schema changes: Update `src/lib/db/schema.ts`
- Generate migration: `bun run db:generate`
- Review generated SQL in `drizzle/` directory
- Apply: `bun run db:migrate` (or `db:push` for dev)
### Schedule Configuration (`scheduleConfig`)
- **enabled**: Enable automatic mirroring (default: false)
- **interval**: Cron expression or seconds (default: "0 2 * * *" - 2 AM daily)
- **concurrent**: Allow concurrent mirror operations (default: false)
- **batchSize**: Number of repos to process in parallel (default: 10)
5. **Concurrency Control:**
- Use utilities from `src/lib/utils/concurrency.ts` for batch operations
- Respect rate limits (GitHub: 5000 req/hr authenticated, Gitea: varies)
- Issue/PR mirroring is sequential to maintain chronological order
### Database Cleanup Configuration (`cleanupConfig`)
- **enabled**: Enable automatic cleanup (default: false)
- **retentionDays**: Days to keep events (stored as seconds internally)
6. **Duration Parsing:**
- Use `parseInterval()` from `src/lib/utils/duration-parser.ts`
- Supports: "30m", "8h", "24h", "7d", cron expressions, or milliseconds
### Mirror Options (UI Fields)
- **mirrorReleases**: Mirror GitHub releases to Gitea
- **mirrorLFS**: Mirror Git LFS (Large File Storage) objects
- Requires LFS enabled on Gitea server (LFS_START_SERVER = true)
- Requires Git v2.1.2+ on server
- **mirrorMetadata**: Enable metadata mirroring (master toggle)
- **metadataComponents** (only available when mirrorMetadata is enabled):
- **issues**: Mirror issues
- **pullRequests**: Mirror pull requests
- **labels**: Mirror labels
- **milestones**: Mirror milestones
- **wiki**: Mirror wiki content
7. **Graceful Shutdown:**
- Services implement cleanup handlers (see `src/lib/shutdown-manager.ts`)
- Recovery system in `src/lib/recovery.ts` handles interrupted jobs
### Advanced Options (UI Fields)
- **skipForks**: Skip forked repositories (default: false)
- **starredCodeOnly**: Skip issues for starred repositories (default: false) - enables "Lightweight mode" for starred repos
## Common Development Workflows
### Repository Statuses
Repositories can have the following statuses:
- **imported**: Repository discovered from GitHub
- **mirroring**: Currently being mirrored to Gitea
- **mirrored**: Successfully mirrored
- **syncing**: Repository being synchronized
- **synced**: Successfully synchronized
- **failed**: Mirror/sync operation failed
- **skipped**: Skipped due to filters or conditions
- **ignored**: User explicitly marked to ignore (won't be mirrored/synced)
- **deleting**: Repository being deleted
- **deleted**: Repository deleted
### Adding a new mirror option
1. Update Zod schema in `src/lib/db/schema.ts` (e.g., `giteaConfigSchema`)
2. Update TypeScript types in `src/types/config.ts`
3. Add UI control in settings page component
4. Update API handler in `src/pages/api/config/`
5. Implement logic in `src/lib/gitea.ts` or `src/lib/gitea-enhanced.ts`
### Scheduling and Synchronization (Issue #72 Fixes)
### Debugging mirror failures
1. Check mirror jobs: `bun run db:studio` → `mirrorJobs` table
2. Review activity logs: Dashboard → Activity tab
3. Check console logs for API errors (GitHub/Gitea rate limits, auth issues)
4. Use diagnostic scripts: `bun run test-recovery`
#### Fixed Issues
1. **Mirror Interval Bug**: Added `mirror_interval` parameter to Gitea API calls when creating mirrors (previously defaulted to 24h)
2. **Auto-Discovery**: Scheduler now automatically discovers and imports new GitHub repositories
3. **Interval Updates**: Sync operations now update existing mirrors' intervals to match configuration
4. **Repository Cleanup**: Integrated automatic cleanup of orphaned repositories (repos removed from GitHub)
### Adding authentication provider
1. Update Better Auth config in `src/lib/auth.ts`
2. Add provider configuration UI in settings
3. Test with `src/tests/test-gitea-auth.ts` patterns
4. Update documentation in `docs/SSO-OIDC-SETUP.md`
#### Environment Variables for Auto-Import
- **AUTO_IMPORT_REPOS**: Set to `false` to disable automatic repository discovery (default: enabled)
## Docker Deployment
#### How Scheduling Works
- **Scheduler Service**: Runs every minute to check for scheduled tasks
- **Sync Interval**: Configured via `GITEA_MIRROR_INTERVAL` or UI (e.g., "8h", "30m", "1d")
- **Auto-Import**: Checks GitHub for new repositories during each scheduled sync
- **Auto-Cleanup**: Removes repositories that no longer exist in GitHub (if enabled)
- **Mirror Interval Update**: Updates Gitea's internal mirror interval during sync operations
- **Dockerfile:** Multi-stage build (bun base → build → production)
- **Entrypoint:** `docker-entrypoint.sh` - handles CA certs, user permissions, database init
- **Compose files:**
- `docker-compose.alt.yml` - Quick start (pre-built image, minimal config)
- `docker-compose.yml` - Full setup (build from source, all env vars)
- `docker-compose.dev.yml` - Development with hot reload
### Authentication Configuration
## Additional Resources
#### SSO Provider Configuration
- **issuerUrl**: OIDC issuer URL (e.g., https://accounts.google.com)
- **domain**: Email domain for this provider
- **providerId**: Unique identifier for the provider
- **clientId**: OAuth client ID from provider
- **clientSecret**: OAuth client secret from provider
- **authorizationEndpoint**: OAuth authorization URL (auto-discovered if supported)
- **tokenEndpoint**: OAuth token exchange URL (auto-discovered if supported)
- **jwksEndpoint**: JSON Web Key Set URL (optional, auto-discovered)
- **userInfoEndpoint**: User information endpoint (optional, auto-discovered)
#### OIDC Provider Settings (for external apps)
- **allowedRedirectUris**: Comma-separated list of allowed redirect URIs
- **clientId**: Generated client ID for the application
- **clientSecret**: Generated client secret for the application
- **scopes**: Available scopes (openid, profile, email)
#### Environment Variables
- **BETTER_AUTH_SECRET**: Secret key for signing sessions (required)
- **BETTER_AUTH_URL**: Base URL for authentication (default: http://localhost:4321)
## Security Guidelines
- **Confidentiality Guidelines**:
- Dont ever say Claude Code or generated with AI anyhwere.
- Never commit without the explicict ask
- **Environment Variables:** See `docs/ENVIRONMENT_VARIABLES.md` for complete list
- **Development Workflow:** See `docs/DEVELOPMENT_WORKFLOW.md`
- **SSO Setup:** See `docs/SSO-OIDC-SETUP.md`
- **Contributing:** See `CONTRIBUTING.md` for code guidelines and scope
- **Graceful Shutdown:** See `docs/GRACEFUL_SHUTDOWN.md` for crash recovery details

View File

@@ -10,10 +10,6 @@
</p>
</p>
> [!IMPORTANT]
> **Upgrading to v3?** v3 requires a fresh start with a new data volume. Please read the [Upgrade Guide](UPGRADE.md) for instructions.
## 🚀 Quick Start
```bash
@@ -330,6 +326,8 @@ Enable users to sign in with external identity providers like Google, Azure AD,
https://your-domain.com/api/auth/sso/callback/{provider-id}
```
Need help? The [SSO & OIDC guide](docs/SSO-OIDC-SETUP.md) now includes a working Authentik walkthrough plus troubleshooting tips. If you upgraded from a version earlier than v3.8.10 and see `TypeError … url.startsWith` after the callback, delete the old provider and add it again using the Discover button (see [#73](https://github.com/RayLabsHQ/gitea-mirror/issues/73) and [#122](https://github.com/RayLabsHQ/gitea-mirror/issues/122)).
### 3. Header Authentication (Reverse Proxy)
Perfect for automatic authentication when using reverse proxies like Authentik, Authelia, or Traefik Forward Auth.
@@ -407,11 +405,11 @@ GNU General Public License v3.0 - see [LICENSE](LICENSE) file for details.
## Star History
<a href="https://www.star-history.com/#RayLabsHQ/gitea-mirror&Date">
<a href="https://www.star-history.com/#RayLabsHQ/gitea-mirror&type=date&legend=bottom-right">
<picture>
<source media="(prefers-color-scheme: dark)" srcset="https://api.star-history.com/svg?repos=RayLabsHQ/gitea-mirror&type=Date&theme=dark" />
<source media="(prefers-color-scheme: light)" srcset="https://api.star-history.com/svg?repos=RayLabsHQ/gitea-mirror&type=Date" />
<img alt="Star History Chart" src="https://api.star-history.com/svg?repos=RayLabsHQ/gitea-mirror&type=Date" />
<source media="(prefers-color-scheme: dark)" srcset="https://api.star-history.com/svg?repos=RayLabsHQ/gitea-mirror&type=date&theme=dark&legend=bottom-right" />
<source media="(prefers-color-scheme: light)" srcset="https://api.star-history.com/svg?repos=RayLabsHQ/gitea-mirror&type=date&legend=bottom-right" />
<img alt="Star History Chart" src="https://api.star-history.com/svg?repos=RayLabsHQ/gitea-mirror&type=date&legend=bottom-right" />
</picture>
</a>

View File

@@ -1,74 +0,0 @@
# Upgrade Guide
## Upgrading to v3.0
> **⚠️ IMPORTANT**: v3.0 requires a fresh start. There is no automated migration from v2.x to v3.0.
### Why No Migration?
v3.0 introduces fundamental changes to the application architecture:
- **Authentication**: Switched from JWT to Better Auth
- **Database**: Now uses Drizzle ORM with proper migrations
- **Security**: All tokens are now encrypted
- **Features**: Added SSO support and OIDC provider functionality
Due to these extensive changes, we recommend starting fresh with v3.0 for the best experience.
### Upgrade Steps
1. **Stop your v2.x container**
```bash
docker stop gitea-mirror
docker rm gitea-mirror
```
2. **Backup your v2.x data (optional)**
```bash
# If you want to keep your v2 data for reference
docker run --rm -v gitea-mirror-data:/data -v $(pwd):/backup alpine tar czf /backup/gitea-mirror-v2-backup.tar.gz -C /data .
```
3. **Create a new volume for v3**
```bash
docker volume create gitea-mirror-v3-data
```
4. **Run v3 with the new volume**
```bash
docker run -d \
--name gitea-mirror \
-p 4321:4321 \
-v gitea-mirror-v3-data:/app/data \
-e BETTER_AUTH_SECRET=your-secret-key \
-e ENCRYPTION_SECRET=your-encryption-key \
arunavo4/gitea-mirror:latest
```
5. **Set up your configuration again**
- Navigate to http://localhost:4321
- Create a new admin account
- Re-enter your GitHub and Gitea credentials
- Configure your mirror settings
### What Happens to My Existing Mirrors?
Your existing mirrors in Gitea are **not affected**. The application will:
- Recognize existing repositories when you re-import
- Skip creating duplicates
- Resume normal mirror operations
### Environment Variable Changes
v3.0 uses different environment variables:
| v2.x | v3.0 | Notes |
|------|------|-------|
| `JWT_SECRET` | `BETTER_AUTH_SECRET` | Required for session management |
| - | `ENCRYPTION_SECRET` | New - required for token encryption |
### Need Help?
If you have questions about upgrading:
1. Check the [README](README.md) for v3 setup instructions
2. Review your v2 configuration before upgrading
3. Open an issue if you encounter problems

View File

@@ -36,7 +36,7 @@
"@types/react-dom": "^19.2.2",
"astro": "^5.14.8",
"bcryptjs": "^3.0.2",
"better-auth": "1.4.0-beta.12",
"better-auth": "1.4.0-beta.13",
"buffer": "^6.0.3",
"canvas-confetti": "^1.9.3",
"class-variance-authority": "^0.7.1",
@@ -150,11 +150,11 @@
"@babel/types": ["@babel/types@7.28.4", "", { "dependencies": { "@babel/helper-string-parser": "^7.27.1", "@babel/helper-validator-identifier": "^7.27.1" } }, "sha512-bkFqkLhh3pMBUQQkpVgWDWq/lqzc2678eUyDlTBhRqhCHFguYYGM0Efga7tYk4TogG/3x0EEl66/OQ+WGbWB/Q=="],
"@better-auth/core": ["@better-auth/core@1.4.0-beta.12", "", { "dependencies": { "zod": "^4.1.5" }, "peerDependencies": { "@better-auth/utils": "0.3.0", "@better-fetch/fetch": "1.1.18", "better-call": "1.0.24", "better-sqlite3": "^12.4.1", "jose": "^6.1.0", "kysely": "^0.28.5", "nanostores": "^1.0.1" } }, "sha512-2GisAGuSVZS4gtnwP5Owk3RyC6GevZe9zcODTrtbwRCvBTrHUmu0j6bcklK9uNG8DaWDmzCK1+VGA5qIHzg5Pw=="],
"@better-auth/core": ["@better-auth/core@1.4.0-beta.13", "", { "dependencies": { "zod": "^4.1.5" }, "peerDependencies": { "@better-auth/utils": "0.3.0", "@better-fetch/fetch": "1.1.18", "better-call": "1.0.24", "jose": "^6.1.0", "kysely": "^0.28.5", "nanostores": "^1.0.1" } }, "sha512-EGySsNv6HQYnlRQDIa7otIMrwFoC0gGLxBum9lC6C3wAsF4l4pn/ECcdIriFpc9ewLb8mGkeMSpvjVBUBND6ew=="],
"@better-auth/sso": ["@better-auth/sso@1.4.0-beta.12", "", { "dependencies": { "@better-fetch/fetch": "1.1.18", "fast-xml-parser": "^5.2.5", "jose": "^6.1.0", "oauth2-mock-server": "^7.2.1", "samlify": "^2.10.1", "zod": "^4.1.5" }, "peerDependencies": { "better-auth": "1.4.0-beta.12" } }, "sha512-iuRuy59J3yXQihZJ34rqYClWyuVjSkxuBkdFblccKbOhNy7pmRO1lfmBMpyeth3ET5Cp0PDVV/z1XBbDcQp0LA=="],
"@better-auth/telemetry": ["@better-auth/telemetry@1.4.0-beta.12", "", { "dependencies": { "@better-auth/utils": "0.3.0", "@better-fetch/fetch": "1.1.18" }, "peerDependencies": { "@better-auth/core": "1.4.0-beta.12" } }, "sha512-pQ5HITRGXMHQPcPCDnz0xlxFqqxvpD4kQMvY6cdt1vDsPVePHAj9R3S318XEfaw3NAgtw3af/wCN6eBt2u4Kew=="],
"@better-auth/telemetry": ["@better-auth/telemetry@1.4.0-beta.13", "", { "dependencies": { "@better-auth/utils": "0.3.0", "@better-fetch/fetch": "1.1.18" }, "peerDependencies": { "@better-auth/core": "1.4.0-beta.13" } }, "sha512-910f+APALhhD79TiujzXp85Pnd2M3TlcTgBfiYF+mk3ouIkBJkl2N6D2ElcgwfiNTg50cFuTkP3AFPYioz8Arw=="],
"@better-auth/utils": ["@better-auth/utils@0.3.0", "", {}, "sha512-W+Adw6ZA6mgvnSnhOki270rwJ42t4XzSK6YWGF//BbVXL6SwCLWfyzBc1lN2m/4RM28KubdBKQ4X5VMoLRNPQw=="],
@@ -698,7 +698,7 @@
"before-after-hook": ["before-after-hook@4.0.0", "", {}, "sha512-q6tR3RPqIB1pMiTRMFcZwuG5T8vwp+vUvEG0vuI6B+Rikh5BfPp2fQ82c925FOs+b0lcFQ8CFrL+KbilfZFhOQ=="],
"better-auth": ["better-auth@1.4.0-beta.12", "", { "dependencies": { "@better-auth/core": "1.4.0-beta.12", "@better-auth/telemetry": "1.4.0-beta.12", "@better-auth/utils": "0.3.0", "@better-fetch/fetch": "1.1.18", "@noble/ciphers": "^2.0.0", "@noble/hashes": "^2.0.0", "@simplewebauthn/browser": "^13.1.2", "@simplewebauthn/server": "^13.1.2", "better-call": "1.0.24", "defu": "^6.1.4", "jose": "^6.1.0", "kysely": "^0.28.5", "nanostores": "^1.0.1", "zod": "^4.1.5" } }, "sha512-IvrSBmQkHgOinDh6JyJCoKwbMPmHpkmt98/0hBU9Nc0s7Y7u72AOx1Z35J2dRQxxX4SzvFQ9pHqlV6wPnm72Ww=="],
"better-auth": ["better-auth@1.4.0-beta.13", "", { "dependencies": { "@better-auth/core": "1.4.0-beta.13", "@better-auth/telemetry": "1.4.0-beta.13", "@better-auth/utils": "0.3.0", "@better-fetch/fetch": "1.1.18", "@noble/ciphers": "^2.0.0", "@noble/hashes": "^2.0.0", "@simplewebauthn/browser": "^13.1.2", "@simplewebauthn/server": "^13.1.2", "better-call": "1.0.24", "defu": "^6.1.4", "jose": "^6.1.0", "kysely": "^0.28.5", "nanostores": "^1.0.1", "zod": "^4.1.5" } }, "sha512-VOzbsCldupk2AdNfzDmpCVajX83nwITX8S9I8TdEUURgr3kB/CDVrsN6S8t0AClMnGgB4XaeKiXUNN30CCG4aA=="],
"better-call": ["better-call@1.0.24", "", { "dependencies": { "@better-auth/utils": "^0.3.0", "@better-fetch/fetch": "^1.1.4", "rou3": "^0.5.1", "set-cookie-parser": "^2.7.1", "uncrypto": "^0.1.3" } }, "sha512-iGqL29cstPp4xLD2MjKL1EmyAqQHjYS+cBMt4W27rPs3vf+kuqkVPA0NYaf7JciBOzVsJdNj4cbZWXC5TardWQ=="],

View File

@@ -26,6 +26,10 @@ services:
- HOST=0.0.0.0
- PORT=4321
- PUBLIC_BETTER_AUTH_URL=${PUBLIC_BETTER_AUTH_URL:-http://localhost:4321}
# Optional concurrency controls (defaults match in-app defaults)
# If you want perfect ordering of issues and PRs, set these at 1
- MIRROR_ISSUE_CONCURRENCY=${MIRROR_ISSUE_CONCURRENCY:-3}
- MIRROR_PULL_REQUEST_CONCURRENCY=${MIRROR_PULL_REQUEST_CONCURRENCY:-5}
healthcheck:
test: ["CMD", "wget", "--no-verbose", "--tries=3", "--spider", "http://localhost:4321/api/health"]
@@ -54,4 +58,4 @@ services:
# - Auto-import settings
# - Cleanup preferences
#
# That's it! Everything else can be configured via the web interface.
# That's it! Everything else can be configured via the web interface.

View File

@@ -47,6 +47,8 @@ services:
- PRESERVE_ORG_STRUCTURE=${PRESERVE_ORG_STRUCTURE:-false}
- ONLY_MIRROR_ORGS=${ONLY_MIRROR_ORGS:-false}
- SKIP_STARRED_ISSUES=${SKIP_STARRED_ISSUES:-false}
- MIRROR_ISSUE_CONCURRENCY=${MIRROR_ISSUE_CONCURRENCY:-3}
- MIRROR_PULL_REQUEST_CONCURRENCY=${MIRROR_PULL_REQUEST_CONCURRENCY:-5}
- GITEA_URL=${GITEA_URL:-}
- GITEA_TOKEN=${GITEA_TOKEN:-}
- GITEA_USERNAME=${GITEA_USERNAME:-}

View File

@@ -1,175 +0,0 @@
# Better Auth Migration Guide
This document describes the migration from the legacy authentication system to Better Auth.
## Overview
Gitea Mirror has been migrated to use Better Auth, a modern authentication library that provides:
- Built-in support for email/password authentication
- Session management with secure cookies
- Database adapter with Drizzle ORM
- Ready for OAuth2, OIDC, and SSO integrations
- Type-safe authentication throughout the application
## Key Changes
### 1. Database Schema
New tables added:
- `sessions` - User session management
- `accounts` - Authentication providers (credentials, OAuth, etc.)
- `verification_tokens` - Email verification and password reset tokens
Modified tables:
- `users` - Added `emailVerified` field
### 2. Authentication Flow
**Login:**
- Users now log in with email instead of username
- Endpoint: `/api/auth/sign-in/email`
- Session cookies are automatically managed
**Registration:**
- Users register with username, email, and password
- Username is stored as an additional field
- Endpoint: `/api/auth/sign-up/email`
### 3. API Routes
All auth routes are now handled by Better Auth's catch-all handler:
- `/api/auth/[...all].ts` handles all authentication endpoints
Legacy routes have been backed up to `/src/pages/api/auth/legacy-backup/`
### 4. Session Management
Sessions are now managed by Better Auth:
- Middleware automatically populates `context.locals.user` and `context.locals.session`
- Use `useAuth()` hook in React components for client-side auth
- Sessions expire after 30 days by default
## Future OIDC/SSO Configuration
The project is now ready for OIDC and SSO integrations. To enable:
### 1. Enable SSO Plugin
```typescript
// src/lib/auth.ts
import { sso } from "better-auth/plugins/sso";
export const auth = betterAuth({
// ... existing config
plugins: [
sso({
provisionUser: async (data) => {
// Custom user provisioning logic
return data;
},
}),
],
});
```
### 2. Register OIDC Providers
```typescript
// Example: Register an OIDC provider
await authClient.sso.register({
issuer: "https://idp.example.com",
domain: "example.com",
clientId: "your-client-id",
clientSecret: "your-client-secret",
providerId: "example-provider",
});
```
### 3. Enable OIDC Provider Mode
To make Gitea Mirror act as an OIDC provider:
```typescript
// src/lib/auth.ts
import { oidcProvider } from "better-auth/plugins/oidc";
export const auth = betterAuth({
// ... existing config
plugins: [
oidcProvider({
loginPage: "/signin",
consentPage: "/oauth/consent",
metadata: {
issuer: process.env.BETTER_AUTH_URL || "http://localhost:3000",
},
}),
],
});
```
### 4. Database Migration for SSO
When enabling SSO/OIDC, run migrations to add required tables:
```bash
# Generate the schema
bun drizzle-kit generate
# Apply the migration
bun drizzle-kit migrate
```
New tables that will be added:
- `sso_providers` - SSO provider configurations
- `oauth_applications` - OAuth2 client applications
- `oauth_access_tokens` - OAuth2 access tokens
- `oauth_consents` - User consent records
## Environment Variables
Required environment variables:
```env
# Better Auth configuration
BETTER_AUTH_SECRET=your-secret-key
BETTER_AUTH_URL=http://localhost:3000
# Legacy (kept for compatibility)
JWT_SECRET=your-secret-key
```
## Migration Script
To migrate existing users to Better Auth:
```bash
bun run migrate:better-auth
```
This script:
1. Creates credential accounts for existing users
2. Moves password hashes to the accounts table
3. Preserves user creation dates
## Troubleshooting
### Login Issues
- Ensure users log in with email, not username
- Check that BETTER_AUTH_SECRET is set
- Verify database migrations have been applied
### Session Issues
- Clear browser cookies if experiencing session problems
- Check middleware is properly configured
- Ensure auth routes are accessible at `/api/auth/*`
### Development Tips
- Use `bun db:studio` to inspect database tables
- Check `/api/auth/session` to verify current session
- Enable debug logging in Better Auth for troubleshooting
## Resources
- [Better Auth Documentation](https://better-auth.com)
- [Better Auth Astro Integration](https://better-auth.com/docs/integrations/astro)
- [Better Auth Plugins](https://better-auth.com/docs/plugins)

View File

@@ -1,206 +0,0 @@
# Build Guide
This guide covers building the open-source version of Gitea Mirror.
## Prerequisites
- **Bun** >= 1.2.9 (primary runtime)
- **Node.js** >= 20 (for compatibility)
- **Git**
## Quick Start
```bash
# Clone repository
git clone https://github.com/yourusername/gitea-mirror.git
cd gitea-mirror
# Install dependencies
bun install
# Initialize database
bun run init-db
# Build for production
bun run build
# Start the application
bun run start
```
## Build Commands
| Command | Description |
|---------|-------------|
| `bun run build` | Production build |
| `bun run dev` | Development server |
| `bun run preview` | Preview production build |
| `bun test` | Run tests |
| `bun run cleanup-db` | Remove database files |
## Build Output
The build creates:
- `dist/` - Production-ready server files
- `.astro/` - Build cache (git-ignored)
- `data/` - SQLite database location
## Development Build
For active development with hot reload:
```bash
bun run dev
```
Access the application at http://localhost:4321
## Production Build
```bash
# Build
bun run build
# Test the build
bun run preview
# Run in production
bun run start
```
## Docker Build
```dockerfile
# Build Docker image
docker build -t gitea-mirror:latest .
# Run container
docker run -p 3000:3000 gitea-mirror:latest
```
## Environment Variables
Create a `.env` file:
```env
# Database
DATABASE_PATH=./data/gitea-mirror.db
# Authentication
JWT_SECRET=your-secret-here
# GitHub Configuration
GITHUB_TOKEN=ghp_...
GITHUB_WEBHOOK_SECRET=...
GITHUB_EXCLUDED_ORGS=org1,org2,org3 # Optional: Comma-separated list of organizations to exclude from sync
# Gitea Configuration
GITEA_URL=https://your-gitea.com
GITEA_TOKEN=...
```
## Common Build Issues
### Missing Dependencies
```bash
# Solution
bun install
```
### Database Not Initialized
```bash
# Solution
bun run init-db
```
### Port Already in Use
```bash
# Change port
PORT=3001 bun run dev
```
### Build Cache Issues
```bash
# Clear cache
rm -rf .astro/ dist/
bun run build
```
## Build Optimization
### Development Speed
- Use `bun run dev` for hot reload
- Skip type checking during rapid development
- Keep `.astro/` cache between builds
### Production Optimization
- Minification enabled automatically
- Tree shaking removes unused code
- Image optimization with Sharp
## Validation
After building, verify:
```bash
# Check build output
ls -la dist/
# Test server starts
bun run start
# Check health endpoint
curl http://localhost:3000/api/health
```
## CI/CD Build
Example GitHub Actions workflow:
```yaml
name: Build and Test
on: [push, pull_request]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: oven-sh/setup-bun@v2
- run: bun install
- run: bun run build
- run: bun test
```
## Troubleshooting
### Build Fails
1. Check Bun version: `bun --version`
2. Clear dependencies: `rm -rf node_modules && bun install`
3. Check for syntax errors: `bunx tsc --noEmit`
### Runtime Errors
1. Check environment variables
2. Verify database exists
3. Check file permissions
## Performance
Expected build times:
- Clean build: ~5-10 seconds
- Incremental build: ~2-5 seconds
- Development startup: ~1-2 seconds
## Next Steps
- Configure with [Configuration Guide](./CONFIGURATION.md)
- Deploy with [Deployment Guide](./DEPLOYMENT.md)
- Set up authentication with [SSO Guide](./SSO-OIDC-SETUP.md)

View File

@@ -1 +0,0 @@
../certs/README.md

View File

@@ -16,27 +16,22 @@ This guide covers the development workflow for the open-source Gitea Mirror.
1. **Clone the repository**:
```bash
git clone https://github.com/yourusername/gitea-mirror.git
git clone https://github.com/RayLabsHQ/gitea-mirror.git
cd gitea-mirror
```
2. **Install dependencies**:
2. **Install dependencies and seed the SQLite database**:
```bash
bun install
bun run setup
```
3. **Initialize database**:
```bash
bun run init-db
```
4. **Configure environment**:
3. **Configure environment (optional)**:
```bash
cp .env.example .env
# Edit .env with your settings
```
5. **Start development server**:
4. **Start the development server**:
```bash
bun run dev
```
@@ -45,29 +40,33 @@ bun run dev
| Command | Description |
|---------|-------------|
| `bun run dev` | Start development server with hot reload |
| `bun run build` | Build for production |
| `bun run preview` | Preview production build |
| `bun test` | Run all tests |
| `bun run dev` | Start the Bun + Astro dev server with hot reload |
| `bun run build` | Build the production bundle |
| `bun run preview` | Preview the production build locally |
| `bun test` | Run the Bun test suite |
| `bun test:watch` | Run tests in watch mode |
| `bun run db:studio` | Open database GUI |
| `bun run db:studio` | Launch Drizzle Kit Studio |
## Project Structure
```
gitea-mirror/
├── src/
│ ├── components/ # React components
│ ├── pages/ # Astro pages & API routes
│ ├── lib/ # Core logic
│ │ ├── db/ # Database queries
│ │ ├── utils/ # Helper functions
│ │ └── modules/ # Module system
│ ├── hooks/ # React hooks
── types/ # TypeScript types
├── public/ # Static assets
├── scripts/ # Utility scripts
└── tests/ # Test files
├── src/ # Application UI, API routes, and services
│ ├── components/ # React components rendered inside Astro pages
│ ├── pages/ # Astro pages and API routes (e.g., /api/*)
│ ├── lib/ # Core logic: GitHub/Gitea clients, scheduler, recovery, db helpers
│ │ ├── db/ # Drizzle adapter + schema
│ │ ├── modules/ # Module wiring (jobs, integrations)
│ │ └── utils/ # Shared utilities
│ ├── hooks/ # React hooks
── content/ # In-app documentation and templated content
│ ├── layouts/ # Shared layout components
├── styles/ # Tailwind CSS entrypoints
└── types/ # TypeScript types
├── scripts/ # Bun scripts for DB management and maintenance
├── www/ # Marketing site (Astro + MDX use cases)
├── public/ # Static assets served by Vite/Astro
└── tests/ # Dedicated integration/unit test helpers
```
## Feature Development
@@ -80,10 +79,10 @@ git checkout -b feature/my-feature
```
2. **Plan your changes**:
- UI components in `/src/components/`
- API endpoints in `/src/pages/api/`
- Database queries in `/src/lib/db/queries/`
- Types in `/src/types/`
- UI components live in `src/components/`
- API endpoints live in `src/pages/api/`
- Database logic is under `src/lib/db/` (schema + adapter)
- Shared types are in `src/types/`
3. **Implement the feature**:
@@ -120,7 +119,7 @@ describe('My Feature', () => {
5. **Update documentation**:
- Add JSDoc comments
- Update README if needed
- Update README/docs if needed
- Document API changes
## Database Development
@@ -352,4 +351,4 @@ git push origin v2.23.0
- Check existing [issues](https://github.com/yourusername/gitea-mirror/issues)
- Join [discussions](https://github.com/yourusername/gitea-mirror/discussions)
- Read the [FAQ](./FAQ.md)
- Read the [FAQ](./FAQ.md)

View File

@@ -141,6 +141,10 @@ Control what content gets mirrored from GitHub to Gitea.
| `MIRROR_PULL_REQUESTS` | Mirror pull requests (requires MIRROR_METADATA=true) | `false` | `true`, `false` |
| `MIRROR_LABELS` | Mirror labels (requires MIRROR_METADATA=true) | `false` | `true`, `false` |
| `MIRROR_MILESTONES` | Mirror milestones (requires MIRROR_METADATA=true) | `false` | `true`, `false` |
| `MIRROR_ISSUE_CONCURRENCY` | Number of issues processed in parallel. Set above `1` to speed up mirroring at the risk of out-of-order creation. | `3` | Integer ≥ 1 |
| `MIRROR_PULL_REQUEST_CONCURRENCY` | Number of pull requests processed in parallel. Values above `1` may cause ordering differences. | `5` | Integer ≥ 1 |
> **Ordering vs Throughput:** Metadata now mirrors sequentially by default to preserve chronology. Increase the concurrency variables only if you can tolerate minor out-of-order entries.
## Automation Configuration

View File

@@ -1,77 +0,0 @@
# Extending Gitea Mirror
Gitea Mirror is designed with extensibility in mind through a module system.
## Module System
The application provides a module interface that allows extending functionality:
```typescript
export interface Module {
name: string;
version: string;
init(app: AppContext): Promise<void>;
cleanup?(): Promise<void>;
}
```
## Creating Custom Modules
You can create custom modules to add features:
```typescript
// my-module.ts
export class MyModule implements Module {
name = 'my-module';
version = '1.0.0';
async init(app: AppContext) {
// Add your functionality
app.addRoute('/api/my-endpoint', this.handler);
}
async handler(context) {
return new Response('Hello from my module!');
}
}
```
## Module Context
Modules receive an `AppContext` with:
- Database access
- Event system
- Route registration
- Configuration
## Private Extensions
If you're developing private extensions:
1. Create a separate package/repository
2. Implement the module interface
3. Use Bun's linking feature for development:
```bash
# In your extension
bun link
# In gitea-mirror
bun link your-extension
```
## Best Practices
- Keep modules focused on a single feature
- Use TypeScript for type safety
- Handle errors gracefully
- Clean up resources in `cleanup()`
- Document your module's API
## Community Modules
Share your modules with the community:
- Create a GitHub repository
- Tag it with `gitea-mirror-module`
- Submit a PR to list it in our docs
For more details on the module system, see the source code in `/src/lib/modules/`.

View File

@@ -1,118 +1,39 @@
# Gitea Mirror Documentation
Welcome to the Gitea Mirror documentation. This guide covers everything you need to know about developing, building, and deploying the open-source version of Gitea Mirror.
This folder contains engineering and operations references for the open-source Gitea Mirror project. Each guide focuses on the parts of the system that still require bespoke explanation beyond the in-app help and the main `README.md`.
## Documentation Overview
## Available Guides
### Getting Started
### Core workflow
- **[DEVELOPMENT_WORKFLOW.md](./DEVELOPMENT_WORKFLOW.md)** Set up a local environment, run scripts, and understand the repo layout (app + marketing site).
- **[ENVIRONMENT_VARIABLES.md](./ENVIRONMENT_VARIABLES.md)** Complete reference for every configuration flag supported by the app and Docker images.
- **[Development Workflow](./DEVELOPMENT_WORKFLOW.md)** - Set up your development environment and start contributing
- **[Build Guide](./BUILD_GUIDE.md)** - Build Gitea Mirror from source
- **[Configuration Guide](./CONFIGURATION.md)** - Configure all available options
### Reliability & recovery
- **[GRACEFUL_SHUTDOWN.md](./GRACEFUL_SHUTDOWN.md)** How signal handling, shutdown coordination, and job persistence work in v3.
- **[RECOVERY_IMPROVEMENTS.md](./RECOVERY_IMPROVEMENTS.md)** Deep dive into the startup recovery workflow and supporting scripts.
### Deployment
### Authentication
- **[SSO-OIDC-SETUP.md](./SSO-OIDC-SETUP.md)** Configure OIDC/SSO providers through the admin UI.
- **[SSO_TESTING.md](./SSO_TESTING.md)** Recipes for local and staging SSO testing (Google, Keycloak, mock providers).
- **[Deployment Guide](./DEPLOYMENT.md)** - Deploy to production environments
- **[Docker Guide](./DOCKER.md)** - Container-based deployment
- **[Reverse Proxy Setup](./REVERSE_PROXY.md)** - Configure with nginx/Caddy
If you are looking for customer-facing playbooks, see the MDX use cases under `www/src/pages/use-cases/`.
### Features
## Quick start for local development
- **[SSO/OIDC Setup](./SSO-OIDC-SETUP.md)** - Configure authentication providers
- **[Sponsor Integration](./SPONSOR_INTEGRATION.md)** - GitHub Sponsors integration
- **[Webhook Configuration](./WEBHOOKS.md)** - Set up GitHub webhooks
### Architecture
- **[Architecture Overview](./ARCHITECTURE.md)** - System design and components
- **[API Documentation](./API.md)** - REST API endpoints
- **[Database Schema](./DATABASE.md)** - SQLite structure
### Maintenance
- **[Migration Guide](../MIGRATION_GUIDE.md)** - Upgrade from previous versions
- **[Better Auth Migration](./BETTER_AUTH_MIGRATION.md)** - Migrate authentication system
- **[Troubleshooting](./TROUBLESHOOTING.md)** - Common issues and solutions
- **[Backup & Restore](./BACKUP.md)** - Data management
## Quick Start
1. **Clone and install**:
```bash
git clone https://github.com/yourusername/gitea-mirror.git
git clone https://github.com/RayLabsHQ/gitea-mirror.git
cd gitea-mirror
bun install
bun run setup # installs deps and seeds the SQLite DB
bun run dev # starts the Astro/Bun app on http://localhost:4321
```
2. **Configure**:
```bash
cp .env.example .env
# Edit .env with your GitHub and Gitea tokens
```
The first user you create locally becomes the administrator. All other configuration—GitHub owners, Gitea targets, scheduling, cleanup—is done through the **Configuration** screen in the UI.
3. **Initialize and run**:
```bash
bun run init-db
bun run dev
```
## Contributing & support
4. **Access**: Open http://localhost:4321
- 🎯 Contribution guide: [../CONTRIBUTING.md](../CONTRIBUTING.md)
- 📘 Code of conduct: [../CODE_OF_CONDUCT.md](../CODE_OF_CONDUCT.md)
- 🐞 Issues & feature requests: <https://github.com/RayLabsHQ/gitea-mirror/issues>
- 💬 Discussions: <https://github.com/RayLabsHQ/gitea-mirror/discussions>
## Key Features
- 🔄 **Automatic Syncing** - Keep repositories synchronized
- 🗂️ **Organization Support** - Mirror entire organizations
-**Starred Repos** - Mirror your starred repositories
- 🔐 **Self-Hosted** - Full control over your data
- 🚀 **Fast** - Built with Bun for optimal performance
- 🔒 **Secure** - JWT authentication, encrypted tokens
## Technology Stack
- **Runtime**: Bun
- **Framework**: Astro with React
- **Database**: SQLite with Drizzle ORM
- **Styling**: Tailwind CSS v4
- **Authentication**: Better Auth
## System Requirements
- Bun >= 1.2.9
- Node.js >= 20 (optional, for compatibility)
- SQLite 3
- 512MB RAM minimum
- 1GB disk space
## Contributing
We welcome contributions! Please see our [Contributing Guide](../CONTRIBUTING.md) for details.
### Development Setup
1. Fork the repository
2. Create a feature branch
3. Make your changes
4. Add tests
5. Submit a pull request
### Code of Conduct
Please read our [Code of Conduct](../CODE_OF_CONDUCT.md) before contributing.
## Support
- **Issues**: [GitHub Issues](https://github.com/yourusername/gitea-mirror/issues)
- **Discussions**: [GitHub Discussions](https://github.com/yourusername/gitea-mirror/discussions)
- **Wiki**: [GitHub Wiki](https://github.com/yourusername/gitea-mirror/wiki)
## Security
For security issues, please see [SECURITY.md](../SECURITY.md).
## License
Gitea Mirror is open source software licensed under the [MIT License](../LICENSE).
---
For detailed information on any topic, please refer to the specific documentation guides listed above.
Security disclosures should follow the process in [../SECURITY.md](../SECURITY.md).

View File

@@ -1,236 +0,0 @@
# Graceful Shutdown Process
This document details how the gitea-mirror application handles graceful shutdown during active mirroring operations, with specific focus on job interruption and recovery.
## Overview
The graceful shutdown system is designed for **fast, clean termination** without waiting for long-running jobs to complete. It prioritizes **quick shutdown times** (under 30 seconds) while **preserving all progress** for seamless recovery.
## Key Principle
**The application does NOT wait for jobs to finish before shutting down.** Instead, it saves the current state and resumes after restart.
## Shutdown Scenario Example
### Initial State
- **Job**: Mirror 500 repositories
- **Progress**: 200 repositories completed
- **Remaining**: 300 repositories pending
- **Action**: User initiates shutdown (SIGTERM, Ctrl+C, Docker stop)
### Shutdown Process (Under 30 seconds)
#### Step 1: Signal Detection (Immediate)
```
📡 Received SIGTERM signal
🛑 Graceful shutdown initiated by signal: SIGTERM
📊 Shutdown status: 1 active jobs, 2 callbacks
```
#### Step 2: Job State Saving (1-10 seconds)
```
📝 Step 1: Saving active job states...
Saving state for job abc-123...
✅ Saved state for job abc-123
```
**What gets saved:**
- `inProgress: false` - Mark job as not currently running
- `completedItems: 200` - Number of repos successfully mirrored
- `totalItems: 500` - Total repos in the job
- `completedItemIds: [repo1, repo2, ..., repo200]` - List of completed repos
- `itemIds: [repo1, repo2, ..., repo500]` - Full list of repos
- `lastCheckpoint: 2025-05-24T17:30:00Z` - Exact shutdown time
- `message: "Job interrupted by application shutdown - will resume on restart"`
- `status: "imported"` - Keeps status as resumable (not "failed")
#### Step 3: Service Cleanup (1-5 seconds)
```
🔧 Step 2: Executing shutdown callbacks...
🛑 Shutting down cleanup service...
✅ Cleanup service stopped
✅ Shutdown callback 1 completed
```
#### Step 4: Clean Exit (Immediate)
```
💾 Step 3: Closing database connections...
✅ Graceful shutdown completed successfully
```
**Total shutdown time: ~15 seconds** (well under the 30-second limit)
## What Happens to the Remaining 300 Repos?
### During Shutdown
- **NOT processed** - The remaining 300 repos are not mirrored
- **NOT lost** - Their IDs are preserved in the job state
- **NOT marked as failed** - Job status remains "imported" for recovery
### After Restart
The recovery system automatically:
1. **Detects interrupted job** during startup
2. **Calculates remaining work**: 500 - 200 = 300 repos
3. **Extracts remaining repo IDs**: repos 201-500 from the original list
4. **Resumes processing** from exactly where it left off
5. **Continues until completion** of all 500 repos
## Timeout Configuration
### Shutdown Timeouts
```typescript
const SHUTDOWN_TIMEOUT = 30000; // 30 seconds max shutdown time
const JOB_SAVE_TIMEOUT = 10000; // 10 seconds to save job state
```
### Timeout Behavior
- **Normal case**: Shutdown completes in 10-20 seconds
- **Slow database**: Up to 30 seconds allowed
- **Timeout exceeded**: Force exit with code 1
- **Container kill**: Orchestrator should allow 45+ seconds grace period
## Job State Persistence
### Database Schema
The `mirror_jobs` table stores complete job state:
```sql
-- Job identification
id TEXT PRIMARY KEY,
user_id TEXT NOT NULL,
job_type TEXT NOT NULL DEFAULT 'mirror',
-- Progress tracking
total_items INTEGER,
completed_items INTEGER DEFAULT 0,
item_ids TEXT, -- JSON array of all repo IDs
completed_item_ids TEXT DEFAULT '[]', -- JSON array of completed repo IDs
-- State management
in_progress INTEGER NOT NULL DEFAULT 0, -- Boolean: currently running
started_at TIMESTAMP,
completed_at TIMESTAMP,
last_checkpoint TIMESTAMP, -- Last progress save
-- Status and messaging
status TEXT NOT NULL DEFAULT 'imported',
message TEXT NOT NULL
```
### Recovery Query
The recovery system finds interrupted jobs:
```sql
SELECT * FROM mirror_jobs
WHERE in_progress = 0
AND status = 'imported'
AND completed_at IS NULL
AND total_items > completed_items;
```
## Shutdown-Aware Processing
### Concurrency Check
During job execution, each repo processing checks for shutdown:
```typescript
// Before processing each repository
if (isShuttingDown()) {
throw new Error('Processing interrupted by application shutdown');
}
```
### Checkpoint Intervals
Jobs save progress periodically (every 10 repos by default):
```typescript
checkpointInterval: 10, // Save progress every 10 repositories
```
This ensures minimal work loss even if shutdown occurs between checkpoints.
## Container Integration
### Docker Entrypoint
The Docker entrypoint properly forwards signals:
```bash
# Set up signal handlers
trap 'shutdown_handler' TERM INT HUP
# Start application in background
bun ./dist/server/entry.mjs &
APP_PID=$!
# Wait for application to finish
wait "$APP_PID"
```
### Kubernetes Configuration
Recommended pod configuration:
```yaml
apiVersion: v1
kind: Pod
spec:
terminationGracePeriodSeconds: 45 # Allow time for graceful shutdown
containers:
- name: gitea-mirror
# ... other configuration
```
## Monitoring and Logging
### Shutdown Logs
```
🛑 Graceful shutdown initiated by signal: SIGTERM
📊 Shutdown status: 1 active jobs, 2 callbacks
📝 Step 1: Saving active job states...
Saving state for 1 active jobs...
✅ Completed saving all active jobs
🔧 Step 2: Executing shutdown callbacks...
✅ Completed all shutdown callbacks
💾 Step 3: Closing database connections...
✅ Graceful shutdown completed successfully
```
### Recovery Logs
```
⚠️ Jobs found that need recovery. Starting recovery process...
Resuming job abc-123 with 300 remaining items...
✅ Recovery completed successfully
```
## Best Practices
### For Operations
1. **Monitor shutdown times** - Should complete under 30 seconds
2. **Check recovery logs** - Verify jobs resume correctly after restart
3. **Set appropriate grace periods** - Allow 45+ seconds in orchestrators
4. **Plan maintenance windows** - Jobs will resume but may take time to complete
### For Development
1. **Test shutdown scenarios** - Use `bun run test-shutdown`
2. **Monitor job progress** - Check checkpoint frequency and timing
3. **Verify recovery** - Ensure interrupted jobs resume correctly
4. **Handle edge cases** - Test shutdown during different job phases
## Troubleshooting
### Shutdown Takes Too Long
- **Check**: Database performance during job state saving
- **Solution**: Increase `SHUTDOWN_TIMEOUT` environment variable
- **Monitor**: Job complexity and checkpoint frequency
### Jobs Don't Resume
- **Check**: Recovery logs for errors during startup
- **Verify**: Database contains interrupted jobs with correct status
- **Test**: Run `bun run startup-recovery` manually
### Container Force-Killed
- **Check**: Container orchestrator termination grace period
- **Increase**: Grace period to 45+ seconds
- **Monitor**: Application shutdown completion time
This design ensures **production-ready graceful shutdown** with **zero data loss** and **fast recovery times** suitable for modern containerized deployments.

View File

@@ -1,91 +0,0 @@
# GitHub Sponsors Integration
This guide shows how GitHub Sponsors is integrated into the open-source version of Gitea Mirror.
## Components
### GitHubSponsors Card
A card component that displays in the sidebar or dashboard:
```tsx
import { GitHubSponsors } from '@/components/sponsors/GitHubSponsors';
// In your layout or dashboard
<GitHubSponsors />
```
### SponsorButton
A smaller button for headers or navigation:
```tsx
import { SponsorButton } from '@/components/sponsors/GitHubSponsors';
// In your header
<SponsorButton />
```
## Integration Points
### 1. Dashboard Sidebar
Add the sponsor card to the dashboard sidebar for visibility:
```tsx
// src/components/layout/DashboardLayout.tsx
<aside>
{/* Other sidebar content */}
<GitHubSponsors />
</aside>
```
### 2. Header Navigation
Add the sponsor button to the main navigation:
```tsx
// src/components/layout/Header.tsx
<nav>
{/* Other nav items */}
<SponsorButton />
</nav>
```
### 3. Settings Page
Add a support section in settings:
```tsx
// src/components/settings/SupportSection.tsx
<Card>
<CardHeader>
<CardTitle>Support Development</CardTitle>
</CardHeader>
<CardContent>
<GitHubSponsors />
</CardContent>
</Card>
```
## Behavior
- **Only appears in self-hosted mode**: The components automatically hide in hosted mode
- **Non-intrusive**: Designed to be helpful without being annoying
- **Multiple options**: GitHub Sponsors, Buy Me a Coffee, and starring the repo
## Customization
You can customize the sponsor components by:
1. Updating the GitHub Sponsors URL
2. Adding/removing donation platforms
3. Changing the styling to match your theme
4. Adjusting the placement based on user feedback
## Best Practices
1. **Don't be pushy**: Show sponsor options tastefully
2. **Provide value first**: Ensure the tool is useful before asking for support
3. **Be transparent**: Explain how sponsorships help the project
4. **Thank sponsors**: Acknowledge supporters in README or releases

View File

@@ -81,6 +81,26 @@ Replace `{provider-id}` with your chosen Provider ID.
- Client Secret: [Your Okta Client Secret]
- Click "Discover" to auto-fill endpoints
### Example: Authentik SSO Setup
Working Authentik deployments (see [#134](https://github.com/RayLabsHQ/gitea-mirror/issues/134)) follow these steps:
1. In Authentik, create a new **Application** and OIDC **Provider** (implicit flow works well for testing).
2. Start creating an SSO provider inside Gitea Mirror so you can copy the redirect URL shown (`https://your-domain.com/api/auth/sso/callback/authentik` if you pick `authentik` as your Provider ID).
3. Paste that redirect URL into the Authentik Provider configuration and finish creating the provider.
4. Copy the Authentik issuer URL, client ID, and client secret.
5. Back in Gitea Mirror:
- Issuer URL: the exact value from Authentik (keep any trailing slash Authentik shows).
- Provider ID: match the one you used in step 2.
- Click **Discover** so Gitea Mirror stores the authorization, token, and JWKS endpoints (Authentik publishes them via discovery).
- Domain: enter the email domain you expect to match (e.g. `example.com`).
6. Save the provider and test the login flow.
Notes:
- Make sure `BETTER_AUTH_URL` and (if you serve the UI from multiple origins) `BETTER_AUTH_TRUSTED_ORIGINS` point at the public URL users reach. A mismatch can surface as 500 errors after redirect.
- Authentik must report the users email as verified (default behavior) so Gitea Mirror can auto-link accounts.
- If you created an Authentik provider before v3.8.10 you should delete it and re-add it after upgrading; older versions saved incomplete endpoint data which leads to the `url.startsWith` error explained in the Troubleshooting section.
## Setting up OIDC Provider
The OIDC Provider feature allows other applications to use Gitea Mirror as their authentication provider.
@@ -165,6 +185,7 @@ When an application requests authentication:
1. **"Invalid origin" error**: Check that your Gitea Mirror URL matches the configured redirect URI
2. **"Provider not found" error**: Ensure the provider is properly configured and enabled
3. **Redirect loop**: Verify the redirect URI in both Gitea Mirror and the SSO provider match exactly
4. **`TypeError: undefined is not an object (evaluating 'url.startsWith')`**: This indicates the stored provider configuration is missing OIDC endpoints. Delete the provider from Gitea Mirror and re-register it using the **Discover** button so authorization/token URLs are saved (see [#73](https://github.com/RayLabsHQ/gitea-mirror/issues/73) and [#122](https://github.com/RayLabsHQ/gitea-mirror/issues/122) for examples).
### OIDC Provider Issues
@@ -202,4 +223,4 @@ This immediately prevents the application from authenticating new users.
If migrating from the previous JWT-based authentication:
- Existing users remain unaffected
- Users can continue using email/password authentication
- SSO can be added as an additional authentication method
- SSO can be added as an additional authentication method

View File

@@ -1,127 +0,0 @@
# Testing in Gitea Mirror
This document provides guidance on testing in the Gitea Mirror project.
## Current Status
The project now uses Bun's built-in test runner, which is Jest-compatible and provides a fast, reliable testing experience. We've migrated away from Vitest due to compatibility issues with Bun.
## Running Tests
To run tests, use the following commands:
```bash
# Run all tests
bun test
# Run tests in watch mode (automatically re-run when files change)
bun test --watch
# Run tests with coverage reporting
bun test --coverage
```
## Test File Naming Conventions
Bun's test runner automatically discovers test files that match the following patterns:
- `*.test.{js|jsx|ts|tsx}`
- `*_test.{js|jsx|ts|tsx}`
- `*.spec.{js|jsx|ts|tsx}`
- `*_spec.{js|jsx|ts|tsx}`
## Writing Tests
The project uses Bun's test runner with a Jest-compatible API. Here's an example test:
```typescript
// example.test.ts
import { describe, test, expect } from "bun:test";
describe("Example Test", () => {
test("should pass", () => {
expect(true).toBe(true);
});
});
```
### Testing React Components
For testing React components, we use React Testing Library:
```typescript
// component.test.tsx
import { describe, test, expect } from "bun:test";
import { render, screen } from "@testing-library/react";
import MyComponent from "../components/MyComponent";
describe("MyComponent", () => {
test("renders correctly", () => {
render(<MyComponent />);
expect(screen.getByText("Hello World")).toBeInTheDocument();
});
});
```
## Test Setup
The test setup is defined in `src/tests/setup.bun.ts` and includes:
- Automatic cleanup after each test
- Setup for any global test environment needs
## Mocking
Bun's test runner provides built-in mocking capabilities:
```typescript
import { test, expect, mock } from "bun:test";
// Create a mock function
const mockFn = mock(() => "mocked value");
test("mock function", () => {
const result = mockFn();
expect(result).toBe("mocked value");
expect(mockFn).toHaveBeenCalled();
});
// Mock a module
mock.module("./some-module", () => {
return {
someFunction: () => "mocked module function"
};
});
```
## CI Integration
The CI workflow has been updated to use Bun's test runner. Tests are automatically run as part of the CI pipeline.
## Test Coverage
To generate test coverage reports, run:
```bash
bun test --coverage
```
This will generate a coverage report in the `coverage` directory.
## Types of Tests
The project includes several types of tests:
1. **Unit Tests**: Testing individual functions and utilities
2. **API Tests**: Testing API endpoints
3. **Component Tests**: Testing React components
4. **Integration Tests**: Testing how components work together
## Future Improvements
When expanding the test suite, consider:
1. Adding more comprehensive API endpoint tests
2. Increasing component test coverage
3. Setting up end-to-end tests with a tool like Playwright
4. Adding performance tests for critical paths

View File

@@ -0,0 +1,4 @@
ALTER TABLE `accounts` ADD `id_token` text;--> statement-breakpoint
ALTER TABLE `accounts` ADD `access_token_expires_at` integer;--> statement-breakpoint
ALTER TABLE `accounts` ADD `refresh_token_expires_at` integer;--> statement-breakpoint
ALTER TABLE `accounts` ADD `scope` text;

View File

@@ -0,0 +1,18 @@
ALTER TABLE `organizations` ADD `normalized_name` text NOT NULL DEFAULT '';--> statement-breakpoint
UPDATE `organizations` SET `normalized_name` = lower(trim(`name`));--> statement-breakpoint
DELETE FROM `organizations`
WHERE rowid NOT IN (
SELECT MIN(rowid)
FROM `organizations`
GROUP BY `user_id`, `normalized_name`
);--> statement-breakpoint
CREATE UNIQUE INDEX `uniq_organizations_user_normalized_name` ON `organizations` (`user_id`,`normalized_name`);--> statement-breakpoint
ALTER TABLE `repositories` ADD `normalized_full_name` text NOT NULL DEFAULT '';--> statement-breakpoint
UPDATE `repositories` SET `normalized_full_name` = lower(trim(`full_name`));--> statement-breakpoint
DELETE FROM `repositories`
WHERE rowid NOT IN (
SELECT MIN(rowid)
FROM `repositories`
GROUP BY `user_id`, `normalized_full_name`
);--> statement-breakpoint
CREATE UNIQUE INDEX `uniq_repositories_user_normalized_full_name` ON `repositories` (`user_id`,`normalized_full_name`);

View File

@@ -0,0 +1 @@
ALTER TABLE `repositories` ADD `metadata` text;

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -43,6 +43,27 @@
"when": 1757786449446,
"tag": "0005_polite_preak",
"breakpoints": true
},
{
"idx": 6,
"version": "6",
"when": 1761483928546,
"tag": "0006_military_la_nuit",
"breakpoints": true
},
{
"idx": 7,
"version": "6",
"when": 1761534391115,
"tag": "0007_whole_hellion",
"breakpoints": true
},
{
"idx": 8,
"version": "6",
"when": 1761802056073,
"tag": "0008_serious_thena",
"breakpoints": true
}
]
}

View File

@@ -29,7 +29,7 @@ kubectl create namespace gitea-mirror
helm upgrade --install gitea-mirror ./helm/gitea-mirror --namespace gitea-mirror --set "gitea-mirror.github.username=<your-gh-username>" --set "gitea-mirror.github.token=<your-gh-token>" --set "gitea-mirror.gitea.url=https://gitea.example.com" --set "gitea-mirror.gitea.token=<your-gitea-token>"
```
The default Service is `ClusterIP` on port `8080`. You can expose it via Ingress or Gateway API; see below.
The default Service is `ClusterIP` on port `4321`. You can expose it via Ingress or Gateway API; see below.
---
@@ -78,7 +78,7 @@ If you enabled persistence with a PVC the data may persist; delete the PVC manua
| Key | Type | Default | Description |
| --- | --- | --- | --- |
| `deployment.port` | int | `8080` | Container port & named `http` port. |
| `deployment.port` | int | `4321` | Container port & named `http` port. |
| `deployment.strategy.type` | string | `Recreate` | Update strategy (`Recreate` or `RollingUpdate`). |
| `deployment.strategy.rollingUpdate.maxUnavailable/maxSurge` | string/int | — | Used when `type=RollingUpdate`. |
| `deployment.env` | list | `[]` | Extra environment variables. |
@@ -95,7 +95,7 @@ If you enabled persistence with a PVC the data may persist; delete the PVC manua
| Key | Type | Default | Description |
| --- | --- | --- | --- |
| `service.type` | string | `ClusterIP` | Service type. |
| `service.port` | int | `8080` | Service port. |
| `service.port` | int | `4321` | Service port. |
| `service.clusterIP` | string | `None` | ClusterIP (only when `type=ClusterIP`). |
| `service.externalTrafficPolicy` | string | `""` | External traffic policy (LB). |
| `service.loadBalancerIP` | string | `""` | LoadBalancer IP. |
@@ -228,7 +228,7 @@ ingress:
- mirror.example.com
```
This creates an Ingress routing `/` to the service on port `8080`.
This creates an Ingress routing `/` to the service on port `4321`.
### Using Gateway API (HTTPRoute)
@@ -257,7 +257,7 @@ By default, the chart provisions a PVC named `gitea-mirror-storage` with `1Gi` a
## Environment & health endpoints
The container listens on `PORT` (defaults to `deployment.port` = `8080`) and exposes `GET /api/health` for liveness/readiness/startup probes.
The container listens on `PORT` (defaults to `deployment.port` = `4321`) and exposes `GET /api/health` for liveness/readiness/startup probes.
---

View File

@@ -46,7 +46,7 @@ route:
service:
type: ClusterIP
port: 8080
port: 4321
clusterIP: None
annotations: {}
externalTrafficPolicy:
@@ -55,7 +55,7 @@ service:
loadBalancerClass:
deployment:
port: 8080
port: 4321
strategy:
type: Recreate
env: []

View File

@@ -1,7 +1,7 @@
{
"name": "gitea-mirror",
"type": "module",
"version": "3.8.7",
"version": "3.9.0",
"engines": {
"bun": ">=1.2.9"
},
@@ -75,7 +75,7 @@
"astro": "^5.14.8",
"bcryptjs": "^3.0.2",
"buffer": "^6.0.3",
"better-auth": "1.4.0-beta.12",
"better-auth": "1.4.0-beta.13",
"canvas-confetti": "^1.9.3",
"class-variance-authority": "^0.7.1",
"clsx": "^2.1.1",

BIN
public/apple-touch-icon.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 31 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 15 KiB

After

Width:  |  Height:  |  Size: 15 KiB

3
public/favicon.svg Normal file

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 216 KiB

View File

@@ -114,10 +114,10 @@ EOF
echo "======================================"
echo "1. Access Authentik at http://localhost:9000"
echo "2. Login with akadmin / admin-password"
echo "3. Create OAuth2 Provider for Gitea Mirror:"
echo "3. Create an Authentik OIDC Provider for Gitea Mirror:"
echo " - Name: gitea-mirror"
echo " - Redirect URIs:"
echo " http://localhost:4321/api/auth/callback/sso-provider"
echo " - Redirect URI:"
echo " http://localhost:4321/api/auth/sso/callback/authentik"
echo " - Scopes: openid, profile, email"
echo ""
echo "4. Create Application:"
@@ -131,10 +131,14 @@ EOF
echo "6. Configure SSO in Gitea Mirror:"
echo " - Go to Settings → Authentication & SSO"
echo " - Add provider with:"
echo " - Provider ID: authentik"
echo " - Issuer URL: http://localhost:9000/application/o/gitea-mirror/"
echo " - Click Discover to pull Authentik endpoints"
echo " - Client ID: (from Authentik provider)"
echo " - Client Secret: (from Authentik provider)"
echo ""
echo "If you previously registered this provider on a version earlier than v3.8.10, delete it and re-add it after upgrading to avoid missing endpoint data."
echo ""
;;
stop)
@@ -177,4 +181,4 @@ EOF
echo " status - Show service status"
exit 1
;;
esac
esac

View File

@@ -306,7 +306,7 @@ export function Dashboard() {
title="Repositories"
value={repoCount}
icon={<GitFork className="h-4 w-4" />}
description="Total in mirror queue"
description="Total imported repositories"
/>
<StatusCard
title="Mirrored"

View File

@@ -1,5 +1,5 @@
import * as React from "react";
import { useState } from "react";
import { useEffect, useState } from "react";
import { Button } from "@/components/ui/button";
import {
Dialog,
@@ -20,9 +20,11 @@ interface AddOrganizationDialogProps {
onAddOrganization: ({
org,
role,
force,
}: {
org: string;
role: MembershipRole;
force?: boolean;
}) => Promise<void>;
}
@@ -36,6 +38,14 @@ export default function AddOrganizationDialog({
const [isLoading, setIsLoading] = useState<boolean>(false);
const [error, setError] = useState<string>("");
useEffect(() => {
if (!isDialogOpen) {
setError("");
setOrg("");
setRole("member");
}
}, [isDialogOpen]);
const handleSubmit = async (e: React.FormEvent) => {
e.preventDefault();
@@ -54,7 +64,7 @@ export default function AddOrganizationDialog({
setRole("member");
setIsDialogOpen(false);
} catch (err: any) {
setError(err?.message || "Failed to add repository.");
setError(err?.message || "Failed to add organization.");
} finally {
setIsLoading(false);
}
@@ -139,7 +149,7 @@ export default function AddOrganizationDialog({
{isLoading ? (
<LoaderCircle className="h-4 w-4 animate-spin" />
) : (
"Add Repository"
"Add Organization"
)}
</Button>
</div>

View File

@@ -1,6 +1,6 @@
import { useCallback, useEffect, useState } from "react";
import { Button } from "@/components/ui/button";
import { Search, RefreshCw, FlipHorizontal, Filter } from "lucide-react";
import { Search, RefreshCw, FlipHorizontal, Filter, LoaderCircle, Trash2 } from "lucide-react";
import type { MirrorJob, Organization } from "@/lib/db/schema";
import { OrganizationList } from "./OrganizationsList";
import AddOrganizationDialog from "./AddOrganizationDialog";
@@ -37,6 +37,14 @@ import {
DrawerTitle,
DrawerTrigger,
} from "@/components/ui/drawer";
import {
Dialog,
DialogContent,
DialogDescription,
DialogFooter,
DialogHeader,
DialogTitle,
} from "@/components/ui/dialog";
export function Organization() {
const [organizations, setOrganizations] = useState<Organization[]>([]);
@@ -52,6 +60,15 @@ export function Organization() {
status: "",
});
const [loadingOrgIds, setLoadingOrgIds] = useState<Set<string>>(new Set()); // this is used when the api actions are performed
const [duplicateOrgCandidate, setDuplicateOrgCandidate] = useState<{
org: string;
role: MembershipRole;
} | null>(null);
const [isDuplicateOrgDialogOpen, setIsDuplicateOrgDialogOpen] = useState(false);
const [isProcessingDuplicateOrg, setIsProcessingDuplicateOrg] = useState(false);
const [orgToDelete, setOrgToDelete] = useState<Organization | null>(null);
const [isDeleteOrgDialogOpen, setIsDeleteOrgDialogOpen] = useState(false);
const [isDeletingOrg, setIsDeletingOrg] = useState(false);
// Create a stable callback using useCallback
const handleNewMessage = useCallback((data: MirrorJob) => {
@@ -256,19 +273,45 @@ export function Organization() {
const handleAddOrganization = async ({
org,
role,
force = false,
}: {
org: string;
role: MembershipRole;
force?: boolean;
}) => {
try {
if (!user || !user.id) {
return;
if (!user || !user.id) {
return;
}
const trimmedOrg = org.trim();
const normalizedOrg = trimmedOrg.toLowerCase();
if (!trimmedOrg) {
toast.error("Please enter a valid organization name.");
throw new Error("Invalid organization name");
}
if (!force) {
const alreadyExists = organizations.some(
(existing) => existing.name?.trim().toLowerCase() === normalizedOrg
);
if (alreadyExists) {
toast.warning("Organization already exists.");
setDuplicateOrgCandidate({ org: trimmedOrg, role });
setIsDuplicateOrgDialogOpen(true);
throw new Error("Organization already exists");
}
}
try {
setIsLoading(true);
const reqPayload: AddOrganizationApiRequest = {
userId: user.id,
org,
org: trimmedOrg,
role,
force,
};
const response = await apiRequest<AddOrganizationApiResponse>(
@@ -280,25 +323,100 @@ export function Organization() {
);
if (response.success) {
toast.success(`Organization added successfully`);
setOrganizations((prev) => [...prev, response.organization]);
const message = force
? "Organization already exists; using existing entry."
: "Organization added successfully";
toast.success(message);
await fetchOrganizations();
await fetchOrganizations(false);
setFilter((prev) => ({
...prev,
searchTerm: org,
searchTerm: trimmedOrg,
}));
if (force) {
setIsDuplicateOrgDialogOpen(false);
setDuplicateOrgCandidate(null);
}
} else {
showErrorToast(response.error || "Error adding organization", toast);
}
} catch (error) {
showErrorToast(error, toast);
throw error;
} finally {
setIsLoading(false);
}
};
const handleConfirmDuplicateOrganization = async () => {
if (!duplicateOrgCandidate) {
return;
}
setIsProcessingDuplicateOrg(true);
try {
await handleAddOrganization({
org: duplicateOrgCandidate.org,
role: duplicateOrgCandidate.role,
force: true,
});
setIsDialogOpen(false);
setDuplicateOrgCandidate(null);
setIsDuplicateOrgDialogOpen(false);
} catch (error) {
// Error already surfaced via toast
} finally {
setIsProcessingDuplicateOrg(false);
}
};
const handleCancelDuplicateOrganization = () => {
setIsDuplicateOrgDialogOpen(false);
setDuplicateOrgCandidate(null);
};
const handleRequestDeleteOrganization = (orgId: string) => {
const org = organizations.find((item) => item.id === orgId);
if (!org) {
toast.error("Organization not found");
return;
}
setOrgToDelete(org);
setIsDeleteOrgDialogOpen(true);
};
const handleDeleteOrganization = async () => {
if (!user || !user.id || !orgToDelete) {
return;
}
setIsDeletingOrg(true);
try {
const response = await apiRequest<{ success: boolean; error?: string }>(
`/organizations/${orgToDelete.id}`,
{
method: "DELETE",
}
);
if (response.success) {
toast.success(`Removed ${orgToDelete.name} from Gitea Mirror.`);
await fetchOrganizations(false);
} else {
showErrorToast(response.error || "Failed to delete organization", toast);
}
} catch (error) {
showErrorToast(error, toast);
} finally {
setIsDeletingOrg(false);
setIsDeleteOrgDialogOpen(false);
setOrgToDelete(null);
}
};
const handleMirrorAllOrgs = async () => {
try {
if (!user || !user.id || organizations.length === 0) {
@@ -711,6 +829,7 @@ export function Organization() {
onMirror={handleMirrorOrg}
onIgnore={handleIgnoreOrg}
onAddOrganization={() => setIsDialogOpen(true)}
onDelete={handleRequestDeleteOrganization}
onRefresh={async () => {
await fetchOrganizations(false);
}}
@@ -721,6 +840,68 @@ export function Organization() {
isDialogOpen={isDialogOpen}
setIsDialogOpen={setIsDialogOpen}
/>
<Dialog open={isDuplicateOrgDialogOpen} onOpenChange={(open) => {
if (!open) {
handleCancelDuplicateOrganization();
}
}}>
<DialogContent>
<DialogHeader>
<DialogTitle>Organization already exists</DialogTitle>
<DialogDescription>
{duplicateOrgCandidate?.org ?? "This organization"} is already synced in Gitea Mirror.
Continuing will reuse the existing entry without creating a duplicate. You can remove it later if needed.
</DialogDescription>
</DialogHeader>
<DialogFooter>
<Button variant="outline" onClick={handleCancelDuplicateOrganization} disabled={isProcessingDuplicateOrg}>
Cancel
</Button>
<Button onClick={handleConfirmDuplicateOrganization} disabled={isProcessingDuplicateOrg}>
{isProcessingDuplicateOrg ? (
<LoaderCircle className="h-4 w-4 animate-spin" />
) : (
"Continue"
)}
</Button>
</DialogFooter>
</DialogContent>
</Dialog>
<Dialog open={isDeleteOrgDialogOpen} onOpenChange={(open) => {
if (!open) {
setIsDeleteOrgDialogOpen(false);
setOrgToDelete(null);
}
}}>
<DialogContent>
<DialogHeader>
<DialogTitle>Remove organization from Gitea Mirror?</DialogTitle>
<DialogDescription>
{orgToDelete?.name ?? "This organization"} will be deleted from Gitea Mirror only. Nothing will be removed from Gitea; you will need to clean it up manually in Gitea if desired.
</DialogDescription>
</DialogHeader>
<DialogFooter>
<Button variant="outline" onClick={() => {
setIsDeleteOrgDialogOpen(false);
setOrgToDelete(null);
}} disabled={isDeletingOrg}>
Cancel
</Button>
<Button variant="destructive" onClick={handleDeleteOrganization} disabled={isDeletingOrg}>
{isDeletingOrg ? (
<LoaderCircle className="h-4 w-4 animate-spin" />
) : (
<span className="flex items-center gap-2">
<Trash2 className="h-4 w-4" />
Delete
</span>
)}
</Button>
</DialogFooter>
</DialogContent>
</Dialog>
</div>
);
}

View File

@@ -2,7 +2,7 @@ import { useMemo } from "react";
import { Card } from "@/components/ui/card";
import { Button } from "@/components/ui/button";
import { Badge } from "@/components/ui/badge";
import { Plus, RefreshCw, Building2, Check, AlertCircle, Clock, MoreVertical, Ban } from "lucide-react";
import { Plus, RefreshCw, Building2, Check, AlertCircle, Clock, MoreVertical, Ban, Trash2 } from "lucide-react";
import { SiGithub, SiGitea } from "react-icons/si";
import type { Organization } from "@/lib/db/schema";
import type { FilterParams } from "@/types/filter";
@@ -30,6 +30,7 @@ interface OrganizationListProps {
loadingOrgIds: Set<string>;
onAddOrganization?: () => void;
onRefresh?: () => Promise<void>;
onDelete?: (orgId: string) => void;
}
// Helper function to get status badge variant and icon
@@ -60,6 +61,7 @@ export function OrganizationList({
loadingOrgIds,
onAddOrganization,
onRefresh,
onDelete,
}: OrganizationListProps) {
const { giteaConfig } = useGiteaConfig();
@@ -414,7 +416,7 @@ export function OrganizationList({
)}
{/* Dropdown menu for additional actions */}
{org.status !== "ignored" && org.status !== "mirroring" && (
{org.status !== "mirroring" && (
<DropdownMenu>
<DropdownMenuTrigger asChild>
<Button variant="ghost" size="icon" disabled={isLoading} className="h-10 w-10">
@@ -422,12 +424,26 @@ export function OrganizationList({
</Button>
</DropdownMenuTrigger>
<DropdownMenuContent align="end">
<DropdownMenuItem
onClick={() => org.id && onIgnore && onIgnore({ orgId: org.id, ignore: true })}
>
<Ban className="h-4 w-4 mr-2" />
Ignore Organization
</DropdownMenuItem>
{org.status !== "ignored" && (
<DropdownMenuItem
onClick={() => org.id && onIgnore && onIgnore({ orgId: org.id, ignore: true })}
>
<Ban className="h-4 w-4 mr-2" />
Ignore Organization
</DropdownMenuItem>
)}
{onDelete && (
<>
{org.status !== "ignored" && <DropdownMenuSeparator />}
<DropdownMenuItem
className="text-destructive focus:text-destructive"
onClick={() => org.id && onDelete(org.id)}
>
<Trash2 className="h-4 w-4 mr-2" />
Delete from Mirror
</DropdownMenuItem>
</>
)}
</DropdownMenuContent>
</DropdownMenu>
)}
@@ -561,7 +577,7 @@ export function OrganizationList({
)}
{/* Dropdown menu for additional actions */}
{org.status !== "ignored" && org.status !== "mirroring" && (
{org.status !== "mirroring" && (
<DropdownMenu>
<DropdownMenuTrigger asChild>
<Button variant="ghost" size="icon" disabled={isLoading}>
@@ -569,12 +585,26 @@ export function OrganizationList({
</Button>
</DropdownMenuTrigger>
<DropdownMenuContent align="end">
<DropdownMenuItem
onClick={() => org.id && onIgnore && onIgnore({ orgId: org.id, ignore: true })}
>
<Ban className="h-4 w-4 mr-2" />
Ignore Organization
</DropdownMenuItem>
{org.status !== "ignored" && (
<DropdownMenuItem
onClick={() => org.id && onIgnore && onIgnore({ orgId: org.id, ignore: true })}
>
<Ban className="h-4 w-4 mr-2" />
Ignore Organization
</DropdownMenuItem>
)}
{onDelete && (
<>
{org.status !== "ignored" && <DropdownMenuSeparator />}
<DropdownMenuItem
className="text-destructive focus:text-destructive"
onClick={() => org.id && onDelete(org.id)}
>
<Trash2 className="h-4 w-4 mr-2" />
Delete from Mirror
</DropdownMenuItem>
</>
)}
</DropdownMenuContent>
</DropdownMenu>
)}

View File

@@ -1,5 +1,5 @@
import * as React from "react";
import { useState } from "react";
import { useEffect, useState } from "react";
import { Button } from "@/components/ui/button";
import {
Dialog,
@@ -17,9 +17,11 @@ interface AddRepositoryDialogProps {
onAddRepository: ({
repo,
owner,
force,
}: {
repo: string;
owner: string;
force?: boolean;
}) => Promise<void>;
}
@@ -33,6 +35,14 @@ export default function AddRepositoryDialog({
const [isLoading, setIsLoading] = useState<boolean>(false);
const [error, setError] = useState<string>("");
useEffect(() => {
if (!isDialogOpen) {
setError("");
setRepo("");
setOwner("");
}
}, [isDialogOpen]);
const handleSubmit = async (e: React.FormEvent) => {
e.preventDefault();

View File

@@ -18,7 +18,7 @@ import {
SelectValue,
} from "../ui/select";
import { Button } from "@/components/ui/button";
import { Search, RefreshCw, FlipHorizontal, RotateCcw, X, Filter, Ban, Check } from "lucide-react";
import { Search, RefreshCw, FlipHorizontal, RotateCcw, X, Filter, Ban, Check, LoaderCircle, Trash2 } from "lucide-react";
import type { MirrorRepoRequest, MirrorRepoResponse } from "@/types/mirror";
import {
Drawer,
@@ -30,6 +30,14 @@ import {
DrawerTitle,
DrawerTrigger,
} from "@/components/ui/drawer";
import {
Dialog,
DialogContent,
DialogDescription,
DialogFooter,
DialogHeader,
DialogTitle,
} from "@/components/ui/dialog";
import { useSSE } from "@/hooks/useSEE";
import { useFilterParams } from "@/hooks/useFilterParams";
import { toast } from "sonner";
@@ -69,6 +77,15 @@ export default function Repository() {
}, [setFilter]);
const [loadingRepoIds, setLoadingRepoIds] = useState<Set<string>>(new Set()); // this is used when the api actions are performed
const [duplicateRepoCandidate, setDuplicateRepoCandidate] = useState<{
owner: string;
repo: string;
} | null>(null);
const [isDuplicateRepoDialogOpen, setIsDuplicateRepoDialogOpen] = useState(false);
const [isProcessingDuplicateRepo, setIsProcessingDuplicateRepo] = useState(false);
const [repoToDelete, setRepoToDelete] = useState<Repository | null>(null);
const [isDeleteRepoDialogOpen, setIsDeleteRepoDialogOpen] = useState(false);
const [isDeletingRepo, setIsDeletingRepo] = useState(false);
// Create a stable callback using useCallback
const handleNewMessage = useCallback((data: MirrorJob) => {
@@ -618,19 +635,45 @@ export default function Repository() {
const handleAddRepository = async ({
repo,
owner,
force = false,
}: {
repo: string;
owner: string;
force?: boolean;
}) => {
try {
if (!user || !user.id) {
return;
}
if (!user || !user.id) {
return;
}
const trimmedRepo = repo.trim();
const trimmedOwner = owner.trim();
if (!trimmedRepo || !trimmedOwner) {
toast.error("Please provide both owner and repository name.");
throw new Error("Invalid repository details");
}
const normalizedFullName = `${trimmedOwner}/${trimmedRepo}`.toLowerCase();
if (!force) {
const duplicateRepo = repositories.find(
(existing) => existing.normalizedFullName?.toLowerCase() === normalizedFullName
);
if (duplicateRepo) {
toast.warning("Repository already exists.");
setDuplicateRepoCandidate({ repo: trimmedRepo, owner: trimmedOwner });
setIsDuplicateRepoDialogOpen(true);
throw new Error("Repository already exists");
}
}
try {
const reqPayload: AddRepositoriesApiRequest = {
userId: user.id,
repo,
owner,
repo: trimmedRepo,
owner: trimmedOwner,
force,
};
const response = await apiRequest<AddRepositoriesApiResponse>(
@@ -642,20 +685,28 @@ export default function Repository() {
);
if (response.success) {
toast.success(`Repository added successfully`);
setRepositories((prevRepos) => [...prevRepos, response.repository]);
const message = force
? "Repository already exists; metadata refreshed."
: "Repository added successfully";
toast.success(message);
await fetchRepositories(false); // Manual refresh after adding repository
await fetchRepositories(false);
setFilter((prev) => ({
...prev,
searchTerm: repo,
searchTerm: trimmedRepo,
}));
if (force) {
setDuplicateRepoCandidate(null);
setIsDuplicateRepoDialogOpen(false);
}
} else {
showErrorToast(response.error || "Error adding repository", toast);
}
} catch (error) {
showErrorToast(error, toast);
throw error;
}
};
@@ -673,6 +724,71 @@ export default function Repository() {
)
).sort();
const handleConfirmDuplicateRepository = async () => {
if (!duplicateRepoCandidate) {
return;
}
setIsProcessingDuplicateRepo(true);
try {
await handleAddRepository({
repo: duplicateRepoCandidate.repo,
owner: duplicateRepoCandidate.owner,
force: true,
});
setIsDialogOpen(false);
} catch (error) {
// Error already shown
} finally {
setIsProcessingDuplicateRepo(false);
}
};
const handleCancelDuplicateRepository = () => {
setDuplicateRepoCandidate(null);
setIsDuplicateRepoDialogOpen(false);
};
const handleRequestDeleteRepository = (repoId: string) => {
const repo = repositories.find((item) => item.id === repoId);
if (!repo) {
toast.error("Repository not found");
return;
}
setRepoToDelete(repo);
setIsDeleteRepoDialogOpen(true);
};
const handleDeleteRepository = async () => {
if (!user || !user.id || !repoToDelete) {
return;
}
setIsDeletingRepo(true);
try {
const response = await apiRequest<{ success: boolean; error?: string }>(
`/repositories/${repoToDelete.id}`,
{
method: "DELETE",
}
);
if (response.success) {
toast.success(`Removed ${repoToDelete.fullName} from Gitea Mirror.`);
await fetchRepositories(false);
} else {
showErrorToast(response.error || "Failed to delete repository", toast);
}
} catch (error) {
showErrorToast(error, toast);
} finally {
setIsDeletingRepo(false);
setIsDeleteRepoDialogOpen(false);
setRepoToDelete(null);
}
};
// Determine what actions are available for selected repositories
const getAvailableActions = () => {
if (selectedRepoIds.size === 0) return [];
@@ -1198,6 +1314,7 @@ export default function Repository() {
onRefresh={async () => {
await fetchRepositories(false);
}}
onDelete={handleRequestDeleteRepository}
/>
)}
@@ -1206,6 +1323,77 @@ export default function Repository() {
isDialogOpen={isDialogOpen}
setIsDialogOpen={setIsDialogOpen}
/>
<Dialog
open={isDuplicateRepoDialogOpen}
onOpenChange={(open) => {
if (!open) {
handleCancelDuplicateRepository();
}
}}
>
<DialogContent>
<DialogHeader>
<DialogTitle>Repository already exists</DialogTitle>
<DialogDescription>
{duplicateRepoCandidate ? `${duplicateRepoCandidate.owner}/${duplicateRepoCandidate.repo}` : "This repository"} is already tracked in Gitea Mirror. Continuing will refresh the existing entry without creating a duplicate.
</DialogDescription>
</DialogHeader>
<DialogFooter>
<Button variant="outline" onClick={handleCancelDuplicateRepository} disabled={isProcessingDuplicateRepo}>
Cancel
</Button>
<Button onClick={handleConfirmDuplicateRepository} disabled={isProcessingDuplicateRepo}>
{isProcessingDuplicateRepo ? (
<LoaderCircle className="h-4 w-4 animate-spin" />
) : (
"Continue"
)}
</Button>
</DialogFooter>
</DialogContent>
</Dialog>
<Dialog
open={isDeleteRepoDialogOpen}
onOpenChange={(open) => {
if (!open) {
setIsDeleteRepoDialogOpen(false);
setRepoToDelete(null);
}
}}
>
<DialogContent>
<DialogHeader>
<DialogTitle>Remove repository from Gitea Mirror?</DialogTitle>
<DialogDescription>
{repoToDelete?.fullName ?? "This repository"} will be deleted from Gitea Mirror only. The mirror on Gitea will remain untouched; remove it manually in Gitea if needed.
</DialogDescription>
</DialogHeader>
<DialogFooter>
<Button
variant="outline"
onClick={() => {
setIsDeleteRepoDialogOpen(false);
setRepoToDelete(null);
}}
disabled={isDeletingRepo}
>
Cancel
</Button>
<Button variant="destructive" onClick={handleDeleteRepository} disabled={isDeletingRepo}>
{isDeletingRepo ? (
<LoaderCircle className="h-4 w-4 animate-spin" />
) : (
<span className="flex items-center gap-2">
<Trash2 className="h-4 w-4" />
Delete
</span>
)}
</Button>
</DialogFooter>
</DialogContent>
</Dialog>
</div>
);
}

View File

@@ -1,7 +1,7 @@
import { useMemo, useRef } from "react";
import Fuse from "fuse.js";
import { useVirtualizer } from "@tanstack/react-virtual";
import { FlipHorizontal, GitFork, RefreshCw, RotateCcw, Star, Lock, Ban, Check, ChevronDown } from "lucide-react";
import { FlipHorizontal, GitFork, RefreshCw, RotateCcw, Star, Lock, Ban, Check, ChevronDown, Trash2 } from "lucide-react";
import { SiGithub, SiGitea } from "react-icons/si";
import type { Repository } from "@/lib/db/schema";
import { Button } from "@/components/ui/button";
@@ -23,6 +23,7 @@ import {
DropdownMenu,
DropdownMenuContent,
DropdownMenuItem,
DropdownMenuSeparator,
DropdownMenuTrigger,
} from "@/components/ui/dropdown-menu";
@@ -40,6 +41,7 @@ interface RepositoryTableProps {
selectedRepoIds: Set<string>;
onSelectionChange: (selectedIds: Set<string>) => void;
onRefresh?: () => Promise<void>;
onDelete?: (repoId: string) => void;
}
export default function RepositoryTable({
@@ -56,6 +58,7 @@ export default function RepositoryTable({
selectedRepoIds,
onSelectionChange,
onRefresh,
onDelete,
}: RepositoryTableProps) {
const tableParentRef = useRef<HTMLDivElement>(null);
const { giteaConfig } = useGiteaConfig();
@@ -676,6 +679,7 @@ export default function RepositoryTable({
onSync={() => onSync({ repoId: repo.id ?? "" })}
onRetry={() => onRetry({ repoId: repo.id ?? "" })}
onSkip={(skip) => onSkip({ repoId: repo.id ?? "", skip })}
onDelete={onDelete && repo.id ? () => onDelete(repo.id as string) : undefined}
/>
</div>
{/* Links */}
@@ -786,6 +790,7 @@ function RepoActionButton({
onSync,
onRetry,
onSkip,
onDelete,
}: {
repo: { id: string; status: string };
isLoading: boolean;
@@ -793,6 +798,7 @@ function RepoActionButton({
onSync: () => void;
onRetry: () => void;
onSkip: (skip: boolean) => void;
onDelete?: () => void;
}) {
// For ignored repos, show an "Include" action
if (repo.status === "ignored") {
@@ -849,7 +855,7 @@ function RepoActionButton({
);
}
// Show primary action with dropdown for skip option
// Show primary action with dropdown for additional actions
return (
<DropdownMenu>
<div className="flex">
@@ -886,6 +892,18 @@ function RepoActionButton({
<Ban className="h-4 w-4 mr-2" />
Ignore Repository
</DropdownMenuItem>
{onDelete && (
<>
<DropdownMenuSeparator />
<DropdownMenuItem
className="text-destructive focus:text-destructive"
onClick={onDelete}
>
<Trash2 className="h-4 w-4 mr-2" />
Delete from Mirror
</DropdownMenuItem>
</>
)}
</DropdownMenuContent>
</DropdownMenu>
);

View File

@@ -166,6 +166,8 @@ export const auth = betterAuth({
defaultOverrideUserInfo: true,
// Allow implicit sign up for new users
disableImplicitSignUp: false,
// Trust email_verified claims from the upstream provider so we can link by matching email
trustEmailVerified: true,
}),
],
});

View File

@@ -54,6 +54,8 @@ export const giteaConfigSchema = z.object({
.enum(["skip", "reference", "full-copy"])
.default("reference"),
// Mirror options
issueConcurrency: z.number().int().min(1).default(3),
pullRequestConcurrency: z.number().int().min(1).default(5),
mirrorReleases: z.boolean().default(false),
releaseLimit: z.number().default(10),
mirrorMetadata: z.boolean().default(false),
@@ -125,6 +127,7 @@ export const repositorySchema = z.object({
configId: z.string(),
name: z.string(),
fullName: z.string(),
normalizedFullName: z.string(),
url: z.url(),
cloneUrl: z.url(),
owner: z.string(),
@@ -161,6 +164,7 @@ export const repositorySchema = z.object({
lastMirrored: z.coerce.date().optional().nullable(),
errorMessage: z.string().optional().nullable(),
destinationOrg: z.string().optional().nullable(),
metadata: z.string().optional().nullable(), // JSON string for metadata sync state
createdAt: z.coerce.date(),
updatedAt: z.coerce.date(),
});
@@ -207,6 +211,7 @@ export const organizationSchema = z.object({
userId: z.string(),
configId: z.string(),
name: z.string(),
normalizedName: z.string(),
avatarUrl: z.string(),
membershipRole: z.enum(["member", "admin", "owner", "billing_manager"]).default("member"),
isIncluded: z.boolean().default(true),
@@ -332,6 +337,7 @@ export const repositories = sqliteTable("repositories", {
.references(() => configs.id),
name: text("name").notNull(),
fullName: text("full_name").notNull(),
normalizedFullName: text("normalized_full_name").notNull(),
url: text("url").notNull(),
cloneUrl: text("clone_url").notNull(),
owner: text("owner").notNull(),
@@ -371,6 +377,8 @@ export const repositories = sqliteTable("repositories", {
destinationOrg: text("destination_org"),
metadata: text("metadata"), // JSON string storing metadata sync state (issues, PRs, releases, etc.)
createdAt: integer("created_at", { mode: "timestamp" })
.notNull()
.default(sql`(unixepoch())`),
@@ -386,6 +394,7 @@ export const repositories = sqliteTable("repositories", {
index("idx_repositories_is_fork").on(table.isForked),
index("idx_repositories_is_starred").on(table.isStarred),
uniqueIndex("uniq_repositories_user_full_name").on(table.userId, table.fullName),
uniqueIndex("uniq_repositories_user_normalized_full_name").on(table.userId, table.normalizedFullName),
]);
export const mirrorJobs = sqliteTable("mirror_jobs", {
@@ -436,6 +445,7 @@ export const organizations = sqliteTable("organizations", {
.notNull()
.references(() => configs.id),
name: text("name").notNull(),
normalizedName: text("normalized_name").notNull(),
avatarUrl: text("avatar_url").notNull(),
@@ -467,6 +477,7 @@ export const organizations = sqliteTable("organizations", {
index("idx_organizations_config_id").on(table.configId),
index("idx_organizations_status").on(table.status),
index("idx_organizations_is_included").on(table.isIncluded),
uniqueIndex("uniq_organizations_user_normalized_name").on(table.userId, table.normalizedName),
]);
// ===== Better Auth Tables =====
@@ -500,6 +511,10 @@ export const accounts = sqliteTable("accounts", {
providerUserId: text("provider_user_id"), // Make nullable for email/password auth
accessToken: text("access_token"),
refreshToken: text("refresh_token"),
idToken: text("id_token"),
accessTokenExpiresAt: integer("access_token_expires_at", { mode: "timestamp" }),
refreshTokenExpiresAt: integer("refresh_token_expires_at", { mode: "timestamp" }),
scope: text("scope"),
expiresAt: integer("expires_at", { mode: "timestamp" }),
password: text("password"), // For credential provider
createdAt: integer("created_at", { mode: "timestamp" })

View File

@@ -49,6 +49,9 @@ interface EnvConfig {
mirrorLabels?: boolean;
mirrorMilestones?: boolean;
mirrorMetadata?: boolean;
releaseLimit?: number;
issueConcurrency?: number;
pullRequestConcurrency?: number;
};
schedule: {
enabled?: boolean;
@@ -136,6 +139,8 @@ function parseEnvConfig(): EnvConfig {
mirrorMilestones: process.env.MIRROR_MILESTONES === 'true',
mirrorMetadata: process.env.MIRROR_METADATA === 'true',
releaseLimit: process.env.RELEASE_LIMIT ? parseInt(process.env.RELEASE_LIMIT, 10) : undefined,
issueConcurrency: process.env.MIRROR_ISSUE_CONCURRENCY ? parseInt(process.env.MIRROR_ISSUE_CONCURRENCY, 10) : undefined,
pullRequestConcurrency: process.env.MIRROR_PULL_REQUEST_CONCURRENCY ? parseInt(process.env.MIRROR_PULL_REQUEST_CONCURRENCY, 10) : undefined,
},
schedule: {
enabled: process.env.SCHEDULE_ENABLED === 'true' ||
@@ -277,6 +282,12 @@ export async function initializeConfigFromEnv(): Promise<void> {
// Mirror metadata options
mirrorReleases: envConfig.mirror.mirrorReleases ?? existingConfig?.[0]?.giteaConfig?.mirrorReleases ?? false,
releaseLimit: envConfig.mirror.releaseLimit ?? existingConfig?.[0]?.giteaConfig?.releaseLimit ?? 10,
issueConcurrency: envConfig.mirror.issueConcurrency && envConfig.mirror.issueConcurrency > 0
? envConfig.mirror.issueConcurrency
: existingConfig?.[0]?.giteaConfig?.issueConcurrency ?? 3,
pullRequestConcurrency: envConfig.mirror.pullRequestConcurrency && envConfig.mirror.pullRequestConcurrency > 0
? envConfig.mirror.pullRequestConcurrency
: existingConfig?.[0]?.giteaConfig?.pullRequestConcurrency ?? 5,
mirrorMetadata: envConfig.mirror.mirrorMetadata ?? (envConfig.mirror.mirrorIssues || envConfig.mirror.mirrorPullRequests || envConfig.mirror.mirrorLabels || envConfig.mirror.mirrorMilestones) ?? existingConfig?.[0]?.giteaConfig?.mirrorMetadata ?? false,
mirrorIssues: envConfig.mirror.mirrorIssues ?? existingConfig?.[0]?.giteaConfig?.mirrorIssues ?? false,
mirrorPullRequests: envConfig.mirror.mirrorPullRequests ?? existingConfig?.[0]?.giteaConfig?.mirrorPullRequests ?? false,

View File

@@ -8,6 +8,10 @@ mock.module("@/lib/helpers", () => ({
}));
const mockMirrorGitHubReleasesToGitea = mock(() => Promise.resolve());
const mockMirrorGitRepoIssuesToGitea = mock(() => Promise.resolve());
const mockMirrorGitRepoPullRequestsToGitea = mock(() => Promise.resolve());
const mockMirrorGitRepoLabelsToGitea = mock(() => Promise.resolve());
const mockMirrorGitRepoMilestonesToGitea = mock(() => Promise.resolve());
const mockGetGiteaRepoOwnerAsync = mock(() => Promise.resolve("starred"));
// Mock the database module
@@ -128,6 +132,36 @@ const mockHttpGet = mock(async (url: string, headers?: any) => {
headers: new Headers()
};
}
if (url.includes("/api/v1/repos/starred/metadata-repo")) {
return {
data: {
id: 790,
name: "metadata-repo",
mirror: true,
owner: { login: "starred" },
mirror_interval: "8h",
private: false,
},
status: 200,
statusText: "OK",
headers: new Headers(),
};
}
if (url.includes("/api/v1/repos/starred/already-synced-repo")) {
return {
data: {
id: 791,
name: "already-synced-repo",
mirror: true,
owner: { login: "starred" },
mirror_interval: "8h",
private: false,
},
status: 200,
statusText: "OK",
headers: new Headers(),
};
}
if (url.includes("/api/v1/repos/")) {
throw new MockHttpError("Not Found", 404, "Not Found");
}
@@ -224,6 +258,10 @@ describe("Enhanced Gitea Operations", () => {
mockDb.insert.mockClear();
mockDb.update.mockClear();
mockMirrorGitHubReleasesToGitea.mockClear();
mockMirrorGitRepoIssuesToGitea.mockClear();
mockMirrorGitRepoPullRequestsToGitea.mockClear();
mockMirrorGitRepoLabelsToGitea.mockClear();
mockMirrorGitRepoMilestonesToGitea.mockClear();
mockGetGiteaRepoOwnerAsync.mockClear();
mockGetGiteaRepoOwnerAsync.mockImplementation(() => Promise.resolve("starred"));
// Reset tracking variables
@@ -426,6 +464,10 @@ describe("Enhanced Gitea Operations", () => {
{
getGiteaRepoOwnerAsync: mockGetGiteaRepoOwnerAsync,
mirrorGitHubReleasesToGitea: mockMirrorGitHubReleasesToGitea,
mirrorGitRepoIssuesToGitea: mockMirrorGitRepoIssuesToGitea,
mirrorGitRepoPullRequestsToGitea: mockMirrorGitRepoPullRequestsToGitea,
mirrorGitRepoLabelsToGitea: mockMirrorGitRepoLabelsToGitea,
mirrorGitRepoMilestonesToGitea: mockMirrorGitRepoMilestonesToGitea,
}
)
).rejects.toThrow("Repository non-mirror-repo is not a mirror. Cannot sync.");
@@ -470,6 +512,10 @@ describe("Enhanced Gitea Operations", () => {
{
getGiteaRepoOwnerAsync: mockGetGiteaRepoOwnerAsync,
mirrorGitHubReleasesToGitea: mockMirrorGitHubReleasesToGitea,
mirrorGitRepoIssuesToGitea: mockMirrorGitRepoIssuesToGitea,
mirrorGitRepoPullRequestsToGitea: mockMirrorGitRepoPullRequestsToGitea,
mirrorGitRepoLabelsToGitea: mockMirrorGitRepoLabelsToGitea,
mirrorGitRepoMilestonesToGitea: mockMirrorGitRepoMilestonesToGitea,
}
);
@@ -482,6 +528,130 @@ describe("Enhanced Gitea Operations", () => {
expect(releaseCall.config.githubConfig?.token).toBe("github-token");
expect(releaseCall.octokit).toBeDefined();
});
test("mirrors metadata components when enabled and not previously synced", async () => {
const config: Partial<Config> = {
userId: "user123",
githubConfig: {
username: "testuser",
token: "github-token",
privateRepositories: true,
mirrorStarred: false,
},
giteaConfig: {
url: "https://gitea.example.com",
token: "encrypted-token",
defaultOwner: "testuser",
mirrorReleases: true,
mirrorMetadata: true,
mirrorIssues: true,
mirrorPullRequests: true,
mirrorLabels: true,
mirrorMilestones: true,
},
};
const repository: Repository = {
id: "repo789",
name: "metadata-repo",
fullName: "user/metadata-repo",
owner: "user",
cloneUrl: "https://github.com/user/metadata-repo.git",
isPrivate: false,
isStarred: false,
status: repoStatusEnum.parse("mirrored"),
visibility: "public",
userId: "user123",
createdAt: new Date(),
updatedAt: new Date(),
metadata: null,
};
await syncGiteaRepoEnhanced(
{ config, repository },
{
getGiteaRepoOwnerAsync: mockGetGiteaRepoOwnerAsync,
mirrorGitHubReleasesToGitea: mockMirrorGitHubReleasesToGitea,
mirrorGitRepoIssuesToGitea: mockMirrorGitRepoIssuesToGitea,
mirrorGitRepoPullRequestsToGitea: mockMirrorGitRepoPullRequestsToGitea,
mirrorGitRepoLabelsToGitea: mockMirrorGitRepoLabelsToGitea,
mirrorGitRepoMilestonesToGitea: mockMirrorGitRepoMilestonesToGitea,
}
);
expect(mockMirrorGitHubReleasesToGitea).toHaveBeenCalledTimes(1);
expect(mockMirrorGitRepoIssuesToGitea).toHaveBeenCalledTimes(1);
expect(mockMirrorGitRepoPullRequestsToGitea).toHaveBeenCalledTimes(1);
expect(mockMirrorGitRepoMilestonesToGitea).toHaveBeenCalledTimes(1);
// Labels should be skipped because issues already import them
expect(mockMirrorGitRepoLabelsToGitea).not.toHaveBeenCalled();
});
test("skips metadata mirroring when components already synced", async () => {
const config: Partial<Config> = {
userId: "user123",
githubConfig: {
username: "testuser",
token: "github-token",
privateRepositories: true,
mirrorStarred: false,
},
giteaConfig: {
url: "https://gitea.example.com",
token: "encrypted-token",
defaultOwner: "testuser",
mirrorReleases: false,
mirrorMetadata: true,
mirrorIssues: true,
mirrorPullRequests: true,
mirrorLabels: true,
mirrorMilestones: true,
},
};
const repository: Repository = {
id: "repo790",
name: "already-synced-repo",
fullName: "user/already-synced-repo",
owner: "user",
cloneUrl: "https://github.com/user/already-synced-repo.git",
isPrivate: false,
isStarred: false,
status: repoStatusEnum.parse("mirrored"),
visibility: "public",
userId: "user123",
createdAt: new Date(),
updatedAt: new Date(),
metadata: JSON.stringify({
components: {
releases: true,
issues: true,
pullRequests: true,
labels: true,
milestones: true,
},
lastSyncedAt: new Date().toISOString(),
}),
};
await syncGiteaRepoEnhanced(
{ config, repository },
{
getGiteaRepoOwnerAsync: mockGetGiteaRepoOwnerAsync,
mirrorGitHubReleasesToGitea: mockMirrorGitHubReleasesToGitea,
mirrorGitRepoIssuesToGitea: mockMirrorGitRepoIssuesToGitea,
mirrorGitRepoPullRequestsToGitea: mockMirrorGitRepoPullRequestsToGitea,
mirrorGitRepoLabelsToGitea: mockMirrorGitRepoLabelsToGitea,
mirrorGitRepoMilestonesToGitea: mockMirrorGitRepoMilestonesToGitea,
}
);
expect(mockMirrorGitHubReleasesToGitea).not.toHaveBeenCalled();
expect(mockMirrorGitRepoIssuesToGitea).not.toHaveBeenCalled();
expect(mockMirrorGitRepoPullRequestsToGitea).not.toHaveBeenCalled();
expect(mockMirrorGitRepoLabelsToGitea).not.toHaveBeenCalled();
expect(mockMirrorGitRepoMilestonesToGitea).not.toHaveBeenCalled();
});
});
describe("handleExistingNonMirrorRepo", () => {

View File

@@ -15,10 +15,18 @@ import { httpPost, httpGet, httpPatch, HttpError } from "./http-client";
import { db, repositories } from "./db";
import { eq } from "drizzle-orm";
import { repoStatusEnum } from "@/types/Repository";
import {
parseRepositoryMetadataState,
serializeRepositoryMetadataState,
} from "./metadata-state";
type SyncDependencies = {
getGiteaRepoOwnerAsync: typeof import("./gitea")["getGiteaRepoOwnerAsync"];
mirrorGitHubReleasesToGitea: typeof import("./gitea")["mirrorGitHubReleasesToGitea"];
mirrorGitRepoIssuesToGitea: typeof import("./gitea")["mirrorGitRepoIssuesToGitea"];
mirrorGitRepoPullRequestsToGitea: typeof import("./gitea")["mirrorGitRepoPullRequestsToGitea"];
mirrorGitRepoLabelsToGitea: typeof import("./gitea")["mirrorGitRepoLabelsToGitea"];
mirrorGitRepoMilestonesToGitea: typeof import("./gitea")["mirrorGitRepoMilestonesToGitea"];
};
/**
@@ -330,36 +338,236 @@ export async function syncGiteaRepoEnhanced({
Authorization: `token ${decryptedConfig.giteaConfig.token}`,
});
const metadataState = parseRepositoryMetadataState(repository.metadata);
let metadataUpdated = false;
const skipMetadataForStarred =
repository.isStarred && config.githubConfig?.starredCodeOnly;
let metadataOctokit: Octokit | null = null;
const ensureOctokit = (): Octokit | null => {
if (metadataOctokit) {
return metadataOctokit;
}
if (!decryptedConfig.githubConfig?.token) {
return null;
}
metadataOctokit = new Octokit({
auth: decryptedConfig.githubConfig.token,
});
return metadataOctokit;
};
const shouldMirrorReleases =
decryptedConfig.giteaConfig?.mirrorReleases &&
!(repository.isStarred && decryptedConfig.githubConfig?.starredCodeOnly);
!!config.giteaConfig?.mirrorReleases && !skipMetadataForStarred;
const shouldMirrorIssuesThisRun =
!!config.giteaConfig?.mirrorIssues &&
!skipMetadataForStarred &&
!metadataState.components.issues;
const shouldMirrorPullRequests =
!!config.giteaConfig?.mirrorPullRequests &&
!skipMetadataForStarred &&
!metadataState.components.pullRequests;
const shouldMirrorLabels =
!!config.giteaConfig?.mirrorLabels &&
!skipMetadataForStarred &&
!shouldMirrorIssuesThisRun &&
!metadataState.components.labels;
const shouldMirrorMilestones =
!!config.giteaConfig?.mirrorMilestones &&
!skipMetadataForStarred &&
!metadataState.components.milestones;
if (shouldMirrorReleases) {
if (!decryptedConfig.githubConfig?.token) {
const octokit = ensureOctokit();
if (!octokit) {
console.warn(
`[Sync] Skipping release mirroring for ${repository.name}: Missing GitHub token`
);
} else {
try {
const octokit = new Octokit({ auth: decryptedConfig.githubConfig.token });
await dependencies.mirrorGitHubReleasesToGitea({
config: decryptedConfig,
config,
octokit,
repository,
giteaOwner: repoOwner,
giteaRepoName: repository.name,
});
console.log(`[Sync] Mirrored releases for ${repository.name} after sync`);
metadataState.components.releases = true;
metadataUpdated = true;
console.log(
`[Sync] Mirrored releases for ${repository.name} after sync`
);
} catch (releaseError) {
console.error(
`[Sync] Failed to mirror releases for ${repository.name}: ${
releaseError instanceof Error ? releaseError.message : String(releaseError)
releaseError instanceof Error
? releaseError.message
: String(releaseError)
}`
);
}
}
}
if (shouldMirrorIssuesThisRun) {
const octokit = ensureOctokit();
if (!octokit) {
console.warn(
`[Sync] Skipping issue mirroring for ${repository.name}: Missing GitHub token`
);
} else {
try {
await dependencies.mirrorGitRepoIssuesToGitea({
config,
octokit,
repository,
giteaOwner: repoOwner,
giteaRepoName: repository.name,
});
metadataState.components.issues = true;
metadataState.components.labels = true;
metadataUpdated = true;
console.log(
`[Sync] Mirrored issues for ${repository.name} after sync`
);
} catch (issueError) {
console.error(
`[Sync] Failed to mirror issues for ${repository.name}: ${
issueError instanceof Error
? issueError.message
: String(issueError)
}`
);
}
}
} else if (
config.giteaConfig?.mirrorIssues &&
metadataState.components.issues
) {
console.log(
`[Sync] Issues already mirrored for ${repository.name}; skipping`
);
}
if (shouldMirrorPullRequests) {
const octokit = ensureOctokit();
if (!octokit) {
console.warn(
`[Sync] Skipping pull request mirroring for ${repository.name}: Missing GitHub token`
);
} else {
try {
await dependencies.mirrorGitRepoPullRequestsToGitea({
config,
octokit,
repository,
giteaOwner: repoOwner,
giteaRepoName: repository.name,
});
metadataState.components.pullRequests = true;
metadataUpdated = true;
console.log(
`[Sync] Mirrored pull requests for ${repository.name} after sync`
);
} catch (prError) {
console.error(
`[Sync] Failed to mirror pull requests for ${repository.name}: ${
prError instanceof Error ? prError.message : String(prError)
}`
);
}
}
} else if (
config.giteaConfig?.mirrorPullRequests &&
metadataState.components.pullRequests
) {
console.log(
`[Sync] Pull requests already mirrored for ${repository.name}; skipping`
);
}
if (shouldMirrorLabels) {
const octokit = ensureOctokit();
if (!octokit) {
console.warn(
`[Sync] Skipping label mirroring for ${repository.name}: Missing GitHub token`
);
} else {
try {
await dependencies.mirrorGitRepoLabelsToGitea({
config,
octokit,
repository,
giteaOwner: repoOwner,
giteaRepoName: repository.name,
});
metadataState.components.labels = true;
metadataUpdated = true;
console.log(
`[Sync] Mirrored labels for ${repository.name} after sync`
);
} catch (labelError) {
console.error(
`[Sync] Failed to mirror labels for ${repository.name}: ${
labelError instanceof Error
? labelError.message
: String(labelError)
}`
);
}
}
} else if (
config.giteaConfig?.mirrorLabels &&
metadataState.components.labels
) {
console.log(
`[Sync] Labels already mirrored for ${repository.name}; skipping`
);
}
if (shouldMirrorMilestones) {
const octokit = ensureOctokit();
if (!octokit) {
console.warn(
`[Sync] Skipping milestone mirroring for ${repository.name}: Missing GitHub token`
);
} else {
try {
await dependencies.mirrorGitRepoMilestonesToGitea({
config,
octokit,
repository,
giteaOwner: repoOwner,
giteaRepoName: repository.name,
});
metadataState.components.milestones = true;
metadataUpdated = true;
console.log(
`[Sync] Mirrored milestones for ${repository.name} after sync`
);
} catch (milestoneError) {
console.error(
`[Sync] Failed to mirror milestones for ${repository.name}: ${
milestoneError instanceof Error
? milestoneError.message
: String(milestoneError)
}`
);
}
}
} else if (
config.giteaConfig?.mirrorMilestones &&
metadataState.components.milestones
) {
console.log(
`[Sync] Milestones already mirrored for ${repository.name}; skipping`
);
}
if (metadataUpdated) {
metadataState.lastSyncedAt = new Date().toISOString();
}
// Mark repo as "synced" in DB
await db
.update(repositories)
@@ -369,6 +577,9 @@ export async function syncGiteaRepoEnhanced({
lastMirrored: new Date(),
errorMessage: null,
mirroredLocation: `${repoOwner}/${repository.name}`,
metadata: metadataUpdated
? serializeRepositoryMetadataState(metadataState)
: repository.metadata ?? null,
})
.where(eq(repositories.id, repository.id!));

View File

@@ -12,6 +12,11 @@ import { createMirrorJob } from "./helpers";
import { db, organizations, repositories } from "./db";
import { eq, and } from "drizzle-orm";
import { decryptConfigTokens } from "./utils/config-encryption";
import { formatDateShort } from "./utils";
import {
parseRepositoryMetadataState,
serializeRepositoryMetadataState,
} from "./metadata-state";
/**
* Helper function to get organization configuration including destination override
@@ -586,12 +591,18 @@ export const mirrorGithubRepoToGitea = async ({
}
);
//mirror releases
// Skip releases for starred repos if starredCodeOnly is enabled
const shouldMirrorReleases = config.giteaConfig?.mirrorReleases &&
!(repository.isStarred && config.githubConfig?.starredCodeOnly);
const metadataState = parseRepositoryMetadataState(repository.metadata);
let metadataUpdated = false;
const skipMetadataForStarred =
repository.isStarred && config.githubConfig?.starredCodeOnly;
console.log(`[Metadata] Release mirroring check: mirrorReleases=${config.giteaConfig?.mirrorReleases}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorReleases=${shouldMirrorReleases}`);
// Mirror releases if enabled (always allowed to rerun for updates)
const shouldMirrorReleases =
!!config.giteaConfig?.mirrorReleases && !skipMetadataForStarred;
console.log(
`[Metadata] Release mirroring check: mirrorReleases=${config.giteaConfig?.mirrorReleases}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorReleases=${shouldMirrorReleases}`
);
if (shouldMirrorReleases) {
try {
@@ -602,21 +613,32 @@ export const mirrorGithubRepoToGitea = async ({
giteaOwner: repoOwner,
giteaRepoName: targetRepoName,
});
console.log(`[Metadata] Successfully mirrored releases for ${repository.name}`);
metadataState.components.releases = true;
metadataUpdated = true;
console.log(
`[Metadata] Successfully mirrored releases for ${repository.name}`
);
} catch (error) {
console.error(`[Metadata] Failed to mirror releases for ${repository.name}: ${error instanceof Error ? error.message : String(error)}`);
console.error(
`[Metadata] Failed to mirror releases for ${repository.name}: ${
error instanceof Error ? error.message : String(error)
}`
);
// Continue with other operations even if releases fail
}
}
// clone issues
// Skip issues for starred repos if starredCodeOnly is enabled
const shouldMirrorIssues = config.giteaConfig?.mirrorIssues &&
!(repository.isStarred && config.githubConfig?.starredCodeOnly);
console.log(`[Metadata] Issue mirroring check: mirrorIssues=${config.giteaConfig?.mirrorIssues}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorIssues=${shouldMirrorIssues}`);
if (shouldMirrorIssues) {
// Determine metadata operations to avoid duplicates
const shouldMirrorIssuesThisRun =
!!config.giteaConfig?.mirrorIssues &&
!skipMetadataForStarred &&
!metadataState.components.issues;
console.log(
`[Metadata] Issue mirroring check: mirrorIssues=${config.giteaConfig?.mirrorIssues}, alreadyMirrored=${metadataState.components.issues}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorIssues=${shouldMirrorIssuesThisRun}`
);
if (shouldMirrorIssuesThisRun) {
try {
await mirrorGitRepoIssuesToGitea({
config,
@@ -625,19 +647,34 @@ export const mirrorGithubRepoToGitea = async ({
giteaOwner: repoOwner,
giteaRepoName: targetRepoName,
});
console.log(`[Metadata] Successfully mirrored issues for ${repository.name}`);
metadataState.components.issues = true;
metadataState.components.labels = true;
metadataUpdated = true;
console.log(
`[Metadata] Successfully mirrored issues for ${repository.name}`
);
} catch (error) {
console.error(`[Metadata] Failed to mirror issues for ${repository.name}: ${error instanceof Error ? error.message : String(error)}`);
console.error(
`[Metadata] Failed to mirror issues for ${repository.name}: ${
error instanceof Error ? error.message : String(error)
}`
);
// Continue with other metadata operations even if issues fail
}
} else if (config.giteaConfig?.mirrorIssues && metadataState.components.issues) {
console.log(
`[Metadata] Issues already mirrored for ${repository.name}; skipping to avoid duplicates`
);
}
// Mirror pull requests if enabled
// Skip pull requests for starred repos if starredCodeOnly is enabled
const shouldMirrorPullRequests = config.giteaConfig?.mirrorPullRequests &&
!(repository.isStarred && config.githubConfig?.starredCodeOnly);
const shouldMirrorPullRequests =
!!config.giteaConfig?.mirrorPullRequests &&
!skipMetadataForStarred &&
!metadataState.components.pullRequests;
console.log(`[Metadata] Pull request mirroring check: mirrorPullRequests=${config.giteaConfig?.mirrorPullRequests}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorPullRequests=${shouldMirrorPullRequests}`);
console.log(
`[Metadata] Pull request mirroring check: mirrorPullRequests=${config.giteaConfig?.mirrorPullRequests}, alreadyMirrored=${metadataState.components.pullRequests}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorPullRequests=${shouldMirrorPullRequests}`
);
if (shouldMirrorPullRequests) {
try {
@@ -648,19 +685,37 @@ export const mirrorGithubRepoToGitea = async ({
giteaOwner: repoOwner,
giteaRepoName: targetRepoName,
});
console.log(`[Metadata] Successfully mirrored pull requests for ${repository.name}`);
metadataState.components.pullRequests = true;
metadataUpdated = true;
console.log(
`[Metadata] Successfully mirrored pull requests for ${repository.name}`
);
} catch (error) {
console.error(`[Metadata] Failed to mirror pull requests for ${repository.name}: ${error instanceof Error ? error.message : String(error)}`);
console.error(
`[Metadata] Failed to mirror pull requests for ${repository.name}: ${
error instanceof Error ? error.message : String(error)
}`
);
// Continue with other metadata operations even if PRs fail
}
} else if (
config.giteaConfig?.mirrorPullRequests &&
metadataState.components.pullRequests
) {
console.log(
`[Metadata] Pull requests already mirrored for ${repository.name}; skipping`
);
}
// Mirror labels if enabled (and not already done via issues)
// Skip labels for starred repos if starredCodeOnly is enabled
const shouldMirrorLabels = config.giteaConfig?.mirrorLabels && !shouldMirrorIssues &&
!(repository.isStarred && config.githubConfig?.starredCodeOnly);
const shouldMirrorLabels =
!!config.giteaConfig?.mirrorLabels &&
!skipMetadataForStarred &&
!shouldMirrorIssuesThisRun &&
!metadataState.components.labels;
console.log(`[Metadata] Label mirroring check: mirrorLabels=${config.giteaConfig?.mirrorLabels}, shouldMirrorIssues=${shouldMirrorIssues}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorLabels=${shouldMirrorLabels}`);
console.log(
`[Metadata] Label mirroring check: mirrorLabels=${config.giteaConfig?.mirrorLabels}, alreadyMirrored=${metadataState.components.labels}, issuesRunning=${shouldMirrorIssuesThisRun}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorLabels=${shouldMirrorLabels}`
);
if (shouldMirrorLabels) {
try {
@@ -671,19 +726,33 @@ export const mirrorGithubRepoToGitea = async ({
giteaOwner: repoOwner,
giteaRepoName: targetRepoName,
});
console.log(`[Metadata] Successfully mirrored labels for ${repository.name}`);
metadataState.components.labels = true;
metadataUpdated = true;
console.log(
`[Metadata] Successfully mirrored labels for ${repository.name}`
);
} catch (error) {
console.error(`[Metadata] Failed to mirror labels for ${repository.name}: ${error instanceof Error ? error.message : String(error)}`);
console.error(
`[Metadata] Failed to mirror labels for ${repository.name}: ${
error instanceof Error ? error.message : String(error)
}`
);
// Continue with other metadata operations even if labels fail
}
} else if (config.giteaConfig?.mirrorLabels && metadataState.components.labels) {
console.log(
`[Metadata] Labels already mirrored for ${repository.name}; skipping`
);
}
// Mirror milestones if enabled
// Skip milestones for starred repos if starredCodeOnly is enabled
const shouldMirrorMilestones = config.giteaConfig?.mirrorMilestones &&
!(repository.isStarred && config.githubConfig?.starredCodeOnly);
const shouldMirrorMilestones =
!!config.giteaConfig?.mirrorMilestones &&
!skipMetadataForStarred &&
!metadataState.components.milestones;
console.log(`[Metadata] Milestone mirroring check: mirrorMilestones=${config.giteaConfig?.mirrorMilestones}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorMilestones=${shouldMirrorMilestones}`);
console.log(
`[Metadata] Milestone mirroring check: mirrorMilestones=${config.giteaConfig?.mirrorMilestones}, alreadyMirrored=${metadataState.components.milestones}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorMilestones=${shouldMirrorMilestones}`
);
if (shouldMirrorMilestones) {
try {
@@ -694,11 +763,30 @@ export const mirrorGithubRepoToGitea = async ({
giteaOwner: repoOwner,
giteaRepoName: targetRepoName,
});
console.log(`[Metadata] Successfully mirrored milestones for ${repository.name}`);
metadataState.components.milestones = true;
metadataUpdated = true;
console.log(
`[Metadata] Successfully mirrored milestones for ${repository.name}`
);
} catch (error) {
console.error(`[Metadata] Failed to mirror milestones for ${repository.name}: ${error instanceof Error ? error.message : String(error)}`);
console.error(
`[Metadata] Failed to mirror milestones for ${repository.name}: ${
error instanceof Error ? error.message : String(error)
}`
);
// Continue with other metadata operations even if milestones fail
}
} else if (
config.giteaConfig?.mirrorMilestones &&
metadataState.components.milestones
) {
console.log(
`[Metadata] Milestones already mirrored for ${repository.name}; skipping`
);
}
if (metadataUpdated) {
metadataState.lastSyncedAt = new Date().toISOString();
}
console.log(`Repository ${repository.name} mirrored successfully as ${targetRepoName}`);
@@ -712,6 +800,9 @@ export const mirrorGithubRepoToGitea = async ({
lastMirrored: new Date(),
errorMessage: null,
mirroredLocation: `${repoOwner}/${targetRepoName}`,
metadata: metadataUpdated
? serializeRepositoryMetadataState(metadataState)
: repository.metadata ?? null,
})
.where(eq(repositories.id, repository.id!));
@@ -1052,12 +1143,17 @@ export async function mirrorGitHubRepoToGiteaOrg({
}
);
//mirror releases
// Skip releases for starred repos if starredCodeOnly is enabled
const shouldMirrorReleases = config.giteaConfig?.mirrorReleases &&
!(repository.isStarred && config.githubConfig?.starredCodeOnly);
const metadataState = parseRepositoryMetadataState(repository.metadata);
let metadataUpdated = false;
const skipMetadataForStarred =
repository.isStarred && config.githubConfig?.starredCodeOnly;
console.log(`[Metadata] Release mirroring check: mirrorReleases=${config.giteaConfig?.mirrorReleases}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorReleases=${shouldMirrorReleases}`);
const shouldMirrorReleases =
!!config.giteaConfig?.mirrorReleases && !skipMetadataForStarred;
console.log(
`[Metadata] Release mirroring check: mirrorReleases=${config.giteaConfig?.mirrorReleases}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorReleases=${shouldMirrorReleases}`
);
if (shouldMirrorReleases) {
try {
@@ -1068,21 +1164,31 @@ export async function mirrorGitHubRepoToGiteaOrg({
giteaOwner: orgName,
giteaRepoName: targetRepoName,
});
console.log(`[Metadata] Successfully mirrored releases for ${repository.name}`);
metadataState.components.releases = true;
metadataUpdated = true;
console.log(
`[Metadata] Successfully mirrored releases for ${repository.name}`
);
} catch (error) {
console.error(`[Metadata] Failed to mirror releases for ${repository.name}: ${error instanceof Error ? error.message : String(error)}`);
console.error(
`[Metadata] Failed to mirror releases for ${repository.name}: ${
error instanceof Error ? error.message : String(error)
}`
);
// Continue with other operations even if releases fail
}
}
// Clone issues
// Skip issues for starred repos if starredCodeOnly is enabled
const shouldMirrorIssues = config.giteaConfig?.mirrorIssues &&
!(repository.isStarred && config.githubConfig?.starredCodeOnly);
console.log(`[Metadata] Issue mirroring check: mirrorIssues=${config.giteaConfig?.mirrorIssues}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorIssues=${shouldMirrorIssues}`);
if (shouldMirrorIssues) {
const shouldMirrorIssuesThisRun =
!!config.giteaConfig?.mirrorIssues &&
!skipMetadataForStarred &&
!metadataState.components.issues;
console.log(
`[Metadata] Issue mirroring check: mirrorIssues=${config.giteaConfig?.mirrorIssues}, alreadyMirrored=${metadataState.components.issues}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorIssues=${shouldMirrorIssuesThisRun}`
);
if (shouldMirrorIssuesThisRun) {
try {
await mirrorGitRepoIssuesToGitea({
config,
@@ -1091,19 +1197,37 @@ export async function mirrorGitHubRepoToGiteaOrg({
giteaOwner: orgName,
giteaRepoName: targetRepoName,
});
console.log(`[Metadata] Successfully mirrored issues for ${repository.name} to org ${orgName}/${targetRepoName}`);
metadataState.components.issues = true;
metadataState.components.labels = true;
metadataUpdated = true;
console.log(
`[Metadata] Successfully mirrored issues for ${repository.name} to org ${orgName}/${targetRepoName}`
);
} catch (error) {
console.error(`[Metadata] Failed to mirror issues for ${repository.name} to org ${orgName}/${targetRepoName}: ${error instanceof Error ? error.message : String(error)}`);
console.error(
`[Metadata] Failed to mirror issues for ${repository.name} to org ${orgName}/${targetRepoName}: ${
error instanceof Error ? error.message : String(error)
}`
);
// Continue with other metadata operations even if issues fail
}
} else if (
config.giteaConfig?.mirrorIssues &&
metadataState.components.issues
) {
console.log(
`[Metadata] Issues already mirrored for ${repository.name}; skipping`
);
}
// Mirror pull requests if enabled
// Skip pull requests for starred repos if starredCodeOnly is enabled
const shouldMirrorPullRequests = config.giteaConfig?.mirrorPullRequests &&
!(repository.isStarred && config.githubConfig?.starredCodeOnly);
const shouldMirrorPullRequests =
!!config.giteaConfig?.mirrorPullRequests &&
!skipMetadataForStarred &&
!metadataState.components.pullRequests;
console.log(`[Metadata] Pull request mirroring check: mirrorPullRequests=${config.giteaConfig?.mirrorPullRequests}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorPullRequests=${shouldMirrorPullRequests}`);
console.log(
`[Metadata] Pull request mirroring check: mirrorPullRequests=${config.giteaConfig?.mirrorPullRequests}, alreadyMirrored=${metadataState.components.pullRequests}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorPullRequests=${shouldMirrorPullRequests}`
);
if (shouldMirrorPullRequests) {
try {
@@ -1114,19 +1238,37 @@ export async function mirrorGitHubRepoToGiteaOrg({
giteaOwner: orgName,
giteaRepoName: targetRepoName,
});
console.log(`[Metadata] Successfully mirrored pull requests for ${repository.name} to org ${orgName}/${targetRepoName}`);
metadataState.components.pullRequests = true;
metadataUpdated = true;
console.log(
`[Metadata] Successfully mirrored pull requests for ${repository.name} to org ${orgName}/${targetRepoName}`
);
} catch (error) {
console.error(`[Metadata] Failed to mirror pull requests for ${repository.name} to org ${orgName}/${targetRepoName}: ${error instanceof Error ? error.message : String(error)}`);
console.error(
`[Metadata] Failed to mirror pull requests for ${repository.name} to org ${orgName}/${targetRepoName}: ${
error instanceof Error ? error.message : String(error)
}`
);
// Continue with other metadata operations even if PRs fail
}
} else if (
config.giteaConfig?.mirrorPullRequests &&
metadataState.components.pullRequests
) {
console.log(
`[Metadata] Pull requests already mirrored for ${repository.name}; skipping`
);
}
// Mirror labels if enabled (and not already done via issues)
// Skip labels for starred repos if starredCodeOnly is enabled
const shouldMirrorLabels = config.giteaConfig?.mirrorLabels && !shouldMirrorIssues &&
!(repository.isStarred && config.githubConfig?.starredCodeOnly);
const shouldMirrorLabels =
!!config.giteaConfig?.mirrorLabels &&
!skipMetadataForStarred &&
!shouldMirrorIssuesThisRun &&
!metadataState.components.labels;
console.log(`[Metadata] Label mirroring check: mirrorLabels=${config.giteaConfig?.mirrorLabels}, shouldMirrorIssues=${shouldMirrorIssues}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorLabels=${shouldMirrorLabels}`);
console.log(
`[Metadata] Label mirroring check: mirrorLabels=${config.giteaConfig?.mirrorLabels}, alreadyMirrored=${metadataState.components.labels}, issuesRunning=${shouldMirrorIssuesThisRun}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorLabels=${shouldMirrorLabels}`
);
if (shouldMirrorLabels) {
try {
@@ -1137,19 +1279,36 @@ export async function mirrorGitHubRepoToGiteaOrg({
giteaOwner: orgName,
giteaRepoName: targetRepoName,
});
console.log(`[Metadata] Successfully mirrored labels for ${repository.name} to org ${orgName}/${targetRepoName}`);
metadataState.components.labels = true;
metadataUpdated = true;
console.log(
`[Metadata] Successfully mirrored labels for ${repository.name} to org ${orgName}/${targetRepoName}`
);
} catch (error) {
console.error(`[Metadata] Failed to mirror labels for ${repository.name} to org ${orgName}/${targetRepoName}: ${error instanceof Error ? error.message : String(error)}`);
console.error(
`[Metadata] Failed to mirror labels for ${repository.name} to org ${orgName}/${targetRepoName}: ${
error instanceof Error ? error.message : String(error)
}`
);
// Continue with other metadata operations even if labels fail
}
} else if (
config.giteaConfig?.mirrorLabels &&
metadataState.components.labels
) {
console.log(
`[Metadata] Labels already mirrored for ${repository.name}; skipping`
);
}
// Mirror milestones if enabled
// Skip milestones for starred repos if starredCodeOnly is enabled
const shouldMirrorMilestones = config.giteaConfig?.mirrorMilestones &&
!(repository.isStarred && config.githubConfig?.starredCodeOnly);
const shouldMirrorMilestones =
!!config.giteaConfig?.mirrorMilestones &&
!skipMetadataForStarred &&
!metadataState.components.milestones;
console.log(`[Metadata] Milestone mirroring check: mirrorMilestones=${config.giteaConfig?.mirrorMilestones}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorMilestones=${shouldMirrorMilestones}`);
console.log(
`[Metadata] Milestone mirroring check: mirrorMilestones=${config.giteaConfig?.mirrorMilestones}, alreadyMirrored=${metadataState.components.milestones}, isStarred=${repository.isStarred}, starredCodeOnly=${config.githubConfig?.starredCodeOnly}, shouldMirrorMilestones=${shouldMirrorMilestones}`
);
if (shouldMirrorMilestones) {
try {
@@ -1160,11 +1319,30 @@ export async function mirrorGitHubRepoToGiteaOrg({
giteaOwner: orgName,
giteaRepoName: targetRepoName,
});
console.log(`[Metadata] Successfully mirrored milestones for ${repository.name} to org ${orgName}/${targetRepoName}`);
metadataState.components.milestones = true;
metadataUpdated = true;
console.log(
`[Metadata] Successfully mirrored milestones for ${repository.name} to org ${orgName}/${targetRepoName}`
);
} catch (error) {
console.error(`[Metadata] Failed to mirror milestones for ${repository.name} to org ${orgName}/${targetRepoName}: ${error instanceof Error ? error.message : String(error)}`);
console.error(
`[Metadata] Failed to mirror milestones for ${repository.name} to org ${orgName}/${targetRepoName}: ${
error instanceof Error ? error.message : String(error)
}`
);
// Continue with other metadata operations even if milestones fail
}
} else if (
config.giteaConfig?.mirrorMilestones &&
metadataState.components.milestones
) {
console.log(
`[Metadata] Milestones already mirrored for ${repository.name}; skipping`
);
}
if (metadataUpdated) {
metadataState.lastSyncedAt = new Date().toISOString();
}
console.log(
@@ -1180,6 +1358,9 @@ export async function mirrorGitHubRepoToGiteaOrg({
lastMirrored: new Date(),
errorMessage: null,
mirroredLocation: `${orgName}/${targetRepoName}`,
metadata: metadataUpdated
? serializeRepositoryMetadataState(metadataState)
: repository.metadata ?? null,
})
.where(eq(repositories.id, repository.id!));
@@ -1558,6 +1739,8 @@ export const mirrorGitRepoIssuesToGitea = async ({
repo,
state: "all",
per_page: 100,
sort: "created",
direction: "asc",
},
(res) => res.data
);
@@ -1590,6 +1773,18 @@ export const mirrorGitRepoIssuesToGitea = async ({
// Import the processWithRetry function
const { processWithRetry } = await import("@/lib/utils/concurrency");
const rawIssueConcurrency = config.giteaConfig?.issueConcurrency ?? 3;
const issueConcurrencyLimit =
Number.isFinite(rawIssueConcurrency)
? Math.max(1, Math.floor(rawIssueConcurrency))
: 1;
if (issueConcurrencyLimit > 1) {
console.warn(
`[Issues] Concurrency is set to ${issueConcurrencyLimit}. This may lead to out-of-order issue creation in Gitea but is faster.`
);
}
// Process issues in parallel with concurrency control
await processWithRetry(
filteredIssues,
@@ -1632,11 +1827,15 @@ export const mirrorGitRepoIssuesToGitea = async ({
.join(", ")} on GitHub.`
: "";
const issueAuthor = issue.user?.login ?? "unknown";
const issueCreatedOn = formatDateShort(issue.created_at);
const issueOriginHeader = `Originally created by @${issueAuthor} on GitHub${
issueCreatedOn ? ` (${issueCreatedOn})` : ""
}.`;
const issuePayload: any = {
title: issue.title,
body: `Originally created by @${
issue.user?.login
} on GitHub.${originalAssignees}\n\n${issue.body || ""}`,
body: `${issueOriginHeader}${originalAssignees}\n\n${issue.body ?? ""}`,
closed: issue.state === "closed",
labels: giteaLabelIds,
};
@@ -1662,15 +1861,30 @@ export const mirrorGitRepoIssuesToGitea = async ({
(res) => res.data
);
// Process comments in parallel with concurrency control
if (comments.length > 0) {
// Ensure comments are applied in chronological order to preserve discussion flow
const sortedComments = comments
.slice()
.sort(
(a, b) =>
new Date(a.created_at || 0).getTime() -
new Date(b.created_at || 0).getTime()
);
// Process comments sequentially to preserve historical ordering
if (sortedComments.length > 0) {
await processWithRetry(
comments,
sortedComments,
async (comment) => {
const commenter = comment.user?.login ?? "unknown";
const commentDate = formatDateShort(comment.created_at);
const commentHeader = `@${commenter} commented on GitHub${
commentDate ? ` (${commentDate})` : ""
}:`;
await httpPost(
`${config.giteaConfig!.url}/api/v1/repos/${giteaOwner}/${repoName}/issues/${createdIssue.data.number}/comments`,
{
body: `@${comment.user?.login} commented on GitHub:\n\n${comment.body}`,
body: `${commentHeader}\n\n${comment.body ?? ""}`,
},
{
Authorization: `token ${decryptedConfig.giteaConfig!.token}`,
@@ -1679,7 +1893,7 @@ export const mirrorGitRepoIssuesToGitea = async ({
return comment;
},
{
concurrencyLimit: 5,
concurrencyLimit: 1,
maxRetries: 2,
retryDelay: 1000,
onRetry: (_comment, error, attempt) => {
@@ -1694,7 +1908,7 @@ export const mirrorGitRepoIssuesToGitea = async ({
return issue;
},
{
concurrencyLimit: 3, // Process 3 issues at a time
concurrencyLimit: issueConcurrencyLimit,
maxRetries: 2,
retryDelay: 2000,
onProgress: (completed, total, result) => {
@@ -1778,12 +1992,26 @@ export async function mirrorGitHubReleasesToGitea({
let mirroredCount = 0;
let skippedCount = 0;
// Sort releases by created_at to ensure we get the most recent ones
const sortedReleases = releases.data.sort((a, b) =>
new Date(b.created_at).getTime() - new Date(a.created_at).getTime()
).slice(0, releaseLimit);
const getReleaseTimestamp = (release: typeof releases.data[number]) => {
const sourceDate = release.created_at ?? release.published_at ?? "";
const timestamp = sourceDate ? new Date(sourceDate).getTime() : 0;
return Number.isFinite(timestamp) ? timestamp : 0;
};
for (const release of sortedReleases) {
// Capture the latest releases, then process them oldest-to-newest so Gitea mirrors keep chronological order
const releasesToProcess = releases.data
.slice()
.sort((a, b) => getReleaseTimestamp(b) - getReleaseTimestamp(a))
.slice(0, releaseLimit)
.sort((a, b) => getReleaseTimestamp(a) - getReleaseTimestamp(b));
console.log(`[Releases] Processing ${releasesToProcess.length} releases in chronological order (oldest to newest)`);
releasesToProcess.forEach((rel, idx) => {
const date = new Date(rel.published_at || rel.created_at);
console.log(`[Releases] ${idx + 1}. ${rel.tag_name} - Originally published: ${date.toISOString()}`);
});
for (const release of releasesToProcess) {
try {
// Check if release already exists
const existingReleasesResponse = await httpGet(
@@ -1793,8 +2021,14 @@ export async function mirrorGitHubReleasesToGitea({
}
).catch(() => null);
const releaseNote = release.body || "";
// Prepare release body with GitHub original date header
const githubPublishedDate = release.published_at || release.created_at;
const githubDateHeader = githubPublishedDate
? `> 📅 **Originally published on GitHub:** ${new Date(githubPublishedDate).toUTCString()}\n\n`
: '';
const originalReleaseNote = release.body || "";
const releaseNote = githubDateHeader + originalReleaseNote;
if (existingReleasesResponse) {
// Update existing release if the changelog/body differs
const existingRelease = existingReleasesResponse.data;
@@ -1817,9 +2051,11 @@ export async function mirrorGitHubReleasesToGitea({
Authorization: `token ${decryptedConfig.giteaConfig.token}`,
}
);
if (releaseNote) {
console.log(`[Releases] Updated changelog for ${release.tag_name} (${releaseNote.length} characters)`);
if (originalReleaseNote) {
console.log(`[Releases] Updated changelog for ${release.tag_name} (${originalReleaseNote.length} characters + GitHub date header)`);
} else {
console.log(`[Releases] Updated release ${release.tag_name} with GitHub date header`);
}
mirroredCount++;
} else {
@@ -1829,9 +2065,11 @@ export async function mirrorGitHubReleasesToGitea({
continue;
}
// Create new release with changelog/body content
if (releaseNote) {
console.log(`[Releases] Including changelog for ${release.tag_name} (${releaseNote.length} characters)`);
// Create new release with changelog/body content (includes GitHub date header)
if (originalReleaseNote) {
console.log(`[Releases] Including changelog for ${release.tag_name} (${originalReleaseNote.length} characters + GitHub date header)`);
} else {
console.log(`[Releases] Creating release ${release.tag_name} with GitHub date header (no changelog)`);
}
const createReleaseResponse = await httpPost(
@@ -1899,8 +2137,14 @@ export async function mirrorGitHubReleasesToGitea({
}
mirroredCount++;
const noteInfo = releaseNote ? ` with ${releaseNote.length} character changelog` : " without changelog";
const noteInfo = originalReleaseNote ? ` with ${originalReleaseNote.length} character changelog` : " without changelog";
console.log(`[Releases] Successfully mirrored release: ${release.tag_name}${noteInfo}`);
// Add delay to ensure proper timestamp ordering in Gitea
// Gitea sorts releases by created_unix DESC, and all releases created in quick succession
// will have nearly identical timestamps. The 1-second delay ensures proper chronological order.
console.log(`[Releases] Waiting 1 second to ensure proper timestamp ordering in Gitea...`);
await new Promise(resolve => setTimeout(resolve, 1000));
} catch (error) {
console.error(`[Releases] Failed to mirror release ${release.tag_name}: ${error instanceof Error ? error.message : String(error)}`);
}
@@ -1966,6 +2210,8 @@ export async function mirrorGitRepoPullRequestsToGitea({
repo,
state: "all",
per_page: 100,
sort: "created",
direction: "asc",
},
(res) => res.data
);
@@ -2022,6 +2268,18 @@ export async function mirrorGitRepoPullRequestsToGitea({
const { processWithRetry } = await import("@/lib/utils/concurrency");
const rawPullConcurrency = config.giteaConfig?.pullRequestConcurrency ?? 5;
const pullRequestConcurrencyLimit =
Number.isFinite(rawPullConcurrency)
? Math.max(1, Math.floor(rawPullConcurrency))
: 1;
if (pullRequestConcurrencyLimit > 1) {
console.warn(
`[Pull Requests] Concurrency is set to ${pullRequestConcurrencyLimit}. This may lead to out-of-order pull request mirroring in Gitea.`
);
}
let successCount = 0;
let failedCount = 0;
@@ -2144,7 +2402,7 @@ export async function mirrorGitRepoPullRequestsToGitea({
}
},
{
concurrencyLimit: 5,
concurrencyLimit: pullRequestConcurrencyLimit,
maxRetries: 3,
retryDelay: 1000,
}

75
src/lib/metadata-state.ts Normal file
View File

@@ -0,0 +1,75 @@
interface MetadataComponentsState {
releases: boolean;
issues: boolean;
pullRequests: boolean;
labels: boolean;
milestones: boolean;
}
export interface RepositoryMetadataState {
components: MetadataComponentsState;
lastSyncedAt?: string;
}
const defaultComponents: MetadataComponentsState = {
releases: false,
issues: false,
pullRequests: false,
labels: false,
milestones: false,
};
export function createDefaultMetadataState(): RepositoryMetadataState {
return {
components: { ...defaultComponents },
};
}
export function parseRepositoryMetadataState(
raw: unknown
): RepositoryMetadataState {
const base = createDefaultMetadataState();
if (!raw) {
return base;
}
let parsed: any = raw;
if (typeof raw === "string") {
try {
parsed = JSON.parse(raw);
} catch {
return base;
}
}
if (!parsed || typeof parsed !== "object") {
return base;
}
if (parsed.components && typeof parsed.components === "object") {
base.components = {
...base.components,
releases: Boolean(parsed.components.releases),
issues: Boolean(parsed.components.issues),
pullRequests: Boolean(parsed.components.pullRequests),
labels: Boolean(parsed.components.labels),
milestones: Boolean(parsed.components.milestones),
};
}
if (typeof parsed.lastSyncedAt === "string") {
base.lastSyncedAt = parsed.lastSyncedAt;
} else if (typeof parsed.lastMetadataSync === "string") {
base.lastSyncedAt = parsed.lastMetadataSync;
}
return base;
}
export function serializeRepositoryMetadataState(
state: RepositoryMetadataState
): string {
return JSON.stringify(state);
}

View File

@@ -62,6 +62,7 @@ describe('normalizeGitRepoToInsert', () => {
expect(insert.description).toBeNull();
expect(insert.lastMirrored).toBeNull();
expect(insert.errorMessage).toBeNull();
expect(insert.normalizedFullName).toBe(repo.fullName.toLowerCase());
});
});
@@ -72,4 +73,3 @@ describe('calcBatchSizeForInsert', () => {
expect(batch * 29).toBeLessThanOrEqual(999);
});
});

View File

@@ -33,6 +33,7 @@ export function normalizeGitRepoToInsert(
configId,
name: repo.name,
fullName: repo.fullName,
normalizedFullName: repo.fullName.toLowerCase(),
url: repo.url,
cloneUrl: repo.cloneUrl,
owner: repo.owner,
@@ -68,4 +69,3 @@ export function calcBatchSizeForInsert(columnCount: number, maxParams = 999): nu
const effectiveMax = Math.max(1, maxParams - safety);
return Math.max(1, Math.floor(effectiveMax / columnCount));
}

View File

@@ -99,12 +99,12 @@ async function runScheduledSync(config: any): Promise<void> {
// Check for new repositories
const existingRepos = await db
.select({ fullName: repositories.fullName })
.select({ normalizedFullName: repositories.normalizedFullName })
.from(repositories)
.where(eq(repositories.userId, userId));
const existingRepoNames = new Set(existingRepos.map(r => r.fullName));
const newRepos = allGithubRepos.filter(r => !existingRepoNames.has(r.fullName));
const existingRepoNames = new Set(existingRepos.map(r => r.normalizedFullName));
const newRepos = allGithubRepos.filter(r => !existingRepoNames.has(r.fullName.toLowerCase()));
if (newRepos.length > 0) {
console.log(`[Scheduler] Found ${newRepos.length} new repositories for user ${userId}`);
@@ -123,7 +123,7 @@ async function runScheduledSync(config: any): Promise<void> {
await db
.insert(repositories)
.values(batch)
.onConflictDoNothing({ target: [repositories.userId, repositories.fullName] });
.onConflictDoNothing({ target: [repositories.userId, repositories.normalizedFullName] });
}
console.log(`[Scheduler] Successfully imported ${newRepos.length} new repositories for user ${userId}`);
} else {
@@ -432,12 +432,12 @@ async function performInitialAutoStart(): Promise<void> {
// Check for new repositories
const existingRepos = await db
.select({ fullName: repositories.fullName })
.select({ normalizedFullName: repositories.normalizedFullName })
.from(repositories)
.where(eq(repositories.userId, config.userId));
const existingRepoNames = new Set(existingRepos.map(r => r.fullName));
const reposToImport = allGithubRepos.filter(r => !existingRepoNames.has(r.fullName));
const existingRepoNames = new Set(existingRepos.map(r => r.normalizedFullName));
const reposToImport = allGithubRepos.filter(r => !existingRepoNames.has(r.fullName.toLowerCase()));
if (reposToImport.length > 0) {
console.log(`[Scheduler] Importing ${reposToImport.length} repositories for user ${config.userId}...`);
@@ -456,7 +456,7 @@ async function performInitialAutoStart(): Promise<void> {
await db
.insert(repositories)
.values(batch)
.onConflictDoNothing({ target: [repositories.userId, repositories.fullName] });
.onConflictDoNothing({ target: [repositories.userId, repositories.normalizedFullName] });
}
console.log(`[Scheduler] Successfully imported ${reposToImport.length} repositories`);
} else {

View File

@@ -24,6 +24,7 @@ describe("normalizeOidcProviderConfig", () => {
expect(result.oidcConfig.userInfoEndpoint).toBe("https://auth.example.com/userinfo");
expect(result.oidcConfig.scopes).toEqual(["openid", "email"]);
expect(result.oidcConfig.pkce).toBe(false);
expect(result.oidcConfig.discoveryEndpoint).toBe("https://auth.example.com/.well-known/openid-configuration");
});
it("derives missing fields from discovery", async () => {
@@ -46,6 +47,24 @@ describe("normalizeOidcProviderConfig", () => {
expect(result.oidcConfig.jwksEndpoint).toBe("https://auth.example.com/jwks");
expect(result.oidcConfig.userInfoEndpoint).toBe("https://auth.example.com/userinfo");
expect(result.oidcConfig.scopes).toEqual(["openid", "email", "profile"]);
expect(result.oidcConfig.discoveryEndpoint).toBe("https://auth.example.com/.well-known/openid-configuration");
});
it("preserves trailing slash issuers when building discovery endpoints", async () => {
const trailingIssuer = "https://auth.example.com/application/o/example/";
const requestedUrls: string[] = [];
const fetchMock: typeof fetch = async (url) => {
requestedUrls.push(typeof url === "string" ? url : url.url);
return new Response(JSON.stringify({
authorization_endpoint: "https://auth.example.com/application/o/example/auth",
token_endpoint: "https://auth.example.com/application/o/example/token",
}));
};
const result = await normalizeOidcProviderConfig(trailingIssuer, {}, fetchMock);
expect(requestedUrls[0]).toBe("https://auth.example.com/application/o/example/.well-known/openid-configuration");
expect(result.oidcConfig.discoveryEndpoint).toBe("https://auth.example.com/application/o/example/.well-known/openid-configuration");
});
it("throws for invalid issuer URL", async () => {

View File

@@ -131,18 +131,21 @@ export async function normalizeOidcProviderConfig(
throw new OidcConfigError("Issuer is required");
}
let normalizedIssuer: string;
const trimmedIssuer = issuer.trim();
try {
const issuerUrl = new URL(issuer.trim());
normalizedIssuer = issuerUrl.toString().replace(/\/$/, "");
// Validate issuer but keep caller-provided formatting so we don't break provider expectations
new URL(trimmedIssuer);
} catch {
throw new OidcConfigError(`Invalid issuer URL: ${issuer}`);
}
const issuerForDiscovery = trimmedIssuer.replace(/\/$/, "");
const discoveryEndpoint = cleanUrl(
rawConfig.discoveryEndpoint,
"discovery endpoint",
) ?? `${normalizedIssuer}/.well-known/openid-configuration`;
) ?? `${issuerForDiscovery}/.well-known/openid-configuration`;
const authorizationEndpoint = cleanUrl(rawConfig.authorizationEndpoint, "authorization endpoint");
const tokenEndpoint = cleanUrl(rawConfig.tokenEndpoint, "token endpoint");

View File

@@ -1,5 +1,5 @@
import { describe, test, expect } from "bun:test";
import { jsonResponse, formatDate, truncate, safeParse, parseErrorMessage, showErrorToast } from "./utils";
import { jsonResponse, formatDate, formatDateShort, truncate, safeParse, parseErrorMessage, showErrorToast } from "./utils";
describe("jsonResponse", () => {
test("creates a Response with JSON content", () => {
@@ -65,6 +65,18 @@ describe("formatDate", () => {
});
});
describe("formatDateShort", () => {
test("returns formatted date when input is provided", () => {
const formatted = formatDateShort("2014-10-20T15:32:10Z");
expect(formatted).toBe("Oct 20, 2014");
});
test("returns undefined when date is missing", () => {
expect(formatDateShort(null)).toBeUndefined();
expect(formatDateShort(undefined)).toBeUndefined();
});
});
describe("truncate", () => {
test("truncates a string that exceeds the length", () => {
const str = "This is a long string that needs truncation";

View File

@@ -29,6 +29,15 @@ export function formatDate(date?: Date | string | null): string {
}).format(new Date(date));
}
export function formatDateShort(date?: Date | string | null): string | undefined {
if (!date) return undefined;
return new Intl.DateTimeFormat("en-US", {
year: "numeric",
month: "short",
day: "numeric",
}).format(new Date(date));
}
export function formatLastSyncTime(date: Date | string | null): string {
if (!date) return "Never";

View File

@@ -86,6 +86,8 @@ export async function createDefaultConfig({ userId, envOverrides = {} }: Default
addTopics: true,
preserveVisibility: false,
forkStrategy: "reference",
issueConcurrency: 3,
pullRequestConcurrency: 5,
},
include: [],
exclude: [],

View File

@@ -89,6 +89,8 @@ export function mapUiToDbConfig(
forkStrategy: advancedOptions.skipForks ? "skip" : "reference",
// Mirror options from UI
issueConcurrency: giteaConfig.issueConcurrency ?? 3,
pullRequestConcurrency: giteaConfig.pullRequestConcurrency ?? 5,
mirrorReleases: mirrorOptions.mirrorReleases,
releaseLimit: mirrorOptions.releaseLimit || 10,
mirrorMetadata: mirrorOptions.mirrorMetadata,
@@ -132,6 +134,8 @@ export function mapDbToUiConfig(dbConfig: any): {
preserveOrgStructure: dbConfig.giteaConfig?.preserveVisibility || false, // Map preserveVisibility
mirrorStrategy: dbConfig.githubConfig?.mirrorStrategy || "preserve", // Get from GitHub config
personalReposOrg: undefined, // Not stored in current schema
issueConcurrency: dbConfig.giteaConfig?.issueConcurrency ?? 3,
pullRequestConcurrency: dbConfig.giteaConfig?.pullRequestConcurrency ?? 5,
};
// Map mirror options from various database fields

View File

@@ -29,12 +29,13 @@ export async function POST(context: APIContext) {
);
}
// Validate issuer URL format
// Validate issuer URL format while preserving trailing slash when provided
let validatedIssuer = issuer;
if (issuer && typeof issuer === 'string' && issuer.trim() !== '') {
try {
const issuerUrl = new URL(issuer.trim());
validatedIssuer = issuerUrl.toString().replace(/\/$/, ''); // Remove trailing slash
const trimmedIssuer = issuer.trim();
new URL(trimmedIssuer);
validatedIssuer = trimmedIssuer;
} catch (e) {
return new Response(
JSON.stringify({ error: `Invalid issuer URL format: ${issuer}` }),

View File

@@ -1,5 +1,5 @@
import type { APIRoute } from "astro";
import { db, organizations } from "@/lib/db";
import { db, organizations, repositories } from "@/lib/db";
import { eq, and } from "drizzle-orm";
import { createSecureErrorResponse } from "@/lib/utils";
import { requireAuth } from "@/lib/utils/auth-helpers";
@@ -61,3 +61,60 @@ export const PATCH: APIRoute = async (context) => {
return createSecureErrorResponse(error, "Update organization destination", 500);
}
};
export const DELETE: APIRoute = async (context) => {
try {
const { user, response } = await requireAuth(context);
if (response) return response;
const userId = user!.id;
const orgId = context.params.id;
if (!orgId) {
return new Response(
JSON.stringify({ error: "Organization ID is required" }),
{
status: 400,
headers: { "Content-Type": "application/json" },
}
);
}
const [existingOrg] = await db
.select()
.from(organizations)
.where(and(eq(organizations.id, orgId), eq(organizations.userId, userId)))
.limit(1);
if (!existingOrg) {
return new Response(
JSON.stringify({ error: "Organization not found" }),
{
status: 404,
headers: { "Content-Type": "application/json" },
}
);
}
await db.delete(repositories).where(
and(
eq(repositories.userId, userId),
eq(repositories.organization, existingOrg.name)
)
);
await db
.delete(organizations)
.where(and(eq(organizations.id, orgId), eq(organizations.userId, userId)));
return new Response(
JSON.stringify({ success: true }),
{
status: 200,
headers: { "Content-Type": "application/json" },
}
);
} catch (error) {
return createSecureErrorResponse(error, "Delete organization", 500);
}
};

View File

@@ -1,5 +1,5 @@
import type { APIRoute } from "astro";
import { db, repositories } from "@/lib/db";
import { db, repositories, mirrorJobs } from "@/lib/db";
import { eq, and } from "drizzle-orm";
import { createSecureErrorResponse } from "@/lib/utils";
import { requireAuth } from "@/lib/utils/auth-helpers";
@@ -60,4 +60,55 @@ export const PATCH: APIRoute = async (context) => {
} catch (error) {
return createSecureErrorResponse(error, "Update repository destination", 500);
}
};
};
export const DELETE: APIRoute = async (context) => {
try {
const { user, response } = await requireAuth(context);
if (response) return response;
const userId = user!.id;
const repoId = context.params.id;
if (!repoId) {
return new Response(JSON.stringify({ error: "Repository ID is required" }), {
status: 400,
headers: { "Content-Type": "application/json" },
});
}
const [existingRepo] = await db
.select()
.from(repositories)
.where(and(eq(repositories.id, repoId), eq(repositories.userId, userId)))
.limit(1);
if (!existingRepo) {
return new Response(
JSON.stringify({ error: "Repository not found" }),
{
status: 404,
headers: { "Content-Type": "application/json" },
}
);
}
await db
.delete(repositories)
.where(and(eq(repositories.id, repoId), eq(repositories.userId, userId)));
await db
.delete(mirrorJobs)
.where(and(eq(mirrorJobs.repositoryId, repoId), eq(mirrorJobs.userId, userId)));
return new Response(
JSON.stringify({ success: true }),
{
status: 200,
headers: { "Content-Type": "application/json" },
}
);
} catch (error) {
return createSecureErrorResponse(error, "Delete repository", 500);
}
};

View File

@@ -17,11 +17,11 @@ export async function POST(context: APIContext) {
});
}
// Validate issuer URL format
let cleanIssuer: string;
// Validate issuer URL format while keeping trailing slash if provided
const trimmedIssuer = issuer.trim();
let parsedIssuer: URL;
try {
const issuerUrl = new URL(issuer.trim());
cleanIssuer = issuerUrl.toString().replace(/\/$/, ""); // Remove trailing slash
parsedIssuer = new URL(trimmedIssuer);
} catch (e) {
return new Response(
JSON.stringify({
@@ -35,7 +35,8 @@ export async function POST(context: APIContext) {
);
}
const discoveryUrl = `${cleanIssuer}/.well-known/openid-configuration`;
const issuerForDiscovery = trimmedIssuer.replace(/\/$/, "");
const discoveryUrl = `${issuerForDiscovery}/.well-known/openid-configuration`;
try {
// Fetch OIDC discovery document with timeout
@@ -52,9 +53,9 @@ export async function POST(context: APIContext) {
});
} catch (fetchError) {
if (fetchError instanceof Error && fetchError.name === 'AbortError') {
throw new Error(`Request timeout: The OIDC provider at ${cleanIssuer} did not respond within 10 seconds`);
throw new Error(`Request timeout: The OIDC provider at ${trimmedIssuer} did not respond within 10 seconds`);
}
throw new Error(`Network error: Could not connect to ${cleanIssuer}. Please verify the URL is correct and accessible.`);
throw new Error(`Network error: Could not connect to ${trimmedIssuer}. Please verify the URL is correct and accessible.`);
} finally {
clearTimeout(timeoutId);
}
@@ -63,7 +64,7 @@ export async function POST(context: APIContext) {
if (response.status === 404) {
throw new Error(`OIDC discovery document not found at ${discoveryUrl}. For Authentik, ensure you're using the correct application slug in the URL.`);
} else if (response.status >= 500) {
throw new Error(`OIDC provider error (${response.status}): The server at ${cleanIssuer} returned an error.`);
throw new Error(`OIDC provider error (${response.status}): The server at ${trimmedIssuer} returned an error.`);
} else {
throw new Error(`Failed to fetch discovery document (${response.status}): ${response.statusText}`);
}
@@ -73,12 +74,12 @@ export async function POST(context: APIContext) {
try {
config = await response.json();
} catch (parseError) {
throw new Error(`Invalid response: The discovery document from ${cleanIssuer} is not valid JSON.`);
throw new Error(`Invalid response: The discovery document from ${trimmedIssuer} is not valid JSON.`);
}
// Extract the essential endpoints
const discoveredConfig = {
issuer: config.issuer || cleanIssuer,
issuer: config.issuer || trimmedIssuer,
authorizationEndpoint: config.authorization_endpoint,
tokenEndpoint: config.token_endpoint,
userInfoEndpoint: config.userinfo_endpoint,
@@ -88,7 +89,7 @@ export async function POST(context: APIContext) {
responseTypes: config.response_types_supported || ["code"],
grantTypes: config.grant_types_supported || ["authorization_code"],
// Suggested domain from issuer
suggestedDomain: new URL(cleanIssuer).hostname.replace("www.", ""),
suggestedDomain: parsedIssuer.hostname.replace("www.", ""),
};
return new Response(JSON.stringify(discoveredConfig), {
@@ -111,4 +112,4 @@ export async function POST(context: APIContext) {
} catch (error) {
return createSecureErrorResponse(error, "SSO discover API");
}
}
}

View File

@@ -82,11 +82,10 @@ export async function POST(context: APIContext) {
);
}
// Clean issuer URL (remove trailing slash); validate format
let cleanIssuer = issuer;
// Validate issuer URL format but keep trailing slash if provided
const trimmedIssuer = issuer.toString().trim();
try {
const issuerUrl = new URL(issuer.toString().trim());
cleanIssuer = issuerUrl.toString().replace(/\/$/, "");
new URL(trimmedIssuer);
} catch {
return new Response(
JSON.stringify({ error: `Invalid issuer URL format: ${issuer}` }),
@@ -99,7 +98,7 @@ export async function POST(context: APIContext) {
let normalized;
try {
normalized = await normalizeOidcProviderConfig(cleanIssuer, {
normalized = await normalizeOidcProviderConfig(trimmedIssuer, {
clientId,
clientSecret,
authorizationEndpoint,
@@ -134,7 +133,7 @@ export async function POST(context: APIContext) {
.insert(ssoProviders)
.values({
id: nanoid(),
issuer: cleanIssuer,
issuer: trimmedIssuer,
domain,
oidcConfig: JSON.stringify(storedOidcConfig),
userId: user.id,
@@ -213,12 +212,10 @@ export async function PUT(context: APIContext) {
// Parse existing config
const existingConfig = JSON.parse(existingProvider.oidcConfig);
const effectiveIssuer = issuer || existingProvider.issuer;
const effectiveIssuer = issuer?.toString().trim() || existingProvider.issuer;
let cleanIssuer = effectiveIssuer;
try {
const issuerUrl = new URL(effectiveIssuer.toString().trim());
cleanIssuer = issuerUrl.toString().replace(/\/$/, "");
new URL(effectiveIssuer);
} catch {
return new Response(
JSON.stringify({ error: `Invalid issuer URL format: ${effectiveIssuer}` }),
@@ -244,7 +241,7 @@ export async function PUT(context: APIContext) {
let normalized;
try {
normalized = await normalizeOidcProviderConfig(cleanIssuer, mergedConfig);
normalized = await normalizeOidcProviderConfig(effectiveIssuer, mergedConfig);
} catch (error) {
if (error instanceof OidcConfigError) {
return new Response(
@@ -266,7 +263,7 @@ export async function PUT(context: APIContext) {
const [updatedProvider] = await db
.update(ssoProviders)
.set({
issuer: cleanIssuer,
issuer: effectiveIssuer,
domain: domain || existingProvider.domain,
oidcConfig: JSON.stringify(storedOidcConfig),
organizationId: organizationId !== undefined ? organizationId : existingProvider.organizationId,

View File

@@ -66,6 +66,7 @@ export const POST: APIRoute = async ({ request }) => {
configId: config.id,
name: repo.name,
fullName: repo.fullName,
normalizedFullName: repo.fullName.toLowerCase(),
url: repo.url,
cloneUrl: repo.cloneUrl,
owner: repo.owner,
@@ -97,6 +98,7 @@ export const POST: APIRoute = async ({ request }) => {
userId,
configId: config.id,
name: org.name,
normalizedName: org.name.toLowerCase(),
avatarUrl: org.avatarUrl,
membershipRole: org.membershipRole,
isIncluded: false,
@@ -113,22 +115,22 @@ export const POST: APIRoute = async ({ request }) => {
await db.transaction(async (tx) => {
const [existingRepos, existingOrgs] = await Promise.all([
tx
.select({ fullName: repositories.fullName })
.select({ normalizedFullName: repositories.normalizedFullName })
.from(repositories)
.where(eq(repositories.userId, userId)),
tx
.select({ name: organizations.name })
.select({ normalizedName: organizations.normalizedName })
.from(organizations)
.where(eq(organizations.userId, userId)),
]);
const existingRepoNames = new Set(existingRepos.map((r) => r.fullName));
const existingOrgNames = new Set(existingOrgs.map((o) => o.name));
const existingRepoNames = new Set(existingRepos.map((r) => r.normalizedFullName));
const existingOrgNames = new Set(existingOrgs.map((o) => o.normalizedName));
insertedRepos = newRepos.filter(
(r) => !existingRepoNames.has(r.fullName)
(r) => !existingRepoNames.has(r.normalizedFullName)
);
insertedOrgs = newOrgs.filter((o) => !existingOrgNames.has(o.name));
insertedOrgs = newOrgs.filter((o) => !existingOrgNames.has(o.normalizedName));
// Batch insert repositories to avoid SQLite parameter limit (dynamic by column count)
const sample = newRepos[0];
@@ -140,7 +142,7 @@ export const POST: APIRoute = async ({ request }) => {
await tx
.insert(repositories)
.values(batch)
.onConflictDoNothing({ target: [repositories.userId, repositories.fullName] });
.onConflictDoNothing({ target: [repositories.userId, repositories.normalizedFullName] });
}
}

View File

@@ -1,5 +1,4 @@
import type { APIRoute } from "astro";
import { Octokit } from "@octokit/rest";
import { configs, db, organizations, repositories } from "@/lib/db";
import { and, eq } from "drizzle-orm";
import { jsonResponse, createSecureErrorResponse } from "@/lib/utils";
@@ -15,7 +14,7 @@ import { createGitHubClient } from "@/lib/github";
export const POST: APIRoute = async ({ request }) => {
try {
const body: AddOrganizationApiRequest = await request.json();
const { role, org, userId } = body;
const { role, org, userId, force = false } = body;
if (!org || !userId || !role) {
return jsonResponse({
@@ -24,21 +23,58 @@ export const POST: APIRoute = async ({ request }) => {
});
}
// Check if org already exists
const existingOrg = await db
const trimmedOrg = org.trim();
const normalizedOrg = trimmedOrg.toLowerCase();
// Check if org already exists (case-insensitive)
const [existingOrg] = await db
.select()
.from(organizations)
.where(
and(eq(organizations.name, org), eq(organizations.userId, userId))
);
and(
eq(organizations.userId, userId),
eq(organizations.normalizedName, normalizedOrg)
)
)
.limit(1);
if (existingOrg.length > 0) {
if (existingOrg && !force) {
return jsonResponse({
data: {
success: false,
error: "Organization already exists for this user",
},
status: 400,
status: 409,
});
}
if (existingOrg && force) {
const [updatedOrg] = await db
.update(organizations)
.set({
membershipRole: role,
normalizedName: normalizedOrg,
updatedAt: new Date(),
})
.where(eq(organizations.id, existingOrg.id))
.returning();
const resPayload: AddOrganizationApiResponse = {
success: true,
organization: updatedOrg ?? existingOrg,
message: "Organization already exists; using existing record.",
};
return jsonResponse({ data: resPayload, status: 200 });
}
if (existingOrg) {
return jsonResponse({
data: {
success: false,
error: "Organization already exists for this user",
},
status: 409,
});
}
@@ -71,17 +107,21 @@ export const POST: APIRoute = async ({ request }) => {
// Create authenticated Octokit instance with rate limit tracking
const githubUsername = decryptedConfig.githubConfig?.owner || undefined;
const octokit = createGitHubClient(decryptedConfig.githubConfig.token, userId, githubUsername);
const octokit = createGitHubClient(
decryptedConfig.githubConfig.token,
userId,
githubUsername
);
// Fetch org metadata
const { data: orgData } = await octokit.orgs.get({ org });
const { data: orgData } = await octokit.orgs.get({ org: trimmedOrg });
// Fetch repos based on config settings
const allRepos = [];
// Fetch all repos (public, private, and member) to show in UI
const publicRepos = await octokit.paginate(octokit.repos.listForOrg, {
org,
org: trimmedOrg,
type: "public",
per_page: 100,
});
@@ -89,7 +129,7 @@ export const POST: APIRoute = async ({ request }) => {
// Always fetch private repos to show them in the UI
const privateRepos = await octokit.paginate(octokit.repos.listForOrg, {
org,
org: trimmedOrg,
type: "private",
per_page: 100,
});
@@ -97,7 +137,7 @@ export const POST: APIRoute = async ({ request }) => {
// Also fetch member repos (includes private repos the user has access to)
const memberRepos = await octokit.paginate(octokit.repos.listForOrg, {
org,
org: trimmedOrg,
type: "member",
per_page: 100,
});
@@ -107,38 +147,44 @@ export const POST: APIRoute = async ({ request }) => {
allRepos.push(...uniqueMemberRepos);
// Insert repositories
const repoRecords = allRepos.map((repo) => ({
id: uuidv4(),
userId,
configId,
name: repo.name,
fullName: repo.full_name,
url: repo.html_url,
cloneUrl: repo.clone_url ?? "",
owner: repo.owner.login,
organization:
repo.owner.type === "Organization" ? repo.owner.login : null,
mirroredLocation: "",
destinationOrg: null,
isPrivate: repo.private,
isForked: repo.fork,
forkedFrom: null,
hasIssues: repo.has_issues,
isStarred: false,
isArchived: repo.archived,
size: repo.size,
hasLFS: false,
hasSubmodules: false,
language: repo.language ?? null,
description: repo.description ?? null,
defaultBranch: repo.default_branch ?? "main",
visibility: (repo.visibility ?? "public") as RepositoryVisibility,
status: "imported" as RepoStatus,
lastMirrored: null,
errorMessage: null,
createdAt: repo.created_at ? new Date(repo.created_at) : new Date(),
updatedAt: repo.updated_at ? new Date(repo.updated_at) : new Date(),
}));
const repoRecords = allRepos.map((repo) => {
const normalizedOwner = repo.owner.login.trim().toLowerCase();
const normalizedRepoName = repo.name.trim().toLowerCase();
return {
id: uuidv4(),
userId,
configId,
name: repo.name,
fullName: repo.full_name,
normalizedFullName: `${normalizedOwner}/${normalizedRepoName}`,
url: repo.html_url,
cloneUrl: repo.clone_url ?? "",
owner: repo.owner.login,
organization:
repo.owner.type === "Organization" ? repo.owner.login : null,
mirroredLocation: "",
destinationOrg: null,
isPrivate: repo.private,
isForked: repo.fork,
forkedFrom: null,
hasIssues: repo.has_issues,
isStarred: false,
isArchived: repo.archived,
size: repo.size,
hasLFS: false,
hasSubmodules: false,
language: repo.language ?? null,
description: repo.description ?? null,
defaultBranch: repo.default_branch ?? "main",
visibility: (repo.visibility ?? "public") as RepositoryVisibility,
status: "imported" as RepoStatus,
lastMirrored: null,
errorMessage: null,
createdAt: repo.created_at ? new Date(repo.created_at) : new Date(),
updatedAt: repo.updated_at ? new Date(repo.updated_at) : new Date(),
};
});
// Batch insert repositories to avoid SQLite parameter limit
// Compute batch size based on column count
@@ -150,7 +196,7 @@ export const POST: APIRoute = async ({ request }) => {
await db
.insert(repositories)
.values(batch)
.onConflictDoNothing({ target: [repositories.userId, repositories.fullName] });
.onConflictDoNothing({ target: [repositories.userId, repositories.normalizedFullName] });
}
// Insert organization metadata
@@ -159,6 +205,7 @@ export const POST: APIRoute = async ({ request }) => {
userId,
configId,
name: orgData.login,
normalizedName: normalizedOrg,
avatarUrl: orgData.avatar_url,
membershipRole: role,
isIncluded: false,

View File

@@ -15,7 +15,7 @@ import { createMirrorJob } from "@/lib/helpers";
export const POST: APIRoute = async ({ request }) => {
try {
const body: AddRepositoriesApiRequest = await request.json();
const { owner, repo, userId } = body;
const { owner, repo, userId, force = false } = body;
if (!owner || !repo || !userId) {
return new Response(
@@ -27,26 +27,43 @@ export const POST: APIRoute = async ({ request }) => {
);
}
const trimmedOwner = owner.trim();
const trimmedRepo = repo.trim();
if (!trimmedOwner || !trimmedRepo) {
return jsonResponse({
data: {
success: false,
error: "Missing owner, repo, or userId",
},
status: 400,
});
}
const normalizedOwner = trimmedOwner.toLowerCase();
const normalizedRepo = trimmedRepo.toLowerCase();
const normalizedFullName = `${normalizedOwner}/${normalizedRepo}`;
// Check if repository with the same owner, name, and userId already exists
const existingRepo = await db
const [existingRepo] = await db
.select()
.from(repositories)
.where(
and(
eq(repositories.owner, owner),
eq(repositories.name, repo),
eq(repositories.userId, userId)
eq(repositories.userId, userId),
eq(repositories.normalizedFullName, normalizedFullName)
)
);
)
.limit(1);
if (existingRepo.length > 0) {
if (existingRepo && !force) {
return jsonResponse({
data: {
success: false,
error:
"Repository with this name and owner already exists for this user",
},
status: 400,
status: 409,
});
}
@@ -68,14 +85,17 @@ export const POST: APIRoute = async ({ request }) => {
const octokit = new Octokit(); // No auth for public repos
const { data: repoData } = await octokit.rest.repos.get({ owner, repo });
const { data: repoData } = await octokit.rest.repos.get({
owner: trimmedOwner,
repo: trimmedRepo,
});
const metadata = {
id: uuidv4(),
const baseMetadata = {
userId,
configId,
name: repoData.name,
fullName: repoData.full_name,
normalizedFullName,
url: repoData.html_url,
cloneUrl: repoData.clone_url,
owner: repoData.owner.login,
@@ -94,6 +114,37 @@ export const POST: APIRoute = async ({ request }) => {
description: repoData.description ?? null,
defaultBranch: repoData.default_branch,
visibility: (repoData.visibility ?? "public") as RepositoryVisibility,
lastMirrored: existingRepo?.lastMirrored ?? null,
errorMessage: existingRepo?.errorMessage ?? null,
mirroredLocation: existingRepo?.mirroredLocation ?? "",
destinationOrg: existingRepo?.destinationOrg ?? null,
updatedAt: repoData.updated_at
? new Date(repoData.updated_at)
: new Date(),
};
if (existingRepo && force) {
const [updatedRepo] = await db
.update(repositories)
.set({
...baseMetadata,
normalizedFullName,
configId,
})
.where(eq(repositories.id, existingRepo.id))
.returning();
const resPayload: AddRepositoriesApiResponse = {
success: true,
repository: updatedRepo ?? existingRepo,
message: "Repository already exists; metadata refreshed.",
};
return jsonResponse({ data: resPayload, status: 200 });
}
const metadata = {
id: uuidv4(),
status: "imported" as Repository["status"],
lastMirrored: null,
errorMessage: null,
@@ -102,15 +153,13 @@ export const POST: APIRoute = async ({ request }) => {
createdAt: repoData.created_at
? new Date(repoData.created_at)
: new Date(),
updatedAt: repoData.updated_at
? new Date(repoData.updated_at)
: new Date(),
};
...baseMetadata,
} satisfies Repository;
await db
.insert(repositories)
.values(metadata)
.onConflictDoNothing({ target: [repositories.userId, repositories.fullName] });
.onConflictDoNothing({ target: [repositories.userId, repositories.normalizedFullName] });
createMirrorJob({
userId,

View File

@@ -81,11 +81,12 @@ export interface AddRepositoriesApiRequest {
userId: string;
repo: string;
owner: string;
force?: boolean;
}
export interface AddRepositoriesApiResponse {
success: boolean;
message: string;
repository: Repository;
repository?: Repository;
error?: string;
}

View File

@@ -13,6 +13,8 @@ export interface GiteaConfig {
preserveOrgStructure: boolean;
mirrorStrategy?: MirrorStrategy; // New field for the strategy
personalReposOrg?: string; // Override destination for personal repos
issueConcurrency?: number;
pullRequestConcurrency?: number;
}
export interface ScheduleConfig {

View File

@@ -45,11 +45,12 @@ export interface AddOrganizationApiRequest {
userId: string;
org: string;
role: MembershipRole;
force?: boolean;
}
export interface AddOrganizationApiResponse {
success: boolean;
message: string;
organization: Organization;
organization?: Organization;
error?: string;
}

View File

@@ -1,13 +1,30 @@
# Astro with Tailwind
# Gitea Mirror Marketing Site
```sh
bun create astro@latest -- --template with-tailwindcss
This Astro workspace powers the public marketing experience for Gitea Mirror. It includes the landing page, screenshots, call-to-action components, and the new use case library that highlights real-world workflows.
## Developing Locally
```bash
bun install
bun run dev
```
[![Open in StackBlitz](https://developer.stackblitz.com/img/open_in_stackblitz.svg)](https://stackblitz.com/github/withastro/astro/tree/latest/examples/with-tailwindcss)
[![Open with CodeSandbox](https://assets.codesandbox.io/github/button-edit-lime.svg)](https://codesandbox.io/p/sandbox/github/withastro/astro/tree/latest/examples/with-tailwindcss)
[![Open in GitHub Codespaces](https://github.com/codespaces/badge.svg)](https://codespaces.new/withastro/astro?devcontainer_path=.devcontainer/with-tailwindcss/devcontainer.json)
The site is available at `http://localhost:4321`. Tailwind CSS v4 handles styling; classes can be used directly inside Astro, MDX, and React components.
Astro comes with [Tailwind](https://tailwindcss.com) support out of the box. This example showcases how to style your Astro project with Tailwind.
## Project Structure
For complete setup instructions, please see our [Tailwind Integration Guide](https://docs.astro.build/en/guides/integrations-guide/tailwind).
- `src/pages/index.astro` Main landing page
- `src/components/` Reusable UI (Header, Hero, Features, UseCases, etc.)
- `src/lib/use-cases.ts` Central data source for use case titles, summaries, and tags
- `src/pages/use-cases/` MDX guides for each use case, rendered with `UseCaseLayout`
- `src/layouts/UseCaseLayout.astro` Shared layout that injects the header, shader background, and footer into MDX guides
## Authoring Use Case Guides
1. Add or update a record in `src/lib/use-cases.ts`. This keeps the landing page and library listing in sync.
2. Create a new MDX file in `src/pages/use-cases/<slug>.mdx` with the `UseCaseLayout` layout and descriptive frontmatter.
3. Run `bun run dev` to preview the layout and ensure the new guide inherits global styles.
## Deployment
The marketing site is built with the standard Astro pipeline. Use `bun run build` to generate a production build before deploying.

View File

@@ -1,8 +1,8 @@
// @ts-check
import { defineConfig } from 'astro/config';
import tailwindcss from '@tailwindcss/vite';
import react from '@astrojs/react';
import mdx from '@astrojs/mdx';
// https://astro.build/config
export default defineConfig({
@@ -10,5 +10,5 @@ export default defineConfig({
plugins: [tailwindcss()]
},
integrations: [react()]
});
integrations: [react(), mdx()]
});

1037
www/bun.lock Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -1,236 +1,628 @@
# SEO Keywords & Content Strategy for Gitea Mirror
# SEO Keywords & Programmatic Content Strategy for Gitea Mirror
## Target Audience & Pain Points
### Primary Audience
- DevOps engineers managing GitHub repositories
- Companies looking to backup GitHub data
- Self-hosting enthusiasts
- Organizations migrating from GitHub to self-hosted solutions
- Developers needing GitHub disaster recovery
### Key Pain Points
- Manual GitHub to Gitea migration is time-consuming
- No automated backup solution for GitHub organizations
- Difficulty preserving repository structure during migration
- Need for scheduled, automatic synchronization
- Complex authentication setup for self-hosted Git services
## Keyword Categories & Opportunities
### 1. Problem-Solving Keywords (High Intent)
- **"github to gitea migration"** - Core functionality keyword
- **"mirror github repository to gitea"** - Direct search intent
- **"sync github gitea automatically"** - Automation focus
- **"backup github to self hosted"** - Backup use case
- **"github organization mirror tool"** - Organization-specific
- **"gitea import from github"** - Alternative phrasing
- **"migrate starred github repos"** - Specific feature
### 2. Comparison & Alternative Keywords
- **"github vs gitea migration"** - Comparison content
- **"gitea mirror alternatives"** - Competitor analysis
- **"self hosted github backup solutions"** - Solution category
- **"github repository sync tools"** - Tool category
- **"gitea github integration"** - Integration focus
- **"github backup automation"** - Automation emphasis
### 3. How-To & Tutorial Keywords
- **"how to mirror github to gitea"** - Tutorial intent
- **"setup gitea mirror docker"** - Installation guide
- **"gitea github sync tutorial"** - Step-by-step content
- **"automate github backup gitea"** - Automation tutorial
- **"mirror private github repos gitea"** - Private repos guide
- **"gitea import github wiki"** - Feature-specific tutorial
### 4. Feature-Specific Keywords
- **"gitea sso authentication setup"** - Auth feature
- **"gitea oidc provider configuration"** - OIDC setup
- **"gitea better auth integration"** - Specific tech stack
- **"gitea scheduled mirror"** - Scheduling feature
- **"gitea bulk repository import"** - Bulk operations
- **"gitea preserve organization structure"** - Organization feature
### 5. Platform & Deployment Keywords
- **"gitea mirror proxmox"** - Platform-specific
- **"gitea mirror docker compose"** - Docker deployment
- **"gitea mirror arm64"** - Architecture-specific
- **"gitea mirror reverse proxy"** - Infrastructure setup
- **"gitea authentik integration"** - Auth provider integration
### 6. Use Case Keywords
- **"self host github backup"** - Backup use case
- **"enterprise github migration gitea"** - Enterprise focus
- **"github disaster recovery gitea"** - DR use case
- **"github archive self hosted"** - Archival use case
- **"github organization backup automation"** - Org backup
### 7. Long-Tail Problem Keywords
- **"mirror github issues to gitea"** - Specific feature
- **"sync github releases gitea automatically"** - Release sync
- **"gitea mirror multiple organizations"** - Multi-org
- **"github starred repositories backup"** - Starred repos
- **"gitea mirror skip forks"** - Fork handling
### 8. Technical Integration Keywords
- **"gitea github api integration"** - API focus
- **"gitea webhook github sync"** - Webhook integration
- **"gitea ci/cd github mirror"** - CI/CD integration
- **"gitea github actions migration"** - Actions migration
## Blog Post Ideas & Content Strategy
### High-Priority Blog Posts
1. **"Complete Guide to Migrating from GitHub to Gitea in 2025"**
- **Target Keywords**: github to gitea migration, gitea import from github
- **Content**: Comprehensive migration guide with screenshots
- **Length**: 2,500-3,000 words
- **Include**: Step-by-step instructions, troubleshooting, best practices
2. **"How to Automatically Backup Your GitHub Repositories to Self-Hosted Gitea"**
- **Target Keywords**: backup github to self hosted, github backup automation
- **Content**: Focus on automation and scheduling features
- **Length**: 1,800-2,200 words
- **Include**: Docker setup, cron scheduling, backup strategies
3. **"Gitea Mirror vs Manual Migration: Which GitHub Migration Method is Best?"**
- **Target Keywords**: gitea mirror alternatives, github repository sync tools
- **Content**: Comparison post with pros/cons, feature matrix
- **Length**: 1,500-2,000 words
- **Include**: Comparison table, use case recommendations
4. **"Setting Up Enterprise GitHub Backup with Gitea Mirror and Docker"**
- **Target Keywords**: enterprise github migration gitea, github organization backup automation
- **Content**: Enterprise-focused guide with security considerations
- **Length**: 2,000-2,500 words
- **Include**: Multi-user setup, permission management, scaling
5. **"Mirror GitHub Organizations to Gitea While Preserving Structure"**
- **Target Keywords**: github organization mirror tool, gitea preserve organization structure
- **Content**: Deep dive into organization mirroring strategies
- **Length**: 1,500-1,800 words
- **Include**: Strategy explanations, configuration examples
6. **"Gitea SSO Setup: Complete Authentication Guide with Examples"**
- **Target Keywords**: gitea sso authentication setup, gitea oidc provider configuration
- **Content**: Cover all auth methods including header auth
- **Length**: 2,000-2,500 words
- **Include**: Provider examples (Google, Azure, Authentik)
7. **"How to Mirror Private GitHub Repositories to Your Gitea Instance"**
- **Target Keywords**: mirror private github repos gitea, gitea github api integration
- **Content**: Security-focused content with token management
- **Length**: 1,500-1,800 words
- **Include**: Token permissions, security best practices
8. **"Gitea Mirror on Proxmox: Ultimate Self-Hosting Guide"**
- **Target Keywords**: gitea mirror proxmox, self host github backup
- **Content**: LXC container setup tutorial
- **Length**: 1,800-2,200 words
- **Include**: Proxmox setup, resource allocation, networking
## Landing Page Optimization
### Title Tag Options
- "Gitea Mirror - Automated GitHub to Gitea Migration & Backup Tool"
- "GitHub to Gitea Mirror - Sync, Backup & Migrate Repositories Automatically"
- "Gitea Mirror - Self-Hosted GitHub Repository Backup & Sync Solution"
### Meta Description Options
- "Automatically mirror and backup your GitHub repositories to self-hosted Gitea. Support for organizations, private repos, scheduled sync, and SSO authentication. Docker & Proxmox ready."
- "The easiest way to migrate from GitHub to Gitea. Mirror repositories, organizations, issues, and releases automatically. Self-hosted backup solution with enterprise features."
### H1 Options
- "Automatically Mirror GitHub Repositories to Your Gitea Instance"
- "Self-Hosted GitHub Backup & Migration Tool for Gitea"
- "The Complete GitHub to Gitea Migration Solution"
### Key Landing Page Sections to Optimize
1. **Hero Section**
- Include primary keywords naturally
- Clear value proposition
- Quick start CTA
2. **Features Section**
- Target feature-specific keywords
- Use semantic variations
- Include comparison points
3. **Use Cases Section**
- Target use case keywords
- Include customer scenarios
- Enterprise focus subsection
4. **Installation Section**
- Target platform keywords
- Docker, Proxmox, manual options
- Quick start emphasis
5. **FAQ Section**
- Target long-tail keywords
- Common migration questions
- Technical integration queries
## Content Calendar Suggestions
### Month 1
- Week 1-2: "Complete Guide to Migrating from GitHub to Gitea"
- Week 3-4: "How to Automatically Backup Your GitHub Repositories"
### Month 2
- Week 1-2: "Gitea Mirror vs Manual Migration"
- Week 3-4: "Enterprise GitHub Backup Guide"
### Month 3
- Week 1-2: "Mirror GitHub Organizations Guide"
- Week 3-4: "Gitea SSO Setup Guide"
### Month 4
- Week 1-2: "Private Repository Mirroring"
- Week 3-4: "Gitea Mirror on Proxmox"
## SEO Research Tips
### When Using Ahrefs
1. **Search Volume**: Target 100-1,000 monthly searches initially
2. **Keyword Difficulty**: Aim for KD < 30 for new content
3. **SERP Analysis**: Check competitor content depth
4. **Parent Topics**: Find broader topics to target
5. **Featured Snippets**: Look for snippet opportunities
### Content Optimization
1. Include target keyword in:
- Title tag
- H1 (once)
- First 100 words
- At least one H2
- URL slug
- Meta description
2. Use semantic variations throughout
3. Include related keywords naturally
4. Optimize for search intent
5. Add schema markup for tutorials
## Tracking & Updates
### KPIs to Monitor
- Organic traffic growth
- Keyword rankings
- Click-through rates
- Conversion rates (signups/downloads)
- Time on page
### Regular Updates
- Review keyword performance monthly
- Update content quarterly
- Add new keywords based on search console data
- Monitor competitor content
- Track feature releases for new keyword opportunities
> **Goal**: Generate 5,000-15,000 organic visits/month within 6-12 months
> **Strategy**: Low-effort, high-intent pages targeting long-tail keywords
> **Focus**: Problem-solving content over generic tool descriptions
---
*Last Updated: [Current Date]*
*Next Review: [Date + 3 months]*
## 🎯 LOW-HANGING FRUIT: Quick Wins (Start This Week)
### Tier 1: Ultra Low-Effort, High-Intent Pages (1-2 hours each)
These are **simple template pages** with **minimal content** but **high search volume** and **buyer intent**.
| Page | Keyword | Monthly Searches | Difficulty | Effort | Priority |
|------|---------|-----------------|------------|--------|----------|
| `/use-cases/backup-github-repositories` | "backup github repositories" | 500-1K | Low (15) | 1h | ⭐⭐⭐⭐⭐ |
| `/use-cases/migrate-github-to-gitea` | "migrate github to gitea" | 300-800 | Low (10) | 1h | ⭐⭐⭐⭐⭐ |
| `/solutions/github-disaster-recovery` | "github disaster recovery" | 200-500 | Low (12) | 1h | ⭐⭐⭐⭐⭐ |
| `/vs/manual-vs-automated-github-migration` | "automated github migration" | 150-400 | Very Low (8) | 1.5h | ⭐⭐⭐⭐ |
| `/guides/setup-gitea-mirror-docker` | "gitea mirror docker setup" | 100-300 | Very Low (5) | 2h | ⭐⭐⭐⭐ |
**Why these work:**
- Specific, actionable queries ("how to backup", "migrate to")
- Low competition (KD < 15)
- High commercial intent (ready to install)
- Can reuse existing docs content
**Template for these pages:** 400-600 words, 30 minutes to write each
---
## 📊 KEYWORD STRATEGY: 3-Tier Approach
### Tier 1: Problem-Solving Keywords (HIGHEST PRIORITY)
**Intent**: "I have this specific problem"
**Effort**: Low (template-based)
**Pages needed**: 15
| Primary Keyword | Secondary Keywords | Est. Traffic | Page URL |
|----------------|-------------------|--------------|----------|
| backup github repositories | github backup tool, automated github backup | 500/mo | `/use-cases/backup-github-repositories` |
| migrate github to gitea | github gitea migration, import github to gitea | 400/mo | `/use-cases/migrate-github-to-gitea` |
| github disaster recovery | backup github organization, github downtime backup | 250/mo | `/solutions/github-disaster-recovery` |
| sync github to self-hosted | self-hosted github alternative, github to gitea sync | 200/mo | `/use-cases/sync-github-to-self-hosted-gitea` |
| preserve github history | github history backup, archive github repos | 180/mo | `/use-cases/preserve-github-history` |
| github vendor lock-in | avoid github lock-in, github alternatives | 150/mo | `/solutions/avoid-vendor-lock-in` |
| github backup automation | automate github mirror, scheduled github backup | 140/mo | `/use-cases/github-backup-automation` |
| mirror starred repositories | backup starred repos, export github stars | 120/mo | `/use-cases/starred-repos-collection` |
| github offline access | offline git mirror, air-gapped github | 100/mo | `/solutions/need-offline-git-access` |
| github rate limits | bypass github api limits, github api alternatives | 90/mo | `/solutions/github-rate-limits` |
**Total Tier 1 Traffic Potential**: ~2,500 visits/month
---
### Tier 2: Feature-Specific Keywords (MEDIUM PRIORITY)
**Intent**: "I want to do this specific thing"
**Effort**: Medium (requires explaining features)
**Pages needed**: 12
| Primary Keyword | Est. Traffic | Page URL |
|----------------|--------------|----------|
| mirror github issues | 80/mo | `/features/github-issues-migration` |
| sync github releases | 70/mo | `/features/github-releases-sync` |
| mirror github wiki | 60/mo | `/features/wiki-migration` |
| preserve github organization structure | 50/mo | `/features/organization-structure-preservation` |
| mirror private github repos | 180/mo | `/features/private-repository-mirroring` |
| github metadata migration | 45/mo | `/features/metadata-migration` |
| scheduled github sync | 120/mo | `/features/scheduled-synchronization` |
| batch github migration | 40/mo | `/features/batch-repository-processing` |
| github pull request migration | 35/mo | `/features/pull-request-mirroring` |
| git lfs mirror | 30/mo | `/features/git-lfs-support` |
**Total Tier 2 Traffic Potential**: ~1,200 visits/month
---
### Tier 3: Comparison Keywords (HIGH CONVERSION)
**Intent**: "Evaluating options"
**Effort**: Medium-High (research required)
**Pages needed**: 8
| Primary Keyword | Est. Traffic | Conversion Potential | Page URL |
|----------------|--------------|---------------------|----------|
| github backup tools comparison | 250/mo | Very High | `/vs/github-backup-solutions` |
| gitea vs github | 800/mo | Medium | `/vs/github-vs-gitea` |
| manual vs automated migration | 60/mo | High | `/vs/manual-vs-automated-migration` |
| git clone vs mirror | 45/mo | Medium | `/vs/git-clone-vs-automated-sync` |
| gitea alternatives | 150/mo | Medium | `/alternatives` |
| self-hosted git servers | 400/mo | Low | `/vs/self-hosted-vs-cloud-git` |
**Total Tier 3 Traffic Potential**: ~1,700 visits/month
---
## 🚀 IMPLEMENTATION ROADMAP: 4-Week Sprint
### Week 1: Foundation (5 pages)
**Goal**: Get first pages indexed, establish content structure
**Day 1-2: Setup** (4 hours)
- [ ] Create Astro content collections (`src/content/config.ts`)
- [ ] Build page templates (use-cases, features, solutions)
- [ ] Setup SEO component with structured data
- [ ] Create sitemap generator
**Day 3-5: Core Content** (8 hours)
- [ ] `/use-cases/backup-github-repositories` - 600 words
- [ ] `/use-cases/migrate-github-to-gitea` - 600 words
- [ ] `/solutions/github-disaster-recovery` - 500 words
- [ ] `/features/automatic-github-mirroring` - 700 words
- [ ] `/vs/manual-vs-automated-migration` - 800 words
**Day 6-7: Technical Setup** (3 hours)
- [ ] Submit sitemap to Google Search Console
- [ ] Setup Google Analytics 4
- [ ] Add schema.org markup
- [ ] Create robots.txt
- [ ] Setup canonical URLs
**Week 1 Target**: 5 pages live, indexed by Google
---
### Week 2: Scale Content (10 pages)
**Goal**: Batch create similar pages using templates
**Use Case Pages** (5 pages, 1 hour each):
- [ ] `/use-cases/sync-github-to-self-hosted-gitea`
- [ ] `/use-cases/preserve-github-history`
- [ ] `/use-cases/github-backup-automation`
- [ ] `/use-cases/starred-repos-collection`
- [ ] `/use-cases/vendor-lock-in-prevention`
**Feature Pages** (5 pages, 1.5 hours each):
- [ ] `/features/private-repository-mirroring`
- [ ] `/features/scheduled-synchronization`
- [ ] `/features/github-issues-migration`
- [ ] `/features/github-releases-sync`
- [ ] `/features/metadata-migration`
**Week 2 Target**: 15 total pages, monitor first impressions in GSC
---
### Week 3: Problem-Solution Focus (8 pages)
**Goal**: Target high-intent problem queries
**Solution Pages** (6 pages, 45 min each):
- [ ] `/solutions/avoid-vendor-lock-in`
- [ ] `/solutions/need-offline-git-access`
- [ ] `/solutions/github-rate-limits`
- [ ] `/solutions/github-pricing-too-expensive`
- [ ] `/solutions/comply-with-data-regulations`
- [ ] `/solutions/preserve-deleted-github-repos`
**Guide Pages** (2 pages, 2 hours each):
- [ ] `/guides/setup-gitea-mirror-docker`
- [ ] `/guides/migrate-github-organization-to-gitea`
**Week 3 Target**: 23 total pages, start seeing traffic
---
### Week 4: Comparison & Polish (7 pages + optimization)
**Goal**: High-conversion comparison content + optimization
**Comparison Pages** (4 pages, 2 hours each):
- [ ] `/vs/github-backup-solutions`
- [ ] `/vs/github-vs-gitea`
- [ ] `/vs/self-hosted-vs-cloud-git`
- [ ] `/alternatives`
**Integration Pages** (3 pages, 1 hour each):
- [ ] `/integrations/docker-compose`
- [ ] `/integrations/kubernetes`
- [ ] `/integrations/helm-charts`
**Optimization** (8 hours):
- [ ] Add internal linking between all pages
- [ ] Optimize images (WebP, alt text)
- [ ] Add FAQ sections to top 10 pages
- [ ] Create content calendar for Month 2
**Week 4 Target**: 30 total pages, 50-100 visitors/week
---
## 📝 CONTENT TEMPLATES
### Template 1: Use Case Page (400-600 words, 30 min)
```markdown
# [Use Case Title] - Gitea Mirror
> **In this guide**: Learn how to [solve specific problem] using Gitea Mirror's automated [feature].
## The Problem
[2-3 sentences describing the pain point]
**Common challenges:**
- Challenge 1
- Challenge 2
- Challenge 3
## How Gitea Mirror Solves This
[3-4 sentences explaining the solution]
**Key capabilities:**
- ✅ Capability 1
- ✅ Capability 2
- ✅ Capability 3
## Quick Start (5 Minutes)
\`\`\`bash
# Step 1: Pull the Docker image
docker pull giteamirror/gitea-mirror:latest
# Step 2: Run with environment variables
docker run -d \\
-e GITHUB_TOKEN=your_token \\
-e GITEA_URL=https://gitea.example.com \\
giteamirror/gitea-mirror
\`\`\`
[2 sentences on what happens next]
## Real-World Example
[Short scenario: "A DevOps team needed to..."]
## Related Features
- [Link to feature 1]
- [Link to feature 2]
## Get Started
[CTA button/link to GitHub repo]
---
**Keywords**: [primary], [secondary], [tertiary]
**Last Updated**: [Date]
```
**Why this works:**
- Answers search query immediately
- Shows code (high engagement)
- Internal links (SEO juice)
- Clear CTA
- **Total time: 30 minutes**
---
### Template 2: Feature Page (500-700 words, 45 min)
```markdown
# [Feature Name] - Gitea Mirror
> Automatically [feature benefit] from GitHub to Gitea with zero manual work.
## What Is [Feature Name]?
[2-3 sentences explaining the feature]
## Why You Need This
**Without Gitea Mirror:**
- ❌ Manual problem 1
- ❌ Manual problem 2
- ❌ Manual problem 3
**With Gitea Mirror:**
- ✅ Automated solution 1
- ✅ Automated solution 2
- ✅ Automated solution 3
## How It Works
1. **Step 1**: [Action]
2. **Step 2**: [Action]
3. **Step 3**: [Result]
## Configuration
\`\`\`yaml
# Example configuration
feature_enabled: true
option1: value
option2: value
\`\`\`
## Use Cases
### Use Case 1
[Scenario where this feature helps]
### Use Case 2
[Another scenario]
## Best Practices
- Tip 1
- Tip 2
- Tip 3
## See It In Action
[Screenshot or GIF]
## Get Started
[CTA]
---
**Related**:
- [Use case page]
- [Guide page]
```
---
### Template 3: Solution Page (300-500 words, 20 min)
```markdown
# [Problem Statement] - Solved
> **The Problem**: [One sentence problem]
> **The Solution**: Gitea Mirror's automated [approach]
## Why This Problem Matters
[2 sentences on impact]
**Consequences of not solving:**
1. Consequence 1
2. Consequence 2
3. Consequence 3
## How Gitea Mirror Fixes This
[Explain the solution in 3-4 sentences]
## Implementation
\`\`\`bash
# 2-3 line code snippet
\`\`\`
## Success Story
"[Quote or short anecdote]"
## Next Steps
1. [Link to getting started]
2. [Link to relevant feature]
[CTA button]
```
**Total time: 20 minutes**
---
## 🎨 SEO OPTIMIZATION CHECKLIST
### On-Page SEO (Per Page)
```
✅ Title tag: [Keyword] - Gitea Mirror (50-60 chars)
✅ Meta description with CTA (150-160 chars)
✅ H1 includes primary keyword
✅ URL slug = primary keyword
✅ First paragraph mentions keyword
✅ H2s include semantic variations
✅ Image alt text descriptive
✅ Internal links (3-5 per page)
✅ External links (1-2 authoritative sources)
✅ Schema.org markup (SoftwareApplication)
✅ Canonical URL set
✅ Mobile responsive
✅ Page speed < 3s
```
### Content Quality Checks
```
✅ Answers search intent completely
✅ 400-1500 word count (based on competition)
✅ Code examples where relevant
✅ Screenshots/visuals
✅ Updated date visible
✅ Clear CTA
✅ Related content links
✅ No keyword stuffing (1-2% density)
```
---
## 📈 TRACKING & METRICS
### Week 1-2 KPIs
- [ ] All pages indexed in Google (check GSC)
- [ ] 0 technical SEO errors (screaming frog)
- [ ] < 3s page load time
- [ ] Mobile usability 100/100
### Week 3-4 KPIs
- [ ] 10+ impressions/day in GSC
- [ ] 3+ clicks/day from organic
- [ ] 1+ page ranking in top 50
### Month 2 Goals
- [ ] 100+ impressions/day
- [ ] 20+ clicks/day
- [ ] 10+ keywords in top 50
- [ ] 5+ keywords in top 20
### Month 3 Goals
- [ ] 500+ impressions/day
- [ ] 50+ clicks/day
- [ ] 20+ keywords in top 20
- [ ] 10+ keywords in top 10
---
## 🔗 INTERNAL LINKING STRATEGY
**Hub & Spoke Model**
### Hub Pages (Link FROM these everywhere)
1. Homepage
2. `/use-cases/migrate-github-to-gitea` (main use case)
3. `/features/automatic-github-mirroring` (main feature)
### Spoke Pages (Link TO hubs + related spokes)
- Use case pages link to: Related features, guides, solutions
- Feature pages link to: Use cases, guides
- Solution pages link to: Use cases, features
- Guide pages link to: Features, use cases
**Example**:
```
/use-cases/backup-github-repositories
→ Links to:
- /features/scheduled-synchronization
- /features/automatic-github-mirroring
- /guides/setup-gitea-mirror-docker
- /solutions/github-disaster-recovery
```
---
## 💡 CONTENT HACKS: Work Smarter
### 1. Batch Similar Pages (2x faster)
Write all "use case" pages in one session using the template. Copy structure, change specifics.
### 2. Reuse Existing Content
- Main repo README Use case pages
- Docker docs Guide pages
- GitHub issues Problem pages
### 3. AI-Assisted Expansion
- Write 200-word outline manually
- Expand with AI to 600 words
- Edit for accuracy (10 min)
- **Time saved: 50%**
### 4. Screenshot Once, Use Everywhere
Create a `/public/screenshots/` library:
- Dashboard view
- Configuration screen
- Migration in progress
- Results page
Reuse across all pages.
### 5. Schema Markup Template
Create one JSON-LD template, reuse with variable substitution:
```json
{
"@context": "https://schema.org",
"@type": "SoftwareApplication",
"name": "Gitea Mirror",
"description": "[PAGE_DESCRIPTION]",
"url": "[PAGE_URL]"
}
```
---
## 🎯 MONTH 2-3 EXPANSION PLAN
### Month 2: Depth Over Breadth
**Goal**: Make existing pages rank higher
**Activities**:
- [ ] Add 200 words to each existing page
- [ ] Add FAQ sections (5 Q&As per page)
- [ ] Create 10 more guide pages (tutorials)
- [ ] Add video embeds (YouTube shorts)
- [ ] Guest post on Dev.to (backlinks)
**New Pages** (10):
- 5 more use case pages
- 5 advanced guides
### Month 3: Authority Building
**Goal**: Establish Gitea Mirror as THE GitHub migration resource
**Activities**:
- [ ] Ultimate Guide: "Complete GitHub to Gitea Migration Guide" (3,000 words)
- [ ] Comparison matrix: All GitHub backup tools
- [ ] Interactive tool: "Migration time calculator"
- [ ] Video tutorials (5-10 minutes each)
- [ ] Community: Add testimonials/case studies
**New Pages** (15):
- 5 integration pages
- 5 technical spec pages
- 5 advanced solution pages
---
## 🏆 SUCCESS METRICS (6 Months)
### Conservative Target
- **Pages**: 50 indexed
- **Traffic**: 5,000 visits/month
- **Keywords**: 30 in top 20
- **Backlinks**: 15-20
- **GitHub Stars**: +50 from organic
### Optimistic Target
- **Pages**: 80 indexed
- **Traffic**: 12,000 visits/month
- **Keywords**: 50 in top 20, 20 in top 10
- **Backlinks**: 40-50
- **GitHub Stars**: +200 from organic
---
## 🔧 TECHNICAL SETUP (Do Once)
### Astro Content Collections
```typescript
// src/content/config.ts
import { defineCollection, z } from 'astro:content';
const useCases = defineCollection({
type: 'content',
schema: z.object({
title: z.string(),
description: z.string(),
keywords: z.array(z.string()),
problem: z.string(),
solution: z.string(),
difficulty: z.enum(['beginner', 'intermediate', 'advanced']),
timeToRead: z.number(),
relatedPages: z.array(z.string()).optional(),
}),
});
export const collections = {
'use-cases': useCases,
'features': defineCollection({ /* ... */ }),
'guides': defineCollection({ /* ... */ }),
'solutions': defineCollection({ /* ... */ }),
'vs': defineCollection({ /* ... */ }),
};
```
### Dynamic Route Template
```astro
---
// src/pages/use-cases/[...slug].astro
import { getCollection } from 'astro:content';
export async function getStaticPaths() {
const useCases = await getCollection('use-cases');
return useCases.map(entry => ({
params: { slug: entry.slug },
props: { entry },
}));
}
const { entry } = Astro.props;
const { Content } = await entry.render();
---
<Layout title={entry.data.title} description={entry.data.description}>
<article>
<h1>{entry.data.title}</h1>
<Content />
</article>
</Layout>
```
---
## 📋 QUICK ACTION CHECKLIST
**Today:**
- [ ] Create content collections structure
- [ ] Write first use case page (1 hour)
- [ ] Setup Google Search Console
**This Week:**
- [ ] Complete 5 high-priority pages
- [ ] Submit sitemap
- [ ] Add schema markup
**This Month:**
- [ ] 30 pages live
- [ ] Internal linking complete
- [ ] First organic traffic
---
**Last Updated**: January 2025
**Next Review**: February 2025
**Owner**: [Your Team]

View File

@@ -1,7 +1,7 @@
{
"name": "www",
"type": "module",
"version": "1.0.1",
"version": "1.1.0",
"scripts": {
"dev": "astro dev",
"build": "astro build",
@@ -9,28 +9,28 @@
"astro": "astro"
},
"dependencies": {
"@astrojs/mdx": "^4.3.6",
"@astrojs/mdx": "^4.3.7",
"@astrojs/react": "^4.4.0",
"@radix-ui/react-icons": "^1.3.2",
"@radix-ui/react-slot": "^1.2.3",
"@splinetool/react-spline": "^4.1.0",
"@splinetool/runtime": "^1.10.73",
"@tailwindcss/vite": "^4.1.14",
"@splinetool/runtime": "^1.10.85",
"@tailwindcss/vite": "^4.1.15",
"@types/canvas-confetti": "^1.9.0",
"@types/react": "^19.2.0",
"@types/react-dom": "^19.2.0",
"astro": "^5.14.3",
"@types/react": "^19.2.2",
"@types/react-dom": "^19.2.2",
"astro": "^5.14.8",
"canvas-confetti": "^1.9.3",
"class-variance-authority": "^0.7.1",
"clsx": "^2.1.1",
"lucide-react": "^0.544.0",
"lucide-react": "^0.546.0",
"react": "^19.2.0",
"react-dom": "^19.2.0",
"tailwind-merge": "^3.3.1",
"tailwindcss": "^4.1.14"
"tailwindcss": "^4.1.15"
},
"devDependencies": {
"tw-animate-css": "^1.4.0"
},
"packageManager": "pnpm@10.18.0"
"packageManager": "pnpm@10.18.3"
}

451
www/pnpm-lock.yaml generated
View File

@@ -9,38 +9,38 @@ importers:
.:
dependencies:
'@astrojs/mdx':
specifier: ^4.3.6
version: 4.3.6(astro@5.14.3(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.1)(rollup@4.52.4)(typescript@5.8.3))
specifier: ^4.3.7
version: 4.3.7(astro@5.14.8(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.2)(rollup@4.52.4)(typescript@5.8.3))
'@astrojs/react':
specifier: ^4.4.0
version: 4.4.0(@types/node@24.7.1)(@types/react-dom@19.2.0(@types/react@19.2.0))(@types/react@19.2.0)(jiti@2.6.1)(lightningcss@1.30.1)(react-dom@19.2.0(react@19.2.0))(react@19.2.0)
version: 4.4.0(@types/node@24.7.1)(@types/react-dom@19.2.2(@types/react@19.2.2))(@types/react@19.2.2)(jiti@2.6.1)(lightningcss@1.30.2)(react-dom@19.2.0(react@19.2.0))(react@19.2.0)
'@radix-ui/react-icons':
specifier: ^1.3.2
version: 1.3.2(react@19.2.0)
'@radix-ui/react-slot':
specifier: ^1.2.3
version: 1.2.3(@types/react@19.2.0)(react@19.2.0)
version: 1.2.3(@types/react@19.2.2)(react@19.2.0)
'@splinetool/react-spline':
specifier: ^4.1.0
version: 4.1.0(@splinetool/runtime@1.10.73)(react-dom@19.2.0(react@19.2.0))(react@19.2.0)
version: 4.1.0(@splinetool/runtime@1.10.85)(react-dom@19.2.0(react@19.2.0))(react@19.2.0)
'@splinetool/runtime':
specifier: ^1.10.73
version: 1.10.73
specifier: ^1.10.85
version: 1.10.85
'@tailwindcss/vite':
specifier: ^4.1.14
version: 4.1.14(vite@6.3.6(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.1))
specifier: ^4.1.15
version: 4.1.15(vite@6.3.6(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.2))
'@types/canvas-confetti':
specifier: ^1.9.0
version: 1.9.0
'@types/react':
specifier: ^19.2.0
version: 19.2.0
specifier: ^19.2.2
version: 19.2.2
'@types/react-dom':
specifier: ^19.2.0
version: 19.2.0(@types/react@19.2.0)
specifier: ^19.2.2
version: 19.2.2(@types/react@19.2.2)
astro:
specifier: ^5.14.3
version: 5.14.3(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.1)(rollup@4.52.4)(typescript@5.8.3)
specifier: ^5.14.8
version: 5.14.8(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.2)(rollup@4.52.4)(typescript@5.8.3)
canvas-confetti:
specifier: ^1.9.3
version: 1.9.3
@@ -51,8 +51,8 @@ importers:
specifier: ^2.1.1
version: 2.1.1
lucide-react:
specifier: ^0.544.0
version: 0.544.0(react@19.2.0)
specifier: ^0.546.0
version: 0.546.0(react@19.2.0)
react:
specifier: ^19.2.0
version: 19.2.0
@@ -63,8 +63,8 @@ importers:
specifier: ^3.3.1
version: 3.3.1
tailwindcss:
specifier: ^4.1.14
version: 4.1.14
specifier: ^4.1.15
version: 4.1.15
devDependencies:
tw-animate-css:
specifier: ^1.4.0
@@ -79,20 +79,14 @@ packages:
'@astrojs/compiler@2.13.0':
resolution: {integrity: sha512-mqVORhUJViA28fwHYaWmsXSzLO9osbdZ5ImUfxBarqsYdMlPbqAqGJCxsNzvppp1BEzc1mJNjOVvQqeDN8Vspw==}
'@astrojs/internal-helpers@0.7.3':
resolution: {integrity: sha512-6Pl0bQEIChuW5wqN7jdKrzWfCscW2rG/Cz+fzt4PhSQX2ivBpnhXgFUCs0M3DCYvjYHnPVG2W36X5rmFjZ62sw==}
'@astrojs/internal-helpers@0.7.4':
resolution: {integrity: sha512-lDA9MqE8WGi7T/t2BMi+EAXhs4Vcvr94Gqx3q15cFEz8oFZMO4/SFBqYr/UcmNlvW+35alowkVj+w9VhLvs5Cw==}
'@astrojs/markdown-remark@6.3.7':
resolution: {integrity: sha512-KXGdq6/BC18doBCYXp08alHlWChH0hdD2B1qv9wIyOHbvwI5K6I7FhSta8dq1hBQNdun8YkKPR013D/Hm8xd0g==}
'@astrojs/markdown-remark@6.3.8':
resolution: {integrity: sha512-uFNyFWadnULWK2cOw4n0hLKeu+xaVWeuECdP10cQ3K2fkybtTlhb7J7TcScdjmS8Yps7oje9S/ehYMfZrhrgCg==}
'@astrojs/mdx@4.3.6':
resolution: {integrity: sha512-jH04tYgaqLfq3To42+z1oEcXrXUzo3BxZ4fTkb+7BEmOJkQ9/c3iIixFEC+x0GgE8lJb4SuEDGldpAv7+1yY8A==}
'@astrojs/mdx@4.3.7':
resolution: {integrity: sha512-5SRmvMyT/UMWaU2eoD+htnXtE2mUZZEH2K/nEzhuEy+iCsOSuS/DUry59WuKUJRQETi1mgJFdNR4dZLJHYVuRA==}
engines: {node: 18.20.8 || ^20.3.0 || >=22.0.0}
peerDependencies:
astro: ^5.0.0
@@ -486,10 +480,6 @@ packages:
cpu: [x64]
os: [win32]
'@isaacs/fs-minipass@4.0.1':
resolution: {integrity: sha512-wgm9Ehl2jpeqP3zw/7mo3kRHFp5MEDhqAdwy1fTGkHAwnkGOVsgpvQhL8B5n1qlb01jV3n/bI0ZfZp5lWA1k4w==}
engines: {node: '>=18.0.0'}
'@jridgewell/gen-mapping@0.3.12':
resolution: {integrity: sha512-OuLGC46TjB5BbN1dH8JULVVZY4WTdkF7tV9Ys6wLL1rubZnCMstOhNHueU5bLCrnRuDhKPDM4g6sw4Bel5Gzqg==}
@@ -689,71 +679,71 @@ packages:
next:
optional: true
'@splinetool/runtime@1.10.73':
resolution: {integrity: sha512-VirKiO226oHuhKYl5SH1ymBS/qvxqErpP8yVy8j3Svd3NKSPIDoNB4tOz2SjVPzPtVUr9JNy4TSpuCu/4QCWwg==}
'@splinetool/runtime@1.10.85':
resolution: {integrity: sha512-dpfsXTq/IF7pu98al6HX+tAkdM9+CU3l1TpGtkqZ81GHuirl0/UINAwHQLspvrwIstaC2yqN0OOQXd3cYLFYDA==}
'@swc/helpers@0.5.17':
resolution: {integrity: sha512-5IKx/Y13RsYd+sauPb2x+U/xZikHjolzfuDgTAl/Tdf3Q8rslRvC19NKDLgAJQ6wsqADk10ntlv08nPFw/gO/A==}
'@tailwindcss/node@4.1.14':
resolution: {integrity: sha512-hpz+8vFk3Ic2xssIA3e01R6jkmsAhvkQdXlEbRTk6S10xDAtiQiM3FyvZVGsucefq764euO/b8WUW9ysLdThHw==}
'@tailwindcss/node@4.1.15':
resolution: {integrity: sha512-HF4+7QxATZWY3Jr8OlZrBSXmwT3Watj0OogeDvdUY/ByXJHQ+LBtqA2brDb3sBxYslIFx6UP94BJ4X6a4L9Bmw==}
'@tailwindcss/oxide-android-arm64@4.1.14':
resolution: {integrity: sha512-a94ifZrGwMvbdeAxWoSuGcIl6/DOP5cdxagid7xJv6bwFp3oebp7y2ImYsnZBMTwjn5Ev5xESvS3FFYUGgPODQ==}
'@tailwindcss/oxide-android-arm64@4.1.15':
resolution: {integrity: sha512-TkUkUgAw8At4cBjCeVCRMc/guVLKOU1D+sBPrHt5uVcGhlbVKxrCaCW9OKUIBv1oWkjh4GbunD/u/Mf0ql6kEA==}
engines: {node: '>= 10'}
cpu: [arm64]
os: [android]
'@tailwindcss/oxide-darwin-arm64@4.1.14':
resolution: {integrity: sha512-HkFP/CqfSh09xCnrPJA7jud7hij5ahKyWomrC3oiO2U9i0UjP17o9pJbxUN0IJ471GTQQmzwhp0DEcpbp4MZTA==}
'@tailwindcss/oxide-darwin-arm64@4.1.15':
resolution: {integrity: sha512-xt5XEJpn2piMSfvd1UFN6jrWXyaKCwikP4Pidcf+yfHTSzSpYhG3dcMktjNkQO3JiLCp+0bG0HoWGvz97K162w==}
engines: {node: '>= 10'}
cpu: [arm64]
os: [darwin]
'@tailwindcss/oxide-darwin-x64@4.1.14':
resolution: {integrity: sha512-eVNaWmCgdLf5iv6Qd3s7JI5SEFBFRtfm6W0mphJYXgvnDEAZ5sZzqmI06bK6xo0IErDHdTA5/t7d4eTfWbWOFw==}
'@tailwindcss/oxide-darwin-x64@4.1.15':
resolution: {integrity: sha512-TnWaxP6Bx2CojZEXAV2M01Yl13nYPpp0EtGpUrY+LMciKfIXiLL2r/SiSRpagE5Fp2gX+rflp/Os1VJDAyqymg==}
engines: {node: '>= 10'}
cpu: [x64]
os: [darwin]
'@tailwindcss/oxide-freebsd-x64@4.1.14':
resolution: {integrity: sha512-QWLoRXNikEuqtNb0dhQN6wsSVVjX6dmUFzuuiL09ZeXju25dsei2uIPl71y2Ic6QbNBsB4scwBoFnlBfabHkEw==}
'@tailwindcss/oxide-freebsd-x64@4.1.15':
resolution: {integrity: sha512-quISQDWqiB6Cqhjc3iWptXVZHNVENsWoI77L1qgGEHNIdLDLFnw3/AfY7DidAiiCIkGX/MjIdB3bbBZR/G2aJg==}
engines: {node: '>= 10'}
cpu: [x64]
os: [freebsd]
'@tailwindcss/oxide-linux-arm-gnueabihf@4.1.14':
resolution: {integrity: sha512-VB4gjQni9+F0VCASU+L8zSIyjrLLsy03sjcR3bM0V2g4SNamo0FakZFKyUQ96ZVwGK4CaJsc9zd/obQy74o0Fw==}
'@tailwindcss/oxide-linux-arm-gnueabihf@4.1.15':
resolution: {integrity: sha512-ObG76+vPlab65xzVUQbExmDU9FIeYLQ5k2LrQdR2Ud6hboR+ZobXpDoKEYXf/uOezOfIYmy2Ta3w0ejkTg9yxg==}
engines: {node: '>= 10'}
cpu: [arm]
os: [linux]
'@tailwindcss/oxide-linux-arm64-gnu@4.1.14':
resolution: {integrity: sha512-qaEy0dIZ6d9vyLnmeg24yzA8XuEAD9WjpM5nIM1sUgQ/Zv7cVkharPDQcmm/t/TvXoKo/0knI3me3AGfdx6w1w==}
'@tailwindcss/oxide-linux-arm64-gnu@4.1.15':
resolution: {integrity: sha512-4WbBacRmk43pkb8/xts3wnOZMDKsPFyEH/oisCm2q3aLZND25ufvJKcDUpAu0cS+CBOL05dYa8D4U5OWECuH/Q==}
engines: {node: '>= 10'}
cpu: [arm64]
os: [linux]
'@tailwindcss/oxide-linux-arm64-musl@4.1.14':
resolution: {integrity: sha512-ISZjT44s59O8xKsPEIesiIydMG/sCXoMBCqsphDm/WcbnuWLxxb+GcvSIIA5NjUw6F8Tex7s5/LM2yDy8RqYBQ==}
'@tailwindcss/oxide-linux-arm64-musl@4.1.15':
resolution: {integrity: sha512-AbvmEiteEj1nf42nE8skdHv73NoR+EwXVSgPY6l39X12Ex8pzOwwfi3Kc8GAmjsnsaDEbk+aj9NyL3UeyHcTLg==}
engines: {node: '>= 10'}
cpu: [arm64]
os: [linux]
'@tailwindcss/oxide-linux-x64-gnu@4.1.14':
resolution: {integrity: sha512-02c6JhLPJj10L2caH4U0zF8Hji4dOeahmuMl23stk0MU1wfd1OraE7rOloidSF8W5JTHkFdVo/O7uRUJJnUAJg==}
'@tailwindcss/oxide-linux-x64-gnu@4.1.15':
resolution: {integrity: sha512-+rzMVlvVgrXtFiS+ES78yWgKqpThgV19ISKD58Ck+YO5pO5KjyxLt7AWKsWMbY0R9yBDC82w6QVGz837AKQcHg==}
engines: {node: '>= 10'}
cpu: [x64]
os: [linux]
'@tailwindcss/oxide-linux-x64-musl@4.1.14':
resolution: {integrity: sha512-TNGeLiN1XS66kQhxHG/7wMeQDOoL0S33x9BgmydbrWAb9Qw0KYdd8o1ifx4HOGDWhVmJ+Ul+JQ7lyknQFilO3Q==}
'@tailwindcss/oxide-linux-x64-musl@4.1.15':
resolution: {integrity: sha512-fPdEy7a8eQN9qOIK3Em9D3TO1z41JScJn8yxl/76mp4sAXFDfV4YXxsiptJcOwy6bGR+70ZSwFIZhTXzQeqwQg==}
engines: {node: '>= 10'}
cpu: [x64]
os: [linux]
'@tailwindcss/oxide-wasm32-wasi@4.1.14':
resolution: {integrity: sha512-uZYAsaW/jS/IYkd6EWPJKW/NlPNSkWkBlaeVBi/WsFQNP05/bzkebUL8FH1pdsqx4f2fH/bWFcUABOM9nfiJkQ==}
'@tailwindcss/oxide-wasm32-wasi@4.1.15':
resolution: {integrity: sha512-sJ4yd6iXXdlgIMfIBXuVGp/NvmviEoMVWMOAGxtxhzLPp9LOj5k0pMEMZdjeMCl4C6Up+RM8T3Zgk+BMQ0bGcQ==}
engines: {node: '>=14.0.0'}
cpu: [wasm32]
bundledDependencies:
@@ -764,24 +754,24 @@ packages:
- '@emnapi/wasi-threads'
- tslib
'@tailwindcss/oxide-win32-arm64-msvc@4.1.14':
resolution: {integrity: sha512-Az0RnnkcvRqsuoLH2Z4n3JfAef0wElgzHD5Aky/e+0tBUxUhIeIqFBTMNQvmMRSP15fWwmvjBxZ3Q8RhsDnxAA==}
'@tailwindcss/oxide-win32-arm64-msvc@4.1.15':
resolution: {integrity: sha512-sJGE5faXnNQ1iXeqmRin7Ds/ru2fgCiaQZQQz3ZGIDtvbkeV85rAZ0QJFMDg0FrqsffZG96H1U9AQlNBRLsHVg==}
engines: {node: '>= 10'}
cpu: [arm64]
os: [win32]
'@tailwindcss/oxide-win32-x64-msvc@4.1.14':
resolution: {integrity: sha512-ttblVGHgf68kEE4om1n/n44I0yGPkCPbLsqzjvybhpwa6mKKtgFfAzy6btc3HRmuW7nHe0OOrSeNP9sQmmH9XA==}
'@tailwindcss/oxide-win32-x64-msvc@4.1.15':
resolution: {integrity: sha512-NLeHE7jUV6HcFKS504bpOohyi01zPXi2PXmjFfkzTph8xRxDdxkRsXm/xDO5uV5K3brrE1cCwbUYmFUSHR3u1w==}
engines: {node: '>= 10'}
cpu: [x64]
os: [win32]
'@tailwindcss/oxide@4.1.14':
resolution: {integrity: sha512-23yx+VUbBwCg2x5XWdB8+1lkPajzLmALEfMb51zZUBYaYVPDQvBSD/WYDqiVyBIo2BZFa3yw1Rpy3G2Jp+K0dw==}
'@tailwindcss/oxide@4.1.15':
resolution: {integrity: sha512-krhX+UOOgnsUuks2SR7hFafXmLQrKxB4YyRTERuCE59JlYL+FawgaAlSkOYmDRJdf1Q+IFNDMl9iRnBW7QBDfQ==}
engines: {node: '>= 10'}
'@tailwindcss/vite@4.1.14':
resolution: {integrity: sha512-BoFUoU0XqgCUS1UXWhmDJroKKhNXeDzD7/XwabjkDIAbMnc4ULn5e2FuEuBbhZ6ENZoSYzKlzvZ44Yr6EUDUSA==}
'@tailwindcss/vite@4.1.15':
resolution: {integrity: sha512-B6s60MZRTUil+xKoZoGe6i0Iar5VuW+pmcGlda2FX+guDuQ1G1sjiIy1W0frneVpeL/ZjZ4KEgWZHNrIm++2qA==}
peerDependencies:
vite: ^5.2.0 || ^6 || ^7
@@ -830,13 +820,13 @@ packages:
'@types/node@24.7.1':
resolution: {integrity: sha512-CmyhGZanP88uuC5GpWU9q+fI61j2SkhO3UGMUdfYRE6Bcy0ccyzn1Rqj9YAB/ZY4kOXmNf0ocah5GtphmLMP6Q==}
'@types/react-dom@19.2.0':
resolution: {integrity: sha512-brtBs0MnE9SMx7px208g39lRmC5uHZs96caOJfTjFcYSLHNamvaSMfJNagChVNkup2SdtOxKX1FDBkRSJe1ZAg==}
'@types/react-dom@19.2.2':
resolution: {integrity: sha512-9KQPoO6mZCi7jcIStSnlOWn2nEF3mNmyr3rIAsGnAbQKYbRLyqmeSc39EVgtxXVia+LMT8j3knZLAZAh+xLmrw==}
peerDependencies:
'@types/react': ^19.2.0
'@types/react@19.2.0':
resolution: {integrity: sha512-1LOH8xovvsKsCBq1wnT4ntDUdCJKmnEakhsuoUSy6ExlHCkGP2hqnatagYTgFk6oeL0VU31u7SNjunPN+GchtA==}
'@types/react@19.2.2':
resolution: {integrity: sha512-6mDvHUFSjyT2B2yeNx2nUgMxh9LtOWvkhIU3uePn2I2oyNymUAX1NIsdgviM4CH+JSrp2D2hsMvJOkxY+0wNRA==}
'@types/unist@2.0.11':
resolution: {integrity: sha512-CmBKiL6NNo/OqgmMn95Fk9Whlp2mtvIv+KNpQKN2F4SjvrEesubTRWGYSg+BnWZOnlCaSTU1sMpsBOzgbYhnsA==}
@@ -896,8 +886,8 @@ packages:
resolution: {integrity: sha512-LElXdjswlqjWrPpJFg1Fx4wpkOCxj1TDHlSV4PlaRxHGWko024xICaa97ZkMfs6DRKlCguiAI+rbXv5GWwXIkg==}
hasBin: true
astro@5.14.3:
resolution: {integrity: sha512-iRvl3eEYYdSYA195eNREjh43hqMMwKY1uoHYiKfLCB9G+bjFtaBtDe8R0ip7AbTD69wyOKgUCOtMad+lkOnT/w==}
astro@5.14.8:
resolution: {integrity: sha512-nKqCLs7BFvGQL9QWQOUqxHhlHtV0UMLXz1ANJygozvjcexBWS7FYkWI2LzRPMNYmbW4msIWNWnX2RvLdvI5Cnw==}
engines: {node: 18.20.8 || ^20.3.0 || >=22.0.0, npm: '>=9.6.5', pnpm: '>=7.1.0'}
hasBin: true
@@ -962,10 +952,6 @@ packages:
resolution: {integrity: sha512-Qgzu8kfBvo+cA4962jnP1KkS6Dop5NS6g7R5LFYJr4b8Ub94PPQXUksCw9PvXoeXPRRddRNC5C1JQUR2SMGtnA==}
engines: {node: '>= 14.16.0'}
chownr@3.0.0:
resolution: {integrity: sha512-+IxzY9BZOQd/XuYPRmrvEVjF/nqj5kgT4kEq7VofrDoM1MxoRjEWkrCC3EtLi59TVawxTAn+orJwFQcrqEN1+g==}
engines: {node: '>=18'}
ci-info@4.3.1:
resolution: {integrity: sha512-Wdy2Igu8OcBpI2pZePZ5oWjPC38tmDVx5WKUXKwlLYkA0ozo85sLsLvkBbBn/sZaSCMFOGZJ14fvW9t5/d7kdA==}
engines: {node: '>=8'}
@@ -1041,10 +1027,6 @@ packages:
destr@2.0.5:
resolution: {integrity: sha512-ugFTXCtDZunbzasqBxrK93Ik/DRYsO6S/fedkWEMKqt04xZ4csmnmwGDBAb07QWNaGMAmnTIemsYZCksjATwsA==}
detect-libc@2.0.4:
resolution: {integrity: sha512-3UDv+G9CsCKO1WKMGw9fwq/SWJYbI0c5Y7LU1AXYoDdbhE2AHQ6N6Nb34sG8Fj7T5APy8qXDCKuuIHd1BR0tVA==}
engines: {node: '>=8'}
detect-libc@2.1.2:
resolution: {integrity: sha512-Btj2BOOO83o3WyH59e8MgXsxEQVcarkUOpEYrubB0urwnN10yQ364rsiByU11nZlqWYZm05i/of7io4mzihBtQ==}
engines: {node: '>=8'}
@@ -1303,68 +1285,74 @@ packages:
resolution: {integrity: sha512-o+NO+8WrRiQEE4/7nwRJhN1HWpVmJm511pBHUxPLtp0BUISzlBplORYSmTclCnJvQq2tKu/sgl3xVpkc7ZWuQQ==}
engines: {node: '>=6'}
lightningcss-darwin-arm64@1.30.1:
resolution: {integrity: sha512-c8JK7hyE65X1MHMN+Viq9n11RRC7hgin3HhYKhrMyaXflk5GVplZ60IxyoVtzILeKr+xAJwg6zK6sjTBJ0FKYQ==}
lightningcss-android-arm64@1.30.2:
resolution: {integrity: sha512-BH9sEdOCahSgmkVhBLeU7Hc9DWeZ1Eb6wNS6Da8igvUwAe0sqROHddIlvU06q3WyXVEOYDZ6ykBZQnjTbmo4+A==}
engines: {node: '>= 12.0.0'}
cpu: [arm64]
os: [android]
lightningcss-darwin-arm64@1.30.2:
resolution: {integrity: sha512-ylTcDJBN3Hp21TdhRT5zBOIi73P6/W0qwvlFEk22fkdXchtNTOU4Qc37SkzV+EKYxLouZ6M4LG9NfZ1qkhhBWA==}
engines: {node: '>= 12.0.0'}
cpu: [arm64]
os: [darwin]
lightningcss-darwin-x64@1.30.1:
resolution: {integrity: sha512-k1EvjakfumAQoTfcXUcHQZhSpLlkAuEkdMBsI/ivWw9hL+7FtilQc0Cy3hrx0AAQrVtQAbMI7YjCgYgvn37PzA==}
lightningcss-darwin-x64@1.30.2:
resolution: {integrity: sha512-oBZgKchomuDYxr7ilwLcyms6BCyLn0z8J0+ZZmfpjwg9fRVZIR5/GMXd7r9RH94iDhld3UmSjBM6nXWM2TfZTQ==}
engines: {node: '>= 12.0.0'}
cpu: [x64]
os: [darwin]
lightningcss-freebsd-x64@1.30.1:
resolution: {integrity: sha512-kmW6UGCGg2PcyUE59K5r0kWfKPAVy4SltVeut+umLCFoJ53RdCUWxcRDzO1eTaxf/7Q2H7LTquFHPL5R+Gjyig==}
lightningcss-freebsd-x64@1.30.2:
resolution: {integrity: sha512-c2bH6xTrf4BDpK8MoGG4Bd6zAMZDAXS569UxCAGcA7IKbHNMlhGQ89eRmvpIUGfKWNVdbhSbkQaWhEoMGmGslA==}
engines: {node: '>= 12.0.0'}
cpu: [x64]
os: [freebsd]
lightningcss-linux-arm-gnueabihf@1.30.1:
resolution: {integrity: sha512-MjxUShl1v8pit+6D/zSPq9S9dQ2NPFSQwGvxBCYaBYLPlCWuPh9/t1MRS8iUaR8i+a6w7aps+B4N0S1TYP/R+Q==}
lightningcss-linux-arm-gnueabihf@1.30.2:
resolution: {integrity: sha512-eVdpxh4wYcm0PofJIZVuYuLiqBIakQ9uFZmipf6LF/HRj5Bgm0eb3qL/mr1smyXIS1twwOxNWndd8z0E374hiA==}
engines: {node: '>= 12.0.0'}
cpu: [arm]
os: [linux]
lightningcss-linux-arm64-gnu@1.30.1:
resolution: {integrity: sha512-gB72maP8rmrKsnKYy8XUuXi/4OctJiuQjcuqWNlJQ6jZiWqtPvqFziskH3hnajfvKB27ynbVCucKSm2rkQp4Bw==}
lightningcss-linux-arm64-gnu@1.30.2:
resolution: {integrity: sha512-UK65WJAbwIJbiBFXpxrbTNArtfuznvxAJw4Q2ZGlU8kPeDIWEX1dg3rn2veBVUylA2Ezg89ktszWbaQnxD/e3A==}
engines: {node: '>= 12.0.0'}
cpu: [arm64]
os: [linux]
lightningcss-linux-arm64-musl@1.30.1:
resolution: {integrity: sha512-jmUQVx4331m6LIX+0wUhBbmMX7TCfjF5FoOH6SD1CttzuYlGNVpA7QnrmLxrsub43ClTINfGSYyHe2HWeLl5CQ==}
lightningcss-linux-arm64-musl@1.30.2:
resolution: {integrity: sha512-5Vh9dGeblpTxWHpOx8iauV02popZDsCYMPIgiuw97OJ5uaDsL86cnqSFs5LZkG3ghHoX5isLgWzMs+eD1YzrnA==}
engines: {node: '>= 12.0.0'}
cpu: [arm64]
os: [linux]
lightningcss-linux-x64-gnu@1.30.1:
resolution: {integrity: sha512-piWx3z4wN8J8z3+O5kO74+yr6ze/dKmPnI7vLqfSqI8bccaTGY5xiSGVIJBDd5K5BHlvVLpUB3S2YCfelyJ1bw==}
lightningcss-linux-x64-gnu@1.30.2:
resolution: {integrity: sha512-Cfd46gdmj1vQ+lR6VRTTadNHu6ALuw2pKR9lYq4FnhvgBc4zWY1EtZcAc6EffShbb1MFrIPfLDXD6Xprbnni4w==}
engines: {node: '>= 12.0.0'}
cpu: [x64]
os: [linux]
lightningcss-linux-x64-musl@1.30.1:
resolution: {integrity: sha512-rRomAK7eIkL+tHY0YPxbc5Dra2gXlI63HL+v1Pdi1a3sC+tJTcFrHX+E86sulgAXeI7rSzDYhPSeHHjqFhqfeQ==}
lightningcss-linux-x64-musl@1.30.2:
resolution: {integrity: sha512-XJaLUUFXb6/QG2lGIW6aIk6jKdtjtcffUT0NKvIqhSBY3hh9Ch+1LCeH80dR9q9LBjG3ewbDjnumefsLsP6aiA==}
engines: {node: '>= 12.0.0'}
cpu: [x64]
os: [linux]
lightningcss-win32-arm64-msvc@1.30.1:
resolution: {integrity: sha512-mSL4rqPi4iXq5YVqzSsJgMVFENoa4nGTT/GjO2c0Yl9OuQfPsIfncvLrEW6RbbB24WtZ3xP/2CCmI3tNkNV4oA==}
lightningcss-win32-arm64-msvc@1.30.2:
resolution: {integrity: sha512-FZn+vaj7zLv//D/192WFFVA0RgHawIcHqLX9xuWiQt7P0PtdFEVaxgF9rjM/IRYHQXNnk61/H/gb2Ei+kUQ4xQ==}
engines: {node: '>= 12.0.0'}
cpu: [arm64]
os: [win32]
lightningcss-win32-x64-msvc@1.30.1:
resolution: {integrity: sha512-PVqXh48wh4T53F/1CCu8PIPCxLzWyCnn/9T5W1Jpmdy5h9Cwd+0YQS6/LwhHXSafuc61/xg9Lv5OrCby6a++jg==}
lightningcss-win32-x64-msvc@1.30.2:
resolution: {integrity: sha512-5g1yc73p+iAkid5phb4oVFMB45417DkRevRbt/El/gKXJk4jid+vPFF/AXbxn05Aky8PapwzZrdJShv5C0avjw==}
engines: {node: '>= 12.0.0'}
cpu: [x64]
os: [win32]
lightningcss@1.30.1:
resolution: {integrity: sha512-xi6IyHML+c9+Q3W0S4fCQJOym42pyurFiJUHEcEyHS0CeKzia4yZDEsLlqOFykxOdHpNy0NmvVO31vcSqAxJCg==}
lightningcss@1.30.2:
resolution: {integrity: sha512-utfs7Pr5uJyyvDETitgsaqSyjCb2qNRAtuqUeWIAKztsOYdcACf2KtARYXg2pSvhkt+9NfoaNY7fxjl6nuMjIQ==}
engines: {node: '>= 12.0.0'}
lodash.debounce@4.0.8:
@@ -1379,8 +1367,8 @@ packages:
lru-cache@5.1.1:
resolution: {integrity: sha512-KpNARQA3Iwv+jTA0utUVVbrh+Jlrr1Fv0e56GGzAFOXN7dk/FviaDW8LHmK52DlcH4WP2n6gI8vN1aesBFgo9w==}
lucide-react@0.544.0:
resolution: {integrity: sha512-t5tS44bqd825zAW45UQxpG2CvcC4urOwn2TrwSH8u+MjeE+1NnWl6QqeQ/6NdjMqdOygyiT9p3Ev0p1NJykxjw==}
lucide-react@0.546.0:
resolution: {integrity: sha512-Z94u6fKT43lKeYHiVyvyR8fT7pwCzDu7RyMPpTvh054+xahSgj4HFQ+NmflvzdXsoAjYGdCguGaFKYuvq0ThCQ==}
peerDependencies:
react: ^16.5.1 || ^17.0.0 || ^18.0.0 || ^19.0.0
@@ -1556,14 +1544,6 @@ packages:
micromark@4.0.2:
resolution: {integrity: sha512-zpe98Q6kvavpCr1NPVSCMebCKfD7CA2NqZ+rykeNhONIJBpc1tFKt9hucLGwha3jNTNI8lHpctWJWoimVF4PfA==}
minipass@7.1.2:
resolution: {integrity: sha512-qOOzS1cBTWYF4BH8fVePDBOO9iptMnGUEZwNc/cMWnTV2nVLZ7VoNWEPHkYczZA0pdoA7dl6e7FL659nX9S2aw==}
engines: {node: '>=16 || 14 >=14.17'}
minizlib@3.1.0:
resolution: {integrity: sha512-KZxYo1BUkWD2TVFLr0MQoM8vUUigWD3LlD83a/75BqC+4qE0Hb1Vo5v1FgcfaNXvfXzr+5EhQ6ing/CaBijTlw==}
engines: {node: '>= 18'}
mrmime@2.0.1:
resolution: {integrity: sha512-Y3wQdFg2Va6etvQ5I82yUhGdsKrcYox6p7FfL1LbK2J4V01F9TGlepTIhnK24t7koZibmg82KGglhA1XK5IsLQ==}
engines: {node: '>=10'}
@@ -1834,17 +1814,13 @@ packages:
tailwind-merge@3.3.1:
resolution: {integrity: sha512-gBXpgUm/3rp1lMZZrM/w7D8GKqshif0zAymAhbCyIt8KMe+0v9DQ7cdYLR4FHH/cKpdTXb+A/tKKU3eolfsI+g==}
tailwindcss@4.1.14:
resolution: {integrity: sha512-b7pCxjGO98LnxVkKjaZSDeNuljC4ueKUddjENJOADtubtdo8llTaJy7HwBMeLNSSo2N5QIAgklslK1+Ir8r6CA==}
tailwindcss@4.1.15:
resolution: {integrity: sha512-k2WLnWkYFkdpRv+Oby3EBXIyQC8/s1HOFMBUViwtAh6Z5uAozeUSMQlIsn/c6Q2iJzqG6aJT3wdPaRNj70iYxQ==}
tapable@2.2.2:
resolution: {integrity: sha512-Re10+NauLTMCudc7T5WLFLAwDhQ0JWdrMK+9B2M8zR5hRExKmsRDCBA7/aV/pNJFltmBFO5BAMlQFi/vq3nKOg==}
engines: {node: '>=6'}
tar@7.5.1:
resolution: {integrity: sha512-nlGpxf+hv0v7GkWBK2V9spgactGOp0qvfWRxUMjqHyzrt3SgwE48DIv/FhqPHJYLHpgW1opq3nERbz5Anq7n1g==}
engines: {node: '>=18'}
thumbhash@0.1.1:
resolution: {integrity: sha512-kH5pKeIIBPQXAOni2AiY/Cu/NKdkFREdpH+TLdM0g6WA7RriCv0kPLgP731ady67MhTAqrVG/4mnEeibVuCJcg==}
@@ -2014,9 +1990,6 @@ packages:
vfile-location@5.0.3:
resolution: {integrity: sha512-5yXvWDEgqeiYiBe1lbxYF7UMAIm/IcopxMHrMQDq3nvKcjPKIhZklUKL+AE7J7uApI4kwe2snsK+eI6UTj9EHg==}
vfile-message@4.0.2:
resolution: {integrity: sha512-jRDZ1IMLttGj41KcZvlrYAaI3CfqpLpfpf+Mfig13viT6NKvRzWZ+lXz0Y5D60w6uJIBAOGq9mSHf0gktF0duw==}
vfile-message@4.0.3:
resolution: {integrity: sha512-QTHzsGd1EhbZs4AsQ20JX1rC3cOlt/IWJruk893DfLRr57lcnOeMaWG4K0JrRta4mIJZKth2Au3mM3u03/JWKw==}
@@ -2092,10 +2065,6 @@ packages:
yallist@3.1.1:
resolution: {integrity: sha512-a4UGQaWPH59mOXUYnAG2ewncQS4i4F43Tv3JoAM+s2VDAmS9NsK8GpDMLrCHPksFT7h3K6TOoUNn2pb7RoXx4g==}
yallist@5.0.0:
resolution: {integrity: sha512-YgvUTfwqyc7UXVMrB+SImsVYSmTS8X/tSrtdNZMImM+n7+QTriRXyXim0mBrTXNeqzVF0KWGgHPeiyViFFrNDw==}
engines: {node: '>=18'}
yargs-parser@21.1.1:
resolution: {integrity: sha512-tVpsJW7DdjecAiFpbIB1e3qxIQsE6NoPc5/eTdrbbIC4h0LVsWhnoa3g+m2HclBIujHzsxZ4VJVA+GUuc2/LBw==}
engines: {node: '>=12'}
@@ -2138,36 +2107,8 @@ snapshots:
'@astrojs/compiler@2.13.0': {}
'@astrojs/internal-helpers@0.7.3': {}
'@astrojs/internal-helpers@0.7.4': {}
'@astrojs/markdown-remark@6.3.7':
dependencies:
'@astrojs/internal-helpers': 0.7.3
'@astrojs/prism': 3.3.0
github-slugger: 2.0.0
hast-util-from-html: 2.0.3
hast-util-to-text: 4.0.2
import-meta-resolve: 4.2.0
js-yaml: 4.1.0
mdast-util-definitions: 6.0.0
rehype-raw: 7.0.0
rehype-stringify: 10.0.1
remark-gfm: 4.0.1
remark-parse: 11.0.0
remark-rehype: 11.1.2
remark-smartypants: 3.0.2
shiki: 3.13.0
smol-toml: 1.4.2
unified: 11.0.5
unist-util-remove-position: 5.0.0
unist-util-visit: 5.0.0
unist-util-visit-parents: 6.0.1
vfile: 6.0.3
transitivePeerDependencies:
- supports-color
'@astrojs/markdown-remark@6.3.8':
dependencies:
'@astrojs/internal-helpers': 0.7.4
@@ -2194,12 +2135,12 @@ snapshots:
transitivePeerDependencies:
- supports-color
'@astrojs/mdx@4.3.6(astro@5.14.3(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.1)(rollup@4.52.4)(typescript@5.8.3))':
'@astrojs/mdx@4.3.7(astro@5.14.8(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.2)(rollup@4.52.4)(typescript@5.8.3))':
dependencies:
'@astrojs/markdown-remark': 6.3.7
'@astrojs/markdown-remark': 6.3.8
'@mdx-js/mdx': 3.1.1
acorn: 8.15.0
astro: 5.14.3(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.1)(rollup@4.52.4)(typescript@5.8.3)
astro: 5.14.8(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.2)(rollup@4.52.4)(typescript@5.8.3)
es-module-lexer: 1.7.0
estree-util-visit: 2.0.0
hast-util-to-html: 9.0.5
@@ -2217,15 +2158,15 @@ snapshots:
dependencies:
prismjs: 1.30.0
'@astrojs/react@4.4.0(@types/node@24.7.1)(@types/react-dom@19.2.0(@types/react@19.2.0))(@types/react@19.2.0)(jiti@2.6.1)(lightningcss@1.30.1)(react-dom@19.2.0(react@19.2.0))(react@19.2.0)':
'@astrojs/react@4.4.0(@types/node@24.7.1)(@types/react-dom@19.2.2(@types/react@19.2.2))(@types/react@19.2.2)(jiti@2.6.1)(lightningcss@1.30.2)(react-dom@19.2.0(react@19.2.0))(react@19.2.0)':
dependencies:
'@types/react': 19.2.0
'@types/react-dom': 19.2.0(@types/react@19.2.0)
'@vitejs/plugin-react': 4.7.0(vite@6.3.6(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.1))
'@types/react': 19.2.2
'@types/react-dom': 19.2.2(@types/react@19.2.2)
'@vitejs/plugin-react': 4.7.0(vite@6.3.6(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.2))
react: 19.2.0
react-dom: 19.2.0(react@19.2.0)
ultrahtml: 1.6.0
vite: 6.3.6(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.1)
vite: 6.3.6(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.2)
transitivePeerDependencies:
- '@types/node'
- jiti
@@ -2540,10 +2481,6 @@ snapshots:
'@img/sharp-win32-x64@0.34.4':
optional: true
'@isaacs/fs-minipass@4.0.1':
dependencies:
minipass: 7.1.2
'@jridgewell/gen-mapping@0.3.12':
dependencies:
'@jridgewell/sourcemap-codec': 1.5.5
@@ -2595,22 +2532,22 @@ snapshots:
'@oslojs/encoding@1.1.0': {}
'@radix-ui/react-compose-refs@1.1.2(@types/react@19.2.0)(react@19.2.0)':
'@radix-ui/react-compose-refs@1.1.2(@types/react@19.2.2)(react@19.2.0)':
dependencies:
react: 19.2.0
optionalDependencies:
'@types/react': 19.2.0
'@types/react': 19.2.2
'@radix-ui/react-icons@1.3.2(react@19.2.0)':
dependencies:
react: 19.2.0
'@radix-ui/react-slot@1.2.3(@types/react@19.2.0)(react@19.2.0)':
'@radix-ui/react-slot@1.2.3(@types/react@19.2.2)(react@19.2.0)':
dependencies:
'@radix-ui/react-compose-refs': 1.1.2(@types/react@19.2.0)(react@19.2.0)
'@radix-ui/react-compose-refs': 1.1.2(@types/react@19.2.2)(react@19.2.0)
react: 19.2.0
optionalDependencies:
'@types/react': 19.2.0
'@types/react': 19.2.2
'@rolldown/pluginutils@1.0.0-beta.27': {}
@@ -2721,9 +2658,9 @@ snapshots:
'@shikijs/vscode-textmate@10.0.2': {}
'@splinetool/react-spline@4.1.0(@splinetool/runtime@1.10.73)(react-dom@19.2.0(react@19.2.0))(react@19.2.0)':
'@splinetool/react-spline@4.1.0(@splinetool/runtime@1.10.85)(react-dom@19.2.0(react@19.2.0))(react@19.2.0)':
dependencies:
'@splinetool/runtime': 1.10.73
'@splinetool/runtime': 1.10.85
blurhash: 2.0.5
lodash.debounce: 4.0.8
react: 19.2.0
@@ -2731,7 +2668,7 @@ snapshots:
react-merge-refs: 2.1.1
thumbhash: 0.1.1
'@splinetool/runtime@1.10.73':
'@splinetool/runtime@1.10.85':
dependencies:
on-change: 4.0.2
semver-compare: 1.0.0
@@ -2740,76 +2677,73 @@ snapshots:
dependencies:
tslib: 2.8.1
'@tailwindcss/node@4.1.14':
'@tailwindcss/node@4.1.15':
dependencies:
'@jridgewell/remapping': 2.3.5
enhanced-resolve: 5.18.3
jiti: 2.6.1
lightningcss: 1.30.1
lightningcss: 1.30.2
magic-string: 0.30.19
source-map-js: 1.2.1
tailwindcss: 4.1.14
tailwindcss: 4.1.15
'@tailwindcss/oxide-android-arm64@4.1.14':
'@tailwindcss/oxide-android-arm64@4.1.15':
optional: true
'@tailwindcss/oxide-darwin-arm64@4.1.14':
'@tailwindcss/oxide-darwin-arm64@4.1.15':
optional: true
'@tailwindcss/oxide-darwin-x64@4.1.14':
'@tailwindcss/oxide-darwin-x64@4.1.15':
optional: true
'@tailwindcss/oxide-freebsd-x64@4.1.14':
'@tailwindcss/oxide-freebsd-x64@4.1.15':
optional: true
'@tailwindcss/oxide-linux-arm-gnueabihf@4.1.14':
'@tailwindcss/oxide-linux-arm-gnueabihf@4.1.15':
optional: true
'@tailwindcss/oxide-linux-arm64-gnu@4.1.14':
'@tailwindcss/oxide-linux-arm64-gnu@4.1.15':
optional: true
'@tailwindcss/oxide-linux-arm64-musl@4.1.14':
'@tailwindcss/oxide-linux-arm64-musl@4.1.15':
optional: true
'@tailwindcss/oxide-linux-x64-gnu@4.1.14':
'@tailwindcss/oxide-linux-x64-gnu@4.1.15':
optional: true
'@tailwindcss/oxide-linux-x64-musl@4.1.14':
'@tailwindcss/oxide-linux-x64-musl@4.1.15':
optional: true
'@tailwindcss/oxide-wasm32-wasi@4.1.14':
'@tailwindcss/oxide-wasm32-wasi@4.1.15':
optional: true
'@tailwindcss/oxide-win32-arm64-msvc@4.1.14':
'@tailwindcss/oxide-win32-arm64-msvc@4.1.15':
optional: true
'@tailwindcss/oxide-win32-x64-msvc@4.1.14':
'@tailwindcss/oxide-win32-x64-msvc@4.1.15':
optional: true
'@tailwindcss/oxide@4.1.14':
dependencies:
detect-libc: 2.0.4
tar: 7.5.1
'@tailwindcss/oxide@4.1.15':
optionalDependencies:
'@tailwindcss/oxide-android-arm64': 4.1.14
'@tailwindcss/oxide-darwin-arm64': 4.1.14
'@tailwindcss/oxide-darwin-x64': 4.1.14
'@tailwindcss/oxide-freebsd-x64': 4.1.14
'@tailwindcss/oxide-linux-arm-gnueabihf': 4.1.14
'@tailwindcss/oxide-linux-arm64-gnu': 4.1.14
'@tailwindcss/oxide-linux-arm64-musl': 4.1.14
'@tailwindcss/oxide-linux-x64-gnu': 4.1.14
'@tailwindcss/oxide-linux-x64-musl': 4.1.14
'@tailwindcss/oxide-wasm32-wasi': 4.1.14
'@tailwindcss/oxide-win32-arm64-msvc': 4.1.14
'@tailwindcss/oxide-win32-x64-msvc': 4.1.14
'@tailwindcss/oxide-android-arm64': 4.1.15
'@tailwindcss/oxide-darwin-arm64': 4.1.15
'@tailwindcss/oxide-darwin-x64': 4.1.15
'@tailwindcss/oxide-freebsd-x64': 4.1.15
'@tailwindcss/oxide-linux-arm-gnueabihf': 4.1.15
'@tailwindcss/oxide-linux-arm64-gnu': 4.1.15
'@tailwindcss/oxide-linux-arm64-musl': 4.1.15
'@tailwindcss/oxide-linux-x64-gnu': 4.1.15
'@tailwindcss/oxide-linux-x64-musl': 4.1.15
'@tailwindcss/oxide-wasm32-wasi': 4.1.15
'@tailwindcss/oxide-win32-arm64-msvc': 4.1.15
'@tailwindcss/oxide-win32-x64-msvc': 4.1.15
'@tailwindcss/vite@4.1.14(vite@6.3.6(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.1))':
'@tailwindcss/vite@4.1.15(vite@6.3.6(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.2))':
dependencies:
'@tailwindcss/node': 4.1.14
'@tailwindcss/oxide': 4.1.14
tailwindcss: 4.1.14
vite: 6.3.6(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.1)
'@tailwindcss/node': 4.1.15
'@tailwindcss/oxide': 4.1.15
tailwindcss: 4.1.15
vite: 6.3.6(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.2)
'@types/babel__core@7.20.5':
dependencies:
@@ -2868,11 +2802,11 @@ snapshots:
dependencies:
undici-types: 7.14.0
'@types/react-dom@19.2.0(@types/react@19.2.0)':
'@types/react-dom@19.2.2(@types/react@19.2.2)':
dependencies:
'@types/react': 19.2.0
'@types/react': 19.2.2
'@types/react@19.2.0':
'@types/react@19.2.2':
dependencies:
csstype: 3.1.3
@@ -2882,7 +2816,7 @@ snapshots:
'@ungap/structured-clone@1.3.0': {}
'@vitejs/plugin-react@4.7.0(vite@6.3.6(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.1))':
'@vitejs/plugin-react@4.7.0(vite@6.3.6(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.2))':
dependencies:
'@babel/core': 7.28.0
'@babel/plugin-transform-react-jsx-self': 7.27.1(@babel/core@7.28.0)
@@ -2890,7 +2824,7 @@ snapshots:
'@rolldown/pluginutils': 1.0.0-beta.27
'@types/babel__core': 7.20.5
react-refresh: 0.17.0
vite: 6.3.6(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.1)
vite: 6.3.6(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.2)
transitivePeerDependencies:
- supports-color
@@ -2923,7 +2857,7 @@ snapshots:
astring@1.9.0: {}
astro@5.14.3(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.1)(rollup@4.52.4)(typescript@5.8.3):
astro@5.14.8(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.2)(rollup@4.52.4)(typescript@5.8.3):
dependencies:
'@astrojs/compiler': 2.13.0
'@astrojs/internal-helpers': 0.7.4
@@ -2979,8 +2913,8 @@ snapshots:
unist-util-visit: 5.0.0
unstorage: 1.17.1
vfile: 6.0.3
vite: 6.3.6(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.1)
vitefu: 1.1.1(vite@6.3.6(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.1))
vite: 6.3.6(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.2)
vitefu: 1.1.1(vite@6.3.6(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.2))
xxhash-wasm: 1.1.0
yargs-parser: 21.1.1
yocto-spinner: 0.2.3
@@ -3078,8 +3012,6 @@ snapshots:
dependencies:
readdirp: 4.1.2
chownr@3.0.0: {}
ci-info@4.3.1: {}
class-variance-authority@0.7.1:
@@ -3131,8 +3063,6 @@ snapshots:
destr@2.0.5: {}
detect-libc@2.0.4: {}
detect-libc@2.1.2: {}
deterministic-object-hash@2.0.2:
@@ -3309,7 +3239,7 @@ snapshots:
hast-util-from-parse5: 8.0.3
parse5: 7.3.0
vfile: 6.0.3
vfile-message: 4.0.2
vfile-message: 4.0.3
hast-util-from-parse5@8.0.3:
dependencies:
@@ -3397,7 +3327,7 @@ snapshots:
space-separated-tokens: 2.0.2
style-to-js: 1.1.17
unist-util-position: 5.0.0
vfile-message: 4.0.2
vfile-message: 4.0.3
transitivePeerDependencies:
- supports-color
@@ -3483,50 +3413,54 @@ snapshots:
kleur@4.1.5: {}
lightningcss-darwin-arm64@1.30.1:
lightningcss-android-arm64@1.30.2:
optional: true
lightningcss-darwin-x64@1.30.1:
lightningcss-darwin-arm64@1.30.2:
optional: true
lightningcss-freebsd-x64@1.30.1:
lightningcss-darwin-x64@1.30.2:
optional: true
lightningcss-linux-arm-gnueabihf@1.30.1:
lightningcss-freebsd-x64@1.30.2:
optional: true
lightningcss-linux-arm64-gnu@1.30.1:
lightningcss-linux-arm-gnueabihf@1.30.2:
optional: true
lightningcss-linux-arm64-musl@1.30.1:
lightningcss-linux-arm64-gnu@1.30.2:
optional: true
lightningcss-linux-x64-gnu@1.30.1:
lightningcss-linux-arm64-musl@1.30.2:
optional: true
lightningcss-linux-x64-musl@1.30.1:
lightningcss-linux-x64-gnu@1.30.2:
optional: true
lightningcss-win32-arm64-msvc@1.30.1:
lightningcss-linux-x64-musl@1.30.2:
optional: true
lightningcss-win32-x64-msvc@1.30.1:
lightningcss-win32-arm64-msvc@1.30.2:
optional: true
lightningcss@1.30.1:
lightningcss-win32-x64-msvc@1.30.2:
optional: true
lightningcss@1.30.2:
dependencies:
detect-libc: 2.1.2
optionalDependencies:
lightningcss-darwin-arm64: 1.30.1
lightningcss-darwin-x64: 1.30.1
lightningcss-freebsd-x64: 1.30.1
lightningcss-linux-arm-gnueabihf: 1.30.1
lightningcss-linux-arm64-gnu: 1.30.1
lightningcss-linux-arm64-musl: 1.30.1
lightningcss-linux-x64-gnu: 1.30.1
lightningcss-linux-x64-musl: 1.30.1
lightningcss-win32-arm64-msvc: 1.30.1
lightningcss-win32-x64-msvc: 1.30.1
lightningcss-android-arm64: 1.30.2
lightningcss-darwin-arm64: 1.30.2
lightningcss-darwin-x64: 1.30.2
lightningcss-freebsd-x64: 1.30.2
lightningcss-linux-arm-gnueabihf: 1.30.2
lightningcss-linux-arm64-gnu: 1.30.2
lightningcss-linux-arm64-musl: 1.30.2
lightningcss-linux-x64-gnu: 1.30.2
lightningcss-linux-x64-musl: 1.30.2
lightningcss-win32-arm64-msvc: 1.30.2
lightningcss-win32-x64-msvc: 1.30.2
lodash.debounce@4.0.8: {}
@@ -3538,7 +3472,7 @@ snapshots:
dependencies:
yallist: 3.1.1
lucide-react@0.544.0(react@19.2.0):
lucide-react@0.546.0(react@19.2.0):
dependencies:
react: 19.2.0
@@ -3667,7 +3601,7 @@ snapshots:
parse-entities: 4.0.2
stringify-entities: 4.0.4
unist-util-stringify-position: 4.0.0
vfile-message: 4.0.2
vfile-message: 4.0.3
transitivePeerDependencies:
- supports-color
@@ -3991,12 +3925,6 @@ snapshots:
transitivePeerDependencies:
- supports-color
minipass@7.1.2: {}
minizlib@3.1.0:
dependencies:
minipass: 7.1.2
mrmime@2.0.1: {}
ms@2.1.3: {}
@@ -4381,18 +4309,10 @@ snapshots:
tailwind-merge@3.3.1: {}
tailwindcss@4.1.14: {}
tailwindcss@4.1.15: {}
tapable@2.2.2: {}
tar@7.5.1:
dependencies:
'@isaacs/fs-minipass': 4.0.1
chownr: 3.0.0
minipass: 7.1.2
minizlib: 3.1.0
yallist: 5.0.0
thumbhash@0.1.1: {}
tiny-inflate@1.0.3: {}
@@ -4522,11 +4442,6 @@ snapshots:
'@types/unist': 3.0.3
vfile: 6.0.3
vfile-message@4.0.2:
dependencies:
'@types/unist': 3.0.3
unist-util-stringify-position: 4.0.0
vfile-message@4.0.3:
dependencies:
'@types/unist': 3.0.3
@@ -4537,7 +4452,7 @@ snapshots:
'@types/unist': 3.0.3
vfile-message: 4.0.3
vite@6.3.6(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.1):
vite@6.3.6(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.2):
dependencies:
esbuild: 0.25.10
fdir: 6.5.0(picomatch@4.0.3)
@@ -4549,11 +4464,11 @@ snapshots:
'@types/node': 24.7.1
fsevents: 2.3.3
jiti: 2.6.1
lightningcss: 1.30.1
lightningcss: 1.30.2
vitefu@1.1.1(vite@6.3.6(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.1)):
vitefu@1.1.1(vite@6.3.6(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.2)):
optionalDependencies:
vite: 6.3.6(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.1)
vite: 6.3.6(@types/node@24.7.1)(jiti@2.6.1)(lightningcss@1.30.2)
web-namespaces@2.0.1: {}
@@ -4573,8 +4488,6 @@ snapshots:
yallist@3.1.1: {}
yallist@5.0.0: {}
yargs-parser@21.1.1: {}
yocto-queue@1.2.1: {}

Binary file not shown.

After

Width:  |  Height:  |  Size: 31 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 854 KiB

After

Width:  |  Height:  |  Size: 834 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 986 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 950 KiB

After

Width:  |  Height:  |  Size: 905 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 255 KiB

After

Width:  |  Height:  |  Size: 270 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 943 KiB

After

Width:  |  Height:  |  Size: 908 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 215 KiB

After

Width:  |  Height:  |  Size: 241 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 44 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 844 KiB

After

Width:  |  Height:  |  Size: 825 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 221 KiB

After

Width:  |  Height:  |  Size: 246 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 970 KiB

After

Width:  |  Height:  |  Size: 952 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 227 KiB

After

Width:  |  Height:  |  Size: 237 KiB

Some files were not shown because too many files have changed in this diff Show More