mirror of
https://github.com/astral-sh/setup-uv.git
synced 2026-03-14 17:14:58 +00:00
Compare commits
6 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
fd8f376b22 | ||
|
|
f9070de1ea | ||
|
|
cadb67bdc9 | ||
|
|
e06108dd0a | ||
|
|
0f6ec07aaf | ||
|
|
821e5c9815 |
48
.agents/skills/dependabot-pr-rollup/SKILL.md
Normal file
48
.agents/skills/dependabot-pr-rollup/SKILL.md
Normal file
@@ -0,0 +1,48 @@
|
|||||||
|
---
|
||||||
|
name: dependabot-pr-rollup
|
||||||
|
description: Find open Dependabot PRs for the current GitHub repo, compare each PR head to its base branch, replay only the net dependency changes in a fresh worktree and branch, run npm validation, and optionally commit, push, and open a PR. Use when you want to batch or manually replicate active Dependabot updates.
|
||||||
|
license: MIT
|
||||||
|
compatibility: Requires git, git worktree, gh CLI auth, npm, and a GitHub repo with an origin remote.
|
||||||
|
---
|
||||||
|
|
||||||
|
# Dependabot PR Rollup
|
||||||
|
|
||||||
|
## When to use
|
||||||
|
|
||||||
|
Use this skill when the user wants to:
|
||||||
|
- find all open Dependabot PRs in the current repo
|
||||||
|
- reproduce their net effect in one local branch
|
||||||
|
- validate the result with the repo's standard npm checks
|
||||||
|
- optionally commit, push, and open a PR
|
||||||
|
|
||||||
|
## Workflow
|
||||||
|
|
||||||
|
1. Inspect the current checkout state, but do not reuse a dirty worktree.
|
||||||
|
2. List open Dependabot PRs with `gh pr list --state open --author app/dependabot`.
|
||||||
|
3. For each PR, collect the title, base branch, head branch, changed files, and relevant diffs.
|
||||||
|
4. Compare each PR head against `origin/<base>` instead of trusting the PR title. Dependabot PRs can already be partially merged, superseded by newer versions, or have no remaining net effect.
|
||||||
|
5. Create a new worktree and branch from `origin/<base>`.
|
||||||
|
6. Reproduce only the remaining dependency changes in the new worktree.
|
||||||
|
- Inspect `package.json` before editing.
|
||||||
|
- Run `npm ci --ignore-scripts` before applying updates.
|
||||||
|
- Use `npm install ... --ignore-scripts` for direct dependency changes so `package-lock.json` stays in sync.
|
||||||
|
7. Run `npm run all`.
|
||||||
|
8. If requested, commit the changed source, lockfile, and generated artifacts, then push and open a PR.
|
||||||
|
|
||||||
|
## Repo-specific notes
|
||||||
|
|
||||||
|
- Use `gh` for GitHub operations.
|
||||||
|
- Keep the user's original checkout untouched by working in a separate worktree.
|
||||||
|
- In this repo, `npm run all` is the safest validation command because it runs build, check, package, and test.
|
||||||
|
- If dependency changes affect bundled output, include the regenerated `dist/` files.
|
||||||
|
|
||||||
|
## Report back
|
||||||
|
|
||||||
|
Always report:
|
||||||
|
- open Dependabot PRs found
|
||||||
|
- which PRs required no net changes
|
||||||
|
- new branch name
|
||||||
|
- new worktree path
|
||||||
|
- files changed
|
||||||
|
- `npm run all` result
|
||||||
|
- if applicable, commit SHA and PR URL
|
||||||
263
.github/copilot-instructions.md
vendored
263
.github/copilot-instructions.md
vendored
@@ -1,263 +0,0 @@
|
|||||||
# Copilot Instructions for setup-uv
|
|
||||||
|
|
||||||
This document provides essential information for GitHub Copilot coding agents working on the `astral-sh/setup-uv` repository.
|
|
||||||
|
|
||||||
## Repository Overview
|
|
||||||
|
|
||||||
**setup-uv** is a GitHub Action that sets up the [uv](https://docs.astral.sh/uv/)
|
|
||||||
Python package installer in GitHub Actions workflows.
|
|
||||||
It's a TypeScript-based action that downloads uv binaries, manages caching, handles version resolution,
|
|
||||||
and configures the environment for subsequent workflow steps.
|
|
||||||
|
|
||||||
### Key Features
|
|
||||||
|
|
||||||
- Downloads and installs specific uv versions from GitHub releases
|
|
||||||
- Supports version resolution from config files (pyproject.toml, uv.toml, .tool-versions)
|
|
||||||
- Implements intelligent caching for both uv cache and Python installations
|
|
||||||
- Provides cross-platform support (Linux, macOS, Windows, including ARM architectures)
|
|
||||||
- Includes problem matchers for Python error reporting
|
|
||||||
- Supports environment activation and custom tool directories
|
|
||||||
|
|
||||||
## Repository Structure
|
|
||||||
|
|
||||||
**Size**: Small-medium repository (~50 source files, ~400 total files including dependencies)
|
|
||||||
**Languages**: TypeScript (primary), JavaScript (compiled output), JSON (configuration)
|
|
||||||
**Runtime**: Node.js 24 (GitHub Actions runtime)
|
|
||||||
**Key Dependencies**: @actions/core, @actions/cache, @actions/tool-cache, @octokit/core
|
|
||||||
|
|
||||||
### Core Architecture
|
|
||||||
|
|
||||||
```
|
|
||||||
src/
|
|
||||||
├── setup-uv.ts # Main entry point and orchestration
|
|
||||||
├── save-cache.ts # Post-action cache saving logic
|
|
||||||
├── update-known-versions.ts # Maintenance script for version updates
|
|
||||||
├── cache/ # Cache management functionality
|
|
||||||
├── download/ # Version resolution and binary downloading
|
|
||||||
├── utils/ # Input parsing, platform detection, configuration
|
|
||||||
└── version/ # Version resolution from various file formats
|
|
||||||
```
|
|
||||||
|
|
||||||
### Key Files and Locations
|
|
||||||
|
|
||||||
- **Action Definition**: `action.yml` - Defines all inputs/outputs and entry points
|
|
||||||
- **Main Source**: `src/setup-uv.ts` - Primary action logic
|
|
||||||
- **Configuration**: `biome.json` (linting), `tsconfig.json` (TypeScript), `jest.config.js` (testing)
|
|
||||||
- **Compiled Output**: `dist/` - Contains compiled Node.js bundles (auto-generated, committed)
|
|
||||||
- **Test Fixtures**: `__tests__/fixtures/` - Sample projects for different configuration scenarios
|
|
||||||
- **Workflows**: `.github/workflows/test.yml` - Comprehensive CI/CD pipeline
|
|
||||||
|
|
||||||
## Build and Development Process
|
|
||||||
|
|
||||||
### Prerequisites
|
|
||||||
|
|
||||||
- Node.js 24+ (matches GitHub Actions runtime)
|
|
||||||
- npm (included with Node.js)
|
|
||||||
|
|
||||||
### Essential Commands (ALWAYS run in this order)
|
|
||||||
|
|
||||||
#### 1. Install Dependencies
|
|
||||||
|
|
||||||
```bash
|
|
||||||
npm ci --ignore-scripts
|
|
||||||
```
|
|
||||||
|
|
||||||
**Timing**: ~20-30 seconds
|
|
||||||
**Note**: Always run this first after cloning or when package.json changes
|
|
||||||
|
|
||||||
#### 2. Build TypeScript
|
|
||||||
|
|
||||||
```bash
|
|
||||||
npm run build
|
|
||||||
```
|
|
||||||
|
|
||||||
**Timing**: ~5-10 seconds
|
|
||||||
**Purpose**: Compiles TypeScript source to JavaScript in `lib/` directory
|
|
||||||
|
|
||||||
#### 3. Lint and Format Code
|
|
||||||
|
|
||||||
```bash
|
|
||||||
npm run check
|
|
||||||
```
|
|
||||||
|
|
||||||
**Timing**: ~2-5 seconds
|
|
||||||
**Tool**: Biome (replaces ESLint/Prettier)
|
|
||||||
**Auto-fixes**: Formatting, import organization, basic linting issues
|
|
||||||
|
|
||||||
#### 4. Package for Distribution
|
|
||||||
|
|
||||||
```bash
|
|
||||||
npm run package
|
|
||||||
```
|
|
||||||
|
|
||||||
**Timing**: ~20-30 seconds
|
|
||||||
**Purpose**: Creates bundled distributions in `dist/` using @vercel/ncc
|
|
||||||
**Critical**: This step MUST be run before committing - the `dist/` files are used by GitHub Actions
|
|
||||||
|
|
||||||
#### 5. Run Tests
|
|
||||||
|
|
||||||
```bash
|
|
||||||
npm test
|
|
||||||
```
|
|
||||||
|
|
||||||
**Timing**: ~10-15 seconds
|
|
||||||
**Framework**: Jest with TypeScript support
|
|
||||||
**Coverage**: Unit tests for version resolution, input parsing, checksum validation
|
|
||||||
|
|
||||||
#### 6. Complete Validation (Recommended)
|
|
||||||
|
|
||||||
```bash
|
|
||||||
npm run all
|
|
||||||
```
|
|
||||||
|
|
||||||
**Timing**: ~60-90 seconds
|
|
||||||
**Purpose**: Runs build → check → package → test in sequence
|
|
||||||
**Use**: Before making pull requests or when unsure about build state
|
|
||||||
|
|
||||||
### Important Build Notes
|
|
||||||
|
|
||||||
**CRITICAL**: Always run `npm run package` after making code changes. The `dist/` directory contains the compiled bundles that GitHub Actions actually executes. Forgetting this step will cause your changes to have no effect.
|
|
||||||
|
|
||||||
**TypeScript Warnings**: You may see ts-jest warnings about "isolatedModules" - these are harmless and don't affect functionality.
|
|
||||||
|
|
||||||
**Biome**: This project uses Biome instead of ESLint/Prettier. Run `npm run check` to fix formatting and linting issues automatically.
|
|
||||||
|
|
||||||
## Testing Strategy
|
|
||||||
|
|
||||||
### Unit Tests
|
|
||||||
|
|
||||||
- **Location**: `__tests__/` directory
|
|
||||||
- **Framework**: Jest with ts-jest transformer
|
|
||||||
- **Coverage**: Version resolution, input parsing, checksum validation, utility functions
|
|
||||||
|
|
||||||
### Integration Tests
|
|
||||||
|
|
||||||
- **Location**: `.github/workflows/test.yml`
|
|
||||||
- **Scope**: Full end-to-end testing across multiple platforms and scenarios
|
|
||||||
- **Key Test Categories**:
|
|
||||||
- Version installation (specific, latest, semver ranges)
|
|
||||||
- Cache behavior (setup, restore, invalidation)
|
|
||||||
- Cross-platform compatibility (Ubuntu, macOS, Windows, ARM)
|
|
||||||
- Configuration file parsing (pyproject.toml, uv.toml, requirements.txt)
|
|
||||||
- Error handling and edge cases
|
|
||||||
|
|
||||||
### Test Fixtures
|
|
||||||
|
|
||||||
Located in `__tests__/fixtures/`, these provide sample projects with different configurations:
|
|
||||||
- `pyproject-toml-project/` - Standard Python project with uv version specification
|
|
||||||
- `uv-toml-project/` - Project using uv.toml configuration
|
|
||||||
- `requirements-txt-project/` - Legacy requirements.txt format
|
|
||||||
- `cache-dir-defined-project/` - Custom cache directory configuration
|
|
||||||
|
|
||||||
## Continuous Integration
|
|
||||||
|
|
||||||
### GitHub Workflows
|
|
||||||
|
|
||||||
#### Primary Test Suite (`.github/workflows/test.yml`)
|
|
||||||
|
|
||||||
- **Triggers**: PRs, pushes to main, manual dispatch
|
|
||||||
- **Matrix**: Multiple OS (Ubuntu, macOS, Windows), architecture (x64, ARM), and configuration combinations
|
|
||||||
- **Duration**: ~5 minutes for full matrix
|
|
||||||
- **Key Validations**:
|
|
||||||
- Cross-platform installation and functionality
|
|
||||||
- Cache behavior and performance
|
|
||||||
- Version resolution from various sources
|
|
||||||
- Tool directory configurations
|
|
||||||
- Problem matcher functionality
|
|
||||||
|
|
||||||
#### Maintenance Workflows
|
|
||||||
|
|
||||||
- **CodeQL Analysis**: Security scanning on pushes/PRs
|
|
||||||
- **Update Known Versions**: Daily job to sync with latest uv releases
|
|
||||||
- **Dependabot**: Automated dependency updates
|
|
||||||
|
|
||||||
### Pre-commit Validation
|
|
||||||
|
|
||||||
The CI runs these checks that you should run locally:
|
|
||||||
1. `npm run all` - Complete build and test suite
|
|
||||||
2. ActionLint - GitHub Actions workflow validation
|
|
||||||
3. Change detection - Ensures no uncommitted build artifacts
|
|
||||||
|
|
||||||
## Key Configuration Files
|
|
||||||
|
|
||||||
### Action Configuration (`action.yml`)
|
|
||||||
|
|
||||||
Defines 20+ inputs including version specifications,
|
|
||||||
cache settings, tool directories, and environment options.
|
|
||||||
This file is the authoritative source for understanding available action parameters.
|
|
||||||
|
|
||||||
### TypeScript Configuration (`tsconfig.json`)
|
|
||||||
|
|
||||||
- Target: ES2024
|
|
||||||
- Module: nodenext (Node.js modules)
|
|
||||||
- Strict mode enabled
|
|
||||||
- Output directory: `lib/`
|
|
||||||
|
|
||||||
### Linting Configuration (`biome.json`)
|
|
||||||
|
|
||||||
- Formatter and linter combined
|
|
||||||
- Enforces consistent code style
|
|
||||||
- Automatically organizes imports and sorts object keys
|
|
||||||
|
|
||||||
## Common Development Patterns
|
|
||||||
|
|
||||||
### Making Code Changes
|
|
||||||
|
|
||||||
1. Edit TypeScript source files in `src/`
|
|
||||||
2. Run `npm run build` to compile
|
|
||||||
3. Run `npm run check` to format and lint
|
|
||||||
4. Run `npm run package` to update distribution bundles
|
|
||||||
5. Run `npm test` to verify functionality
|
|
||||||
6. Commit all changes including `dist/` files
|
|
||||||
|
|
||||||
### Adding New Features
|
|
||||||
|
|
||||||
- Follow existing patterns in `src/utils/inputs.ts` for new action inputs
|
|
||||||
- Update `action.yml` to declare new inputs/outputs
|
|
||||||
- Add corresponding tests in `__tests__/`
|
|
||||||
- Add a test in `.github/workflows/test.yml` if it affects integration
|
|
||||||
- Update README.md with usage examples
|
|
||||||
|
|
||||||
### Cache-Related Changes
|
|
||||||
|
|
||||||
- Cache logic is complex and affects performance significantly
|
|
||||||
- Test with multiple cache scenarios (hit, miss, invalidation)
|
|
||||||
- Consider impact on both GitHub-hosted and self-hosted runners
|
|
||||||
- Validate cache key generation and dependency detection
|
|
||||||
|
|
||||||
### Version Resolution Changes
|
|
||||||
|
|
||||||
- Version resolution supports multiple file formats and precedence rules
|
|
||||||
- Test with fixtures in `__tests__/fixtures/`
|
|
||||||
- Consider backward compatibility with existing projects
|
|
||||||
- Validate semver and PEP 440 specification handling
|
|
||||||
|
|
||||||
## Troubleshooting
|
|
||||||
|
|
||||||
### Build Failures
|
|
||||||
|
|
||||||
- **"Module not found"**: Run `npm ci --ignore-scripts` to ensure dependencies are installed
|
|
||||||
- **TypeScript errors**: Check `tsconfig.json` and ensure all imports are valid
|
|
||||||
- **Test failures**: Check if test fixtures have been modified or if logic changes broke assumptions
|
|
||||||
|
|
||||||
### Action Failures in Workflows
|
|
||||||
|
|
||||||
- **Changes not taking effect**: Ensure `npm run package` was run and `dist/` files committed
|
|
||||||
- **Version resolution issues**: Check version specification format and file existence
|
|
||||||
- **Cache problems**: Verify cache key generation and dependency glob patterns
|
|
||||||
|
|
||||||
### Common Gotchas
|
|
||||||
|
|
||||||
- **Forgetting to package**: Code changes won't work without running `npm run package`
|
|
||||||
- **Platform differences**: Windows paths use backslashes, test cross-platform behavior
|
|
||||||
- **Cache invalidation**: Cache keys are sensitive to dependency file changes
|
|
||||||
- **Tool directory permissions**: Some platforms require specific directory setups
|
|
||||||
|
|
||||||
## Trust These Instructions
|
|
||||||
|
|
||||||
These instructions are comprehensive and current. Only search for additional information if:
|
|
||||||
- You encounter specific error messages not covered here
|
|
||||||
- You need to understand implementation details of specific functions
|
|
||||||
- The instructions appear outdated (check repository commit history)
|
|
||||||
|
|
||||||
For most development tasks, following the build process and development patterns outlined above will be sufficient.
|
|
||||||
2
.github/release-drafter.yml
vendored
2
.github/release-drafter.yml
vendored
@@ -19,7 +19,7 @@ categories:
|
|||||||
labels:
|
labels:
|
||||||
- "maintenance"
|
- "maintenance"
|
||||||
- "ci"
|
- "ci"
|
||||||
- "update-known-versions"
|
- "update-known-checksums"
|
||||||
- title: "📚 Documentation"
|
- title: "📚 Documentation"
|
||||||
labels:
|
labels:
|
||||||
- "documentation"
|
- "documentation"
|
||||||
|
|||||||
2
.github/workflows/test.yml
vendored
2
.github/workflows/test.yml
vendored
@@ -38,7 +38,7 @@ jobs:
|
|||||||
npm run all
|
npm run all
|
||||||
- name: Check all jobs are in all-tests-passed.needs
|
- name: Check all jobs are in all-tests-passed.needs
|
||||||
run: |
|
run: |
|
||||||
tsc check-all-tests-passed-needs.ts
|
tsc --module nodenext --moduleResolution nodenext --target es2022 check-all-tests-passed-needs.ts
|
||||||
node check-all-tests-passed-needs.js
|
node check-all-tests-passed-needs.js
|
||||||
working-directory: .github/scripts
|
working-directory: .github/scripts
|
||||||
- name: Make sure no changes from linters are detected
|
- name: Make sure no changes from linters are detected
|
||||||
|
|||||||
@@ -1,4 +1,4 @@
|
|||||||
name: "Update known versions"
|
name: "Update known checksums"
|
||||||
on:
|
on:
|
||||||
workflow_dispatch:
|
workflow_dispatch:
|
||||||
schedule:
|
schedule:
|
||||||
@@ -20,14 +20,13 @@ jobs:
|
|||||||
persist-credentials: true
|
persist-credentials: true
|
||||||
- uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6.3.0
|
- uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6.3.0
|
||||||
with:
|
with:
|
||||||
node-version: "20"
|
node-version-file: .nvmrc
|
||||||
- name: Update known versions
|
cache: npm
|
||||||
id: update-known-versions
|
- name: Update known checksums
|
||||||
|
id: update-known-checksums
|
||||||
run:
|
run:
|
||||||
node dist/update-known-versions/index.js
|
node dist/update-known-checksums/index.cjs
|
||||||
src/download/checksum/known-checksums.ts
|
src/download/checksum/known-checksums.ts
|
||||||
version-manifest.json
|
|
||||||
${{ secrets.GITHUB_TOKEN }}
|
|
||||||
- name: Check for changes
|
- name: Check for changes
|
||||||
id: changes-exist
|
id: changes-exist
|
||||||
run: |
|
run: |
|
||||||
@@ -48,10 +47,10 @@ jobs:
|
|||||||
git config user.name "$GITHUB_ACTOR"
|
git config user.name "$GITHUB_ACTOR"
|
||||||
git config user.email "$GITHUB_ACTOR@users.noreply.github.com"
|
git config user.email "$GITHUB_ACTOR@users.noreply.github.com"
|
||||||
git add .
|
git add .
|
||||||
git commit -m "chore: update known versions for $LATEST_VERSION"
|
git commit -m "chore: update known checksums for $LATEST_VERSION"
|
||||||
git push origin HEAD:refs/heads/main
|
git push origin HEAD:refs/heads/main
|
||||||
env:
|
env:
|
||||||
LATEST_VERSION: ${{ steps.update-known-versions.outputs.latest-version }}
|
LATEST_VERSION: ${{ steps.update-known-checksums.outputs.latest-version }}
|
||||||
|
|
||||||
- name: Create Pull Request
|
- name: Create Pull Request
|
||||||
if: ${{ steps.changes-exist.outputs.changes-exist == 'true' && steps.commit-and-push.outcome != 'success' }}
|
if: ${{ steps.changes-exist.outputs.changes-exist == 'true' && steps.commit-and-push.outcome != 'success' }}
|
||||||
@@ -60,11 +59,11 @@ jobs:
|
|||||||
commit-message: "chore: update known checksums"
|
commit-message: "chore: update known checksums"
|
||||||
title:
|
title:
|
||||||
"chore: update known checksums for ${{
|
"chore: update known checksums for ${{
|
||||||
steps.update-known-versions.outputs.latest-version }}"
|
steps.update-known-checksums.outputs.latest-version }}"
|
||||||
body:
|
body:
|
||||||
"chore: update known checksums for ${{
|
"chore: update known checksums for ${{
|
||||||
steps.update-known-versions.outputs.latest-version }}"
|
steps.update-known-checksums.outputs.latest-version }}"
|
||||||
base: main
|
base: main
|
||||||
labels: "automated-pr,update-known-versions"
|
labels: "automated-pr,update-known-checksums"
|
||||||
branch: update-known-versions-pr
|
branch: update-known-checksums-pr
|
||||||
delete-branch: true
|
delete-branch: true
|
||||||
18
AGENTS.md
Normal file
18
AGENTS.md
Normal file
@@ -0,0 +1,18 @@
|
|||||||
|
# setup-uv agent notes
|
||||||
|
|
||||||
|
This repository is a TypeScript-based GitHub Action for installing `uv` in GitHub Actions workflows. It also supports restoring/saving the `uv` cache and optional managed-Python caching.
|
||||||
|
|
||||||
|
- The published action runs the committed bundles in `dist/`, not the TypeScript in `src/`. After any code change, run `npm run package` and commit the resulting `dist/` updates.
|
||||||
|
- Standard local validation is:
|
||||||
|
1. `npm ci --ignore-scripts`
|
||||||
|
2. `npm run all`
|
||||||
|
- `npm run check` uses Biome (not ESLint/Prettier) and rewrites files in place.
|
||||||
|
- User-facing changes are usually multi-file changes. If you add or change inputs, outputs, or behavior, update `action.yml`, the implementation in `src/`, tests in `__tests__/`, relevant docs/README, and then re-package.
|
||||||
|
- The easiest areas to regress are version resolution and caching. When touching them, add or update tests for precedence, cache invalidation, and cross-platform path behavior.
|
||||||
|
- Workflow edits have extra CI-only checks (`actionlint` and `zizmor`); `npm run all` does not cover them.
|
||||||
|
- Source is authored with bundler-friendly TypeScript, but published action artifacts in `dist/` are bundled as CommonJS for maximum GitHub Actions runtime compatibility with `@actions/*` dependencies.
|
||||||
|
- Keep these concerns separate when changing module formats:
|
||||||
|
- `src/` and tests may use modern ESM-friendly TypeScript patterns.
|
||||||
|
- `dist/` should prioritize runtime reliability over format purity.
|
||||||
|
- Do not switch published bundles to ESM without validating the actual committed artifacts under the target Node runtime.
|
||||||
|
- Before finishing, make sure validation does not leave generated or formatting-only diffs behind.
|
||||||
14
README.md
14
README.md
@@ -68,7 +68,7 @@ Have a look under [Advanced Configuration](#advanced-configuration) for detailed
|
|||||||
# The checksum of the uv version to install
|
# The checksum of the uv version to install
|
||||||
checksum: ""
|
checksum: ""
|
||||||
|
|
||||||
# Used to increase the rate limit when retrieving versions and downloading uv
|
# Used when downloading uv from GitHub releases
|
||||||
github-token: ${{ github.token }}
|
github-token: ${{ github.token }}
|
||||||
|
|
||||||
# Enable uploading of the uv cache: true, false, or auto (enabled on GitHub-hosted runners, disabled on self-hosted runners)
|
# Enable uploading of the uv cache: true, false, or auto (enabled on GitHub-hosted runners, disabled on self-hosted runners)
|
||||||
@@ -114,7 +114,7 @@ Have a look under [Advanced Configuration](#advanced-configuration) for detailed
|
|||||||
# Custom path to set UV_TOOL_BIN_DIR to
|
# Custom path to set UV_TOOL_BIN_DIR to
|
||||||
tool-bin-dir: ""
|
tool-bin-dir: ""
|
||||||
|
|
||||||
# URL to the manifest file containing available versions and download URLs
|
# URL to a custom manifest file (NDJSON preferred, legacy JSON array is deprecated)
|
||||||
manifest-file: ""
|
manifest-file: ""
|
||||||
|
|
||||||
# Add problem matchers
|
# Add problem matchers
|
||||||
@@ -190,10 +190,12 @@ For more advanced configuration options, see our detailed documentation:
|
|||||||
|
|
||||||
## How it works
|
## How it works
|
||||||
|
|
||||||
This action downloads uv from the uv repo's official
|
By default, this action resolves uv versions from
|
||||||
[GitHub Releases](https://github.com/astral-sh/uv) and uses the
|
[`astral-sh/versions`](https://github.com/astral-sh/versions) (NDJSON) and downloads uv from the
|
||||||
[GitHub Actions Toolkit](https://github.com/actions/toolkit) to cache it as a tool to speed up
|
official [GitHub Releases](https://github.com/astral-sh/uv).
|
||||||
consecutive runs on self-hosted runners.
|
|
||||||
|
It then uses the [GitHub Actions Toolkit](https://github.com/actions/toolkit) to cache uv as a
|
||||||
|
tool to speed up consecutive runs on self-hosted runners.
|
||||||
|
|
||||||
The installed version of uv is then added to the runner PATH, enabling later steps to invoke it
|
The installed version of uv is then added to the runner PATH, enabling later steps to invoke it
|
||||||
by name (`uv`).
|
by name (`uv`).
|
||||||
|
|||||||
@@ -4,10 +4,11 @@ import {
|
|||||||
validateChecksum,
|
validateChecksum,
|
||||||
} from "../../../src/download/checksum/checksum";
|
} from "../../../src/download/checksum/checksum";
|
||||||
|
|
||||||
test("checksum should match", async () => {
|
const validChecksum =
|
||||||
const validChecksum =
|
|
||||||
"f3da96ec7e995debee7f5d52ecd034dfb7074309a1da42f76429ecb814d813a3";
|
"f3da96ec7e995debee7f5d52ecd034dfb7074309a1da42f76429ecb814d813a3";
|
||||||
const filePath = "__tests__/fixtures/checksumfile";
|
const filePath = "__tests__/fixtures/checksumfile";
|
||||||
|
|
||||||
|
test("checksum should match", async () => {
|
||||||
// string params don't matter only test the checksum mechanism, not known checksums
|
// string params don't matter only test the checksum mechanism, not known checksums
|
||||||
await validateChecksum(
|
await validateChecksum(
|
||||||
validChecksum,
|
validChecksum,
|
||||||
@@ -18,6 +19,16 @@ test("checksum should match", async () => {
|
|||||||
);
|
);
|
||||||
});
|
});
|
||||||
|
|
||||||
|
test("provided checksum beats known checksums", async () => {
|
||||||
|
await validateChecksum(
|
||||||
|
validChecksum,
|
||||||
|
filePath,
|
||||||
|
"x86_64",
|
||||||
|
"unknown-linux-gnu",
|
||||||
|
"0.3.0",
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
type KnownVersionFixture = { version: string; known: boolean };
|
type KnownVersionFixture = { version: string; known: boolean };
|
||||||
|
|
||||||
it.each<KnownVersionFixture>([
|
it.each<KnownVersionFixture>([
|
||||||
|
|||||||
254
__tests__/download/download-version.test.ts
Normal file
254
__tests__/download/download-version.test.ts
Normal file
@@ -0,0 +1,254 @@
|
|||||||
|
import { beforeEach, describe, expect, it, jest } from "@jest/globals";
|
||||||
|
import * as semver from "semver";
|
||||||
|
|
||||||
|
const mockInfo = jest.fn();
|
||||||
|
const mockWarning = jest.fn();
|
||||||
|
|
||||||
|
jest.unstable_mockModule("@actions/core", () => ({
|
||||||
|
debug: jest.fn(),
|
||||||
|
info: mockInfo,
|
||||||
|
warning: mockWarning,
|
||||||
|
}));
|
||||||
|
|
||||||
|
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||||
|
const mockDownloadTool = jest.fn<any>();
|
||||||
|
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||||
|
const mockExtractTar = jest.fn<any>();
|
||||||
|
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||||
|
const mockExtractZip = jest.fn<any>();
|
||||||
|
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||||
|
const mockCacheDir = jest.fn<any>();
|
||||||
|
|
||||||
|
jest.unstable_mockModule("@actions/tool-cache", () => ({
|
||||||
|
cacheDir: mockCacheDir,
|
||||||
|
downloadTool: mockDownloadTool,
|
||||||
|
evaluateVersions: (versions: string[], range: string) =>
|
||||||
|
semver.maxSatisfying(versions, range) ?? "",
|
||||||
|
extractTar: mockExtractTar,
|
||||||
|
extractZip: mockExtractZip,
|
||||||
|
find: () => "",
|
||||||
|
findAllVersions: () => [],
|
||||||
|
isExplicitVersion: (version: string) => semver.valid(version) !== null,
|
||||||
|
}));
|
||||||
|
|
||||||
|
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||||
|
const mockGetLatestVersionFromNdjson = jest.fn<any>();
|
||||||
|
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||||
|
const mockGetAllVersionsFromNdjson = jest.fn<any>();
|
||||||
|
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||||
|
const mockGetArtifactFromNdjson = jest.fn<any>();
|
||||||
|
|
||||||
|
jest.unstable_mockModule("../../src/download/versions-client", () => ({
|
||||||
|
getAllVersions: mockGetAllVersionsFromNdjson,
|
||||||
|
getArtifact: mockGetArtifactFromNdjson,
|
||||||
|
getLatestVersion: mockGetLatestVersionFromNdjson,
|
||||||
|
}));
|
||||||
|
|
||||||
|
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||||
|
const mockGetAllManifestVersions = jest.fn<any>();
|
||||||
|
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||||
|
const mockGetLatestVersionInManifest = jest.fn<any>();
|
||||||
|
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||||
|
const mockGetManifestArtifact = jest.fn<any>();
|
||||||
|
|
||||||
|
jest.unstable_mockModule("../../src/download/version-manifest", () => ({
|
||||||
|
getAllVersions: mockGetAllManifestVersions,
|
||||||
|
getLatestKnownVersion: mockGetLatestVersionInManifest,
|
||||||
|
getManifestArtifact: mockGetManifestArtifact,
|
||||||
|
}));
|
||||||
|
|
||||||
|
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||||
|
const mockValidateChecksum = jest.fn<any>();
|
||||||
|
|
||||||
|
jest.unstable_mockModule("../../src/download/checksum/checksum", () => ({
|
||||||
|
validateChecksum: mockValidateChecksum,
|
||||||
|
}));
|
||||||
|
|
||||||
|
const {
|
||||||
|
downloadVersionFromManifest,
|
||||||
|
downloadVersionFromNdjson,
|
||||||
|
resolveVersion,
|
||||||
|
} = await import("../../src/download/download-version");
|
||||||
|
|
||||||
|
describe("download-version", () => {
|
||||||
|
beforeEach(() => {
|
||||||
|
mockInfo.mockReset();
|
||||||
|
mockWarning.mockReset();
|
||||||
|
mockDownloadTool.mockReset();
|
||||||
|
mockExtractTar.mockReset();
|
||||||
|
mockExtractZip.mockReset();
|
||||||
|
mockCacheDir.mockReset();
|
||||||
|
mockGetLatestVersionFromNdjson.mockReset();
|
||||||
|
mockGetAllVersionsFromNdjson.mockReset();
|
||||||
|
mockGetArtifactFromNdjson.mockReset();
|
||||||
|
mockGetAllManifestVersions.mockReset();
|
||||||
|
mockGetLatestVersionInManifest.mockReset();
|
||||||
|
mockGetManifestArtifact.mockReset();
|
||||||
|
mockValidateChecksum.mockReset();
|
||||||
|
|
||||||
|
mockDownloadTool.mockResolvedValue("/tmp/downloaded");
|
||||||
|
mockExtractTar.mockResolvedValue("/tmp/extracted");
|
||||||
|
mockExtractZip.mockResolvedValue("/tmp/extracted");
|
||||||
|
mockCacheDir.mockResolvedValue("/tmp/cached");
|
||||||
|
});
|
||||||
|
|
||||||
|
describe("resolveVersion", () => {
|
||||||
|
it("uses astral-sh/versions to resolve latest", async () => {
|
||||||
|
mockGetLatestVersionFromNdjson.mockResolvedValue("0.9.26");
|
||||||
|
|
||||||
|
const version = await resolveVersion("latest", undefined);
|
||||||
|
|
||||||
|
expect(version).toBe("0.9.26");
|
||||||
|
expect(mockGetLatestVersionFromNdjson).toHaveBeenCalledTimes(1);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("uses astral-sh/versions to resolve available versions", async () => {
|
||||||
|
mockGetAllVersionsFromNdjson.mockResolvedValue(["0.9.26", "0.9.25"]);
|
||||||
|
|
||||||
|
const version = await resolveVersion("^0.9.0", undefined);
|
||||||
|
|
||||||
|
expect(version).toBe("0.9.26");
|
||||||
|
expect(mockGetAllVersionsFromNdjson).toHaveBeenCalledTimes(1);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("does not fall back when astral-sh/versions fails", async () => {
|
||||||
|
mockGetLatestVersionFromNdjson.mockRejectedValue(
|
||||||
|
new Error("NDJSON unavailable"),
|
||||||
|
);
|
||||||
|
|
||||||
|
await expect(resolveVersion("latest", undefined)).rejects.toThrow(
|
||||||
|
"NDJSON unavailable",
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("uses manifest-file when provided", async () => {
|
||||||
|
mockGetAllManifestVersions.mockResolvedValue(["0.9.26", "0.9.25"]);
|
||||||
|
|
||||||
|
const version = await resolveVersion(
|
||||||
|
"^0.9.0",
|
||||||
|
"https://example.com/custom.ndjson",
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(version).toBe("0.9.26");
|
||||||
|
expect(mockGetAllManifestVersions).toHaveBeenCalledWith(
|
||||||
|
"https://example.com/custom.ndjson",
|
||||||
|
);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe("downloadVersionFromNdjson", () => {
|
||||||
|
it("fails when NDJSON metadata lookup fails", async () => {
|
||||||
|
mockGetArtifactFromNdjson.mockRejectedValue(
|
||||||
|
new Error("NDJSON unavailable"),
|
||||||
|
);
|
||||||
|
|
||||||
|
await expect(
|
||||||
|
downloadVersionFromNdjson(
|
||||||
|
"unknown-linux-gnu",
|
||||||
|
"x86_64",
|
||||||
|
"0.9.26",
|
||||||
|
undefined,
|
||||||
|
"token",
|
||||||
|
),
|
||||||
|
).rejects.toThrow("NDJSON unavailable");
|
||||||
|
|
||||||
|
expect(mockDownloadTool).not.toHaveBeenCalled();
|
||||||
|
expect(mockValidateChecksum).not.toHaveBeenCalled();
|
||||||
|
});
|
||||||
|
|
||||||
|
it("fails when no matching artifact exists in NDJSON metadata", async () => {
|
||||||
|
mockGetArtifactFromNdjson.mockResolvedValue(undefined);
|
||||||
|
|
||||||
|
await expect(
|
||||||
|
downloadVersionFromNdjson(
|
||||||
|
"unknown-linux-gnu",
|
||||||
|
"x86_64",
|
||||||
|
"0.9.26",
|
||||||
|
undefined,
|
||||||
|
"token",
|
||||||
|
),
|
||||||
|
).rejects.toThrow(
|
||||||
|
"Could not find artifact for version 0.9.26, arch x86_64, platform unknown-linux-gnu in https://raw.githubusercontent.com/astral-sh/versions/main/v1/uv.ndjson .",
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(mockDownloadTool).not.toHaveBeenCalled();
|
||||||
|
expect(mockValidateChecksum).not.toHaveBeenCalled();
|
||||||
|
});
|
||||||
|
|
||||||
|
it("uses built-in checksums for default NDJSON downloads", async () => {
|
||||||
|
mockGetArtifactFromNdjson.mockResolvedValue({
|
||||||
|
archiveFormat: "tar.gz",
|
||||||
|
sha256: "ndjson-checksum-that-should-be-ignored",
|
||||||
|
url: "https://example.com/uv.tar.gz",
|
||||||
|
});
|
||||||
|
|
||||||
|
await downloadVersionFromNdjson(
|
||||||
|
"unknown-linux-gnu",
|
||||||
|
"x86_64",
|
||||||
|
"0.9.26",
|
||||||
|
undefined,
|
||||||
|
"token",
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(mockValidateChecksum).toHaveBeenCalledWith(
|
||||||
|
undefined,
|
||||||
|
"/tmp/downloaded",
|
||||||
|
"x86_64",
|
||||||
|
"unknown-linux-gnu",
|
||||||
|
"0.9.26",
|
||||||
|
);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe("downloadVersionFromManifest", () => {
|
||||||
|
it("uses manifest-file checksum metadata when checksum input is unset", async () => {
|
||||||
|
mockGetManifestArtifact.mockResolvedValue({
|
||||||
|
archiveFormat: "tar.gz",
|
||||||
|
checksum: "manifest-checksum",
|
||||||
|
downloadUrl: "https://example.com/custom-uv.tar.gz",
|
||||||
|
});
|
||||||
|
|
||||||
|
await downloadVersionFromManifest(
|
||||||
|
"https://example.com/custom.ndjson",
|
||||||
|
"unknown-linux-gnu",
|
||||||
|
"x86_64",
|
||||||
|
"0.9.26",
|
||||||
|
"",
|
||||||
|
"token",
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(mockValidateChecksum).toHaveBeenCalledWith(
|
||||||
|
"manifest-checksum",
|
||||||
|
"/tmp/downloaded",
|
||||||
|
"x86_64",
|
||||||
|
"unknown-linux-gnu",
|
||||||
|
"0.9.26",
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("prefers checksum input over manifest-file checksum metadata", async () => {
|
||||||
|
mockGetManifestArtifact.mockResolvedValue({
|
||||||
|
archiveFormat: "tar.gz",
|
||||||
|
checksum: "manifest-checksum",
|
||||||
|
downloadUrl: "https://example.com/custom-uv.tar.gz",
|
||||||
|
});
|
||||||
|
|
||||||
|
await downloadVersionFromManifest(
|
||||||
|
"https://example.com/custom.ndjson",
|
||||||
|
"unknown-linux-gnu",
|
||||||
|
"x86_64",
|
||||||
|
"0.9.26",
|
||||||
|
"user-checksum",
|
||||||
|
"token",
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(mockValidateChecksum).toHaveBeenCalledWith(
|
||||||
|
"user-checksum",
|
||||||
|
"/tmp/downloaded",
|
||||||
|
"x86_64",
|
||||||
|
"unknown-linux-gnu",
|
||||||
|
"0.9.26",
|
||||||
|
);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
136
__tests__/download/version-manifest.test.ts
Normal file
136
__tests__/download/version-manifest.test.ts
Normal file
@@ -0,0 +1,136 @@
|
|||||||
|
import { beforeEach, describe, expect, it, jest } from "@jest/globals";
|
||||||
|
|
||||||
|
const mockWarning = jest.fn();
|
||||||
|
|
||||||
|
jest.unstable_mockModule("@actions/core", () => ({
|
||||||
|
debug: jest.fn(),
|
||||||
|
info: jest.fn(),
|
||||||
|
warning: mockWarning,
|
||||||
|
}));
|
||||||
|
|
||||||
|
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||||
|
const mockFetch = jest.fn<any>();
|
||||||
|
jest.unstable_mockModule("../../src/utils/fetch", () => ({
|
||||||
|
fetch: mockFetch,
|
||||||
|
}));
|
||||||
|
|
||||||
|
const {
|
||||||
|
clearManifestCache,
|
||||||
|
getAllVersions,
|
||||||
|
getLatestKnownVersion,
|
||||||
|
getManifestArtifact,
|
||||||
|
} = await import("../../src/download/version-manifest");
|
||||||
|
|
||||||
|
const legacyManifestResponse = JSON.stringify([
|
||||||
|
{
|
||||||
|
arch: "x86_64",
|
||||||
|
artifactName: "uv-x86_64-unknown-linux-gnu.tar.gz",
|
||||||
|
downloadUrl:
|
||||||
|
"https://example.com/releases/download/0.7.12-alpha.1/uv-x86_64-unknown-linux-gnu.tar.gz",
|
||||||
|
platform: "unknown-linux-gnu",
|
||||||
|
version: "0.7.12-alpha.1",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
arch: "x86_64",
|
||||||
|
artifactName: "uv-x86_64-unknown-linux-gnu.tar.gz",
|
||||||
|
downloadUrl:
|
||||||
|
"https://example.com/releases/download/0.7.13/uv-x86_64-unknown-linux-gnu.tar.gz",
|
||||||
|
platform: "unknown-linux-gnu",
|
||||||
|
version: "0.7.13",
|
||||||
|
},
|
||||||
|
]);
|
||||||
|
|
||||||
|
const ndjsonManifestResponse = `{"version":"0.10.0","artifacts":[{"platform":"x86_64-unknown-linux-gnu","variant":"default","url":"https://example.com/releases/download/0.10.0/uv-x86_64-unknown-linux-gnu.tar.gz","archive_format":"tar.gz","sha256":"checksum-100"}]}
|
||||||
|
{"version":"0.9.30","artifacts":[{"platform":"x86_64-unknown-linux-gnu","variant":"default","url":"https://example.com/releases/download/0.9.30/uv-x86_64-unknown-linux-gnu.tar.gz","archive_format":"tar.gz","sha256":"checksum-0930"}]}`;
|
||||||
|
|
||||||
|
const multiVariantManifestResponse = `{"version":"0.10.0","artifacts":[{"platform":"x86_64-unknown-linux-gnu","variant":"managed-python","url":"https://example.com/releases/download/0.10.0/uv-x86_64-unknown-linux-gnu-managed-python.tar.gz","archive_format":"tar.gz","sha256":"checksum-managed"},{"platform":"x86_64-unknown-linux-gnu","variant":"default","url":"https://example.com/releases/download/0.10.0/uv-x86_64-unknown-linux-gnu-default.zip","archive_format":"zip","sha256":"checksum-default"}]}`;
|
||||||
|
|
||||||
|
function createMockResponse(
|
||||||
|
ok: boolean,
|
||||||
|
status: number,
|
||||||
|
statusText: string,
|
||||||
|
data: string,
|
||||||
|
) {
|
||||||
|
return {
|
||||||
|
ok,
|
||||||
|
status,
|
||||||
|
statusText,
|
||||||
|
text: async () => data,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
describe("version-manifest", () => {
|
||||||
|
beforeEach(() => {
|
||||||
|
clearManifestCache();
|
||||||
|
mockFetch.mockReset();
|
||||||
|
mockWarning.mockReset();
|
||||||
|
});
|
||||||
|
|
||||||
|
it("supports the legacy JSON manifest format", async () => {
|
||||||
|
mockFetch.mockResolvedValue(
|
||||||
|
createMockResponse(true, 200, "OK", legacyManifestResponse),
|
||||||
|
);
|
||||||
|
|
||||||
|
const latest = await getLatestKnownVersion(
|
||||||
|
"https://example.com/legacy.json",
|
||||||
|
);
|
||||||
|
const artifact = await getManifestArtifact(
|
||||||
|
"https://example.com/legacy.json",
|
||||||
|
"0.7.13",
|
||||||
|
"x86_64",
|
||||||
|
"unknown-linux-gnu",
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(latest).toBe("0.7.13");
|
||||||
|
expect(artifact).toEqual({
|
||||||
|
archiveFormat: undefined,
|
||||||
|
checksum: undefined,
|
||||||
|
downloadUrl:
|
||||||
|
"https://example.com/releases/download/0.7.13/uv-x86_64-unknown-linux-gnu.tar.gz",
|
||||||
|
});
|
||||||
|
expect(mockWarning).toHaveBeenCalledTimes(1);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("supports NDJSON manifests", async () => {
|
||||||
|
mockFetch.mockResolvedValue(
|
||||||
|
createMockResponse(true, 200, "OK", ndjsonManifestResponse),
|
||||||
|
);
|
||||||
|
|
||||||
|
const versions = await getAllVersions("https://example.com/custom.ndjson");
|
||||||
|
const artifact = await getManifestArtifact(
|
||||||
|
"https://example.com/custom.ndjson",
|
||||||
|
"0.10.0",
|
||||||
|
"x86_64",
|
||||||
|
"unknown-linux-gnu",
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(versions).toEqual(["0.10.0", "0.9.30"]);
|
||||||
|
expect(artifact).toEqual({
|
||||||
|
archiveFormat: "tar.gz",
|
||||||
|
checksum: "checksum-100",
|
||||||
|
downloadUrl:
|
||||||
|
"https://example.com/releases/download/0.10.0/uv-x86_64-unknown-linux-gnu.tar.gz",
|
||||||
|
});
|
||||||
|
expect(mockWarning).not.toHaveBeenCalled();
|
||||||
|
});
|
||||||
|
|
||||||
|
it("prefers the default variant when a manifest contains multiple variants", async () => {
|
||||||
|
mockFetch.mockResolvedValue(
|
||||||
|
createMockResponse(true, 200, "OK", multiVariantManifestResponse),
|
||||||
|
);
|
||||||
|
|
||||||
|
const artifact = await getManifestArtifact(
|
||||||
|
"https://example.com/multi-variant.ndjson",
|
||||||
|
"0.10.0",
|
||||||
|
"x86_64",
|
||||||
|
"unknown-linux-gnu",
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(artifact).toEqual({
|
||||||
|
archiveFormat: "zip",
|
||||||
|
checksum: "checksum-default",
|
||||||
|
downloadUrl:
|
||||||
|
"https://example.com/releases/download/0.10.0/uv-x86_64-unknown-linux-gnu-default.zip",
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
170
__tests__/download/versions-client.test.ts
Normal file
170
__tests__/download/versions-client.test.ts
Normal file
@@ -0,0 +1,170 @@
|
|||||||
|
import { beforeEach, describe, expect, it, jest } from "@jest/globals";
|
||||||
|
|
||||||
|
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||||
|
const mockFetch = jest.fn<any>();
|
||||||
|
|
||||||
|
jest.unstable_mockModule("../../src/utils/fetch", () => ({
|
||||||
|
fetch: mockFetch,
|
||||||
|
}));
|
||||||
|
|
||||||
|
const {
|
||||||
|
clearCache,
|
||||||
|
fetchVersionData,
|
||||||
|
getAllVersions,
|
||||||
|
getArtifact,
|
||||||
|
getLatestVersion,
|
||||||
|
parseVersionData,
|
||||||
|
} = await import("../../src/download/versions-client");
|
||||||
|
|
||||||
|
const sampleNdjsonResponse = `{"version":"0.9.26","artifacts":[{"platform":"aarch64-apple-darwin","variant":"default","url":"https://github.com/astral-sh/uv/releases/download/0.9.26/uv-aarch64-apple-darwin.tar.gz","archive_format":"tar.gz","sha256":"fcf0a9ea6599c6ae28a4c854ac6da76f2c889354d7c36ce136ef071f7ab9721f"},{"platform":"x86_64-pc-windows-msvc","variant":"default","url":"https://github.com/astral-sh/uv/releases/download/0.9.26/uv-x86_64-pc-windows-msvc.zip","archive_format":"zip","sha256":"eb02fd95d8e0eed462b4a67ecdd320d865b38c560bffcda9a0b87ec944bdf036"}]}
|
||||||
|
{"version":"0.9.25","artifacts":[{"platform":"aarch64-apple-darwin","variant":"default","url":"https://github.com/astral-sh/uv/releases/download/0.9.25/uv-aarch64-apple-darwin.tar.gz","archive_format":"tar.gz","sha256":"606b3c6949d971709f2526fa0d9f0fd23ccf60e09f117999b406b424af18a6a6"}]}`;
|
||||||
|
|
||||||
|
const multiVariantNdjsonResponse = `{"version":"0.9.26","artifacts":[{"platform":"aarch64-apple-darwin","variant":"python-managed","url":"https://github.com/astral-sh/uv/releases/download/0.9.26/uv-aarch64-apple-darwin-managed.tar.gz","archive_format":"tar.gz","sha256":"managed-checksum"},{"platform":"aarch64-apple-darwin","variant":"default","url":"https://github.com/astral-sh/uv/releases/download/0.9.26/uv-aarch64-apple-darwin.zip","archive_format":"zip","sha256":"default-checksum"}]}`;
|
||||||
|
|
||||||
|
function createMockResponse(
|
||||||
|
ok: boolean,
|
||||||
|
status: number,
|
||||||
|
statusText: string,
|
||||||
|
data: string,
|
||||||
|
) {
|
||||||
|
return {
|
||||||
|
ok,
|
||||||
|
status,
|
||||||
|
statusText,
|
||||||
|
text: async () => data,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
describe("versions-client", () => {
|
||||||
|
beforeEach(() => {
|
||||||
|
clearCache();
|
||||||
|
mockFetch.mockReset();
|
||||||
|
});
|
||||||
|
|
||||||
|
describe("fetchVersionData", () => {
|
||||||
|
it("should fetch and parse NDJSON data", async () => {
|
||||||
|
mockFetch.mockResolvedValue(
|
||||||
|
createMockResponse(true, 200, "OK", sampleNdjsonResponse),
|
||||||
|
);
|
||||||
|
|
||||||
|
const versions = await fetchVersionData();
|
||||||
|
|
||||||
|
expect(versions).toHaveLength(2);
|
||||||
|
expect(versions[0].version).toBe("0.9.26");
|
||||||
|
expect(versions[1].version).toBe("0.9.25");
|
||||||
|
});
|
||||||
|
|
||||||
|
it("should throw error on failed fetch", async () => {
|
||||||
|
mockFetch.mockResolvedValue(
|
||||||
|
createMockResponse(false, 500, "Internal Server Error", ""),
|
||||||
|
);
|
||||||
|
|
||||||
|
await expect(fetchVersionData()).rejects.toThrow(
|
||||||
|
"Failed to fetch version data: 500 Internal Server Error",
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("should cache results", async () => {
|
||||||
|
mockFetch.mockResolvedValue(
|
||||||
|
createMockResponse(true, 200, "OK", sampleNdjsonResponse),
|
||||||
|
);
|
||||||
|
|
||||||
|
await fetchVersionData();
|
||||||
|
await fetchVersionData();
|
||||||
|
|
||||||
|
expect(mockFetch).toHaveBeenCalledTimes(1);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe("getLatestVersion", () => {
|
||||||
|
it("should return the first version (newest)", async () => {
|
||||||
|
mockFetch.mockResolvedValue(
|
||||||
|
createMockResponse(true, 200, "OK", sampleNdjsonResponse),
|
||||||
|
);
|
||||||
|
|
||||||
|
const latest = await getLatestVersion();
|
||||||
|
|
||||||
|
expect(latest).toBe("0.9.26");
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe("getAllVersions", () => {
|
||||||
|
it("should return all version strings", async () => {
|
||||||
|
mockFetch.mockResolvedValue(
|
||||||
|
createMockResponse(true, 200, "OK", sampleNdjsonResponse),
|
||||||
|
);
|
||||||
|
|
||||||
|
const versions = await getAllVersions();
|
||||||
|
|
||||||
|
expect(versions).toEqual(["0.9.26", "0.9.25"]);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe("getArtifact", () => {
|
||||||
|
beforeEach(() => {
|
||||||
|
mockFetch.mockResolvedValue(
|
||||||
|
createMockResponse(true, 200, "OK", sampleNdjsonResponse),
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("should find artifact by version and platform", async () => {
|
||||||
|
const artifact = await getArtifact("0.9.26", "aarch64", "apple-darwin");
|
||||||
|
|
||||||
|
expect(artifact).toEqual({
|
||||||
|
archiveFormat: "tar.gz",
|
||||||
|
sha256:
|
||||||
|
"fcf0a9ea6599c6ae28a4c854ac6da76f2c889354d7c36ce136ef071f7ab9721f",
|
||||||
|
url: "https://github.com/astral-sh/uv/releases/download/0.9.26/uv-aarch64-apple-darwin.tar.gz",
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it("should find windows artifact", async () => {
|
||||||
|
const artifact = await getArtifact("0.9.26", "x86_64", "pc-windows-msvc");
|
||||||
|
|
||||||
|
expect(artifact).toEqual({
|
||||||
|
archiveFormat: "zip",
|
||||||
|
sha256:
|
||||||
|
"eb02fd95d8e0eed462b4a67ecdd320d865b38c560bffcda9a0b87ec944bdf036",
|
||||||
|
url: "https://github.com/astral-sh/uv/releases/download/0.9.26/uv-x86_64-pc-windows-msvc.zip",
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it("should prefer the default variant when multiple artifacts share a platform", async () => {
|
||||||
|
mockFetch.mockResolvedValue(
|
||||||
|
createMockResponse(true, 200, "OK", multiVariantNdjsonResponse),
|
||||||
|
);
|
||||||
|
|
||||||
|
const artifact = await getArtifact("0.9.26", "aarch64", "apple-darwin");
|
||||||
|
|
||||||
|
expect(artifact).toEqual({
|
||||||
|
archiveFormat: "zip",
|
||||||
|
sha256: "default-checksum",
|
||||||
|
url: "https://github.com/astral-sh/uv/releases/download/0.9.26/uv-aarch64-apple-darwin.zip",
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it("should return undefined for unknown version", async () => {
|
||||||
|
const artifact = await getArtifact("0.0.1", "aarch64", "apple-darwin");
|
||||||
|
|
||||||
|
expect(artifact).toBeUndefined();
|
||||||
|
});
|
||||||
|
|
||||||
|
it("should return undefined for unknown platform", async () => {
|
||||||
|
const artifact = await getArtifact(
|
||||||
|
"0.9.26",
|
||||||
|
"aarch64",
|
||||||
|
"unknown-linux-musl",
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(artifact).toBeUndefined();
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe("parseVersionData", () => {
|
||||||
|
it("should throw for malformed NDJSON", () => {
|
||||||
|
expect(() =>
|
||||||
|
parseVersionData('{"version":"0.1.0"', "test-source"),
|
||||||
|
).toThrow("Failed to parse version data from test-source");
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
@@ -1,14 +1,3 @@
|
|||||||
jest.mock("@actions/core", () => {
|
|
||||||
return {
|
|
||||||
debug: jest.fn(),
|
|
||||||
getBooleanInput: jest.fn(
|
|
||||||
(name: string) => (mockInputs[name] ?? "") === "true",
|
|
||||||
),
|
|
||||||
getInput: jest.fn((name: string) => mockInputs[name] ?? ""),
|
|
||||||
warning: jest.fn(),
|
|
||||||
};
|
|
||||||
});
|
|
||||||
|
|
||||||
import {
|
import {
|
||||||
afterEach,
|
afterEach,
|
||||||
beforeEach,
|
beforeEach,
|
||||||
@@ -22,6 +11,26 @@ import {
|
|||||||
let mockInputs: Record<string, string> = {};
|
let mockInputs: Record<string, string> = {};
|
||||||
const ORIGINAL_HOME = process.env.HOME;
|
const ORIGINAL_HOME = process.env.HOME;
|
||||||
|
|
||||||
|
const mockDebug = jest.fn();
|
||||||
|
const mockGetBooleanInput = jest.fn(
|
||||||
|
(name: string) => (mockInputs[name] ?? "") === "true",
|
||||||
|
);
|
||||||
|
const mockGetInput = jest.fn((name: string) => mockInputs[name] ?? "");
|
||||||
|
const mockInfo = jest.fn();
|
||||||
|
const mockWarning = jest.fn();
|
||||||
|
|
||||||
|
jest.unstable_mockModule("@actions/core", () => ({
|
||||||
|
debug: mockDebug,
|
||||||
|
getBooleanInput: mockGetBooleanInput,
|
||||||
|
getInput: mockGetInput,
|
||||||
|
info: mockInfo,
|
||||||
|
warning: mockWarning,
|
||||||
|
}));
|
||||||
|
|
||||||
|
async function importInputsModule() {
|
||||||
|
return await import("../../src/utils/inputs");
|
||||||
|
}
|
||||||
|
|
||||||
describe("cacheDependencyGlob", () => {
|
describe("cacheDependencyGlob", () => {
|
||||||
beforeEach(() => {
|
beforeEach(() => {
|
||||||
jest.resetModules();
|
jest.resetModules();
|
||||||
@@ -36,21 +45,21 @@ describe("cacheDependencyGlob", () => {
|
|||||||
|
|
||||||
it("returns empty string when input not provided", async () => {
|
it("returns empty string when input not provided", async () => {
|
||||||
mockInputs["working-directory"] = "/workspace";
|
mockInputs["working-directory"] = "/workspace";
|
||||||
const { cacheDependencyGlob } = await import("../../src/utils/inputs");
|
const { cacheDependencyGlob } = await importInputsModule();
|
||||||
expect(cacheDependencyGlob).toBe("");
|
expect(cacheDependencyGlob).toBe("");
|
||||||
});
|
});
|
||||||
|
|
||||||
it("resolves a single relative path", async () => {
|
it("resolves a single relative path", async () => {
|
||||||
mockInputs["working-directory"] = "/workspace";
|
mockInputs["working-directory"] = "/workspace";
|
||||||
mockInputs["cache-dependency-glob"] = "requirements.txt";
|
mockInputs["cache-dependency-glob"] = "requirements.txt";
|
||||||
const { cacheDependencyGlob } = await import("../../src/utils/inputs");
|
const { cacheDependencyGlob } = await importInputsModule();
|
||||||
expect(cacheDependencyGlob).toBe("/workspace/requirements.txt");
|
expect(cacheDependencyGlob).toBe("/workspace/requirements.txt");
|
||||||
});
|
});
|
||||||
|
|
||||||
it("strips leading ./ from relative path", async () => {
|
it("strips leading ./ from relative path", async () => {
|
||||||
mockInputs["working-directory"] = "/workspace";
|
mockInputs["working-directory"] = "/workspace";
|
||||||
mockInputs["cache-dependency-glob"] = "./uv.lock";
|
mockInputs["cache-dependency-glob"] = "./uv.lock";
|
||||||
const { cacheDependencyGlob } = await import("../../src/utils/inputs");
|
const { cacheDependencyGlob } = await importInputsModule();
|
||||||
expect(cacheDependencyGlob).toBe("/workspace/uv.lock");
|
expect(cacheDependencyGlob).toBe("/workspace/uv.lock");
|
||||||
});
|
});
|
||||||
|
|
||||||
@@ -58,7 +67,7 @@ describe("cacheDependencyGlob", () => {
|
|||||||
mockInputs["working-directory"] = "/workspace";
|
mockInputs["working-directory"] = "/workspace";
|
||||||
mockInputs["cache-dependency-glob"] =
|
mockInputs["cache-dependency-glob"] =
|
||||||
" ~/.cache/file1\n ./rel/file2 \nfile3.txt";
|
" ~/.cache/file1\n ./rel/file2 \nfile3.txt";
|
||||||
const { cacheDependencyGlob } = await import("../../src/utils/inputs");
|
const { cacheDependencyGlob } = await importInputsModule();
|
||||||
expect(cacheDependencyGlob).toBe(
|
expect(cacheDependencyGlob).toBe(
|
||||||
[
|
[
|
||||||
"/home/testuser/.cache/file1", // expanded tilde, absolute path unchanged
|
"/home/testuser/.cache/file1", // expanded tilde, absolute path unchanged
|
||||||
@@ -71,7 +80,7 @@ describe("cacheDependencyGlob", () => {
|
|||||||
it("keeps absolute path unchanged in multiline input", async () => {
|
it("keeps absolute path unchanged in multiline input", async () => {
|
||||||
mockInputs["working-directory"] = "/workspace";
|
mockInputs["working-directory"] = "/workspace";
|
||||||
mockInputs["cache-dependency-glob"] = "/abs/path.lock\nrelative.lock";
|
mockInputs["cache-dependency-glob"] = "/abs/path.lock\nrelative.lock";
|
||||||
const { cacheDependencyGlob } = await import("../../src/utils/inputs");
|
const { cacheDependencyGlob } = await importInputsModule();
|
||||||
expect(cacheDependencyGlob).toBe(
|
expect(cacheDependencyGlob).toBe(
|
||||||
["/abs/path.lock", "/workspace/relative.lock"].join("\n"),
|
["/abs/path.lock", "/workspace/relative.lock"].join("\n"),
|
||||||
);
|
);
|
||||||
@@ -80,7 +89,7 @@ describe("cacheDependencyGlob", () => {
|
|||||||
it("handles exclusions in relative paths correct", async () => {
|
it("handles exclusions in relative paths correct", async () => {
|
||||||
mockInputs["working-directory"] = "/workspace";
|
mockInputs["working-directory"] = "/workspace";
|
||||||
mockInputs["cache-dependency-glob"] = "!/abs/path.lock\n!relative.lock";
|
mockInputs["cache-dependency-glob"] = "!/abs/path.lock\n!relative.lock";
|
||||||
const { cacheDependencyGlob } = await import("../../src/utils/inputs");
|
const { cacheDependencyGlob } = await importInputsModule();
|
||||||
expect(cacheDependencyGlob).toBe(
|
expect(cacheDependencyGlob).toBe(
|
||||||
["!/abs/path.lock", "!/workspace/relative.lock"].join("\n"),
|
["!/abs/path.lock", "!/workspace/relative.lock"].join("\n"),
|
||||||
);
|
);
|
||||||
@@ -104,7 +113,7 @@ describe("tool directories", () => {
|
|||||||
mockInputs["tool-bin-dir"] = "~/tool-bin-dir";
|
mockInputs["tool-bin-dir"] = "~/tool-bin-dir";
|
||||||
mockInputs["tool-dir"] = "~/tool-dir";
|
mockInputs["tool-dir"] = "~/tool-dir";
|
||||||
|
|
||||||
const { toolBinDir, toolDir } = await import("../../src/utils/inputs");
|
const { toolBinDir, toolDir } = await importInputsModule();
|
||||||
|
|
||||||
expect(toolBinDir).toBe("/home/testuser/tool-bin-dir");
|
expect(toolBinDir).toBe("/home/testuser/tool-bin-dir");
|
||||||
expect(toolDir).toBe("/home/testuser/tool-dir");
|
expect(toolDir).toBe("/home/testuser/tool-dir");
|
||||||
@@ -127,9 +136,7 @@ describe("cacheLocalPath", () => {
|
|||||||
mockInputs["working-directory"] = "/workspace";
|
mockInputs["working-directory"] = "/workspace";
|
||||||
mockInputs["cache-local-path"] = "~/uv-cache/cache-local-path";
|
mockInputs["cache-local-path"] = "~/uv-cache/cache-local-path";
|
||||||
|
|
||||||
const { CacheLocalSource, cacheLocalPath } = await import(
|
const { CacheLocalSource, cacheLocalPath } = await importInputsModule();
|
||||||
"../../src/utils/inputs"
|
|
||||||
);
|
|
||||||
|
|
||||||
expect(cacheLocalPath).toEqual({
|
expect(cacheLocalPath).toEqual({
|
||||||
path: "/home/testuser/uv-cache/cache-local-path",
|
path: "/home/testuser/uv-cache/cache-local-path",
|
||||||
@@ -152,7 +159,7 @@ describe("venvPath", () => {
|
|||||||
|
|
||||||
it("defaults to .venv in the working directory", async () => {
|
it("defaults to .venv in the working directory", async () => {
|
||||||
mockInputs["working-directory"] = "/workspace";
|
mockInputs["working-directory"] = "/workspace";
|
||||||
const { venvPath } = await import("../../src/utils/inputs");
|
const { venvPath } = await importInputsModule();
|
||||||
expect(venvPath).toBe("/workspace/.venv");
|
expect(venvPath).toBe("/workspace/.venv");
|
||||||
});
|
});
|
||||||
|
|
||||||
@@ -160,7 +167,7 @@ describe("venvPath", () => {
|
|||||||
mockInputs["working-directory"] = "/workspace";
|
mockInputs["working-directory"] = "/workspace";
|
||||||
mockInputs["activate-environment"] = "true";
|
mockInputs["activate-environment"] = "true";
|
||||||
mockInputs["venv-path"] = "custom-venv";
|
mockInputs["venv-path"] = "custom-venv";
|
||||||
const { venvPath } = await import("../../src/utils/inputs");
|
const { venvPath } = await importInputsModule();
|
||||||
expect(venvPath).toBe("/workspace/custom-venv");
|
expect(venvPath).toBe("/workspace/custom-venv");
|
||||||
});
|
});
|
||||||
|
|
||||||
@@ -168,7 +175,7 @@ describe("venvPath", () => {
|
|||||||
mockInputs["working-directory"] = "/workspace";
|
mockInputs["working-directory"] = "/workspace";
|
||||||
mockInputs["activate-environment"] = "true";
|
mockInputs["activate-environment"] = "true";
|
||||||
mockInputs["venv-path"] = "custom-venv/";
|
mockInputs["venv-path"] = "custom-venv/";
|
||||||
const { venvPath } = await import("../../src/utils/inputs");
|
const { venvPath } = await importInputsModule();
|
||||||
expect(venvPath).toBe("/workspace/custom-venv");
|
expect(venvPath).toBe("/workspace/custom-venv");
|
||||||
});
|
});
|
||||||
|
|
||||||
@@ -176,7 +183,7 @@ describe("venvPath", () => {
|
|||||||
mockInputs["working-directory"] = "/workspace";
|
mockInputs["working-directory"] = "/workspace";
|
||||||
mockInputs["activate-environment"] = "true";
|
mockInputs["activate-environment"] = "true";
|
||||||
mockInputs["venv-path"] = "/tmp/custom-venv";
|
mockInputs["venv-path"] = "/tmp/custom-venv";
|
||||||
const { venvPath } = await import("../../src/utils/inputs");
|
const { venvPath } = await importInputsModule();
|
||||||
expect(venvPath).toBe("/tmp/custom-venv");
|
expect(venvPath).toBe("/tmp/custom-venv");
|
||||||
});
|
});
|
||||||
|
|
||||||
@@ -184,7 +191,7 @@ describe("venvPath", () => {
|
|||||||
mockInputs["working-directory"] = "/workspace";
|
mockInputs["working-directory"] = "/workspace";
|
||||||
mockInputs["activate-environment"] = "true";
|
mockInputs["activate-environment"] = "true";
|
||||||
mockInputs["venv-path"] = "~/.venv";
|
mockInputs["venv-path"] = "~/.venv";
|
||||||
const { venvPath } = await import("../../src/utils/inputs");
|
const { venvPath } = await importInputsModule();
|
||||||
expect(venvPath).toBe("/home/testuser/.venv");
|
expect(venvPath).toBe("/home/testuser/.venv");
|
||||||
});
|
});
|
||||||
|
|
||||||
@@ -192,18 +199,11 @@ describe("venvPath", () => {
|
|||||||
mockInputs["working-directory"] = "/workspace";
|
mockInputs["working-directory"] = "/workspace";
|
||||||
mockInputs["venv-path"] = "custom-venv";
|
mockInputs["venv-path"] = "custom-venv";
|
||||||
|
|
||||||
const { activateEnvironment, venvPath } = await import(
|
const { activateEnvironment, venvPath } = await importInputsModule();
|
||||||
"../../src/utils/inputs"
|
|
||||||
);
|
|
||||||
|
|
||||||
expect(activateEnvironment).toBe(false);
|
expect(activateEnvironment).toBe(false);
|
||||||
expect(venvPath).toBe("/workspace/custom-venv");
|
expect(venvPath).toBe("/workspace/custom-venv");
|
||||||
|
expect(mockWarning).toHaveBeenCalledWith(
|
||||||
const mockedCore = jest.requireMock("@actions/core") as {
|
|
||||||
warning: jest.Mock;
|
|
||||||
};
|
|
||||||
|
|
||||||
expect(mockedCore.warning).toHaveBeenCalledWith(
|
|
||||||
"venv-path is only used when activate-environment is true",
|
"venv-path is only used when activate-environment is true",
|
||||||
);
|
);
|
||||||
});
|
});
|
||||||
|
|||||||
@@ -1,113 +1,121 @@
|
|||||||
jest.mock("node:fs");
|
import { beforeEach, describe, expect, it, jest } from "@jest/globals";
|
||||||
jest.mock("@actions/core", () => ({
|
|
||||||
warning: jest.fn(),
|
const mockReadFileSync = jest.fn();
|
||||||
|
const mockWarning = jest.fn();
|
||||||
|
|
||||||
|
jest.unstable_mockModule("node:fs", () => ({
|
||||||
|
default: {
|
||||||
|
readFileSync: mockReadFileSync,
|
||||||
|
},
|
||||||
}));
|
}));
|
||||||
|
|
||||||
import fs from "node:fs";
|
jest.unstable_mockModule("@actions/core", () => ({
|
||||||
import * as core from "@actions/core";
|
warning: mockWarning,
|
||||||
import { beforeEach, describe, expect, it, jest } from "@jest/globals";
|
}));
|
||||||
import { getUvVersionFromToolVersions } from "../../src/version/tool-versions-file";
|
|
||||||
|
|
||||||
const mockedFs = fs as jest.Mocked<typeof fs>;
|
async function getVersionFromToolVersions(filePath: string) {
|
||||||
const mockedCore = core as jest.Mocked<typeof core>;
|
const { getUvVersionFromToolVersions } = await import(
|
||||||
|
"../../src/version/tool-versions-file"
|
||||||
|
);
|
||||||
|
|
||||||
|
return getUvVersionFromToolVersions(filePath);
|
||||||
|
}
|
||||||
|
|
||||||
describe("getUvVersionFromToolVersions", () => {
|
describe("getUvVersionFromToolVersions", () => {
|
||||||
beforeEach(() => {
|
beforeEach(() => {
|
||||||
|
jest.resetModules();
|
||||||
jest.clearAllMocks();
|
jest.clearAllMocks();
|
||||||
});
|
});
|
||||||
|
|
||||||
it("should return undefined for non-.tool-versions files", () => {
|
it("should return undefined for non-.tool-versions files", async () => {
|
||||||
const result = getUvVersionFromToolVersions("package.json");
|
const result = await getVersionFromToolVersions("package.json");
|
||||||
expect(result).toBeUndefined();
|
expect(result).toBeUndefined();
|
||||||
expect(mockedFs.readFileSync).not.toHaveBeenCalled();
|
expect(mockReadFileSync).not.toHaveBeenCalled();
|
||||||
});
|
});
|
||||||
|
|
||||||
it("should return version for valid uv entry", () => {
|
it("should return version for valid uv entry", async () => {
|
||||||
const fileContent = "python 3.11.0\nuv 0.1.0\nnodejs 18.0.0";
|
const fileContent = "python 3.11.0\nuv 0.1.0\nnodejs 18.0.0";
|
||||||
mockedFs.readFileSync.mockReturnValue(fileContent);
|
mockReadFileSync.mockReturnValue(fileContent);
|
||||||
|
|
||||||
const result = getUvVersionFromToolVersions(".tool-versions");
|
const result = await getVersionFromToolVersions(".tool-versions");
|
||||||
|
|
||||||
expect(result).toBe("0.1.0");
|
expect(result).toBe("0.1.0");
|
||||||
expect(mockedFs.readFileSync).toHaveBeenCalledWith(
|
expect(mockReadFileSync).toHaveBeenCalledWith(".tool-versions", "utf8");
|
||||||
".tool-versions",
|
|
||||||
"utf8",
|
|
||||||
);
|
|
||||||
});
|
});
|
||||||
|
|
||||||
it("should return version for uv entry with v prefix", () => {
|
it("should return version for uv entry with v prefix", async () => {
|
||||||
const fileContent = "uv v0.2.0";
|
const fileContent = "uv v0.2.0";
|
||||||
mockedFs.readFileSync.mockReturnValue(fileContent);
|
mockReadFileSync.mockReturnValue(fileContent);
|
||||||
|
|
||||||
const result = getUvVersionFromToolVersions(".tool-versions");
|
const result = await getVersionFromToolVersions(".tool-versions");
|
||||||
|
|
||||||
expect(result).toBe("0.2.0");
|
expect(result).toBe("0.2.0");
|
||||||
});
|
});
|
||||||
|
|
||||||
it("should handle whitespace around uv entry", () => {
|
it("should handle whitespace around uv entry", async () => {
|
||||||
const fileContent = " uv 0.3.0 ";
|
const fileContent = " uv 0.3.0 ";
|
||||||
mockedFs.readFileSync.mockReturnValue(fileContent);
|
mockReadFileSync.mockReturnValue(fileContent);
|
||||||
|
|
||||||
const result = getUvVersionFromToolVersions(".tool-versions");
|
const result = await getVersionFromToolVersions(".tool-versions");
|
||||||
|
|
||||||
expect(result).toBe("0.3.0");
|
expect(result).toBe("0.3.0");
|
||||||
});
|
});
|
||||||
|
|
||||||
it("should skip commented lines", () => {
|
it("should skip commented lines", async () => {
|
||||||
const fileContent = "# uv 0.1.0\npython 3.11.0\nuv 0.2.0";
|
const fileContent = "# uv 0.1.0\npython 3.11.0\nuv 0.2.0";
|
||||||
mockedFs.readFileSync.mockReturnValue(fileContent);
|
mockReadFileSync.mockReturnValue(fileContent);
|
||||||
|
|
||||||
const result = getUvVersionFromToolVersions(".tool-versions");
|
const result = await getVersionFromToolVersions(".tool-versions");
|
||||||
|
|
||||||
expect(result).toBe("0.2.0");
|
expect(result).toBe("0.2.0");
|
||||||
});
|
});
|
||||||
|
|
||||||
it("should return first matching uv version", () => {
|
it("should return first matching uv version", async () => {
|
||||||
const fileContent = "uv 0.1.0\npython 3.11.0\nuv 0.2.0";
|
const fileContent = "uv 0.1.0\npython 3.11.0\nuv 0.2.0";
|
||||||
mockedFs.readFileSync.mockReturnValue(fileContent);
|
mockReadFileSync.mockReturnValue(fileContent);
|
||||||
|
|
||||||
const result = getUvVersionFromToolVersions(".tool-versions");
|
const result = await getVersionFromToolVersions(".tool-versions");
|
||||||
|
|
||||||
expect(result).toBe("0.1.0");
|
expect(result).toBe("0.1.0");
|
||||||
});
|
});
|
||||||
|
|
||||||
it("should return undefined when no uv entry found", () => {
|
it("should return undefined when no uv entry found", async () => {
|
||||||
const fileContent = "python 3.11.0\nnodejs 18.0.0";
|
const fileContent = "python 3.11.0\nnodejs 18.0.0";
|
||||||
mockedFs.readFileSync.mockReturnValue(fileContent);
|
mockReadFileSync.mockReturnValue(fileContent);
|
||||||
|
|
||||||
const result = getUvVersionFromToolVersions(".tool-versions");
|
const result = await getVersionFromToolVersions(".tool-versions");
|
||||||
|
|
||||||
expect(result).toBeUndefined();
|
expect(result).toBeUndefined();
|
||||||
});
|
});
|
||||||
|
|
||||||
it("should return undefined for empty file", () => {
|
it("should return undefined for empty file", async () => {
|
||||||
mockedFs.readFileSync.mockReturnValue("");
|
mockReadFileSync.mockReturnValue("");
|
||||||
|
|
||||||
const result = getUvVersionFromToolVersions(".tool-versions");
|
const result = await getVersionFromToolVersions(".tool-versions");
|
||||||
|
|
||||||
expect(result).toBeUndefined();
|
expect(result).toBeUndefined();
|
||||||
});
|
});
|
||||||
|
|
||||||
it("should warn and return undefined for ref syntax", () => {
|
it("should warn and return undefined for ref syntax", async () => {
|
||||||
const fileContent = "uv ref:main";
|
const fileContent = "uv ref:main";
|
||||||
mockedFs.readFileSync.mockReturnValue(fileContent);
|
mockReadFileSync.mockReturnValue(fileContent);
|
||||||
|
|
||||||
const result = getUvVersionFromToolVersions(".tool-versions");
|
const result = await getVersionFromToolVersions(".tool-versions");
|
||||||
|
|
||||||
expect(result).toBeUndefined();
|
expect(result).toBeUndefined();
|
||||||
expect(mockedCore.warning).toHaveBeenCalledWith(
|
expect(mockWarning).toHaveBeenCalledWith(
|
||||||
"The ref syntax of .tool-versions is not supported. Please use a released version instead.",
|
"The ref syntax of .tool-versions is not supported. Please use a released version instead.",
|
||||||
);
|
);
|
||||||
});
|
});
|
||||||
|
|
||||||
it("should handle file path with .tool-versions extension", () => {
|
it("should handle file path with .tool-versions extension", async () => {
|
||||||
const fileContent = "uv 0.1.0";
|
const fileContent = "uv 0.1.0";
|
||||||
mockedFs.readFileSync.mockReturnValue(fileContent);
|
mockReadFileSync.mockReturnValue(fileContent);
|
||||||
|
|
||||||
const result = getUvVersionFromToolVersions("path/to/.tool-versions");
|
const result = await getVersionFromToolVersions("path/to/.tool-versions");
|
||||||
|
|
||||||
expect(result).toBe("0.1.0");
|
expect(result).toBe("0.1.0");
|
||||||
expect(mockedFs.readFileSync).toHaveBeenCalledWith(
|
expect(mockReadFileSync).toHaveBeenCalledWith(
|
||||||
"path/to/.tool-versions",
|
"path/to/.tool-versions",
|
||||||
"utf8",
|
"utf8",
|
||||||
);
|
);
|
||||||
|
|||||||
@@ -26,7 +26,7 @@ inputs:
|
|||||||
required: false
|
required: false
|
||||||
github-token:
|
github-token:
|
||||||
description:
|
description:
|
||||||
"Used to increase the rate limit when retrieving versions and downloading uv."
|
"Used when downloading uv from GitHub releases."
|
||||||
required: false
|
required: false
|
||||||
default: ${{ github.token }}
|
default: ${{ github.token }}
|
||||||
enable-cache:
|
enable-cache:
|
||||||
@@ -75,7 +75,7 @@ inputs:
|
|||||||
description: "Custom path to set UV_TOOL_BIN_DIR to."
|
description: "Custom path to set UV_TOOL_BIN_DIR to."
|
||||||
required: false
|
required: false
|
||||||
manifest-file:
|
manifest-file:
|
||||||
description: "URL to the manifest file containing available versions and download URLs."
|
description: "URL to a custom manifest file. Supports the astral-sh/versions NDJSON format and the legacy JSON array format (deprecated)."
|
||||||
required: false
|
required: false
|
||||||
add-problem-matchers:
|
add-problem-matchers:
|
||||||
description: "Add problem matchers."
|
description: "Add problem matchers."
|
||||||
@@ -102,8 +102,8 @@ outputs:
|
|||||||
description: "A boolean value to indicate the Python cache entry was found"
|
description: "A boolean value to indicate the Python cache entry was found"
|
||||||
runs:
|
runs:
|
||||||
using: "node24"
|
using: "node24"
|
||||||
main: "dist/setup/index.js"
|
main: "dist/setup/index.cjs"
|
||||||
post: "dist/save-cache/index.js"
|
post: "dist/save-cache/index.cjs"
|
||||||
post-if: success()
|
post-if: success()
|
||||||
branding:
|
branding:
|
||||||
icon: "package"
|
icon: "package"
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
{
|
{
|
||||||
"$schema": "https://biomejs.dev/schemas/2.3.7/schema.json",
|
"$schema": "https://biomejs.dev/schemas/2.4.7/schema.json",
|
||||||
"assist": {
|
"assist": {
|
||||||
"actions": {
|
"actions": {
|
||||||
"source": {
|
"source": {
|
||||||
|
|||||||
63325
dist/save-cache/index.cjs
generated
vendored
Normal file
63325
dist/save-cache/index.cjs
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
94305
dist/save-cache/index.js
generated
vendored
94305
dist/save-cache/index.js
generated
vendored
File diff suppressed because one or more lines are too long
97175
dist/setup/index.cjs
generated
vendored
Normal file
97175
dist/setup/index.cjs
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
104639
dist/setup/index.js
generated
vendored
104639
dist/setup/index.js
generated
vendored
File diff suppressed because one or more lines are too long
49537
dist/update-known-checksums/index.cjs
generated
vendored
Normal file
49537
dist/update-known-checksums/index.cjs
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
39068
dist/update-known-versions/index.js
generated
vendored
39068
dist/update-known-versions/index.js
generated
vendored
File diff suppressed because one or more lines are too long
@@ -18,12 +18,29 @@ are automatically verified by this action. The sha256 hashes can be found on the
|
|||||||
|
|
||||||
## Manifest file
|
## Manifest file
|
||||||
|
|
||||||
The `manifest-file` input allows you to specify a JSON manifest that lists available uv versions,
|
By default, setup-uv reads version metadata from
|
||||||
architectures, and their download URLs. By default, this action uses the manifest file contained
|
[`astral-sh/versions`](https://github.com/astral-sh/versions) (NDJSON format).
|
||||||
in this repository, which is automatically updated with each release of uv.
|
|
||||||
|
|
||||||
The manifest file contains an array of objects, each describing a version,
|
The `manifest-file` input lets you override that source with your own URL, for example to test
|
||||||
architecture, platform, and the corresponding download URL. For example:
|
custom uv builds or alternate download locations.
|
||||||
|
|
||||||
|
### Format
|
||||||
|
|
||||||
|
The manifest file must be in NDJSON format, where each line is a JSON object representing a version and its artifacts. For example:
|
||||||
|
|
||||||
|
```json
|
||||||
|
{"version":"0.10.7","artifacts":[{"platform":"x86_64-unknown-linux-gnu","variant":"default","url":"https://example.com/uv-x86_64-unknown-linux-gnu.tar.gz","archive_format":"tar.gz","sha256":"..."}]}
|
||||||
|
{"version":"0.10.6","artifacts":[{"platform":"x86_64-unknown-linux-gnu","variant":"default","url":"https://example.com/uv-x86_64-unknown-linux-gnu.tar.gz","archive_format":"tar.gz","sha256":"..."}]}
|
||||||
|
```
|
||||||
|
|
||||||
|
setup-uv currently only supports `default` as the `variant`.
|
||||||
|
|
||||||
|
The `archive_format` field is currently ignored.
|
||||||
|
|
||||||
|
### Legacy format: JSON array (deprecated)
|
||||||
|
|
||||||
|
The previous JSON array format is still supported for compatibility, but deprecated and will be
|
||||||
|
removed in a future major release.
|
||||||
|
|
||||||
```json
|
```json
|
||||||
[
|
[
|
||||||
@@ -33,26 +50,20 @@ architecture, platform, and the corresponding download URL. For example:
|
|||||||
"arch": "aarch64",
|
"arch": "aarch64",
|
||||||
"platform": "apple-darwin",
|
"platform": "apple-darwin",
|
||||||
"downloadUrl": "https://github.com/astral-sh/uv/releases/download/0.7.13/uv-aarch64-apple-darwin.tar.gz"
|
"downloadUrl": "https://github.com/astral-sh/uv/releases/download/0.7.13/uv-aarch64-apple-darwin.tar.gz"
|
||||||
},
|
}
|
||||||
...
|
|
||||||
]
|
]
|
||||||
```
|
```
|
||||||
|
|
||||||
You can supply a custom manifest file URL to define additional versions,
|
|
||||||
architectures, or different download URLs.
|
|
||||||
This is useful if you maintain your own uv builds or want to override the default sources.
|
|
||||||
|
|
||||||
```yaml
|
```yaml
|
||||||
- name: Use a custom manifest file
|
- name: Use a custom manifest file
|
||||||
uses: astral-sh/setup-uv@v7
|
uses: astral-sh/setup-uv@v7
|
||||||
with:
|
with:
|
||||||
manifest-file: "https://example.com/my-custom-manifest.json"
|
manifest-file: "https://example.com/my-custom-manifest.ndjson"
|
||||||
```
|
```
|
||||||
|
|
||||||
> [!NOTE]
|
> [!NOTE]
|
||||||
> When you use a custom manifest file and do not set the `version` input, its default value is `latest`.
|
> When you use a custom manifest file and do not set the `version` input, setup-uv installs the
|
||||||
> This means the action will install the latest version available in the custom manifest file.
|
> latest version from that custom manifest.
|
||||||
> This is different from the default behavior of installing the latest version from the official uv releases.
|
|
||||||
|
|
||||||
## Add problem matchers
|
## Add problem matchers
|
||||||
|
|
||||||
|
|||||||
@@ -38,9 +38,12 @@ You can customize the venv location with `venv-path`, for example to place it in
|
|||||||
|
|
||||||
## GitHub authentication token
|
## GitHub authentication token
|
||||||
|
|
||||||
This action uses the GitHub API to fetch the uv release artifacts. To avoid hitting the GitHub API
|
By default, this action resolves available uv versions from
|
||||||
rate limit too quickly, an authentication token can be provided via the `github-token` input. By
|
[`astral-sh/versions`](https://github.com/astral-sh/versions), then downloads uv artifacts from
|
||||||
default, the `GITHUB_TOKEN` secret is used, which is automatically provided by GitHub Actions.
|
GitHub Releases.
|
||||||
|
|
||||||
|
You can provide a token via `github-token` to authenticate those downloads. By default, the
|
||||||
|
`GITHUB_TOKEN` secret is used, which is automatically provided by GitHub Actions.
|
||||||
|
|
||||||
If the default
|
If the default
|
||||||
[permissions for the GitHub token](https://docs.github.com/en/actions/security-for-github-actions/security-guides/automatic-token-authentication#permissions-for-the-github_token)
|
[permissions for the GitHub token](https://docs.github.com/en/actions/security-for-github-actions/security-guides/automatic-token-authentication#permissions-for-the-github_token)
|
||||||
|
|||||||
@@ -1,9 +0,0 @@
|
|||||||
module.exports = {
|
|
||||||
clearMocks: true,
|
|
||||||
moduleFileExtensions: ["js", "ts"],
|
|
||||||
testMatch: ["**/*.test.ts"],
|
|
||||||
transform: {
|
|
||||||
"^.+\\.ts$": "ts-jest",
|
|
||||||
},
|
|
||||||
verbose: true,
|
|
||||||
};
|
|
||||||
14
jest.config.mjs
Normal file
14
jest.config.mjs
Normal file
@@ -0,0 +1,14 @@
|
|||||||
|
import { createDefaultEsmPreset } from "ts-jest";
|
||||||
|
|
||||||
|
const esmPreset = createDefaultEsmPreset({
|
||||||
|
tsconfig: "./tsconfig.json",
|
||||||
|
});
|
||||||
|
|
||||||
|
export default {
|
||||||
|
...esmPreset,
|
||||||
|
clearMocks: true,
|
||||||
|
moduleFileExtensions: ["js", "mjs", "ts"],
|
||||||
|
testEnvironment: "node",
|
||||||
|
testMatch: ["**/*.test.ts"],
|
||||||
|
verbose: true,
|
||||||
|
};
|
||||||
4416
package-lock.json
generated
4416
package-lock.json
generated
File diff suppressed because it is too large
Load Diff
44
package.json
44
package.json
@@ -2,16 +2,18 @@
|
|||||||
"name": "setup-uv",
|
"name": "setup-uv",
|
||||||
"version": "1.0.0",
|
"version": "1.0.0",
|
||||||
"private": true,
|
"private": true,
|
||||||
|
"type": "module",
|
||||||
"description": "Set up your GitHub Actions workflow with a specific version of uv",
|
"description": "Set up your GitHub Actions workflow with a specific version of uv",
|
||||||
"main": "dist/index.js",
|
"main": "dist/setup/index.cjs",
|
||||||
"scripts": {
|
"scripts": {
|
||||||
"build": "tsc",
|
"build": "tsc --noEmit",
|
||||||
"check": "biome check --write",
|
"check": "biome check --write",
|
||||||
"package": "ncc build -o dist/setup src/setup-uv.ts && ncc build -o dist/save-cache src/save-cache.ts && ncc build -o dist/update-known-versions src/update-known-versions.ts",
|
"package": "node scripts/build-dist.mjs",
|
||||||
"test": "jest",
|
"test:unit": "node --experimental-vm-modules ./node_modules/jest/bin/jest.js",
|
||||||
|
"test": "npm run build && npm run test:unit",
|
||||||
"act": "act pull_request -W .github/workflows/test.yml --container-architecture linux/amd64 -s GITHUB_TOKEN=\"$(gh auth token)\"",
|
"act": "act pull_request -W .github/workflows/test.yml --container-architecture linux/amd64 -s GITHUB_TOKEN=\"$(gh auth token)\"",
|
||||||
"update-known-versions": "RUNNER_TEMP=known_versions node dist/update-known-versions/index.js src/download/checksum/known-versions.ts \"$(gh auth token)\"",
|
"update-known-checksums": "RUNNER_TEMP=known_versions node dist/update-known-checksums/index.cjs src/download/checksum/known-checksums.ts",
|
||||||
"all": "npm run build && npm run check && npm run package && npm test"
|
"all": "npm run build && npm run check && npm run package && npm run test:unit"
|
||||||
},
|
},
|
||||||
"repository": {
|
"repository": {
|
||||||
"type": "git",
|
"type": "git",
|
||||||
@@ -26,28 +28,26 @@
|
|||||||
"author": "@eifinger",
|
"author": "@eifinger",
|
||||||
"license": "MIT",
|
"license": "MIT",
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"@actions/cache": "^4.1.0",
|
"@actions/cache": "^6.0.0",
|
||||||
"@actions/core": "^1.11.1",
|
"@actions/core": "^3.0.0",
|
||||||
"@actions/exec": "^1.1.1",
|
"@actions/exec": "^3.0.0",
|
||||||
"@actions/glob": "^0.5.0",
|
"@actions/glob": "^0.6.1",
|
||||||
"@actions/io": "^1.1.3",
|
"@actions/io": "^3.0.2",
|
||||||
"@actions/tool-cache": "^2.0.2",
|
"@actions/tool-cache": "^4.0.0",
|
||||||
"@octokit/core": "^7.0.6",
|
"@renovatebot/pep440": "^4.2.2",
|
||||||
"@octokit/plugin-paginate-rest": "^14.0.0",
|
|
||||||
"@octokit/plugin-rest-endpoint-methods": "^17.0.0",
|
|
||||||
"@renovatebot/pep440": "^4.2.1",
|
|
||||||
"smol-toml": "^1.6.0",
|
"smol-toml": "^1.6.0",
|
||||||
"undici": "5.28.5"
|
"undici": "^7.24.2"
|
||||||
},
|
},
|
||||||
"devDependencies": {
|
"devDependencies": {
|
||||||
"@biomejs/biome": "2.3.8",
|
"@biomejs/biome": "^2.4.7",
|
||||||
"@types/js-yaml": "^4.0.9",
|
"@types/js-yaml": "^4.0.9",
|
||||||
"@types/node": "^24.10.1",
|
"@types/node": "^25.5.0",
|
||||||
"@types/semver": "^7.7.1",
|
"@types/semver": "^7.7.1",
|
||||||
"@vercel/ncc": "^0.38.4",
|
"@vercel/ncc": "^0.38.4",
|
||||||
"jest": "^30.2.0",
|
"esbuild": "^0.27.4",
|
||||||
"js-yaml": "^4.1.0",
|
"jest": "^30.3.0",
|
||||||
"ts-jest": "^29.4.5",
|
"js-yaml": "^4.1.1",
|
||||||
|
"ts-jest": "^29.4.6",
|
||||||
"typescript": "^5.9.3"
|
"typescript": "^5.9.3"
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
33
scripts/build-dist.mjs
Normal file
33
scripts/build-dist.mjs
Normal file
@@ -0,0 +1,33 @@
|
|||||||
|
import { rm } from "node:fs/promises";
|
||||||
|
import { build } from "esbuild";
|
||||||
|
|
||||||
|
const builds = [
|
||||||
|
{
|
||||||
|
entryPoints: ["src/setup-uv.ts"],
|
||||||
|
outfile: "dist/setup/index.cjs",
|
||||||
|
staleOutfiles: ["dist/setup/index.mjs"],
|
||||||
|
},
|
||||||
|
{
|
||||||
|
entryPoints: ["src/save-cache.ts"],
|
||||||
|
outfile: "dist/save-cache/index.cjs",
|
||||||
|
staleOutfiles: ["dist/save-cache/index.mjs"],
|
||||||
|
},
|
||||||
|
{
|
||||||
|
entryPoints: ["src/update-known-checksums.ts"],
|
||||||
|
outfile: "dist/update-known-checksums/index.cjs",
|
||||||
|
staleOutfiles: ["dist/update-known-checksums/index.mjs"],
|
||||||
|
},
|
||||||
|
];
|
||||||
|
|
||||||
|
for (const { staleOutfiles, ...options } of builds) {
|
||||||
|
await Promise.all(
|
||||||
|
staleOutfiles.map((outfile) => rm(outfile, { force: true })),
|
||||||
|
);
|
||||||
|
await build({
|
||||||
|
bundle: true,
|
||||||
|
format: "cjs",
|
||||||
|
platform: "node",
|
||||||
|
target: "node24",
|
||||||
|
...options,
|
||||||
|
});
|
||||||
|
}
|
||||||
@@ -6,33 +6,35 @@ import type { Architecture, Platform } from "../../utils/platforms";
|
|||||||
import { KNOWN_CHECKSUMS } from "./known-checksums";
|
import { KNOWN_CHECKSUMS } from "./known-checksums";
|
||||||
|
|
||||||
export async function validateChecksum(
|
export async function validateChecksum(
|
||||||
checkSum: string | undefined,
|
checksum: string | undefined,
|
||||||
downloadPath: string,
|
downloadPath: string,
|
||||||
arch: Architecture,
|
arch: Architecture,
|
||||||
platform: Platform,
|
platform: Platform,
|
||||||
version: string,
|
version: string,
|
||||||
): Promise<void> {
|
): Promise<void> {
|
||||||
let isValid: boolean | undefined;
|
|
||||||
if (checkSum !== undefined && checkSum !== "") {
|
|
||||||
isValid = await validateFileCheckSum(downloadPath, checkSum);
|
|
||||||
} else {
|
|
||||||
core.debug("Checksum not provided. Checking known checksums.");
|
|
||||||
const key = `${arch}-${platform}-${version}`;
|
const key = `${arch}-${platform}-${version}`;
|
||||||
if (key in KNOWN_CHECKSUMS) {
|
const hasProvidedChecksum = checksum !== undefined && checksum !== "";
|
||||||
const knownChecksum = KNOWN_CHECKSUMS[`${arch}-${platform}-${version}`];
|
const checksumToUse = hasProvidedChecksum ? checksum : KNOWN_CHECKSUMS[key];
|
||||||
core.debug(`Checking checksum for ${arch}-${platform}-${version}.`);
|
|
||||||
isValid = await validateFileCheckSum(downloadPath, knownChecksum);
|
if (checksumToUse === undefined) {
|
||||||
} else {
|
core.debug(`No checksum found for ${key}.`);
|
||||||
core.debug(`No known checksum found for ${key}.`);
|
return;
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
if (isValid === false) {
|
const checksumSource = hasProvidedChecksum
|
||||||
throw new Error(`Checksum for ${downloadPath} did not match ${checkSum}.`);
|
? "provided checksum"
|
||||||
|
: `KNOWN_CHECKSUMS entry for ${key}`;
|
||||||
|
|
||||||
|
core.debug(`Validating checksum using ${checksumSource}.`);
|
||||||
|
const isValid = await validateFileCheckSum(downloadPath, checksumToUse);
|
||||||
|
|
||||||
|
if (!isValid) {
|
||||||
|
throw new Error(
|
||||||
|
`Checksum for ${downloadPath} did not match ${checksumToUse}.`,
|
||||||
|
);
|
||||||
}
|
}
|
||||||
if (isValid === true) {
|
|
||||||
core.debug(`Checksum for ${downloadPath} is valid.`);
|
core.debug(`Checksum for ${downloadPath} is valid.`);
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
async function validateFileCheckSum(
|
async function validateFileCheckSum(
|
||||||
|
|||||||
@@ -1,5 +1,39 @@
|
|||||||
// AUTOGENERATED_DO_NOT_EDIT
|
// AUTOGENERATED_DO_NOT_EDIT
|
||||||
export const KNOWN_CHECKSUMS: { [key: string]: string } = {
|
export const KNOWN_CHECKSUMS: { [key: string]: string } = {
|
||||||
|
"aarch64-apple-darwin-0.10.10":
|
||||||
|
"8a09f0ef51ee7f7170731b4cb8bde5bf9ba6da5304f49a7df6cdab42a1f37b5d",
|
||||||
|
"aarch64-pc-windows-msvc-0.10.10":
|
||||||
|
"2c6fe113f14574bc27f085751c68d3485589fcc3c3c64ed85dd1eecc2f87cffc",
|
||||||
|
"aarch64-unknown-linux-gnu-0.10.10":
|
||||||
|
"2b80457b950deda12e8d5dc3b9b7494ac143eae47f1fb11b1c6e5a8495a6421e",
|
||||||
|
"aarch64-unknown-linux-musl-0.10.10":
|
||||||
|
"d08c08b82cdcaf2bd3d928ffe844d3558dda53f90066db6ef9174157cc763252",
|
||||||
|
"arm-unknown-linux-musleabihf-0.10.10":
|
||||||
|
"ccc3c4dd5eeea4b2be829ef9bc0b8d9882389c0f303f7ec5ba668065d57e2673",
|
||||||
|
"armv7-unknown-linux-gnueabihf-0.10.10":
|
||||||
|
"032786622b52f8d0232b5ad16e25342a64f9e43576652db7bf607231021902f3",
|
||||||
|
"armv7-unknown-linux-musleabihf-0.10.10":
|
||||||
|
"f6f67b190eb28b473917c97210f89fd11d9b9393d774acd093ea738fcee68864",
|
||||||
|
"i686-pc-windows-msvc-0.10.10":
|
||||||
|
"980d7ea368cc4883f572bb85c285a647eddfc23539064d2bfaf8fbfefcc2112b",
|
||||||
|
"i686-unknown-linux-gnu-0.10.10":
|
||||||
|
"5260fbef838f8cfec44697064a5cfae08a27c6ab7ed7feab7fc946827e896952",
|
||||||
|
"i686-unknown-linux-musl-0.10.10":
|
||||||
|
"a6683ade964f8d8623098ca0c96b4311d8388b44a56a386cd795974f39fb5bd2",
|
||||||
|
"powerpc64le-unknown-linux-gnu-0.10.10":
|
||||||
|
"78939dc4fc905aca8af4be19b6c6ecc306f04c6ca9f98d144372595d9397fd0d",
|
||||||
|
"riscv64gc-unknown-linux-gnu-0.10.10":
|
||||||
|
"5eff670bf80fce9d9e50df5b4d46c415a9c0324eadf7059d97c76f89ffc33c3f",
|
||||||
|
"s390x-unknown-linux-gnu-0.10.10":
|
||||||
|
"a32d2be5600f7f42f82596ffe9d3115f020974ca7fb4f15251c5625c5481ea5e",
|
||||||
|
"x86_64-apple-darwin-0.10.10":
|
||||||
|
"dd18420591d625f9b4ca2b57a7a6fe3cce43910f02e02d90e47a4101428de14a",
|
||||||
|
"x86_64-pc-windows-msvc-0.10.10":
|
||||||
|
"d31a30f1dfb96e630a08d5a9b3f3f551254b7ed6e9b7e495f46a4232661c7252",
|
||||||
|
"x86_64-unknown-linux-gnu-0.10.10":
|
||||||
|
"3e1027f26ce8c7e4c32e2277a7fed2cb410f2f1f9320d3df97653d40e21f415b",
|
||||||
|
"x86_64-unknown-linux-musl-0.10.10":
|
||||||
|
"74544e8755fbc27559e22e29fd561bdc48f91b8bd8323e760a1130f32433bea4",
|
||||||
"aarch64-apple-darwin-0.10.9":
|
"aarch64-apple-darwin-0.10.9":
|
||||||
"a92f61e9ac9b0f29668c15f56152e4a60143fca148ff5bfadb86718472c3f376",
|
"a92f61e9ac9b0f29668c15f56152e4a60143fca148ff5bfadb86718472c3f376",
|
||||||
"aarch64-pc-windows-msvc-0.10.9":
|
"aarch64-pc-windows-msvc-0.10.9":
|
||||||
|
|||||||
@@ -1,59 +1,34 @@
|
|||||||
import { promises as fs } from "node:fs";
|
import { promises as fs } from "node:fs";
|
||||||
import * as tc from "@actions/tool-cache";
|
|
||||||
import { KNOWN_CHECKSUMS } from "./known-checksums";
|
export interface ChecksumEntry {
|
||||||
|
key: string;
|
||||||
|
checksum: string;
|
||||||
|
}
|
||||||
|
|
||||||
export async function updateChecksums(
|
export async function updateChecksums(
|
||||||
filePath: string,
|
filePath: string,
|
||||||
downloadUrls: string[],
|
checksumEntries: ChecksumEntry[],
|
||||||
): Promise<void> {
|
): Promise<void> {
|
||||||
await fs.rm(filePath);
|
const deduplicatedEntries = new Map<string, string>();
|
||||||
await fs.appendFile(
|
|
||||||
filePath,
|
for (const entry of checksumEntries) {
|
||||||
"// AUTOGENERATED_DO_NOT_EDIT\nexport const KNOWN_CHECKSUMS: { [key: string]: string } = {\n",
|
if (deduplicatedEntries.has(entry.key)) {
|
||||||
);
|
|
||||||
let firstLine = true;
|
|
||||||
for (const downloadUrl of downloadUrls) {
|
|
||||||
const key = getKey(downloadUrl);
|
|
||||||
if (key === undefined) {
|
|
||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
const checksum = await getOrDownloadChecksum(key, downloadUrl);
|
|
||||||
if (!firstLine) {
|
|
||||||
await fs.appendFile(filePath, ",\n");
|
|
||||||
}
|
|
||||||
await fs.appendFile(filePath, ` "${key}":\n "${checksum}"`);
|
|
||||||
firstLine = false;
|
|
||||||
}
|
|
||||||
await fs.appendFile(filePath, ",\n};\n");
|
|
||||||
}
|
|
||||||
|
|
||||||
function getKey(downloadUrl: string): string | undefined {
|
deduplicatedEntries.set(entry.key, entry.checksum);
|
||||||
// https://github.com/astral-sh/uv/releases/download/0.3.2/uv-aarch64-apple-darwin.tar.gz.sha256
|
|
||||||
const parts = downloadUrl.split("/");
|
|
||||||
const fileName = parts[parts.length - 1];
|
|
||||||
if (fileName.startsWith("source")) {
|
|
||||||
return undefined;
|
|
||||||
}
|
}
|
||||||
const name = fileName.split(".")[0].split("uv-")[1];
|
|
||||||
const version = parts[parts.length - 2];
|
|
||||||
return `${name}-${version}`;
|
|
||||||
}
|
|
||||||
|
|
||||||
async function getOrDownloadChecksum(
|
const body = [...deduplicatedEntries.entries()]
|
||||||
key: string,
|
.map(([key, checksum]) => ` "${key}":\n "${checksum}"`)
|
||||||
downloadUrl: string,
|
.join(",\n");
|
||||||
): Promise<string> {
|
|
||||||
let checksum = "";
|
|
||||||
if (key in KNOWN_CHECKSUMS) {
|
|
||||||
checksum = KNOWN_CHECKSUMS[key];
|
|
||||||
} else {
|
|
||||||
const content = await downloadAssetContent(downloadUrl);
|
|
||||||
checksum = content.split(" ")[0].trim();
|
|
||||||
}
|
|
||||||
return checksum;
|
|
||||||
}
|
|
||||||
|
|
||||||
async function downloadAssetContent(downloadUrl: string): Promise<string> {
|
const content =
|
||||||
const downloadPath = await tc.downloadTool(downloadUrl);
|
"// AUTOGENERATED_DO_NOT_EDIT\n" +
|
||||||
const content = await fs.readFile(downloadPath, "utf8");
|
"export const KNOWN_CHECKSUMS: { [key: string]: string } = {\n" +
|
||||||
return content;
|
body +
|
||||||
|
(body === "" ? "" : ",\n") +
|
||||||
|
"};\n";
|
||||||
|
|
||||||
|
await fs.writeFile(filePath, content);
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -2,20 +2,21 @@ import { promises as fs } from "node:fs";
|
|||||||
import * as path from "node:path";
|
import * as path from "node:path";
|
||||||
import * as core from "@actions/core";
|
import * as core from "@actions/core";
|
||||||
import * as tc from "@actions/tool-cache";
|
import * as tc from "@actions/tool-cache";
|
||||||
import type { Endpoints } from "@octokit/types";
|
|
||||||
import * as pep440 from "@renovatebot/pep440";
|
import * as pep440 from "@renovatebot/pep440";
|
||||||
import * as semver from "semver";
|
import * as semver from "semver";
|
||||||
import { OWNER, REPO, TOOL_CACHE_NAME } from "../utils/constants";
|
import { TOOL_CACHE_NAME, VERSIONS_NDJSON_URL } from "../utils/constants";
|
||||||
import { Octokit } from "../utils/octokit";
|
|
||||||
import type { Architecture, Platform } from "../utils/platforms";
|
import type { Architecture, Platform } from "../utils/platforms";
|
||||||
import { validateChecksum } from "./checksum/checksum";
|
import { validateChecksum } from "./checksum/checksum";
|
||||||
import {
|
import {
|
||||||
getDownloadUrl,
|
getAllVersions as getAllManifestVersions,
|
||||||
getLatestKnownVersion as getLatestVersionInManifest,
|
getLatestKnownVersion as getLatestVersionInManifest,
|
||||||
|
getManifestArtifact,
|
||||||
} from "./version-manifest";
|
} from "./version-manifest";
|
||||||
|
import {
|
||||||
type Release =
|
getAllVersions as getAllVersionsFromNdjson,
|
||||||
Endpoints["GET /repos/{owner}/{repo}/releases"]["response"]["data"][number];
|
getArtifact as getArtifactFromNdjson,
|
||||||
|
getLatestVersion as getLatestVersionFromNdjson,
|
||||||
|
} from "./versions-client";
|
||||||
|
|
||||||
export function tryGetFromToolCache(
|
export function tryGetFromToolCache(
|
||||||
arch: Architecture,
|
arch: Architecture,
|
||||||
@@ -32,19 +33,26 @@ export function tryGetFromToolCache(
|
|||||||
return { installedPath, version: resolvedVersion };
|
return { installedPath, version: resolvedVersion };
|
||||||
}
|
}
|
||||||
|
|
||||||
export async function downloadVersionFromGithub(
|
export async function downloadVersionFromNdjson(
|
||||||
platform: Platform,
|
platform: Platform,
|
||||||
arch: Architecture,
|
arch: Architecture,
|
||||||
version: string,
|
version: string,
|
||||||
checkSum: string | undefined,
|
checkSum: string | undefined,
|
||||||
githubToken: string,
|
githubToken: string,
|
||||||
): Promise<{ version: string; cachedToolDir: string }> {
|
): Promise<{ version: string; cachedToolDir: string }> {
|
||||||
const artifact = `uv-${arch}-${platform}`;
|
const artifact = await getArtifactFromNdjson(version, arch, platform);
|
||||||
const extension = getExtension(platform);
|
|
||||||
const downloadUrl = `https://github.com/${OWNER}/${REPO}/releases/download/${version}/${artifact}${extension}`;
|
if (!artifact) {
|
||||||
|
throw new Error(
|
||||||
|
`Could not find artifact for version ${version}, arch ${arch}, platform ${platform} in ${VERSIONS_NDJSON_URL} .`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
// For the default astral-sh/versions source, checksum validation relies on
|
||||||
|
// user input or the built-in KNOWN_CHECKSUMS table, not NDJSON sha256 values.
|
||||||
return await downloadVersion(
|
return await downloadVersion(
|
||||||
downloadUrl,
|
artifact.url,
|
||||||
artifact,
|
`uv-${arch}-${platform}`,
|
||||||
platform,
|
platform,
|
||||||
arch,
|
arch,
|
||||||
version,
|
version,
|
||||||
@@ -54,38 +62,32 @@ export async function downloadVersionFromGithub(
|
|||||||
}
|
}
|
||||||
|
|
||||||
export async function downloadVersionFromManifest(
|
export async function downloadVersionFromManifest(
|
||||||
manifestUrl: string | undefined,
|
manifestUrl: string,
|
||||||
platform: Platform,
|
platform: Platform,
|
||||||
arch: Architecture,
|
arch: Architecture,
|
||||||
version: string,
|
version: string,
|
||||||
checkSum: string | undefined,
|
checkSum: string | undefined,
|
||||||
githubToken: string,
|
githubToken: string,
|
||||||
): Promise<{ version: string; cachedToolDir: string }> {
|
): Promise<{ version: string; cachedToolDir: string }> {
|
||||||
const downloadUrl = await getDownloadUrl(
|
const artifact = await getManifestArtifact(
|
||||||
manifestUrl,
|
manifestUrl,
|
||||||
version,
|
version,
|
||||||
arch,
|
arch,
|
||||||
platform,
|
platform,
|
||||||
);
|
);
|
||||||
if (!downloadUrl) {
|
if (!artifact) {
|
||||||
core.info(
|
throw new Error(
|
||||||
`manifest-file does not contain version ${version}, arch ${arch}, platform ${platform}. Falling back to GitHub releases.`,
|
`manifest-file does not contain version ${version}, arch ${arch}, platform ${platform}.`,
|
||||||
);
|
|
||||||
return await downloadVersionFromGithub(
|
|
||||||
platform,
|
|
||||||
arch,
|
|
||||||
version,
|
|
||||||
checkSum,
|
|
||||||
githubToken,
|
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
return await downloadVersion(
|
return await downloadVersion(
|
||||||
downloadUrl,
|
artifact.downloadUrl,
|
||||||
`uv-${arch}-${platform}`,
|
`uv-${arch}-${platform}`,
|
||||||
platform,
|
platform,
|
||||||
arch,
|
arch,
|
||||||
version,
|
version,
|
||||||
checkSum,
|
resolveChecksum(checkSum, artifact.checksum),
|
||||||
githubToken,
|
githubToken,
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
@@ -96,7 +98,7 @@ async function downloadVersion(
|
|||||||
platform: Platform,
|
platform: Platform,
|
||||||
arch: Architecture,
|
arch: Architecture,
|
||||||
version: string,
|
version: string,
|
||||||
checkSum: string | undefined,
|
checksum: string | undefined,
|
||||||
githubToken: string,
|
githubToken: string,
|
||||||
): Promise<{ version: string; cachedToolDir: string }> {
|
): Promise<{ version: string; cachedToolDir: string }> {
|
||||||
core.info(`Downloading uv from "${downloadUrl}" ...`);
|
core.info(`Downloading uv from "${downloadUrl}" ...`);
|
||||||
@@ -105,14 +107,14 @@ async function downloadVersion(
|
|||||||
undefined,
|
undefined,
|
||||||
githubToken,
|
githubToken,
|
||||||
);
|
);
|
||||||
await validateChecksum(checkSum, downloadPath, arch, platform, version);
|
await validateChecksum(checksum, downloadPath, arch, platform, version);
|
||||||
|
|
||||||
let uvDir: string;
|
let uvDir: string;
|
||||||
if (platform === "pc-windows-msvc") {
|
if (platform === "pc-windows-msvc") {
|
||||||
// On windows extracting the zip does not create an intermediate directory
|
// On windows extracting the zip does not create an intermediate directory.
|
||||||
try {
|
try {
|
||||||
// Try tar first as it's much faster, but only bsdtar supports zip files,
|
// Try tar first as it's much faster, but only bsdtar supports zip files,
|
||||||
// so this my fail if another tar, like gnu tar, ends up being used.
|
// so this may fail if another tar, like gnu tar, ends up being used.
|
||||||
uvDir = await tc.extractTar(downloadPath, undefined, "x");
|
uvDir = await tc.extractTar(downloadPath, undefined, "x");
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
core.info(
|
core.info(
|
||||||
@@ -127,6 +129,7 @@ async function downloadVersion(
|
|||||||
const extractedDir = await tc.extractTar(downloadPath);
|
const extractedDir = await tc.extractTar(downloadPath);
|
||||||
uvDir = path.join(extractedDir, artifactName);
|
uvDir = path.join(extractedDir, artifactName);
|
||||||
}
|
}
|
||||||
|
|
||||||
const cachedToolDir = await tc.cacheDir(
|
const cachedToolDir = await tc.cacheDir(
|
||||||
uvDir,
|
uvDir,
|
||||||
TOOL_CACHE_NAME,
|
TOOL_CACHE_NAME,
|
||||||
@@ -136,14 +139,22 @@ async function downloadVersion(
|
|||||||
return { cachedToolDir, version: version };
|
return { cachedToolDir, version: version };
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function resolveChecksum(
|
||||||
|
checkSum: string | undefined,
|
||||||
|
manifestChecksum?: string,
|
||||||
|
): string | undefined {
|
||||||
|
return checkSum !== undefined && checkSum !== ""
|
||||||
|
? checkSum
|
||||||
|
: manifestChecksum;
|
||||||
|
}
|
||||||
|
|
||||||
function getExtension(platform: Platform): string {
|
function getExtension(platform: Platform): string {
|
||||||
return platform === "pc-windows-msvc" ? ".zip" : ".tar.gz";
|
return platform === "pc-windows-msvc" ? ".zip" : ".tar.gz";
|
||||||
}
|
}
|
||||||
|
|
||||||
export async function resolveVersion(
|
export async function resolveVersion(
|
||||||
versionInput: string,
|
versionInput: string,
|
||||||
manifestFile: string | undefined,
|
manifestUrl: string | undefined,
|
||||||
githubToken: string,
|
|
||||||
resolutionStrategy: "highest" | "lowest" = "highest",
|
resolutionStrategy: "highest" | "lowest" = "highest",
|
||||||
): Promise<string> {
|
): Promise<string> {
|
||||||
core.debug(`Resolving version: ${versionInput}`);
|
core.debug(`Resolving version: ${versionInput}`);
|
||||||
@@ -155,15 +166,15 @@ export async function resolveVersion(
|
|||||||
if (resolveVersionSpecifierToLatest) {
|
if (resolveVersionSpecifierToLatest) {
|
||||||
core.info("Found minimum version specifier, using latest version");
|
core.info("Found minimum version specifier, using latest version");
|
||||||
}
|
}
|
||||||
if (manifestFile) {
|
if (manifestUrl !== undefined) {
|
||||||
version =
|
version =
|
||||||
versionInput === "latest" || resolveVersionSpecifierToLatest
|
versionInput === "latest" || resolveVersionSpecifierToLatest
|
||||||
? await getLatestVersionInManifest(manifestFile)
|
? await getLatestVersionInManifest(manifestUrl)
|
||||||
: versionInput;
|
: versionInput;
|
||||||
} else {
|
} else {
|
||||||
version =
|
version =
|
||||||
versionInput === "latest" || resolveVersionSpecifierToLatest
|
versionInput === "latest" || resolveVersionSpecifierToLatest
|
||||||
? await getLatestVersion(githubToken)
|
? await getLatestVersionFromNdjson()
|
||||||
: versionInput;
|
: versionInput;
|
||||||
}
|
}
|
||||||
if (tc.isExplicitVersion(version)) {
|
if (tc.isExplicitVersion(version)) {
|
||||||
@@ -175,91 +186,33 @@ export async function resolveVersion(
|
|||||||
}
|
}
|
||||||
return version;
|
return version;
|
||||||
}
|
}
|
||||||
const availableVersions = await getAvailableVersions(githubToken);
|
|
||||||
|
const availableVersions = await getAvailableVersions(manifestUrl);
|
||||||
core.debug(`Available versions: ${availableVersions}`);
|
core.debug(`Available versions: ${availableVersions}`);
|
||||||
const resolvedVersion =
|
const resolvedVersion =
|
||||||
resolutionStrategy === "lowest"
|
resolutionStrategy === "lowest"
|
||||||
? minSatisfying(availableVersions, version)
|
? minSatisfying(availableVersions, version)
|
||||||
: maxSatisfying(availableVersions, version);
|
: maxSatisfying(availableVersions, version);
|
||||||
|
|
||||||
if (resolvedVersion === undefined) {
|
if (resolvedVersion === undefined) {
|
||||||
throw new Error(`No version found for ${version}`);
|
throw new Error(`No version found for ${version}`);
|
||||||
}
|
}
|
||||||
|
|
||||||
return resolvedVersion;
|
return resolvedVersion;
|
||||||
}
|
}
|
||||||
|
|
||||||
async function getAvailableVersions(githubToken: string): Promise<string[]> {
|
async function getAvailableVersions(
|
||||||
core.info("Getting available versions from GitHub API...");
|
manifestUrl: string | undefined,
|
||||||
try {
|
): Promise<string[]> {
|
||||||
const octokit = new Octokit({
|
if (manifestUrl !== undefined) {
|
||||||
auth: githubToken,
|
|
||||||
});
|
|
||||||
return await getReleaseTagNames(octokit);
|
|
||||||
} catch (err) {
|
|
||||||
if ((err as Error).message.includes("Bad credentials")) {
|
|
||||||
core.info(
|
core.info(
|
||||||
"No (valid) GitHub token provided. Falling back to anonymous. Requests might be rate limited.",
|
`Getting available versions from manifest-file ${manifestUrl} ...`,
|
||||||
);
|
);
|
||||||
const octokit = new Octokit();
|
return await getAllManifestVersions(manifestUrl);
|
||||||
return await getReleaseTagNames(octokit);
|
|
||||||
}
|
|
||||||
throw err;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
async function getReleaseTagNames(octokit: Octokit): Promise<string[]> {
|
|
||||||
const response: Release[] = await octokit.paginate(
|
|
||||||
octokit.rest.repos.listReleases,
|
|
||||||
{
|
|
||||||
owner: OWNER,
|
|
||||||
repo: REPO,
|
|
||||||
},
|
|
||||||
);
|
|
||||||
const releaseTagNames = response.map((release) => release.tag_name);
|
|
||||||
if (releaseTagNames.length === 0) {
|
|
||||||
throw Error(
|
|
||||||
"Github API request failed while getting releases. Check the GitHub status page for outages. Try again later.",
|
|
||||||
);
|
|
||||||
}
|
|
||||||
return releaseTagNames;
|
|
||||||
}
|
|
||||||
|
|
||||||
async function getLatestVersion(githubToken: string) {
|
|
||||||
core.info("Getting latest version from GitHub API...");
|
|
||||||
const octokit = new Octokit({
|
|
||||||
auth: githubToken,
|
|
||||||
});
|
|
||||||
|
|
||||||
let latestRelease: { tag_name: string } | undefined;
|
|
||||||
try {
|
|
||||||
latestRelease = await getLatestRelease(octokit);
|
|
||||||
} catch (err) {
|
|
||||||
if ((err as Error).message.includes("Bad credentials")) {
|
|
||||||
core.info(
|
|
||||||
"No (valid) GitHub token provided. Falling back to anonymous. Requests might be rate limited.",
|
|
||||||
);
|
|
||||||
const octokit = new Octokit();
|
|
||||||
latestRelease = await getLatestRelease(octokit);
|
|
||||||
} else {
|
|
||||||
core.error(
|
|
||||||
"Github API request failed while getting latest release. Check the GitHub status page for outages. Try again later.",
|
|
||||||
);
|
|
||||||
throw err;
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
if (!latestRelease) {
|
core.info(`Getting available versions from ${VERSIONS_NDJSON_URL} ...`);
|
||||||
throw new Error("Could not determine latest release.");
|
return await getAllVersionsFromNdjson();
|
||||||
}
|
|
||||||
core.debug(`Latest version: ${latestRelease.tag_name}`);
|
|
||||||
return latestRelease.tag_name;
|
|
||||||
}
|
|
||||||
|
|
||||||
async function getLatestRelease(octokit: Octokit) {
|
|
||||||
const { data: latestRelease } = await octokit.rest.repos.getLatestRelease({
|
|
||||||
owner: OWNER,
|
|
||||||
repo: REPO,
|
|
||||||
});
|
|
||||||
return latestRelease;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
function maxSatisfying(
|
function maxSatisfying(
|
||||||
|
|||||||
80
src/download/legacy-version-manifest.ts
Normal file
80
src/download/legacy-version-manifest.ts
Normal file
@@ -0,0 +1,80 @@
|
|||||||
|
import * as core from "@actions/core";
|
||||||
|
|
||||||
|
export interface ManifestEntry {
|
||||||
|
arch: string;
|
||||||
|
platform: string;
|
||||||
|
version: string;
|
||||||
|
downloadUrl: string;
|
||||||
|
checksum?: string;
|
||||||
|
variant?: string;
|
||||||
|
archiveFormat?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface LegacyManifestEntry {
|
||||||
|
arch: string;
|
||||||
|
platform: string;
|
||||||
|
version: string;
|
||||||
|
downloadUrl: string;
|
||||||
|
checksum?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
const warnedLegacyManifestUrls = new Set<string>();
|
||||||
|
|
||||||
|
export function parseLegacyManifestEntries(
|
||||||
|
parsedEntries: unknown[],
|
||||||
|
manifestUrl: string,
|
||||||
|
): ManifestEntry[] {
|
||||||
|
warnAboutLegacyManifestFormat(manifestUrl);
|
||||||
|
|
||||||
|
return parsedEntries.map((entry, index) => {
|
||||||
|
if (!isLegacyManifestEntry(entry)) {
|
||||||
|
throw new Error(
|
||||||
|
`Invalid legacy manifest-file entry at index ${index} in ${manifestUrl}.`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
arch: entry.arch,
|
||||||
|
checksum: entry.checksum,
|
||||||
|
downloadUrl: entry.downloadUrl,
|
||||||
|
platform: entry.platform,
|
||||||
|
version: entry.version,
|
||||||
|
};
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
export function clearLegacyManifestWarnings(): void {
|
||||||
|
warnedLegacyManifestUrls.clear();
|
||||||
|
}
|
||||||
|
|
||||||
|
function warnAboutLegacyManifestFormat(manifestUrl: string): void {
|
||||||
|
if (warnedLegacyManifestUrls.has(manifestUrl)) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
warnedLegacyManifestUrls.add(manifestUrl);
|
||||||
|
core.warning(
|
||||||
|
`manifest-file ${manifestUrl} uses the legacy JSON array format, which is deprecated. Please migrate to the astral-sh/versions NDJSON format before the next major release.`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
function isLegacyManifestEntry(value: unknown): value is LegacyManifestEntry {
|
||||||
|
if (!isRecord(value)) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
const checksumIsValid =
|
||||||
|
typeof value.checksum === "string" || value.checksum === undefined;
|
||||||
|
|
||||||
|
return (
|
||||||
|
typeof value.arch === "string" &&
|
||||||
|
checksumIsValid &&
|
||||||
|
typeof value.downloadUrl === "string" &&
|
||||||
|
typeof value.platform === "string" &&
|
||||||
|
typeof value.version === "string"
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
function isRecord(value: unknown): value is Record<string, unknown> {
|
||||||
|
return typeof value === "object" && value !== null;
|
||||||
|
}
|
||||||
39
src/download/variant-selection.ts
Normal file
39
src/download/variant-selection.ts
Normal file
@@ -0,0 +1,39 @@
|
|||||||
|
interface VariantAwareEntry {
|
||||||
|
variant?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function selectDefaultVariant<T extends VariantAwareEntry>(
|
||||||
|
entries: T[],
|
||||||
|
duplicateEntryDescription: string,
|
||||||
|
): T {
|
||||||
|
const firstEntry = entries[0];
|
||||||
|
if (firstEntry === undefined) {
|
||||||
|
throw new Error("selectDefaultVariant requires at least one candidate.");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (entries.length === 1) {
|
||||||
|
return firstEntry;
|
||||||
|
}
|
||||||
|
|
||||||
|
const defaultEntries = entries.filter((entry) =>
|
||||||
|
isDefaultVariant(entry.variant),
|
||||||
|
);
|
||||||
|
if (defaultEntries.length === 1) {
|
||||||
|
return defaultEntries[0];
|
||||||
|
}
|
||||||
|
|
||||||
|
throw new Error(
|
||||||
|
`${duplicateEntryDescription} with variants ${formatVariants(entries)}. setup-uv currently requires a single default variant for duplicate platform entries.`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
function isDefaultVariant(variant: string | undefined): boolean {
|
||||||
|
return variant === undefined || variant === "default";
|
||||||
|
}
|
||||||
|
|
||||||
|
function formatVariants<T extends VariantAwareEntry>(entries: T[]): string {
|
||||||
|
return entries
|
||||||
|
.map((entry) => entry.variant ?? "default")
|
||||||
|
.sort((left, right) => left.localeCompare(right))
|
||||||
|
.join(", ");
|
||||||
|
}
|
||||||
@@ -1,49 +1,78 @@
|
|||||||
import { promises as fs } from "node:fs";
|
|
||||||
import { join } from "node:path";
|
|
||||||
import * as core from "@actions/core";
|
import * as core from "@actions/core";
|
||||||
import * as semver from "semver";
|
import * as semver from "semver";
|
||||||
import { fetch } from "../utils/fetch";
|
import { fetch } from "../utils/fetch";
|
||||||
|
import {
|
||||||
|
clearLegacyManifestWarnings,
|
||||||
|
type ManifestEntry,
|
||||||
|
parseLegacyManifestEntries,
|
||||||
|
} from "./legacy-version-manifest";
|
||||||
|
import { selectDefaultVariant } from "./variant-selection";
|
||||||
|
import { type NdjsonVersion, parseVersionData } from "./versions-client";
|
||||||
|
|
||||||
const localManifestFile = join(__dirname, "..", "..", "version-manifest.json");
|
export interface ManifestArtifact {
|
||||||
|
|
||||||
interface ManifestEntry {
|
|
||||||
version: string;
|
|
||||||
artifactName: string;
|
|
||||||
arch: string;
|
|
||||||
platform: string;
|
|
||||||
downloadUrl: string;
|
downloadUrl: string;
|
||||||
|
checksum?: string;
|
||||||
|
archiveFormat?: string;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
const cachedManifestEntries = new Map<string, ManifestEntry[]>();
|
||||||
|
|
||||||
export async function getLatestKnownVersion(
|
export async function getLatestKnownVersion(
|
||||||
manifestUrl: string | undefined,
|
manifestUrl: string,
|
||||||
): Promise<string> {
|
): Promise<string> {
|
||||||
const manifestEntries = await getManifestEntries(manifestUrl);
|
const versions = await getAllVersions(manifestUrl);
|
||||||
return manifestEntries.reduce((a, b) =>
|
const latestVersion = versions.reduce((latest, current) =>
|
||||||
semver.gt(a.version, b.version) ? a : b,
|
semver.gt(current, latest) ? current : latest,
|
||||||
).version;
|
);
|
||||||
|
|
||||||
|
return latestVersion;
|
||||||
}
|
}
|
||||||
|
|
||||||
export async function getDownloadUrl(
|
export async function getAllVersions(manifestUrl: string): Promise<string[]> {
|
||||||
manifestUrl: string | undefined,
|
const manifestEntries = await getManifestEntries(manifestUrl);
|
||||||
|
return [...new Set(manifestEntries.map((entry) => entry.version))];
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function getManifestArtifact(
|
||||||
|
manifestUrl: string,
|
||||||
version: string,
|
version: string,
|
||||||
arch: string,
|
arch: string,
|
||||||
platform: string,
|
platform: string,
|
||||||
): Promise<string | undefined> {
|
): Promise<ManifestArtifact | undefined> {
|
||||||
const manifestEntries = await getManifestEntries(manifestUrl);
|
const manifestEntries = await getManifestEntries(manifestUrl);
|
||||||
const entry = manifestEntries.find(
|
const entry = selectManifestEntry(
|
||||||
(entry) =>
|
manifestEntries,
|
||||||
entry.version === version &&
|
manifestUrl,
|
||||||
entry.arch === arch &&
|
version,
|
||||||
entry.platform === platform,
|
arch,
|
||||||
|
platform,
|
||||||
);
|
);
|
||||||
return entry ? entry.downloadUrl : undefined;
|
|
||||||
|
if (!entry) {
|
||||||
|
return undefined;
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
archiveFormat: entry.archiveFormat,
|
||||||
|
checksum: entry.checksum,
|
||||||
|
downloadUrl: entry.downloadUrl,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export function clearManifestCache(): void {
|
||||||
|
cachedManifestEntries.clear();
|
||||||
|
clearLegacyManifestWarnings();
|
||||||
}
|
}
|
||||||
|
|
||||||
async function getManifestEntries(
|
async function getManifestEntries(
|
||||||
manifestUrl: string | undefined,
|
manifestUrl: string,
|
||||||
): Promise<ManifestEntry[]> {
|
): Promise<ManifestEntry[]> {
|
||||||
let data: string;
|
const cachedEntries = cachedManifestEntries.get(manifestUrl);
|
||||||
if (manifestUrl !== undefined) {
|
if (cachedEntries !== undefined) {
|
||||||
|
core.debug(`Using cached manifest-file from: ${manifestUrl}`);
|
||||||
|
return cachedEntries;
|
||||||
|
}
|
||||||
|
|
||||||
core.info(`Fetching manifest-file from: ${manifestUrl}`);
|
core.info(`Fetching manifest-file from: ${manifestUrl}`);
|
||||||
const response = await fetch(manifestUrl, {});
|
const response = await fetch(manifestUrl, {});
|
||||||
if (!response.ok) {
|
if (!response.ok) {
|
||||||
@@ -51,41 +80,90 @@ async function getManifestEntries(
|
|||||||
`Failed to fetch manifest-file: ${response.status} ${response.statusText}`,
|
`Failed to fetch manifest-file: ${response.status} ${response.statusText}`,
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
data = await response.text();
|
|
||||||
} else {
|
|
||||||
core.info("manifest-file not provided, reading from local file.");
|
|
||||||
const fileContent = await fs.readFile(localManifestFile);
|
|
||||||
data = fileContent.toString();
|
|
||||||
}
|
|
||||||
|
|
||||||
return JSON.parse(data);
|
const data = await response.text();
|
||||||
|
const parsedEntries = parseManifestEntries(data, manifestUrl);
|
||||||
|
cachedManifestEntries.set(manifestUrl, parsedEntries);
|
||||||
|
|
||||||
|
return parsedEntries;
|
||||||
}
|
}
|
||||||
|
|
||||||
export async function updateVersionManifest(
|
function parseManifestEntries(
|
||||||
|
data: string,
|
||||||
manifestUrl: string,
|
manifestUrl: string,
|
||||||
downloadUrls: string[],
|
): ManifestEntry[] {
|
||||||
): Promise<void> {
|
const trimmed = data.trim();
|
||||||
const manifest: ManifestEntry[] = [];
|
if (trimmed === "") {
|
||||||
|
throw new Error(`manifest-file at ${manifestUrl} is empty.`);
|
||||||
|
}
|
||||||
|
|
||||||
for (const downloadUrl of downloadUrls) {
|
const parsedAsJson = tryParseJson(trimmed);
|
||||||
const urlParts = downloadUrl.split("/");
|
if (Array.isArray(parsedAsJson)) {
|
||||||
const version = urlParts[urlParts.length - 2];
|
return parseLegacyManifestEntries(parsedAsJson, manifestUrl);
|
||||||
const artifactName = urlParts[urlParts.length - 1];
|
|
||||||
if (!artifactName.startsWith("uv-")) {
|
|
||||||
continue;
|
|
||||||
}
|
}
|
||||||
if (artifactName.startsWith("uv-installer")) {
|
|
||||||
continue;
|
const versions = parseVersionData(trimmed, manifestUrl);
|
||||||
|
return mapNdjsonVersionsToManifestEntries(versions, manifestUrl);
|
||||||
|
}
|
||||||
|
|
||||||
|
function mapNdjsonVersionsToManifestEntries(
|
||||||
|
versions: NdjsonVersion[],
|
||||||
|
manifestUrl: string,
|
||||||
|
): ManifestEntry[] {
|
||||||
|
const manifestEntries: ManifestEntry[] = [];
|
||||||
|
|
||||||
|
for (const versionData of versions) {
|
||||||
|
for (const artifact of versionData.artifacts) {
|
||||||
|
const [arch, ...platformParts] = artifact.platform.split("-");
|
||||||
|
if (arch === undefined || platformParts.length === 0) {
|
||||||
|
throw new Error(
|
||||||
|
`Invalid artifact platform '${artifact.platform}' in manifest-file ${manifestUrl}.`,
|
||||||
|
);
|
||||||
}
|
}
|
||||||
const artifactParts = artifactName.split(".")[0].split("-");
|
|
||||||
manifest.push({
|
manifestEntries.push({
|
||||||
arch: artifactParts[1],
|
arch,
|
||||||
artifactName: artifactName,
|
archiveFormat: artifact.archive_format,
|
||||||
downloadUrl: downloadUrl,
|
checksum: artifact.sha256,
|
||||||
platform: artifactName.split(`uv-${artifactParts[1]}-`)[1].split(".")[0],
|
downloadUrl: artifact.url,
|
||||||
version: version,
|
platform: platformParts.join("-"),
|
||||||
|
variant: artifact.variant,
|
||||||
|
version: versionData.version,
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
core.debug(`Updating manifest-file: ${JSON.stringify(manifest)}`);
|
}
|
||||||
await fs.writeFile(manifestUrl, JSON.stringify(manifest));
|
|
||||||
|
return manifestEntries;
|
||||||
|
}
|
||||||
|
|
||||||
|
function selectManifestEntry(
|
||||||
|
manifestEntries: ManifestEntry[],
|
||||||
|
manifestUrl: string,
|
||||||
|
version: string,
|
||||||
|
arch: string,
|
||||||
|
platform: string,
|
||||||
|
): ManifestEntry | undefined {
|
||||||
|
const matches = manifestEntries.filter(
|
||||||
|
(candidate) =>
|
||||||
|
candidate.version === version &&
|
||||||
|
candidate.arch === arch &&
|
||||||
|
candidate.platform === platform,
|
||||||
|
);
|
||||||
|
|
||||||
|
if (matches.length === 0) {
|
||||||
|
return undefined;
|
||||||
|
}
|
||||||
|
|
||||||
|
return selectDefaultVariant(
|
||||||
|
matches,
|
||||||
|
`manifest-file ${manifestUrl} contains multiple artifacts for version ${version}, arch ${arch}, platform ${platform}`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
function tryParseJson(value: string): unknown {
|
||||||
|
try {
|
||||||
|
return JSON.parse(value);
|
||||||
|
} catch {
|
||||||
|
return undefined;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
191
src/download/versions-client.ts
Normal file
191
src/download/versions-client.ts
Normal file
@@ -0,0 +1,191 @@
|
|||||||
|
import * as core from "@actions/core";
|
||||||
|
import { VERSIONS_NDJSON_URL } from "../utils/constants";
|
||||||
|
import { fetch } from "../utils/fetch";
|
||||||
|
import { selectDefaultVariant } from "./variant-selection";
|
||||||
|
|
||||||
|
export interface NdjsonArtifact {
|
||||||
|
platform: string;
|
||||||
|
variant?: string;
|
||||||
|
url: string;
|
||||||
|
archive_format: string;
|
||||||
|
sha256: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface NdjsonVersion {
|
||||||
|
version: string;
|
||||||
|
artifacts: NdjsonArtifact[];
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface ArtifactResult {
|
||||||
|
url: string;
|
||||||
|
sha256: string;
|
||||||
|
archiveFormat: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
const cachedVersionData = new Map<string, NdjsonVersion[]>();
|
||||||
|
|
||||||
|
export async function fetchVersionData(
|
||||||
|
url: string = VERSIONS_NDJSON_URL,
|
||||||
|
): Promise<NdjsonVersion[]> {
|
||||||
|
const cachedVersions = cachedVersionData.get(url);
|
||||||
|
if (cachedVersions !== undefined) {
|
||||||
|
core.debug(`Using cached NDJSON version data from ${url}`);
|
||||||
|
return cachedVersions;
|
||||||
|
}
|
||||||
|
|
||||||
|
core.info(`Fetching version data from ${url} ...`);
|
||||||
|
const response = await fetch(url, {});
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error(
|
||||||
|
`Failed to fetch version data: ${response.status} ${response.statusText}`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
const body = await response.text();
|
||||||
|
const versions = parseVersionData(body, url);
|
||||||
|
cachedVersionData.set(url, versions);
|
||||||
|
return versions;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function parseVersionData(
|
||||||
|
data: string,
|
||||||
|
sourceDescription: string,
|
||||||
|
): NdjsonVersion[] {
|
||||||
|
const versions: NdjsonVersion[] = [];
|
||||||
|
|
||||||
|
for (const [index, line] of data.split("\n").entries()) {
|
||||||
|
const trimmed = line.trim();
|
||||||
|
if (trimmed === "") {
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
let parsed: unknown;
|
||||||
|
try {
|
||||||
|
parsed = JSON.parse(trimmed);
|
||||||
|
} catch (error) {
|
||||||
|
throw new Error(
|
||||||
|
`Failed to parse version data from ${sourceDescription} at line ${index + 1}: ${(error as Error).message}`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!isNdjsonVersion(parsed)) {
|
||||||
|
throw new Error(
|
||||||
|
`Invalid NDJSON record in ${sourceDescription} at line ${index + 1}.`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
versions.push(parsed);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (versions.length === 0) {
|
||||||
|
throw new Error(`No version data found in ${sourceDescription}.`);
|
||||||
|
}
|
||||||
|
|
||||||
|
return versions;
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function getLatestVersion(): Promise<string> {
|
||||||
|
const versions = await fetchVersionData();
|
||||||
|
const latestVersion = versions[0]?.version;
|
||||||
|
if (!latestVersion) {
|
||||||
|
throw new Error("No versions found in NDJSON data");
|
||||||
|
}
|
||||||
|
|
||||||
|
core.debug(`Latest version from NDJSON: ${latestVersion}`);
|
||||||
|
return latestVersion;
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function getAllVersions(): Promise<string[]> {
|
||||||
|
const versions = await fetchVersionData();
|
||||||
|
return versions.map((versionData) => versionData.version);
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function getArtifact(
|
||||||
|
version: string,
|
||||||
|
arch: string,
|
||||||
|
platform: string,
|
||||||
|
): Promise<ArtifactResult | undefined> {
|
||||||
|
const versions = await fetchVersionData();
|
||||||
|
const versionData = versions.find(
|
||||||
|
(candidate) => candidate.version === version,
|
||||||
|
);
|
||||||
|
if (!versionData) {
|
||||||
|
core.debug(`Version ${version} not found in NDJSON data`);
|
||||||
|
return undefined;
|
||||||
|
}
|
||||||
|
|
||||||
|
const targetPlatform = `${arch}-${platform}`;
|
||||||
|
const matchingArtifacts = versionData.artifacts.filter(
|
||||||
|
(candidate) => candidate.platform === targetPlatform,
|
||||||
|
);
|
||||||
|
|
||||||
|
if (matchingArtifacts.length === 0) {
|
||||||
|
core.debug(
|
||||||
|
`Artifact for ${targetPlatform} not found in version ${version}. Available platforms: ${versionData.artifacts
|
||||||
|
.map((candidate) => candidate.platform)
|
||||||
|
.join(", ")}`,
|
||||||
|
);
|
||||||
|
return undefined;
|
||||||
|
}
|
||||||
|
|
||||||
|
const artifact = selectArtifact(matchingArtifacts, version, targetPlatform);
|
||||||
|
|
||||||
|
return {
|
||||||
|
archiveFormat: artifact.archive_format,
|
||||||
|
sha256: artifact.sha256,
|
||||||
|
url: artifact.url,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export function clearCache(url?: string): void {
|
||||||
|
if (url === undefined) {
|
||||||
|
cachedVersionData.clear();
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
cachedVersionData.delete(url);
|
||||||
|
}
|
||||||
|
|
||||||
|
function selectArtifact(
|
||||||
|
artifacts: NdjsonArtifact[],
|
||||||
|
version: string,
|
||||||
|
targetPlatform: string,
|
||||||
|
): NdjsonArtifact {
|
||||||
|
return selectDefaultVariant(
|
||||||
|
artifacts,
|
||||||
|
`Multiple artifacts found for ${targetPlatform} in version ${version}`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
function isNdjsonVersion(value: unknown): value is NdjsonVersion {
|
||||||
|
if (!isRecord(value)) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (typeof value.version !== "string" || !Array.isArray(value.artifacts)) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
return value.artifacts.every(isNdjsonArtifact);
|
||||||
|
}
|
||||||
|
|
||||||
|
function isNdjsonArtifact(value: unknown): value is NdjsonArtifact {
|
||||||
|
if (!isRecord(value)) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
const variantIsValid =
|
||||||
|
typeof value.variant === "string" || value.variant === undefined;
|
||||||
|
|
||||||
|
return (
|
||||||
|
typeof value.archive_format === "string" &&
|
||||||
|
typeof value.platform === "string" &&
|
||||||
|
typeof value.sha256 === "string" &&
|
||||||
|
typeof value.url === "string" &&
|
||||||
|
variantIsValid
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
function isRecord(value: unknown): value is Record<string, unknown> {
|
||||||
|
return typeof value === "object" && value !== null;
|
||||||
|
}
|
||||||
@@ -5,6 +5,7 @@ import * as exec from "@actions/exec";
|
|||||||
import { restoreCache } from "./cache/restore-cache";
|
import { restoreCache } from "./cache/restore-cache";
|
||||||
import {
|
import {
|
||||||
downloadVersionFromManifest,
|
downloadVersionFromManifest,
|
||||||
|
downloadVersionFromNdjson,
|
||||||
resolveVersion,
|
resolveVersion,
|
||||||
tryGetFromToolCache,
|
tryGetFromToolCache,
|
||||||
} from "./download/download-version";
|
} from "./download/download-version";
|
||||||
@@ -37,6 +38,8 @@ import {
|
|||||||
} from "./utils/platforms";
|
} from "./utils/platforms";
|
||||||
import { getUvVersionFromFile } from "./version/resolve";
|
import { getUvVersionFromFile } from "./version/resolve";
|
||||||
|
|
||||||
|
const sourceDir = __dirname;
|
||||||
|
|
||||||
async function getPythonVersion(): Promise<string> {
|
async function getPythonVersion(): Promise<string> {
|
||||||
if (pythonVersion !== "") {
|
if (pythonVersion !== "") {
|
||||||
return pythonVersion;
|
return pythonVersion;
|
||||||
@@ -139,13 +142,22 @@ async function setupUv(
|
|||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
const downloadVersionResult = await downloadVersionFromManifest(
|
const downloadVersionResult =
|
||||||
|
manifestFile !== undefined
|
||||||
|
? await downloadVersionFromManifest(
|
||||||
manifestFile,
|
manifestFile,
|
||||||
platform,
|
platform,
|
||||||
arch,
|
arch,
|
||||||
resolvedVersion,
|
resolvedVersion,
|
||||||
checkSum,
|
checkSum,
|
||||||
githubToken,
|
githubToken,
|
||||||
|
)
|
||||||
|
: await downloadVersionFromNdjson(
|
||||||
|
platform,
|
||||||
|
arch,
|
||||||
|
resolvedVersion,
|
||||||
|
checkSum,
|
||||||
|
githubToken,
|
||||||
);
|
);
|
||||||
|
|
||||||
return {
|
return {
|
||||||
@@ -158,12 +170,7 @@ async function determineVersion(
|
|||||||
manifestFile: string | undefined,
|
manifestFile: string | undefined,
|
||||||
): Promise<string> {
|
): Promise<string> {
|
||||||
if (versionInput !== "") {
|
if (versionInput !== "") {
|
||||||
return await resolveVersion(
|
return await resolveVersion(versionInput, manifestFile, resolutionStrategy);
|
||||||
versionInput,
|
|
||||||
manifestFile,
|
|
||||||
githubToken,
|
|
||||||
resolutionStrategy,
|
|
||||||
);
|
|
||||||
}
|
}
|
||||||
if (versionFileInput !== "") {
|
if (versionFileInput !== "") {
|
||||||
const versionFromFile = getUvVersionFromFile(versionFileInput);
|
const versionFromFile = getUvVersionFromFile(versionFileInput);
|
||||||
@@ -175,7 +182,6 @@ async function determineVersion(
|
|||||||
return await resolveVersion(
|
return await resolveVersion(
|
||||||
versionFromFile,
|
versionFromFile,
|
||||||
manifestFile,
|
manifestFile,
|
||||||
githubToken,
|
|
||||||
resolutionStrategy,
|
resolutionStrategy,
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
@@ -193,7 +199,6 @@ async function determineVersion(
|
|||||||
return await resolveVersion(
|
return await resolveVersion(
|
||||||
versionFromUvToml || versionFromPyproject || "latest",
|
versionFromUvToml || versionFromPyproject || "latest",
|
||||||
manifestFile,
|
manifestFile,
|
||||||
githubToken,
|
|
||||||
resolutionStrategy,
|
resolutionStrategy,
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
@@ -305,7 +310,7 @@ function setCacheDir(): void {
|
|||||||
|
|
||||||
function addMatchers(): void {
|
function addMatchers(): void {
|
||||||
if (addProblemMatchers) {
|
if (addProblemMatchers) {
|
||||||
const matchersPath = path.join(__dirname, `..${path.sep}..`, ".github");
|
const matchersPath = path.join(sourceDir, "..", "..", ".github");
|
||||||
core.info(`##[add-matcher]${path.join(matchersPath, "python.json")}`);
|
core.info(`##[add-matcher]${path.join(matchersPath, "python.json")}`);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
81
src/update-known-checksums.ts
Normal file
81
src/update-known-checksums.ts
Normal file
@@ -0,0 +1,81 @@
|
|||||||
|
import * as core from "@actions/core";
|
||||||
|
import * as semver from "semver";
|
||||||
|
import { KNOWN_CHECKSUMS } from "./download/checksum/known-checksums";
|
||||||
|
import {
|
||||||
|
type ChecksumEntry,
|
||||||
|
updateChecksums,
|
||||||
|
} from "./download/checksum/update-known-checksums";
|
||||||
|
import {
|
||||||
|
fetchVersionData,
|
||||||
|
getLatestVersion,
|
||||||
|
type NdjsonVersion,
|
||||||
|
} from "./download/versions-client";
|
||||||
|
|
||||||
|
const VERSION_IN_CHECKSUM_KEY_PATTERN =
|
||||||
|
/-(\d+\.\d+\.\d+(?:[-+][0-9A-Za-z.-]+)?)$/;
|
||||||
|
|
||||||
|
async function run(): Promise<void> {
|
||||||
|
const checksumFilePath = process.argv.slice(2)[0];
|
||||||
|
if (!checksumFilePath) {
|
||||||
|
throw new Error(
|
||||||
|
"Missing checksum file path. Usage: node dist/update-known-checksums/index.cjs <checksum-file-path>",
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
const latestVersion = await getLatestVersion();
|
||||||
|
const latestKnownVersion = getLatestKnownVersionFromChecksums();
|
||||||
|
|
||||||
|
if (semver.lte(latestVersion, latestKnownVersion)) {
|
||||||
|
core.info(
|
||||||
|
`Latest release (${latestVersion}) is not newer than the latest known version (${latestKnownVersion}). Skipping update.`,
|
||||||
|
);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const versions = await fetchVersionData();
|
||||||
|
const checksumEntries = extractChecksumsFromNdjson(versions);
|
||||||
|
await updateChecksums(checksumFilePath, checksumEntries);
|
||||||
|
|
||||||
|
core.setOutput("latest-version", latestVersion);
|
||||||
|
}
|
||||||
|
|
||||||
|
function getLatestKnownVersionFromChecksums(): string {
|
||||||
|
const versions = new Set<string>();
|
||||||
|
|
||||||
|
for (const key of Object.keys(KNOWN_CHECKSUMS)) {
|
||||||
|
const version = extractVersionFromChecksumKey(key);
|
||||||
|
if (version !== undefined) {
|
||||||
|
versions.add(version);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const latestVersion = [...versions].sort(semver.rcompare)[0];
|
||||||
|
if (!latestVersion) {
|
||||||
|
throw new Error("Could not determine latest known version from checksums.");
|
||||||
|
}
|
||||||
|
|
||||||
|
return latestVersion;
|
||||||
|
}
|
||||||
|
|
||||||
|
function extractVersionFromChecksumKey(key: string): string | undefined {
|
||||||
|
return key.match(VERSION_IN_CHECKSUM_KEY_PATTERN)?.[1];
|
||||||
|
}
|
||||||
|
|
||||||
|
function extractChecksumsFromNdjson(
|
||||||
|
versions: NdjsonVersion[],
|
||||||
|
): ChecksumEntry[] {
|
||||||
|
const checksums: ChecksumEntry[] = [];
|
||||||
|
|
||||||
|
for (const version of versions) {
|
||||||
|
for (const artifact of version.artifacts) {
|
||||||
|
checksums.push({
|
||||||
|
checksum: artifact.sha256,
|
||||||
|
key: `${artifact.platform}-${version.version}`,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return checksums;
|
||||||
|
}
|
||||||
|
|
||||||
|
run();
|
||||||
@@ -1,63 +0,0 @@
|
|||||||
import * as core from "@actions/core";
|
|
||||||
import type { Endpoints } from "@octokit/types";
|
|
||||||
import * as semver from "semver";
|
|
||||||
import { updateChecksums } from "./download/checksum/update-known-checksums";
|
|
||||||
import {
|
|
||||||
getLatestKnownVersion,
|
|
||||||
updateVersionManifest,
|
|
||||||
} from "./download/version-manifest";
|
|
||||||
import { OWNER, REPO } from "./utils/constants";
|
|
||||||
import { Octokit } from "./utils/octokit";
|
|
||||||
|
|
||||||
type Release =
|
|
||||||
Endpoints["GET /repos/{owner}/{repo}/releases"]["response"]["data"][number];
|
|
||||||
|
|
||||||
async function run(): Promise<void> {
|
|
||||||
const checksumFilePath = process.argv.slice(2)[0];
|
|
||||||
const versionsManifestFile = process.argv.slice(2)[1];
|
|
||||||
const githubToken = process.argv.slice(2)[2];
|
|
||||||
|
|
||||||
const octokit = new Octokit({
|
|
||||||
auth: githubToken,
|
|
||||||
});
|
|
||||||
|
|
||||||
const { data: latestRelease } = await octokit.rest.repos.getLatestRelease({
|
|
||||||
owner: OWNER,
|
|
||||||
repo: REPO,
|
|
||||||
});
|
|
||||||
|
|
||||||
const latestKnownVersion = await getLatestKnownVersion(undefined);
|
|
||||||
|
|
||||||
if (semver.lte(latestRelease.tag_name, latestKnownVersion)) {
|
|
||||||
core.info(
|
|
||||||
`Latest release (${latestRelease.tag_name}) is not newer than the latest known version (${latestKnownVersion}). Skipping update.`,
|
|
||||||
);
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
const releases: Release[] = await octokit.paginate(
|
|
||||||
octokit.rest.repos.listReleases,
|
|
||||||
{
|
|
||||||
owner: OWNER,
|
|
||||||
repo: REPO,
|
|
||||||
},
|
|
||||||
);
|
|
||||||
const checksumDownloadUrls: string[] = releases.flatMap((release) =>
|
|
||||||
release.assets
|
|
||||||
.filter((asset) => asset.name.endsWith(".sha256"))
|
|
||||||
.map((asset) => asset.browser_download_url),
|
|
||||||
);
|
|
||||||
await updateChecksums(checksumFilePath, checksumDownloadUrls);
|
|
||||||
|
|
||||||
const artifactDownloadUrls: string[] = releases.flatMap((release) =>
|
|
||||||
release.assets
|
|
||||||
.filter((asset) => !asset.name.endsWith(".sha256"))
|
|
||||||
.map((asset) => asset.browser_download_url),
|
|
||||||
);
|
|
||||||
|
|
||||||
await updateVersionManifest(versionsManifestFile, artifactDownloadUrls);
|
|
||||||
|
|
||||||
core.setOutput("latest-version", latestRelease.tag_name);
|
|
||||||
}
|
|
||||||
|
|
||||||
run();
|
|
||||||
@@ -1,5 +1,5 @@
|
|||||||
export const REPO = "uv";
|
|
||||||
export const OWNER = "astral-sh";
|
|
||||||
export const TOOL_CACHE_NAME = "uv";
|
export const TOOL_CACHE_NAME = "uv";
|
||||||
export const STATE_UV_PATH = "uv-path";
|
export const STATE_UV_PATH = "uv-path";
|
||||||
export const STATE_UV_VERSION = "uv-version";
|
export const STATE_UV_VERSION = "uv-version";
|
||||||
|
export const VERSIONS_NDJSON_URL =
|
||||||
|
"https://raw.githubusercontent.com/astral-sh/versions/main/v1/uv.ndjson";
|
||||||
|
|||||||
@@ -1,34 +0,0 @@
|
|||||||
import type { OctokitOptions } from "@octokit/core";
|
|
||||||
import { Octokit as Core } from "@octokit/core";
|
|
||||||
import {
|
|
||||||
type PaginateInterface,
|
|
||||||
paginateRest,
|
|
||||||
} from "@octokit/plugin-paginate-rest";
|
|
||||||
import { legacyRestEndpointMethods } from "@octokit/plugin-rest-endpoint-methods";
|
|
||||||
import { fetch as customFetch } from "./fetch";
|
|
||||||
|
|
||||||
export type { RestEndpointMethodTypes } from "@octokit/plugin-rest-endpoint-methods";
|
|
||||||
|
|
||||||
const DEFAULTS = {
|
|
||||||
baseUrl: "https://api.github.com",
|
|
||||||
userAgent: "setup-uv",
|
|
||||||
};
|
|
||||||
|
|
||||||
const OctokitWithPlugins = Core.plugin(paginateRest, legacyRestEndpointMethods);
|
|
||||||
|
|
||||||
export const Octokit = OctokitWithPlugins.defaults(function buildDefaults(
|
|
||||||
options: OctokitOptions,
|
|
||||||
): OctokitOptions {
|
|
||||||
return {
|
|
||||||
...DEFAULTS,
|
|
||||||
...options,
|
|
||||||
request: {
|
|
||||||
fetch: customFetch,
|
|
||||||
...options.request,
|
|
||||||
},
|
|
||||||
};
|
|
||||||
});
|
|
||||||
|
|
||||||
export type Octokit = InstanceType<typeof OctokitWithPlugins> & {
|
|
||||||
paginate: PaginateInterface;
|
|
||||||
};
|
|
||||||
@@ -1,12 +1,12 @@
|
|||||||
{
|
{
|
||||||
"compilerOptions": {
|
"compilerOptions": {
|
||||||
"esModuleInterop": true /* Enables emit interoperability between CommonJS and ES Modules via creation of namespace objects for all imports. Implies 'allowSyntheticDefaultImports'. */,
|
"esModuleInterop": true,
|
||||||
"module": "nodenext" /* Specify module code generation: 'none', 'commonjs', 'amd', 'system', 'umd', 'es2015', or 'ESNext'. */,
|
"isolatedModules": true,
|
||||||
"noImplicitAny": true /* Raise error on expressions and declarations with an implied 'any' type. */,
|
"module": "esnext",
|
||||||
"outDir": "./lib" /* Redirect output structure to the directory. */,
|
"moduleResolution": "bundler",
|
||||||
"rootDir": "./src" /* Specify the root directory of input files. Use to control the output directory structure with --outDir. */,
|
"noImplicitAny": true,
|
||||||
"strict": true /* Enable all strict type-checking options. */,
|
"strict": true,
|
||||||
"target": "ES2022" /* Specify ECMAScript target version: 'ES3' (default), 'ES5', 'ES2015', 'ES2016', 'ES2017', 'ES2018', 'ES2019' or 'ESNEXT'. */
|
"target": "ES2022"
|
||||||
},
|
},
|
||||||
"exclude": ["node_modules", "**/*.test.ts"]
|
"include": ["src/**/*.ts"]
|
||||||
}
|
}
|
||||||
|
|||||||
30872
version-manifest.json
30872
version-manifest.json
File diff suppressed because it is too large
Load Diff
Reference in New Issue
Block a user