mirror of
https://github.com/astral-sh/setup-uv.git
synced 2026-03-14 17:14:58 +00:00
Compare commits
44 Commits
zsol/jj-uq
...
speed-up-v
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
01149c4575 | ||
|
|
fd8f376b22 | ||
|
|
f9070de1ea | ||
|
|
cadb67bdc9 | ||
|
|
e06108dd0a | ||
|
|
0f6ec07aaf | ||
|
|
821e5c9815 | ||
|
|
6ee6290f1c | ||
|
|
9f332a133a | ||
|
|
0acf9708ce | ||
|
|
fe3617d6e9 | ||
|
|
2ff70eebcc | ||
|
|
5ba8a7e5d0 | ||
|
|
4bc8fabc0c | ||
|
|
950b623541 | ||
|
|
09ff6fe0ae | ||
|
|
bd870193dd | ||
|
|
f8858e6756 | ||
|
|
5a095e7a20 | ||
|
|
b12532f27f | ||
|
|
0098a7571c | ||
|
|
2e7ed0e2bb | ||
|
|
04224aa8ca | ||
|
|
2bc602ff89 | ||
|
|
dd9d748439 | ||
|
|
14eede1834 | ||
|
|
c452423b2c | ||
|
|
eac588ad8d | ||
|
|
a97c6cbe9c | ||
|
|
02182fa02a | ||
|
|
a3b3eaea92 | ||
|
|
78cebeceac | ||
|
|
b6b8e2cd6a | ||
|
|
e31bec8546 | ||
|
|
db2b65ebae | ||
|
|
3511ff7054 | ||
|
|
99b0f0474b | ||
|
|
db4d6bf3d6 | ||
|
|
98e1309028 | ||
|
|
5ed2ede620 | ||
|
|
5fca386933 | ||
|
|
803947b9bd | ||
|
|
24553ac46d | ||
|
|
085087a5d3 |
48
.agents/skills/dependabot-pr-rollup/SKILL.md
Normal file
48
.agents/skills/dependabot-pr-rollup/SKILL.md
Normal file
@@ -0,0 +1,48 @@
|
||||
---
|
||||
name: dependabot-pr-rollup
|
||||
description: Find open Dependabot PRs for the current GitHub repo, compare each PR head to its base branch, replay only the net dependency changes in a fresh worktree and branch, run npm validation, and optionally commit, push, and open a PR. Use when you want to batch or manually replicate active Dependabot updates.
|
||||
license: MIT
|
||||
compatibility: Requires git, git worktree, gh CLI auth, npm, and a GitHub repo with an origin remote.
|
||||
---
|
||||
|
||||
# Dependabot PR Rollup
|
||||
|
||||
## When to use
|
||||
|
||||
Use this skill when the user wants to:
|
||||
- find all open Dependabot PRs in the current repo
|
||||
- reproduce their net effect in one local branch
|
||||
- validate the result with the repo's standard npm checks
|
||||
- optionally commit, push, and open a PR
|
||||
|
||||
## Workflow
|
||||
|
||||
1. Inspect the current checkout state, but do not reuse a dirty worktree.
|
||||
2. List open Dependabot PRs with `gh pr list --state open --author app/dependabot`.
|
||||
3. For each PR, collect the title, base branch, head branch, changed files, and relevant diffs.
|
||||
4. Compare each PR head against `origin/<base>` instead of trusting the PR title. Dependabot PRs can already be partially merged, superseded by newer versions, or have no remaining net effect.
|
||||
5. Create a new worktree and branch from `origin/<base>`.
|
||||
6. Reproduce only the remaining dependency changes in the new worktree.
|
||||
- Inspect `package.json` before editing.
|
||||
- Run `npm ci --ignore-scripts` before applying updates.
|
||||
- Use `npm install ... --ignore-scripts` for direct dependency changes so `package-lock.json` stays in sync.
|
||||
7. Run `npm run all`.
|
||||
8. If requested, commit the changed source, lockfile, and generated artifacts, then push and open a PR.
|
||||
|
||||
## Repo-specific notes
|
||||
|
||||
- Use `gh` for GitHub operations.
|
||||
- Keep the user's original checkout untouched by working in a separate worktree.
|
||||
- In this repo, `npm run all` is the safest validation command because it runs build, check, package, and test.
|
||||
- If dependency changes affect bundled output, include the regenerated `dist/` files.
|
||||
|
||||
## Report back
|
||||
|
||||
Always report:
|
||||
- open Dependabot PRs found
|
||||
- which PRs required no net changes
|
||||
- new branch name
|
||||
- new worktree path
|
||||
- files changed
|
||||
- `npm run all` result
|
||||
- if applicable, commit SHA and PR URL
|
||||
263
.github/copilot-instructions.md
vendored
263
.github/copilot-instructions.md
vendored
@@ -1,263 +0,0 @@
|
||||
# Copilot Instructions for setup-uv
|
||||
|
||||
This document provides essential information for GitHub Copilot coding agents working on the `astral-sh/setup-uv` repository.
|
||||
|
||||
## Repository Overview
|
||||
|
||||
**setup-uv** is a GitHub Action that sets up the [uv](https://docs.astral.sh/uv/)
|
||||
Python package installer in GitHub Actions workflows.
|
||||
It's a TypeScript-based action that downloads uv binaries, manages caching, handles version resolution,
|
||||
and configures the environment for subsequent workflow steps.
|
||||
|
||||
### Key Features
|
||||
|
||||
- Downloads and installs specific uv versions from GitHub releases
|
||||
- Supports version resolution from config files (pyproject.toml, uv.toml, .tool-versions)
|
||||
- Implements intelligent caching for both uv cache and Python installations
|
||||
- Provides cross-platform support (Linux, macOS, Windows, including ARM architectures)
|
||||
- Includes problem matchers for Python error reporting
|
||||
- Supports environment activation and custom tool directories
|
||||
|
||||
## Repository Structure
|
||||
|
||||
**Size**: Small-medium repository (~50 source files, ~400 total files including dependencies)
|
||||
**Languages**: TypeScript (primary), JavaScript (compiled output), JSON (configuration)
|
||||
**Runtime**: Node.js 24 (GitHub Actions runtime)
|
||||
**Key Dependencies**: @actions/core, @actions/cache, @actions/tool-cache, @octokit/core
|
||||
|
||||
### Core Architecture
|
||||
|
||||
```
|
||||
src/
|
||||
├── setup-uv.ts # Main entry point and orchestration
|
||||
├── save-cache.ts # Post-action cache saving logic
|
||||
├── update-known-versions.ts # Maintenance script for version updates
|
||||
├── cache/ # Cache management functionality
|
||||
├── download/ # Version resolution and binary downloading
|
||||
├── utils/ # Input parsing, platform detection, configuration
|
||||
└── version/ # Version resolution from various file formats
|
||||
```
|
||||
|
||||
### Key Files and Locations
|
||||
|
||||
- **Action Definition**: `action.yml` - Defines all inputs/outputs and entry points
|
||||
- **Main Source**: `src/setup-uv.ts` - Primary action logic
|
||||
- **Configuration**: `biome.json` (linting), `tsconfig.json` (TypeScript), `jest.config.js` (testing)
|
||||
- **Compiled Output**: `dist/` - Contains compiled Node.js bundles (auto-generated, committed)
|
||||
- **Test Fixtures**: `__tests__/fixtures/` - Sample projects for different configuration scenarios
|
||||
- **Workflows**: `.github/workflows/test.yml` - Comprehensive CI/CD pipeline
|
||||
|
||||
## Build and Development Process
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- Node.js 24+ (matches GitHub Actions runtime)
|
||||
- npm (included with Node.js)
|
||||
|
||||
### Essential Commands (ALWAYS run in this order)
|
||||
|
||||
#### 1. Install Dependencies
|
||||
|
||||
```bash
|
||||
npm ci --ignore-scripts
|
||||
```
|
||||
|
||||
**Timing**: ~20-30 seconds
|
||||
**Note**: Always run this first after cloning or when package.json changes
|
||||
|
||||
#### 2. Build TypeScript
|
||||
|
||||
```bash
|
||||
npm run build
|
||||
```
|
||||
|
||||
**Timing**: ~5-10 seconds
|
||||
**Purpose**: Compiles TypeScript source to JavaScript in `lib/` directory
|
||||
|
||||
#### 3. Lint and Format Code
|
||||
|
||||
```bash
|
||||
npm run check
|
||||
```
|
||||
|
||||
**Timing**: ~2-5 seconds
|
||||
**Tool**: Biome (replaces ESLint/Prettier)
|
||||
**Auto-fixes**: Formatting, import organization, basic linting issues
|
||||
|
||||
#### 4. Package for Distribution
|
||||
|
||||
```bash
|
||||
npm run package
|
||||
```
|
||||
|
||||
**Timing**: ~20-30 seconds
|
||||
**Purpose**: Creates bundled distributions in `dist/` using @vercel/ncc
|
||||
**Critical**: This step MUST be run before committing - the `dist/` files are used by GitHub Actions
|
||||
|
||||
#### 5. Run Tests
|
||||
|
||||
```bash
|
||||
npm test
|
||||
```
|
||||
|
||||
**Timing**: ~10-15 seconds
|
||||
**Framework**: Jest with TypeScript support
|
||||
**Coverage**: Unit tests for version resolution, input parsing, checksum validation
|
||||
|
||||
#### 6. Complete Validation (Recommended)
|
||||
|
||||
```bash
|
||||
npm run all
|
||||
```
|
||||
|
||||
**Timing**: ~60-90 seconds
|
||||
**Purpose**: Runs build → check → package → test in sequence
|
||||
**Use**: Before making pull requests or when unsure about build state
|
||||
|
||||
### Important Build Notes
|
||||
|
||||
**CRITICAL**: Always run `npm run package` after making code changes. The `dist/` directory contains the compiled bundles that GitHub Actions actually executes. Forgetting this step will cause your changes to have no effect.
|
||||
|
||||
**TypeScript Warnings**: You may see ts-jest warnings about "isolatedModules" - these are harmless and don't affect functionality.
|
||||
|
||||
**Biome**: This project uses Biome instead of ESLint/Prettier. Run `npm run check` to fix formatting and linting issues automatically.
|
||||
|
||||
## Testing Strategy
|
||||
|
||||
### Unit Tests
|
||||
|
||||
- **Location**: `__tests__/` directory
|
||||
- **Framework**: Jest with ts-jest transformer
|
||||
- **Coverage**: Version resolution, input parsing, checksum validation, utility functions
|
||||
|
||||
### Integration Tests
|
||||
|
||||
- **Location**: `.github/workflows/test.yml`
|
||||
- **Scope**: Full end-to-end testing across multiple platforms and scenarios
|
||||
- **Key Test Categories**:
|
||||
- Version installation (specific, latest, semver ranges)
|
||||
- Cache behavior (setup, restore, invalidation)
|
||||
- Cross-platform compatibility (Ubuntu, macOS, Windows, ARM)
|
||||
- Configuration file parsing (pyproject.toml, uv.toml, requirements.txt)
|
||||
- Error handling and edge cases
|
||||
|
||||
### Test Fixtures
|
||||
|
||||
Located in `__tests__/fixtures/`, these provide sample projects with different configurations:
|
||||
- `pyproject-toml-project/` - Standard Python project with uv version specification
|
||||
- `uv-toml-project/` - Project using uv.toml configuration
|
||||
- `requirements-txt-project/` - Legacy requirements.txt format
|
||||
- `cache-dir-defined-project/` - Custom cache directory configuration
|
||||
|
||||
## Continuous Integration
|
||||
|
||||
### GitHub Workflows
|
||||
|
||||
#### Primary Test Suite (`.github/workflows/test.yml`)
|
||||
|
||||
- **Triggers**: PRs, pushes to main, manual dispatch
|
||||
- **Matrix**: Multiple OS (Ubuntu, macOS, Windows), architecture (x64, ARM), and configuration combinations
|
||||
- **Duration**: ~5 minutes for full matrix
|
||||
- **Key Validations**:
|
||||
- Cross-platform installation and functionality
|
||||
- Cache behavior and performance
|
||||
- Version resolution from various sources
|
||||
- Tool directory configurations
|
||||
- Problem matcher functionality
|
||||
|
||||
#### Maintenance Workflows
|
||||
|
||||
- **CodeQL Analysis**: Security scanning on pushes/PRs
|
||||
- **Update Known Versions**: Daily job to sync with latest uv releases
|
||||
- **Dependabot**: Automated dependency updates
|
||||
|
||||
### Pre-commit Validation
|
||||
|
||||
The CI runs these checks that you should run locally:
|
||||
1. `npm run all` - Complete build and test suite
|
||||
2. ActionLint - GitHub Actions workflow validation
|
||||
3. Change detection - Ensures no uncommitted build artifacts
|
||||
|
||||
## Key Configuration Files
|
||||
|
||||
### Action Configuration (`action.yml`)
|
||||
|
||||
Defines 20+ inputs including version specifications,
|
||||
cache settings, tool directories, and environment options.
|
||||
This file is the authoritative source for understanding available action parameters.
|
||||
|
||||
### TypeScript Configuration (`tsconfig.json`)
|
||||
|
||||
- Target: ES2024
|
||||
- Module: nodenext (Node.js modules)
|
||||
- Strict mode enabled
|
||||
- Output directory: `lib/`
|
||||
|
||||
### Linting Configuration (`biome.json`)
|
||||
|
||||
- Formatter and linter combined
|
||||
- Enforces consistent code style
|
||||
- Automatically organizes imports and sorts object keys
|
||||
|
||||
## Common Development Patterns
|
||||
|
||||
### Making Code Changes
|
||||
|
||||
1. Edit TypeScript source files in `src/`
|
||||
2. Run `npm run build` to compile
|
||||
3. Run `npm run check` to format and lint
|
||||
4. Run `npm run package` to update distribution bundles
|
||||
5. Run `npm test` to verify functionality
|
||||
6. Commit all changes including `dist/` files
|
||||
|
||||
### Adding New Features
|
||||
|
||||
- Follow existing patterns in `src/utils/inputs.ts` for new action inputs
|
||||
- Update `action.yml` to declare new inputs/outputs
|
||||
- Add corresponding tests in `__tests__/`
|
||||
- Add a test in `.github/workflows/test.yml` if it affects integration
|
||||
- Update README.md with usage examples
|
||||
|
||||
### Cache-Related Changes
|
||||
|
||||
- Cache logic is complex and affects performance significantly
|
||||
- Test with multiple cache scenarios (hit, miss, invalidation)
|
||||
- Consider impact on both GitHub-hosted and self-hosted runners
|
||||
- Validate cache key generation and dependency detection
|
||||
|
||||
### Version Resolution Changes
|
||||
|
||||
- Version resolution supports multiple file formats and precedence rules
|
||||
- Test with fixtures in `__tests__/fixtures/`
|
||||
- Consider backward compatibility with existing projects
|
||||
- Validate semver and PEP 440 specification handling
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Build Failures
|
||||
|
||||
- **"Module not found"**: Run `npm ci --ignore-scripts` to ensure dependencies are installed
|
||||
- **TypeScript errors**: Check `tsconfig.json` and ensure all imports are valid
|
||||
- **Test failures**: Check if test fixtures have been modified or if logic changes broke assumptions
|
||||
|
||||
### Action Failures in Workflows
|
||||
|
||||
- **Changes not taking effect**: Ensure `npm run package` was run and `dist/` files committed
|
||||
- **Version resolution issues**: Check version specification format and file existence
|
||||
- **Cache problems**: Verify cache key generation and dependency glob patterns
|
||||
|
||||
### Common Gotchas
|
||||
|
||||
- **Forgetting to package**: Code changes won't work without running `npm run package`
|
||||
- **Platform differences**: Windows paths use backslashes, test cross-platform behavior
|
||||
- **Cache invalidation**: Cache keys are sensitive to dependency file changes
|
||||
- **Tool directory permissions**: Some platforms require specific directory setups
|
||||
|
||||
## Trust These Instructions
|
||||
|
||||
These instructions are comprehensive and current. Only search for additional information if:
|
||||
- You encounter specific error messages not covered here
|
||||
- You need to understand implementation details of specific functions
|
||||
- The instructions appear outdated (check repository commit history)
|
||||
|
||||
For most development tasks, following the build process and development patterns outlined above will be sufficient.
|
||||
2
.github/release-drafter.yml
vendored
2
.github/release-drafter.yml
vendored
@@ -19,7 +19,7 @@ categories:
|
||||
labels:
|
||||
- "maintenance"
|
||||
- "ci"
|
||||
- "update-known-versions"
|
||||
- "update-known-checksums"
|
||||
- title: "📚 Documentation"
|
||||
labels:
|
||||
- "documentation"
|
||||
|
||||
8
.github/workflows/codeql-analysis.yml
vendored
8
.github/workflows/codeql-analysis.yml
vendored
@@ -41,13 +41,13 @@ jobs:
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
|
||||
# Initializes the CodeQL tools for scanning.
|
||||
- name: Initialize CodeQL
|
||||
uses: github/codeql-action/init@5d4e8d1aca955e8d8589aabd499c5cae939e33c7 # v4.31.9
|
||||
uses: github/codeql-action/init@45cbd0c69e560cd9e7cd7f8c32362050c9b7ded2 # v4.32.2
|
||||
with:
|
||||
languages: ${{ matrix.language }}
|
||||
source-root: src
|
||||
@@ -59,7 +59,7 @@ jobs:
|
||||
# Autobuild attempts to build any compiled languages (C/C++, C#, or Java).
|
||||
# If this step fails, then you should remove it and run the build manually (see below)
|
||||
- name: Autobuild
|
||||
uses: github/codeql-action/autobuild@5d4e8d1aca955e8d8589aabd499c5cae939e33c7 # v4.31.9
|
||||
uses: github/codeql-action/autobuild@45cbd0c69e560cd9e7cd7f8c32362050c9b7ded2 # v4.32.2
|
||||
|
||||
# ℹ️ Command-line programs to run using the OS shell.
|
||||
# 📚 https://git.io/JvXDl
|
||||
@@ -73,4 +73,4 @@ jobs:
|
||||
# make release
|
||||
|
||||
- name: Perform CodeQL Analysis
|
||||
uses: github/codeql-action/analyze@5d4e8d1aca955e8d8589aabd499c5cae939e33c7 # v4.31.9
|
||||
uses: github/codeql-action/analyze@45cbd0c69e560cd9e7cd7f8c32362050c9b7ded2 # v4.32.2
|
||||
|
||||
2
.github/workflows/release-drafter.yml
vendored
2
.github/workflows/release-drafter.yml
vendored
@@ -19,6 +19,6 @@ jobs:
|
||||
pull-requests: read
|
||||
steps:
|
||||
- name: 🚀 Run Release Drafter
|
||||
uses: release-drafter/release-drafter@b1476f6e6eb133afa41ed8589daba6dc69b4d3f5 # v6.1.0
|
||||
uses: release-drafter/release-drafter@6db134d15f3909ccc9eefd369f02bd1e9cffdf97 # v6.2.0
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
311
.github/workflows/test.yml
vendored
311
.github/workflows/test.yml
vendored
@@ -21,14 +21,14 @@ jobs:
|
||||
permissions:
|
||||
security-events: write # for zizmor
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Actionlint
|
||||
uses: eifinger/actionlint-action@213860089b7cf97d640aa67567898fabeb132746 # v1.9.3
|
||||
uses: eifinger/actionlint-action@7802e0cc3ab3f81cbffb36fb0bf1a3621d994b89 # v1.10.1
|
||||
- name: Run zizmor
|
||||
uses: zizmorcore/zizmor-action@e639db99335bc9038abc0e066dfcd72e23d26fb4 # v0.3.0
|
||||
- uses: actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # v6.1.0
|
||||
uses: zizmorcore/zizmor-action@0dce2577a4760a2749d8cfb7a84b7d5585ebcb7d # v0.5.0
|
||||
- uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6.3.0
|
||||
with:
|
||||
node-version-file: .nvmrc
|
||||
cache: npm
|
||||
@@ -38,7 +38,7 @@ jobs:
|
||||
npm run all
|
||||
- name: Check all jobs are in all-tests-passed.needs
|
||||
run: |
|
||||
tsc check-all-tests-passed-needs.ts
|
||||
tsc --module nodenext --moduleResolution nodenext --target es2022 check-all-tests-passed-needs.ts
|
||||
node check-all-tests-passed-needs.js
|
||||
working-directory: .github/scripts
|
||||
- name: Make sure no changes from linters are detected
|
||||
@@ -51,7 +51,7 @@ jobs:
|
||||
matrix:
|
||||
os: [ubuntu-latest, macos-latest, macos-14, windows-latest]
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Install latest version
|
||||
@@ -76,7 +76,7 @@ jobs:
|
||||
test-uv-no-modify-path:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Install with UV_NO_MODIFY_PATH set
|
||||
@@ -125,7 +125,7 @@ jobs:
|
||||
expected-version: "0.1.0"
|
||||
resolution-strategy: "lowest"
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Install version ${{ matrix.input.version-input }} with strategy ${{ matrix.input.resolution-strategy || 'highest' }}
|
||||
@@ -154,7 +154,7 @@ jobs:
|
||||
matrix:
|
||||
version-input: ["latest", ">=0.8"]
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Install version ${{ matrix.version-input }}
|
||||
@@ -182,7 +182,7 @@ jobs:
|
||||
- working-directory: "__tests__/fixtures/uv-toml-project"
|
||||
expected-version: "0.5.15"
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Install version from ${{ matrix.input.working-directory }}
|
||||
@@ -208,7 +208,7 @@ jobs:
|
||||
- version-file: "__tests__/fixtures/.tool-versions"
|
||||
expected-version: "0.5.15"
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Install version from ${{ matrix.input.version-file }}
|
||||
@@ -225,7 +225,7 @@ jobs:
|
||||
test-malformed-pyproject-file-fallback:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Install using malformed pyproject.toml
|
||||
@@ -245,7 +245,7 @@ jobs:
|
||||
- os: macos-latest
|
||||
checksum: "a70cbfbf3bb5c08b2f84963b4f12c94e08fbb2468ba418a3bfe1066fbe9e7218"
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Checksum matches expected
|
||||
@@ -259,7 +259,7 @@ jobs:
|
||||
test-with-explicit-token:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Install default version
|
||||
@@ -272,7 +272,7 @@ jobs:
|
||||
test-uvx:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Install default version
|
||||
@@ -285,7 +285,7 @@ jobs:
|
||||
matrix:
|
||||
os: [ubuntu-latest, macos-latest, macos-14, windows-latest]
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Install default version
|
||||
@@ -293,35 +293,13 @@ jobs:
|
||||
- run: uv tool install ruff
|
||||
- run: ruff --version
|
||||
|
||||
test-tilde-expansion-tool-dirs:
|
||||
runs-on: selfhosted-ubuntu-arm64
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Setup with cache
|
||||
uses: ./
|
||||
with:
|
||||
tool-bin-dir: "~/tool-bin-dir"
|
||||
tool-dir: "~/tool-dir"
|
||||
- name: "Check if tool dirs are expanded"
|
||||
run: |
|
||||
if ! echo "$PATH" | grep -q "/home/ubuntu/tool-bin-dir"; then
|
||||
echo "PATH does not contain /home/ubuntu/tool-bin-dir: $PATH"
|
||||
exit 1
|
||||
fi
|
||||
if [ "$UV_TOOL_DIR" != "/home/ubuntu/tool-dir" ]; then
|
||||
echo "UV_TOOL_DIR does not contain /home/ubuntu/tool-dir: $UV_TOOL_DIR"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
test-python-version:
|
||||
runs-on: ${{ matrix.os }}
|
||||
strategy:
|
||||
matrix:
|
||||
os: [ubuntu-latest, macos-latest, windows-latest]
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Install latest version
|
||||
@@ -353,7 +331,7 @@ jobs:
|
||||
matrix:
|
||||
os: [ubuntu-latest, macos-latest, windows-latest]
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Install latest version
|
||||
@@ -386,11 +364,79 @@ jobs:
|
||||
env:
|
||||
UV_VENV: ${{ steps.setup-uv.outputs.venv }}
|
||||
|
||||
test-activate-environment-custom-path:
|
||||
runs-on: ${{ matrix.os }}
|
||||
strategy:
|
||||
matrix:
|
||||
os: [ubuntu-latest, macos-latest, windows-latest]
|
||||
steps:
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Install latest version
|
||||
id: setup-uv
|
||||
uses: ./
|
||||
with:
|
||||
python-version: 3.13.1t
|
||||
activate-environment: true
|
||||
venv-path: ${{ runner.temp }}/custom-venv
|
||||
- name: Verify VIRTUAL_ENV matches output
|
||||
run: |
|
||||
if [ "$VIRTUAL_ENV" != "$UV_VENV" ]; then
|
||||
echo "VIRTUAL_ENV does not match venv output: $VIRTUAL_ENV vs $UV_VENV"
|
||||
exit 1
|
||||
fi
|
||||
shell: bash
|
||||
env:
|
||||
UV_VENV: ${{ steps.setup-uv.outputs.venv }}
|
||||
- name: Verify venv location is runner.temp/custom-venv
|
||||
run: |
|
||||
python - <<'PY'
|
||||
import os
|
||||
from pathlib import Path
|
||||
|
||||
venv = Path(os.environ["VIRTUAL_ENV"]).resolve()
|
||||
temp = Path(os.environ["RUNNER_TEMP"]).resolve()
|
||||
|
||||
if venv.name != "custom-venv":
|
||||
raise SystemExit(f"Expected venv name 'custom-venv', got: {venv}")
|
||||
if venv.parent != temp:
|
||||
raise SystemExit(f"Expected venv under {temp}, got: {venv}")
|
||||
if not venv.is_dir():
|
||||
raise SystemExit(f"Venv directory does not exist: {venv}")
|
||||
PY
|
||||
shell: bash
|
||||
- name: Verify packages can be installed
|
||||
run: uv pip install pip
|
||||
shell: bash
|
||||
- name: Verify python runs from custom venv
|
||||
run: |
|
||||
python - <<'PY'
|
||||
import sys
|
||||
if "custom-venv" not in sys.executable:
|
||||
raise SystemExit(f"Python is not running from custom venv: {sys.executable}")
|
||||
PY
|
||||
shell: bash
|
||||
|
||||
test-debian-unstable:
|
||||
runs-on: ubuntu-latest
|
||||
container: debian:unstable
|
||||
steps:
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Install latest version
|
||||
uses: ./
|
||||
with:
|
||||
enable-cache: true
|
||||
- run: uv sync
|
||||
working-directory: __tests__/fixtures/uv-project
|
||||
|
||||
test-musl:
|
||||
runs-on: ubuntu-latest
|
||||
container: alpine
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Install latest version
|
||||
@@ -429,7 +475,7 @@ jobs:
|
||||
- os: windows-2025
|
||||
expected-os: "windows-2025"
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Setup uv
|
||||
@@ -453,9 +499,9 @@ jobs:
|
||||
strategy:
|
||||
matrix:
|
||||
enable-cache: ["true", "false", "auto"]
|
||||
os: ["ubuntu-latest", "selfhosted-ubuntu-arm64", "windows-latest"]
|
||||
os: ["ubuntu-latest", "windows-latest"]
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Setup with cache
|
||||
@@ -471,10 +517,10 @@ jobs:
|
||||
strategy:
|
||||
matrix:
|
||||
enable-cache: ["true", "false", "auto"]
|
||||
os: ["ubuntu-latest", "selfhosted-ubuntu-arm64", "windows-latest"]
|
||||
os: ["ubuntu-latest", "windows-latest"]
|
||||
needs: test-setup-cache
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Restore with cache
|
||||
@@ -493,7 +539,7 @@ jobs:
|
||||
CACHE_HIT: ${{ steps.restore.outputs.cache-hit }}
|
||||
shell: bash
|
||||
- name: Cache was not hit
|
||||
if: ${{ matrix.enable-cache == 'false' || (matrix.enable-cache == 'auto' && matrix.os == 'selfhosted-ubuntu-arm64') }}
|
||||
if: ${{ matrix.enable-cache == 'false' }}
|
||||
run: |
|
||||
if [ "$CACHE_HIT" == "true" ]; then
|
||||
exit 1
|
||||
@@ -508,7 +554,7 @@ jobs:
|
||||
test-setup-cache-requirements-txt:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Setup with cache
|
||||
@@ -524,7 +570,7 @@ jobs:
|
||||
runs-on: ubuntu-latest
|
||||
needs: test-setup-cache-requirements-txt
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Restore with cache
|
||||
@@ -548,7 +594,7 @@ jobs:
|
||||
test-setup-cache-dependency-glob:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Setup with cache
|
||||
@@ -565,7 +611,7 @@ jobs:
|
||||
runs-on: ubuntu-latest
|
||||
needs: test-setup-cache-dependency-glob
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Change pyproject.toml
|
||||
@@ -593,7 +639,7 @@ jobs:
|
||||
test-setup-cache-save-cache-false:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Setup with cache
|
||||
@@ -609,7 +655,7 @@ jobs:
|
||||
runs-on: ubuntu-latest
|
||||
needs: test-setup-cache-save-cache-false
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Restore with cache
|
||||
@@ -629,7 +675,7 @@ jobs:
|
||||
test-setup-cache-restore-cache-false:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Setup with cache
|
||||
@@ -644,7 +690,7 @@ jobs:
|
||||
runs-on: ubuntu-latest
|
||||
needs: test-setup-cache-restore-cache-false
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Restore with cache
|
||||
@@ -670,11 +716,9 @@ jobs:
|
||||
expected-cache-dir: "/home/runner/work/_temp/setup-uv-cache"
|
||||
- os: windows-latest
|
||||
expected-cache-dir: "D:\\a\\_temp\\setup-uv-cache"
|
||||
- os: selfhosted-ubuntu-arm64
|
||||
expected-cache-dir: "/home/ubuntu/.cache/uv"
|
||||
runs-on: ${{ matrix.inputs.os }}
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Setup with cache
|
||||
@@ -692,7 +736,7 @@ jobs:
|
||||
test-cache-local-cache-disabled:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Setup without cache
|
||||
@@ -711,7 +755,7 @@ jobs:
|
||||
test-cache-local-cache-disabled-but-explicit-path:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Setup without cache
|
||||
@@ -728,99 +772,10 @@ jobs:
|
||||
fi
|
||||
shell: bash
|
||||
|
||||
test-setup-cache-local:
|
||||
runs-on: selfhosted-ubuntu-arm64
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Setup with cache
|
||||
uses: ./
|
||||
with:
|
||||
enable-cache: true
|
||||
cache-suffix: ${{ github.run_id }}-${{ github.run_attempt }}-test-setup-cache-local
|
||||
cache-local-path: /tmp/uv-cache
|
||||
- run: uv sync
|
||||
working-directory: __tests__/fixtures/uv-project
|
||||
test-restore-cache-local:
|
||||
runs-on: selfhosted-ubuntu-arm64
|
||||
needs: test-setup-cache-local
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Restore with cache
|
||||
id: restore
|
||||
uses: ./
|
||||
with:
|
||||
enable-cache: true
|
||||
cache-suffix: ${{ github.run_id }}-${{ github.run_attempt }}-test-setup-cache-local
|
||||
cache-local-path: /tmp/uv-cache
|
||||
- name: Cache was hit
|
||||
run: |
|
||||
if [ "$CACHE_HIT" != "true" ]; then
|
||||
exit 1
|
||||
fi
|
||||
env:
|
||||
CACHE_HIT: ${{ steps.restore.outputs.cache-hit }}
|
||||
- run: uv sync
|
||||
working-directory: __tests__/fixtures/uv-project
|
||||
|
||||
test-tilde-expansion-cache-local-path:
|
||||
runs-on: selfhosted-ubuntu-arm64
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Create cache directory
|
||||
run: mkdir -p ~/uv-cache
|
||||
shell: bash
|
||||
- name: Setup with cache
|
||||
uses: ./
|
||||
with:
|
||||
cache-local-path: ~/uv-cache/cache-local-path
|
||||
- run: uv sync
|
||||
working-directory: __tests__/fixtures/uv-project
|
||||
|
||||
test-tilde-expansion-cache-dependency-glob:
|
||||
runs-on: selfhosted-ubuntu-arm64
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Create cache directory
|
||||
run: mkdir -p ~/uv-cache
|
||||
shell: bash
|
||||
- name: Create cache dependency glob file
|
||||
run: touch ~/uv-cache.glob
|
||||
shell: bash
|
||||
- name: Setup with cache
|
||||
uses: ./
|
||||
with:
|
||||
enable-cache: true
|
||||
cache-local-path: ~/uv-cache/cache-dependency-glob
|
||||
cache-dependency-glob: "~/uv-cache.glob"
|
||||
- run: uv sync
|
||||
working-directory: __tests__/fixtures/uv-project
|
||||
|
||||
cleanup-tilde-expansion-tests:
|
||||
needs:
|
||||
- test-tilde-expansion-cache-local-path
|
||||
- test-tilde-expansion-cache-dependency-glob
|
||||
if: always()
|
||||
runs-on: selfhosted-ubuntu-arm64
|
||||
steps:
|
||||
- name: Remove cache directory
|
||||
run: rm -rf ~/uv-cache
|
||||
shell: bash
|
||||
- name: Remove cache dependency glob file
|
||||
run: rm -f ~/uv-cache.glob
|
||||
shell: bash
|
||||
|
||||
test-no-python-version:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Fake pyproject.toml at root
|
||||
@@ -835,7 +790,7 @@ jobs:
|
||||
test-custom-manifest-file:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Install from custom manifest file
|
||||
@@ -854,7 +809,7 @@ jobs:
|
||||
test-absolute-path:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Create requirements.txt
|
||||
@@ -874,7 +829,7 @@ jobs:
|
||||
test-relative-path:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: mkdir
|
||||
@@ -898,7 +853,7 @@ jobs:
|
||||
test-cache-prune-force:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Setup uv
|
||||
@@ -915,7 +870,7 @@ jobs:
|
||||
test-cache-dir-from-file:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Verify uv cache dir is not populated
|
||||
@@ -937,10 +892,33 @@ jobs:
|
||||
exit 1
|
||||
fi
|
||||
|
||||
test-cache-python-missing-managed-install-dir:
|
||||
runs-on: ubuntu-latest
|
||||
env:
|
||||
UV_PYTHON_INSTALL_DIR: /tmp/missing-uv-python
|
||||
steps:
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Setup uv with cache and python cache enabled
|
||||
uses: ./
|
||||
with:
|
||||
enable-cache: true
|
||||
cache-python: true
|
||||
python-version: "3.12"
|
||||
cache-local-path: /tmp/setup-uv-cache
|
||||
cache-suffix: ${{ github.run_id }}-${{ github.run_attempt }}-test-cache-python-missing-managed-install-dir
|
||||
- name: Ensure uv cache dir exists so only python-cache behavior is tested
|
||||
run: uv sync
|
||||
working-directory: __tests__/fixtures/uv-project
|
||||
shell: bash
|
||||
- name: Ensure managed Python install dir does not exist and this does not break caching
|
||||
run: rm -rf "$UV_PYTHON_INSTALL_DIR"
|
||||
|
||||
test-cache-python-installs:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Verify Python install dir is not populated
|
||||
@@ -967,7 +945,7 @@ jobs:
|
||||
runs-on: ubuntu-latest
|
||||
needs: test-cache-python-installs
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Verify Python install dir does not exist
|
||||
@@ -1007,11 +985,9 @@ jobs:
|
||||
expected-python-dir: "/home/runner/work/_temp/uv-python-dir"
|
||||
- os: windows-latest
|
||||
expected-python-dir: "D:\\a\\_temp\\uv-python-dir"
|
||||
- os: selfhosted-ubuntu-arm64
|
||||
expected-python-dir: "/home/ubuntu/.local/share/uv/python"
|
||||
runs-on: ${{ matrix.inputs.os }}
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Install latest version
|
||||
@@ -1030,7 +1006,7 @@ jobs:
|
||||
test-act:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Install act
|
||||
@@ -1045,11 +1021,11 @@ jobs:
|
||||
validate-typings:
|
||||
runs-on: "ubuntu-latest"
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: false
|
||||
- name: Validate typings
|
||||
uses: typesafegithub/github-actions-typing@184d97003b1300f6a10e286eb98c191e416ff02b # v2.2.1
|
||||
uses: typesafegithub/github-actions-typing@9ddf35b71a482be7d8922b28e8d00df16b77e315 # v2.2.2
|
||||
|
||||
all-tests-passed:
|
||||
runs-on: ubuntu-latest
|
||||
@@ -1066,9 +1042,10 @@ jobs:
|
||||
- test-with-explicit-token
|
||||
- test-uvx
|
||||
- test-tool-install
|
||||
- test-tilde-expansion-tool-dirs
|
||||
- test-python-version
|
||||
- test-activate-environment
|
||||
- test-activate-environment-custom-path
|
||||
- test-debian-unstable
|
||||
- test-musl
|
||||
- test-cache-key-os-version
|
||||
- test-cache-local
|
||||
@@ -1084,17 +1061,13 @@ jobs:
|
||||
- test-restore-cache-save-cache-false
|
||||
- test-setup-cache-restore-cache-false
|
||||
- test-restore-cache-restore-cache-false
|
||||
- test-setup-cache-local
|
||||
- test-restore-cache-local
|
||||
- test-tilde-expansion-cache-local-path
|
||||
- test-tilde-expansion-cache-dependency-glob
|
||||
- cleanup-tilde-expansion-tests
|
||||
- test-no-python-version
|
||||
- test-custom-manifest-file
|
||||
- test-absolute-path
|
||||
- test-relative-path
|
||||
- test-cache-prune-force
|
||||
- test-cache-dir-from-file
|
||||
- test-cache-python-missing-managed-install-dir
|
||||
- test-cache-python-installs
|
||||
- test-restore-python-installs
|
||||
- test-python-install-dir
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
name: "Update known versions"
|
||||
name: "Update known checksums"
|
||||
on:
|
||||
workflow_dispatch:
|
||||
schedule:
|
||||
@@ -15,19 +15,18 @@ jobs:
|
||||
contents: write
|
||||
pull-requests: write
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: true
|
||||
- uses: actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # v6.1.0
|
||||
- uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6.3.0
|
||||
with:
|
||||
node-version: "20"
|
||||
- name: Update known versions
|
||||
id: update-known-versions
|
||||
node-version-file: .nvmrc
|
||||
cache: npm
|
||||
- name: Update known checksums
|
||||
id: update-known-checksums
|
||||
run:
|
||||
node dist/update-known-versions/index.js
|
||||
node dist/update-known-checksums/index.cjs
|
||||
src/download/checksum/known-checksums.ts
|
||||
version-manifest.json
|
||||
${{ secrets.GITHUB_TOKEN }}
|
||||
- name: Check for changes
|
||||
id: changes-exist
|
||||
run: |
|
||||
@@ -48,23 +47,23 @@ jobs:
|
||||
git config user.name "$GITHUB_ACTOR"
|
||||
git config user.email "$GITHUB_ACTOR@users.noreply.github.com"
|
||||
git add .
|
||||
git commit -m "chore: update known versions for $LATEST_VERSION"
|
||||
git commit -m "chore: update known checksums for $LATEST_VERSION"
|
||||
git push origin HEAD:refs/heads/main
|
||||
env:
|
||||
LATEST_VERSION: ${{ steps.update-known-versions.outputs.latest-version }}
|
||||
LATEST_VERSION: ${{ steps.update-known-checksums.outputs.latest-version }}
|
||||
|
||||
- name: Create Pull Request
|
||||
if: ${{ steps.changes-exist.outputs.changes-exist == 'true' && steps.commit-and-push.outcome != 'success' }}
|
||||
uses: peter-evans/create-pull-request@98357b18bf14b5342f975ff684046ec3b2a07725 # v8.0.0
|
||||
uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v8.1.0
|
||||
with:
|
||||
commit-message: "chore: update known checksums"
|
||||
title:
|
||||
"chore: update known checksums for ${{
|
||||
steps.update-known-versions.outputs.latest-version }}"
|
||||
steps.update-known-checksums.outputs.latest-version }}"
|
||||
body:
|
||||
"chore: update known checksums for ${{
|
||||
steps.update-known-versions.outputs.latest-version }}"
|
||||
steps.update-known-checksums.outputs.latest-version }}"
|
||||
base: main
|
||||
labels: "automated-pr,update-known-versions"
|
||||
branch: update-known-versions-pr
|
||||
labels: "automated-pr,update-known-checksums"
|
||||
branch: update-known-checksums-pr
|
||||
delete-branch: true
|
||||
@@ -17,7 +17,7 @@ jobs:
|
||||
permissions:
|
||||
contents: write
|
||||
steps:
|
||||
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
persist-credentials: true # needed for git push below
|
||||
- name: Update Major Minor Tags
|
||||
|
||||
18
AGENTS.md
Normal file
18
AGENTS.md
Normal file
@@ -0,0 +1,18 @@
|
||||
# setup-uv agent notes
|
||||
|
||||
This repository is a TypeScript-based GitHub Action for installing `uv` in GitHub Actions workflows. It also supports restoring/saving the `uv` cache and optional managed-Python caching.
|
||||
|
||||
- The published action runs the committed bundles in `dist/`, not the TypeScript in `src/`. After any code change, run `npm run package` and commit the resulting `dist/` updates.
|
||||
- Standard local validation is:
|
||||
1. `npm ci --ignore-scripts`
|
||||
2. `npm run all`
|
||||
- `npm run check` uses Biome (not ESLint/Prettier) and rewrites files in place.
|
||||
- User-facing changes are usually multi-file changes. If you add or change inputs, outputs, or behavior, update `action.yml`, the implementation in `src/`, tests in `__tests__/`, relevant docs/README, and then re-package.
|
||||
- The easiest areas to regress are version resolution and caching. When touching them, add or update tests for precedence, cache invalidation, and cross-platform path behavior.
|
||||
- Workflow edits have extra CI-only checks (`actionlint` and `zizmor`); `npm run all` does not cover them.
|
||||
- Source is authored with bundler-friendly TypeScript, but published action artifacts in `dist/` are bundled as CommonJS for maximum GitHub Actions runtime compatibility with `@actions/*` dependencies.
|
||||
- Keep these concerns separate when changing module formats:
|
||||
- `src/` and tests may use modern ESM-friendly TypeScript patterns.
|
||||
- `dist/` should prioritize runtime reliability over format purity.
|
||||
- Do not switch published bundles to ESM without validating the actual committed artifacts under the target Node runtime.
|
||||
- Before finishing, make sure validation does not leave generated or formatting-only diffs behind.
|
||||
23
README.md
23
README.md
@@ -59,13 +59,16 @@ Have a look under [Advanced Configuration](#advanced-configuration) for detailed
|
||||
# Use uv venv to activate a venv ready to be used by later steps
|
||||
activate-environment: "false"
|
||||
|
||||
# Custom path for the virtual environment when using activate-environment (default: .venv in the working directory)
|
||||
venv-path: ""
|
||||
|
||||
# The directory to execute all commands in and look for files such as pyproject.toml
|
||||
working-directory: ""
|
||||
|
||||
# The checksum of the uv version to install
|
||||
checksum: ""
|
||||
|
||||
# Used to increase the rate limit when retrieving versions and downloading uv
|
||||
# Used when downloading uv from GitHub releases
|
||||
github-token: ${{ github.token }}
|
||||
|
||||
# Enable uploading of the uv cache: true, false, or auto (enabled on GitHub-hosted runners, disabled on self-hosted runners)
|
||||
@@ -111,7 +114,7 @@ Have a look under [Advanced Configuration](#advanced-configuration) for detailed
|
||||
# Custom path to set UV_TOOL_BIN_DIR to
|
||||
tool-bin-dir: ""
|
||||
|
||||
# URL to the manifest file containing available versions and download URLs
|
||||
# URL to a custom manifest file (NDJSON preferred, legacy JSON array is deprecated)
|
||||
manifest-file: ""
|
||||
|
||||
# Add problem matchers
|
||||
@@ -167,7 +170,7 @@ You can set the working directory with the `working-directory` input.
|
||||
This controls where we look for `pyproject.toml`, `uv.toml` and `.python-version` files
|
||||
which are used to determine the version of uv and python to install.
|
||||
|
||||
It also controls where [the venv gets created](#activate-environment).
|
||||
It also controls where [the venv gets created](#activate-environment), unless `venv-path` is set.
|
||||
|
||||
```yaml
|
||||
- name: Install uv based on the config files in the working-directory
|
||||
@@ -187,10 +190,12 @@ For more advanced configuration options, see our detailed documentation:
|
||||
|
||||
## How it works
|
||||
|
||||
This action downloads uv from the uv repo's official
|
||||
[GitHub Releases](https://github.com/astral-sh/uv) and uses the
|
||||
[GitHub Actions Toolkit](https://github.com/actions/toolkit) to cache it as a tool to speed up
|
||||
consecutive runs on self-hosted runners.
|
||||
By default, this action resolves uv versions from
|
||||
[`astral-sh/versions`](https://github.com/astral-sh/versions) (NDJSON) and downloads uv from the
|
||||
official [GitHub Releases](https://github.com/astral-sh/uv).
|
||||
|
||||
It then uses the [GitHub Actions Toolkit](https://github.com/actions/toolkit) to cache uv as a
|
||||
tool to speed up consecutive runs on self-hosted runners.
|
||||
|
||||
The installed version of uv is then added to the runner PATH, enabling later steps to invoke it
|
||||
by name (`uv`).
|
||||
@@ -276,7 +281,7 @@ the cache will not be found and the warning `No GitHub Actions cache found for k
|
||||
While this might be irritating at first, it is expected behaviour and the cache will be created
|
||||
and reused in later workflows.
|
||||
|
||||
The reason for the warning is, that we have to way to know if this is the first run of a new
|
||||
The reason for the warning is that we have to way to know if this is the first run of a new
|
||||
cache key or the user accidentally misconfigured the cache-dependency-glob
|
||||
or cache-suffix (see [Caching documentation](docs/caching.md)) and the cache never gets used.
|
||||
|
||||
@@ -289,7 +294,7 @@ Running `actions/checkout` after `setup-uv` **is not supported**.
|
||||
|
||||
### Does `setup-uv` also install my project or its dependencies automatically?
|
||||
|
||||
No, `setup-uv` alone wont install any libraries from your `pyproject.toml` or `requirements.txt`, it only sets up `uv`.
|
||||
No, `setup-uv` alone won't install any libraries from your `pyproject.toml` or `requirements.txt`, it only sets up `uv`.
|
||||
You should run `uv sync` or `uv pip install .` separately, or use `uv run ...` to ensure necessary dependencies are installed.
|
||||
|
||||
### Why is a changed cache not detected and not the full cache uploaded?
|
||||
|
||||
@@ -4,10 +4,11 @@ import {
|
||||
validateChecksum,
|
||||
} from "../../../src/download/checksum/checksum";
|
||||
|
||||
const validChecksum =
|
||||
"f3da96ec7e995debee7f5d52ecd034dfb7074309a1da42f76429ecb814d813a3";
|
||||
const filePath = "__tests__/fixtures/checksumfile";
|
||||
|
||||
test("checksum should match", async () => {
|
||||
const validChecksum =
|
||||
"f3da96ec7e995debee7f5d52ecd034dfb7074309a1da42f76429ecb814d813a3";
|
||||
const filePath = "__tests__/fixtures/checksumfile";
|
||||
// string params don't matter only test the checksum mechanism, not known checksums
|
||||
await validateChecksum(
|
||||
validChecksum,
|
||||
@@ -18,6 +19,16 @@ test("checksum should match", async () => {
|
||||
);
|
||||
});
|
||||
|
||||
test("provided checksum beats known checksums", async () => {
|
||||
await validateChecksum(
|
||||
validChecksum,
|
||||
filePath,
|
||||
"x86_64",
|
||||
"unknown-linux-gnu",
|
||||
"0.3.0",
|
||||
);
|
||||
});
|
||||
|
||||
type KnownVersionFixture = { version: string; known: boolean };
|
||||
|
||||
it.each<KnownVersionFixture>([
|
||||
|
||||
271
__tests__/download/download-version.test.ts
Normal file
271
__tests__/download/download-version.test.ts
Normal file
@@ -0,0 +1,271 @@
|
||||
import { beforeEach, describe, expect, it, jest } from "@jest/globals";
|
||||
import * as semver from "semver";
|
||||
|
||||
const mockInfo = jest.fn();
|
||||
const mockWarning = jest.fn();
|
||||
|
||||
jest.unstable_mockModule("@actions/core", () => ({
|
||||
debug: jest.fn(),
|
||||
info: mockInfo,
|
||||
warning: mockWarning,
|
||||
}));
|
||||
|
||||
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||
const mockDownloadTool = jest.fn<any>();
|
||||
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||
const mockExtractTar = jest.fn<any>();
|
||||
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||
const mockExtractZip = jest.fn<any>();
|
||||
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||
const mockCacheDir = jest.fn<any>();
|
||||
|
||||
jest.unstable_mockModule("@actions/tool-cache", () => ({
|
||||
cacheDir: mockCacheDir,
|
||||
downloadTool: mockDownloadTool,
|
||||
evaluateVersions: (versions: string[], range: string) =>
|
||||
semver.maxSatisfying(versions, range) ?? "",
|
||||
extractTar: mockExtractTar,
|
||||
extractZip: mockExtractZip,
|
||||
find: () => "",
|
||||
findAllVersions: () => [],
|
||||
isExplicitVersion: (version: string) => semver.valid(version) !== null,
|
||||
}));
|
||||
|
||||
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||
const mockGetLatestVersionFromNdjson = jest.fn<any>();
|
||||
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||
const mockGetAllVersionsFromNdjson = jest.fn<any>();
|
||||
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||
const mockGetArtifactFromNdjson = jest.fn<any>();
|
||||
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||
const mockGetHighestSatisfyingVersionFromNdjson = jest.fn<any>();
|
||||
|
||||
jest.unstable_mockModule("../../src/download/versions-client", () => ({
|
||||
getAllVersions: mockGetAllVersionsFromNdjson,
|
||||
getArtifact: mockGetArtifactFromNdjson,
|
||||
getHighestSatisfyingVersion: mockGetHighestSatisfyingVersionFromNdjson,
|
||||
getLatestVersion: mockGetLatestVersionFromNdjson,
|
||||
}));
|
||||
|
||||
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||
const mockGetAllManifestVersions = jest.fn<any>();
|
||||
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||
const mockGetLatestVersionInManifest = jest.fn<any>();
|
||||
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||
const mockGetManifestArtifact = jest.fn<any>();
|
||||
|
||||
jest.unstable_mockModule("../../src/download/version-manifest", () => ({
|
||||
getAllVersions: mockGetAllManifestVersions,
|
||||
getLatestKnownVersion: mockGetLatestVersionInManifest,
|
||||
getManifestArtifact: mockGetManifestArtifact,
|
||||
}));
|
||||
|
||||
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||
const mockValidateChecksum = jest.fn<any>();
|
||||
|
||||
jest.unstable_mockModule("../../src/download/checksum/checksum", () => ({
|
||||
validateChecksum: mockValidateChecksum,
|
||||
}));
|
||||
|
||||
const {
|
||||
downloadVersionFromManifest,
|
||||
downloadVersionFromNdjson,
|
||||
resolveVersion,
|
||||
} = await import("../../src/download/download-version");
|
||||
|
||||
describe("download-version", () => {
|
||||
beforeEach(() => {
|
||||
mockInfo.mockReset();
|
||||
mockWarning.mockReset();
|
||||
mockDownloadTool.mockReset();
|
||||
mockExtractTar.mockReset();
|
||||
mockExtractZip.mockReset();
|
||||
mockCacheDir.mockReset();
|
||||
mockGetLatestVersionFromNdjson.mockReset();
|
||||
mockGetAllVersionsFromNdjson.mockReset();
|
||||
mockGetArtifactFromNdjson.mockReset();
|
||||
mockGetHighestSatisfyingVersionFromNdjson.mockReset();
|
||||
mockGetAllManifestVersions.mockReset();
|
||||
mockGetLatestVersionInManifest.mockReset();
|
||||
mockGetManifestArtifact.mockReset();
|
||||
mockValidateChecksum.mockReset();
|
||||
|
||||
mockDownloadTool.mockResolvedValue("/tmp/downloaded");
|
||||
mockExtractTar.mockResolvedValue("/tmp/extracted");
|
||||
mockExtractZip.mockResolvedValue("/tmp/extracted");
|
||||
mockCacheDir.mockResolvedValue("/tmp/cached");
|
||||
});
|
||||
|
||||
describe("resolveVersion", () => {
|
||||
it("uses astral-sh/versions to resolve latest", async () => {
|
||||
mockGetLatestVersionFromNdjson.mockResolvedValue("0.9.26");
|
||||
|
||||
const version = await resolveVersion("latest", undefined);
|
||||
|
||||
expect(version).toBe("0.9.26");
|
||||
expect(mockGetLatestVersionFromNdjson).toHaveBeenCalledTimes(1);
|
||||
});
|
||||
|
||||
it("streams astral-sh/versions to resolve the highest matching version", async () => {
|
||||
mockGetHighestSatisfyingVersionFromNdjson.mockResolvedValue("0.9.26");
|
||||
|
||||
const version = await resolveVersion("^0.9.0", undefined);
|
||||
|
||||
expect(version).toBe("0.9.26");
|
||||
expect(mockGetHighestSatisfyingVersionFromNdjson).toHaveBeenCalledWith(
|
||||
"^0.9.0",
|
||||
);
|
||||
expect(mockGetAllVersionsFromNdjson).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it("still loads all versions when resolving the lowest matching version", async () => {
|
||||
mockGetAllVersionsFromNdjson.mockResolvedValue(["0.9.26", "0.9.25"]);
|
||||
|
||||
const version = await resolveVersion("^0.9.0", undefined, "lowest");
|
||||
|
||||
expect(version).toBe("0.9.25");
|
||||
expect(mockGetAllVersionsFromNdjson).toHaveBeenCalledTimes(1);
|
||||
expect(mockGetHighestSatisfyingVersionFromNdjson).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it("does not fall back when astral-sh/versions fails", async () => {
|
||||
mockGetLatestVersionFromNdjson.mockRejectedValue(
|
||||
new Error("NDJSON unavailable"),
|
||||
);
|
||||
|
||||
await expect(resolveVersion("latest", undefined)).rejects.toThrow(
|
||||
"NDJSON unavailable",
|
||||
);
|
||||
});
|
||||
|
||||
it("uses manifest-file when provided", async () => {
|
||||
mockGetAllManifestVersions.mockResolvedValue(["0.9.26", "0.9.25"]);
|
||||
|
||||
const version = await resolveVersion(
|
||||
"^0.9.0",
|
||||
"https://example.com/custom.ndjson",
|
||||
);
|
||||
|
||||
expect(version).toBe("0.9.26");
|
||||
expect(mockGetAllManifestVersions).toHaveBeenCalledWith(
|
||||
"https://example.com/custom.ndjson",
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
describe("downloadVersionFromNdjson", () => {
|
||||
it("fails when NDJSON metadata lookup fails", async () => {
|
||||
mockGetArtifactFromNdjson.mockRejectedValue(
|
||||
new Error("NDJSON unavailable"),
|
||||
);
|
||||
|
||||
await expect(
|
||||
downloadVersionFromNdjson(
|
||||
"unknown-linux-gnu",
|
||||
"x86_64",
|
||||
"0.9.26",
|
||||
undefined,
|
||||
"token",
|
||||
),
|
||||
).rejects.toThrow("NDJSON unavailable");
|
||||
|
||||
expect(mockDownloadTool).not.toHaveBeenCalled();
|
||||
expect(mockValidateChecksum).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it("fails when no matching artifact exists in NDJSON metadata", async () => {
|
||||
mockGetArtifactFromNdjson.mockResolvedValue(undefined);
|
||||
|
||||
await expect(
|
||||
downloadVersionFromNdjson(
|
||||
"unknown-linux-gnu",
|
||||
"x86_64",
|
||||
"0.9.26",
|
||||
undefined,
|
||||
"token",
|
||||
),
|
||||
).rejects.toThrow(
|
||||
"Could not find artifact for version 0.9.26, arch x86_64, platform unknown-linux-gnu in https://raw.githubusercontent.com/astral-sh/versions/main/v1/uv.ndjson .",
|
||||
);
|
||||
|
||||
expect(mockDownloadTool).not.toHaveBeenCalled();
|
||||
expect(mockValidateChecksum).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it("uses built-in checksums for default NDJSON downloads", async () => {
|
||||
mockGetArtifactFromNdjson.mockResolvedValue({
|
||||
archiveFormat: "tar.gz",
|
||||
sha256: "ndjson-checksum-that-should-be-ignored",
|
||||
url: "https://example.com/uv.tar.gz",
|
||||
});
|
||||
|
||||
await downloadVersionFromNdjson(
|
||||
"unknown-linux-gnu",
|
||||
"x86_64",
|
||||
"0.9.26",
|
||||
undefined,
|
||||
"token",
|
||||
);
|
||||
|
||||
expect(mockValidateChecksum).toHaveBeenCalledWith(
|
||||
undefined,
|
||||
"/tmp/downloaded",
|
||||
"x86_64",
|
||||
"unknown-linux-gnu",
|
||||
"0.9.26",
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
describe("downloadVersionFromManifest", () => {
|
||||
it("uses manifest-file checksum metadata when checksum input is unset", async () => {
|
||||
mockGetManifestArtifact.mockResolvedValue({
|
||||
archiveFormat: "tar.gz",
|
||||
checksum: "manifest-checksum",
|
||||
downloadUrl: "https://example.com/custom-uv.tar.gz",
|
||||
});
|
||||
|
||||
await downloadVersionFromManifest(
|
||||
"https://example.com/custom.ndjson",
|
||||
"unknown-linux-gnu",
|
||||
"x86_64",
|
||||
"0.9.26",
|
||||
"",
|
||||
"token",
|
||||
);
|
||||
|
||||
expect(mockValidateChecksum).toHaveBeenCalledWith(
|
||||
"manifest-checksum",
|
||||
"/tmp/downloaded",
|
||||
"x86_64",
|
||||
"unknown-linux-gnu",
|
||||
"0.9.26",
|
||||
);
|
||||
});
|
||||
|
||||
it("prefers checksum input over manifest-file checksum metadata", async () => {
|
||||
mockGetManifestArtifact.mockResolvedValue({
|
||||
archiveFormat: "tar.gz",
|
||||
checksum: "manifest-checksum",
|
||||
downloadUrl: "https://example.com/custom-uv.tar.gz",
|
||||
});
|
||||
|
||||
await downloadVersionFromManifest(
|
||||
"https://example.com/custom.ndjson",
|
||||
"unknown-linux-gnu",
|
||||
"x86_64",
|
||||
"0.9.26",
|
||||
"user-checksum",
|
||||
"token",
|
||||
);
|
||||
|
||||
expect(mockValidateChecksum).toHaveBeenCalledWith(
|
||||
"user-checksum",
|
||||
"/tmp/downloaded",
|
||||
"x86_64",
|
||||
"unknown-linux-gnu",
|
||||
"0.9.26",
|
||||
);
|
||||
});
|
||||
});
|
||||
});
|
||||
136
__tests__/download/version-manifest.test.ts
Normal file
136
__tests__/download/version-manifest.test.ts
Normal file
@@ -0,0 +1,136 @@
|
||||
import { beforeEach, describe, expect, it, jest } from "@jest/globals";
|
||||
|
||||
const mockWarning = jest.fn();
|
||||
|
||||
jest.unstable_mockModule("@actions/core", () => ({
|
||||
debug: jest.fn(),
|
||||
info: jest.fn(),
|
||||
warning: mockWarning,
|
||||
}));
|
||||
|
||||
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||
const mockFetch = jest.fn<any>();
|
||||
jest.unstable_mockModule("../../src/utils/fetch", () => ({
|
||||
fetch: mockFetch,
|
||||
}));
|
||||
|
||||
const {
|
||||
clearManifestCache,
|
||||
getAllVersions,
|
||||
getLatestKnownVersion,
|
||||
getManifestArtifact,
|
||||
} = await import("../../src/download/version-manifest");
|
||||
|
||||
const legacyManifestResponse = JSON.stringify([
|
||||
{
|
||||
arch: "x86_64",
|
||||
artifactName: "uv-x86_64-unknown-linux-gnu.tar.gz",
|
||||
downloadUrl:
|
||||
"https://example.com/releases/download/0.7.12-alpha.1/uv-x86_64-unknown-linux-gnu.tar.gz",
|
||||
platform: "unknown-linux-gnu",
|
||||
version: "0.7.12-alpha.1",
|
||||
},
|
||||
{
|
||||
arch: "x86_64",
|
||||
artifactName: "uv-x86_64-unknown-linux-gnu.tar.gz",
|
||||
downloadUrl:
|
||||
"https://example.com/releases/download/0.7.13/uv-x86_64-unknown-linux-gnu.tar.gz",
|
||||
platform: "unknown-linux-gnu",
|
||||
version: "0.7.13",
|
||||
},
|
||||
]);
|
||||
|
||||
const ndjsonManifestResponse = `{"version":"0.10.0","artifacts":[{"platform":"x86_64-unknown-linux-gnu","variant":"default","url":"https://example.com/releases/download/0.10.0/uv-x86_64-unknown-linux-gnu.tar.gz","archive_format":"tar.gz","sha256":"checksum-100"}]}
|
||||
{"version":"0.9.30","artifacts":[{"platform":"x86_64-unknown-linux-gnu","variant":"default","url":"https://example.com/releases/download/0.9.30/uv-x86_64-unknown-linux-gnu.tar.gz","archive_format":"tar.gz","sha256":"checksum-0930"}]}`;
|
||||
|
||||
const multiVariantManifestResponse = `{"version":"0.10.0","artifacts":[{"platform":"x86_64-unknown-linux-gnu","variant":"managed-python","url":"https://example.com/releases/download/0.10.0/uv-x86_64-unknown-linux-gnu-managed-python.tar.gz","archive_format":"tar.gz","sha256":"checksum-managed"},{"platform":"x86_64-unknown-linux-gnu","variant":"default","url":"https://example.com/releases/download/0.10.0/uv-x86_64-unknown-linux-gnu-default.zip","archive_format":"zip","sha256":"checksum-default"}]}`;
|
||||
|
||||
function createMockResponse(
|
||||
ok: boolean,
|
||||
status: number,
|
||||
statusText: string,
|
||||
data: string,
|
||||
) {
|
||||
return {
|
||||
ok,
|
||||
status,
|
||||
statusText,
|
||||
text: async () => data,
|
||||
};
|
||||
}
|
||||
|
||||
describe("version-manifest", () => {
|
||||
beforeEach(() => {
|
||||
clearManifestCache();
|
||||
mockFetch.mockReset();
|
||||
mockWarning.mockReset();
|
||||
});
|
||||
|
||||
it("supports the legacy JSON manifest format", async () => {
|
||||
mockFetch.mockResolvedValue(
|
||||
createMockResponse(true, 200, "OK", legacyManifestResponse),
|
||||
);
|
||||
|
||||
const latest = await getLatestKnownVersion(
|
||||
"https://example.com/legacy.json",
|
||||
);
|
||||
const artifact = await getManifestArtifact(
|
||||
"https://example.com/legacy.json",
|
||||
"0.7.13",
|
||||
"x86_64",
|
||||
"unknown-linux-gnu",
|
||||
);
|
||||
|
||||
expect(latest).toBe("0.7.13");
|
||||
expect(artifact).toEqual({
|
||||
archiveFormat: undefined,
|
||||
checksum: undefined,
|
||||
downloadUrl:
|
||||
"https://example.com/releases/download/0.7.13/uv-x86_64-unknown-linux-gnu.tar.gz",
|
||||
});
|
||||
expect(mockWarning).toHaveBeenCalledTimes(1);
|
||||
});
|
||||
|
||||
it("supports NDJSON manifests", async () => {
|
||||
mockFetch.mockResolvedValue(
|
||||
createMockResponse(true, 200, "OK", ndjsonManifestResponse),
|
||||
);
|
||||
|
||||
const versions = await getAllVersions("https://example.com/custom.ndjson");
|
||||
const artifact = await getManifestArtifact(
|
||||
"https://example.com/custom.ndjson",
|
||||
"0.10.0",
|
||||
"x86_64",
|
||||
"unknown-linux-gnu",
|
||||
);
|
||||
|
||||
expect(versions).toEqual(["0.10.0", "0.9.30"]);
|
||||
expect(artifact).toEqual({
|
||||
archiveFormat: "tar.gz",
|
||||
checksum: "checksum-100",
|
||||
downloadUrl:
|
||||
"https://example.com/releases/download/0.10.0/uv-x86_64-unknown-linux-gnu.tar.gz",
|
||||
});
|
||||
expect(mockWarning).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it("prefers the default variant when a manifest contains multiple variants", async () => {
|
||||
mockFetch.mockResolvedValue(
|
||||
createMockResponse(true, 200, "OK", multiVariantManifestResponse),
|
||||
);
|
||||
|
||||
const artifact = await getManifestArtifact(
|
||||
"https://example.com/multi-variant.ndjson",
|
||||
"0.10.0",
|
||||
"x86_64",
|
||||
"unknown-linux-gnu",
|
||||
);
|
||||
|
||||
expect(artifact).toEqual({
|
||||
archiveFormat: "zip",
|
||||
checksum: "checksum-default",
|
||||
downloadUrl:
|
||||
"https://example.com/releases/download/0.10.0/uv-x86_64-unknown-linux-gnu-default.zip",
|
||||
});
|
||||
});
|
||||
});
|
||||
241
__tests__/download/versions-client.test.ts
Normal file
241
__tests__/download/versions-client.test.ts
Normal file
@@ -0,0 +1,241 @@
|
||||
import { beforeEach, describe, expect, it, jest } from "@jest/globals";
|
||||
|
||||
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||
const mockFetch = jest.fn<any>();
|
||||
|
||||
jest.unstable_mockModule("../../src/utils/fetch", () => ({
|
||||
fetch: mockFetch,
|
||||
}));
|
||||
|
||||
const {
|
||||
clearCache,
|
||||
fetchVersionData,
|
||||
getAllVersions,
|
||||
getArtifact,
|
||||
getHighestSatisfyingVersion,
|
||||
getLatestVersion,
|
||||
parseVersionData,
|
||||
} = await import("../../src/download/versions-client");
|
||||
|
||||
const sampleNdjsonResponse = `{"version":"0.9.26","artifacts":[{"platform":"aarch64-apple-darwin","variant":"default","url":"https://github.com/astral-sh/uv/releases/download/0.9.26/uv-aarch64-apple-darwin.tar.gz","archive_format":"tar.gz","sha256":"fcf0a9ea6599c6ae28a4c854ac6da76f2c889354d7c36ce136ef071f7ab9721f"},{"platform":"x86_64-pc-windows-msvc","variant":"default","url":"https://github.com/astral-sh/uv/releases/download/0.9.26/uv-x86_64-pc-windows-msvc.zip","archive_format":"zip","sha256":"eb02fd95d8e0eed462b4a67ecdd320d865b38c560bffcda9a0b87ec944bdf036"}]}
|
||||
{"version":"0.9.25","artifacts":[{"platform":"aarch64-apple-darwin","variant":"default","url":"https://github.com/astral-sh/uv/releases/download/0.9.25/uv-aarch64-apple-darwin.tar.gz","archive_format":"tar.gz","sha256":"606b3c6949d971709f2526fa0d9f0fd23ccf60e09f117999b406b424af18a6a6"}]}`;
|
||||
|
||||
const multiVariantNdjsonResponse = `{"version":"0.9.26","artifacts":[{"platform":"aarch64-apple-darwin","variant":"python-managed","url":"https://github.com/astral-sh/uv/releases/download/0.9.26/uv-aarch64-apple-darwin-managed.tar.gz","archive_format":"tar.gz","sha256":"managed-checksum"},{"platform":"aarch64-apple-darwin","variant":"default","url":"https://github.com/astral-sh/uv/releases/download/0.9.26/uv-aarch64-apple-darwin.zip","archive_format":"zip","sha256":"default-checksum"}]}`;
|
||||
|
||||
function createMockStream(chunks: string[]): ReadableStream<Uint8Array> {
|
||||
const encoder = new TextEncoder();
|
||||
|
||||
return new ReadableStream<Uint8Array>({
|
||||
start(controller) {
|
||||
for (const chunk of chunks) {
|
||||
controller.enqueue(encoder.encode(chunk));
|
||||
}
|
||||
controller.close();
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
function createMockResponse(
|
||||
ok: boolean,
|
||||
status: number,
|
||||
statusText: string,
|
||||
data: string,
|
||||
chunks: string[] = [data],
|
||||
) {
|
||||
return {
|
||||
body: createMockStream(chunks),
|
||||
ok,
|
||||
status,
|
||||
statusText,
|
||||
text: async () => data,
|
||||
};
|
||||
}
|
||||
|
||||
describe("versions-client", () => {
|
||||
beforeEach(() => {
|
||||
clearCache();
|
||||
mockFetch.mockReset();
|
||||
});
|
||||
|
||||
describe("fetchVersionData", () => {
|
||||
it("should fetch and parse NDJSON data", async () => {
|
||||
mockFetch.mockResolvedValue(
|
||||
createMockResponse(true, 200, "OK", sampleNdjsonResponse),
|
||||
);
|
||||
|
||||
const versions = await fetchVersionData();
|
||||
|
||||
expect(versions).toHaveLength(2);
|
||||
expect(versions[0].version).toBe("0.9.26");
|
||||
expect(versions[1].version).toBe("0.9.25");
|
||||
});
|
||||
|
||||
it("should throw error on failed fetch", async () => {
|
||||
mockFetch.mockResolvedValue(
|
||||
createMockResponse(false, 500, "Internal Server Error", ""),
|
||||
);
|
||||
|
||||
await expect(fetchVersionData()).rejects.toThrow(
|
||||
"Failed to fetch version data: 500 Internal Server Error",
|
||||
);
|
||||
});
|
||||
|
||||
it("should cache results", async () => {
|
||||
mockFetch.mockResolvedValue(
|
||||
createMockResponse(true, 200, "OK", sampleNdjsonResponse),
|
||||
);
|
||||
|
||||
await fetchVersionData();
|
||||
await fetchVersionData();
|
||||
|
||||
expect(mockFetch).toHaveBeenCalledTimes(1);
|
||||
});
|
||||
});
|
||||
|
||||
describe("getLatestVersion", () => {
|
||||
it("should return the first version (newest)", async () => {
|
||||
mockFetch.mockResolvedValue(
|
||||
createMockResponse(true, 200, "OK", sampleNdjsonResponse),
|
||||
);
|
||||
|
||||
const latest = await getLatestVersion();
|
||||
|
||||
expect(latest).toBe("0.9.26");
|
||||
});
|
||||
|
||||
it("should stop after the first record when resolving latest", async () => {
|
||||
mockFetch.mockResolvedValue(
|
||||
createMockResponse(
|
||||
true,
|
||||
200,
|
||||
"OK",
|
||||
`${sampleNdjsonResponse}\n{"version":`,
|
||||
[`${sampleNdjsonResponse.split("\n")[0]}\n`, '{"version":'],
|
||||
),
|
||||
);
|
||||
|
||||
const latest = await getLatestVersion();
|
||||
|
||||
expect(latest).toBe("0.9.26");
|
||||
});
|
||||
});
|
||||
|
||||
describe("getAllVersions", () => {
|
||||
it("should return all version strings", async () => {
|
||||
mockFetch.mockResolvedValue(
|
||||
createMockResponse(true, 200, "OK", sampleNdjsonResponse),
|
||||
);
|
||||
|
||||
const versions = await getAllVersions();
|
||||
|
||||
expect(versions).toEqual(["0.9.26", "0.9.25"]);
|
||||
});
|
||||
});
|
||||
|
||||
describe("getHighestSatisfyingVersion", () => {
|
||||
it("should return the first matching version from the stream", async () => {
|
||||
mockFetch.mockResolvedValue(
|
||||
createMockResponse(
|
||||
true,
|
||||
200,
|
||||
"OK",
|
||||
`${sampleNdjsonResponse}\n{"version":`,
|
||||
[`${sampleNdjsonResponse.split("\n")[0]}\n`, '{"version":'],
|
||||
),
|
||||
);
|
||||
|
||||
const version = await getHighestSatisfyingVersion("^0.9.0");
|
||||
|
||||
expect(version).toBe("0.9.26");
|
||||
});
|
||||
});
|
||||
|
||||
describe("getArtifact", () => {
|
||||
beforeEach(() => {
|
||||
mockFetch.mockResolvedValue(
|
||||
createMockResponse(true, 200, "OK", sampleNdjsonResponse),
|
||||
);
|
||||
});
|
||||
|
||||
it("should find artifact by version and platform", async () => {
|
||||
const artifact = await getArtifact("0.9.26", "aarch64", "apple-darwin");
|
||||
|
||||
expect(artifact).toEqual({
|
||||
archiveFormat: "tar.gz",
|
||||
sha256:
|
||||
"fcf0a9ea6599c6ae28a4c854ac6da76f2c889354d7c36ce136ef071f7ab9721f",
|
||||
url: "https://github.com/astral-sh/uv/releases/download/0.9.26/uv-aarch64-apple-darwin.tar.gz",
|
||||
});
|
||||
});
|
||||
|
||||
it("should stop once the requested version is found", async () => {
|
||||
mockFetch.mockResolvedValue(
|
||||
createMockResponse(
|
||||
true,
|
||||
200,
|
||||
"OK",
|
||||
`${sampleNdjsonResponse}\n{"version":`,
|
||||
[`${sampleNdjsonResponse.split("\n")[0]}\n`, '{"version":'],
|
||||
),
|
||||
);
|
||||
|
||||
const artifact = await getArtifact("0.9.26", "aarch64", "apple-darwin");
|
||||
|
||||
expect(artifact).toEqual({
|
||||
archiveFormat: "tar.gz",
|
||||
sha256:
|
||||
"fcf0a9ea6599c6ae28a4c854ac6da76f2c889354d7c36ce136ef071f7ab9721f",
|
||||
url: "https://github.com/astral-sh/uv/releases/download/0.9.26/uv-aarch64-apple-darwin.tar.gz",
|
||||
});
|
||||
});
|
||||
|
||||
it("should find windows artifact", async () => {
|
||||
const artifact = await getArtifact("0.9.26", "x86_64", "pc-windows-msvc");
|
||||
|
||||
expect(artifact).toEqual({
|
||||
archiveFormat: "zip",
|
||||
sha256:
|
||||
"eb02fd95d8e0eed462b4a67ecdd320d865b38c560bffcda9a0b87ec944bdf036",
|
||||
url: "https://github.com/astral-sh/uv/releases/download/0.9.26/uv-x86_64-pc-windows-msvc.zip",
|
||||
});
|
||||
});
|
||||
|
||||
it("should prefer the default variant when multiple artifacts share a platform", async () => {
|
||||
mockFetch.mockResolvedValue(
|
||||
createMockResponse(true, 200, "OK", multiVariantNdjsonResponse),
|
||||
);
|
||||
|
||||
const artifact = await getArtifact("0.9.26", "aarch64", "apple-darwin");
|
||||
|
||||
expect(artifact).toEqual({
|
||||
archiveFormat: "zip",
|
||||
sha256: "default-checksum",
|
||||
url: "https://github.com/astral-sh/uv/releases/download/0.9.26/uv-aarch64-apple-darwin.zip",
|
||||
});
|
||||
});
|
||||
|
||||
it("should return undefined for unknown version", async () => {
|
||||
const artifact = await getArtifact("0.0.1", "aarch64", "apple-darwin");
|
||||
|
||||
expect(artifact).toBeUndefined();
|
||||
});
|
||||
|
||||
it("should return undefined for unknown platform", async () => {
|
||||
const artifact = await getArtifact(
|
||||
"0.9.26",
|
||||
"aarch64",
|
||||
"unknown-linux-musl",
|
||||
);
|
||||
|
||||
expect(artifact).toBeUndefined();
|
||||
});
|
||||
});
|
||||
|
||||
describe("parseVersionData", () => {
|
||||
it("should throw for malformed NDJSON", () => {
|
||||
expect(() =>
|
||||
parseVersionData('{"version":"0.1.0"', "test-source"),
|
||||
).toThrow("Failed to parse version data from test-source");
|
||||
});
|
||||
});
|
||||
});
|
||||
@@ -1,13 +1,3 @@
|
||||
jest.mock("@actions/core", () => {
|
||||
return {
|
||||
debug: jest.fn(),
|
||||
getBooleanInput: jest.fn(
|
||||
(name: string) => (mockInputs[name] ?? "") === "true",
|
||||
),
|
||||
getInput: jest.fn((name: string) => mockInputs[name] ?? ""),
|
||||
};
|
||||
});
|
||||
|
||||
import {
|
||||
afterEach,
|
||||
beforeEach,
|
||||
@@ -21,9 +11,30 @@ import {
|
||||
let mockInputs: Record<string, string> = {};
|
||||
const ORIGINAL_HOME = process.env.HOME;
|
||||
|
||||
const mockDebug = jest.fn();
|
||||
const mockGetBooleanInput = jest.fn(
|
||||
(name: string) => (mockInputs[name] ?? "") === "true",
|
||||
);
|
||||
const mockGetInput = jest.fn((name: string) => mockInputs[name] ?? "");
|
||||
const mockInfo = jest.fn();
|
||||
const mockWarning = jest.fn();
|
||||
|
||||
jest.unstable_mockModule("@actions/core", () => ({
|
||||
debug: mockDebug,
|
||||
getBooleanInput: mockGetBooleanInput,
|
||||
getInput: mockGetInput,
|
||||
info: mockInfo,
|
||||
warning: mockWarning,
|
||||
}));
|
||||
|
||||
async function importInputsModule() {
|
||||
return await import("../../src/utils/inputs");
|
||||
}
|
||||
|
||||
describe("cacheDependencyGlob", () => {
|
||||
beforeEach(() => {
|
||||
jest.resetModules();
|
||||
jest.clearAllMocks();
|
||||
mockInputs = {};
|
||||
process.env.HOME = "/home/testuser";
|
||||
});
|
||||
@@ -34,21 +45,21 @@ describe("cacheDependencyGlob", () => {
|
||||
|
||||
it("returns empty string when input not provided", async () => {
|
||||
mockInputs["working-directory"] = "/workspace";
|
||||
const { cacheDependencyGlob } = await import("../../src/utils/inputs");
|
||||
const { cacheDependencyGlob } = await importInputsModule();
|
||||
expect(cacheDependencyGlob).toBe("");
|
||||
});
|
||||
|
||||
it("resolves a single relative path", async () => {
|
||||
mockInputs["working-directory"] = "/workspace";
|
||||
mockInputs["cache-dependency-glob"] = "requirements.txt";
|
||||
const { cacheDependencyGlob } = await import("../../src/utils/inputs");
|
||||
const { cacheDependencyGlob } = await importInputsModule();
|
||||
expect(cacheDependencyGlob).toBe("/workspace/requirements.txt");
|
||||
});
|
||||
|
||||
it("strips leading ./ from relative path", async () => {
|
||||
mockInputs["working-directory"] = "/workspace";
|
||||
mockInputs["cache-dependency-glob"] = "./uv.lock";
|
||||
const { cacheDependencyGlob } = await import("../../src/utils/inputs");
|
||||
const { cacheDependencyGlob } = await importInputsModule();
|
||||
expect(cacheDependencyGlob).toBe("/workspace/uv.lock");
|
||||
});
|
||||
|
||||
@@ -56,7 +67,7 @@ describe("cacheDependencyGlob", () => {
|
||||
mockInputs["working-directory"] = "/workspace";
|
||||
mockInputs["cache-dependency-glob"] =
|
||||
" ~/.cache/file1\n ./rel/file2 \nfile3.txt";
|
||||
const { cacheDependencyGlob } = await import("../../src/utils/inputs");
|
||||
const { cacheDependencyGlob } = await importInputsModule();
|
||||
expect(cacheDependencyGlob).toBe(
|
||||
[
|
||||
"/home/testuser/.cache/file1", // expanded tilde, absolute path unchanged
|
||||
@@ -69,7 +80,7 @@ describe("cacheDependencyGlob", () => {
|
||||
it("keeps absolute path unchanged in multiline input", async () => {
|
||||
mockInputs["working-directory"] = "/workspace";
|
||||
mockInputs["cache-dependency-glob"] = "/abs/path.lock\nrelative.lock";
|
||||
const { cacheDependencyGlob } = await import("../../src/utils/inputs");
|
||||
const { cacheDependencyGlob } = await importInputsModule();
|
||||
expect(cacheDependencyGlob).toBe(
|
||||
["/abs/path.lock", "/workspace/relative.lock"].join("\n"),
|
||||
);
|
||||
@@ -78,9 +89,122 @@ describe("cacheDependencyGlob", () => {
|
||||
it("handles exclusions in relative paths correct", async () => {
|
||||
mockInputs["working-directory"] = "/workspace";
|
||||
mockInputs["cache-dependency-glob"] = "!/abs/path.lock\n!relative.lock";
|
||||
const { cacheDependencyGlob } = await import("../../src/utils/inputs");
|
||||
const { cacheDependencyGlob } = await importInputsModule();
|
||||
expect(cacheDependencyGlob).toBe(
|
||||
["!/abs/path.lock", "!/workspace/relative.lock"].join("\n"),
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
describe("tool directories", () => {
|
||||
beforeEach(() => {
|
||||
jest.resetModules();
|
||||
jest.clearAllMocks();
|
||||
mockInputs = {};
|
||||
process.env.HOME = "/home/testuser";
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
process.env.HOME = ORIGINAL_HOME;
|
||||
});
|
||||
|
||||
it("expands tilde for tool-bin-dir and tool-dir", async () => {
|
||||
mockInputs["working-directory"] = "/workspace";
|
||||
mockInputs["tool-bin-dir"] = "~/tool-bin-dir";
|
||||
mockInputs["tool-dir"] = "~/tool-dir";
|
||||
|
||||
const { toolBinDir, toolDir } = await importInputsModule();
|
||||
|
||||
expect(toolBinDir).toBe("/home/testuser/tool-bin-dir");
|
||||
expect(toolDir).toBe("/home/testuser/tool-dir");
|
||||
});
|
||||
});
|
||||
|
||||
describe("cacheLocalPath", () => {
|
||||
beforeEach(() => {
|
||||
jest.resetModules();
|
||||
jest.clearAllMocks();
|
||||
mockInputs = {};
|
||||
process.env.HOME = "/home/testuser";
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
process.env.HOME = ORIGINAL_HOME;
|
||||
});
|
||||
|
||||
it("expands tilde in cache-local-path", async () => {
|
||||
mockInputs["working-directory"] = "/workspace";
|
||||
mockInputs["cache-local-path"] = "~/uv-cache/cache-local-path";
|
||||
|
||||
const { CacheLocalSource, cacheLocalPath } = await importInputsModule();
|
||||
|
||||
expect(cacheLocalPath).toEqual({
|
||||
path: "/home/testuser/uv-cache/cache-local-path",
|
||||
source: CacheLocalSource.Input,
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe("venvPath", () => {
|
||||
beforeEach(() => {
|
||||
jest.resetModules();
|
||||
jest.clearAllMocks();
|
||||
mockInputs = {};
|
||||
process.env.HOME = "/home/testuser";
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
process.env.HOME = ORIGINAL_HOME;
|
||||
});
|
||||
|
||||
it("defaults to .venv in the working directory", async () => {
|
||||
mockInputs["working-directory"] = "/workspace";
|
||||
const { venvPath } = await importInputsModule();
|
||||
expect(venvPath).toBe("/workspace/.venv");
|
||||
});
|
||||
|
||||
it("resolves a relative venv-path", async () => {
|
||||
mockInputs["working-directory"] = "/workspace";
|
||||
mockInputs["activate-environment"] = "true";
|
||||
mockInputs["venv-path"] = "custom-venv";
|
||||
const { venvPath } = await importInputsModule();
|
||||
expect(venvPath).toBe("/workspace/custom-venv");
|
||||
});
|
||||
|
||||
it("normalizes venv-path with trailing slash", async () => {
|
||||
mockInputs["working-directory"] = "/workspace";
|
||||
mockInputs["activate-environment"] = "true";
|
||||
mockInputs["venv-path"] = "custom-venv/";
|
||||
const { venvPath } = await importInputsModule();
|
||||
expect(venvPath).toBe("/workspace/custom-venv");
|
||||
});
|
||||
|
||||
it("keeps an absolute venv-path unchanged", async () => {
|
||||
mockInputs["working-directory"] = "/workspace";
|
||||
mockInputs["activate-environment"] = "true";
|
||||
mockInputs["venv-path"] = "/tmp/custom-venv";
|
||||
const { venvPath } = await importInputsModule();
|
||||
expect(venvPath).toBe("/tmp/custom-venv");
|
||||
});
|
||||
|
||||
it("expands tilde in venv-path", async () => {
|
||||
mockInputs["working-directory"] = "/workspace";
|
||||
mockInputs["activate-environment"] = "true";
|
||||
mockInputs["venv-path"] = "~/.venv";
|
||||
const { venvPath } = await importInputsModule();
|
||||
expect(venvPath).toBe("/home/testuser/.venv");
|
||||
});
|
||||
|
||||
it("warns when venv-path is set but activate-environment is false", async () => {
|
||||
mockInputs["working-directory"] = "/workspace";
|
||||
mockInputs["venv-path"] = "custom-venv";
|
||||
|
||||
const { activateEnvironment, venvPath } = await importInputsModule();
|
||||
|
||||
expect(activateEnvironment).toBe(false);
|
||||
expect(venvPath).toBe("/workspace/custom-venv");
|
||||
expect(mockWarning).toHaveBeenCalledWith(
|
||||
"venv-path is only used when activate-environment is true",
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -1,113 +1,121 @@
|
||||
jest.mock("node:fs");
|
||||
jest.mock("@actions/core", () => ({
|
||||
warning: jest.fn(),
|
||||
import { beforeEach, describe, expect, it, jest } from "@jest/globals";
|
||||
|
||||
const mockReadFileSync = jest.fn();
|
||||
const mockWarning = jest.fn();
|
||||
|
||||
jest.unstable_mockModule("node:fs", () => ({
|
||||
default: {
|
||||
readFileSync: mockReadFileSync,
|
||||
},
|
||||
}));
|
||||
|
||||
import fs from "node:fs";
|
||||
import * as core from "@actions/core";
|
||||
import { beforeEach, describe, expect, it, jest } from "@jest/globals";
|
||||
import { getUvVersionFromToolVersions } from "../../src/version/tool-versions-file";
|
||||
jest.unstable_mockModule("@actions/core", () => ({
|
||||
warning: mockWarning,
|
||||
}));
|
||||
|
||||
const mockedFs = fs as jest.Mocked<typeof fs>;
|
||||
const mockedCore = core as jest.Mocked<typeof core>;
|
||||
async function getVersionFromToolVersions(filePath: string) {
|
||||
const { getUvVersionFromToolVersions } = await import(
|
||||
"../../src/version/tool-versions-file"
|
||||
);
|
||||
|
||||
return getUvVersionFromToolVersions(filePath);
|
||||
}
|
||||
|
||||
describe("getUvVersionFromToolVersions", () => {
|
||||
beforeEach(() => {
|
||||
jest.resetModules();
|
||||
jest.clearAllMocks();
|
||||
});
|
||||
|
||||
it("should return undefined for non-.tool-versions files", () => {
|
||||
const result = getUvVersionFromToolVersions("package.json");
|
||||
it("should return undefined for non-.tool-versions files", async () => {
|
||||
const result = await getVersionFromToolVersions("package.json");
|
||||
expect(result).toBeUndefined();
|
||||
expect(mockedFs.readFileSync).not.toHaveBeenCalled();
|
||||
expect(mockReadFileSync).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it("should return version for valid uv entry", () => {
|
||||
it("should return version for valid uv entry", async () => {
|
||||
const fileContent = "python 3.11.0\nuv 0.1.0\nnodejs 18.0.0";
|
||||
mockedFs.readFileSync.mockReturnValue(fileContent);
|
||||
mockReadFileSync.mockReturnValue(fileContent);
|
||||
|
||||
const result = getUvVersionFromToolVersions(".tool-versions");
|
||||
const result = await getVersionFromToolVersions(".tool-versions");
|
||||
|
||||
expect(result).toBe("0.1.0");
|
||||
expect(mockedFs.readFileSync).toHaveBeenCalledWith(
|
||||
".tool-versions",
|
||||
"utf8",
|
||||
);
|
||||
expect(mockReadFileSync).toHaveBeenCalledWith(".tool-versions", "utf8");
|
||||
});
|
||||
|
||||
it("should return version for uv entry with v prefix", () => {
|
||||
it("should return version for uv entry with v prefix", async () => {
|
||||
const fileContent = "uv v0.2.0";
|
||||
mockedFs.readFileSync.mockReturnValue(fileContent);
|
||||
mockReadFileSync.mockReturnValue(fileContent);
|
||||
|
||||
const result = getUvVersionFromToolVersions(".tool-versions");
|
||||
const result = await getVersionFromToolVersions(".tool-versions");
|
||||
|
||||
expect(result).toBe("0.2.0");
|
||||
});
|
||||
|
||||
it("should handle whitespace around uv entry", () => {
|
||||
it("should handle whitespace around uv entry", async () => {
|
||||
const fileContent = " uv 0.3.0 ";
|
||||
mockedFs.readFileSync.mockReturnValue(fileContent);
|
||||
mockReadFileSync.mockReturnValue(fileContent);
|
||||
|
||||
const result = getUvVersionFromToolVersions(".tool-versions");
|
||||
const result = await getVersionFromToolVersions(".tool-versions");
|
||||
|
||||
expect(result).toBe("0.3.0");
|
||||
});
|
||||
|
||||
it("should skip commented lines", () => {
|
||||
it("should skip commented lines", async () => {
|
||||
const fileContent = "# uv 0.1.0\npython 3.11.0\nuv 0.2.0";
|
||||
mockedFs.readFileSync.mockReturnValue(fileContent);
|
||||
mockReadFileSync.mockReturnValue(fileContent);
|
||||
|
||||
const result = getUvVersionFromToolVersions(".tool-versions");
|
||||
const result = await getVersionFromToolVersions(".tool-versions");
|
||||
|
||||
expect(result).toBe("0.2.0");
|
||||
});
|
||||
|
||||
it("should return first matching uv version", () => {
|
||||
it("should return first matching uv version", async () => {
|
||||
const fileContent = "uv 0.1.0\npython 3.11.0\nuv 0.2.0";
|
||||
mockedFs.readFileSync.mockReturnValue(fileContent);
|
||||
mockReadFileSync.mockReturnValue(fileContent);
|
||||
|
||||
const result = getUvVersionFromToolVersions(".tool-versions");
|
||||
const result = await getVersionFromToolVersions(".tool-versions");
|
||||
|
||||
expect(result).toBe("0.1.0");
|
||||
});
|
||||
|
||||
it("should return undefined when no uv entry found", () => {
|
||||
it("should return undefined when no uv entry found", async () => {
|
||||
const fileContent = "python 3.11.0\nnodejs 18.0.0";
|
||||
mockedFs.readFileSync.mockReturnValue(fileContent);
|
||||
mockReadFileSync.mockReturnValue(fileContent);
|
||||
|
||||
const result = getUvVersionFromToolVersions(".tool-versions");
|
||||
const result = await getVersionFromToolVersions(".tool-versions");
|
||||
|
||||
expect(result).toBeUndefined();
|
||||
});
|
||||
|
||||
it("should return undefined for empty file", () => {
|
||||
mockedFs.readFileSync.mockReturnValue("");
|
||||
it("should return undefined for empty file", async () => {
|
||||
mockReadFileSync.mockReturnValue("");
|
||||
|
||||
const result = getUvVersionFromToolVersions(".tool-versions");
|
||||
const result = await getVersionFromToolVersions(".tool-versions");
|
||||
|
||||
expect(result).toBeUndefined();
|
||||
});
|
||||
|
||||
it("should warn and return undefined for ref syntax", () => {
|
||||
it("should warn and return undefined for ref syntax", async () => {
|
||||
const fileContent = "uv ref:main";
|
||||
mockedFs.readFileSync.mockReturnValue(fileContent);
|
||||
mockReadFileSync.mockReturnValue(fileContent);
|
||||
|
||||
const result = getUvVersionFromToolVersions(".tool-versions");
|
||||
const result = await getVersionFromToolVersions(".tool-versions");
|
||||
|
||||
expect(result).toBeUndefined();
|
||||
expect(mockedCore.warning).toHaveBeenCalledWith(
|
||||
expect(mockWarning).toHaveBeenCalledWith(
|
||||
"The ref syntax of .tool-versions is not supported. Please use a released version instead.",
|
||||
);
|
||||
});
|
||||
|
||||
it("should handle file path with .tool-versions extension", () => {
|
||||
it("should handle file path with .tool-versions extension", async () => {
|
||||
const fileContent = "uv 0.1.0";
|
||||
mockedFs.readFileSync.mockReturnValue(fileContent);
|
||||
mockReadFileSync.mockReturnValue(fileContent);
|
||||
|
||||
const result = getUvVersionFromToolVersions("path/to/.tool-versions");
|
||||
const result = await getVersionFromToolVersions("path/to/.tool-versions");
|
||||
|
||||
expect(result).toBe("0.1.0");
|
||||
expect(mockedFs.readFileSync).toHaveBeenCalledWith(
|
||||
expect(mockReadFileSync).toHaveBeenCalledWith(
|
||||
"path/to/.tool-versions",
|
||||
"utf8",
|
||||
);
|
||||
|
||||
@@ -9,6 +9,8 @@ inputs:
|
||||
type: string
|
||||
activate-environment:
|
||||
type: boolean
|
||||
venv-path:
|
||||
type: string
|
||||
working-directory:
|
||||
type: string
|
||||
checksum:
|
||||
|
||||
11
action.yml
11
action.yml
@@ -15,6 +15,9 @@ inputs:
|
||||
activate-environment:
|
||||
description: "Use uv venv to activate a venv ready to be used by later steps. "
|
||||
default: "false"
|
||||
venv-path:
|
||||
description: "Custom path for the virtual environment when using activate-environment. Defaults to '.venv' in the working directory."
|
||||
default: ""
|
||||
working-directory:
|
||||
description: "The directory to execute all commands in and look for files such as pyproject.toml"
|
||||
default: ${{ github.workspace }}
|
||||
@@ -23,7 +26,7 @@ inputs:
|
||||
required: false
|
||||
github-token:
|
||||
description:
|
||||
"Used to increase the rate limit when retrieving versions and downloading uv."
|
||||
"Used when downloading uv from GitHub releases."
|
||||
required: false
|
||||
default: ${{ github.token }}
|
||||
enable-cache:
|
||||
@@ -72,7 +75,7 @@ inputs:
|
||||
description: "Custom path to set UV_TOOL_BIN_DIR to."
|
||||
required: false
|
||||
manifest-file:
|
||||
description: "URL to the manifest file containing available versions and download URLs."
|
||||
description: "URL to a custom manifest file. Supports the astral-sh/versions NDJSON format and the legacy JSON array format (deprecated)."
|
||||
required: false
|
||||
add-problem-matchers:
|
||||
description: "Add problem matchers."
|
||||
@@ -99,8 +102,8 @@ outputs:
|
||||
description: "A boolean value to indicate the Python cache entry was found"
|
||||
runs:
|
||||
using: "node24"
|
||||
main: "dist/setup/index.js"
|
||||
post: "dist/save-cache/index.js"
|
||||
main: "dist/setup/index.cjs"
|
||||
post: "dist/save-cache/index.cjs"
|
||||
post-if: success()
|
||||
branding:
|
||||
icon: "package"
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
{
|
||||
"$schema": "https://biomejs.dev/schemas/2.3.7/schema.json",
|
||||
"$schema": "https://biomejs.dev/schemas/2.4.7/schema.json",
|
||||
"assist": {
|
||||
"actions": {
|
||||
"source": {
|
||||
|
||||
63325
dist/save-cache/index.cjs
generated
vendored
Normal file
63325
dist/save-cache/index.cjs
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
94291
dist/save-cache/index.js
generated
vendored
94291
dist/save-cache/index.js
generated
vendored
File diff suppressed because one or more lines are too long
97307
dist/setup/index.cjs
generated
vendored
Normal file
97307
dist/setup/index.cjs
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
104378
dist/setup/index.js
generated
vendored
104378
dist/setup/index.js
generated
vendored
File diff suppressed because one or more lines are too long
50290
dist/update-known-checksums/index.cjs
generated
vendored
Normal file
50290
dist/update-known-checksums/index.cjs
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
38826
dist/update-known-versions/index.js
generated
vendored
38826
dist/update-known-versions/index.js
generated
vendored
File diff suppressed because one or more lines are too long
@@ -199,6 +199,10 @@ By default, the Python install dir (`uv python dir` / `UV_PYTHON_INSTALL_DIR`) i
|
||||
for the same reason that the dependency cache is pruned.
|
||||
If you want to cache Python installs along with your dependencies, set the `cache-python` input to `true`.
|
||||
|
||||
Note that this only caches Python versions that uv actually installs into `UV_PYTHON_INSTALL_DIR`
|
||||
(i.e. managed Python installs). If uv uses a system Python, there may be nothing to cache.
|
||||
To force managed Python installs, set `UV_PYTHON_PREFERENCE=only-managed`.
|
||||
|
||||
```yaml
|
||||
- name: Cache Python installs
|
||||
uses: astral-sh/setup-uv@v7
|
||||
|
||||
@@ -18,12 +18,29 @@ are automatically verified by this action. The sha256 hashes can be found on the
|
||||
|
||||
## Manifest file
|
||||
|
||||
The `manifest-file` input allows you to specify a JSON manifest that lists available uv versions,
|
||||
architectures, and their download URLs. By default, this action uses the manifest file contained
|
||||
in this repository, which is automatically updated with each release of uv.
|
||||
By default, setup-uv reads version metadata from
|
||||
[`astral-sh/versions`](https://github.com/astral-sh/versions) (NDJSON format).
|
||||
|
||||
The manifest file contains an array of objects, each describing a version,
|
||||
architecture, platform, and the corresponding download URL. For example:
|
||||
The `manifest-file` input lets you override that source with your own URL, for example to test
|
||||
custom uv builds or alternate download locations.
|
||||
|
||||
### Format
|
||||
|
||||
The manifest file must be in NDJSON format, where each line is a JSON object representing a version and its artifacts. For example:
|
||||
|
||||
```json
|
||||
{"version":"0.10.7","artifacts":[{"platform":"x86_64-unknown-linux-gnu","variant":"default","url":"https://example.com/uv-x86_64-unknown-linux-gnu.tar.gz","archive_format":"tar.gz","sha256":"..."}]}
|
||||
{"version":"0.10.6","artifacts":[{"platform":"x86_64-unknown-linux-gnu","variant":"default","url":"https://example.com/uv-x86_64-unknown-linux-gnu.tar.gz","archive_format":"tar.gz","sha256":"..."}]}
|
||||
```
|
||||
|
||||
setup-uv currently only supports `default` as the `variant`.
|
||||
|
||||
The `archive_format` field is currently ignored.
|
||||
|
||||
### Legacy format: JSON array (deprecated)
|
||||
|
||||
The previous JSON array format is still supported for compatibility, but deprecated and will be
|
||||
removed in a future major release.
|
||||
|
||||
```json
|
||||
[
|
||||
@@ -33,26 +50,20 @@ architecture, platform, and the corresponding download URL. For example:
|
||||
"arch": "aarch64",
|
||||
"platform": "apple-darwin",
|
||||
"downloadUrl": "https://github.com/astral-sh/uv/releases/download/0.7.13/uv-aarch64-apple-darwin.tar.gz"
|
||||
},
|
||||
...
|
||||
}
|
||||
]
|
||||
```
|
||||
|
||||
You can supply a custom manifest file URL to define additional versions,
|
||||
architectures, or different download URLs.
|
||||
This is useful if you maintain your own uv builds or want to override the default sources.
|
||||
|
||||
```yaml
|
||||
- name: Use a custom manifest file
|
||||
uses: astral-sh/setup-uv@v7
|
||||
with:
|
||||
manifest-file: "https://example.com/my-custom-manifest.json"
|
||||
manifest-file: "https://example.com/my-custom-manifest.ndjson"
|
||||
```
|
||||
|
||||
> [!NOTE]
|
||||
> When you use a custom manifest file and do not set the `version` input, its default value is `latest`.
|
||||
> This means the action will install the latest version available in the custom manifest file.
|
||||
> This is different from the default behavior of installing the latest version from the official uv releases.
|
||||
> When you use a custom manifest file and do not set the `version` input, setup-uv installs the
|
||||
> latest version from that custom manifest.
|
||||
|
||||
## Add problem matchers
|
||||
|
||||
|
||||
@@ -15,6 +15,17 @@ This allows directly using it in later steps:
|
||||
- run: uv pip install pip
|
||||
```
|
||||
|
||||
By default, the venv is created at `.venv` inside the `working-directory`.
|
||||
|
||||
You can customize the venv location with `venv-path`, for example to place it in the runner temp directory:
|
||||
|
||||
```yaml
|
||||
- uses: astral-sh/setup-uv@v7
|
||||
with:
|
||||
activate-environment: true
|
||||
venv-path: ${{ runner.temp }}/custom-venv
|
||||
```
|
||||
|
||||
> [!WARNING]
|
||||
>
|
||||
> Activating the environment adds your dependencies to the `PATH`, which could break some workflows.
|
||||
@@ -27,9 +38,12 @@ This allows directly using it in later steps:
|
||||
|
||||
## GitHub authentication token
|
||||
|
||||
This action uses the GitHub API to fetch the uv release artifacts. To avoid hitting the GitHub API
|
||||
rate limit too quickly, an authentication token can be provided via the `github-token` input. By
|
||||
default, the `GITHUB_TOKEN` secret is used, which is automatically provided by GitHub Actions.
|
||||
By default, this action resolves available uv versions from
|
||||
[`astral-sh/versions`](https://github.com/astral-sh/versions), then downloads uv artifacts from
|
||||
GitHub Releases.
|
||||
|
||||
You can provide a token via `github-token` to authenticate those downloads. By default, the
|
||||
`GITHUB_TOKEN` secret is used, which is automatically provided by GitHub Actions.
|
||||
|
||||
If the default
|
||||
[permissions for the GitHub token](https://docs.github.com/en/actions/security-for-github-actions/security-guides/automatic-token-authentication#permissions-for-the-github_token)
|
||||
|
||||
@@ -1,9 +0,0 @@
|
||||
module.exports = {
|
||||
clearMocks: true,
|
||||
moduleFileExtensions: ["js", "ts"],
|
||||
testMatch: ["**/*.test.ts"],
|
||||
transform: {
|
||||
"^.+\\.ts$": "ts-jest",
|
||||
},
|
||||
verbose: true,
|
||||
};
|
||||
14
jest.config.mjs
Normal file
14
jest.config.mjs
Normal file
@@ -0,0 +1,14 @@
|
||||
import { createDefaultEsmPreset } from "ts-jest";
|
||||
|
||||
const esmPreset = createDefaultEsmPreset({
|
||||
tsconfig: "./tsconfig.json",
|
||||
});
|
||||
|
||||
export default {
|
||||
...esmPreset,
|
||||
clearMocks: true,
|
||||
moduleFileExtensions: ["js", "mjs", "ts"],
|
||||
testEnvironment: "node",
|
||||
testMatch: ["**/*.test.ts"],
|
||||
verbose: true,
|
||||
};
|
||||
4456
package-lock.json
generated
4456
package-lock.json
generated
File diff suppressed because it is too large
Load Diff
47
package.json
47
package.json
@@ -2,16 +2,19 @@
|
||||
"name": "setup-uv",
|
||||
"version": "1.0.0",
|
||||
"private": true,
|
||||
"type": "module",
|
||||
"description": "Set up your GitHub Actions workflow with a specific version of uv",
|
||||
"main": "dist/index.js",
|
||||
"main": "dist/setup/index.cjs",
|
||||
"scripts": {
|
||||
"build": "tsc",
|
||||
"build": "tsc --noEmit",
|
||||
"check": "biome check --write",
|
||||
"package": "ncc build -o dist/setup src/setup-uv.ts && ncc build -o dist/save-cache src/save-cache.ts && ncc build -o dist/update-known-versions src/update-known-versions.ts",
|
||||
"test": "jest",
|
||||
"package": "node scripts/build-dist.mjs",
|
||||
"bench:versions": "node scripts/bench-versions-client.mjs",
|
||||
"test:unit": "node --experimental-vm-modules ./node_modules/jest/bin/jest.js",
|
||||
"test": "npm run build && npm run test:unit",
|
||||
"act": "act pull_request -W .github/workflows/test.yml --container-architecture linux/amd64 -s GITHUB_TOKEN=\"$(gh auth token)\"",
|
||||
"update-known-versions": "RUNNER_TEMP=known_versions node dist/update-known-versions/index.js src/download/checksum/known-versions.ts \"$(gh auth token)\"",
|
||||
"all": "npm run build && npm run check && npm run package && npm test"
|
||||
"update-known-checksums": "RUNNER_TEMP=known_versions node dist/update-known-checksums/index.cjs src/download/checksum/known-checksums.ts",
|
||||
"all": "npm run build && npm run check && npm run package && npm run test:unit"
|
||||
},
|
||||
"repository": {
|
||||
"type": "git",
|
||||
@@ -26,28 +29,26 @@
|
||||
"author": "@eifinger",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@actions/cache": "^4.1.0",
|
||||
"@actions/core": "^1.11.1",
|
||||
"@actions/exec": "^1.1.1",
|
||||
"@actions/glob": "^0.5.0",
|
||||
"@actions/io": "^1.1.3",
|
||||
"@actions/tool-cache": "^2.0.2",
|
||||
"@octokit/core": "^7.0.6",
|
||||
"@octokit/plugin-paginate-rest": "^14.0.0",
|
||||
"@octokit/plugin-rest-endpoint-methods": "^17.0.0",
|
||||
"@renovatebot/pep440": "^4.2.1",
|
||||
"smol-toml": "^1.4.2",
|
||||
"undici": "5.28.5"
|
||||
"@actions/cache": "^6.0.0",
|
||||
"@actions/core": "^3.0.0",
|
||||
"@actions/exec": "^3.0.0",
|
||||
"@actions/glob": "^0.6.1",
|
||||
"@actions/io": "^3.0.2",
|
||||
"@actions/tool-cache": "^4.0.0",
|
||||
"@renovatebot/pep440": "^4.2.2",
|
||||
"smol-toml": "^1.6.0",
|
||||
"undici": "^7.24.2"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@biomejs/biome": "2.3.7",
|
||||
"@biomejs/biome": "^2.4.7",
|
||||
"@types/js-yaml": "^4.0.9",
|
||||
"@types/node": "^24.10.1",
|
||||
"@types/node": "^25.5.0",
|
||||
"@types/semver": "^7.7.1",
|
||||
"@vercel/ncc": "^0.38.4",
|
||||
"jest": "^30.2.0",
|
||||
"js-yaml": "^4.1.0",
|
||||
"ts-jest": "^29.4.5",
|
||||
"esbuild": "^0.27.4",
|
||||
"jest": "^30.3.0",
|
||||
"js-yaml": "^4.1.1",
|
||||
"ts-jest": "^29.4.6",
|
||||
"typescript": "^5.9.3"
|
||||
}
|
||||
}
|
||||
|
||||
483
scripts/bench-versions-client.mjs
Normal file
483
scripts/bench-versions-client.mjs
Normal file
@@ -0,0 +1,483 @@
|
||||
import { performance } from "node:perf_hooks";
|
||||
import * as pep440 from "@renovatebot/pep440";
|
||||
import * as semver from "semver";
|
||||
import { ProxyAgent, fetch as undiciFetch } from "undici";
|
||||
|
||||
const DEFAULT_URL =
|
||||
"https://raw.githubusercontent.com/astral-sh/versions/main/v1/uv.ndjson";
|
||||
const DEFAULT_ITERATIONS = 100;
|
||||
const DEFAULT_ARCH = "aarch64";
|
||||
const DEFAULT_PLATFORM = "apple-darwin";
|
||||
|
||||
function getProxyAgent() {
|
||||
const httpProxy = process.env.HTTP_PROXY || process.env.http_proxy;
|
||||
if (httpProxy) {
|
||||
return new ProxyAgent(httpProxy);
|
||||
}
|
||||
|
||||
const httpsProxy = process.env.HTTPS_PROXY || process.env.https_proxy;
|
||||
if (httpsProxy) {
|
||||
return new ProxyAgent(httpsProxy);
|
||||
}
|
||||
|
||||
return undefined;
|
||||
}
|
||||
|
||||
async function fetch(url) {
|
||||
return await undiciFetch(url, {
|
||||
dispatcher: getProxyAgent(),
|
||||
});
|
||||
}
|
||||
|
||||
function parseArgs(argv) {
|
||||
const options = {
|
||||
arch: DEFAULT_ARCH,
|
||||
iterations: DEFAULT_ITERATIONS,
|
||||
platform: DEFAULT_PLATFORM,
|
||||
url: DEFAULT_URL,
|
||||
};
|
||||
|
||||
for (let index = 0; index < argv.length; index += 1) {
|
||||
const arg = argv[index];
|
||||
const next = argv[index + 1];
|
||||
|
||||
if (arg === "--iterations" && next !== undefined) {
|
||||
options.iterations = Number.parseInt(next, 10);
|
||||
index += 1;
|
||||
continue;
|
||||
}
|
||||
|
||||
if (arg === "--url" && next !== undefined) {
|
||||
options.url = next;
|
||||
index += 1;
|
||||
continue;
|
||||
}
|
||||
|
||||
if (arg === "--arch" && next !== undefined) {
|
||||
options.arch = next;
|
||||
index += 1;
|
||||
continue;
|
||||
}
|
||||
|
||||
if (arg === "--platform" && next !== undefined) {
|
||||
options.platform = next;
|
||||
index += 1;
|
||||
}
|
||||
}
|
||||
|
||||
if (!Number.isInteger(options.iterations) || options.iterations <= 0) {
|
||||
throw new Error("--iterations must be a positive integer");
|
||||
}
|
||||
|
||||
return options;
|
||||
}
|
||||
|
||||
function parseVersionLine(line, sourceDescription, lineNumber) {
|
||||
let parsed;
|
||||
try {
|
||||
parsed = JSON.parse(line);
|
||||
} catch (error) {
|
||||
throw new Error(
|
||||
`Failed to parse version data from ${sourceDescription} at line ${lineNumber}: ${error.message}`,
|
||||
);
|
||||
}
|
||||
|
||||
if (
|
||||
typeof parsed !== "object" ||
|
||||
parsed === null ||
|
||||
typeof parsed.version !== "string" ||
|
||||
!Array.isArray(parsed.artifacts)
|
||||
) {
|
||||
throw new Error(
|
||||
`Invalid NDJSON record in ${sourceDescription} at line ${lineNumber}.`,
|
||||
);
|
||||
}
|
||||
|
||||
return parsed;
|
||||
}
|
||||
|
||||
function parseVersionData(data, sourceDescription) {
|
||||
const versions = [];
|
||||
|
||||
for (const [index, line] of data.split("\n").entries()) {
|
||||
const trimmed = line.trim();
|
||||
if (trimmed === "") {
|
||||
continue;
|
||||
}
|
||||
|
||||
versions.push(parseVersionLine(trimmed, sourceDescription, index + 1));
|
||||
}
|
||||
|
||||
if (versions.length === 0) {
|
||||
throw new Error(`No version data found in ${sourceDescription}.`);
|
||||
}
|
||||
|
||||
return versions;
|
||||
}
|
||||
|
||||
async function readEntireResponse(response) {
|
||||
if (response.body === null) {
|
||||
const text = await response.text();
|
||||
return {
|
||||
bytesRead: Buffer.byteLength(text, "utf8"),
|
||||
text,
|
||||
};
|
||||
}
|
||||
|
||||
const reader = response.body.getReader();
|
||||
const decoder = new TextDecoder();
|
||||
const chunks = [];
|
||||
let bytesRead = 0;
|
||||
|
||||
while (true) {
|
||||
const { done, value } = await reader.read();
|
||||
if (done) {
|
||||
chunks.push(decoder.decode());
|
||||
break;
|
||||
}
|
||||
|
||||
bytesRead += value.byteLength;
|
||||
chunks.push(decoder.decode(value, { stream: true }));
|
||||
}
|
||||
|
||||
return {
|
||||
bytesRead,
|
||||
text: chunks.join(""),
|
||||
};
|
||||
}
|
||||
|
||||
async function fetchAllVersions(url) {
|
||||
const response = await fetch(url);
|
||||
if (!response.ok) {
|
||||
throw new Error(
|
||||
`Failed to fetch version data: ${response.status} ${response.statusText}`,
|
||||
);
|
||||
}
|
||||
|
||||
const { bytesRead, text } = await readEntireResponse(response);
|
||||
return {
|
||||
bytesRead,
|
||||
versions: parseVersionData(text, url),
|
||||
};
|
||||
}
|
||||
|
||||
async function streamUntil(url, predicate) {
|
||||
const response = await fetch(url);
|
||||
if (!response.ok) {
|
||||
throw new Error(
|
||||
`Failed to fetch version data: ${response.status} ${response.statusText}`,
|
||||
);
|
||||
}
|
||||
|
||||
if (response.body === null) {
|
||||
const { bytesRead, versions } = await fetchAllVersions(url);
|
||||
return {
|
||||
bytesRead,
|
||||
matchedVersion: versions.find(predicate),
|
||||
};
|
||||
}
|
||||
|
||||
const reader = response.body.getReader();
|
||||
const decoder = new TextDecoder();
|
||||
let bytesRead = 0;
|
||||
let buffer = "";
|
||||
let lineNumber = 0;
|
||||
|
||||
while (true) {
|
||||
const { done, value } = await reader.read();
|
||||
if (done) {
|
||||
buffer += decoder.decode();
|
||||
break;
|
||||
}
|
||||
|
||||
bytesRead += value.byteLength;
|
||||
buffer += decoder.decode(value, { stream: true });
|
||||
|
||||
let newlineIndex = buffer.indexOf("\n");
|
||||
while (newlineIndex !== -1) {
|
||||
const line = buffer.slice(0, newlineIndex);
|
||||
buffer = buffer.slice(newlineIndex + 1);
|
||||
const trimmed = line.trim();
|
||||
|
||||
if (trimmed !== "") {
|
||||
lineNumber += 1;
|
||||
const versionData = parseVersionLine(trimmed, url, lineNumber);
|
||||
if (predicate(versionData)) {
|
||||
await reader.cancel();
|
||||
return { bytesRead, matchedVersion: versionData };
|
||||
}
|
||||
}
|
||||
|
||||
newlineIndex = buffer.indexOf("\n");
|
||||
}
|
||||
}
|
||||
|
||||
if (buffer.trim() !== "") {
|
||||
lineNumber += 1;
|
||||
const versionData = parseVersionLine(buffer.trim(), url, lineNumber);
|
||||
if (predicate(versionData)) {
|
||||
return { bytesRead, matchedVersion: versionData };
|
||||
}
|
||||
}
|
||||
|
||||
return { bytesRead, matchedVersion: undefined };
|
||||
}
|
||||
|
||||
function versionSatisfies(version, versionSpecifier) {
|
||||
return (
|
||||
semver.satisfies(version, versionSpecifier) ||
|
||||
pep440.satisfies(version, versionSpecifier)
|
||||
);
|
||||
}
|
||||
|
||||
function maxSatisfying(versions, versionSpecifier) {
|
||||
const semverMatch = semver.maxSatisfying(versions, versionSpecifier);
|
||||
if (semverMatch !== null) {
|
||||
return semverMatch;
|
||||
}
|
||||
|
||||
return pep440.maxSatisfying(versions, versionSpecifier) ?? undefined;
|
||||
}
|
||||
|
||||
function selectArtifact(artifacts) {
|
||||
if (artifacts.length === 1) {
|
||||
return artifacts[0];
|
||||
}
|
||||
|
||||
const defaultVariant = artifacts.find(
|
||||
(candidate) => candidate.variant === "default",
|
||||
);
|
||||
if (defaultVariant !== undefined) {
|
||||
return defaultVariant;
|
||||
}
|
||||
|
||||
return artifacts[0];
|
||||
}
|
||||
|
||||
async function benchmarkCase(name, expected, implementations, iterations) {
|
||||
const results = {
|
||||
name,
|
||||
new: [],
|
||||
old: [],
|
||||
};
|
||||
|
||||
for (let iteration = 0; iteration < iterations; iteration += 1) {
|
||||
const order = iteration % 2 === 0 ? ["old", "new"] : ["new", "old"];
|
||||
|
||||
for (const label of order) {
|
||||
const implementation = implementations[label];
|
||||
const startedAt = performance.now();
|
||||
const outcome = await implementation.run();
|
||||
const durationMs = performance.now() - startedAt;
|
||||
|
||||
if (outcome.value !== expected) {
|
||||
throw new Error(
|
||||
`${name} ${label} produced ${JSON.stringify(outcome.value)}; expected ${JSON.stringify(expected)}`,
|
||||
);
|
||||
}
|
||||
|
||||
results[label].push({
|
||||
bytesRead: outcome.bytesRead,
|
||||
durationMs,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
return results;
|
||||
}
|
||||
|
||||
function summarize(samples) {
|
||||
const durations = samples
|
||||
.map((sample) => sample.durationMs)
|
||||
.sort((left, right) => left - right);
|
||||
const bytes = samples
|
||||
.map((sample) => sample.bytesRead)
|
||||
.sort((left, right) => left - right);
|
||||
|
||||
const sum = (values) => values.reduce((total, value) => total + value, 0);
|
||||
const percentile = (values, ratio) => {
|
||||
const index = Math.min(
|
||||
values.length - 1,
|
||||
Math.max(0, Math.ceil(values.length * ratio) - 1),
|
||||
);
|
||||
return values[index];
|
||||
};
|
||||
|
||||
return {
|
||||
avgBytes: sum(bytes) / bytes.length,
|
||||
avgMs: sum(durations) / durations.length,
|
||||
maxMs: durations[durations.length - 1],
|
||||
medianMs: percentile(durations, 0.5),
|
||||
minMs: durations[0],
|
||||
p95Ms: percentile(durations, 0.95),
|
||||
};
|
||||
}
|
||||
|
||||
function formatNumber(value, digits = 2) {
|
||||
return value.toFixed(digits);
|
||||
}
|
||||
|
||||
function formatSummary(name, oldSummary, newSummary) {
|
||||
const speedup = oldSummary.avgMs / newSummary.avgMs;
|
||||
const timeReduction =
|
||||
((oldSummary.avgMs - newSummary.avgMs) / oldSummary.avgMs) * 100;
|
||||
const byteReduction =
|
||||
((oldSummary.avgBytes - newSummary.avgBytes) / oldSummary.avgBytes) * 100;
|
||||
|
||||
return [
|
||||
`Scenario: ${name}`,
|
||||
` old avg: ${formatNumber(oldSummary.avgMs)} ms | median: ${formatNumber(oldSummary.medianMs)} ms | p95: ${formatNumber(oldSummary.p95Ms)} ms | avg bytes: ${Math.round(oldSummary.avgBytes)}`,
|
||||
` new avg: ${formatNumber(newSummary.avgMs)} ms | median: ${formatNumber(newSummary.medianMs)} ms | p95: ${formatNumber(newSummary.p95Ms)} ms | avg bytes: ${Math.round(newSummary.avgBytes)}`,
|
||||
` delta: ${formatNumber(timeReduction)}% faster | ${formatNumber(speedup)}x speedup | ${formatNumber(byteReduction)}% fewer bytes read`,
|
||||
].join("\n");
|
||||
}
|
||||
|
||||
async function main() {
|
||||
const options = parseArgs(process.argv.slice(2));
|
||||
console.log(`Preparing benchmark data from ${options.url}`);
|
||||
const baseline = await fetchAllVersions(options.url);
|
||||
const latestVersion = baseline.versions[0]?.version;
|
||||
if (!latestVersion) {
|
||||
throw new Error("No versions found in NDJSON data");
|
||||
}
|
||||
|
||||
const latestArtifact = selectArtifact(
|
||||
baseline.versions[0].artifacts.filter(
|
||||
(candidate) =>
|
||||
candidate.platform === `${options.arch}-${options.platform}`,
|
||||
),
|
||||
);
|
||||
if (!latestArtifact) {
|
||||
throw new Error(
|
||||
`No artifact found for ${options.arch}-${options.platform} in ${latestVersion}`,
|
||||
);
|
||||
}
|
||||
|
||||
const rangeSpecifier = `^${latestVersion.split(".")[0]}.${latestVersion.split(".")[1]}.0`;
|
||||
|
||||
console.log(
|
||||
`Running ${options.iterations} iterations per scenario against ${options.url}`,
|
||||
);
|
||||
console.log(`Latest version: ${latestVersion}`);
|
||||
console.log(`Range benchmark: ${rangeSpecifier}`);
|
||||
console.log(`Artifact benchmark: ${options.arch}-${options.platform}`);
|
||||
console.log("");
|
||||
|
||||
const scenarios = [
|
||||
await benchmarkCase(
|
||||
"latest version",
|
||||
latestVersion,
|
||||
{
|
||||
new: {
|
||||
run: async () => {
|
||||
const { bytesRead, matchedVersion } = await streamUntil(
|
||||
options.url,
|
||||
() => true,
|
||||
);
|
||||
return {
|
||||
bytesRead,
|
||||
value: matchedVersion?.version,
|
||||
};
|
||||
},
|
||||
},
|
||||
old: {
|
||||
run: async () => {
|
||||
const { bytesRead, versions } = await fetchAllVersions(options.url);
|
||||
return {
|
||||
bytesRead,
|
||||
value: versions[0]?.version,
|
||||
};
|
||||
},
|
||||
},
|
||||
},
|
||||
options.iterations,
|
||||
),
|
||||
await benchmarkCase(
|
||||
"highest satisfying range",
|
||||
latestVersion,
|
||||
{
|
||||
new: {
|
||||
run: async () => {
|
||||
const { bytesRead, matchedVersion } = await streamUntil(
|
||||
options.url,
|
||||
(candidate) =>
|
||||
versionSatisfies(candidate.version, rangeSpecifier),
|
||||
);
|
||||
return {
|
||||
bytesRead,
|
||||
value: matchedVersion?.version,
|
||||
};
|
||||
},
|
||||
},
|
||||
old: {
|
||||
run: async () => {
|
||||
const { bytesRead, versions } = await fetchAllVersions(options.url);
|
||||
return {
|
||||
bytesRead,
|
||||
value: maxSatisfying(
|
||||
versions.map((versionData) => versionData.version),
|
||||
rangeSpecifier,
|
||||
),
|
||||
};
|
||||
},
|
||||
},
|
||||
},
|
||||
options.iterations,
|
||||
),
|
||||
await benchmarkCase(
|
||||
"exact version artifact",
|
||||
latestArtifact.url,
|
||||
{
|
||||
new: {
|
||||
run: async () => {
|
||||
const { bytesRead, matchedVersion } = await streamUntil(
|
||||
options.url,
|
||||
(candidate) => candidate.version === latestVersion,
|
||||
);
|
||||
const artifact = matchedVersion
|
||||
? selectArtifact(
|
||||
matchedVersion.artifacts.filter(
|
||||
(candidate) =>
|
||||
candidate.platform ===
|
||||
`${options.arch}-${options.platform}`,
|
||||
),
|
||||
)
|
||||
: undefined;
|
||||
return {
|
||||
bytesRead,
|
||||
value: artifact?.url,
|
||||
};
|
||||
},
|
||||
},
|
||||
old: {
|
||||
run: async () => {
|
||||
const { bytesRead, versions } = await fetchAllVersions(options.url);
|
||||
const versionData = versions.find(
|
||||
(candidate) => candidate.version === latestVersion,
|
||||
);
|
||||
const artifact = selectArtifact(
|
||||
versionData.artifacts.filter(
|
||||
(candidate) =>
|
||||
candidate.platform === `${options.arch}-${options.platform}`,
|
||||
),
|
||||
);
|
||||
return {
|
||||
bytesRead,
|
||||
value: artifact?.url,
|
||||
};
|
||||
},
|
||||
},
|
||||
},
|
||||
options.iterations,
|
||||
),
|
||||
];
|
||||
|
||||
for (const scenario of scenarios) {
|
||||
const oldSummary = summarize(scenario.old);
|
||||
const newSummary = summarize(scenario.new);
|
||||
console.log(formatSummary(scenario.name, oldSummary, newSummary));
|
||||
console.log("");
|
||||
}
|
||||
}
|
||||
|
||||
await main();
|
||||
33
scripts/build-dist.mjs
Normal file
33
scripts/build-dist.mjs
Normal file
@@ -0,0 +1,33 @@
|
||||
import { rm } from "node:fs/promises";
|
||||
import { build } from "esbuild";
|
||||
|
||||
const builds = [
|
||||
{
|
||||
entryPoints: ["src/setup-uv.ts"],
|
||||
outfile: "dist/setup/index.cjs",
|
||||
staleOutfiles: ["dist/setup/index.mjs"],
|
||||
},
|
||||
{
|
||||
entryPoints: ["src/save-cache.ts"],
|
||||
outfile: "dist/save-cache/index.cjs",
|
||||
staleOutfiles: ["dist/save-cache/index.mjs"],
|
||||
},
|
||||
{
|
||||
entryPoints: ["src/update-known-checksums.ts"],
|
||||
outfile: "dist/update-known-checksums/index.cjs",
|
||||
staleOutfiles: ["dist/update-known-checksums/index.mjs"],
|
||||
},
|
||||
];
|
||||
|
||||
for (const { staleOutfiles, ...options } of builds) {
|
||||
await Promise.all(
|
||||
staleOutfiles.map((outfile) => rm(outfile, { force: true })),
|
||||
);
|
||||
await build({
|
||||
bundle: true,
|
||||
format: "cjs",
|
||||
platform: "node",
|
||||
target: "node24",
|
||||
...options,
|
||||
});
|
||||
}
|
||||
@@ -6,33 +6,35 @@ import type { Architecture, Platform } from "../../utils/platforms";
|
||||
import { KNOWN_CHECKSUMS } from "./known-checksums";
|
||||
|
||||
export async function validateChecksum(
|
||||
checkSum: string | undefined,
|
||||
checksum: string | undefined,
|
||||
downloadPath: string,
|
||||
arch: Architecture,
|
||||
platform: Platform,
|
||||
version: string,
|
||||
): Promise<void> {
|
||||
let isValid: boolean | undefined;
|
||||
if (checkSum !== undefined && checkSum !== "") {
|
||||
isValid = await validateFileCheckSum(downloadPath, checkSum);
|
||||
} else {
|
||||
core.debug("Checksum not provided. Checking known checksums.");
|
||||
const key = `${arch}-${platform}-${version}`;
|
||||
if (key in KNOWN_CHECKSUMS) {
|
||||
const knownChecksum = KNOWN_CHECKSUMS[`${arch}-${platform}-${version}`];
|
||||
core.debug(`Checking checksum for ${arch}-${platform}-${version}.`);
|
||||
isValid = await validateFileCheckSum(downloadPath, knownChecksum);
|
||||
} else {
|
||||
core.debug(`No known checksum found for ${key}.`);
|
||||
}
|
||||
const key = `${arch}-${platform}-${version}`;
|
||||
const hasProvidedChecksum = checksum !== undefined && checksum !== "";
|
||||
const checksumToUse = hasProvidedChecksum ? checksum : KNOWN_CHECKSUMS[key];
|
||||
|
||||
if (checksumToUse === undefined) {
|
||||
core.debug(`No checksum found for ${key}.`);
|
||||
return;
|
||||
}
|
||||
|
||||
if (isValid === false) {
|
||||
throw new Error(`Checksum for ${downloadPath} did not match ${checkSum}.`);
|
||||
}
|
||||
if (isValid === true) {
|
||||
core.debug(`Checksum for ${downloadPath} is valid.`);
|
||||
const checksumSource = hasProvidedChecksum
|
||||
? "provided checksum"
|
||||
: `KNOWN_CHECKSUMS entry for ${key}`;
|
||||
|
||||
core.debug(`Validating checksum using ${checksumSource}.`);
|
||||
const isValid = await validateFileCheckSum(downloadPath, checksumToUse);
|
||||
|
||||
if (!isValid) {
|
||||
throw new Error(
|
||||
`Checksum for ${downloadPath} did not match ${checksumToUse}.`,
|
||||
);
|
||||
}
|
||||
|
||||
core.debug(`Checksum for ${downloadPath} is valid.`);
|
||||
}
|
||||
|
||||
async function validateFileCheckSum(
|
||||
|
||||
@@ -1,5 +1,523 @@
|
||||
// AUTOGENERATED_DO_NOT_EDIT
|
||||
export const KNOWN_CHECKSUMS: { [key: string]: string } = {
|
||||
"aarch64-apple-darwin-0.10.10":
|
||||
"8a09f0ef51ee7f7170731b4cb8bde5bf9ba6da5304f49a7df6cdab42a1f37b5d",
|
||||
"aarch64-pc-windows-msvc-0.10.10":
|
||||
"2c6fe113f14574bc27f085751c68d3485589fcc3c3c64ed85dd1eecc2f87cffc",
|
||||
"aarch64-unknown-linux-gnu-0.10.10":
|
||||
"2b80457b950deda12e8d5dc3b9b7494ac143eae47f1fb11b1c6e5a8495a6421e",
|
||||
"aarch64-unknown-linux-musl-0.10.10":
|
||||
"d08c08b82cdcaf2bd3d928ffe844d3558dda53f90066db6ef9174157cc763252",
|
||||
"arm-unknown-linux-musleabihf-0.10.10":
|
||||
"ccc3c4dd5eeea4b2be829ef9bc0b8d9882389c0f303f7ec5ba668065d57e2673",
|
||||
"armv7-unknown-linux-gnueabihf-0.10.10":
|
||||
"032786622b52f8d0232b5ad16e25342a64f9e43576652db7bf607231021902f3",
|
||||
"armv7-unknown-linux-musleabihf-0.10.10":
|
||||
"f6f67b190eb28b473917c97210f89fd11d9b9393d774acd093ea738fcee68864",
|
||||
"i686-pc-windows-msvc-0.10.10":
|
||||
"980d7ea368cc4883f572bb85c285a647eddfc23539064d2bfaf8fbfefcc2112b",
|
||||
"i686-unknown-linux-gnu-0.10.10":
|
||||
"5260fbef838f8cfec44697064a5cfae08a27c6ab7ed7feab7fc946827e896952",
|
||||
"i686-unknown-linux-musl-0.10.10":
|
||||
"a6683ade964f8d8623098ca0c96b4311d8388b44a56a386cd795974f39fb5bd2",
|
||||
"powerpc64le-unknown-linux-gnu-0.10.10":
|
||||
"78939dc4fc905aca8af4be19b6c6ecc306f04c6ca9f98d144372595d9397fd0d",
|
||||
"riscv64gc-unknown-linux-gnu-0.10.10":
|
||||
"5eff670bf80fce9d9e50df5b4d46c415a9c0324eadf7059d97c76f89ffc33c3f",
|
||||
"s390x-unknown-linux-gnu-0.10.10":
|
||||
"a32d2be5600f7f42f82596ffe9d3115f020974ca7fb4f15251c5625c5481ea5e",
|
||||
"x86_64-apple-darwin-0.10.10":
|
||||
"dd18420591d625f9b4ca2b57a7a6fe3cce43910f02e02d90e47a4101428de14a",
|
||||
"x86_64-pc-windows-msvc-0.10.10":
|
||||
"d31a30f1dfb96e630a08d5a9b3f3f551254b7ed6e9b7e495f46a4232661c7252",
|
||||
"x86_64-unknown-linux-gnu-0.10.10":
|
||||
"3e1027f26ce8c7e4c32e2277a7fed2cb410f2f1f9320d3df97653d40e21f415b",
|
||||
"x86_64-unknown-linux-musl-0.10.10":
|
||||
"74544e8755fbc27559e22e29fd561bdc48f91b8bd8323e760a1130f32433bea4",
|
||||
"aarch64-apple-darwin-0.10.9":
|
||||
"a92f61e9ac9b0f29668c15f56152e4a60143fca148ff5bfadb86718472c3f376",
|
||||
"aarch64-pc-windows-msvc-0.10.9":
|
||||
"5c2526844acf978eab784161c21604343141aa6c9ed22c237ae2f315648f049d",
|
||||
"aarch64-unknown-linux-gnu-0.10.9":
|
||||
"cc0c5a8573e7d6d78aecb954e0a62b5c0d18217bb81f1e19363b428c57a9962a",
|
||||
"aarch64-unknown-linux-musl-0.10.9":
|
||||
"05b0d3087e913ebe11756365a90dd47c05d6728752fdbe129ad4c3ccd769826d",
|
||||
"arm-unknown-linux-musleabihf-0.10.9":
|
||||
"6220fa3eb5f8212cae4ec3a5053060914aaa829549cf706dde9f9cc344f75f61",
|
||||
"armv7-unknown-linux-gnueabihf-0.10.9":
|
||||
"0076eac165c2f7129627e2297478e7ffbb9465d9ae6a8961b2f53dcbd807473d",
|
||||
"armv7-unknown-linux-musleabihf-0.10.9":
|
||||
"f702e821b80e371e14987a886d58ee103c5948b7b096fa49a552624c24d7e073",
|
||||
"i686-pc-windows-msvc-0.10.9":
|
||||
"034bf6b91390b9adc5f41a5946fdb618ebc8cef1574f3d95af9c12fe2bf9aaf3",
|
||||
"i686-unknown-linux-gnu-0.10.9":
|
||||
"90d9168a4e7900463f9fd79a32eb1890081fb1e238d803404f6e17b2dcdcca7b",
|
||||
"i686-unknown-linux-musl-0.10.9":
|
||||
"1d42b0d0a037b3d658b11ec889154686db3ab269ba2b789bdbc45d36e3549f34",
|
||||
"powerpc64le-unknown-linux-gnu-0.10.9":
|
||||
"e804f4a7d0659e09ef806365f04bdd33c940603fab903e925402748d05dd109a",
|
||||
"riscv64gc-unknown-linux-gnu-0.10.9":
|
||||
"1541596da45855e34202130027a613a2ace7d441e04d747cb4dd9f2590461c9a",
|
||||
"s390x-unknown-linux-gnu-0.10.9":
|
||||
"a589d4a8930c82fa7225daec19c632651b3c84f50f770efe758056b387e5f0dd",
|
||||
"x86_64-apple-darwin-0.10.9":
|
||||
"9cc2de7d195fa157f98b306a8a1cb151ded93f488939b93363cebc8b9d598c28",
|
||||
"x86_64-pc-windows-msvc-0.10.9":
|
||||
"f58dc40896000229db7c52b8bdd931394040ef2ad59abd1eda841f6d70b13d7a",
|
||||
"x86_64-unknown-linux-gnu-0.10.9":
|
||||
"20d79708222611fa540b5c9ed84f352bcd3937740e51aacc0f8b15b271c57594",
|
||||
"x86_64-unknown-linux-musl-0.10.9":
|
||||
"433e56874739e92c7cfd661ba9e5f287b376ca612c08c8194a41a98a13158aea",
|
||||
"aarch64-apple-darwin-0.10.8":
|
||||
"c3a6fff5b6b4abddff863117878194e35dbc6b0267d61ad259ab9896f9b8dcbb",
|
||||
"aarch64-pc-windows-msvc-0.10.8":
|
||||
"20db25dc446f9a75d1cfde0a5f4b021e1b2eb266e600a610d32c7ca5d7ff83bf",
|
||||
"aarch64-unknown-linux-gnu-0.10.8":
|
||||
"661860e954f87dcd823251191866af3486484d1a9df60eed56f4586ed7559e3d",
|
||||
"aarch64-unknown-linux-musl-0.10.8":
|
||||
"2ef0d0489e9e2a32f134ca80097fa36be4b486c4ab004706a1d6d0d57980ff07",
|
||||
"arm-unknown-linux-musleabihf-0.10.8":
|
||||
"f6dfca333c566024f6feaef19adf7ce06675a1bc2fcadc2de640dd805112a518",
|
||||
"armv7-unknown-linux-gnueabihf-0.10.8":
|
||||
"1bee8f88a7129f7922c43b0e091a7065d4e13a2934e599aa8a48f162cf9739aa",
|
||||
"armv7-unknown-linux-musleabihf-0.10.8":
|
||||
"ad0ca78991518fde1c4c42f8590e86f29db1f746cedb637f9dac1bb7de2e28da",
|
||||
"i686-pc-windows-msvc-0.10.8":
|
||||
"db40952a0c16eb647cb3a06c8cc13712b72e5b6a2501bc080c7e00c0f0e4ad88",
|
||||
"i686-unknown-linux-gnu-0.10.8":
|
||||
"3a78c54ffedce8eafd59a19a32eaec538924169fa4bf9d28d2d5841a7f604210",
|
||||
"i686-unknown-linux-musl-0.10.8":
|
||||
"25cf70c12abded06c4c18db8fdba253776bc115ce28f849af6f6ef771e67d730",
|
||||
"powerpc64le-unknown-linux-gnu-0.10.8":
|
||||
"3a4a158e645d04825872eb59ca60dd5026529e4f9fe5dd88987a45478301724d",
|
||||
"riscv64gc-unknown-linux-gnu-0.10.8":
|
||||
"2349e786d2de14fbd72386f42ed9f398cad52f47f6cdd78e05f338a1faf1321c",
|
||||
"s390x-unknown-linux-gnu-0.10.8":
|
||||
"21de0f86838b06e6ebcc3cb6a079d49d3d3886e5b49822ae58e5758eb08a6710",
|
||||
"x86_64-apple-darwin-0.10.8":
|
||||
"e0a1b22b039f8155765f5bc8c13df03a5f994a901901179791572e8e5f053281",
|
||||
"x86_64-pc-windows-msvc-0.10.8":
|
||||
"2e70ecd22196cbd9d14eefb700814bcafc5b75a0d8275b52e8402e5fe256d928",
|
||||
"x86_64-unknown-linux-gnu-0.10.8":
|
||||
"f0c566b55683395a62fefb9261a060fa09824914b5682c3b9629fa154762ae2f",
|
||||
"x86_64-unknown-linux-musl-0.10.8":
|
||||
"a4e6ad1aecac61077de548d2cc9ccf2c2f1848863312b3b59fb0d2eb8d8a043c",
|
||||
"aarch64-apple-darwin-0.10.7":
|
||||
"1eb4dcc5e0fc8669fa0b33cf1151b64ba3b8c26b60dceff4f7a686129e2af22b",
|
||||
"aarch64-pc-windows-msvc-0.10.7":
|
||||
"45ba7b72a7435343d650c73d21d65d2e8bdda47f6bd39af00e37f3cb70aa79ef",
|
||||
"aarch64-unknown-linux-gnu-0.10.7":
|
||||
"20efc27d946860093650bcf26096a016b10fdaf03b13c33b75fbde02962beea9",
|
||||
"aarch64-unknown-linux-musl-0.10.7":
|
||||
"115291f9943531a3b63db3a2eabda8b74b8da4831551679382cb309c9debd9f7",
|
||||
"arm-unknown-linux-musleabihf-0.10.7":
|
||||
"3ea331cd68f28235e13639d5400341a3893d0455f2473a74a9926b7d62cb739c",
|
||||
"armv7-unknown-linux-gnueabihf-0.10.7":
|
||||
"2e2f88cc5a7b49282c9aa05cfe03e3b8b0a044e90981062fbeb60a7aeba188ca",
|
||||
"armv7-unknown-linux-musleabihf-0.10.7":
|
||||
"27319e842d802c5c73be52f3774999d79d0f28f37984090998560fd925133375",
|
||||
"i686-pc-windows-msvc-0.10.7":
|
||||
"a7960473a473ee5907a55fccb8c645e24c1da7d39076aaef652b819e3a26a28b",
|
||||
"i686-unknown-linux-gnu-0.10.7":
|
||||
"1a22aa0d2268a9a6fb2e5f092ca3d1ef7c14f96c3b4fd546226814f376e59d73",
|
||||
"i686-unknown-linux-musl-0.10.7":
|
||||
"75c2cc60675fb6f846b394c3f7b51f77c08f0981abf5cfcb5e27cfbb2f5837e0",
|
||||
"powerpc64le-unknown-linux-gnu-0.10.7":
|
||||
"7398686962b966959c32e7fbfd2868fbac38491ff0d86033d7c8bbb826a04026",
|
||||
"riscv64gc-unknown-linux-gnu-0.10.7":
|
||||
"39abc60403fdcf5c681b63c967059d42aea58a81ffb092d6dda767390222a4b0",
|
||||
"s390x-unknown-linux-gnu-0.10.7":
|
||||
"281ae4c1343e0c5f9775358690d40e00edbf63ca788b4d8b6574a0b5cba624f4",
|
||||
"x86_64-apple-darwin-0.10.7":
|
||||
"4fed9d4f4608fb3850db714ee37244436f850a2b6e485bc510795679c2d08866",
|
||||
"x86_64-pc-windows-msvc-0.10.7":
|
||||
"8881afb877996a1373a12e816395122a8d39a3ac06cd066272acdb49510cf0fe",
|
||||
"x86_64-unknown-linux-gnu-0.10.7":
|
||||
"9ac6cee4e379a5abfca06e78a777b26b7ba1f81cb7935b97054d80d85ac00774",
|
||||
"x86_64-unknown-linux-musl-0.10.7":
|
||||
"992529add6024e67135b1c80617abd2eca7be2cf0b99b3911f923de815bd8dc1",
|
||||
"aarch64-apple-darwin-0.10.6":
|
||||
"3993249d8f51deaf34cfce037e57e294e82267ff1f9dc45b7983a17afaf065b4",
|
||||
"aarch64-pc-windows-msvc-0.10.6":
|
||||
"e431c9a4f8d66e872f6640500cbbf1af20418720b78ac01404399ac810ef2e46",
|
||||
"aarch64-unknown-linux-gnu-0.10.6":
|
||||
"9380705294a85e3e634570abddd5b2577900c1873c29b790c7abc56a81dce4bc",
|
||||
"aarch64-unknown-linux-musl-0.10.6":
|
||||
"7de7aa836fd54ff930fa5e63bc04da35e2fbd72889d6258e153479c44d08b863",
|
||||
"arm-unknown-linux-musleabihf-0.10.6":
|
||||
"9d0b55a3b0aff97884f49e15739a9936eb33a1b59a5bf1b3c7ce4d9e517d4d76",
|
||||
"armv7-unknown-linux-gnueabihf-0.10.6":
|
||||
"165400192202ee2487bcee4429a5e5a2fddfe8fef8985fb548e2a89fda6b2376",
|
||||
"armv7-unknown-linux-musleabihf-0.10.6":
|
||||
"1cf58447f2003122f83b1a34aee94429cb2686010c3502bfa21c8116e09d5bdf",
|
||||
"i686-pc-windows-msvc-0.10.6":
|
||||
"ec189db03b89262e6089e4fb895af6116b964234cf4166b330e258aaf7f999b4",
|
||||
"i686-unknown-linux-gnu-0.10.6":
|
||||
"f72a88d489fc424aca69c1cbf175bb5aeae649aa8c55b092628e5e553b481dd5",
|
||||
"i686-unknown-linux-musl-0.10.6":
|
||||
"94471f51aedbfaceb495949d5ce37d44352b2dfea45b61399870c39a881681fc",
|
||||
"powerpc64le-unknown-linux-gnu-0.10.6":
|
||||
"72d504553fc7150177bbf57b585c850cb4d695ddd848b9ba1416ac122eb88293",
|
||||
"riscv64gc-unknown-linux-gnu-0.10.6":
|
||||
"8f8a966d1f911f39334581a933805a30cdec5a7c1d4f580e03973ff45bf9b6ad",
|
||||
"s390x-unknown-linux-gnu-0.10.6":
|
||||
"5ed60237762862b05561d02b7f095268897d0561e87dca5986b04319265bc2cf",
|
||||
"x86_64-apple-darwin-0.10.6":
|
||||
"d7647571fb17a5107d4d23cc190418039c157fd7361ddb59bc6f8127a49e3eac",
|
||||
"x86_64-pc-windows-msvc-0.10.6":
|
||||
"b27eb789f281e398a82197477de727fc8faf08605152115686da2c3cba0d25f7",
|
||||
"x86_64-unknown-linux-gnu-0.10.6":
|
||||
"aaa402e19d14a6b9a4267fcf4ec35380f804c68923525cea67cd6ee05bb4e930",
|
||||
"x86_64-unknown-linux-musl-0.10.6":
|
||||
"01d6ce770da88ce6445acb0a8764c8b1634c9f69c728dca68b19fc7a893f72b9",
|
||||
"aarch64-apple-darwin-0.10.5":
|
||||
"796c2d264c6aba3e1179249438a9fa2fe64140748f0e5b6681e38218ab6238f1",
|
||||
"aarch64-pc-windows-msvc-0.10.5":
|
||||
"7f88f279e271cd76a6e07fe1ad711cbdf15374206ab79f55adadb818ebbd8e43",
|
||||
"aarch64-unknown-linux-gnu-0.10.5":
|
||||
"dfa82b047456c646c50ba769af81a6b7ba20aaf5feee96e61554861db8db5809",
|
||||
"aarch64-unknown-linux-musl-0.10.5":
|
||||
"cf01a960442b9aff4cadc4d27c691086151e9289b5b9fbd0dc41ecfcff1db872",
|
||||
"arm-unknown-linux-musleabihf-0.10.5":
|
||||
"abe18becc57fe3c3bf55e62b4b7be0231cb4dbb941fdb3f4f9132703b1f4868c",
|
||||
"armv7-unknown-linux-gnueabihf-0.10.5":
|
||||
"46d79f64e88cb339160cf90f6df51ea14795960840fb4fca8aa61af8cddd8187",
|
||||
"armv7-unknown-linux-musleabihf-0.10.5":
|
||||
"13444ea0cc650551c4c455af73ac27a77185064275475b2999c627158b7455f4",
|
||||
"i686-pc-windows-msvc-0.10.5":
|
||||
"67d96bae5ef30b9f1e201622505591601b936996ceea84c36fce5e577db5a442",
|
||||
"i686-unknown-linux-gnu-0.10.5":
|
||||
"56eb897036b8607bb7516349388bef6c83004ae05e694ec34e1bae69f3a0f237",
|
||||
"i686-unknown-linux-musl-0.10.5":
|
||||
"b0be10f5c16a987294a806dfd3927348456fca8b465377c99e0d167792b842dc",
|
||||
"powerpc64le-unknown-linux-gnu-0.10.5":
|
||||
"c7f4049b7e26a43107351808f7748c3bc0dfdf118c29f4b1470b69be15fef45b",
|
||||
"riscv64gc-unknown-linux-gnu-0.10.5":
|
||||
"756c43f4844953a2241c4254d268335b3bd35ca81856e8e06c7d4826466e87ce",
|
||||
"s390x-unknown-linux-gnu-0.10.5":
|
||||
"fbccde48aec139fc99558bd022ec3cab15f607b9b5e0efc0279c6145ab5ecaf7",
|
||||
"x86_64-apple-darwin-0.10.5":
|
||||
"84c4ce2902e2e840a54a75360b00f06ceffc6c26894bc5e73151a2c55d5fd043",
|
||||
"x86_64-pc-windows-msvc-0.10.5":
|
||||
"d5b3b04127eb6fb41ffca60c0da655124133b62b4b58e29cfc5435469a176e06",
|
||||
"x86_64-unknown-linux-gnu-0.10.5":
|
||||
"bcb127225873baa5ebd23cf09f29996cc97c1091830c9933e2e320bf1429a584",
|
||||
"x86_64-unknown-linux-musl-0.10.5":
|
||||
"88aeea39c77b6b796ca6b19c0216a577b18095dc450972dac7872a307bb1e160",
|
||||
"aarch64-apple-darwin-0.10.4":
|
||||
"a6852e4dc565c8fedcf5adcdf09fca7caf5347739bed512bd95b15dada36db51",
|
||||
"aarch64-pc-windows-msvc-0.10.4":
|
||||
"77f859cfc26181bdfb94087ce42336d9e2d9e0700bc42f6668445cde517198ce",
|
||||
"aarch64-unknown-linux-gnu-0.10.4":
|
||||
"c84a6e6405715caa6e2f5ef8e5f29a5d0bc558a954e9f1b5c082b9d4708c222e",
|
||||
"aarch64-unknown-linux-musl-0.10.4":
|
||||
"82fc461031dafb130af761e7dbec1bcc51b826c2e664f5bf8bc4e4f8330320cd",
|
||||
"arm-unknown-linux-musleabihf-0.10.4":
|
||||
"2050d9037a63975dafed987bdc7d2960a3b82345951c14193060fce20f9d31d8",
|
||||
"armv7-unknown-linux-gnueabihf-0.10.4":
|
||||
"d1824ed14f3ad0e7cb7835b46bc0299859cd8141d039a66274a135ca9797bf9c",
|
||||
"armv7-unknown-linux-musleabihf-0.10.4":
|
||||
"3038fdf153a722941424c28ae76996d60589f7f626c2000eb6567b3c301100dd",
|
||||
"i686-pc-windows-msvc-0.10.4":
|
||||
"b42379a65e9cec5863a22cf81810aec57281b08d426e70cc3b90320b996d84a7",
|
||||
"i686-unknown-linux-gnu-0.10.4":
|
||||
"79821b1d6c035aa8dc32a45d41551a4f010b8e357c98df48c95c5cb5ec18a743",
|
||||
"i686-unknown-linux-musl-0.10.4":
|
||||
"459315d7dba39b0297f44104fad1c93fa5cf866f91b533bba02d58f1e54129ad",
|
||||
"powerpc64le-unknown-linux-gnu-0.10.4":
|
||||
"7b315d9580ef574a1d0ff2023c16e5ac8a164feb1e998f33ed144dfd4c4fc125",
|
||||
"riscv64gc-unknown-linux-gnu-0.10.4":
|
||||
"101a71c072986929c410d4839babf66851563fd855b36c1dd7ffbbf5fbedce36",
|
||||
"s390x-unknown-linux-gnu-0.10.4":
|
||||
"59a50f14892c82de8f3e7a1a63ebc0ef98778085e4bb35ec99323f5009232fe2",
|
||||
"x86_64-apple-darwin-0.10.4":
|
||||
"df6dd1c3ebeab4369a098c516c15c233c62bf789a40a4864b30dad1d38d7604e",
|
||||
"x86_64-pc-windows-msvc-0.10.4":
|
||||
"0f0e22d7507633bfb38d9b42fb6a0341f1f74b8e80b070a31231c354812432a3",
|
||||
"x86_64-unknown-linux-gnu-0.10.4":
|
||||
"6b52a47358deea1c5e173278bf46b2b489747a59ae31f2a4362ed5c6c1c269f7",
|
||||
"x86_64-unknown-linux-musl-0.10.4":
|
||||
"18adf097cea30a165ba086c1e72659fec3c5aca056a560e7c39e0164ac871196",
|
||||
"aarch64-apple-darwin-0.10.3":
|
||||
"ed2a08079527dafae4943fee80162ed750286657901e642eba4c9de928706df8",
|
||||
"aarch64-pc-windows-msvc-0.10.3":
|
||||
"48243b8acbb31d0081e00878ee3b28535ed9f28ab8b27960b88aed8e1d6dd16a",
|
||||
"aarch64-unknown-linux-gnu-0.10.3":
|
||||
"cce7d1e4c34e22955cd647b256409b6504f4ae72acf190a6f26189efefbc9a9d",
|
||||
"aarch64-unknown-linux-musl-0.10.3":
|
||||
"a98f8decf21204d40acb512b0e08a803ed718c640a97f3c095864967463d5b15",
|
||||
"arm-unknown-linux-musleabihf-0.10.3":
|
||||
"e4b3c6dc59cd65125eda09e6c24b97fca71647df979f8963662807dc6a53e165",
|
||||
"armv7-unknown-linux-gnueabihf-0.10.3":
|
||||
"1d453ef56127d3aab3ea7f383b27765840e0bdc0b683347191e4cbc26272de2e",
|
||||
"armv7-unknown-linux-musleabihf-0.10.3":
|
||||
"d2484df75c9ba4c7e9750da00c4c4276b65c088d8b551b63717d5d9aa227ffa5",
|
||||
"i686-pc-windows-msvc-0.10.3":
|
||||
"51f745bcab5f77fe75e6f221e3e55a4bddf54824e634ac6f229132880506ce7e",
|
||||
"i686-unknown-linux-gnu-0.10.3":
|
||||
"e82e76ced718091d946eed30880728cf39f05b85f4f82c483a7dbf95f1663531",
|
||||
"i686-unknown-linux-musl-0.10.3":
|
||||
"0baca51f61729c6911d1d055c2e6dee5d11d88f6abbcd1ff801460f46880dc8d",
|
||||
"powerpc64le-unknown-linux-gnu-0.10.3":
|
||||
"cf4969ba97af3a53d1e4dc8a28441b79e78a8d9a9d41854e88b425f6b6fc6179",
|
||||
"riscv64gc-unknown-linux-gnu-0.10.3":
|
||||
"79b6b362e48c80e5b7d251fb96546d8ee52dd3458e01518cef969f757b59502b",
|
||||
"s390x-unknown-linux-gnu-0.10.3":
|
||||
"fc969d6011e4ffd0752abb5d812fc453649a7394c3f08a11556c9960891e359c",
|
||||
"x86_64-apple-darwin-0.10.3":
|
||||
"e8071cedb9986724ca3d70020b4460a85a274394b378c0e8eb1e8f9e33402ff9",
|
||||
"x86_64-pc-windows-msvc-0.10.3":
|
||||
"d029201a3eebaa8a0001fa762ee44ca14a9cb3cae4d59fc3fd69857da03a6f8c",
|
||||
"x86_64-unknown-linux-gnu-0.10.3":
|
||||
"c60b9956a0e6727f0ddd881c303a706c6408b2047f3a8fa4d1454a826338ccdc",
|
||||
"x86_64-unknown-linux-musl-0.10.3":
|
||||
"126496b606129eda426dac502af0d910d895f3db81da28efc49b18edf5557741",
|
||||
"aarch64-apple-darwin-0.10.2":
|
||||
"3828b2de196687f60e9d199aea8b504299629300831eea0935ff3fe339903d0a",
|
||||
"aarch64-pc-windows-msvc-0.10.2":
|
||||
"826e4ee3a03ec245e54c449e272fdf8aab749e039cc49c950ad43cc13702221f",
|
||||
"aarch64-unknown-linux-gnu-0.10.2":
|
||||
"4998f545234d52fc6f1280827d392f00a9278295050d59c53a776546dbf0124d",
|
||||
"aarch64-unknown-linux-musl-0.10.2":
|
||||
"685e47f8f88b6845a9fc2ca27c3d246c0f53af8c017daf8e98ac0a97fe20365b",
|
||||
"arm-unknown-linux-musleabihf-0.10.2":
|
||||
"1c51ebc67e8e492fa549167a96e40bb21a2c2ccde8a8b440f9c8bc0e07f3d4a8",
|
||||
"armv7-unknown-linux-gnueabihf-0.10.2":
|
||||
"45243fed8f587f11002f175216894c9c75e2f402324627b7e0855e670557ec14",
|
||||
"armv7-unknown-linux-musleabihf-0.10.2":
|
||||
"45b3d7eee7a3af2e4309b0bbe4886c6640b773f6500f0e0b662d84f4a5466f67",
|
||||
"i686-pc-windows-msvc-0.10.2":
|
||||
"a828ee0a2f42d1384f79acd3edaf01956000e1ec5d18d9992d79e17d70d9aa6c",
|
||||
"i686-unknown-linux-gnu-0.10.2":
|
||||
"7f64628a8a0869185eed24de4a02f4c8d19c99dec7363f383050ccb7474a76e9",
|
||||
"i686-unknown-linux-musl-0.10.2":
|
||||
"8d1978ecfa37d2d71cbb0e2e75262e65c184d040130fe2dc331f25e044ed97b4",
|
||||
"powerpc64le-unknown-linux-gnu-0.10.2":
|
||||
"9b7f8e3ced416276a9e6321369f69234552d9cbf39d68d96a67e85cee4cd611f",
|
||||
"riscv64gc-unknown-linux-gnu-0.10.2":
|
||||
"1ad005a361293175170f3c193b50d5a5c7f1da631649236cd857721ce8c9cbde",
|
||||
"s390x-unknown-linux-gnu-0.10.2":
|
||||
"d4832c85f3e8e17f7ae4ced90059dc2b6927939a47fea3e92e5712e7148b9c09",
|
||||
"x86_64-apple-darwin-0.10.2":
|
||||
"3cdbd038333cfe861ce04f3d91678547bf2e726224acf5f42d3f0affa6740e19",
|
||||
"x86_64-pc-windows-msvc-0.10.2":
|
||||
"493ebbe0e06128d6ee4905e1ed5e2a433fb0f7cfc08b0eaca9fab4ca76778ae1",
|
||||
"x86_64-unknown-linux-gnu-0.10.2":
|
||||
"6aa4576c31f791c0b9d4739e256d07358d45e7535695287fec03cf6839e25512",
|
||||
"x86_64-unknown-linux-musl-0.10.2":
|
||||
"c162182ba7dd692794362d76dd183990d6e51553217954106da19bdb6ced211b",
|
||||
"aarch64-apple-darwin-0.10.1":
|
||||
"37c101cd8a745a43d69bc3832c41866ab721467a1d58881f57b73b705abc2851",
|
||||
"aarch64-pc-windows-msvc-0.10.1":
|
||||
"9644d0e37c41c19aa65137a928bf6fad78dc887f820202c0cfcf010cceb416a0",
|
||||
"aarch64-unknown-linux-gnu-0.10.1":
|
||||
"3731e98805ea6789188edec0dd97e673da195bf976a72db38f325f7c51cf5cdd",
|
||||
"aarch64-unknown-linux-musl-0.10.1":
|
||||
"ae9ae536be5b4d1cf7a6560d52a20711f267e7b21e23ee6cc538a4afa236b757",
|
||||
"arm-unknown-linux-musleabihf-0.10.1":
|
||||
"af7994b58553156fb4acdac40b3f7b1b43260a76de96ca7123bdf861351675d4",
|
||||
"armv7-unknown-linux-gnueabihf-0.10.1":
|
||||
"4f8857a779df69e2aa9df8ff35b6c34ef3ce45c13d2d4a0ae3957b0e68d322cc",
|
||||
"armv7-unknown-linux-musleabihf-0.10.1":
|
||||
"79d978b0e829cab83de4c78e80bd014f3210cf0a1a653d880d0aa6760baeaf80",
|
||||
"i686-pc-windows-msvc-0.10.1":
|
||||
"c4e989d479f9fc229302345a64f272be3c249d5fff4a2e722aa3d73c381fb303",
|
||||
"i686-unknown-linux-gnu-0.10.1":
|
||||
"0c4a17893df6e11991483277c5f0bee06d8ea60b6e11b349a9849bfe13a8c5cf",
|
||||
"i686-unknown-linux-musl-0.10.1":
|
||||
"7219a96adde5316489886c0d74749b7248c2c4070170b8e153d9d3f8f9fdfa5e",
|
||||
"powerpc64le-unknown-linux-gnu-0.10.1":
|
||||
"aa2ed9587a9ad5127662da9ceccaa747b941f37cbd9e6d9334c7c6c3286c9587",
|
||||
"riscv64gc-unknown-linux-gnu-0.10.1":
|
||||
"bda96a9ff8be79f780ff4711a2515061fe80d6f135ba55a47c41e1c6739d048e",
|
||||
"s390x-unknown-linux-gnu-0.10.1":
|
||||
"091eeeecfcdb15a954f2488be6b89d8709709003ada81d215d6ca88145826049",
|
||||
"x86_64-apple-darwin-0.10.1":
|
||||
"f61f1122193698a53fc2d4cc6fb5a5849b283817509778ac8f1a7d2a36a218de",
|
||||
"x86_64-pc-windows-msvc-0.10.1":
|
||||
"64c297ef1cd8e3a50966dee20cbe039564cd59e41186e0d1dd38fa4e627fc285",
|
||||
"x86_64-unknown-linux-gnu-0.10.1":
|
||||
"8b5af2d678da1bdae80a5107c934f6ab010c6cdeb2de5b8e07568031d9486051",
|
||||
"x86_64-unknown-linux-musl-0.10.1":
|
||||
"d1a3b08dd9abf9e500541cadd0e2f4b144c99b9265fb00e500c2b5c82a3b4ee8",
|
||||
"aarch64-apple-darwin-0.10.0":
|
||||
"82d4b99dc6ea686695b5ee142ceba03dd3e3eda2b414e94215ab7bce94972fbb",
|
||||
"aarch64-pc-windows-msvc-0.10.0":
|
||||
"614dd3c409d7fb5a98b516d532c98db9b7799a23fb450150e3784338a9ebd903",
|
||||
"aarch64-unknown-linux-gnu-0.10.0":
|
||||
"c300afd5f2d31df039fe6a26a2d68a76b62832098c272a43e1e74ab9efd4fbd7",
|
||||
"aarch64-unknown-linux-musl-0.10.0":
|
||||
"edf1adb1d183730302f87eef9b71bc4e47b4b8058832c3393b0fbcd86f270510",
|
||||
"arm-unknown-linux-musleabihf-0.10.0":
|
||||
"fea6d45bce1e7172192b4a7d3feb9f37c4198c243be1c573c8dacae765a32c53",
|
||||
"armv7-unknown-linux-gnueabihf-0.10.0":
|
||||
"3e8ab76a515884c29c773e01360acb6da61a1351c630377b54ba58918d9673af",
|
||||
"armv7-unknown-linux-musleabihf-0.10.0":
|
||||
"85423cda078ed0313f993ddea6ac897e469885539ce156643ace982bbffb8109",
|
||||
"i686-pc-windows-msvc-0.10.0":
|
||||
"b71bca0987dd12ea09ac6a0e52fdfa89f53601b6074be38366d0592b181f3001",
|
||||
"i686-unknown-linux-gnu-0.10.0":
|
||||
"dbac897653b0d60fb863288587dbacb30140f9725a42718f2c017df7b2d2b3c3",
|
||||
"i686-unknown-linux-musl-0.10.0":
|
||||
"56a211155275dd33731cbbb33aa915d3e7efa59d4436502edaca39ba436c157a",
|
||||
"powerpc64le-unknown-linux-gnu-0.10.0":
|
||||
"677a414608c61e2ecd751364dae9209cc5b76019481968b99b5d5ad7258d2d77",
|
||||
"riscv64gc-unknown-linux-gnu-0.10.0":
|
||||
"9da4019ecfd3440a5d0a0a957d8d5e4c6534ac1e3a10636d55266a22ab4135f8",
|
||||
"s390x-unknown-linux-gnu-0.10.0":
|
||||
"a1b9aa45c1a6b69066179e8d7e3f6e122e0f433ef2ad4e91c0acd1433a083c31",
|
||||
"x86_64-apple-darwin-0.10.0":
|
||||
"664aed584c276f8d79cdc3b7685cd48f5d64657bd6840b06b4b2b0db731b9c99",
|
||||
"x86_64-pc-windows-msvc-0.10.0":
|
||||
"4037b444541f695cd2eb93188a9346de3e334af562381411deade0a31c7bf898",
|
||||
"x86_64-unknown-linux-gnu-0.10.0":
|
||||
"230e328948c92dd1ebad83949c4d56e83813dfe9c6362a4c519e6a227973f1ae",
|
||||
"x86_64-unknown-linux-musl-0.10.0":
|
||||
"312d37f31b6f2c3bfc65668ba0efea9f1f9eaf7bc3209fe1a109e5cf861b95fa",
|
||||
"aarch64-apple-darwin-0.9.30":
|
||||
"03a5d9ec7f7d588446b2ec226d13ff6300055e55365eca8f3fab39f342b0e805",
|
||||
"aarch64-pc-windows-msvc-0.9.30":
|
||||
"cfbc40baf1da11c55eff92ee008f5af3cdbb4c24c40ddb0bbd489b983fadf43f",
|
||||
"aarch64-unknown-linux-gnu-0.9.30":
|
||||
"6aadf3c71600d594e16dabf382cc15282ead4c5ca768599b6bcb43c5004d9aa8",
|
||||
"aarch64-unknown-linux-musl-0.9.30":
|
||||
"b658b56957bceea742ca14f3ef28fb3542adbcedfb8bd5bd718ae255394ccd09",
|
||||
"arm-unknown-linux-musleabihf-0.9.30":
|
||||
"5a7f4cd306363b734dba2d86eb760812cb1211254d36ace01860f9e783df1900",
|
||||
"armv7-unknown-linux-gnueabihf-0.9.30":
|
||||
"bf8d9c2f1b4d0eee9bfb689b5483b1bd4b0b76acbeaaa4d0d68b132574c606ff",
|
||||
"armv7-unknown-linux-musleabihf-0.9.30":
|
||||
"8715a9da643d9e6cb984c2d3e00480849f93f11251d1474cd382cc9d7faeab84",
|
||||
"i686-pc-windows-msvc-0.9.30":
|
||||
"218b7ec0d052836d7ee395d5e0592e5dac7578fd618f439a5d09c1ad36466399",
|
||||
"i686-unknown-linux-gnu-0.9.30":
|
||||
"1bab147179887ebcb5c31e016e9ac9987f687e79f92fd2f0ff9bcedf927b8228",
|
||||
"i686-unknown-linux-musl-0.9.30":
|
||||
"14d8b2e2caa0b470418e551e027f3a8283aa8d09eae79206e7dbcd23a8ffa027",
|
||||
"powerpc64-unknown-linux-gnu-0.9.30":
|
||||
"ac4cd1a021462885932f6023b005a4835cca4c72bb60dec186ee2be4b60dca6f",
|
||||
"powerpc64le-unknown-linux-gnu-0.9.30":
|
||||
"73b8cbc560c6b2fa205358365d4e174abdf50cfcf57dc36a447572c56eba5ae4",
|
||||
"riscv64gc-unknown-linux-gnu-0.9.30":
|
||||
"5e0453d9252aab874a3658a039d4ffdde79dba4096974fcdc945498697dc81cf",
|
||||
"s390x-unknown-linux-gnu-0.9.30":
|
||||
"b35975bb9e5c2c418b428d0316cc6e3c7a6eff710c69212be14005c192f54516",
|
||||
"x86_64-apple-darwin-0.9.30":
|
||||
"ce069bf750567e9a4a31d6e285d1eae75d444d8a281409b641235903943b7681",
|
||||
"x86_64-pc-windows-msvc-0.9.30":
|
||||
"875981be7908295937dee09532bb66d576986d4f223259e171b0c767c885897a",
|
||||
"x86_64-unknown-linux-gnu-0.9.30":
|
||||
"8b3762374972daa7a74bbc6896cc73229ca69a07403dd9f9ea3805a51ffd7582",
|
||||
"x86_64-unknown-linux-musl-0.9.30":
|
||||
"1caf8fe092e2005dd4c134ba515c1aa3eea3d3c143f8a1903bcb58fcdf169365",
|
||||
"aarch64-apple-darwin-0.9.29":
|
||||
"0729ddd5c02df33669b03627aa5d9ac7cde4421657f808d54585e3cda944bb55",
|
||||
"aarch64-pc-windows-msvc-0.9.29":
|
||||
"39f7dce0d2993cd18d67980c012945ea678a99aef199f7afcea522b5bd70ecf7",
|
||||
"aarch64-unknown-linux-gnu-0.9.29":
|
||||
"935b35542b7e25493a551dcb3487af23b72ad284ee8ac6a488a97d02ce2d84ec",
|
||||
"aarch64-unknown-linux-musl-0.9.29":
|
||||
"b1edc94f5d6c36bb28a20f8c8afb400e55a428fcf396b03bf78cb7394f75077c",
|
||||
"arm-unknown-linux-musleabihf-0.9.29":
|
||||
"c72ae74c04668d4cf3143fb11ad5bbd1c9e9a80aaa439cb3e43208c127249202",
|
||||
"armv7-unknown-linux-gnueabihf-0.9.29":
|
||||
"e263645c9ab44e3f7e732b0317da775082f077bb86933be662395eeab97fb3d2",
|
||||
"armv7-unknown-linux-musleabihf-0.9.29":
|
||||
"98ab47dcb345d746b230a359d72a96444b1be21cf24026c653d5c7848c680beb",
|
||||
"i686-pc-windows-msvc-0.9.29":
|
||||
"049a929882a3f4a2d054c9dc44848d2c24175079696e131a57d60d9ab62df81a",
|
||||
"i686-unknown-linux-gnu-0.9.29":
|
||||
"9415828fc2fdacadb56263382a27da6661a89a4bb3a6683d6d864d5c013b7c6a",
|
||||
"i686-unknown-linux-musl-0.9.29":
|
||||
"3ac91c9cccc85c07c0950afc4f45b3e14f2a3e9484f4940366ebab72e71fa8dc",
|
||||
"powerpc64-unknown-linux-gnu-0.9.29":
|
||||
"7feb1fb35fe66b4f83d3bc7776810f708c6609c9be48ceed6ec024b15733101d",
|
||||
"powerpc64le-unknown-linux-gnu-0.9.29":
|
||||
"1f4e1f859868abcf3557afe78b8b7525a938921af745945deef737927a017d82",
|
||||
"riscv64gc-unknown-linux-gnu-0.9.29":
|
||||
"18dc2d3b513c4bfe0fc4b3a67a80f62ce32077f84db343a1f0eb8003ab276732",
|
||||
"s390x-unknown-linux-gnu-0.9.29":
|
||||
"10e6d5dcd72bf99daee6678f6b508d1056e9f1670f6d76c1cfdf02b7560bcb4a",
|
||||
"x86_64-apple-darwin-0.9.29":
|
||||
"d251e48db2a962272a2efeb2771c82c02e40f473193a255e8e5c05eb61112139",
|
||||
"x86_64-pc-windows-msvc-0.9.29":
|
||||
"9825b1a5955d8a432b664e56660641aac8886ed30cd9c59a94aacc68ae9116ce",
|
||||
"x86_64-unknown-linux-gnu-0.9.29":
|
||||
"1ce5212f8f42dc7427a1bd3db4168d6d1abcf81b38d8c82a5b9d0ddc54ceebfc",
|
||||
"x86_64-unknown-linux-musl-0.9.29":
|
||||
"44c93c73e8870e003bda17ab50d433e27d201d0cb28d2bb75351ef1497ffa9db",
|
||||
"aarch64-apple-darwin-0.9.28":
|
||||
"12163fe09eb292d3ad1ea0f132a84485c902e2ff360d57562bf676e6615fcba0",
|
||||
"aarch64-pc-windows-msvc-0.9.28":
|
||||
"081703fa19ae05a49f486f97468f7792e1cdacda403a091b151af7f5bd6f4595",
|
||||
"aarch64-unknown-linux-gnu-0.9.28":
|
||||
"382c342735ff29f8ba4574d88e39bca798bcbac50bff6742710ca9cd8143e7d2",
|
||||
"aarch64-unknown-linux-musl-0.9.28":
|
||||
"eec3249254efac972d2555ff858f8ed20f05b40fbb38ac83b15cf0a2ccc86749",
|
||||
"arm-unknown-linux-musleabihf-0.9.28":
|
||||
"d0df2a9e7db464a567038bd560dc5007e488542c073989334a4a293b8957e1e1",
|
||||
"armv7-unknown-linux-gnueabihf-0.9.28":
|
||||
"6ddf1979609a3f5bdf897965ed6984dacce860ce57c579596bdc4b514c19320b",
|
||||
"armv7-unknown-linux-musleabihf-0.9.28":
|
||||
"e391ba4cc05a3a1096f1ab6cd82fcbed059d048a6ba108b4cb18da311a07c4d5",
|
||||
"i686-pc-windows-msvc-0.9.28":
|
||||
"fb5015efd0db178268312a7a7dcde7b0d3b7d7e0eccd0372a4b6f1dcfc075472",
|
||||
"i686-unknown-linux-gnu-0.9.28":
|
||||
"c0d34d92cb11925530fbc313de7536da3e1d097a442f54668417d241697fb3a2",
|
||||
"i686-unknown-linux-musl-0.9.28":
|
||||
"be1ad4f30d97c95af5105405fc38329d66375cde3de18cd0f9fe73b4581155c7",
|
||||
"powerpc64-unknown-linux-gnu-0.9.28":
|
||||
"6f23bfca0febb001792e7124d0c2ba41ddcfe01d6c030f4a8668ed634a5a582b",
|
||||
"powerpc64le-unknown-linux-gnu-0.9.28":
|
||||
"894ac114f076cffbf041e55e1ad0df759f7bc9dba1291158690781baad38001e",
|
||||
"riscv64gc-unknown-linux-gnu-0.9.28":
|
||||
"e61fa014a0b77acd17f9f366a55cbc0e67b377c4eff13629021a4242cc71eabb",
|
||||
"s390x-unknown-linux-gnu-0.9.28":
|
||||
"af15dc54893b2caecc3604ac68104914b155a8bbf821f667996549e777919a90",
|
||||
"x86_64-apple-darwin-0.9.28":
|
||||
"3a8030881d13b824e5168f5e4d060e715e40753249766bda3d52d6771d93b169",
|
||||
"x86_64-pc-windows-msvc-0.9.28":
|
||||
"9cb567fcd92f31431220ce620787043b946c30b9bb46ca213780e5ef471453be",
|
||||
"x86_64-unknown-linux-gnu-0.9.28":
|
||||
"66ad1822dd9cf96694b95c24f25bc05cff417a65351464da01682a91796d1f2b",
|
||||
"x86_64-unknown-linux-musl-0.9.28":
|
||||
"83cd032167b6b97ac94830608efe11159b3d485654e39fdb0bf84718ef236afe",
|
||||
"aarch64-apple-darwin-0.9.27":
|
||||
"1359538ed8664d172692cf4719ee0933a4a3bfb22fc91b0be1e19e7bdd8f5ef3",
|
||||
"aarch64-pc-windows-msvc-0.9.27":
|
||||
"b448ab228f5d1165b8497e8ca10346af6f652eb8ad4e75e47fa55e8cdb5b60d7",
|
||||
"aarch64-unknown-linux-gnu-0.9.27":
|
||||
"a58b3b77a25620ae15ff3587049b755c7cbf3eaa7df187620b3e6c3dbf71daa0",
|
||||
"aarch64-unknown-linux-musl-0.9.27":
|
||||
"f80e97e1154a06e42143a173831289336ca9e34a67096ab070346958153e8e52",
|
||||
"arm-unknown-linux-musleabihf-0.9.27":
|
||||
"b80f4db9254b9ddec4b576190bdf15723e948f37f648d9b273be2e153d05f820",
|
||||
"armv7-unknown-linux-gnueabihf-0.9.27":
|
||||
"03b45c99ca940739c2a093f6a514da3dd858b3bc1e8c957c16c1832e30b30c28",
|
||||
"armv7-unknown-linux-musleabihf-0.9.27":
|
||||
"da43ee6e2f17b4646e35e2d55ce6a021fdf47c06601a6ae8b827de7bb7b3b02f",
|
||||
"i686-pc-windows-msvc-0.9.27":
|
||||
"f47831a97b8a1bc7c7211905c1e517cc2f4ef84df877f2a283c49609275db0fa",
|
||||
"i686-unknown-linux-gnu-0.9.27":
|
||||
"fdf3067e0c05d39b849ad48fbbc2b58919f70a686a40506c643d32688ceba1a9",
|
||||
"i686-unknown-linux-musl-0.9.27":
|
||||
"3c1f8c2b148ebf884311558aaff32b9fb5b68fe4f4242e3e3765381bb594386a",
|
||||
"powerpc64-unknown-linux-gnu-0.9.27":
|
||||
"c3cbda5118b06f2261d32f4802adfdc71f618f808df0c6a3184695a6ffecb88a",
|
||||
"powerpc64le-unknown-linux-gnu-0.9.27":
|
||||
"9011f6085cee3921c9fce82ce03041ca97aacc8cab86b7a5791faa71fa5f2712",
|
||||
"riscv64gc-unknown-linux-gnu-0.9.27":
|
||||
"7193628620c2c50c2d6632ea8e53a4ab5313f7e8003ddedd9e999f48b6d2c222",
|
||||
"s390x-unknown-linux-gnu-0.9.27":
|
||||
"5b055f02f2c8e5086ae1d05cf70d32d66982d27d8469ed896a65067fac2001d2",
|
||||
"x86_64-apple-darwin-0.9.27":
|
||||
"3977309c5c79984c13c55d2d1cd7aa114a718eb29436c5bdb4bdfa08bf243438",
|
||||
"x86_64-pc-windows-msvc-0.9.27":
|
||||
"c3bf465d5f2b93c836f369aec9f3fa8350843f24abd5f710bb74e72440b82898",
|
||||
"x86_64-unknown-linux-gnu-0.9.27":
|
||||
"8636e693ea0e05f5f4294b161f816c4d8df065267fdb0405cfb84c8e326991fa",
|
||||
"x86_64-unknown-linux-musl-0.9.27":
|
||||
"9f269bfb9c2e80808c373902af6a4af6cd5f4b4668b28c44aa09639cfed925c5",
|
||||
"aarch64-apple-darwin-0.9.26":
|
||||
"fcf0a9ea6599c6ae28a4c854ac6da76f2c889354d7c36ce136ef071f7ab9721f",
|
||||
"aarch64-pc-windows-msvc-0.9.26":
|
||||
|
||||
@@ -1,59 +1,34 @@
|
||||
import { promises as fs } from "node:fs";
|
||||
import * as tc from "@actions/tool-cache";
|
||||
import { KNOWN_CHECKSUMS } from "./known-checksums";
|
||||
|
||||
export interface ChecksumEntry {
|
||||
key: string;
|
||||
checksum: string;
|
||||
}
|
||||
|
||||
export async function updateChecksums(
|
||||
filePath: string,
|
||||
downloadUrls: string[],
|
||||
checksumEntries: ChecksumEntry[],
|
||||
): Promise<void> {
|
||||
await fs.rm(filePath);
|
||||
await fs.appendFile(
|
||||
filePath,
|
||||
"// AUTOGENERATED_DO_NOT_EDIT\nexport const KNOWN_CHECKSUMS: { [key: string]: string } = {\n",
|
||||
);
|
||||
let firstLine = true;
|
||||
for (const downloadUrl of downloadUrls) {
|
||||
const key = getKey(downloadUrl);
|
||||
if (key === undefined) {
|
||||
const deduplicatedEntries = new Map<string, string>();
|
||||
|
||||
for (const entry of checksumEntries) {
|
||||
if (deduplicatedEntries.has(entry.key)) {
|
||||
continue;
|
||||
}
|
||||
const checksum = await getOrDownloadChecksum(key, downloadUrl);
|
||||
if (!firstLine) {
|
||||
await fs.appendFile(filePath, ",\n");
|
||||
}
|
||||
await fs.appendFile(filePath, ` "${key}":\n "${checksum}"`);
|
||||
firstLine = false;
|
||||
}
|
||||
await fs.appendFile(filePath, ",\n};\n");
|
||||
}
|
||||
|
||||
function getKey(downloadUrl: string): string | undefined {
|
||||
// https://github.com/astral-sh/uv/releases/download/0.3.2/uv-aarch64-apple-darwin.tar.gz.sha256
|
||||
const parts = downloadUrl.split("/");
|
||||
const fileName = parts[parts.length - 1];
|
||||
if (fileName.startsWith("source")) {
|
||||
return undefined;
|
||||
deduplicatedEntries.set(entry.key, entry.checksum);
|
||||
}
|
||||
const name = fileName.split(".")[0].split("uv-")[1];
|
||||
const version = parts[parts.length - 2];
|
||||
return `${name}-${version}`;
|
||||
}
|
||||
|
||||
async function getOrDownloadChecksum(
|
||||
key: string,
|
||||
downloadUrl: string,
|
||||
): Promise<string> {
|
||||
let checksum = "";
|
||||
if (key in KNOWN_CHECKSUMS) {
|
||||
checksum = KNOWN_CHECKSUMS[key];
|
||||
} else {
|
||||
const content = await downloadAssetContent(downloadUrl);
|
||||
checksum = content.split(" ")[0].trim();
|
||||
}
|
||||
return checksum;
|
||||
}
|
||||
const body = [...deduplicatedEntries.entries()]
|
||||
.map(([key, checksum]) => ` "${key}":\n "${checksum}"`)
|
||||
.join(",\n");
|
||||
|
||||
async function downloadAssetContent(downloadUrl: string): Promise<string> {
|
||||
const downloadPath = await tc.downloadTool(downloadUrl);
|
||||
const content = await fs.readFile(downloadPath, "utf8");
|
||||
return content;
|
||||
const content =
|
||||
"// AUTOGENERATED_DO_NOT_EDIT\n" +
|
||||
"export const KNOWN_CHECKSUMS: { [key: string]: string } = {\n" +
|
||||
body +
|
||||
(body === "" ? "" : ",\n") +
|
||||
"};\n";
|
||||
|
||||
await fs.writeFile(filePath, content);
|
||||
}
|
||||
|
||||
@@ -2,20 +2,22 @@ import { promises as fs } from "node:fs";
|
||||
import * as path from "node:path";
|
||||
import * as core from "@actions/core";
|
||||
import * as tc from "@actions/tool-cache";
|
||||
import type { Endpoints } from "@octokit/types";
|
||||
import * as pep440 from "@renovatebot/pep440";
|
||||
import * as semver from "semver";
|
||||
import { OWNER, REPO, TOOL_CACHE_NAME } from "../utils/constants";
|
||||
import { Octokit } from "../utils/octokit";
|
||||
import { TOOL_CACHE_NAME, VERSIONS_NDJSON_URL } from "../utils/constants";
|
||||
import type { Architecture, Platform } from "../utils/platforms";
|
||||
import { validateChecksum } from "./checksum/checksum";
|
||||
import {
|
||||
getDownloadUrl,
|
||||
getAllVersions as getAllManifestVersions,
|
||||
getLatestKnownVersion as getLatestVersionInManifest,
|
||||
getManifestArtifact,
|
||||
} from "./version-manifest";
|
||||
|
||||
type Release =
|
||||
Endpoints["GET /repos/{owner}/{repo}/releases"]["response"]["data"][number];
|
||||
import {
|
||||
getAllVersions as getAllVersionsFromNdjson,
|
||||
getArtifact as getArtifactFromNdjson,
|
||||
getHighestSatisfyingVersion as getHighestSatisfyingVersionFromNdjson,
|
||||
getLatestVersion as getLatestVersionFromNdjson,
|
||||
} from "./versions-client";
|
||||
|
||||
export function tryGetFromToolCache(
|
||||
arch: Architecture,
|
||||
@@ -32,19 +34,26 @@ export function tryGetFromToolCache(
|
||||
return { installedPath, version: resolvedVersion };
|
||||
}
|
||||
|
||||
export async function downloadVersionFromGithub(
|
||||
export async function downloadVersionFromNdjson(
|
||||
platform: Platform,
|
||||
arch: Architecture,
|
||||
version: string,
|
||||
checkSum: string | undefined,
|
||||
githubToken: string,
|
||||
): Promise<{ version: string; cachedToolDir: string }> {
|
||||
const artifact = `uv-${arch}-${platform}`;
|
||||
const extension = getExtension(platform);
|
||||
const downloadUrl = `https://github.com/${OWNER}/${REPO}/releases/download/${version}/${artifact}${extension}`;
|
||||
const artifact = await getArtifactFromNdjson(version, arch, platform);
|
||||
|
||||
if (!artifact) {
|
||||
throw new Error(
|
||||
`Could not find artifact for version ${version}, arch ${arch}, platform ${platform} in ${VERSIONS_NDJSON_URL} .`,
|
||||
);
|
||||
}
|
||||
|
||||
// For the default astral-sh/versions source, checksum validation relies on
|
||||
// user input or the built-in KNOWN_CHECKSUMS table, not NDJSON sha256 values.
|
||||
return await downloadVersion(
|
||||
downloadUrl,
|
||||
artifact,
|
||||
artifact.url,
|
||||
`uv-${arch}-${platform}`,
|
||||
platform,
|
||||
arch,
|
||||
version,
|
||||
@@ -54,38 +63,32 @@ export async function downloadVersionFromGithub(
|
||||
}
|
||||
|
||||
export async function downloadVersionFromManifest(
|
||||
manifestUrl: string | undefined,
|
||||
manifestUrl: string,
|
||||
platform: Platform,
|
||||
arch: Architecture,
|
||||
version: string,
|
||||
checkSum: string | undefined,
|
||||
githubToken: string,
|
||||
): Promise<{ version: string; cachedToolDir: string }> {
|
||||
const downloadUrl = await getDownloadUrl(
|
||||
const artifact = await getManifestArtifact(
|
||||
manifestUrl,
|
||||
version,
|
||||
arch,
|
||||
platform,
|
||||
);
|
||||
if (!downloadUrl) {
|
||||
core.info(
|
||||
`manifest-file does not contain version ${version}, arch ${arch}, platform ${platform}. Falling back to GitHub releases.`,
|
||||
);
|
||||
return await downloadVersionFromGithub(
|
||||
platform,
|
||||
arch,
|
||||
version,
|
||||
checkSum,
|
||||
githubToken,
|
||||
if (!artifact) {
|
||||
throw new Error(
|
||||
`manifest-file does not contain version ${version}, arch ${arch}, platform ${platform}.`,
|
||||
);
|
||||
}
|
||||
|
||||
return await downloadVersion(
|
||||
downloadUrl,
|
||||
artifact.downloadUrl,
|
||||
`uv-${arch}-${platform}`,
|
||||
platform,
|
||||
arch,
|
||||
version,
|
||||
checkSum,
|
||||
resolveChecksum(checkSum, artifact.checksum),
|
||||
githubToken,
|
||||
);
|
||||
}
|
||||
@@ -96,7 +99,7 @@ async function downloadVersion(
|
||||
platform: Platform,
|
||||
arch: Architecture,
|
||||
version: string,
|
||||
checkSum: string | undefined,
|
||||
checksum: string | undefined,
|
||||
githubToken: string,
|
||||
): Promise<{ version: string; cachedToolDir: string }> {
|
||||
core.info(`Downloading uv from "${downloadUrl}" ...`);
|
||||
@@ -105,14 +108,14 @@ async function downloadVersion(
|
||||
undefined,
|
||||
githubToken,
|
||||
);
|
||||
await validateChecksum(checkSum, downloadPath, arch, platform, version);
|
||||
await validateChecksum(checksum, downloadPath, arch, platform, version);
|
||||
|
||||
let uvDir: string;
|
||||
if (platform === "pc-windows-msvc") {
|
||||
// On windows extracting the zip does not create an intermediate directory
|
||||
// On windows extracting the zip does not create an intermediate directory.
|
||||
try {
|
||||
// Try tar first as it's much faster, but only bsdtar supports zip files,
|
||||
// so this my fail if another tar, like gnu tar, ends up being used.
|
||||
// so this may fail if another tar, like gnu tar, ends up being used.
|
||||
uvDir = await tc.extractTar(downloadPath, undefined, "x");
|
||||
} catch (err) {
|
||||
core.info(
|
||||
@@ -127,6 +130,7 @@ async function downloadVersion(
|
||||
const extractedDir = await tc.extractTar(downloadPath);
|
||||
uvDir = path.join(extractedDir, artifactName);
|
||||
}
|
||||
|
||||
const cachedToolDir = await tc.cacheDir(
|
||||
uvDir,
|
||||
TOOL_CACHE_NAME,
|
||||
@@ -136,14 +140,22 @@ async function downloadVersion(
|
||||
return { cachedToolDir, version: version };
|
||||
}
|
||||
|
||||
function resolveChecksum(
|
||||
checkSum: string | undefined,
|
||||
manifestChecksum?: string,
|
||||
): string | undefined {
|
||||
return checkSum !== undefined && checkSum !== ""
|
||||
? checkSum
|
||||
: manifestChecksum;
|
||||
}
|
||||
|
||||
function getExtension(platform: Platform): string {
|
||||
return platform === "pc-windows-msvc" ? ".zip" : ".tar.gz";
|
||||
}
|
||||
|
||||
export async function resolveVersion(
|
||||
versionInput: string,
|
||||
manifestFile: string | undefined,
|
||||
githubToken: string,
|
||||
manifestUrl: string | undefined,
|
||||
resolutionStrategy: "highest" | "lowest" = "highest",
|
||||
): Promise<string> {
|
||||
core.debug(`Resolving version: ${versionInput}`);
|
||||
@@ -155,15 +167,15 @@ export async function resolveVersion(
|
||||
if (resolveVersionSpecifierToLatest) {
|
||||
core.info("Found minimum version specifier, using latest version");
|
||||
}
|
||||
if (manifestFile) {
|
||||
if (manifestUrl !== undefined) {
|
||||
version =
|
||||
versionInput === "latest" || resolveVersionSpecifierToLatest
|
||||
? await getLatestVersionInManifest(manifestFile)
|
||||
? await getLatestVersionInManifest(manifestUrl)
|
||||
: versionInput;
|
||||
} else {
|
||||
version =
|
||||
versionInput === "latest" || resolveVersionSpecifierToLatest
|
||||
? await getLatestVersion(githubToken)
|
||||
? await getLatestVersionFromNdjson()
|
||||
: versionInput;
|
||||
}
|
||||
if (tc.isExplicitVersion(version)) {
|
||||
@@ -175,91 +187,44 @@ export async function resolveVersion(
|
||||
}
|
||||
return version;
|
||||
}
|
||||
const availableVersions = await getAvailableVersions(githubToken);
|
||||
|
||||
if (manifestUrl === undefined && resolutionStrategy === "highest") {
|
||||
const resolvedVersion =
|
||||
await getHighestSatisfyingVersionFromNdjson(version);
|
||||
if (resolvedVersion !== undefined) {
|
||||
core.debug(`Resolved version from NDJSON stream: ${resolvedVersion}`);
|
||||
return resolvedVersion;
|
||||
}
|
||||
|
||||
throw new Error(`No version found for ${version}`);
|
||||
}
|
||||
|
||||
const availableVersions = await getAvailableVersions(manifestUrl);
|
||||
core.debug(`Available versions: ${availableVersions}`);
|
||||
const resolvedVersion =
|
||||
resolutionStrategy === "lowest"
|
||||
? minSatisfying(availableVersions, version)
|
||||
: maxSatisfying(availableVersions, version);
|
||||
|
||||
if (resolvedVersion === undefined) {
|
||||
throw new Error(`No version found for ${version}`);
|
||||
}
|
||||
|
||||
return resolvedVersion;
|
||||
}
|
||||
|
||||
async function getAvailableVersions(githubToken: string): Promise<string[]> {
|
||||
core.info("Getting available versions from GitHub API...");
|
||||
try {
|
||||
const octokit = new Octokit({
|
||||
auth: githubToken,
|
||||
});
|
||||
return await getReleaseTagNames(octokit);
|
||||
} catch (err) {
|
||||
if ((err as Error).message.includes("Bad credentials")) {
|
||||
core.info(
|
||||
"No (valid) GitHub token provided. Falling back to anonymous. Requests might be rate limited.",
|
||||
);
|
||||
const octokit = new Octokit();
|
||||
return await getReleaseTagNames(octokit);
|
||||
}
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
|
||||
async function getReleaseTagNames(octokit: Octokit): Promise<string[]> {
|
||||
const response: Release[] = await octokit.paginate(
|
||||
octokit.rest.repos.listReleases,
|
||||
{
|
||||
owner: OWNER,
|
||||
repo: REPO,
|
||||
},
|
||||
);
|
||||
const releaseTagNames = response.map((release) => release.tag_name);
|
||||
if (releaseTagNames.length === 0) {
|
||||
throw Error(
|
||||
"Github API request failed while getting releases. Check the GitHub status page for outages. Try again later.",
|
||||
async function getAvailableVersions(
|
||||
manifestUrl: string | undefined,
|
||||
): Promise<string[]> {
|
||||
if (manifestUrl !== undefined) {
|
||||
core.info(
|
||||
`Getting available versions from manifest-file ${manifestUrl} ...`,
|
||||
);
|
||||
}
|
||||
return releaseTagNames;
|
||||
}
|
||||
|
||||
async function getLatestVersion(githubToken: string) {
|
||||
core.info("Getting latest version from GitHub API...");
|
||||
const octokit = new Octokit({
|
||||
auth: githubToken,
|
||||
});
|
||||
|
||||
let latestRelease: { tag_name: string } | undefined;
|
||||
try {
|
||||
latestRelease = await getLatestRelease(octokit);
|
||||
} catch (err) {
|
||||
if ((err as Error).message.includes("Bad credentials")) {
|
||||
core.info(
|
||||
"No (valid) GitHub token provided. Falling back to anonymous. Requests might be rate limited.",
|
||||
);
|
||||
const octokit = new Octokit();
|
||||
latestRelease = await getLatestRelease(octokit);
|
||||
} else {
|
||||
core.error(
|
||||
"Github API request failed while getting latest release. Check the GitHub status page for outages. Try again later.",
|
||||
);
|
||||
throw err;
|
||||
}
|
||||
return await getAllManifestVersions(manifestUrl);
|
||||
}
|
||||
|
||||
if (!latestRelease) {
|
||||
throw new Error("Could not determine latest release.");
|
||||
}
|
||||
core.debug(`Latest version: ${latestRelease.tag_name}`);
|
||||
return latestRelease.tag_name;
|
||||
}
|
||||
|
||||
async function getLatestRelease(octokit: Octokit) {
|
||||
const { data: latestRelease } = await octokit.rest.repos.getLatestRelease({
|
||||
owner: OWNER,
|
||||
repo: REPO,
|
||||
});
|
||||
return latestRelease;
|
||||
core.info(`Getting available versions from ${VERSIONS_NDJSON_URL} ...`);
|
||||
return await getAllVersionsFromNdjson();
|
||||
}
|
||||
|
||||
function maxSatisfying(
|
||||
|
||||
80
src/download/legacy-version-manifest.ts
Normal file
80
src/download/legacy-version-manifest.ts
Normal file
@@ -0,0 +1,80 @@
|
||||
import * as core from "@actions/core";
|
||||
|
||||
export interface ManifestEntry {
|
||||
arch: string;
|
||||
platform: string;
|
||||
version: string;
|
||||
downloadUrl: string;
|
||||
checksum?: string;
|
||||
variant?: string;
|
||||
archiveFormat?: string;
|
||||
}
|
||||
|
||||
interface LegacyManifestEntry {
|
||||
arch: string;
|
||||
platform: string;
|
||||
version: string;
|
||||
downloadUrl: string;
|
||||
checksum?: string;
|
||||
}
|
||||
|
||||
const warnedLegacyManifestUrls = new Set<string>();
|
||||
|
||||
export function parseLegacyManifestEntries(
|
||||
parsedEntries: unknown[],
|
||||
manifestUrl: string,
|
||||
): ManifestEntry[] {
|
||||
warnAboutLegacyManifestFormat(manifestUrl);
|
||||
|
||||
return parsedEntries.map((entry, index) => {
|
||||
if (!isLegacyManifestEntry(entry)) {
|
||||
throw new Error(
|
||||
`Invalid legacy manifest-file entry at index ${index} in ${manifestUrl}.`,
|
||||
);
|
||||
}
|
||||
|
||||
return {
|
||||
arch: entry.arch,
|
||||
checksum: entry.checksum,
|
||||
downloadUrl: entry.downloadUrl,
|
||||
platform: entry.platform,
|
||||
version: entry.version,
|
||||
};
|
||||
});
|
||||
}
|
||||
|
||||
export function clearLegacyManifestWarnings(): void {
|
||||
warnedLegacyManifestUrls.clear();
|
||||
}
|
||||
|
||||
function warnAboutLegacyManifestFormat(manifestUrl: string): void {
|
||||
if (warnedLegacyManifestUrls.has(manifestUrl)) {
|
||||
return;
|
||||
}
|
||||
|
||||
warnedLegacyManifestUrls.add(manifestUrl);
|
||||
core.warning(
|
||||
`manifest-file ${manifestUrl} uses the legacy JSON array format, which is deprecated. Please migrate to the astral-sh/versions NDJSON format before the next major release.`,
|
||||
);
|
||||
}
|
||||
|
||||
function isLegacyManifestEntry(value: unknown): value is LegacyManifestEntry {
|
||||
if (!isRecord(value)) {
|
||||
return false;
|
||||
}
|
||||
|
||||
const checksumIsValid =
|
||||
typeof value.checksum === "string" || value.checksum === undefined;
|
||||
|
||||
return (
|
||||
typeof value.arch === "string" &&
|
||||
checksumIsValid &&
|
||||
typeof value.downloadUrl === "string" &&
|
||||
typeof value.platform === "string" &&
|
||||
typeof value.version === "string"
|
||||
);
|
||||
}
|
||||
|
||||
function isRecord(value: unknown): value is Record<string, unknown> {
|
||||
return typeof value === "object" && value !== null;
|
||||
}
|
||||
39
src/download/variant-selection.ts
Normal file
39
src/download/variant-selection.ts
Normal file
@@ -0,0 +1,39 @@
|
||||
interface VariantAwareEntry {
|
||||
variant?: string;
|
||||
}
|
||||
|
||||
export function selectDefaultVariant<T extends VariantAwareEntry>(
|
||||
entries: T[],
|
||||
duplicateEntryDescription: string,
|
||||
): T {
|
||||
const firstEntry = entries[0];
|
||||
if (firstEntry === undefined) {
|
||||
throw new Error("selectDefaultVariant requires at least one candidate.");
|
||||
}
|
||||
|
||||
if (entries.length === 1) {
|
||||
return firstEntry;
|
||||
}
|
||||
|
||||
const defaultEntries = entries.filter((entry) =>
|
||||
isDefaultVariant(entry.variant),
|
||||
);
|
||||
if (defaultEntries.length === 1) {
|
||||
return defaultEntries[0];
|
||||
}
|
||||
|
||||
throw new Error(
|
||||
`${duplicateEntryDescription} with variants ${formatVariants(entries)}. setup-uv currently requires a single default variant for duplicate platform entries.`,
|
||||
);
|
||||
}
|
||||
|
||||
function isDefaultVariant(variant: string | undefined): boolean {
|
||||
return variant === undefined || variant === "default";
|
||||
}
|
||||
|
||||
function formatVariants<T extends VariantAwareEntry>(entries: T[]): string {
|
||||
return entries
|
||||
.map((entry) => entry.variant ?? "default")
|
||||
.sort((left, right) => left.localeCompare(right))
|
||||
.join(", ");
|
||||
}
|
||||
@@ -1,91 +1,169 @@
|
||||
import { promises as fs } from "node:fs";
|
||||
import { join } from "node:path";
|
||||
import * as core from "@actions/core";
|
||||
import * as semver from "semver";
|
||||
import { fetch } from "../utils/fetch";
|
||||
import {
|
||||
clearLegacyManifestWarnings,
|
||||
type ManifestEntry,
|
||||
parseLegacyManifestEntries,
|
||||
} from "./legacy-version-manifest";
|
||||
import { selectDefaultVariant } from "./variant-selection";
|
||||
import { type NdjsonVersion, parseVersionData } from "./versions-client";
|
||||
|
||||
const localManifestFile = join(__dirname, "..", "..", "version-manifest.json");
|
||||
|
||||
interface ManifestEntry {
|
||||
version: string;
|
||||
artifactName: string;
|
||||
arch: string;
|
||||
platform: string;
|
||||
export interface ManifestArtifact {
|
||||
downloadUrl: string;
|
||||
checksum?: string;
|
||||
archiveFormat?: string;
|
||||
}
|
||||
|
||||
const cachedManifestEntries = new Map<string, ManifestEntry[]>();
|
||||
|
||||
export async function getLatestKnownVersion(
|
||||
manifestUrl: string | undefined,
|
||||
manifestUrl: string,
|
||||
): Promise<string> {
|
||||
const manifestEntries = await getManifestEntries(manifestUrl);
|
||||
return manifestEntries.reduce((a, b) =>
|
||||
semver.gt(a.version, b.version) ? a : b,
|
||||
).version;
|
||||
const versions = await getAllVersions(manifestUrl);
|
||||
const latestVersion = versions.reduce((latest, current) =>
|
||||
semver.gt(current, latest) ? current : latest,
|
||||
);
|
||||
|
||||
return latestVersion;
|
||||
}
|
||||
|
||||
export async function getDownloadUrl(
|
||||
manifestUrl: string | undefined,
|
||||
export async function getAllVersions(manifestUrl: string): Promise<string[]> {
|
||||
const manifestEntries = await getManifestEntries(manifestUrl);
|
||||
return [...new Set(manifestEntries.map((entry) => entry.version))];
|
||||
}
|
||||
|
||||
export async function getManifestArtifact(
|
||||
manifestUrl: string,
|
||||
version: string,
|
||||
arch: string,
|
||||
platform: string,
|
||||
): Promise<string | undefined> {
|
||||
): Promise<ManifestArtifact | undefined> {
|
||||
const manifestEntries = await getManifestEntries(manifestUrl);
|
||||
const entry = manifestEntries.find(
|
||||
(entry) =>
|
||||
entry.version === version &&
|
||||
entry.arch === arch &&
|
||||
entry.platform === platform,
|
||||
const entry = selectManifestEntry(
|
||||
manifestEntries,
|
||||
manifestUrl,
|
||||
version,
|
||||
arch,
|
||||
platform,
|
||||
);
|
||||
return entry ? entry.downloadUrl : undefined;
|
||||
|
||||
if (!entry) {
|
||||
return undefined;
|
||||
}
|
||||
|
||||
return {
|
||||
archiveFormat: entry.archiveFormat,
|
||||
checksum: entry.checksum,
|
||||
downloadUrl: entry.downloadUrl,
|
||||
};
|
||||
}
|
||||
|
||||
export function clearManifestCache(): void {
|
||||
cachedManifestEntries.clear();
|
||||
clearLegacyManifestWarnings();
|
||||
}
|
||||
|
||||
async function getManifestEntries(
|
||||
manifestUrl: string | undefined,
|
||||
): Promise<ManifestEntry[]> {
|
||||
let data: string;
|
||||
if (manifestUrl !== undefined) {
|
||||
core.info(`Fetching manifest-file from: ${manifestUrl}`);
|
||||
const response = await fetch(manifestUrl, {});
|
||||
if (!response.ok) {
|
||||
throw new Error(
|
||||
`Failed to fetch manifest-file: ${response.status} ${response.statusText}`,
|
||||
);
|
||||
}
|
||||
data = await response.text();
|
||||
} else {
|
||||
core.info("manifest-file not provided, reading from local file.");
|
||||
const fileContent = await fs.readFile(localManifestFile);
|
||||
data = fileContent.toString();
|
||||
}
|
||||
|
||||
return JSON.parse(data);
|
||||
}
|
||||
|
||||
export async function updateVersionManifest(
|
||||
manifestUrl: string,
|
||||
downloadUrls: string[],
|
||||
): Promise<void> {
|
||||
const manifest: ManifestEntry[] = [];
|
||||
|
||||
for (const downloadUrl of downloadUrls) {
|
||||
const urlParts = downloadUrl.split("/");
|
||||
const version = urlParts[urlParts.length - 2];
|
||||
const artifactName = urlParts[urlParts.length - 1];
|
||||
if (!artifactName.startsWith("uv-")) {
|
||||
continue;
|
||||
}
|
||||
if (artifactName.startsWith("uv-installer")) {
|
||||
continue;
|
||||
}
|
||||
const artifactParts = artifactName.split(".")[0].split("-");
|
||||
manifest.push({
|
||||
arch: artifactParts[1],
|
||||
artifactName: artifactName,
|
||||
downloadUrl: downloadUrl,
|
||||
platform: artifactName.split(`uv-${artifactParts[1]}-`)[1].split(".")[0],
|
||||
version: version,
|
||||
});
|
||||
): Promise<ManifestEntry[]> {
|
||||
const cachedEntries = cachedManifestEntries.get(manifestUrl);
|
||||
if (cachedEntries !== undefined) {
|
||||
core.debug(`Using cached manifest-file from: ${manifestUrl}`);
|
||||
return cachedEntries;
|
||||
}
|
||||
|
||||
core.info(`Fetching manifest-file from: ${manifestUrl}`);
|
||||
const response = await fetch(manifestUrl, {});
|
||||
if (!response.ok) {
|
||||
throw new Error(
|
||||
`Failed to fetch manifest-file: ${response.status} ${response.statusText}`,
|
||||
);
|
||||
}
|
||||
|
||||
const data = await response.text();
|
||||
const parsedEntries = parseManifestEntries(data, manifestUrl);
|
||||
cachedManifestEntries.set(manifestUrl, parsedEntries);
|
||||
|
||||
return parsedEntries;
|
||||
}
|
||||
|
||||
function parseManifestEntries(
|
||||
data: string,
|
||||
manifestUrl: string,
|
||||
): ManifestEntry[] {
|
||||
const trimmed = data.trim();
|
||||
if (trimmed === "") {
|
||||
throw new Error(`manifest-file at ${manifestUrl} is empty.`);
|
||||
}
|
||||
|
||||
const parsedAsJson = tryParseJson(trimmed);
|
||||
if (Array.isArray(parsedAsJson)) {
|
||||
return parseLegacyManifestEntries(parsedAsJson, manifestUrl);
|
||||
}
|
||||
|
||||
const versions = parseVersionData(trimmed, manifestUrl);
|
||||
return mapNdjsonVersionsToManifestEntries(versions, manifestUrl);
|
||||
}
|
||||
|
||||
function mapNdjsonVersionsToManifestEntries(
|
||||
versions: NdjsonVersion[],
|
||||
manifestUrl: string,
|
||||
): ManifestEntry[] {
|
||||
const manifestEntries: ManifestEntry[] = [];
|
||||
|
||||
for (const versionData of versions) {
|
||||
for (const artifact of versionData.artifacts) {
|
||||
const [arch, ...platformParts] = artifact.platform.split("-");
|
||||
if (arch === undefined || platformParts.length === 0) {
|
||||
throw new Error(
|
||||
`Invalid artifact platform '${artifact.platform}' in manifest-file ${manifestUrl}.`,
|
||||
);
|
||||
}
|
||||
|
||||
manifestEntries.push({
|
||||
arch,
|
||||
archiveFormat: artifact.archive_format,
|
||||
checksum: artifact.sha256,
|
||||
downloadUrl: artifact.url,
|
||||
platform: platformParts.join("-"),
|
||||
variant: artifact.variant,
|
||||
version: versionData.version,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
return manifestEntries;
|
||||
}
|
||||
|
||||
function selectManifestEntry(
|
||||
manifestEntries: ManifestEntry[],
|
||||
manifestUrl: string,
|
||||
version: string,
|
||||
arch: string,
|
||||
platform: string,
|
||||
): ManifestEntry | undefined {
|
||||
const matches = manifestEntries.filter(
|
||||
(candidate) =>
|
||||
candidate.version === version &&
|
||||
candidate.arch === arch &&
|
||||
candidate.platform === platform,
|
||||
);
|
||||
|
||||
if (matches.length === 0) {
|
||||
return undefined;
|
||||
}
|
||||
|
||||
return selectDefaultVariant(
|
||||
matches,
|
||||
`manifest-file ${manifestUrl} contains multiple artifacts for version ${version}, arch ${arch}, platform ${platform}`,
|
||||
);
|
||||
}
|
||||
|
||||
function tryParseJson(value: string): unknown {
|
||||
try {
|
||||
return JSON.parse(value);
|
||||
} catch {
|
||||
return undefined;
|
||||
}
|
||||
core.debug(`Updating manifest-file: ${JSON.stringify(manifest)}`);
|
||||
await fs.writeFile(manifestUrl, JSON.stringify(manifest));
|
||||
}
|
||||
|
||||
380
src/download/versions-client.ts
Normal file
380
src/download/versions-client.ts
Normal file
@@ -0,0 +1,380 @@
|
||||
import * as core from "@actions/core";
|
||||
import * as pep440 from "@renovatebot/pep440";
|
||||
import * as semver from "semver";
|
||||
import { VERSIONS_NDJSON_URL } from "../utils/constants";
|
||||
import { fetch } from "../utils/fetch";
|
||||
import { selectDefaultVariant } from "./variant-selection";
|
||||
|
||||
export interface NdjsonArtifact {
|
||||
platform: string;
|
||||
variant?: string;
|
||||
url: string;
|
||||
archive_format: string;
|
||||
sha256: string;
|
||||
}
|
||||
|
||||
export interface NdjsonVersion {
|
||||
version: string;
|
||||
artifacts: NdjsonArtifact[];
|
||||
}
|
||||
|
||||
export interface ArtifactResult {
|
||||
url: string;
|
||||
sha256: string;
|
||||
archiveFormat: string;
|
||||
}
|
||||
|
||||
const cachedVersionData = new Map<string, NdjsonVersion[]>();
|
||||
const cachedLatestVersionData = new Map<string, NdjsonVersion>();
|
||||
const cachedVersionLookup = new Map<string, Map<string, NdjsonVersion>>();
|
||||
|
||||
export async function fetchVersionData(
|
||||
url: string = VERSIONS_NDJSON_URL,
|
||||
): Promise<NdjsonVersion[]> {
|
||||
const cachedVersions = cachedVersionData.get(url);
|
||||
if (cachedVersions !== undefined) {
|
||||
core.debug(`Using cached NDJSON version data from ${url}`);
|
||||
return cachedVersions;
|
||||
}
|
||||
|
||||
core.info(`Fetching version data from ${url} ...`);
|
||||
const { versions } = await readVersionData(url);
|
||||
cacheCompleteVersionData(url, versions);
|
||||
return versions;
|
||||
}
|
||||
|
||||
export function parseVersionData(
|
||||
data: string,
|
||||
sourceDescription: string,
|
||||
): NdjsonVersion[] {
|
||||
const versions: NdjsonVersion[] = [];
|
||||
|
||||
for (const [index, line] of data.split("\n").entries()) {
|
||||
const trimmed = line.trim();
|
||||
if (trimmed === "") {
|
||||
continue;
|
||||
}
|
||||
|
||||
versions.push(parseVersionLine(trimmed, sourceDescription, index + 1));
|
||||
}
|
||||
|
||||
if (versions.length === 0) {
|
||||
throw new Error(`No version data found in ${sourceDescription}.`);
|
||||
}
|
||||
|
||||
return versions;
|
||||
}
|
||||
|
||||
export async function getLatestVersion(): Promise<string> {
|
||||
const cachedVersions = cachedVersionData.get(VERSIONS_NDJSON_URL);
|
||||
const cachedLatestVersion =
|
||||
cachedVersions?.[0] ?? cachedLatestVersionData.get(VERSIONS_NDJSON_URL);
|
||||
if (cachedLatestVersion !== undefined) {
|
||||
core.debug(
|
||||
`Latest version from NDJSON cache: ${cachedLatestVersion.version}`,
|
||||
);
|
||||
return cachedLatestVersion.version;
|
||||
}
|
||||
|
||||
const latestVersion = await findVersionData(() => true);
|
||||
if (!latestVersion) {
|
||||
throw new Error("No versions found in NDJSON data");
|
||||
}
|
||||
|
||||
core.debug(`Latest version from NDJSON: ${latestVersion.version}`);
|
||||
return latestVersion.version;
|
||||
}
|
||||
|
||||
export async function getAllVersions(): Promise<string[]> {
|
||||
const versions = await fetchVersionData();
|
||||
return versions.map((versionData) => versionData.version);
|
||||
}
|
||||
|
||||
export async function getHighestSatisfyingVersion(
|
||||
versionSpecifier: string,
|
||||
url: string = VERSIONS_NDJSON_URL,
|
||||
): Promise<string | undefined> {
|
||||
const matchedVersion = await findVersionData(
|
||||
(candidate) => versionSatisfies(candidate.version, versionSpecifier),
|
||||
url,
|
||||
);
|
||||
|
||||
return matchedVersion?.version;
|
||||
}
|
||||
|
||||
export async function getArtifact(
|
||||
version: string,
|
||||
arch: string,
|
||||
platform: string,
|
||||
): Promise<ArtifactResult | undefined> {
|
||||
const versionData = await getVersionData(version);
|
||||
if (!versionData) {
|
||||
core.debug(`Version ${version} not found in NDJSON data`);
|
||||
return undefined;
|
||||
}
|
||||
|
||||
const targetPlatform = `${arch}-${platform}`;
|
||||
const matchingArtifacts = versionData.artifacts.filter(
|
||||
(candidate) => candidate.platform === targetPlatform,
|
||||
);
|
||||
|
||||
if (matchingArtifacts.length === 0) {
|
||||
core.debug(
|
||||
`Artifact for ${targetPlatform} not found in version ${version}. Available platforms: ${versionData.artifacts
|
||||
.map((candidate) => candidate.platform)
|
||||
.join(", ")}`,
|
||||
);
|
||||
return undefined;
|
||||
}
|
||||
|
||||
const artifact = selectArtifact(matchingArtifacts, version, targetPlatform);
|
||||
|
||||
return {
|
||||
archiveFormat: artifact.archive_format,
|
||||
sha256: artifact.sha256,
|
||||
url: artifact.url,
|
||||
};
|
||||
}
|
||||
|
||||
export function clearCache(url?: string): void {
|
||||
if (url === undefined) {
|
||||
cachedVersionData.clear();
|
||||
cachedLatestVersionData.clear();
|
||||
cachedVersionLookup.clear();
|
||||
return;
|
||||
}
|
||||
|
||||
cachedVersionData.delete(url);
|
||||
cachedLatestVersionData.delete(url);
|
||||
cachedVersionLookup.delete(url);
|
||||
}
|
||||
|
||||
function selectArtifact(
|
||||
artifacts: NdjsonArtifact[],
|
||||
version: string,
|
||||
targetPlatform: string,
|
||||
): NdjsonArtifact {
|
||||
return selectDefaultVariant(
|
||||
artifacts,
|
||||
`Multiple artifacts found for ${targetPlatform} in version ${version}`,
|
||||
);
|
||||
}
|
||||
|
||||
async function getVersionData(
|
||||
version: string,
|
||||
url: string = VERSIONS_NDJSON_URL,
|
||||
): Promise<NdjsonVersion | undefined> {
|
||||
const cachedVersions = cachedVersionData.get(url);
|
||||
if (cachedVersions !== undefined) {
|
||||
return cachedVersions.find((candidate) => candidate.version === version);
|
||||
}
|
||||
|
||||
const cachedVersion = cachedVersionLookup.get(url)?.get(version);
|
||||
if (cachedVersion !== undefined) {
|
||||
return cachedVersion;
|
||||
}
|
||||
|
||||
return await findVersionData(
|
||||
(candidate) => candidate.version === version,
|
||||
url,
|
||||
);
|
||||
}
|
||||
|
||||
async function findVersionData(
|
||||
predicate: (versionData: NdjsonVersion) => boolean,
|
||||
url: string = VERSIONS_NDJSON_URL,
|
||||
): Promise<NdjsonVersion | undefined> {
|
||||
const cachedVersions = cachedVersionData.get(url);
|
||||
if (cachedVersions !== undefined) {
|
||||
return cachedVersions.find(predicate);
|
||||
}
|
||||
|
||||
const { matchedVersion, versions, complete } = await readVersionData(
|
||||
url,
|
||||
predicate,
|
||||
);
|
||||
|
||||
if (complete) {
|
||||
cacheCompleteVersionData(url, versions);
|
||||
}
|
||||
|
||||
return matchedVersion;
|
||||
}
|
||||
|
||||
async function readVersionData(
|
||||
url: string,
|
||||
stopWhen?: (versionData: NdjsonVersion) => boolean,
|
||||
): Promise<{
|
||||
complete: boolean;
|
||||
matchedVersion: NdjsonVersion | undefined;
|
||||
versions: NdjsonVersion[];
|
||||
}> {
|
||||
const response = await fetch(url, {});
|
||||
if (!response.ok) {
|
||||
throw new Error(
|
||||
`Failed to fetch version data: ${response.status} ${response.statusText}`,
|
||||
);
|
||||
}
|
||||
|
||||
if (response.body === null) {
|
||||
const body = await response.text();
|
||||
const versions = parseVersionData(body, url);
|
||||
const matchedVersion = stopWhen
|
||||
? versions.find((candidate) => stopWhen(candidate))
|
||||
: undefined;
|
||||
return { complete: true, matchedVersion, versions };
|
||||
}
|
||||
|
||||
const versions: NdjsonVersion[] = [];
|
||||
let lineNumber = 0;
|
||||
let matchedVersion: NdjsonVersion | undefined;
|
||||
let buffer = "";
|
||||
const decoder = new TextDecoder();
|
||||
const reader = response.body.getReader();
|
||||
|
||||
const processLine = (line: string): boolean => {
|
||||
const trimmed = line.trim();
|
||||
if (trimmed === "") {
|
||||
return false;
|
||||
}
|
||||
|
||||
lineNumber += 1;
|
||||
const versionData = parseVersionLine(trimmed, url, lineNumber);
|
||||
if (versions.length === 0) {
|
||||
cachedLatestVersionData.set(url, versionData);
|
||||
}
|
||||
|
||||
versions.push(versionData);
|
||||
cacheVersion(url, versionData);
|
||||
|
||||
if (stopWhen?.(versionData) === true) {
|
||||
matchedVersion = versionData;
|
||||
return true;
|
||||
}
|
||||
|
||||
return false;
|
||||
};
|
||||
|
||||
while (true) {
|
||||
const { done, value } = await reader.read();
|
||||
if (done) {
|
||||
buffer += decoder.decode();
|
||||
break;
|
||||
}
|
||||
|
||||
buffer += decoder.decode(value, { stream: true });
|
||||
let newlineIndex = buffer.indexOf("\n");
|
||||
while (newlineIndex !== -1) {
|
||||
const line = buffer.slice(0, newlineIndex);
|
||||
buffer = buffer.slice(newlineIndex + 1);
|
||||
|
||||
if (processLine(line)) {
|
||||
await reader.cancel();
|
||||
return { complete: false, matchedVersion, versions };
|
||||
}
|
||||
|
||||
newlineIndex = buffer.indexOf("\n");
|
||||
}
|
||||
}
|
||||
|
||||
if (buffer.trim() !== "" && processLine(buffer)) {
|
||||
return { complete: true, matchedVersion, versions };
|
||||
}
|
||||
|
||||
if (versions.length === 0) {
|
||||
throw new Error(`No version data found in ${url}.`);
|
||||
}
|
||||
|
||||
return { complete: true, matchedVersion, versions };
|
||||
}
|
||||
|
||||
function cacheCompleteVersionData(
|
||||
url: string,
|
||||
versions: NdjsonVersion[],
|
||||
): void {
|
||||
cachedVersionData.set(url, versions);
|
||||
|
||||
if (versions[0] !== undefined) {
|
||||
cachedLatestVersionData.set(url, versions[0]);
|
||||
}
|
||||
|
||||
const versionLookup = new Map<string, NdjsonVersion>();
|
||||
for (const versionData of versions) {
|
||||
versionLookup.set(versionData.version, versionData);
|
||||
}
|
||||
|
||||
cachedVersionLookup.set(url, versionLookup);
|
||||
}
|
||||
|
||||
function cacheVersion(url: string, versionData: NdjsonVersion): void {
|
||||
let versionLookup = cachedVersionLookup.get(url);
|
||||
if (versionLookup === undefined) {
|
||||
versionLookup = new Map<string, NdjsonVersion>();
|
||||
cachedVersionLookup.set(url, versionLookup);
|
||||
}
|
||||
|
||||
versionLookup.set(versionData.version, versionData);
|
||||
}
|
||||
|
||||
function parseVersionLine(
|
||||
line: string,
|
||||
sourceDescription: string,
|
||||
lineNumber: number,
|
||||
): NdjsonVersion {
|
||||
let parsed: unknown;
|
||||
try {
|
||||
parsed = JSON.parse(line);
|
||||
} catch (error) {
|
||||
throw new Error(
|
||||
`Failed to parse version data from ${sourceDescription} at line ${lineNumber}: ${(error as Error).message}`,
|
||||
);
|
||||
}
|
||||
|
||||
if (!isNdjsonVersion(parsed)) {
|
||||
throw new Error(
|
||||
`Invalid NDJSON record in ${sourceDescription} at line ${lineNumber}.`,
|
||||
);
|
||||
}
|
||||
|
||||
return parsed;
|
||||
}
|
||||
|
||||
function versionSatisfies(version: string, versionSpecifier: string): boolean {
|
||||
return (
|
||||
semver.satisfies(version, versionSpecifier) ||
|
||||
pep440.satisfies(version, versionSpecifier)
|
||||
);
|
||||
}
|
||||
|
||||
function isNdjsonVersion(value: unknown): value is NdjsonVersion {
|
||||
if (!isRecord(value)) {
|
||||
return false;
|
||||
}
|
||||
|
||||
if (typeof value.version !== "string" || !Array.isArray(value.artifacts)) {
|
||||
return false;
|
||||
}
|
||||
|
||||
return value.artifacts.every(isNdjsonArtifact);
|
||||
}
|
||||
|
||||
function isNdjsonArtifact(value: unknown): value is NdjsonArtifact {
|
||||
if (!isRecord(value)) {
|
||||
return false;
|
||||
}
|
||||
|
||||
const variantIsValid =
|
||||
typeof value.variant === "string" || value.variant === undefined;
|
||||
|
||||
return (
|
||||
typeof value.archive_format === "string" &&
|
||||
typeof value.platform === "string" &&
|
||||
typeof value.sha256 === "string" &&
|
||||
typeof value.url === "string" &&
|
||||
variantIsValid
|
||||
);
|
||||
}
|
||||
|
||||
function isRecord(value: unknown): value is Record<string, unknown> {
|
||||
return typeof value === "object" && value !== null;
|
||||
}
|
||||
@@ -59,23 +59,40 @@ async function saveCache(): Promise<void> {
|
||||
}
|
||||
|
||||
const actualCachePath = getUvCachePath();
|
||||
await saveCacheToKey(
|
||||
cacheKey,
|
||||
actualCachePath,
|
||||
STATE_CACHE_MATCHED_KEY,
|
||||
"uv cache",
|
||||
`Cache path ${actualCachePath} does not exist on disk. This likely indicates that there are no dependencies to cache. Consider disabling the cache input if it is not needed.`,
|
||||
);
|
||||
if (!fs.existsSync(actualCachePath)) {
|
||||
if (ignoreNothingToCache) {
|
||||
core.info(
|
||||
"No cacheable uv cache paths were found. Ignoring because ignore-nothing-to-cache is enabled.",
|
||||
);
|
||||
} else {
|
||||
throw new Error(
|
||||
`Cache path ${actualCachePath} does not exist on disk. This likely indicates that there are no dependencies to cache. Consider disabling the cache input if it is not needed.`,
|
||||
);
|
||||
}
|
||||
} else {
|
||||
await saveCacheToKey(
|
||||
cacheKey,
|
||||
actualCachePath,
|
||||
STATE_CACHE_MATCHED_KEY,
|
||||
"uv cache",
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
if (cachePython) {
|
||||
if (!fs.existsSync(pythonDir)) {
|
||||
core.warning(
|
||||
`Python cache path ${pythonDir} does not exist on disk. Skipping Python cache save because no managed Python installation was found. If you want uv to install managed Python instead of using a system interpreter, set UV_PYTHON_PREFERENCE=only-managed.`,
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
const pythonCacheKey = `${cacheKey}-python`;
|
||||
await saveCacheToKey(
|
||||
pythonCacheKey,
|
||||
pythonDir,
|
||||
STATE_PYTHON_CACHE_MATCHED_KEY,
|
||||
"Python cache",
|
||||
`Python cache path ${pythonDir} does not exist on disk. This likely indicates that there are no Python installations to cache. Consider disabling the cache input if it is not needed.`,
|
||||
);
|
||||
}
|
||||
}
|
||||
@@ -119,7 +136,6 @@ async function saveCacheToKey(
|
||||
cachePath: string,
|
||||
stateKey: string,
|
||||
cacheName: string,
|
||||
pathNotExistErrorMessage: string,
|
||||
): Promise<void> {
|
||||
const matchedKey = core.getState(stateKey);
|
||||
|
||||
@@ -131,26 +147,8 @@ async function saveCacheToKey(
|
||||
}
|
||||
|
||||
core.info(`Including ${cacheName} path: ${cachePath}`);
|
||||
if (!fs.existsSync(cachePath) && !ignoreNothingToCache) {
|
||||
throw new Error(pathNotExistErrorMessage);
|
||||
}
|
||||
|
||||
try {
|
||||
await cache.saveCache([cachePath], cacheKey);
|
||||
core.info(`${cacheName} saved with key: ${cacheKey}`);
|
||||
} catch (e) {
|
||||
if (
|
||||
e instanceof Error &&
|
||||
e.message ===
|
||||
"Path Validation Error: Path(s) specified in the action for caching do(es) not exist, hence no cache is being saved."
|
||||
) {
|
||||
core.info(
|
||||
`No cacheable ${cacheName} paths were found. Ignoring because ignore-nothing-to-save is enabled.`,
|
||||
);
|
||||
} else {
|
||||
throw e;
|
||||
}
|
||||
}
|
||||
await cache.saveCache([cachePath], cacheKey);
|
||||
core.info(`${cacheName} saved with key: ${cacheKey}`);
|
||||
}
|
||||
|
||||
run();
|
||||
|
||||
@@ -5,6 +5,7 @@ import * as exec from "@actions/exec";
|
||||
import { restoreCache } from "./cache/restore-cache";
|
||||
import {
|
||||
downloadVersionFromManifest,
|
||||
downloadVersionFromNdjson,
|
||||
resolveVersion,
|
||||
tryGetFromToolCache,
|
||||
} from "./download/download-version";
|
||||
@@ -24,6 +25,7 @@ import {
|
||||
resolutionStrategy,
|
||||
toolBinDir,
|
||||
toolDir,
|
||||
venvPath,
|
||||
versionFile as versionFileInput,
|
||||
version as versionInput,
|
||||
workingDirectory,
|
||||
@@ -36,6 +38,8 @@ import {
|
||||
} from "./utils/platforms";
|
||||
import { getUvVersionFromFile } from "./version/resolve";
|
||||
|
||||
const sourceDir = __dirname;
|
||||
|
||||
async function getPythonVersion(): Promise<string> {
|
||||
if (pythonVersion !== "") {
|
||||
return pythonVersion;
|
||||
@@ -138,14 +142,23 @@ async function setupUv(
|
||||
};
|
||||
}
|
||||
|
||||
const downloadVersionResult = await downloadVersionFromManifest(
|
||||
manifestFile,
|
||||
platform,
|
||||
arch,
|
||||
resolvedVersion,
|
||||
checkSum,
|
||||
githubToken,
|
||||
);
|
||||
const downloadVersionResult =
|
||||
manifestFile !== undefined
|
||||
? await downloadVersionFromManifest(
|
||||
manifestFile,
|
||||
platform,
|
||||
arch,
|
||||
resolvedVersion,
|
||||
checkSum,
|
||||
githubToken,
|
||||
)
|
||||
: await downloadVersionFromNdjson(
|
||||
platform,
|
||||
arch,
|
||||
resolvedVersion,
|
||||
checkSum,
|
||||
githubToken,
|
||||
);
|
||||
|
||||
return {
|
||||
uvDir: downloadVersionResult.cachedToolDir,
|
||||
@@ -157,12 +170,7 @@ async function determineVersion(
|
||||
manifestFile: string | undefined,
|
||||
): Promise<string> {
|
||||
if (versionInput !== "") {
|
||||
return await resolveVersion(
|
||||
versionInput,
|
||||
manifestFile,
|
||||
githubToken,
|
||||
resolutionStrategy,
|
||||
);
|
||||
return await resolveVersion(versionInput, manifestFile, resolutionStrategy);
|
||||
}
|
||||
if (versionFileInput !== "") {
|
||||
const versionFromFile = getUvVersionFromFile(versionFileInput);
|
||||
@@ -174,7 +182,6 @@ async function determineVersion(
|
||||
return await resolveVersion(
|
||||
versionFromFile,
|
||||
manifestFile,
|
||||
githubToken,
|
||||
resolutionStrategy,
|
||||
);
|
||||
}
|
||||
@@ -192,7 +199,6 @@ async function determineVersion(
|
||||
return await resolveVersion(
|
||||
versionFromUvToml || versionFromPyproject || "latest",
|
||||
manifestFile,
|
||||
githubToken,
|
||||
resolutionStrategy,
|
||||
);
|
||||
}
|
||||
@@ -269,12 +275,16 @@ async function activateEnvironment(): Promise<void> {
|
||||
"UV_NO_MODIFY_PATH and activate-environment cannot be used together.",
|
||||
);
|
||||
}
|
||||
const execArgs = ["venv", ".venv", "--directory", workingDirectory];
|
||||
|
||||
core.info("Activating python venv...");
|
||||
await exec.exec("uv", execArgs);
|
||||
core.info(`Creating and activating python venv at ${venvPath}...`);
|
||||
await exec.exec("uv", [
|
||||
"venv",
|
||||
venvPath,
|
||||
"--directory",
|
||||
workingDirectory,
|
||||
"--clear",
|
||||
]);
|
||||
|
||||
const venvPath = path.resolve(`${workingDirectory}${path.sep}.venv`);
|
||||
let venvBinPath = `${venvPath}${path.sep}bin`;
|
||||
if (process.platform === "win32") {
|
||||
venvBinPath = `${venvPath}${path.sep}Scripts`;
|
||||
@@ -300,7 +310,7 @@ function setCacheDir(): void {
|
||||
|
||||
function addMatchers(): void {
|
||||
if (addProblemMatchers) {
|
||||
const matchersPath = path.join(__dirname, `..${path.sep}..`, ".github");
|
||||
const matchersPath = path.join(sourceDir, "..", "..", ".github");
|
||||
core.info(`##[add-matcher]${path.join(matchersPath, "python.json")}`);
|
||||
}
|
||||
}
|
||||
|
||||
81
src/update-known-checksums.ts
Normal file
81
src/update-known-checksums.ts
Normal file
@@ -0,0 +1,81 @@
|
||||
import * as core from "@actions/core";
|
||||
import * as semver from "semver";
|
||||
import { KNOWN_CHECKSUMS } from "./download/checksum/known-checksums";
|
||||
import {
|
||||
type ChecksumEntry,
|
||||
updateChecksums,
|
||||
} from "./download/checksum/update-known-checksums";
|
||||
import {
|
||||
fetchVersionData,
|
||||
getLatestVersion,
|
||||
type NdjsonVersion,
|
||||
} from "./download/versions-client";
|
||||
|
||||
const VERSION_IN_CHECKSUM_KEY_PATTERN =
|
||||
/-(\d+\.\d+\.\d+(?:[-+][0-9A-Za-z.-]+)?)$/;
|
||||
|
||||
async function run(): Promise<void> {
|
||||
const checksumFilePath = process.argv.slice(2)[0];
|
||||
if (!checksumFilePath) {
|
||||
throw new Error(
|
||||
"Missing checksum file path. Usage: node dist/update-known-checksums/index.cjs <checksum-file-path>",
|
||||
);
|
||||
}
|
||||
|
||||
const latestVersion = await getLatestVersion();
|
||||
const latestKnownVersion = getLatestKnownVersionFromChecksums();
|
||||
|
||||
if (semver.lte(latestVersion, latestKnownVersion)) {
|
||||
core.info(
|
||||
`Latest release (${latestVersion}) is not newer than the latest known version (${latestKnownVersion}). Skipping update.`,
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
const versions = await fetchVersionData();
|
||||
const checksumEntries = extractChecksumsFromNdjson(versions);
|
||||
await updateChecksums(checksumFilePath, checksumEntries);
|
||||
|
||||
core.setOutput("latest-version", latestVersion);
|
||||
}
|
||||
|
||||
function getLatestKnownVersionFromChecksums(): string {
|
||||
const versions = new Set<string>();
|
||||
|
||||
for (const key of Object.keys(KNOWN_CHECKSUMS)) {
|
||||
const version = extractVersionFromChecksumKey(key);
|
||||
if (version !== undefined) {
|
||||
versions.add(version);
|
||||
}
|
||||
}
|
||||
|
||||
const latestVersion = [...versions].sort(semver.rcompare)[0];
|
||||
if (!latestVersion) {
|
||||
throw new Error("Could not determine latest known version from checksums.");
|
||||
}
|
||||
|
||||
return latestVersion;
|
||||
}
|
||||
|
||||
function extractVersionFromChecksumKey(key: string): string | undefined {
|
||||
return key.match(VERSION_IN_CHECKSUM_KEY_PATTERN)?.[1];
|
||||
}
|
||||
|
||||
function extractChecksumsFromNdjson(
|
||||
versions: NdjsonVersion[],
|
||||
): ChecksumEntry[] {
|
||||
const checksums: ChecksumEntry[] = [];
|
||||
|
||||
for (const version of versions) {
|
||||
for (const artifact of version.artifacts) {
|
||||
checksums.push({
|
||||
checksum: artifact.sha256,
|
||||
key: `${artifact.platform}-${version.version}`,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
return checksums;
|
||||
}
|
||||
|
||||
run();
|
||||
@@ -1,63 +0,0 @@
|
||||
import * as core from "@actions/core";
|
||||
import type { Endpoints } from "@octokit/types";
|
||||
import * as semver from "semver";
|
||||
import { updateChecksums } from "./download/checksum/update-known-checksums";
|
||||
import {
|
||||
getLatestKnownVersion,
|
||||
updateVersionManifest,
|
||||
} from "./download/version-manifest";
|
||||
import { OWNER, REPO } from "./utils/constants";
|
||||
import { Octokit } from "./utils/octokit";
|
||||
|
||||
type Release =
|
||||
Endpoints["GET /repos/{owner}/{repo}/releases"]["response"]["data"][number];
|
||||
|
||||
async function run(): Promise<void> {
|
||||
const checksumFilePath = process.argv.slice(2)[0];
|
||||
const versionsManifestFile = process.argv.slice(2)[1];
|
||||
const githubToken = process.argv.slice(2)[2];
|
||||
|
||||
const octokit = new Octokit({
|
||||
auth: githubToken,
|
||||
});
|
||||
|
||||
const { data: latestRelease } = await octokit.rest.repos.getLatestRelease({
|
||||
owner: OWNER,
|
||||
repo: REPO,
|
||||
});
|
||||
|
||||
const latestKnownVersion = await getLatestKnownVersion(undefined);
|
||||
|
||||
if (semver.lte(latestRelease.tag_name, latestKnownVersion)) {
|
||||
core.info(
|
||||
`Latest release (${latestRelease.tag_name}) is not newer than the latest known version (${latestKnownVersion}). Skipping update.`,
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
const releases: Release[] = await octokit.paginate(
|
||||
octokit.rest.repos.listReleases,
|
||||
{
|
||||
owner: OWNER,
|
||||
repo: REPO,
|
||||
},
|
||||
);
|
||||
const checksumDownloadUrls: string[] = releases.flatMap((release) =>
|
||||
release.assets
|
||||
.filter((asset) => asset.name.endsWith(".sha256"))
|
||||
.map((asset) => asset.browser_download_url),
|
||||
);
|
||||
await updateChecksums(checksumFilePath, checksumDownloadUrls);
|
||||
|
||||
const artifactDownloadUrls: string[] = releases.flatMap((release) =>
|
||||
release.assets
|
||||
.filter((asset) => !asset.name.endsWith(".sha256"))
|
||||
.map((asset) => asset.browser_download_url),
|
||||
);
|
||||
|
||||
await updateVersionManifest(versionsManifestFile, artifactDownloadUrls);
|
||||
|
||||
core.setOutput("latest-version", latestRelease.tag_name);
|
||||
}
|
||||
|
||||
run();
|
||||
@@ -1,5 +1,5 @@
|
||||
export const REPO = "uv";
|
||||
export const OWNER = "astral-sh";
|
||||
export const TOOL_CACHE_NAME = "uv";
|
||||
export const STATE_UV_PATH = "uv-path";
|
||||
export const STATE_UV_VERSION = "uv-version";
|
||||
export const VERSIONS_NDJSON_URL =
|
||||
"https://raw.githubusercontent.com/astral-sh/versions/main/v1/uv.ndjson";
|
||||
|
||||
@@ -14,6 +14,7 @@ export const version = core.getInput("version");
|
||||
export const versionFile = getVersionFile();
|
||||
export const pythonVersion = core.getInput("python-version");
|
||||
export const activateEnvironment = core.getBooleanInput("activate-environment");
|
||||
export const venvPath = getVenvPath();
|
||||
export const checkSum = core.getInput("checksum");
|
||||
export const enableCache = getEnableCache();
|
||||
export const restoreCache = core.getInput("restore-cache") === "true";
|
||||
@@ -45,6 +46,18 @@ function getVersionFile(): string {
|
||||
return versionFileInput;
|
||||
}
|
||||
|
||||
function getVenvPath(): string {
|
||||
const venvPathInput = core.getInput("venv-path");
|
||||
if (venvPathInput !== "") {
|
||||
if (!activateEnvironment) {
|
||||
core.warning("venv-path is only used when activate-environment is true");
|
||||
}
|
||||
const tildeExpanded = expandTilde(venvPathInput);
|
||||
return normalizePath(resolveRelativePath(tildeExpanded));
|
||||
}
|
||||
return normalizePath(resolveRelativePath(".venv"));
|
||||
}
|
||||
|
||||
function getEnableCache(): boolean {
|
||||
const enableCacheInput = core.getInput("enable-cache");
|
||||
if (enableCacheInput === "auto") {
|
||||
@@ -194,6 +207,19 @@ function expandTilde(input: string): string {
|
||||
return input;
|
||||
}
|
||||
|
||||
function normalizePath(inputPath: string): string {
|
||||
const normalized = path.normalize(inputPath);
|
||||
const root = path.parse(normalized).root;
|
||||
|
||||
// Remove any trailing path separators, except when the whole path is the root.
|
||||
let trimmed = normalized;
|
||||
while (trimmed.length > root.length && trimmed.endsWith(path.sep)) {
|
||||
trimmed = trimmed.slice(0, -1);
|
||||
}
|
||||
|
||||
return trimmed;
|
||||
}
|
||||
|
||||
function resolveRelativePath(inputPath: string): string {
|
||||
const hasNegation = inputPath.startsWith("!");
|
||||
const pathWithoutNegation = hasNegation ? inputPath.substring(1) : inputPath;
|
||||
|
||||
@@ -1,34 +0,0 @@
|
||||
import type { OctokitOptions } from "@octokit/core";
|
||||
import { Octokit as Core } from "@octokit/core";
|
||||
import {
|
||||
type PaginateInterface,
|
||||
paginateRest,
|
||||
} from "@octokit/plugin-paginate-rest";
|
||||
import { legacyRestEndpointMethods } from "@octokit/plugin-rest-endpoint-methods";
|
||||
import { fetch as customFetch } from "./fetch";
|
||||
|
||||
export type { RestEndpointMethodTypes } from "@octokit/plugin-rest-endpoint-methods";
|
||||
|
||||
const DEFAULTS = {
|
||||
baseUrl: "https://api.github.com",
|
||||
userAgent: "setup-uv",
|
||||
};
|
||||
|
||||
const OctokitWithPlugins = Core.plugin(paginateRest, legacyRestEndpointMethods);
|
||||
|
||||
export const Octokit = OctokitWithPlugins.defaults(function buildDefaults(
|
||||
options: OctokitOptions,
|
||||
): OctokitOptions {
|
||||
return {
|
||||
...DEFAULTS,
|
||||
...options,
|
||||
request: {
|
||||
fetch: customFetch,
|
||||
...options.request,
|
||||
},
|
||||
};
|
||||
});
|
||||
|
||||
export type Octokit = InstanceType<typeof OctokitWithPlugins> & {
|
||||
paginate: PaginateInterface;
|
||||
};
|
||||
@@ -13,6 +13,7 @@ export type Architecture =
|
||||
| "x86_64"
|
||||
| "aarch64"
|
||||
| "s390x"
|
||||
| "riscv64gc"
|
||||
| "powerpc64le";
|
||||
|
||||
export function getArch(): Architecture | undefined {
|
||||
@@ -21,6 +22,7 @@ export function getArch(): Architecture | undefined {
|
||||
arm64: "aarch64",
|
||||
ia32: "i686",
|
||||
ppc64: "powerpc64le",
|
||||
riscv64: "riscv64gc",
|
||||
s390x: "s390x",
|
||||
x64: "x86_64",
|
||||
};
|
||||
@@ -106,10 +108,16 @@ function getLinuxOSNameVersion(): string {
|
||||
const content = fs.readFileSync(file, "utf8");
|
||||
const id = parseOsReleaseValue(content, "ID");
|
||||
const versionId = parseOsReleaseValue(content, "VERSION_ID");
|
||||
// Fallback for rolling releases (debian:unstable/testing, arch, etc.)
|
||||
// that don't have VERSION_ID but have VERSION_CODENAME
|
||||
const versionCodename = parseOsReleaseValue(content, "VERSION_CODENAME");
|
||||
|
||||
if (id && versionId) {
|
||||
return `${id}-${versionId}`;
|
||||
}
|
||||
if (id && versionCodename) {
|
||||
return `${id}-${versionCodename}`;
|
||||
}
|
||||
} catch {
|
||||
// Try next file
|
||||
}
|
||||
|
||||
@@ -1,12 +1,12 @@
|
||||
{
|
||||
"compilerOptions": {
|
||||
"esModuleInterop": true /* Enables emit interoperability between CommonJS and ES Modules via creation of namespace objects for all imports. Implies 'allowSyntheticDefaultImports'. */,
|
||||
"module": "nodenext" /* Specify module code generation: 'none', 'commonjs', 'amd', 'system', 'umd', 'es2015', or 'ESNext'. */,
|
||||
"noImplicitAny": true /* Raise error on expressions and declarations with an implied 'any' type. */,
|
||||
"outDir": "./lib" /* Redirect output structure to the directory. */,
|
||||
"rootDir": "./src" /* Specify the root directory of input files. Use to control the output directory structure with --outDir. */,
|
||||
"strict": true /* Enable all strict type-checking options. */,
|
||||
"target": "ES2022" /* Specify ECMAScript target version: 'ES3' (default), 'ES5', 'ES2015', 'ES2016', 'ES2017', 'ES2018', 'ES2019' or 'ESNEXT'. */
|
||||
"esModuleInterop": true,
|
||||
"isolatedModules": true,
|
||||
"module": "esnext",
|
||||
"moduleResolution": "bundler",
|
||||
"noImplicitAny": true,
|
||||
"strict": true,
|
||||
"target": "ES2022"
|
||||
},
|
||||
"exclude": ["node_modules", "**/*.test.ts"]
|
||||
"include": ["src/**/*.ts"]
|
||||
}
|
||||
|
||||
29178
version-manifest.json
29178
version-manifest.json
File diff suppressed because it is too large
Load Diff
Reference in New Issue
Block a user