mirror of
https://github.com/astral-sh/setup-uv.git
synced 2026-03-14 17:14:58 +00:00
Compare commits
7 Commits
v7.4
...
speed-up-v
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
01149c4575 | ||
|
|
fd8f376b22 | ||
|
|
f9070de1ea | ||
|
|
cadb67bdc9 | ||
|
|
e06108dd0a | ||
|
|
0f6ec07aaf | ||
|
|
821e5c9815 |
48
.agents/skills/dependabot-pr-rollup/SKILL.md
Normal file
48
.agents/skills/dependabot-pr-rollup/SKILL.md
Normal file
@@ -0,0 +1,48 @@
|
||||
---
|
||||
name: dependabot-pr-rollup
|
||||
description: Find open Dependabot PRs for the current GitHub repo, compare each PR head to its base branch, replay only the net dependency changes in a fresh worktree and branch, run npm validation, and optionally commit, push, and open a PR. Use when you want to batch or manually replicate active Dependabot updates.
|
||||
license: MIT
|
||||
compatibility: Requires git, git worktree, gh CLI auth, npm, and a GitHub repo with an origin remote.
|
||||
---
|
||||
|
||||
# Dependabot PR Rollup
|
||||
|
||||
## When to use
|
||||
|
||||
Use this skill when the user wants to:
|
||||
- find all open Dependabot PRs in the current repo
|
||||
- reproduce their net effect in one local branch
|
||||
- validate the result with the repo's standard npm checks
|
||||
- optionally commit, push, and open a PR
|
||||
|
||||
## Workflow
|
||||
|
||||
1. Inspect the current checkout state, but do not reuse a dirty worktree.
|
||||
2. List open Dependabot PRs with `gh pr list --state open --author app/dependabot`.
|
||||
3. For each PR, collect the title, base branch, head branch, changed files, and relevant diffs.
|
||||
4. Compare each PR head against `origin/<base>` instead of trusting the PR title. Dependabot PRs can already be partially merged, superseded by newer versions, or have no remaining net effect.
|
||||
5. Create a new worktree and branch from `origin/<base>`.
|
||||
6. Reproduce only the remaining dependency changes in the new worktree.
|
||||
- Inspect `package.json` before editing.
|
||||
- Run `npm ci --ignore-scripts` before applying updates.
|
||||
- Use `npm install ... --ignore-scripts` for direct dependency changes so `package-lock.json` stays in sync.
|
||||
7. Run `npm run all`.
|
||||
8. If requested, commit the changed source, lockfile, and generated artifacts, then push and open a PR.
|
||||
|
||||
## Repo-specific notes
|
||||
|
||||
- Use `gh` for GitHub operations.
|
||||
- Keep the user's original checkout untouched by working in a separate worktree.
|
||||
- In this repo, `npm run all` is the safest validation command because it runs build, check, package, and test.
|
||||
- If dependency changes affect bundled output, include the regenerated `dist/` files.
|
||||
|
||||
## Report back
|
||||
|
||||
Always report:
|
||||
- open Dependabot PRs found
|
||||
- which PRs required no net changes
|
||||
- new branch name
|
||||
- new worktree path
|
||||
- files changed
|
||||
- `npm run all` result
|
||||
- if applicable, commit SHA and PR URL
|
||||
263
.github/copilot-instructions.md
vendored
263
.github/copilot-instructions.md
vendored
@@ -1,263 +0,0 @@
|
||||
# Copilot Instructions for setup-uv
|
||||
|
||||
This document provides essential information for GitHub Copilot coding agents working on the `astral-sh/setup-uv` repository.
|
||||
|
||||
## Repository Overview
|
||||
|
||||
**setup-uv** is a GitHub Action that sets up the [uv](https://docs.astral.sh/uv/)
|
||||
Python package installer in GitHub Actions workflows.
|
||||
It's a TypeScript-based action that downloads uv binaries, manages caching, handles version resolution,
|
||||
and configures the environment for subsequent workflow steps.
|
||||
|
||||
### Key Features
|
||||
|
||||
- Downloads and installs specific uv versions from GitHub releases
|
||||
- Supports version resolution from config files (pyproject.toml, uv.toml, .tool-versions)
|
||||
- Implements intelligent caching for both uv cache and Python installations
|
||||
- Provides cross-platform support (Linux, macOS, Windows, including ARM architectures)
|
||||
- Includes problem matchers for Python error reporting
|
||||
- Supports environment activation and custom tool directories
|
||||
|
||||
## Repository Structure
|
||||
|
||||
**Size**: Small-medium repository (~50 source files, ~400 total files including dependencies)
|
||||
**Languages**: TypeScript (primary), JavaScript (compiled output), JSON (configuration)
|
||||
**Runtime**: Node.js 24 (GitHub Actions runtime)
|
||||
**Key Dependencies**: @actions/core, @actions/cache, @actions/tool-cache, @octokit/core
|
||||
|
||||
### Core Architecture
|
||||
|
||||
```
|
||||
src/
|
||||
├── setup-uv.ts # Main entry point and orchestration
|
||||
├── save-cache.ts # Post-action cache saving logic
|
||||
├── update-known-versions.ts # Maintenance script for version updates
|
||||
├── cache/ # Cache management functionality
|
||||
├── download/ # Version resolution and binary downloading
|
||||
├── utils/ # Input parsing, platform detection, configuration
|
||||
└── version/ # Version resolution from various file formats
|
||||
```
|
||||
|
||||
### Key Files and Locations
|
||||
|
||||
- **Action Definition**: `action.yml` - Defines all inputs/outputs and entry points
|
||||
- **Main Source**: `src/setup-uv.ts` - Primary action logic
|
||||
- **Configuration**: `biome.json` (linting), `tsconfig.json` (TypeScript), `jest.config.js` (testing)
|
||||
- **Compiled Output**: `dist/` - Contains compiled Node.js bundles (auto-generated, committed)
|
||||
- **Test Fixtures**: `__tests__/fixtures/` - Sample projects for different configuration scenarios
|
||||
- **Workflows**: `.github/workflows/test.yml` - Comprehensive CI/CD pipeline
|
||||
|
||||
## Build and Development Process
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- Node.js 24+ (matches GitHub Actions runtime)
|
||||
- npm (included with Node.js)
|
||||
|
||||
### Essential Commands (ALWAYS run in this order)
|
||||
|
||||
#### 1. Install Dependencies
|
||||
|
||||
```bash
|
||||
npm ci --ignore-scripts
|
||||
```
|
||||
|
||||
**Timing**: ~20-30 seconds
|
||||
**Note**: Always run this first after cloning or when package.json changes
|
||||
|
||||
#### 2. Build TypeScript
|
||||
|
||||
```bash
|
||||
npm run build
|
||||
```
|
||||
|
||||
**Timing**: ~5-10 seconds
|
||||
**Purpose**: Compiles TypeScript source to JavaScript in `lib/` directory
|
||||
|
||||
#### 3. Lint and Format Code
|
||||
|
||||
```bash
|
||||
npm run check
|
||||
```
|
||||
|
||||
**Timing**: ~2-5 seconds
|
||||
**Tool**: Biome (replaces ESLint/Prettier)
|
||||
**Auto-fixes**: Formatting, import organization, basic linting issues
|
||||
|
||||
#### 4. Package for Distribution
|
||||
|
||||
```bash
|
||||
npm run package
|
||||
```
|
||||
|
||||
**Timing**: ~20-30 seconds
|
||||
**Purpose**: Creates bundled distributions in `dist/` using @vercel/ncc
|
||||
**Critical**: This step MUST be run before committing - the `dist/` files are used by GitHub Actions
|
||||
|
||||
#### 5. Run Tests
|
||||
|
||||
```bash
|
||||
npm test
|
||||
```
|
||||
|
||||
**Timing**: ~10-15 seconds
|
||||
**Framework**: Jest with TypeScript support
|
||||
**Coverage**: Unit tests for version resolution, input parsing, checksum validation
|
||||
|
||||
#### 6. Complete Validation (Recommended)
|
||||
|
||||
```bash
|
||||
npm run all
|
||||
```
|
||||
|
||||
**Timing**: ~60-90 seconds
|
||||
**Purpose**: Runs build → check → package → test in sequence
|
||||
**Use**: Before making pull requests or when unsure about build state
|
||||
|
||||
### Important Build Notes
|
||||
|
||||
**CRITICAL**: Always run `npm run package` after making code changes. The `dist/` directory contains the compiled bundles that GitHub Actions actually executes. Forgetting this step will cause your changes to have no effect.
|
||||
|
||||
**TypeScript Warnings**: You may see ts-jest warnings about "isolatedModules" - these are harmless and don't affect functionality.
|
||||
|
||||
**Biome**: This project uses Biome instead of ESLint/Prettier. Run `npm run check` to fix formatting and linting issues automatically.
|
||||
|
||||
## Testing Strategy
|
||||
|
||||
### Unit Tests
|
||||
|
||||
- **Location**: `__tests__/` directory
|
||||
- **Framework**: Jest with ts-jest transformer
|
||||
- **Coverage**: Version resolution, input parsing, checksum validation, utility functions
|
||||
|
||||
### Integration Tests
|
||||
|
||||
- **Location**: `.github/workflows/test.yml`
|
||||
- **Scope**: Full end-to-end testing across multiple platforms and scenarios
|
||||
- **Key Test Categories**:
|
||||
- Version installation (specific, latest, semver ranges)
|
||||
- Cache behavior (setup, restore, invalidation)
|
||||
- Cross-platform compatibility (Ubuntu, macOS, Windows, ARM)
|
||||
- Configuration file parsing (pyproject.toml, uv.toml, requirements.txt)
|
||||
- Error handling and edge cases
|
||||
|
||||
### Test Fixtures
|
||||
|
||||
Located in `__tests__/fixtures/`, these provide sample projects with different configurations:
|
||||
- `pyproject-toml-project/` - Standard Python project with uv version specification
|
||||
- `uv-toml-project/` - Project using uv.toml configuration
|
||||
- `requirements-txt-project/` - Legacy requirements.txt format
|
||||
- `cache-dir-defined-project/` - Custom cache directory configuration
|
||||
|
||||
## Continuous Integration
|
||||
|
||||
### GitHub Workflows
|
||||
|
||||
#### Primary Test Suite (`.github/workflows/test.yml`)
|
||||
|
||||
- **Triggers**: PRs, pushes to main, manual dispatch
|
||||
- **Matrix**: Multiple OS (Ubuntu, macOS, Windows), architecture (x64, ARM), and configuration combinations
|
||||
- **Duration**: ~5 minutes for full matrix
|
||||
- **Key Validations**:
|
||||
- Cross-platform installation and functionality
|
||||
- Cache behavior and performance
|
||||
- Version resolution from various sources
|
||||
- Tool directory configurations
|
||||
- Problem matcher functionality
|
||||
|
||||
#### Maintenance Workflows
|
||||
|
||||
- **CodeQL Analysis**: Security scanning on pushes/PRs
|
||||
- **Update Known Versions**: Daily job to sync with latest uv releases
|
||||
- **Dependabot**: Automated dependency updates
|
||||
|
||||
### Pre-commit Validation
|
||||
|
||||
The CI runs these checks that you should run locally:
|
||||
1. `npm run all` - Complete build and test suite
|
||||
2. ActionLint - GitHub Actions workflow validation
|
||||
3. Change detection - Ensures no uncommitted build artifacts
|
||||
|
||||
## Key Configuration Files
|
||||
|
||||
### Action Configuration (`action.yml`)
|
||||
|
||||
Defines 20+ inputs including version specifications,
|
||||
cache settings, tool directories, and environment options.
|
||||
This file is the authoritative source for understanding available action parameters.
|
||||
|
||||
### TypeScript Configuration (`tsconfig.json`)
|
||||
|
||||
- Target: ES2024
|
||||
- Module: nodenext (Node.js modules)
|
||||
- Strict mode enabled
|
||||
- Output directory: `lib/`
|
||||
|
||||
### Linting Configuration (`biome.json`)
|
||||
|
||||
- Formatter and linter combined
|
||||
- Enforces consistent code style
|
||||
- Automatically organizes imports and sorts object keys
|
||||
|
||||
## Common Development Patterns
|
||||
|
||||
### Making Code Changes
|
||||
|
||||
1. Edit TypeScript source files in `src/`
|
||||
2. Run `npm run build` to compile
|
||||
3. Run `npm run check` to format and lint
|
||||
4. Run `npm run package` to update distribution bundles
|
||||
5. Run `npm test` to verify functionality
|
||||
6. Commit all changes including `dist/` files
|
||||
|
||||
### Adding New Features
|
||||
|
||||
- Follow existing patterns in `src/utils/inputs.ts` for new action inputs
|
||||
- Update `action.yml` to declare new inputs/outputs
|
||||
- Add corresponding tests in `__tests__/`
|
||||
- Add a test in `.github/workflows/test.yml` if it affects integration
|
||||
- Update README.md with usage examples
|
||||
|
||||
### Cache-Related Changes
|
||||
|
||||
- Cache logic is complex and affects performance significantly
|
||||
- Test with multiple cache scenarios (hit, miss, invalidation)
|
||||
- Consider impact on both GitHub-hosted and self-hosted runners
|
||||
- Validate cache key generation and dependency detection
|
||||
|
||||
### Version Resolution Changes
|
||||
|
||||
- Version resolution supports multiple file formats and precedence rules
|
||||
- Test with fixtures in `__tests__/fixtures/`
|
||||
- Consider backward compatibility with existing projects
|
||||
- Validate semver and PEP 440 specification handling
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Build Failures
|
||||
|
||||
- **"Module not found"**: Run `npm ci --ignore-scripts` to ensure dependencies are installed
|
||||
- **TypeScript errors**: Check `tsconfig.json` and ensure all imports are valid
|
||||
- **Test failures**: Check if test fixtures have been modified or if logic changes broke assumptions
|
||||
|
||||
### Action Failures in Workflows
|
||||
|
||||
- **Changes not taking effect**: Ensure `npm run package` was run and `dist/` files committed
|
||||
- **Version resolution issues**: Check version specification format and file existence
|
||||
- **Cache problems**: Verify cache key generation and dependency glob patterns
|
||||
|
||||
### Common Gotchas
|
||||
|
||||
- **Forgetting to package**: Code changes won't work without running `npm run package`
|
||||
- **Platform differences**: Windows paths use backslashes, test cross-platform behavior
|
||||
- **Cache invalidation**: Cache keys are sensitive to dependency file changes
|
||||
- **Tool directory permissions**: Some platforms require specific directory setups
|
||||
|
||||
## Trust These Instructions
|
||||
|
||||
These instructions are comprehensive and current. Only search for additional information if:
|
||||
- You encounter specific error messages not covered here
|
||||
- You need to understand implementation details of specific functions
|
||||
- The instructions appear outdated (check repository commit history)
|
||||
|
||||
For most development tasks, following the build process and development patterns outlined above will be sufficient.
|
||||
2
.github/release-drafter.yml
vendored
2
.github/release-drafter.yml
vendored
@@ -19,7 +19,7 @@ categories:
|
||||
labels:
|
||||
- "maintenance"
|
||||
- "ci"
|
||||
- "update-known-versions"
|
||||
- "update-known-checksums"
|
||||
- title: "📚 Documentation"
|
||||
labels:
|
||||
- "documentation"
|
||||
|
||||
2
.github/workflows/test.yml
vendored
2
.github/workflows/test.yml
vendored
@@ -38,7 +38,7 @@ jobs:
|
||||
npm run all
|
||||
- name: Check all jobs are in all-tests-passed.needs
|
||||
run: |
|
||||
tsc check-all-tests-passed-needs.ts
|
||||
tsc --module nodenext --moduleResolution nodenext --target es2022 check-all-tests-passed-needs.ts
|
||||
node check-all-tests-passed-needs.js
|
||||
working-directory: .github/scripts
|
||||
- name: Make sure no changes from linters are detected
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
name: "Update known versions"
|
||||
name: "Update known checksums"
|
||||
on:
|
||||
workflow_dispatch:
|
||||
schedule:
|
||||
@@ -20,14 +20,13 @@ jobs:
|
||||
persist-credentials: true
|
||||
- uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6.3.0
|
||||
with:
|
||||
node-version: "20"
|
||||
- name: Update known versions
|
||||
id: update-known-versions
|
||||
node-version-file: .nvmrc
|
||||
cache: npm
|
||||
- name: Update known checksums
|
||||
id: update-known-checksums
|
||||
run:
|
||||
node dist/update-known-versions/index.js
|
||||
node dist/update-known-checksums/index.cjs
|
||||
src/download/checksum/known-checksums.ts
|
||||
version-manifest.json
|
||||
${{ secrets.GITHUB_TOKEN }}
|
||||
- name: Check for changes
|
||||
id: changes-exist
|
||||
run: |
|
||||
@@ -48,10 +47,10 @@ jobs:
|
||||
git config user.name "$GITHUB_ACTOR"
|
||||
git config user.email "$GITHUB_ACTOR@users.noreply.github.com"
|
||||
git add .
|
||||
git commit -m "chore: update known versions for $LATEST_VERSION"
|
||||
git commit -m "chore: update known checksums for $LATEST_VERSION"
|
||||
git push origin HEAD:refs/heads/main
|
||||
env:
|
||||
LATEST_VERSION: ${{ steps.update-known-versions.outputs.latest-version }}
|
||||
LATEST_VERSION: ${{ steps.update-known-checksums.outputs.latest-version }}
|
||||
|
||||
- name: Create Pull Request
|
||||
if: ${{ steps.changes-exist.outputs.changes-exist == 'true' && steps.commit-and-push.outcome != 'success' }}
|
||||
@@ -60,11 +59,11 @@ jobs:
|
||||
commit-message: "chore: update known checksums"
|
||||
title:
|
||||
"chore: update known checksums for ${{
|
||||
steps.update-known-versions.outputs.latest-version }}"
|
||||
steps.update-known-checksums.outputs.latest-version }}"
|
||||
body:
|
||||
"chore: update known checksums for ${{
|
||||
steps.update-known-versions.outputs.latest-version }}"
|
||||
steps.update-known-checksums.outputs.latest-version }}"
|
||||
base: main
|
||||
labels: "automated-pr,update-known-versions"
|
||||
branch: update-known-versions-pr
|
||||
labels: "automated-pr,update-known-checksums"
|
||||
branch: update-known-checksums-pr
|
||||
delete-branch: true
|
||||
18
AGENTS.md
Normal file
18
AGENTS.md
Normal file
@@ -0,0 +1,18 @@
|
||||
# setup-uv agent notes
|
||||
|
||||
This repository is a TypeScript-based GitHub Action for installing `uv` in GitHub Actions workflows. It also supports restoring/saving the `uv` cache and optional managed-Python caching.
|
||||
|
||||
- The published action runs the committed bundles in `dist/`, not the TypeScript in `src/`. After any code change, run `npm run package` and commit the resulting `dist/` updates.
|
||||
- Standard local validation is:
|
||||
1. `npm ci --ignore-scripts`
|
||||
2. `npm run all`
|
||||
- `npm run check` uses Biome (not ESLint/Prettier) and rewrites files in place.
|
||||
- User-facing changes are usually multi-file changes. If you add or change inputs, outputs, or behavior, update `action.yml`, the implementation in `src/`, tests in `__tests__/`, relevant docs/README, and then re-package.
|
||||
- The easiest areas to regress are version resolution and caching. When touching them, add or update tests for precedence, cache invalidation, and cross-platform path behavior.
|
||||
- Workflow edits have extra CI-only checks (`actionlint` and `zizmor`); `npm run all` does not cover them.
|
||||
- Source is authored with bundler-friendly TypeScript, but published action artifacts in `dist/` are bundled as CommonJS for maximum GitHub Actions runtime compatibility with `@actions/*` dependencies.
|
||||
- Keep these concerns separate when changing module formats:
|
||||
- `src/` and tests may use modern ESM-friendly TypeScript patterns.
|
||||
- `dist/` should prioritize runtime reliability over format purity.
|
||||
- Do not switch published bundles to ESM without validating the actual committed artifacts under the target Node runtime.
|
||||
- Before finishing, make sure validation does not leave generated or formatting-only diffs behind.
|
||||
14
README.md
14
README.md
@@ -68,7 +68,7 @@ Have a look under [Advanced Configuration](#advanced-configuration) for detailed
|
||||
# The checksum of the uv version to install
|
||||
checksum: ""
|
||||
|
||||
# Used to increase the rate limit when retrieving versions and downloading uv
|
||||
# Used when downloading uv from GitHub releases
|
||||
github-token: ${{ github.token }}
|
||||
|
||||
# Enable uploading of the uv cache: true, false, or auto (enabled on GitHub-hosted runners, disabled on self-hosted runners)
|
||||
@@ -114,7 +114,7 @@ Have a look under [Advanced Configuration](#advanced-configuration) for detailed
|
||||
# Custom path to set UV_TOOL_BIN_DIR to
|
||||
tool-bin-dir: ""
|
||||
|
||||
# URL to the manifest file containing available versions and download URLs
|
||||
# URL to a custom manifest file (NDJSON preferred, legacy JSON array is deprecated)
|
||||
manifest-file: ""
|
||||
|
||||
# Add problem matchers
|
||||
@@ -190,10 +190,12 @@ For more advanced configuration options, see our detailed documentation:
|
||||
|
||||
## How it works
|
||||
|
||||
This action downloads uv from the uv repo's official
|
||||
[GitHub Releases](https://github.com/astral-sh/uv) and uses the
|
||||
[GitHub Actions Toolkit](https://github.com/actions/toolkit) to cache it as a tool to speed up
|
||||
consecutive runs on self-hosted runners.
|
||||
By default, this action resolves uv versions from
|
||||
[`astral-sh/versions`](https://github.com/astral-sh/versions) (NDJSON) and downloads uv from the
|
||||
official [GitHub Releases](https://github.com/astral-sh/uv).
|
||||
|
||||
It then uses the [GitHub Actions Toolkit](https://github.com/actions/toolkit) to cache uv as a
|
||||
tool to speed up consecutive runs on self-hosted runners.
|
||||
|
||||
The installed version of uv is then added to the runner PATH, enabling later steps to invoke it
|
||||
by name (`uv`).
|
||||
|
||||
@@ -4,10 +4,11 @@ import {
|
||||
validateChecksum,
|
||||
} from "../../../src/download/checksum/checksum";
|
||||
|
||||
test("checksum should match", async () => {
|
||||
const validChecksum =
|
||||
const validChecksum =
|
||||
"f3da96ec7e995debee7f5d52ecd034dfb7074309a1da42f76429ecb814d813a3";
|
||||
const filePath = "__tests__/fixtures/checksumfile";
|
||||
const filePath = "__tests__/fixtures/checksumfile";
|
||||
|
||||
test("checksum should match", async () => {
|
||||
// string params don't matter only test the checksum mechanism, not known checksums
|
||||
await validateChecksum(
|
||||
validChecksum,
|
||||
@@ -18,6 +19,16 @@ test("checksum should match", async () => {
|
||||
);
|
||||
});
|
||||
|
||||
test("provided checksum beats known checksums", async () => {
|
||||
await validateChecksum(
|
||||
validChecksum,
|
||||
filePath,
|
||||
"x86_64",
|
||||
"unknown-linux-gnu",
|
||||
"0.3.0",
|
||||
);
|
||||
});
|
||||
|
||||
type KnownVersionFixture = { version: string; known: boolean };
|
||||
|
||||
it.each<KnownVersionFixture>([
|
||||
|
||||
271
__tests__/download/download-version.test.ts
Normal file
271
__tests__/download/download-version.test.ts
Normal file
@@ -0,0 +1,271 @@
|
||||
import { beforeEach, describe, expect, it, jest } from "@jest/globals";
|
||||
import * as semver from "semver";
|
||||
|
||||
const mockInfo = jest.fn();
|
||||
const mockWarning = jest.fn();
|
||||
|
||||
jest.unstable_mockModule("@actions/core", () => ({
|
||||
debug: jest.fn(),
|
||||
info: mockInfo,
|
||||
warning: mockWarning,
|
||||
}));
|
||||
|
||||
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||
const mockDownloadTool = jest.fn<any>();
|
||||
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||
const mockExtractTar = jest.fn<any>();
|
||||
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||
const mockExtractZip = jest.fn<any>();
|
||||
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||
const mockCacheDir = jest.fn<any>();
|
||||
|
||||
jest.unstable_mockModule("@actions/tool-cache", () => ({
|
||||
cacheDir: mockCacheDir,
|
||||
downloadTool: mockDownloadTool,
|
||||
evaluateVersions: (versions: string[], range: string) =>
|
||||
semver.maxSatisfying(versions, range) ?? "",
|
||||
extractTar: mockExtractTar,
|
||||
extractZip: mockExtractZip,
|
||||
find: () => "",
|
||||
findAllVersions: () => [],
|
||||
isExplicitVersion: (version: string) => semver.valid(version) !== null,
|
||||
}));
|
||||
|
||||
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||
const mockGetLatestVersionFromNdjson = jest.fn<any>();
|
||||
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||
const mockGetAllVersionsFromNdjson = jest.fn<any>();
|
||||
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||
const mockGetArtifactFromNdjson = jest.fn<any>();
|
||||
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||
const mockGetHighestSatisfyingVersionFromNdjson = jest.fn<any>();
|
||||
|
||||
jest.unstable_mockModule("../../src/download/versions-client", () => ({
|
||||
getAllVersions: mockGetAllVersionsFromNdjson,
|
||||
getArtifact: mockGetArtifactFromNdjson,
|
||||
getHighestSatisfyingVersion: mockGetHighestSatisfyingVersionFromNdjson,
|
||||
getLatestVersion: mockGetLatestVersionFromNdjson,
|
||||
}));
|
||||
|
||||
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||
const mockGetAllManifestVersions = jest.fn<any>();
|
||||
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||
const mockGetLatestVersionInManifest = jest.fn<any>();
|
||||
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||
const mockGetManifestArtifact = jest.fn<any>();
|
||||
|
||||
jest.unstable_mockModule("../../src/download/version-manifest", () => ({
|
||||
getAllVersions: mockGetAllManifestVersions,
|
||||
getLatestKnownVersion: mockGetLatestVersionInManifest,
|
||||
getManifestArtifact: mockGetManifestArtifact,
|
||||
}));
|
||||
|
||||
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||
const mockValidateChecksum = jest.fn<any>();
|
||||
|
||||
jest.unstable_mockModule("../../src/download/checksum/checksum", () => ({
|
||||
validateChecksum: mockValidateChecksum,
|
||||
}));
|
||||
|
||||
const {
|
||||
downloadVersionFromManifest,
|
||||
downloadVersionFromNdjson,
|
||||
resolveVersion,
|
||||
} = await import("../../src/download/download-version");
|
||||
|
||||
describe("download-version", () => {
|
||||
beforeEach(() => {
|
||||
mockInfo.mockReset();
|
||||
mockWarning.mockReset();
|
||||
mockDownloadTool.mockReset();
|
||||
mockExtractTar.mockReset();
|
||||
mockExtractZip.mockReset();
|
||||
mockCacheDir.mockReset();
|
||||
mockGetLatestVersionFromNdjson.mockReset();
|
||||
mockGetAllVersionsFromNdjson.mockReset();
|
||||
mockGetArtifactFromNdjson.mockReset();
|
||||
mockGetHighestSatisfyingVersionFromNdjson.mockReset();
|
||||
mockGetAllManifestVersions.mockReset();
|
||||
mockGetLatestVersionInManifest.mockReset();
|
||||
mockGetManifestArtifact.mockReset();
|
||||
mockValidateChecksum.mockReset();
|
||||
|
||||
mockDownloadTool.mockResolvedValue("/tmp/downloaded");
|
||||
mockExtractTar.mockResolvedValue("/tmp/extracted");
|
||||
mockExtractZip.mockResolvedValue("/tmp/extracted");
|
||||
mockCacheDir.mockResolvedValue("/tmp/cached");
|
||||
});
|
||||
|
||||
describe("resolveVersion", () => {
|
||||
it("uses astral-sh/versions to resolve latest", async () => {
|
||||
mockGetLatestVersionFromNdjson.mockResolvedValue("0.9.26");
|
||||
|
||||
const version = await resolveVersion("latest", undefined);
|
||||
|
||||
expect(version).toBe("0.9.26");
|
||||
expect(mockGetLatestVersionFromNdjson).toHaveBeenCalledTimes(1);
|
||||
});
|
||||
|
||||
it("streams astral-sh/versions to resolve the highest matching version", async () => {
|
||||
mockGetHighestSatisfyingVersionFromNdjson.mockResolvedValue("0.9.26");
|
||||
|
||||
const version = await resolveVersion("^0.9.0", undefined);
|
||||
|
||||
expect(version).toBe("0.9.26");
|
||||
expect(mockGetHighestSatisfyingVersionFromNdjson).toHaveBeenCalledWith(
|
||||
"^0.9.0",
|
||||
);
|
||||
expect(mockGetAllVersionsFromNdjson).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it("still loads all versions when resolving the lowest matching version", async () => {
|
||||
mockGetAllVersionsFromNdjson.mockResolvedValue(["0.9.26", "0.9.25"]);
|
||||
|
||||
const version = await resolveVersion("^0.9.0", undefined, "lowest");
|
||||
|
||||
expect(version).toBe("0.9.25");
|
||||
expect(mockGetAllVersionsFromNdjson).toHaveBeenCalledTimes(1);
|
||||
expect(mockGetHighestSatisfyingVersionFromNdjson).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it("does not fall back when astral-sh/versions fails", async () => {
|
||||
mockGetLatestVersionFromNdjson.mockRejectedValue(
|
||||
new Error("NDJSON unavailable"),
|
||||
);
|
||||
|
||||
await expect(resolveVersion("latest", undefined)).rejects.toThrow(
|
||||
"NDJSON unavailable",
|
||||
);
|
||||
});
|
||||
|
||||
it("uses manifest-file when provided", async () => {
|
||||
mockGetAllManifestVersions.mockResolvedValue(["0.9.26", "0.9.25"]);
|
||||
|
||||
const version = await resolveVersion(
|
||||
"^0.9.0",
|
||||
"https://example.com/custom.ndjson",
|
||||
);
|
||||
|
||||
expect(version).toBe("0.9.26");
|
||||
expect(mockGetAllManifestVersions).toHaveBeenCalledWith(
|
||||
"https://example.com/custom.ndjson",
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
describe("downloadVersionFromNdjson", () => {
|
||||
it("fails when NDJSON metadata lookup fails", async () => {
|
||||
mockGetArtifactFromNdjson.mockRejectedValue(
|
||||
new Error("NDJSON unavailable"),
|
||||
);
|
||||
|
||||
await expect(
|
||||
downloadVersionFromNdjson(
|
||||
"unknown-linux-gnu",
|
||||
"x86_64",
|
||||
"0.9.26",
|
||||
undefined,
|
||||
"token",
|
||||
),
|
||||
).rejects.toThrow("NDJSON unavailable");
|
||||
|
||||
expect(mockDownloadTool).not.toHaveBeenCalled();
|
||||
expect(mockValidateChecksum).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it("fails when no matching artifact exists in NDJSON metadata", async () => {
|
||||
mockGetArtifactFromNdjson.mockResolvedValue(undefined);
|
||||
|
||||
await expect(
|
||||
downloadVersionFromNdjson(
|
||||
"unknown-linux-gnu",
|
||||
"x86_64",
|
||||
"0.9.26",
|
||||
undefined,
|
||||
"token",
|
||||
),
|
||||
).rejects.toThrow(
|
||||
"Could not find artifact for version 0.9.26, arch x86_64, platform unknown-linux-gnu in https://raw.githubusercontent.com/astral-sh/versions/main/v1/uv.ndjson .",
|
||||
);
|
||||
|
||||
expect(mockDownloadTool).not.toHaveBeenCalled();
|
||||
expect(mockValidateChecksum).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it("uses built-in checksums for default NDJSON downloads", async () => {
|
||||
mockGetArtifactFromNdjson.mockResolvedValue({
|
||||
archiveFormat: "tar.gz",
|
||||
sha256: "ndjson-checksum-that-should-be-ignored",
|
||||
url: "https://example.com/uv.tar.gz",
|
||||
});
|
||||
|
||||
await downloadVersionFromNdjson(
|
||||
"unknown-linux-gnu",
|
||||
"x86_64",
|
||||
"0.9.26",
|
||||
undefined,
|
||||
"token",
|
||||
);
|
||||
|
||||
expect(mockValidateChecksum).toHaveBeenCalledWith(
|
||||
undefined,
|
||||
"/tmp/downloaded",
|
||||
"x86_64",
|
||||
"unknown-linux-gnu",
|
||||
"0.9.26",
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
describe("downloadVersionFromManifest", () => {
|
||||
it("uses manifest-file checksum metadata when checksum input is unset", async () => {
|
||||
mockGetManifestArtifact.mockResolvedValue({
|
||||
archiveFormat: "tar.gz",
|
||||
checksum: "manifest-checksum",
|
||||
downloadUrl: "https://example.com/custom-uv.tar.gz",
|
||||
});
|
||||
|
||||
await downloadVersionFromManifest(
|
||||
"https://example.com/custom.ndjson",
|
||||
"unknown-linux-gnu",
|
||||
"x86_64",
|
||||
"0.9.26",
|
||||
"",
|
||||
"token",
|
||||
);
|
||||
|
||||
expect(mockValidateChecksum).toHaveBeenCalledWith(
|
||||
"manifest-checksum",
|
||||
"/tmp/downloaded",
|
||||
"x86_64",
|
||||
"unknown-linux-gnu",
|
||||
"0.9.26",
|
||||
);
|
||||
});
|
||||
|
||||
it("prefers checksum input over manifest-file checksum metadata", async () => {
|
||||
mockGetManifestArtifact.mockResolvedValue({
|
||||
archiveFormat: "tar.gz",
|
||||
checksum: "manifest-checksum",
|
||||
downloadUrl: "https://example.com/custom-uv.tar.gz",
|
||||
});
|
||||
|
||||
await downloadVersionFromManifest(
|
||||
"https://example.com/custom.ndjson",
|
||||
"unknown-linux-gnu",
|
||||
"x86_64",
|
||||
"0.9.26",
|
||||
"user-checksum",
|
||||
"token",
|
||||
);
|
||||
|
||||
expect(mockValidateChecksum).toHaveBeenCalledWith(
|
||||
"user-checksum",
|
||||
"/tmp/downloaded",
|
||||
"x86_64",
|
||||
"unknown-linux-gnu",
|
||||
"0.9.26",
|
||||
);
|
||||
});
|
||||
});
|
||||
});
|
||||
136
__tests__/download/version-manifest.test.ts
Normal file
136
__tests__/download/version-manifest.test.ts
Normal file
@@ -0,0 +1,136 @@
|
||||
import { beforeEach, describe, expect, it, jest } from "@jest/globals";
|
||||
|
||||
const mockWarning = jest.fn();
|
||||
|
||||
jest.unstable_mockModule("@actions/core", () => ({
|
||||
debug: jest.fn(),
|
||||
info: jest.fn(),
|
||||
warning: mockWarning,
|
||||
}));
|
||||
|
||||
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||
const mockFetch = jest.fn<any>();
|
||||
jest.unstable_mockModule("../../src/utils/fetch", () => ({
|
||||
fetch: mockFetch,
|
||||
}));
|
||||
|
||||
const {
|
||||
clearManifestCache,
|
||||
getAllVersions,
|
||||
getLatestKnownVersion,
|
||||
getManifestArtifact,
|
||||
} = await import("../../src/download/version-manifest");
|
||||
|
||||
const legacyManifestResponse = JSON.stringify([
|
||||
{
|
||||
arch: "x86_64",
|
||||
artifactName: "uv-x86_64-unknown-linux-gnu.tar.gz",
|
||||
downloadUrl:
|
||||
"https://example.com/releases/download/0.7.12-alpha.1/uv-x86_64-unknown-linux-gnu.tar.gz",
|
||||
platform: "unknown-linux-gnu",
|
||||
version: "0.7.12-alpha.1",
|
||||
},
|
||||
{
|
||||
arch: "x86_64",
|
||||
artifactName: "uv-x86_64-unknown-linux-gnu.tar.gz",
|
||||
downloadUrl:
|
||||
"https://example.com/releases/download/0.7.13/uv-x86_64-unknown-linux-gnu.tar.gz",
|
||||
platform: "unknown-linux-gnu",
|
||||
version: "0.7.13",
|
||||
},
|
||||
]);
|
||||
|
||||
const ndjsonManifestResponse = `{"version":"0.10.0","artifacts":[{"platform":"x86_64-unknown-linux-gnu","variant":"default","url":"https://example.com/releases/download/0.10.0/uv-x86_64-unknown-linux-gnu.tar.gz","archive_format":"tar.gz","sha256":"checksum-100"}]}
|
||||
{"version":"0.9.30","artifacts":[{"platform":"x86_64-unknown-linux-gnu","variant":"default","url":"https://example.com/releases/download/0.9.30/uv-x86_64-unknown-linux-gnu.tar.gz","archive_format":"tar.gz","sha256":"checksum-0930"}]}`;
|
||||
|
||||
const multiVariantManifestResponse = `{"version":"0.10.0","artifacts":[{"platform":"x86_64-unknown-linux-gnu","variant":"managed-python","url":"https://example.com/releases/download/0.10.0/uv-x86_64-unknown-linux-gnu-managed-python.tar.gz","archive_format":"tar.gz","sha256":"checksum-managed"},{"platform":"x86_64-unknown-linux-gnu","variant":"default","url":"https://example.com/releases/download/0.10.0/uv-x86_64-unknown-linux-gnu-default.zip","archive_format":"zip","sha256":"checksum-default"}]}`;
|
||||
|
||||
function createMockResponse(
|
||||
ok: boolean,
|
||||
status: number,
|
||||
statusText: string,
|
||||
data: string,
|
||||
) {
|
||||
return {
|
||||
ok,
|
||||
status,
|
||||
statusText,
|
||||
text: async () => data,
|
||||
};
|
||||
}
|
||||
|
||||
describe("version-manifest", () => {
|
||||
beforeEach(() => {
|
||||
clearManifestCache();
|
||||
mockFetch.mockReset();
|
||||
mockWarning.mockReset();
|
||||
});
|
||||
|
||||
it("supports the legacy JSON manifest format", async () => {
|
||||
mockFetch.mockResolvedValue(
|
||||
createMockResponse(true, 200, "OK", legacyManifestResponse),
|
||||
);
|
||||
|
||||
const latest = await getLatestKnownVersion(
|
||||
"https://example.com/legacy.json",
|
||||
);
|
||||
const artifact = await getManifestArtifact(
|
||||
"https://example.com/legacy.json",
|
||||
"0.7.13",
|
||||
"x86_64",
|
||||
"unknown-linux-gnu",
|
||||
);
|
||||
|
||||
expect(latest).toBe("0.7.13");
|
||||
expect(artifact).toEqual({
|
||||
archiveFormat: undefined,
|
||||
checksum: undefined,
|
||||
downloadUrl:
|
||||
"https://example.com/releases/download/0.7.13/uv-x86_64-unknown-linux-gnu.tar.gz",
|
||||
});
|
||||
expect(mockWarning).toHaveBeenCalledTimes(1);
|
||||
});
|
||||
|
||||
it("supports NDJSON manifests", async () => {
|
||||
mockFetch.mockResolvedValue(
|
||||
createMockResponse(true, 200, "OK", ndjsonManifestResponse),
|
||||
);
|
||||
|
||||
const versions = await getAllVersions("https://example.com/custom.ndjson");
|
||||
const artifact = await getManifestArtifact(
|
||||
"https://example.com/custom.ndjson",
|
||||
"0.10.0",
|
||||
"x86_64",
|
||||
"unknown-linux-gnu",
|
||||
);
|
||||
|
||||
expect(versions).toEqual(["0.10.0", "0.9.30"]);
|
||||
expect(artifact).toEqual({
|
||||
archiveFormat: "tar.gz",
|
||||
checksum: "checksum-100",
|
||||
downloadUrl:
|
||||
"https://example.com/releases/download/0.10.0/uv-x86_64-unknown-linux-gnu.tar.gz",
|
||||
});
|
||||
expect(mockWarning).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it("prefers the default variant when a manifest contains multiple variants", async () => {
|
||||
mockFetch.mockResolvedValue(
|
||||
createMockResponse(true, 200, "OK", multiVariantManifestResponse),
|
||||
);
|
||||
|
||||
const artifact = await getManifestArtifact(
|
||||
"https://example.com/multi-variant.ndjson",
|
||||
"0.10.0",
|
||||
"x86_64",
|
||||
"unknown-linux-gnu",
|
||||
);
|
||||
|
||||
expect(artifact).toEqual({
|
||||
archiveFormat: "zip",
|
||||
checksum: "checksum-default",
|
||||
downloadUrl:
|
||||
"https://example.com/releases/download/0.10.0/uv-x86_64-unknown-linux-gnu-default.zip",
|
||||
});
|
||||
});
|
||||
});
|
||||
241
__tests__/download/versions-client.test.ts
Normal file
241
__tests__/download/versions-client.test.ts
Normal file
@@ -0,0 +1,241 @@
|
||||
import { beforeEach, describe, expect, it, jest } from "@jest/globals";
|
||||
|
||||
// biome-ignore lint/suspicious/noExplicitAny: Mock requires flexible typing in tests.
|
||||
const mockFetch = jest.fn<any>();
|
||||
|
||||
jest.unstable_mockModule("../../src/utils/fetch", () => ({
|
||||
fetch: mockFetch,
|
||||
}));
|
||||
|
||||
const {
|
||||
clearCache,
|
||||
fetchVersionData,
|
||||
getAllVersions,
|
||||
getArtifact,
|
||||
getHighestSatisfyingVersion,
|
||||
getLatestVersion,
|
||||
parseVersionData,
|
||||
} = await import("../../src/download/versions-client");
|
||||
|
||||
const sampleNdjsonResponse = `{"version":"0.9.26","artifacts":[{"platform":"aarch64-apple-darwin","variant":"default","url":"https://github.com/astral-sh/uv/releases/download/0.9.26/uv-aarch64-apple-darwin.tar.gz","archive_format":"tar.gz","sha256":"fcf0a9ea6599c6ae28a4c854ac6da76f2c889354d7c36ce136ef071f7ab9721f"},{"platform":"x86_64-pc-windows-msvc","variant":"default","url":"https://github.com/astral-sh/uv/releases/download/0.9.26/uv-x86_64-pc-windows-msvc.zip","archive_format":"zip","sha256":"eb02fd95d8e0eed462b4a67ecdd320d865b38c560bffcda9a0b87ec944bdf036"}]}
|
||||
{"version":"0.9.25","artifacts":[{"platform":"aarch64-apple-darwin","variant":"default","url":"https://github.com/astral-sh/uv/releases/download/0.9.25/uv-aarch64-apple-darwin.tar.gz","archive_format":"tar.gz","sha256":"606b3c6949d971709f2526fa0d9f0fd23ccf60e09f117999b406b424af18a6a6"}]}`;
|
||||
|
||||
const multiVariantNdjsonResponse = `{"version":"0.9.26","artifacts":[{"platform":"aarch64-apple-darwin","variant":"python-managed","url":"https://github.com/astral-sh/uv/releases/download/0.9.26/uv-aarch64-apple-darwin-managed.tar.gz","archive_format":"tar.gz","sha256":"managed-checksum"},{"platform":"aarch64-apple-darwin","variant":"default","url":"https://github.com/astral-sh/uv/releases/download/0.9.26/uv-aarch64-apple-darwin.zip","archive_format":"zip","sha256":"default-checksum"}]}`;
|
||||
|
||||
function createMockStream(chunks: string[]): ReadableStream<Uint8Array> {
|
||||
const encoder = new TextEncoder();
|
||||
|
||||
return new ReadableStream<Uint8Array>({
|
||||
start(controller) {
|
||||
for (const chunk of chunks) {
|
||||
controller.enqueue(encoder.encode(chunk));
|
||||
}
|
||||
controller.close();
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
function createMockResponse(
|
||||
ok: boolean,
|
||||
status: number,
|
||||
statusText: string,
|
||||
data: string,
|
||||
chunks: string[] = [data],
|
||||
) {
|
||||
return {
|
||||
body: createMockStream(chunks),
|
||||
ok,
|
||||
status,
|
||||
statusText,
|
||||
text: async () => data,
|
||||
};
|
||||
}
|
||||
|
||||
describe("versions-client", () => {
|
||||
beforeEach(() => {
|
||||
clearCache();
|
||||
mockFetch.mockReset();
|
||||
});
|
||||
|
||||
describe("fetchVersionData", () => {
|
||||
it("should fetch and parse NDJSON data", async () => {
|
||||
mockFetch.mockResolvedValue(
|
||||
createMockResponse(true, 200, "OK", sampleNdjsonResponse),
|
||||
);
|
||||
|
||||
const versions = await fetchVersionData();
|
||||
|
||||
expect(versions).toHaveLength(2);
|
||||
expect(versions[0].version).toBe("0.9.26");
|
||||
expect(versions[1].version).toBe("0.9.25");
|
||||
});
|
||||
|
||||
it("should throw error on failed fetch", async () => {
|
||||
mockFetch.mockResolvedValue(
|
||||
createMockResponse(false, 500, "Internal Server Error", ""),
|
||||
);
|
||||
|
||||
await expect(fetchVersionData()).rejects.toThrow(
|
||||
"Failed to fetch version data: 500 Internal Server Error",
|
||||
);
|
||||
});
|
||||
|
||||
it("should cache results", async () => {
|
||||
mockFetch.mockResolvedValue(
|
||||
createMockResponse(true, 200, "OK", sampleNdjsonResponse),
|
||||
);
|
||||
|
||||
await fetchVersionData();
|
||||
await fetchVersionData();
|
||||
|
||||
expect(mockFetch).toHaveBeenCalledTimes(1);
|
||||
});
|
||||
});
|
||||
|
||||
describe("getLatestVersion", () => {
|
||||
it("should return the first version (newest)", async () => {
|
||||
mockFetch.mockResolvedValue(
|
||||
createMockResponse(true, 200, "OK", sampleNdjsonResponse),
|
||||
);
|
||||
|
||||
const latest = await getLatestVersion();
|
||||
|
||||
expect(latest).toBe("0.9.26");
|
||||
});
|
||||
|
||||
it("should stop after the first record when resolving latest", async () => {
|
||||
mockFetch.mockResolvedValue(
|
||||
createMockResponse(
|
||||
true,
|
||||
200,
|
||||
"OK",
|
||||
`${sampleNdjsonResponse}\n{"version":`,
|
||||
[`${sampleNdjsonResponse.split("\n")[0]}\n`, '{"version":'],
|
||||
),
|
||||
);
|
||||
|
||||
const latest = await getLatestVersion();
|
||||
|
||||
expect(latest).toBe("0.9.26");
|
||||
});
|
||||
});
|
||||
|
||||
describe("getAllVersions", () => {
|
||||
it("should return all version strings", async () => {
|
||||
mockFetch.mockResolvedValue(
|
||||
createMockResponse(true, 200, "OK", sampleNdjsonResponse),
|
||||
);
|
||||
|
||||
const versions = await getAllVersions();
|
||||
|
||||
expect(versions).toEqual(["0.9.26", "0.9.25"]);
|
||||
});
|
||||
});
|
||||
|
||||
describe("getHighestSatisfyingVersion", () => {
|
||||
it("should return the first matching version from the stream", async () => {
|
||||
mockFetch.mockResolvedValue(
|
||||
createMockResponse(
|
||||
true,
|
||||
200,
|
||||
"OK",
|
||||
`${sampleNdjsonResponse}\n{"version":`,
|
||||
[`${sampleNdjsonResponse.split("\n")[0]}\n`, '{"version":'],
|
||||
),
|
||||
);
|
||||
|
||||
const version = await getHighestSatisfyingVersion("^0.9.0");
|
||||
|
||||
expect(version).toBe("0.9.26");
|
||||
});
|
||||
});
|
||||
|
||||
describe("getArtifact", () => {
|
||||
beforeEach(() => {
|
||||
mockFetch.mockResolvedValue(
|
||||
createMockResponse(true, 200, "OK", sampleNdjsonResponse),
|
||||
);
|
||||
});
|
||||
|
||||
it("should find artifact by version and platform", async () => {
|
||||
const artifact = await getArtifact("0.9.26", "aarch64", "apple-darwin");
|
||||
|
||||
expect(artifact).toEqual({
|
||||
archiveFormat: "tar.gz",
|
||||
sha256:
|
||||
"fcf0a9ea6599c6ae28a4c854ac6da76f2c889354d7c36ce136ef071f7ab9721f",
|
||||
url: "https://github.com/astral-sh/uv/releases/download/0.9.26/uv-aarch64-apple-darwin.tar.gz",
|
||||
});
|
||||
});
|
||||
|
||||
it("should stop once the requested version is found", async () => {
|
||||
mockFetch.mockResolvedValue(
|
||||
createMockResponse(
|
||||
true,
|
||||
200,
|
||||
"OK",
|
||||
`${sampleNdjsonResponse}\n{"version":`,
|
||||
[`${sampleNdjsonResponse.split("\n")[0]}\n`, '{"version":'],
|
||||
),
|
||||
);
|
||||
|
||||
const artifact = await getArtifact("0.9.26", "aarch64", "apple-darwin");
|
||||
|
||||
expect(artifact).toEqual({
|
||||
archiveFormat: "tar.gz",
|
||||
sha256:
|
||||
"fcf0a9ea6599c6ae28a4c854ac6da76f2c889354d7c36ce136ef071f7ab9721f",
|
||||
url: "https://github.com/astral-sh/uv/releases/download/0.9.26/uv-aarch64-apple-darwin.tar.gz",
|
||||
});
|
||||
});
|
||||
|
||||
it("should find windows artifact", async () => {
|
||||
const artifact = await getArtifact("0.9.26", "x86_64", "pc-windows-msvc");
|
||||
|
||||
expect(artifact).toEqual({
|
||||
archiveFormat: "zip",
|
||||
sha256:
|
||||
"eb02fd95d8e0eed462b4a67ecdd320d865b38c560bffcda9a0b87ec944bdf036",
|
||||
url: "https://github.com/astral-sh/uv/releases/download/0.9.26/uv-x86_64-pc-windows-msvc.zip",
|
||||
});
|
||||
});
|
||||
|
||||
it("should prefer the default variant when multiple artifacts share a platform", async () => {
|
||||
mockFetch.mockResolvedValue(
|
||||
createMockResponse(true, 200, "OK", multiVariantNdjsonResponse),
|
||||
);
|
||||
|
||||
const artifact = await getArtifact("0.9.26", "aarch64", "apple-darwin");
|
||||
|
||||
expect(artifact).toEqual({
|
||||
archiveFormat: "zip",
|
||||
sha256: "default-checksum",
|
||||
url: "https://github.com/astral-sh/uv/releases/download/0.9.26/uv-aarch64-apple-darwin.zip",
|
||||
});
|
||||
});
|
||||
|
||||
it("should return undefined for unknown version", async () => {
|
||||
const artifact = await getArtifact("0.0.1", "aarch64", "apple-darwin");
|
||||
|
||||
expect(artifact).toBeUndefined();
|
||||
});
|
||||
|
||||
it("should return undefined for unknown platform", async () => {
|
||||
const artifact = await getArtifact(
|
||||
"0.9.26",
|
||||
"aarch64",
|
||||
"unknown-linux-musl",
|
||||
);
|
||||
|
||||
expect(artifact).toBeUndefined();
|
||||
});
|
||||
});
|
||||
|
||||
describe("parseVersionData", () => {
|
||||
it("should throw for malformed NDJSON", () => {
|
||||
expect(() =>
|
||||
parseVersionData('{"version":"0.1.0"', "test-source"),
|
||||
).toThrow("Failed to parse version data from test-source");
|
||||
});
|
||||
});
|
||||
});
|
||||
@@ -1,14 +1,3 @@
|
||||
jest.mock("@actions/core", () => {
|
||||
return {
|
||||
debug: jest.fn(),
|
||||
getBooleanInput: jest.fn(
|
||||
(name: string) => (mockInputs[name] ?? "") === "true",
|
||||
),
|
||||
getInput: jest.fn((name: string) => mockInputs[name] ?? ""),
|
||||
warning: jest.fn(),
|
||||
};
|
||||
});
|
||||
|
||||
import {
|
||||
afterEach,
|
||||
beforeEach,
|
||||
@@ -22,6 +11,26 @@ import {
|
||||
let mockInputs: Record<string, string> = {};
|
||||
const ORIGINAL_HOME = process.env.HOME;
|
||||
|
||||
const mockDebug = jest.fn();
|
||||
const mockGetBooleanInput = jest.fn(
|
||||
(name: string) => (mockInputs[name] ?? "") === "true",
|
||||
);
|
||||
const mockGetInput = jest.fn((name: string) => mockInputs[name] ?? "");
|
||||
const mockInfo = jest.fn();
|
||||
const mockWarning = jest.fn();
|
||||
|
||||
jest.unstable_mockModule("@actions/core", () => ({
|
||||
debug: mockDebug,
|
||||
getBooleanInput: mockGetBooleanInput,
|
||||
getInput: mockGetInput,
|
||||
info: mockInfo,
|
||||
warning: mockWarning,
|
||||
}));
|
||||
|
||||
async function importInputsModule() {
|
||||
return await import("../../src/utils/inputs");
|
||||
}
|
||||
|
||||
describe("cacheDependencyGlob", () => {
|
||||
beforeEach(() => {
|
||||
jest.resetModules();
|
||||
@@ -36,21 +45,21 @@ describe("cacheDependencyGlob", () => {
|
||||
|
||||
it("returns empty string when input not provided", async () => {
|
||||
mockInputs["working-directory"] = "/workspace";
|
||||
const { cacheDependencyGlob } = await import("../../src/utils/inputs");
|
||||
const { cacheDependencyGlob } = await importInputsModule();
|
||||
expect(cacheDependencyGlob).toBe("");
|
||||
});
|
||||
|
||||
it("resolves a single relative path", async () => {
|
||||
mockInputs["working-directory"] = "/workspace";
|
||||
mockInputs["cache-dependency-glob"] = "requirements.txt";
|
||||
const { cacheDependencyGlob } = await import("../../src/utils/inputs");
|
||||
const { cacheDependencyGlob } = await importInputsModule();
|
||||
expect(cacheDependencyGlob).toBe("/workspace/requirements.txt");
|
||||
});
|
||||
|
||||
it("strips leading ./ from relative path", async () => {
|
||||
mockInputs["working-directory"] = "/workspace";
|
||||
mockInputs["cache-dependency-glob"] = "./uv.lock";
|
||||
const { cacheDependencyGlob } = await import("../../src/utils/inputs");
|
||||
const { cacheDependencyGlob } = await importInputsModule();
|
||||
expect(cacheDependencyGlob).toBe("/workspace/uv.lock");
|
||||
});
|
||||
|
||||
@@ -58,7 +67,7 @@ describe("cacheDependencyGlob", () => {
|
||||
mockInputs["working-directory"] = "/workspace";
|
||||
mockInputs["cache-dependency-glob"] =
|
||||
" ~/.cache/file1\n ./rel/file2 \nfile3.txt";
|
||||
const { cacheDependencyGlob } = await import("../../src/utils/inputs");
|
||||
const { cacheDependencyGlob } = await importInputsModule();
|
||||
expect(cacheDependencyGlob).toBe(
|
||||
[
|
||||
"/home/testuser/.cache/file1", // expanded tilde, absolute path unchanged
|
||||
@@ -71,7 +80,7 @@ describe("cacheDependencyGlob", () => {
|
||||
it("keeps absolute path unchanged in multiline input", async () => {
|
||||
mockInputs["working-directory"] = "/workspace";
|
||||
mockInputs["cache-dependency-glob"] = "/abs/path.lock\nrelative.lock";
|
||||
const { cacheDependencyGlob } = await import("../../src/utils/inputs");
|
||||
const { cacheDependencyGlob } = await importInputsModule();
|
||||
expect(cacheDependencyGlob).toBe(
|
||||
["/abs/path.lock", "/workspace/relative.lock"].join("\n"),
|
||||
);
|
||||
@@ -80,7 +89,7 @@ describe("cacheDependencyGlob", () => {
|
||||
it("handles exclusions in relative paths correct", async () => {
|
||||
mockInputs["working-directory"] = "/workspace";
|
||||
mockInputs["cache-dependency-glob"] = "!/abs/path.lock\n!relative.lock";
|
||||
const { cacheDependencyGlob } = await import("../../src/utils/inputs");
|
||||
const { cacheDependencyGlob } = await importInputsModule();
|
||||
expect(cacheDependencyGlob).toBe(
|
||||
["!/abs/path.lock", "!/workspace/relative.lock"].join("\n"),
|
||||
);
|
||||
@@ -104,7 +113,7 @@ describe("tool directories", () => {
|
||||
mockInputs["tool-bin-dir"] = "~/tool-bin-dir";
|
||||
mockInputs["tool-dir"] = "~/tool-dir";
|
||||
|
||||
const { toolBinDir, toolDir } = await import("../../src/utils/inputs");
|
||||
const { toolBinDir, toolDir } = await importInputsModule();
|
||||
|
||||
expect(toolBinDir).toBe("/home/testuser/tool-bin-dir");
|
||||
expect(toolDir).toBe("/home/testuser/tool-dir");
|
||||
@@ -127,9 +136,7 @@ describe("cacheLocalPath", () => {
|
||||
mockInputs["working-directory"] = "/workspace";
|
||||
mockInputs["cache-local-path"] = "~/uv-cache/cache-local-path";
|
||||
|
||||
const { CacheLocalSource, cacheLocalPath } = await import(
|
||||
"../../src/utils/inputs"
|
||||
);
|
||||
const { CacheLocalSource, cacheLocalPath } = await importInputsModule();
|
||||
|
||||
expect(cacheLocalPath).toEqual({
|
||||
path: "/home/testuser/uv-cache/cache-local-path",
|
||||
@@ -152,7 +159,7 @@ describe("venvPath", () => {
|
||||
|
||||
it("defaults to .venv in the working directory", async () => {
|
||||
mockInputs["working-directory"] = "/workspace";
|
||||
const { venvPath } = await import("../../src/utils/inputs");
|
||||
const { venvPath } = await importInputsModule();
|
||||
expect(venvPath).toBe("/workspace/.venv");
|
||||
});
|
||||
|
||||
@@ -160,7 +167,7 @@ describe("venvPath", () => {
|
||||
mockInputs["working-directory"] = "/workspace";
|
||||
mockInputs["activate-environment"] = "true";
|
||||
mockInputs["venv-path"] = "custom-venv";
|
||||
const { venvPath } = await import("../../src/utils/inputs");
|
||||
const { venvPath } = await importInputsModule();
|
||||
expect(venvPath).toBe("/workspace/custom-venv");
|
||||
});
|
||||
|
||||
@@ -168,7 +175,7 @@ describe("venvPath", () => {
|
||||
mockInputs["working-directory"] = "/workspace";
|
||||
mockInputs["activate-environment"] = "true";
|
||||
mockInputs["venv-path"] = "custom-venv/";
|
||||
const { venvPath } = await import("../../src/utils/inputs");
|
||||
const { venvPath } = await importInputsModule();
|
||||
expect(venvPath).toBe("/workspace/custom-venv");
|
||||
});
|
||||
|
||||
@@ -176,7 +183,7 @@ describe("venvPath", () => {
|
||||
mockInputs["working-directory"] = "/workspace";
|
||||
mockInputs["activate-environment"] = "true";
|
||||
mockInputs["venv-path"] = "/tmp/custom-venv";
|
||||
const { venvPath } = await import("../../src/utils/inputs");
|
||||
const { venvPath } = await importInputsModule();
|
||||
expect(venvPath).toBe("/tmp/custom-venv");
|
||||
});
|
||||
|
||||
@@ -184,7 +191,7 @@ describe("venvPath", () => {
|
||||
mockInputs["working-directory"] = "/workspace";
|
||||
mockInputs["activate-environment"] = "true";
|
||||
mockInputs["venv-path"] = "~/.venv";
|
||||
const { venvPath } = await import("../../src/utils/inputs");
|
||||
const { venvPath } = await importInputsModule();
|
||||
expect(venvPath).toBe("/home/testuser/.venv");
|
||||
});
|
||||
|
||||
@@ -192,18 +199,11 @@ describe("venvPath", () => {
|
||||
mockInputs["working-directory"] = "/workspace";
|
||||
mockInputs["venv-path"] = "custom-venv";
|
||||
|
||||
const { activateEnvironment, venvPath } = await import(
|
||||
"../../src/utils/inputs"
|
||||
);
|
||||
const { activateEnvironment, venvPath } = await importInputsModule();
|
||||
|
||||
expect(activateEnvironment).toBe(false);
|
||||
expect(venvPath).toBe("/workspace/custom-venv");
|
||||
|
||||
const mockedCore = jest.requireMock("@actions/core") as {
|
||||
warning: jest.Mock;
|
||||
};
|
||||
|
||||
expect(mockedCore.warning).toHaveBeenCalledWith(
|
||||
expect(mockWarning).toHaveBeenCalledWith(
|
||||
"venv-path is only used when activate-environment is true",
|
||||
);
|
||||
});
|
||||
|
||||
@@ -1,113 +1,121 @@
|
||||
jest.mock("node:fs");
|
||||
jest.mock("@actions/core", () => ({
|
||||
warning: jest.fn(),
|
||||
import { beforeEach, describe, expect, it, jest } from "@jest/globals";
|
||||
|
||||
const mockReadFileSync = jest.fn();
|
||||
const mockWarning = jest.fn();
|
||||
|
||||
jest.unstable_mockModule("node:fs", () => ({
|
||||
default: {
|
||||
readFileSync: mockReadFileSync,
|
||||
},
|
||||
}));
|
||||
|
||||
import fs from "node:fs";
|
||||
import * as core from "@actions/core";
|
||||
import { beforeEach, describe, expect, it, jest } from "@jest/globals";
|
||||
import { getUvVersionFromToolVersions } from "../../src/version/tool-versions-file";
|
||||
jest.unstable_mockModule("@actions/core", () => ({
|
||||
warning: mockWarning,
|
||||
}));
|
||||
|
||||
const mockedFs = fs as jest.Mocked<typeof fs>;
|
||||
const mockedCore = core as jest.Mocked<typeof core>;
|
||||
async function getVersionFromToolVersions(filePath: string) {
|
||||
const { getUvVersionFromToolVersions } = await import(
|
||||
"../../src/version/tool-versions-file"
|
||||
);
|
||||
|
||||
return getUvVersionFromToolVersions(filePath);
|
||||
}
|
||||
|
||||
describe("getUvVersionFromToolVersions", () => {
|
||||
beforeEach(() => {
|
||||
jest.resetModules();
|
||||
jest.clearAllMocks();
|
||||
});
|
||||
|
||||
it("should return undefined for non-.tool-versions files", () => {
|
||||
const result = getUvVersionFromToolVersions("package.json");
|
||||
it("should return undefined for non-.tool-versions files", async () => {
|
||||
const result = await getVersionFromToolVersions("package.json");
|
||||
expect(result).toBeUndefined();
|
||||
expect(mockedFs.readFileSync).not.toHaveBeenCalled();
|
||||
expect(mockReadFileSync).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it("should return version for valid uv entry", () => {
|
||||
it("should return version for valid uv entry", async () => {
|
||||
const fileContent = "python 3.11.0\nuv 0.1.0\nnodejs 18.0.0";
|
||||
mockedFs.readFileSync.mockReturnValue(fileContent);
|
||||
mockReadFileSync.mockReturnValue(fileContent);
|
||||
|
||||
const result = getUvVersionFromToolVersions(".tool-versions");
|
||||
const result = await getVersionFromToolVersions(".tool-versions");
|
||||
|
||||
expect(result).toBe("0.1.0");
|
||||
expect(mockedFs.readFileSync).toHaveBeenCalledWith(
|
||||
".tool-versions",
|
||||
"utf8",
|
||||
);
|
||||
expect(mockReadFileSync).toHaveBeenCalledWith(".tool-versions", "utf8");
|
||||
});
|
||||
|
||||
it("should return version for uv entry with v prefix", () => {
|
||||
it("should return version for uv entry with v prefix", async () => {
|
||||
const fileContent = "uv v0.2.0";
|
||||
mockedFs.readFileSync.mockReturnValue(fileContent);
|
||||
mockReadFileSync.mockReturnValue(fileContent);
|
||||
|
||||
const result = getUvVersionFromToolVersions(".tool-versions");
|
||||
const result = await getVersionFromToolVersions(".tool-versions");
|
||||
|
||||
expect(result).toBe("0.2.0");
|
||||
});
|
||||
|
||||
it("should handle whitespace around uv entry", () => {
|
||||
it("should handle whitespace around uv entry", async () => {
|
||||
const fileContent = " uv 0.3.0 ";
|
||||
mockedFs.readFileSync.mockReturnValue(fileContent);
|
||||
mockReadFileSync.mockReturnValue(fileContent);
|
||||
|
||||
const result = getUvVersionFromToolVersions(".tool-versions");
|
||||
const result = await getVersionFromToolVersions(".tool-versions");
|
||||
|
||||
expect(result).toBe("0.3.0");
|
||||
});
|
||||
|
||||
it("should skip commented lines", () => {
|
||||
it("should skip commented lines", async () => {
|
||||
const fileContent = "# uv 0.1.0\npython 3.11.0\nuv 0.2.0";
|
||||
mockedFs.readFileSync.mockReturnValue(fileContent);
|
||||
mockReadFileSync.mockReturnValue(fileContent);
|
||||
|
||||
const result = getUvVersionFromToolVersions(".tool-versions");
|
||||
const result = await getVersionFromToolVersions(".tool-versions");
|
||||
|
||||
expect(result).toBe("0.2.0");
|
||||
});
|
||||
|
||||
it("should return first matching uv version", () => {
|
||||
it("should return first matching uv version", async () => {
|
||||
const fileContent = "uv 0.1.0\npython 3.11.0\nuv 0.2.0";
|
||||
mockedFs.readFileSync.mockReturnValue(fileContent);
|
||||
mockReadFileSync.mockReturnValue(fileContent);
|
||||
|
||||
const result = getUvVersionFromToolVersions(".tool-versions");
|
||||
const result = await getVersionFromToolVersions(".tool-versions");
|
||||
|
||||
expect(result).toBe("0.1.0");
|
||||
});
|
||||
|
||||
it("should return undefined when no uv entry found", () => {
|
||||
it("should return undefined when no uv entry found", async () => {
|
||||
const fileContent = "python 3.11.0\nnodejs 18.0.0";
|
||||
mockedFs.readFileSync.mockReturnValue(fileContent);
|
||||
mockReadFileSync.mockReturnValue(fileContent);
|
||||
|
||||
const result = getUvVersionFromToolVersions(".tool-versions");
|
||||
const result = await getVersionFromToolVersions(".tool-versions");
|
||||
|
||||
expect(result).toBeUndefined();
|
||||
});
|
||||
|
||||
it("should return undefined for empty file", () => {
|
||||
mockedFs.readFileSync.mockReturnValue("");
|
||||
it("should return undefined for empty file", async () => {
|
||||
mockReadFileSync.mockReturnValue("");
|
||||
|
||||
const result = getUvVersionFromToolVersions(".tool-versions");
|
||||
const result = await getVersionFromToolVersions(".tool-versions");
|
||||
|
||||
expect(result).toBeUndefined();
|
||||
});
|
||||
|
||||
it("should warn and return undefined for ref syntax", () => {
|
||||
it("should warn and return undefined for ref syntax", async () => {
|
||||
const fileContent = "uv ref:main";
|
||||
mockedFs.readFileSync.mockReturnValue(fileContent);
|
||||
mockReadFileSync.mockReturnValue(fileContent);
|
||||
|
||||
const result = getUvVersionFromToolVersions(".tool-versions");
|
||||
const result = await getVersionFromToolVersions(".tool-versions");
|
||||
|
||||
expect(result).toBeUndefined();
|
||||
expect(mockedCore.warning).toHaveBeenCalledWith(
|
||||
expect(mockWarning).toHaveBeenCalledWith(
|
||||
"The ref syntax of .tool-versions is not supported. Please use a released version instead.",
|
||||
);
|
||||
});
|
||||
|
||||
it("should handle file path with .tool-versions extension", () => {
|
||||
it("should handle file path with .tool-versions extension", async () => {
|
||||
const fileContent = "uv 0.1.0";
|
||||
mockedFs.readFileSync.mockReturnValue(fileContent);
|
||||
mockReadFileSync.mockReturnValue(fileContent);
|
||||
|
||||
const result = getUvVersionFromToolVersions("path/to/.tool-versions");
|
||||
const result = await getVersionFromToolVersions("path/to/.tool-versions");
|
||||
|
||||
expect(result).toBe("0.1.0");
|
||||
expect(mockedFs.readFileSync).toHaveBeenCalledWith(
|
||||
expect(mockReadFileSync).toHaveBeenCalledWith(
|
||||
"path/to/.tool-versions",
|
||||
"utf8",
|
||||
);
|
||||
|
||||
@@ -26,7 +26,7 @@ inputs:
|
||||
required: false
|
||||
github-token:
|
||||
description:
|
||||
"Used to increase the rate limit when retrieving versions and downloading uv."
|
||||
"Used when downloading uv from GitHub releases."
|
||||
required: false
|
||||
default: ${{ github.token }}
|
||||
enable-cache:
|
||||
@@ -75,7 +75,7 @@ inputs:
|
||||
description: "Custom path to set UV_TOOL_BIN_DIR to."
|
||||
required: false
|
||||
manifest-file:
|
||||
description: "URL to the manifest file containing available versions and download URLs."
|
||||
description: "URL to a custom manifest file. Supports the astral-sh/versions NDJSON format and the legacy JSON array format (deprecated)."
|
||||
required: false
|
||||
add-problem-matchers:
|
||||
description: "Add problem matchers."
|
||||
@@ -102,8 +102,8 @@ outputs:
|
||||
description: "A boolean value to indicate the Python cache entry was found"
|
||||
runs:
|
||||
using: "node24"
|
||||
main: "dist/setup/index.js"
|
||||
post: "dist/save-cache/index.js"
|
||||
main: "dist/setup/index.cjs"
|
||||
post: "dist/save-cache/index.cjs"
|
||||
post-if: success()
|
||||
branding:
|
||||
icon: "package"
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
{
|
||||
"$schema": "https://biomejs.dev/schemas/2.3.7/schema.json",
|
||||
"$schema": "https://biomejs.dev/schemas/2.4.7/schema.json",
|
||||
"assist": {
|
||||
"actions": {
|
||||
"source": {
|
||||
|
||||
63325
dist/save-cache/index.cjs
generated
vendored
Normal file
63325
dist/save-cache/index.cjs
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
94305
dist/save-cache/index.js
generated
vendored
94305
dist/save-cache/index.js
generated
vendored
File diff suppressed because one or more lines are too long
97307
dist/setup/index.cjs
generated
vendored
Normal file
97307
dist/setup/index.cjs
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
104639
dist/setup/index.js
generated
vendored
104639
dist/setup/index.js
generated
vendored
File diff suppressed because one or more lines are too long
50290
dist/update-known-checksums/index.cjs
generated
vendored
Normal file
50290
dist/update-known-checksums/index.cjs
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
39068
dist/update-known-versions/index.js
generated
vendored
39068
dist/update-known-versions/index.js
generated
vendored
File diff suppressed because one or more lines are too long
@@ -18,12 +18,29 @@ are automatically verified by this action. The sha256 hashes can be found on the
|
||||
|
||||
## Manifest file
|
||||
|
||||
The `manifest-file` input allows you to specify a JSON manifest that lists available uv versions,
|
||||
architectures, and their download URLs. By default, this action uses the manifest file contained
|
||||
in this repository, which is automatically updated with each release of uv.
|
||||
By default, setup-uv reads version metadata from
|
||||
[`astral-sh/versions`](https://github.com/astral-sh/versions) (NDJSON format).
|
||||
|
||||
The manifest file contains an array of objects, each describing a version,
|
||||
architecture, platform, and the corresponding download URL. For example:
|
||||
The `manifest-file` input lets you override that source with your own URL, for example to test
|
||||
custom uv builds or alternate download locations.
|
||||
|
||||
### Format
|
||||
|
||||
The manifest file must be in NDJSON format, where each line is a JSON object representing a version and its artifacts. For example:
|
||||
|
||||
```json
|
||||
{"version":"0.10.7","artifacts":[{"platform":"x86_64-unknown-linux-gnu","variant":"default","url":"https://example.com/uv-x86_64-unknown-linux-gnu.tar.gz","archive_format":"tar.gz","sha256":"..."}]}
|
||||
{"version":"0.10.6","artifacts":[{"platform":"x86_64-unknown-linux-gnu","variant":"default","url":"https://example.com/uv-x86_64-unknown-linux-gnu.tar.gz","archive_format":"tar.gz","sha256":"..."}]}
|
||||
```
|
||||
|
||||
setup-uv currently only supports `default` as the `variant`.
|
||||
|
||||
The `archive_format` field is currently ignored.
|
||||
|
||||
### Legacy format: JSON array (deprecated)
|
||||
|
||||
The previous JSON array format is still supported for compatibility, but deprecated and will be
|
||||
removed in a future major release.
|
||||
|
||||
```json
|
||||
[
|
||||
@@ -33,26 +50,20 @@ architecture, platform, and the corresponding download URL. For example:
|
||||
"arch": "aarch64",
|
||||
"platform": "apple-darwin",
|
||||
"downloadUrl": "https://github.com/astral-sh/uv/releases/download/0.7.13/uv-aarch64-apple-darwin.tar.gz"
|
||||
},
|
||||
...
|
||||
}
|
||||
]
|
||||
```
|
||||
|
||||
You can supply a custom manifest file URL to define additional versions,
|
||||
architectures, or different download URLs.
|
||||
This is useful if you maintain your own uv builds or want to override the default sources.
|
||||
|
||||
```yaml
|
||||
- name: Use a custom manifest file
|
||||
uses: astral-sh/setup-uv@v7
|
||||
with:
|
||||
manifest-file: "https://example.com/my-custom-manifest.json"
|
||||
manifest-file: "https://example.com/my-custom-manifest.ndjson"
|
||||
```
|
||||
|
||||
> [!NOTE]
|
||||
> When you use a custom manifest file and do not set the `version` input, its default value is `latest`.
|
||||
> This means the action will install the latest version available in the custom manifest file.
|
||||
> This is different from the default behavior of installing the latest version from the official uv releases.
|
||||
> When you use a custom manifest file and do not set the `version` input, setup-uv installs the
|
||||
> latest version from that custom manifest.
|
||||
|
||||
## Add problem matchers
|
||||
|
||||
|
||||
@@ -38,9 +38,12 @@ You can customize the venv location with `venv-path`, for example to place it in
|
||||
|
||||
## GitHub authentication token
|
||||
|
||||
This action uses the GitHub API to fetch the uv release artifacts. To avoid hitting the GitHub API
|
||||
rate limit too quickly, an authentication token can be provided via the `github-token` input. By
|
||||
default, the `GITHUB_TOKEN` secret is used, which is automatically provided by GitHub Actions.
|
||||
By default, this action resolves available uv versions from
|
||||
[`astral-sh/versions`](https://github.com/astral-sh/versions), then downloads uv artifacts from
|
||||
GitHub Releases.
|
||||
|
||||
You can provide a token via `github-token` to authenticate those downloads. By default, the
|
||||
`GITHUB_TOKEN` secret is used, which is automatically provided by GitHub Actions.
|
||||
|
||||
If the default
|
||||
[permissions for the GitHub token](https://docs.github.com/en/actions/security-for-github-actions/security-guides/automatic-token-authentication#permissions-for-the-github_token)
|
||||
|
||||
@@ -1,9 +0,0 @@
|
||||
module.exports = {
|
||||
clearMocks: true,
|
||||
moduleFileExtensions: ["js", "ts"],
|
||||
testMatch: ["**/*.test.ts"],
|
||||
transform: {
|
||||
"^.+\\.ts$": "ts-jest",
|
||||
},
|
||||
verbose: true,
|
||||
};
|
||||
14
jest.config.mjs
Normal file
14
jest.config.mjs
Normal file
@@ -0,0 +1,14 @@
|
||||
import { createDefaultEsmPreset } from "ts-jest";
|
||||
|
||||
const esmPreset = createDefaultEsmPreset({
|
||||
tsconfig: "./tsconfig.json",
|
||||
});
|
||||
|
||||
export default {
|
||||
...esmPreset,
|
||||
clearMocks: true,
|
||||
moduleFileExtensions: ["js", "mjs", "ts"],
|
||||
testEnvironment: "node",
|
||||
testMatch: ["**/*.test.ts"],
|
||||
verbose: true,
|
||||
};
|
||||
4416
package-lock.json
generated
4416
package-lock.json
generated
File diff suppressed because it is too large
Load Diff
45
package.json
45
package.json
@@ -2,16 +2,19 @@
|
||||
"name": "setup-uv",
|
||||
"version": "1.0.0",
|
||||
"private": true,
|
||||
"type": "module",
|
||||
"description": "Set up your GitHub Actions workflow with a specific version of uv",
|
||||
"main": "dist/index.js",
|
||||
"main": "dist/setup/index.cjs",
|
||||
"scripts": {
|
||||
"build": "tsc",
|
||||
"build": "tsc --noEmit",
|
||||
"check": "biome check --write",
|
||||
"package": "ncc build -o dist/setup src/setup-uv.ts && ncc build -o dist/save-cache src/save-cache.ts && ncc build -o dist/update-known-versions src/update-known-versions.ts",
|
||||
"test": "jest",
|
||||
"package": "node scripts/build-dist.mjs",
|
||||
"bench:versions": "node scripts/bench-versions-client.mjs",
|
||||
"test:unit": "node --experimental-vm-modules ./node_modules/jest/bin/jest.js",
|
||||
"test": "npm run build && npm run test:unit",
|
||||
"act": "act pull_request -W .github/workflows/test.yml --container-architecture linux/amd64 -s GITHUB_TOKEN=\"$(gh auth token)\"",
|
||||
"update-known-versions": "RUNNER_TEMP=known_versions node dist/update-known-versions/index.js src/download/checksum/known-versions.ts \"$(gh auth token)\"",
|
||||
"all": "npm run build && npm run check && npm run package && npm test"
|
||||
"update-known-checksums": "RUNNER_TEMP=known_versions node dist/update-known-checksums/index.cjs src/download/checksum/known-checksums.ts",
|
||||
"all": "npm run build && npm run check && npm run package && npm run test:unit"
|
||||
},
|
||||
"repository": {
|
||||
"type": "git",
|
||||
@@ -26,28 +29,26 @@
|
||||
"author": "@eifinger",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@actions/cache": "^4.1.0",
|
||||
"@actions/core": "^1.11.1",
|
||||
"@actions/exec": "^1.1.1",
|
||||
"@actions/glob": "^0.5.0",
|
||||
"@actions/io": "^1.1.3",
|
||||
"@actions/tool-cache": "^2.0.2",
|
||||
"@octokit/core": "^7.0.6",
|
||||
"@octokit/plugin-paginate-rest": "^14.0.0",
|
||||
"@octokit/plugin-rest-endpoint-methods": "^17.0.0",
|
||||
"@renovatebot/pep440": "^4.2.1",
|
||||
"@actions/cache": "^6.0.0",
|
||||
"@actions/core": "^3.0.0",
|
||||
"@actions/exec": "^3.0.0",
|
||||
"@actions/glob": "^0.6.1",
|
||||
"@actions/io": "^3.0.2",
|
||||
"@actions/tool-cache": "^4.0.0",
|
||||
"@renovatebot/pep440": "^4.2.2",
|
||||
"smol-toml": "^1.6.0",
|
||||
"undici": "5.28.5"
|
||||
"undici": "^7.24.2"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@biomejs/biome": "2.3.8",
|
||||
"@biomejs/biome": "^2.4.7",
|
||||
"@types/js-yaml": "^4.0.9",
|
||||
"@types/node": "^24.10.1",
|
||||
"@types/node": "^25.5.0",
|
||||
"@types/semver": "^7.7.1",
|
||||
"@vercel/ncc": "^0.38.4",
|
||||
"jest": "^30.2.0",
|
||||
"js-yaml": "^4.1.0",
|
||||
"ts-jest": "^29.4.5",
|
||||
"esbuild": "^0.27.4",
|
||||
"jest": "^30.3.0",
|
||||
"js-yaml": "^4.1.1",
|
||||
"ts-jest": "^29.4.6",
|
||||
"typescript": "^5.9.3"
|
||||
}
|
||||
}
|
||||
|
||||
483
scripts/bench-versions-client.mjs
Normal file
483
scripts/bench-versions-client.mjs
Normal file
@@ -0,0 +1,483 @@
|
||||
import { performance } from "node:perf_hooks";
|
||||
import * as pep440 from "@renovatebot/pep440";
|
||||
import * as semver from "semver";
|
||||
import { ProxyAgent, fetch as undiciFetch } from "undici";
|
||||
|
||||
const DEFAULT_URL =
|
||||
"https://raw.githubusercontent.com/astral-sh/versions/main/v1/uv.ndjson";
|
||||
const DEFAULT_ITERATIONS = 100;
|
||||
const DEFAULT_ARCH = "aarch64";
|
||||
const DEFAULT_PLATFORM = "apple-darwin";
|
||||
|
||||
function getProxyAgent() {
|
||||
const httpProxy = process.env.HTTP_PROXY || process.env.http_proxy;
|
||||
if (httpProxy) {
|
||||
return new ProxyAgent(httpProxy);
|
||||
}
|
||||
|
||||
const httpsProxy = process.env.HTTPS_PROXY || process.env.https_proxy;
|
||||
if (httpsProxy) {
|
||||
return new ProxyAgent(httpsProxy);
|
||||
}
|
||||
|
||||
return undefined;
|
||||
}
|
||||
|
||||
async function fetch(url) {
|
||||
return await undiciFetch(url, {
|
||||
dispatcher: getProxyAgent(),
|
||||
});
|
||||
}
|
||||
|
||||
function parseArgs(argv) {
|
||||
const options = {
|
||||
arch: DEFAULT_ARCH,
|
||||
iterations: DEFAULT_ITERATIONS,
|
||||
platform: DEFAULT_PLATFORM,
|
||||
url: DEFAULT_URL,
|
||||
};
|
||||
|
||||
for (let index = 0; index < argv.length; index += 1) {
|
||||
const arg = argv[index];
|
||||
const next = argv[index + 1];
|
||||
|
||||
if (arg === "--iterations" && next !== undefined) {
|
||||
options.iterations = Number.parseInt(next, 10);
|
||||
index += 1;
|
||||
continue;
|
||||
}
|
||||
|
||||
if (arg === "--url" && next !== undefined) {
|
||||
options.url = next;
|
||||
index += 1;
|
||||
continue;
|
||||
}
|
||||
|
||||
if (arg === "--arch" && next !== undefined) {
|
||||
options.arch = next;
|
||||
index += 1;
|
||||
continue;
|
||||
}
|
||||
|
||||
if (arg === "--platform" && next !== undefined) {
|
||||
options.platform = next;
|
||||
index += 1;
|
||||
}
|
||||
}
|
||||
|
||||
if (!Number.isInteger(options.iterations) || options.iterations <= 0) {
|
||||
throw new Error("--iterations must be a positive integer");
|
||||
}
|
||||
|
||||
return options;
|
||||
}
|
||||
|
||||
function parseVersionLine(line, sourceDescription, lineNumber) {
|
||||
let parsed;
|
||||
try {
|
||||
parsed = JSON.parse(line);
|
||||
} catch (error) {
|
||||
throw new Error(
|
||||
`Failed to parse version data from ${sourceDescription} at line ${lineNumber}: ${error.message}`,
|
||||
);
|
||||
}
|
||||
|
||||
if (
|
||||
typeof parsed !== "object" ||
|
||||
parsed === null ||
|
||||
typeof parsed.version !== "string" ||
|
||||
!Array.isArray(parsed.artifacts)
|
||||
) {
|
||||
throw new Error(
|
||||
`Invalid NDJSON record in ${sourceDescription} at line ${lineNumber}.`,
|
||||
);
|
||||
}
|
||||
|
||||
return parsed;
|
||||
}
|
||||
|
||||
function parseVersionData(data, sourceDescription) {
|
||||
const versions = [];
|
||||
|
||||
for (const [index, line] of data.split("\n").entries()) {
|
||||
const trimmed = line.trim();
|
||||
if (trimmed === "") {
|
||||
continue;
|
||||
}
|
||||
|
||||
versions.push(parseVersionLine(trimmed, sourceDescription, index + 1));
|
||||
}
|
||||
|
||||
if (versions.length === 0) {
|
||||
throw new Error(`No version data found in ${sourceDescription}.`);
|
||||
}
|
||||
|
||||
return versions;
|
||||
}
|
||||
|
||||
async function readEntireResponse(response) {
|
||||
if (response.body === null) {
|
||||
const text = await response.text();
|
||||
return {
|
||||
bytesRead: Buffer.byteLength(text, "utf8"),
|
||||
text,
|
||||
};
|
||||
}
|
||||
|
||||
const reader = response.body.getReader();
|
||||
const decoder = new TextDecoder();
|
||||
const chunks = [];
|
||||
let bytesRead = 0;
|
||||
|
||||
while (true) {
|
||||
const { done, value } = await reader.read();
|
||||
if (done) {
|
||||
chunks.push(decoder.decode());
|
||||
break;
|
||||
}
|
||||
|
||||
bytesRead += value.byteLength;
|
||||
chunks.push(decoder.decode(value, { stream: true }));
|
||||
}
|
||||
|
||||
return {
|
||||
bytesRead,
|
||||
text: chunks.join(""),
|
||||
};
|
||||
}
|
||||
|
||||
async function fetchAllVersions(url) {
|
||||
const response = await fetch(url);
|
||||
if (!response.ok) {
|
||||
throw new Error(
|
||||
`Failed to fetch version data: ${response.status} ${response.statusText}`,
|
||||
);
|
||||
}
|
||||
|
||||
const { bytesRead, text } = await readEntireResponse(response);
|
||||
return {
|
||||
bytesRead,
|
||||
versions: parseVersionData(text, url),
|
||||
};
|
||||
}
|
||||
|
||||
async function streamUntil(url, predicate) {
|
||||
const response = await fetch(url);
|
||||
if (!response.ok) {
|
||||
throw new Error(
|
||||
`Failed to fetch version data: ${response.status} ${response.statusText}`,
|
||||
);
|
||||
}
|
||||
|
||||
if (response.body === null) {
|
||||
const { bytesRead, versions } = await fetchAllVersions(url);
|
||||
return {
|
||||
bytesRead,
|
||||
matchedVersion: versions.find(predicate),
|
||||
};
|
||||
}
|
||||
|
||||
const reader = response.body.getReader();
|
||||
const decoder = new TextDecoder();
|
||||
let bytesRead = 0;
|
||||
let buffer = "";
|
||||
let lineNumber = 0;
|
||||
|
||||
while (true) {
|
||||
const { done, value } = await reader.read();
|
||||
if (done) {
|
||||
buffer += decoder.decode();
|
||||
break;
|
||||
}
|
||||
|
||||
bytesRead += value.byteLength;
|
||||
buffer += decoder.decode(value, { stream: true });
|
||||
|
||||
let newlineIndex = buffer.indexOf("\n");
|
||||
while (newlineIndex !== -1) {
|
||||
const line = buffer.slice(0, newlineIndex);
|
||||
buffer = buffer.slice(newlineIndex + 1);
|
||||
const trimmed = line.trim();
|
||||
|
||||
if (trimmed !== "") {
|
||||
lineNumber += 1;
|
||||
const versionData = parseVersionLine(trimmed, url, lineNumber);
|
||||
if (predicate(versionData)) {
|
||||
await reader.cancel();
|
||||
return { bytesRead, matchedVersion: versionData };
|
||||
}
|
||||
}
|
||||
|
||||
newlineIndex = buffer.indexOf("\n");
|
||||
}
|
||||
}
|
||||
|
||||
if (buffer.trim() !== "") {
|
||||
lineNumber += 1;
|
||||
const versionData = parseVersionLine(buffer.trim(), url, lineNumber);
|
||||
if (predicate(versionData)) {
|
||||
return { bytesRead, matchedVersion: versionData };
|
||||
}
|
||||
}
|
||||
|
||||
return { bytesRead, matchedVersion: undefined };
|
||||
}
|
||||
|
||||
function versionSatisfies(version, versionSpecifier) {
|
||||
return (
|
||||
semver.satisfies(version, versionSpecifier) ||
|
||||
pep440.satisfies(version, versionSpecifier)
|
||||
);
|
||||
}
|
||||
|
||||
function maxSatisfying(versions, versionSpecifier) {
|
||||
const semverMatch = semver.maxSatisfying(versions, versionSpecifier);
|
||||
if (semverMatch !== null) {
|
||||
return semverMatch;
|
||||
}
|
||||
|
||||
return pep440.maxSatisfying(versions, versionSpecifier) ?? undefined;
|
||||
}
|
||||
|
||||
function selectArtifact(artifacts) {
|
||||
if (artifacts.length === 1) {
|
||||
return artifacts[0];
|
||||
}
|
||||
|
||||
const defaultVariant = artifacts.find(
|
||||
(candidate) => candidate.variant === "default",
|
||||
);
|
||||
if (defaultVariant !== undefined) {
|
||||
return defaultVariant;
|
||||
}
|
||||
|
||||
return artifacts[0];
|
||||
}
|
||||
|
||||
async function benchmarkCase(name, expected, implementations, iterations) {
|
||||
const results = {
|
||||
name,
|
||||
new: [],
|
||||
old: [],
|
||||
};
|
||||
|
||||
for (let iteration = 0; iteration < iterations; iteration += 1) {
|
||||
const order = iteration % 2 === 0 ? ["old", "new"] : ["new", "old"];
|
||||
|
||||
for (const label of order) {
|
||||
const implementation = implementations[label];
|
||||
const startedAt = performance.now();
|
||||
const outcome = await implementation.run();
|
||||
const durationMs = performance.now() - startedAt;
|
||||
|
||||
if (outcome.value !== expected) {
|
||||
throw new Error(
|
||||
`${name} ${label} produced ${JSON.stringify(outcome.value)}; expected ${JSON.stringify(expected)}`,
|
||||
);
|
||||
}
|
||||
|
||||
results[label].push({
|
||||
bytesRead: outcome.bytesRead,
|
||||
durationMs,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
return results;
|
||||
}
|
||||
|
||||
function summarize(samples) {
|
||||
const durations = samples
|
||||
.map((sample) => sample.durationMs)
|
||||
.sort((left, right) => left - right);
|
||||
const bytes = samples
|
||||
.map((sample) => sample.bytesRead)
|
||||
.sort((left, right) => left - right);
|
||||
|
||||
const sum = (values) => values.reduce((total, value) => total + value, 0);
|
||||
const percentile = (values, ratio) => {
|
||||
const index = Math.min(
|
||||
values.length - 1,
|
||||
Math.max(0, Math.ceil(values.length * ratio) - 1),
|
||||
);
|
||||
return values[index];
|
||||
};
|
||||
|
||||
return {
|
||||
avgBytes: sum(bytes) / bytes.length,
|
||||
avgMs: sum(durations) / durations.length,
|
||||
maxMs: durations[durations.length - 1],
|
||||
medianMs: percentile(durations, 0.5),
|
||||
minMs: durations[0],
|
||||
p95Ms: percentile(durations, 0.95),
|
||||
};
|
||||
}
|
||||
|
||||
function formatNumber(value, digits = 2) {
|
||||
return value.toFixed(digits);
|
||||
}
|
||||
|
||||
function formatSummary(name, oldSummary, newSummary) {
|
||||
const speedup = oldSummary.avgMs / newSummary.avgMs;
|
||||
const timeReduction =
|
||||
((oldSummary.avgMs - newSummary.avgMs) / oldSummary.avgMs) * 100;
|
||||
const byteReduction =
|
||||
((oldSummary.avgBytes - newSummary.avgBytes) / oldSummary.avgBytes) * 100;
|
||||
|
||||
return [
|
||||
`Scenario: ${name}`,
|
||||
` old avg: ${formatNumber(oldSummary.avgMs)} ms | median: ${formatNumber(oldSummary.medianMs)} ms | p95: ${formatNumber(oldSummary.p95Ms)} ms | avg bytes: ${Math.round(oldSummary.avgBytes)}`,
|
||||
` new avg: ${formatNumber(newSummary.avgMs)} ms | median: ${formatNumber(newSummary.medianMs)} ms | p95: ${formatNumber(newSummary.p95Ms)} ms | avg bytes: ${Math.round(newSummary.avgBytes)}`,
|
||||
` delta: ${formatNumber(timeReduction)}% faster | ${formatNumber(speedup)}x speedup | ${formatNumber(byteReduction)}% fewer bytes read`,
|
||||
].join("\n");
|
||||
}
|
||||
|
||||
async function main() {
|
||||
const options = parseArgs(process.argv.slice(2));
|
||||
console.log(`Preparing benchmark data from ${options.url}`);
|
||||
const baseline = await fetchAllVersions(options.url);
|
||||
const latestVersion = baseline.versions[0]?.version;
|
||||
if (!latestVersion) {
|
||||
throw new Error("No versions found in NDJSON data");
|
||||
}
|
||||
|
||||
const latestArtifact = selectArtifact(
|
||||
baseline.versions[0].artifacts.filter(
|
||||
(candidate) =>
|
||||
candidate.platform === `${options.arch}-${options.platform}`,
|
||||
),
|
||||
);
|
||||
if (!latestArtifact) {
|
||||
throw new Error(
|
||||
`No artifact found for ${options.arch}-${options.platform} in ${latestVersion}`,
|
||||
);
|
||||
}
|
||||
|
||||
const rangeSpecifier = `^${latestVersion.split(".")[0]}.${latestVersion.split(".")[1]}.0`;
|
||||
|
||||
console.log(
|
||||
`Running ${options.iterations} iterations per scenario against ${options.url}`,
|
||||
);
|
||||
console.log(`Latest version: ${latestVersion}`);
|
||||
console.log(`Range benchmark: ${rangeSpecifier}`);
|
||||
console.log(`Artifact benchmark: ${options.arch}-${options.platform}`);
|
||||
console.log("");
|
||||
|
||||
const scenarios = [
|
||||
await benchmarkCase(
|
||||
"latest version",
|
||||
latestVersion,
|
||||
{
|
||||
new: {
|
||||
run: async () => {
|
||||
const { bytesRead, matchedVersion } = await streamUntil(
|
||||
options.url,
|
||||
() => true,
|
||||
);
|
||||
return {
|
||||
bytesRead,
|
||||
value: matchedVersion?.version,
|
||||
};
|
||||
},
|
||||
},
|
||||
old: {
|
||||
run: async () => {
|
||||
const { bytesRead, versions } = await fetchAllVersions(options.url);
|
||||
return {
|
||||
bytesRead,
|
||||
value: versions[0]?.version,
|
||||
};
|
||||
},
|
||||
},
|
||||
},
|
||||
options.iterations,
|
||||
),
|
||||
await benchmarkCase(
|
||||
"highest satisfying range",
|
||||
latestVersion,
|
||||
{
|
||||
new: {
|
||||
run: async () => {
|
||||
const { bytesRead, matchedVersion } = await streamUntil(
|
||||
options.url,
|
||||
(candidate) =>
|
||||
versionSatisfies(candidate.version, rangeSpecifier),
|
||||
);
|
||||
return {
|
||||
bytesRead,
|
||||
value: matchedVersion?.version,
|
||||
};
|
||||
},
|
||||
},
|
||||
old: {
|
||||
run: async () => {
|
||||
const { bytesRead, versions } = await fetchAllVersions(options.url);
|
||||
return {
|
||||
bytesRead,
|
||||
value: maxSatisfying(
|
||||
versions.map((versionData) => versionData.version),
|
||||
rangeSpecifier,
|
||||
),
|
||||
};
|
||||
},
|
||||
},
|
||||
},
|
||||
options.iterations,
|
||||
),
|
||||
await benchmarkCase(
|
||||
"exact version artifact",
|
||||
latestArtifact.url,
|
||||
{
|
||||
new: {
|
||||
run: async () => {
|
||||
const { bytesRead, matchedVersion } = await streamUntil(
|
||||
options.url,
|
||||
(candidate) => candidate.version === latestVersion,
|
||||
);
|
||||
const artifact = matchedVersion
|
||||
? selectArtifact(
|
||||
matchedVersion.artifacts.filter(
|
||||
(candidate) =>
|
||||
candidate.platform ===
|
||||
`${options.arch}-${options.platform}`,
|
||||
),
|
||||
)
|
||||
: undefined;
|
||||
return {
|
||||
bytesRead,
|
||||
value: artifact?.url,
|
||||
};
|
||||
},
|
||||
},
|
||||
old: {
|
||||
run: async () => {
|
||||
const { bytesRead, versions } = await fetchAllVersions(options.url);
|
||||
const versionData = versions.find(
|
||||
(candidate) => candidate.version === latestVersion,
|
||||
);
|
||||
const artifact = selectArtifact(
|
||||
versionData.artifacts.filter(
|
||||
(candidate) =>
|
||||
candidate.platform === `${options.arch}-${options.platform}`,
|
||||
),
|
||||
);
|
||||
return {
|
||||
bytesRead,
|
||||
value: artifact?.url,
|
||||
};
|
||||
},
|
||||
},
|
||||
},
|
||||
options.iterations,
|
||||
),
|
||||
];
|
||||
|
||||
for (const scenario of scenarios) {
|
||||
const oldSummary = summarize(scenario.old);
|
||||
const newSummary = summarize(scenario.new);
|
||||
console.log(formatSummary(scenario.name, oldSummary, newSummary));
|
||||
console.log("");
|
||||
}
|
||||
}
|
||||
|
||||
await main();
|
||||
33
scripts/build-dist.mjs
Normal file
33
scripts/build-dist.mjs
Normal file
@@ -0,0 +1,33 @@
|
||||
import { rm } from "node:fs/promises";
|
||||
import { build } from "esbuild";
|
||||
|
||||
const builds = [
|
||||
{
|
||||
entryPoints: ["src/setup-uv.ts"],
|
||||
outfile: "dist/setup/index.cjs",
|
||||
staleOutfiles: ["dist/setup/index.mjs"],
|
||||
},
|
||||
{
|
||||
entryPoints: ["src/save-cache.ts"],
|
||||
outfile: "dist/save-cache/index.cjs",
|
||||
staleOutfiles: ["dist/save-cache/index.mjs"],
|
||||
},
|
||||
{
|
||||
entryPoints: ["src/update-known-checksums.ts"],
|
||||
outfile: "dist/update-known-checksums/index.cjs",
|
||||
staleOutfiles: ["dist/update-known-checksums/index.mjs"],
|
||||
},
|
||||
];
|
||||
|
||||
for (const { staleOutfiles, ...options } of builds) {
|
||||
await Promise.all(
|
||||
staleOutfiles.map((outfile) => rm(outfile, { force: true })),
|
||||
);
|
||||
await build({
|
||||
bundle: true,
|
||||
format: "cjs",
|
||||
platform: "node",
|
||||
target: "node24",
|
||||
...options,
|
||||
});
|
||||
}
|
||||
@@ -6,33 +6,35 @@ import type { Architecture, Platform } from "../../utils/platforms";
|
||||
import { KNOWN_CHECKSUMS } from "./known-checksums";
|
||||
|
||||
export async function validateChecksum(
|
||||
checkSum: string | undefined,
|
||||
checksum: string | undefined,
|
||||
downloadPath: string,
|
||||
arch: Architecture,
|
||||
platform: Platform,
|
||||
version: string,
|
||||
): Promise<void> {
|
||||
let isValid: boolean | undefined;
|
||||
if (checkSum !== undefined && checkSum !== "") {
|
||||
isValid = await validateFileCheckSum(downloadPath, checkSum);
|
||||
} else {
|
||||
core.debug("Checksum not provided. Checking known checksums.");
|
||||
const key = `${arch}-${platform}-${version}`;
|
||||
if (key in KNOWN_CHECKSUMS) {
|
||||
const knownChecksum = KNOWN_CHECKSUMS[`${arch}-${platform}-${version}`];
|
||||
core.debug(`Checking checksum for ${arch}-${platform}-${version}.`);
|
||||
isValid = await validateFileCheckSum(downloadPath, knownChecksum);
|
||||
} else {
|
||||
core.debug(`No known checksum found for ${key}.`);
|
||||
}
|
||||
const hasProvidedChecksum = checksum !== undefined && checksum !== "";
|
||||
const checksumToUse = hasProvidedChecksum ? checksum : KNOWN_CHECKSUMS[key];
|
||||
|
||||
if (checksumToUse === undefined) {
|
||||
core.debug(`No checksum found for ${key}.`);
|
||||
return;
|
||||
}
|
||||
|
||||
if (isValid === false) {
|
||||
throw new Error(`Checksum for ${downloadPath} did not match ${checkSum}.`);
|
||||
const checksumSource = hasProvidedChecksum
|
||||
? "provided checksum"
|
||||
: `KNOWN_CHECKSUMS entry for ${key}`;
|
||||
|
||||
core.debug(`Validating checksum using ${checksumSource}.`);
|
||||
const isValid = await validateFileCheckSum(downloadPath, checksumToUse);
|
||||
|
||||
if (!isValid) {
|
||||
throw new Error(
|
||||
`Checksum for ${downloadPath} did not match ${checksumToUse}.`,
|
||||
);
|
||||
}
|
||||
if (isValid === true) {
|
||||
|
||||
core.debug(`Checksum for ${downloadPath} is valid.`);
|
||||
}
|
||||
}
|
||||
|
||||
async function validateFileCheckSum(
|
||||
|
||||
@@ -1,5 +1,39 @@
|
||||
// AUTOGENERATED_DO_NOT_EDIT
|
||||
export const KNOWN_CHECKSUMS: { [key: string]: string } = {
|
||||
"aarch64-apple-darwin-0.10.10":
|
||||
"8a09f0ef51ee7f7170731b4cb8bde5bf9ba6da5304f49a7df6cdab42a1f37b5d",
|
||||
"aarch64-pc-windows-msvc-0.10.10":
|
||||
"2c6fe113f14574bc27f085751c68d3485589fcc3c3c64ed85dd1eecc2f87cffc",
|
||||
"aarch64-unknown-linux-gnu-0.10.10":
|
||||
"2b80457b950deda12e8d5dc3b9b7494ac143eae47f1fb11b1c6e5a8495a6421e",
|
||||
"aarch64-unknown-linux-musl-0.10.10":
|
||||
"d08c08b82cdcaf2bd3d928ffe844d3558dda53f90066db6ef9174157cc763252",
|
||||
"arm-unknown-linux-musleabihf-0.10.10":
|
||||
"ccc3c4dd5eeea4b2be829ef9bc0b8d9882389c0f303f7ec5ba668065d57e2673",
|
||||
"armv7-unknown-linux-gnueabihf-0.10.10":
|
||||
"032786622b52f8d0232b5ad16e25342a64f9e43576652db7bf607231021902f3",
|
||||
"armv7-unknown-linux-musleabihf-0.10.10":
|
||||
"f6f67b190eb28b473917c97210f89fd11d9b9393d774acd093ea738fcee68864",
|
||||
"i686-pc-windows-msvc-0.10.10":
|
||||
"980d7ea368cc4883f572bb85c285a647eddfc23539064d2bfaf8fbfefcc2112b",
|
||||
"i686-unknown-linux-gnu-0.10.10":
|
||||
"5260fbef838f8cfec44697064a5cfae08a27c6ab7ed7feab7fc946827e896952",
|
||||
"i686-unknown-linux-musl-0.10.10":
|
||||
"a6683ade964f8d8623098ca0c96b4311d8388b44a56a386cd795974f39fb5bd2",
|
||||
"powerpc64le-unknown-linux-gnu-0.10.10":
|
||||
"78939dc4fc905aca8af4be19b6c6ecc306f04c6ca9f98d144372595d9397fd0d",
|
||||
"riscv64gc-unknown-linux-gnu-0.10.10":
|
||||
"5eff670bf80fce9d9e50df5b4d46c415a9c0324eadf7059d97c76f89ffc33c3f",
|
||||
"s390x-unknown-linux-gnu-0.10.10":
|
||||
"a32d2be5600f7f42f82596ffe9d3115f020974ca7fb4f15251c5625c5481ea5e",
|
||||
"x86_64-apple-darwin-0.10.10":
|
||||
"dd18420591d625f9b4ca2b57a7a6fe3cce43910f02e02d90e47a4101428de14a",
|
||||
"x86_64-pc-windows-msvc-0.10.10":
|
||||
"d31a30f1dfb96e630a08d5a9b3f3f551254b7ed6e9b7e495f46a4232661c7252",
|
||||
"x86_64-unknown-linux-gnu-0.10.10":
|
||||
"3e1027f26ce8c7e4c32e2277a7fed2cb410f2f1f9320d3df97653d40e21f415b",
|
||||
"x86_64-unknown-linux-musl-0.10.10":
|
||||
"74544e8755fbc27559e22e29fd561bdc48f91b8bd8323e760a1130f32433bea4",
|
||||
"aarch64-apple-darwin-0.10.9":
|
||||
"a92f61e9ac9b0f29668c15f56152e4a60143fca148ff5bfadb86718472c3f376",
|
||||
"aarch64-pc-windows-msvc-0.10.9":
|
||||
|
||||
@@ -1,59 +1,34 @@
|
||||
import { promises as fs } from "node:fs";
|
||||
import * as tc from "@actions/tool-cache";
|
||||
import { KNOWN_CHECKSUMS } from "./known-checksums";
|
||||
|
||||
export interface ChecksumEntry {
|
||||
key: string;
|
||||
checksum: string;
|
||||
}
|
||||
|
||||
export async function updateChecksums(
|
||||
filePath: string,
|
||||
downloadUrls: string[],
|
||||
checksumEntries: ChecksumEntry[],
|
||||
): Promise<void> {
|
||||
await fs.rm(filePath);
|
||||
await fs.appendFile(
|
||||
filePath,
|
||||
"// AUTOGENERATED_DO_NOT_EDIT\nexport const KNOWN_CHECKSUMS: { [key: string]: string } = {\n",
|
||||
);
|
||||
let firstLine = true;
|
||||
for (const downloadUrl of downloadUrls) {
|
||||
const key = getKey(downloadUrl);
|
||||
if (key === undefined) {
|
||||
const deduplicatedEntries = new Map<string, string>();
|
||||
|
||||
for (const entry of checksumEntries) {
|
||||
if (deduplicatedEntries.has(entry.key)) {
|
||||
continue;
|
||||
}
|
||||
const checksum = await getOrDownloadChecksum(key, downloadUrl);
|
||||
if (!firstLine) {
|
||||
await fs.appendFile(filePath, ",\n");
|
||||
}
|
||||
await fs.appendFile(filePath, ` "${key}":\n "${checksum}"`);
|
||||
firstLine = false;
|
||||
}
|
||||
await fs.appendFile(filePath, ",\n};\n");
|
||||
}
|
||||
|
||||
function getKey(downloadUrl: string): string | undefined {
|
||||
// https://github.com/astral-sh/uv/releases/download/0.3.2/uv-aarch64-apple-darwin.tar.gz.sha256
|
||||
const parts = downloadUrl.split("/");
|
||||
const fileName = parts[parts.length - 1];
|
||||
if (fileName.startsWith("source")) {
|
||||
return undefined;
|
||||
deduplicatedEntries.set(entry.key, entry.checksum);
|
||||
}
|
||||
const name = fileName.split(".")[0].split("uv-")[1];
|
||||
const version = parts[parts.length - 2];
|
||||
return `${name}-${version}`;
|
||||
}
|
||||
|
||||
async function getOrDownloadChecksum(
|
||||
key: string,
|
||||
downloadUrl: string,
|
||||
): Promise<string> {
|
||||
let checksum = "";
|
||||
if (key in KNOWN_CHECKSUMS) {
|
||||
checksum = KNOWN_CHECKSUMS[key];
|
||||
} else {
|
||||
const content = await downloadAssetContent(downloadUrl);
|
||||
checksum = content.split(" ")[0].trim();
|
||||
}
|
||||
return checksum;
|
||||
}
|
||||
const body = [...deduplicatedEntries.entries()]
|
||||
.map(([key, checksum]) => ` "${key}":\n "${checksum}"`)
|
||||
.join(",\n");
|
||||
|
||||
async function downloadAssetContent(downloadUrl: string): Promise<string> {
|
||||
const downloadPath = await tc.downloadTool(downloadUrl);
|
||||
const content = await fs.readFile(downloadPath, "utf8");
|
||||
return content;
|
||||
const content =
|
||||
"// AUTOGENERATED_DO_NOT_EDIT\n" +
|
||||
"export const KNOWN_CHECKSUMS: { [key: string]: string } = {\n" +
|
||||
body +
|
||||
(body === "" ? "" : ",\n") +
|
||||
"};\n";
|
||||
|
||||
await fs.writeFile(filePath, content);
|
||||
}
|
||||
|
||||
@@ -2,20 +2,22 @@ import { promises as fs } from "node:fs";
|
||||
import * as path from "node:path";
|
||||
import * as core from "@actions/core";
|
||||
import * as tc from "@actions/tool-cache";
|
||||
import type { Endpoints } from "@octokit/types";
|
||||
import * as pep440 from "@renovatebot/pep440";
|
||||
import * as semver from "semver";
|
||||
import { OWNER, REPO, TOOL_CACHE_NAME } from "../utils/constants";
|
||||
import { Octokit } from "../utils/octokit";
|
||||
import { TOOL_CACHE_NAME, VERSIONS_NDJSON_URL } from "../utils/constants";
|
||||
import type { Architecture, Platform } from "../utils/platforms";
|
||||
import { validateChecksum } from "./checksum/checksum";
|
||||
import {
|
||||
getDownloadUrl,
|
||||
getAllVersions as getAllManifestVersions,
|
||||
getLatestKnownVersion as getLatestVersionInManifest,
|
||||
getManifestArtifact,
|
||||
} from "./version-manifest";
|
||||
|
||||
type Release =
|
||||
Endpoints["GET /repos/{owner}/{repo}/releases"]["response"]["data"][number];
|
||||
import {
|
||||
getAllVersions as getAllVersionsFromNdjson,
|
||||
getArtifact as getArtifactFromNdjson,
|
||||
getHighestSatisfyingVersion as getHighestSatisfyingVersionFromNdjson,
|
||||
getLatestVersion as getLatestVersionFromNdjson,
|
||||
} from "./versions-client";
|
||||
|
||||
export function tryGetFromToolCache(
|
||||
arch: Architecture,
|
||||
@@ -32,19 +34,26 @@ export function tryGetFromToolCache(
|
||||
return { installedPath, version: resolvedVersion };
|
||||
}
|
||||
|
||||
export async function downloadVersionFromGithub(
|
||||
export async function downloadVersionFromNdjson(
|
||||
platform: Platform,
|
||||
arch: Architecture,
|
||||
version: string,
|
||||
checkSum: string | undefined,
|
||||
githubToken: string,
|
||||
): Promise<{ version: string; cachedToolDir: string }> {
|
||||
const artifact = `uv-${arch}-${platform}`;
|
||||
const extension = getExtension(platform);
|
||||
const downloadUrl = `https://github.com/${OWNER}/${REPO}/releases/download/${version}/${artifact}${extension}`;
|
||||
const artifact = await getArtifactFromNdjson(version, arch, platform);
|
||||
|
||||
if (!artifact) {
|
||||
throw new Error(
|
||||
`Could not find artifact for version ${version}, arch ${arch}, platform ${platform} in ${VERSIONS_NDJSON_URL} .`,
|
||||
);
|
||||
}
|
||||
|
||||
// For the default astral-sh/versions source, checksum validation relies on
|
||||
// user input or the built-in KNOWN_CHECKSUMS table, not NDJSON sha256 values.
|
||||
return await downloadVersion(
|
||||
downloadUrl,
|
||||
artifact,
|
||||
artifact.url,
|
||||
`uv-${arch}-${platform}`,
|
||||
platform,
|
||||
arch,
|
||||
version,
|
||||
@@ -54,38 +63,32 @@ export async function downloadVersionFromGithub(
|
||||
}
|
||||
|
||||
export async function downloadVersionFromManifest(
|
||||
manifestUrl: string | undefined,
|
||||
manifestUrl: string,
|
||||
platform: Platform,
|
||||
arch: Architecture,
|
||||
version: string,
|
||||
checkSum: string | undefined,
|
||||
githubToken: string,
|
||||
): Promise<{ version: string; cachedToolDir: string }> {
|
||||
const downloadUrl = await getDownloadUrl(
|
||||
const artifact = await getManifestArtifact(
|
||||
manifestUrl,
|
||||
version,
|
||||
arch,
|
||||
platform,
|
||||
);
|
||||
if (!downloadUrl) {
|
||||
core.info(
|
||||
`manifest-file does not contain version ${version}, arch ${arch}, platform ${platform}. Falling back to GitHub releases.`,
|
||||
);
|
||||
return await downloadVersionFromGithub(
|
||||
platform,
|
||||
arch,
|
||||
version,
|
||||
checkSum,
|
||||
githubToken,
|
||||
if (!artifact) {
|
||||
throw new Error(
|
||||
`manifest-file does not contain version ${version}, arch ${arch}, platform ${platform}.`,
|
||||
);
|
||||
}
|
||||
|
||||
return await downloadVersion(
|
||||
downloadUrl,
|
||||
artifact.downloadUrl,
|
||||
`uv-${arch}-${platform}`,
|
||||
platform,
|
||||
arch,
|
||||
version,
|
||||
checkSum,
|
||||
resolveChecksum(checkSum, artifact.checksum),
|
||||
githubToken,
|
||||
);
|
||||
}
|
||||
@@ -96,7 +99,7 @@ async function downloadVersion(
|
||||
platform: Platform,
|
||||
arch: Architecture,
|
||||
version: string,
|
||||
checkSum: string | undefined,
|
||||
checksum: string | undefined,
|
||||
githubToken: string,
|
||||
): Promise<{ version: string; cachedToolDir: string }> {
|
||||
core.info(`Downloading uv from "${downloadUrl}" ...`);
|
||||
@@ -105,14 +108,14 @@ async function downloadVersion(
|
||||
undefined,
|
||||
githubToken,
|
||||
);
|
||||
await validateChecksum(checkSum, downloadPath, arch, platform, version);
|
||||
await validateChecksum(checksum, downloadPath, arch, platform, version);
|
||||
|
||||
let uvDir: string;
|
||||
if (platform === "pc-windows-msvc") {
|
||||
// On windows extracting the zip does not create an intermediate directory
|
||||
// On windows extracting the zip does not create an intermediate directory.
|
||||
try {
|
||||
// Try tar first as it's much faster, but only bsdtar supports zip files,
|
||||
// so this my fail if another tar, like gnu tar, ends up being used.
|
||||
// so this may fail if another tar, like gnu tar, ends up being used.
|
||||
uvDir = await tc.extractTar(downloadPath, undefined, "x");
|
||||
} catch (err) {
|
||||
core.info(
|
||||
@@ -127,6 +130,7 @@ async function downloadVersion(
|
||||
const extractedDir = await tc.extractTar(downloadPath);
|
||||
uvDir = path.join(extractedDir, artifactName);
|
||||
}
|
||||
|
||||
const cachedToolDir = await tc.cacheDir(
|
||||
uvDir,
|
||||
TOOL_CACHE_NAME,
|
||||
@@ -136,14 +140,22 @@ async function downloadVersion(
|
||||
return { cachedToolDir, version: version };
|
||||
}
|
||||
|
||||
function resolveChecksum(
|
||||
checkSum: string | undefined,
|
||||
manifestChecksum?: string,
|
||||
): string | undefined {
|
||||
return checkSum !== undefined && checkSum !== ""
|
||||
? checkSum
|
||||
: manifestChecksum;
|
||||
}
|
||||
|
||||
function getExtension(platform: Platform): string {
|
||||
return platform === "pc-windows-msvc" ? ".zip" : ".tar.gz";
|
||||
}
|
||||
|
||||
export async function resolveVersion(
|
||||
versionInput: string,
|
||||
manifestFile: string | undefined,
|
||||
githubToken: string,
|
||||
manifestUrl: string | undefined,
|
||||
resolutionStrategy: "highest" | "lowest" = "highest",
|
||||
): Promise<string> {
|
||||
core.debug(`Resolving version: ${versionInput}`);
|
||||
@@ -155,15 +167,15 @@ export async function resolveVersion(
|
||||
if (resolveVersionSpecifierToLatest) {
|
||||
core.info("Found minimum version specifier, using latest version");
|
||||
}
|
||||
if (manifestFile) {
|
||||
if (manifestUrl !== undefined) {
|
||||
version =
|
||||
versionInput === "latest" || resolveVersionSpecifierToLatest
|
||||
? await getLatestVersionInManifest(manifestFile)
|
||||
? await getLatestVersionInManifest(manifestUrl)
|
||||
: versionInput;
|
||||
} else {
|
||||
version =
|
||||
versionInput === "latest" || resolveVersionSpecifierToLatest
|
||||
? await getLatestVersion(githubToken)
|
||||
? await getLatestVersionFromNdjson()
|
||||
: versionInput;
|
||||
}
|
||||
if (tc.isExplicitVersion(version)) {
|
||||
@@ -175,91 +187,44 @@ export async function resolveVersion(
|
||||
}
|
||||
return version;
|
||||
}
|
||||
const availableVersions = await getAvailableVersions(githubToken);
|
||||
|
||||
if (manifestUrl === undefined && resolutionStrategy === "highest") {
|
||||
const resolvedVersion =
|
||||
await getHighestSatisfyingVersionFromNdjson(version);
|
||||
if (resolvedVersion !== undefined) {
|
||||
core.debug(`Resolved version from NDJSON stream: ${resolvedVersion}`);
|
||||
return resolvedVersion;
|
||||
}
|
||||
|
||||
throw new Error(`No version found for ${version}`);
|
||||
}
|
||||
|
||||
const availableVersions = await getAvailableVersions(manifestUrl);
|
||||
core.debug(`Available versions: ${availableVersions}`);
|
||||
const resolvedVersion =
|
||||
resolutionStrategy === "lowest"
|
||||
? minSatisfying(availableVersions, version)
|
||||
: maxSatisfying(availableVersions, version);
|
||||
|
||||
if (resolvedVersion === undefined) {
|
||||
throw new Error(`No version found for ${version}`);
|
||||
}
|
||||
|
||||
return resolvedVersion;
|
||||
}
|
||||
|
||||
async function getAvailableVersions(githubToken: string): Promise<string[]> {
|
||||
core.info("Getting available versions from GitHub API...");
|
||||
try {
|
||||
const octokit = new Octokit({
|
||||
auth: githubToken,
|
||||
});
|
||||
return await getReleaseTagNames(octokit);
|
||||
} catch (err) {
|
||||
if ((err as Error).message.includes("Bad credentials")) {
|
||||
async function getAvailableVersions(
|
||||
manifestUrl: string | undefined,
|
||||
): Promise<string[]> {
|
||||
if (manifestUrl !== undefined) {
|
||||
core.info(
|
||||
"No (valid) GitHub token provided. Falling back to anonymous. Requests might be rate limited.",
|
||||
`Getting available versions from manifest-file ${manifestUrl} ...`,
|
||||
);
|
||||
const octokit = new Octokit();
|
||||
return await getReleaseTagNames(octokit);
|
||||
}
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
|
||||
async function getReleaseTagNames(octokit: Octokit): Promise<string[]> {
|
||||
const response: Release[] = await octokit.paginate(
|
||||
octokit.rest.repos.listReleases,
|
||||
{
|
||||
owner: OWNER,
|
||||
repo: REPO,
|
||||
},
|
||||
);
|
||||
const releaseTagNames = response.map((release) => release.tag_name);
|
||||
if (releaseTagNames.length === 0) {
|
||||
throw Error(
|
||||
"Github API request failed while getting releases. Check the GitHub status page for outages. Try again later.",
|
||||
);
|
||||
}
|
||||
return releaseTagNames;
|
||||
}
|
||||
|
||||
async function getLatestVersion(githubToken: string) {
|
||||
core.info("Getting latest version from GitHub API...");
|
||||
const octokit = new Octokit({
|
||||
auth: githubToken,
|
||||
});
|
||||
|
||||
let latestRelease: { tag_name: string } | undefined;
|
||||
try {
|
||||
latestRelease = await getLatestRelease(octokit);
|
||||
} catch (err) {
|
||||
if ((err as Error).message.includes("Bad credentials")) {
|
||||
core.info(
|
||||
"No (valid) GitHub token provided. Falling back to anonymous. Requests might be rate limited.",
|
||||
);
|
||||
const octokit = new Octokit();
|
||||
latestRelease = await getLatestRelease(octokit);
|
||||
} else {
|
||||
core.error(
|
||||
"Github API request failed while getting latest release. Check the GitHub status page for outages. Try again later.",
|
||||
);
|
||||
throw err;
|
||||
}
|
||||
return await getAllManifestVersions(manifestUrl);
|
||||
}
|
||||
|
||||
if (!latestRelease) {
|
||||
throw new Error("Could not determine latest release.");
|
||||
}
|
||||
core.debug(`Latest version: ${latestRelease.tag_name}`);
|
||||
return latestRelease.tag_name;
|
||||
}
|
||||
|
||||
async function getLatestRelease(octokit: Octokit) {
|
||||
const { data: latestRelease } = await octokit.rest.repos.getLatestRelease({
|
||||
owner: OWNER,
|
||||
repo: REPO,
|
||||
});
|
||||
return latestRelease;
|
||||
core.info(`Getting available versions from ${VERSIONS_NDJSON_URL} ...`);
|
||||
return await getAllVersionsFromNdjson();
|
||||
}
|
||||
|
||||
function maxSatisfying(
|
||||
|
||||
80
src/download/legacy-version-manifest.ts
Normal file
80
src/download/legacy-version-manifest.ts
Normal file
@@ -0,0 +1,80 @@
|
||||
import * as core from "@actions/core";
|
||||
|
||||
export interface ManifestEntry {
|
||||
arch: string;
|
||||
platform: string;
|
||||
version: string;
|
||||
downloadUrl: string;
|
||||
checksum?: string;
|
||||
variant?: string;
|
||||
archiveFormat?: string;
|
||||
}
|
||||
|
||||
interface LegacyManifestEntry {
|
||||
arch: string;
|
||||
platform: string;
|
||||
version: string;
|
||||
downloadUrl: string;
|
||||
checksum?: string;
|
||||
}
|
||||
|
||||
const warnedLegacyManifestUrls = new Set<string>();
|
||||
|
||||
export function parseLegacyManifestEntries(
|
||||
parsedEntries: unknown[],
|
||||
manifestUrl: string,
|
||||
): ManifestEntry[] {
|
||||
warnAboutLegacyManifestFormat(manifestUrl);
|
||||
|
||||
return parsedEntries.map((entry, index) => {
|
||||
if (!isLegacyManifestEntry(entry)) {
|
||||
throw new Error(
|
||||
`Invalid legacy manifest-file entry at index ${index} in ${manifestUrl}.`,
|
||||
);
|
||||
}
|
||||
|
||||
return {
|
||||
arch: entry.arch,
|
||||
checksum: entry.checksum,
|
||||
downloadUrl: entry.downloadUrl,
|
||||
platform: entry.platform,
|
||||
version: entry.version,
|
||||
};
|
||||
});
|
||||
}
|
||||
|
||||
export function clearLegacyManifestWarnings(): void {
|
||||
warnedLegacyManifestUrls.clear();
|
||||
}
|
||||
|
||||
function warnAboutLegacyManifestFormat(manifestUrl: string): void {
|
||||
if (warnedLegacyManifestUrls.has(manifestUrl)) {
|
||||
return;
|
||||
}
|
||||
|
||||
warnedLegacyManifestUrls.add(manifestUrl);
|
||||
core.warning(
|
||||
`manifest-file ${manifestUrl} uses the legacy JSON array format, which is deprecated. Please migrate to the astral-sh/versions NDJSON format before the next major release.`,
|
||||
);
|
||||
}
|
||||
|
||||
function isLegacyManifestEntry(value: unknown): value is LegacyManifestEntry {
|
||||
if (!isRecord(value)) {
|
||||
return false;
|
||||
}
|
||||
|
||||
const checksumIsValid =
|
||||
typeof value.checksum === "string" || value.checksum === undefined;
|
||||
|
||||
return (
|
||||
typeof value.arch === "string" &&
|
||||
checksumIsValid &&
|
||||
typeof value.downloadUrl === "string" &&
|
||||
typeof value.platform === "string" &&
|
||||
typeof value.version === "string"
|
||||
);
|
||||
}
|
||||
|
||||
function isRecord(value: unknown): value is Record<string, unknown> {
|
||||
return typeof value === "object" && value !== null;
|
||||
}
|
||||
39
src/download/variant-selection.ts
Normal file
39
src/download/variant-selection.ts
Normal file
@@ -0,0 +1,39 @@
|
||||
interface VariantAwareEntry {
|
||||
variant?: string;
|
||||
}
|
||||
|
||||
export function selectDefaultVariant<T extends VariantAwareEntry>(
|
||||
entries: T[],
|
||||
duplicateEntryDescription: string,
|
||||
): T {
|
||||
const firstEntry = entries[0];
|
||||
if (firstEntry === undefined) {
|
||||
throw new Error("selectDefaultVariant requires at least one candidate.");
|
||||
}
|
||||
|
||||
if (entries.length === 1) {
|
||||
return firstEntry;
|
||||
}
|
||||
|
||||
const defaultEntries = entries.filter((entry) =>
|
||||
isDefaultVariant(entry.variant),
|
||||
);
|
||||
if (defaultEntries.length === 1) {
|
||||
return defaultEntries[0];
|
||||
}
|
||||
|
||||
throw new Error(
|
||||
`${duplicateEntryDescription} with variants ${formatVariants(entries)}. setup-uv currently requires a single default variant for duplicate platform entries.`,
|
||||
);
|
||||
}
|
||||
|
||||
function isDefaultVariant(variant: string | undefined): boolean {
|
||||
return variant === undefined || variant === "default";
|
||||
}
|
||||
|
||||
function formatVariants<T extends VariantAwareEntry>(entries: T[]): string {
|
||||
return entries
|
||||
.map((entry) => entry.variant ?? "default")
|
||||
.sort((left, right) => left.localeCompare(right))
|
||||
.join(", ");
|
||||
}
|
||||
@@ -1,49 +1,78 @@
|
||||
import { promises as fs } from "node:fs";
|
||||
import { join } from "node:path";
|
||||
import * as core from "@actions/core";
|
||||
import * as semver from "semver";
|
||||
import { fetch } from "../utils/fetch";
|
||||
import {
|
||||
clearLegacyManifestWarnings,
|
||||
type ManifestEntry,
|
||||
parseLegacyManifestEntries,
|
||||
} from "./legacy-version-manifest";
|
||||
import { selectDefaultVariant } from "./variant-selection";
|
||||
import { type NdjsonVersion, parseVersionData } from "./versions-client";
|
||||
|
||||
const localManifestFile = join(__dirname, "..", "..", "version-manifest.json");
|
||||
|
||||
interface ManifestEntry {
|
||||
version: string;
|
||||
artifactName: string;
|
||||
arch: string;
|
||||
platform: string;
|
||||
export interface ManifestArtifact {
|
||||
downloadUrl: string;
|
||||
checksum?: string;
|
||||
archiveFormat?: string;
|
||||
}
|
||||
|
||||
const cachedManifestEntries = new Map<string, ManifestEntry[]>();
|
||||
|
||||
export async function getLatestKnownVersion(
|
||||
manifestUrl: string | undefined,
|
||||
manifestUrl: string,
|
||||
): Promise<string> {
|
||||
const manifestEntries = await getManifestEntries(manifestUrl);
|
||||
return manifestEntries.reduce((a, b) =>
|
||||
semver.gt(a.version, b.version) ? a : b,
|
||||
).version;
|
||||
const versions = await getAllVersions(manifestUrl);
|
||||
const latestVersion = versions.reduce((latest, current) =>
|
||||
semver.gt(current, latest) ? current : latest,
|
||||
);
|
||||
|
||||
return latestVersion;
|
||||
}
|
||||
|
||||
export async function getDownloadUrl(
|
||||
manifestUrl: string | undefined,
|
||||
export async function getAllVersions(manifestUrl: string): Promise<string[]> {
|
||||
const manifestEntries = await getManifestEntries(manifestUrl);
|
||||
return [...new Set(manifestEntries.map((entry) => entry.version))];
|
||||
}
|
||||
|
||||
export async function getManifestArtifact(
|
||||
manifestUrl: string,
|
||||
version: string,
|
||||
arch: string,
|
||||
platform: string,
|
||||
): Promise<string | undefined> {
|
||||
): Promise<ManifestArtifact | undefined> {
|
||||
const manifestEntries = await getManifestEntries(manifestUrl);
|
||||
const entry = manifestEntries.find(
|
||||
(entry) =>
|
||||
entry.version === version &&
|
||||
entry.arch === arch &&
|
||||
entry.platform === platform,
|
||||
const entry = selectManifestEntry(
|
||||
manifestEntries,
|
||||
manifestUrl,
|
||||
version,
|
||||
arch,
|
||||
platform,
|
||||
);
|
||||
return entry ? entry.downloadUrl : undefined;
|
||||
|
||||
if (!entry) {
|
||||
return undefined;
|
||||
}
|
||||
|
||||
return {
|
||||
archiveFormat: entry.archiveFormat,
|
||||
checksum: entry.checksum,
|
||||
downloadUrl: entry.downloadUrl,
|
||||
};
|
||||
}
|
||||
|
||||
export function clearManifestCache(): void {
|
||||
cachedManifestEntries.clear();
|
||||
clearLegacyManifestWarnings();
|
||||
}
|
||||
|
||||
async function getManifestEntries(
|
||||
manifestUrl: string | undefined,
|
||||
manifestUrl: string,
|
||||
): Promise<ManifestEntry[]> {
|
||||
let data: string;
|
||||
if (manifestUrl !== undefined) {
|
||||
const cachedEntries = cachedManifestEntries.get(manifestUrl);
|
||||
if (cachedEntries !== undefined) {
|
||||
core.debug(`Using cached manifest-file from: ${manifestUrl}`);
|
||||
return cachedEntries;
|
||||
}
|
||||
|
||||
core.info(`Fetching manifest-file from: ${manifestUrl}`);
|
||||
const response = await fetch(manifestUrl, {});
|
||||
if (!response.ok) {
|
||||
@@ -51,41 +80,90 @@ async function getManifestEntries(
|
||||
`Failed to fetch manifest-file: ${response.status} ${response.statusText}`,
|
||||
);
|
||||
}
|
||||
data = await response.text();
|
||||
} else {
|
||||
core.info("manifest-file not provided, reading from local file.");
|
||||
const fileContent = await fs.readFile(localManifestFile);
|
||||
data = fileContent.toString();
|
||||
}
|
||||
|
||||
return JSON.parse(data);
|
||||
const data = await response.text();
|
||||
const parsedEntries = parseManifestEntries(data, manifestUrl);
|
||||
cachedManifestEntries.set(manifestUrl, parsedEntries);
|
||||
|
||||
return parsedEntries;
|
||||
}
|
||||
|
||||
export async function updateVersionManifest(
|
||||
function parseManifestEntries(
|
||||
data: string,
|
||||
manifestUrl: string,
|
||||
downloadUrls: string[],
|
||||
): Promise<void> {
|
||||
const manifest: ManifestEntry[] = [];
|
||||
): ManifestEntry[] {
|
||||
const trimmed = data.trim();
|
||||
if (trimmed === "") {
|
||||
throw new Error(`manifest-file at ${manifestUrl} is empty.`);
|
||||
}
|
||||
|
||||
for (const downloadUrl of downloadUrls) {
|
||||
const urlParts = downloadUrl.split("/");
|
||||
const version = urlParts[urlParts.length - 2];
|
||||
const artifactName = urlParts[urlParts.length - 1];
|
||||
if (!artifactName.startsWith("uv-")) {
|
||||
continue;
|
||||
const parsedAsJson = tryParseJson(trimmed);
|
||||
if (Array.isArray(parsedAsJson)) {
|
||||
return parseLegacyManifestEntries(parsedAsJson, manifestUrl);
|
||||
}
|
||||
if (artifactName.startsWith("uv-installer")) {
|
||||
continue;
|
||||
|
||||
const versions = parseVersionData(trimmed, manifestUrl);
|
||||
return mapNdjsonVersionsToManifestEntries(versions, manifestUrl);
|
||||
}
|
||||
|
||||
function mapNdjsonVersionsToManifestEntries(
|
||||
versions: NdjsonVersion[],
|
||||
manifestUrl: string,
|
||||
): ManifestEntry[] {
|
||||
const manifestEntries: ManifestEntry[] = [];
|
||||
|
||||
for (const versionData of versions) {
|
||||
for (const artifact of versionData.artifacts) {
|
||||
const [arch, ...platformParts] = artifact.platform.split("-");
|
||||
if (arch === undefined || platformParts.length === 0) {
|
||||
throw new Error(
|
||||
`Invalid artifact platform '${artifact.platform}' in manifest-file ${manifestUrl}.`,
|
||||
);
|
||||
}
|
||||
const artifactParts = artifactName.split(".")[0].split("-");
|
||||
manifest.push({
|
||||
arch: artifactParts[1],
|
||||
artifactName: artifactName,
|
||||
downloadUrl: downloadUrl,
|
||||
platform: artifactName.split(`uv-${artifactParts[1]}-`)[1].split(".")[0],
|
||||
version: version,
|
||||
|
||||
manifestEntries.push({
|
||||
arch,
|
||||
archiveFormat: artifact.archive_format,
|
||||
checksum: artifact.sha256,
|
||||
downloadUrl: artifact.url,
|
||||
platform: platformParts.join("-"),
|
||||
variant: artifact.variant,
|
||||
version: versionData.version,
|
||||
});
|
||||
}
|
||||
core.debug(`Updating manifest-file: ${JSON.stringify(manifest)}`);
|
||||
await fs.writeFile(manifestUrl, JSON.stringify(manifest));
|
||||
}
|
||||
|
||||
return manifestEntries;
|
||||
}
|
||||
|
||||
function selectManifestEntry(
|
||||
manifestEntries: ManifestEntry[],
|
||||
manifestUrl: string,
|
||||
version: string,
|
||||
arch: string,
|
||||
platform: string,
|
||||
): ManifestEntry | undefined {
|
||||
const matches = manifestEntries.filter(
|
||||
(candidate) =>
|
||||
candidate.version === version &&
|
||||
candidate.arch === arch &&
|
||||
candidate.platform === platform,
|
||||
);
|
||||
|
||||
if (matches.length === 0) {
|
||||
return undefined;
|
||||
}
|
||||
|
||||
return selectDefaultVariant(
|
||||
matches,
|
||||
`manifest-file ${manifestUrl} contains multiple artifacts for version ${version}, arch ${arch}, platform ${platform}`,
|
||||
);
|
||||
}
|
||||
|
||||
function tryParseJson(value: string): unknown {
|
||||
try {
|
||||
return JSON.parse(value);
|
||||
} catch {
|
||||
return undefined;
|
||||
}
|
||||
}
|
||||
|
||||
380
src/download/versions-client.ts
Normal file
380
src/download/versions-client.ts
Normal file
@@ -0,0 +1,380 @@
|
||||
import * as core from "@actions/core";
|
||||
import * as pep440 from "@renovatebot/pep440";
|
||||
import * as semver from "semver";
|
||||
import { VERSIONS_NDJSON_URL } from "../utils/constants";
|
||||
import { fetch } from "../utils/fetch";
|
||||
import { selectDefaultVariant } from "./variant-selection";
|
||||
|
||||
export interface NdjsonArtifact {
|
||||
platform: string;
|
||||
variant?: string;
|
||||
url: string;
|
||||
archive_format: string;
|
||||
sha256: string;
|
||||
}
|
||||
|
||||
export interface NdjsonVersion {
|
||||
version: string;
|
||||
artifacts: NdjsonArtifact[];
|
||||
}
|
||||
|
||||
export interface ArtifactResult {
|
||||
url: string;
|
||||
sha256: string;
|
||||
archiveFormat: string;
|
||||
}
|
||||
|
||||
const cachedVersionData = new Map<string, NdjsonVersion[]>();
|
||||
const cachedLatestVersionData = new Map<string, NdjsonVersion>();
|
||||
const cachedVersionLookup = new Map<string, Map<string, NdjsonVersion>>();
|
||||
|
||||
export async function fetchVersionData(
|
||||
url: string = VERSIONS_NDJSON_URL,
|
||||
): Promise<NdjsonVersion[]> {
|
||||
const cachedVersions = cachedVersionData.get(url);
|
||||
if (cachedVersions !== undefined) {
|
||||
core.debug(`Using cached NDJSON version data from ${url}`);
|
||||
return cachedVersions;
|
||||
}
|
||||
|
||||
core.info(`Fetching version data from ${url} ...`);
|
||||
const { versions } = await readVersionData(url);
|
||||
cacheCompleteVersionData(url, versions);
|
||||
return versions;
|
||||
}
|
||||
|
||||
export function parseVersionData(
|
||||
data: string,
|
||||
sourceDescription: string,
|
||||
): NdjsonVersion[] {
|
||||
const versions: NdjsonVersion[] = [];
|
||||
|
||||
for (const [index, line] of data.split("\n").entries()) {
|
||||
const trimmed = line.trim();
|
||||
if (trimmed === "") {
|
||||
continue;
|
||||
}
|
||||
|
||||
versions.push(parseVersionLine(trimmed, sourceDescription, index + 1));
|
||||
}
|
||||
|
||||
if (versions.length === 0) {
|
||||
throw new Error(`No version data found in ${sourceDescription}.`);
|
||||
}
|
||||
|
||||
return versions;
|
||||
}
|
||||
|
||||
export async function getLatestVersion(): Promise<string> {
|
||||
const cachedVersions = cachedVersionData.get(VERSIONS_NDJSON_URL);
|
||||
const cachedLatestVersion =
|
||||
cachedVersions?.[0] ?? cachedLatestVersionData.get(VERSIONS_NDJSON_URL);
|
||||
if (cachedLatestVersion !== undefined) {
|
||||
core.debug(
|
||||
`Latest version from NDJSON cache: ${cachedLatestVersion.version}`,
|
||||
);
|
||||
return cachedLatestVersion.version;
|
||||
}
|
||||
|
||||
const latestVersion = await findVersionData(() => true);
|
||||
if (!latestVersion) {
|
||||
throw new Error("No versions found in NDJSON data");
|
||||
}
|
||||
|
||||
core.debug(`Latest version from NDJSON: ${latestVersion.version}`);
|
||||
return latestVersion.version;
|
||||
}
|
||||
|
||||
export async function getAllVersions(): Promise<string[]> {
|
||||
const versions = await fetchVersionData();
|
||||
return versions.map((versionData) => versionData.version);
|
||||
}
|
||||
|
||||
export async function getHighestSatisfyingVersion(
|
||||
versionSpecifier: string,
|
||||
url: string = VERSIONS_NDJSON_URL,
|
||||
): Promise<string | undefined> {
|
||||
const matchedVersion = await findVersionData(
|
||||
(candidate) => versionSatisfies(candidate.version, versionSpecifier),
|
||||
url,
|
||||
);
|
||||
|
||||
return matchedVersion?.version;
|
||||
}
|
||||
|
||||
export async function getArtifact(
|
||||
version: string,
|
||||
arch: string,
|
||||
platform: string,
|
||||
): Promise<ArtifactResult | undefined> {
|
||||
const versionData = await getVersionData(version);
|
||||
if (!versionData) {
|
||||
core.debug(`Version ${version} not found in NDJSON data`);
|
||||
return undefined;
|
||||
}
|
||||
|
||||
const targetPlatform = `${arch}-${platform}`;
|
||||
const matchingArtifacts = versionData.artifacts.filter(
|
||||
(candidate) => candidate.platform === targetPlatform,
|
||||
);
|
||||
|
||||
if (matchingArtifacts.length === 0) {
|
||||
core.debug(
|
||||
`Artifact for ${targetPlatform} not found in version ${version}. Available platforms: ${versionData.artifacts
|
||||
.map((candidate) => candidate.platform)
|
||||
.join(", ")}`,
|
||||
);
|
||||
return undefined;
|
||||
}
|
||||
|
||||
const artifact = selectArtifact(matchingArtifacts, version, targetPlatform);
|
||||
|
||||
return {
|
||||
archiveFormat: artifact.archive_format,
|
||||
sha256: artifact.sha256,
|
||||
url: artifact.url,
|
||||
};
|
||||
}
|
||||
|
||||
export function clearCache(url?: string): void {
|
||||
if (url === undefined) {
|
||||
cachedVersionData.clear();
|
||||
cachedLatestVersionData.clear();
|
||||
cachedVersionLookup.clear();
|
||||
return;
|
||||
}
|
||||
|
||||
cachedVersionData.delete(url);
|
||||
cachedLatestVersionData.delete(url);
|
||||
cachedVersionLookup.delete(url);
|
||||
}
|
||||
|
||||
function selectArtifact(
|
||||
artifacts: NdjsonArtifact[],
|
||||
version: string,
|
||||
targetPlatform: string,
|
||||
): NdjsonArtifact {
|
||||
return selectDefaultVariant(
|
||||
artifacts,
|
||||
`Multiple artifacts found for ${targetPlatform} in version ${version}`,
|
||||
);
|
||||
}
|
||||
|
||||
async function getVersionData(
|
||||
version: string,
|
||||
url: string = VERSIONS_NDJSON_URL,
|
||||
): Promise<NdjsonVersion | undefined> {
|
||||
const cachedVersions = cachedVersionData.get(url);
|
||||
if (cachedVersions !== undefined) {
|
||||
return cachedVersions.find((candidate) => candidate.version === version);
|
||||
}
|
||||
|
||||
const cachedVersion = cachedVersionLookup.get(url)?.get(version);
|
||||
if (cachedVersion !== undefined) {
|
||||
return cachedVersion;
|
||||
}
|
||||
|
||||
return await findVersionData(
|
||||
(candidate) => candidate.version === version,
|
||||
url,
|
||||
);
|
||||
}
|
||||
|
||||
async function findVersionData(
|
||||
predicate: (versionData: NdjsonVersion) => boolean,
|
||||
url: string = VERSIONS_NDJSON_URL,
|
||||
): Promise<NdjsonVersion | undefined> {
|
||||
const cachedVersions = cachedVersionData.get(url);
|
||||
if (cachedVersions !== undefined) {
|
||||
return cachedVersions.find(predicate);
|
||||
}
|
||||
|
||||
const { matchedVersion, versions, complete } = await readVersionData(
|
||||
url,
|
||||
predicate,
|
||||
);
|
||||
|
||||
if (complete) {
|
||||
cacheCompleteVersionData(url, versions);
|
||||
}
|
||||
|
||||
return matchedVersion;
|
||||
}
|
||||
|
||||
async function readVersionData(
|
||||
url: string,
|
||||
stopWhen?: (versionData: NdjsonVersion) => boolean,
|
||||
): Promise<{
|
||||
complete: boolean;
|
||||
matchedVersion: NdjsonVersion | undefined;
|
||||
versions: NdjsonVersion[];
|
||||
}> {
|
||||
const response = await fetch(url, {});
|
||||
if (!response.ok) {
|
||||
throw new Error(
|
||||
`Failed to fetch version data: ${response.status} ${response.statusText}`,
|
||||
);
|
||||
}
|
||||
|
||||
if (response.body === null) {
|
||||
const body = await response.text();
|
||||
const versions = parseVersionData(body, url);
|
||||
const matchedVersion = stopWhen
|
||||
? versions.find((candidate) => stopWhen(candidate))
|
||||
: undefined;
|
||||
return { complete: true, matchedVersion, versions };
|
||||
}
|
||||
|
||||
const versions: NdjsonVersion[] = [];
|
||||
let lineNumber = 0;
|
||||
let matchedVersion: NdjsonVersion | undefined;
|
||||
let buffer = "";
|
||||
const decoder = new TextDecoder();
|
||||
const reader = response.body.getReader();
|
||||
|
||||
const processLine = (line: string): boolean => {
|
||||
const trimmed = line.trim();
|
||||
if (trimmed === "") {
|
||||
return false;
|
||||
}
|
||||
|
||||
lineNumber += 1;
|
||||
const versionData = parseVersionLine(trimmed, url, lineNumber);
|
||||
if (versions.length === 0) {
|
||||
cachedLatestVersionData.set(url, versionData);
|
||||
}
|
||||
|
||||
versions.push(versionData);
|
||||
cacheVersion(url, versionData);
|
||||
|
||||
if (stopWhen?.(versionData) === true) {
|
||||
matchedVersion = versionData;
|
||||
return true;
|
||||
}
|
||||
|
||||
return false;
|
||||
};
|
||||
|
||||
while (true) {
|
||||
const { done, value } = await reader.read();
|
||||
if (done) {
|
||||
buffer += decoder.decode();
|
||||
break;
|
||||
}
|
||||
|
||||
buffer += decoder.decode(value, { stream: true });
|
||||
let newlineIndex = buffer.indexOf("\n");
|
||||
while (newlineIndex !== -1) {
|
||||
const line = buffer.slice(0, newlineIndex);
|
||||
buffer = buffer.slice(newlineIndex + 1);
|
||||
|
||||
if (processLine(line)) {
|
||||
await reader.cancel();
|
||||
return { complete: false, matchedVersion, versions };
|
||||
}
|
||||
|
||||
newlineIndex = buffer.indexOf("\n");
|
||||
}
|
||||
}
|
||||
|
||||
if (buffer.trim() !== "" && processLine(buffer)) {
|
||||
return { complete: true, matchedVersion, versions };
|
||||
}
|
||||
|
||||
if (versions.length === 0) {
|
||||
throw new Error(`No version data found in ${url}.`);
|
||||
}
|
||||
|
||||
return { complete: true, matchedVersion, versions };
|
||||
}
|
||||
|
||||
function cacheCompleteVersionData(
|
||||
url: string,
|
||||
versions: NdjsonVersion[],
|
||||
): void {
|
||||
cachedVersionData.set(url, versions);
|
||||
|
||||
if (versions[0] !== undefined) {
|
||||
cachedLatestVersionData.set(url, versions[0]);
|
||||
}
|
||||
|
||||
const versionLookup = new Map<string, NdjsonVersion>();
|
||||
for (const versionData of versions) {
|
||||
versionLookup.set(versionData.version, versionData);
|
||||
}
|
||||
|
||||
cachedVersionLookup.set(url, versionLookup);
|
||||
}
|
||||
|
||||
function cacheVersion(url: string, versionData: NdjsonVersion): void {
|
||||
let versionLookup = cachedVersionLookup.get(url);
|
||||
if (versionLookup === undefined) {
|
||||
versionLookup = new Map<string, NdjsonVersion>();
|
||||
cachedVersionLookup.set(url, versionLookup);
|
||||
}
|
||||
|
||||
versionLookup.set(versionData.version, versionData);
|
||||
}
|
||||
|
||||
function parseVersionLine(
|
||||
line: string,
|
||||
sourceDescription: string,
|
||||
lineNumber: number,
|
||||
): NdjsonVersion {
|
||||
let parsed: unknown;
|
||||
try {
|
||||
parsed = JSON.parse(line);
|
||||
} catch (error) {
|
||||
throw new Error(
|
||||
`Failed to parse version data from ${sourceDescription} at line ${lineNumber}: ${(error as Error).message}`,
|
||||
);
|
||||
}
|
||||
|
||||
if (!isNdjsonVersion(parsed)) {
|
||||
throw new Error(
|
||||
`Invalid NDJSON record in ${sourceDescription} at line ${lineNumber}.`,
|
||||
);
|
||||
}
|
||||
|
||||
return parsed;
|
||||
}
|
||||
|
||||
function versionSatisfies(version: string, versionSpecifier: string): boolean {
|
||||
return (
|
||||
semver.satisfies(version, versionSpecifier) ||
|
||||
pep440.satisfies(version, versionSpecifier)
|
||||
);
|
||||
}
|
||||
|
||||
function isNdjsonVersion(value: unknown): value is NdjsonVersion {
|
||||
if (!isRecord(value)) {
|
||||
return false;
|
||||
}
|
||||
|
||||
if (typeof value.version !== "string" || !Array.isArray(value.artifacts)) {
|
||||
return false;
|
||||
}
|
||||
|
||||
return value.artifacts.every(isNdjsonArtifact);
|
||||
}
|
||||
|
||||
function isNdjsonArtifact(value: unknown): value is NdjsonArtifact {
|
||||
if (!isRecord(value)) {
|
||||
return false;
|
||||
}
|
||||
|
||||
const variantIsValid =
|
||||
typeof value.variant === "string" || value.variant === undefined;
|
||||
|
||||
return (
|
||||
typeof value.archive_format === "string" &&
|
||||
typeof value.platform === "string" &&
|
||||
typeof value.sha256 === "string" &&
|
||||
typeof value.url === "string" &&
|
||||
variantIsValid
|
||||
);
|
||||
}
|
||||
|
||||
function isRecord(value: unknown): value is Record<string, unknown> {
|
||||
return typeof value === "object" && value !== null;
|
||||
}
|
||||
@@ -5,6 +5,7 @@ import * as exec from "@actions/exec";
|
||||
import { restoreCache } from "./cache/restore-cache";
|
||||
import {
|
||||
downloadVersionFromManifest,
|
||||
downloadVersionFromNdjson,
|
||||
resolveVersion,
|
||||
tryGetFromToolCache,
|
||||
} from "./download/download-version";
|
||||
@@ -37,6 +38,8 @@ import {
|
||||
} from "./utils/platforms";
|
||||
import { getUvVersionFromFile } from "./version/resolve";
|
||||
|
||||
const sourceDir = __dirname;
|
||||
|
||||
async function getPythonVersion(): Promise<string> {
|
||||
if (pythonVersion !== "") {
|
||||
return pythonVersion;
|
||||
@@ -139,13 +142,22 @@ async function setupUv(
|
||||
};
|
||||
}
|
||||
|
||||
const downloadVersionResult = await downloadVersionFromManifest(
|
||||
const downloadVersionResult =
|
||||
manifestFile !== undefined
|
||||
? await downloadVersionFromManifest(
|
||||
manifestFile,
|
||||
platform,
|
||||
arch,
|
||||
resolvedVersion,
|
||||
checkSum,
|
||||
githubToken,
|
||||
)
|
||||
: await downloadVersionFromNdjson(
|
||||
platform,
|
||||
arch,
|
||||
resolvedVersion,
|
||||
checkSum,
|
||||
githubToken,
|
||||
);
|
||||
|
||||
return {
|
||||
@@ -158,12 +170,7 @@ async function determineVersion(
|
||||
manifestFile: string | undefined,
|
||||
): Promise<string> {
|
||||
if (versionInput !== "") {
|
||||
return await resolveVersion(
|
||||
versionInput,
|
||||
manifestFile,
|
||||
githubToken,
|
||||
resolutionStrategy,
|
||||
);
|
||||
return await resolveVersion(versionInput, manifestFile, resolutionStrategy);
|
||||
}
|
||||
if (versionFileInput !== "") {
|
||||
const versionFromFile = getUvVersionFromFile(versionFileInput);
|
||||
@@ -175,7 +182,6 @@ async function determineVersion(
|
||||
return await resolveVersion(
|
||||
versionFromFile,
|
||||
manifestFile,
|
||||
githubToken,
|
||||
resolutionStrategy,
|
||||
);
|
||||
}
|
||||
@@ -193,7 +199,6 @@ async function determineVersion(
|
||||
return await resolveVersion(
|
||||
versionFromUvToml || versionFromPyproject || "latest",
|
||||
manifestFile,
|
||||
githubToken,
|
||||
resolutionStrategy,
|
||||
);
|
||||
}
|
||||
@@ -305,7 +310,7 @@ function setCacheDir(): void {
|
||||
|
||||
function addMatchers(): void {
|
||||
if (addProblemMatchers) {
|
||||
const matchersPath = path.join(__dirname, `..${path.sep}..`, ".github");
|
||||
const matchersPath = path.join(sourceDir, "..", "..", ".github");
|
||||
core.info(`##[add-matcher]${path.join(matchersPath, "python.json")}`);
|
||||
}
|
||||
}
|
||||
|
||||
81
src/update-known-checksums.ts
Normal file
81
src/update-known-checksums.ts
Normal file
@@ -0,0 +1,81 @@
|
||||
import * as core from "@actions/core";
|
||||
import * as semver from "semver";
|
||||
import { KNOWN_CHECKSUMS } from "./download/checksum/known-checksums";
|
||||
import {
|
||||
type ChecksumEntry,
|
||||
updateChecksums,
|
||||
} from "./download/checksum/update-known-checksums";
|
||||
import {
|
||||
fetchVersionData,
|
||||
getLatestVersion,
|
||||
type NdjsonVersion,
|
||||
} from "./download/versions-client";
|
||||
|
||||
const VERSION_IN_CHECKSUM_KEY_PATTERN =
|
||||
/-(\d+\.\d+\.\d+(?:[-+][0-9A-Za-z.-]+)?)$/;
|
||||
|
||||
async function run(): Promise<void> {
|
||||
const checksumFilePath = process.argv.slice(2)[0];
|
||||
if (!checksumFilePath) {
|
||||
throw new Error(
|
||||
"Missing checksum file path. Usage: node dist/update-known-checksums/index.cjs <checksum-file-path>",
|
||||
);
|
||||
}
|
||||
|
||||
const latestVersion = await getLatestVersion();
|
||||
const latestKnownVersion = getLatestKnownVersionFromChecksums();
|
||||
|
||||
if (semver.lte(latestVersion, latestKnownVersion)) {
|
||||
core.info(
|
||||
`Latest release (${latestVersion}) is not newer than the latest known version (${latestKnownVersion}). Skipping update.`,
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
const versions = await fetchVersionData();
|
||||
const checksumEntries = extractChecksumsFromNdjson(versions);
|
||||
await updateChecksums(checksumFilePath, checksumEntries);
|
||||
|
||||
core.setOutput("latest-version", latestVersion);
|
||||
}
|
||||
|
||||
function getLatestKnownVersionFromChecksums(): string {
|
||||
const versions = new Set<string>();
|
||||
|
||||
for (const key of Object.keys(KNOWN_CHECKSUMS)) {
|
||||
const version = extractVersionFromChecksumKey(key);
|
||||
if (version !== undefined) {
|
||||
versions.add(version);
|
||||
}
|
||||
}
|
||||
|
||||
const latestVersion = [...versions].sort(semver.rcompare)[0];
|
||||
if (!latestVersion) {
|
||||
throw new Error("Could not determine latest known version from checksums.");
|
||||
}
|
||||
|
||||
return latestVersion;
|
||||
}
|
||||
|
||||
function extractVersionFromChecksumKey(key: string): string | undefined {
|
||||
return key.match(VERSION_IN_CHECKSUM_KEY_PATTERN)?.[1];
|
||||
}
|
||||
|
||||
function extractChecksumsFromNdjson(
|
||||
versions: NdjsonVersion[],
|
||||
): ChecksumEntry[] {
|
||||
const checksums: ChecksumEntry[] = [];
|
||||
|
||||
for (const version of versions) {
|
||||
for (const artifact of version.artifacts) {
|
||||
checksums.push({
|
||||
checksum: artifact.sha256,
|
||||
key: `${artifact.platform}-${version.version}`,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
return checksums;
|
||||
}
|
||||
|
||||
run();
|
||||
@@ -1,63 +0,0 @@
|
||||
import * as core from "@actions/core";
|
||||
import type { Endpoints } from "@octokit/types";
|
||||
import * as semver from "semver";
|
||||
import { updateChecksums } from "./download/checksum/update-known-checksums";
|
||||
import {
|
||||
getLatestKnownVersion,
|
||||
updateVersionManifest,
|
||||
} from "./download/version-manifest";
|
||||
import { OWNER, REPO } from "./utils/constants";
|
||||
import { Octokit } from "./utils/octokit";
|
||||
|
||||
type Release =
|
||||
Endpoints["GET /repos/{owner}/{repo}/releases"]["response"]["data"][number];
|
||||
|
||||
async function run(): Promise<void> {
|
||||
const checksumFilePath = process.argv.slice(2)[0];
|
||||
const versionsManifestFile = process.argv.slice(2)[1];
|
||||
const githubToken = process.argv.slice(2)[2];
|
||||
|
||||
const octokit = new Octokit({
|
||||
auth: githubToken,
|
||||
});
|
||||
|
||||
const { data: latestRelease } = await octokit.rest.repos.getLatestRelease({
|
||||
owner: OWNER,
|
||||
repo: REPO,
|
||||
});
|
||||
|
||||
const latestKnownVersion = await getLatestKnownVersion(undefined);
|
||||
|
||||
if (semver.lte(latestRelease.tag_name, latestKnownVersion)) {
|
||||
core.info(
|
||||
`Latest release (${latestRelease.tag_name}) is not newer than the latest known version (${latestKnownVersion}). Skipping update.`,
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
const releases: Release[] = await octokit.paginate(
|
||||
octokit.rest.repos.listReleases,
|
||||
{
|
||||
owner: OWNER,
|
||||
repo: REPO,
|
||||
},
|
||||
);
|
||||
const checksumDownloadUrls: string[] = releases.flatMap((release) =>
|
||||
release.assets
|
||||
.filter((asset) => asset.name.endsWith(".sha256"))
|
||||
.map((asset) => asset.browser_download_url),
|
||||
);
|
||||
await updateChecksums(checksumFilePath, checksumDownloadUrls);
|
||||
|
||||
const artifactDownloadUrls: string[] = releases.flatMap((release) =>
|
||||
release.assets
|
||||
.filter((asset) => !asset.name.endsWith(".sha256"))
|
||||
.map((asset) => asset.browser_download_url),
|
||||
);
|
||||
|
||||
await updateVersionManifest(versionsManifestFile, artifactDownloadUrls);
|
||||
|
||||
core.setOutput("latest-version", latestRelease.tag_name);
|
||||
}
|
||||
|
||||
run();
|
||||
@@ -1,5 +1,5 @@
|
||||
export const REPO = "uv";
|
||||
export const OWNER = "astral-sh";
|
||||
export const TOOL_CACHE_NAME = "uv";
|
||||
export const STATE_UV_PATH = "uv-path";
|
||||
export const STATE_UV_VERSION = "uv-version";
|
||||
export const VERSIONS_NDJSON_URL =
|
||||
"https://raw.githubusercontent.com/astral-sh/versions/main/v1/uv.ndjson";
|
||||
|
||||
@@ -1,34 +0,0 @@
|
||||
import type { OctokitOptions } from "@octokit/core";
|
||||
import { Octokit as Core } from "@octokit/core";
|
||||
import {
|
||||
type PaginateInterface,
|
||||
paginateRest,
|
||||
} from "@octokit/plugin-paginate-rest";
|
||||
import { legacyRestEndpointMethods } from "@octokit/plugin-rest-endpoint-methods";
|
||||
import { fetch as customFetch } from "./fetch";
|
||||
|
||||
export type { RestEndpointMethodTypes } from "@octokit/plugin-rest-endpoint-methods";
|
||||
|
||||
const DEFAULTS = {
|
||||
baseUrl: "https://api.github.com",
|
||||
userAgent: "setup-uv",
|
||||
};
|
||||
|
||||
const OctokitWithPlugins = Core.plugin(paginateRest, legacyRestEndpointMethods);
|
||||
|
||||
export const Octokit = OctokitWithPlugins.defaults(function buildDefaults(
|
||||
options: OctokitOptions,
|
||||
): OctokitOptions {
|
||||
return {
|
||||
...DEFAULTS,
|
||||
...options,
|
||||
request: {
|
||||
fetch: customFetch,
|
||||
...options.request,
|
||||
},
|
||||
};
|
||||
});
|
||||
|
||||
export type Octokit = InstanceType<typeof OctokitWithPlugins> & {
|
||||
paginate: PaginateInterface;
|
||||
};
|
||||
@@ -1,12 +1,12 @@
|
||||
{
|
||||
"compilerOptions": {
|
||||
"esModuleInterop": true /* Enables emit interoperability between CommonJS and ES Modules via creation of namespace objects for all imports. Implies 'allowSyntheticDefaultImports'. */,
|
||||
"module": "nodenext" /* Specify module code generation: 'none', 'commonjs', 'amd', 'system', 'umd', 'es2015', or 'ESNext'. */,
|
||||
"noImplicitAny": true /* Raise error on expressions and declarations with an implied 'any' type. */,
|
||||
"outDir": "./lib" /* Redirect output structure to the directory. */,
|
||||
"rootDir": "./src" /* Specify the root directory of input files. Use to control the output directory structure with --outDir. */,
|
||||
"strict": true /* Enable all strict type-checking options. */,
|
||||
"target": "ES2022" /* Specify ECMAScript target version: 'ES3' (default), 'ES5', 'ES2015', 'ES2016', 'ES2017', 'ES2018', 'ES2019' or 'ESNEXT'. */
|
||||
"esModuleInterop": true,
|
||||
"isolatedModules": true,
|
||||
"module": "esnext",
|
||||
"moduleResolution": "bundler",
|
||||
"noImplicitAny": true,
|
||||
"strict": true,
|
||||
"target": "ES2022"
|
||||
},
|
||||
"exclude": ["node_modules", "**/*.test.ts"]
|
||||
"include": ["src/**/*.ts"]
|
||||
}
|
||||
|
||||
30872
version-manifest.json
30872
version-manifest.json
File diff suppressed because it is too large
Load Diff
Reference in New Issue
Block a user