-
-**Fast:** Install and run any JS tool quickly and seamlessly! Volta is built in Rust and ships as a snappy static binary.
-
-**Reliable:** Ensure everyone in your project has the same tools—without interfering with their workflow.
-
-**Universal:** No matter the package manager, Node runtime, or OS, one command is all you need: `volta install`.
-
-## Features
-
-- Speed 🚀
-- Seamless, per-project version switching
-- Cross-platform support, including Windows and all Unix shells
-- Support for multiple package managers
-- Stable tool installation—no reinstalling on every Node upgrade!
-- Extensibility hooks for site-specific customization
-
-## Installing Volta
-
-Read the [Getting Started Guide](https://docs.volta.sh/guide/getting-started) on our website for detailed instructions on how to install Volta.
-
-## Using Volta
-
-Read the [Understanding Volta Guide](https://docs.volta.sh/guide/understanding) on our website for detailed instructions on how to use Volta.
-
-## Contributing to Volta
-
-Contributions are always welcome, no matter how large or small. Substantial feature ideas should be proposed as an [RFC](https://github.com/volta-cli/rfcs). Before contributing, please read the [code of conduct](CODE_OF_CONDUCT.md).
-
-See the [Contributing Guide](https://docs.volta.sh/contributing/) on our website for detailed instructions on how to contribute to Volta.
-
-## Who is using Volta?
-
-
-
-See [here](https://sourcegraph.com/search?q=context:global+%22volta%22+file:package.json&patternType=literal) for more Volta users.
diff --git a/RELEASES.md b/RELEASES.md
deleted file mode 100644
index dfcc9e740..000000000
--- a/RELEASES.md
+++ /dev/null
@@ -1,334 +0,0 @@
-# Version 2.0.2
-
-- Dependency updates
-- Improvements to header handling for HTTP requests (#1822, #1877)
-
-# Version 2.0.1
-
-- Improved accuracy of Node download progress bar on Windows (#1833)
-- You should no longer run into errors about needing the VC++ Runtime on Windows (#1844)
-- The data provided when installing a new Node version is now more relevant and accurate (#1846, #1848)
-- Increased performance to make Volta even more responsive in typical use (#1849)
-- `volta run` will now correctly handle flags in more situations (#1857)
-
-# Version 2.0.0
-
-- 🚨 (BREAKING) 🚨 We upgraded the version of Rust used to build Volta, which drops support for older versions of glibc & Linux kernel. See [the Rust announcement from August 2022](https://blog.rust-lang.org/2022/08/01/Increasing-glibc-kernel-requirements.html) for details about the supported versions. Notably, this means that we no longer support CentOS 6 (#1611)
-- 🚨 (BREAKING) 🚨 Due to costs and changes in the code signing process, we have dropped the code signing for the Windows installer. We now recommend using `winget` to install Volta on Windows (#1650)
-- 🎉 (NEW) 🎉 We now ship a pre-built binary for ARM Linux & ARM Windows (#1696, #1801)
-- Volta no longer requires Developer Mode to be enabled on Windows (#1755)
-- `volta uninstall` now provides better help & error messages to describe its use and limitations (#1628, #1786)
-- Volta will now use a universal binary on Mac, rather than separate Intel- & ARM-specific builds (#1635)
-- Switched to installing profile scripts into `.zshenv` by default, rather than `.zshrc` (#1657)
-- Added a default shim for the `yarnpkg` command, which is an alias of `yarn` (#1670)
-- Added a new `--very-verbose` flag to enable even more logging (note: we haven't yet implemented much additional logging) (#1815)
-- Simplified the fetching process to remove an extra network request and resolve hangs (#1812)
-- Several dependency upgrades and clean-up refactors from @tottoto
-
-# Version 1.1.1
-
-- Experimental support for pnpm (requires `VOLTA_FEATURE_PNPM` environment variable) (#1273)
-- Fix to correctly import native root certificates (#1375)
-- Better detection of executables provided by `yarn` (#1388, #1393)
-
-# Version 1.1.0
-
-- Added support for pinning / installing Yarn 3+ (#1305)
-- Improved portability and installer effectiveness by removing dependency on OpenSSL (#1214)
-
-# Version 1.0.8
-
-- Fix for malformed `bin` entries when installing global packages (#997)
-- Dependency updates
-
-# Version 1.0.7
-
-- Added build for Linux distros with OpenSSL 3.0 (#1211)
-
-# Version 1.0.6
-
-- Fixed panic when `stdout` is closed (#1058)
-- Disabled global package interception when `--prefix` is provided (#1171)
-- Numerous dependency updates
-
-# Version 1.0.5
-
-- Added error when attempting to install Node using `nvm` syntax (#1020)
-- Avoid modifying shell config if the environment is already correct (#990)
-- Prevent trying to read OS-generated files as package configs (#981)
-
-# Version 1.0.4
-
-- Fetch native Apple silicon versions of Node when available (#974)
-
-# Version 1.0.3
-
-- Fix pinning of `npm@bundled` when there is a custom default npm version (#957)
-- Use correct binary name for scoped packages with a string `bin` entry in `package.json` (#969)
-
-# Version 1.0.2
-
-- Fix issues where `volta list` wasn't showing the correct information in all cases (#778, #926)
-- Make detection of tool name case-insensitive on Windows (#941)
-- Fix problem with `npm link` in a scoped package under npm 7 (#945)
-
-# Version 1.0.1
-
-- Create Native build for Apple Silicon machines (#915, #917)
-
-# Version 1.0.0
-
-- Support for `npm link` (#888, #889, #891)
-- Support for `npm update -g` and `yarn global upgrade` (#895)
-- Improvements in the handling of `npm` and `yarn` commands (#886, #887)
-
-# Version 0.9.3
-
-- Various fixes to event plugin logic (#892, #894, #897)
-
-# Version 0.9.2
-
-- Correctly detect Volta binary installation directory (#864)
-
-# Version 0.9.1
-
-- Fix an issue with installing globals using npm 7 (#858)
-
-# Version 0.9.0
-
-- Support Proxies through environment variables (#809, #851)
-- Avoid unnecessary `exists` calls for files (#834)
-- Rework package installs to allow for directly calling package manager (#848, #849)
-- **Breaking Change**: Remove support for `packages` hooks (#817)
-
-# Version 0.8.7
-
-- Support fetching older versions of Yarn (#771)
-- Correctly detect `zsh` environment with `ZDOTDIR` variable (#799)
-- Prevent race conditions when installing tools (#684, #796)
-
-# Version 0.8.6
-
-- Improve parsing of `engines` when installing a package (#791, #792)
-
-# Version 0.8.5
-
-- Improve the stability of installing tools on systems with virus scanning software (#784)
-- Make `volta uninstall` work correctly when the original install had an issue (#787)
-
-# Version 0.8.4
-
-- Add `{{filename}}` and `{{ext}}` (extension) replacements for `template` hooks (#774)
-- Show better error when running `volta install yarn` without a Node version available (#763)
-
-# Version 0.8.3
-
-- Fix bug preventing custom `npm` versions from launching on Windows (#777)
-- Fix for completions in `zsh` for `volta list` (#772)
-
-# Version 0.8.2
-
-- Add support for workspaces through the `extends` key in `package.json` (#755)
-- Improve `volta setup` to make profile scripts more shareable across machines (#756)
-
-# Version 0.8.1
-
-- Fix panic when running `volta completions zsh` (#746)
-- Improve startup latency by reducing binary size (#732, #733, #734, #735)
-
-# Version 0.8.0
-
-- Support for pinning / installing custom versions of `npm` (#691)
-- New command: `volta run` which will let you run one-off commands using custom versions of Node / Yarn / npm (#713)
-- Added default pretty formatter for `volta list` (#697)
-- Improved setup of Volta environment to make it work in more scenarios (#666, #725)
-- Bug fixes and performance improvements (#683, #701, #703, #704, #707, #717)
-
-# Version 0.7.2
-
-- Added `npm.cmd`, `npx.cmd`, and `yarn.cmd` on Windows to support tools that look for CMD files specifically (#663)
-- Updated `volta setup` to also ensure that the shim symlinks are set up correctly (#662)
-
-# Version 0.7.1
-
-- Added warning when attempting to `volta uninstall` a package you don't have installed (#638)
-- Added informational message about pinned project version when running `volta install` (#646)
-- `volta completions` will attempt to create the output directory if it doesn't exist (#647)
-- `volta install` will correctly handle script files that have CRLF as the line ending (#644)
-
-# Version 0.7.0
-
-- Removed deprecated commands `volta activate`, `volta deactivate`, and `volta current` (#620, #559)
-- Simplified installer behavior and added data directory migration support (#619)
-- Removed reliance on UNC paths when executing node scripts (#637)
-
-# Version 0.6.8
-
-- You can now use tagged versions when installing a tool with `volta install` (#604)
-- `volta install ` will now prefer LTS Node when pinning a version (#604)
-
-# Version 0.6.7
-
-- `volta pin` will no longer remove a closing newline from `package.json` (#603)
-- New environment variable `VOLTA_BYPASS` will allow you to temporarily disable Volta shims (#603)
-
-# Version 0.6.6
-
-- Node and Yarn can now both be pinned in the same command `volta pin node yarn` (#593)
-- Windows installer will now work on minimal Windows installs (e.g. Windows Sandbox) (#592)
-
-# Version 0.6.5
-
-- `volta list` Now always outputs to stdout, regardless of how it is called (#581)
-- DEPRECATION: `volta activate` and `volta deactivate` are deprecated and will be removed in a future version (#571)
-
-# Version 0.6.4
-
-- `volta install` now works for installing packages from a private, authenticated registry (#554)
-- `volta install` now has better diagnostic messages when things go wrong (#548)
-
-# Version 0.6.3
-
-- `volta install` will no longer error when installing a scoped binary package (#537)
-
-# Version 0.6.2
-
-- Added `volta list` command for inspecting the available tools and versions (#461)
-
-# Version 0.6.1
-
-- Windows users will see a spinner instead of a � when Volta is loading data (#511)
-- Interrupting a tool with Ctrl+C will correctly wait for the tool to exit (#513)
-
-# Version 0.6.0
-
-- Allow installing 3rd-party binaries from private registries (#469)
-
-# Version 0.5.7
-
-- Prevent corrupting local cache by downloading tools to temp directory (#498)
-
-# Version 0.5.6
-
-- Improve expected behavior with Yarn in projects (#470)
-- Suppress an erroneous "toolchain" key warning message (#486)
-
-# Version 0.5.5
-
-- Proper support for relative paths in Bin hooks (#468)
-- Diagnostic messages for shims with `VOLTA_LOGLEVEL=debug` (#466)
-- Preserve user order for multiple tool installs (#479)
-
-# Version 0.5.4
-
-- Show additional diagnostic messages when run with `--verbose` (#455)
-
-# Version 0.5.3
-
-- Prevent unnecessary warning output when not running interactively (#451)
-- Fix a bug in load script for fish shell on Linux (#456)
-- Improve wrapping behavior for warning messages (#453)
-
-# Version 0.5.2
-
-- Improve error messages when running a project-local binary fails (#426)
-- Fix execution of user binaries on Windows (#445)
-
-# Version 0.5.1
-
-- Add per-project hooks configuration in `/.volta/hooks.json` (#411)
-- Support backwards compatibility with `toolchain` key in `package.json` (#434)
-
-# Version 0.5.0
-
-- Rename to Volta: The JavaScript Launcher ⚡️
-- Change `package.json` key to `volta` from `toolchain` (#413)
-- Update `volta completions` behavior to be more usable (#416)
-- Improve `volta which` to correctly find user tools (#419)
-- Remove unneeded lookups of `package.json` files (#420)
-- Cleanup of error messages and extraneous output (#421, #422)
-
-# Version 0.4.1
-
-- Allow tool executions to pass through to the system if no Notion platform exists (#372)
-- Improve installer support for varied Linux distros
-
-# Version 0.4.0
-
-- Update `notion install` to use `tool@version` formatting for specifying a tool (#383, #403)
-- Further error message improvements (#344, #395, #399, #400)
-- Clean up bugs around installing and running packages (#368, #390, #394, #396)
-- Include success messages when running `notion install` and `notion pin` (#397)
-
-# Version 0.3.0
-
-- Support `lts` pseudo-version for Node (#331)
-- Error message improvements
-- Add `notion install` and `notion uninstall` for package binaries
-- Remove autoshimming
-
-# Version 0.2.2
-
-- Add `notion which` command (#293)
-- Show progress when fetching Notion installer (#279)
-- Improved styling for usage information (#283)
-- Support for `fish` shell (#266, #290)
-- Consolidate binaries, for a ~2/3 size reduction of Notion installer (#274)
-
-# Version 0.2.1
-
-- Move preventing globals behind a feature flag (#273)
-
-# Version 0.2.0
-
-- Add support for OpenSSL 1.1.1 (#267)
-- Fix: ensure temp files are on the same volume (#257)
-- Intercept global package installations (#248)
-- Fix: make npx compatible with prelrease versions of npm (#239)
-- Fix: make `notion deactivate` work infallibly, without loading any files (#237)
-- Fix: make `"npm"` key optional in `package.json` (#233)
-- Fix: publish latest Notion version via self-hosted endpoint (#230)
-- Fix: eliminate excessive fetching and scanning for exact versions (#227)
-- Rename `notion use` to `notion pin` (#226)
-- Base filesystem isolation on `NOTION_HOME` env var (#224)
-- Fix: robust progress bar logic (#221)
-- Use JSON for internal state files (#220)
-- Support for npm and npx (#205)
-- Changes to directory layout (#181)
-
-# Version 0.1.5
-
-- Autoshimming! (#163)
-- `notion deactivate` also unsets `NOTION_HOME` (#195)
-- Implemented `notion activate` (#201)
-- Fix for Yarn over-fetching bug (#203)
-
-# Version 0.1.4
-
-- Fix for `package.json` parsing bug (#156)
-
-# Version 0.1.3
-
-- Fix for Yarn path bug (#153)
-
-# Version 0.1.2
-
-- Correct logic for computing `latest` version of Node (#144)
-- Don't crash if cache dir was deleted (#138)
-- Improved tests (#135)
-
-# Version 0.1.1
-
-- Support for specifying `latest` as a version specifier (#133)
-- Suppress scary-looking symlink warnings on reinstall (#132)
-- Clearer error message for not-yet-implemented `notion install somebin` (#131)
-- Support optional `v` prefix to version specifiers (#130)
-
-# Version 0.1.0
-
-First pre-release, supporting:
-
-- macOS and Linux (bash-only)
-- `notion install` (Node and Yarn only, no package binaries)
-- `notion use`
-- Proof-of-concept plugin API
diff --git a/ci/build-linux.sh b/ci/build-linux.sh
deleted file mode 100755
index 2c39aeb28..000000000
--- a/ci/build-linux.sh
+++ /dev/null
@@ -1,16 +0,0 @@
-#!/bin/bash
-
-set -e
-
-# Activate the upgraded versions of GCC and binutils
-# See https://linux.web.cern.ch/centos7/docs/softwarecollections/#inst
-source /opt/rh/devtoolset-8/enable
-
-echo "Building Volta"
-
-cargo build --release
-
-echo "Packaging Binaries"
-
-cd target/release
-tar -zcvf "$1.tar.gz" volta volta-shim volta-migrate
diff --git a/ci/build-macos.sh b/ci/build-macos.sh
deleted file mode 100755
index 1976b2cd9..000000000
--- a/ci/build-macos.sh
+++ /dev/null
@@ -1,21 +0,0 @@
-#!/bin/bash
-
-set -e
-
-echo "Building Volta"
-
-MACOSX_DEPLOYMENT_TARGET=11.0 cargo build --release --target=aarch64-apple-darwin
-MACOSX_DEPLOYMENT_TARGET=11.0 cargo build --release --target=x86_64-apple-darwin
-
-echo "Packaging Binaries"
-
-mkdir -p target/universal-apple-darwin/release
-
-for exe in volta volta-shim volta-migrate
-do
- lipo -create -output target/universal-apple-darwin/release/$exe target/x86_64-apple-darwin/release/$exe target/aarch64-apple-darwin/release/$exe
-done
-
-cd target/universal-apple-darwin/release
-
-tar -zcvf "$1.tar.gz" volta volta-shim volta-migrate
diff --git a/ci/docker/Dockerfile b/ci/docker/Dockerfile
deleted file mode 100644
index 5764763bc..000000000
--- a/ci/docker/Dockerfile
+++ /dev/null
@@ -1,16 +0,0 @@
-FROM cern/cc7-base
-
-# This repo file references a URL that is no longer valid. It also isn't used by the build
-# toolchain, so we can safely remove it entirely
-RUN rm /etc/yum.repos.d/epel.repo
-
-# https://linux.web.cern.ch/centos7/docs/softwarecollections/#inst
-# Tools needed for the build and setup process
-RUN yum -y install wget tar
-# Fetch the repo information for the devtoolset repo
-RUN yum install -y centos-release-scl
-# Install more recent GCC and binutils, to allow us to compile
-RUN yum install -y devtoolset-8
-
-RUN curl https://sh.rustup.rs -sSf | sh -s -- -y
-ENV PATH="/root/.cargo/bin:${PATH}"
diff --git a/ci/volta.manifest b/ci/volta.manifest
deleted file mode 100644
index 6390adc4c..000000000
--- a/ci/volta.manifest
+++ /dev/null
@@ -1,3 +0,0 @@
-volta
-volta-shim
-volta-migrate
diff --git a/crates/archive/Cargo.toml b/crates/archive/Cargo.toml
deleted file mode 100644
index 46dd5dab8..000000000
--- a/crates/archive/Cargo.toml
+++ /dev/null
@@ -1,23 +0,0 @@
-[package]
-name = "archive"
-version = "0.1.0"
-authors = ["David Herman "]
-edition = "2021"
-
-[dependencies]
-flate2 = "1.0"
-tar = "0.4.13"
-# Set features manually to drop usage of `time` crate: we do not rely on that
-# set of capabilities, and it has a vulnerability. We also don't need to use
-# every single compression algorithm feature since we are only downloading
-# Node as a zip file
-zip_rs = { version = "=2.1.6", package = "zip", default-features = false, features = ["deflate", "bzip2"] }
-tee = "0.1.0"
-fs-utils = { path = "../fs-utils" }
-progress-read = { path = "../progress-read" }
-verbatim = "0.1"
-cfg-if = "1.0"
-headers = "0.4"
-thiserror = "2.0.0"
-attohttpc = { version = "0.28", default-features = false, features = ["json", "compress", "tls-rustls-native-roots"] }
-log = { version = "0.4", features = ["std"] }
diff --git a/crates/archive/fixtures/tarballs/test-file.tar.gz b/crates/archive/fixtures/tarballs/test-file.tar.gz
deleted file mode 100644
index 960b61f9d..000000000
Binary files a/crates/archive/fixtures/tarballs/test-file.tar.gz and /dev/null differ
diff --git a/crates/archive/fixtures/zips/test-file.zip b/crates/archive/fixtures/zips/test-file.zip
deleted file mode 100644
index ca6079330..000000000
Binary files a/crates/archive/fixtures/zips/test-file.zip and /dev/null differ
diff --git a/crates/archive/src/lib.rs b/crates/archive/src/lib.rs
deleted file mode 100644
index 9619920db..000000000
--- a/crates/archive/src/lib.rs
+++ /dev/null
@@ -1,105 +0,0 @@
-//! This crate provides types for fetching and unpacking compressed
-//! archives in tarball or zip format.
-use std::fs::File;
-use std::path::Path;
-
-use attohttpc::header::HeaderMap;
-use headers::{ContentLength, Header, HeaderMapExt};
-use thiserror::Error;
-
-mod tarball;
-mod zip;
-
-pub use crate::tarball::Tarball;
-pub use crate::zip::Zip;
-
-/// Error type for this crate
-#[derive(Error, Debug)]
-pub enum ArchiveError {
- #[error("HTTP failure ({0})")]
- HttpError(attohttpc::StatusCode),
-
- #[error("HTTP header '{0}' not found")]
- MissingHeaderError(&'static attohttpc::header::HeaderName),
-
- #[error("unexpected content length in HTTP response: {0}")]
- UnexpectedContentLengthError(u64),
-
- #[error("{0}")]
- IoError(#[from] std::io::Error),
-
- #[error("{0}")]
- AttohttpcError(#[from] attohttpc::Error),
-
- #[error("{0}")]
- ZipError(#[from] zip_rs::result::ZipError),
-}
-
-/// Metadata describing whether an archive comes from a local or remote origin.
-#[derive(Copy, Clone)]
-pub enum Origin {
- Local,
- Remote,
-}
-
-pub trait Archive {
- fn compressed_size(&self) -> u64;
-
- /// Unpacks the zip archive to the specified destination folder.
- fn unpack(
- self: Box,
- dest: &Path,
- progress: &mut dyn FnMut(&(), usize),
- ) -> Result<(), ArchiveError>;
-
- fn origin(&self) -> Origin;
-}
-
-cfg_if::cfg_if! {
- if #[cfg(unix)] {
- /// Load an archive in the native OS-preferred format from the specified file.
- ///
- /// On Windows, the preferred format is zip. On Unixes, the preferred format
- /// is tarball.
- pub fn load_native(source: File) -> Result, ArchiveError> {
- Tarball::load(source)
- }
-
- /// Fetch a remote archive in the native OS-preferred format from the specified
- /// URL and store its results at the specified file path.
- ///
- /// On Windows, the preferred format is zip. On Unixes, the preferred format
- /// is tarball.
- pub fn fetch_native(url: &str, cache_file: &Path) -> Result, ArchiveError> {
- Tarball::fetch(url, cache_file)
- }
- } else if #[cfg(windows)] {
- /// Load an archive in the native OS-preferred format from the specified file.
- ///
- /// On Windows, the preferred format is zip. On Unixes, the preferred format
- /// is tarball.
- pub fn load_native(source: File) -> Result, ArchiveError> {
- Zip::load(source)
- }
-
- /// Fetch a remote archive in the native OS-preferred format from the specified
- /// URL and store its results at the specified file path.
- ///
- /// On Windows, the preferred format is zip. On Unixes, the preferred format
- /// is tarball.
- pub fn fetch_native(url: &str, cache_file: &Path) -> Result, ArchiveError> {
- Zip::fetch(url, cache_file)
- }
- } else {
- compile_error!("Unsupported OS (expected 'unix' or 'windows').");
- }
-}
-
-/// Determines the length of an HTTP response's content in bytes, using
-/// the HTTP `"Content-Length"` header.
-fn content_length(headers: &HeaderMap) -> Result {
- headers
- .typed_get()
- .map(|ContentLength(v)| v)
- .ok_or_else(|| ArchiveError::MissingHeaderError(ContentLength::name()))
-}
diff --git a/crates/archive/src/tarball.rs b/crates/archive/src/tarball.rs
deleted file mode 100644
index 0f713cbab..000000000
--- a/crates/archive/src/tarball.rs
+++ /dev/null
@@ -1,98 +0,0 @@
-//! Provides types and functions for fetching and unpacking a Node installation
-//! tarball in Unix operating systems.
-
-use std::fs::File;
-use std::io::Read;
-use std::path::Path;
-
-use super::{content_length, Archive, ArchiveError, Origin};
-use flate2::read::GzDecoder;
-use fs_utils::ensure_containing_dir_exists;
-use progress_read::ProgressRead;
-use tee::TeeReader;
-
-/// A Node installation tarball.
-pub struct Tarball {
- compressed_size: u64,
- data: Box,
- origin: Origin,
-}
-
-impl Tarball {
- /// Loads a tarball from the specified file.
- pub fn load(source: File) -> Result, ArchiveError> {
- let compressed_size = source.metadata()?.len();
- Ok(Box::new(Tarball {
- compressed_size,
- data: Box::new(source),
- origin: Origin::Local,
- }))
- }
-
- /// Initiate fetching of a tarball from the given URL, returning a
- /// tarball that can be streamed (and that tees its data to a local
- /// file as it streams).
- pub fn fetch(url: &str, cache_file: &Path) -> Result, ArchiveError> {
- let (status, headers, response) = attohttpc::get(url).send()?.split();
-
- if !status.is_success() {
- return Err(ArchiveError::HttpError(status));
- }
-
- let compressed_size = content_length(&headers)?;
-
- ensure_containing_dir_exists(&cache_file)?;
- let file = File::create(cache_file)?;
- let data = Box::new(TeeReader::new(response, file));
-
- Ok(Box::new(Tarball {
- compressed_size,
- data,
- origin: Origin::Remote,
- }))
- }
-}
-
-impl Archive for Tarball {
- fn compressed_size(&self) -> u64 {
- self.compressed_size
- }
- fn unpack(
- self: Box,
- dest: &Path,
- progress: &mut dyn FnMut(&(), usize),
- ) -> Result<(), ArchiveError> {
- let decoded = GzDecoder::new(ProgressRead::new(self.data, (), progress));
- let mut tarball = tar::Archive::new(decoded);
- tarball.unpack(dest)?;
- Ok(())
- }
- fn origin(&self) -> Origin {
- self.origin
- }
-}
-
-#[cfg(test)]
-pub mod tests {
-
- use crate::tarball::Tarball;
- use std::fs::File;
- use std::path::PathBuf;
-
- fn fixture_path(fixture_dir: &str) -> PathBuf {
- let mut cargo_manifest_dir = PathBuf::from(env!("CARGO_MANIFEST_DIR"));
- cargo_manifest_dir.push("fixtures");
- cargo_manifest_dir.push(fixture_dir);
- cargo_manifest_dir
- }
-
- #[test]
- fn test_load() {
- let mut test_file_path = fixture_path("tarballs");
- test_file_path.push("test-file.tar.gz");
- let test_file = File::open(test_file_path).expect("Couldn't open test file");
- let tarball = Tarball::load(test_file).expect("Failed to load tarball");
-
- assert_eq!(tarball.compressed_size(), 402);
- }
-}
diff --git a/crates/archive/src/zip.rs b/crates/archive/src/zip.rs
deleted file mode 100644
index 476d73f50..000000000
--- a/crates/archive/src/zip.rs
+++ /dev/null
@@ -1,102 +0,0 @@
-//! Provides types and functions for fetching and unpacking a Node installation
-//! zip file in Windows operating systems.
-
-use std::fs::File;
-use std::io::Read;
-use std::path::Path;
-
-use super::{content_length, ArchiveError};
-use fs_utils::ensure_containing_dir_exists;
-use progress_read::ProgressRead;
-use tee::TeeReader;
-use verbatim::PathExt;
-use zip_rs::unstable::stream::ZipStreamReader;
-
-use super::Archive;
-use super::Origin;
-
-pub struct Zip {
- compressed_size: u64,
- data: Box,
- origin: Origin,
-}
-
-impl Zip {
- /// Loads a cached Node zip archive from the specified file.
- pub fn load(source: File) -> Result, ArchiveError> {
- let compressed_size = source.metadata()?.len();
-
- Ok(Box::new(Zip {
- compressed_size,
- data: Box::new(source),
- origin: Origin::Local,
- }))
- }
-
- /// Initiate fetching of a Node zip archive from the given URL, returning
- /// a `Remote` data source.
- pub fn fetch(url: &str, cache_file: &Path) -> Result, ArchiveError> {
- let (status, headers, response) = attohttpc::get(url).send()?.split();
-
- if !status.is_success() {
- return Err(ArchiveError::HttpError(status));
- }
-
- let compressed_size = content_length(&headers)?;
-
- ensure_containing_dir_exists(&cache_file)?;
- let file = File::create(cache_file)?;
- let data = Box::new(TeeReader::new(response, file));
-
- Ok(Box::new(Zip {
- compressed_size,
- data,
- origin: Origin::Remote,
- }))
- }
-}
-
-impl Archive for Zip {
- fn compressed_size(&self) -> u64 {
- self.compressed_size
- }
- fn unpack(
- self: Box,
- dest: &Path,
- progress: &mut dyn FnMut(&(), usize),
- ) -> Result<(), ArchiveError> {
- // Use a verbatim path to avoid the legacy Windows 260 byte path limit.
- let dest: &Path = &dest.to_verbatim();
- let zip = ZipStreamReader::new(ProgressRead::new(self.data, (), progress));
- zip.extract(dest)?;
- Ok(())
- }
- fn origin(&self) -> Origin {
- self.origin
- }
-}
-
-#[cfg(test)]
-pub mod tests {
-
- use crate::zip::Zip;
- use std::fs::File;
- use std::path::PathBuf;
-
- fn fixture_path(fixture_dir: &str) -> PathBuf {
- let mut cargo_manifest_dir = PathBuf::from(env!("CARGO_MANIFEST_DIR"));
- cargo_manifest_dir.push("fixtures");
- cargo_manifest_dir.push(fixture_dir);
- cargo_manifest_dir
- }
-
- #[test]
- fn test_load() {
- let mut test_file_path = fixture_path("zips");
- test_file_path.push("test-file.zip");
- let test_file = File::open(test_file_path).expect("Couldn't open test file");
- let zip = Zip::load(test_file).expect("Failed to load zip file");
-
- assert_eq!(zip.compressed_size(), 214);
- }
-}
diff --git a/crates/fs-utils/Cargo.toml b/crates/fs-utils/Cargo.toml
deleted file mode 100644
index 9d2964429..000000000
--- a/crates/fs-utils/Cargo.toml
+++ /dev/null
@@ -1,7 +0,0 @@
-[package]
-name = "fs-utils"
-version = "0.1.0"
-authors = ["Michael Stewart "]
-edition = "2021"
-
-[dependencies]
diff --git a/crates/fs-utils/src/lib.rs b/crates/fs-utils/src/lib.rs
deleted file mode 100644
index afbb80556..000000000
--- a/crates/fs-utils/src/lib.rs
+++ /dev/null
@@ -1,21 +0,0 @@
-//! This crate provides utilities for operating on the filesystem.
-
-use std::fs;
-use std::io;
-use std::path::Path;
-
-/// This creates the parent directory of the input path, assuming the input path is a file.
-pub fn ensure_containing_dir_exists>(path: &P) -> io::Result<()> {
- path.as_ref()
- .parent()
- .ok_or_else(|| {
- io::Error::new(
- io::ErrorKind::NotFound,
- format!(
- "Could not determine directory information for {}",
- path.as_ref().display()
- ),
- )
- })
- .and_then(fs::create_dir_all)
-}
diff --git a/crates/progress-read/Cargo.toml b/crates/progress-read/Cargo.toml
deleted file mode 100644
index 6dafc1110..000000000
--- a/crates/progress-read/Cargo.toml
+++ /dev/null
@@ -1,7 +0,0 @@
-[package]
-name = "progress-read"
-version = "0.1.0"
-authors = ["David Herman "]
-edition = "2021"
-
-[dependencies]
diff --git a/crates/progress-read/src/lib.rs b/crates/progress-read/src/lib.rs
deleted file mode 100644
index cf1e49a9f..000000000
--- a/crates/progress-read/src/lib.rs
+++ /dev/null
@@ -1,47 +0,0 @@
-//! This crate provides an adapter for the `std::io::Read` trait to
-//! allow reporting incremental progress to a callback function.
-
-use std::io::{self, Read, Seek, SeekFrom};
-
-/// A reader that reports incremental progress while reading.
-pub struct ProgressRead T> {
- source: R,
- accumulator: T,
- progress: F,
-}
-
-impl T> Read for ProgressRead {
- /// Read some bytes from the underlying reader into the specified buffer,
- /// and report progress to the progress callback. The progress callback is
- /// passed the current value of the accumulator as its first argument and
- /// the number of bytes read as its second argument. The result of the
- /// progress callback is stored as the updated value of the accumulator,
- /// to be passed to the next invocation of the callback.
- fn read(&mut self, buf: &mut [u8]) -> io::Result {
- let len = self.source.read(buf)?;
- let new_accumulator = {
- let progress = &mut self.progress;
- progress(&self.accumulator, len)
- };
- self.accumulator = new_accumulator;
- Ok(len)
- }
-}
-
-impl T> ProgressRead {
- /// Construct a new progress reader with the specified underlying reader,
- /// initial value for an accumulator, and progress callback.
- pub fn new(source: R, init: T, progress: F) -> ProgressRead {
- ProgressRead {
- source,
- accumulator: init,
- progress,
- }
- }
-}
-
-impl T> Seek for ProgressRead {
- fn seek(&mut self, pos: SeekFrom) -> io::Result {
- self.source.seek(pos)
- }
-}
diff --git a/crates/test-support/Cargo.toml b/crates/test-support/Cargo.toml
deleted file mode 100644
index e95a81d3f..000000000
--- a/crates/test-support/Cargo.toml
+++ /dev/null
@@ -1,10 +0,0 @@
-[package]
-name = "test-support"
-version = "0.1.0"
-authors = ["David Herman "]
-edition = "2021"
-
-[dependencies]
-hamcrest2 = "0.3.0"
-serde_json = { version = "1.0.135" }
-thiserror = "2.0.9"
diff --git a/crates/test-support/src/lib.rs b/crates/test-support/src/lib.rs
deleted file mode 100644
index bf1f5e662..000000000
--- a/crates/test-support/src/lib.rs
+++ /dev/null
@@ -1,15 +0,0 @@
-//! Utilities to use with acceptance tests in Volta.
-
-#[macro_export]
-macro_rules! ok_or_panic {
- { $e:expr } => {
- match $e {
- Ok(x) => x,
- Err(err) => panic!("{} failed with {}", stringify!($e), err),
- }
- };
-}
-
-pub mod matchers;
-pub mod paths;
-pub mod process;
diff --git a/crates/test-support/src/matchers.rs b/crates/test-support/src/matchers.rs
deleted file mode 100644
index c534c4ef0..000000000
--- a/crates/test-support/src/matchers.rs
+++ /dev/null
@@ -1,719 +0,0 @@
-use std::fmt;
-use std::process::Output;
-use std::str;
-
-use crate::process::ProcessBuilder;
-
-use hamcrest2::core::{MatchResult, Matcher};
-use serde_json::{self, Value};
-
-#[derive(Clone)]
-pub struct Execs {
- expect_stdout: Option,
- expect_stderr: Option,
- expect_exit_code: Option,
- expect_stdout_contains: Vec,
- expect_stderr_contains: Vec,
- expect_either_contains: Vec,
- expect_stdout_contains_n: Vec<(String, usize)>,
- expect_stdout_not_contains: Vec,
- expect_stderr_not_contains: Vec,
- expect_stderr_unordered: Vec,
- expect_neither_contains: Vec,
- expect_json: Option>,
-}
-
-impl Execs {
- /// Verify that stdout is equal to the given lines.
- /// See `lines_match` for supported patterns.
- pub fn with_stdout(mut self, expected: S) -> Execs {
- self.expect_stdout = Some(expected.to_string());
- self
- }
-
- /// Verify that stderr is equal to the given lines.
- /// See `lines_match` for supported patterns.
- pub fn with_stderr(mut self, expected: S) -> Execs {
- self._with_stderr(&expected);
- self
- }
-
- fn _with_stderr(&mut self, expected: &dyn ToString) {
- self.expect_stderr = Some(expected.to_string());
- }
-
- /// Verify the exit code from the process.
- pub fn with_status(mut self, expected: i32) -> Execs {
- self.expect_exit_code = Some(expected);
- self
- }
-
- /// Verify that stdout contains the given contiguous lines somewhere in
- /// its output.
- /// See `lines_match` for supported patterns.
- pub fn with_stdout_contains(mut self, expected: S) -> Execs {
- self.expect_stdout_contains.push(expected.to_string());
- self
- }
-
- /// Verify that stderr contains the given contiguous lines somewhere in
- /// its output.
- /// See `lines_match` for supported patterns.
- pub fn with_stderr_contains(mut self, expected: S) -> Execs {
- self.expect_stderr_contains.push(expected.to_string());
- self
- }
-
- /// Verify that either stdout or stderr contains the given contiguous
- /// lines somewhere in its output.
- /// See `lines_match` for supported patterns.
- pub fn with_either_contains(mut self, expected: S) -> Execs {
- self.expect_either_contains.push(expected.to_string());
- self
- }
-
- /// Verify that stdout contains the given contiguous lines somewhere in
- /// its output, and should be repeated `number` times.
- /// See `lines_match` for supported patterns.
- pub fn with_stdout_contains_n(mut self, expected: S, number: usize) -> Execs {
- self.expect_stdout_contains_n
- .push((expected.to_string(), number));
- self
- }
-
- /// Verify that stdout does not contain the given contiguous lines.
- /// See `lines_match` for supported patterns.
- /// See note on `with_stderr_does_not_contain`.
- pub fn with_stdout_does_not_contain(mut self, expected: S) -> Execs {
- self.expect_stdout_not_contains.push(expected.to_string());
- self
- }
-
- /// Verify that stderr does not contain the given contiguous lines.
- /// See `lines_match` for supported patterns.
- ///
- /// Care should be taken when using this method because there is a
- /// limitless number of possible things that *won't* appear. A typo means
- /// your test will pass without verifying the correct behavior. If
- /// possible, write the test first so that it fails, and then implement
- /// your fix/feature to make it pass.
- pub fn with_stderr_does_not_contain(mut self, expected: S) -> Execs {
- self.expect_stderr_not_contains.push(expected.to_string());
- self
- }
-
- /// Verify that all of the stderr output is equal to the given lines,
- /// ignoring the order of the lines.
- /// See `lines_match` for supported patterns.
- /// This is useful when checking the output of `cargo build -v` since
- /// the order of the output is not always deterministic.
- /// Recommend use `with_stderr_contains` instead unless you really want to
- /// check *every* line of output.
- ///
- /// Be careful when using patterns such as `[..]`, because you may end up
- /// with multiple lines that might match, and this is not smart enough to
- /// do anything like longest-match. For example, avoid something like:
- /// [RUNNING] `rustc [..]
- /// [RUNNING] `rustc --crate-name foo [..]
- /// This will randomly fail if the other crate name is `bar`, and the
- /// order changes.
- pub fn with_stderr_unordered(mut self, expected: S) -> Execs {
- self.expect_stderr_unordered.push(expected.to_string());
- self
- }
-
- /// Verify the JSON output matches the given JSON.
- /// Typically used when testing cargo commands that emit JSON.
- /// Each separate JSON object should be separated by a blank line.
- /// Example:
- /// assert_that(
- /// p.cargo("metadata"),
- /// execs().with_json(r#"
- /// {"example": "abc"}
- /// {"example": "def"}
- /// "#)
- /// );
- /// Objects should match in the order given.
- /// The order of arrays is ignored.
- /// Strings support patterns described in `lines_match`.
- /// Use `{...}` to match any object.
- pub fn with_json(mut self, expected: &str) -> Execs {
- self.expect_json = Some(
- expected
- .split("\n\n")
- .map(|obj| obj.parse().unwrap())
- .collect(),
- );
- self
- }
-
- fn match_output(&self, actual: &Output) -> MatchResult {
- self.match_status(actual)
- .and(self.match_stdout(actual))
- .and(self.match_stderr(actual))
- }
-
- fn match_status(&self, actual: &Output) -> MatchResult {
- match self.expect_exit_code {
- None => Ok(()),
- Some(code) if actual.status.code() == Some(code) => Ok(()),
- Some(_) => Err(format!(
- "exited with {}\n--- stdout\n{}\n--- stderr\n{}",
- actual.status,
- String::from_utf8_lossy(&actual.stdout),
- String::from_utf8_lossy(&actual.stderr)
- )),
- }
- }
-
- fn match_stdout(&self, actual: &Output) -> MatchResult {
- self.match_std(
- self.expect_stdout.as_ref(),
- &actual.stdout,
- "stdout",
- &actual.stderr,
- MatchKind::Exact,
- )?;
- for expect in self.expect_stdout_contains.iter() {
- self.match_std(
- Some(expect),
- &actual.stdout,
- "stdout",
- &actual.stderr,
- MatchKind::Partial,
- )?;
- }
- for expect in self.expect_stderr_contains.iter() {
- self.match_std(
- Some(expect),
- &actual.stderr,
- "stderr",
- &actual.stdout,
- MatchKind::Partial,
- )?;
- }
- for &(ref expect, number) in self.expect_stdout_contains_n.iter() {
- self.match_std(
- Some(expect),
- &actual.stdout,
- "stdout",
- &actual.stderr,
- MatchKind::PartialN(number),
- )?;
- }
- for expect in self.expect_stdout_not_contains.iter() {
- self.match_std(
- Some(expect),
- &actual.stdout,
- "stdout",
- &actual.stderr,
- MatchKind::NotPresent,
- )?;
- }
- for expect in self.expect_stderr_not_contains.iter() {
- self.match_std(
- Some(expect),
- &actual.stderr,
- "stderr",
- &actual.stdout,
- MatchKind::NotPresent,
- )?;
- }
- for expect in self.expect_stderr_unordered.iter() {
- self.match_std(
- Some(expect),
- &actual.stderr,
- "stderr",
- &actual.stdout,
- MatchKind::Unordered,
- )?;
- }
- for expect in self.expect_neither_contains.iter() {
- self.match_std(
- Some(expect),
- &actual.stdout,
- "stdout",
- &actual.stdout,
- MatchKind::NotPresent,
- )?;
-
- self.match_std(
- Some(expect),
- &actual.stderr,
- "stderr",
- &actual.stderr,
- MatchKind::NotPresent,
- )?;
- }
-
- for expect in self.expect_either_contains.iter() {
- let match_std = self.match_std(
- Some(expect),
- &actual.stdout,
- "stdout",
- &actual.stdout,
- MatchKind::Partial,
- );
- let match_err = self.match_std(
- Some(expect),
- &actual.stderr,
- "stderr",
- &actual.stderr,
- MatchKind::Partial,
- );
-
- if let (Err(_), Err(_)) = (match_std, match_err) {
- return Err(format!(
- "expected to find:\n\
- {}\n\n\
- did not find in either output.",
- expect
- ));
- }
- }
-
- if let Some(ref objects) = self.expect_json {
- let stdout = str::from_utf8(&actual.stdout)
- .map_err(|_| "stdout was not utf8 encoded".to_owned())?;
- let lines = stdout
- .lines()
- .filter(|line| line.starts_with('{'))
- .collect::>();
- if lines.len() != objects.len() {
- return Err(format!(
- "expected {} json lines, got {}, stdout:\n{}",
- objects.len(),
- lines.len(),
- stdout
- ));
- }
- for (obj, line) in objects.iter().zip(lines) {
- self.match_json(obj, line)?;
- }
- }
- Ok(())
- }
-
- fn match_stderr(&self, actual: &Output) -> MatchResult {
- self.match_std(
- self.expect_stderr.as_ref(),
- &actual.stderr,
- "stderr",
- &actual.stdout,
- MatchKind::Exact,
- )
- }
-
- fn match_std(
- &self,
- expected: Option<&String>,
- actual: &[u8],
- description: &str,
- extra: &[u8],
- kind: MatchKind,
- ) -> MatchResult {
- let out = match expected {
- Some(out) => out,
- None => return Ok(()),
- };
- let actual = match str::from_utf8(actual) {
- Err(..) => return Err(format!("{} was not utf8 encoded", description)),
- Ok(actual) => actual,
- };
- // Let's not deal with \r\n vs \n on windows...
- let actual = actual.replace('\r', "");
- let actual = actual.replace('\t', "");
-
- match kind {
- MatchKind::Exact => {
- let a = actual.lines();
- let e = out.lines();
-
- let diffs = self.diff_lines(a, e, false);
- if diffs.is_empty() {
- Ok(())
- } else {
- Err(format!(
- "differences:\n\
- {}\n\n\
- other output:\n\
- `{}`",
- diffs.join("\n"),
- String::from_utf8_lossy(extra)
- ))
- }
- }
- MatchKind::Partial => {
- let mut a = actual.lines();
- let e = out.lines();
-
- let mut diffs = self.diff_lines(a.clone(), e.clone(), true);
- #[allow(clippy::while_let_on_iterator)]
- while let Some(..) = a.next() {
- let a = self.diff_lines(a.clone(), e.clone(), true);
- if a.len() < diffs.len() {
- diffs = a;
- }
- }
- if diffs.is_empty() {
- Ok(())
- } else {
- Err(format!(
- "expected to find:\n\
- {}\n\n\
- did not find in output:\n\
- {}",
- out, actual
- ))
- }
- }
- MatchKind::PartialN(number) => {
- let mut a = actual.lines();
- let e = out.lines();
-
- let mut matches = 0;
-
- loop {
- if self.diff_lines(a.clone(), e.clone(), true).is_empty() {
- matches += 1;
- }
-
- if a.next().is_none() {
- break;
- }
- }
-
- if matches == number {
- Ok(())
- } else {
- Err(format!(
- "expected to find {} occurrences:\n\
- {}\n\n\
- did not find in output:\n\
- {}",
- number, out, actual
- ))
- }
- }
- MatchKind::NotPresent => {
- let mut a = actual.lines();
- let e = out.lines();
-
- let mut diffs = self.diff_lines(a.clone(), e.clone(), true);
- #[allow(clippy::while_let_on_iterator)]
- while let Some(..) = a.next() {
- let a = self.diff_lines(a.clone(), e.clone(), true);
- if a.len() < diffs.len() {
- diffs = a;
- }
- }
- if diffs.is_empty() {
- Err(format!(
- "expected not to find:\n\
- {}\n\n\
- but found in output:\n\
- {}",
- out, actual
- ))
- } else {
- Ok(())
- }
- }
- MatchKind::Unordered => {
- let mut a = actual.lines().collect::>();
- let e = out.lines();
-
- for e_line in e {
- match a.iter().position(|a_line| lines_match(e_line, a_line)) {
- Some(index) => a.remove(index),
- None => {
- return Err(format!(
- "Did not find expected line:\n\
- {}\n\
- Remaining available output:\n\
- {}\n",
- e_line,
- a.join("\n")
- ));
- }
- };
- }
- if !a.is_empty() {
- Err(format!(
- "Output included extra lines:\n\
- {}\n",
- a.join("\n")
- ))
- } else {
- Ok(())
- }
- }
- }
- }
-
- fn match_json(&self, expected: &Value, line: &str) -> MatchResult {
- let actual = match line.parse() {
- Err(e) => return Err(format!("invalid json, {}:\n`{}`", e, line)),
- Ok(actual) => actual,
- };
-
- match find_mismatch(expected, &actual) {
- Some((expected_part, actual_part)) => Err(format!(
- "JSON mismatch\nExpected:\n{}\nWas:\n{}\nExpected part:\n{}\nActual part:\n{}\n",
- serde_json::to_string_pretty(expected).unwrap(),
- serde_json::to_string_pretty(&actual).unwrap(),
- serde_json::to_string_pretty(expected_part).unwrap(),
- serde_json::to_string_pretty(actual_part).unwrap(),
- )),
- None => Ok(()),
- }
- }
-
- fn diff_lines<'a>(
- &self,
- actual: str::Lines<'a>,
- expected: str::Lines<'a>,
- partial: bool,
- ) -> Vec {
- let actual = actual.take(if partial {
- expected.clone().count()
- } else {
- usize::MAX
- });
- zip_all(actual, expected)
- .enumerate()
- .filter_map(|(i, (a, e))| match (a, e) {
- (Some(a), Some(e)) => {
- if lines_match(e, a) {
- None
- } else {
- Some(format!("{:3} - |{}|\n + |{}|\n", i, e, a))
- }
- }
- (Some(a), None) => Some(format!("{:3} -\n + |{}|\n", i, a)),
- (None, Some(e)) => Some(format!("{:3} - |{}|\n +\n", i, e)),
- (None, None) => panic!("Cannot get here"),
- })
- .collect()
- }
-}
-
-#[derive(Debug, PartialEq, Eq, Clone, Copy)]
-enum MatchKind {
- Exact,
- Partial,
- PartialN(usize),
- NotPresent,
- Unordered,
-}
-
-/// Compare a line with an expected pattern.
-/// - Use `[..]` as a wildcard to match 0 or more characters on the same line
-/// (similar to `.*` in a regex).
-/// - Use `[EXE]` to optionally add `.exe` on Windows (empty string on other
-/// platforms).
-/// - There is a wide range of macros (such as `[COMPILING]` or `[WARNING]`)
-/// to match cargo's "status" output and allows you to ignore the alignment.
-/// See `substitute_macros` for a complete list of macros.
-pub fn lines_match(expected: &str, actual: &str) -> bool {
- // Let's not deal with / vs \ (windows...)
- let expected = expected.replace('\\', "/");
- let mut actual: &str = &actual.replace('\\', "/");
- let expected = substitute_macros(&expected);
- for (i, part) in expected.split("[..]").enumerate() {
- match actual.find(part) {
- Some(j) => {
- if i == 0 && j != 0 {
- return false;
- }
- actual = &actual[j + part.len()..];
- }
- None => return false,
- }
- }
- actual.is_empty() || expected.ends_with("[..]")
-}
-
-#[test]
-fn lines_match_works() {
- assert!(lines_match("a b", "a b"));
- assert!(lines_match("a[..]b", "a b"));
- assert!(lines_match("a[..]", "a b"));
- assert!(lines_match("[..]", "a b"));
- assert!(lines_match("[..]b", "a b"));
-
- assert!(!lines_match("[..]b", "c"));
- assert!(!lines_match("b", "c"));
- assert!(!lines_match("b", "cb"));
-}
-
-// Compares JSON object for approximate equality.
-// You can use `[..]` wildcard in strings (useful for OS dependent things such
-// as paths). You can use a `"{...}"` string literal as a wildcard for
-// arbitrary nested JSON (useful for parts of object emitted by other programs
-// (e.g. rustc) rather than Cargo itself). Arrays are sorted before comparison.
-fn find_mismatch<'a>(expected: &'a Value, actual: &'a Value) -> Option<(&'a Value, &'a Value)> {
- use serde_json::Value::*;
- match (expected, actual) {
- (Number(l), Number(r)) if l == r => None,
- (Bool(l), Bool(r)) if l == r => None,
- (String(l), String(r)) if lines_match(l, r) => None,
- (Array(l), Array(r)) => {
- if l.len() != r.len() {
- return Some((expected, actual));
- }
-
- let mut l = l.iter().collect::>();
- let mut r = r.iter().collect::>();
-
- l.retain(
- |l| match r.iter().position(|r| find_mismatch(l, r).is_none()) {
- Some(i) => {
- r.remove(i);
- false
- }
- None => true,
- },
- );
-
- if !l.is_empty() {
- assert!(!r.is_empty());
- Some((l[0], r[0]))
- } else {
- assert_eq!(r.len(), 0);
- None
- }
- }
- (Object(l), Object(r)) => {
- let same_keys = l.len() == r.len() && l.keys().all(|k| r.contains_key(k));
- if !same_keys {
- return Some((expected, actual));
- }
-
- l.values()
- .zip(r.values())
- .find_map(|(l, r)| find_mismatch(l, r))
- }
- (Null, Null) => None,
- // magic string literal "{...}" acts as wildcard for any sub-JSON
- (String(l), _) if l == "{...}" => None,
- _ => Some((expected, actual)),
- }
-}
-
-struct ZipAll {
- first: I1,
- second: I2,
-}
-
-impl, I2: Iterator> Iterator for ZipAll {
- type Item = (Option, Option);
- fn next(&mut self) -> Option<(Option, Option)> {
- let first = self.first.next();
- let second = self.second.next();
-
- match (first, second) {
- (None, None) => None,
- (a, b) => Some((a, b)),
- }
- }
-}
-
-fn zip_all, I2: Iterator>(a: I1, b: I2) -> ZipAll {
- ZipAll {
- first: a,
- second: b,
- }
-}
-
-impl fmt::Display for Execs {
- fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
- write!(f, "execs")
- }
-}
-
-impl fmt::Debug for Execs {
- fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
- write!(f, "execs")
- }
-}
-
-impl Matcher for Execs {
- fn matches(&self, mut process: ProcessBuilder) -> MatchResult {
- self.matches(&mut process)
- }
-}
-
-impl<'a> Matcher<&'a mut ProcessBuilder> for Execs {
- fn matches(&self, process: &'a mut ProcessBuilder) -> MatchResult {
- println!("running {}", process);
- let res = process.exec_with_output();
-
- match res {
- Ok(out) => self.match_output(&out),
- Err(err) => {
- if let Some(out) = &err.output {
- return self.match_output(out);
- }
- Err(format!("could not exec process {}: {}", process, err))
- }
- }
- }
-}
-
-impl Matcher