From df673a440c7afb8c7dcd87b7411a4d33d63b6ec6 Mon Sep 17 00:00:00 2001 From: Remi Bernotavicius Date: Sun, 14 Jan 2024 09:50:23 -0800 Subject: [PATCH] Revert "Rename cargo-metest to cargo-maelstrom." This reverts commit 717c44238d7076f754fe94908e2abe329a03a502. This commit meant to rename cargo-metest, but instead it ended up accidentally deleting it. I'm reverting it so we can try again. --- CHANGELOG.md | 14 +- Cargo.lock | 2 +- README.md | 2 +- crates/cargo-metest/Cargo.toml | 41 + .../cargo-metest/examples/cargo-metest.toml | 29 + crates/cargo-metest/src/artifacts.rs | 149 ++ crates/cargo-metest/src/cargo.rs | 94 ++ crates/cargo-metest/src/config.rs | 56 + crates/cargo-metest/src/lib.rs | 734 +++++++++ crates/cargo-metest/src/main.rs | 244 +++ crates/cargo-metest/src/metadata.rs | 1021 ++++++++++++ crates/cargo-metest/src/metadata/directive.rs | 963 +++++++++++ .../cargo-metest/src/pattern/interpreter.rs | 493 ++++++ crates/cargo-metest/src/pattern/mod.rs | 5 + crates/cargo-metest/src/pattern/parser.rs | 763 +++++++++ crates/cargo-metest/src/progress.rs | 87 + crates/cargo-metest/src/progress/driver.rs | 81 + .../src/progress/multiple_progress_bars.rs | 129 ++ crates/cargo-metest/src/progress/no_bar.rs | 29 + .../cargo-metest/src/progress/quiet_no_bar.rs | 29 + .../src/progress/quiet_progress_bar.rs | 35 + .../cargo-metest/src/progress/test_listing.rs | 78 + crates/cargo-metest/src/test_listing.rs | 179 ++ crates/cargo-metest/src/visitor.rs | 224 +++ crates/cargo-metest/tests/integration_test.rs | 1438 +++++++++++++++++ crates/maelstrom-client/src/spec.rs | 2 +- meticulous-test.toml | 2 +- site/src/SUMMARY.md | 24 +- site/src/cargo_metest.md | 8 +- site/src/cargo_metest/filtering_tests.md | 2 +- .../cargo_metest/include_and_exclude_flags.md | 8 +- site/src/cargo_metest/running_tests.md | 24 +- site/src/cargo_metest/test_pattern_dsl.md | 4 +- site/src/install/cargo_metest.md | 6 +- site/src/installation.md | 4 +- site/src/introduction.md | 6 +- 36 files changed, 6955 insertions(+), 54 deletions(-) create mode 100644 crates/cargo-metest/Cargo.toml create mode 100644 crates/cargo-metest/examples/cargo-metest.toml create mode 100644 crates/cargo-metest/src/artifacts.rs create mode 100644 crates/cargo-metest/src/cargo.rs create mode 100644 crates/cargo-metest/src/config.rs create mode 100644 crates/cargo-metest/src/lib.rs create mode 100644 crates/cargo-metest/src/main.rs create mode 100644 crates/cargo-metest/src/metadata.rs create mode 100644 crates/cargo-metest/src/metadata/directive.rs create mode 100644 crates/cargo-metest/src/pattern/interpreter.rs create mode 100644 crates/cargo-metest/src/pattern/mod.rs create mode 100644 crates/cargo-metest/src/pattern/parser.rs create mode 100644 crates/cargo-metest/src/progress.rs create mode 100644 crates/cargo-metest/src/progress/driver.rs create mode 100644 crates/cargo-metest/src/progress/multiple_progress_bars.rs create mode 100644 crates/cargo-metest/src/progress/no_bar.rs create mode 100644 crates/cargo-metest/src/progress/quiet_no_bar.rs create mode 100644 crates/cargo-metest/src/progress/quiet_progress_bar.rs create mode 100644 crates/cargo-metest/src/progress/test_listing.rs create mode 100644 crates/cargo-metest/src/test_listing.rs create mode 100644 crates/cargo-metest/src/visitor.rs create mode 100644 crates/cargo-metest/tests/integration_test.rs diff --git a/CHANGELOG.md b/CHANGELOG.md index b4445a44..682e6893 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -7,7 +7,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 ## [Unreleased] -### `cargo-maelstrom` +### `cargo-metest` #### Added - The `--include` and `--exclude` long option names for `-i` and `-x`. @@ -30,7 +30,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 flags. `list` also takes an optional positional argument to indicate what type of artifact to list. The `-v`, `-h`, and `-b` flags remain at the top level. The `quiet` configuration option also has moved into a new sub-section - for the run command in `cargo-maelstrom.toml`. + for the run command in `cargo-metest.toml`. [Issue #118](https://github.com/meticulous-software/meticulous/issues/118) #### Fixed @@ -51,7 +51,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 - Ability to create a build environment using [`nix`](https://nixos.org). - Renamed `maelstrom-client` binary to `maelstrom-client-cli`. -### `cargo-maelstrom` +### `cargo-metest` #### Added @@ -77,7 +77,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 #### Changed - The JSON language used to specify jobs. This now has all of the features of - the language used by `cargo-maelstrom`, including specifying parts of an image + the language used by `cargo-metest`, including specifying parts of an image to use. [Issue #103](https://github.com/meticulous-software/meticulous/issues/103) @@ -99,7 +99,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 `maelstrom-client`. [Issue #70](https://github.com/meticulous-software/meticulous/issues/70). -### `cargo-maelstrom` +### `cargo-metest` #### Added @@ -119,7 +119,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 worked on. There is an new, "pop-up", status bar when tar files are being generated. Plus other small improvements. - Initial progress bar accuracy by remembering how many tests there were the - last time `cargo-maelstrom` was run. + last time `cargo-metest` was run. #### Changed @@ -137,7 +137,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 - Binaries for the clustered job runner: `maelstrom-worker`, `maelstrom-broker`, and `maelstrom-client`. - Client library for communicating with the broker: `maelstrom-client`. -- A Rust test runner that uses the clustered job runner: `cargo-maelstrom`. +- A Rust test runner that uses the clustered job runner: `cargo-metest`. - A bunch of other library packages that are used internally. [unreleased]: https://github.com/meticulous-software/meticulous/compare/v0.3.0...HEAD diff --git a/Cargo.lock b/Cargo.lock index 3caaa021..27684b1a 100644 --- a/Cargo.lock +++ b/Cargo.lock @@ -658,7 +658,7 @@ dependencies = [ ] [[package]] -name = "cargo-maelstrom" +name = "cargo-metest" version = "0.4.0-dev" dependencies = [ "anyhow", diff --git a/README.md b/README.md index e96fec21..be1d7cd7 100644 --- a/README.md +++ b/README.md @@ -46,7 +46,7 @@ this repository and use `cargo install`. Something like this should work: ```sh -for i in meticulous-{broker,worker,client} cargo-maelstrom; do +for i in meticulous-{broker,worker,client} cargo-metest; do cargo install --git https://github.com/meticulous-software/meticulous.git $i done ``` diff --git a/crates/cargo-metest/Cargo.toml b/crates/cargo-metest/Cargo.toml new file mode 100644 index 00000000..cfe1b004 --- /dev/null +++ b/crates/cargo-metest/Cargo.toml @@ -0,0 +1,41 @@ +[package] +name = "cargo-metest" +version.workspace = true +edition.workspace = true + +[dependencies] +anyhow.workspace = true +cargo_metadata.workspace = true +clap.workspace = true +colored.workspace = true +combine.workspace = true +console.workspace = true +derive_more.workspace = true +enumset.workspace = true +figment.workspace = true +globset.workspace = true +indicatif.workspace = true +lddtree.workspace = true +maelstrom-base.workspace = true +maelstrom-client.workspace = true +maelstrom-util.workspace = true +regex.workspace = true +regex-macro.workspace = true +serde.workspace = true +serde_json.workspace = true +serde_repr.workspace = true +serde_with.workspace = true +tar.workspace = true +toml.workspace = true +unicode-truncate.workspace = true +unicode-width.workspace = true + +[dev-dependencies] +assert_matches.workspace = true +bincode.workspace = true +enum-map.workspace = true +serde.workspace = true +tempfile.workspace = true +maelstrom-broker.workspace = true +maelstrom-test.workspace = true +maelstrom-worker.workspace = true diff --git a/crates/cargo-metest/examples/cargo-metest.toml b/crates/cargo-metest/examples/cargo-metest.toml new file mode 100644 index 00000000..220d414b --- /dev/null +++ b/crates/cargo-metest/examples/cargo-metest.toml @@ -0,0 +1,29 @@ +# Example config for cargo metest. +# +# Save this as /.cache/cargo-metest.toml or wherever you +# specify the config file using the --config-file or -c option. + +# The host and port of the broker. +# +# Can also be specified via the `--broker` or `-b` command-line options, or via +# the CARGO_METEST_BROKER environment variable. +# +# There is no default. This option must be specified. +# +# Examples: +# broker = "localhost:1234" +# broker = "127.0.0.1:1234" +broker = "[::1]:1234" + +# Options that apply to the `run` subcommand go in this section. +[run] + +# Whether the output should be "quiet". If this is true, then only a single +# status bar will be output. +# +# Can also be specified via the `--quiet` or `-q` command-line options, or via +# the CARGO_METEST_RUN environment variable like this: CARGO_METEST_RUN={quiet = true} +# +# Default: false +# Examples: +# quiet = true diff --git a/crates/cargo-metest/src/artifacts.rs b/crates/cargo-metest/src/artifacts.rs new file mode 100644 index 00000000..62a511b3 --- /dev/null +++ b/crates/cargo-metest/src/artifacts.rs @@ -0,0 +1,149 @@ +use crate::progress::ProgressIndicator; +use anyhow::Result; +use indicatif::ProgressBar; +use maelstrom_base::Sha256Digest; +use maelstrom_client::Client; +use maelstrom_util::fs::Fs; +use std::{ + collections::{BTreeSet, HashMap}, + path::{Path, PathBuf}, + sync::Mutex, +}; +use tar::Header; + +fn create_artifact_for_binary(binary_path: &Path, prog: Option) -> Result { + let prog = prog.unwrap_or_else(ProgressBar::hidden); + + let fs = Fs::new(); + let binary = fs.open_file(binary_path)?; + + let mut tar_path = PathBuf::from(binary_path); + assert!(tar_path.set_extension("tar")); + + if fs.exists(&tar_path) { + let binary_mtime = binary.metadata()?.modified()?; + let tar_mtime = fs.metadata(&tar_path)?.modified()?; + if binary_mtime < tar_mtime { + return Ok(tar_path); + } + } + + let tar_file = fs.create_file(&tar_path)?; + let mut a = tar::Builder::new(tar_file); + + let binary_path_in_tar = Path::new("./").join(binary_path.file_name().unwrap()); + let mut header = Header::new_gnu(); + let meta = binary.metadata()?; + prog.set_length(meta.len()); + header.set_metadata(&meta.into_inner()); + a.append_data(&mut header, binary_path_in_tar, prog.wrap_read(binary))?; + a.finish()?; + + Ok(tar_path) +} + +fn create_artifact_for_binary_deps( + binary_path: &Path, + prog: Option, +) -> Result { + let prog = prog.unwrap_or_else(ProgressBar::hidden); + let fs = Fs::new(); + + let mut tar_path = PathBuf::from(binary_path); + assert!(tar_path.set_extension("deps.tar")); + + if fs.exists(&tar_path) { + let binary_mtime = fs.metadata(binary_path)?.modified()?; + let tar_mtime = fs.metadata(&tar_path)?.modified()?; + + if binary_mtime < tar_mtime { + return Ok(tar_path); + } + } + + let dep_tree = lddtree::DependencyAnalyzer::new("/".into()); + let deps = dep_tree.analyze(binary_path)?; + + let mut paths = BTreeSet::new(); + if let Some(p) = deps.interpreter { + if let Some(lib) = deps.libraries.get(&p) { + paths.insert(lib.path.clone()); + } + } + + fn walk_deps( + deps: &[String], + libraries: &HashMap, + paths: &mut BTreeSet, + ) { + for dep in deps { + if let Some(lib) = libraries.get(dep) { + paths.insert(lib.path.clone()); + } + if let Some(lib) = libraries.get(dep) { + walk_deps(&lib.needed, libraries, paths); + } + } + } + walk_deps(&deps.needed, &deps.libraries, &mut paths); + + fn remove_root(path: &Path) -> PathBuf { + path.components().skip(1).collect() + } + + let tar_file = fs.create_file(&tar_path)?; + let mut a = tar::Builder::new(tar_file); + + let files = paths + .iter() + .map(|p| fs.open_file(p)) + .collect::>>()?; + + let metas = files + .iter() + .map(|f| f.metadata()) + .collect::>>()?; + + let total_size = metas.iter().map(|m| m.len()).sum(); + prog.set_length(total_size); + + for ((path, file), meta) in paths + .into_iter() + .zip(files.into_iter()) + .zip(metas.into_iter()) + { + let mut header = Header::new_gnu(); + header.set_metadata(&meta.into_inner()); + a.append_data(&mut header, &remove_root(&path), prog.wrap_read(file))?; + } + + a.finish()?; + + Ok(tar_path) +} + +pub struct GeneratedArtifacts { + pub binary: Sha256Digest, + pub deps: Sha256Digest, +} + +pub fn add_generated_artifacts( + client: &Mutex, + binary_path: &Path, + ind: &impl ProgressIndicator, +) -> Result { + let prog = ind.new_side_progress("tar"); + let binary_artifact = client + .lock() + .unwrap() + .add_artifact(&create_artifact_for_binary(binary_path, prog)?)?; + let prog = ind.new_side_progress("tar"); + let deps_artifact = client + .lock() + .unwrap() + .add_artifact(&create_artifact_for_binary_deps(binary_path, prog)?)?; + Ok(GeneratedArtifacts { + binary: binary_artifact, + deps: deps_artifact, + }) +} diff --git a/crates/cargo-metest/src/cargo.rs b/crates/cargo-metest/src/cargo.rs new file mode 100644 index 00000000..619c0cd2 --- /dev/null +++ b/crates/cargo-metest/src/cargo.rs @@ -0,0 +1,94 @@ +use anyhow::{Error, Result}; +use cargo_metadata::{ + Artifact as CargoArtifact, Message as CargoMessage, MessageIter as CargoMessageIter, +}; +use regex::Regex; +use std::{ + io, + io::BufReader, + path::Path, + process::{Child, ChildStdout, Command, Stdio}, + str, +}; + +pub struct CargoBuild { + child: Child, +} + +impl CargoBuild { + pub fn new(program: &str, color: bool, packages: Vec) -> Result { + let mut cmd = Command::new(program); + cmd.arg("test") + .arg("--no-run") + .arg("--message-format=json-render-diagnostics") + .arg(&format!( + "--color={}", + if color { "always" } else { "never" } + )) + .stdout(Stdio::piped()) + .stderr(Stdio::piped()); + + for package in packages { + cmd.arg("--package").arg(package); + } + + let child = cmd.spawn()?; + + Ok(Self { child }) + } + + pub fn artifact_stream(&mut self) -> TestArtifactStream { + TestArtifactStream { + stream: Some(CargoMessage::parse_stream(BufReader::new( + self.child.stdout.take().unwrap(), + ))), + } + } + + pub fn check_status(mut self, mut stderr: impl io::Write) -> Result<()> { + let exit_status = self.child.wait()?; + if !exit_status.success() { + std::io::copy(self.child.stderr.as_mut().unwrap(), &mut stderr)?; + return Err(Error::msg("build failure".to_string())); + } + + Ok(()) + } +} + +#[derive(Default)] +pub struct TestArtifactStream { + stream: Option>>, +} + +impl Iterator for TestArtifactStream { + type Item = Result; + + fn next(&mut self) -> Option { + while let Some(stream) = &mut self.stream { + match stream.next()? { + Err(e) => return Some(Err(e.into())), + Ok(CargoMessage::CompilerArtifact(artifact)) => { + if artifact.executable.is_some() && artifact.profile.test { + return Some(Ok(artifact)); + } + } + _ => continue, + } + } + None + } +} + +pub fn get_cases_from_binary(binary: &Path, filter: &Option) -> Result> { + let mut cmd = Command::new(binary); + cmd.arg("--list").arg("--format").arg("terse"); + if let Some(filter) = filter { + cmd.arg(filter); + } + let output = cmd.output()?; + Ok(Regex::new(r"\b([^ ]*): test")? + .captures_iter(str::from_utf8(&output.stdout)?) + .map(|capture| capture.get(1).unwrap().as_str().trim().to_string()) + .collect()) +} diff --git a/crates/cargo-metest/src/config.rs b/crates/cargo-metest/src/config.rs new file mode 100644 index 00000000..a76f39fe --- /dev/null +++ b/crates/cargo-metest/src/config.rs @@ -0,0 +1,56 @@ +use derive_more::From; +use maelstrom_util::config::BrokerAddr; +use serde::{Deserialize, Serialize}; +use serde_with::skip_serializing_none; +use std::fmt::{self, Debug, Formatter}; + +#[derive(Clone, Deserialize, From)] +#[serde(transparent)] +pub struct Quiet(bool); + +impl Quiet { + pub fn into_inner(self) -> bool { + self.0 + } +} + +impl Debug for Quiet { + fn fmt(&self, f: &mut Formatter<'_>) -> Result<(), fmt::Error> { + self.0.fmt(f) + } +} + +#[derive(Debug, Deserialize)] +#[serde(deny_unknown_fields)] +pub struct Config { + pub broker: BrokerAddr, + pub run: RunConfig, +} + +#[derive(Debug, Deserialize)] +#[serde(deny_unknown_fields)] +pub struct RunConfig { + pub quiet: Quiet, +} + +#[skip_serializing_none] +#[derive(Serialize)] +pub struct ConfigOptions { + pub broker: Option, + pub run: RunConfigOptions, +} + +#[skip_serializing_none] +#[derive(Serialize)] +pub struct RunConfigOptions { + pub quiet: Option, +} + +impl Default for ConfigOptions { + fn default() -> Self { + ConfigOptions { + broker: None, + run: RunConfigOptions { quiet: Some(false) }, + } + } +} diff --git a/crates/cargo-metest/src/lib.rs b/crates/cargo-metest/src/lib.rs new file mode 100644 index 00000000..878a9606 --- /dev/null +++ b/crates/cargo-metest/src/lib.rs @@ -0,0 +1,734 @@ +pub mod artifacts; +pub mod cargo; +pub mod config; +pub mod metadata; +pub mod pattern; +pub mod progress; +pub mod test_listing; +pub mod visitor; + +use anyhow::Result; +use artifacts::GeneratedArtifacts; +use cargo::{get_cases_from_binary, CargoBuild, TestArtifactStream}; +use cargo_metadata::{Artifact as CargoArtifact, Package as CargoPackage}; +use config::Quiet; +use indicatif::{ProgressBar, TermLike}; +use maelstrom_base::{JobSpec, NonEmpty, Sha256Digest}; +use maelstrom_client::{spec::ImageConfig, Client, ClientDriver}; +use maelstrom_util::{config::BrokerAddr, process::ExitCode}; +use metadata::{AllMetadata, TestMetadata}; +use progress::{ + MultipleProgressBars, NoBar, ProgressDriver, ProgressIndicator, QuietNoBar, QuietProgressBar, + TestListingProgress, TestListingProgressNoSpinner, +}; +use std::{ + collections::HashSet, + io, + path::{Path, PathBuf}, + str, + sync::{ + atomic::{AtomicU64, Ordering}, + Arc, Mutex, + }, +}; +use test_listing::{load_test_listing, write_test_listing, TestListing, LAST_TEST_LISTING_NAME}; +use visitor::{JobStatusTracker, JobStatusVisitor}; + +pub enum ListAction { + ListTests, + ListBinaries, + ListPackages, +} + +/// Returns `true` if the given `CargoPackage` matches the given pattern +fn filter_package(package: &CargoPackage, p: &pattern::Pattern) -> bool { + let c = pattern::Context { + package: package.name.clone(), + artifact: None, + case: None, + }; + pattern::interpret_pattern(p, &c).unwrap_or(true) +} + +/// Returns `true` if the given `CargoArtifact` and case matches the given pattern +fn filter_case(artifact: &CargoArtifact, case: &str, p: &pattern::Pattern) -> bool { + let package_name = artifact.package_id.repr.split(' ').next().unwrap().into(); + let c = pattern::Context { + package: package_name, + artifact: Some(pattern::Artifact::from_target(&artifact.target)), + case: Some(pattern::Case { name: case.into() }), + }; + pattern::interpret_pattern(p, &c).expect("case is provided") +} + +/// A collection of objects that are used while enqueuing jobs. This is useful as a separate object +/// since it can contain things which live longer than the scoped threads and thus can be shared +/// among them. +/// +/// This object is separate from `MainAppDeps` because it is lent to `JobQueuing` +struct JobQueuingDeps { + cargo: String, + packages: Vec, + filter: pattern::Pattern, + stderr: Mutex, + stderr_color: bool, + tracker: Arc, + jobs_queued: AtomicU64, + test_metadata: AllMetadata, + expected_job_count: u64, + test_listing: Mutex, + list_action: Option, +} + +impl JobQueuingDeps { + #[allow(clippy::too_many_arguments)] + fn new( + cargo: String, + packages: Vec, + filter: pattern::Pattern, + stderr: StdErrT, + stderr_color: bool, + test_metadata: AllMetadata, + test_listing: TestListing, + list_action: Option, + ) -> Self { + let expected_job_count = test_listing.expected_job_count(&filter); + + Self { + cargo, + packages, + filter, + stderr: Mutex::new(stderr), + stderr_color, + tracker: Arc::new(JobStatusTracker::default()), + jobs_queued: AtomicU64::new(0), + test_metadata, + expected_job_count, + test_listing: Mutex::new(test_listing), + list_action, + } + } +} + +type StringIter = as IntoIterator>::IntoIter; + +/// Enqueues test cases as jobs in the given client from the given `CargoArtifact` +/// +/// This object is like an iterator, it maintains a position in the test listing and enqueues the +/// next thing when asked. +/// +/// This object is stored inside `JobQueuing` and is used to keep track of which artifact it is +/// currently enqueuing from. +struct ArtifactQueuing<'a, StdErrT, ProgressIndicatorT> { + queuing_deps: &'a JobQueuingDeps, + client: &'a Mutex, + width: usize, + ind: ProgressIndicatorT, + artifact: CargoArtifact, + binary: PathBuf, + generated_artifacts: Option, + ignored_cases: HashSet, + package_name: String, + cases: StringIter, +} + +#[derive(Default)] +struct TestListingResult { + cases: Vec, + ignored_cases: HashSet, +} + +fn list_test_cases( + queuing_deps: &JobQueuingDeps, + ind: &ProgressIndicatorT, + artifact: &CargoArtifact, + package_name: &str, +) -> Result +where + ProgressIndicatorT: ProgressIndicator, +{ + let binary = PathBuf::from(artifact.executable.clone().unwrap()); + let ignored_cases: HashSet<_> = get_cases_from_binary(&binary, &Some("--ignored".into()))? + .into_iter() + .collect(); + + ind.update_enqueue_status(format!("processing {package_name}")); + let mut cases = get_cases_from_binary(&binary, &None)?; + + let mut listing = queuing_deps.test_listing.lock().unwrap(); + listing.add_cases(artifact, &cases[..]); + + cases.retain(|c| filter_case(artifact, c, &queuing_deps.filter)); + Ok(TestListingResult { + cases, + ignored_cases, + }) +} + +fn generate_artifacts( + client: &Mutex, + ind: &ProgressIndicatorT, + artifact: &CargoArtifact, + package_name: &str, +) -> Result +where + ProgressIndicatorT: ProgressIndicator, +{ + let binary = PathBuf::from(artifact.executable.clone().unwrap()); + ind.update_enqueue_status(format!("tar {package_name}")); + artifacts::add_generated_artifacts(client, &binary, ind) +} + +impl<'a, StdErrT, ProgressIndicatorT> ArtifactQueuing<'a, StdErrT, ProgressIndicatorT> +where + ProgressIndicatorT: ProgressIndicator, +{ + fn new( + queuing_deps: &'a JobQueuingDeps, + client: &'a Mutex, + width: usize, + ind: ProgressIndicatorT, + artifact: CargoArtifact, + package_name: String, + ) -> Result { + let binary = PathBuf::from(artifact.executable.clone().unwrap()); + + ind.update_enqueue_status(format!("processing {package_name}")); + let running_tests = queuing_deps.list_action.is_none(); + + let listing = list_test_cases(queuing_deps, &ind, &artifact, &package_name)?; + let generated_artifacts = running_tests + .then(|| generate_artifacts(client, &ind, &artifact, &package_name)) + .transpose()?; + + Ok(Self { + queuing_deps, + client, + width, + ind, + artifact, + binary, + generated_artifacts, + ignored_cases: listing.ignored_cases, + package_name, + cases: listing.cases.into_iter(), + }) + } + + fn calculate_job_layers( + &mut self, + test_metadata: &TestMetadata, + ) -> Result> { + let mut layers = test_metadata + .layers + .iter() + .map(|layer| { + self.client + .lock() + .unwrap() + .add_artifact(PathBuf::from(layer).as_path()) + }) + .collect::>>()?; + let artifacts = self.generated_artifacts.as_ref().unwrap(); + if test_metadata.include_shared_libraries() { + layers.push(artifacts.deps.clone()); + } + layers.push(artifacts.binary.clone()); + + Ok(NonEmpty::try_from(layers).unwrap()) + } + + fn queue_job_from_case(&mut self, case: &str) -> Result { + let case_str = format!("{} {case}", &self.package_name); + self.ind + .update_enqueue_status(format!("processing {case_str}")); + + if self.queuing_deps.list_action.is_some() { + self.ind.println(case_str); + return Ok(EnqueueResult::Listed); + } + + let image_lookup = |image: &str| { + let (image, version) = image.split_once(':').unwrap_or((image, "latest")); + let prog = self + .ind + .new_side_progress(format!("downloading image {image}")) + .unwrap_or_else(ProgressBar::hidden); + let mut client = self.client.lock().unwrap(); + let container_image_depot = client.container_image_depot_mut(); + let image = container_image_depot.get_container_image(image, version, prog)?; + Ok(ImageConfig { + layers: image.layers.clone(), + environment: image.env().cloned(), + working_directory: image.working_dir().map(From::from), + }) + }; + + let filter_context = pattern::Context { + package: self.package_name.clone(), + artifact: Some(pattern::Artifact::from_target(&self.artifact.target)), + case: Some(pattern::Case { name: case.into() }), + }; + + let test_metadata = self + .queuing_deps + .test_metadata + .get_metadata_for_test_with_env(&filter_context, image_lookup)?; + let layers = self.calculate_job_layers(&test_metadata)?; + + // N.B. Must do this before we enqueue the job, but after we know we can't fail + let count = self.queuing_deps.jobs_queued.fetch_add(1, Ordering::AcqRel); + self.ind.update_length(std::cmp::max( + self.queuing_deps.expected_job_count, + count + 1, + )); + + let visitor = JobStatusVisitor::new( + self.queuing_deps.tracker.clone(), + case_str, + self.width, + self.ind.clone(), + ); + + if self.ignored_cases.contains(case) { + visitor.job_ignored(); + return Ok(EnqueueResult::Ignored); + } + + let binary_name = self.binary.file_name().unwrap().to_str().unwrap(); + self.client.lock().unwrap().add_job( + JobSpec { + program: format!("/{binary_name}").into(), + arguments: vec!["--exact".into(), "--nocapture".into(), case.into()], + environment: test_metadata.environment(), + layers, + devices: test_metadata.devices, + mounts: test_metadata.mounts, + enable_loopback: test_metadata.enable_loopback, + enable_writable_file_system: test_metadata.enable_writable_file_system, + working_directory: test_metadata.working_directory, + user: test_metadata.user, + group: test_metadata.group, + }, + Box::new(move |cjid, result| visitor.job_finished(cjid, result)), + ); + + Ok(EnqueueResult::Enqueued { + package_name: self.package_name.clone(), + case: case.into(), + }) + } + + /// Attempt to enqueue the next test as a job in the client + /// + /// Returns an `EnqueueResult` describing what happened. Meant to be called until it returns + /// `EnqueueResult::Done` + fn enqueue_one(&mut self) -> Result { + let Some(case) = self.cases.next() else { + return Ok(EnqueueResult::Done); + }; + self.queue_job_from_case(&case) + } +} + +/// Enqueues tests as jobs in the given client. +/// +/// This object is like an iterator, it maintains a position in the test listing and enqueues the +/// next thing when asked. +struct JobQueuing<'a, StdErrT, ProgressIndicatorT> { + queuing_deps: &'a JobQueuingDeps, + client: &'a Mutex, + width: usize, + ind: ProgressIndicatorT, + cargo_build: Option, + package_match: bool, + artifacts: TestArtifactStream, + artifact_queuing: Option>, +} + +impl<'a, StdErrT, ProgressIndicatorT: ProgressIndicator> JobQueuing<'a, StdErrT, ProgressIndicatorT> +where + ProgressIndicatorT: ProgressIndicator, + StdErrT: io::Write, +{ + fn new( + queuing_deps: &'a JobQueuingDeps, + client: &'a Mutex, + width: usize, + ind: ProgressIndicatorT, + ) -> Result { + let package_names: Vec<_> = queuing_deps + .packages + .iter() + .map(|p| p.name.clone()) + .collect(); + + let building_tests = !package_names.is_empty() + && matches!(queuing_deps.list_action, None | Some(ListAction::ListTests)); + let mut cargo_build = building_tests + .then(|| { + CargoBuild::new( + &queuing_deps.cargo, + queuing_deps.stderr_color, + package_names, + ) + }) + .transpose()?; + + Ok(Self { + queuing_deps, + client, + width, + ind, + package_match: false, + artifacts: cargo_build + .as_mut() + .map(|c| c.artifact_stream()) + .unwrap_or_default(), + artifact_queuing: None, + cargo_build, + }) + } + + fn start_queuing_from_artifact(&mut self) -> Result { + self.ind.update_enqueue_status("building artifacts..."); + + let Some(artifact) = self.artifacts.next() else { + return Ok(false); + }; + let artifact = artifact?; + + let package_name = artifact.package_id.repr.split(' ').next().unwrap().into(); + self.artifact_queuing = Some(ArtifactQueuing::new( + self.queuing_deps, + self.client, + self.width, + self.ind.clone(), + artifact, + package_name, + )?); + + Ok(true) + } + + /// Meant to be called when the user has enqueued all the jobs they want. Checks for deferred + /// errors from cargo or otherwise + fn finish(&mut self) -> Result<()> { + if let Some(cb) = self.cargo_build.take() { + cb.check_status(&mut *self.queuing_deps.stderr.lock().unwrap())?; + } + + Ok(()) + } + + /// Attempt to enqueue the next test as a job in the client + /// + /// Returns an `EnqueueResult` describing what happened. Meant to be called it returns + /// `EnqueueResult::Done` + fn enqueue_one(&mut self) -> Result { + if self.artifact_queuing.is_none() && !self.start_queuing_from_artifact()? { + self.finish()?; + return Ok(EnqueueResult::Done); + } + self.package_match = true; + + let res = self.artifact_queuing.as_mut().unwrap().enqueue_one()?; + if res.is_done() { + self.artifact_queuing = None; + return self.enqueue_one(); + } + + Ok(res) + } +} + +/// A collection of objects that are used to run the MainApp. This is useful as a separate object +/// since it can contain things which live longer than scoped threads and thus shared among them. +pub struct MainAppDeps { + pub client: Mutex, + queuing_deps: JobQueuingDeps, + cache_dir: PathBuf, +} + +impl MainAppDeps { + /// Creates a new `MainAppDeps` + /// + /// `cargo`: the command to run when invoking cargo + /// `include_filter`: tests which match any of the patterns in this filter are run + /// `exclude_filter`: tests which match any of the patterns in this filter are not run + /// `list_action`: if some, tests aren't run, instead tests or other things are listed + /// `stderr`: is written to for error output + /// `stderr_color`: should terminal color codes be written to `stderr` or not + /// `workspace_root`: the path to the root of the workspace + /// `workspace_packages`: a listing of the packages in the workspace + /// `broker_addr`: the network address of the broker which we connect to + /// `client_driver`: an object which drives the background work of the `Client` + #[allow(clippy::too_many_arguments)] + pub fn new( + cargo: String, + include_filter: Vec, + exclude_filter: Vec, + list_action: Option, + stderr: StdErrT, + stderr_color: bool, + workspace_root: &impl AsRef, + workspace_packages: &[&CargoPackage], + broker_addr: BrokerAddr, + client_driver: impl ClientDriver + Send + Sync + 'static, + ) -> Result { + let cache_dir = workspace_root.as_ref().join("target"); + let client = Mutex::new(Client::new( + client_driver, + broker_addr, + workspace_root, + cache_dir.clone(), + )?); + let test_metadata = AllMetadata::load(workspace_root)?; + let mut test_listing = + load_test_listing(&cache_dir.join(LAST_TEST_LISTING_NAME))?.unwrap_or_default(); + test_listing.retain_packages(workspace_packages); + + let filter = pattern::compile_filter(&include_filter, &exclude_filter)?; + let selected_packages = workspace_packages + .iter() + .filter(|p| filter_package(p, &filter)) + .map(|&p| p.clone()) + .collect(); + + Ok(Self { + client, + queuing_deps: JobQueuingDeps::new( + cargo, + selected_packages, + filter, + stderr, + stderr_color, + test_metadata, + test_listing, + list_action, + ), + cache_dir, + }) + } +} + +/// The `MainApp` enqueues tests as jobs. With each attempted job enqueued this object is returned +/// and describes what happened. +pub enum EnqueueResult { + /// A job successfully enqueued with the following information + Enqueued { package_name: String, case: String }, + /// No job was enqueued, instead the test that would have been enqueued has been ignored + /// because it has been marked as `#[ignored]` + Ignored, + /// No job was enqueued, we have run out of tests to run + Done, + /// No job was enqueued, we listed the test case instead + Listed, +} + +impl EnqueueResult { + /// Is this `EnqueueResult` the `Done` variant + pub fn is_done(&self) -> bool { + matches!(self, Self::Done) + } + + /// Is this `EnqueueResult` the `Ignored` variant + pub fn is_ignored(&self) -> bool { + matches!(self, Self::Ignored) + } +} + +/// This is the public API for the MainApp +/// +/// N.B. This API is a trait only for type-erasure purposes +pub trait MainApp { + /// Enqueue one test as a job on the `Client`. This is meant to be called repeatedly until + /// `EnqueueResult::Done` is returned, or an error is encountered. + fn enqueue_one(&mut self) -> Result; + + /// Indicates that we have finished enqueuing jobs and starts tearing things down + fn drain(&mut self) -> Result<()>; + + /// Waits for all outstanding jobs to finish, displays a summary, and obtains an `ExitCode` + fn finish(&mut self) -> Result; +} + +struct MainAppImpl<'deps, StdErrT, TermT, ProgressIndicatorT, ProgressDriverT> { + deps: &'deps MainAppDeps, + queuing: JobQueuing<'deps, StdErrT, ProgressIndicatorT>, + prog_driver: ProgressDriverT, + prog: ProgressIndicatorT, + term: TermT, +} + +impl<'deps, StdErrT, TermT, ProgressIndicatorT, ProgressDriverT> + MainAppImpl<'deps, StdErrT, TermT, ProgressIndicatorT, ProgressDriverT> +{ + fn new( + deps: &'deps MainAppDeps, + queuing: JobQueuing<'deps, StdErrT, ProgressIndicatorT>, + prog_driver: ProgressDriverT, + prog: ProgressIndicatorT, + term: TermT, + ) -> Self { + Self { + deps, + queuing, + prog_driver, + prog, + term, + } + } +} + +impl<'deps, 'scope, StdErrT, TermT, ProgressIndicatorT, ProgressDriverT> MainApp + for MainAppImpl<'deps, StdErrT, TermT, ProgressIndicatorT, ProgressDriverT> +where + StdErrT: io::Write + Send, + ProgressIndicatorT: ProgressIndicator, + TermT: TermLike + Clone + 'static, + ProgressDriverT: ProgressDriver<'scope>, +{ + fn enqueue_one(&mut self) -> Result { + self.queuing.enqueue_one() + } + + fn drain(&mut self) -> Result<()> { + self.prog + .update_length(self.deps.queuing_deps.jobs_queued.load(Ordering::Acquire)); + self.prog.done_queuing_jobs(); + self.prog_driver.stop()?; + self.deps.client.lock().unwrap().stop_accepting()?; + Ok(()) + } + + fn finish(&mut self) -> Result { + self.deps + .client + .lock() + .unwrap() + .wait_for_outstanding_jobs()?; + self.prog.finished()?; + + if self.deps.queuing_deps.list_action.is_none() { + let width = self.term.width() as usize; + self.deps + .queuing_deps + .tracker + .print_summary(width, self.term.clone())?; + } + + write_test_listing( + &self.deps.cache_dir.join(LAST_TEST_LISTING_NAME), + &self.deps.queuing_deps.test_listing.lock().unwrap(), + )?; + + Ok(self.deps.queuing_deps.tracker.exit_code()) + } +} + +fn list_packages(ind: &ProgressIndicatorT, packages: &[CargoPackage]) +where + ProgressIndicatorT: ProgressIndicator, +{ + for pkg in packages { + ind.println(format!("package {}", &pkg.name)); + } +} + +fn list_binaries(ind: &ProgressIndicatorT, packages: &[CargoPackage]) +where + ProgressIndicatorT: ProgressIndicator, +{ + for pkg in packages { + for tgt in &pkg.targets { + if tgt.test { + let pkg_kind = pattern::ArtifactKind::from_target(tgt); + let mut binary_name = String::new(); + if tgt.name != pkg.name { + binary_name += " "; + binary_name += &tgt.name; + } + ind.println(format!( + "binary {}{} ({})", + &pkg.name, binary_name, pkg_kind + )); + } + } + } +} + +fn new_helper<'deps, 'scope, StdErrT, ProgressIndicatorT, TermT>( + deps: &'deps MainAppDeps, + prog_factory: impl FnOnce(TermT) -> ProgressIndicatorT, + term: TermT, + mut prog_driver: impl ProgressDriver<'scope> + 'scope, +) -> Result> +where + StdErrT: io::Write + Send, + ProgressIndicatorT: ProgressIndicator, + TermT: TermLike + Clone + 'static, + 'deps: 'scope, +{ + let width = term.width() as usize; + let prog = prog_factory(term.clone()); + + prog_driver.drive(&deps.client, prog.clone()); + prog.update_length(deps.queuing_deps.expected_job_count); + + match deps.queuing_deps.list_action { + Some(ListAction::ListPackages) => list_packages(&prog, &deps.queuing_deps.packages), + + Some(ListAction::ListBinaries) => list_binaries(&prog, &deps.queuing_deps.packages), + _ => {} + } + + let queuing = JobQueuing::new(&deps.queuing_deps, &deps.client, width, prog.clone())?; + Ok(Box::new(MainAppImpl::new( + deps, + queuing, + prog_driver, + prog, + term, + ))) +} + +/// Construct a `MainApp` +/// +/// `deps`: a collection of dependencies +/// `stdout_tty`: should terminal color codes be printed to stdout (provided via `term`) +/// `quiet`: indicates whether quiet mode should be used or not +/// `term`: represents the terminal +/// `driver`: drives the background work needed for updating the progress bars +pub fn main_app_new<'deps, 'scope, TermT, StdErrT>( + deps: &'deps MainAppDeps, + stdout_tty: bool, + quiet: Quiet, + term: TermT, + driver: impl ProgressDriver<'scope> + 'scope, +) -> Result> +where + StdErrT: io::Write + Send, + TermT: TermLike + Clone + Send + Sync + 'static, + 'deps: 'scope, +{ + if deps.queuing_deps.list_action.is_some() { + return if stdout_tty { + Ok(new_helper(deps, TestListingProgress::new, term, driver)?) + } else { + Ok(new_helper( + deps, + TestListingProgressNoSpinner::new, + term, + driver, + )?) + }; + } + + match (stdout_tty, quiet.into_inner()) { + (true, true) => Ok(new_helper(deps, QuietProgressBar::new, term, driver)?), + (true, false) => Ok(new_helper(deps, MultipleProgressBars::new, term, driver)?), + (false, true) => Ok(new_helper(deps, QuietNoBar::new, term, driver)?), + (false, false) => Ok(new_helper(deps, NoBar::new, term, driver)?), + } +} diff --git a/crates/cargo-metest/src/main.rs b/crates/cargo-metest/src/main.rs new file mode 100644 index 00000000..8bb4df5f --- /dev/null +++ b/crates/cargo-metest/src/main.rs @@ -0,0 +1,244 @@ +use anyhow::{Context as _, Result}; +use cargo_metadata::Metadata as CargoMetadata; +use cargo_metest::{ + config::{Config, ConfigOptions, RunConfigOptions}, + main_app_new, + progress::DefaultProgressDriver, + ListAction, MainAppDeps, +}; +use clap::{Args, Parser, Subcommand}; +use console::Term; +use figment::{ + error::Kind, + providers::{Env, Format, Serialized, Toml}, + Figment, +}; +use maelstrom_client::DefaultClientDriver; +use maelstrom_util::process::ExitCode; +use std::{ + env, + io::IsTerminal as _, + path::{Path, PathBuf}, + process::Command, +}; + +/// The meticulous client. This process sends work to the broker to be executed by workers. +#[derive(Parser, Debug)] +#[command(version)] +#[command(bin_name = "cargo metest")] +struct CliOptions { + /// Configuration file. Values set in the configuration file will be overridden by values set + /// through environment variables and values set on the command line. If not set, the file + /// .config/cargo-metest.toml in the workspace root will be used, if it exists. + #[arg(short = 'c', long)] + config_file: Option, + + /// Socket address of broker. Examples: 127.0.0.1:5000 host.example.com:2000". + #[arg(short = 'b', long)] + broker: Option, + + #[command(subcommand)] + command: CliCommand, +} + +#[derive(Debug, Subcommand)] +enum CliCommand { + /// Run tests + Run(CliRun), + + /// List tests, binaries, or packages + List(CliList), +} + +#[derive(Args, Debug)] +struct CliRun { + /// Print configuration and exit + #[arg(short = 'P', long)] + print_config: bool, + + /// Don't output information about the tests being run + #[arg(short = 'q', long)] + quiet: bool, + + /// Only run tests which match the given filter. Can be specified multiple times + #[arg( + short = 'i', + long, + value_name = "FILTER_EXPRESSION", + default_value = "all" + )] + include: Vec, + + /// Only run tests which don't match the given filter. Can be specified multiple times + #[arg(short = 'x', long, value_name = "FILTER_EXPRESSION")] + exclude: Vec, +} + +#[derive(Args, Debug)] +struct CliList { + #[command(subcommand)] + what: Option, + + /// Print configuration and exit + #[arg(short = 'P', long)] + print_config: bool, + + /// Only list artifacts which match the given filter. Can be specified multiple times + #[arg( + short = 'i', + long, + value_name = "FILTER_EXPRESSION", + default_value = "all" + )] + include: Vec, + + /// Only list artifacts which don't match the given filter. Can be specified multiple times + #[arg(short = 'x', long, value_name = "FILTER_EXPRESSION")] + exclude: Vec, +} + +#[derive(Debug, Subcommand)] +enum CliListType { + /// Only list the tests that would be run, don't actually run them. This is the default + Tests, + + /// Only list the binaries that would be built, don't actually build them or run tests + Binaries, + + /// Only list the package that exist, don't build anything or run any tests + Packages, +} + +fn config(config_file: impl AsRef, cli_options: ConfigOptions) -> Result { + Figment::new() + .merge(Serialized::defaults(ConfigOptions::default())) + .merge(Toml::file(config_file)) + .merge(Env::prefixed("CARGO_METEST_")) + .merge(Serialized::globals(cli_options)) + .extract() + .map_err(|mut e| { + if let Kind::MissingField(field) = &e.kind { + e.kind = Kind::Message(format!("configuration value \"{field}\" was no provided")); + e + } else { + e + } + }) + .context("reading configuration") +} + +/// The main function for the client. This should be called on a task of its own. It will return +/// when a signal is received or when all work has been processed by the broker. +pub fn main() -> Result { + let mut args = Vec::from_iter(env::args()); + if args.len() > 1 && args[0].ends_with(format!("cargo-{}", args[1]).as_str()) { + args.remove(1); + } + let cli_options = CliOptions::parse_from(args); + + let cargo_metadata = Command::new("cargo") + .args(["metadata", "--format-version=1"]) + .output() + .context("getting cargo metadata")?; + let cargo_metadata: CargoMetadata = + serde_json::from_slice(&cargo_metadata.stdout).context("parsing cargo metadata")?; + + let config_file = match &cli_options.config_file { + Some(path) => { + if !path.exists() { + eprintln!("warning: config file {} not found", path.display()); + } + path.clone() + } + None => cargo_metadata + .workspace_root + .join(".config") + .join("cargo-metest.toml") + .into(), + }; + + let (config, include, exclude, list_action) = match cli_options.command { + CliCommand::List(CliList { + what, + include, + exclude, + print_config, + }) => { + let config = config( + config_file, + ConfigOptions { + broker: cli_options.broker, + run: RunConfigOptions { quiet: None }, + }, + )?; + if print_config { + println!("{config:#?}"); + return Ok(ExitCode::SUCCESS); + } + ( + config, + include, + exclude, + Some(match what { + None | Some(CliListType::Tests) => ListAction::ListTests, + Some(CliListType::Binaries) => ListAction::ListBinaries, + Some(CliListType::Packages) => ListAction::ListPackages, + }), + ) + } + CliCommand::Run(CliRun { + include, + exclude, + print_config, + quiet, + }) => { + let config = config( + config_file, + ConfigOptions { + broker: cli_options.broker, + run: RunConfigOptions { + quiet: quiet.then_some(true), + }, + }, + )?; + if print_config { + println!("{config:#?}"); + return Ok(ExitCode::SUCCESS); + } + (config, include, exclude, None) + } + }; + + let deps = MainAppDeps::new( + "cargo".into(), + include, + exclude, + list_action, + std::io::stderr(), + std::io::stderr().is_terminal(), + &cargo_metadata.workspace_root, + &cargo_metadata.workspace_packages(), + config.broker, + DefaultClientDriver::default(), + )?; + + let stdout_tty = std::io::stdout().is_terminal(); + std::thread::scope(|scope| { + let mut app = main_app_new( + &deps, + stdout_tty, + config.run.quiet, + Term::buffered_stdout(), + DefaultProgressDriver::new(scope), + )?; + while !app.enqueue_one()?.is_done() {} + app.drain()?; + app.finish() + }) +} + +#[test] +fn test_cli() { + use clap::CommandFactory; + CliOptions::command().debug_assert() +} diff --git a/crates/cargo-metest/src/metadata.rs b/crates/cargo-metest/src/metadata.rs new file mode 100644 index 00000000..4515dec7 --- /dev/null +++ b/crates/cargo-metest/src/metadata.rs @@ -0,0 +1,1021 @@ +mod directive; + +use crate::pattern; +use anyhow::{Context as _, Error, Result}; +use directive::TestDirective; +use maelstrom_base::Utf8PathBuf; +use maelstrom_base::{EnumSet, GroupId, JobDevice, JobMount, UserId}; +use maelstrom_client::spec::{self, substitute, ImageConfig, ImageOption, PossiblyImage}; +use maelstrom_util::fs::Fs; +use serde::Deserialize; +use std::{collections::BTreeMap, path::Path, str}; + +#[derive(PartialEq, Eq, Debug, Deserialize, Default)] +#[serde(deny_unknown_fields)] +pub struct AllMetadata { + directives: Vec, +} + +#[derive(Debug, Eq, PartialEq)] +pub struct TestMetadata { + include_shared_libraries: Option, + pub enable_loopback: bool, + pub enable_writable_file_system: bool, + pub working_directory: Utf8PathBuf, + pub user: UserId, + pub group: GroupId, + pub layers: Vec, + environment: BTreeMap, + pub mounts: Vec, + pub devices: EnumSet, +} + +impl Default for TestMetadata { + fn default() -> Self { + Self { + include_shared_libraries: Default::default(), + enable_loopback: Default::default(), + enable_writable_file_system: Default::default(), + working_directory: Utf8PathBuf::from("/"), + user: UserId::from(0), + group: GroupId::from(0), + layers: Default::default(), + environment: Default::default(), + mounts: Default::default(), + devices: Default::default(), + } + } +} + +impl TestMetadata { + /// Return whether to include a layer of shared library dependencies. + /// + /// The logic here is that if they explicitly set the value to something, we should return + /// that. Otherwise, we should see if they set any layers. If they explicitly added layers, + /// they probably don't want us pushing shared libraries on those layers. + pub fn include_shared_libraries(&self) -> bool { + match self.include_shared_libraries { + Some(val) => val, + None => self.layers.is_empty(), + } + } + + pub fn environment(&self) -> Vec { + self.environment + .iter() + .map(|(k, v)| format!("{k}={v}")) + .collect() + } + + fn try_fold( + mut self, + directive: &TestDirective, + env_lookup: impl Fn(&str) -> Result>, + image_lookup: impl FnMut(&str) -> Result, + ) -> Result { + let image = ImageOption::new(&directive.image, image_lookup)?; + + if directive.include_shared_libraries.is_some() { + self.include_shared_libraries = directive.include_shared_libraries; + } + + if let Some(enable_loopback) = directive.enable_loopback { + self.enable_loopback = enable_loopback; + } + + if let Some(enable_writable_file_system) = directive.enable_writable_file_system { + self.enable_writable_file_system = enable_writable_file_system; + } + + match &directive.working_directory { + Some(PossiblyImage::Explicit(working_directory)) => { + self.working_directory = working_directory.clone(); + } + Some(PossiblyImage::Image) => { + self.working_directory = image.working_directory()?; + } + None => {} + } + + if let Some(user) = directive.user { + self.user = user; + } + + if let Some(group) = directive.group { + self.group = group; + } + + match &directive.layers { + Some(PossiblyImage::Explicit(layers)) => { + self.layers = layers.to_vec(); + } + Some(PossiblyImage::Image) => { + self.layers = image.layers()?.collect(); + } + None => {} + } + self.layers.extend(directive.added_layers.iter().cloned()); + + fn substitute_environment( + env_lookup: impl Fn(&str) -> Result>, + prev: &BTreeMap, + new: &BTreeMap, + ) -> Result> { + new.iter() + .map(|(k, v)| { + substitute::substitute(v, &env_lookup, |var| prev.get(var).map(String::as_str)) + .map(|v| (k.clone(), String::from(v))) + .map_err(Error::new) + }) + .collect() + } + + match &directive.environment { + Some(PossiblyImage::Explicit(environment)) => { + self.environment = + substitute_environment(&env_lookup, &self.environment, environment)? + .into_iter() + .collect(); + } + Some(PossiblyImage::Image) => { + self.environment = image.environment()?; + } + None => {} + } + self.environment.extend(substitute_environment( + &env_lookup, + &self.environment, + &directive.added_environment, + )?); + + if let Some(mounts) = &directive.mounts { + self.mounts = mounts.to_vec(); + } + self.mounts.extend(directive.added_mounts.iter().cloned()); + + if let Some(devices) = directive.devices { + self.devices = devices; + } + self.devices = self.devices.union(directive.added_devices); + + Ok(self) + } +} + +fn pattern_match(filter: &pattern::Pattern, context: &pattern::Context) -> bool { + pattern::interpret_pattern(filter, context).expect("context should have case") +} + +impl AllMetadata { + fn get_metadata_for_test( + &self, + context: &pattern::Context, + env_lookup: impl Fn(&str) -> Result>, + mut image_lookup: impl FnMut(&str) -> Result, + ) -> Result { + self.directives + .iter() + .filter(|directive| match directive { + TestDirective { + filter: Some(filter), + .. + } => pattern_match(filter, context), + TestDirective { filter: None, .. } => true, + }) + .try_fold(TestMetadata::default(), |m, d| { + m.try_fold(d, &env_lookup, &mut image_lookup) + }) + } + + pub fn get_metadata_for_test_with_env( + &self, + context: &pattern::Context, + image_lookup: impl FnMut(&str) -> Result, + ) -> Result { + self.get_metadata_for_test(context, spec::std_env_lookup, image_lookup) + } + + fn from_str(contents: &str) -> Result { + Ok(toml::from_str(contents)?) + } + + pub fn load(workspace_root: &impl AsRef) -> Result { + let path = workspace_root.as_ref().join("maelstrom-test.toml"); + + Ok(Fs::new() + .read_to_string_if_exists(&path)? + .map(|c| Self::from_str(&c).with_context(|| format!("parsing {}", path.display()))) + .transpose()? + .unwrap_or_default()) + } +} + +#[cfg(test)] +mod test { + use super::*; + use maelstrom_base::{enum_set, JobMountFsType}; + use maelstrom_test::{path_buf_vec, string, string_vec, utf8_path_buf}; + use toml::de::Error as TomlError; + + fn test_ctx(package: &str, test: &str) -> pattern::Context { + pattern::Context { + package: package.into(), + artifact: Some(pattern::Artifact { + name: package.into(), + kind: pattern::ArtifactKind::Library, + }), + case: Some(pattern::Case { name: test.into() }), + } + } + + fn empty_env(_: &str) -> Result> { + Ok(None) + } + + fn no_containers(_: &str) -> Result { + panic!() + } + + #[test] + fn default() { + assert_eq!( + AllMetadata { directives: vec![] } + .get_metadata_for_test(&test_ctx("mod", "foo"), empty_env, no_containers) + .unwrap(), + TestMetadata::default(), + ); + } + + #[test] + fn include_shared_libraries_defaults() { + let all = AllMetadata::from_str( + r#" + [[directives]] + filter = "package.equals(package1)" + layers = ["layer1"] + + [[directives]] + filter = "package.equals(package1) && name.equals(test1)" + layers = [] + "#, + ) + .unwrap(); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test1"), empty_env, no_containers) + .unwrap() + .include_shared_libraries(), + true + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test2"), empty_env, no_containers) + .unwrap() + .include_shared_libraries(), + false + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package2", "test1"), empty_env, no_containers) + .unwrap() + .include_shared_libraries(), + true + ); + } + + #[test] + fn include_shared_libraries() { + let all = AllMetadata::from_str( + r#" + [[directives]] + include_shared_libraries = false + + [[directives]] + filter = "package.equals(package1)" + include_shared_libraries = true + layers = ["layer1"] + + [[directives]] + filter = "package.equals(package1) && name.equals(test1)" + layers = [] + "#, + ) + .unwrap(); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test1"), empty_env, no_containers) + .unwrap() + .include_shared_libraries(), + true + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test2"), empty_env, no_containers) + .unwrap() + .include_shared_libraries(), + true + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package2", "test1"), empty_env, no_containers) + .unwrap() + .include_shared_libraries(), + false + ); + } + + #[test] + fn enable_loopback() { + let all = AllMetadata::from_str( + r#" + [[directives]] + filter = "package.equals(package1)" + enable_loopback = true + + [[directives]] + filter = "package.equals(package1) && name.equals(test1)" + enable_loopback = false + "#, + ) + .unwrap(); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test1"), empty_env, no_containers) + .unwrap() + .enable_loopback, + false + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test2"), empty_env, no_containers) + .unwrap() + .enable_loopback, + true + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package2", "test1"), empty_env, no_containers) + .unwrap() + .enable_loopback, + false + ); + } + + #[test] + fn enable_writable_file_system() { + let all = AllMetadata::from_str( + r#" + [[directives]] + filter = "package.equals(package1)" + enable_writable_file_system = true + + [[directives]] + filter = "package.equals(package1) && name.equals(test1)" + enable_writable_file_system = false + "#, + ) + .unwrap(); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test1"), empty_env, no_containers) + .unwrap() + .enable_writable_file_system, + false + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test2"), empty_env, no_containers) + .unwrap() + .enable_writable_file_system, + true + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package2", "test1"), empty_env, no_containers) + .unwrap() + .enable_writable_file_system, + false + ); + } + + #[test] + fn working_directory() { + let image_lookup = |name: &_| match name { + "rust" => Ok(ImageConfig { + working_directory: Some(utf8_path_buf!("/foo")), + ..Default::default() + }), + "no-working-directory" => Ok(Default::default()), + _ => panic!(), + }; + let all = AllMetadata::from_str( + r#" + [[directives]] + include_shared_libraries = false + + [[directives]] + filter = "package.equals(package1)" + image.name = "rust" + image.use = ["working_directory"] + + [[directives]] + filter = "package.equals(package1) && name.equals(test1)" + working_directory = "/bar" + + [[directives]] + filter = "package.equals(package3)" + image.name = "no-working-directory" + image.use = ["working_directory"] + "#, + ) + .unwrap(); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test1"), empty_env, image_lookup) + .unwrap() + .working_directory, + utf8_path_buf!("/bar") + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test2"), empty_env, image_lookup) + .unwrap() + .working_directory, + utf8_path_buf!("/foo") + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package2", "test1"), empty_env, image_lookup) + .unwrap() + .working_directory, + utf8_path_buf!("/") + ); + } + + #[test] + fn user() { + let all = AllMetadata::from_str( + r#" + [[directives]] + filter = "package.equals(package1)" + user = 101 + + [[directives]] + filter = "package.equals(package1) && name.equals(test1)" + user = 202 + "#, + ) + .unwrap(); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test1"), empty_env, no_containers) + .unwrap() + .user, + UserId::from(202) + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test2"), empty_env, no_containers) + .unwrap() + .user, + UserId::from(101) + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package2", "test1"), empty_env, no_containers) + .unwrap() + .user, + UserId::from(0) + ); + } + + #[test] + fn group() { + let all = AllMetadata::from_str( + r#" + [[directives]] + filter = "package.equals(package1)" + group = 101 + + [[directives]] + filter = "package.equals(package1) && name.equals(test1)" + group = 202 + "#, + ) + .unwrap(); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test1"), empty_env, no_containers) + .unwrap() + .group, + GroupId::from(202) + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test2"), empty_env, no_containers) + .unwrap() + .group, + GroupId::from(101) + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package2", "test1"), empty_env, no_containers) + .unwrap() + .group, + GroupId::from(0) + ); + } + + #[test] + fn layers() { + let image_lookup = |name: &_| match name { + "image1" => Ok(ImageConfig { + layers: path_buf_vec!["layer11", "layer12"], + ..Default::default() + }), + "image2" => Ok(ImageConfig { + layers: path_buf_vec!["layer21", "layer22"], + ..Default::default() + }), + "empty-layers" => Ok(Default::default()), + _ => panic!(), + }; + let all = AllMetadata::from_str( + r#" + [[directives]] + layers = ["layer1", "layer2"] + + [[directives]] + filter = "package.equals(package1)" + image.name = "image2" + image.use = [ "layers" ] + + [[directives]] + filter = "package.equals(package1) && name.equals(test1)" + image.name = "image1" + image.use = [ "layers" ] + + [[directives]] + filter = "package.equals(package1) && name.equals(test2)" + layers = ["layer3", "layer4"] + + [[directives]] + filter = "package.equals(package1) && name.equals(test3)" + image.name = "empty-layers" + image.use = [ "layers" ] + "#, + ) + .unwrap(); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test1"), empty_env, image_lookup) + .unwrap() + .layers, + string_vec!["layer11", "layer12"], + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test2"), empty_env, image_lookup) + .unwrap() + .layers, + string_vec!["layer3", "layer4"], + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test3"), empty_env, image_lookup) + .unwrap() + .layers, + Vec::::default(), + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test4"), empty_env, image_lookup) + .unwrap() + .layers, + string_vec!["layer21", "layer22"], + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package2", "test1"), empty_env, image_lookup) + .unwrap() + .layers, + string_vec!["layer1", "layer2"], + ); + } + + #[test] + fn added_layers() { + let all = AllMetadata::from_str( + r#" + [[directives]] + added_layers = ["added-layer1", "added-layer2"] + + [[directives]] + filter = "package.equals(package1)" + layers = ["layer1", "layer2"] + added_layers = ["added-layer3", "added-layer4"] + + [[directives]] + filter = "package.equals(package1) && name.equals(test1)" + added_layers = ["added-layer5", "added-layer6"] + "#, + ) + .unwrap(); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test1"), empty_env, no_containers) + .unwrap() + .layers, + string_vec![ + "layer1", + "layer2", + "added-layer3", + "added-layer4", + "added-layer5", + "added-layer6", + ], + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test2"), empty_env, no_containers) + .unwrap() + .layers, + string_vec!["layer1", "layer2", "added-layer3", "added-layer4",], + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package2", "test1"), empty_env, no_containers) + .unwrap() + .layers, + string_vec!["added-layer1", "added-layer2"], + ); + } + + #[test] + fn environment() { + let env = |key: &_| { + Ok(Some(match key { + "FOO" => string!("env-foo"), + "BAR" => string!("env-bar"), + _ => panic!(), + })) + }; + let images = |name: &_| match name { + "image1" => Ok(ImageConfig { + environment: Some(vec![string!("FOO=image-foo"), string!("FROB=image-frob")]), + ..Default::default() + }), + "no-environment" => Ok(Default::default()), + "bad-environment" => Ok(ImageConfig { + environment: Some(string_vec!["FOO"]), + ..Default::default() + }), + _ => panic!(), + }; + let all = AllMetadata::from_str( + r#" + [[directives]] + environment = { FOO = "$env{FOO}", BAR = "bar", BAZ = "$prev{FOO:-no-prev-foo}" } + + [[directives]] + filter = "package.equals(package1)" + image.name = "image1" + image.use = ["environment"] + + [[directives]] + filter = "package.equals(package1) && name.equals(test1)" + environment = { FOO = "$prev{FOO}", BAR = "$env{BAR}", BAZ = "$prev{BAZ:-no-prev-baz}" } + + [[directives]] + filter = "package.equals(package3)" + image.name = "no-environment" + image.use = ["environment"] + + [[directives]] + filter = "package.equals(package4)" + image.name = "bad-environment" + image.use = ["environment"] + "#, + ) + .unwrap(); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test1"), env, images) + .unwrap() + .environment(), + string_vec!["BAR=env-bar", "BAZ=no-prev-baz", "FOO=image-foo",], + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test2"), env, images) + .unwrap() + .environment(), + string_vec!["FOO=image-foo", "FROB=image-frob"], + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package2", "test1"), env, images) + .unwrap() + .environment(), + string_vec!["BAR=bar", "BAZ=no-prev-foo", "FOO=env-foo",], + ); + } + + #[test] + fn added_environment() { + let env = |key: &_| { + Ok(Some(match key { + "FOO" => string!("env-foo"), + "BAR" => string!("env-bar"), + _ => panic!(), + })) + }; + let images = |name: &_| match name { + "image1" => Ok(ImageConfig { + environment: Some(string_vec!["FOO=image-foo", "FROB=image-frob",]), + ..Default::default() + }), + _ => panic!(), + }; + let all = AllMetadata::from_str( + r#" + [[directives]] + environment = { FOO = "foo", BAR = "bar" } + added_environment = { FOO = "prev-$prev{FOO}", BAZ = "$prev{BAZ:-no-prev-baz}" } + + [[directives]] + filter = "package.equals(package1)" + image.name = "image1" + image.use = ["environment"] + added_environment = { FOO = "$prev{FOO}", BAZ = "$prev{BAZ:-no-prev-baz}" } + + [[directives]] + filter = "package.equals(package1) && name.equals(test1)" + added_environment = { FOO = "prev-$prev{FOO}", BAR = "bar" } + + [[directives]] + filter = "package.equals(package1) && name.equals(test2)" + environment = { FOO = "prev-$prev{FOO}" } + added_environment = { FOO = "prev-$prev{FOO}", BAR = "$env{BAR}" } + "#, + ) + .unwrap(); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test1"), env, images) + .unwrap() + .environment(), + string_vec![ + "BAR=bar", + "BAZ=no-prev-baz", + "FOO=prev-image-foo", + "FROB=image-frob", + ], + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test2"), env, images) + .unwrap() + .environment(), + string_vec!["BAR=env-bar", "FOO=prev-prev-image-foo",], + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test3"), env, images) + .unwrap() + .environment(), + string_vec!["BAZ=no-prev-baz", "FOO=image-foo", "FROB=image-frob",], + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package2", "test1"), env, images) + .unwrap() + .environment(), + string_vec!["BAR=bar", "BAZ=no-prev-baz", "FOO=prev-foo",], + ); + } + + #[test] + fn mounts() { + let all = AllMetadata::from_str( + r#" + [[directives]] + filter = "package.equals(package1)" + mounts = [ { fs_type = "proc", mount_point = "/proc" } ] + + [[directives]] + filter = "package.equals(package1) && name.equals(test1)" + mounts = [ + { fs_type = "tmp", mount_point = "/tmp" }, + { fs_type = "sys", mount_point = "/sys" }, + ] + "#, + ) + .unwrap(); + + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test1"), empty_env, no_containers) + .unwrap() + .mounts, + vec![ + JobMount { + fs_type: JobMountFsType::Tmp, + mount_point: utf8_path_buf!("/tmp"), + }, + JobMount { + fs_type: JobMountFsType::Sys, + mount_point: utf8_path_buf!("/sys"), + }, + ], + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test2"), empty_env, no_containers) + .unwrap() + .mounts, + vec![JobMount { + fs_type: JobMountFsType::Proc, + mount_point: utf8_path_buf!("/proc"), + },], + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package2", "test1"), empty_env, no_containers) + .unwrap() + .mounts, + vec![], + ); + } + + #[test] + fn added_mounts() { + let all = AllMetadata::from_str( + r#" + [[directives]] + added_mounts = [ { fs_type = "tmp", mount_point = "/tmp" } ] + + [[directives]] + filter = "package.equals(package1)" + mounts = [ + { fs_type = "proc", mount_point = "/proc" }, + ] + added_mounts = [ + { fs_type = "sys", mount_point = "/sys" }, + ] + + [[directives]] + filter = "package.equals(package1) && name.equals(test1)" + added_mounts = [ + { fs_type = "tmp", mount_point = "/tmp" }, + ] + + [[directives]] + filter = "package.equals(package1) && name.equals(test2)" + added_mounts = [ + { fs_type = "tmp", mount_point = "/tmp" }, + { fs_type = "proc", mount_point = "/proc" }, + ] + + [[directives]] + filter = "package.equals(package1) && name.equals(test3)" + mounts = [] + added_mounts = [ + { fs_type = "tmp", mount_point = "/tmp" }, + ] + "#, + ) + .unwrap(); + + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test1"), empty_env, no_containers) + .unwrap() + .mounts, + vec![ + JobMount { + fs_type: JobMountFsType::Proc, + mount_point: utf8_path_buf!("/proc"), + }, + JobMount { + fs_type: JobMountFsType::Sys, + mount_point: utf8_path_buf!("/sys"), + }, + JobMount { + fs_type: JobMountFsType::Tmp, + mount_point: utf8_path_buf!("/tmp"), + }, + ], + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test2"), empty_env, no_containers) + .unwrap() + .mounts, + vec![ + JobMount { + fs_type: JobMountFsType::Proc, + mount_point: utf8_path_buf!("/proc"), + }, + JobMount { + fs_type: JobMountFsType::Sys, + mount_point: utf8_path_buf!("/sys"), + }, + JobMount { + fs_type: JobMountFsType::Tmp, + mount_point: utf8_path_buf!("/tmp"), + }, + JobMount { + fs_type: JobMountFsType::Proc, + mount_point: utf8_path_buf!("/proc"), + }, + ], + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test3"), empty_env, no_containers) + .unwrap() + .mounts, + vec![JobMount { + fs_type: JobMountFsType::Tmp, + mount_point: utf8_path_buf!("/tmp"), + },], + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package2", "test1"), empty_env, no_containers) + .unwrap() + .mounts, + vec![JobMount { + fs_type: JobMountFsType::Tmp, + mount_point: utf8_path_buf!("/tmp"), + },], + ); + } + + #[test] + fn devices() { + let all = AllMetadata::from_str( + r#" + [[directives]] + filter = "package.equals(package1)" + devices = [ "null" ] + + [[directives]] + filter = "package.equals(package1) && name.equals(test1)" + devices = [ "zero", "tty" ] + "#, + ) + .unwrap(); + + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test1"), empty_env, no_containers) + .unwrap() + .devices, + enum_set! {JobDevice::Zero | JobDevice::Tty}, + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test2"), empty_env, no_containers) + .unwrap() + .devices, + enum_set! {JobDevice::Null}, + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package2", "test1"), empty_env, no_containers) + .unwrap() + .devices, + EnumSet::EMPTY, + ); + } + + #[test] + fn added_devices() { + let all = AllMetadata::from_str( + r#" + [[directives]] + added_devices = [ "tty" ] + + [[directives]] + filter = "package.equals(package1)" + devices = [ "zero", "null" ] + added_devices = [ "full" ] + + [[directives]] + filter = "package.equals(package1) && name.equals(test1)" + added_devices = [ "random" ] + + [[directives]] + filter = "package.equals(package1) && name.equals(test2)" + added_devices = [ "urandom" ] + + [[directives]] + filter = "package.equals(package1) && name.equals(test3)" + devices = [] + added_devices = [ "zero" ] + "#, + ) + .unwrap(); + + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test1"), empty_env, no_containers) + .unwrap() + .devices, + enum_set! {JobDevice::Zero | JobDevice::Null | JobDevice::Full | JobDevice::Random}, + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test2"), empty_env, no_containers) + .unwrap() + .devices, + enum_set! {JobDevice::Zero | JobDevice::Null | JobDevice::Full | JobDevice::Urandom}, + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package1", "test3"), empty_env, no_containers) + .unwrap() + .devices, + enum_set! {JobDevice::Zero}, + ); + assert_eq!( + all.get_metadata_for_test(&test_ctx("package2", "test1"), empty_env, no_containers) + .unwrap() + .devices, + enum_set! {JobDevice::Tty}, + ); + } + + fn assert_toml_error(err: Error, expected: &str) { + let err = err.downcast_ref::().unwrap(); + let message = err.message(); + assert!(message.starts_with(expected), "message: {message}"); + } + + #[test] + fn bad_field_in_all_metadata() { + assert_toml_error( + AllMetadata::from_str( + r#" + [not_a_field] + foo = "three" + "#, + ) + .unwrap_err(), + "unknown field `not_a_field`, expected `directives`", + ); + } +} diff --git a/crates/cargo-metest/src/metadata/directive.rs b/crates/cargo-metest/src/metadata/directive.rs new file mode 100644 index 00000000..ea377551 --- /dev/null +++ b/crates/cargo-metest/src/metadata/directive.rs @@ -0,0 +1,963 @@ +use crate::pattern; +use anyhow::Result; +use maelstrom_base::{ + EnumSet, GroupId, JobDevice, JobDeviceListDeserialize, JobMount, UserId, Utf8PathBuf, +}; +use maelstrom_client::spec::{incompatible, Image, ImageUse, PossiblyImage}; +use serde::{de, Deserialize, Deserializer}; +use std::{collections::BTreeMap, str}; + +#[derive(PartialEq, Eq, Debug, Default)] +pub struct TestDirective { + pub filter: Option, + // This will be Some if any of the other fields are Some(AllMetadata::Image). + pub image: Option, + pub include_shared_libraries: Option, + pub enable_loopback: Option, + pub enable_writable_file_system: Option, + pub user: Option, + pub group: Option, + pub layers: Option>>, + pub added_layers: Vec, + pub mounts: Option>, + pub added_mounts: Vec, + pub devices: Option>, + pub added_devices: EnumSet, + pub environment: Option>>, + pub added_environment: BTreeMap, + pub working_directory: Option>, +} + +#[derive(Deserialize)] +#[serde(field_identifier, rename_all = "snake_case")] +enum DirectiveField { + Filter, + IncludeSharedLibraries, + EnableLoopback, + EnableWritableFileSystem, + User, + Group, + Mounts, + AddedMounts, + Devices, + AddedDevices, + Image, + WorkingDirectory, + Layers, + AddedLayers, + Environment, + AddedEnvironment, +} + +struct DirectiveVisitor; + +impl<'de> de::Visitor<'de> for DirectiveVisitor { + type Value = TestDirective; + + fn expecting(&self, formatter: &mut std::fmt::Formatter<'_>) -> std::fmt::Result { + write!(formatter, "TestDirective") + } + + fn visit_map(self, mut map: A) -> Result + where + A: de::MapAccess<'de>, + { + let mut filter = None; + let mut include_shared_libraries = None; + let mut enable_loopback = None; + let mut enable_writable_file_system = None; + let mut user = None; + let mut group = None; + let mut mounts = None; + let mut added_mounts = None; + let mut devices = None; + let mut added_devices = None; + let mut image = None; + let mut working_directory = None; + let mut layers = None; + let mut added_layers = None; + let mut environment = None; + let mut added_environment = None; + while let Some(key) = map.next_key()? { + match key { + DirectiveField::Filter => { + filter = Some( + map.next_value::()? + .parse() + .map_err(serde::de::Error::custom)?, + ); + } + DirectiveField::IncludeSharedLibraries => { + include_shared_libraries = Some(map.next_value()?); + } + DirectiveField::EnableLoopback => { + enable_loopback = Some(map.next_value()?); + } + DirectiveField::EnableWritableFileSystem => { + enable_writable_file_system = Some(map.next_value()?); + } + DirectiveField::User => { + user = Some(map.next_value()?); + } + DirectiveField::Group => { + group = Some(map.next_value()?); + } + DirectiveField::Mounts => { + incompatible( + &added_mounts, + "field `mounts` cannot be set after `added_mounts`", + )?; + mounts = Some(map.next_value()?); + } + DirectiveField::AddedMounts => { + added_mounts = Some(map.next_value()?); + } + DirectiveField::Devices => { + incompatible( + &added_devices, + "field `devices` cannot be set after `added_devices`", + )?; + let d = map.next_value::>()?; + devices = Some(d.into_iter().map(JobDevice::from).collect()); + } + DirectiveField::AddedDevices => { + let d = map.next_value::>()?; + added_devices = Some(d.into_iter().map(JobDevice::from).collect()); + } + DirectiveField::Image => { + let i = map.next_value::()?; + image = Some(i.name); + for use_ in i.use_ { + match use_ { + ImageUse::WorkingDirectory => { + incompatible( + &working_directory, + "field `image` cannot use `working_directory` if field `working_directory` is also set", + )?; + working_directory = Some(PossiblyImage::Image); + } + ImageUse::Layers => { + incompatible( + &layers, + "field `image` cannot use `layers` if field `layers` is also set", + )?; + incompatible( + &added_layers, + "field `image` that uses `layers` cannot be set after `added_layers`", + )?; + layers = Some(PossiblyImage::Image); + } + ImageUse::Environment => { + incompatible( + &environment, + "field `image` cannot use `environment` if field `environment` is also set", + )?; + incompatible( + &added_environment, + "field `image` that uses `environment` cannot be set after `added_environment`", + )?; + environment = Some(PossiblyImage::Image); + } + } + } + } + DirectiveField::WorkingDirectory => { + incompatible( + &working_directory, + "field `working_directory` cannot be set after `image` field that uses `working_directory`", + )?; + working_directory = Some(PossiblyImage::Explicit(map.next_value()?)); + } + DirectiveField::Layers => { + incompatible( + &layers, + "field `layers` cannot be set after `image` field that uses `layers`", + )?; + incompatible( + &added_layers, + "field `layers` cannot be set after `added_layers`", + )?; + layers = Some(PossiblyImage::Explicit(map.next_value()?)); + } + DirectiveField::AddedLayers => { + added_layers = Some(map.next_value()?); + } + DirectiveField::Environment => { + incompatible( + &environment, + "field `environment` cannot be set after `image` field that uses `environment`", + )?; + incompatible( + &added_environment, + "field `environment` cannot be set after `added_environment`", + )?; + environment = Some(PossiblyImage::Explicit(map.next_value()?)); + } + DirectiveField::AddedEnvironment => { + added_environment = Some(map.next_value()?); + } + } + } + Ok(TestDirective { + filter, + include_shared_libraries, + enable_loopback, + enable_writable_file_system, + user, + group, + layers, + added_layers: added_layers.unwrap_or_default(), + mounts, + added_mounts: added_mounts.unwrap_or_default(), + image, + working_directory, + devices, + added_devices: added_devices.unwrap_or_default(), + environment, + added_environment: added_environment.unwrap_or_default(), + }) + } +} + +impl<'de> de::Deserialize<'de> for TestDirective { + fn deserialize(deserializer: D) -> Result + where + D: Deserializer<'de>, + { + deserializer.deserialize_any(DirectiveVisitor) + } +} + +#[cfg(test)] +mod test { + use super::*; + use anyhow::Error; + use maelstrom_base::{enum_set, JobMountFsType}; + use maelstrom_test::{string, string_vec, utf8_path_buf}; + use toml::de::Error as TomlError; + + fn parse_test_directive(file: &str) -> Result { + toml::from_str(file).map_err(Error::new) + } + + fn assert_toml_error(err: Error, expected: &str) { + let err = err.downcast_ref::().unwrap(); + let message = err.message(); + assert!(message.starts_with(expected), "message: {message}"); + } + + #[test] + fn empty() { + assert_eq!(parse_test_directive("").unwrap(), TestDirective::default(),); + } + + #[test] + fn unknown_field() { + assert_toml_error( + parse_test_directive( + r#" + unknown = "foo" + "#, + ) + .unwrap_err(), + "unknown field `unknown`, expected one of", + ); + } + + #[test] + fn duplicate_field() { + assert_toml_error( + parse_test_directive( + r#" + filter = "all" + filter = "any" + "#, + ) + .unwrap_err(), + "duplicate key `filter`", + ); + } + + #[test] + fn simple_fields() { + assert_eq!( + parse_test_directive( + r#" + filter = "package.equals(package1) && test.equals(test1)" + include_shared_libraries = true + enable_loopback = false + enable_writable_file_system = true + user = 101 + group = 202 + "# + ) + .unwrap(), + TestDirective { + filter: Some( + "package.equals(package1) && test.equals(test1)" + .parse() + .unwrap() + ), + include_shared_libraries: Some(true), + enable_loopback: Some(false), + enable_writable_file_system: Some(true), + user: Some(UserId::from(101)), + group: Some(GroupId::from(202)), + ..Default::default() + } + ); + } + + #[test] + fn mounts() { + assert_eq!( + parse_test_directive( + r#" + mounts = [ { fs_type = "proc", mount_point = "/proc" } ] + "# + ) + .unwrap(), + TestDirective { + mounts: Some(vec![JobMount { + fs_type: JobMountFsType::Proc, + mount_point: utf8_path_buf!("/proc"), + }]), + ..Default::default() + } + ); + } + + #[test] + fn added_mounts() { + assert_eq!( + parse_test_directive( + r#" + added_mounts = [ { fs_type = "proc", mount_point = "/proc" } ] + "# + ) + .unwrap(), + TestDirective { + added_mounts: vec![JobMount { + fs_type: JobMountFsType::Proc, + mount_point: utf8_path_buf!("/proc"), + }], + ..Default::default() + } + ); + } + + #[test] + fn mounts_before_added_mounts() { + assert_eq!( + parse_test_directive( + r#" + mounts = [ { fs_type = "proc", mount_point = "/proc" } ] + added_mounts = [ { fs_type = "tmp", mount_point = "/tmp" } ] + "# + ) + .unwrap(), + TestDirective { + mounts: Some(vec![JobMount { + fs_type: JobMountFsType::Proc, + mount_point: utf8_path_buf!("/proc"), + }]), + added_mounts: vec![JobMount { + fs_type: JobMountFsType::Tmp, + mount_point: utf8_path_buf!("/tmp"), + }], + ..Default::default() + } + ); + } + + #[test] + fn mounts_after_added_mounts() { + assert_toml_error( + parse_test_directive( + r#" + added_mounts = [ { fs_type = "tmp", mount_point = "/tmp" } ] + mounts = [ { fs_type = "proc", mount_point = "/proc" } ] + "#, + ) + .unwrap_err(), + "field `mounts` cannot be set after `added_mounts`", + ); + } + + #[test] + fn unknown_field_in_mount() { + assert_toml_error( + parse_test_directive( + r#" + mounts = [ { fs_type = "proc", mount_point = "/proc", unknown = "true" } ] + "#, + ) + .unwrap_err(), + "unknown field `unknown`, expected", + ); + } + + #[test] + fn missing_field_in_mount() { + assert_toml_error( + parse_test_directive( + r#" + mounts = [ { fs_type = "proc" } ] + "#, + ) + .unwrap_err(), + "missing field `mount_point`", + ); + } + + #[test] + fn devices() { + assert_eq!( + parse_test_directive( + r#" + devices = [ "null", "zero" ] + "# + ) + .unwrap(), + TestDirective { + devices: Some(enum_set!(JobDevice::Null | JobDevice::Zero)), + ..Default::default() + } + ); + } + + #[test] + fn added_devices() { + assert_eq!( + parse_test_directive( + r#" + added_devices = [ "null", "zero" ] + "# + ) + .unwrap(), + TestDirective { + added_devices: enum_set!(JobDevice::Null | JobDevice::Zero), + ..Default::default() + } + ); + } + + #[test] + fn devices_before_added_devices() { + assert_eq!( + parse_test_directive( + r#" + devices = [ "null", "zero" ] + added_devices = [ "full", "tty" ] + "# + ) + .unwrap(), + TestDirective { + devices: Some(enum_set!(JobDevice::Null | JobDevice::Zero)), + added_devices: enum_set!(JobDevice::Full | JobDevice::Tty), + ..Default::default() + } + ); + } + + #[test] + fn devices_after_added_devices() { + assert_toml_error( + parse_test_directive( + r#" + added_devices = [ "full", "tty" ] + devices = [ "null", "zero" ] + "#, + ) + .unwrap_err(), + "field `devices` cannot be set after `added_devices`", + ); + } + + #[test] + fn unknown_devices_type() { + assert_toml_error( + parse_test_directive( + r#" + devices = ["unknown"] + "#, + ) + .unwrap_err(), + "unknown variant `unknown`, expected one of", + ); + } + + #[test] + fn working_directory() { + assert_eq!( + parse_test_directive( + r#" + working_directory = "/foo" + "# + ) + .unwrap(), + TestDirective { + working_directory: Some(PossiblyImage::Explicit("/foo".into())), + ..Default::default() + } + ); + } + + #[test] + fn image_with_working_directory() { + assert_eq!( + parse_test_directive( + r#" + image = { name = "rust", use = ["layers", "working_directory"] } + "# + ) + .unwrap(), + TestDirective { + image: Some(string!("rust")), + working_directory: Some(PossiblyImage::Image), + layers: Some(PossiblyImage::Image), + ..Default::default() + } + ); + } + + #[test] + fn working_directory_after_image_without_working_directory() { + assert_eq!( + parse_test_directive( + r#" + image = { name = "rust", use = ["layers"] } + working_directory = "/foo" + "# + ) + .unwrap(), + TestDirective { + image: Some(string!("rust")), + working_directory: Some(PossiblyImage::Explicit("/foo".into())), + layers: Some(PossiblyImage::Image), + ..Default::default() + } + ); + } + + #[test] + fn image_without_working_directory_after_working_directory() { + assert_eq!( + parse_test_directive( + r#" + working_directory = "/foo" + image = { name = "rust", use = ["layers"] } + "# + ) + .unwrap(), + TestDirective { + image: Some(string!("rust")), + working_directory: Some(PossiblyImage::Explicit("/foo".into())), + layers: Some(PossiblyImage::Image), + ..Default::default() + } + ); + } + + #[test] + fn working_directory_after_image_with_working_directory() { + assert_toml_error( + parse_test_directive( + r#" + image = { name = "rust", use = ["layers", "working_directory"] } + working_directory = "/foo" + "# + ) + .unwrap_err(), + "field `working_directory` cannot be set after `image` field that uses `working_directory`" + ); + } + + #[test] + fn image_with_working_directory_after_working_directory() { + assert_toml_error( + parse_test_directive( + r#" + working_directory = "/foo" + image = { name = "rust", use = ["layers", "working_directory"] } + "#, + ) + .unwrap_err(), + "field `image` cannot use `working_directory` if field `working_directory` is also set", + ); + } + + #[test] + fn layers() { + assert_eq!( + parse_test_directive( + r#" + layers = ["foo.tar"] + "# + ) + .unwrap(), + TestDirective { + layers: Some(PossiblyImage::Explicit(string_vec!["foo.tar"])), + ..Default::default() + } + ); + } + + #[test] + fn image_with_layers() { + assert_eq!( + parse_test_directive( + r#" + image = { name = "rust", use = ["layers", "working_directory"] } + "# + ) + .unwrap(), + TestDirective { + image: Some(string!("rust")), + working_directory: Some(PossiblyImage::Image), + layers: Some(PossiblyImage::Image), + ..Default::default() + } + ); + } + + #[test] + fn layers_after_image_without_layers() { + assert_eq!( + parse_test_directive( + r#" + image = { name = "rust", use = ["working_directory"] } + layers = ["foo.tar"] + "# + ) + .unwrap(), + TestDirective { + image: Some(string!("rust")), + working_directory: Some(PossiblyImage::Image), + layers: Some(PossiblyImage::Explicit(string_vec!["foo.tar"])), + ..Default::default() + } + ); + } + + #[test] + fn image_without_layers_after_layers() { + assert_eq!( + parse_test_directive( + r#" + layers = ["foo.tar"] + image = { name = "rust", use = ["working_directory"] } + "# + ) + .unwrap(), + TestDirective { + image: Some(string!("rust")), + working_directory: Some(PossiblyImage::Image), + layers: Some(PossiblyImage::Explicit(string_vec!["foo.tar"])), + ..Default::default() + } + ); + } + + #[test] + fn layers_after_image_with_layers() { + assert_toml_error( + parse_test_directive( + r#" + image = { name = "rust", use = ["layers", "working_directory"] } + layers = ["foo.tar"] + "#, + ) + .unwrap_err(), + "field `layers` cannot be set after `image` field that uses `layers`", + ) + } + + #[test] + fn image_with_layers_after_layers() { + assert_toml_error( + parse_test_directive( + r#" + layers = ["foo.tar"] + image = { name = "rust", use = ["layers", "working_directory"] } + "#, + ) + .unwrap_err(), + "field `image` cannot use `layers` if field `layers` is also set", + ) + } + + #[test] + fn added_layers() { + assert_eq!( + parse_test_directive( + r#" + added_layers = ["foo.tar"] + "# + ) + .unwrap(), + TestDirective { + added_layers: string_vec!["foo.tar"], + ..Default::default() + } + ); + } + + #[test] + fn added_layers_after_layers() { + assert_eq!( + parse_test_directive( + r#" + layers = ["foo.tar"] + added_layers = ["bar.tar"] + "# + ) + .unwrap(), + TestDirective { + layers: Some(PossiblyImage::Explicit(string_vec!["foo.tar"])), + added_layers: string_vec!["bar.tar"], + ..Default::default() + } + ); + } + + #[test] + fn added_layers_after_image_with_layers() { + assert_eq!( + parse_test_directive( + r#" + image = { name = "rust", use = ["layers"] } + added_layers = ["foo.tar"] + "# + ) + .unwrap(), + TestDirective { + image: Some(string!("rust")), + layers: Some(PossiblyImage::Image), + added_layers: string_vec!["foo.tar"], + ..Default::default() + } + ); + } + + #[test] + fn layers_after_added_layers() { + assert_toml_error( + parse_test_directive( + r#" + added_layers = ["bar.tar"] + layers = ["foo.tar"] + "#, + ) + .unwrap_err(), + "field `layers` cannot be set after `added_layers`", + ); + } + + #[test] + fn image_with_layers_after_added_layers() { + assert_toml_error( + parse_test_directive( + r#" + added_layers = ["bar.tar"] + image = { name = "rust", use = ["layers"] } + "#, + ) + .unwrap_err(), + "field `image` that uses `layers` cannot be set after `added_layers`", + ); + } + + #[test] + fn environment() { + assert_eq!( + parse_test_directive( + r#" + environment = { FOO = "foo" } + "# + ) + .unwrap(), + TestDirective { + environment: Some(PossiblyImage::Explicit(BTreeMap::from([( + string!("FOO"), + string!("foo") + )]))), + ..Default::default() + } + ); + } + + #[test] + fn image_with_environment() { + assert_eq!( + parse_test_directive( + r#" + image = { name = "rust", use = ["environment", "working_directory"] } + "# + ) + .unwrap(), + TestDirective { + image: Some(string!("rust")), + working_directory: Some(PossiblyImage::Image), + environment: Some(PossiblyImage::Image), + ..Default::default() + } + ); + } + + #[test] + fn environment_after_image_without_environment() { + assert_eq!( + parse_test_directive( + r#" + image = { name = "rust", use = ["working_directory"] } + environment = { FOO = "foo" } + "# + ) + .unwrap(), + TestDirective { + image: Some(string!("rust")), + working_directory: Some(PossiblyImage::Image), + environment: Some(PossiblyImage::Explicit(BTreeMap::from([( + string!("FOO"), + string!("foo") + )]))), + ..Default::default() + } + ); + } + + #[test] + fn image_without_environment_after_environment() { + assert_eq!( + parse_test_directive( + r#" + environment = { FOO = "foo" } + image = { name = "rust", use = ["working_directory"] } + "# + ) + .unwrap(), + TestDirective { + image: Some(string!("rust")), + working_directory: Some(PossiblyImage::Image), + environment: Some(PossiblyImage::Explicit(BTreeMap::from([( + string!("FOO"), + string!("foo") + )]))), + ..Default::default() + } + ); + } + + #[test] + fn environment_after_image_with_environment() { + assert_toml_error( + parse_test_directive( + r#" + image = { name = "rust", use = ["environment", "working_directory"] } + environment = { FOO = "foo" } + "#, + ) + .unwrap_err(), + "field `environment` cannot be set after `image` field that uses `environment`", + ) + } + + #[test] + fn image_with_environment_after_environment() { + assert_toml_error( + parse_test_directive( + r#" + environment = { FOO = "foo" } + image = { name = "rust", use = ["environment", "working_directory"] } + "#, + ) + .unwrap_err(), + "field `image` cannot use `environment` if field `environment` is also set", + ) + } + + #[test] + fn added_environment() { + assert_eq!( + parse_test_directive( + r#" + added_environment = { BAR = "bar" } + "# + ) + .unwrap(), + TestDirective { + added_environment: BTreeMap::from([(string!("BAR"), string!("bar"))]), + ..Default::default() + } + ); + } + + #[test] + fn added_environment_after_environment() { + assert_eq!( + parse_test_directive( + r#" + environment = { FOO = "foo" } + added_environment = { BAR = "bar" } + "# + ) + .unwrap(), + TestDirective { + environment: Some(PossiblyImage::Explicit(BTreeMap::from([( + string!("FOO"), + string!("foo") + )]))), + added_environment: BTreeMap::from([(string!("BAR"), string!("bar"))]), + ..Default::default() + } + ); + } + + #[test] + fn added_environment_after_image_with_environment() { + assert_eq!( + parse_test_directive( + r#" + image = { name = "rust", use = ["environment"] } + added_environment = { BAR = "bar" } + "# + ) + .unwrap(), + TestDirective { + image: Some(string!("rust")), + environment: Some(PossiblyImage::Image), + added_environment: BTreeMap::from([(string!("BAR"), string!("bar"))]), + ..Default::default() + } + ); + } + + #[test] + fn environment_after_added_environment() { + assert_toml_error( + parse_test_directive( + r#" + added_environment = { BAR = "bar" } + environment = { FOO = "foo" } + "#, + ) + .unwrap_err(), + "field `environment` cannot be set after `added_environment`", + ); + } + + #[test] + fn image_with_environment_after_added_environment() { + assert_toml_error( + parse_test_directive( + r#" + added_environment = { BAR = "bar" } + image = { name = "rust", use = ["environment"] } + "#, + ) + .unwrap_err(), + "field `image` that uses `environment` cannot be set after `added_environment`", + ); + } +} diff --git a/crates/cargo-metest/src/pattern/interpreter.rs b/crates/cargo-metest/src/pattern/interpreter.rs new file mode 100644 index 00000000..e90cf529 --- /dev/null +++ b/crates/cargo-metest/src/pattern/interpreter.rs @@ -0,0 +1,493 @@ +use crate::pattern::parser::*; +use cargo_metadata::Target as CargoTarget; +use serde::{Deserialize, Serialize}; +use std::fmt; + +#[cfg(test)] +use crate::parse_str; + +#[derive(Copy, Clone, Hash, PartialOrd, Ord, PartialEq, Eq, Debug, Serialize, Deserialize)] +pub enum ArtifactKind { + Library, + Binary, + Test, + Benchmark, + Example, +} + +impl fmt::Display for ArtifactKind { + fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result { + format!("{:?}", self).to_lowercase().fmt(f) + } +} + +impl ArtifactKind { + pub fn from_target(t: &CargoTarget) -> Self { + if t.is_bin() { + Self::Binary + } else if t.is_test() { + Self::Test + } else if t.is_example() { + Self::Example + } else if t.is_bench() { + Self::Benchmark + } else { + Self::Library + } + } +} + +#[derive(Clone, PartialEq, Eq, Debug)] +pub struct Artifact { + pub kind: ArtifactKind, + pub name: String, +} + +impl Artifact { + pub fn from_target(t: &CargoTarget) -> Self { + Self { + name: t.name.clone(), + kind: ArtifactKind::from_target(t), + } + } +} + +#[derive(Clone, PartialEq, Eq, Debug)] +pub struct Case { + pub name: String, +} + +#[derive(Clone, PartialEq, Eq, Debug)] +pub struct Context { + pub package: String, + pub artifact: Option, + pub case: Option, +} + +impl Context { + fn case(&self) -> Option<&Case> { + self.case.as_ref() + } + + fn artifact(&self) -> Option<&Artifact> { + self.artifact.as_ref() + } +} + +pub fn maybe_not(a: Option) -> Option { + a.map(|v| !v) +} + +pub fn maybe_and(a: Option, b: Option) -> Option { + match (a, b) { + (Some(a), Some(b)) => Some(a && b), + (None, Some(true)) => None, + (None, Some(false)) => Some(false), + (Some(true), None) => None, + (Some(false), None) => Some(false), + (None, None) => None, + } +} + +pub fn maybe_or(a: Option, b: Option) -> Option { + match (a, b) { + (Some(a), Some(b)) => Some(a || b), + (None, Some(true)) => Some(true), + (None, Some(false)) => None, + (Some(true), None) => Some(true), + (Some(false), None) => None, + (None, None) => None, + } +} + +pub fn interpret_simple_selector(s: &SimpleSelector, c: &Context) -> Option { + use CompoundSelectorName::*; + use SimpleSelectorName::*; + Some(match s.name { + All | Any | True => true, + None | False => false, + Library => matches!(c.artifact()?.kind, ArtifactKind::Library), + Compound(Binary) => matches!(c.artifact()?.kind, ArtifactKind::Binary), + Compound(Benchmark) => matches!(c.artifact()?.kind, ArtifactKind::Benchmark), + Compound(Test) => matches!(c.artifact()?.kind, ArtifactKind::Test), + Compound(Example) => matches!(c.artifact()?.kind, ArtifactKind::Example), + Compound(Name) => unreachable!("should be parser error"), + Compound(Package) => unreachable!("should be parser error"), + }) +} + +fn interpret_matcher(s: &str, matcher: &Matcher) -> bool { + use Matcher::*; + match matcher { + Equals(a) => s == a.0, + Contains(a) => s.contains(&a.0), + StartsWith(a) => s.starts_with(&a.0), + EndsWith(a) => s.ends_with(&a.0), + Matches(a) => a.0.is_match(s), + Globs(a) => a.0.is_match(s), + } +} + +pub fn interpret_compound_selector(s: &CompoundSelector, c: &Context) -> Option { + use CompoundSelectorName::*; + Some(match s.name { + Name => interpret_matcher(&c.case()?.name, &s.matcher), + Package => interpret_matcher(&c.package, &s.matcher), + Binary => { + matches!(&c.artifact()?.kind, ArtifactKind::Binary) + && interpret_matcher(&c.artifact()?.name, &s.matcher) + } + Benchmark => { + matches!(&c.artifact()?.kind, ArtifactKind::Benchmark) + && interpret_matcher(&c.artifact()?.name, &s.matcher) + } + Example => { + matches!(&c.artifact()?.kind, ArtifactKind::Example) + && interpret_matcher(&c.artifact()?.name, &s.matcher) + } + Test => { + matches!(&c.artifact()?.kind, ArtifactKind::Test) + && interpret_matcher(&c.artifact()?.name, &s.matcher) + } + }) +} + +fn interpret_not_expression(n: &NotExpression, c: &Context) -> Option { + use NotExpression::*; + match n { + Not(n) => maybe_not(interpret_not_expression(n, c)), + Simple(s) => interpret_simple_expression(s, c), + } +} + +fn interpret_and_expression(a: &AndExpression, c: &Context) -> Option { + use AndExpression::*; + match a { + And(n, a) => maybe_and( + interpret_not_expression(n, c), + interpret_and_expression(a, c), + ), + Diff(n, a) => maybe_and( + interpret_not_expression(n, c), + maybe_not(interpret_and_expression(a, c)), + ), + Not(n) => interpret_not_expression(n, c), + } +} + +fn interpret_or_expression(o: &OrExpression, c: &Context) -> Option { + use OrExpression::*; + match o { + Or(a, o) => maybe_or( + interpret_and_expression(a, c), + interpret_or_expression(o, c), + ), + And(a) => interpret_and_expression(a, c), + } +} + +pub fn interpret_simple_expression(s: &SimpleExpression, c: &Context) -> Option { + use SimpleExpression::*; + match s { + Or(o) => interpret_or_expression(o, c), + SimpleSelector(s) => interpret_simple_selector(s, c), + CompoundSelector(s) => interpret_compound_selector(s, c), + } +} + +pub fn interpret_pattern(s: &Pattern, c: &Context) -> Option { + interpret_or_expression(&s.0, c) +} + +#[test] +fn simple_expression_simple_selector() { + use ArtifactKind::*; + + fn test_it(s: &str, artifact: Option, expected: Option) { + let c = Context { + package: "foo".into(), + artifact: artifact.map(|kind| Artifact { + kind, + name: "foo.bin".into(), + }), + case: None, + }; + let actual = interpret_simple_expression(&parse_str!(SimpleExpression, s).unwrap(), &c); + assert_eq!(actual, expected); + } + + // for all inputs, these expression evaluate as true + for w in ["all", "any", "true"] { + for a in [Library, Binary, Test, Benchmark, Example] { + test_it(w, Some(a), Some(true)); + } + test_it(w, None, Some(true)); + } + + // for all inputs, these expression evaluate as false + for w in ["none", "false"] { + for a in [Library, Binary, Test, Benchmark, Example] { + test_it(w, Some(a), Some(false)); + } + test_it(w, None, Some(false)); + } + + test_it("library", Some(Library), Some(true)); + test_it("library", Some(Binary), Some(false)); + test_it("library", None, None); + + test_it("binary", Some(Library), Some(false)); + test_it("binary", Some(Binary), Some(true)); + test_it("binary", None, None); + + test_it("benchmark", Some(Library), Some(false)); + test_it("benchmark", Some(Benchmark), Some(true)); + test_it("benchmark", None, None); + + test_it("test", Some(Library), Some(false)); + test_it("test", Some(Test), Some(true)); + test_it("test", None, None); + + test_it("example", Some(Library), Some(false)); + test_it("example", Some(Example), Some(true)); + test_it("example", None, None); +} + +#[cfg(test)] +fn test_compound_sel( + s: &str, + artifact: Option, + name: impl Into, + expected: Option, +) { + let c = Context { + package: "foo".into(), + artifact: artifact.map(|kind| Artifact { + kind, + name: name.into(), + }), + case: None, + }; + let actual = interpret_simple_expression(&parse_str!(SimpleExpression, s).unwrap(), &c); + assert_eq!(actual, expected); +} + +#[test] +fn simple_expression_compound_selector_starts_with() { + use ArtifactKind::*; + + let p = "binary.starts_with(bar)"; + test_compound_sel(p, Some(Binary), "barbaz", Some(true)); + test_compound_sel(p, Some(Binary), "bazbar", Some(false)); + test_compound_sel(p, None, "bazbar", None); +} + +#[test] +fn simple_expression_compound_selector_ends_with() { + use ArtifactKind::*; + + let p = "binary.ends_with(bar)"; + test_compound_sel(p, Some(Binary), "bazbar", Some(true)); + test_compound_sel(p, Some(Binary), "barbaz", Some(false)); + test_compound_sel(p, None, "bazbar", None); +} + +#[test] +fn simple_expression_compound_selector_equals() { + use ArtifactKind::*; + + let p = "binary.equals(bar)"; + test_compound_sel(p, Some(Binary), "bar", Some(true)); + test_compound_sel(p, Some(Binary), "baz", Some(false)); + test_compound_sel(p, None, "bazbar", None); +} + +#[test] +fn simple_expression_compound_selector_contains() { + use ArtifactKind::*; + + let p = "binary.contains(bar)"; + test_compound_sel(p, Some(Binary), "bazbarbin", Some(true)); + test_compound_sel(p, Some(Binary), "bazbin", Some(false)); + test_compound_sel(p, None, "bazbar", None); +} + +#[test] +fn simple_expression_compound_selector_matches() { + use ArtifactKind::*; + + let p = "binary.matches(^[a-z]*$)"; + test_compound_sel(p, Some(Binary), "bazbarbin", Some(true)); + test_compound_sel(p, Some(Binary), "baz-bin", Some(false)); + test_compound_sel(p, None, "bazbar", None); +} + +#[test] +fn simple_expression_compound_selector_globs() { + use ArtifactKind::*; + + let p = "binary.globs(baz*)"; + test_compound_sel(p, Some(Binary), "bazbarbin", Some(true)); + test_compound_sel(p, Some(Binary), "binbaz", Some(false)); + test_compound_sel(p, None, "bazbar", None); +} + +#[cfg(test)] +fn test_compound_sel_case( + s: &str, + kind: Option, + package: impl Into, + artifact_name: impl Into, + case_name: impl Into, + expected: Option, +) { + let c = Context { + package: package.into(), + artifact: kind.map(|kind| Artifact { + kind, + name: artifact_name.into(), + }), + case: Some(Case { + name: case_name.into(), + }), + }; + let actual = interpret_simple_expression(&parse_str!(SimpleExpression, s).unwrap(), &c); + assert_eq!(actual, expected); +} + +#[test] +fn simple_expression_compound_selector_packge() { + use ArtifactKind::*; + + let p = "package.matches(^[a-z]*$)"; + for k in [Library, Binary, Test, Benchmark, Example] { + test_compound_sel_case(p, Some(k), "bazbarbin", "", "", Some(true)); + test_compound_sel_case(p, Some(k), "baz-bin", "", "", Some(false)); + } + test_compound_sel_case(p, None, "baz-bin", "", "", Some(false)); +} + +#[test] +fn simple_expression_compound_selector_name() { + use ArtifactKind::*; + + let p = "name.matches(^[a-z]*$)"; + for k in [Library, Binary, Test, Benchmark, Example] { + test_compound_sel_case(p, Some(k), "", "", "bazbarbin", Some(true)); + test_compound_sel_case(p, Some(k), "", "", "baz-bin", Some(false)); + } +} + +#[test] +fn simple_expression_compound_selector_binary() { + use ArtifactKind::*; + + let p = "binary.matches(^[a-z]*$)"; + test_compound_sel_case(p, Some(Binary), "", "bazbarbin", "", Some(true)); + test_compound_sel_case(p, Some(Binary), "", "baz-bin", "", Some(false)); + test_compound_sel_case(p, Some(Test), "", "bazbarbin", "", Some(false)); +} + +#[test] +fn simple_expression_compound_selector_benchmark() { + use ArtifactKind::*; + + let p = "benchmark.matches(^[a-z]*$)"; + test_compound_sel_case(p, Some(Benchmark), "", "bazbarbin", "", Some(true)); + test_compound_sel_case(p, Some(Benchmark), "", "baz-bin", "", Some(false)); + test_compound_sel_case(p, Some(Test), "", "bazbarbin", "", Some(false)); +} + +#[test] +fn simple_expression_compound_selector_example() { + use ArtifactKind::*; + + let p = "example.matches(^[a-z]*$)"; + test_compound_sel_case(p, Some(Example), "", "bazbarbin", "", Some(true)); + test_compound_sel_case(p, Some(Example), "", "baz-bin", "", Some(false)); + test_compound_sel_case(p, Some(Test), "", "bazbarbin", "", Some(false)); +} + +#[test] +fn simple_expression_compound_selector_test() { + use ArtifactKind::*; + + let p = "test.matches(^[a-z]*$)"; + test_compound_sel_case(p, Some(Test), "", "bazbarbin", "", Some(true)); + test_compound_sel_case(p, Some(Test), "", "baz-bin", "", Some(false)); + test_compound_sel_case(p, Some(Binary), "", "bazbarbin", "", Some(false)); +} + +#[test] +fn and_or_not_diff_expressions() { + fn test_it(s: &str, expected: bool) { + let c = Context { + package: "foo".into(), + artifact: Some(Artifact { + kind: ArtifactKind::Library, + name: "foo_bin".into(), + }), + case: Some(Case { + name: "foo_test".into(), + }), + }; + let actual = interpret_pattern(&parse_str!(Pattern, s).unwrap(), &c); + assert_eq!(actual, Some(expected)); + } + + test_it( + "(package.equals(foo) || package.equals(bar)) && name.equals(foo_test)", + true, + ); + test_it("package.equals(foo) && name.equals(foo_test)", true); + test_it("package.equals(foo) || name.equals(foo_test)", true); + test_it("package.equals(foo) || name.equals(bar_test)", true); + test_it("package.equals(foo) && !name.equals(bar_test)", true); + test_it("package.equals(foo) - name.equals(bar_test)", true); + + test_it("package.equals(foo) && name.equals(bar_test)", false); + test_it("package.equals(bar) || name.equals(bar_test)", false); + test_it("package.equals(bar) || !name.equals(foo_test)", false); + test_it("package.equals(foo) - name.equals(foo_test)", false); +} + +#[test] +fn and_or_not_diff_maybe_expressions() { + fn test_it(s: &str, expected: Option) { + let c = Context { + package: "foo".into(), + artifact: Some(Artifact { + kind: ArtifactKind::Library, + name: "foo_bin".into(), + }), + case: None, + }; + let actual = interpret_pattern(&parse_str!(Pattern, s).unwrap(), &c); + assert_eq!(actual, expected); + } + + test_it( + "(package.equals(foo) || package.equals(bar)) && name.equals(foo_test)", + None, + ); + test_it("package.equals(foo) && name.equals(foo_test)", None); + test_it("name.equals(foo_test) && name.equals(bar_test)", None); + test_it("name.equals(foo_test) && package.equals(foo)", None); + test_it("package.equals(foo) && name.equals(bar_test)", None); + test_it("package.equals(foo) && !name.equals(bar_test)", None); + + test_it("name.equals(foo_test) && package.equals(bar)", Some(false)); + test_it("package.equals(bar) && name.equals(foo_test)", Some(false)); + + test_it("name.equals(foo_test) || name.equals(bar_test)", None); + test_it("name.equals(foo_test) || package.equals(bar)", None); + test_it("package.equals(bar) || name.equals(bar_test)", None); + test_it("package.equals(bar) || !name.equals(foo_test)", None); + + test_it("name.equals(foo_test) || package.equals(foo)", Some(true)); + test_it("package.equals(foo) || name.equals(foo_test)", Some(true)); + test_it("package.equals(foo) || name.equals(bar_test)", Some(true)); + + test_it("package.equals(foo) - name.equals(bar_test)", None); + test_it("package.equals(foo) - name.equals(foo_test)", None); +} diff --git a/crates/cargo-metest/src/pattern/mod.rs b/crates/cargo-metest/src/pattern/mod.rs new file mode 100644 index 00000000..86cf2ab2 --- /dev/null +++ b/crates/cargo-metest/src/pattern/mod.rs @@ -0,0 +1,5 @@ +pub mod interpreter; +pub mod parser; + +pub use interpreter::{interpret_pattern, Artifact, ArtifactKind, Case, Context}; +pub use parser::{compile_filter, Pattern}; diff --git a/crates/cargo-metest/src/pattern/parser.rs b/crates/cargo-metest/src/pattern/parser.rs new file mode 100644 index 00000000..ceedf0cd --- /dev/null +++ b/crates/cargo-metest/src/pattern/parser.rs @@ -0,0 +1,763 @@ +use crate::parse_str; +use anyhow::{anyhow, Error, Result}; +use combine::{ + attempt, between, choice, many, many1, optional, parser, + parser::{ + char::{space, spaces, string}, + combinator::{lazy, no_partial}, + }, + satisfy, token, Parser, Stream, +}; +use derive_more::From; +use globset::{Glob, GlobMatcher}; +use regex::Regex; +use std::str::FromStr; + +#[cfg(test)] +use regex_macro::regex; + +#[derive(From, Clone, Debug, PartialEq, Eq)] +#[from(forward)] +pub struct MatcherParameter(pub String); + +impl MatcherParameter { + pub fn parser>() -> impl Parser { + parser(|input| { + let (open, committed) = + choice((token('('), token('['), token('{'), token('<'), token('/'))) + .parse_stream(input) + .into_result()?; + let close = match open { + '(' => ')', + '[' => ']', + '{' => '}', + '<' => '>', + '/' => '/', + _ => unreachable!(), + }; + let mut count = 1; + let mut contents = String::new(); + 'outer: loop { + let (chunk, _): (String, _) = many(satisfy(|c| c != open && c != close)) + .parse_stream(input) + .into_result()?; + contents += &chunk; + + while attempt(token(close)).parse_stream(input).is_ok() { + count -= 1; + if count == 0 { + break 'outer; + } else { + contents.push(close); + } + } + count += 1; + token(open).parse_stream(input).into_result()?; + contents.push(open); + } + + Ok((contents, committed)) + }) + .map(Self) + } +} + +#[test] +fn matcher_parameter_test() { + fn test_it(a: &str, b: &str) { + assert_eq!( + parse_str!(MatcherParameter, a), + Ok(MatcherParameter(b.into())) + ); + } + test_it("[abc]", "abc"); + test_it("{abc}", "abc"); + test_it("", "abc"); + test_it("[(hello)]", "(hello)"); + test_it("((hello))", "(hello)"); + test_it("(([hello]))", "([hello])"); + test_it("(he[llo)", "he[llo"); + test_it("()", ""); + test_it("((()))", "(())"); + test_it("((a)(b))", "(a)(b)"); + + fn test_err(a: &str) { + assert!(matches!(parse_str!(MatcherParameter, a), Err(_))); + } + test_err("[1)"); + test_err("(((hello))"); +} + +pub fn err_construct< + RetT, + ErrorT: std::error::Error + Send + Sync + 'static, + InputT: Stream, +>( + mut inner: impl Parser, + mut con: impl FnMut(&str) -> std::result::Result, +) -> impl Parser { + use combine::{ + error::{Commit, StreamError}, + ParseError, + }; + parser(move |input: &mut InputT| { + let position = input.position(); + let (s, committed) = inner.parse_stream(input).into_result()?; + match con(&s) { + Ok(r) => Ok((r, committed)), + Err(e) => { + let mut parse_error = InputT::Error::empty(position); + parse_error.add(StreamError::other(e)); + Err(Commit::Commit(parse_error.into())) + } + } + }) +} + +#[derive(Clone, Debug)] +pub struct GlobMatcherParameter(pub GlobMatcher); + +impl PartialEq for GlobMatcherParameter { + fn eq(&self, other: &Self) -> bool { + self.0.glob() == other.0.glob() + } +} + +impl Eq for GlobMatcherParameter {} + +impl GlobMatcherParameter { + pub fn parser>() -> impl Parser { + err_construct(MatcherParameter::parser().map(|v| v.0), Glob::new) + .map(|g| Self(g.compile_matcher())) + } +} + +#[derive(Clone, Debug)] +pub struct RegexMatcherParameter(pub Regex); + +impl From<&Regex> for RegexMatcherParameter { + fn from(r: &Regex) -> Self { + Self(r.clone()) + } +} + +impl PartialEq for RegexMatcherParameter { + fn eq(&self, other: &Self) -> bool { + self.0.as_str() == other.0.as_str() + } +} + +impl Eq for RegexMatcherParameter {} + +impl RegexMatcherParameter { + pub fn parser>() -> impl Parser { + err_construct(MatcherParameter::parser().map(|v| v.0), Regex::new).map(Self) + } +} + +#[test] +fn regex_parser_test() { + parse_str!(RegexMatcherParameter, "/[a-z]/").unwrap(); + parse_str!(RegexMatcherParameter, "/*/").unwrap_err(); +} + +#[derive(Clone, Debug, PartialEq, Eq)] +pub enum Matcher { + Equals(MatcherParameter), + Contains(MatcherParameter), + StartsWith(MatcherParameter), + EndsWith(MatcherParameter), + Matches(RegexMatcherParameter), + Globs(GlobMatcherParameter), +} + +fn prefix>( + s: &'static str, + min_len: usize, +) -> impl Parser { + if s.len() == min_len { + no_partial(lazy(move || string(s))).boxed() + } else { + no_partial(lazy(move || { + attempt(string(s)).or(prefix(&s[..s.len() - 1], min_len)) + })) + .boxed() + } +} + +impl Matcher { + pub fn parser>() -> impl Parser { + let arg = || MatcherParameter::parser(); + let regex = || RegexMatcherParameter::parser(); + let glob = || GlobMatcherParameter::parser(); + choice(( + attempt(prefix("equals", 2).with(arg())).map(Self::Equals), + attempt(prefix("contains", 1).with(arg())).map(Self::Contains), + attempt(prefix("starts_with", 1).with(arg())).map(Self::StartsWith), + attempt(prefix("ends_with", 2).with(arg())).map(Self::EndsWith), + attempt(prefix("matches", 1).with(regex())).map(Self::Matches), + prefix("globs", 1).with(glob()).map(Self::Globs), + )) + } +} + +#[derive(Clone, Debug, PartialEq, Eq)] +pub enum CompoundSelectorName { + Name, + Binary, + Benchmark, + Example, + Test, + Package, +} + +impl CompoundSelectorName { + pub fn parser>() -> impl Parser { + choice(( + attempt(prefix("name", 1)).map(|_| Self::Name), + attempt(prefix("package", 1)).map(|_| Self::Package), + Self::parser_for_simple_selector(), + )) + } + + pub fn parser_for_simple_selector>( + ) -> impl Parser { + choice(( + attempt(prefix("binary", 2)).map(|_| Self::Binary), + attempt(prefix("benchmark", 2)).map(|_| Self::Benchmark), + attempt(prefix("example", 1)).map(|_| Self::Example), + prefix("test", 2).map(|_| Self::Test), + )) + } +} + +#[derive(Clone, Debug, PartialEq, Eq)] +pub struct CompoundSelector { + pub name: CompoundSelectorName, + pub matcher: Matcher, +} + +impl CompoundSelector { + pub fn parser>() -> impl Parser { + ( + CompoundSelectorName::parser().skip(token('.')), + Matcher::parser(), + ) + .map(|(name, matcher)| Self { name, matcher }) + } +} + +#[derive(Clone, Debug, PartialEq, Eq, From)] +pub enum SimpleSelectorName { + All, + Any, + True, + None, + False, + Library, + #[from] + Compound(CompoundSelectorName), +} + +impl SimpleSelectorName { + pub fn parser>() -> impl Parser { + choice(( + attempt(prefix("all", 2)).map(|_| Self::All), + attempt(prefix("any", 2)).map(|_| Self::Any), + attempt(prefix("true", 2)).map(|_| Self::True), + attempt(prefix("none", 1)).map(|_| Self::None), + attempt(prefix("false", 1)).map(|_| Self::False), + attempt(prefix("library", 1)).map(|_| Self::Library), + CompoundSelectorName::parser_for_simple_selector().map(Self::Compound), + )) + } +} + +#[derive(Clone, Debug, PartialEq, Eq, From)] +#[from(types(CompoundSelectorName))] +pub struct SimpleSelector { + pub name: SimpleSelectorName, +} + +impl SimpleSelector { + pub fn parser>() -> impl Parser { + SimpleSelectorName::parser() + .skip(optional(string("()"))) + .map(|name| Self { name }) + } +} + +#[derive(Clone, Debug, PartialEq, Eq, From)] +pub enum SimpleExpression { + #[from(types(OrExpression))] + Or(Box), + #[from(types(SimpleSelectorName, CompoundSelectorName))] + SimpleSelector(SimpleSelector), + #[from] + CompoundSelector(CompoundSelector), +} + +impl From for SimpleExpression { + fn from(a: AndExpression) -> Self { + OrExpression::from(a).into() + } +} + +impl SimpleExpression { + pub fn parser>() -> impl Parser { + let or_parser = || no_partial(lazy(|| OrExpression::parser())).boxed(); + choice(( + attempt(between( + token('(').skip(spaces()), + spaces().with(token(')')), + or_parser(), + )) + .map(|o| Self::Or(Box::new(o))), + attempt(CompoundSelector::parser().map(Self::CompoundSelector)), + attempt(SimpleSelector::parser().map(Self::SimpleSelector)), + )) + } +} + +fn not_operator>() -> impl Parser { + choice((string("!"), string("~"), string("not").skip(spaces1()))) +} + +#[derive(Clone, Debug, PartialEq, Eq, From)] +pub enum NotExpression { + Not(Box), + #[from(types(SimpleSelector, SimpleSelectorName, CompoundSelector, OrExpression))] + Simple(SimpleExpression), +} + +impl NotExpression { + pub fn parser>() -> impl Parser { + let self_parser = || no_partial(lazy(|| Self::parser())).boxed(); + choice(( + attempt(not_operator().with(self_parser().map(|e| Self::Not(Box::new(e))))), + SimpleExpression::parser().map(Self::Simple), + )) + } +} + +fn spaces1>() -> impl Parser { + many1(space()) +} + +fn and_operator>() -> impl Parser { + attempt(between( + spaces(), + spaces(), + choice((attempt(string("&&")), string("&"), string("+"))), + )) + .or(spaces1().with(string("and")).skip(spaces1())) +} + +fn diff_operator>() -> impl Parser { + attempt(between( + spaces(), + spaces(), + choice((string("\\"), string("-"))), + )) + .or(spaces1().with(string("minus")).skip(spaces1())) +} + +#[derive(Clone, Debug, PartialEq, Eq, From)] +pub enum AndExpression { + And(NotExpression, Box), + Diff(NotExpression, Box), + #[from(types( + OrExpression, + SimpleExpression, + SimpleSelector, + SimpleSelectorName, + CompoundSelector + ))] + Not(NotExpression), +} + +impl AndExpression { + pub fn parser>() -> impl Parser { + let self_parser = || no_partial(lazy(|| Self::parser())).boxed(); + choice(( + attempt((NotExpression::parser(), and_operator(), self_parser())) + .map(|(n, _, a)| Self::And(n, Box::new(a))), + attempt((NotExpression::parser(), diff_operator(), self_parser())) + .map(|(n, _, a)| Self::Diff(n, Box::new(a))), + NotExpression::parser().map(Self::Not), + )) + } +} + +fn or_operator>() -> impl Parser { + attempt(between( + spaces(), + spaces(), + choice((attempt(string("||")), string("|"))), + )) + .or(spaces1().with(string("or")).skip(spaces1())) +} + +#[derive(Clone, Debug, PartialEq, Eq, From)] +pub enum OrExpression { + Or(AndExpression, Box), + #[from(types( + NotExpression, + SimpleExpression, + SimpleSelector, + SimpleSelectorName, + CompoundSelector + ))] + And(AndExpression), +} + +impl OrExpression { + pub fn parser>() -> impl Parser { + let self_parser = || no_partial(lazy(|| Self::parser())).boxed(); + choice(( + attempt((AndExpression::parser(), or_operator(), self_parser())) + .map(|(a, _, o)| Self::Or(a, Box::new(o))), + AndExpression::parser().map(Self::And), + )) + } +} + +#[derive(Debug, PartialEq, Eq, From)] +#[from(types(NotExpression, AndExpression))] +pub struct Pattern(pub OrExpression); + +impl Pattern { + pub fn parser>() -> impl Parser { + OrExpression::parser().map(Self) + } +} + +impl FromStr for Pattern { + type Err = Error; + fn from_str(s: &str) -> Result { + parse_str!(Self, s).map_err(|e| anyhow!("Failed to parse pattern: {e}")) + } +} + +#[macro_export] +macro_rules! parse_str { + ($ty:ty, $input:expr) => {{ + use combine::{EasyParser as _, Parser as _}; + <$ty>::parser() + .skip(combine::eof()) + .easy_parse(combine::stream::position::Stream::new($input)) + .map(|x| x.0) + }}; +} + +fn compile_filter_or(filters: &[String]) -> Result { + filters + .iter() + .try_fold(SimpleSelectorName::False.into(), |e, item| { + Ok(OrExpression::Or( + AndExpression::from(e), + Box::new(Pattern::from_str(item.as_str())?.0), + )) + }) +} + +pub fn compile_filter(include_filter: &[String], exclude_filter: &[String]) -> Result { + let include = compile_filter_or(include_filter)?; + let exclude = compile_filter_or(exclude_filter)?; + Ok(AndExpression::Diff(include.into(), Box::new(exclude.into())).into()) +} + +#[test] +fn simple_expr() { + use CompoundSelectorName::*; + use SimpleSelectorName::*; + + fn test_it(a: &str, s: impl Into) { + assert_eq!(parse_str!(SimpleExpression, a), Ok(s.into())); + } + test_it("all", All); + test_it("all()", All); + test_it("any", Any); + test_it("any()", Any); + test_it("true", True); + test_it("true()", True); + test_it("none", None); + test_it("none()", None); + test_it("false", False); + test_it("false()", False); + test_it("library", Library); + test_it("library()", Library); + + test_it("binary", Binary); + test_it("binary()", Binary); + test_it("benchmark", Benchmark); + test_it("benchmark()", Benchmark); + test_it("example", Example); + test_it("example()", Example); + test_it("test", Test); + test_it("test()", Test); + + fn test_it_err(a: &str) { + assert!(parse_str!(SimpleExpression, a).is_err()); + } + test_it_err("name"); + test_it_err("name()"); + test_it_err("package"); + test_it_err("package()"); +} + +#[test] +fn simple_expr_prefix() { + use CompoundSelectorName::*; + use SimpleSelectorName::*; + + fn test_it(a: &str, min: usize, s: impl Into) { + let expected = s.into(); + for i in min..=a.len() { + assert_eq!(parse_str!(SimpleExpression, &a[..i]), Ok(expected.clone())); + } + } + + test_it("all", 2, All); + test_it("any", 2, Any); + test_it("true", 2, True); + test_it("none", 1, None); + test_it("false", 1, False); + test_it("library", 1, Library); + + test_it("binary", 2, Binary); + test_it("benchmark", 2, Benchmark); + test_it("example", 1, Example); + test_it("test", 2, Test); +} + +#[test] +fn simple_expr_compound() { + use CompoundSelectorName::*; + use Matcher::*; + + fn test_it(a: &str, name: CompoundSelectorName, matcher: Matcher) { + assert_eq!( + parse_str!(SimpleExpression, a), + Ok(CompoundSelector { name, matcher }.into()) + ); + } + test_it("name.matches", Name, Matches(regex!("foo").into())); + test_it("test.equals([a-z].*)", Test, Equals("[a-z].*".into())); + test_it( + "binary.starts_with<(hi)>", + Binary, + StartsWith("(hi)".into()), + ); + test_it( + "benchmark.ends_with[hey?]", + Benchmark, + EndsWith("hey?".into()), + ); + test_it( + "example.contains{s(oi)l}", + Example, + Contains("s(oi)l".into()), + ); +} + +#[test] +fn matcher_prefixes() { + use CompoundSelectorName::*; + use Matcher::*; + + fn test_it(matcher_name: &str, min: usize, matcher: Matcher) { + for i in min..=matcher_name.len() { + let e = format!("name.{}", &matcher_name[..i]); + assert_eq!( + parse_str!(SimpleExpression, e.as_str()), + Ok(CompoundSelector { + name: Name, + matcher: matcher.clone() + } + .into()) + ); + } + } + + test_it("matches", 1, Matches(regex!("foo").into())); + test_it("equals", 2, Equals("foo".into())); + test_it("starts_with", 1, StartsWith("foo".into())); + test_it("ends_with", 2, EndsWith("foo".into())); + test_it("contains", 1, Contains("foo".into())); +} + +#[test] +fn pattern_simple_boolean_expr() { + fn test_it(a: &str, pattern: impl Into) { + assert_eq!(parse_str!(Pattern, a), Ok(pattern.into())); + } + test_it( + "!all", + NotExpression::Not(Box::new(SimpleSelectorName::All.into())), + ); + test_it( + "all && any", + AndExpression::And( + SimpleSelectorName::All.into(), + Box::new(SimpleSelectorName::Any.into()), + ), + ); + test_it( + "all || any", + OrExpression::Or( + SimpleSelectorName::All.into(), + Box::new(SimpleSelectorName::Any.into()), + ), + ); +} + +#[test] +fn pattern_longer_boolean_expr() { + fn test_it(a: &str, pattern: impl Into) { + assert_eq!(parse_str!(Pattern, a), Ok(pattern.into())); + } + test_it( + "all || any || none", + OrExpression::Or( + SimpleSelectorName::All.into(), + Box::new( + OrExpression::Or( + SimpleSelectorName::Any.into(), + Box::new(SimpleSelectorName::None.into()), + ) + .into(), + ), + ), + ); + test_it( + "all || any && none", + OrExpression::Or( + SimpleSelectorName::All.into(), + Box::new( + AndExpression::And( + SimpleSelectorName::Any.into(), + Box::new(SimpleSelectorName::None.into()), + ) + .into(), + ), + ), + ); + test_it( + "all && any || none", + OrExpression::Or( + AndExpression::And( + SimpleSelectorName::All.into(), + Box::new(SimpleSelectorName::Any.into()), + ), + Box::new(SimpleSelectorName::None.into()), + ), + ); +} + +#[test] +fn pattern_complicated_boolean_expr() { + fn test_it(a: &str, pattern: impl Into) { + assert_eq!(parse_str!(Pattern, a), Ok(pattern.into())); + } + test_it( + "( all || any ) && none - library", + AndExpression::And( + OrExpression::Or( + SimpleSelectorName::All.into(), + Box::new(SimpleSelectorName::Any.into()), + ) + .into(), + Box::new(AndExpression::Diff( + SimpleSelectorName::None.into(), + Box::new(SimpleSelectorName::Library.into()), + )), + ), + ); + test_it( + "!( all || any ) && none", + AndExpression::And( + NotExpression::Not(Box::new( + OrExpression::Or( + SimpleSelectorName::All.into(), + Box::new(SimpleSelectorName::Any.into()), + ) + .into(), + )), + Box::new(SimpleSelectorName::None.into()), + ), + ); + + test_it( + "not ( all or any ) and none minus library", + AndExpression::And( + NotExpression::Not(Box::new( + OrExpression::Or( + SimpleSelectorName::All.into(), + Box::new(SimpleSelectorName::Any.into()), + ) + .into(), + )), + Box::new(AndExpression::Diff( + SimpleSelectorName::None.into(), + Box::new(SimpleSelectorName::Library.into()), + )), + ), + ); +} + +#[test] +fn pattern_complicated_boolean_expr_compound() { + fn test_it(a: &str, pattern: impl Into) { + assert_eq!(parse_str!(Pattern, a), Ok(pattern.into())); + } + + test_it( + "binary.starts_with(hi) && name.matches/([a-z]+::)*[a-z]+/", + AndExpression::And( + CompoundSelector { + name: CompoundSelectorName::Binary, + matcher: Matcher::StartsWith("hi".into()), + } + .into(), + Box::new( + CompoundSelector { + name: CompoundSelectorName::Name, + matcher: Matcher::Matches(regex!("([a-z]+::)*[a-z]+").into()), + } + .into(), + ), + ), + ); + + test_it( + "( binary.starts_with(hi) && name.matches/([a-z]+::)*[a-z]+/ ) || benchmark.ends_with(jo)", + OrExpression::Or( + NotExpression::Simple( + AndExpression::And( + CompoundSelector { + name: CompoundSelectorName::Binary, + matcher: Matcher::StartsWith("hi".into()), + } + .into(), + Box::new( + CompoundSelector { + name: CompoundSelectorName::Name, + matcher: Matcher::Matches(regex!("([a-z]+::)*[a-z]+").into()), + } + .into(), + ), + ) + .into(), + ) + .into(), + Box::new( + CompoundSelector { + name: CompoundSelectorName::Benchmark, + matcher: Matcher::EndsWith("jo".into()), + } + .into(), + ), + ), + ); +} diff --git a/crates/cargo-metest/src/progress.rs b/crates/cargo-metest/src/progress.rs new file mode 100644 index 00000000..0ada0777 --- /dev/null +++ b/crates/cargo-metest/src/progress.rs @@ -0,0 +1,87 @@ +mod driver; +mod multiple_progress_bars; +mod no_bar; +mod quiet_no_bar; +mod quiet_progress_bar; +mod test_listing; + +use anyhow::Result; +use colored::Colorize as _; +pub use driver::{DefaultProgressDriver, ProgressDriver}; +use indicatif::{ProgressBar, ProgressStyle}; +use maelstrom_base::stats::JobStateCounts; +pub use multiple_progress_bars::MultipleProgressBars; +pub use no_bar::NoBar; +pub use quiet_no_bar::QuietNoBar; +pub use quiet_progress_bar::QuietProgressBar; +pub use test_listing::{TestListingProgress, TestListingProgressNoSpinner}; + +pub trait ProgressIndicator: Clone + Send + Sync + 'static { + /// Prints a line to stdout while not interfering with any progress bars + fn println(&self, msg: String); + + /// Prints a line to stdout while not interfering with any progress bars and indicating it was + /// stderr + fn eprintln(&self, msg: impl AsRef) { + for line in msg.as_ref().lines() { + self.println(format!("{} {line}", "stderr:".red())) + } + } + + /// Meant to be called with the job is complete, it updates the complete bar with this status + fn job_finished(&self) {} + + /// Update the number of pending jobs indicated + fn update_length(&self, _new_length: u64) {} + + /// Add another progress bar which is meant to show progress of some sub-task, like downloading + /// an image or uploading an artifact + fn new_side_progress(&self, _msg: impl Into) -> Option { + None + } + + /// Update any information pertaining to the states of jobs. Should be called repeatedly until + /// it returns false + fn update_job_states(&self, _counts: JobStateCounts) -> Result { + Ok(false) + } + + /// Tick and spinners + fn tick(&self) -> bool { + false + } + + /// Update the message for the spinner which indicates jobs are being enqueued + fn update_enqueue_status(&self, _msg: impl Into) {} + + /// Called when all jobs are running + fn done_queuing_jobs(&self) {} + + /// Called when all jobs are done + fn finished(&self) -> Result<()> { + Ok(()) + } +} + +// waiting for artifacts, pending, running, complete +const COLORS: [&str; 4] = ["red", "yellow", "blue", "green"]; + +fn make_progress_bar( + color: &str, + message: impl Into, + msg_len: usize, + bytes: bool, +) -> ProgressBar { + let prog_line = if bytes { + "{bytes}/{total_bytes}" + } else { + "{pos}/{len}" + }; + ProgressBar::new(0).with_message(message.into()).with_style( + ProgressStyle::with_template(&format!( + "{{wide_bar:.{color}}} {prog_line} {{msg:{msg_len}}}" + )) + .unwrap() + .progress_chars("##-"), + ) +} diff --git a/crates/cargo-metest/src/progress/driver.rs b/crates/cargo-metest/src/progress/driver.rs new file mode 100644 index 00000000..fe6331c0 --- /dev/null +++ b/crates/cargo-metest/src/progress/driver.rs @@ -0,0 +1,81 @@ +use super::ProgressIndicator; +use anyhow::Result; +use maelstrom_client::Client; +use std::{ + sync::{ + atomic::{AtomicBool, Ordering}, + Arc, Mutex, + }, + thread, + time::Duration, +}; + +pub trait ProgressDriver<'scope> { + fn drive<'dep, ProgressIndicatorT>( + &mut self, + client: &'dep Mutex, + ind: ProgressIndicatorT, + ) where + ProgressIndicatorT: ProgressIndicator, + 'dep: 'scope; + + fn stop(&mut self) -> Result<()>; +} + +pub struct DefaultProgressDriver<'scope, 'env> { + scope: &'scope thread::Scope<'scope, 'env>, + handle: Option>>, + canceled: Arc, +} + +impl<'scope, 'env> DefaultProgressDriver<'scope, 'env> { + pub fn new(scope: &'scope thread::Scope<'scope, 'env>) -> Self { + Self { + scope, + handle: None, + canceled: Default::default(), + } + } +} + +impl<'scope, 'env> Drop for DefaultProgressDriver<'scope, 'env> { + fn drop(&mut self) { + self.canceled.store(true, Ordering::Release); + } +} + +impl<'scope, 'env> ProgressDriver<'scope> for DefaultProgressDriver<'scope, 'env> { + fn drive<'dep, ProgressIndicatorT>( + &mut self, + client: &'dep Mutex, + ind: ProgressIndicatorT, + ) where + ProgressIndicatorT: ProgressIndicator, + 'dep: 'scope, + { + let canceled = self.canceled.clone(); + self.handle = Some(self.scope.spawn(move || { + thread::scope(|scope| { + scope.spawn(|| { + while ind.tick() && !canceled.load(Ordering::Acquire) { + thread::sleep(Duration::from_millis(500)) + } + }); + while !canceled.load(Ordering::Acquire) { + let counts = client.lock().unwrap().get_job_state_counts_async()?; + if !ind.update_job_states(counts.recv()?)? { + break; + } + + // Don't hammer server with requests + thread::sleep(Duration::from_millis(500)); + } + Ok(()) + }) + })); + } + + fn stop(&mut self) -> Result<()> { + self.handle.take().unwrap().join().unwrap() + } +} diff --git a/crates/cargo-metest/src/progress/multiple_progress_bars.rs b/crates/cargo-metest/src/progress/multiple_progress_bars.rs new file mode 100644 index 00000000..ca6c8084 --- /dev/null +++ b/crates/cargo-metest/src/progress/multiple_progress_bars.rs @@ -0,0 +1,129 @@ +use super::{ProgressIndicator, COLORS}; +use anyhow::Result; +use indicatif::{MultiProgress, ProgressBar, ProgressDrawTarget, TermLike}; +use maelstrom_base::stats::{JobState, JobStateCounts}; +use std::{ + cmp::max, + collections::HashMap, + sync::{Arc, Mutex}, +}; + +#[derive(Default)] +struct State { + done_queuing_jobs: bool, + length: u64, + finished: u64, +} + +#[derive(Clone)] +pub struct MultipleProgressBars { + multi_bar: MultiProgress, + bars: HashMap, + enqueue_spinner: ProgressBar, + state: Arc>, +} + +impl MultipleProgressBars { + pub fn new(term: impl TermLike + 'static) -> Self { + let multi_bar = MultiProgress::new(); + multi_bar.set_draw_target(ProgressDrawTarget::term_like_with_hz(Box::new(term), 20)); + let enqueue_spinner = + multi_bar.add(ProgressBar::new_spinner().with_message("building artifacts...")); + + let mut bars = HashMap::new(); + for (state, color) in JobState::iter().zip(COLORS) { + let bar = multi_bar.add(super::make_progress_bar( + color, + state.to_string(), + 21, + false, + )); + bars.insert(state, bar); + } + Self { + multi_bar, + bars, + enqueue_spinner, + state: Default::default(), + } + } +} + +impl ProgressIndicator for MultipleProgressBars { + fn println(&self, msg: String) { + let com = self.bars.get(&JobState::Complete).unwrap(); + com.println(msg); + } + + fn job_finished(&self) { + let mut state = self.state.lock().unwrap(); + state.finished += 1; + + for bar in self.bars.values() { + let pos = max(bar.position(), state.finished); + bar.set_position(pos); + } + } + + fn update_length(&self, new_length: u64) { + let mut state = self.state.lock().unwrap(); + state.length = new_length; + + for bar in self.bars.values() { + bar.set_length(new_length); + } + } + + fn new_side_progress(&self, msg: impl Into) -> Option { + Some( + self.multi_bar + .insert(1, super::make_progress_bar("white", msg, 21, true)), + ) + } + + fn update_enqueue_status(&self, msg: impl Into) { + self.enqueue_spinner.set_message(msg.into()); + } + + fn update_job_states(&self, counts: JobStateCounts) -> Result { + let state = self.state.lock().unwrap(); + + for job_state in JobState::iter().filter(|s| s != &JobState::Complete) { + let jobs = JobState::iter() + .filter(|s| s >= &job_state) + .map(|s| counts[s]) + .sum(); + let bar = self.bars.get(&job_state).unwrap(); + let pos = max(jobs, state.finished); + bar.set_position(pos); + } + + let finished = state.done_queuing_jobs && state.finished >= state.length; + Ok(!finished) + } + + fn tick(&self) -> bool { + let state = self.state.lock().unwrap(); + + if state.done_queuing_jobs { + return false; + } + + self.enqueue_spinner.tick(); + true + } + + fn done_queuing_jobs(&self) { + let mut state = self.state.lock().unwrap(); + state.done_queuing_jobs = true; + + self.enqueue_spinner.finish_and_clear(); + } + + fn finished(&self) -> Result<()> { + for bar in self.bars.values() { + bar.finish_and_clear(); + } + Ok(()) + } +} diff --git a/crates/cargo-metest/src/progress/no_bar.rs b/crates/cargo-metest/src/progress/no_bar.rs new file mode 100644 index 00000000..8d9c3251 --- /dev/null +++ b/crates/cargo-metest/src/progress/no_bar.rs @@ -0,0 +1,29 @@ +use super::ProgressIndicator; +use anyhow::Result; +use indicatif::TermLike; + +#[derive(Clone)] +pub struct NoBar { + term: TermT, +} + +impl NoBar { + pub fn new(term: TermT) -> Self { + Self { term } + } +} + +impl ProgressIndicator for NoBar +where + TermT: TermLike + Clone + Send + Sync + 'static, +{ + fn println(&self, msg: String) { + self.term.write_line(&msg).ok(); + } + + fn finished(&self) -> Result<()> { + self.term.write_line("all jobs completed")?; + self.term.flush()?; + Ok(()) + } +} diff --git a/crates/cargo-metest/src/progress/quiet_no_bar.rs b/crates/cargo-metest/src/progress/quiet_no_bar.rs new file mode 100644 index 00000000..bf0975dd --- /dev/null +++ b/crates/cargo-metest/src/progress/quiet_no_bar.rs @@ -0,0 +1,29 @@ +use super::ProgressIndicator; +use anyhow::Result; +use indicatif::TermLike; + +#[derive(Clone)] +pub struct QuietNoBar { + term: TermT, +} + +impl QuietNoBar { + pub fn new(term: TermT) -> Self { + Self { term } + } +} + +impl ProgressIndicator for QuietNoBar +where + TermT: TermLike + Clone + Send + Sync + 'static, +{ + fn println(&self, _msg: String) { + // quiet mode doesn't print anything + } + + fn finished(&self) -> Result<()> { + self.term.write_line("all jobs completed")?; + self.term.flush()?; + Ok(()) + } +} diff --git a/crates/cargo-metest/src/progress/quiet_progress_bar.rs b/crates/cargo-metest/src/progress/quiet_progress_bar.rs new file mode 100644 index 00000000..f6df52fc --- /dev/null +++ b/crates/cargo-metest/src/progress/quiet_progress_bar.rs @@ -0,0 +1,35 @@ +use super::ProgressIndicator; +use anyhow::Result; +use indicatif::{ProgressBar, ProgressDrawTarget, TermLike}; + +#[derive(Clone)] +pub struct QuietProgressBar { + bar: ProgressBar, +} + +impl QuietProgressBar { + pub fn new(term: impl TermLike + 'static) -> Self { + let bar = super::make_progress_bar("white", "jobs", 4, false); + bar.set_draw_target(ProgressDrawTarget::term_like_with_hz(Box::new(term), 20)); + Self { bar } + } +} + +impl ProgressIndicator for QuietProgressBar { + fn println(&self, _msg: String) { + // quiet mode doesn't print anything + } + + fn job_finished(&self) { + self.bar.inc(1); + } + + fn update_length(&self, new_length: u64) { + self.bar.set_length(new_length); + } + + fn finished(&self) -> Result<()> { + self.bar.finish_and_clear(); + Ok(()) + } +} diff --git a/crates/cargo-metest/src/progress/test_listing.rs b/crates/cargo-metest/src/progress/test_listing.rs new file mode 100644 index 00000000..b648586e --- /dev/null +++ b/crates/cargo-metest/src/progress/test_listing.rs @@ -0,0 +1,78 @@ +use super::ProgressIndicator; +use anyhow::Result; +use indicatif::{ProgressBar, TermLike}; +use std::sync::{Arc, Mutex}; + +#[derive(Default)] +struct State { + done_queuing_jobs: bool, +} + +#[derive(Clone)] +pub struct TestListingProgress { + enqueue_spinner: ProgressBar, + state: Arc>, +} + +impl TestListingProgress { + pub fn new(_term: impl TermLike + 'static) -> Self { + let enqueue_spinner = ProgressBar::new_spinner().with_message("building artifacts..."); + Self { + enqueue_spinner, + state: Default::default(), + } + } +} + +impl ProgressIndicator for TestListingProgress { + fn println(&self, msg: String) { + self.enqueue_spinner.println(msg); + } + + fn update_enqueue_status(&self, msg: impl Into) { + self.enqueue_spinner.set_message(msg.into()); + } + + fn tick(&self) -> bool { + let state = self.state.lock().unwrap(); + + if state.done_queuing_jobs { + return false; + } + + self.enqueue_spinner.tick(); + true + } + + fn done_queuing_jobs(&self) { + let mut state = self.state.lock().unwrap(); + state.done_queuing_jobs = true; + + self.enqueue_spinner.finish_and_clear(); + } +} + +#[derive(Clone)] +pub struct TestListingProgressNoSpinner { + term: TermT, +} + +impl TestListingProgressNoSpinner { + pub fn new(term: TermT) -> Self { + Self { term } + } +} + +impl ProgressIndicator for TestListingProgressNoSpinner +where + TermT: TermLike + Clone + Send + Sync + 'static, +{ + fn println(&self, msg: String) { + self.term.write_line(&msg).ok(); + } + + fn finished(&self) -> Result<()> { + self.term.flush()?; + Ok(()) + } +} diff --git a/crates/cargo-metest/src/test_listing.rs b/crates/cargo-metest/src/test_listing.rs new file mode 100644 index 00000000..56b9affc --- /dev/null +++ b/crates/cargo-metest/src/test_listing.rs @@ -0,0 +1,179 @@ +use crate::pattern; +use anyhow::{anyhow, Result}; +use cargo_metadata::{Artifact as CargoArtifact, Package as CargoPackage, Target as CargoTarget}; +use maelstrom_util::fs::Fs; +use serde::{Deserialize, Serialize}; +use serde_repr::{Deserialize_repr, Serialize_repr}; +use serde_with::{serde_as, FromInto}; +use std::collections::{BTreeMap, HashMap, HashSet}; +use std::path::Path; + +pub use crate::pattern::ArtifactKind; + +#[derive(Default, Copy, Clone, Debug, PartialEq, Eq, Serialize_repr, Deserialize_repr)] +#[repr(u32)] +pub enum TestListingVersion { + V0 = 0, + #[default] + V1 = 1, +} + +#[derive(Clone, Debug, Deserialize, Eq, Hash, Ord, PartialEq, PartialOrd, Serialize)] +pub struct ArtifactKey { + pub name: String, + pub kind: ArtifactKind, +} + +impl ArtifactKey { + fn from_target(target: &CargoTarget) -> Self { + Self { + name: target.name.clone(), + kind: ArtifactKind::from_target(target), + } + } +} + +#[derive(Debug, Clone, PartialEq, Eq, Default, Serialize, Deserialize)] +pub struct ArtifactCases { + pub cases: Vec, +} + +#[derive(Debug, Clone, Serialize, Deserialize)] +struct Artifact { + #[serde(flatten)] + key: ArtifactKey, + #[serde(flatten)] + value: ArtifactCases, +} + +#[derive(Debug, Clone, Default, Serialize, Deserialize)] +#[serde(transparent)] +struct ArtifactVec(Vec); + +impl From for BTreeMap { + fn from(v: ArtifactVec) -> Self { + v.0.into_iter().map(|a| (a.key, a.value)).collect() + } +} + +impl From> for ArtifactVec { + fn from(m: BTreeMap) -> Self { + Self( + m.into_iter() + .map(|(key, value)| Artifact { key, value }) + .collect(), + ) + } +} + +#[serde_as] +#[derive(Debug, Default, PartialEq, Eq, Serialize, Deserialize)] +pub struct Package { + #[serde_as(as = "FromInto")] + pub artifacts: BTreeMap, +} + +#[derive(Debug, Default, PartialEq, Eq, Serialize, Deserialize)] +pub struct TestListing { + pub version: TestListingVersion, + #[serde(flatten)] + pub packages: BTreeMap, +} + +fn filter_case( + package: &str, + artifact: &ArtifactKey, + case: &str, + filter: &pattern::Pattern, +) -> bool { + let c = pattern::Context { + package: package.into(), + artifact: Some(pattern::Artifact { + name: artifact.name.clone(), + kind: artifact.kind, + }), + case: Some(pattern::Case { name: case.into() }), + }; + pattern::interpret_pattern(filter, &c).expect("case is provided") +} + +impl TestListing { + pub fn add_cases(&mut self, artifact: &CargoArtifact, cases: &[String]) { + let package_name = artifact.package_id.repr.split(' ').next().unwrap().into(); + let artifact_key = ArtifactKey::from_target(&artifact.target); + let package = self.packages.entry(package_name).or_default(); + package.artifacts.insert( + artifact_key, + ArtifactCases { + cases: cases.to_vec(), + }, + ); + } + + pub fn remove_package(&mut self, package: &str) { + self.packages.remove(package); + } + + pub fn expected_job_count(&self, filter: &pattern::Pattern) -> u64 { + self.packages + .iter() + .flat_map(|(p, a)| { + a.artifacts + .iter() + .flat_map(move |(a, c)| c.cases.iter().map(move |c| (p, a, c))) + }) + .filter(|(p, a, c)| filter_case(p, a, c, filter)) + .count() as u64 + } + + pub fn retain_packages(&mut self, existing_packages_slice: &[&CargoPackage]) { + let existing_packages: HashMap<&String, &CargoPackage> = existing_packages_slice + .iter() + .map(|p| (&p.name, *p)) + .collect(); + let mut new_packages = BTreeMap::new(); + for (name, mut pkg) in std::mem::take(&mut self.packages).into_iter() { + let Some(existing_package) = existing_packages.get(&name) else { + continue; + }; + let artifact_keys: HashSet<_> = existing_package + .targets + .iter() + .map(ArtifactKey::from_target) + .collect(); + pkg.artifacts.retain(|key, _| artifact_keys.contains(key)); + new_packages.insert(name, pkg); + } + + self.packages = new_packages; + } +} + +pub const LAST_TEST_LISTING_NAME: &str = "maelstrom-test-listing.toml"; + +pub fn load_test_listing(path: &Path) -> Result> { + let fs = Fs::new(); + if let Some(contents) = fs.read_to_string_if_exists(path)? { + let mut table: toml::Table = toml::from_str(&contents)?; + let version: TestListingVersion = table + .remove("version") + .ok_or(anyhow!("missing version"))? + .try_into()?; + if version != TestListingVersion::default() { + Ok(None) + } else { + Ok(toml::from_str(&contents)?) + } + } else { + Ok(None) + } +} + +pub fn write_test_listing(path: &Path, job_listing: &TestListing) -> Result<()> { + let fs = Fs::new(); + if let Some(parent) = path.parent() { + fs.create_dir_all(parent)?; + } + fs.write(path, toml::to_string_pretty(job_listing)?)?; + Ok(()) +} diff --git a/crates/cargo-metest/src/visitor.rs b/crates/cargo-metest/src/visitor.rs new file mode 100644 index 00000000..7eef7cf4 --- /dev/null +++ b/crates/cargo-metest/src/visitor.rs @@ -0,0 +1,224 @@ +use crate::ProgressIndicator; +use anyhow::Result; +use colored::{ColoredString, Colorize as _}; +use indicatif::TermLike; +use maelstrom_base::{ + ClientJobId, JobError, JobOutputResult, JobStatus, JobStringResult, JobSuccess, +}; +use maelstrom_util::process::{ExitCode, ExitCodeAccumulator}; +use std::sync::{Arc, Mutex}; +use unicode_truncate::UnicodeTruncateStr as _; +use unicode_width::UnicodeWidthStr as _; + +enum CaseResult { + Ignored, + Ran(ExitCode), +} + +#[derive(Default)] +pub struct JobStatusTracker { + statuses: Mutex>, + exit_code: ExitCodeAccumulator, +} + +impl JobStatusTracker { + pub fn job_exited(&self, case: String, exit_code: ExitCode) { + let mut statuses = self.statuses.lock().unwrap(); + statuses.push((case, CaseResult::Ran(exit_code))); + self.exit_code.add(exit_code); + } + + pub fn job_ignored(&self, case: String) { + let mut statuses = self.statuses.lock().unwrap(); + statuses.push((case, CaseResult::Ignored)); + } + + pub fn print_summary(&self, width: usize, term: impl TermLike) -> Result<()> { + term.write_line("")?; + + let heading = " Test Summary "; + let equal_width = (width - heading.width()) / 2; + term.write_line(&format!( + "{empty:= 0 { + column1_width = std::cmp::max(column1_width, ignore.width()); + } + + term.write_line(&format!( + "{:max_digits$}", + success.green(), + ))?; + term.write_line(&format!( + "{:max_digits$}", + failure.red(), + ))?; + let failed_width = failed.clone().map(|(n, _)| n.width()).max().unwrap_or(0); + for (failed, _) in failed { + term.write_line(&format!(" {failed: 0 { + term.write_line(&format!( + "{:max_digits$}", + ignore.yellow(), + ))?; + let failed_width = ignored.clone().map(|(n, _)| n.width()).max().unwrap_or(0); + for (ignored, _) in ignored { + term.write_line(&format!( + " {ignored: ExitCode { + self.exit_code.get() + } +} + +pub struct JobStatusVisitor { + tracker: Arc, + case: String, + width: usize, + ind: ProgressIndicatorT, +} + +impl JobStatusVisitor { + pub fn new( + tracker: Arc, + case: String, + width: usize, + ind: ProgressIndicatorT, + ) -> Self { + Self { + tracker, + case, + width, + ind, + } + } +} + +impl JobStatusVisitor { + fn print_job_result(&self, result_str: ColoredString) { + if self.width > 10 { + let case_width = self.case.width(); + let result_width = result_str.width(); + if case_width + result_width < self.width { + let dots_width = self.width - result_width - case_width; + let case = self.case.bold(); + self.ind.println(format!( + "{case}{empty:. Result<()> { + let result_str: ColoredString; + let mut result_details: Option = None; + let mut test_output_lines: Vec = vec![]; + match result { + Ok(JobSuccess { status, stderr, .. }) => { + let mut job_failed = true; + match status { + JobStatus::Exited(code) => { + result_str = if code == 0 { + job_failed = false; + "OK".green() + } else { + "FAIL".red() + }; + self.tracker + .job_exited(self.case.clone(), ExitCode::from(code)); + } + JobStatus::Signaled(signo) => { + result_str = "FAIL".red(); + result_details = Some(format!("killed by signal {signo}")); + self.tracker + .job_exited(self.case.clone(), ExitCode::FAILURE); + } + }; + if job_failed { + match stderr { + JobOutputResult::None => {} + JobOutputResult::Inline(bytes) => { + test_output_lines.push(String::from_utf8_lossy(&bytes).into()); + } + JobOutputResult::Truncated { first, truncated } => { + test_output_lines.push(String::from_utf8_lossy(&first).into()); + test_output_lines.push(format!( + "job {cjid}: stderr truncated, {truncated} bytes lost" + )); + } + } + } + } + Err(JobError::Execution(err)) => { + result_str = "ERR".yellow(); + result_details = Some(format!("execution error: {err}")); + self.tracker + .job_exited(self.case.clone(), ExitCode::FAILURE); + } + Err(JobError::System(err)) => { + result_str = "ERR".yellow(); + result_details = Some(format!("system error: {err}")); + self.tracker + .job_exited(self.case.clone(), ExitCode::FAILURE); + } + } + self.print_job_result(result_str); + + if let Some(details_str) = result_details { + self.ind.println(details_str); + } + for line in test_output_lines { + self.ind.eprintln(line); + } + self.ind.job_finished(); + Ok(()) + } + + pub fn job_ignored(&self) { + self.print_job_result("IGNORED".yellow()); + self.tracker.job_ignored(self.case.clone()); + self.ind.job_finished(); + } +} diff --git a/crates/cargo-metest/tests/integration_test.rs b/crates/cargo-metest/tests/integration_test.rs new file mode 100644 index 00000000..f96636b8 --- /dev/null +++ b/crates/cargo-metest/tests/integration_test.rs @@ -0,0 +1,1438 @@ +use anyhow::Result; +use assert_matches::assert_matches; +use cargo_metest::test_listing::{ + load_test_listing, ArtifactCases, ArtifactKey, ArtifactKind, Package, TestListing, + LAST_TEST_LISTING_NAME, +}; +use cargo_metest::{ + config::Quiet, + main_app_new, + progress::{ProgressDriver, ProgressIndicator}, + EnqueueResult, ListAction, MainAppDeps, +}; +use indicatif::InMemoryTerm; +use maelstrom_base::{ + proto::{BrokerToClient, ClientToBroker, Hello}, + stats::{JobState, JobStateCounts}, + JobOutputResult, JobSpec, JobStatus, JobStringResult, JobSuccess, +}; +use maelstrom_client::{Client, ClientDeps, ClientDriver}; +use maelstrom_util::{config::BrokerAddr, fs::Fs}; +use serde::{de::DeserializeOwned, Serialize}; +use std::{ + cell::RefCell, + collections::HashMap, + io::{self, Read as _, Write as _}, + net::{Ipv6Addr, SocketAddrV6, TcpListener, TcpStream}, + os::unix::fs::PermissionsExt as _, + path::Path, + rc::Rc, + sync::{Arc, Mutex}, +}; +use tempfile::{tempdir, TempDir}; + +fn path_file_name(path: &Path) -> String { + path.file_name().unwrap().to_str().unwrap().to_owned() +} + +fn put_file(fs: &Fs, path: &Path, contents: &str) { + let mut f = fs.create_file(path).unwrap(); + f.write_all(contents.as_bytes()).unwrap(); +} + +fn put_script(fs: &Fs, path: &Path, contents: &str) { + let mut f = fs.create_file(path).unwrap(); + f.write_all( + format!( + "#!/bin/bash + set -e + set -o pipefail + {contents} + " + ) + .as_bytes(), + ) + .unwrap(); + + let mut perms = f.metadata().unwrap().permissions(); + perms.set_mode(0o777); + f.set_permissions(perms).unwrap(); +} + +fn generate_cargo_project(tmp_dir: &TempDir, fake_tests: &FakeTests) -> String { + let fs = Fs::new(); + let workspace_dir = tmp_dir.path().join("workspace"); + fs.create_dir(&workspace_dir).unwrap(); + let cargo_path = workspace_dir.join("cargo"); + put_script( + &fs, + &cargo_path, + &format!( + "\ + cd {workspace_dir:?}\n\ + cargo $@ | sort\n\ + " + ), + ); + put_file( + &fs, + &workspace_dir.join("Cargo.toml"), + "\ + [workspace]\n\ + members = [ \"crates/*\"] + ", + ); + let crates_dir = workspace_dir.join("crates"); + fs.create_dir(&crates_dir).unwrap(); + for binary in &fake_tests.test_binaries { + let crate_name = &binary.name; + let project_dir = crates_dir.join(&crate_name); + fs.create_dir(&project_dir).unwrap(); + put_file( + &fs, + &project_dir.join("Cargo.toml"), + &format!( + "\ + [package]\n\ + name = \"{crate_name}\"\n\ + version = \"0.1.0\"\n\ + [lib]\n\ + ", + ), + ); + let src_dir = project_dir.join("src"); + fs.create_dir(&src_dir).unwrap(); + let mut test_src = String::new(); + for test_case in &binary.tests { + let test_name = &test_case.name; + let ignored = if test_case.ignored { "#[ignore]" } else { "" }; + test_src += &format!( + "\ + #[test]\n\ + {ignored}\ + fn {test_name}() {{}}\n\ + ", + ); + } + put_file(&fs, &src_dir.join("lib.rs"), &test_src); + } + + cargo_path.display().to_string() +} + +struct MessageStream { + stream: TcpStream, +} + +impl MessageStream { + fn next(&mut self) -> io::Result { + let mut msg_len: [u8; 4] = [0; 4]; + self.stream.read_exact(&mut msg_len)?; + let mut buf = vec![0; u32::from_le_bytes(msg_len) as usize]; + self.stream.read_exact(&mut buf).unwrap(); + Ok(bincode::deserialize_from(&buf[..]).unwrap()) + } +} + +fn send_message(mut stream: &TcpStream, msg: &impl Serialize) { + let buf = bincode::serialize(msg).unwrap(); + stream.write_all(&(buf.len() as u32).to_le_bytes()).unwrap(); + stream.write_all(&buf[..]).unwrap(); +} + +fn test_path(spec: &JobSpec) -> TestPath { + let binary = spec + .program + .as_str() + .split("/") + .last() + .unwrap() + .split("-") + .next() + .unwrap() + .into(); + let test_name = spec + .arguments + .iter() + .filter(|a| !a.starts_with("-")) + .next() + .unwrap() + .clone(); + TestPath { binary, test_name } +} + +struct FakeBroker { + #[allow(dead_code)] + listener: TcpListener, + state: BrokerState, + address: BrokerAddr, +} + +struct FakeBrokerConnection { + messages: MessageStream, + state: BrokerState, +} + +impl FakeBroker { + fn new(state: BrokerState) -> Self { + let listener = + TcpListener::bind(SocketAddrV6::new(Ipv6Addr::UNSPECIFIED, 0, 0, 0)).unwrap(); + let address = BrokerAddr::new(listener.local_addr().unwrap()); + + Self { + listener, + state, + address, + } + } + + fn accept(&mut self) -> FakeBrokerConnection { + let (stream, _) = self.listener.accept().unwrap(); + let mut messages = MessageStream { stream }; + + let msg: Hello = messages.next().unwrap(); + assert_matches!(msg, Hello::Client); + + FakeBrokerConnection { + messages, + state: self.state.clone(), + } + } +} + +impl FakeBrokerConnection { + fn process(&mut self, count: usize) { + for _ in 0..count { + let msg = self.messages.next::().unwrap(); + match msg { + ClientToBroker::JobRequest(id, spec) => { + let test_path = test_path(&spec); + match self.state.job_responses.remove(&test_path).unwrap() { + JobAction::Respond(res) => send_message( + &self.messages.stream, + &BrokerToClient::JobResponse(id, res), + ), + JobAction::Ignore => (), + } + } + ClientToBroker::JobStateCountsRequest => send_message( + &self.messages.stream, + &BrokerToClient::JobStateCountsResponse(self.state.job_states.clone()), + ), + + _ => (), + } + } + } +} + +#[derive(Clone)] +enum JobAction { + Ignore, + Respond(JobStringResult), +} + +#[derive(Default, Clone)] +struct BrokerState { + job_responses: HashMap, + job_states: JobStateCounts, +} + +#[derive(Clone)] +struct FakeTestCase { + name: String, + ignored: bool, + desired_state: JobState, +} + +impl Default for FakeTestCase { + fn default() -> Self { + Self { + name: "".into(), + ignored: false, + desired_state: JobState::Complete, + } + } +} + +#[derive(Clone, Default)] +struct FakeTestBinary { + name: String, + tests: Vec, +} + +#[derive(Clone)] +struct FakeTests { + test_binaries: Vec, +} + +impl FakeTests { + fn listing(&self) -> TestListing { + TestListing { + version: Default::default(), + packages: self + .test_binaries + .iter() + .map(|b| { + ( + b.name.clone(), + Package { + artifacts: [( + ArtifactKey { + name: b.name.clone(), + kind: ArtifactKind::Library, + }, + ArtifactCases { + cases: b.tests.iter().map(|t| t.name.clone()).collect(), + }, + )] + .into_iter() + .collect(), + }, + ) + }) + .collect(), + } + } +} + +#[derive(Clone, Hash, PartialOrd, Ord, PartialEq, Eq)] +struct TestPath { + binary: String, + test_name: String, +} + +impl FakeTests { + fn all_test_paths(&self) -> impl Iterator + '_ { + self.test_binaries + .iter() + .map(|b| { + b.tests.iter().filter_map(|t| { + (!t.ignored).then(|| { + ( + t, + TestPath { + binary: b.name.clone(), + test_name: t.name.clone(), + }, + ) + }) + }) + }) + .flatten() + } + + fn get(&self, package_name: &str, case: &str) -> &FakeTestCase { + let binary = self + .test_binaries + .iter() + .find(|b| b.name == package_name) + .unwrap(); + binary.tests.iter().find(|t| t.name == case).unwrap() + } +} + +#[derive(Default, Clone)] +struct TestClientDriver { + deps: Arc>>, +} + +impl ClientDriver for TestClientDriver { + fn drive(&mut self, deps: ClientDeps) { + *self.deps.lock().unwrap() = Some(deps); + } + + fn stop(&mut self) -> Result<()> { + Ok(()) + } +} + +impl TestClientDriver { + fn process_broker_msg(&self, count: usize) { + let mut locked_deps = self.deps.lock().unwrap(); + let deps = locked_deps.as_mut().unwrap(); + + for _ in 0..count { + deps.socket_reader.process_one(); + deps.dispatcher.try_process_one().unwrap(); + } + } + + fn process_client_messages(&self) { + let mut locked_deps = self.deps.lock().unwrap(); + let deps = locked_deps.as_mut().unwrap(); + while deps.dispatcher.try_process_one().is_ok() {} + } +} + +#[derive(Default, Clone)] +struct TestProgressDriver<'scope> { + update_func: Rc Result + 'scope>>>>, +} + +impl<'scope> ProgressDriver<'scope> for TestProgressDriver<'scope> { + fn drive<'dep, ProgressIndicatorT>( + &mut self, + _client: &'dep Mutex, + ind: ProgressIndicatorT, + ) where + ProgressIndicatorT: ProgressIndicator, + 'dep: 'scope, + { + *self.update_func.borrow_mut() = Some(Box::new(move |state| ind.update_job_states(state))); + } + + fn stop(&mut self) -> Result<()> { + Ok(()) + } +} + +impl<'scope> TestProgressDriver<'scope> { + fn update(&self, states: JobStateCounts) -> Result { + (self.update_func.borrow_mut().as_mut().unwrap())(states) + } +} + +fn run_app( + term: InMemoryTerm, + fake_tests: FakeTests, + workspace_root: &Path, + state: BrokerState, + cargo: String, + stdout_tty: bool, + quiet: Quiet, + include_filter: Vec, + exclude_filter: Vec, + list: Option, + finish: bool, +) -> String { + let cargo_metadata = cargo_metadata::MetadataCommand::new() + .manifest_path(workspace_root.join("Cargo.toml")) + .exec() + .unwrap(); + + let mut stderr = vec![]; + let mut b = FakeBroker::new(state); + let client_driver = TestClientDriver::default(); + + let deps = MainAppDeps::new( + cargo, + include_filter, + exclude_filter, + list, + &mut stderr, + false, // stderr_color + &workspace_root, + &cargo_metadata.workspace_packages(), + b.address.clone(), + client_driver.clone(), + ) + .unwrap(); + let prog_driver = TestProgressDriver::default(); + let mut app = + main_app_new(&deps, stdout_tty, quiet, term.clone(), prog_driver.clone()).unwrap(); + + let mut b_conn = b.accept(); + + loop { + let res = app.enqueue_one().unwrap(); + let (package_name, case) = match res { + EnqueueResult::Done => break, + EnqueueResult::Ignored | EnqueueResult::Listed => continue, + EnqueueResult::Enqueued { package_name, case } => (package_name, case), + }; + let test = fake_tests.get(&package_name, &case); + + // process job enqueuing + client_driver.process_client_messages(); + b_conn.process(1); + if test.desired_state == JobState::Complete { + client_driver.process_broker_msg(1); + } + + let counts = deps + .client + .lock() + .unwrap() + .get_job_state_counts_async() + .unwrap(); + client_driver.process_client_messages(); + + // process job state request + b_conn.process(1); + client_driver.process_broker_msg(1); + + prog_driver.update(counts.try_recv().unwrap()).unwrap(); + } + + app.drain().unwrap(); + client_driver.process_client_messages(); + + if finish { + app.finish().unwrap(); + } + + term.contents() +} + +fn run_or_list_all_tests_sync( + tmp_dir: &TempDir, + fake_tests: FakeTests, + quiet: Quiet, + include_filter: Vec, + exclude_filter: Vec, + list: Option, +) -> String { + let mut state = BrokerState::default(); + for (_, test_path) in fake_tests.all_test_paths() { + state.job_responses.insert( + test_path, + JobAction::Respond(Ok(JobSuccess { + status: JobStatus::Exited(0), + stdout: JobOutputResult::None, + stderr: JobOutputResult::Inline(Box::new(*b"this output should be ignored")), + })), + ); + } + + let workspace = tmp_dir.path().join("workspace"); + if !workspace.exists() { + generate_cargo_project(tmp_dir, &fake_tests); + } + let cargo = workspace.join("cargo").to_str().unwrap().into(); + + let term = InMemoryTerm::new(50, 50); + run_app( + term.clone(), + fake_tests, + &workspace, + state, + cargo, + false, // stdout_tty + quiet, + include_filter, + exclude_filter, + list, + true, // finish + ) +} + +fn run_all_tests_sync( + tmp_dir: &TempDir, + fake_tests: FakeTests, + quiet: Quiet, + include_filter: Vec, + exclude_filter: Vec, +) -> String { + run_or_list_all_tests_sync( + tmp_dir, + fake_tests, + quiet, + include_filter, + exclude_filter, + None, + ) +} + +fn list_all_tests_sync( + tmp_dir: &TempDir, + fake_tests: FakeTests, + quiet: Quiet, + include_filter: Vec, + exclude_filter: Vec, + expected_packages: &str, + expected_binaries: &str, + expected_tests: &str, +) { + let listing = run_or_list_all_tests_sync( + tmp_dir, + fake_tests.clone(), + quiet.clone(), + include_filter.clone(), + exclude_filter.clone(), + Some(ListAction::ListTests), + ); + assert_eq!(listing, expected_tests); + + let listing = run_or_list_all_tests_sync( + tmp_dir, + fake_tests.clone(), + quiet.clone(), + include_filter.clone(), + exclude_filter.clone(), + Some(ListAction::ListBinaries), + ); + assert_eq!(listing, expected_binaries); + + let listing = run_or_list_all_tests_sync( + tmp_dir, + fake_tests.clone(), + quiet.clone(), + include_filter.clone(), + exclude_filter.clone(), + Some(ListAction::ListPackages), + ); + assert_eq!(listing, expected_packages); +} + +#[test] +fn no_tests_all_tests_sync() { + let tmp_dir = tempdir().unwrap(); + let fake_tests = FakeTests { + test_binaries: vec![FakeTestBinary { + name: "foo".into(), + tests: vec![], + }], + }; + assert_eq!( + run_all_tests_sync( + &tmp_dir, + fake_tests, + false.into(), + vec!["all".into()], + vec![] + ), + "\ + all jobs completed\n\ + \n\ + ================== Test Summary ==================\n\ + Successful Tests: 0\n\ + Failed Tests : 0\ + " + ); +} + +#[test] +fn no_tests_all_tests_sync_listing() { + let tmp_dir = tempdir().unwrap(); + let fake_tests = FakeTests { + test_binaries: vec![FakeTestBinary { + name: "foo".into(), + tests: vec![], + }], + }; + list_all_tests_sync( + &tmp_dir, + fake_tests, + false.into(), + vec!["all".into()], + vec![], + "package foo", + "binary foo (library)", + "", + ); +} + +#[test] +fn two_tests_all_tests_sync() { + let tmp_dir = tempdir().unwrap(); + let fake_tests = FakeTests { + test_binaries: vec![ + FakeTestBinary { + name: "foo".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + ..Default::default() + }], + }, + FakeTestBinary { + name: "bar".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + ..Default::default() + }], + }, + ], + }; + assert_eq!( + run_all_tests_sync( + &tmp_dir, + fake_tests, + false.into(), + vec!["all".into()], + vec![] + ), + "\ + bar test_it.....................................OK\n\ + foo test_it.....................................OK\n\ + all jobs completed\n\ + \n\ + ================== Test Summary ==================\n\ + Successful Tests: 2\n\ + Failed Tests : 0\ + " + ); +} + +#[test] +fn two_tests_all_tests_sync_listing() { + let tmp_dir = tempdir().unwrap(); + let fake_tests = FakeTests { + test_binaries: vec![ + FakeTestBinary { + name: "foo".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + ..Default::default() + }], + }, + FakeTestBinary { + name: "bar".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + ..Default::default() + }], + }, + ], + }; + list_all_tests_sync( + &tmp_dir, + fake_tests, + false.into(), + vec!["all".into()], + vec![], + "\ + package bar\n\ + package foo\ + ", + "\ + binary bar (library)\n\ + binary foo (library)\ + ", + "\ + bar test_it\n\ + foo test_it\ + ", + ); +} + +#[test] +fn four_tests_filtered_sync() { + let tmp_dir = tempdir().unwrap(); + let fake_tests = FakeTests { + test_binaries: vec![ + FakeTestBinary { + name: "foo".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + ..Default::default() + }], + }, + FakeTestBinary { + name: "bar".into(), + tests: vec![FakeTestCase { + name: "test_it2".into(), + ..Default::default() + }], + }, + FakeTestBinary { + name: "baz".into(), + tests: vec![FakeTestCase { + name: "testy".into(), + ..Default::default() + }], + }, + FakeTestBinary { + name: "bin".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + ..Default::default() + }], + }, + ], + }; + assert_eq!( + run_all_tests_sync( + &tmp_dir, + fake_tests, + false.into(), + vec![ + "name.equals(test_it)".into(), + "name.equals(test_it2)".into() + ], + vec!["package.equals(bin)".into()] + ), + "\ + bar test_it2....................................OK\n\ + foo test_it.....................................OK\n\ + all jobs completed\n\ + \n\ + ================== Test Summary ==================\n\ + Successful Tests: 2\n\ + Failed Tests : 0\ + " + ); +} + +#[test] +fn four_tests_filtered_sync_listing() { + let tmp_dir = tempdir().unwrap(); + let fake_tests = FakeTests { + test_binaries: vec![ + FakeTestBinary { + name: "foo".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + ..Default::default() + }], + }, + FakeTestBinary { + name: "bar".into(), + tests: vec![FakeTestCase { + name: "test_it2".into(), + ..Default::default() + }], + }, + FakeTestBinary { + name: "baz".into(), + tests: vec![FakeTestCase { + name: "testy".into(), + ..Default::default() + }], + }, + FakeTestBinary { + name: "bin".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + ..Default::default() + }], + }, + ], + }; + list_all_tests_sync( + &tmp_dir, + fake_tests, + false.into(), + vec![ + "name.equals(test_it)".into(), + "name.equals(test_it2)".into(), + ], + vec!["package.equals(bin)".into()], + "\ + package bar\n\ + package baz\n\ + package foo\ + ", + "\ + binary bar (library)\n\ + binary baz (library)\n\ + binary foo (library)\ + ", + "\ + bar test_it2\n\ + foo test_it\ + ", + ); +} + +#[test] +fn three_tests_single_package_sync() { + let tmp_dir = tempdir().unwrap(); + let fake_tests = FakeTests { + test_binaries: vec![ + FakeTestBinary { + name: "foo".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + ..Default::default() + }], + }, + FakeTestBinary { + name: "bar".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + ..Default::default() + }], + }, + FakeTestBinary { + name: "baz".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + ..Default::default() + }], + }, + ], + }; + assert_eq!( + run_all_tests_sync( + &tmp_dir, + fake_tests, + false.into(), + vec!["package.equals(foo)".into()], + vec![] + ), + "\ + foo test_it.....................................OK\n\ + all jobs completed\n\ + \n\ + ================== Test Summary ==================\n\ + Successful Tests: 1\n\ + Failed Tests : 0\ + " + ); +} + +#[test] +fn three_tests_single_package_filtered_sync() { + let tmp_dir = tempdir().unwrap(); + let fake_tests = FakeTests { + test_binaries: vec![ + FakeTestBinary { + name: "foo".into(), + tests: vec![ + FakeTestCase { + name: "test_it".into(), + ..Default::default() + }, + FakeTestCase { + name: "testy".into(), + ..Default::default() + }, + ], + }, + FakeTestBinary { + name: "bar".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + ..Default::default() + }], + }, + FakeTestBinary { + name: "baz".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + ..Default::default() + }], + }, + ], + }; + assert_eq!( + run_all_tests_sync( + &tmp_dir, + fake_tests, + false.into(), + vec!["package.equals(foo) && name.equals(test_it)".into()], + vec![] + ), + "\ + foo test_it.....................................OK\n\ + all jobs completed\n\ + \n\ + ================== Test Summary ==================\n\ + Successful Tests: 1\n\ + Failed Tests : 0\ + " + ); +} + +#[test] +fn ignored_test_sync() { + let tmp_dir = tempdir().unwrap(); + let fake_tests = FakeTests { + test_binaries: vec![ + FakeTestBinary { + name: "foo".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + ignored: true, + ..Default::default() + }], + }, + FakeTestBinary { + name: "bar".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + ..Default::default() + }], + }, + FakeTestBinary { + name: "baz".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + ..Default::default() + }], + }, + ], + }; + assert_eq!( + run_all_tests_sync( + &tmp_dir, + fake_tests, + false.into(), + vec!["all".into()], + vec![] + ), + "\ + bar test_it.....................................OK\n\ + baz test_it.....................................OK\n\ + foo test_it................................IGNORED\n\ + all jobs completed\n\ + \n\ + ================== Test Summary ==================\n\ + Successful Tests: 2\n\ + Failed Tests : 0\n\ + Ignored Tests : 1\n\ + \x20\x20\x20\x20foo test_it: ignored\ + " + ); +} + +#[test] +fn two_tests_all_tests_sync_quiet() { + let tmp_dir = tempdir().unwrap(); + let fake_tests = FakeTests { + test_binaries: vec![ + FakeTestBinary { + name: "foo".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + ..Default::default() + }], + }, + FakeTestBinary { + name: "bar".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + ..Default::default() + }], + }, + ], + }; + assert_eq!( + run_all_tests_sync( + &tmp_dir, + fake_tests, + true.into(), + vec!["all".into()], + vec![] + ), + "\ + all jobs completed\n\ + \n\ + ================== Test Summary ==================\n\ + Successful Tests: 2\n\ + Failed Tests : 0\ + " + ); +} + +fn run_failed_tests(fake_tests: FakeTests) -> String { + let tmp_dir = tempdir().unwrap(); + + let mut state = BrokerState::default(); + for (_, test_path) in fake_tests.all_test_paths() { + state.job_responses.insert( + test_path, + JobAction::Respond(Ok(JobSuccess { + status: JobStatus::Exited(1), + stdout: JobOutputResult::None, + stderr: JobOutputResult::Inline(Box::new(*b"error output")), + })), + ); + } + + let cargo = generate_cargo_project(&tmp_dir, &fake_tests); + let term = InMemoryTerm::new(50, 50); + run_app( + term.clone(), + fake_tests, + &tmp_dir.path().join("workspace"), + state, + cargo, + false, // stdout_tty + Quiet::from(false), + vec!["all".into()], + vec![], + None, + true, // finish + ); + + term.contents() +} + +#[test] +fn failed_tests() { + let fake_tests = FakeTests { + test_binaries: vec![ + FakeTestBinary { + name: "foo".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + ..Default::default() + }], + }, + FakeTestBinary { + name: "bar".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + ..Default::default() + }], + }, + ], + }; + assert_eq!( + run_failed_tests(fake_tests), + "\ + bar test_it...................................FAIL\n\ + stderr: error output\n\ + foo test_it...................................FAIL\n\ + stderr: error output\n\ + all jobs completed\n\ + \n\ + ================== Test Summary ==================\n\ + Successful Tests: 0\n\ + Failed Tests : 2\n\ + \x20\x20\x20\x20bar test_it: failure\n\ + \x20\x20\x20\x20foo test_it: failure\ + " + ); +} + +fn run_in_progress_test(fake_tests: FakeTests, quiet: Quiet, expected_output: &str) { + let tmp_dir = tempdir().unwrap(); + + let mut state = BrokerState::default(); + for (test, test_path) in fake_tests.all_test_paths() { + if test.desired_state == JobState::Complete { + state.job_responses.insert( + test_path, + JobAction::Respond(Ok(JobSuccess { + status: JobStatus::Exited(0), + stdout: JobOutputResult::None, + stderr: JobOutputResult::None, + })), + ); + } else { + state.job_responses.insert(test_path, JobAction::Ignore); + } + state.job_states[test.desired_state] += 1; + } + + let cargo = generate_cargo_project(&tmp_dir, &fake_tests); + let term = InMemoryTerm::new(50, 50); + let term_clone = term.clone(); + let contents = run_app( + term_clone, + fake_tests, + &tmp_dir.path().join("workspace"), + state, + cargo, + true, // stdout_tty + quiet, + vec!["all".into()], + vec![], + None, + false, // finish + ); + assert_eq!(contents, expected_output); +} + +#[test] +fn waiting_for_artifacts() { + let fake_tests = FakeTests { + test_binaries: vec![ + FakeTestBinary { + name: "foo".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + desired_state: JobState::WaitingForArtifacts, + ..Default::default() + }], + }, + FakeTestBinary { + name: "bar".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + desired_state: JobState::WaitingForArtifacts, + ..Default::default() + }], + }, + ], + }; + run_in_progress_test( + fake_tests, + false.into(), + "\ + ######################## 2/2 waiting for artifacts\n\ + ------------------------ 0/2 pending\n\ + ------------------------ 0/2 running\n\ + ------------------------ 0/2 complete\ + ", + ); +} + +#[test] +fn pending() { + let fake_tests = FakeTests { + test_binaries: vec![ + FakeTestBinary { + name: "foo".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + desired_state: JobState::Pending, + ..Default::default() + }], + }, + FakeTestBinary { + name: "bar".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + desired_state: JobState::Pending, + ..Default::default() + }], + }, + ], + }; + run_in_progress_test( + fake_tests, + false.into(), + "\ + ######################## 2/2 waiting for artifacts\n\ + ######################## 2/2 pending\n\ + ------------------------ 0/2 running\n\ + ------------------------ 0/2 complete\ + ", + ); +} + +#[test] +fn running() { + let fake_tests = FakeTests { + test_binaries: vec![ + FakeTestBinary { + name: "foo".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + desired_state: JobState::Running, + ..Default::default() + }], + }, + FakeTestBinary { + name: "bar".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + desired_state: JobState::Running, + ..Default::default() + }], + }, + ], + }; + run_in_progress_test( + fake_tests, + false.into(), + "\ + ######################## 2/2 waiting for artifacts\n\ + ######################## 2/2 pending\n\ + ######################## 2/2 running\n\ + ------------------------ 0/2 complete\ + ", + ); +} + +#[test] +fn complete() { + let fake_tests = FakeTests { + test_binaries: vec![ + FakeTestBinary { + name: "foo".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + desired_state: JobState::Complete, + ..Default::default() + }], + }, + FakeTestBinary { + name: "bar".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + desired_state: JobState::Running, + ..Default::default() + }], + }, + ], + }; + run_in_progress_test( + fake_tests, + false.into(), + "\ + foo test_it.....................................OK\n\ + ######################## 2/2 waiting for artifacts\n\ + ######################## 2/2 pending\n\ + ######################## 2/2 running\n\ + #############----------- 1/2 complete\ + ", + ); +} + +#[test] +fn complete_quiet() { + let fake_tests = FakeTests { + test_binaries: vec![ + FakeTestBinary { + name: "foo".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + desired_state: JobState::Complete, + ..Default::default() + }], + }, + FakeTestBinary { + name: "bar".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + desired_state: JobState::Running, + ..Default::default() + }], + }, + ], + }; + run_in_progress_test( + fake_tests, + true.into(), + "#####################-------------------- 1/2 jobs", + ); +} + +#[test] +fn expected_count_updates_packages() { + let tmp_dir = tempdir().unwrap(); + let fake_tests = FakeTests { + test_binaries: vec![ + FakeTestBinary { + name: "foo".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + ..Default::default() + }], + }, + FakeTestBinary { + name: "bar".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + ..Default::default() + }], + }, + ], + }; + run_all_tests_sync( + &tmp_dir, + fake_tests.clone(), + false.into(), + vec!["all".into()], + vec![], + ); + + let path = tmp_dir + .path() + .join("workspace/target") + .join(LAST_TEST_LISTING_NAME); + let listing: TestListing = load_test_listing(&path).unwrap().unwrap(); + assert_eq!(listing, fake_tests.listing()); + + // remove bar + let fake_tests = FakeTests { + test_binaries: vec![FakeTestBinary { + name: "foo".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + ..Default::default() + }], + }], + }; + let fs = Fs::new(); + fs.remove_dir_all(tmp_dir.path().join("workspace/crates/bar")) + .unwrap(); + + run_all_tests_sync( + &tmp_dir, + fake_tests.clone(), + false.into(), + vec!["all".into()], + vec![], + ); + + // new listing should match + let listing: TestListing = load_test_listing(&path).unwrap().unwrap(); + assert_eq!(listing, fake_tests.listing()); +} + +#[test] +fn expected_count_updates_cases() { + let tmp_dir = tempdir().unwrap(); + let fake_tests = FakeTests { + test_binaries: vec![FakeTestBinary { + name: "foo".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + ..Default::default() + }], + }], + }; + run_all_tests_sync( + &tmp_dir, + fake_tests.clone(), + false.into(), + vec!["all".into()], + vec![], + ); + + let path = tmp_dir + .path() + .join("workspace/target") + .join(LAST_TEST_LISTING_NAME); + let listing: TestListing = load_test_listing(&path).unwrap().unwrap(); + assert_eq!(listing, fake_tests.listing()); + + // remove the test + let fake_tests = FakeTests { + test_binaries: vec![FakeTestBinary { + name: "foo".into(), + tests: vec![], + }], + }; + let fs = Fs::new(); + fs.write(tmp_dir.path().join("workspace/crates/foo/src/lib.rs"), "") + .unwrap(); + + run_all_tests_sync( + &tmp_dir, + fake_tests.clone(), + false.into(), + vec!["all".into()], + vec![], + ); + + // new listing should match + let listing: TestListing = load_test_listing(&path).unwrap().unwrap(); + assert_eq!(listing, fake_tests.listing()); +} + +#[test] +fn filtering_none_does_not_build() { + let tmp_dir = tempdir().unwrap(); + let fake_tests = FakeTests { + test_binaries: vec![FakeTestBinary { + name: "foo".into(), + tests: vec![FakeTestCase { + name: "test_it".into(), + ..Default::default() + }], + }], + }; + run_all_tests_sync( + &tmp_dir, + fake_tests.clone(), + false.into(), + vec!["none".into()], + vec![], + ); + + let fs = Fs::new(); + let target_dir = tmp_dir.path().join("workspace/target"); + let entries: Vec<_> = fs + .read_dir(target_dir) + .unwrap() + .map(|e| path_file_name(&e.unwrap().path())) + .collect(); + assert_eq!(entries, vec![LAST_TEST_LISTING_NAME.to_owned()]); +} diff --git a/crates/maelstrom-client/src/spec.rs b/crates/maelstrom-client/src/spec.rs index 53822b58..15260bb0 100644 --- a/crates/maelstrom-client/src/spec.rs +++ b/crates/maelstrom-client/src/spec.rs @@ -1,6 +1,6 @@ //! Provide utilities for evaluating job specification directives. //! -//! The job specification directives for `cargo-maelstrom` and the CLI differ in a number of ways, but +//! The job specification directives for `cargo-metest` and the CLI differ in a number of ways, but //! also have a number of similar constructs. This module includes utilities for those similar //! constructs. diff --git a/meticulous-test.toml b/meticulous-test.toml index f3fce552..815ddea3 100644 --- a/meticulous-test.toml +++ b/meticulous-test.toml @@ -12,7 +12,7 @@ mounts = [ devices = ["full", "null", "random", "tty", "urandom", "zero"] [[directives]] -filter = "package.equals(cargo-maelstrom)" +filter = "package.equals(cargo-metest)" image.name = "rust" image.use = ["layers", "environment"] enable_loopback = true diff --git a/site/src/SUMMARY.md b/site/src/SUMMARY.md index 83c3a203..c3ff01a3 100644 --- a/site/src/SUMMARY.md +++ b/site/src/SUMMARY.md @@ -3,17 +3,17 @@ - [Introduction](./introduction.md) - [Installation](./installation.md) - [Installing Clustered Job Runner](./install/clustered_job_runner.md) - - [Installing cargo-maelstrom](./install/cargo_maelstrom.md) -- [cargo-maelstrom](./cargo_maelstrom.md) - - [Running Tests](./cargo_maelstrom/running_tests.md) - - [Filtering Tests](./cargo_maelstrom/filtering_tests.md) - - [Test Pattern DSL](./cargo_maelstrom/test_pattern_dsl.md) - - [Test Pattern DSL BNF](./cargo_maelstrom/test_pattern_dsl/bnf.md) - - [`--include` and `--exclude` Flags](./cargo_maelstrom/include_and_exclude_flags.md) - - [Listing](./cargo_maelstrom/listing.md) - - [Configuration](./cargo_maelstrom/configuration.md) - - [Execution Environment](./cargo_maelstrom/execution_environment.md) - - [Including Files](./cargo_maelstrom/including_files.md) - - [Using Container Images](./cargo_maelstrom/using_container_images.md) + - [Installing cargo-metest](./install/cargo_metest.md) +- [cargo-metest](./cargo_metest.md) + - [Running Tests](./cargo_metest/running_tests.md) + - [Filtering Tests](./cargo_metest/filtering_tests.md) + - [Test Pattern DSL](./cargo_metest/test_pattern_dsl.md) + - [Test Pattern DSL BNF](./cargo_metest/test_pattern_dsl/bnf.md) + - [`--include` and `--exclude` Flags](./cargo_metest/include_and_exclude_flags.md) + - [Listing](./cargo_metest/listing.md) + - [Configuration](./cargo_metest/configuration.md) + - [Execution Environment](./cargo_metest/execution_environment.md) + - [Including Files](./cargo_metest/including_files.md) + - [Using Container Images](./cargo_metest/using_container_images.md) - [Clustered Job Runner Management](./clustered_job_runner_management.md) - [Job States](./clustered_job_runner_management/job_states.md) diff --git a/site/src/cargo_metest.md b/site/src/cargo_metest.md index 1beb30af..028d8e99 100644 --- a/site/src/cargo_metest.md +++ b/site/src/cargo_metest.md @@ -1,11 +1,11 @@ -# cargo-maelstrom +# cargo-metest -cargo-maelstrom is a replacement for `cargo test` which will run tests as jobs on a +cargo-metest is a replacement for `cargo test` which will run tests as jobs on a distributed clustered job runner. Each test runs in a lightweight container where it is isolated from computer it is running on and other tests. Running your tests using it can be as simple as running `cargo metest` instead -of `cargo test`, (see [Running Tests](./cargo_maelstrom/running_tests.md) for +of `cargo test`, (see [Running Tests](./cargo_metest/running_tests.md) for details) but due to the tests running in a very isolated environment by default, there can be some configuration required to make all your tests pass (see -[Configuration](./cargo_maelstrom/configuration.md).) +[Configuration](./cargo_metest/configuration.md).) diff --git a/site/src/cargo_metest/filtering_tests.md b/site/src/cargo_metest/filtering_tests.md index a7361ed2..f61a6fda 100644 --- a/site/src/cargo_metest/filtering_tests.md +++ b/site/src/cargo_metest/filtering_tests.md @@ -1,6 +1,6 @@ # Filtering Tests -When running `cargo-maelstrom` without any arguments it runs all the tests it finds +When running `cargo-metest` without any arguments it runs all the tests it finds as part of your project. If you wish to run only a subset of tests a filter can be applied via the command line. diff --git a/site/src/cargo_metest/include_and_exclude_flags.md b/site/src/cargo_metest/include_and_exclude_flags.md index cb1e6471..a0b1feef 100644 --- a/site/src/cargo_metest/include_and_exclude_flags.md +++ b/site/src/cargo_metest/include_and_exclude_flags.md @@ -1,6 +1,6 @@ # `--include` and `--exclude` Flags -These flags are about filtering which tests `cargo-maelstrom` runs. +These flags are about filtering which tests `cargo-metest` runs. The `--include` and `--exclude` flags (shorted as `-i` and `-x`) accept a snippet of the [Test Pattern DSL](./test_pattern_dsl.md). The `-i` flag includes @@ -16,7 +16,7 @@ this more explicitly it is something like ``` ## Working with Workspaces -When you specify a filter a package, `cargo-maelstrom` will only build the matching +When you specify a filter a package, `cargo-metest` will only build the matching packages. This can be a useful tip to remember when trying to run a single test. If we were to run something like @@ -24,7 +24,7 @@ If we were to run something like cargo metest -i "name.equals(foobar)" ``` -`cargo-maelstrom` would run any test which has the name "foobar". A test with this +`cargo-metest` would run any test which has the name "foobar". A test with this name could be found in any of the packages in the workspace, so it is forced to build all of them. But if I happened to know that only one package has this test, the `baz` package, I would be better off running the following instead. @@ -33,5 +33,5 @@ test, the `baz` package, I would be better off running the following instead. cargo metest -i "package.equals(baz) && name.equals(foobar)" ``` -Now since I specified that I only care about the "baz" package, `cargo-maelstrom` +Now since I specified that I only care about the "baz" package, `cargo-metest` will only bother to build that package. diff --git a/site/src/cargo_metest/running_tests.md b/site/src/cargo_metest/running_tests.md index 1f4656cf..4bde619f 100644 --- a/site/src/cargo_metest/running_tests.md +++ b/site/src/cargo_metest/running_tests.md @@ -4,14 +4,14 @@ In order to run tests we will need to have a clustered job runner running somewhere first. (See [Installing Clustered Job Runner](../install/clustered_job_runner.md)) -Also ensure you've installed `cargo-maelstrom` (See [Installing -cargo-maelstrom](../install/cargo_maelstrom.md).) +Also ensure you've installed `cargo-metest` (See [Installing +cargo-metest](../install/cargo_metest.md).) -We need to provide the address of the broker to cargo-maelstrom. This can be done +We need to provide the address of the broker to cargo-metest. This can be done via the command line by passing `--broker`, but since you have to provide it every time it is easer to provide it via the configuration file. -Create a file in `.config/cargo-maelstrom.toml` and put the following contents +Create a file in `.config/cargo-metest.toml` and put the following contents ```toml broker = ":" @@ -31,7 +31,7 @@ This should be improved with subsequent invocations. See [Test Listing](#test-listing) # Terminal Output -By default `cargo-maelstrom` prints the name of the tests that have been completed +By default `cargo-metest` prints the name of the tests that have been completed to stdout. It also displays four progress bars indicating the state of jobs. See [Job States](../clustered_job_runner_management/job_states.md). @@ -41,9 +41,9 @@ Before the progress bars there is a "spinner" which provides insight into what the test job enqueuing thread is doing. It disappears once all tests are enqueued. -Before running tests, `cargo-maelstrom` always runs invokes `cargo` to build the +Before running tests, `cargo-metest` always runs invokes `cargo` to build the tests before running them. This happens in parallel with enqueuing the tests to -run. If the build fails for some reason, `cargo-maelstrom` will abruptly stop and +run. If the build fails for some reason, `cargo-metest` will abruptly stop and print out the build output. Providing the `--quiet` flag will show only a single progress bar and provide no @@ -53,25 +53,25 @@ If stdout isn't a TTY, no progress bars are displayed, and color is disabled. # Caching -`cargo-maelstrom` caches some things in the `target/` directory, these things are +`cargo-metest` caches some things in the `target/` directory, these things are covered below. It also caches some things related to containers (not in the `target/` directory) and that is covered in [Using Container -Images](../cargo_maelstrom/using_container_images.md). +Images](../cargo_metest/using_container_images.md). ## `.tar` files -As part of running tests as jobs in the clustered job runner, `cargo-maelstrom` +As part of running tests as jobs in the clustered job runner, `cargo-metest` must create certain temporary `.tar` files. It stores these files alongside the build artifacts that `cargo` produces. At the moment nothing cleans up stale ones. ## Test Listing -A listing of all the tests in your project that `cargo-maelstrom` found is stored +A listing of all the tests in your project that `cargo-metest` found is stored in `target/maelstrom-test-listing.toml` file. This listing is used to predict the amount of tests that will be run with subsequent invocations. ## File Digests Files uploaded to the broker are identified via a hash of the file contents. -Calculating these hashes can be time consuming so `cargo-maelstrom` caches this +Calculating these hashes can be time consuming so `cargo-metest` caches this information. This can be found in `target/meticulous-cached-digests.toml`. The cache works by recording the hash and the mtime of the file from when the hash was calculated. If the file has a different mtime than what is recorded, the diff --git a/site/src/cargo_metest/test_pattern_dsl.md b/site/src/cargo_metest/test_pattern_dsl.md index 7b8e3e2a..b65cebdb 100644 --- a/site/src/cargo_metest/test_pattern_dsl.md +++ b/site/src/cargo_metest/test_pattern_dsl.md @@ -1,6 +1,6 @@ # Test Pattern DSL -This domain-specific language has been designed specifically for `cargo-maelstrom` +This domain-specific language has been designed specifically for `cargo-metest` to let users easily describe a set of tests to run. If you are a fan of formal explanations check out the @@ -109,7 +109,7 @@ These other expressions are useful occasionally - **`true`, `all`, `any`**: selects all tests - **`false`, `none`**: selects no tests -When you provide no filter to `cargo-maelstrom` it acts as if typed `cargo metest +When you provide no filter to `cargo-metest` it acts as if typed `cargo metest -i all` ## Abbreviations diff --git a/site/src/install/cargo_metest.md b/site/src/install/cargo_metest.md index 50a07419..839e1b26 100644 --- a/site/src/install/cargo_metest.md +++ b/site/src/install/cargo_metest.md @@ -1,6 +1,6 @@ -# Installing cargo-maelstrom +# Installing cargo-metest -We're going to install cargo-maelstrom using cargo. It only works on Linux. +We're going to install cargo-metest using cargo. It only works on Linux. First make sure you've installed [Rust](https://www.rust-lang.org/tools/install). @@ -8,7 +8,7 @@ Then install it by doing ```bash export METICULOUS_GITHUB="https://github.com/meticulous-software/meticulous.git" -cargo install --git $METICULOUS_GITHUB cargo-maelstrom +cargo install --git $METICULOUS_GITHUB cargo-metest ``` You should now be able to invoke it by running `cargo metest` diff --git a/site/src/installation.md b/site/src/installation.md index 794cda09..21a6f97f 100644 --- a/site/src/installation.md +++ b/site/src/installation.md @@ -1,6 +1,6 @@ # Installation This is split up into two different installations processes. One for the -clustered job runner, and another for cargo-maelstrom +clustered job runner, and another for cargo-metest - [Clustered Job Runner](./install/clustered_job_runner.md) -- [cargo-maelstrom](./install/cargo_maelstrom.md) +- [cargo-metest](./install/cargo_metest.md) diff --git a/site/src/introduction.md b/site/src/introduction.md index cd5c406d..0bc2ea5d 100644 --- a/site/src/introduction.md +++ b/site/src/introduction.md @@ -17,15 +17,15 @@ Meticulous itself is split up into a few different pieces of software. the actual job (or test.) - **The Client**. There are one or many instances of these. This is what connects to the broker and submits jobs. -- **cargo-maelstrom**. This is our cargo replacement which submits tests as jobs by +- **cargo-metest**. This is our cargo replacement which submits tests as jobs by acting as a client. # What will this book cover? This guide will attempt to cover the following topics: - Basic Install. How do you install and configure this for your own projects. - Both setting up the clustered job runner and using cargo-maelstrom -- cargo-maelstrom configuration. Sometimes extra configuration is needed to make + Both setting up the clustered job runner and using cargo-metest +- cargo-metest configuration. Sometimes extra configuration is needed to make tests run successfully, this will cover how to do that. - Clustered job runner management. How clustered job runner works and how to get insight into the job running process.