Skip to content

Commit

Permalink
formatting
Browse files Browse the repository at this point in the history
  • Loading branch information
utaal committed Aug 20, 2024
1 parent c74e7e8 commit 754ce82
Showing 1 changed file with 10 additions and 5 deletions.
15 changes: 10 additions & 5 deletions site/guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ to follow the same instructions is that `/mydata` is a directory on a mount with
Note that the commands and scripts in the following will manipulate the permissions of `/mydata`. The machine-level setup installs
Docker-CE and gives permission to the current user to connect to the container daemon. Other container runtimes compatible with the docker CLI should work too.

**Step 1. Clone artifact repository, set up container environment.**
#### 1. Clone artifact repository, set up container environment.

Clone the repository

Expand All @@ -56,7 +56,7 @@ sudo bash setup/cloudlab-1.sh $USER

Log out and log in again to ensure the current user is part of the `docker` group.

**Step 2. Run the macrobenchmark verification statistics (Figure 8).**
#### 2. Run the macrobenchmark verification statistics (Figure 8).

**TODO** describe hand-tuned numbers and hard-coded baselines.

Expand All @@ -80,7 +80,11 @@ scp '<username>@<node>.cloudlab.us:/mydata/verus-sosp24-artifact/macro-stats/res
scp '<username>@<node>.cloudlab.us:/mydata/verus-sosp24-artifact/macro-stats/results/macro-table.pdf' .
```

**Step 3. Run the mimalloc benchmark suite**
#### 4. Run the page table benchmark

**TODO.**

#### 5. Run the mimalloc benchmark suite

Navigate to the directory **(TODO where?)**

Expand All @@ -98,6 +102,7 @@ in the intermediate output. The end will summarize the results in tabular form.
The last table, formatted in LaTeX, only contains the benchmarks that succeeded.
The output should resemble Figure 12.


## Set 2

### Claims
Expand Down Expand Up @@ -242,7 +247,7 @@ git submodule update --init

The repository should now be ready and we can build the binaries and run the benchmark.

### 3. Running the Benchmark
#### 3. Running the Benchmark

To run the benchmarks, navigate into the `benchmarks` directory and execute the `run_benchmarks.sh`
script:
Expand Down Expand Up @@ -272,7 +277,7 @@ Note, that the benchmarks will change the following CPU settings:
Also note, that this will pull in the dependencies for building Linear Dafny automatically.


### 4. Obtaining the Results
#### 4. Obtaining the Results

You can view the results of the benchmark by opening the automatically generated plots:

Expand Down

0 comments on commit 754ce82

Please sign in to comment.