Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update README #153

Merged
merged 3 commits into from
Jan 26, 2024
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
87 changes: 27 additions & 60 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,23 +2,30 @@

## RTL Simulation Quickstart

Let's start by spinning up the `snitch-toolchain` Docker container and mounting
a clone of this repo inside it at `/src`:
Let's start by spinning up the `snitch-toolchain` Docker container and mounting in it a clone of this repo at `/src`:

```shell
$ git clone https://github.com/opencompl/riscv-paper-experiments.git
$ docker run --rm -ti --volume $PWD/riscv-paper-experiments:/src ghcr.io/nazavode/snitch-toolchain:latest bash
git clone --recursive https://github.com/opencompl/riscv-paper-experiments.git
docker run --rm -ti --volume $PWD/riscv-paper-experiments:/src ghcr.io/nazavode/snitch-toolchain:latest bash
```

Alternatively, the current flow can be performed with:

```shell
$ docker run --rm -ti --volume $PWD/riscv-paper-experiments:/src ghcr.io/nazavode/snitch-toolchain:latest /src/scripts/run.sh
sudo docker run --rm -ti --volume $PWD/riscv-paper-experiments:/src ghcr.io/nazavode/snitch-toolchain:latest bash
cd /src/xdsl
python3 -m venv venv
source venv/bin/activate
pip install -e .
pip install pandas
cd /src/kernels
make pivoted.csv
```

This builds the kernels, executes them with Verilator, process the traces from these runs and plots the results.
The results can be found under the `results` subdirectory within the repo.
Each result file is uniquely tagged with a date and time suffix.
This creates a Python virtual env for xDSL and the Python scripts used in this repo, builds the kernels,
executes them with Verilator, processes the traces from these runs and plots the results.
The overall results are collated in the`pivoted.csv` file.
Individual CSV files per kernel directory contain the result in cycles for each version of the kernel.

*Note: `opencompl` members seem not to have enough rights to push packages to the organization's
package registry. The image built from [`snitch/docker/Dockerfile`](snitch/docker/Dockerfile) is
Expand All @@ -33,16 +40,15 @@ it's likely that your `docker run` command will complain about the image being `
Add the following option to explicitly ask for a specific platform:*

```shell
$ docker run --platform linux/amd64 ...
docker run --platform linux/amd64 ...
```

To build a RISC-V executable, start from one of the kernels:

```shell
$ cd /src/kernels/saxpy/64xf32/
$ make linalg.x
$ ls *.x
linalg.x
cd /src/kernels/saxpy/64xf32/
make linalg.x
ls *.x # linalg.x should exist if all went OK
```

The [Makefile](kernels/saxpy/64xf32/Makefile) performs the following steps:
Expand All @@ -57,8 +63,13 @@ Once the ELF executable is ready, we can simulate its execution on a Snitch
cluster via the RTL simulator generated by Verilator:

```shell
$ make run_linalg.x
make run_linalg.x
/opt/snitch-rtl/bin/snitch_cluster.vlt linalg.x
```

should produce the following output upon execution:

```shell
Warning: Failed to write binary name to logs/.rtlbinary
Wrote 36 bytes of bootrom to 0x1000
Wrote entry point 0x80000000 to bootloader slot 0x1020
Expand All @@ -72,10 +83,10 @@ Wrote 38 bytes of bootdata to 0x1024
[Tracer] Logging Hart 5 to logs/trace_hart_00000005.dasm
[Tracer] Logging Hart 6 to logs/trace_hart_00000006.dasm
[Tracer] Logging Hart 7 to logs/trace_hart_00000007.dasm
$ echo $?
0
```

Correct execution should return `0` when issuing `echo $?`.

*Note: while the `main` function is run by all the compute cores in the cluster,
the current startup code **returns the integer return value of the core no. 0 only**,
**return values of cores other than no. 0 are discarded**.*
Expand All @@ -87,7 +98,6 @@ $ make traces
$ ls linalg.x.logs/
linalg.x.logs/trace_hart_00000000.trace.txt # decoded trace
linalg.x.logs/trace_hart_00000000.trace.json # json performance data per section
# ...
```

*Note: the current version of `spike-dasm` included in the Docker image doesn't support
Expand Down Expand Up @@ -160,46 +170,3 @@ $ grep fmul\.s linalg.x.logs/trace_hart_00000001.trace.txt | wc -l
The core (a.k.a. *hart* in RISC-V jargon) no. 0 was the only one actually
executing the kernel, while all of the other cores did none as they early-return
from the `main` function.

## Scripts

### Setup

Setup and activate a Python virtual environment for the scripts:

```shell
$ python3 -m venv venv
$ source venv/bin/activate
$ pip install -r requirements.txt
```

### Collecting Results

We can collect results from the Verilator JSON trace logs in a CSV file with:

```shell
$ scripts/harvest_results.py -s kernels/ -f kernel size version cycles -e cycles -o output[.csv | .json]
```

- `-f` is list of strings that are going to be used as a header row in the CSV file.
- `-e` is list of strings that define field names in the JSON trace to be extracted.

The script assumes that the root search directory follows a directory structure as follows:

`[KERNEL]/[SIZE]/[EXECUTABLE_NAME].x.logs/`

where `KERNEL` is the microkernel name and `SIZE` is its dimensions.

Only log file with name `trace_hart_00000000.trace.json` are used for now.

### Plotting Results

We can plot a grouped barchart using Matplotlib from the previous step's extracted CSV file with:

```shell
$ scripts/plotting/plot_barchart.py -f output.csv -s scripts/plotting/configs/cycles/barchart.mplstyle -c scripts/plotting/configs/cycles/barchart.json
$ ls output.pdf
```

The `.mplstyle` controls plotting stylistic options.
Since Matplotlib style files cannot control all aspects of a plot, we also include a JSON config file.
Loading