Skip to content

Commit

Permalink
#386: Refactored ttrt code to better fit API designs and singleton pa…
Browse files Browse the repository at this point in the history
…ttern (#387)
  • Loading branch information
tapspatel authored Aug 15, 2024
1 parent d187e56 commit a288b13
Show file tree
Hide file tree
Showing 5 changed files with 1,842 additions and 966 deletions.
128 changes: 108 additions & 20 deletions docs/src/ttrt.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,6 @@
# `ttrt`

This tool is intended to be a swiss army knife for working with flatbuffers
generated by the compiler. Its primary role is to inspect and run flatbuffer
files. It enables the running of flatbuffer files without a front-end runtime.
This tool is intended to be a swiss army knife for working with flatbuffers generated by the compiler. Its primary role is to inspect and run flatbuffer files. It enables the running of flatbuffer files without a front-end runtime.

## Building

Expand All @@ -19,15 +17,26 @@ See the [ttmlir-opt](./ttmlir-opt.md) documentation for more information on how
## APIs
```bash
ttrt --help
ttrt read
ttrt run
ttrt query
ttrt perf (coming soon)
ttrt check (coming soon)
```

## Command Line

There are different ways you can use the APIs under ttrt. The first is via the command line as follows. By default, all logging is saved in `ttrt.log` file. All artifacts are saved under `ttrt-artifacts` folder under `TT_MLIR_HOME` environment variable.

### read
Read sections of a binary file

```bash
ttrt read --help
ttrt read --section mlir out.ttnn
ttrt read --section cpp out.ttnn
ttrt read --section version out.ttnn
ttrt read --section system-desc out.ttnn
ttrt read --section system_desc out.ttnn
ttrt read --section inputs out.ttnn
ttrt read --section outputs out.ttnn
ttrt read --section all out.ttnn
Expand All @@ -37,12 +46,16 @@ ttrt read --section all /dir/of/flatbuffers
```

### run
Note: It's required to be on a system with silicon and to have a runtime enabled
build `-DTTMLIR_ENABLE_RUNTIME=ON`.
Run a binary file or a directory of binary files
Note: It's required to be on a system with silicon and to have a runtime enabled build `-DTTMLIR_ENABLE_RUNTIME=ON`.

```bash
ttrt run --help
ttrt run out.ttnn
ttrt run out.ttnn --seed 0
ttrt run out.ttnn --init arange
ttrt run out.ttnn --identity
ttrt run out.ttnn --identity --rtol 1 --atol 1
ttrt run out.ttnn --clean-artifacts
ttrt run out.ttnn --save-artifacts
ttrt run out.ttnn --loops 10
Expand All @@ -53,21 +66,18 @@ ttrt run /dir/of/flatbuffers --loops 10
```

### query
Note: It's required to be on a system with silicon and to have a runtime enabled
build `-DTTMLIR_ENABLE_RUNTIME=ON`.
Query a binary file or a directory of binary files
Note: It's required to be on a system with silicon and to have a runtime enabled build `-DTTMLIR_ENABLE_RUNTIME=ON`.

```bash
ttrt query --help
ttrt query --system-desc
ttrt query --system-desc-as-json
ttrt query --system-desc-as-dict
ttrt query --save-artifacts
ttrt query --clean-artifacts
```

### perf
Note: It's required to be on a system with silicon and to have a runtime enabled
build `-DTTMLIR_ENABLE_RUNTIME=ON`. Also need perf enabled build `-DTT_RUNTIME_ENABLE_PERF_TRACE=ON` with `export ENABLE_TRACY=1`.
### perf (coming soon)
Run performance mode of a binary file or a directory of binary files
Note: It's required to be on a system with silicon and to have a runtime enabled build `-DTTMLIR_ENABLE_RUNTIME=ON`. Also need perf enabled build `-DTT_RUNTIME_ENABLE_PERF_TRACE=ON` with `export ENABLE_TRACY=1`.

```bash
ttrt perf --help
Expand All @@ -83,15 +93,93 @@ ttrt perf /dir/of/flatbuffers
ttrt perf /dir/of/flatbuffers --loops 10
```

## ttrt is written as a python library, so it can be used in custom python scripts
## ttrt as a python package

The other way to use the APIs under ttrt is importing it as a library. This allows the user to use it in custom scripts.

### Import ttrt as a python package
```bash
from ttrt.common.api import API
```

### Setup API and register all features
```bash
API.initialize_apis()
```

### Setup arguments
You can specify certain arguments to pass to each API, or use the default arguments provided

#### args
This can be a dictionary of values to set inside your API instance. These are the same options as found via the command line. You can get the total list of support arguments via API.registered_args dictionary. Any argument not provided will be set to the default.
```bash
custom_args = API.Query.registered_args
custom_args["clean-artifacts"] = True
query_instance = API.Query(args=custom_args)

custom_args = { "clean-artifacts": True }
query_instance = API.Query(args=custom_args)
```
#### logging
You can specify a specific logging module you want to set inside your API instance. The rationale behind this is to support different instances of different APIs, all being able to be logged to a different file.
```bash
from ttrt.common.util import Logger

log_file_name = "some_file_name.log"
custom_logger = Logger(log_file_name)
read_instance = API.Read(logging=custom_logger)
```
#### artifacts
You can specify a specific artifacts directory to store all the generate metadata during the execution of any API run. This allows you to specify different artifact directories if you wish for different instances of APIs.
```bash
from ttrt.common.util import Artifacts
log_file_name = "some_file_name.log"
artifacts_folder_path = "/opt/folder"
custom_logger = Logger(log_file_name)
custom_artifacts = Artifacts(logging=custom_logger, artifacts_folder_path=artifacts_folder_path)
run_instance = API.Run(artifacts=custom_artifacts)
```
### Execute API
Once all the arguments are setup, you can run your API instance with all your provided arguments. Note, APIs are stateless. Thus, subsequent calls to the same API instance will not preserve previous call artifacts. You can generate a new artifacts directory for subsequent runs if you wish to call the APIs multiple times, for example.
```bash
query_instance()
read_instance()
run_instance()
```
### Putting it all together
You can do interesting stuff when combining all the above features into your python script
```bash
from ttrt.common.api import API
from ttrt.common.util import Logger
from ttrt.common.util import Artifacts
API.initialize_apis()
custom_args = API.Run.registered_args
custom_args["clean-artifacts"] = True
custom_args["save-artifacts"] = True
custom_args["loops"] = 10
custom_args["init"] = "randn"
custom_args["binary"] = "/path/to/subtract.ttnn"
log_file_name = "some_file_name.log"
custom_logger = Logger(log_file_name)
artifacts_folder_path = "/opt/folder"
custom_artifacts = Artifacts(logging=custom_logger, artifacts_folder_path=artifacts_folder_path)
```python
import ttrt.binary
run_instance = API.Run(args=custom_args, logging=custom_logger, artifacts=custom_artifacts)
fbb = ttrt.binary.load_from_path("out.ttnn")
d = ttrt.binary.as_dict(fbb)
```
## bonus
- artifacts are saved in ttrt-artifacts directory if the option `--save-artifacts` is provided
- you can specify `SYSTEM_DESC_PATH` with the path to your ttsys file, and lit will automatically generate all the flatbuffer binaries for that system
170 changes: 7 additions & 163 deletions runtime/tools/python/ttrt/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,12 +17,9 @@
import shutil

import ttrt.binary
from ttrt.common.api import read, run, query, perf, init_fns
from ttrt.common.util import read_actions
from ttrt.common.api import API


#######################################################################################
#######################################**MAIN**########################################
#######################################################################################
def main():
import argparse

Expand All @@ -31,171 +28,18 @@ def main():
)
subparsers = parser.add_subparsers(required=True)

"""
API: read
"""
read_parser = subparsers.add_parser(
"read", help="read information from flatbuffer binary"
)
read_parser.add_argument(
"--section",
default="all",
choices=sorted(list(read_actions.keys())),
help="output sections of the fb",
)
read_parser.add_argument(
"--clean-artifacts",
action="store_true",
help="clean all artifacts from previous runs",
)
read_parser.add_argument(
"--save-artifacts",
action="store_true",
help="save all artifacts during run",
)
read_parser.add_argument("binary", help="flatbuffer binary file")
read_parser.set_defaults(func=read)

"""
API: run
"""
run_parser = subparsers.add_parser("run", help="run a flatbuffer binary")
run_parser.add_argument(
"--program-index",
default="all",
help="the program inside the fbb to run",
)
run_parser.add_argument(
"--clean-artifacts",
action="store_true",
help="clean all artifacts from previous runs",
)
run_parser.add_argument(
"--loops",
default=1,
help="number of loops",
)
run_parser.add_argument(
"--save-artifacts",
action="store_true",
help="save all artifacts during run",
)
run_parser.add_argument(
"--init",
default="randn",
choices=init_fns,
help="Function to initialize tensors with",
)
run_parser.add_argument(
"--identity",
action="store_true",
help="Do a golden identity test on the output tensors",
)
run_parser.add_argument(
"--rtol",
default=1e-05,
type=float,
help="rtol for golden test",
)
run_parser.add_argument(
"--atol",
default=1e-08,
type=float,
help="atol for golden test",
)
run_parser.add_argument(
"--seed",
default=0,
help="Seed for random number generator",
)
run_parser.add_argument("binary", help="flatbuffer binary file")
run_parser.set_defaults(func=run)

"""
API: query
"""
query_parser = subparsers.add_parser(
"query", help="query information about the current system"
)
query_parser.add_argument(
"--system-desc",
action="store_true",
help="serialize a system desc for the current system to a file",
)
query_parser.add_argument(
"--system-desc-as-json",
action="store_true",
help="print the system desc as json",
)
query_parser.add_argument(
"--system-desc-as-dict",
action="store_true",
help="print the system desc as python dict",
)
query_parser.add_argument(
"--clean-artifacts",
action="store_true",
help="clean all artifacts from previous runs",
)
query_parser.add_argument(
"--save-artifacts",
action="store_true",
help="save all artifacts during run",
)
query_parser.set_defaults(func=query)

"""
API: perf
"""
perf_parser = subparsers.add_parser(
"perf", help="run performance trace and collect performance data"
)
perf_parser.add_argument(
"--program-index",
default="all",
help="the program inside the fbb to run",
)
perf_parser.add_argument(
"--device",
action="store_true",
help="collect performance trace on both host and device",
)
perf_parser.add_argument(
"--generate-params",
action="store_true",
help="generate json file of model parameters based off of perf csv file",
)
perf_parser.add_argument(
"--perf-csv",
default="",
help="perf csv file generated from performance run",
)
perf_parser.add_argument(
"--clean-artifacts",
action="store_true",
help="clean all artifacts from previous runs",
)
perf_parser.add_argument(
"--loops",
default=1,
help="number of loops",
)
perf_parser.add_argument(
"--save-artifacts",
action="store_true",
help="save all artifacts during run",
)
perf_parser.add_argument("binary", help="flatbuffer binary file")
perf_parser.set_defaults(func=perf)
API.initialize_apis()
for api_name, api_class in API.registered_apis.items():
api_class.generate_subparser(subparsers)

try:
args = parser.parse_args()
except:
parser.print_help()
return

# run command
args.func(args)
request_api = args.api(args)
request_api()


if __name__ == "__main__":
Expand Down
Loading

0 comments on commit a288b13

Please sign in to comment.