Skip to content

Commit

Permalink
Updates to 3.5 documentation post release (#1158)
Browse files Browse the repository at this point in the history
* Post 3.5 launch fixes

* Integrate filtered HTML Model Zoo
  • Loading branch information
quentonh authored and GitHub Enterprise committed Jul 3, 2023
1 parent 98f972d commit 5880ef4
Show file tree
Hide file tree
Showing 290 changed files with 2,128 additions and 19,910 deletions.
2 changes: 1 addition & 1 deletion docs/.buildinfo
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Sphinx build info version 1
# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done.
config: 69378c3e4e70920506b6c832d7ea1ffb
config: e8809926e287b8b0d9656262fa0b07bc
tags: 645f666f9bcd5a90fca523b33c5a78b7
Binary file not shown.
Binary file added docs/_images/VEK280_Top_img.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 1 addition & 1 deletion docs/_sources/docs/install/China_Ubuntu_servers.rst.txt
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ Vitis |trade| AI Docker images leverage Ubuntu 20.04. In your Ubuntu installatio
deb http://us.archive.ubuntu.com/ubuntu/ focal universe
You can see that the hostname “archive.ubuntu.com” resolves to servers located within the United States. When building the Vitis AI Docker image, whether for `CPU-only <https://github.com/Xilinx/Vitis-AI/blob/master/docker/dockerfiles/vitis-ai-cpu.Dockerfile>`__ or `GPU <https://github.com/Xilinx/Vitis-AI/blob/master/docker/dockerfiles/vitis-ai-gpu.Dockerfile>`__ applications Docker will attempt to pull from US servers. As a result, users accessing from China will generally experience slow download speeds.
You can see that the hostname “archive.ubuntu.com” resolves to servers located within the United States. When building the Vitis AI Docker image, whether for CPU-only or GPU accelerated containers Docker will attempt to pull from US servers. As a result, users accessing from China will generally experience slow download speeds.

Prior to building the Vitis AI Docker image it is recommended that you modify **/etc/apt/sources.list** and the vitis-ai-gpu.Dockerfile.

Expand Down
9 changes: 5 additions & 4 deletions docs/_sources/docs/quickstart/v70.rst.txt
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
Quick Start Guide for Alveo V70
###############################

The AMD **DPUCV2DX8G** for the Alveo |trade| V70 is a configurable computation engine dedicated to convolutional neural networks. It supports a highly optimized instruction set, enabling the deployment of most convolutional neural networks. The following instructions will help you to install the software and packages required to support V70.
The AMD **DPUCV2DX8G** for the Alveo |trade| V70 is a configurable computation engine dedicated to convolutional neural networks. It supports a highly optimized instruction set, enabling the deployment of most convolutional neural networks. The following instructions will help you install the software and packages required to support V70.

.. image:: ../reference/images/V70.PNG
:width: 1300
Expand Down Expand Up @@ -114,7 +114,7 @@ From inside the docker container, execute one of the following commands to set t
Vitis-AI Model Zoo
==================

You can now select a model from the Vitis AI Model Zoo `Vitis AI Model Zoo <../workflow-model-zoo.html>`__. Navigate to the `model-list subdirectory <https://github.com/Xilinx/Vitis-AI/tree/master/model_zoo/model-list>`__ and select the model that you wish to test. For each model, a YAML file provides key details of the model. In the YAML file there are separate hyperlinks to download the model for each supported target. Choose the correct link for your target platform and download the model.
You can now select a model from the `Vitis AI Model Zoo <../workflow-model-zoo.html>`__. Navigate to the `model-list subdirectory <https://github.com/Xilinx/Vitis-AI/tree/master/model_zoo/model-list>`__ and select the model that you wish to test. For each model, a YAML file provides key details of the model. In the YAML file there are separate hyperlinks to download the model for each supported target. Choose the correct link for your target platform and download the model.

- Take the ResNet50 model as an example.

Expand Down Expand Up @@ -144,7 +144,7 @@ Run the Vitis AI Examples

.. code-block:: Bash
[Docker] $ tar -xzvf vitis_ai_runtime_r3.5.0_image_video.tar.gz -C /w/examples/vai_runtime
[Docker] $ tar -xzvf vitis_ai_runtime_r3.5.0_image_video.tar.gz -C /workspace/examples/vai_runtime
3. Navigate to the example directory.

Expand All @@ -157,6 +157,7 @@ Run the Vitis AI Examples

.. code-block:: Bash
[Docker] $ sudo chmod u+r+x build.sh
[Docker] $ bash -x build.sh
5. Run the example.
Expand Down Expand Up @@ -412,7 +413,7 @@ contain test images and videos that can be leveraged to evaluate our quantized m

.. code-block:: Bash
[Docker] $ ./test_jpeg_classification resnet18_pt /workspace/examples/vai_library/samples/classification/images/002.jpg
[Docker] $ ./test_jpeg_classification resnet18_pt /workspace/examples/vai_library/samples/classification/images/001.jpg
If you wish to do so, you can review the `result.jpg` file. OpenCV function calls have been used to overlay the predictions.

Expand Down
17 changes: 12 additions & 5 deletions docs/_sources/docs/quickstart/vek280.rst.txt
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,12 @@
Quick Start Guide for Versal |trade| AI Edge VEK280
###################################################

The AMD **DPUCV2DX** for Versal |trade| AI Edge is a configurable computation engine dedicated to convolutional neural networks. It supports a highly optimized instruction set, enabling the deployment of most convolutional neural networks. The following instructions will help you to install the software and packages required to support VEK280.
The AMD **DPUCV2DX8G** for Versal |trade| AI Edge is a configurable computation engine dedicated to convolutional neural networks. It supports a highly optimized instruction set, enabling the deployment of most convolutional neural networks. The following instructions will help you to install the software and packages required to support VEK280.

.. image:: ../reference/images/VEK280_Top_img.png
:width: 400
:align: center



*************
Expand All @@ -28,7 +33,7 @@ WSL
This is an optional step intended to enable Windows users to evaluate Vitis |trade| AI.


Although this is not a fully tested and supported flow, in most cases users will be able to execute this basic tutorial on Windows. The Windows Subsystem for Linux (WSL) can be installed from the command line. Open a Powershell prompt as an Administrator and execute the following command:
Although this is not a fully supported flow, in most cases users will be able to execute this basic tutorial on Windows. The Windows Subsystem for Linux (WSL) can be installed from the command line. Open a Powershell prompt as an Administrator and execute the following command:

.. code-block:: Bash
Expand Down Expand Up @@ -138,6 +143,7 @@ Cross compile the ``resnet50_pt`` example.
.. code-block:: Bash
[Docker] $ cd examples/vai_runtime/resnet50_pt
[Docker] $ sudo chmod u+r+x build.sh
[Docker] $ bash –x build.sh
If the compilation process does not report an error and the executable file ``resnet50_pt`` is generated, then the host environment is installed correctly. If an error is reported, double-check that you executed the ``source ~/petalinux....`` command.
Expand Down Expand Up @@ -216,7 +222,7 @@ If you are using a point-to-point connection or DHCP is not available, you can m
Vitis-AI Model Zoo
==================

You can now select a model from the Vitis AI Model Zoo `Vitis AI Model Zoo <../workflow-model-zoo.html>`__. Navigate to the `model-list subdirectory <https://github.com/Xilinx/Vitis-AI/tree/master/model_zoo/model-list>`__ and select the model that you wish to test. For each model, a YAML file provides key details of the model. In the YAML file there are separate hyperlinks to download the model for each supported target. Choose the correct link for your target platform and download the model.
You can now select a model from the `Vitis AI Model Zoo <../workflow-model-zoo.html>`__. Navigate to the `model-list subdirectory <https://github.com/Xilinx/Vitis-AI/tree/master/model_zoo/model-list>`__ and select the model that you wish to test. For each model, a YAML file provides key details of the model. In the YAML file there are separate hyperlinks to download the model for each supported target. Choose the correct link for your target platform and download the model.

1. Take the ResNet50 model as an example.

Expand Down Expand Up @@ -453,7 +459,7 @@ The Vitis AI Compiler compiles the graph operators as a set of micro-coded instr
.. code-block:: Bash
[Docker] $ cd /workspace/resnet18
[Docker] $ vai_c_xir -x quantize_result/ResNet_int.xmodel -a /opt/vitis_ai/compiler/arch/DPUCV2DX/VEK280/arch.json -o resnet18_pt -n resnet18_pt
[Docker] $ vai_c_xir -x quantize_result/ResNet_int.xmodel -a /opt/vitis_ai/compiler/arch/DPUCV2DX8G/VEK280/arch.json -o resnet18_pt -n resnet18_pt
- If compilation is successful, the ``resnet18_pt.xmodel`` file should be generated according to the specified DPU architecture.

Expand Down Expand Up @@ -527,6 +533,7 @@ contain test images and videos that can be leveraged to evaluate our quantized m
.. code-block:: Bash
[Target] $ cd ~/Vitis-AI/examples/vai_library/samples/classification
[Target] $ chmod u+r+x build.sh
[Target] $ ./build.sh
4. Execute the single-image test application.
Expand All @@ -537,7 +544,7 @@ contain test images and videos that can be leveraged to evaluate our quantized m
If you wish to do so, you can copy the `result.jpg` file back to your host and review the output. OpenCV function calls have been used to overlay the predictions.

5. To run the video example, run the following command. To keep this simple we will use one of the Vitis AI video samples, but you should scp your own video clip to the target (webm / raw formats).
5. To run the video example, run the following command. To keep this simple we will use one of the Vitis AI video samples, but users should scp their own video clip to the target in a webm or raw format.

.. code-block:: Bash
Expand Down
9 changes: 6 additions & 3 deletions docs/_sources/docs/reference/release_notes.rst.txt
Original file line number Diff line number Diff line change
Expand Up @@ -81,11 +81,14 @@ TensorFlow 1 CNN Quantizer
- Support for setting the opset version in exporting onnx format.

Bug Fixed:
1. Fixed a bug where the AddV2 operation is misunderstood as a BiasAdd.
1. Fixed a bug where the AddV2 operation is misinterpreted as a BiasAdd.

Compiler
--------
- Release notes to be announced ASAP
- New operators supported: Broadcast add/mul, Bilinear downsample, Trilinear downsample, Group conv2d, Strided-slice
- Performance improved on XV2DPU
- Error message improved
- Compilation time speed up

PyTorch Optimizer
-----------------
Expand Down Expand Up @@ -124,7 +127,7 @@ Library

Model Inspector
---------------
- Release notes to be announced ASAP
- Added support for DPUCV2DX8G

Profiler
--------
Expand Down
2 changes: 1 addition & 1 deletion docs/_sources/docs/reference/version_compatibility.rst.txt
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,7 @@ Zynq |trade| Ultrascale+ |trade|
- 5.15

* - v2.0
- 3.5
- 3.4
- Vivado / Vitis / PetaLinux 2021.2
- 5.10

Expand Down
11 changes: 4 additions & 7 deletions docs/_sources/docs/workflow-model-development.rst.txt
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ In the early phases of development, it is highly recommended that the developer

For more information on the Model Inspector, see the following resources:

- When you are ready to get started with the Vitis AI Model Inspector, refer to the examples provided for both `PyTorch <https://github.com/Xilinx/Vitis-AI/tree/v3.5/examples/vai_quantizer/pytorch/inspector_tutorial.ipynb>`__ and `TensorFlow <https://github.com/Xilinx/Vitis-AI/tree/v3.5/src/vai_quantizer/vai_q_tensorflow2.x/README.md#inspecting-vai_q_tensorflow2>`__.
- When you are ready to get started with the Vitis AI Model Inspector, refer to the examples provided for both `PyTorch <https://github.com/Xilinx/Vitis-AI/tree/v3.5/src/vai_quantizer/vai_q_pytorch/example/jupyter_notebook/inspector/inspector_tutorial.ipynb>`__ and `TensorFlow <https://github.com/Xilinx/Vitis-AI/tree/v3.5/src/vai_quantizer/vai_q_tensorflow2.x/README.md#inspecting-vai_q_tensorflow2>`__.

- If your graph uses operators that are not natively supported by your specific DPU target, see the :ref:`Operator Support <operator-support>` section.

Expand Down Expand Up @@ -129,13 +129,10 @@ Quantization Related Resources

- For additional details on the Vitis AI Quantizer, refer the "Quantizing the Model" chapter in the `Vitis AI User Guide <https://docs.xilinx.com/access/sources/dita/map?isLatest=true&ft:locale=en-US&url=ug1414-vitis-ai>`__.

- TensorFlow 2.x examples are available as follows:
- `TF2 Post-Training Quantization <https://github.com/Xilinx/Vitis-AI/tree/v3.5/src/vai_quantizer/vai_q_tensorflow2.x/tensorflow_model_optimization/g3doc/guide/quantization/post_training.md>`__
- `TF2 Quantization Aware Training <https://github.com/Xilinx/Vitis-AI/tree/v3.5/src/vai_quantizer/vai_q_tensorflow2.x/tensorflow_model_optimization/g3doc/guide/quantization/training.md>`__
- TensorFlow 2.x examples are available `here <https://github.com/Xilinx/Vitis-AI/tree/v3.5/examples/vai_quantizer/tensorflow2x>`__

- PyTorch examples are available `here <https://github.com/Xilinx/Vitis-AI/tree/v3.5/src/vai_quantizer/vai_q_pytorch/example>`__

- PyTorch examples are available as follows:
- `PT Post-Training Quantization <https://github.com/Xilinx/Vitis-AI/tree/v3.5/src/vai_quantizer/vai_q_pytorch/example/resnet18_quant.py>`__
- `PT Quantization Aware Training <https://github.com/Xilinx/Vitis-AI/tree/v3.5/src/vai_quantizer/vai_q_pytorch/example/resnet18_qat.py>`__

.. _model-compilation:

Expand Down
7 changes: 6 additions & 1 deletion docs/_sources/docs/workflow-model-zoo.rst.txt
Original file line number Diff line number Diff line change
Expand Up @@ -18,11 +18,16 @@ All the models in the Model Zoo are deployed on AMD adaptable hardware with `Vit

To make the job of using the Model Zoo a little easier, we have provided a downloadable spreadsheet and an online table that incorporates key data about the Model Zoo models. The spreadsheet and tables include comprehensive information about all models, including links to the original papers and datasets, source framework, input size, computational cost (GOPs), and float and quantized accuracy. **You can download the spreadsheet** :download:`here <reference/ModelZoo_Github.xlsx>`.

.. The below is functional (remove the .. comment on the second line) but has formatting issues that are currently unresolved.
.. raw:: html
.. :file: reference/ModelZoo_Github.htm
.. For now we will just do this:
.. raw:: html

<a href="reference/ModelZoo_Github_web.htm"><h4>Click here to view the Model Zoo Details & Performance table online.</h4></a><br><br>

.. note:: Please note that if the models are marked as "Non-Commercial Use Only", users must comply with this `AMD license agreement <https://github.com/Xilinx/Vitis-AI/blob/master/model_zoo/Xilinx-license-agreement-for-non-commercial-models.md>`__
.. note:: Please note that if the models are marked as "Non-Commercial Use Only", users must comply with this `AMD license agreement <https://github.com/Xilinx/Vitis-AI/blob/master/model_zoo/AMD-license-agreement-for-non-commercial-models.md>`__


.. note:: The model performance benchmarks listed in these tables are verified using Vitis AI v3.5 and Vitis AI Library v3.5. For each platform, specific DPU configurations are used and highlighted in the table's header. Free download of Vitis AI and Vitis AI Library from `Vitis AI Github <https://github.com/Xilinx/Vitis-AI>`__ and `Vitis AI Library Github <https://github.com/Xilinx/Vitis-AI/tree/v3.5/examples/vai_library>`__.
Expand Down
10 changes: 5 additions & 5 deletions docs/_sources/docs/workflow-system-integration.rst.txt
Original file line number Diff line number Diff line change
Expand Up @@ -130,8 +130,8 @@ IP and Reference Designs
* - DPUCZDX8G `PG338 <https://docs.xilinx.com/r/en-US/pg338-dpu>`__
- MPSoC & Kria K26
- 3.0
- `Download <https://www.xilinx.com/bin/public/openDownload?filename=DPUCZDX8G_VAI_v3.5.tar.gz>`__
- `Get IP <https://www.xilinx.com/bin/public/openDownload?filename=DPUCZDX8G_ip_repo_VAI_v3.5.tar.gz>`__
- `Download <https://www.xilinx.com/bin/public/openDownload?filename=DPUCZDX8G_VAI_v3.0.tar.gz>`__
- `Get IP <https://www.xilinx.com/bin/public/openDownload?filename=DPUCZDX8G_ip_repo_VAI_v3.0.tar.gz>`__

* - DPUCVDX8G `PG389 <https://docs.xilinx.com/r/en-US/pg389-dpu>`__
- VCK190
Expand Down Expand Up @@ -174,7 +174,7 @@ Vitis Integration

The Vitis |trade| workflow specifically targets developers with a software-centric approach to AMD SoC system development. Vitis AI is differentiated from traditional FPGA flows, enabling you to build FPGA acceleration into your applications without developing RTL kernels.

The Vitis workflow enables the integration of the DPU IP as an acceleration kernel that is loaded at runtime in the form of an ``xclbin`` file. To provide developers with a reference platform that can be used as a starting point, the Vitis AI repository includes several `reference designs <https://github.com/Xilinx/Vitis-AI/tree/v3.5/dpu>`__ for the different DPU architectures and target platforms.
The Vitis workflow enables the integration of the DPU IP as an acceleration kernel that is loaded at runtime in the form of an ``xclbin`` file. To provide developers with a reference platform that can be used as a starting point. For the DPUCV2DX8G, please refer to the VEK280 reference design included in this release. For MPSoC and Versal AI Core (non AIE-ML devices) please refer to the /dpu subdirectory in the Vitis AI 3.0 Github repository.

In addition, a Vitis tutorial is available which provides the `end-to-end workflow <https://github.com/Xilinx/Vitis-Tutorials/tree/2023.1/Vitis_Platform_Creation/Design_Tutorials/02-Edge-AI-ZCU104>`__ for creating a Vitis Platform for ZCU104 targets.

Expand Down Expand Up @@ -213,7 +213,7 @@ There are two ways to integrate the Vitis |trade| AI Library and Runtime in a cu

- Build the Linux image using Petalinux, incorporating the necessary recipes.

- Install Vitis AI 3.5 to the target leveraging a pre-built package at run time. For details of this procedure, please see :ref:`Vitis AI Online Installation <vart_vail_online_install>`
- Install Vitis AI 3.5 to the target leveraging a pre-built package at run time. For details of this procedure, please see the instructions in the Vitis AI Online Installation section below.


.. _vart_vail_online_install:
Expand Down Expand Up @@ -345,7 +345,7 @@ Run the following commands to upgrade PetaLinux.
Following this upgrade, you will find ``vitis-ai-library_3.5.bb`` recipe in ``<petalinux project>/components/yocto/layers/meta-vitis-ai``.

For details about this process, refer to `Petalinux Upgrade <https://docs.xilinx.com/r/en-US/ug1144-petalinux-tools-reference-guide/petalinux-upgrade-Option>`__.
For details about this process, refer to `Petalinux Upgrade <https://docs.xilinx.com/r/en-US/ug1144-petalinux-tools-reference-guide/petalinux-upgrade>`__.

.. note:: ``2023.1_update1`` will be released approximately 1 month after Vitis 3.5 release. The name of ``2023.1_update1`` may change. Modify it accordingly.

Expand Down
2 changes: 1 addition & 1 deletion docs/docs/install/Alveo_X11.html
Original file line number Diff line number Diff line change
Expand Up @@ -160,7 +160,7 @@ <h1>X11 Support for Running Vitis AI Docker with Alveo<a class="headerlink" href

<div role="contentinfo">
<p>&#169; Copyright 2022-2023, Advanced Micro Devices, Inc.
<span class="lastupdated">Last updated on June 29, 2023.
<span class="lastupdated">Last updated on July 2, 2023.
</span></p>
</div>

Expand Down
Loading

0 comments on commit 5880ef4

Please sign in to comment.