Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dynamic MVAU integration #1242

Draft
wants to merge 25 commits into
base: dev
Choose a base branch
from
Draft

Dynamic MVAU integration #1242

wants to merge 25 commits into from

Conversation

azizb-xlnx
Copy link
Collaborator

@azizb-xlnx azizb-xlnx commented Nov 29, 2024

To minimize changes, we overload the weight input of the existing MVAU superclass to serve as the second input for a dynamic MVAU. The specialized layer transform will instantiate the appropriate RTL subclass based on the presence of a weight initializer.

dkorolij and others added 25 commits November 21, 2024 12:29
- Remove weights when using dynmaic MM
- Rename Weight input to inp_B
- Use WDT as a flag to test dynamic mm
- rename weight to inp_B
- conditional on weight initializer
- rename test node inputs for dynamic mm
- explicitly use weightDataType field to decide if this is dynamic MVAU
- if the MVAU node does not provide wdt, assume it has two inputs.
- These functions are specific to dynamic mm
- assume that the weight tensor is the same as the input tensor B
@azizb-xlnx azizb-xlnx changed the base branch from main to dev November 29, 2024 17:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant