Replies: 1 comment 2 replies
-
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
How are we supposed to transform the RELU layers into a form that FINN likes? I am having trouble with the following code (model was exported using export_qonnx):
At this point, model displays fine. I can see conv, maxpool, relu, and quant layers.
Dont notice much of a difference here
I can see the shapes look good, and layer names are fine.
Heres the part Ive been playing with and trying to debug. I know i need to use 3d->4d since the tensors on this model are 1d which FINN by default does not like. However! It does not like Relu.
The QuantReluHandler transform is a FINN function, but I think it needs to be called from a model.transform - however it wants nodes and node indexes as inputs, so I was throwing together something custom. It (meaning my current code) is, at this current time, garbage. I am pretty sure I am just doing the Relu handling wrong, but I cant find an example that does it properly.
Does anyone know of one? Change3Dto4DTensors will yell at me if I try to run a model with RELU layers through it, so I was trying to fix prior. Ive tried different orders of qonntofinn as well (it used to be much earlier in the code).
Below is the warning I get which is likely what causes build to fail.
Beta Was this translation helpful? Give feedback.
All reactions