You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi! I'm interested in extracting the attention weights on each variable (I have 75 different variables) to determine the relative importance of each variable in predicting the output. How would I go about extracting these weights?
The text was updated successfully, but these errors were encountered:
I have read the paper ... it is very confusing ... Simple things are made complicated.
There is no "relative importance of each variable" since they use sigmoid function instead of softmax ...
The word "attention" implies softmax function since it gives "relative importance" to some data ... sigmoid does not do that...
Using sigmoid it is just a residual block (skip connection)...
In resume another example of scientific theater.
There are some case studies that are well defined to compare results. Like raw temporal data of HAR UCI.
It is very easy to put a standard model giving bad results and our own model giving nice results when you define the temporal windows. Chose a know well define case study to say that your model gives state of the art results.
Make your code public and leave this know well define case study data out for us to test ...
Hi! I'm interested in extracting the attention weights on each variable (I have 75 different variables) to determine the relative importance of each variable in predicting the output. How would I go about extracting these weights?
The text was updated successfully, but these errors were encountered: