You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As noted in issue #119 , one of the questions that often comes up is how to get feature importance, or understand better what the model is deciding on in general.
There is a lot to say about this topic, so I think we should make a call-out box about it including useful links to resources. I think Episode 4 would be a good place to add this call-out box.
The text was updated successfully, but these errors were encountered:
In Dianna, we are currently working to add time-series data-sets for the XAI methods and one of the data-sets we plan to use is the weather data-set from the lesson (in a slightly different sense of predicting seasons). We could add a link to the dashboard of dianna, once it is ready (maybe Q2/3, 2023), to the lesson to give a visual overview of explainable AI and what it shows.
@svenvanderburg, unfortunately the dashboard is not ready yet. As I mentioned, in terms of a time frame, this is only possible in Q3 next year. I will open an issue once the dashboard is live and I can provide some images/links.
As noted in issue #119 , one of the questions that often comes up is how to get feature importance, or understand better what the model is deciding on in general.
There is a lot to say about this topic, so I think we should make a call-out box about it including useful links to resources. I think Episode 4 would be a good place to add this call-out box.
The text was updated successfully, but these errors were encountered: