Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

model explainability #750

Open
alitirmizi23 opened this issue Apr 24, 2023 · 3 comments
Open

model explainability #750

alitirmizi23 opened this issue Apr 24, 2023 · 3 comments
Labels
enhancement New feature or request question Further information is requested

Comments

@alitirmizi23
Copy link

I was checking out one of the utilities for model explanations. I see two functions (grad_cam and feat_attribution). Is this attribution in any way related to SHAP? I don't see that it is. Would a SHAP-like implementation be helpful here in terms of local explanability for multivariate time series input data towards predictions? I can try to look into it and add a feature

@vrodriguezf
Copy link
Contributor

Back in the day there was a SHAP wrapper for fastai models (https://github.com/nestordemeure/fastshap), but as far as I know, there's nothing like that today.

@oguiza oguiza added question Further information is requested answered? labels May 4, 2023
@oguiza oguiza added enhancement New feature or request and removed answered? labels Sep 3, 2023
@oguiza
Copy link
Contributor

oguiza commented Sep 3, 2023

Hi @alitirmizi23,
I'm sorry @alitirmizi23, but I misinterpreted your description.
Yes, it'd be good to investigate if a SHAP-like functionality could be added to tsai. Please, let me know if you are still interested and if you need any help.

@lisu579
Copy link

lisu579 commented Oct 9, 2023

Hi @alitirmizi23, I'm sorry @alitirmizi23, but I misinterpreted your description. Yes, it'd be good to investigate if a SHAP-like functionality could be added to tsai. Please, let me know if you are still interested and if you need any help.

Hi! I have some questions: ①I want to know if we can use permutation methods to calculate feature importance of each variable in evey step? ②Or we can use SHAP method in tsai to derive the local variation of feature importance with each feature? ③Moreover, I'm trying to figure out that if the window_len in 'applying_sliding_window' is the length of one window. For example, I have 100 samples, if window_len is set with 30. Then it means there's 30 samples in one window, or I split 100 samples into 30 windows?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request question Further information is requested
Projects
None yet
Development

No branches or pull requests

4 participants