Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The final sparsity is small than preset. #22

Open
ChenMnZ opened this issue Mar 5, 2021 · 2 comments
Open

The final sparsity is small than preset. #22

ChenMnZ opened this issue Mar 5, 2021 · 2 comments

Comments

@ChenMnZ
Copy link

ChenMnZ commented Mar 5, 2021

Hi, thank you for your great work.
Today, I want to do an ablation experience on your work. I just modified the momentum_growth funtion.
from
y, idx = torch.sort(torch.abs(grad).flatten(), descending=True)
to
y, idx = torch.sort(torch.abs(grad).flatten(), descending=False)
I take the experience with the command:
python main.py --growth momentum --prune magnitude --redistribution momentum --prune-rate 0.2 --density 0.1 --data cifar10 --model vgg-c
I foud that the final sparsity will drop to 0.073. I read the source code and find that momentum_growth funtion can't growth enough weight because it didn't tell weather the mask was 0 befor growth. You deal this problem with the adjusted_growth. And I wonde that why this method work in your origin function but can't work in my ablantion experience.

@ChenMnZ
Copy link
Author

ChenMnZ commented Mar 5, 2021

I solved the problem with the following code

    temp = new_mask.flatten()
    i = 0
    for index,m in enumerate(idx):
        if not temp[m]:
            i += 1
        if i == total_regrowth:
            break
    new_mask.data.view(-1)[idx[:index]] = 1.0

@TimDettmers
Copy link
Owner

TimDettmers commented Jun 6, 2022

Thanks for posting the solution to the problem! I am currently not quite understanding what was going on. Is the code that you provided a general improvement to do the same thing or is it just useful for your ablation experiments?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants