Skip to content

Commit

Permalink
#0: Fix double deallocate
Browse files Browse the repository at this point in the history
  • Loading branch information
yieldthought committed Nov 12, 2024
1 parent 274f58a commit b16ff97
Showing 1 changed file with 0 additions and 1 deletion.
1 change: 0 additions & 1 deletion models/demos/llama3/tt/llama_attention.py
Original file line number Diff line number Diff line change
Expand Up @@ -351,7 +351,6 @@ def forward_decode(
dense_out_sharded, ttnn.L1_MEMORY_CONFIG
) # TODO: remove as soon as we have sharded support in for all CCL

ttnn.deallocate(attn_output_cat)
ttnn.deallocate(dense_out_sharded)

# All reduce
Expand Down

0 comments on commit b16ff97

Please sign in to comment.