Is there any benefits to run BERTopic using TPU on Google Colab? #1012
Replies: 2 comments 1 reply
-
|
Beta Was this translation helpful? Give feedback.
-
Hm, tested this on Google Colab Pro+ on the dataset (500k tweets), somehow even loading data (where it writes something about "batches") takes much longer on TPU (GPU: 15min vs TPU: 2hours) and then crunching data is faster on GPU too (GPU: 25min vs TPU: 32min)... Maybe ... there is some benefit for super large datasets but for 500k tweets dataset GPU Premium works much faster compared to TPU (GPU: 15min inserting data + 25min crunching=40min vs TPU:2hours inserting data + 32min crunching=152min). Also this could be related to the fact that GPU Premium gives us A100 GPU which overperforms all optimisations which TPU has... |
Beta Was this translation helpful? Give feedback.
-
Is there any benefits to run BERTopic using TPU on Google Colab over using GPU instances? Thx!
Beta Was this translation helpful? Give feedback.
All reactions