.merge_models requires pre-trained models to have the same embedding model? #1727
-
When attempting to merge two pre-trained models (MaartenGr/BERTopic_Wikipedia and MaartenGr/BERTopic_ArXiv), I get an error, However, when initializing each BERTopic model with a sentence transformer model that produces embeddings of the same size per the documentation, I'm still run into the same error:
Has anyone figured out a workaround other than retraining the models? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
This is what is currently expected of the |
Beta Was this translation helpful? Give feedback.
This is what is currently expected of the
.merge_models
functionality. Embeddings are created based on the average embeddings of a cluster which cannot be easily generalized to another embedding model.