Skip to content

Access LLM [Keywords] (or just use multi-aspect approach) #1706

Answered by MaartenGr
thundergore asked this question in Q&A
Discussion options

You must be logged in to vote

I'm wondering - if I want to do both - can I access the [keywords] passed to openAI in the prompt? Or do I need to run a multi-aspect approach, generating those representation terms and then separately getting an openAI label?

You can access the keywords passed to the OpenAI prompt by using the .prompts_ variable of your OpenAI representation model. Another option would indeed be to run the representation seperately to more easily access the keywords that are passed to the OpenAI representation. The only thing you would have to do is something like this:

representation_model = {"OpenAI": OpenAI(client, model="gpt-3.5-turbo", chat=True)}

and pass that to BERTopic.

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by thundergore
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants