-
Notifications
You must be signed in to change notification settings - Fork 642
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Paper QA Settings for AWS Bedrock #667
Comments
The Paper QA repository does not provide specific examples or documentation for integrating AWS Bedrock models directly. However, you can configure custom language models using the
This approach is based on the general method for configuring custom models in Paper QA, as described in the repository's README [1]. Adjust the configuration to fit the specifics of your AWS Bedrock setup. |
Hi @istranic can you post your stack trace(s)? Also, check these |
Hi @jamesbraza I made progress and solved some of the issues. I can now manually add and query docs with all models hosted locally.
However, when I add docs using the method above, it's not doing any of the advanced parsing, correct? In order to benefit from advanced parsing, I need to use something like:
However, this doesn't work because |
So I can understand, what do you mean by "advanced parsing"?
Yeah when using And for setting the So you won't be directly instantiating a |
I've seen the example for changing embedding models, and instantiating it via Settings make sense in principle. However, in the example provided, it doesn't highlight how to specify the embedding configuration for a custom model. I've tried various permutations of To rephrase my first question, is there any logical difference between adding documents manually and running |
An issue in this line is that the LiteLLMEmbeddingModel class only passes In my working code that specifies the embedding settings using the
|
Hi @jamesbraza just wondering if you've had a chance to consider the potential bug above. |
To answer this one:
So they ultimately invoke the same method. In other words, you can either:
I can try to update the README wording a bit to clarify this.
As far as
So I actually think you're using it as intended. Your example:
You have bypassed the Settings(embedding_config=embedding_config) Does that clear things up a bit? |
Thanks for the clarification @jamesbraza . I'm now able to run When I add data using On another note, when I use Local Llama 70B model
Bedrock Model
|
The table data, what is the file type? Let's say the table is in a
We don't really support table data that well yet in
We don't pass
This seems to suggest you are either (a) passing arguments incorrectly somewhere, or (b) AWS bedrock doesn't support tool calling. https://docs.litellm.ai/docs/providers/bedrock#usage---function-calling implies |
Hi @jamesbraza regarding our tables, they are within PDFs. Here's an example. It seems like most RAG parsing tools (LlamaParse etc.) create summarize or tables, images, figures, etc. so that would be an awesome feature request for our use cases. I'll consider contributing in this regard as well. Here's the code I'm running. It doesn't look like I'm tweaking the tool settings, so I don't know how it ends up as
|
Yes sounds good
No I don't see anything there why An aside is consider upgrading to Also, consider using |
Hi, do you have any guidance or examples for the
Settings
parameter in order to use paper QA with entirely AWS Bedrock models. I'm able to access by Bedrock's LLM and embedding models usingboto3
andlitellm
, but I can't figure out how to make it work in Paper QA Python API, and my errors are very generic and non-specific.The text was updated successfully, but these errors were encountered: