Support for AWS Bedrock models #160
Replies: 4 comments
-
Hi @MartinRistov I just checked and there is a contribution in progress here. I'll ping you once it is merged for a review! |
Beta Was this translation helpful? Give feedback.
-
Temporary workaround with the use of LangChain. @MartinRistov import { BaseMessage } from "bee-agent-framework/llms/primitives/message";
import { LangChainChatLLM } from "bee-agent-framework/adapters/langchain/llms/chat";
// you need to install this package first @langchain/community
import { BedrockChat } from "@langchain/community/chat_models/bedrock";
console.info("===CHAT===");
const llm = new LangChainChatLLM(
new Bedrock({
model: "anthropic.claude-v2",
region: process.env.BEDROCK_AWS_REGION ?? "us-east-1",
// endpointUrl: "custom.amazonaws.com",
credentials: {
accessKeyId: process.env.BEDROCK_AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.BEDROCK_AWS_SECRET_ACCESS_KEY,
},
temperature: 0,
maxTokens: undefined,
maxRetries: 2,
// other params...
})
);
const response = await llm.generate([
BaseMessage.of({
role: "user",
text: "Hello world!",
}),
]); Related code: https://github.com/i-am-bee/bee-agent-framework/blob/main/examples/llms/providers/langchain.ts |
Beta Was this translation helpful? Give feedback.
-
AWS bedrock is now supported as inference provider (thanks to @abughali!) https://github.com/i-am-bee/bee-agent-framework/blob/main/examples/llms/providers/bedrock.ts Looking forward to you feedback on this @MartinRistov! |
Beta Was this translation helpful? Give feedback.
-
Released in v0.0.41 |
Beta Was this translation helpful? Give feedback.
-
Be happy to collab and help build this out, if you have some guidance! Cheers
Beta Was this translation helpful? Give feedback.
All reactions