ChatBedrock
This doc will help you get started with AWS Bedrock chat models. Amazon Bedrock is a fully managed service that offers a choice of high-performing foundation models (FMs) from leading AI companies like AI21 Labs, Anthropic, Cohere, Meta, Stability AI, and Amazon via a single API, along with a broad set of capabilities you need to build generative AI applications with security, privacy, and responsible AI. Using Amazon Bedrock, you can easily experiment with and evaluate top FMs for your use case, privately customize them with your data using techniques such as fine-tuning and Retrieval Augmented Generation (RAG), and build agents that execute tasks using your enterprise systems and data sources. Since Amazon Bedrock is serverless, you don't have to manage any infrastructure, and you can securely integrate and deploy generative AI capabilities into your applications using the AWS services you are already familiar with.
AWS Bedrock maintains a Converse API which provides a unified conversational interface for Bedrock models. This API does not yet support custom models. You can see a list of all models that are supported here.
We recommend the Converse API for users who do not need to use custom models. It can be accessed using ChatBedrockConverse.
For detailed documentation of all Bedrock features and configurations head to the API reference.
Overview
Integration details
Class | Package | Local | Serializable | JS support | Package downloads | Package latest |
---|---|---|---|---|---|---|
ChatBedrock | langchain-aws | ❌ | beta | ✅ | ||
ChatBedrockConverse | langchain-aws | ❌ | beta | ✅ |
Model features
The below apply to both ChatBedrock
and ChatBedrockConverse
.
Tool calling | Structured output | JSON mode | Image input | Audio input | Video input | Token-level streaming | Native async | Token usage | Logprobs |
---|---|---|---|---|---|---|---|---|---|
✅ | ✅ | ❌ | ✅ | ❌ | ❌ | ✅ | ❌ | ✅ | ❌ |
Setup
To access Bedrock models you'll need to create an AWS account, set up the Bedrock API service, get an access key ID and secret key, and install the langchain-aws
integration package.
Credentials
Head to the AWS docs to sign up to AWS and setup your credentials.
Alternatively, ChatBedrockConverse
will read from the following environment variables by default:
# os.environ["AWS_ACCESS_KEY_ID"] = "..."
# os.environ["AWS_SECRET_ACCESS_KEY"] = "..."
# Not required unless using temporary credentials.
# os.environ["AWS_SESSION_TOKEN"] = "..."
You'll also need to turn on model access for your account, which you can do by following these instructions.
To enable automated tracing of your model calls, set your LangSmith API key:
# os.environ["LANGSMITH_API_KEY"] = getpass.getpass("Enter your LangSmith API key: ")
# os.environ["LANGSMITH_TRACING"] = "true"
Installation
The LangChain Bedrock integration lives in the langchain-aws
package:
%pip install -qU langchain-aws