AWS Bedrock Converse API: Use same code with different models, works best with Claude 3

AWS Bedrock Converse API: Use same code with different models, works best with Claude 3

AWS has simplified coding conversational applications for its Bedrock AI coding service with the introduction of the Converse API, providing a single API that works with all models which send and receive messages, while still making it possible to pass inference parameters that are unique to a particular model.

Bedrock is an AWS service that supports a variety of foundation models, including Anthropic Claude, AI21 Labs’ Jurassic-2, Stability AI’s Stable Diffusion, Cohere’s Command and Embed, Meta’s Llama 2, Mistral AI and Amazon Titan. The service features model customization using private data, and managed agents that run multi-step tasks.

One of the key ideas in Bedrock is that developers can experiment with different models to see which works best or is most cost-effective. A snag though is that each model has its own inference parameters. Claude, for example, has parameters including the version, the maximum tokens to use, the temperature (amount of randomness), the input messages, the probability cut-off, and optional tools, which are external functions that can be invoked by the model.

The Converse API now provides a consistent API that works across multiple Bedrock models. Instead of the old InvokeModel or InvokeModelWithResponseStream, developers can now use Converse or ConverseStream. There is a base set of inference parameters common to all supported models, and additional parameters can be passed in an additionalModelRequestsFields field if needed.

A Converse API example from the tutorial.

There are limitations when using the Converse API. User chat is supported by most Bedrock models, but image chat only works with Claude 3. The full table of supported features with different models shows that Claude 3 is the best supported model, not a surprise given that AWS has invested $4 billion in Anthropic.

Tool use is supported when using the Converse API with Claude 3, Mistral Large, and Cohere’s Command R and R+.

Last month AWS also previewed Bedrock Studio, which provides a visual user interface for experimenting with foundation models and large language models. Bedrock Studio automatically deploys the AWS resources it needs, removing complexity though care is needed to avoid unexpected costs. Bedrock Studio also enables collaboration via shared projects.