You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When using [Async]AnthropicBedrock I have to provide "arn:aws:bedrock:us-east-1:0000000000:inference-profile/us.anthropic.claude-3-7-sonnet-20250219-v1:0" to model_name= instead of just anthropic.claude-3-7-sonnet-20250219-v1:0.
If I provide only the model's name, I get
Error code: 400 - {'message': 'Invocation of model ID anthropic.claude-3-7-sonnet-20250219-v1:0 with on-demand throughput isn’t supported. Retry your request with the ID or ARN of an inference profile that contains this model.'}
httpx.HTTPStatusError: Client error '400 Bad Request' for url 'https://bedrock-runtime.us-east-1.amazonaws.com/model/anthropic.claude-3-7-sonnet-20250219-v1:0/invoke'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/400
Docs from Anthropic or AWS don't make this obvious, and instead indicate to use the model name directly.
When using [Async]AnthropicBedrock I have to provide
"arn:aws:bedrock:us-east-1:0000000000:inference-profile/us.anthropic.claude-3-7-sonnet-20250219-v1:0"
tomodel_name=
instead of justanthropic.claude-3-7-sonnet-20250219-v1:0
.If I provide only the model's name, I get
Docs from Anthropic or AWS don't make this obvious, and instead indicate to use the model name directly.
Maybe just needs a docs update?
Repro
The text was updated successfully, but these errors were encountered: