diff --git a/docs/api/providers.md b/docs/api/providers.md
index 5cd4c81bb5..e92e0ecfaa 100644
--- a/docs/api/providers.md
+++ b/docs/api/providers.md
@@ -4,6 +4,8 @@
::: pydantic_ai.providers.gateway.gateway_provider
+::: pydantic_ai.providers.anthropic.AnthropicProvider
+
::: pydantic_ai.providers.google
::: pydantic_ai.providers.openai
diff --git a/docs/models/anthropic.md b/docs/models/anthropic.md
index 81ed245de6..37253b8777 100644
--- a/docs/models/anthropic.md
+++ b/docs/models/anthropic.md
@@ -78,6 +78,106 @@ agent = Agent(model)
...
```
+## Cloud Platform Integrations
+
+You can use Anthropic models through cloud platforms by passing a custom client to [`AnthropicProvider`][pydantic_ai.providers.anthropic.AnthropicProvider].
+
+### AWS Bedrock
+
+To use Claude models via [AWS Bedrock](https://aws.amazon.com/bedrock/claude/):
+
+=== "With Pydantic AI Gateway"
+
+ ```python {title="Learn about Gateway" test="skip"}
+ from pydantic_ai import Agent
+
+ agent = Agent('gateway/bedrock:us.anthropic.claude-haiku-4-5-20251001-v1:0')
+ ...
+ ```
+
+=== "Directly to Provider API"
+
+ Use the [`AsyncAnthropicBedrock`](https://docs.anthropic.com/en/api/claude-on-amazon-bedrock) client from the `anthropic` package:
+
+ ```python {test="skip"}
+ from anthropic import AsyncAnthropicBedrock
+
+ from pydantic_ai import Agent
+ from pydantic_ai.models.anthropic import AnthropicModel
+ from pydantic_ai.providers.anthropic import AnthropicProvider
+
+ bedrock_client = AsyncAnthropicBedrock() # Uses AWS credentials from environment
+ provider = AnthropicProvider(anthropic_client=bedrock_client)
+ model = AnthropicModel('claude-sonnet-4-5', provider=provider)
+ agent = Agent(model)
+ ...
+ ```
+
+ !!! note "Bedrock vs BedrockConverseModel"
+ This approach uses Anthropic's SDK with AWS Bedrock credentials. For an alternative using AWS SDK (boto3) directly, see [`BedrockConverseModel`](bedrock.md).
+
+### Google Vertex AI
+
+To use Claude models via [Google Cloud Vertex AI](https://cloud.google.com/vertex-ai/generative-ai/docs/partner-models/use-claude):
+
+=== "With Pydantic AI Gateway"
+
+ ```python {title="Learn about Gateway" test="skip"}
+ from pydantic_ai import Agent
+
+ agent = Agent('gateway/google-vertex:claude-sonnet-4-5@20250514')
+ ...
+ ```
+
+=== "Directly to Provider API"
+
+ Use the [`AsyncAnthropicVertex`](https://docs.anthropic.com/en/api/claude-on-vertex-ai) client from the `anthropic` package:
+
+ ```python {test="skip"}
+ from anthropic import AsyncAnthropicVertex
+
+ from pydantic_ai import Agent
+ from pydantic_ai.models.anthropic import AnthropicModel
+ from pydantic_ai.providers.anthropic import AnthropicProvider
+
+ vertex_client = AsyncAnthropicVertex(region='us-east5', project_id='your-project-id')
+ provider = AnthropicProvider(anthropic_client=vertex_client)
+ model = AnthropicModel('claude-sonnet-4-5', provider=provider)
+ agent = Agent(model)
+ ...
+ ```
+
+### Microsoft Azure
+
+Azure offers Claude models through their "Models as a Service" using an OpenAI-compatible API. Use [`OpenAIModel`][pydantic_ai.models.openai.OpenAIModel] with an Azure-configured client:
+
+```python {test="skip"}
+from azure.identity import DefaultAzureCredential, get_bearer_token_provider
+from openai import AsyncAzureOpenAI
+
+from pydantic_ai import Agent
+from pydantic_ai.models.openai import OpenAIModel
+from pydantic_ai.providers.openai import OpenAIProvider
+
+token_provider = get_bearer_token_provider(
+ DefaultAzureCredential(),
+ 'https://cognitiveservices.azure.com/.default',
+)
+
+azure_client = AsyncAzureOpenAI(
+ azure_ad_token_provider=token_provider,
+ azure_endpoint='https://your-resource.services.ai.azure.com/api/projects/your-project',
+ api_version='2025-01-01-preview',
+)
+
+provider = OpenAIProvider(openai_client=azure_client)
+model = OpenAIModel('claude-sonnet-4-5', provider=provider)
+agent = Agent(model)
+...
+```
+
+See [Azure's Claude documentation](https://learn.microsoft.com/en-us/azure/ai-foundry/foundry-models/how-to/use-foundry-models-claude) for setup instructions.
+
## Prompt Caching
Anthropic supports [prompt caching](https://docs.anthropic.com/en/docs/build-with-claude/prompt-caching) to reduce costs by caching parts of your prompts. Pydantic AI provides four ways to use prompt caching:
diff --git a/pydantic_ai_slim/pydantic_ai/providers/anthropic.py b/pydantic_ai_slim/pydantic_ai/providers/anthropic.py
index 2afdbeeef6..5d86a8eec2 100644
--- a/pydantic_ai_slim/pydantic_ai/providers/anthropic.py
+++ b/pydantic_ai_slim/pydantic_ai/providers/anthropic.py
@@ -67,8 +67,11 @@ def __init__(
api_key: The API key to use for authentication, if not provided, the `ANTHROPIC_API_KEY` environment variable
will be used if available.
base_url: The base URL to use for the Anthropic API.
- anthropic_client: An existing [`AsyncAnthropic`](https://github.com/anthropics/anthropic-sdk-python)
- client to use. If provided, the `api_key` and `http_client` arguments will be ignored.
+ anthropic_client: An existing Anthropic client to use. Accepts
+ [`AsyncAnthropic`](https://github.com/anthropics/anthropic-sdk-python),
+ [`AsyncAnthropicBedrock`](https://docs.anthropic.com/en/api/claude-on-amazon-bedrock), or
+ [`AsyncAnthropicVertex`](https://docs.anthropic.com/en/api/claude-on-vertex-ai).
+ If provided, the `api_key` and `http_client` arguments will be ignored.
http_client: An existing `httpx.AsyncClient` to use for making HTTP requests.
"""
if anthropic_client is not None: