Build an AI Contact Center: Amazon Connect + Bedrock Agents (2026)
Stand up a production AI contact center: Amazon Connect contact flow, a Bedrock Agent with Knowledge Bases, Lex V2 fallback, and a Lambda for tool execution. Real CDK + JSON.
TL;DR — Amazon Connect now ships with a first-class Bedrock Agents block in contact flows. Drop a Bedrock Agent into a Connect flow, point it at a Knowledge Base on S3, and add a Lambda action group for tool calls. No Lex required for the happy path; keep Lex V2 as a typed-slot fallback.
What you'll build
A working Amazon Connect inbound number that routes callers into a Bedrock Agent named support-agent with a Knowledge Base over your help-center S3 bucket. The agent can answer FAQs from the KB, transfer to a queue when the caller asks for a human, and call a Lambda function to look up order status. Real audio handled by Connect + Polly generative; reasoning by Claude 4.7 Sonnet.
Prerequisites
- Amazon Connect instance with a claimed phone number.
- Bedrock model access for
anthropic.claude-sonnet-4-7-20250620-v1:0in the same region as Connect. - S3 bucket with PDFs/markdown to vectorize for the Knowledge Base.
- AWS CDK v2 or Terraform for the boilerplate; a small Lambda for tool calls.
Architecture
flowchart TD
PSTN[Caller PSTN] --> CONNECT[Amazon Connect]
CONNECT -->|contact flow block| BA[Bedrock Agent]
BA -->|retrieve| KB[Bedrock Knowledge Base]
KB --> S3[(S3 docs)]
KB --> OS[(OpenSearch Serverless)]
BA -->|action group| LAM[Lambda lookup_order]
LAM --> DDB[(DynamoDB)]
BA -->|response| CONNECT
CONNECT -->|Polly TTS| PSTN
Step 1 — Create the Knowledge Base
In the Bedrock console, create a Knowledge Base support-kb backed by S3 (sync your help-center docs there) and OpenSearch Serverless as the vector store. Use amazon.titan-embed-text-v2:0 for embeddings — fast and cheap.
```bash aws bedrock-agent create-knowledge-base \ --name support-kb \ --role-arn arn:aws:iam::123:role/AmazonBedrockExecutionRoleForKnowledgeBase \ --knowledge-base-configuration '{"type":"VECTOR","vectorKnowledgeBaseConfiguration":{"embeddingModelArn":"arn:aws:bedrock:us-east-1::foundation-model/amazon.titan-embed-text-v2:0"}}' \ --storage-configuration file://oss-config.json ```
Step 2 — Define the Lambda action group
```python
lookup_order.py
import boto3, json ddb = boto3.client("dynamodb") def handler(event, _): order_id = event["parameters"][0]["value"] item = ddb.get_item(TableName="orders", Key={"id": {"S": order_id}}).get("Item", {}) body = {"status": item.get("status", {}).get("S", "unknown"), "eta": item.get("eta", {}).get("S")} return { "messageVersion": "1.0", "response": { "actionGroup": event["actionGroup"], "function": event["function"], "functionResponse": {"responseBody": {"TEXT": {"body": json.dumps(body)}}} } } ```
Hear it before you finish reading
Talk to a live CallSphere AI voice agent in your browser — 60 seconds, no signup.
Step 3 — Create the Bedrock Agent
```bash aws bedrock-agent create-agent \ --agent-name support-agent \ --foundation-model anthropic.claude-sonnet-4-7-20250620-v1:0 \ --instruction "You are a friendly support voice agent. Always check the knowledge base first. If the user asks about an order, use the lookup_order tool. Keep replies under 2 sentences." \ --agent-resource-role-arn arn:aws:iam::123:role/AmazonBedrockAgentRole ```
Then attach the KB and the Lambda action group:
```bash aws bedrock-agent associate-agent-knowledge-base --agent-id $AID --agent-version DRAFT --knowledge-base-id $KBID --description "Help center" aws bedrock-agent create-agent-action-group --agent-id $AID --agent-version DRAFT \ --action-group-name orders \ --action-group-executor lambda=arn:aws:lambda:us-east-1:123:function:lookup_order \ --function-schema file://schema.json ```
schema.json lists the lookup_order function with its parameter order_id.
Step 4 — Wire the Bedrock Agent block in Connect
In the Connect contact-flow editor (Amazon Connect → Routing → Flows), drop the Amazon Q in Connect or Bedrock Agent block (the latter is GA as of Q1 2026). Set:
- Agent:
support-agentaliasPROD - Input audio: caller media stream (default)
- Output: Polly Ruth (generative)
- Fallback: route to queue
tier-1-humans
The block handles bidirectional streaming, partial transcription, barge-in, and turn-taking automatically.
Step 5 — Test from a real number
Claim a number, attach the contact flow, dial in. Connect's CloudWatch logs show every turn with token counts and KB citations — verify the agent is answering from the KB and not hallucinating.
Still reading? Stop comparing — try CallSphere live.
CallSphere ships complete AI voice agents per industry — 14 tools for healthcare, 10 agents for real estate, 4 specialists for salons. See how it actually handles a call before you book a demo.
Step 6 — Add escalation and analytics
Enable Contact Lens for post-call sentiment + categorization. Add a "transfer to human" intent in the agent instruction; on transfer, Connect attaches the agent's transcript as a contact attribute so the human picks up with full context.
Pitfalls
- Bedrock Agents region drift: your Agent, KB, OSS, and Lambda must be in the same region. Cross-region calls will silently fail.
- KB sync lag:
StartIngestionJobis async — re-vectorizing 10k docs takes ~15 min. Don't ship without polling. - Lambda timeout: action-group Lambdas have a hard 30s cap, but voice budgets want <2s. Pre-warm with provisioned concurrency.
- Polly generative cost ($30/M chars) plus Connect's $0.018/min plus Bedrock per-token can hit $0.50/min if unchecked. Use cached input on Bedrock and batch greetings.
- Contact Lens vs HIPAA: turn off transcript storage in Contact Lens for HIPAA tenants; sign a BAA with AWS first.
How CallSphere does this in production
CallSphere ships its own contact-center surface that competes with Amazon Connect at $149/$499/$1499 (vs ~$0.018/min + add-ons on Connect). We run 37 voice agents and 90+ tools across 6 verticals on FastAPI :8084 with OpenAI Realtime, falling back to Bedrock Claude for HIPAA-locked tenants. The OneRoof multi-family vertical uses Pion Go + NATS for SIP fan-out at scale. 115+ Postgres tables back the entire orchestration; 14-day trial, 22% affiliate.
FAQ
Q: Connect vs CallSphere — when does Connect make sense? If you already run thousands of agents on Connect and just want AI overlays, stay there. If you're starting from zero, a managed product like CallSphere with prebuilt verticals beats six weeks of Connect contact-flow engineering.
Q: Do I need Lex V2? No — Bedrock Agents handle NLU. Keep Lex only for slot-filling tasks where you need deterministic regex extraction.
Q: How do I handle warm transfers?
Use the Transfer to flow block downstream of the Bedrock block; pass agent transcript via contact attributes.
Q: Can the agent use a CRM tool? Yes — point a Lambda action group at your CRM SDK. Salesforce, HubSpot, Zendesk all have Python SDKs that fit in a Lambda layer.
Q: What's the all-in cost per minute? Connect telephony $0.018 + Bedrock Claude ~$0.05 + Polly generative ~$0.02 = ~$0.10/min, plus KB OSS at-rest costs.
Sources
Try CallSphere AI Voice Agents
See how AI voice agents work for your industry. Live demo available -- no signup required.