Top announcements of the What's Next with AWS, 2026
AWS just wrapped their What’s Next event with three major announcements that signal where enterprise AI is heading. If you’re building on AWS or planning to, these updates deserve your attention because they’re reshaping how teams integrate AI into their workflows and operations.
The headline grab is Amazon Quick, a new AI assistant designed specifically for work. Think of it as AWS’s answer to the enterprise AI assistant problem—it’s built to understand context across your AWS environment, internal tools, and documentation. What makes it technically interesting is the desktop app approach combined with expanded integrations. Rather than forcing everything through a chat interface, Quick can natively connect to your existing tools and APIs. For a development team, this means asking Quick to help debug CloudFormation templates, explain your architecture decisions, or even scaffold boilerplate code without bouncing between windows. Under the hood, this likely leverages foundational models through Amazon Bedrock with retrieval-augmented generation (RAG) to pull context from your actual infrastructure and documents. The practical win here is reduced context-switching and faster onboarding for teams learning your specific setup.
The second wave of announcements expanded Amazon Connect with four purpose-built agentic AI solutions. This is where things get interesting from an automation perspective. Amazon Connect has evolved from just a contact center tool into a platform for building autonomous agents that handle supply chain coordination, hiring workflows, customer service, and healthcare operations. An agentic approach is fundamentally different from a simple chatbot—these agents can take actions, integrate with backend systems, and make decisions without human intervention for routine tasks. Imagine a hiring agent that screens resumes, schedules interviews, and integrates with your HRIS system automatically, or a supply chain agent that monitors inventory, detects anomalies, and coordinates with vendors. Technically, these agents sit on top of Connect’s existing API framework and foundation models, orchestrating multi-step workflows across your AWS services and third-party systems. For organizations managing these domains, this is about moving from reactive monitoring to proactive automation that scales without hiring more staff.
The partnership expansion with OpenAI brings GPT-5.5, Codex, and Managed Agents into Amazon Bedrock’s available models. This matters because it consolidates your model options—instead of managing separate API keys and integrations with OpenAI directly, you now access their models through Bedrock’s unified API. From a practical standpoint, your infrastructure code stays the same; you just switch model IDs in your Bedrock calls to try GPT-5.5 for tasks where it outperforms other options, or use Codex specifically for code generation. The Managed Agents capability is particularly useful if you’re building complex workflows that need to chain multiple steps—AWS handles orchestration and error handling so you focus on defining what actions your agent should take. This also keeps your usage within your AWS billing and security boundaries, which matters for regulated industries. The limited preview means early access is restricted, but if your workloads would benefit from OpenAI’s models, this is the path to watch.