From Benchmarking to Orchestration: Why 2026 is UX’s Moment to Lead
And how UX Evolved's insightful guests predicted the #1 idea on LinkedIn's list
The tech industry loves a good benchmark. For the past two years, we’ve obsessed over which AI model scores highest on reasoning tests, which chatbot sounds most human, and which image generator produces the most realistic outputs. It’s been comforting, in a way—a familiar measuring stick in unfamiliar territory.
But 2026 is shaping up differently. According to industry analysts and venture capitalists tracking the AI landscape, the conversation is shifting from “which model is best” to “how do we orchestrate AI to drive actual outcomes.” It’s a fundamental reframe that puts UX professionals—designers and researchers who understand human needs, business objectives, and the messy reality of implementation—right at the center of where AI creates value.
My guests have been saying this all along.
What “Orchestration” Actually Means
The LinkedIn Big Ideas 2026 report describes a transition from human-speed workflows to “agent-speed” systems where a single goal triggers thousands of coordinated sub-tasks. Microsoft Research talks about “agentic media” and “context engineering” becoming essential skills. Capgemini predicts new roles like “AI workflow designers” and “agent supervisors” responsible for orchestrating fleets of digital workers.
Translation: Someone needs to design how all these AI pieces work together to achieve something humans actually care about. That’s orchestration—and it’s fundamentally a design problem, not a technical one.
As Adam Howell told me in our inaugural episode: “Perhaps that’s what design is going to be as it’s going to be geared less towards the how many hashes can we remove and more towards the what outcome are we actually trying to solve for here.”
The Pattern Across Every Episode
When I review the conversations from 2025, I see this orchestration theme emerging across every guest, even when we weren’t explicitly talking about it.
Strategic Decision-Making Over Pattern Recognition
Miwako Zosel, UX researcher, articulated why orchestration can’t be automated away: “AI might be able to execute on research plans and kind of complete a checklist, but it can’t navigate the political landscape of a company to determine which research questions actually matter.”
She continued: “That’s where kind of the strategic nature of UX research comes in and, you know, that’s maybe where AI is lacking.”
This is orchestration in practice. It’s not about running the study—it’s about knowing which study to run, when, and for whom. It’s understanding that the VP of Product and the engineering lead need different insights from the same research. AI can’t make those judgment calls because it doesn’t understand organizational dynamics, power structures, or what’s politically feasible versus technically possible.
Human-in-the-Loop as Orchestration Layer
Amit Popat, who’s building AI-powered revenue management for hotels, explained their approach: “There’s this term called human in the loop, which is quite common now, and that’s about actually making sure for those high risk decisions, things like, for example, generating an email marketing campaign. We are never going to just automatically send off marketing campaigns because the risk there to your brand is so huge.”
The human-in-the-loop isn’t a safety net—it’s the conductor. Amit’s team uses AI to analyze pricing patterns, competitor behavior, and demand signals. But the orchestration layer—deciding which recommendations to act on, when to override the AI, how to balance short-term revenue against brand perception—that’s where human judgment creates value.
He also touched on multi-agent orchestration directly: “Having a UX agent, having a tester agent, having an infrastructure agent, the orchestration of those agents is a new metaphor in and of itself.”
Value Through Context, Not Execution
Whitney Tolley, Design Director at Cloudbeds, described how AI changes the designer’s role: “It’s just thinking about where do you provide value and how can you provide that value faster.”
She gave a concrete example: After running a collaborative workshop, she used AI to quickly synthesize outcomes and create compelling communications. “If you can use AI to be like, help me write a compelling Slack message that recaps what we just did and gets people excited about how we spent that time and what we’re doing next. I think it can really help with those things that we’re just not doing because we don’t have time to do. And there’s value in doing those things.”
This is orchestration at the micro level—knowing what needs to be communicated, to whom, and why. The AI executes; the designer orchestrates the information flow that keeps teams aligned and moving forward.
Validation as Critical Orchestration Function
Dan Zola, UX researcher at Google, raised perhaps the most important orchestration question: “Products are continuously increasingly using AI to make decisions about user experience. But then who validates that the AI actually understands the user needs correctly?”
He emphasized: “It really helps if the value of research is seen top down... determine if you can hitch your wagon to a star, hitch your wagon to a place where they see the value of and believe in and support UX research.”
In an orchestrated AI system, someone needs to validate outputs, ensure quality, and catch failures before they reach users. That’s not a technical skill—it’s judgment developed through years of seeing what works and what doesn’t with real humans.
What This Means for UX Professionals in 2026
If 2025 was about learning AI tools, 2026 is about learning to conduct the orchestra. Here’s what that looks like:
1. You’re Not Losing Relevance—You’re Gaining Leverage
The 2026 trend reports describe new roles: AI workflow designers, agent supervisors, governance leads. These aren’t technical roles—they’re design and research roles by another name. Capgemini notes that “the new currency of expertise will lie in systems thinking, AI and agents orchestration, and managing complex, autonomous process and tool chains.”
As Miwako observed: “I do think that the researcher role is becoming more strategic and not diminished. One of the things that we do, and it’s kind of the nature of the role, is acting as the bridge connector across these different teams that may not be talking to one another.”
That bridge-building? That’s orchestration.
2. Business Outcomes Over Pixel Perfection
Adam Howell reflected on how UX has evolved: “One of the things that I think is the most intriguing about UX, which is that humanistic scientific method aspect of who, what are we trying to do for whom, under what circumstances and where does value lie?”
In an orchestrated AI system, this becomes even more critical. When AI can generate 50 interface variations in minutes, the skill isn’t making the variations—it’s knowing which problem you’re actually solving and for whom.
Whitney emphasized: “We’re going to need designers to help push the envelope a little bit and be strategic about choosing those areas in which they should push the envelope. Pick the things that are strategic.”
3. Cross-Functional Fluency Becomes Essential
Amit described how his team bridges product, design, engineering, and data science: “The minute the data scientist comes in, they’re looking for the data that they need to be able to start making their decisions, or then you need a data engineer... you often need more data engineers than you need data scientists because they’re the ones doing all the plumbing.”
Understanding how these pieces fit together—not just conceptually, but operationally—that’s the orchestration skill. You don’t need to code the data pipeline, but you need to understand why it matters and how it affects what you can design.
Initial Education: What New Designers Need
If you’re entering UX in 2025-2026, the baseline has changed:
Systems Thinking Over Tool Mastery: Learn to map workflows, understand dependencies, and see how changes ripple through organizations. Tools will change; systems thinking won’t.
Business Acumen: Understand how decisions get made in organizations. Miwako’s point about navigating “the political landscape of a company to determine which research questions actually matter” isn’t cynicism—it’s craft.
Comfort with Ambiguity: Whitney noted that new skills include “comfort with not having a defined role.” When you’re orchestrating, your job is whatever needs doing to achieve the outcome.
Technical Literacy Without Technical Execution: As Amit demonstrated, you need to understand enough about data engineering, ML systems, and agent architectures to have informed conversations—but your value isn’t in building them.
Continuing Education: What Veterans Need
For those of us mid-career, the adjustment is different:
Embrace Your Strategic Value: Dan’s reminder that “the value of research needs to be seen top-down” applies equally to design. If you’ve been justifying your existence through execution speed, shift to emphasizing orchestration and judgment.
Experiment with Agent Coordination: Amit’s experience with “having a UX agent, having a tester agent, having an infrastructure agent” shows where practice is heading. Start small—coordinate two AI tools to accomplish a workflow you currently do manually.
Document Your Decision-Making: The most valuable thing you do is often invisible—the “why” behind choices, the context that shaped decisions, the alternatives you rejected. In an AI-orchestrated world, that context becomes training data for better systems.
Build Cross-Functional Relationships: Miwako’s role as “bridge connector across these different teams” becomes more valuable when those teams are partly human, partly AI. Your network is your orchestration layer.
The Throughline
Looking back at every conversation from 2025, I see a consistent theme that we didn’t have language for yet: orchestration.
Adam talked about focusing on outcomes over execution.
Miwako described strategic research that navigates organizational complexity.
Whitney demonstrated using AI to amplify human decision-making.
Dan emphasized validation and quality control.
Amit showed multi-agent coordination in practice.
They were all describing orchestration—the fundamentally human work of deciding what to do, coordinating how it gets done, and validating that it actually worked.
Why This Matters Now
The industry is moving from “let’s try AI and see what happens” to “let’s build AI systems that achieve specific business outcomes.” That transition requires orchestration at every level—strategic, tactical, and operational.
And here’s what makes this UX’s moment: we’ve been doing orchestration all along. We just called it “facilitating stakeholder alignment” or “translating business requirements into user needs” or “coordinating across design, product, and engineering.”
The tools have changed. The fundamental work hasn’t.
So in 2026, when someone asks you what you do, you might say: “I orchestrate AI systems to achieve human outcomes.” And they’ll understand exactly what you mean—because by then, everyone will be trying to figure out how to do exactly that.
The question isn’t whether UX is relevant in an AI world. It’s whether we recognize that orchestration has always been our core skill, and AI just made that skill exponentially more valuable.
What are you orchestrating? I’d love to hear how you’re thinking about your role in 2026.






