The Agentic CMS: What Adobe Summit Reveals About the Future of Content Management

Adobe Summit 2026 opened today. One of the first sessions to go live was OS336 - "Adobe Experience Manager Sites Innovations in the AI Era" with Haresh Kumar (Strategy and Product Marketing for AEM and LLM Optimizer) and Cedric Huesler (Lead Product Manager for AEM Sites).

The session covered a question that keeps surfacing in conversations with enterprise teams: does anyone still need a CMS when AI can generate content on the fly? The short answer is yes. But the reason has changed.

The Dual Audience Reality

The Website as Intelligence Layer

From Traditional CMS to Agentic CMS

The Brand Visibility Flywheel

LLM Optimizer: Agentic Traffic and Off-Site Optimization

Figma to AEM in Minutes

The Dual Audience Reality

The fundamental shift is that websites now serve two audiences simultaneously.

Humans have changed how they interact with brands. AI handles research, comparison shopping, and basic information retrieval. By the time a human arrives at your site, they have already had their basic questions answered by an LLM. They expect something different: conversational engagement, personalization, assistance. Not static product pages. Not dense FAQs. Not choose-your-own-adventure navigation.

The expectation is now "what would you recommend for someone like me?" — not "let me click through five product pages and compare features."

Agents are the second audience. When someone asks an LLM "what is the best option in this category," an AI system evaluates your brand in real time. It scans your structured content, product data, claims, and authoritative signals. It does not look at your hero image or homepage design. It looks for content it can read, understand, and represent accurately.

If it cannot, it serves what it thinks it can. That is the moment brands lose control of their own narrative.

This creates the "dual audience reality." Optimizing for one without the other creates imbalance:

Both audiences must be served simultaneously. That is the new operating condition for digital.

The Website as Intelligence Layer

The key reframing from the session: your website is no longer just a destination. It is the source of truth for AI agents.

This is a structural point, not a marketing one. AI agents learn from your brand content. Your website is what trained the LLMs on your brand. It is what provides the meaning to the inference that agents extract. Your whole business sits on this content.

Content has become the most important data on which AI is built. And yet most websites are not built for this dual role. They lack the structure, context, and adaptability that agents require.

From Traditional CMS to Agentic CMS

Session outlined the evolution from what a CMS has historically delivered to what it needs to deliver now.

Traditional CMS outcomes:

Agentic CMS outcomes (in addition to the above):

New metrics:

This is a meaningful expansion. The CMS role is not shrinking. It is growing to include a set of responsibilities that did not exist two years ago.

The Brand Visibility Flywheel

Session discussed an interesting framework called the Brand Visibility Flywheel, built on four continuous loops:

1. Brand Visibility. Is the content readable and available to both human and agentic audiences? Can agents access it? Can they parse it?

2. Brand Association. Is the brand correctly represented? Are the right questions being answered? Does the content reflect what the brand actually stands for?

3. Brand Engagement. Are customers deeply engaged across channels and modalities? Not just clicks, but conversational interactions, assisted experiences, personalized journeys.

4. Proactive Iteration. Is the system learning? Are journeys being tracked as they shift? Is the next engagement better than the last based on what was learned from all previous interactions?

Each loop feeds the next. Visibility enables association. Association drives engagement. Engagement generates data for iteration. Iteration improves visibility.

This connects directly to the three-layer model I covered in a previous article. MCP provides the tool layer for agent access. AEM agents handle the content operations. Agent Orchestrator coordinates across products. The Brand Visibility Flywheel is the business framework that sits on top of that technical architecture.

LLM Optimizer: Agentic Traffic and Off-Site Optimization

Session also demonstrated Adobe LLM Optimizer, which I have covered in earlier posts. Two capabilities shown were new or significantly expanded.

Agentic Traffic Dashboard. LLM Optimizer now tracks how many AI agents visit your site, what content they consume, and what they cannot read. This is the first metric in the new Agentic CMS model — are you even visible to the agents?

The dashboard surfaces concrete, actionable tasks. For example, it analyzes specific URLs and recommends where FAQs should be added. The rationale: agents look for succinct summaries they can cite. If your content is buried in long-form prose, agents skip it. FAQs in structured markup give agents exactly what they need to quote accurately.

Wikipedia Footprint Analysis. LLM Optimizer now includes a section that analyzes your brand's presence within Wikipedia — one of the most authoritative sources for both humans and AI agents. It shows which sections of your Wikipedia content get cited by LLMs and where to focus updates.

This is off-site optimization for the agentic web. Managing your Wikipedia presence is not new advice, but having a tool that shows exactly which sections drive LLM citations makes it actionable instead of theoretical.

Figma to AEM in Minutes

The second demo was a workflow I had not seen before.

The problem: A designer creates a banner in Figma. Getting that design into AEM as structured, managed content typically requires a developer to implement it as a component, create a content fragment model, and build a rendering template.

The new workflow:

  1. Select the element in Figma
  2. Copy the link to selection
  3. Open AEM's AI Assistant
  4. Paste the Figma URL with a prompt ("import this")
  5. AEM authenticates with Figma (one-time setup), retrieves the design, analyzes the selection
  6. The AI Assistant asks clarifying questions, then generates a content fragment with a visualization template

The result is not a flattened image. It is a fully managed content fragment where every field — heading, body text, CTA, image — is individually editable, translatable, and distributable. The visualization template renders it exactly as it appeared in Figma.

That content fragment can then be:

Design to production-ready, multi-channel content in minutes. With full content management capabilities from the start.

What This Means

The CMS is not dying. It is becoming the intelligence layer for an increasingly agentic web.

For teams evaluating their AEM strategy, this session surfaced three concrete implications:

1. Audit your content for agent readability. If you are on AEM as a Cloud Service, LLM Optimizer gives you the tools to measure this. If you are not, the principle still applies: structured content with clear summaries, proper schema markup, and authoritative claims is now a requirement, not a nice-to-have.

2. The metrics are expanding. Clicks and page views are not going away, but they are no longer sufficient. Teams need to add AI readability, generative engine visibility, and citation tracking to their measurement frameworks. LLM Optimizer provides this for AEM customers.

3. Design-to-content workflows are compressing. The Figma-to-AEM pipeline shown in the demo eliminates a significant amount of implementation work. For teams with high-volume content production needs, this changes the economics of content operations.

The brands that treat their website as just a destination will lose visibility to the ones that treat it as the source of truth for both humans and machines. The Agentic CMS is how that shift gets operationalized.

References

Viktor Lazar

Director of Engineering