Monday, May 4, 2026 Marketo Ops Radar Curated insights from the Marketo field
Platform & Integrations

Adobe Summit: Marketo Data Streams Replace the API Poll Grind

Adobe Summit: Marketo Data Streams Replace the API Poll Grind

Adobe Summit 2026 · Session OS220 · MARKETO · Watch on Adobe.com

If your team is still burning through REST API quotas to keep downstream systems current, Marketo's Data Streams feature is the direct answer. This session with Harman Bola from Adobe and Philippe Cherfils, Director of Global Marketing Automation and Data Management at Workday, covers the architecture, the two available stream types, and what Workday learned standing it up at enterprise scale.

Why Polling and Batch Exports Break at Scale

▶ 2:07   ▶ 4:05

The core problem Harman Bola lays out isn't new, but Workday's experience makes it concrete: high-frequency polling hammers API quotas, and when you hit those limits, you don't just stop — you create a backlog that compounds the delay. Philippe Cherfils describes this as the 'number one motivation' for moving to Data Streams, noting the issue predated their current AI initiatives and only got more acute as transaction volume grew.

The batch export alternative isn't cleaner. Manual exports and UI monitoring add ops overhead and stretch campaign cycle times. By the time a form fill or program change surfaces through a nightly sync, the actionable window has often closed. For teams feeding machine learning pipelines or real-time personalization engines, stale data isn't just inconvenient — it breaks the use case entirely.

Bola also flags a third failure mode: custom integration work. Connecting Marketo activity to analytics tools or ML engines via REST typically requires bespoke development that's both time-consuming and brittle. Data Streams are designed to eliminate that layer.

How Data Streams Actually Move Events Out of Marketo

▶ 8:55   ▶ 10:15   ▶ 11:11

The delivery architecture is straightforward: an event fires in Marketo (form fill, program clone, token update), it gets routed through Adobe IO Events, and from there you choose one or more delivery methods. Webhooks push events in real-time to a system you control. Adobe Runtime lets you execute serverless logic inline as events arrive. Amazon EventBridge sends directly into your AWS event infrastructure.

Journaling is on by default and retains a seven-day event history. This isn't a nice-to-have — it's the fallback mechanism. Philippe confirms Workday uses journaling specifically for retry on failed webhook attempts, which is the recommended pattern Bola describes: webhooks for real-time delivery, journaling as the safety net.

Critically, Data Streams run independently of your REST API allocation. You're not trading quota for throughput — the two operate on separate tracks. For teams that have architected around polling, this is a meaningful operational shift, not just a performance tweak.

Lead Activity vs. User Audit: Two Streams, Two Use Cases

▶ 11:44   ▶ 13:38

Both stream types are available today for Prime and Ultimate customers. Lead Activity captures behavioral signals — form fills, email opens and clicks, list additions and removals, custom activity, program status changes. These are the events that feed engagement scoring, AI enrichment pipelines, and downstream CRM triggers. Workday is fully committed here first, using the stream to identify bot traffic and flag fake records before data propagates to the rest of the organization.

User Audit covers operational changes inside Marketo itself: token updates, email and landing page modifications, smart campaign changes, program and folder creation. The primary use case Bola surfaces is governance — enterprises with security and regulatory requirements that need an auditable log of who changed what and when. Philippe has User Audit on the roadmap but is taking a crawl-walk-run approach, building maturity on Lead Activity before layering in the governance stream.

More stream types are in development, though no specifics were shared on timeline or scope.

The Implementation Mindset Shift Workday Didn't Anticipate

▶ 15:15   ▶ 16:19   ▶ 17:02

Philippe's implementation advice is the most useful part of this session for teams about to start. The paradigm shift from REST polling to event streaming sounds simple but isn't. With an API call, you write a query, you get the data you asked for. With Data Streams, you're receiving every activity and need a pre-defined plan for what to capture, what to filter, and how to route it into your data warehouse before you turn it on.

Workday required buy-in and active involvement from their BT (business technology) partner team alongside marketing ops to work through data mapping and filtering decisions. The customizability of Data Streams — you can subscribe to only the activity types you need — is a feature, but it also means you can't skip the planning phase. Going in without that groundwork leads to schema confusion and downstream data quality issues.

The payoff, per Philippe: the ability to handle millions of transactions per hour with no API ceiling anxiety, and a live data feed that makes previously impractical AI applications viable. He's direct that it's more complex than a REST integration and equally direct that it's worth it.


Key takeaways

  • Data Streams operate independently of your REST API quota — enabling them doesn't consume existing API allocation, it adds a separate event delivery layer.
  • Journaling is on by default with a seven-day history; use it as your retry mechanism for failed webhook deliveries, not just a passive log.
  • Lead Activity and User Audit streams are available now for Prime and Ultimate customers — if you're on either tier, there's no feature gate blocking you.
  • Do data mapping and filtering decisions before you enable the stream. Workday required a cross-functional effort between marketing ops and their BT team to define which activity types mattered and how to structure the incoming data.
  • The bot detection and fake record identification use case Workday is running requires real-time data — it's a concrete example of an AI application that simply doesn't work on batch exports.

Bottom line

Data Streams solve a real and well-documented pain point, and Philippe Cherfils' implementation candor — particularly the paradigm shift warning and the need for BT partnership — makes this more useful than a typical product marketing session. If you're on Prime or Ultimate and still polling via REST, the planning conversation with your data engineering team should have started yesterday.

Share