5 Sessions Worth Your Time at Current New Orleans 2025

Current New Orleans 2025 highlights: OpenAI's StreamLink, Kafka protocol deep-dive, Notion's event stack, KIP-1163 diskless topics, and JPMorgan's strat...

Stéphane DerosiauxStéphane Derosiaux · October 27, 2025
5 Sessions Worth Your Time at Current New Orleans 2025

Current kicks off in New Orleans during Halloween week. Frenchmen Street, costumes, brass bands, chaos. Perfect timing.

This year's agenda splits into two themes:

  • Apache Flink leads with almost a fifth of all sessions: performance tuning, SQL features, AI integrations.
  • AI and agent systems form the other big theme: multi-agent orchestration, event-driven AI pipelines, real-time GenAI architectures. The ecosystem is moving from data in motion to decisions in motion.

The rest covers Kafka's core: KRaft, rebalance protocols, the Kafka protocol itself, Iceberg, Tableflow, and governance topics showing how streaming is fusing with the lakehouse world.

Current has become the moment of truth for the streaming world. It's where the state and future direction of streaming are defined (on top of meeting all your streaming friends).

Below are five sessions I'm most excited about. Each captures a key direction streaming is taking: AI meeting Flink, governance meeting lakehouses, enterprises treating data in motion as a core product.

Adam Richardson, OpenAI

How does a leading AI company architect its streaming infrastructure? This talk is a rare chance to see behind the curtain.

I talked with Jigar Bhati, one of OpenAI's technical staff, after their Kafka Summit London session on simplifying Kafka consumption for their teams. We went deep into how they govern internal Kafka usage, why proxies are key to balancing flexibility and control, and how onboarding works at that scale.

StreamLink feels like the logical next step: moving from managing users to managing the data itself. OpenAI connecting Flink, Iceberg, and Kafka to power AI workflows will be a masterclass in how data engineering evolves when AI becomes the customer.

The Kafka Protocol Deconstructed: Building a Broker from a TCP Socket

Mateo Rojas, LittleHorse

This one is for anyone who wants to understand how Kafka works under the hood. Mateo Rojas will rebuild a Kafka broker from scratch, starting with a raw TCP socket, walking through every layer: wire protocol, batching, replication, offsets, ISR, idempotence.

I've followed Colt (CEO) and the LittleHorse team for a few years. Their work on durable execution and workflow orchestration fascinates me. The premise: if your app is critical, it should be distributed and resilient by default, and Kafka is the perfect backbone. LittleHorse competes with Temporal, Restate, and Golem, but they're building it with a unique vision. Seeing them go deep on the Kafka protocol connects everything: durable workflows, streaming, and the core foundations that make this ecosystem possible.

How Notion Processes Billions of Events Daily

Adam Hudson, Notion

I'm an early adopter of Notion. It's where I put my whole life. Hearing how they manage (my!) data at scale is mandatory. They process billions of events every day using Kafka, Snowpipe Streaming, and Apache Pinot for real-time analytics.

I want to see how my #1 software uses my #1 technology. What kind of data they move, what tradeoffs they face, how they keep the product fast and reliable for millions of users.

KIP-1163 Deep Dive: Diskless Topics and Kafka's Future Architecture

Greg Harris, Aiven

This session covers one of the boldest architectural proposals in Kafka's history: KIP-1163, which introduces Diskless Topics. It rethinks how Kafka stores and replicates data in hyperscale cloud environments. The promise: lower cost, higher flexibility. Moving away from traditional broker storage reshapes how Kafka delivers durability, availability, and efficiency.

I admire how Aiven pushes what was status quo in streaming. I talked with Filip Yonov (Head of Streaming) during one of our webinars about where the ecosystem was heading. Cost, governance, and automation kept coming back. This session feels like a direct answer: how far can Kafka be re-engineered to fully embrace modern cloud-native infrastructure?

JPMorgan Chase: Migrating Mainframes to an Event-Driven Architecture

Matthew Walker, JP Morgan Chase

When a bank with over 300,000 employees talks about event-driven architectures, you pay attention. Matthew will outline how they're modernizing decades of mainframe data systems into a streaming platform built on Kafka. A financial giant moving toward an automated, data-driven backbone.

We often talk about streaming in tech companies. Seeing it reshape one of the most traditional industries changes the perspective. How do they make streaming work within global finance's rigor and regulation? Where compliance, AI, and real-time data meet. Front-row seat to the future of enterprise data.

Multi-Agent Chess: Scaling LLM Coordination with Kafka Queues

Stéphane Derosiaux, Conduktor

A more relaxed talk. I'll demo Multiverse Chess, where different AI opponents compete in parallel against one player. It shows how streaming platforms handle complex AI orchestration and the coordination challenges of agentic systems.

Chess was my world growing up. I competed for years. This talk is where that passion meets AI. It will be fun and chaotic, but also demonstrates something serious: the next generation of AI systems will rely on streaming to operate at scale.