Apache Kafka and the Data Mesh

Data Mesh is a relatively recent term that describes a set of principles that good modern data systems uphold. A kind of ‘microservices’ for the data-centric world. While Data Mesh is not technology-specific as a pattern, the building of systems that adopt and implement data mesh principles have a relatively long history under different guises. In this talk, we share our picks of what every developer should know about building a streaming data mesh. We first introduce the four tenets of the data mesh pattern: Domain-driven Design (DDD), Data as a Product, Self-service, and Governance. We then cover topics such as the differences between working with event streams versus centralized or virtualized approaches. We’ll examine how to properly model data as events, putting in place data contracts, dealing with incomplete or corrupt data, and the importance of taking a product-centric view on data sources and the data sets they share.

Speakers
speakerimage
Ben Stopford
Lead Technologist, Office of the CTO, Confluent
speakerimage
Michael Noll
Lead Technologist, Office of the CTO, Confluent