Patterns for Designing and Building Event-Driven Architectures
Adam Bellemare

#Data_Mesh
The exponential growth of data combined with the need to derive real-time business value is a critical issue today. An event-driven data mesh can power real-time operational and analytical workloads, all from a single set of data product streams. With practical real-world examples, this book shows you how to successfully design and build an event-driven data mesh.
Building an Event-Driven Data Mesh provides:
Table of Contents
Chapter 1. Event- Driven Data Communication
Chapter 2. Data Mesh
Chapter 3. Event Streams for Data Mesh
Chapter 4. Federated Governance
Chapter 5. Self-Service Data Platform
Chapter 6. Event Schemas
Chapter 7. Designing Events
Chapter 8. Bootstrapping Data Products
Chapter 9. Integrating Event- Driven Data into Data at Rest
Chapter 10. Eventual Consistency
Chapter 11. Bringing It All Together
Data mesh is a fundamental shift in the way we think about, create, share, and use data. We promote data to a first-class citizen by carefully curating and crafting it into data products, supported with the same level of care and commitment as any other business product. Consumers can discover and select the data products they need for their own use cases, relying upon the commitment of the data product producer to maintain and support it. At its heart, data mesh is as much about technological reorganization as it is about the renegotiation of social contracts, responsibilities, and expectations.
Back when I wrote Building Event-Driven Microservices (O’Reilly) I made reference to (and a bit vaguely defined) a data communication layer, very similar yet not nearly so well thought out as data mesh. The principles of the data communication layer were simple enough: treat data as a first-class citizen, make it reliable and trustworthy, and produce it through event streams so that you can power both operational and analytical applications.
The beauty of data mesh is that it’s not a big-bang total revision of everything we know about data. In fact, it’s really an affirmation of best practices, both social and technical, based on the collective hard work and experiences of countless people. It provides the framework necessary to discuss how to go about creating, communicating, and using data, acting as a lingua franca for the data world.
Zhamak Dehghani has done a phenomenal job in bringing data mesh to the world. I remember being blown away by her initial article in Martin Fowler’s blog from 2019. She very eloquently described the problems that my team was facing at that very moment and identified the principles we would need to adopt for working toward a solution. Her work really influenced my thinking on the need to have a well-defined data communication layer to make sharing and using data reliable and easy. Dehghani’s data mesh is precisely the social-technical framework we need to build a better data world.
Events and event streams play a critical role in a data mesh, as your business opportunities can only ever be solved as fast as your slowest data source. Classic analytical use cases, such as computing a monthly sales report, may be satisfied with a data product that updates just once a day. But many of your most important business use cases, such as fulfilling a sale, computing inventory, and ensuring prompt shipment, require real-time data. An event-driven data mesh provides the capabilities to power both operational and analytical use cases, in both real time and batch.
There is real value in adopting a data mesh. It streamlines discovery, consumption, processing, and application of data across your entire organization. But one of the best features of data mesh is that you can start applying it wherever you are today. It is not an all-or-nothing proposition. You can take the pieces, principles, and concepts that work for improving your situation, and leave the rest until you’re ready to adopt those next.
I’m quite excited about data mesh. It provides us with a principled social and technological framework for building out our own data meshes, but just as importantly, the language to talk about and solve data problems with all of our colleagues. I hope you’ll enjoy reading this book as much as I did writing it.
Adam Bellemare is a Staff Technologist, Office of the CTO at Confluent. Previously, staff Engineer, Data Platform at Shopify and he was at Flipp from 2014, first as a Senior Developer, followed by a role as Staff. He has also held positions in embedded software development and quality assurance. His expertise includes: Devops (Kafka, Spark, Mesos, Zookeeper Clusters. Programmatic Building, scaling, destroying); Technical Leadership (Bringing Avro formatting to our data end-to-end, championing Kafka as the event-driven microservice bus, prototyping JRuby, Scala and Java Kafka clients and focusing on removing technical impediments to allow for product delivery); Software Development (Building microservices in Java and Scala using Spark and Kafka libraries); and Data Engineering (Reshaping the way that behavioral data is collected from user devices and shared with our Machine Learning, Billing and Analytics teams). He is the author of Building Event-Driven Microservices (2020) with O’Reilly









