Do your event streams use connected-data domains such as fraud detection, live logistics routing, or predicting network outages? How can you maintain the analysis and leverage those connections real-time? Graph databases differ from traditional, tabular ones in that they treat connections between data as first class citizens. This means they are optimized for detecting and understanding these relationships – providing insight at speed and at scale. By combining event streams from Kafka along with the power of the Neo4j graph database for interrogating and investigating connections, you make real-time, event-driven intelligent insight a reality. Neo4j Streams integrates Neo4j with Apache Kafka event streams, to serve as a source of data, for instance Change Data Capture or a sink to ingest any kind of Kafka event into your graph. In this session we’ll show you how to get up and running with Neo4j Streams to show you how to sink and source between graphs and streams.