Legacy applications that were developed in bygone days may appear to be close to unsalvageable. In reality, these applications are still running in production and carrying out the important day-to-day missions for their respective companies. After all, companies have spent a considerable amount of time and money on developing those applications, and despite the lack of perfection, these applications nonetheless keep their companies in operation. How about if we redesign the system, and identify pieces of the complex business functionality in the legacy system that can potentially be ""recycled"", and retrofit them into the new system that leverages on the power of the reactive data flow pipeline? This presentation will be a lively discussion with hands-on coding to illustrate how to construct a reactive, event-driven data flow pipeline, which are composed of different library implementations of the Reactive Streams specification, using Akka Streams reading from a Kafka data source, then connecting to a Java Kafka client which will in turn send write to a Kafka destination.