Building a Codeless Log Pipeline w/ Confluent Sink Connector

Logging data drives engineering decision-making such as production debugging, experimentation with A/B testing, fraud and risks, and also data analytics. Loggi is one of the Brazilian unicorns transforming logistics through technology. At Loggi, we've made logging events a first-class citizen splitting them into three main categories: unstructured, semi-structured, and structured. Around 300GB per day are ingested to a search engine and all structured logs should be also sent to our data warehouse to be used by our data analytics team to drive business decisions. To build an ETL we need to write a bunch of code, and build a large infrastructure, the pipelines will break and require more code to fix and prevent new issues, producers will change data schemas without previous notice breaking things, and then it becomes a cycle. We'd like to show how we've addressed this issue by building a robust and reliable pipeline using Confluent Sink connectors with almost no code.

Speakers
speakerimage
Pollyanna Rigon Valente
Site Reliability Engineer, Loggi
speakerimage
Daniel Sousa
Site Reliability Engineer, Loggi