Smart Cities are complex and challenging environments. They generate overwhelming amounts of data you have to ingest, transfer, prepare and store even before thinking of analyzing them or training a model. Based on a fictitious but concrete scenario, this presentation will show how to leverage different data engineering patterns to create an entire edge-to-core data pipeline in London. We will review technologies such as computer vision, Kafka, object storage, data aggregation and realtime and batch analysis in a fully cloud-native containerized environment.
In this session, you will better understand how to get your models in production, how to automate some data engineering workflows, and how to build a resilient production-grade data workflow infrastructure.
Associate Principal Solutions Architect, Red Hat®
Principal Specialist Solutions Architect, Red Hat
Time: 12:00 PM - 1:00 PM ET
Date: December 13, 2022
Any questions? Please email lvillanu@redhat.com
Important Notice | |
|
|