Designing a streaming application which has to process data from 1 or 2 streams is easy. Any streaming framework which provides scalability, high-throughput, and fault-tolerance would work. But when the number of streams start growing in order 100s or 1000s, managing them can be daunting. How would you share resources among 1000s of streams with all of them running 24×7? Manage their state, Apply advanced streaming operations, Add/Delete streams without restarting? This talk explains common scenarios & shows techniques that can handle thousands of streams using Spark Structured Streaming.
I am Lead Consultant at Knoldus Software LLP. I have been developing reactive products for the last 4 years using Spark/Scala and Akka. I have developed complex solutions for media and retail industry using machine learning in Scala and Spark. I am a technology enthusiast and blogs frequently about the Scala ecosystem. I also trains engineers on Spark and Akka.