In this session you will learn about how H&M have created a reference architecture for deploying their machine learning models on azure utilizing databricks following devOps principles. The architecture is currently used in production and has been iterated over multiple times to solve some of the discovered pain points. The team that are presenting is currently responsible for ensuring that best practices are implemented on all H&M use cases covering 100”s of models across the entire H&M group.
This architecture will not only give benefits to data scientist to use notebooks for exploration and modeling but also give the engineers a way to build robust production grade code for deployment. The session will in addition cover topics like lifecycle management, traceability, automation, scalability and version control.
With 10+ years of experience from a wide variety of industries Errol have been able to reach an expert level in working with and extracting value from data. Both hands on in the data and from a strategic perspective by creating data products and leading large teams. He does so by leveraging, in the majority of the cases, open source technologies such as Spark, R, Python, Tensorflow and more. Currently Errol is the Lead data scientist for H&M where he manages the data science and ML engineering teams delivering models into production
With 15+ years of experience, Keven becomes a specialist in AI and Data. Besides hands on experience, Keven also has taken various technical leader roles, helping different organization to build AI and data capability, establish tech foundation. Currently Keven is competence lead and also AI architect in H&M group, manages a group of machine learning engineers, also responsible for engineering and architecture.