Skip to main content

Digitalizing solar for operational excellence


In cost savings


Growth data processed

CLOUD: Azure

“Thanks to the scalability of the lakehouse architecture, we are now able to harness the insights within all our data in an effort to proactively inform customers of potential performance issues before they are directly impacted.”

— Steffen Mangold, CTO, Solytic

Solar equipment monitoring and analytics company Solytic has one goal: to make it easy for anyone to use solar energy. Their solutions are designed to help stakeholders make smart, data-powered decisions in real-time, using AI to automate processes, optimize production and cut costs throughout the asset lifecycle. However, with rising data volumes, Solytic’s legacy systems didn’t provide the data reliability or scalability needed for large-scale analytical workloads. The Databricks Data Intelligence Platform enables their analytics teams to monitor solar equipment performance and help mitigate issues before they impact customers.

When rapid growth meets legacy technology

These days, both businesses and individuals alike are becoming as energy conscious as they are health conscious. This has been great news for companies like Solytic, which specializes in solar equipment monitoring and analytics for customers that range from small residential consumers to the largest industrial solar sites. But it’s also made it tricky to navigate such an explosion of interest without the proper tools.

Due to the onboarding of several large customers, Solytic was facing a 1,500% increase in data volumes generated by IoT sensors embedded within solar panels. Their existing solutions weren’t scalable, didn’t allow collaboration between teams, couldn’t hold data without retention time and didn’t provide accessibility for large-scale analytical workloads — and linear cost growth made them extraordinarily expensive to maintain.

With more than 1 billion events per day streaming in, the company needed to begin their transition to a modern platform by separating the processing of data in a data lake without scaling up storage. In addition, 1TB of historical data needed to be migrated to a new platform. Solytic turned to Databricks for help.

The connected power of unifying data and leveraging AI

In order to quickly provide customers with accurate insights into their photovoltaic plants, Solytic needed to address multiple challenges quickly. First, they required a way to clean, enrich and aggregate incoming data streams to enable rapid querying and analytics. With more than 10,000 events coming in from IoT panels per second, along with a large spread of granularity, varying data quality and delayed data, the company needed a unified solution that could resolve all these issues on one platform.

The Databricks Data Intelligence Platform on Azure did just that — quickly and efficiently. While running one big query on their old database would have completely overwhelmed the system, Delta Lake has allowed Solytic to unify all their data, scale without disruption, and build reliable and performant ETL pipelines to feed their analytics needs. Delta Lake is an open format storage layer that delivers reliability, security and performance to a data lake — for both streaming and batch operations.

A second goal at Solytic was to find a way for the data engineers and analytics teams to work together more efficiently. With Databricks fully managed cloud infrastructure, each team is now able to spin up and run their own clusters without tripping over each other, boosting their ability to collaborate. Even better? It took barely any time at all to implement.

“When we began migrating to the Databricks Data Intelligence Platform, we were concerned with having to create a working prototype in less than three months with new technology and an inexperienced team,” said Helge Brügner, Data Engineer at Solytic. “But Databricks stepped up and helped us to fully understand the architecture and how to avoid common pitfalls, and we succeeded without issue.”

Working with Databricks notebooks makes the environment even more collaborative. Whether team members know coding languages or not, analytics team members are able to go into the data with their SQL knowledge and execute their queries for quick insights. And in the collaborative notebooks, one team member can easily draw a plot, compare them to others, and quickly identify the pros and cons of various approaches.

Collaborative productivity, stability and significant cost savings

The implementation of Databricks has enabled Solytic to expand from 20,000 to more than 300,000 monitored devices in less than one year — a growth rate of 15x. Furthermore, by pre-aggregating all the streaming data generated by thousands of IoT sensors, the overall cost of their analytics infrastructure decreased by 90% in the same period.

Helge added, “Today, leveraging lakehouse architecture enables us to make more than 40TB of data accessible for analytical workloads. We are able to serve the data to various teams, and the analysts and data engineers can work and scale their workloads independently. Access control helps us to restrict the data that each team can work with, without the risk of compromising any data.”

Within the next 12 months, Solytic wants to grow their data analytics team and expand their analytical capabilities into machine learning use cases such as advanced predictive maintenance, detection of device outages and smart alerts. With their lakehouse now fully accessible, the data analytics teams at Solytic will be able to explore more innovative options in the future and reduce the workload on operators when monitoring their plants.