Got a big idea? I’m in. Let’s create something extraordinary together.

Phone

+61 40 649 2312

Address

673 La Trobe St, Docklands, VIC 3008

Social Links

Personal Blog

Inside the Engine Room: Building a Real-Time Data Hub for Energy Operations

Transforming energy operations in Victoria with a unified data hub using Databricks, Kafka, and real-time analytics.

Inside the Engine Room: Building a Real-Time Data Hub for Energy Operations

In today’s energy sector, data is more than just numbers - it’s the backbone of how we manage, plan, and optimise operations. Over the past year, I had the opportunity to lead a cross-functional team on a transformative data initiative for a major energy provider in Victoria. Our mission was ambitious: to centralise critical data streams into a single, intelligent platform that would not only meet regulatory and operational requirements but also empower teams across the organisation with real-time insights.

The Vision

Energy networks are inherently complex. Between SCADA systems, GIS layers, SAP HANA, and customer smart meters, we had a wealth of data - just not in the right place, at the right time, or in the right format. The vision was to unify these systems into a real-time, governed, and analytics-ready data hub hosted on AWS.

Our North Star? Build an architecture that supported near real-time analytics for power quality, outage planning, asset optimisation, and customer service - all while complying with Australian regulatory standards.

Architecting the Platform

The solution architecture was built with Databricks at its core, running in a dedicated AWS VPC. We leveraged:

  • Confluent Kafka to ingest streaming data from PQ meters, SAP, and Zepben EWB.

  • Delta Lake with Medallion Architecture to organise raw, curated, and business-ready datasets.

  • Databricks Notebooks and Delta Live Tables for scalable and maintainable ETL pipelines.

  • TimescaleDB for storing and analysing time-series PQ data (e.g. voltage sags, harmonics, flickers).

Data landed in AWS S3 , processed via Spark jobs, and published into secure Delta tables. From there, dashboards, APIs, and notebooks enabled users from field planning to analytics teams to interact with the data.

Key Integrations

  • SAP HANA → S3 for customer and meter metadata

  • SIQ → Kafka → TimescaleDB → Databricks for PQ telemetry

  • Zepben EWB model converted into dynamic network views

  • ADMS polling for hourly switch state updates

We implemented Change Data Capture (CDC) for reconciliations, and used Unity Catalog for managing schema, lineage, and RBAC-compliant access controls.

The Human Element

This wasn’t just a tech project - it was a cultural shift. Operations planners, PQ analysts, data engineers, and security architects collaborated like never before. We co-designed interfaces that answered real business questions: Which circuits can take extra load during an outage? Is a customer complaint due to a real PQ issue? How has load shifted after switching?

Dashboards like Low Voltage Analytics (LVA) and tools like Phase Balancing transformed manual assessments into data-backed workflows. Analysts could triage complaints from their desk, and planners could simulate switch operations with confidence.

Compliance and Security

Working in the Australian utilities context, we had to ensure strict compliance with data protection and resilience standards such as:

  • Australian Privacy Principles (APPs)

  • CPS 234 for cyber security

  • Secure by Design principles in cloud infrastructure

This meant encrypted data flows, VPC isolation, fine-grained IAM, and audit-ready data access patterns.

The Outcomes

  • Faster outage planning by simulating load transfers in real time

  • Reduced field visits through desktop diagnostics of PQ issues

  • Data-driven phase balancing to improve network reliability

  • One-click access to trusted data for over 100 users

What’s Next?

The foundation is laid. Next, we’re moving into predictive analytics, digital twin modelling, and intelligent asset planning. By integrating AI models and expanding our orchestration layer, we aim to automate even more of the grid management lifecycle.

Final Thoughts

This journey has shown me that great platforms are not just built with the best tools - they’re shaped by the people who use them. When data engineers, field techs, and planners share a vision, technology becomes an enabler, not a barrier. For others looking to modernise their energy operations, my advice is simple: start with user pain points, build iteratively, and make governance and performance first-class citizens in your design. And above all, remember - data should empower decisions, not complicate them.

databricks, journey, australia, energy, Innovation
4 min read
Jun 15, 2025
By Vishnu Devarajan
Share

Leave a comment

Your email address will not be published. Required fields are marked *

This site is protected by Honeypot.

Related posts

Jun 16, 2025 • 4 min read
Tech Problems I’ve Actually Solved – Not Just Talked About

Real-world tech challenges solved through practical architecture, data...

Jun 15, 2025 • 4 min read
Built in the Field – How My Services Grew from Real-World Problems

A real-world guide to tech services built from hands-on experience in...

Feb 22, 2025 • 1 min read
Transforming Trade – Automating Docs with ML & Blockchain

Revolutionised export processes by automating trade documentation with...