by Tony Kelly, former Managing Director of Yarra Valley Water
When I left Yarra Valley Water in 2014, I believed, and still do, that every utility will have real-time monitoring of their infrastructure networks that is fully integrated with data analytics and decision-making systems within ten years.
The days of utilities waiting for their customers to call and tell them there is a burst water main, a pressure or a water quality problem are rapidly coming to an end. Customers, regulators and the media now expect a water utility to know exactly what’s going on, in real-time. Soon it will be the utilities that are proactively calling their customers to assure them the problem is known, is being addressed and that the system will be back to normal within X minutes.
Also heading for the exit is the utility’s dependence on local knowledge. We all know the salt-of-the-earth operators who have worked on the network for thirty or even forty years and who know every valve on a first name basis. These guys have served the industry extremely well, but new technologies will dramatically amplify the amount of data coming into the utility and the manual methods used by these guys will not be able to keep up. Utilities need to build a whole new level of capability.
It is in this context that everyone at every water industry conference these days seems to be talking about evolving into a digital utility. There are many challenges surrounding this that the CEO of a utility needs to face:
- While the technologies are seductive, how do I build an effective business case?
- Which technologies do we buy?
- How do I future-proof our utility and not get locked into proprietary systems with short half-lives?
- How do I start this process in a way that ensures I do not compromise the bigger picture and provides the flexibility to gradually and sensibly build a workable integrated digital architecture?
- The cost of communications and network sensors is coming down and the business case is getting better by the year, but when do we jump in – now, or do we wait?
- More and more data will be created, but how do we turn all that data into useful actions and knowledge?
- How do I ensure the organisational culture allows us to exploit the potential offered by these technologies?
While there is a lot of debate about exactly what a digital utility looks like, everyone is likely to agree it must include real-time, automatic monitoring of the utility’s network allied with smart analytics.
Cloud-based Central Event Management (CEM) systems based on data analytics and machine learning can be a sensible way to start. The service enables early detection of network events and incidents such as leaks, bursts, faulty assets, telemetry and data issues, changes in demand and operational failures. Aggregating different data types from several sources and learning from previous events, the CEM software continuously improves its predictions.
The first operative example of this is TaKaDu’s Central Event Management which is used by numerous Utilities in Australia (as well as in the US, Europe, and Latin America).
The system acts as the central management layer for all network events, integrating with any modern IT architecture, and other systems such as enterprise asset management, work order, GIS and CRM (call centres) and acoustic leak detection.
TaKaDu’s CEM bridges the organisational silos, providing a utility with the opportunity to improve its levels of customer service and reduce costs – the holy grail of any strategy. With greater visibility, the utility can prioritise jobs more effectively and respond more quickly, know immediately if there is a change in the configuration of the system or a rapid change in demand, and monitor pressure and the behaviour of pressure relief valves more effectively.
The utility can also detect when and where a leak has occurred, how much water has been lost and monitor the integrity of the pressure districts. With a better understanding of the relationship between supply and demand, the utility can optimise the capacity of its system over time. Combined with the newly available water quality sensors, it will be the first, rather than one of the last, to know if there is a drinking water quality problem.
While many utilities are struggling to work out a pathway into the digital future, a data-driven CEM system provides a low-cost, no-regrets entry point that is easy to implement. The system provides an opportunity to venture into this minefield easily and efficiently without ‘betting the farm’ with a big bang.
Utilities don’t need to wait until they have all the data – they can start with what they have and add the necessary detail. The right system will help the utility identify the ‘bare’ spots in the data and pinpoint where the data needs to be enhanced.
And it doesn’t stop there – utilities have a larger role to play in making cities more efficient, more sustainable and more liveable. For example, utilities in the US are already partnering with popular SatNav systems to inform commuters of traffic disruptions caused by infrastructure failures. A CEM system prepares utilities for their role in supporting smart cities and enables them to respond much more effectively in the event of a major natural disaster.
In summary, CEM systems can improve a utility’s operational efficiency, foster collaboration across the organisation and improve levels of customer service. Looking ahead, data-driven CEM systems have the potential to make a quantum leap in the levels of customer service delivered by water utility networks. Can your utility afford not to have one?
Tony Kelly, a civil engineer, has spent over 40 years in the water industry. During this time he held a number of diverse executive roles covering all areas of engineering, billing, strategic planning and pricing. Tony was Managing Director of Yarra Valley Water from 2003-14 and since that time has been working with a number of organisations in an advisory capacity – including TaKaDu where he is a member of their Advisory Board.