Energy and Utilities Asset Optimisation through Digital Twin technology

In this presentation XMPro CEO, Pieter Van Schalkwyk discusses Energy and Utilities Asset Optimisation through Digital Twin technology.

Looking to learn more about energy and utilities asset optimization using digital twin technology? This informative video presentation provides engineers in the energy and utilities industries with a comprehensive understanding of the basics of digital twin technology, as well as its applications and benefits in asset optimization.

In this video, Pieter answers the most commonly asked questions regarding digital twins, including what exactly is a digital twin, why is it important in asset optimization, and how can it be implemented in your organization.

With a focus on the unique needs of engineers in the energy and utilities industries, this presentation provides an in-depth analysis of the advantages of using digital twin technology in asset optimization. From increased efficiency and reduced costs to improved safety and productivity, digital twin technology offers a range of benefits that can revolutionize the way assets are managed in these industries.

Whether you're an engineer looking to stay ahead of the curve in energy and utilities asset optimization or simply curious about the possibilities of digital twin technology, this video is a must-watch. So sit back, relax, and get ready to learn all about energy and utilities asset optimization through digital twin technology.

Tags: Energy, Utilities, Digital Twin, Asset Optimization, Engineering, Efficiency, Safety, Productivity, Cost Reduction

Transcript

I'm beautiful and I will run you through

how energy and utilities do asset

optimization using digital twin

technology

when we speak to Executives in energy

and utilities and specifically around

asset optimization using digital twins

we kind of hear the same three questions

what is a digital twin why should I care

and how do I get started

so I'll start with what is a digital

twin we were early member of the digital

twin Consortium and in that Consortium

of 250 organizations came up with a

definition of a digital twin is a

virtual representation of a real world

entity or process that is synchronized

at a specific frequency and Fidelity and

the key is that it synchronizes the

visual representation of an entity which

could be a physical entity or something

like a business process

It Is underpinned by three things which

is it needs to improve understanding

decision making and effective action it

needs to use real-time historical and

historical data to help you analyze what

happened in the past what's happening

right now and what's likely to happen in

the future

now digital twins should be driven from

a business perspective around outcomes

we grade them around specific use cases

or applications

they are powered by integration a key

aspect using data and guided by

Specialists to understand the domain

where they operate and typically these

are implemented in the in the industrial

environment like energy utilities

through ID and ID systems

if we explain this in a picture on the

left hand side we have the physical

entity or the physical twin

and on the right hand side we have the

digital twin which consists of a model

consists of data that is synchronized at

a certain twinning right

and

during that synchronization we create an

instance or instantiate the digital twin

based on the model I can have one model

and a thousand pumps in this example and

sometimes the digital twin would feed

information back

do the equipment but that is not

necessary from a digital twin definition

perspective but it does need

synchronization from the physical to the

virtual side of it so 50 000 foot view

of what it is now you also get very

simple discrete digital twins like a

Transformer not that it's a simple piece

of equipment but you can create it a

simple digital twin around that which

could be part of a composite digital

twin for example the substation and that

substation is part of a larger bigger

Network which then becomes a system of

systems Challenge and that's how the

scope and scale of digital Twins and the

interoperability challenge with it grows

what does life look like like right now

with our digital twins well operations

maintenance safety and all these other

functions have their own little systems

on the right hand side where they

sometimes duplicate data and have

different systems and each of them or do

have their own capabilities in different

silos what a digital twin brings it's

really that proxy that allows you to use

common capabilities in the middle and

then let the different areas of the

business create use cases we'll get back

to these six core capabilities as you

see them there when you talk a little

bit later on how to build these things

so just remember the paper clip will get

back to that

the second question that we get is why

should I K now again in energy and

utilities

some of the examples here is really

around the measurable Roi and this is

how microgreens talks about the

impediments of digital Twins and why you

know from an adoption perspective one of

them is that you have to have

value-based use cases because

accountants are the killers of Joy

according to him and if you didn't know

he was the founder of the term

uh digital twins looking at energy

utilities and specifically

on the asset side predictive maintenance

according to Kinsey McKinsey and study

that was done anywhere between 10 to 20

my reduction of Maintenance costs as

well as increase in asset availability

but in 10 to 20 if you put that in

context of your organization it's

massive likewise with performance asset

Performance Management how do we improve

the utilization and and the asset

productivity again you can see anywhere

to doing five percent now if we move

that into the grid operation side of

things

on the

transmission side again anywhere from 10

to 20 in reduction in Grid related

outage times

from Navigant research the if we can now

start integrating some of these other

new generation or alternative energy

sources and we can do that in a

structured way then again the impact

anyway from three to five percent in

terms of improvement of getting those

online and lastly does it Disaster

Recovery things like natural

disasters and those how do we recover

from that how do we recover quicker and

this is from the electric power

Institute some of the numbers that they

are there

there's also been a shift in power so

how we did it traditionally we had

generation units and then we had load

units and we just it was just a

continuous optimization problem

will be moving to now

we have generators that are also

consumers and it's a really much more of

a balancing act and we have to sense the

side and act in a different way it

requires collaboration orchestration and

a lot more con flexibility compared to

where we came from historically now in

order to do that and this is from

Gartner to move now where we right now

have a limited amount of Renewables as

that changes and how we also centralized

everything even from the decision

support and the applications that we use

very monolithic applications and and

structures that we had moving into

intelligent distributed organizations

recomposable decisions the way to do

that is by one adopting digital twins in

the context of the discussion today as

well as composable capabilities and how

we do that

now again this all needs

executive support and we need to

Drive the ROI and the levers that they

have is really around and where digital

twins can support the the the ROI on the

investment is to help

with these assets around to run more

often to produce more or or enhance the

output while it's operating or running

improve the integration of distributed

assets and also minimize the cost in the

process all in the framework of enhanced

safety and also improving ESG across

these four key drivers or key threads

around business performance the process

optimization ESG monitoring compliance

and also asset performance and this is a

strategic initiative set now in order to

drive that a lot of organizations are

creating different initiatives so

underneath of those different

initiatives to address that but the real

trick is in moving from a strategic to a

tactical and operational side and

thinking about what the decision support

and automation requirements are at each

level of the triangle to turn the

strategy into execution at both the

Tactical and operational levels

when we look at this from a digital twin

perspective a little bit more bioropical

you'll see on the left hand side we have

strategic tactical and operational so

right at the top from a strategic

perspective I want to see all those Roi

drivers and I potentially even want to

see it across different sites which may

also have a tactical implication but

then I go into the asset health and then

the process household operational Health

with the equipment and again at

operational tactical levels and this is

what the digital twin can help you do

and one of the key things is you can

create metrics at every level to see how

good you are at doing that

now a different perspective on this and

the role of a digital twin is to create

that common operating picture for

operational awareness and response and

the whole idea is to change from

reactive to more prescriptive operations

and many organizations now inbound

utilities and asset intensive complex

Industries have

all array of complex assets where

there's already scada systems PLC

sensors and everything in there a whole

bunch of different applications inside

the organization Erp GIS name them ML

and then we're trying to use people

processes and automation to respond to

events that happens in all of this

context

and there are signals going from from

the assets into these systems and

transactions in terms of what people

need to do and we have subject matter

experts that have a deep understanding

of these of these assets and how they

operate now what we are trying to do is

first of all connect to all of those

signals and data and create some event

intelligence and in Excel Pro we do that

through what we call our data streams so

it's a visual way of connecting and

handling the integration to all of these

complex things around a specific use

case or application that we are trying

to do so it's a visual way of connecting

the data so that we can create

visualization so looking at the same

data at all the different levels but

from a different perspective or

different lens operationally I see

information and this is what we refer to

as event intelligence we are now

connected to these real-time data

streams and it now feeds our common

operating picture with the the or from

from the same data sources but a

different lens in perspective at the

Strategic level at the planning or

tactical level what's my view for the

next two two weeks a month a quarter

versus what's happening right now at

operational level and what do I do need

to do right now and this gives us

operations intelligence so now we've now

moved from event intelligence to being

able to operate in a better white and

again adding some more capability to

this is being able to create

recommendations that you can consolidate

from all of these different places and

have a consistent way that you present

how people respond

to different

events that happen but using again a

similar structure whether that strategic

level tactical level or even bringing it

from the underlying systems that sits at

the bottom for us that's the Holy Grail

of a common operating pictures not just

seeing the picture but to know what to

do and have prescriptive recommendations

which allows your smartest people to

kind of pull those Roi levers so they

know how to now manipulate those levers

which will reduce the risk of you being

blindsided by key events that are likely

to happen or happening

and it also improves the accountability

and close the feedback loop that

provides visibility and opportunity for

Learning and this is a whole new

business process but if I look at

digital twins in terms of business

processes this is not new this is from

Gartner in terms of you know looking for

example at an asset a digital twin what

you are trying to do is the asset

Performance Management

um you know when something's wrong we

want to rise a ticket for maintenance we

want to make sure that we've got people

we want to know that we've got space we

want to schedule the maintenance effort

and also create tasks and worklets for

for technicians to go out and do that

and then at that stage we can take the

asset offline for maintenance so once we

find a

vibration data that gives us an alert it

goes to full circle so in a way that's a

similar description but this is just a

new this is just a business process with

a different way of actuating and

responding to it instead of human to

human workflies this is now initiated

through machines and iot and sensors the

still analysis process there's still a

work plan process and there's still the

execution of making sure that it's done

and the values that's in improving the

yield and in terms of the business

outcomes making it more profitable

from an excellent Pro point of view with

a common operating picture it's the same

you know it's about integrating all of

this heterogeneous data in a drag and

drop way and then being able to combine

um off-the-shelf analytics with some

maybe some more advanced analytics

developed by your own SM

es in-house

getting that into the systems so that

you can go and create work orders and

things like that in Erp and eam and some

of the other systems and then create a

an interactive user experience a lot and

as we're moving into AR VR and some of

the others how do we support that so we

can help that technician to do the best

best job and then verifying and checking

that the work has been done

so we can close the loop on this the

next question that we get is so this is

all well and great but how do I get

started

so

the way that we get started and again

this is just based on a concept of

composable architecture there are many

different pictures I personally like

this one of well Gardner where it talks

about all these package business

capabilities that sit at the center so

how can we create these reusable blocks

almost like Lego blocks how can I create

all of these blocks and then allow my

subject matter experts where all the

data and everything comes from all the

existing systems that we have to be able

to compose new applications and this

this composition platform can handle

integration orchestration where you

exported at the operation and the

governance as we go through and then we

just build these applications on top

we've taken that same approach and just

applied it to the different types of

data and information that we see in the

industrial and specifically power

utilities environments so physics-based

models analytics iot and temporal data

transactional data and visual models and

master data as part of this you can see

all the the different types of that

lives in there through that we can now

create package business capabilities so

we can have Leak Detection we can do

current monitoring you can do ask

reusable little blocks of capabilities

and compose that together

to be able to grow digital twins that's

where X and profit we see ourselves as a

digital twin composition platform that's

really good at handling the integration

composition orchestration

the development of all of this and the

management as well as the ux spot that's

sort of on top of it while all of this

is integrated into Legacy Business

Systems I.T systems iot systems and kind

of the modern data Fabrics that we see a

lot of organizations Implement right now

that enables us to build

digital twins for Performance Management

fault detection automation

um

emissions and different we can reuse a

lot of the things that we've done and

the connections and and capabilities

have been packaged together in order to

do that from X and pro perspective the

way that we that we do that on the left

hand side we've got our data streams

which is the how do we connect and

integrate and compose and orchestrate on

the right hand side is more the ux and

the ux development in our application

designer so I mentioned earlier in terms

of the little paper clip and those six

capabilities that we saw inside digital

twin Consortium I had the opportunity to

lead a group that created the

capabilities of a periodic table and we

categorized all the capabilities of a

digital 20 into six main groups naming

uh data services integration

intelligence user experience management

and trustworthiness as the key

categories for these and inside that for

example with data services we have data

streaming data transformation so a whole

bunch of of core capabilities there's

actually 62 so in integration on

Intelligence on ux on how to manage all

of this and also from a trustworthiness

this is available on the digital twin

Consortium website

um so I'm not going to draw down into

much more detail you can get some

information there but there but you can

compose any digital twin by using some

of these blocks

and it gives so here's an example of

condition monitoring for a wind form in

this one I only use these capabilities

and if I want to create energy

prediction then suddenly I have machine

learning artificial intelligence and

some of the other things I need to add

which also helps us to start a

conversation around what capabilities do

we have in organization and which of

them should we look at building

partnering acquiring so there's a whole

bunch of things but it also helps us to

also not just focus on the smart

technical things because this is not a

technology or this is not an

architecture this is just core

capabilities but some of it relates to

trustworthiness like security safety

and also how do we handle things like

event logging and some of managing these

things at scale

so what does it look like as a typical

example so here is that wind form and

this is a remote Operation Center

so I can see my overall portfolio of

Wind forms and Silo forms and all sorts

of different

assets that I have I can see at the

moment they're all green so pretty happy

with those I can also see key outcomes

and I'll drill into much more detail on

each of these but it's really around how

can we make sure that we are outcomes

focused that we provide contextual

metadata context for decision makers how

do we focus on the asset itself and the

asset performance how can we provide

Advanced analysis as well as

collaboration and all of this and also

you know how do we provide access to

knowledge all in a common operating

picture

so with that

I'll get into a little bit more detail

on how we compose the digital twins for

us it's a three-step process the first

step is

orchestrating all of this data the

second part is and in a visual way the

second part is creating the visual

experience in this instance it's a it's

a desktop app but it could be mobile it

could be Irv or any one of those and

then how do we create recommendations

because that's a key thing you actually

the real outcome that you're looking for

is the actions that come from

recommendations so if I touch on the

first part in terms of how we build this

orchestration what you'll see here is

typically how we do it at XM Pro we've

got a visual drag and drop data stream

designer and with this I can now bring

in information from the router and the

gearbox the power the your pitch all of

that from other Telemetry using mqtt

these are all draggable blocks under

listeners there's a whole Library it's

extensible Library

um and I can drag these on I can then do

some and I can bring contacts like make

model and all that from Maxima in this

instance I can then transform that data

so there's a whole range there's a whole

bunch of blocks that are all around uh

Transformations doing calculations and

all of that

um I can add clean Wrangle data in all

of that and then the next step is

applying some AI to it you can see I'm

running a python model to predict

likelihood of failure there's anomaly on

the on the right top right I'm also

storing some data at the same time here

in in fluxdb as an action which is

action agent and on the right hand side

we then run a recommendation which is

also there are different variations all

blocks that I can drag on so this is a

visual way to build and then the data

flows based on the frequency that I'm

specifying for this as well what we've

added to this now is the ability to

bring in jupyter Notebook so we have our

AI designer which is

embedded in the product as well so I can

run some of that logic Advanced logic

correct models simulations all of that

running jupyter notebooks as part of

that data stream which is a key part of

bringing intelligence to data to to

digital twins

the next part is to make it look pretty

so the visualizations as I mentioned you

can bring in objects from external

things like Gia systems example of

Israel and also recommendations from the

recommendation engine showing me where

they are they are potential challenges

this is outcomes Focus so you can see

call metrics I'll be doing better I'll

be getting worse what's my current view

on open work orders you know and I can

see the health and across my my

different assets pretty easily if I get

into the asset itself

I can start looking at the time profile

some of the calculations around

effective utilization the power that

it's in current real-time yo and pitch

and all that I can see some of the other

live metrics the recommendations

specifically for this one

um and I this could be a 2d graph or it

could be a Unity model it could be an

Omnibus visualization and

I also have my maintenance records from

my work order history and everything as

a again as a common operating picture

for someone who makes the needs to make

a decision right now I may want to do

further analysis on this compare

different turbines in the same Wind Farm

to each other or I need to have

contextual information around what's

happening right now on in the

environment I may need to speak to

someone

from the supplier of these or someone in

our operation centers around some issues

that I'm seeing there and I have to get

some documentation and supporting

information around this

and where this is all heading as you've

seen with things like chat GPT and what

we biking in as well as the ability to

now use that and interrogate inside for

example what are the failure modes and

whatever it causes potentially for what

I'm saying here so that's where this is

heading and this is all as part of the

visualization part in the middle the

next part is being able to create these

recommendations and in this instance I

can just see some of the real-time data

around what happened

and then I can create a work order in

this instance so this will automatically

then create the work order back into my

existing system that I have like sap or

Maxima or whatever it may be

and I can do some analysis on how many

times we've seen this across this

equipment so a lot more to this I'm just

skimming over this pretty quick but

that's kind of the other core element of

an interactive digital twin which leads

me to where is this all going so what is

the future and um again I had the

opportunity to work with Dr Michael

Greaves

um and uh at the digital tone Consortium

and he's got this view of intelligent

digital twins for me it is it is a

integrated intelligent interactive

composable environment that we need so

we need to make sure that at standard

space API models as I showed at the past

need to make sure and what we're seeing

is executable AI as I've shown you where

we've got the Jupiter notebooks and

those embedded python running inside so

I can get the smarts of my Engineers

into those data streams Innovation AI

being able to run things like front

running simulation similar to what we

have in Formula One where I can speed up

the environment and kind of see what's

what's likely to happen again this is

all possible right now and then

augmenting or augmented AI for making

the digital twins smarter so taking

putting AI over the data that we're

collecting and see how can we improve

and make them smarter so we can get some

machine intelligence

um out of this from an interactive point

of view looking at making it more

um AI enabled so that they recommend for

recommendations and prescriptive

analytics creating multi-user

experiences that are that are more

collaborative generative and then also

we don't know where the industrial

minivers is going to end up or what it's

going to look like

um but the digital twins will be the

core building blocks for the industrial

metavers

and all of this in a composable way so

we can reuse

um what we're doing

Last updated