End-To-End Use Case
Last updated
Last updated
This Use Case assumes the XMPro platform is installed and configured, or you are using the Free Trial that has everything set up for you.
This step-by-step tutorial is meant to be an introduction to using the XMPro platform. Completing it will give you a solid foundation to understand the more advanced concepts and detailed how-to guides. This tutorial will explain how to create and design a Data Stream, configure Stream Objects to ingest, analyze, transform, and perform actions on data. You will also learn how to set up a Recommendation to generate alerts based on rule logic, create and design an App, create Data Sources and Connections, and configure a simple Data Grid and Chart.
Please note that the XMPro platform requires third-party cookies to be enabled on your browser.
Let's assume there is a power plant that uses a heat exchanger to keep the turbine cool and at the optimum temperature. The heat exchanger circulates water between the cooling tower and the heat exchanger to dissipate heat. To keep a proper circulation of liquid, there are three pumps [A, B, C] installed. Each Pump has a sensor that provides live data for Flow Rate (L/m) and Temperature (°C) using MQTT.
Unless the Pump is under maintenance the Flow Rate should be above 15000 L/m and Water temperature should be below 130°C .
Engineers should be alerted if the average flow rate falls below 250 L/s. If the average temperature starts to rise above 130°C then a critical level alert should be raised.
Engineers should be provided a view to check the history of pump telemetry, maintenance records, and reservoir level to enable them to take necessary action.
The Use Case requires that we gather the Flow Rate and Temperature data from three pumps constantly, and pass it on to be analyzed and have actions performed on the data. We will achieve this with the use of Data Streams. A Data Stream is a visual representation of a flow of data. It is created through the Data Stream Designer.
To access the Data Stream Designer, log into your XMPro Account and press the button in the top-left corner of the screen and click on the Data Stream Designer item.
A Data Stream has four components:
Ingesting data through Listener Agents
Contextualizing sensor data/telemetry through Context Provider Agents
Analyzing and transforming data through Transformation and AI Agents
Performing actions or outputting data to other integrations through Recommendation and Action Agents
We will follow those four steps below.
In this section, we will simulate reading data from pump sensors and a metadata store, and combine the data together into a single flow.
See the Data Stream Concept article for more information on Data Streams.
To begin, we will need to create a new Data Stream. To create a Data Stream, follow the steps below:
Open the New Data Stream page from the left-hand menu.
Give the Data Stream a name. For example, "Pump Condition Monitoring"
Select the Type "Streaming". Data Streams of the Streaming type will run polling Agents at a set interval, for instance, every 10 seconds, whereas Recurrent Data Streams run on a customizable schedule, for instance, once a day at 12am. The recurring type only applies to polling-based Stream Objects, which we won't use in this example.
Select the category under which the Data Stream is to be added.
Feel free to load a suitable icon. If you do not, the default icon will be used. Sample icons can be found in the Icon Library.
Select a collection that will be used to publish Data Stream.
Enter a description to best describe the Data Stream.
Click on "Save".
In a production environment, Data Streams would integrate with external data emitters through Agents like OSIsoft PI or MQTT Listeners. However, for the sake of keeping the example simple, we won't be using any Agents that require an environment to be set up. Instead, we will be simulating the data with the Event Simulator, Calculated Field, and CSV Context Provider Agents.
To simulate the telemetry from the pumps, follow the steps below:
Drag into the canvas one of each of the following Agents:
Refer to How to Upload an Agent to Data Stream Designer if you are not able to find the Agents in the toolbox or the correct versions.
You can search for the Agent in the search bar, and click and drag the Agent into the canvas to add it. An instance of an Agent added to the canvas is referred to as a Stream Object.
Once you have all four Stream Objects in the Data Stream canvas, rename them as follows:
Event Simulator as "Simulate Pump Data"
Calculated Field as "Add Pump Identifier"
CSV as "Simulate Context Data for Assets"
Join as "Contextualize Data"
To change the name of a Stream Object, click the text and edit it.
Your stream should end up looking like this:
Once you have renamed all four Stream Objects, connect them with arrows as follows:
"Simulate Pump Data" to "Add Pump Identifier"
"Add Pump Identifier" to "Contextualize Data" (first input)
"Simulate Context Data for Assets" to "Contextualize Data" (second input)
To connect two Stream Objects, click and drag the green rectangle (Output) at the bottom edge of the first Stream Object, move the cursor to the green rectangle on the left edge of the second Stream Object (Input).
Your connected Stream Objects should look like this:
Now we will configure the added Stream Objects. Save your Data Stream now and after every change to propagate the changes throughout the Data Stream.
See the article on how to configure Stream Objects for more information.
We will need to simulate ingesting data about flow rate and temperature from sensors in the pumps. We can achieve this with the Event Simulator Agent. The "Simulate Pump Data" Event Simulator will constantly generate data defined by the Event Definitions at a rate defined by the Events per Second property.
To edit the configuration of a Stream Object, either double-click it or click it once to select it and click the "Configure" button on the canvas header.
Edit the "Simulate Pump Data" Stream Object and click the + button to the right of the Event Definition grid to add event definitions.
Add two event definitions as follows:
Name: WaterTemperature Type: Range Minimum Value: 100 Maximum Value: 160 Spike Value: 0
Name: FlowRate Type: Range Minimum Value: 14000 Maximum Value: 16000
Spike Value: 0
Ignore the Spike Value and Generate Spike options, as they are not relevant to the current scenario.
Change the Events per Second to 1. Click "Apply" on the Simulate Pump Data configuration page. Then click "Save" on the Data Stream page.
We need to add a way to simulate having three different pumps. At the moment the data is not identified, so we will need to add a range of identifiers to the data. This can be achieved with the Calculated Field Agent. The "Add Pump Identifier" Calculated Field will add a "PumpId" field to the data generated with values "A", "B", and "C" for each subsequent row.
To configure the Stream Object, double click on "Add Pump Identifier" to open its configuration. Or, you can also highlight the Stream Object and click on the "Configure" option at the top of the Data Stream.
Keep "Append to Current" as the "Results Returned As" value. This will add the value calculated by the expression to each row instead of creating a new row with the identifier.
Click the + button to the right of the Expressions grid to add the following expression:
Calculated Field: "PumpId" - The field won't exist yet in the dropdown, so you must enter it yourself.
Expression: ReadingNo % 2 == 0 ? "A" : ReadingNo % 3 == 0 ? "B" : "C"
Data Type: String
Press "Apply" on the PumpId expression and the Add Pump Identifier configuration pages, and press "Save" on the Data Stream page.
There is often metadata associated with assets that is not part of the live data from the sensors. In this case, metadata includes whether the pump is currently under maintenance, the manufacturer, and the last service date. We must retrieve this data from elsewhere. In a production environment, this might be an SAP EAM system, but for this example, we can achieve this through the CSV Context Provider Agent.
Double-click on the "Simulate Context Data for Assets" Stream Object to open the configuration menu. You can also highlight the Stream Object and click on the "Configure" option at the top of the Data Stream.
Download the provided file. The contents of the file are below the download link.
Then under Data check the Use Uploaded File? checkbox and upload the file into the CSV Context Provider. The CSV Definition will be automatically detected and filled.
Change UnderMaintenance from a String to a Boolean, using the options from the dropdown. Also change ServiceDate from a String to a DateTime, using the dropdown. Leave the Limit Rows, Filter Criteria, and Sort by properties as their default values.
When completed, press the "Apply" button at the top of the configuration, and then save the Data Stream.
The metadata about each pump needs to be appended to each row of sensor data received from the pumps. This can be achieved with the Join Agent.
The "Contextualize Data" Join will join together the data from the CSV Context Provider and the Calculated Field using the PumpId as the common field. Configure it as follows:
Behavior: Context - we want to join some context data to our row.
Context Endpoint: Right - we must tell the Stream Object which input has Context data. The Context Data is received by the Right endpoint, as shown in the image below.
Select List: all fields except R_PumpId (as the same data will be in L_PumpId).
Join Type: Inner Join.
On: L_PumpId
=
R_PumpId
You may need to maximize the page to see the grid properly. You can do this by pressing the "Maximize" button in the top-right corner of the page. Press the "Restore" button in the top-right corner to return it to the regular size.
Press "Apply" on the Contextualize Data configuration page, and press "Save" on the Data Stream page.
In this section, we will add some analytics and calculations that will find exceptions, transform the units of the data and find the average level across 5 seconds.
We want to only pass data onward in the Stream if the current pump is not under maintenance. This can be achieved with the Filter Agent. To do this, drag in a Filter Agent and connect the "Contextualize Data" Join endpoint to the Filter.
Rename the Filter to "Ignore Pumps Under Maintenance", and save. Double-click on the Stream Object to open the configuration menu. Click on the + symbol to add a new rule for the filter. Select "Add Condition", and configure the Filter to have the logic R_UnderMaintenance
Equals
false
.
The configuration and Data Stream should look like this:
There are two green outputs to the Filter Stream Object, the left output is where the data is output to when the filter is true. The right output is where the data is output to when the filter is false. The left True Output should be the output that you connect to the next Stream Object.
Press "Apply" on the Ignore Pumps Under Maintenance Data configuration page, and press "Save" on the Data Stream page.
The data from the Pump Data has different units than what we want to use - it is measured in L/m and we want the units to be in L/s. This can be solved with the Calculated Field Agent. To transform the data, drag in a Calculated Field Agent, and rename it to "Change Unit to L/s".
Connect the "Ignore Pumps Under maintenance" Filter endpoint to the Calculated Field and Save. Make sure you connect the left True Output of the "Ignore Pumps Under maintenance" Stream Object to the Calculated Field's input.
Configure the Calculated Field as follows:
Calculated Field: L_FlowRate
Expression: L_FlowRate / 60
Data Type: Double
This will divide the flow rate by 60 to make the value in L/s instead of L/m.
Press "Apply" on the Change Unit to L/S configuration page, and press "Save" on the Data Stream page.
The Use Case requires that engineers should be alerted if the flow rate averaged over 5 seconds falls below 250 L/s, and if the temperature averaged over 5 seconds also starts to rise above 130°C then a critical level alert should be raised. This can be achieved with the Aggregate Agent.
To calculate the average temperature and flow rate over 5 seconds, drag in the Aggregate Agent and name it "Average across 5 seconds". Connect the "Change Unit to L/S" Calculated Field endpoint to the Aggregate Agent and save.
You may need to maximize the page to see the grid properly. You can do this by pressing the "Maximize" button in the top-right corner of the page. Press the "Restore" button in the top-right corner to return it to the regular size.
Configure the Aggregate Agent as follows:
Attributes to group on: L_PumpId
Aggregate:
Average (of) L_FlowRate (as) FlowRateAvg
Average (of) L_WaterTemperature (as) CoolantTemperatureAvg
Unit: Second
Size: 5
Press "Apply" on the Average across 5 seconds configuration page, and press "Save" on the Data Stream page.
We want the data for the average flow rate to be in integer format to display it more easily. This can be achieved through the Data Conversion Agent. To do this, drag in a Data Conversion Agent and rename it to "Data Conversion". Connect the "Average across 5 seconds" endpoint to the Data Conversion Agent and press "Save" on the Data Stream page.
Configure the Data Conversion Agent with the following two rows (You may need to maximize the page again). Click on the + symbol to add each row:
First row:
Input Column: "FlowRateAvg"
Output Alias: "FlowRateAvg"
Data Type: "Int64"
Second row:
Input Column: "CoolantTemperatureAvg"
Output Alias: "CoolantTemperatureAvg"
Data Type: "Int64"
The input columns should already have FlowRateAvg and CoolantTemperatureAvg as options listed in the dropdown menu.
This will replace the FlowRateAvg and CoolantTemperatureAvg with values converted to Integer format.
Press "Apply" on the Data Conversion configuration page, and press "Save" on the Data Stream page.
In this section, we will integrate our Data Stream with the App Designer to trigger Recommendations and send data to Apps.
First, we want to trigger Recommendations with the data from the Data Stream. This can be achieved with the Run Recommendation Agent. To do this, drag in a Run Recommendation Agent and rename it to "Run Recommendation". Connect the "Data Conversion" endpoint to the Run Recommendation Agent and press "Save" on the Data Stream page
Configure the Run Recommendation Stream Object as follows:
Url: the URL of the App Designer site
Key: the App Designer Key.
Output on first occurrence Only?: true
Entity Identifier: L_PumpId - this is for the Recommendation to create separate Alerts for each Entity
Columns To Return: Leave empty (Return all columns)
It is highly recommended that you use any variables that you already have that store the URL or key. You may use the variables that have already been set up if you are using the Free Trial.
Otherwise, the App Designer URL and Key can be found by following these steps:
Open the App Designer in a new tab by clicking the "Waffle" button (a.ka. "App Launcher") in the top left corner of the page and clicking "App Designer".
Copy the App Designer URL from the browser's address bar and paste it into the Url field in the Run Recommendation configuration.
Click the "Settings" button in the top bar and click the "Copy" button to the right of the Integration Key and paste it in the Key field in the Run Recommendation configuration. You will only be able to see this if you have Admin access. If you do not have Admin access, you can ask an Admin to share the key with you.
If you are configuring the URL and Integration Key without using variables, make sure you uncheck the "Use Connection Variables" checkbox option first.
This is how your Data Stream and configuration should look:
Press "Apply" on the Run Recommendation configuration page, and press "Save" on the Data Stream page.
We want to send data to an App to be displayed as a decision support dashboard for the engineers. This can be achieved through the XMPro App Agent.
Drag two XMPro App Agents onto the Data Stream and name them "Post Pump Overview" and "Post Pump Specifics". One will send an overview of the data for all pumps, and the other will send a large cached amount of data for each pump.
Now we run into a problem; we want to connect multiple agents to the same data. To solve this, drag a Broadcast Agent into the Stream, and rename it to "Broadcast". Disconnect the "Ignore Pumps Under Maintenance" input arrow and connect it to the new Broadcast Stream Object. You can disconnect the arrow by highlighting the arrow itself and clicking on the "Delete" button at the top of the Data Stream.
Alternatively, you can click on the green rectangle (input) on the "Ignore Pumps Under Maintenance" Stream Object, and drag the arrow to the green rectangle (input) of the "Broadcast" Stream Object. Connect the Broadcast endpoints to the two XMPro App Stream Objects and the "Ignore Pumps Under maintenance" Filter, as shown in the video below:
Press the "Save" button at the top of the Data Stream. Your Data Stream should now look like this:
Configure the "Post Pump Overview" Stream Object to store in cache and output only one row per pump as follows: (Follow the steps given for the Run Recommendation Agent above to get the Url and Key.)
Url: the URL of the App Designer site
Key: the App Designer Key.
Cache Size: 1
Replace Cache: false
Cache Per Entity: true
Entity Identifier: L_PumpId
Primary Key: L_PumpId
It is highly recommended that you use any variables that you already have that store the URL or key. You may use the variables that have already been set up if you are using the Free Trial.
If you are configuring the URL and Integration Key without using variables, make sure you uncheck the "Use Connection Variables" checkbox option first.
Press "Apply" on the Post Pump Overview configuration page and press "Save" on the Data Stream page.
Configure the "Post Pump Specifics" Stream Object to cache and output 20 rows per pump as follows:
Url: the URL of the App Designer site
Key: the App Designer Key. For more detail on how to find the key in the Site Settings, see this article.
Cache Size: 20
Replace Cache: false
Cache Per Entity: true
Entity Identifier: L_PumpId
Primary Key: L_PumpId and L_ReadingNo
It is highly recommended that you use any variables that you already have that store the URL or key. You may use the variables that have already been set up if you are using the Free Trial.
If you are configuring the URL and Integration Key without using variables, make sure you uncheck the "Use Connection Variables" checkbox option first.
Press "Apply" on the Post Pump Specifics configuration page and press "Save" on the Data Stream page.
Your Data Stream is now complete. To start the stream, click on the "Publish" button. To see the data flow at each Stream Object, press the "Live View" button and select all the Stream Objects. Alternatively, you can also select specific Stream Objects. For example, if you just want to see the data flowing through to the XMPro App, select the XMPro App Stream Object.
To see if data is flowing properly within the Data Stream, you will need to Publish the Data Stream. Before publishing, you want to make sure there are no errors in the configurations of the Stream Objects. Click on the "Integrity Check" option at the top of the Data Stream. If any errors are present, the Stream Object with the errors will turn red. Hovering over the Stream Object will show you the list of errors. Once these errors are fixed, you will need to run the Integrity Check again.
To read more about Integrity Checks, read the Verifying Stream Integrity article.
Once all Stream Objects have passed the Integrity Check, you can click on "Publish", then "Live View", on the top of the Data Stream. The Live Data will open on the side, and you can then click on "Select Views" to click on the Stream Objects you want to troubleshoot.
If data is displaying for the Stream Object, that means the Stream Object should be working correctly. If not, you can recheck your configuration values for the Stream Object. You can also check if you have a Stream Host running. There are also other ways that you can troubleshoot Data Streams.
For more ways on how you can troubleshoot a Data Stream, read the Troubleshoot a Data Stream article.
The Use Case requires that engineers should be alerted if the flow rate averaged over 5 seconds falls below 250 L/s, furthermore, if the temperature averaged over 5 seconds also starts to rise above 130°C then a critical level alert should be raised. To achieve this we will use Recommendations.
Recommendations can be found in the App Designer. Open the App Designer in a new tab by clicking the button in the top left corner of the page and clicking "App Designer".
To access the Recommendation management section of the App Designer, click on the "Recommendations" button in the left menu and press the "Manage Recommendations" button on the page.
To trigger the required Alerts, we will be creating a Recommendation with two Rules. First, create a Recommendation called "Pump Flow Threshold". The Data Stream should be the same "Pump Condition Monitoring" stream we created previously.
To create a new Recommendation, follow the steps below after navigating to the Recommendation management page:
Click "New".
Specify a name and category for your new Recommendation.
Choose a Data Stream to receive data from.
Click "Save".
Make sure to click the "Manage Access" command and give at least yourself Run Access, otherwise, you won't be able to see any Recommendation Alerts that will be generated by this Recommendation.
This Rule will notify Engineers when the Flow Rate is lower than 250 L/s and give them instructions and resources to help resolve the issue.
Select the Enable Execution Order checkbox, since we want the more critical rule to override the medium rule. Create a Rule by pressing the + button to the right of the Rules list.
Give the Rule the following properties:
The tags for the Alert Heading and Alert Description fields will not work if they are copied and pasted into the field. You will need to select the tags yourself by adding an @
symbol and selecting from the tags in the list.
Name | Value | Description |
Rule Name | Medium - Flow rate falling | The Rule Name is for identification only and will not be shown to the user in the Recommendation Alerts grid or detailed view. |
Alert Headline | Warning | This refers to the headline that the Recommendation Alert will be created with. Any tag (starting with @) will be replaced with the value output from the Data Stream. Add tags by typing @ and selecting the item. |
Alert Description | Flow rate is reported to be falling, danger of plant overheating and shutdown. Flow Rate: | This refers to the description that the Recommendation Alert will be created with. Any tag (starting with @) will be replaced with the value output from the Data Stream. |
Alert Ranking | Medium | The priority level that the Recommendation Alert will be created with. Priority level determines the order in which the Alerts will be displayed. |
Icon | An icon of your choice (hover your mouse over the default icon to upload a different one - sample icons can be found in the Icon Library) | The icon that will be displayed on the Recommendation Alert in the grid and in detailed view. |
Impact Metric | Prefix: $ Value: 15 Unit of Measurement: K | The impact that the Recommendation Alert will have. For example, if the value for this was $15K, that means that the cost of the condition causing the Alert would be $15K (or $15000). This will be shown on Alerts in the Recommendation Block in App Pages. |
Rule Logic |
| Data sent from the selected Data Stream is passed through the Rule Logic and if the conditions created are met by the data (and if Recurrence is set to All Occurrences or First Occurrence and no Pending Alert exists), a new Recommendation Alert will be created. You can add new conditions or groups by clicking the + button. Groups can be nested within each other to create advanced logic. In an "And" group, all the conditions must be true, and in an "Or" group, only one of the conditions must be true to trigger an alert. |
The Rule Logic determines whether a Recommendation Alert will be created on receiving data from the Data Stream.
There should be some instructions for the engineers to follow to help resolve the issue when it occurs. This can be provided through the Triage Instructions.
Continue creating the Rule with the following properties:
Name | Value | Description |
Enable Form | false | A flag that determines whether the Recommendation Alert will be created with a Form. |
Additional Recommendation Management Column | - | An additional column in the Recommendation Alerts grid. |
Resolution | Manual | Resolution determines whether new data from the Data Stream will automatically resolve the Recommendation Alert if the Rule Logic is no longer true. Manual Resolution: A user must manually resolve each recommendation. Automatic Resolution: Recommendation auto resolves when trigger conditions are no longer true. This may impact performance. |
Recurrence | First Occurrence | Recurrence determines whether new data from the Data Stream will create new Recommendation Alerts if there already exists a Pending Recommendation Alert and the Rule Logic is true. Recurrence will create an Alert for each unique Entity selected by the Data Stream. For example, since there are three pumps (A, B, and C), each pump will generate its own Alert when something goes wrong, and it will need to be resolved before new Alerts for that pump are created. First Occurrence: The current recommendation must be resolved before others can be triggered for the same rule. All Occurrences: A new r |