Just as big data is the key to many improvements in our daily lives, like driving directions on our smartphones, it’s also key to improving pipeline operations. However, pipeline operators often find themselves drinking from a fire hose of process, instrumentation, and historical data:
- Operating process parameters on drive train and pumps or compressors (temperatures, pressures, flow rates)
- Instrumentation parameters for meters, transmitters and analytical tools
- Short-term historical data collected for incident investigations
- Long-term historical data collected for comparative analysis and identification of trends
- Event data, such as equipment failures and human intervention on the pipeline equipment for replacements and preventive maintenance
And unfortunately, the quality of this data can be questionable. You’ve likely faced some (or all) of these pipeline-specific challenges related to collecting quality data:
- Validity: Here’s an example: are the pressure transmitter, the PLC and the data-collection equipment all operating correctly, and is the pressure transmitter connected to the operating pipeline, or is it isolated for some reason?
- Centralization: It’s very challenging to centralize data, both for analysis and for archiving, given the low-bandwidth communications used on remote pipeline sites
- Availability: It’s tough to analyze data and look for trends when your data has gaps
So how do you take your company’s data quality to the next level?
Here are four ideas:
Improve Asset Management
Keeping track of hundreds (or thousands) of assets across great geographic distance can be a challenge. Asset inventory software can help verify you’ve got all of your assets accounted for.
Asset inventory software should allow you to automatically discover assets in the field, and also track the status of all connected devices, network switches and workstation computers across your entire operation. In addition to verifying all of your assets are online, asset management software can also track other pertinent information like software running, firmware version, vendor name and IP address.
Implement an Analytical Watchdog
Once you’ve got tabs on all of your assets, you need a way to monitor the health and performance of each, as well as the health of the entire system. Analytics software allows you to monitor and analyze conditions across multiple devices on your network to determine patterns. Identifying data gaps early and addressing issues allows you to obtain better data quality. Your analytics software should:
- Alert you via your computer, tablet or phone, when a situation occurs that the system feels need attention
- Learn from your actions which issues are important to you
- Perform analytics on detected smart devices to identify health and maintenance issues that could impact performance
- Take a system-wide look at asset interactions and performance
- Display dashboards with contextualized information to notify you when issues are detected
Consolidate Your Data
It’s inevitable that internet access will fail at some of your more remote locations. A solution to address this is to send only the data required by operators to the SCADA system. Collect the rest of your data locally so you can store it at a higher rate of speed, then consolidate it and send straight to your central repository. Consolidation can include:
- Sending data at a lower resolution (larger time interval).
- Using modern, real-time data historians that provide sophisticated ways of compressing the data, recording it and varying the data from an established pattern. This includes reporting values when they have changed from the last reported value by a configured percentage or amount, and also monitoring for change from the trend established by pre-existing values and other data compression algorithms.
Organize Your Data
What are your operators truly looking for when they analyze operating, historical, and event data? And how can you serve up data to help operators most easily locate the information they need?
Ultimately, pipeline operators need to know how each piece of equipment is functioning. A key feature of your analytics software should be the ability to group data from various sources by the equipment that data ‘serves’. When data is grouped by piece of equipment, operators can quickly and easily locate all of the information needed to solve any problems that arise.
And, grouping data by piece of equipment better positions you for future implementation of artificial intelligence (AI) analysis tools. But that’s a topic for another blog post.
For more information about how to take your company’s data collection skills to the next level, check out our whitepaper: “Pipeline Optimization: Key Opportunities for Smart Technologies.”
And to learn more about the products and services Rockwell Automation provides to support Pipeline Operators, visit our pipeline automation web page.