Just as big data is the key to many improvements in our daily lives, like driving directions on our smartphones, it’s also key to improving pipeline operations. However, pipeline operators often find themselves drinking from a fire hose of process, instrumentation, and historical data:
And unfortunately, the quality of this data can be questionable. You’ve likely faced some (or all) of these pipeline-specific challenges related to collecting quality data:
So how do you take your company’s data quality to the next level?
Here are four ideas:
Keeping track of hundreds (or thousands) of assets across great geographic distance can be a challenge. Asset inventory software can help verify you’ve got all of your assets accounted for.
Asset inventory software should allow you to automatically discover assets in the field, and also track the status of all connected devices, network switches and workstation computers across your entire operation. In addition to verifying all of your assets are online, asset management software can also track other pertinent information like software running, firmware version, vendor name and IP address.
Once you’ve got tabs on all of your assets, you need a way to monitor the health and performance of each, as well as the health of the entire system. Analytics software allows you to monitor and analyze conditions across multiple devices on your network to determine patterns. Identifying data gaps early and addressing issues allows you to obtain better data quality. Your analytics software should:
It’s inevitable that internet access will fail at some of your more remote locations. A solution to address this is to send only the data required by operators to the SCADA system. Collect the rest of your data locally so you can store it at a higher rate of speed, then consolidate it and send straight to your central repository. Consolidation can include:
What are your operators truly looking for when they analyze operating, historical, and event data? And how can you serve up data to help operators most easily locate the information they need?
Ultimately, pipeline operators need to know how each piece of equipment is functioning. A key feature of your analytics software should be the ability to group data from various sources by the equipment that data ‘serves’. When data is grouped by piece of equipment, operators can quickly and easily locate all of the information needed to solve any problems that arise.
And, grouping data by piece of equipment better positions you for future implementation of artificial intelligence (AI) analysis tools. But that’s a topic for another blog post.
For more information about how to take your company’s data collection skills to the next level, check out our whitepaper: “Pipeline Optimization: Key Opportunities for Smart Technologies.”
And to learn more about the products and services Rockwell Automation provides to support Pipeline Operators, visit our pipeline automation web page.