February 2014, Vol. 241 No. 2

Features

Overcoming Data Deluge With Operations Management

Bob Ell, Oil & Gas Business Development, Rockwell Automation

Our increasingly digital world – with all of the texting, emailing, tweeting, posting, downloading and streaming that’s become commonplace – is creating huge amounts of data, more than than ever before. Pipeline operators are faced with a similar phenomenon when it comes to information overload. Intelligent devices for safety, fire and gas, instrumentation, intelligent motor control and condition-based monitoring offer many individual types of information, each with related alarms.

As the amount of asset-related information swells, operations personnel find themselves overwhelmed. Maintenance needs can be overlooked and equipment continues to malfunction, leading to increased downtime and rising production costs. In a worst-case scenario, unheeded alarms can escalate to full-scale emergencies.

An operations management system offers an alternative to this data deluge by creating real-time, secure and standards-based collaborative environments where remotely located subject matter experts make decisions and trained operators apply their recommendations. Using this approach, subject matter experts, such as those responsible for process operations and rotating assets, can be easily accessed and dispatched to deal with warnings and alarms from intelligent devices.

By diverting distracting information from on-site operators to off-site experts, operations management systems allow operators to focus on dashboard information that’s critical for smooth on-site operation. Operators who once suffered from a sort of paralysis by analysis can focus on information that they can act upon. The days of sifting through inconsequential information are gone. No more critical alarms missed, no more overlooked maintenance, no more unplanned down time as a result of irrelevant data interference.

Easing The Search For The Truth
Shifting to a model that moves the supervisory horizon by presenting remote subject matter experts with live data can make life easier for operators and engineers alike.
However, the data still can be overwhelming for these individuals. Two factors contribute to this: time in search of “the truth,” and the manual processing required to manipulate “the truth” into a meaningful format that facilities and maintenance engineers can act upon.
The search for the truth is a labor-intensive process that often requires spending long hours searching historians and databases to extract data pertaining to a specific production challenge, such as reviewing large rotating asset performance to predict when preventative maintenance actions should be undertaken.

Engineers often find themselves relying on a combination of databases to find the answer they’re seeking. In many cases, however, these sources of information aren’t accessible from a single computer, so gathering all the requisite data can be time consuming. Looking at the large rotating asset example, subject matter expertise would need to access CMMS packages, production data, motor performance data and process data to truly understand how the device was performing against design standards. Typically, the engineer would access these databases, extract the data and build a spreadsheet requiring large amounts of non-productive time.

With an operations management approach, work flow automation helps eliminate these manual processes. Well-test automation, production allocation and pump efficiencies are all examples of work flows accomplished today in many organizations by subject matter experts mining multiple databases, manually running calculations or manually driving data into a spreadsheet. Implementing tools that automate these non-productive processes saves time by allowing experts to focus efforts on the challenge they’re solving instead of getting muddled in data.

Dashboarding technology – when paired with historian functionality and, most importantly the ability to contextualize and organize all that data – can automatically aggregate production data, calculate key performance indicators, and present easy-to-understand dashboards and role-based displays via secure web-browser.

This new model changes the meaning of supervisory control and creates an environment where more people have access to the same information. It requires some additional considerations – especially as it relates to data integrity. Assurance comes in the form of a versioning system, but the complexity of the environments in which energy transportation and distribution companies operate means that manual version tracking is simply not a viable option.

Funding The Investment
Automation systems can require both significant capital investments to build as well as ongoing operating expenses to maintain the infrastructure.

An operations management system may be delivered in the form of traditional software packages provided on an organization’s server infrastructure or in the form of Software as a Service (SaaS), taking advantage of secure data management via cloud computing technology. Leveraging cloud computing foundations, applications can be built within the cloud infrastructure and provided as a service.

Purchasing operations management tools as a service avoids using capital budgets and specifically circumvents issues of funding across assets. Collaborating in this fashion is possible within constraints of operating budgets using a fund-as-built model. In most cases, the cost of the tools required to gain efficiency benefits are significantly lower than the cost of building traditional systems.

Utilizing cloud computing to share operating conditional data creates an environment for operational improvement. ‘Private clouds’ leveraging the same technologies can be deployed if an enterprise has considerations that require all of the data to remain within their enterprise (regulatory compliance, investor compliance, etc.).

Implementing operations management as a cloud solution means that operators are no longer the only ones who can view and act upon asset data. An effective operations management solution enables sharing of information to anyone in an organization with the permission to access it, effectively empowering key stakeholders at every level in the organization.

Operators, engineers, managers and executives all have access to dashboards in the cloud, creating a system of checks and balances that ensure no critical detail will be overlooked. An important alarm missed by an operator can be flagged by an engineer who has access to the same information. Production intelligence can pass through multiple sets of eyes, meaning that one person’s oversight does not impact the entire operation.

Finding A Path Forward

By moving the supervisory horizon and leveraging cloud technology, best-in-class organizations have the ability to visualize their operations with live data specific to the role of the individual working the task. As a result, organizations are benefitting from an accelerated ability to find challenges, identify the path to the solution and ultimately reduce downtime.

The foundation for achieving these goals begins with properly planning, designing and applying automation systems. Applying the systems in a way that reduces operator fatigue and empowers operators to perform the required tasks is the beginning of the path toward improved reliability of systems. Driving to world-class performance standards requires moving the supervisory horizon and establishing a collaborative environment that allows subject matter experts, maintenance personnel and operations personnel to collaborate, utilizing data available from the asset. Cloud computing should be considered as a means of achieving these goals quickly.

 width= Bob Ell is regional director for Oil & Gas Business Development for Rockwell Automation.

Related Articles

Comments

{{ error }}
{{ comment.comment.Name }} • {{ comment.timeAgo }}
{{ comment.comment.Text }}