Updated: Nov 27, 2019
How do you get to the prize?
So you have decided that you are starting on a Digital Transformation Program with your brownfield asset with the end goals being to decrease unplanned downtime of you asset, cost reduction through increased efficiency, enhance your revenues and improve your safety record.
In addition, you must ensure that any plan is complaint at both Enterprise and Government levels and has the ability to mitigate environmental risk.
This will be a challenge, but it is doable, subject to budget restraints as well as alignment and willingness to change at both corporate and operational levels. We, at Digital Plant Specialists (DPS) are tasked by our clients to help them achieve these goals and as such, this paper outlines our phased approach to ensuring that not only is the implementation a success, the company meets the goals it has set, including a rapid ROI.
Phase 1 – where are you now and where do you want to be at the end of this process?
A recent article in the Harvard Business Review by Mike Sutcliff, the Group CEO of Accenture Digital notes that “If top managers aren’t on the same page, it makes it difficult for their direct reports to agree on what to prioritize and how to measure progress. The remedy: Define and articulate not only the opportunity but also the problem it solves, and how the company will build the organization around the desired solution before investing.” In other words, if you are not in alignment from the top down, the chance of project success if very low.
To address this, all our DT4.0 projects begin with a series of workshops to define the key elements through a gap analysis. Key information discussed include:
What are your goals regarding DT4.0?
What are your Key Process Models at an Operational, Corporate and Governance level?
What are the KPIs driving your business and what needs to be captured and visualized?
Where is all your legacy and near/real time data?
What data is important for improving quality, safety and performance?
What data model/hierarchy/standards are going to apply?
How do I handle “Big Data”?
What is your budget.. a tricky question that needs to be determined up front to ensure that the final bespoke solution brings an ROI within an agreed time frame.
The Value of running this Phase is that you start the project with an aligned and focused team with SMART goals, standards and budgetary plans that will drive the future success of the investment.
Phase 2 – Capturing the history and genesis of the asset?
I am sure that everyone has seen the cover of the Economist from May 6th 2017 with the headline “ The World’s most valuable resource” which is no longer oil but data. The dive into the digital pool does not begin with placing condition monitoring sensors on all of your critical and non critical process circuits and equipment nor is it about purchasing ground breaking software.. at this stage..
The most critical element that tends to get forgotten is the enormous amount of data you already have, in an existing operating plant. This is the history of the successes and failures, modifications and maintenance that has gone into getting the plant where it is today. Company data comes in many forms, Static and Dynamic, Structured and unstructured, 1D, 2D, 3D and the list goes on. If you are serious about preparing to go “live” with a predictive analytical outcome, this data will build the training database for machine learning, smart algorithm, AI driven end goal.
This is itself can seem a daunting task, and in the past, it would be person years worth of work, collating, verifying, documenting and sourcing all forms of data into a centralised data base, cross referenced and accessible to all who require it.
Today, whilst it is still a daunting task, exploiting the lasts machine learning, natural language processing, computer vision image processing and AI, there is a solution that will now take months to accomplish rather than years. Human interaction is still critical for validation and creating the initial training data sets for the smart algorithms to learn from but in the end you will have a validated EAM with all documents and data linked though meta data tags in a rapidly accessible data lake format. The value from this phase is that you now have a single source of truth accurately representing your asset accessible by all levels of the company. This is critical for both prescriptive analysis, such as RBI and FMECA studies as well as the predictive analytics to be implemented as part of the final stages of the DT4.0 journey.
Phase 3 – Visualising where you are today!
So you have a validated and truthful data base but to get best out of it, building a coupled 3D static model of you asset. With the advances in 3D modelling, we are now able to create an interactive and accurate model of the asset which can be interrogated to an individual equipment piece or circuit level. By coupling this model with the Phase 2 data lake, anyone with the correct permissions can bring up the history of an individual item but access the technical specifications, plans, safety records as shown below:
The model is created through a combination of utilising the current 3D model data, updated with new surveying and 3D laser scanning to ensure the model is not “as built” but is now “as is today”.
The power of visualisation can provide potential ‘step-change’ productivity enhancements by replacing slow and costly manual process using multiple systems. As an example, Traditional analytical techniques focusing on safety, such as RBI and FEMCA, generate results using as much data as was available and reporting in a spreadsheet format. With a fully coupled 3D static model, your results can be linked and displayed in relative space and have more meaning and impact when planning ahead.
Source: Antea’s Palladio Asset Visualisation Software – anteash.com
With this 3D visualisation capability and a coupled model, decision making is being optimised by accessing the right data at the right time by the right people to drive real transformation.
Phase 4 – Looking into the future
One of the main goals for the DT4.0 journey is to reach a stage whereby you are able to analyse all of your data, both historical (Phase 2) and real/near time, to predict outcomes in the future. These outcomes or objectives may include:
Predictive Spare Parts / Supplies Management
Predictive Consequence Modeling
Linked Models between you and your customer
Predictive Mud Weights for well planning
3D Hazard mapping and modelling
As we have already discussed, predictive analytics, by definition, uses historical data to predict future events. Typically, historical data is used to build a mathematical model that captures important trends. That predictive model is then used on current data to predict what will happen next, or to suggest actions to take for optimal outcomes.
For predictive maintenance to be carried out on an industrial asset, the following base components are required:
Sensors – real/near time data-collecting sensors installed in the physical product or machine
Data communication – the communication system that allows data to securely flow between the monitored asset and the central data store
Central data store – the central data hub in which asset data (from OT systems), and business data (from IT systems) are stored, processed and analyzed; either on-premise or on-cloud
Predictive analytics – predictive analytics algorithms applied to the aggregated data to recognize patterns and generate insights in the form of dashboards and alerts
Root cause analysis – data analysis tools used by maintenance and process engineers to investigate the insights and determine the corrective action to be performed
All of these steps are unique for each company implementing the solution as the operations teams knowledge of the plant will guide the sensor locations and on the floor insights from the hands on team are just as valuable (and often overlooked) as the legacy data which has been verified and cleaned up in Phase 2.
Many of our clients already have some form of smart sensoring system, communications systems and protocols and alert control panels in place and these should be integrated into any final package to ensure cost minimisation, if at all possible. Your central data store could be your EAM or a bespoke solution onsite or cloud based and linked to all the data repositories currently in place, again to minimise the project spend. These steps should be carried out up front, in Phase 1 when the goals, KPIs, budgets and deliverables are being agreed by executive, operational and IT management.
So now for the smart algorithms that drive the predictions that will optimise the plant processes, if implemented correctly. As we have mentioned in a separate paper, machine learning is literally an algorithm that with the help of an accurate historical data set, will “learn trends”, seek patterns and begin to collate these legacy insight with real time or near time data enabling a predictive model to be built.
Our partners at Vroc Ai, one of the worlds leading predictive analytics companies, highlight two key learnings any company must take into consideration:
During the “learning” phase of the process, interaction with the client technical team is essential for ensuring that their knowledge of the asset in combination with the legacy data, is captured.
Secondly, real time data is excellent but running a prediction is all about the future as opposed to the now. Live data is a key input but there is always a time lag to when the analytical results are available. Massive real time data communications links, which are very costly and, in some cases charged to the client by the Mb, need to be planned upfront noting what key data needs to be live and what data can be near time. We have seen clients charged tens of thousands of dollars for a single hour's data transfer, when streaming live . Be aware that alarm alerts need this up to the second data but most of the information can be sent in bursts at regular intervals thus saving costs.
Phase 5 – Visualise to create Action
Predictive Data analytics has proven its worth time and again by helping businesses examine structured and unstructured datasets and extract useful information so key stakeholders can make more-informed, more effective decisions.
However it can only do so much. Endless columns and rows of alphanumeric data can be difficult to digest at scale. Depending upon the level of detail that stakeholders need to draw actionable conclusions, as well as the need to interact with or drill-down into the data, traditional data analytics might not be sufficient for businesses to excel in today’s competitive marketplace. Additional tools are needed to help extract more timely, more nuanced, and more interactive insights than data analysis alone can provide.
As such, data visualisation is critical to create actionable results based on the predictive analytics methodologies implemented in this paper. Data visualization takes the results of the queries and computations of data analysis and puts them into a more dynamic and human-friendly format. It summarizes and delivers complex ideas, correlations of intricate relationships, and the results of multiple tiers of variables to those that need them.
Data visualization allows real-time interaction, allowing users to drill down into the minute details of a chart right on their computers and mobile devices.
This vertical interaction allows stakeholders to select different data sets, view the results of different filters, and otherwise fine-tune their view of the data to answer specific questions they have in the moment.
Visually compelling images of data visualization, like charts, graphs, gauges, and maps, help business understand the story of trends and stats much more easily. They can often reveal patterns, trends, and correlations that would easily go undetected otherwise. We have already seen this at Phase 3 of this workflow, where your static 3D model can be used to highlight RBI, FMECA, component histories etc. However, with real time data and the predictive analytical information at hand we can scale this up to create the “Digital Twin”, a 3D model that shows you where you are today and allows you to add predictive data to manage the future without disruption to on going operations.
So who will benefit?
Executive Management: Ability to monitor and influence key corporate performance indicators that matter to your company stakeholders at all levels
Operational Management: New operational data feeds into production and planning models, dictating pivotal strategic insights, recommendations and road maps.
Field Services/Technicians: Digital twins are leveraged to continuously aid in rapid monitoring for critical equipment while reducing downtime (planned and unplanned) and enable service-based business models.
Design: The real time 3D operational data and modelling streamlines design and testing costs in prototyping product stages.
Marketing & Sales: Equipped with knowledge on customer’s preferences and actual usage of their product, can tailor messaging to drive revenue.
Product Managers: Improve product insights and PLM systems in-place with digital twin integrations accelerating time-to-market.
IT Managers: Ability to build both Cloud based and Local data transfer systems enabling all pertinent data to be accessible by the right people and the right time – Desk based, Control Room Based and Tablet based access
Source - Vroc Ai
Source - Antea's Palladio Suite
In summary, a properly scoped and executed digital transition program, utilising a phased approach to implementation, will bring significant outcomes to your current asset portfolio including:
• Cost Reduction through efficiency
• Revenue Enhancement through optimised asset ultilisation
• Safety with predictive monitoring and human factors
• Compliance at Enterprise and Governmental levels
• Innovate by creating new business models
By combining our deep technical engineering expertise with class leading partners, including Antea, Vroc, Cenozai and DPS is able to offer a bespoke Digital Transition workflow from data to digital twin, in line with your desire to embrace Industry 4.0 with a proven phased approach.