The Journey to Autonomy - Part 2 - From Experience to Prescriptions

Read time: ~15min

In this series we will explore some potential paths from present day to autonomous operations, and the challenges and requirements of achieving autonomy(x).

 

Introduction

Autonomy at scale promises to unlock real increases in efficiency as we learn from and adapt operations and maintenance to the ever-changing geology and physical environment at mine sites. The adoption of autonomous equipment is only one part of the equation. With ever-increasing economic, environmental and social pressures, and rapidly falling rates of experience on sites, autonomy in mining has quickly moved from a nice-to-have to a necessity for scaling and transformation. When faced with needing to do more with less at lower impact, autonomy will help scale productivity, reduce human exposure to hazardous environments, optimize resource extraction and minimize environmental impact. Achieving meaningful autonomy at scale, however, will require system level changes to how we work.

Autonomous operations are idealised for many reasons, principally because we know there are lots of mistakes that are preventable, and we know we need to change as society and our incoming workforce changes. In other words, we know that we can improve how decisions are made, and we know that we will have to change how decisions are made as experience leaves our industry.

Autonomous operations need not only mean unstaffed, however, and we have previously proposed a proximate objective as being autonomy(x), where X = the number of years where a majority of decisions are still made by humans.

This period of x we call the journey to autonomy. In this article, we describe the 5 major stages of the journey, and provide some context and examples to illustrate the challenges and enablers of this journey. We also show that the journey is unlikely to be liner and staged; organisations, and departments within organisations, will occupy multiple stages of the journey at any one time, resulting in sometimes radically disparate and even conflicting methods of decision making.

The journey to autonomy

We define the journey to autonomy as being a spectrum that spans 5 stages, and as the journey progresses, the primacy and purpose of each stage changes too:

  • Experience

  • Rules

  • Analytics

  • Predictions

  • Prescriptions

Many organisations already encompass all 5 of these stages, to some extent or other, across their departments and operations.

In other words, the journey to autonomy is already, and will continue to be, characterised by co-existence, with localised replacement, widespread variation and continuous change.

From conventional to X

Working forwards from a conventional operation, let us examine what the idealised journey looks like, and some of the major challenges to be overcome.

The first step for many organisation is to invest to increase analytics and predictive models.

Today

Bar graph from the Journey to Autonomy article, illustrating the current emphasis on experience and rules over analytics, predictions, and prescriptions in today's operational decisions.

Today, we design, build, operate, maintain and upgrade our operations primarily by experience, supported by rule-driven systems (systems = processes, procedures, software, TARPs, etc). We hire people with specific experience to do specific roles, and we trust them to follow hard-coded rules. Analytics is present in nearly every part of our operation but is primarily used in arrears and on long feedback cycles to inform future investment or system decisions. Predictions occur internally inside some systems, and the pilots are widespread to understand how we can better use predictions to inform experience. Where a prediction clashes with a rule, we follow the rule, and we consider how analytics can help improve predictions to reduce clashes. Prescriptions exist inside very strictly controlled systems, typically equipment control systems with highly constrained degrees of freedom.

Tomorrow

Bar graph showing the anticipated shift towards greater reliance on analytics and predictions in future operations, as discussed in the Journey to Autonomy article.

The rise of data science and ML drives increases in Analytics and Predictions. Often, these investments create siloed or local benefits in the short term, but over time, the efficacy of these investments increases and correlates more closely with Experience. There are things we “know” (believe) to be true, but cannot prove, and in the absence of proof, we create rules to avoid risk, by increasing certainty. Predictions that align with beliefs (experience) are perceived as the most valuable, especially when experience is still the primary driver of decisions. Prescriptions remain tightly controlled, especially as many predictions are proved non-sensical or only narrowly true, based as they are on limited/narrow data models within tightly controlled (control systems) or logically driven environments (value drivers).

Challenge 1 - Experience vs Data

A common first challenge is the disagreement between data-driven assessments and experience. While Analytics increases with continuing investments, the primacy of Experience and Rules does not change, as these are the conventional modes of decision and risk mitigation, respectively. The result is an ever-increasing number of data products that are not used, or are used inefficiently, or are deliberately limited in scope to areas where they will not conflict with rules.

Bar graph comparing the amount of used vs. unused analytics in decision-making, highlighting the challenge of integrating data-driven approaches in operations.

While this often does not play out as a “problem” in day-to-day management, it is a symptom of one of the key underlying challenges in the journey to autonomy: experience-led vs data-driven. The experience-led (read: the entirety of human history) view on making decisions is to organically evaluate data collected via the 5 senses and make decisions. The data-driven view is that if we measure and analyse extensive data sets objectively, we’ll make better decisions, because we can bring far more data to increasingly complex models in measurable and repeatable ways.

Conventional organisational design further reinforces the separation; maintenance managers are typically engineers or, better yet, in the eyes of Experience, trade-qualified people who have “worked their way up”. How many times have you heard someone celebrate, “the new maintenance manager is a fitter who worked their way up, so we have some real experience”. No one says, “the new maintenance manager is a data scientist and has never been underground, so we’ve reduced our bias”. Similarly, the data team does not celebrate the fitter who has been moved sideways to oversee them, as a stepping stone to a more senior role, and of course, we do this so that we can give them “more experience” before promoting them.

Data may be the new oil, but experience is the strata it is locked up in.

The natural inclination of “data” people is to increase the quantity and quality (a dangerous word… a bit like beauty, it does not imply improved outcomes) of data products, and to show that data >> experience. They also attempt to drill through the strata, via business process changes, to extract their oil and bring it to the surface.

The response from “experienced” people is to close ranks on risk, and their personal experience, and to rightly point out that they are responsible for keeping people safe. We don’t overrule human front-line decision-makers when it could cause deaths. However, the journey to autonomy involves exactly this – machines and data models will have to make life and death and business-critical decisions. And the data view is that data >> experience, and that the correct use of data will reduce deaths and improve business outcomes. The more politically correct view is that both are critical, and we need to use both, and more of each will create better results.

Our position on this is that there is a future where 51% of day-to-day decisions are made by machines (X). At every point along that journey to X, there are decisions that are made on the basis of experience that would be better made with data. There are also decisions for which data models and machine decisions are not yet ready to make decisions of sufficiently high accuracy and low risk, that should be made by humans. At all stages, every decision that can be informed by data, and that can be measured in order to improve data and decisions, is a benefit.

The journey, therefore, is not a road to drive along, leaving each stop behind, but rather a series of challenges that one must confront and decide how to move forward. The decisions made at each challenge will determine the specific decisions made by humans and made by machines and the governance of both, at every given point in time. While there are types of decisions where we should consider making many more decisions (vs today), the autonomous world can only come about if there is an increase in the % of data-driven decisions vs experience-led decisions. While digital transformation and predictive analytics focuses on increasing the absolute number of data-driven decisions, we should also consider the other side of the coin: the change required in the % of experience and rules-led decisions.

Reducing Experience & Rules - Analytics

Bar graph illustrating the reduction in reliance on experience and rules in favor of increased use of analytics, highlighting the transition in decision-making processes.

Stacking up more decisions does not necessarily increase positive outcomes. In certain situations, moving a decision from weekly to daily to intra-shift can be of great benefit; Short Interval Control (SIC) is the obvious example. When we move from averaged-across-long-time-periods to frequent-data-driven, we address issues that are hidden in the averages. There are 2 steps to embracing Analytics in this way:

  • visualising critical raw data in a way that supports rapid decision-making; &

  • transforming data to show impacts of or causes of the raw data, to support improved decision making.

Both of these steps can bring into conflict Experience and Rules, though in a low-level way. Where data clearly shows that “the rule” is sub-optimal, then we must still follow the Rule. Management of Change can help us to break or change a Rule, but in the SIC example, we are not going to initiate MoC intra-shift. We simply don’t have time, and MoC is bound by its own Rules, and one of those rules is: don’t try and change too quickly.

In order to maximise the use of Analytics, the first “parts” of Experience and Rules must be changed or set aside, through conscious choice. Individual decision makers must choose to use Analytics over Experience, or choose to change or use a different Rule. The decision still resides inside the human head; it is not disaggregated yet, and it is not receiving a “decision” from a machine, it might just be a dashboard.

For the human decision maker, the choice is clear: if the Analytics “looks” right, and if I feel that I should make a change based on this, and if I can make a change within the Rules, then I should.

It’s a long chain of causality and feelings, for a data-driven decision.

As the decisions are still made in people's heads, based on an undocumented mix of Experience, Rules and Analytics, the link between Analytics accuracy and exposure and improved performance or increased/decreased risk is very uncertain. The feedback is, therefore mostly qualitative, and invented in arrears inside the decision maker's head; which, of course, is subject to the incredible bias of the outcome. A “bad” outcome is much more likely to be attributed to “others”, while a “good” outcome is much more likely to be claimed by ourselves. Measuring outcomes and measuring Analytics investments and leaving the middle dark and captured in monthly SLT PowerPoints leads to… making decisions on investments into Analytics on the basis of our… Experience…

At some organisations, based on the humans present (especially leadership), and the culture, and the centralised/decentralised philosophy prevalent at the time (which will be expressed in Rules), the Analytics chain of causality is more or less likely to be utilised.

In other words, the impact of Analytics on decision-making is heavily based on the Experience of the leaders and decision-makers.

Reducing Experience & Rules - Predictions

Bar graph showing the increasing role of analytics and predictions in decision-making, highlighting the reduction of reliance on experience and rules.

Let us now look at the more uncertain realm of Predictions.

Predictions occur when data models use historic data inputs to create patterns through which current data is compared, providing predictions of the future through data outputs. Predictions are not only ML/AI. Statistical methods still dominate prediction by weight of numbers, though GenAI may blow this all out of the water with its stream of black-box outputs. Regardless of source, and assuming that not all predictions are equal, the same dynamic as Analytics starts to play out. As Predictions follow Analytics, and are created by analytical processes, the Analytics utilisation dynamic is playing out as Predictions are introduced, and in practice, many users would struggle to draw a clear line between the 2 data products (which is not the point of this discussion).

The difference we highlight is that Predictions imply knowledge of probable futures; Analytics explicitly describe the past.

For the human decision-maker, this is the first real leap into trusting the decisions of a machine. When we speak of autonomous control of vehicles or equipment, trust is a critical factor. We establish trust by observing and testing, and we primarily reduce risk by segregation. At low levels (e.g.  automated valves), the data model is sufficiently transparent that the measurement of triggers and PID loops is robust and well validated. Autonomous control of certain valves is a reality today, as are predictive suggestions through control systems. It is for this reason that we show Predictions and Prescriptions as being present in a conventional operation.

The system is predicting that X will follow Y and feeding this to the human decision maker. The decision maker now faces an additional challenge – uncertainty. The Prediction is a data product, but there is now an element of uncertainty. How was this prediction arrived at? What is the probability of this prediction? What happens if I don’t follow this prediction? What happens if I do, and something goes wrong? These are all very good questions, and the answers change across situations, use cases, analytics and data. The final component to this uncertainty is the perceived variability; the same system can produce a radically different Prediction, for seemingly the same situation. Of course, to the data scientist, this is the crux and value of Prediction – it wasn’t based on broad brush Experience; it was based on 20 data points, 12 of which the decision maker can’t see or even ingest in a sensible way, and so yes of course the Prediction is different in this situation – your perceived “same” situation is in fact wholly different and unknowable by you with the Mk 1 eyeball – that’s the whole point.

In other words, Predictions are inherently uncertain inputs to decision-making until they are so robust that they may as well be “certainties”, at which point they are Analytics or should be used as Prescriptions.

In future stages of Prediction, we will have siloed/narrow Predictions feeding use case/wide Analytics. In other words, the chain of causality will extend, with specialised machine predictions feeding broad machine analytics feeding other machine predictions feeding human decision makers.

The ultimate stage (envisioned so far) is chained machine Analytics and Predictions loops feeding machine (AI) decision-makers. In most cases, we consider that a human will always be in this loop, but we cannot with any certainty say this will always be true. If the Human-in-the-loop is the Rule, however, and the human experiences sufficient uncertainty/variability, then their default will be to overrule the Prediction and use the Rule.

Data driven loops constrained by Experience will be the norm for a great many years in a great many situations.

Before the advent of GenAI, most would have dismissed this as irrelevant; we’ll jump to Prescriptions, and AI will “know” what to do when the AI proves it is ready to do so (we had no idea how we’d get to this, but it was sufficiently and appropriately deus ex to allow us to move past it). This is no longer true. The advent of generalisable narrow AI is already leading to GPTs being used as the final cog in chains of data transformation, with one or many black-box causality models along the way. Human decision-makers rightly fear following a Prediction of this sort, and a Statutory role blindly following a Prediction of this sort is unthinkable (fortunately, our Experience and Rule-based systems prevent this).

The stages of Prediction can be thought of as:

  • Predictive input to human decision-maker

  • Machines feeding Predictions and Analytics to human decision-makers

  • Machines feeding Predictions and Analytics to Machine decision makers with or without Human-in-the-loop.

For the human decision maker at the end (or on top of) the chain, we can extend the rationale to: if the Prediction “looks” right, or if it looks wrong but I sufficiently trust the Prediction system, and if I feel that I should make a change based on this, and if I can make a change within the Rules, then I should.

The critical difference between “accepting” Analytics and Predictions as inputs to decisions is the temporal factor: Analytics can challenge our view of the past, and what “really happened”. Predictions challenge our projection of what the future will look like. We have Rules (and Plans) to tell us what should happen in the future, and we base our projections of what will happen on our Experience. Predictions are a major blow to both, because now we are receiving suggestions to knowingly make decisions that will conflict with a Plan, and compliance to plan might be the primary Rule.

Thus, the adoption of Predictions is primarily based on the human decision-maker being willing and able to accept Predictive-informed decisions.

The willingness is driven heavily by uncertainty and variability to Experience, while the ability is driven by Rules. Counterintuitive Predictions that run counter to Rules are very unlikely to be adopted, regardless of potential value. In fact, in some corporate cultures, bias in the form of training models to produce expected Predictions is a very real risk, and measuring the adoption of expected Predictions could be a very incorrect North Star. In the same way that high utilisation is dangerous without grade control, high utilisation of biased or “shallow” Predictions can delay the journey to autonomy by leading the humans down a path from which very significant change management will be needed to come back. For some humans, it is likely to be the very evidence required to justify their rejection of Predictions in the first place.

The final stage - Prescription

Bar graph illustrating the final stage of the journey to autonomy, where predictions are codified into prescriptions, reducing reliance on experience and rules.

The Prescription stage takes sufficiently accurate Predictions and codifies them as Rules – follow the Prescription unless you know (believe) it will cause unacceptable Risk. The psychological angst is thus much lower than the Prediction stage. Here, the human has been told that they must-unless follow the Prescription, as a Rule, and that their Experience is to be used to prevent unintended Risk. This is actually the ideal situation for some, as decision-making accountability is now multi-threaded through machine and human, and the human making the decisions is rarely (ever??) accountable or responsible for the training or testing of the Prescription machine.

Interestingly, this may be the stage where net Rules begin to creep back up. Rules now need to come into effect to enforce Prescription use, to set actions for Prescriptive incidents, and to create sufficient must-unless governance. Logically, while Experience will still be present, at this stage, the number of FTEs on site will be falling, even if it did not at earlier stages. Net years of Experience now declines on site, and with the challenges of bringing young people into the industry, it is likely that Experience per FTE is also falling, especially if X=10 years or more into the future. Thus, at the moment when the reins are being handed to Prescriptions, Experience may be falling faster than ever, and net new Rules are being created, in which no one has any Experience with. Experience is, of course, contextual. Number of years breathing is a poor proxy for most applications (except CVs), and number of years designing or implementing Prescriptive Rules is likely to be very small, for some time (hint hint ambitious people).

To a conventional operation today, Prescriptions without magic AI wands are very likely to be underutilised. To a conventional operation in X=10 that has not gone through the pain and uncertainty of the Analytics and Predictions stages, Prescriptions are likely to arrive as a welcome and highly dangerous panacea. The visionary maintenance data strategist is unlikely to know what that funny knocking sound from the primary crusher gearbox means, assuming they are even allowed to or inclined to visit the site.

If the data model doesn’t include audio, then the tree falling in the forest truly makes no sound.

The value of Experience and Rules

The value and nature of Experience and Rules will change over the next X years. Whereas in a conventional operation, they are the primary methods of decision-making, recruiting, promoting, collaboration, training humans, change management and planning, in an autonomous operation, they are the primary methods of risk management, Prediction control, machine training and initiation of change.

The 25 years of experience walking up and down process plants, crawling over conveyors, squinting into the sun, and making decisions based on garbled 2way calls is valuable. In fact, in a world where young people do not want to “do their time”, and/or are not allowed to “get their hands dirty”, this experience might be irreplaceable. But replace it we must, because the Earth continues to turn, and many of those walking the conveyor today will not be here at X.

Without being ageist, there is no amount of 25-year-old enthusiasm that can replace 25 years of walking the talk. Equally, there is no amount of swinging off a shifter that can replace a solid education in statistical methods. The journey to autonomy relies on both, but it relies on both changing as well. It is also not just that we need both in different portions (e.g. less shifters / more statistics), but rather that we need to transfer the critical components of Experience to Analytics and Predictions, and we need Analytics and Predictive products to augment Experience. In the middle, we need Rules to change, to increase the freedom to break other Rules on the basis of data, and to force Experience to yield to the right data, and to force data and experience to collaborate to improve how decisions are made.

The setting and changing of Rules is likely to have the most dramatic influence on how long and how risky the journey to autonomy is.

Summary

The journey to autonomy is not just automating a haul truck and setting it off on its circuit for the day. Nor is it changing a process control system from PID to MPC to AI. Autonomous operations are idealised for many reasons, principally that we know there’s lots of mistakes that are made (we know a lot more than outsiders do and are much harsher on ourselves), and we know we need to change as society and our incoming workforce changes. Autonomous operations need not only mean unstaffed, and we propose a useful proximate objective as being X, where X = the years where a majority of decisions are still made by humans.

The journey to autonomy encompasses 5 stages, and as the journey progresses, the primacy and purpose of each stage may change, too:

  • Experience

  • Rules

  • Analytics

  • Predictions

  • Prescriptions

Early in the journey, Analytics provides the confidence to embrace Predictions, and Predictions highlight the Rules that will need to change and the Experience that needs to be validated or invalidated, and Experience and Rules keep everyone safe.

Later in the journey, Experience provides the know-how to create more accurate and usable Predictions, and Analytics proves which Predictions can become Prescriptions, and Rules mediate the interplay. Experience encounters severe shocks, and some Rules are proven to cause unintended consequences, which can now be avoided if Experience allows it.

At the end of the journey, and the beginning of the next one, Prescriptions begin to overturn Experience, and Analytics shows that most static Rules are inhibitors, not risk mitigators, and probabilistic action on predicted risks becomes the new paradigm to explore.

The journey is not a fait accompli and AI is not the panacea. Prediction accuracy is one element, and a critical one, but many of today's operations will not make the journey to autonomy before the end of life. They will start, and great change will ensue, but they need to hit their numbers every quarter, and they need to reduce OPEX to help pay for the increased ESG, safety and compliance expectations they were never designed or organised for. The people in these mines are products of the environment that has worked and continues to work, and they need help to change the way we operate, because they will make 90% of the decisions that take that operation through to the end, and every worker's life will rely on those decisions.

Understanding and embracing the journey to autonomy will help organisations and decision-makers face the challenges and embrace the opportunities, and hopefully move the conversation beyond autonomous trucks.

Why did we write this?

Perhaps it wasn’t clear, or meant to be clear, but we mostly wrote this to create clarity for ourselves on one of the Whys of Geminum. We created Geminum to accelerate the journey to autonomy. We see digital twins and AI as being critical enablers of this high leverage, high uncertainty, highly dynamic period leading to X. We think there’s a virtuous circle to augmenting humans and improving machine decisions – when one flows into the other, we will see the real hybrid decision making era take off, and that’s the space we choose to play in.

We hope you enjoyed this article, and congrats on seeing it through to the end. You must actually care about this topic – kudos, we do too! Stay tuned for another one…

Previous
Previous

Grid Twin: Enhancing Power Company Operations with Real-Time Data

Next
Next

Geminum Now Offers Digital Twin Training Courses