If you manage or oversee change delivery projects, you’ll be familiar with RAG reports. But how much can you really rely on them?
A RAG status can alert you to a problem, but without real context and understanding of what drives the metrics, project managers still have to rely on their own analysis, anecdotal insight and gut feeling.
Not only does that create unnecessary stress, it leaves PMs with little option but to deliver subjective, inaccurate reports. Even worse, if the reporting tools you use to maximise business value are inaccurate and unclear, the resulting decisions are likely to be inefficient or just plain wrong.
Introducing AI helps. But XAI is the game changer.
What makes XAI so much better than AI alone?
Well, we learned the hard way on this. When we developed Sharktower, we created AI reporting tools to help users make better decisions. But then we found that some users weren’t using the tools, so we asked them why.
The answer was pretty clear: they didn’t fully understand what these scores meant, and they didn’t know what actions to take to improve them.
Back to the drawing board.
Early versions of Sharktower’s AI tools didn’t explain the full picture.
We knew if we wanted our users to come on this AI journey with us, we’d have to give them more trust in the tools we were asking them to use. Especially as we were asking them to radically alter their existing toolkit (including RAG status).
So we introduced XAI, to build clarity and understanding, and to empower users to make informed actions. And it worked.
Here’s why:
More certainty, less risk.
Explainable AI (XAI) removes the uncertainly of non-standardised reporting by highlighting and explaining complex correlations and other hidden factors to support confident decision-making and a truly focused response.
It gives a non-biased reflection that can be applied across an entire portfolio, subjecting all projects to the same ‘marking criteria’ and enabling awareness of the true project status.
Sharktower’s XAI goes beyond RAG reports, giving project managers context and focus.
Explainable AI and the role of the Project Manager
XAI doesn’t replace project managers. It’s an additional resource to support decisions, by providing the ‘Why?’ for machine learning model results. These insights can encourage, challenge and validate project decisions, but the final call still remains with the PM.
XAI gives PMs:
- Data-driven machine learning insights that compliment professional opinion
- Interpretable models that take the stress out of ‘gut-feel’ decisions
- Innovative tools that leverage modern analytics and deliver real results
- Evidence-based intervention where it is most needed.
- An understanding of the sensitivity of scores to inputs and the work required to materially alter the score and deliver business value.
Supporting the shift to a data-driven culture
You can’t build a data-driven culture with unreliable, arbitrary data. But Machine Learning and XAI help to build trust, confidence and communication thanks to:
- Explainable data: a wider range of users can understand why or how a conclusion was reached
- Transparency: accessibility to the data safeguards against biasand reveals hidden project dependencies
- Justifiable decisions: there is an understanding of the case in support of a particular outcome
- Contestable information: Users have the information they need to challenge a decision or classification
Tell us what you think
Are you using AI tools to drive your decisions? To see Sharktower’s XAI in action, REQUEST A DEMO or email info@sharktower.com