Obstacles To Value: The Five Failure Modes Of Advanced Analytics

--

The following five data analytics challenges emerged as common themes that projects will need to overcome in order to avoid failure and deliver their potential.

A broken incandescent light bulb depicting data analytics challenges

We frequently hear about advanced analytics (AA) success stories. There are high expectations — McKinsey predicts AA and AI will deliver between $9.5 trillion and $15.4 trillion in annual economic value — so it is only natural that many will want to shine a spotlight on progress whenever possible.

However, practitioners will be all too aware that it’s not all success in advanced analytics. For every impressive case study or exciting headline, there are dozens of projects which have failed to deliver on their potential. The exploratory, counterintuitive and technical nature of advanced analytics projects are typically quoted as reasons for the challenges that each project faces. What else lies behind these failures?

Over the last few months we have interrogated our own experience and discussed with advanced analytics leaders and practitioners across multiple industries, asking what causes unsuccessful analytics projects. The following five cross-cutting challenges, agnostic to industry, emerged as common themes that projects will need to overcome in order to deliver full potential.

1. PROBLEM — Ill-defined challenges

Poor problem definition is a significant challenge that analytics teams face. It is often very difficult to break down the broad challenge facing an organisation into solvable segments, and even more difficult to assess which segments will deliver the most impact if solved.

So what are the consequences for analytics projects when they focus on the wrong problem — or at least the wrong aspect of the right problem? When this issue manifests itself, projects end up:

  • Not addressing a clear business need
  • Unaligned with the overall business strategy
  • Lacking a clear path to delivering Return on Investment
  • Disconnected from the true drivers of business success
  • Focused on what was interesting instead of what delivered the most impact

2. DATA — Low quality, inconsistent or absent data

Any model is only as strong as the data it relies on. However, sourcing the right data — and enough of it — can prove difficult. Failure is likely to occur when:

  • The required data doesn’t exist
  • Data quality is not sufficient to proceed
  • The project team does not have access to the necessary data
  • The data is too expensive to access
  • Data engineering is too expensive or time consuming to make the data usable

3. EXECUTION — Technical approach misaligned with the problem at hand

Unfortunately, identifying the right business problem for analytics to address and having the data required to address it is not enough to build a model that delivers business outcomes. Even when you get the first two steps right, a team may fail to complete a working model due to:

  • A deficit of proper technical talent or domain experts involved throughout the process
  • Over-scoping the project and trying to achieve too much at once
  • Using the wrong technology, algorithm or approach when developing the solution
  • Not building an accurate enough model to be predictive
  • Insufficient resources available to deliver to the quality or scope necessary to make impact
  • The delivery time for the project was longer than anticipated and did not have enough budget to complete the model

4. LAUNCH — Failure to account for the human element

Even if you deliver a working model, it may still fail if the intended users do not adopt it, or if it is not integrated into an existing technical or business process. While technology integration poses problems, user adoption is a much greater cause of failure in analytics projects. The best data science and most well-constructed models will deliver little impact if they are not easy to use and deployed to augment human decisions. Adoption and usability failures arise when:

  • Intended users are not engaged or actively resist adopting interventions
  • Operating procedures and incentives do not encourage users to incorporate the model into their ongoing behaviour
  • The interaction or interface of the model is too difficult to use
  • The solution does not easily integrate into an existing technology stack, the current infrastructure, or the organisation lacks the capabilities, such as necessary data warehousing, cloud processing and storage

5. INDEPENDENCE — The “one-off” trap

While a model may thrive if initially adopted, it can still falter if it is abandoned in the long term, perhaps due to a lack of internal support or if it is not adapted following a significant change in the organisation it was built for.

  • Failure to adapt the model to align with changes in the organisation’s needs, business strategy or objectives
  • Model performance deterioration over time due to changes in environment, patterns or behaviours
  • Not enough technical support to adjust for problems in the data pipeline, changes in source systems or APIs, etc.
  • Lack of long term adoption with end users rolling back to old ways of working, creating new workarounds or using the system sub optimally

While organisations around the world are at different stages of their analytics journeys, we have seen an overall growing maturity throughout the advanced analytics industry. The technical talent, accessibility of appropriate data and the thinking behind a model’s concept are often initially sound, so factors that contribute towards the problem, data and execution failure modes are generally less prevalent than they were even two years ago. The launch and independence phases highlighted above represent the latest bottleneck for many advanced analytics projects and largely depend on user adoption. Data science alone cannot address this issue. In our experience, human-centred design, beyond interface and data visualisation, plays an integral role in clearing this bottleneck and ensuring analytics projects deliver their full impact.

Integrating design from a project’s outset, as early as the problem definition phase, begins the change management process earlier and gets ahead of many issues which lead to failures. Unmet needs are considered and potential user problems are discovered far sooner. As one seasoned analytics executive put it, “Fifty percent of the impact from analytics is how good your model is, the other fifty percent is user adoption. One without the other doesn’t get you anywhere worthwhile.”

In an upcoming article we will explore why addressing this understanding gap requires practitioners from across data science and design to collaborate effectively. Ensuring that solutions are not just technically feasible but widely adopted is key in driving maximum value from advanced analytics, particularly at a time when the world is in such need of its problem-solving and predictive prowess.

Authored by Maksud Ibrahimov, Jr Principal Data Scientist, QuantumBlack, Melbourne; Dan Feldman, Design Director, McKinsey, Sydney; Justin Hevey, Expert Designer, McKinsey, Sydney; Cris Cunha, Analytics Expert Associate Partner, QuantumBlack, Perth

--

--

QuantumBlack, AI by McKinsey
QuantumBlack, AI by McKinsey

An advanced analytics firm operating at the intersection of strategy, technology and design.