What are Data Maturity Assessments (DMA) and why should you use them?
- Alex Leigh

- Jan 8
- 5 min read
Introduction
Organisations talk endlessly about becoming “data‑driven”, but few consider the fundamental question underpinning that: “What is the state of our data, the culture around it, and is it ready for everything we want to use it for?” Not asked nor answered even when there is a perfect tool - the Data Maturity Assessment (DMA) - to do so.
So why do we see so little priority given to these assessments? Reasons include a misunderstanding of its purpose, outputs from the DMA not being actioned, or the process being treated as a one‑off box‑ticking exercise. That’s before we get into political considerations where the assessment “makes me look bad”, or consultancies deploying them to sell more of their own products and services.
Despite these hurdles though, it’s reassuring to know that when done well, the assessment – and more importantly what you do with the outputs - becomes a strategic asset that can shape culture, set priorities, and build a sustainable data capability.
So, let’s get into this. The goal is to make your DMA relevant, reusable, and compelling. Further, it needs to make the case for change and have lasting impact.
Step 1: Make it about you.
Off‑the‑shelf maturity models may seem initially tempting. They’re neat, structured, and often comparable within one or more industries. However, in our experience, they’re also blunt instruments because a generic model cannot reflect your organisation’s unique context. When we say context, that includes your strategy, constraints, culture, regulatory environment and operational priorities.
To do that, you need to make the assessment feel authentic for your organisation. You need to assess what’s important in your context and make pragmatic and organisationally aligned recommendations based on the findings.
It’s useful to think about this in four ways:
Use the language authentic to your organisation. For example, If your teams don’t talk about “data stewardship” but do talk about “information ownership,” adapt the terminology. This may seem trivial, but if the assessment doesn’t pass the “not some generic thing” sniff test, you’ll lose vital engagement.
Reflect your operating model. A centralised data team will face different maturity challenges than a federated one. More widely how data is used needs to be reflected in the assessment, including being honest about how things are meant to work, and how things are really organised.
Align with your organisation’s goals. If your organisation is prioritising automation, customer insight, or operational resilience, the assessment should focus on those areas more deeply.
Acknowledge your current issues. There’s no point assessing “AI readiness” if your data quality is inconsistent or your governance is invisible. And we’re seeing this a lot. The assessment should be a mandatory step before unleashing LLMs on your ungoverned data!
A tailored assessment doesn’t just measure maturity; it shows the gaps where we need to improve. People engage more deeply and more often when the questions feel grounded in their world. If no one completes your survey, you’re not going to get very far!
Start where you mean to finish.
A maturity assessment is only as valuable as the decisions it informs. So before you run it for the first time, be crystal clear about its purpose by asking yourself these questions:
What decisions will this assessment help us make?
Who needs to see the results, and why?
What actions or investments might follow?
How will we prioritise improvements?
Without this clarity, you risk producing a glossy report that sits ignored on a shelf. With it, the assessment can act a strategic data compass. What we mean by that is an agreed set of data priorities that can guide roadmaps, shape capability development, and support/inform investment cases.
Publish and be damned!
Don’t consider this as producing a simple PDF, think of it instead as an opportunity to weave a compelling data story for your organisation. The way you communicate the findings determines whether people see the assessment as a pragmatic catalyst for change, or just that box ticking exercise.
Again four points to help:
Know you audience. For example, top tier management expect a high‑level narrative, while operational teams need specifics.
Tone. Balance honesty with optimism. Highlight strengths as well as gaps.
Format. A slide deck, an interactive dashboard, a written report, or a short video update. Each of these has merit but should be tailored to the audience. Our experience suggests having something interactive really helps with engagement.
Timing. Align publication with other organisational cadence, such as strategy refreshes, transformation programmes, or budget planning cycles.
The narrative should connect the assessment to what else is happening across the organisation. If you’re launching a new digital strategy, show how the maturity findings support it. If teams are struggling with data access, highlight how the assessment validates their experience and plots a doable path forward.
Sell the sizzle!
Visualisation should be where the assessment comes alive. Done well, it transforms often boring tables and abstract concepts into intuitive insights.
Here are some things we use. That’s not saying they are best practice though. Dovetail your viz’s with what works for your organisation.
Radar charts showing capability across the organisation. Great for showing strengths and weaknesses across multiple dimensions at a glance. They make comparisons over time easy and visually striking.
Heatmaps to add detail. Assuming different departments and business units were assessed, heatmaps reveal pattern. For example, highlight clusters of maturity, pockets of excellence, or areas needing targeted support.
Simple ladders bars- where are we now, where do we want to be These help non‑technical audiences understand what each maturity level means and where the organisation currently sits.
As-is and to-be markers. Building on the ladders, these should be side‑by‑side visuals for the two states; a really compelling way to demonstrate progress and show the value of investment.
Self-service dashboards. Interactive dashboards combining charts with short explanatory text can guide readers through the story rather than leaving them to interpret raw visuals, tables or spreadsheets.
The human context. Data stories, quotes, and frustrations can make the findings feel grounded in real world experience rather than abstract scores.
Whatever format you choose, aim for clarity, simplicity, and authenticity. The goal is not just to inform, but to inspire action
.
Not one and done.
A single assessment is merely a snapshot, whereas repeated assessments give you space to tell a story.
That means running the DMA annually, then blending the results into business as usual. Key steps to doing so are:
Establish a baseline with your first assessment. Make sure this is agreed as evidentially accurate with senior staff.
Track progress over time. Accept scores can go up or down! Focus on targeted areas to show real improvement.
Spot emerging trends. Not just what you expect. The expression “a rising tide lifts all boats” can improve several indicators.
Evaluate the impact of interventions. This is so important, and often not done. If it’s not working, don’t be afraid to change it.
Keep data maturity on the organisational agenda. Be ready to pivot to support changes and new initiatives.
This rhythm also reinforces the message that data maturity is not a destination but a journey. It evolves as your organisation changes.
Closing thoughts
A data maturity assessment is far more than a diagnostic tool. When designed properly and used strategically, it becomes a catalyst for cultural change, a roadmap for investment, and a narrative thread that ties together your organisation’s broader ambitions.
Make it relevant. Make it purposeful. Make it repeatable. And above all, make it compelling. We all have too much to read, so your challenge is to get the DMA to the top of the pile!



Comments