We’ve built a foundational set of data capabilities this last year – one metric we’re particularly proud of is the number of colleagues that use our analytics tools each day.
We followed four principles to build this foundation, detailed in this article:
Focus on the right behaviours, not just the right technology
Create a ritual of measuring performance and asking good questions
Build just enough, then use feedback to prioritise further work
Distribute data specialists and help teams work well together
We hope this is both a helpful guide and a good illustration of how we like to work at Mettle.
It’s easy to make the wrong decisions. Without data as an objective language, we may be more reliant on untested assumptions, or more easily swayed by cognitive biases. Data can help teams measure their performance with clarity and prioritise opportunities likely to hit their goals.
Although there is plenty of hype around artificial intelligence, there are also clear opportunities for data science to create transformative value in financial services. For example, enriching credit scores with cash flow information can support better and more affordable lending decisions, classifying transactions can help small businesses simplify tax submissions and forecast their profits, and models that spot novel transaction patterns can more quickly alert us to new forms of financial crime.
The fastest-growing technology companies in the world use data to make decisions that power their growth, and use data science to power their products. Applying these techniques to banking has become easier – driven by the growing digitisation of our financial lives, the availability and expansion of open banking APIs, and platforms that make it easier for data scientists to access data and build out new models. This means that while being data-driven is a competitive advantage for organisations like Mettle, it will become a competitive necessity in the future.
To ensure we harness data in the way we wanted, we followed the principles below:
focus on behaviour, not just technology
create rituals of asking the right questions
build just enough for feedback before iterating
embed specialists and help them work with teams
Principle 1. Focus on the right behaviours, not just the right technology
It’s easy to spot a data-driven organisation from the behaviour of the people in it. Data scientists and analysts are embedded into teams and work collaboratively with non-data specialists (such as product, marketing, risk, and operations managers) to solve problems.
Non-data specialists can easily self-serve simpler analysis and metrics, which has the benefit of encouraging them to be curious and explore the data, and reduce the context loss and delay associated with relying on a specialist or a central data team. This also frees up data analysts and scientists to focus on more complex analyses.
While technology is a powerful and necessary enabler of data-driven behaviours, true behavioural change is hard. Merely making great analytics tools available will not guarantee they are used appropriately, or at all.
There are many ways to encourage behavioural change. An important enabler is commitment by leaders to measure performance with metrics, ask for data in major decision-making processes, and encourage teams to be curious and dig into data. Non-data specialists should be provided the time and support to build their data literacy and capabilities. Examples of great self-service use, problem solving with data, or data science applications are celebrated publicly as positive reinforcement of desired behaviour.
Principle 2. Create a ritual of measuring performance and asking good questions
The first informational need of any organisation is to see its performance. Only then can it diagnose problems, prioritise opportunities and test interventions. An early step in our data journey was to create a Business Review meeting – where the CEO, leadership team and colleagues across Mettle use data to openly discuss the organisation’s performance. It’s a ceremony we borrowed from Amazon and is the best attended meeting in Mettle outside our all hands meetings.
This ceremony is valuable as it allows us to explore insights about the business and our customers while providing a forum to challenge each other and be curious as a group. But it also provides additional benefits. It demonstrates executive sponsorship of the right decision-making approach, and places accountability on business leads to explain changes or issues and to use data to justify their next steps.
The meeting also provides a feedback loop on our capabilities. For example, when we first started the Business Review we only had access to data via a handful of API endpoints, directly from third party systems, or by reading from G-sheets. While we could easily see our performance, we could only see it once a fortnight through a manual process. We had neither the time nor means to dig below the surface and diagnose changes in performance.
Since we’ve invested in our data warehouse and initial dashboards, we’ve gone from discussing our metrics once a fortnight, to responding to daily changes. As we’ve added data to our platform, we’ve been able to have evidence-backed debates about the root cause of changes. And as we’ve opened the platform to the business and provided training and support, we’ve found non-data specialists in marketing and product teams join the discussion and present their own findings.
Principle 3. Build just enough, then use feedback to prioritise further work
An analytics platform must do more than just deliver data. Data should ideally be complete and accurately reflect reality, remain secure and compliant with data privacy law, be easy for non-specialists to pick up and use, while still providing sufficient control for specialists to flexibly explore data and build models.
Delivering this for an entire business can feel like a monumental task. It can be tempting to spend lots of time planning or invest in flashy SaaS platforms or consultants that promise to do it all. Instead, the best approach is to launch a ‘minimum testable product’ fast and iterate (eg. this article on using MVP thinking).
In this tradition, we constrained our initial work to some clear basics – setting up a data warehouse (we use BigQuery) and structuring a couple of raw kafka topics (which reflect communications between our microservices) into useful analytics tables. We made our platform available for everyone in the company to use, to maximise feedback and learning.
This first platform was unpolished – it contained only a subset of available data, could only be accessed via SQL, and had no support for non-specialists. However, it allowed us to identify the biggest gaps in platform functionality, content and support, and understand the quality of the data for analytics, from the perspective of actual end users. We’ve taken a similar incremental approach to building our business intelligence – we started with a single dashboard of performance, and used feedback to gradually improve and extend.
An ‘agile’ approach can make people uncomfortable. Users of analytics must accept unpolished analytics and tools and willingly participate in the process of improving these. Organisational stakeholders must accept that there can be no meaningful long term ‘roadmap’ – and that work will instead flow from shorter term feedback and gaps to the bigger vision. The advantages, however, are undeniable – a clear feedback loop enabled us to prioritise work that solved real user problems, instead of building tools we would have to later convince people to use.
Principle 4. Distribute data specialists and help teams work well together
After building the basics, we allocated data scientists and analysts across teams. A distributed analytics team is a great way to bring specialists really close to the problem space and prioritise work most effectively with business areas that have data requirements. However, data scientists still dedicate a portion of their time to ‘chapter’ activities such as supporting self-service, improving our platform and furthering our collective learning.
We talked about what good problem solving looks like in Principle 1. In well functioning teams, data specialists and non-data specialists share the workload of extracting information and using it.
This may not always happen. Non-data specialists that lack the confidence, time, or skill to use self-service tools may push this work to their more efficient specialist colleagues. Teams may struggle to align on the true value of an explorative or data science project – or focus excessively on reporting and ‘interesting’ data exploration or data science over actionable diagnostic analyses and experimentation. New insights may not cleanly fit with decision making processes and teams may need to rethink the way they prioritise their backlog, build the right points of data collection, and work together.
The only solution to the above is to take time and go slowly. Non-data specialists need to feel comfortable to ask questions and learn. Data specialists need to be empowered to push back on poorly defined or low value problems. Teams should view their working style as a constant experiment and be open to improving how they work together – and new analytics tools and platforms can make this even easier.
We’re still on our journey of building out our own capabilities – and we’re still learning.
If you are a curious problem solver that likes getting their hands dirty with data, please get in touch – we’d love to have you on the team!