How can we find the Goldilocks Zone of our National Data Infrastructure?
(please note my disclosures at the end of this post)
This post contains (as-always) personal opinions and thinking-in-progress (strong opinions, weakly held) as I navigate through the maze(s) of data governance. Persistent in my questions are how we apply the Goldilocks principle of governance vs innovation: to remain flexible and avoid brittleness.
Given the conversations I’m in at the moment I believe it very important to have as much debate as openly as possible, as this impacts everyone.
Here’s a question:
How should the UK implement its National Data Infrastructure so that it provides consistent control, sharing, and security for data, much like other public infrastructures such as roads and utilities?
“Everything should be as simple as it can be, but not simpler”
[often attributed to Einstein]
Having initiated ‘data as infrastructure’ at a political level in the UK in 2013, I’ve been watching and learning on this for a long time. We’ve made remarkable progress in some areas, less so in others.
Let’s explore, starting with some lessons learned.
One immediate reflection is that as over 60 countries moved to copy the Open Banking Standard, those who took only ‘parts’ of the system tended to experience less ideal outcomes (to the point that they often add them back in later to help course-correct — most notable are those who thought this was ‘just a tech problem’ and only took the open API principles).
We can look to past UK data initiatives like MiData and Open Banking and potentially infer that voluntary-only initiatives and centralisation do not work, and that the decentralised nature of Open Banking was more successful. However, these are shallow reflections.
We can point to three factors beyond centralised or voluntary which are dependent on the use case:
- Market incentives: there must be an economic argument that policy can then amplify or mandate. If there is no financial incentive, there will be no movement (regardless of central vs decentralised, mandate or not — mandates are also often ignored). Midata was mostly tech-led, Open Banking was use-case led.
- Removal of friction: Removing transactional friction may seem like something everyone wants, but not if your current business model relies on it. There must be “something in it” for everyone, or at least a path to cost reduction or a new business model. Removing friction can help everyone go together: this is never a ‘technology problem’ (e.g. absence of a data ontology).
- Mandates: if the friction is ‘too high’, regulatory intervention is necessary to mandate participation and move the market. However, if the market can demonstrate self-initiation, then ‘endorsement’ from government can be sufficient to drive impact. Value creation must be visible in both cases.
I believe that we need to embrace these factors in our systems design.
We need to:
- work out where the lines are drawn between central, pre-competitive and competitive areas;
- ask what the role of regulation is, and at what granularity
- query how things can be implemented in a way that humans (and machines) can and want to adopt
- ensure those governing, operating and participating can be helped in their assessments of compatibility and interoperability
- work out how to prioritise and create clear (and stable) roadmaps that enable investment to be made
I believe that we can build on the success of sector-specific approaches:
The Open Banking model’s success lies precisely in its sector-specific, incremental approach. It focused on the unique needs of banking, where security, standardisation, and customer trust were paramount.
Copying the Open Banking model into other sectors (e.g., energy, transport, health) isn’t about duplication but about applying tested frameworks tailored to each industry.
Without considering the market incentives for participation we risk techno-utopian thinking
Instead of trying to create a complex, overarching architecture that risks being too broad, a sectoral approach allows for flexibility and adaptation to industry-specific challenges. This carries a risk of fragmentation, and equally a potential to mandate cross-sector interoperability without defining exactly what this means in detail for everyone. The risk of push back on over-reach is material and catastrophic (in terms of adoption, if it fails). Further, market-incentives will be market-specific.
Avoiding Over-Standardisation
National Data Architecture risks over-standardisation and could stifle innovation instead of enhancing it. Different sectors have wildly varying levels of data maturity, hugely variable data governance needs and challenges, and a one-size-fits-all approach is highly likely to impose unnecessary constraints on sectors that are not as ready for them.
For example, imposing the level of data governance on the industrial sector that is required in the financial sector is (today) not realistic. We are, however, on the frontline of that journey with ESG reporting today, and it highlights the complexity of that real-financial sector bridging. Incremental improvements, like extending Open Banking principles to energy or transport can allow industries to evolve organically while keeping governance controls and security as a priority.
Pragmatism of the Incremental Approach
The incremental, domain-by-domain approach may seem less ambitious, but it offers a practical pathway to innovation that can be adopted today. Trying to design a top-down, comprehensive national architecture from the outset risks paralysis by analysis and over-planning. Learning from Open Banking and rolling out smart data initiatives sector-by-sector allows for continuous improvement and responsiveness to real-world feedback, which may be more effective than a sweeping, coordinated architecture.
Interoperability Requires Industry Engagement
Interoperability relies on industry engagement. This is best fostered through practical, sector-specific initiatives like Open Banking. Imposing an overarching system from above risks of alienating key industry stakeholders, who may resist changes that threaten their business models (this happens in every sector already, today). Open Banking’s success came from the balance of regulatory oversight and industry cooperation, a formula that can be replicated in other sectors.
Market Failures Can Be Addressed Incrementally
There are (always) risks of market failures such as data hoarding (castle & moat is still the prevalent investment model for data businesses). These issues should be tackled incrementally through sector-specific mandates rather than a top-down architecture, but can be accelerated with top-down principles.
The competition fostered by Open Banking and similar initiatives already shows promise in tackling data monopolies. An incremental, practical approach allows policymakers to address market failures as they emerge, rather than trying to solve all potential problems with a one-time top-down design.
Trust is Earned, Not Imposed
Trust in data systems cannot be imposed from the top. While Open Banking was catalysed by regulation, it earned user trust through transparency, security, and gradual adoption, not through a top-down imposition. Similarly, rolling out frameworks sector by sector allows consumers, businesses and citizens to see the benefits and gradually gain confidence in the approach. Attempting to ‘mandate trust’ top-down could backfire, especially in sectors like health, where data privacy concerns are particularly sensitive.
At IB1, our Trust Frameworks are one tool (of many) that can help foster multilateral collaboration, building on Open Banking principles.
Decentralisation Encourages Innovation
The decentralised, minimised and pre-competitive nature of Open Banking has led to its adoption and innovation. A large, centralised approach would have risked locking in certain standards or technological pathways that could hinder future innovations. Allowing sectors to develop their own interoperable standards fosters a more competitive, flexible landscape, where innovations in one sector can inform developments in another, without being dictated by a single framework.
Data governance should aim to balance impact on rights and security while minimising its own footprint and reach
While the idea of a unified, overarching data framework may sound appealing in theory, in practice it would risk being too rigid, slow to adapt, and detached from the unique needs of individual sectors. A more pragmatic approach — building on the lessons of Open Banking and extending these principles to other industries — may offer the best balance of innovation, security, and user control.
Please feel free to comment, or message me (via LinkedIn or directly)
Disclosure: I sat on the MiData energy sector board (mostly tearing my hair out on these points); co-chaired the creation of the Open Banking Standard (through which I learned so many lessons it could fill a book or two); was founding CEO of the Open Data Institute; am co-Chair of the Smart Data Council, and run a non-profit (IB1.org) working on data governance at sector and national scale.
Thanks to Chris, Frank, Paul, Hadley and others for their feedback and inputs.
Useful links
Lessons learned from Gaia-X Data Spaces https://www.sitra.fi/en/articles/eight-lessons-from-building-data-spaces/
ODI on Data Institutions
https://theodi.org/insights/projects/rd-data-institutions/
Icebreaker One on how it implements data sharing
https://ib1.org/what-we-do
Open Banking (implementation entity)
https://openbanking.org.uk
Additional narrative on cities
https://agentgav.medium.com/the-porous-city-92ae986cd43c