We take for granted that we can search a billion websites from around the world in under a second. The fact that we can do this is the result of a very specific architectural approach, one that is based on standards that have been adopted globally. The outcome is that connections can be made by any system that follows the rules, enabling billions of connections to be made on a continuous basis, unlocking vastly diverse applications.
This is a free-market, open, democratic approach that enables both commercial and non-commercial innovation.
Things like search engines exist as a direct consequence of this approach. Imagine if, instead, everyone had to ‘put’ their data into some global ‘portal’ … no one would. It doesn’t stop everyone trying to do this, but it’s futile. Open markets win in the end and when it looks like they won’t, governments and regulators step in (however clumsily, slowly or late) to address power imbalances.
This underpins that the web is not just a technological innovation. The fact that the world’s richest companies are who they are is just that they ‘got there first’.
We are all getting better at this, and free-market principles are still driving behaviours. While this does create monopolies, it also creates competition.
To fuel our data innovation we need to get better at opening up markets, making them accessible to many. It is legally possible to search, find and discover copyright and other ‘restricted’ content through the ability to scan metadata that describes the underlying data. To access and use the data typically requires a manual step that ‘signs a contract’ to allow its usage and/or pay for its use.
The next phase of web development will automate this step and enable the connection of data between machines at a rapid pace. Step one must be that we publish better, open metadata about what we hold. If no one can find you in the ocean, you will be assimilated.
The adoption of new technologies such as machine learning (ML) and artificial intelligence (AI) will accelerate the pace of change to the extent that many services will be automated.
At the same time, user needs are diverse and growing, touching millions of people, thousands of organisations and society as a whole.
These developments in and around data, including increasing automation and the use of both ML and AI, are relevant to environmental impact. Observing, monitoring and predicting both our environment and the impact industry has on it, relies heavily upon the collection and analysis of data.
Meanwhile, predictions and projections of the future state of those systems— both natural (weather, water and climate) and human-made (through the creation and operation of energy, water, agriculture, transport, built world) are increasingly using ML and AI, especially in shorter timescales, to supplement the data produced by earth system models and digital twins.
We need to explore best practices for data sharing across sectors and anchor our concepts around data. Doing so will help determine an approach for strengthening value chains, embracing the expertise of humans and machines to better understand our complex challenges and how we can best address them.
Interoperability is the precursor to diversification
If institutions are to meet the needs of their users, they must address a rapidly evolving ecosystem of data. Rather than continue to build, maintain and deploy technologies, or supply data against historic needs, the modern need is to connect not collect.
At Icebreaker One, we hope to enable organisations to find and connect data around user-needs, not just collect and distribute it (in fact we will never hold the underlying data at all).
Instead, we are taking a project-based approach to initiate, strengthen and optimise value chains through public-private engagement, such as Open Energy. This public-private approach puts the industry and regulators around the same table, to look at the whole market, address the business models and operational needs as well as the country’s objectives (e.g. Net Zero), our societal needs and, ultimately, build an energy data ecosystem that works for everyone.