Making data work harder to deliver Net Zero — can we automate environmental reporting?

Gavin Starks
3 min readJun 6, 2022


Trusted data needs to flow between industry, finance and environment — is achievable at web-scale.

Pension funds, market investors and countries (sovereign wealth funds) are allocating $trillions to create a lower-carbon future. At the moment this is done using ‘financial economy’ modelling based on top-down economic models, Environmental, Social and Governance (ESG) ratings and related financial instruments that have a risk-based focus. While these will deliver a ‘lower carbon’ future, there are material risks in data: in models, inputs & reported outputs.

Currently, there is a material absence of transparency, assurance, comparability and auditability around environmental reporting, including ‘ESG’ and ‘impact investing’, that many groups around the world are trying to address. Most reporting is ‘push’ (e.g. to or TCFD reporting, which is now mandatory for >50k companies in Europe).

While better reporting is a form of progress, it does not provide a causal link to materiality or ‘double materiality’. ESG ratings are very far from mature.

Governments are likely to mandate that such publishing be made ‘open’ (for some definitions of open) and XBRL has the mandate to help develop principles for machine-readable data reporting. One challenge is that everyone is trying to ‘build a portal’ rather than taking a web-scale approach — this is not scalable and increases barriers to access and share.

At Icebreaker One we’re making data work harder to deliver net zero. We’re promoting the idea that companies publish their reports, data, and metadata in a way that can be properly indexed.

We’re building a Climate Finance ‘Trust Framework’ where organisations can legally verify that they are the publisher and enable auditors to better assure the outputs.

Step one is to invert the reporting model to one of verified publishing, the act of which can enable a secure open market of actors to use in their own systems/portals/hubs/analytics/tech stack for publishing.

Step two is to start opening up the data value chains between the ‘real economy’ and the ‘financial economy’. This means, for example, enabling access to the energy data (both kWh consumption and its carbon intensity at the time of use) to users in a way they can share it through to carbon accounting, risk modelling (insurance), investors, ESG ratings providers, auditors and a range of other users (e.g. energy efficiency will become priced/tradable in emerging decentralised flexibility markets).

We can use the trust framework to verify sources, connections and provide access control for non-public data. Copies of audit data can also be used to provide a certifiable provenance that supports assurance. Such certification would give the market/rating higher confidence and drive behaviours towards transparency, quantifiable impact and double materiality.

One way of thinking about this is: we could automate greenhouse gas (GHG) reporting within a trusted framework. This would facilitate capital flow to projects that demonstrate the greatest measurable progress towards a a lower carbon future and, if applied across economies, would let us work out whether or not we are on target to hit net zero in advance, rather than waiting for the climate data to tell us we’re not.

Slides, open-to-comment,

*the above diagram only addresses climate finance risk through the reporting frameworks. Unpacking the actual financial risk models themselves is yet another dimension, as are the climate and catastrophe risk models. As with all systems-modelling approaches, there is no ‘edge’ to the data flows.