Making data work harder to deliver Net Zero — can we automate environmental reporting?

Trusted data needs to flow between industry, finance and environment — is achievable at web-scale.

Pensions, investments and countries (sovereign wealth funds) are allocating $trillions to create a lower-carbon future. At the moment this is done using ‘financial economy’ modelling based on top-down economic models, Environmental, Social and Governance (ESG) ratings and related financial instruments that have a risk-based focus. While these will deliver a ‘lower carbon’ future, there are material risks in data (in models, inputs & reported outputs).

Currently, there is a material absence of transparency, assurance, comparability and auditability around environmental reporting (including ‘ESG’ and ‘impact investing’) that many groups around the world are trying to addre†ss. Most reporting is ‘push’ (e.g. to https://cdp.net or TCFD reporting, which is now mandatory for >50k companies in Europe).

While better reporting is a form of progress it does not provide a causal link to materiality (or ‘double materiality’). ESG ratings are very far from mature.

Governments are likely to mandate that such publishing be made ‘open’ (for some definitions of open) and XBRL has the mandate to help develop principles of machine-readable data reporting. One challenge is that everyone is trying to ‘build a portal’ rather than taking a web-scale approach — this is not scalable and increases barriers to access and share.

At Icebreaker One we’re making data work harder to deliver net zero. We’re promoting the idea that companies publish their reports and data, and metadata in a way that can be properly indexed.

We’re building a Climate Finance ‘Trust Framework’ where organisations can legally verify that they are the publisher and enable auditors to better assure the outputs.

Step one is to invert the reporting model to one of verified publishing, the act of which can enables a secure open market of actors to use in their own systems/portals/hubs/analytics/tech stack.

Step two is to start opening up the data value chains between the ‘real economy’ and the ‘financial economy’. This means, for example, enabling access to the energy (both kWh consumption and its carbon intensity at the time of use) to users in a way they can share it through to carbon accounting, risk modelling (insurance), investors, ESG ratings providers, auditors and a range of other users (e.g. energy efficiency will become priced/tradable in emerging decentralised flexibility markets).

We can use the trust framework to verify sources, connections and provide access control for access to non-public data. Copies of audit data can also be used to provide a certifiable provenance that are auditable can lead to assurance. Such certification would give the market/rating higher confidence and drive behaviours towards transparency, quantifiable impact and double materiality.

One way of thinking about this is: we could automate Greenhouse Gas (GHG) reporting within a trusted framework. This would facilitate capital flow to projects that demonstrate the greatest (measurable) progress towards lower carbon and, if applied across economies, would let us work out whether or not we are on target to hit net zero in advance (rather than waiting for the climate data to tell us we’re not).

Slides, open-to-comment, https://bit.ly/IB1-NZDF

*the above diagram only addresses climate finance risk through the reporting frameworks. Unpacking the actual financial risk models themselves is yet another dimension, as are the climate and catastrophe risk models. As with all systems-modelling approaches, there is no ‘edge’ to the data flows.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store