If the built environment industry is to deliver on its net zero promise, there must be open access and sharing of decarbonisation data, write Atkins‘ Isabelle Smith and Rachel Bell
As an industry we face a formidable challenge in decarbonising our built environment by 2050 to reach our net zero goals.
Since the government set those targets into legislation in 2019, there has been an enormous amount of work done in terms of road-mapping and planning. But we now need to focus on the delivery of our decarbonisation programmes and how we are going to manage the volume of data generated by the projects on the ground.
One of the biggest obstacles to progress, in our view, is ensuring businesses across the industry – from the smallest firms to global players – can access and share data effectively, so that we can share knowledge and understanding of how the industry as a whole approaches decarbonisation.
Using technology to enhance data integration
The most effective way of doing this is to leverage available technology to manage data and integrate enterprise models early in major programmes.
Much of this was simply not possible five or ten years ago because, although the technology was available, the processes and skills to manipulate, analyse and transfer data were not as mature in the industry as they are now. This, alongside well-documented advisory information on data processes, security and management, issued by bodies like the Infrastructure and Projects Authority (IPA), through the Construction Playbook, enables us to unlock the potential of that technology.
Enterprise models are most effective when you take an outcomes-based approach with the end result firmly in mind from the outset. That means that, whether it is carbon, safety, quality or social value, the value driver is locked into the data from the start.
That dataset then forms the backbone of the project used by all, whether that’s the designers, contractors, or the future operators, which should ensure that this information then moves seamlessly right through the value chain.
Data management on projects
We talk a lot about collaboration, but the reality can be far from straightforward. One of the blockers is data continuity, which requires management across a supply chain. The reality of this is a lot of projects can’t afford the overheads associated with data management which can make this approach more difficult for small to medium-sized firms to adopt.
At Atkins we operate in enterprise models on a number of major schemes, across large scale programmes. The issue with data management isn’t about scaling up in that case, it’s about scaling it down.
At the moment it does work at scale so the major programmes can take advantage of the IPA direction and application of the Construction Playbook because there is a management layer, or a delivery partner or an integrator to make sure that it actually happens. But making sure it is embedded at a smaller scale is probably the bigger challenge.
Don’t wait for a solution
The importance of the role of data in a business goes without saying, and how it operates in organisations like Atkins.
There is a whole host of multi-layered information at an institute and a government level. But the question we need to be asking as an industry is how we get all that learning and experience and embed it back into the policies that are being developed by government, or the direction that you give designers, contractors etc to ensure standards and targets are met.
“We need to improve data maturity across the supply chain and enable every project to be rooted in data-driven decision making”
What we see happening, when it comes to major programmes, is clients can afford to share the information amongst framework suppliers. There are systems in place to enable that to happen on something like an HS2 or the nuclear new build programme. They share information highly effectively between all the participants within a programme, but it doesn’t get shared outside of that programme, which would benefit the industry more widely.
Therefore, in our view, there are two things that need to happen. Firstly we need to improve data maturity across the supply chain and enable every project to be rooted in data-driven decision making. At the same time, we need to be scaling up in the sense of sharing large scale data sets across projects and industry, including open sourcing data to bring improvements to the industry as a whole.
Because, it is only through better methods of sharing expertise and understanding that we are going to be able to make real progress in rolling out our decarbonisation programmes to reach net zero. And it is a problem the industry is going to have to solve itself, as waiting for government to solve it, to aggregate all that information and share it, is never going to happen.
Read next: Standardising data is the first step to efficient workflows in construction
Are you a building professional? Sign up for a FREE MEMBERSHIP to upload news stories, post job vacancies, and connect with colleagues on our secure social feed.