A study from Oak Ridge National Laboratory looked at which energy-saving measures would be the most effective across a swath of buildings in Chattanooga, Tennessee.
Residential and commercial buildings consume nearly three-quarters of U.S. electricity—during peak hours, that share reaches 80%. Simulating that energy use on a broad scale can help identify ways to reduce it, cutting greenhouse gas emissions in the process.
In a recent study, researchers from the U.S. Department of Energy’s (DOE) Oak Ridge National Laboratory (ORNL) assessed energy use across more than 178,000 buildings using supercomputing power at DOE’s Argonne National Laboratory. The effort is part of a larger goal to model all of the nation’s 129 million buildings.
If you were to tally up the energy bills from all of those buildings, the annual total would be around $403 billion.
“I look at that as a mortgage on our economy,” said Joshua Ryan New, a computer scientist at ORNL. “If we can find better ways to use energy more productively, we can accomplish more as a society.”
New and colleagues developed the Automatic Building Energy Modeling (AutoBEM) software, which is used to detect buildings, generate models and simulate building energy use for very large areas. Creating an energy picture of a large network of buildings, rather than looking at just one building or even several hundred, can illuminate areas of opportunity for planning the most effective energy-saving measures.
However, many efforts to model large numbers of buildings rely on prototypes of common commercial buildings such as offices, warehouses and schools. Gaps remain between what a computer model will predict and what real life will reflect in terms of energy use. To narrow those gaps, models need to be validated with actual energy use data.
For the study, New and colleagues partnered with a municipal utility to create a digital twin of 178,337 buildings in Chattanooga, Tennessee. To do this, they integrated the utility’s information on energy use for every building, down to 15-minute intervals, with satellite images, tax assessments and other data sources. Then they projected the effects of eight energy conservation measures on energy use, demand, cost and emissions. Those measures included roof insulation, lighting changes and improvements to heating and cooling efficiency.
To run the simulations, the team used the Theta supercomputer at the Argonne Leadership Computing Facility (ALCF), a DOE Office of Science User Facility. A building energy model has on average 3,000 inputs. One of those inputs could be an hourly lighting schedule of a single room with more than 8,000 values, New said, so it’s easy to imagine how modeling eight different energy-saving measures across more than 100,000 buildings could quickly become a data-intensive job.
“We’ve been able to scale up to running annual simulations of over a million buildings in one hour on Theta,” New said. “That really unlocks a lot of potential that you wouldn’t see otherwise.”
Theta’s configuration offered an advantage for AutoBEM, which uses EnergyPlus and OpenStudio, two DOE tools for building energy modeling. The distinction rests on the type of processing power: While many high-performance computers derive their power from graphics processing units (GPU), EnergyPlus runs primarily on central processing units (CPU), which are designed to handle one task after another in quick succession.
“Theta is one of the most powerful supercomputers in the U.S. in terms of CPUs,” New said, noting that the code was initially run on ORNL’s Titan, which is a GPU-heavy machine. “When we got to Theta, AutoBEM scaled so beautifully. We regularly use over 80% of Theta.”
The ORNL team’s study was awarded time at the ALCF through DOE’s Advanced Scientific Computing Research Leadership Computing Challenge, a program that allocates national supercomputing facility resources with an emphasis on high-risk, high-payoff simulation projects.
“The fact that this study simulated a large agglomeration of buildings to reduce energy use is a novel problem for our resources and our allocation programs,” said Katherine Riley, ALCF’s director of science. “This work is really aligned with important energy questions going into the next few decades about how we can reduce the carbon footprint of the nation’s buildings.”
Historically in high-performance computing, one might run one big simulation, go away to analyze the data, and then run another simulation. Riley said that ALCF continues to build its systems for new types of workflows and science.
“If you’re trying to understand energy requirements over a lot of different scenarios in a city and wanted to look at all the knobs you could turn, that’s thousands of simulations—not one big one,” Riley said. “For a project like that, you need a system that’s capable of managing a very dynamic workflow. That’s what ALCF can support.”
The AutoBEM simulation of Chattanooga buildings found that 99% of them saw energy savings for the set of energy efficiency technologies evaluated. Increasing the efficiency of the heating, ventilation and air conditioning (HVAC) system by 7.5% saved $28,500 in annual energy costs averaged across 177,307 buildings, for example. A given conservation measure such as improved HVAC efficiency, space sealing, insulation or lighting could have the potential to offset 500 to 3,000 pounds of carbon dioxide per building, the researchers concluded. Full details appear in a paper published in the journal Energies.
“What we do in buildings will have a long-lasting impact,” New said. “Creating a more sustainable and resilient building stock will have an impact that I might not see in my lifetime, but my grandchildren’s grandchildren will be thankful we got that right.”
Supercomputing resources power energy savings analysis (2021, August 12)
retrieved 12 August 2021
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.