Smart grid: Modeling a year in the life of a power grid

January 20, 2016
Contact:
  • umichnews@umich.edu

ANN ARBOR—With a new $1.4 million grant, University of Michigan researchers will lead an effort to model a year in the life of a power grid, creating the most detailed, adaptable power grid simulation ever made.

Their model will inform the software at the heart of tomorrow’s smart electrical grid, which will need to shuffle electricity on the fly to constantly optimize the balance between supply and demand. That’s a radical departure from current practice, but experts say it will be essential to build a system that can handle the the uncertainty that the future will bring. Next-generation energy systems will rely more on renewable—but intermittent—sources like solar and wind.

Using real data from a French power grid operator, U-M researchers will create a highly detailed, flexible grid model that they and other researchers can then use to test algorithms and electrical grid software over long timeframes, on a scale hundreds of times larger than is currently possible.

The project also involves researchers from Los Alamos National Laboratory, the California Institute of Technology, Columbia University and Réseau de Transport d’Electrique (RTE) in France. It’s one of seven that the Department of Energy funded through an $11 million Advanced Research Projects Agency-Energy program to develop models and data repositories to transform the U.S. grid. It’s part of an ongoing effort that reflects the Obama Administration’s commitment to improving the resiliency, reliability and security of the nation’s electricity delivery system, according to a DoE statement.

“The energy landscape in 10 years is going to look much different than it does today,” said Pascal Van Hentenryck, the Seth Bonder Collegiate Professor of Industrial and Operations Engineering at U-M, who leads the project. “We’re going to see a shift from fossil fuels to renewables, more frequent extreme weather and new demands from electrical customers who are generators as well as consumers of electricity. To adapt to those changes, we’re going to have to reconsider some of the basic assumptions about how an electrical grid should work.”

Van Hentenryck says that today, most electricity is generated by a few large sources—often coal or nuclear—that are scaled up or down to follow demand. Future grids, he says, will need to do the opposite—constantly matching demand with the generation capacity available from many small, fluctuating sources like wind and solar. Powerful new software will be required to optimize this process and keep things running smoothly when the unexpected happens.

“Customers who use rooftop solar panels, for example, are much less predictable in their energy usage than traditional power consumers, since power they generate sometimes offsets their usage,” Van Hentenryck said. “Renewable energy sources like wind can make it necessary to quickly ship power across the country from where it’s generated to where it’s needed. We need a grid that’s smart and efficient enough to accommodate that.”

One of the biggest obstacles to getting a smarter grid off the ground has been that there’s currently no reliable way to test new software to make sure it performs as intended in the real world. That’s largely because the operators that run the grid have been reluctant to release data about how their systems work. They worry that it could lead to a security breach or tip their hand to competitors. So researchers have had to rely on proprietary data that can’t be shared beyond their immediate project, or on hypothetical data that’s often inaccurate or incomplete.

The U-M project aims to change that by partnering with grid operator RTE France, which will provide real power system data that the team can use to design and test high-fidelity models. The U-M team will use sophisticated optimization techniques to build a model that’s accurate but doesn’t divulge sensitive information about RTE’s system. Van Hentenryck says the use of real data sets the project apart.

“We’re going to build what’s called a synthetic model. It will behave like a real system down to a very fine level of detail. But it won’t include any of the actual data that was used to build it,” he said. “It’s a very complex optimization problem and that’s where the fundamental science lies. But we have a very strong collaboration with our partner and that’s going to help us get it done.”

The goal is to create a tool that will be useful for years to come and can be adapted to large and small grids in the U.S. and elsewhere. They plan to have a working model completed within the next two years.

“We have a big opportunity here, not just to improve our own grid, but to become leaders in technology that can be exported around the world,” Van Hentenryck said. “It makes a lot of sense both from an environmental and economical standpoint.”

 

More information: