New project to develop software for efficient computing in the age of nanoscale devices

October 19, 2010
Contact:
  • umichnews@umich.edu

ANN ARBOR—A University of Michigan electrical engineering researcher is part of a national team that has received a $10-million National Science Foundation (NSF) grant to study how software can make nanoscale computer components more efficient.

As semiconductor manufacturers build ever smaller components, their nanoscale circuits and chips become less reliable and more expensive to produce. The variability in their behavior from device to device and over their lifetimes—due to manufacturing, aging-related wear-out, and varying operating environments—is largely ignored by today’s mainstream computer systems.

In this five-year project, a team of computer scientists and electrical engineers from six universities is proposing to re-think and enhance the role that software can play in a new class of computing machines that are adaptive and highly energy efficient.

“As the transistors on their chips get smaller, semiconductor makers are experiencing lower yields and more variability. In other words, more circuits have to be thrown away because they don’t meet the timing-, power- and lifetime-related specifications,” said Dennis Sylvester, an associate professor in the U-M Department of Electrical Engineering and Computer Science. Sylvester is an expert in designing computer circuits in nanoscale technologies.

If left unaddressed, said Rajesh Gupta, project director and professor of computer science and engineering at the University of California, San Diego, “this trend toward parts that scale in neither capability nor cost will cripple the computing and information technology industries. So we need to find a solution to the variability problem.

“We envision a world where system components—led by proactive software—routinely monitor, predict and adapt to the variability in manufactured computing systems,” Gupta said. “Changing the way software interacts with hardware offers the best hope for perpetuating the fundamental gains in computing performance at lower cost of the past 40 years.

The research team seeks to develop computing systems that will be able to sense the nature and extent of variations in their hardware circuits, and expose these variations to compilers, operating systems, and applications to drive adaptations in the software stack.

Variability-aware computing systems would benefit the entire spectrum of embedded, mobile, desktop and server-class applications by dramatically reducing hardware design and test costs for computing systems, while enhancing their performance and energy efficiency, the researchers say. Many in-demand applications—including search engines and medical imaging—would also benefit. But the project’s initial focus will be on wireless sensing, software radio and mobile platforms of all kinds. The aim is to transfer advances in these early areas to the marketplace.

In addition to Sylvester and UCSD’s Gupta, 11 computer scientists and electrical engineers from these institutions are involved: University of California, Los Angeles; Stanford University, University of California, Irvine (UCI); and the University of Illinois at Urbana-Champaign (UIUC). The project is based in the UCSD division of the California Institute for Telecommunications and Information Technology.

 

Related Links: