U.C. Berkeley professor Jan Rabaey kicked off the 2010 International Symposium on Low-Power Electronics and Design (ISLPED) today in Austin with a challenge to programmers, hardware designers and their EDA tool providers: The deterministic Turing model has hit the power wall. If you want energy efficiency, forget accuracy. Consider statistical, even analog computing.

In his keynote talk—“Going Beyond Turing: Energy Efficiency in the Post-Moore Era”—Rabaey claimed that despite a decade of advances in energy efficiency—including dynamic and adaptive voltage scaling, architectural innovations and other clever power management techniques—“waste has been eliminated and we’re basically running out of options” at smaller geometries. In fact at 22 nm and below, energy savings won’t scale any more. Leakage is now the major problem, and scaling makes it worse. ISSCC 2011 will feature a general session panel that will brainstorm how to get the next major power reduction as you scale down.

**It’s Hard to Determine**

All computers today are built on a deterministic model; with the same inputs you get the same outputs every time. But what if you don’t need complete accuracy—if “being in the ballpark” is good enough? According to Rabaey, “If you’re willing to back away a bit from accuracy, you can gain quite a bit in efficiency.” You can get along with a lot less computing power if you’re willing to accept a range of outputs.

Why not in fact do the computation in analog? The outcome will be not a single number but a distribution. Analog is inherently accurate for simple computations, but improving accuracy is expensive in terms of energy efficiency, since in analog circuits there is an exponential relationship between the signal-to-noise ratio (SNR) and power. For up to a 30 dB SNR, analog does a good job; but to get better SNR, the power requirement goes up fast. There are a lot of applications that do well with a low SNR. While not exactly an application, the basically analog human brain works very well at a low SNR. Add a room full of screaming 7-year olds and its efficiency falls off a cliff.

Statistical computing doesn’t rely on probabilistic algorithms and it’s not the same as Boolean networks. Its inputs are deterministic or stochastic variables and its outputs are a range of numbers that follow a distribution curve. It relies on algorithms that display resilience in the presence of uncertainty (“noise”) and can still make reasonable estimates—with varying degrees of certainty—within parameters determined by the degree of uncertainty associated with the inputs.

**ERSA and ANTs**

Rabaey cited work done at Stanford on the Error Resistant System Architecture (ERSA), an attempt to define a hardware/software architecture that supports statistical computing. ERSA has resulted in significant power savings in dealing with streaming video, adding little detectable noise in the process. Rabaey also cited experiments on algorithmic noise tolerance (ANT) that resulted in a 2.5x energy saving for a barely detectable increase in error rate.

Rabaey pointed out that some applications are well suited to statistical computing techniques while others are not. Adding a bit of noise to streaming video is a reasonable tradeoff if it results in a major power saving. In contrast, medical, military and any mission-critical computing tasks need to remain deterministic. “Some errors roll off smoothly, while some are catastrophic.” And in any application, changes to the least significant bit (LSB) in a byte may be insignificant, while changes to the most significant bit (MSB) clearly are not.

Statistical computing may hold great promise, but designers and programmers need to change their thinking and EDA vendors need to deliver the tools. “We must start building statistics into all levels of the design process,” Rabaey concluded. “We have to break determinism. VHDL and Verilog are purely deterministic languages. We must spend time on error modeling of our hardware.” For this to happen, “The EDA community really needs to break out into the application space.”

Mentor, Synopsys, Cadence: the ball’s in your court, folks.