"Error is viewed, not as an extraneous and misdirected or misdirecting accident, but as an essential part of the process…. Our present treatment of error is unsatisfactory and ad hoc.” -J. von Neumann (1956).
Well before 1956, John von Neumann recognized and voiced repeatedly his conviction that handling errors is a critical aspect of computing. For over 50 years, the field of computing has relied on powerful methods such as defect minimization, design margining, abstraction, redundancy, backtracking, and virtualization at the technology, circuits, architecture, and software levels to avoid errors and exploit the scaling enabled by Moore’s Law. However a drastic shift in the computing model may be due as traditional technology scaling itself poses risks to the continued advancement of computers; the memory gap is becoming more and more important, power is continuing to be a limiting factor, and variation- and reliability-induced errors in nanoscale technologies are getting more difficult to contain.
Recent research has been looking into new computing models as a solution. These have particular relevance in the context of emerging applications, which are rapidly gaining prominence. Instead of focusing on exact computations from traditional input sources, more diverse inputs sources, often based on ensembles (of sensors, users, other computers), are enabling probabilistic or approximate models for computing towards high-level inferences. These models have several implications, such as reduced power or latency, particularly when they are considered in the context of emerging technologies. Generally speaking, such computing models have so far been shown to be applicable only in specific cases, however broader benefits are also possible. In brain-inspired computing, as an example, while substantial background including the decades of research in neural networks exists, recent advancements focus on general-purpose computing and emerging machine-learning methods.
While it is possible to study these computing models independent of technology or applications, designing new computing models in a way that is explicitly driven by these upcoming applications and emerging technologies would lead to more systematic advances. Further, it will also enable directed efforts on the technological and application levels that are influenced by these computing models. With these goals, we are holding a DAC Workshop where researchers from alternative computing, system design, and novel technology research areas can attend, share their insights, and exchange ideas towards the common goal of improving system design in the nanoscale era.
52nd DAC General Chair, Anne Cirkel, will be the featured editor of the "52 Weeks of DAC" blog. Follow this blog for a behind the scenes view of how this conference gets ready for June 2015!
DAC 2015 will be held in San Francisco, California, at the Moscone Center. Get details about travel, hotels, and area attractions in one convenient spot.