Is Liquid Silicon the Next Generation of Computer Hardware?
The idea of “liquid silicon” conjures images from a Terminator film. Fittingly, it is a nascent ’80s computing concept brought to life with modern fabrication techniques, with the potential to alter the course of the future for computer hardware.
“Liquid Si,” with its delicate layers of mono-crystalline silicon and stacked transistors, have real-world implications in the post-Moore semiconductor landscape.
Building unified computer hardware that incorporates system memory, I/O logic, and disk storage into the same module represents a long-standing goal for microchip architects, and attainment is closer now than ever. Using a process called monolithic 3D integration, modern fabrication machines can execute chip designs with silicon and semiconductor circuitry layered on the bottom, solid-state memory arrays on top, and a dense metal-to-metal bus sandwiched in between.
The ‘liquid’ aspect refers more to the fluidity of the new hardware—whether it is to act as RAM, SSD, or CPU. With it, the age-old bottleneck between storage and compute components shifts a hardware problem to one that software solves, explains Jing Li, a University of Wisconsin assistant professor at the forefront of the new technology. “Liquid Si itself can be configured as storage class memory to replace NAND,” Li points out, whcih gives users the option to allocate compute, storage, or memory resources in real-time for meeting demands on the system.
Li and a team of post-grad research assistants at the UW-Madison College of Engineering have taken additional strides in the wake of modern breakthroughs for 3D layered silicon chips in the labs of academia. They have a custom built a computer system that automates the QA processes for prototypes, and are building the compiler code for app development.
With momentum running at full steam for Li, there’s a long road ahead for the technology.
The original concept of multi-layer universal computer chips traces back to Madison, Wisconsin by way of Silicon Valley. A similar type of wafer-scale integration (WSI) ‘superchips’ were first envisioned for mainframe computers by visionary IBM architect and University of Wisconsin alum Gene Amdahl. After raising a $60 million IPO in 1983, Amdahl launched Trilogy Systems to produce the concept, but stopped after 18 months without a product to show for it.
The layered circuits design proved too difficult to fabricate back then, but academic experimentation with 3D layered multi-chips sparked Amdahl’s concept back to life a several years ago.
Two prototypes of note—3D-MAPS developed by Georgia Institute of Technology, and University of Michigan’s Centip3De—demonstrated functional DRAM and multi-core logic in a single 3D layered silicon chip in 2011 and 2012.
For a CPU industry that is currently staring the end of miniaturization in the face, monolithic fabrication might be the logical next step. In a year full of famous departures, 2016 witnessed the death of Moore’s Law, a 51 year-old way to understand semiconductor size, evolution, and market value.
CPUs after the death of Moore’s Law
Moore’s passing surprised few, but visibly rippled across the industry. Intel delayed production of its 10 nm fabrications indefinitely. The newest 14 nm Kaby Lake CPUs represent a “tock” performance boost where a “tick” die-reduction should be. We’re still awaiting word from AMD about the much-awaited Zen products.
For AMD and Intel, being able to increase density, efficiency, and performance without moving to smaller dies and transistors means thinking in 3D. Vertical thinking for NAND architecture worked for solid state drives and dealt industry leadership to Samsung after three years on the market. Will it do the same for silicon?
Amdahl: I’ll Be Back.
One year after Amdahl’s passing in November 2015, a poetic “I’ll be back” might be in store for one of the greatest minds in computer science.
While it is likely that Liquid-Si has several years before commercial implementation, the immediate goal for Li is to fit inside new chips inside existing motherboard sockets, PCIe and DIMM slots. “We would like to keep it compatible with existing industry standard and bus protocols instead of reinventing new standard,” Li says.
UW liquid silicon research and development is backed for two years by DARPA, a U.S. defense contractor at $500,000 per year.
Photos courtesy of University of Wisconsin