How Analog and Neuromorphic Chips Will Rule the Robotic Age
When it comes to new technologies and products, we tend to think of “digital” as synonymous with advanced, modern, and high-def, while “analog” is considered retrograde, outmoded, and low-resolution.
But if you think analog is dead, you’d be wrong. Analog processing not only remains at the heart of many vital systems we depend on today, it is now going to make its way into a new breed of compute and intelligent systems that will power some of the most exciting technologies of the future: artificial intelligence and robotics.
Before discussing the upcoming analog renaissance—and why engineers and innovators working on AI and robot applications should be paying close attention to it—it might be helpful to understand the significance and legacy of the Old Analog Age.
OUR LOVE FOR ANALOG
During World War II, analog circuits played a key role in enabling the first automatic anti-aircraft fire-control systems and, in the decades that followed, analog computers became an essential tool for calculating flight trajectories for rockets and spacecraft.
Analog also became prevalent in control and communication systems used inaircraft, ships, and power plants, and some of those systems are still in operation to this day. Until not long ago, analog circuits controlled vast portions of our telecommunications infrastructure (remember the rotary dial?), and they even ruled the office copy room, where early photocopier machines reproduced images without handling a single digital bit.
Our love for analog persisted for so long because this technology proved, again and again, to be accurate, simple, and fast. It steered our rockets, sailed our ships, recorded and played back our music and videos, and connected us to each other for many decades. And then, starting in the mid- to late-1960s, digital came along and rapidly took over.
DIGITAL DOMAIN
Why was analog abandoned for digital? The biggest weakness of analog is that it’s rigid, and it gets exponentially more complex when you try to make it flexible. More complexity brings a disproportionate reduction in reliability (bad), so engineers began to notice that Moore’s Law was making dense compute extremely reliable and cheap.
Meanwhile, MEMS and microfabrication techniques commoditized sensors that capture physical signals and convert them to digital. Soon enough, operational amplifiers were abandoned for logic gates that got exponentially cheaper just as Moore’s Law predicted. Mechanical linkages became “fly-by-wire,” and designers pushed digitization further to the edge.
No comments:
Post a Comment