Serial entrepreneur Naveen Rao lifted the curtain a little bit this week on his latest startup, Unconventional AI, which is looking to build a new analog chip that can move AI beyond the scaling challenges restricting current digital computers.
Word of Unconventional AI’s existence came to light in September, when Rao hinted in an X post that he was co-founding a new company to build a computer with “brain-scale efficiency.” We learned the company was backed by Andreessen Horowitz and that it was looking to raise $1 billion.
Fast forward two months, and Rao and his three co-founders–MIT Associate Professor Michael Carbin, Stanford University Assistant Professor Sara Achour, and former Google engineer MeeLan Lee–formally announced in a blog post that they have raised $475 million in seed funding at a $4.5 billion valuation.
Unconventional AI co-founders (from left): Sara Achour, Naveen Rao, MeeLan Lee, and Michael Carbin
The core problem that Rao–who sold his AI startup Mosaic ML to Databricks for $1.3 billion in 2023 and sold another startup, Nervana Systems, to Intel for a reported $400 million in 2016–was looking to solve is the relative inefficiency of using traditional digital computers to scale today’s neural network-based AI workloads. While these systems are quite powerful computationally, it comes at the expense of wattage. At the current rate of growth, the co-founders say, there simply won’t be enough energy in the world to sustain any more growth in AI workloads.
Unconventional AI is looking to develop “a more efficient computational substrate specifically for AI.” The goal is to develop “a software interface to the inherent physics of the silicon” to enable “biology-scale energy efficiency,” a reference to the human brain’s capability to process data with an energy budget of about 20 watts, or the equivalent of a dim lightbulb.
While digital computers are synonymous with computing today, that wasn’t always the case. Rao points out in a video interview with A16Z’s Matt Bornstein that early analog computers based on vacuum tubes, such as the ENIAC computer, actually ran quite well.
“Analog computers are actually some of the first computers,” Rao said. “They’re very efficient, but they couldn’t be scaled up because of manufacturing variability.”
Digital computers based on transistors took over due to the inability to scale analog approaches, and they eventually became what we think of as computers today. It now defines the field of computer science, Rao said. However, digital computers based on classic von Neumann architecture are now running into another scaling problem related to power consumption, which is leading Rao and others to rethink analog systems.
“Analog still is inherently more efficient. It’s actually analogous to the way to think about it,” the Unconventional AI CEO told Bornstein. “Can I build a physical system that is similar to the quantity I’m trying to express or compute over? You’re effectively using physics of the underlying medium to do the computation.”
Bornstein pressed Rao to make it concrete. What sort of substrate will Unconventional AI build? What kind of chip is it going to be?
Rao teased with his response:
“Analog computers can do lots of different things. Wind tunnels are a great example of an analog computer in a sense, where I have a race car … or an airplane and I want to understand how the wind moves around it. You can, in theory, solve those things computationally. The problem is you’re always going to be off. It’s very hard to know what the real system is going to look like. And doing things with computational fluid dynamics accurately is pretty hard. So people still build wind tunnels. That’s actually modeling that. That’s an analog computer.
“I think we still have lots of reasons to build these analogous type computers,” Rao continued. “Now, in the situation we’re talking about, we can actually build circuits in silicon to recapitulate behaviors of neural networks. So what we’re doing today is more specified than what we were doing 80 years ago, in a sense. Then we were trying to automate generic calculations, which were used to calculate artillery trajectories. It was used to calculate finances, maybe some physics problems like going into space, things like that. Those require determinism and specificity around these numbers and these computations.
“Intelligence is a different beast,” he said. “You can build it out of numbers, but is it naturally built on numbers? I don’t know. A neural network is actually a stochastic machine. And so why are we using the substrate that is highly precise and deterministic for something that’s actually stochastic and distributed in nature? We believe we can find the right isomorphism in electrical circuits that can subserve intelligence.”
According to Rao, the company is giving itself five years to build a new analog chip. The company will partner with TSMC, he said, and the first prototype is “going to be probably one of the larger, maybe the largest analog chip people have ever built.”
There is surely a risk in taking such an unconventional approach to building a new chip. But Rao is reassured by the fact that there are very smart engineers who have been mapping a range of algorithms to different physical substrates for quite a while.
“Those folks who understand energy-based models, flow models, gradient descent in different ways–this kind of thing is what we need there,” he said in the A16Z interview. “We need theorists who can think about different ways of building coupled systems, how I can characterize the richness of dynamical systems and relating that to neural networks. So there is a theory aspect of this. Then there’s folks who are kind of system architecture level…Here’s what the theory says. This is what I can really build. How do I bridge that gap. And then there’s the people actually physically building this stuff, like analog circuits and digital circuit people too.”
Michael Bornstein of A16Z interviews Unconventional AI CEO Naveen Rao in a video available on X
Unconventional AI isn’t the only company building analog computer chips. Mythic has been developing its analog processors since 2020, and today claims that its MM1076 chip delivers an order of magnitude better performance compared to traditional CPUs, GPUs, and TPUs. It’s being utilized in edge AI use cases, such as for drones, robots, and smart city deployments.
In October, we learned that a group of Chinese researchers had built an analog chip that they claimed could outperform Nvidia GPUs by a factor of 1,000, while using 100x less energy. The Peking University researchers built an analogue matrix computing (AMC) system that relies on resistive random-access memory (RRAM) technology and is able to solve matrix math problems in a single step, without the need for iterations.
“The in situ computing property of AMC with resistive memory can also help overcome the von Neumann bottleneck, thus enhancing computing throughput and energy efficiency,” the researchers write in a paper in the journal Nature.
This article first appeared on HPCwire.
The post Unconventional AI Wants to Solve AI Scaling Crunch with Analog Chips. Will It Work? appeared first on AIwire.

