this post was submitted on 25 Oct 2025
42 points (100.0% liked)

technology

24068 readers
375 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] yogthos@lemmygrad.ml 7 points 3 days ago (4 children)

The problem with digital computing is just a light switch that can be on or off. To model a smooth curve, like those sigmoid functions all over AI, it has to model it with a massive series of discrete steps. It's clunky and burns a ton of energy just to approximate a smooth line. But an analog computer can be any value in between. It doesn't have to calculate the gradient because its voltage just is the gradient. It's modeling the physics of the problem directly, which is why it's so much more efficient for this stuff.

Meanwhile, the five orders of magnitude precision claim is actually the most important part. The whole reason we ditched analog in the 70s was because it was noisy and imprecise. Their breakthrough is basically that they've made an analog system to have enough precision to make it practical.

And you're absolutely right that digital can do in-memory computing, but the way it's done is the key. Digital in-memory compute is still just moving 1s and 0s around using logic gates, just on the same chip. Meanwhile, analog RRAM stuff is using the physical properties of the resistive material to do the math. They feed a bunch of input voltages into a grid of these memory cells, and the output current that comes out is the answer to a massive matrix multiplication, all in one shot. The computation is a physical event, not a sequence of steps.

As for those 1,000x performance gains, that's not what you'll likely see in real world applications. This isn't going to run your OS or a web browser faster. It's a specialized accelerator, just like a GPU is for graphics. You'd still have a normal CPU, but it would hand off specific tasks to this thing.

[–] sodium_nitride@hexbear.net 5 points 3 days ago (3 children)

Although in theory digital circuits have to deal with quantisation error, the imprecision of analog computing also makes it face a similar problem. There is always going to be an error bar. If it will be lower with analog computers, that's great.

As for the AI sigmoid curves, they do not need to actually be sigmoid curves. The point is to clip output and add non-linearity. The discrete steps are not a problem and I've seen some AI architectures for even 2-bit models.

But you are correct that digital circuits experience fundamental limitations of accuracy that could in theory be bypassed by analog circuits. I don't think analog circuits are bad, and am quite excited for them. I'm just wary of the overwhelming amounts of clickbait from journalists.

its voltage just is the gradient.

I'm not sure what set up the peking university experiment uses, but the voltage is proportional to the gradient of current only if you feed current to an inductor, and the proportionality constant is dependent on the inductor and surrounding components.

I think I will try to read the paper when I'm done with exams, because I am curious on the peking researchers' technique.

[–] yogthos@lemmygrad.ml 5 points 2 days ago (1 children)
[–] sodium_nitride@hexbear.net 5 points 2 days ago
load more comments (1 replies)
load more comments (1 replies)