this post was submitted on 25 Oct 2025
42 points (100.0% liked)

technology

24068 readers
375 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 5 years ago
MODERATORS
top 10 comments
sorted by: hot top controversial new old
[–] sodium_nitride@hexbear.net 5 points 3 days ago (1 children)

the breakthrough achieves analog computation accuracy comparable to digital systems, improving analog precision by five orders of magnitude — or nearly 100,000 times.

This statement needs much elaboration before it becomes meaningful.

Unlike digital chips that move data back and forth between memory and processor cores (a process that consumes huge energy), RRAM executes computations directly within memory cells.

I'm 90% sure you can do in-memory computing with digital as well

Throughput hundreds to thousands of times higher than top-tier GPUs. Energy efficiency improvements of up to 1,000x

This seems rather amazing and extravagant. I would say that there are many obstacles before this type of thing could be put into the real world, but with China, they can do it. Probably they won't get thousands of times the performance of digital GPUs outside lab conditions, and not for all applications, but this could be a big leap forward.

[–] yogthos@lemmygrad.ml 7 points 3 days ago (1 children)

The problem with digital computing is just a light switch that can be on or off. To model a smooth curve, like those sigmoid functions all over AI, it has to model it with a massive series of discrete steps. It's clunky and burns a ton of energy just to approximate a smooth line. But an analog computer can be any value in between. It doesn't have to calculate the gradient because its voltage just is the gradient. It's modeling the physics of the problem directly, which is why it's so much more efficient for this stuff.

Meanwhile, the five orders of magnitude precision claim is actually the most important part. The whole reason we ditched analog in the 70s was because it was noisy and imprecise. Their breakthrough is basically that they've made an analog system to have enough precision to make it practical.

And you're absolutely right that digital can do in-memory computing, but the way it's done is the key. Digital in-memory compute is still just moving 1s and 0s around using logic gates, just on the same chip. Meanwhile, analog RRAM stuff is using the physical properties of the resistive material to do the math. They feed a bunch of input voltages into a grid of these memory cells, and the output current that comes out is the answer to a massive matrix multiplication, all in one shot. The computation is a physical event, not a sequence of steps.

As for those 1,000x performance gains, that's not what you'll likely see in real world applications. This isn't going to run your OS or a web browser faster. It's a specialized accelerator, just like a GPU is for graphics. You'd still have a normal CPU, but it would hand off specific tasks to this thing.

[–] sodium_nitride@hexbear.net 5 points 3 days ago (2 children)

Although in theory digital circuits have to deal with quantisation error, the imprecision of analog computing also makes it face a similar problem. There is always going to be an error bar. If it will be lower with analog computers, that's great.

As for the AI sigmoid curves, they do not need to actually be sigmoid curves. The point is to clip output and add non-linearity. The discrete steps are not a problem and I've seen some AI architectures for even 2-bit models.

But you are correct that digital circuits experience fundamental limitations of accuracy that could in theory be bypassed by analog circuits. I don't think analog circuits are bad, and am quite excited for them. I'm just wary of the overwhelming amounts of clickbait from journalists.

its voltage just is the gradient.

I'm not sure what set up the peking university experiment uses, but the voltage is proportional to the gradient of current only if you feed current to an inductor, and the proportionality constant is dependent on the inductor and surrounding components.

I think I will try to read the paper when I'm done with exams, because I am curious on the peking researchers' technique.

[–] yogthos@lemmygrad.ml 5 points 2 days ago (1 children)
[–] sodium_nitride@hexbear.net 5 points 2 days ago

Oh wonderful!

[–] yogthos@lemmygrad.ml 4 points 3 days ago

Sure, analog circuits are imprecise, but that isn't necessarily a show stopper for applications where you're tuning something like a neural network. The human brain is also extremely noisy and imprecise, but it's far more energy efficient at doing many tasks than our digital computers. The key idea is that it's optimized for a different use case than a digital chip.

I agree that the article is sensationalist in nature, but the tech itself is useful and if they figured out how to get noise levels down to make these chips work that really good be revolutionary. The paper will definitely have more interesting info, I was just too lazy to track it down.

[–] musicpostingonly@hexbear.net 4 points 3 days ago (1 children)
[–] Champoloo@hexbear.net 11 points 3 days ago (1 children)

Yeah, the article is talking about analog chips.

[–] musicpostingonly@hexbear.net 6 points 3 days ago* (last edited 3 days ago) (1 children)

I read the article. analog chips haven't been around a hundred years. There is no 100 year barrier. It's a made up term.

worlds most precise is a relative term. Whichever one came before this was 'the world's most precise' analog chip.

[–] vovchik_ilich@hexbear.net 5 points 3 days ago

You can argue about there being a 100 year barrier in analog computing, not in analog chips. Analog computing is old and it became obsolete because of reasons, some of them stated in the article, and it hasn't really been used all that much in the past century as a consequence. The claim of the article is that this may change, but I'm not knowledgeable on the topic