this post was submitted on 25 Oct 2025
42 points (100.0% liked)

technology

24065 readers
484 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] sodium_nitride@hexbear.net 5 points 2 days ago (2 children)

Although in theory digital circuits have to deal with quantisation error, the imprecision of analog computing also makes it face a similar problem. There is always going to be an error bar. If it will be lower with analog computers, that's great.

As for the AI sigmoid curves, they do not need to actually be sigmoid curves. The point is to clip output and add non-linearity. The discrete steps are not a problem and I've seen some AI architectures for even 2-bit models.

But you are correct that digital circuits experience fundamental limitations of accuracy that could in theory be bypassed by analog circuits. I don't think analog circuits are bad, and am quite excited for them. I'm just wary of the overwhelming amounts of clickbait from journalists.

its voltage just is the gradient.

I'm not sure what set up the peking university experiment uses, but the voltage is proportional to the gradient of current only if you feed current to an inductor, and the proportionality constant is dependent on the inductor and surrounding components.

I think I will try to read the paper when I'm done with exams, because I am curious on the peking researchers' technique.

[–] yogthos@lemmygrad.ml 5 points 2 days ago (1 children)
[–] sodium_nitride@hexbear.net 5 points 2 days ago
[–] yogthos@lemmygrad.ml 4 points 2 days ago

Sure, analog circuits are imprecise, but that isn't necessarily a show stopper for applications where you're tuning something like a neural network. The human brain is also extremely noisy and imprecise, but it's far more energy efficient at doing many tasks than our digital computers. The key idea is that it's optimized for a different use case than a digital chip.

I agree that the article is sensationalist in nature, but the tech itself is useful and if they figured out how to get noise levels down to make these chips work that really good be revolutionary. The paper will definitely have more interesting info, I was just too lazy to track it down.