559
Running AI is so expensive that Amazon will probably charge you to use Alexa in future, says outgoing exec
(www.businessinsider.com)
This is a most excellent place for technology news and articles.
Eh. I mean sure the likes of A100s will invariably get cheaper because they're overpriced AF, but there isn't really that much engineering going into those things hardware-wise: Accelerating massive chains of
fma
s is quite a smaller challenge than designing a CPU or GPU. Meanwhile moore's law is -- well maybe not dead but a zombie. In the past advances in manufacturing meant lower price per transistor, that hasn't been true for a while now and the physics of everything aren't exactly getting easier, they're now battling quantum uncertainty in the lithography process itself.Where there might still be significant headways to be made is by switching to analogue, but, eeeh. Reliability. Neural networks are rather robust against small perturbations but it's not like digital systems can't make use of that by reducing precision, and controlling precision is way harder in analoge. Everything is harder there, it's an arcane art.
tl;dr: Don't expect large leaps, especially not multiple. This isn't a naughts "buy a PC twice as fast at half the price two years later" kind of situation, AI accelerators are silicon like any other they already make use of the progress we made back then.