this post was submitted on 25 Sep 2023
559 points (96.2% liked)

Technology

59666 readers
2703 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Running AI is so expensive that Amazon will probably charge you to use Alexa in future, says outgoing exec::In an interview with Bloomberg, Dave Limp said that he "absolutely" believes that Amazon will soon start charging a subscription fee for Alexa

you are viewing a single comment's thread
view the rest of the comments
[–] barsoap@lemm.ee 3 points 1 year ago* (last edited 1 year ago)

but the hardware will continue to improve and get cheaper.

Eh. I mean sure the likes of A100s will invariably get cheaper because they're overpriced AF, but there isn't really that much engineering going into those things hardware-wise: Accelerating massive chains of fmas is quite a smaller challenge than designing a CPU or GPU. Meanwhile moore's law is -- well maybe not dead but a zombie. In the past advances in manufacturing meant lower price per transistor, that hasn't been true for a while now and the physics of everything aren't exactly getting easier, they're now battling quantum uncertainty in the lithography process itself.

Where there might still be significant headways to be made is by switching to analogue, but, eeeh. Reliability. Neural networks are rather robust against small perturbations but it's not like digital systems can't make use of that by reducing precision, and controlling precision is way harder in analoge. Everything is harder there, it's an arcane art.


tl;dr: Don't expect large leaps, especially not multiple. This isn't a naughts "buy a PC twice as fast at half the price two years later" kind of situation, AI accelerators are silicon like any other they already make use of the progress we made back then.