this post was submitted on 15 Nov 2024
        
      
      118 points (99.2% liked)
      Futurology
    3390 readers
  
      
      13 users here now
      
        founded 2 years ago
      
      MODERATORS
      
    you are viewing a single comment's thread
view the rest of the comments
    view the rest of the comments
Ok, so if you want to run your local LLM on your desktop, use your GPU. If you’re doing that on a laptop in a cafe, get a laptop with an NPU. If you don’t care about either, you don’t need to think about these AI PCs.
Or use a laptop with a GPU? An npu seems to just be slightly upgraded onboard graphics.
It’s a power efficiency thing. According to the article, a GPU gets the job done, but uses more energy to get there. Probably not a big deal unless charging opportunities are scarce.