this post was submitted on 07 Jan 2025
55 points (92.3% liked)
Games
17017 readers
651 users here now
Video game news oriented community. No NanoUFO is not a bot :)
Posts.
- News oriented content (general reviews, previews or retrospectives allowed).
- Broad discussion posts (preferably not only about a specific game).
- No humor/memes etc..
- No affiliate links
- No advertising.
- No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
- No self promotion.
- No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
- No politics.
Comments.
- No personal attacks.
- Obey instance rules.
- No low effort comments(one or two words, emoji etc..)
- Please use spoiler tags for spoilers.
My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.
Other communities:
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Thank you so much, will check that as soon as I can
Edit: that was really useful, turns out older CPUs are not so feasible with Arc GPUs. Here's a summary that I found quite simple and elegant from the comment section:
Yep, that's pretty much the gist of it. Driver overhead isn't something completely new, but with the B580 it certainly is so high that it becomes a massive problem in exactly the use case where it would make the most sense.
Another albeit smaller issue is the idle power draw. Here is a chart (taken from this article)
Because for a honest value evaluation that also plays a role, especially for anyone planning to use the card for a long time. Peak power draw doesn't matter as much imo, since most of us will not push their system to its limit for a majority of the time. But idle power draw does add up over time. It also imo kind of kills it as a product for the second niche use besides budget oriented games, which would be for use in a homelab setting for stuff like video transcoding.
So as much as i am honestly rooting for Intel and think they are actually making really good progress in entering such a difficult market, this isn't it yet. Maybe third time's the charm.