this post was submitted on 12 Feb 2024
1341 points (98.6% liked)

memes

10375 readers
2559 users here now

Community rules

1. Be civilNo trolling, bigotry or other insulting / annoying behaviour

2. No politicsThis is non-politics community. For political memes please go to !politicalmemes@lemmy.world

3. No recent repostsCheck for reposts when posting a meme, you can only repost after 1 month

4. No botsNo bots without the express approval of the mods or the admins

5. No Spam/AdsNo advertisements or spam. This is an instance rule and the only way to live.

Sister communities

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] BmeBenji@lemm.ee 136 points 9 months ago (4 children)

4K is overkill enough. 8K is a waste of energy. Let’s see optimization be the trend in the next generation of graphics hardware, not further waste.

[–] Zink@programming.dev 52 points 9 months ago (2 children)

Yeah. Once games are rendering 120fps at a native 6K downscaled to an amazing looking 4K picture, then maybe you could convince me it was time to get an 8K TV.

Honestly most people sit far enough from the TV that 1080p is already good enough.

[–] frezik@midwest.social 12 points 9 months ago (2 children)

I find 4k is nice on computer monitors because you can shut off anti-aliasing entirely and still leave jagged edges behind. 1440p isn't quite enough to get there.

Also, there's some interesting ideas among emulator writers about using those extra pixels to create more accurate CRT-like effects.

[–] Zink@programming.dev 5 points 9 months ago

Oh yeah, I have read some very cool things about emulators and being able to simulate the individual phosphors with 4K resolution. I have always been a sucker for clean crisp pixels (that’s what I was trying to achieve on the shitty old CRT I had for my SNES) so I haven’t jumped into the latest on crt shaders myself.

[–] Holzkohlen@feddit.de 1 points 9 months ago (1 children)

But anti-aliasing needs far less performance. And you need to mess about with scaling on a 4k monitor which is always a pain. 1440p for life IMHO

[–] frezik@midwest.social 2 points 9 months ago

Anti-aliasing also softens the image a bit. Image quality is better if you can leave it off.

[–] minibyte@sh.itjust.works 3 points 9 months ago* (last edited 9 months ago) (1 children)

I’m to THX spec, 10 feet from an 85 inch. I’m right in the middle of 1440P and 4K being optimal, but with my eyes see little difference between the two.

I’d settle for 4k @ 120 FPS locked.

[–] Zink@programming.dev 2 points 9 months ago

I’m 6-8 feet from a 65, depending on seating position and posture. It seems to be a pretty sweet spot for 4K (I have used the viewing distance calculators in the past, but not recent enough to remember the numbers). I do wear my glasses while watching TV too, so I see things pretty clearly.

With games that render at a native 4K at 60fps and an uncompressed signal, it is absolutely stunning. If I try to sit like 4 feet from the screen to get more immersion, then it starts to look more like a computer monitor rather than a razor sharp HDR picture just painted on the oled.

There is a lot of quality yet to be packed into 4K. As long as “TV in the living room” is a similar format to now, I don’t think 8K will benefit people. It will be interesting to see if all nice TVs just become 8K one day like with 4K now though.

[–] FinalRemix@lemmy.world 28 points 9 months ago (1 children)

*monkey's paw curls*

Granted! Everything's just internal render 25% scale and massive amounts of TAA.

[–] SailorMoss@sh.itjust.works 13 points 9 months ago

He said next-gen not current gen. :/

[–] flintheart_glomgold@lemmy.world 7 points 9 months ago

For TV manufacturers the 1K/4K/8K nonsense is a marketing trap of their own making - but it also serves their interests.

TV makers DON'T WANT consumers to easily compare models or understand what makes a good TV. Manufacturers profit mightily by selling crap to misinformed consumers.

[–] bruhduh@lemmy.world 4 points 9 months ago* (last edited 9 months ago) (1 children)

Divide resolution by 3 though, current gen upscale tech can give that much, 4k = upscaled 720p and 8k = upscaled 1440p

[–] AngryMob@lemmy.one 4 points 9 months ago (1 children)

can doesn't mean should.

720p to 4k using dlss is okay, but you start to see visual tradeoffs strictly for the extra performance

to me it really shines at 1080p to 4k where it is basically indistinguishable from native for a still large performance increase.

or even 1440p to 4k where it actually looks better than native with just a moderate performance increase.

For 8k that same setup holds true. go for better than native or match native visuals. There is no real need to go below native just to get more performance. At that point the hardware is mismatched

[–] bruhduh@lemmy.world 1 points 9 months ago* (last edited 9 months ago)

Devs already use it instead of optimisations, what makes you think that bosses don't try to push it further because deadlines and quarterly profits, immortals of aveum is example and it's not even end of generation, only half (i agree with you from user standpoint though)