This is an automated archive made by the Lemmit Bot.
The original was posted on /r/linustechtips by /u/compound-interest on 2025-01-29 18:32:02+00:00.
The new MeganeX Superlight 8k VR headset has a resolution of 3552 x 3840 per eye (7104 x 3840 pixels for both eyes), before any sort of supersampling. It's not just this specific headset, but the 1.35 inch Micro OLED / 10 bit panel will be an extremely popular option in all available headsets coming out in 2025.
Don't worry though, benchmarking these will not only apply to VR as I explain below.
One of the more popular uses for VR at the moment is modding games like Cyberpunk 2077, or using the UEVR injector for some of the games LTT benchmarks. Since headsets containing this panel are coming out this year, I personally think it would benefit the consumer highly to know what the differences between the 4090 and 5090 would be in this ultra high pixel resolution at 90hz.
What makes the UEVR injector so interesting is it's made to inject VR into all unreal engine games in the exact same way, meaning performance differences between flat and VR shouldn't vary depending on the game (because each game isn't being done by a different modder and thus the code and performance impact is similar across the board).
Here's a list of counterpoints you may have and my responses to them.
But VR is still niche why would I want to know that?
It's not about VR performance. It's about stretching the card to it's absolute limit. LTT used 4k as a benchmark resolution back when 4k panels were $2,000. It's about creating situations that really differentiate cards, which I feel like 4k doesn't do as good of a job with anymore (and 8k monitors don't really add much clarity, which is why it makes less sense to use them). The VR performance should be of interest to even people who hate using VR directly, because the results have interesting things to say about the hardware itself even for gamers that aren't ever going to use VR.
How will these benchmarks affect my purchasing decisions?
Knowing how each card does in a VR mod (like UEVR or the Luke Ross mods to things like Cyberpunk) using these extremely high resolutions would help with predicting how the cards will age in future titles/games. For example, comparing the same game pancake benchmarks to the VR modded version would give interesting insight on how both cards might perform similarly today, but when playing games that use higher resolution textures in the future, they might stand out from each other. It's not a perfect 1:1 scale between say display resolution and texture resolution, but it still gives the user interesting insight on pushing the cards to their vram limits, and how doing so affects the compute capabilities of each card.
These headsets aren't out yet. How would LTT even do this?
For one, I think at least one of the companies showcasing that new micro OLED panel at CES would have a unit that they could send LTT for early access. Alternatively, LTT could supersample a Bigscreen Beyond to get a very similar resolution to the new micro oled panels coming out.
I would imagine the audience of people buying the new expensive VR headsets will overlap with the people buying $2k GPUs, and give the audience a unique insight on the performance differences gen/gen at an almost 8k 90hz resolution, using an application that actually benefits from the extreme resolution uplift.