this post was submitted on 11 Jan 2025
65 points (98.5% liked)
PC Gaming
9158 readers
757 users here now
For PC gaming news and discussion. PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments, within reason.
- Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
You're still seeing ray tracing as a graphics option instead of what it actually is: Something that makes game development considerably easier while at the same time dramatically improving lighting - provided it replaces rasterized graphics completely. Lighting levels the old-fashioned way is a royal pain in the butt, time- and labor-intensive, slow and error-prone. The rendering pipelines required to pull it off convincingly are a rat's nest of shortcuts and arcane magic compared to the elegant simplicity of ray tracing.
In other words: It doesn't matter that you don't care about it, because in a few short years, the vast majority of 3D games will make use of it. The necessary install base of RT-capable GPUs and consoles is already there if you look at the Steam hardware survey, the PS5, Xbox Series and soon Switch 2. Hell, even phones are already shipping with GPUs that can do it at least a little.
Game developers have been waiting for this tech for decades, as has anyone who has ever gotten a taste of actually working with or otherwise experiencing it since the 1980s.
My personal "this is the future" moment was with the groundbreaking real-time ray tracing demo heaven seven demo from the year 2000:
https://pouet.net/prod.php?which=5
I was expecting it to happen much sooner though, by the mid to late 2000s at the latest, but rasterized graphics and the hardware that runs it were improving at a much faster pace. This demo runs in software, entirely on the CPU, which obviously had its limitations. I got another delicious taste of near real-time RT with Nvidia's iRay rendering engine in the early 2010s, which could churn out complex scenes with PBR materials (instead of the simple, barely textured geometric shapes of heaven seven) at a rate of just a few seconds per frame on a decent GPU with CUDA, even in real-time on a top of the line card. Even running entreily on the CPU, this engine was as fast as a conventional CPU rasterizer. I would sometimes preach about just how this was a stepping stone towards this tech appearing in games, but people rarely believed me back then.
I agree that it's the future, and once it becomes commonplace I'll definitely be interested. I don't think that will happen this generation, but I could be wrong.