The RTX Scam

In late 2018, NVIDIA had launched RTX (Ray Traced Texel Extreme) with their 20 series cards. RTX is supposed to be image rendering using real time light path tracing, instead of pre-compiled shadows and light effects in rasterization which has been the case till then (actually even today, more on that later). NVIDIA’s 20 series cards introduced what they called RT Cores, which were special cores designed to facilitate the computation, and CUDA cores could not be reused. This meant, anybody wanting to enable RT in their games would need to buy a new GPU from the Big Green.

Almost 5 years later, what is the impact of Ray Tracing in a typical gamer’s experience while playing it? The answer to this question is more important than number of games adapting the technology, which is about 80 odd titles among thousands released since the first RTX card was launched. (We can discount the debate on gameplay etc. which arguably might be more important than graphical fidelity for old school gamers like me, because we are actually, discussing about graphical fidelity here. )

Disclaimer: This article does not aim to discredit the research in path tracing to render images. This is to find out whether it makes sense in the current retail space of video games.

Before proceeding with my observations, lets first see what others have observed, as per one of several such Youtube videos:

Control

Click on the gallery to see each image in full, and scroll left/ right for others.

RT on is simply making reflections more pronounced compared to RT off, which renders the floor like a blurry mess. However, while playing the game, the player would be more interested in finding the supernatural beings than noticing the window reflection on the floor in most cases. Further, some may feel the reflection on the right image is overdone (see next example).

Watch Dogs Legion

Again, RT on seems to simply make reflections more shiny, which in some old titles like GTA V/ Read Dead Redemption 2 were handled through reflection quality/ sub surface reflections. The RT implementation on the right actually seem to be overdone, with the road almost appearing to be made of glass.

Note these two examples from the above video which we will come back to later. Watch the video in full for more examples.

Saints Row the Third: Remastered

The first suspicion I personally had about questionable improvement in visual fidelity was while playing Saints Row the Third Remastered, which was given away for free by Epic Games Store at one point. I was surprised at how well the reflections looked in rain-soaked streets of Steelport, while being completely rasterized. Especially awesome were the reflections in small puddles scattered across the ground, with neon lights perfectly mirrored. Though reflections are not all that Ray Tracing has to offer theoretically, when it was first launched most demos were shiny and followed this context only, e.g. Metro Exodus or Battlefield’s European City level. Later, of course, Cyberpunk 2077 implemented path tracing going beyond simple reflections, and might still be one of the best RT implementations yet, but that still does not absolve most of the 80 titles mentioned above which have implemented RT only in the context of reflections as is evident by the video linked above.

Hitman 2

The second suspicion was while playing Hitman 2 (2018). The Glacier engine has some really great reflections and visual fidelity. Especially the levels of Island of Sgail, Hawke’s Bay, Mumbai, Paris, and Thailand really offer immersive enough fidelity while the player is finding ways to avoid detection and execute the targets to get a Silent Assassin rank (or similar). Hitman is one of the few games where fidelity on immediate surroundings matter greatly for immersion, including objects and reflections, and IO Interactive delivered that on even low end hardware. I ran it at 1440p 60 fps on Linux Steam/ Proton, with a 1070 Ti, and there is really not much difference if one plays with an older card not capable of using RT. One would not miss out on anything that Hitman as a genre-defining game has to offer.

Ghostwire: Tokyo

My suspicion cemented further recently, while playing Ghostwire: Tokyo. This is a 2022 game which arguably should be too new for a 1070 Ti. But I was wrong! The below are some of the screenshots while playing the game on that card, which does not support RT, so this is purely rasterized rendering (though Ghostwire does support RT too).

Image 1 shows shadows rendered by the Phone Booth. Others are more representative of reflection rendering, in rain-soaked Shibuya prefect of Tokyo with neon boards and lights (this, and Night City of Cyberpunk are probably the best locations to showcase any RT implementation).

Image 6 particularly includes reflection of neon billboard on the right, which showed a pattern in motion in the game, and the reflections too were in sync with the billboard motion.

Again, all the screenshots are using rasterized rendering, without any RT. Not particularly related, but the gameplay is at 1440p 60 fps, which the GTX 1070 Ti is perfectly capable to render while being undervolted to check temparatures, with 8 GB VRAM which is same as the amount offered in a 3070 Ti, 2 generations later.

Now lets go back to the 2 examples from Youtube above – Control and Watch Dogs. Compare the “RT on” screenshots with those of Ghostwire with “RT off”.

As we can see, Ghostwire: Tokyo can achieve similar visual fidelity in rasterized mode as Control or Watch Dogs in RT.

Of course, a scene using RT will be more accurate in reflections and lighting (duh, we’re not discounting the theory), but at what cost and benefit? There is no denying that the visual fidelity above in Ghostwire is good enough for most gamers while playing the game, comparable to when RT is enabled on a capable card. A well-done rasterized game can give as much immersive experience as the shiny RT that would need more VRAM, consume more power, heat up more, and ultimately won’t give any tangible benefit over old technology to the game. So if a game can offer this much fidelity with rasterization itself, why do we need RT which is not at all efficient even 5 years after launch?

Is RT only so that NVIDIA can sell more cards?

And this is not just generalising based on one game only. As mentioned, Saints Row The Third, and Hitman, too have great rasterized reflections. Suffices to say that immersive visual fidelity does not depend on RT; can be had with rasterization too provided the devs use the right engine, design lighting and effects well. My observations also include RT rendering on Cyberpunk and Bright Memory: Infinite – the two better examples in RT domain, and even games like a Plague Tale: Requiem and Metro: Exodus where the setting does not leave too much demand for RT.

Recently I came across another Youtube video discussing why no one is interested in new video cards. Based on my experience with a non-RT card from 6 years back in at least 3 titles, I think we know the answer. Companies like NVIDIA have no new hardware revolution going on that can considerably impact a gamer’s immersion. The shortage in last 2-3 years was due to crypto mining, not RT demand. Cards from 5-6 years back can play rasterized games at 1440p/ 1080p 60 fps just fine. NVIDIA probably knew this coming, so they tried to push RT as the next big revolution in gaming.

RT is more like creating a non-existent problem, then offering a solution.

5 years later, less than 2% of games launched are using RT, and even those who do, are faring pretty well on rasterization (there is no game that only works on RT by the way, so RT will always be a good to have addition, not a requirement for gaming). So they did not increase VRAM on 70 series cards for 3 generations until the recently launched 4070.

As for the 40 series, amusingly, NVIDIA is now pushing DLSS3, a form of supersampling, on these cards. Seems like they themselves have moved from RT on to more traditional problems in graphics rendering (outliers like path tracing “Overdrive” implementation by Cyberpunk notwithstanding). They needed something “new” to sell the 40 series cards, but thankfully we do not see too many suckers this time.

Two more “banes” for new graphics cards – the first is 1080p – 1440p “sweet spot” for gaming resolution. Most people don’t need 4k, because 1440p can give good enough fidelity with much reduced “aliasing” whether viewed on a 1440p or 1080p (even better) monitor at 2 feet, or a 4k TV at 6-8 feet. Try to play a game on a 4k TV at 1080p and see if you can notice the difference at 6 ft over playing at 4k or even 1440p. This is also a reason why heavily advertised “4k” consoles – Series X and PS5, often dynamically downscale even their “quality” modes to 1440p or around. Gamers would rather prefer locked framerates like those old generation consoles of yore, so Microsoft and Sony are prioritising locked framerates (even consistent 30 fps without a single frame drop might be better to some gamers than 60 fps with dips to 50).

We have almost arrived at the final resolution in video games

Occasional frame drops at 60 fps will be there in near future too, because that number is heavily dependent on CPU cache, and only recently have we started exploring CPU architecture with higher amounts of cache (i.e. AMD with Ryzen x3D solutions, though we are not there yet because they need to solve the thermal challenges) and performance/ efficient cores (i.e. Intel Alder Lake onwards), but such “better” CPUs have not penetrated the entry/ mid market.

The second “bane” for new card sales is AMD’s FSR, which works on NVIDIA cards also, as old as 10 series. If the game is not well optimised, AMD’s FSR can do wonders. I have used FSR occasionally in Ghostwire with my 1070 Ti, which drastically reduced power consumption without sacrificing too much on fidelity (I play 1440p on a 1080p monitor, which might help as I dont need AA). Though, FSR is ultimately a catch-all for optimised games. If we consider the examples of Resident Evil titles recently, or God of War, a well designed and optimised game will look and play great without RT.

We don’t need RT. Immersion depends on the game design, engine, and optimisation, not NVIDIA’s proprietary technology. So next time someone tries to sell you an RT card, remember to validate what exactly RT has to offer that you will miss in rasterization during gameplay.

Leave a comment