• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What do gamers actually think about Ray-Tracing?

I'll keep saying what I said since the 20th century: Give me denser voxels and a completely interactive world.
Gameplay matters more than looks and that's why Baldur's Gate 3 won the most awards and has been waited for more than 20 years instead of the arguably graphics king of the same year as the first Baldur's Gate, Incoming.
I'd take destructable terrain over RT. Man alive, I used to love Red Faction haha.
 
I'd take destructable terrain over RT. Man alive, I used to love Red Faction haha.
In this specific case, I've heard a lot of excuses from devs that lighting is an issue with physics and destructible terrain and models, and RT makes it much easier. Considering we've seen in the past games that could without issues do both sensibly well... I call that at least partially BS. :)
 
If it's a quick bite, fast food style, I don't care more much other than the food to be decent - aka phone gaming, but for something more, I care about the location, how it looks, people serving, etc.
Yes, but you don't ask the waiter for exact recipe, do you? Same as almost all gamers don't go to ask devs how they made exactly this or that effect - it's all irrelevant for most people as only end result matter.
The point was that if graphics don't matter, then devs wouldn't waste time have graphics above what swtich or phones do. They do matter and even console people argue amongst themselves which one does it better.
Graphics matter but not in the way you're describing. It's not making or breaking most games unless it's really bad and for example you can't see what you're supposed to be seeing, so you can't play the game properly. Games by definition are an entertainment - ergo, they have to be entertaining first and mostly. How it's done, that's on devs. Usually it's achieved by engaging gameplay and design, with sound and graphics sprinkled on top by real artists to make it properly looking and sounding. Without proper gameplay and design you get at best a tech demo, not a game. There are plenty of great games that don't use latest and top graphics tech, as it wasn't needed, still sell well and hardly anyone complains - some of them have very simple retro-style graphics even. Example - look at games like indie Vampire Survivors, which sold over 2 milion copies and growing, even though game is pixel art, with simple idea yet well designed and very entertaining. By some estimates it sold better than newest Dragon Age.
Point being, graphics, just like other elements, does elevate the experience for me.
I don't argue this point, it's personal preference. But the moment we step into more general terms, I have a problem with that. :)
I figure it does for a lot from the bunch of "I don't care about graphics", is just that it does up to a certain point which usually means that it runs decent for what hardware they have.

You can't have a Dacia run like a BMW, AUDI or Ferrari.
In the rare case of game being fun and designed to use RT/PT, not for the free NVIDIA monies and marketing, but to actually achieve something relevant (which is, as per HUB video and similar, only a literal handful of games so far) - sure, people could wish for faster hardware for it to work better. Because one might only have option of good visuals vs horrible visuals (sadly not all games have proper scaling). But such games are still so rare they're pretty much irrelevant. Most games don't use that tech, or falls back to simplified software Lumen at best, or they use it but it adds nothing relevant aside increasing hardware requirements (and that's besides gameplay being often just bad).
You also don't print on regular cheap, recycled paper your photos, you use photographic paper for a reason.
This is bad take - I already described above what games are by definition and it's not graphics. Photographs, on the other hand, you need to print on photo paper, as that's part of the definition of paper photograph. Otherwise you'd get a photocopy for example, not a photograph.
Anyway, bottom line, if graphics won't matter than people wouldn't lose time here or on graphics card forums. They'll just buy the cheapest, set all to low and lowest res available and game... which doesn't happen.
It's not black and white. Again, most people don't care for the tech behind graphics, it has to be functional and fitting the game, that's it. That said, the number of enthusiasts like yourself, on this forum, is so miniscule it wouldn't even register as a bleep in comparison to the number of gamers out there. Most of said gamers do not even know there's such thing as forums, nor couldn't be bothered to read one (writing is even smaller number). :)
It doesn't matter either way. Stalker 2 is a killer for 8GB cards and yet... here it is.
It is if you care for full details graphics. If you cut down details and play in 1080, it works fine enough - that's what majority of gamers do. It actually proves my point, as enough people do not care for graphics but love the gameplay, so they don't mind dropping details down to enjoy it even on their weaker GPUs.
But is a killer only if you max out settings. Since graphics don't matter, people should just lower settings to its lowest and enjoy. It shouldn't go over 8gb from what I've played - at 1080p at least.
Exactly my point :D
 
Last edited:
This all day long, and my point still stands-Nvidia has been watering down the gpu stack almost gen on gen for ~decade, and some high end users on the subject of RT blame AMD users for RT negativity(despite having a tiny share of the market) are blind to the fact that the majority of Nv users turn it off as it's not doable.

Case in point, an 8Gb 4060 cant run Stalker 2@1440p+Dlss because it runs out of vram=Nv can't even run software RT'ing on a $300 dollar gpu!

8gb isn't enough, ~'but what do you expect, it's a 4060!', well no the 3070/70Ti also have 8Gb that can't run software RT'ing, how much was the street price on the 70's?

Most users in here paid £8/900+ for 70/ti as the FE's were next to never on the 5 min countdown sales.

Good watch on a whole range of gpus here and (earlier in the vid, it shows the pitfalls of lumen(remember- it saves devs time folks) where raster would have been a better choice to run both options to facilitate low end gpus imo.

HUB shows 8gb problems too.

Unless Nv go 16Gb minimum from the 60/70 and at least 20Gb for the 80, RT'ing is going to be a long long way away for mainstream, good luck with the heavy Nv funded titles guys, maybe they'll bring out one or two more PT titles, if your lucky some more to showcase the 5090 running RTX in all it's glory on 4K@60fps.:thumbsup

Vram does matter-moreso for HWRT, it is extremely cheap considering the fleecing Nv's been getting away with despite always been the first to fall, if Nv's Vram truly was 'plenty', Vram wouldn't get talked about-at all!
8GB is 1080p area, not 1440p. Drop some settings and you won't go over it in Stalker 2. I've tried it with 1080p, high settings, epic textures, no FG and should stay below, but you can always mess around with the settings.
BTW, this is with the Steam version which seems to be better. Not sure why Game Pass is crappy, but it is what it is.

As for 16GB the minimum... don't know. Direct Storage is still MIA and 1080/1440 should require less than 4k. But, 16GB on the 5080 could be a tad too little.
 
Last edited:
Yes, but you don't ask the waiter for exact recipe, do you? Same as almost all gamers don't go to ask devs how they made exactly this or that effect - it's all irrelevant for most people as only end result matter.

I don't, but I see the ingredients, see to what I may be allergic, etc. I can always look the recipe online. Ergo, to some extent it matters.

Graphics matter but not in the way you're describing. It's not making or breaking most games unless it's really bad and for example you can't see what you're supposed to be seeing, so you can't play the game properly. Games by definition are an entertainment - ergo, they have to be entertaining first and mostly. How it's done, that's on devs. Usually it's achieved by engaging gameplay and design, with sound and graphics sprinkled on top by real artists to make it properly looking and sounding. Without proper gameplay and design you get at best a tech demo, not a game. There are plenty of great games that don't use latest and top graphics tech, as it wasn't needed, still sell well and hardly anyone complains - some of them have very simple retro-style graphics even. Example - look at games like indie Vampire Survivors, which sold over 2 milion copies and growing, even though game is pixel art, with simple idea yet well designed and very entertaining. By some estimates it sold better than newest Dragon Age.

I don't argue this point, it's personal preference. But the moment we step into more general terms, I have a problem with that. :)

Of course games will sell well if they're good. My point, in one of these long posts, was that it matters to me. Fair enough that it doesn't to you.
In general, there are different genres and approaches, different ways to skin a cat as they say. Some relay on graphics, too, to better push the mood and feel. Not necessarily RT/PT, but other effects that complete the experience. While it varies to different degrees, if visuals will have next to no importance, studios, especially big ones, won't lose money investing in that and this last point is critical. I don't think they're that stupid, do you? :)

In the rare case of game being fun and designed to use RT/PT, not for the free NVIDIA monies and marketing, but to actually achieve something relevant (which is, as per HUB video and similar, only a literal handful of games so far) - sure, people could wish for faster hardware for it to work better. Because one might only have option of good visuals vs horrible visuals (sadly not all games have proper scaling). But such games are still so rare they're pretty much irrelevant. Most games don't use that tech, or falls back to simplified software Lumen at best, or they use it but it adds nothing relevant aside increasing hardware requirements (and that's besides gameplay being often just bad).

I'm not arguing that some games are less that they could be, that goes without saying (even for raster). CB2077 is less that it could be.

This is bad take - I already described above what games are by definition and it's not graphics. Photographs, on the other hand, you need to print on photo paper, as that's part of the definition of paper photograph. Otherwise you'd get a photocopy for example, not a photograph.

Well, as you say... it doesn't matter the process, but the result. I want it cheap, so cheap paper! :)) Joke aside, this is the same as above, if the story alone would matter, big studios and cinematographers would only use iphones, not hugely expensive gear. And so would wedding photographers. Cheap phone or cheap entry level camera, kit lens camera flash.

It's not black and white. Again, most people don't care for the tech behind graphics, it has to be functional and fitting the game, that's it. That said, the number of enthusiasts like yourself, on this forum, is so miniscule it wouldn't even register as a bleep in comparison to the number of gamers out there. Most of said gamers do not even know there's such thing as forums, nor couldn't be bothered to read one (writing is even smaller number).

I don't disagree with that, in general. However, there are still mods to improve the visuals, Skyrim is a prime example of that. It doesn't have to change a lot, but a little better textures alone show that there is a need in that direction.

It is if you care for full details graphics. If you cut down details and play in 1080, it works fine enough - that's what majority of gamers do. It actually proves my point, as enough people do not care for graphics but love the gameplay, so they don't mind dropping details down to enjoy it even on their weaker GPUs.

Exactly my point :D

Cool, then people should stop complaining, get any card and play. 8GB is more than enough. :)
 
Last edited:
8GB is 1080p area, not 1440p.

Nvidia are locking 8Gb users to 1080p, 16Gb 4060 users can run@1440p.

What about this:
1dVCiUK.png


Drop some settings and you won't go over it in Stalker 2.
The ti should have 12Gb minimum, £339 to £390 gpus running out of vram@1080p, but just nock back the settings?

Again, the 3070/ti's ARE even faster 1080/1440p gpus, so dial in settings to reduce vram costs to degrade your experience is the answer?

That's indefensible, they shouldn't have to drop settings to fit into 8Gb, 3070/ti grunt is more than capable enough@1080/1440.

I was one of the few NV users who warned this was coming before the vram talk got banned.

Wasn't I talking **** when I warned about this happening back when I complained my 3070 and 3080 should have came with 16GB because they'd run out of vram and got slaughtered for it, just for complaining about the long term effects of Nv's low vram tact to make you upgrade, was I wrong?

As for 16GB the minimum... don't know. Direct Storage is still MIA and 1080/1440 should require less than 4k. But, 16GB on the 5080 could be a tad too little.

16gb's not going to cut it on 5080-but hey Nv don't give a **** it'll be what they want it to be and almost everyone will make Vram excuses for Nv like has happened since like for ever.:thumbsup
 
The ti should have 12Gb minimum, £339 to £390 gpus running out of vram@1080p, but just nock back the settings? ...That's indefensible, they shouldn't have to drop settings to fit into 8Gb, 3070/ti grunt is more than capable enough@1080/1440.

I was one of the few NV users who warned this was coming before the vram talk got banned.

Yes just turn down the settings, lol remember this well! Its now plastered all over the content/news space the debate of the unspeakable word but as you can clearly see is a thing. So referencing the comments about there's loads of games that now do RT, well the same number of games are out where the hardware spec (or the lack of) seems to be the pain point.
 
I'm in a curious position. Currently playing Cyberpunk so I would say raytracing is brilliant... But I'm also aware that the amount of games that utlise it as well as that I could count on one hand. it is surprising how relatively niche it still is.
 
I'm in a curious position. Currently playing Cyberpunk so I would say raytracing is brilliant... But I'm also aware that the amount of games that utlise it as well as that I could count on one hand. it is surprising how relatively niche it still is.

Devs are easing off from it as they realise the Cyberpunk model scares more people off than it draws in, every slide says you need a $1000 GPU to run it at 40 FPS, which is not a good look.
 
Last edited:
Basically people must play on highest settings and not willing to play on medium or lower to get higher fps. Why should they miss out right?

So devs have to cap the highest settings from ultra to medium so mid range cards can play maxed out too.

Oh and might as well do basic RT you barely notice so mid and lower cards can deal with it.

Silly imo, all they have to do is make sure lower presets are available and optimised.
 
Devs are easing off from it as they realise the Cyberpunk model scares more people off than it draws in, every slide says you need a $1000 GPU to run it at 40 FPS, which is not a good look.
I'm getting around 80fps on a 3080 12gb at 3440x1440 (DLSS is a godsend). But yeah, the performance cost is very high and with PT it doesn't look like it's getting lower any time soon.

We have to hope the new cards bring that barrier to entry a little further down.
 
its not just the games, have you wondered how everyones pushing bloatware starting right from windows - the operating system. software world in general does not care about optimization, games included, like people here keep on saying how memory's been cheap and 16gb should be the min vram benchmark, i believe game devs too are thinking on similar lines, next year the argument would be for 24gb because its cheap
 
Nvidia are locking 8Gb users to 1080p, 16Gb 4060 users can run@1440p.

What about this:
1dVCiUK.png



The ti should have 12Gb minimum, £339 to £390 gpus running out of vram@1080p, but just nock back the settings?

Again, the 3070/ti's ARE even faster 1080/1440p gpus, so dial in settings to reduce vram costs to degrade your experience is the answer?

That's indefensible, they shouldn't have to drop settings to fit into 8Gb, 3070/ti grunt is more than capable enough@1080/1440.

I was one of the few NV users who warned this was coming before the vram talk got banned.

Wasn't I talking **** when I warned about this happening back when I complained my 3070 and 3080 should have came with 16GB because they'd run out of vram and got slaughtered for it, just for complaining about the long term effects of Nv's low vram tact to make you upgrade, was I wrong?



16gb's not going to cut it on 5080-but hey Nv don't give a **** it'll be what they want it to be and almost everyone will make Vram excuses for Nv like has happened since like for ever.:thumbsup

It's a form of planned obsolescence on Nvidia's part.

But this is why we need AMD and Intel to challenge them at the top end so they can't do what they like with the market.
 
Last edited:
its not just the games, have you wondered how everyones pushing bloatware starting right from windows - the operating system. software world in general does not care about optimization, games included, like people here keep on saying how memory's been cheap and 16gb should be the min vram benchmark, i believe game devs too are thinking on similar lines, next year the argument would be for 24gb because its cheap

I think this is a fair statement, we have seen windows updates show healthy single digit gains when that has had some attention and more than a year ago I recall nvidia doing a driver update with similar gains showing that its healthy pickings if they want to visit it. Where the distinction flares though is when you start charging sizeable markup on the products for 'reasons' we are told but the markup should then cover these 'cheap' memory shenanigans.
 
Nvidia are locking 8Gb users to 1080p, 16Gb 4060 users can run@1440p.

What about this:
1dVCiUK.png



The ti should have 12Gb minimum, £339 to £390 gpus running out of vram@1080p, but just nock back the settings?

Again, the 3070/ti's ARE even faster 1080/1440p gpus, so dial in settings to reduce vram costs to degrade your experience is the answer?

That's indefensible, they shouldn't have to drop settings to fit into 8Gb, 3070/ti grunt is more than capable enough@1080/1440.

I was one of the few NV users who warned this was coming before the vram talk got banned.

Wasn't I talking **** when I warned about this happening back when I complained my 3070 and 3080 should have came with 16GB because they'd run out of vram and got slaughtered for it, just for complaining about the long term effects of Nv's low vram tact to make you upgrade, was I wrong?



16gb's not going to cut it on 5080-but hey Nv don't give a **** it'll be what they want it to be and almost everyone will make Vram excuses for Nv like has happened since like for ever.:thumbsup

The lack of vRAM on current gen cards, prices and up selling them has been talked to death. From my perspective, seeing how some of these games manage the vRAM, it seems that the issue has been exacerbated by sloppy work and the lack of Direct Storage.

Reverting to Stalker 2, the statement that it can't run at 1440p with DLSS on 8GB cards is just false - at least for the Steam version and more than 16GB of system RAM. At 4k with medium settings, Performance DLSS and it stays 99.99% of the time under 8GB, with the exception of a few spikes. However, the game does know to lower the use and doesn't just throw everything into vRAM, because... why not?... As Far Cry did. With Ultra Performance it should stay pretty solid under 8GB all around. I did another quick run after with EPIC settings on 4k Performance and it stayed under 10GB, so a 3080 10GB should be able to handle it from the vRAM perspective.

Of course, that's without FG, that takes extra vRAM.

Also, a patch is due to fix a memory leak... so yeah. 1440p or thereabouts should definitely not be an issue with settings set accordingly.

As for 16GB on the 5080... yeah, probably will be problematic, probably not. Remains to be seen.
 
Last edited:
The lack of vRAM on current gen cards, prices and up selling them has been talked to death. From my perspective, seeing how some of these games manage the vRAM, it seems that the issue has been exacerbated by sloppy work and the lack of Direct Storage.

Reverting to Stalker 2, the statement that it can't run at 1440p with DLSS on 8GB cards is just false - at least for the Steam version and more than 16GB of system RAM. At 4k with medium settings, Performance DLSS and it stays 99.99% of the time under 8GB, with the exception of a few spikes. However, the game does know to lower the use and doesn't just throw everything into vRAM, because... why not?... As Far Cry did. With Ultra Performance it should stay pretty solid under 8GB all around. I did another quick run after with EPIC settings on 4k Performance and it stayed under 10GB, so a 3080 10GB should be able to handle it from the vRAM perspective.

Of course, that's without FG, that takes extra vRAM.

Also, a patch is due to fix a memory leak... so yeah. 1440p or thereabouts should definitely not be an issue with settings set accordingly.

As for 16GB on the 5080... yeah, probably will be problematic, probably not. Remains to be seen.

Bit contradictory old bean, you say its been talked to death and its sloppy dev work and storage techniques to why.

Then you mention features like FG that take up vram, and say 16GB on the 5080 will probably be problematic.

Its pretty simple when nvidia are commanding high premiums for their products the vram should be ample - not just about enough or it should do for most game scenarios...
 
Last edited:
Back
Top Bottom