• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What do gamers actually think about Ray-Tracing?

RT was never going to run on low end hardware, the absolute minimum is PS5 level GPU class, and in today's money, what was £300 is not £300.
That's correct but it means games still have to be optimized to run well on cheaper hardware or they will simply not sell. It's as simple as that. RT is fancy but average Joe isn't going to upgrade their machine to run it - it's not worth it for such gamers. And that is where huge majority of the market resides, hence they decide if the game is a success or a flop, not enthusiasts. A lot of them have money to spend on games but simply don't find it worthy to spend more and that's that.
 
Also this is a survey of viewers of an enthusiast's PC hardware channel. People who care enough about watching hardware reviews are more likely to have better hardware and are more invested in caring about visuals.

The issue is that enthusiasts on forums are only a miniscule amount of most gamers. This very forum,is one of the biggest tech forums in the world but is lucky to have 5000 concurrent users online and many of them are not talking about games or graphics. Even if you add all the tech forums,Reddit,etc you are talking maybe 10s of 1000s of gamers. It's an echo chamber of people interested in hardware,talking about hardware and wanting to justify an upgrade. Yet we have over 100 million PC gamers on Steam alone.

Look at games such as LoL or GTA:Online. Games which make billions and hardly seem to care about graphics. Yet,how many in the graphics card sub-forum are waxing lyrical about playing such games? Probably none.
And yet we have games that aren't pure raster anymore and do ray tracing without letting you opt out.
 
And yet we have games that aren't pure raster anymore and do ray tracing without letting you opt out.

Yet,most of the games which make up most of the PC industries revenue are not RT only. They have optional RT. All you are doing is cherry picking one or two AAA single player titles which make for the most part make peanuts in gaming revenue.

WoW,LoL,DOTA,ES:O,Fortnite,etc are designed to run on relatively slow cards and they make $10s of billions of USD between them. RT at best will be optional but not mandatory for years.

All have cartoony styles which scale down to even integrated graphics.

I know far more people in the realworld who play those games,who haven't a clue about what the latest PT enhanced game is. They probably spend more on those games than many who buy expensive PC rigs.

Most people I know who played Cyberpunk played it on mainstream cards,with either RT off or set to very low levels. I finished most of the game on a Pascal card with no RT on. Even with my RTX3060TI,the game was playable with RT reflections on,but after the latest update it really can't keep up with just that one setting on. I also have a Ryzen 7 7800X3D on.

Don't think an enthusiast's forum are true of most gamers.

If people want RT/PT to be mandatory in the biggest games in the world,then the mainstream hardware has to be simply better. Blaiming games devs for all this is not fair,when Nvidia and AMD repeatedly think less than 20% improvements for years are fine.
 
Last edited:
Yet,most of the games which make up most of the PC industries revenue are not RT only. They have optional RT. All you are doing is cherry picking one or two AAA single player titles which make for the most part make peanuts in gaming revenue.

WoW,LoL,DOTA,ES:O,Fortnite,etc are designed to run on relatively slow cards and they make $10s of billions of USD between them. RT at best will be optional but not mandatory for years.

All have cartoony styles which scale down to even integrated graphics.

I know far more people in the realworld who play those games,who haven't a clue about what the latest PT enhanced game is. They probably spend more on those games than many who buy expensive PC rigs.

Most people I know who played Cyberpunk played it on mainstream cards,with either RT off or set to very low levels. I finished most of the game on a Pascal card with no RT on.

Don't think an enthusiast's forum are true of most gamers.

Why does anyone bother making PC games, consoles make more money. Why does anyone bother making console games, mobile makes more money... Agnes from down the street has a chromebook with Steam installed, surely game devs should just pack up and go home since their game won't run on her laptop.

It's good we have a healthy appetite in our technical industries that are ok with not milking the entirety of the proletariat, and continue to drive innovation forward. Otherwise we'd still be playing Pacman :rolleyes:
 
Yes, the numbers are plain too see: 26% Were in between with enabling it only in some games. That's not excluding the option that they simply don't find it worth it in the games they don't enable it in. Because they are on the fence and we don't have enough data to judge which way they lean more, we can exclude that from both sides. We are left with 52% don't find it worthy, 22% find it worthy and at best 26% undecided. Out of all the numbers, only 15% were able to actually run it on full settings. That's the proper interpretation of these numbers imho.

I will go one better. Of the % how many of these people participating in said surveys have all of the RT games? I mean I don't think the majority of gamers will have a full library invested in all of these games. I have less than ten I think of the possible list of them all.. and out of the full list I doubt double figures will be a decent use/showcase of the implementation. All I see is the people saying its great complain that it needs more optimising or it was not done well.
 
Why does anyone bother making PC games, consoles make more money. Why does anyone bother making console games, mobile makes more money... Agnes from down the street has a chromebook with Steam installed, surely game devs should just pack up and go home since their game won't run on her laptop.

It's good we have a healthy appetite in our technical industries that are ok with not milking the entirety of the proletariat, and continue to drive innovation forward. Otherwise we'd still be playing Pacman :rolleyes:

Does a game like Valheim or Enshrouded suck because they look cartoony then? Fallout:New Vegas looked crap even at launch - but it was a fantastic game.

The important part of a game is the world building and gameplay. Graphics are the cherry on the top.

But graphics only goes forward when the majority can run the effects. ATI invented tessellation in the PC gaming space in 2002. But it was only when Fermi/HD5000 series could run these effects at a reasonable framerate that it entered mainstream. Each generation improved a decent amount until most cards have no issues doing it and it's found in most games.

Companies need to make their money back by selling enough copies - Crytek tried pushing stuff too quickly but they nearly went bankrupt. If they launched Crysis a year or two later,things would have been different IMHO.

If we look at the improvements in the mainstream,they are simply not good enough. The RTX2060 to RTX4060 transition only a 45% improvement over three generations looking at TPU.

Looking at how my RTX3060TI can handle RT,the performance is not good enough. But I am looking to buy a much better card this year and probably will raise my budget a decent amount.

But many people will be upgrading to that level of performance or worse in 2025 - it's really rubbish how the market is going now. The top10 on Steam is rather depressing reading IMHO.

Nvidia and AMD have to take some blame for this. There was an era where they really valued us as gamers and would give us the best. Now,they only give us the scraps and charge the most they can get away with. Now compounded by Sony also taking the mickey on the console side.

So until they do bother,expect economy level RT implementations for the immediate future.
 
Last edited:
So let's take someone who doesn't bother turning it on for SOTTR because whats the point really, but turns it on for Metro Exodus since it is very much worth it. Where does that person lie in your "undecided" blanket rule? They don't. They are decided, they just turn it on where it truly is worth it. To speak with the conviction like you do comes across as naive.
That is NOT how statistical analysis work. If you have a vague data like that, it's nothing more than a noise and gets discarded. Otherwise it's just pure guessing and you simply can't guess and massage it to what you want it to be, otherwise you'll be tainting the data with your own bias. Hence, that's out the window - survey should be retaken with better formed questions, but we can't do that, so we work with what we have. And what we have has one most important point of data, which I mentioned - only 15% of the respondents can and are willing to run RT with full details. That's amongst enthusiasts, not even wide population. Which is one of the reasons of the below too.
Not even... find a source for a dev blaming RT please.
Dev? It's publishers losing monies on these if anything, as such AAA games cost even hundreds of millions to make these days and need to sell similar number of copies to get any income. And if they limit how good game works to a handful of enthusiasts, that's what they get in return - not much at all. It's as simple as that, really. :)
 
I've been on the train since zx spectrum, never have I seen such an obsession with a now not so new technique that imo is good but hasn't had the impact that the hype 6+ years ago would have led me to believe. A game has RT, a few slightly nicer shadows and the pc games thread is taken over for weeks with pages of screenshots and talk about said shadow :D whilst great games like Nine Sols don't even have enough interest for its own thread. I suspect many "gamers" on enthusiast forums are very biased towards tech before gameplay, which is OK, but is very different to my non tech head gamer friends where gameplay is king. These friends will happily play a game on switch for convenience than stick it on their ps5.

I haven't been overly impressed with all the AI stuff and the direction it's taking us, but I love tech and have supported many new ones to see where they go, so even though I'm not fully sold on the techs I will continue to support and see what happens with all this fg ai.
 
Does a game like Valheim or Enshrouded suck because they look cartoony then? Fallout:New Vegas looked crap even at launch - but it was a fantastic game.

The important part of a game is the world building and gameplay. Graphics are the cherry on the top.
It would seem, it's bold of you to assume some people don't play pixels (illuminated by rays) instead of actual games. :)

But graphics only goes forward when the majority can run the effects.
It's worse than that - Hollywood in many cases stagnated with CGI too, on their huge render farms, with a lot of films looking just... bad. Often because of cost cutting. And people imagine games can get better than that, it feels - they simply can't. They're games, not movies - designed to be played interactively, not watched like a film in 24FPS. Something has to give. :)

(...)
Nvidia and AMD have to take some blame for this. There was an era where they really valued us as gamers and would give us the best. Now,they only give us the scraps and charge the most they can get away with.
NVIDIA CEO likes to claim it's not them, it's physics and they can't make GPUs cheaper nor faster anymore - only AI counts now. ;) Yes, he's pushing marketing narrative but he's also right in a way that it became very expensive to design and produce these GPUs, as all the low-hanging fruits they already gathered and there's not an easy path forward for more performance anymore. AMD has at least a workable and interesting new designs coming (not this gen but sometime in the future), which should greatly speed up RT processing, but it's a few years away at least and likely won't be cheap with expensive L1 cache added to each small cluster of RT cores, just to achieve required vRAM throughput of hundreds of TB/s - simple GDDR7 is orders of magnitude too slow for that, currently.
 
And yet we have games that aren't pure raster anymore and do ray tracing without letting you opt out.
It's not about them using RT or not, the point is can they run well on mainstream xx60 cards without too much of quality degradation, stutter, artefacts, horribly blurry image etc.? If yes, that's fine. Plenty of games have RT effects in them that run very well on 3060 even, because they're well optimised. But also, there's plenty of badly optimised games that just don't run well on mainstream hardware or look so bad, nobody wants to run them on such.

That said, the main reason, that devs very openly talk about, for games being pure RT and no opt-out is cost-cutting on the side of devs. It's not done to push graphics forward, or for the good of gamers - it's just for the good of devs, who can save monies on development. Nothing to cheer on, from our side of things.
 
Last edited:
It would seem, it's bold of you to assume some people don't play pixels (illuminated by rays) instead of actual games. :)
A lot of my mates less interested in the tech side of things play these games so don't have a choice. But they are fun!
It's worse than that - Hollywood in many cases stagnated with CGI too, on their huge render farms, with a lot of films looking just... bad. Often because of cost cutting. And people imagine games can get better than that, it feels - they simply can't. They're games, not movies - designed to be played interactively, not watched like a film in 24FPS. Something has to give. :)

There is much to be said with decent physical effects. Bladerunner 2049 looked amazing on IMAX screens and used a lot of miniatures from Weta Workshop. But the original Bladerunner still holds up.

NVIDIA CEO likes to claim it's not them, it's physics and they can't make GPUs cheaper nor faster anymore - only AI counts now. ;) Yes, he's pushing marketing narrative but he's also right in a way that it became very expensive to design and produce these GPUs, as all the low-hanging fruits they already gathered and there's not an easy path forward for more performance anymore. AMD has at least a workable and interesting new designs coming (not this gen but sometime in the future), which should greatly speed up RT processing, but it's a few years away at least and likely won't be cheap with expensive L1 cache added to each small cluster of RT cores, just to achieve required vRAM throughput of hundreds of TB/s - simple GDDR7 is orders of magnitude too slow for that, currently.

I am not against smarter rendering techniques TBH and the tech is interesting - how it works out is another question. But what I have an issue is this repeated shrinkflation in mainstream hardware(including it seems consoles now),which swallows up any potential improvements so they can sell less for more. For the typical enthusiast who spends huge amounts on cards,it doesn't matter as much.

What did it for me finally was when Nvidia literally rebranded the RTX4060/RTX4060TI as the RTX4070,the RTX4080 12GB stunt and AMD followed suite with rebranding the RX7700XT as the RX7800XT and the RX7800 as the RX7900XT.

The resulting mainstream cards were frankly a POS. But these very cards are probably the biggest selling cards for both companies.

It's not about them using RT or not, the point is can they run well on mainstream xx60 cards without too much of quality degradation, stutter, artefacts, horribly blurry image etc.? If yes, that's fine. Plenty of games have RT effects in them that run very well on 3060 even, because they're well optimised. But also, there's plenty of badly optimised games that just don't run well on mainstream hardware or look so bad, nobody wants to run them on such.

But even in those cases,the effects are relatively minimal because the cards still need to render all the other parts. Many here are wanting balls to the walls RT/PT effects with permanent RT. It will happen eventually,but I think we would be here already if the mainstream hardware wasn't so rubbish.

That might have happened if the last two mainstream generations had a 40% improvement,which would have lead to a 2X improvement in performance and probably a 3X improvement by the end of this year. Then if you added upscaling,frame generation,etc on top of this,devs could push forward quicker.

These companies always try to spin the cost of hardware,but strangely their margins seem to be going up as they shrinkflate more and more.

But it seems Nvidia/AMD are more worried looking at what their shareholders are saying,so gamers can go and do one. Pretty much the same with some of these AAA gaming companies,but it seems gamers are wising up.
 
Last edited:
Yet,most of the games which make up most of the PC industries revenue are not RT only. They have optional RT. All you are doing is cherry picking one or two AAA single player titles which make for the most part make peanuts in gaming revenue.

WoW,LoL,DOTA,ES:O,Fortnite,etc are designed to run on relatively slow cards and they make $10s of billions of USD between them. RT at best will be optional but not mandatory for years.

All have cartoony styles which scale down to even integrated graphics.

I know far more people in the realworld who play those games,who haven't a clue about what the latest PT enhanced game is. They probably spend more on those games than many who buy expensive PC rigs.

Most people I know who played Cyberpunk played it on mainstream cards,with either RT off or set to very low levels. I finished most of the game on a Pascal card with no RT on. Even with my RTX3060TI,the game was playable with RT reflections on,but after the latest update it really can't keep up with just that one setting on. I also have a Ryzen 7 7800X3D on.

Don't think an enthusiast's forum are true of most gamers.

If people want RT/PT to be mandatory in the biggest games in the world,then the mainstream hardware has to be simply better. Blaiming games devs for all this is not fair,when Nvidia and AMD repeatedly think less than 20% improvements for years are fine.

You're cherry picking yourself a genre of games as well -> WoW,LoL,DOTA,ES:O,Fortnite,etc.

There are far more than those. I think some have RT to some degree as well.

It's not about them using RT or not, the point is can they run well on mainstream xx60 cards without too much of quality degradation, stutter, artefacts, horribly blurry image etc.? If yes, that's fine. Plenty of games have RT effects in them that run very well on 3060 even, because they're well optimised. But also, there's plenty of badly optimised games that just don't run well on mainstream hardware or look so bad, nobody wants to run them on such.

That said, the main reason, that devs very openly talk about, for games being pure RT and no opt-out is cost-cutting on the side of devs. It's not done to push graphics forward, or for the good of gamers - it's just for the good of devs, who can save monies on development. Nothing to cheer on, from our side of things.

Is pretty much irrelevant how it runs on low(er) end hardware from 2 gens ago, especially when you're going to launch in a 4-5 years from now +/-, perhaps even with new consoles around.

"stutter, artefacts, horribly blurry image" ... yeah, like consoles had for plenty of time. I'm sure it was a real treat playing games at under 30fps with crappy upscalers. Or having either no reflections or those that disappear once you slightly move the camera.
 
Last edited:
There is much to be said with decent physical effects. Bladerunner 2049 looked amazing on IMAX screens and used a lot of miniatures from Weta Workshop. But the original Bladerunner still holds up.
One of the main reasons Hollywood lies to people about not using CGI and using practical effects - sure they use practical effects but then change them or improve with CGI and special effects. Doesn't stop them lying all the time, though. :)
I am not against smarter rendering techniques TBH and the tech is interesting - how it works out is another question. But what I have an issue is this repeated shrinkflation in mainstream hardware(including it seems consoles now),which swallows up any potential improvements so they can sell less for more. For the typical enthusiast who spends huge amounts on cards,it doesn't matter as much.
Same same, which could be partially fixed by devs optimising their stuff, but that cost monies too, so in effect ends up with shrinkflation as well - we get less FPS for more monies.
What did it for me finally was when Nvidia literally rebranded the RTX4060/RTX4060TI as the RTX4070,the RTX4080 12GB stunt and AMD followed suite with rebranding the RX7700XT as the RX7800XT and the RX7800 as the RX7900XT.

The resulting mainstream cards were frankly a POS. But these very cards are probably the biggest selling cards for both companies.
Well, according to Steam and known stores' stats - main selling ones were still xx60 series, then xx70s and the rest sold weakly. 4080 seems to have sold much less than 4090 even, which isn't that surprising.

But even in those cases,the effects are relatively minimal because the cards still need to render all the other parts. Many here are wanting balls to the walls RT/PT effects with permanent RT. It will happen eventually,but I think we would be here already if the mainstream hardware wasn't so rubbish. That might have happened if the last two mainstream generations had a 40% improvement,which would have lead to a 2X improvement in performance and probably a 3X improvement by the end of this year. Then if you added upscaling,frame generation,etc on top of this,devs could push forward quicker.
Even with full PT enabled, the card still has to process a lot of geometry, particle effects, texturing and other bits and bobs, ergo things like CUDA cores matter a lot, still. And now, we look at 5000 series and... most has only few % more CUDA cores and RT cores than respective cards of the 4000 series, whilst having lower clocks too. All they really have improved is AI processing power. To that, NVIDIA compared them ignoring Super cards of the 4k series, which is disingenuous marketing BS, as when compared to Super ones, they look almost identical in most cases. The only real upgrade seems to be 5090 over 4090, but even that is weaker than expected (20% more raster expected, only). Shrinkflation continues, and a bit lower pricing is frankly necessary, as couldn't increase it or they would've been laughed at with such non-upgrade, as it seems to be so far. It very much reminds me of 2k series vs 1k series before RT became a thing in games - an expensive non-upgrade.

These companies always try to spin the cost of hardware,but strangely their margins seem to be going up as they shrinkflate more and more.

But it seems Nvidia/AMD are more worried looking at what their shareholders are saying,so gamers can go and do one. Pretty much the same with some of these AAA gaming companies,but it seems gamers are wising up.
Capitalism ftw? :) Shareholders are real market pushers, not enthusiasts and not gamers, for sure. They do forget it's our wallets that keep fattening them up, though and eventually people just get tired with being taken for a ride and that's when things start changing for the better. But then people forget and it begins again.
 
Is pretty much irrelevant how it runs on low(er) end hardware from 2 gens ago, especially when you're going to launch in a 4-5 years from now +/-, perhaps even with new consoles around.
I am sure Ubisoft thought so too and then started begging for monies to not go bankrupt (which they likely very soon will anyway). Same with many other closed recently studios who were releasing games that specifically avoided targeting the actual mainstream on the market (not just GPUs, but also type of gamers). That's how it ends in capitalism - either you sell product people can and want to use, or you start another business. Only Indie games can survive releasing niche products, not publishers spending hundreds of millions of USD on games that barely anyone buys later.

"stutter, artefacts, horribly blurry image" ... yeah, like consoles had for plenty of time. I'm sure it was a real treat playing games at under 30fps with crappy upscalers. Or having either no reflections or those that disappear once you slightly move the camera.
Games that sold well on consoles had gameplay going for them first and graphics far further down the line. For many years now graphics do not sale games, as market proves over and over again. A hint - look at the weakest, worst 480-720p console which is Switch. It still rules the market and good games for it sell like hot cakes.
 
Last edited:
That is NOT how statistical analysis work. If you have a vague data like that, it's nothing more than a noise and gets discarded. Otherwise it's just pure guessing and you simply can't guess and massage it to what you want it to be, otherwise you'll be tainting the data with your own bias. Hence, that's out the window - survey should be retaken with better formed questions, but we can't do that, so we work with what we have. And what we have has one most important point of data, which I mentioned - only 15% of the respondents can and are willing to run RT with full details. That's amongst enthusiasts, not even wide population. Which is one of the reasons of the below too.

I gave you a quick off the cuff example of how your assumption could be easily disproven, and you want to be condescending about my knowledge of stats, and suggest I'm massaging "noise". You have quite the imagination when it comes to trying to prove yourself correct, it's a shame your imagination cannot extrapolate beyond your own experience as you have fallen into a fallacy of composition.

Here's a few more distinct examples I can think of:
- On in Wukong for atmosphere, off in BF V for fps
- On in Doom Eternal because its just fast af regardless of settings, off in Diablo 4 because it barely makes a difference to IQ
- On in Alan Wake 2 because thats the only way it should be played, off in Spider-man because the game moves so fast they don't get a chance to appreciate the visuals

Lots of noise for you to hand wave away there. Ultimately you don't know the motivations of the people in the survey, but it's a lot more logical to deduce that if they are playing with RT on at all, they think it IS worth it.
Again, are people who play with it on, but think it is not worth it, supposed to be some kind of masochists? The number is much higher than 15%, but because that is the only cast iron black and white metric that goes against your narrative, I am sure you won't budge from it. Ultimately, you said "people don't care" but have at least conceded that 15% of people in a survey (more like 50%) do care.

Dev? It's publishers losing monies on these if anything, as such AAA games cost even hundreds of millions to make these days and need to sell similar number of copies to get any income. And if they limit how good game works to a handful of enthusiasts, that's what they get in return - not much at all. It's as simple as that, really. :)

Moving those goalposts again I see. You have no proof that RT negatively affects games sales.
 
Last edited:
I am sure Ubisoft thought so too and then started begging for monies to not go bankrupt (which they likely very soon will anyway). Same with many other closed recently studios who were releasing games that specifically avoided targeting the actual mainstream on the market (not just GPUs, but also type of gamers). That's how it ends in capitalism - either you sell product people can and want to use, or you start another business. Only Indie games can survive releasing niche products, not publishers spending hundreds of millions of USD on games that barely anyone buys later.


Games that sold well on consoles had gameplay going for them first and graphics far further down the line. For many years now graphics do not sale games, as market proves over and over again. A hint - look at the weakest, worst 480-720p console which is Switch. It still rules the market and good games for it sell like hot cakes.

Ubisoft, Sony and the rest were driven by political agenda or just incompetence. Probably they should open studios in China, hiring locally since it appears it will give them sells regardless of the game's quality.
Graphics has nothing to do with it. Metro Exodus sold over 10 million copies for instance. And as you say, performance hasn't mattered as well.

If games sell well even at 480-720, low fps, then no worry if next gen games can't run fast on old, outdated hardware.

Ergo, studios should just go for best visuals if you have the hardware and what's easier for them. If the game is good, the game will sell, no matter the lower visuals or its performance on inferior hardware.
 
Last edited:
...

Games that sold well on consoles had gameplay going for them first and graphics far further down the line. For many years now graphics do not sale games, as market proves over and over again. A hint - look at the weakest, worst 480-720p console which is Switch. It still rules the market and good games for it sell like hot cakes.

Graphics improvements have plateaued in the last few years. Ray Tracing was the last Next Big Thing but for a lot (most) people it's someting they might use some of the time. It's not a must have.

Ray Tracing has been around 5 or 6 years now and still smashes performance on the highest end cards at high resolutions. To run it with good FPS most people will need to also run a DLSS type solution.

For a fairly mature technology it's still a lottery, even if you won the lottery and can afford a 5090.
 
Ubisoft, Sony and the rest were driven by political agenda or just incompetence.
No big companies like these do "political agenda" for the sake of it - it's always chasing after more monies. Seems they chose badly, ergo it's incompetence IMHO.
Probably they should open studios in China, hiring locally since it appears it will give them sells regardless of the game's quality.
Chinese would never let the western leftist agenda appear in their games, though - it's pretty much illegal there.
Graphics has nothing to do with it.
Debatable. It added up to it, another incompetent/wrong choice in addition to the above (as in, part of the same pattern of incompetency) - both things aimed at some tiny minority of the market and that was the wrong choice when they are after more monies.
Metro Exodus sold over 10 million copies for instance. And as you say, performance hasn't mattered as well.
It's a well scalable game that works on variety of hardware, including mainstream of that time, whilst not looking like undercooked potato and most of all it's actually a good game. Plus, well optimised, as devs knew what they're doing. Graphics by itself had very little to do with anything there IMHO.
If games sell well even at 480-720, low fps, then no worry if next gen games can't run fast on old, outdated hardware.
Switch 2 is coming and... it's still going to be way behind all other handhelds with performance. Most of Nintendo consoles were underperforming and most sold really well but also games on them sell really well still. Then, we have CP2077 which still sold nearly 4 milion copies on PS4, which is another badly outdated piece of hardware - people liked the game, graphics didn't matter much to them, as long as it's playable (and eventually it became playable apparently). As I said many times earlier, if gameplay is good and game is good, people will play it - as long as it works sensibly well on their hardware.
Ergo, studios should just go for best visuals if you have the hardware and what's easier for them. If the game is good, the game will sell, no matter the lower visuals or its performance on inferior hardware.
That's a backwards way of looking on it. :) First - nobody cares what's easier for the studio, it's their problem, not ours. They have hundreds of milion of USD budgets too, and definitely do not lack resources. Different thing when looking at indie studios, but they manage just fine. Most of all, studios should make a good game, good gameplay, then see if they can also dress it up nicely for the target audience. Not start with best possible graphics and pray someone will buy it, like AAA industry does for a while now - as this just doesn't work at all in capitalism.
 
I gave you a quick off the cuff example(...)
Which is pure guessing, full of your bias and completely irrelevant to my point. You have not a shred of clue what the responder meant clicking that answer unless you're a deity and can read minds - hint, you aren't and you can't. Neither can I, hence that data can't be used for anything sensible.
The number is much higher than 15%
Not in the survey on hand, it's not. It's exactly 15%.
Moving those goalposts again I see. You have no proof that RT negatively affects games sales.
Please, don't hold back, show us the evidence of RT increasing sales. :)
 
Back
Top Bottom