• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: Do you care for Ray Tracing "now"?

Do you care for ray tracing "now"?


  • Total voters
    294
Status
Not open for further replies.
Raytracing looks very nice but so does Ultra settings. For the last 20 years I've been using medium and high settings. I can see that Ultra looks very nice but I would rather buy a cheaper GPU. It's the same with Raytracing. Yes it's very nice but I'll go without and save the money.

I think everyone can see that Raytracing looks better but most would rather save money and use a lower setting. That doesn't mean we're stupid or blind or unable to tell the difference etc.
 
Last edited:
Given time, all the benefits will come to gamers, but I don't see any need to chase them with my money, they'll drop into my lap as they mature in the natural upgrade cycle.

How will the technology mature without investment.. If people - like you - who realise it's importance aren't willing to support it's development then where's the incentive ? Nvidia market hard on RT because they invest in innovation, RTX2x we saw price increases but RT is now an indusrty standard, the technology is now relatively cheaper to produce through adoption and investment. It's how the "free market" works in an ideal situation.. enough PC gamers are chasing innovation - assuming that most people are free thinking.
 
How will the technology mature without investment.. If people - like you - who realise it's importance aren't willing to support it's development then where's the incentive ? Nvidia market hard on RT because they invest in innovation, RTX2x we saw price increases but RT is now an indusrty standard, the technology is now relatively cheaper to produce through adoption and investment. It's how the "free market" works in an ideal situation.. enough PC gamers are chasing innovation - assuming that most people are free thinking.
RT was a standard way before nVidia looked at it.
 
Nah I've only got one title that supports it (CP2077) and there its barely a tech demo (shiny puddles wooh) DLSS is an easier sell as thats worth something... for the few titles that support it. Almost all of mine are older however.

I only got a 3090 because a) they were about the only thing in stock when I bought it and b) I wanted the grunt for 4K - if I ever find someone who wants to swap for a 6950XT I'd go for it
 
I care for ray tracing now... but... 99% of the games out there which support it so far mostly use it to patch over the bits where conventional techniques fall down, at a huge performance cost, without utilising what ray tracing can bring to the table. Often the overheads involved in bringing ray tracing to those titles is a large percentage of the performance cost of using ray tracing more widely as well without the increase in visual quality to validate it.

We are also at least another generation of cards away from utilising ray tracing to best effect without compromises like visual noise.

People with a bitterly opposed position on it, usually due to fanboyism, are just idiots though - properly done ray tracing in games, and it needs these baby steps to get the technology rolling, will bring a significant increase in visual quality.

Its like HDR to me I can take it or leave it.

I find the whole situation with HDR annoying as unlike ray tracing there aren't the same significant hurdles when it comes to technical barriers to it more a lack of willingness to innovate and push the technology forward.
 
ok first up not actually got a ray tracing gpu so all my ramblings are based on videos.

my view is much like HDR when its done well (and subtly) it can look good and add something more to the game, but when its done bad it makes the experience worse.
 
Rasterisation is like a rickety Jenga tower right now. Layers upon layers. RT simplifies everything at the expense of computation. I’ve long said that the 20 series - even the 2080 ti - was no more than a demo and I expected it would take three generations for RT to work its way through to the mainstream. That means the 4060 GPU. But games are taking ever longer to develop so I don’t see a preponderance of RT games for some years.
 
I’ve long said that the 20 series - even the 2080 ti - was no more than a demo
I do find it funny when people say that the rx 6000 series is bad at RT never seem to mention that the 2000 series was just as bad at it.

I do think we're still at least 1 gen off it being easy enough to run to be worth it. For me I find that unless I'm comparing screen shots while playing the actual game i just don't notice it, so makes no sense to me to use it at the cost of so much performance. Once its trivial to run, then yeah go nuts, till then its not for me
 
I do find it funny when people say that the rx 6000 series is bad at RT never seem to mention that the 2000 series was just as bad at it.

I called Turing out many times at launch, so much so that @Gregster called me an AMD fanboy and thought I 'hated' Nvidia. I then called out RDNA2 at launch for its similar garbage performance only to find myself labelled as a Nvidia fanboy :cry:
 
Interesting results so far and kind of what I expected, well actually, I was expecting less votes for "yes" on this forum tbh.

Just thinking, a better poll would have been to split it into be based on who owns what gpu brand as that would have been a rather good insight imo since generally I find it seems to be amd owners who don't use nor care for RT i.e. something like:

Yes (nvidia owner)
Yes (amd owner)
No (nvidia owner)
No (amd owner)
Not yet but in the future (nvidia owner)
Not yet but in the future (amd owner)

But will stick with the current format for now.

I called Turing out many times at launch, so much so that @Gregster called me an AMD fanboy and thought I 'hated' Nvidia. I then called out RDNA2 at launch for its similar garbage performance only to find myself labelled as a Nvidia fanboy :cry:
Exactly, I was the same too and also called out how utterly **** dlss 1 was but nope "amd fanboi!!!" :cry:

I'll never understand the "but it's amds first go, this is nvidias second attempt" stance.... yes that is true but it doesn't matter, it's a big selling point for nvidia and the features are here to be used in many many titles now, as a consumer, I couldn't care less who does it better etc. but I want to play said games with said features now, not in 1-2 years time, which sadly only nvidia offer atm.

Also, there was some talk of this in another thread recently but seems turing is still ever so slightly better than RDNA 2 in RT games.
 
  • Like
Reactions: TNA
What they going to call you when INTEL release there gpu's :D

Well I've had a little of that already, by an anonymous mod (not OCuk staff member) showing a little bias and a couple of general forum members, when I suggested that the 12900k maybe a better option than the 5950x for specific video editing tasks :eek:

Totally ridiculous as I was intending on a 5950x for a new dev box myself before AMD announced 3D cache chips. I only went Intel when I found out the 3D cache would only be available on the 8 core part.
 
I called Turing out many times at launch, so much so that @Gregster called me an AMD fanboy and thought I 'hated' Nvidia. I then called out RDNA2 at launch for its similar garbage performance only to find myself labelled as a Nvidia fanboy :cry:
I sympathise. I have found myself being called AMD fanboy and Nvidia fanboy or insinuated many times over the years. The price you pay for trying to not be bias and buying whatever suits you best at the time I guess.

Grim kept calling me an AMD fanboy because I thought DLSS 1.0 was crap. Dude it was ******* crap man, simple as that. I just say it how it is! DLSS 2.0 improved things a lot and it has kept getting better since and now I like it. He also said I would never buy a 3070 or 3080 and was making it up as I was too much of an AMD fanboy.

I have defended AMD driver issues a lot over the years, well up until the black screen issue they had which took them ages to fix anyway, which was a real issue as I saw it first hand. Have also recommended many AMD cards when people would recommend Nvidia ones just because of the branding. All on this very forum.. Not to mention I have owned more AMD GPU’s and CPU’s, but all that gets forgotten about when I have an opinion that is not pro AMD, namely the 3080 10GB debate :cry:
 
I voted No.

I'm a very casual gamer now, and I mostly play older games that don't have RTX functionality. I currently have a GTX 1080 and I'm holding on to it as long as possible. It cost about £500 at the time and that felt like too much to me, it has been a great card so I don't regret it really but don't want to spend that sort of money again because I'm gaming a lot less now. I don't give a flying f about the latest bells and whistles. I don't care about AAA titles.

RTX actually puts me off buying a GPU because I know I'm paying a premium for a feature I don't care about. Cards are expensive enough without adding RTX Tax to the mix.

IMO they had it right at first when they offered GTX for people who didn't want RTX yet. That allowed people to opt-in so RTX can improve to a level where everyone else thinks it's worth it, but also allowed people to upgrade without feeling ripped off.
 
Status
Not open for further replies.
Back
Top Bottom