• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Ray Tracing on an AMD Graphics Card beats nvidia

That's what happen when games developers are turning what should be fun to play and actual sense of accomplishments into a mean to force boring grinding chores upon the players to do in attempting to make money on micro-transactions.

Any games that are built centralised around micro-transaction are doom to fail and has no chance of being a good game, as it is focused on by taking away the enjoyment and convenience from the players and forcing them to spend. The reason why Warframe has such a success is that it is a game focused first and foremost, and the micro-transaction is optional in the true sense of the word that it is essentially just an extra layer on top, rather being the centre or core-focus of the game.

Publishers today just want undiscerning players who will happily chase their own tail and jump from game #1 to #2 to #3 on cue :|
 
Personally I find the whole thing a bit meh - reflections are nice and all but there is a whole lot more to ray tracing than that and lots of possible light effects that I'd rather have ahead of better reflections - games like The Division already do reflections well enough for now.

Same here, In BFV they still need to supplement Ray Tracing with fake reflections so do we really need it yet? I'd say no, especially when you look at the cost increase Nvidia has put on it. It wouldn't be so bad if their were alternatives gpu's, there are but no longer high end so if you want to go high end Nvidia have forced it on us along with the additional cost.
 
Come on, we all know that if this was REAL RT, performance would be crap.

It's more smoke and mirrors again.

I mean the 2000 series has dedicated hardware and yet performance takes a large hit. You really think this demo is using true RT and running lovely jubbly on both vendors?

Unlikely.
 
Come on, we all know that if this was REAL RT, performance would be crap.

It's more smoke and mirrors again.

I mean the 2000 series has dedicated hardware and yet performance takes a large hit. You really think this demo is using true RT and running lovely jubbly on both vendors?

Unlikely.
It doesn't need to be fully RT, it just needs to be good enough or at least as good as.

That's the point. If you can get to a stage which is indiscernible from nVidia's solution by other means, why do you need the dedicated hardware and expense?
 
The main benefit of RTX is it cuts down on game development. Instead of a developer spending weeks adding lighting and shadows to a scene, he/she simply adds a light source and RTX just works. Game development time is cut down massively.

The fact that RTX looks nice is just a positive side- effect which is used to market the product to consumers.

I think people are loosing sight of what RTX actually is when they compare RTX On to RTX Off and don't see much difference in how the game looks.

While this demo looks nice, we don't know how long it took the developer to implement.

We're not getting the full benefit of RTX yet because developers still have to add lighting using rasterisation for compatibility. However, in several GPU generations down the line when RTX filters it's way down to budget cards and onboard graphics, we'll truly see the benefit of realtime ray tracing.
 
Last edited:
We're not getting the full benefit of RTX yet because developers still have to add lighting using rasterisation for compatibility. However, in several GPU generations down the line when RTX filters it's way down to budget cards and onboard graphics, we'll truly see the benefit of realtime ray tracing.

How would that happen when nvidia has no access to any onboard graphics?
 
This is the issue. No point in making them look nice when the gameplay is trash.

So far none of the ray-traced games have been all that great as games.

Metro Exodus is sensational. I didn't use RTX though. I think the normal lighting model in it is different but just as good looking. I just found out my TV can do 1440p/120hz too though so I'd likely not ever want RTX effects turned on.
 
How would that happen when nvidia has no access to any onboard graphics?

Nvidia aren't won't be the only GPU manufacturers implementing ray tracing, they're just the first to try to.

It's the consoles that will make ray tracing popular. Nvidia and AMD implementing it now for the PC is just the starting point. I assume AMD will probably win the next console GPU war, and I expect it will include some form of ray tracing capability @4K with at least 30fps if not 60fps.

Ray tracing is really pretty and does make the scene around you more lifelike/immersive. It's under performing for the moment though.
 
I have just noticed that in CS: Source in the Office map (resolution is 3840x2160, everything except AA is maxed out), when looking at the doors handles, I see very good reflections of the ceiling lighting. When moving to the left and right in front of the handle, the reflection also changes position and shape.

Does CS: Source use ray-tracing?
 
I have just noticed that in CS: Source in the Office map (resolution is 3840x2160, everything except AA is maxed out), when looking at the doors handles, I see very good reflections of the ceiling lighting. When moving to the left and right in front of the handle, the reflection also changes position and shape.

Does CS: Source use ray-tracing?

No its basic environment mapping.
 
Metro Exodus is sensational. I didn't use RTX though. I think the normal lighting model in it is different but just as good looking. I just found out my TV can do 1440p/120hz too though so I'd likely not ever want RTX effects turned on.


It sounds like I need to go back and give it another go, I played roughly 3 hours on release day but didn't think much of it so I never went back, I'm really disappointed in myself for buying the game on the Epic store, I wish I'd waited a couple of years now and grabbed it during a Steam sale, maybe getting it full price on a platform I didn't want blurred my opinion a bit.
 
It sounds like I need to go back and give it another go, I played roughly 3 hours on release day but didn't think much of it so I never went back, I'm really disappointed in myself for buying the game on the Epic store, I wish I'd waited a couple of years now and grabbed it during a Steam sale, maybe getting it full price on a platform I didn't want blurred my opinion a bit.

I did the same once the reviews were proven to be really high. A couple of new single player DLCs coming with new story too and luckily I bought the pass for these. I just find in terms of atmosphere and getting to know a small group of characters you care about, this is as good as it gets. It helps that it has all the feel of a AAA title. The chance to try to stealth and show mercy to get a good ending or be ruthless and kill people for a not bad but different ending.
 
Same here, In BFV they still need to supplement Ray Tracing with fake reflections so do we really need it yet? I'd say no, especially when you look at the cost increase Nvidia has put on it. It wouldn't be so bad if their were alternatives gpu's, there are but no longer high end so if you want to go high end Nvidia have forced it on us along with the additional cost.

Price, performance and image quality cost from having to use DLSS. It's just not worth it.
 
Thread is click bait :).
Only skimmed it but unless you can play BFV or Metro Exodus with an AMD card and RT on it's nonsense, as much as I personally would like to see AMD get in on the RT action asap.
Dedicated hardware(cores) is required for RT and optimisation IMO.
 
Thread is click bait :).
Only skimmed it but unless you can play BFV or Metro Exodus with an AMD card and RT on it's nonsense, as much as I personally would like to see AMD get in on the RT action asap.
Dedicated hardware(cores) is required for RT and optimisation IMO.

You believe it's required because Nvidia have you believe it's required. I have read nothing in any article or in the dxr spec that suggests it's impossible for a card with enough computational grunt. This whole thing reminds me of physx and look what happened to dedicated physx hardware, besides havok was always vastly superior to physx and more commonly used anyway. As time progresses and architectures improve dedicated RT hardware will also imo die out.

Just to be clear, I'm not saying amd will smash this out the park and historically bar async compute you would assume that the amd card won't quite be up there with NV implementation. All this requiring x/y/z hardware seems a semi decent way of them extorting you out of your hard earned though.
 
Last edited:
You believe it's required because Nvidia have you believe it's required. I have read nothing in any article or in the dxr spec that suggests it's impossible for a card with enough computational grunt. This whole thing reminds me of physx and look what happened to dedicated physx hardware, besides havok was always vastly superior to physx and more commonly used anyway. As time progresses and architectures improve dedicated RT hardware will also imo die out.

Just to be clear, I'm not saying amd will smash this out the park and historically bar async compute you would assume that the amd card won't quite be up there with NV implementation. All this requiring x/y/z hardware seems a semi decent way of them extorting you out of your hard earned though.
You don't even have to look back that far, but only just as far back as G-sync and Freesync.
 
You don't even have to look back that far, but only just as far back as G-sync and Freesync.

Yea and that. There are many, many examples but your right on gsync, don't they need a silly expensive asic for hdr gsync where freesync can simply do it without?
 
Yea and that. There are many many examples but your right on gsync, don't they need a silly expensive asic for hdr gsync where freesync can simply do it without?
I am running Freesync on my 4K TV over HDMI at 60Hz with HDR enabled (in games that support HDR). Nvidia can't even do Gsync or Freesync over HDMI.
 
You believe it's required because Nvidia have you believe it's required. I have read nothing in any article or in the dxr spec that suggests it's impossible for a card with enough computational grunt. This whole thing reminds me of physx and look what happened to dedicated physx hardware, besides havok was always vastly superior to physx and more commonly used anyway. As time progresses and architectures improve dedicated RT hardware will also imo die out.

Just to be clear, I'm not saying amd will smash this out the park and historically bar async compute you would assume that the amd card won't quite be up there with NV implementation. All this requiring x/y/z hardware seems a semi decent way of them extorting you out of your hard earned though.
I don't disagree :). Nothing to do what NV says. As you said, if cards had enough computational grunt, but they don't. Atm having cores optimised for the RT workloads is required. In the future we may not even need CPU's either.
 
Back
Top Bottom