• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Soldato
Joined
15 Oct 2019
Posts
11,712
Location
Uk
£100 extra for RT / DLSS seems reasonable but right now they are using it to cut die sizes and raster performance at the high end and try and sell 60ti/70 class cards for double the price.
 
Soldato
Joined
16 Aug 2009
Posts
7,751
Nvidia: Let`s pay the devs to put as much crap as they can in their game.
Look Nvidia just added a thing on their 40 series called "thread ordering...something". It should help in theory to run the RT faster. The problem is, you need a custom code inserted in the game for it to work. So most likely it will work better in a future version of CP or whatever games Nvidia will sponsor and it will not work at all in games that are not sponsored by Nvidia. I believe Intel has some thread ordering something feature too, most likely with their own custom code that they will insert in the games they will sponsor ( most likely fewer than Nvidia). So assuming their drivers will work, you will see the big ARC running RT games worse than the 3050 or better than the 3080 or something, depending on the title. :)
And that is the problem, the custom API instead of having both ( or hopefully all three cards manufacturers ) using the same API and competing and inovating inside the same framework.
On one hand we complain there is no or very little competition in the graphic cards market and the prices are going through the roof, on the other hand we don't understand there can't be any competition if each manufacturer does its own thing. Nvidia has a big advantage here, having a huge marketshare and being able to control most of the pc gaming market and tbh anyone who wants to play most of the future good PC games at the best performance should always buy Nvidia cards. But at least we can understand how we are manipulated instead of acting like stupid cheerleaders. They don't put the RT inside the game because they want you to play a better looking game. They put it because it allows them to gain an unfair advantage against the competition.

I wonder how long "sponsored by nvidia" becomes "licenced by nvidia" and studios become locked into nvidias infrastructure and hardware, nvidia will then squeeze them till the pips hurt for every penny they're worth I imagine.
 
Caporegime
Joined
12 Jul 2007
Posts
40,632
Location
United Kingdom
Nvidia: Let`s pay the devs to put as much crap as they can in their game.
Also Nvidia: Let's find some ways to mitigate the perf penalty because our cards are not good enough to run the features we paid for. :)
That is the problem with RT. Even if it may look better, it is still a marketing tool and not a feature the devs are pushing for. I am not sure if there is any RT game released right now that wasn't sponsored by either Nvidia, AMD or Intel.
The fact that we don't have a fixed number of features and the devs can lock some features for a specific card if they want : CP RT worked only on Nvidia or Godfall RT worked only on AMD and so on. That is a disgrace and none of those games should be given attention. If you use your game to help selling some cards then i won't give you more respect than you give me.
Look Nvidia just added a thing on their 40 series called "thread ordering...something". It should help in theory to run the RT faster. The problem is, you need a custom code inserted in the game for it to work. So most likely it will work better in a future version of CP or whatever games Nvidia will sponsor and it will not work at all in games that are not sponsored by Nvidia. I believe Intel has some thread ordering something feature too, most likely with their own custom code that they will insert in the games they will sponsor ( most likely fewer than Nvidia). So assuming their drivers will work, you will see the big ARC running RT games worse than the 3050 or better than the 3080 or something, depending on the title. :)
And that is the problem, the custom API instead of having both ( or hopefully all three cards manufacturers ) using the same API and competing and inovating inside the same framework.
On one hand we complain there is no or very little competition in the graphic cards market and the prices are going through the roof, on the other hand we don't understand there can't be any competition if each manufacturer does its own thing. Nvidia has a big advantage here, having a huge marketshare and being able to control most of the pc gaming market and tbh anyone who wants to play most of the future good PC games at the best performance should always buy Nvidia cards. But at least we can understand how we are manipulated instead of acting like stupid cheerleaders. They don't put the RT inside the game because they want you to play a better looking game. They put it because it allows them to gain an unfair advantage against the competition.
Welcome back, pretty damn good post bro.
 
Associate
Joined
19 Sep 2022
Posts
512
Location
Pyongyang
Would've been easier if the raytracing was an add in card. Like how raster started with adding 3dfx card to run with your 2d video card, or physx on an ageia card.
When Turing dropped, much of the sentiment was the price increases were due to die being used for raytracing etc.
I always jump on new tech, always have. But many of my enthusiast friends are enthusiastic with getting as much performance for the best price. Overclocking etc got them into this hobby especially when there really were big gains to have by oc.
For them they adopted 3d a few gens in. With raytracing, they feel they are paying for something that they don't value the saturation on the games they will play. If it was a separate card would maybe wait another gen.
I kinda like people being forced :D will make the transition quicker. Sometimes a tech can really impress me, but never adopted by the mass and dissapears due to many factors, sometimes timing. 3d vision was amazing on oled or projector. No ghosting etc, most experienced it on slow lcd + cheating on 3d effects was becoming more common meaning more game effects needed to be turned off. Done well, 3d vision was amazing. If only I could've forced everyone to buy an oled screen or projector back then :D
Yeah man I have a similar opinion on all that encoding hardware present on die... they'd just be better off selling as a separate add-on card. And use the freed up space for packing more Alus..

Though I would leave rt as-is

Nvidia: Let`s pay the devs to put as much crap as they can in their game.
Also Nvidia: Let's find some ways to mitigate the perf penalty because our cards are not good enough to run the features we paid for. :)
That is the problem with RT. Even if it may look better, it is still a marketing tool and not a feature the devs are pushing for. I am not sure if there is any RT game released right now that wasn't sponsored by either Nvidia, AMD or Intel.
The fact that we don't have a fixed number of features and the devs can lock some features for a specific card if they want : CP RT worked only on Nvidia or Godfall RT worked only on AMD and so on. That is a disgrace and none of those games should be given attention. If you use your game to help selling some cards then i won't give you more respect than you give me.
Look Nvidia just added a thing on their 40 series called "thread ordering...something". It should help in theory to run the RT faster. The problem is, you need a custom code inserted in the game for it to work. So most likely it will work better in a future version of CP or whatever games Nvidia will sponsor and it will not work at all in games that are not sponsored by Nvidia. I believe Intel has some thread ordering something feature too, most likely with their own custom code that they will insert in the games they will sponsor ( most likely fewer than Nvidia). So assuming their drivers will work, you will see the big ARC running RT games worse than the 3050 or better than the 3080 or something, depending on the title. :)
And that is the problem, the custom API instead of having both ( or hopefully all three cards manufacturers ) using the same API and competing and inovating inside the same framework.
On one hand we complain there is no or very little competition in the graphic cards market and the prices are going through the roof, on the other hand we don't understand there can't be any competition if each manufacturer does its own thing. Nvidia has a big advantage here, having a huge marketshare and being able to control most of the pc gaming market and tbh anyone who wants to play most of the future good PC games at the best performance should always buy Nvidia cards. But at least we can understand how we are manipulated instead of acting like stupid cheerleaders. They don't put the RT inside the game because they want you to play a better looking game. They put it because it allows them to gain an unfair advantage against the competition.

It's the dx12 paradigm though, shifting stuff away from driver to Dev code. Maybe dx13 would regress to a dx11 approach.
And also I believe that's how standards evolve in most cases. You don't start with a consortium but often compete on new standards till the best thing is adopted. And theres always a premium on thought leadership
 
Last edited:
Soldato
Joined
13 Mar 2004
Posts
4,712
Location
Norwich
Associate
Joined
27 Dec 2014
Posts
1,686
Location
Southampton
this is interesting, it#s about the new CPUs but with some implications maybe on the 40xx GPUs. Popcorn at the ready people!

 
Caporegime
Joined
4 Jun 2009
Posts
31,117
Would've been easier if the raytracing was an add in card. Like how raster started with adding 3dfx card to run with your 2d video card, or physx on an ageia card.
When Turing dropped, much of the sentiment was the price increases were due to die being used for raytracing etc.
I always jump on new tech, always have. But many of my enthusiast friends are enthusiastic with getting as much performance for the best price. Overclocking etc got them into this hobby especially when there really were big gains to have by oc.
For them they adopted 3d a few gens in. With raytracing, they feel they are paying for something that they don't value the saturation on the games they will play. If it was a separate card would maybe wait another gen.
I kinda like people being forced :D will make the transition quicker. Sometimes a tech can really impress me, but never adopted by the mass and dissapears due to many factors, sometimes timing. 3d vision was amazing on oled or projector. No ghosting etc, most experienced it on slow lcd + cheating on 3d effects was becoming more common meaning more game effects needed to be turned off. Done well, 3d vision was amazing. If only I could've forced everyone to buy an oled screen or projector back then :D

Agree, I absolutely loved 3d, on oled it was fantastic.

Although still not a great/valid comparison as difference is "everyone" and all hardware since turing is backing/supporting RT. 3d didn't have the same support sadly :( To achieve next gen visuals as shown by the RT teh demos, you simply can't achieve that with rasterization methods and if you could, the hours/effort required just won't be worth it when there is a solution out there that can achieve it far quicker. Been replaying RDR 2 again recently and it looks stunning the lighting, shadows etc. but it took years for R* to get to that and if you know what to look out for in terms of rasterization issues, the game is plagued by them i.e. reflections distorting/cutting off (particularly in the swamp area), lighting bleeding through areas/walls.

Nvidia: Let`s pay the devs to put as much crap as they can in their game.
Also Nvidia: Let's find some ways to mitigate the perf penalty because our cards are not good enough to run the features we paid for. :)
That is the problem with RT. Even if it may look better, it is still a marketing tool and not a feature the devs are pushing for. I am not sure if there is any RT game released right now that wasn't sponsored by either Nvidia, AMD or Intel.
The fact that we don't have a fixed number of features and the devs can lock some features for a specific card if they want : CP RT worked only on Nvidia or Godfall RT worked only on AMD and so on. That is a disgrace and none of those games should be given attention. If you use your game to help selling some cards then i won't give you more respect than you give me.
Look Nvidia just added a thing on their 40 series called "thread ordering...something". It should help in theory to run the RT faster. The problem is, you need a custom code inserted in the game for it to work. So most likely it will work better in a future version of CP or whatever games Nvidia will sponsor and it will not work at all in games that are not sponsored by Nvidia. I believe Intel has some thread ordering something feature too, most likely with their own custom code that they will insert in the games they will sponsor ( most likely fewer than Nvidia). So assuming their drivers will work, you will see the big ARC running RT games worse than the 3050 or better than the 3080 or something, depending on the title. :)
And that is the problem, the custom API instead of having both ( or hopefully all three cards manufacturers ) using the same API and competing and inovating inside the same framework.
On one hand we complain there is no or very little competition in the graphic cards market and the prices are going through the roof, on the other hand we don't understand there can't be any competition if each manufacturer does its own thing. Nvidia has a big advantage here, having a huge marketshare and being able to control most of the pc gaming market and tbh anyone who wants to play most of the future good PC games at the best performance should always buy Nvidia cards. But at least we can understand how we are manipulated instead of acting like stupid cheerleaders. They don't put the RT inside the game because they want you to play a better looking game. They put it because it allows them to gain an unfair advantage against the competition.

Again, why are you putting "ray tracing" as a "nvidia" thing? :confused:

And why aren't developers pushing for it? Have you watched/read any of the content by the developers who have been using it? It's pretty obvious why they want to move to using it.

There are plenty of games out there that aren't sponsored by either amd, intel or nvidia which have RT too. Just do a google for yourself to see this.

Is anyone forcing people to turn on/max out RT? Nope.

Also, CP uses Microsoft's DirectX Raytracing API, I can't comment as to why we had to wait for rt support on both amd and consoles, iirc, there was a tweet where CDPR said they were working with AMD on it, did nvidia perhaps pay for a timed exclusive on it? More than likely, same as happened with godfalls RT for nvidia but again, you're acting like this is the case with every RT game, when it's not.

Again, see metro ee, a RT only title:

OusRdRy.png

For some reason, people don't seem to like using metro ee to refer to for what good RT implementation looks like where it even runs very well on rdna 2, maybe because it's nvidia sponsored title? :p

AMD is perfectly capable of running RT to some extent and in fact, they are even improving thier performance, I believe there was an article the other day where driver update brought a 10% improvement in quake rtx? Nvidia just simply are better from a hardware pov though and given having more experience/headstart, it's not any surprise nvidia generally run better in all RT scenarios, even in AMDs sponsored ones.

Sure but it's the same as G-Sync vs Freesync IMO, the majority of people don't want to pay £100's for something with little discernable difference, they want good enough.

There's no question Nvidia RT is good, even great, but people don't want or can pay £100's for a high end solution, the majority of the market is in the middle, the majority prefer to pay for something that's good enough.

Completely different to free/g sync too, same way dlss vs fsr is completely different. In terms of free/g sync:

- monitor manufacturers have to pay for the nvidia module (developers/publishers don't have to pay for RT)
- monitor manufacturers have to change their monitor chasis design to accommodate the gsync module and it's fan, this isn't a simple task (obviously developers have to change up their workflow and perhaps do things different/that they are not familiar with and they will no doubt some will find the old way/rasterization better since they know it... but ask any developer and they will tell you that learning new tools/workflows is part of the job, it's all about finding better ways to do your job more efficiently and it is expected of you too, especially if you have management breathng down your neck as to why things are taking so long :))

Gsync module at one point also only worked with nvidia gpus, amd now work on it too but I digress.... the point is RT is supported by every GPU released since turing.

The only way RT will die now is if hardware providers i.e. intel, amd and nvidia etc. all pull out of supporting it. BTW, iirc, qualcomm mobile chipsets also support RT now too.

Would've been easier if the raytracing was an add in card. Like how raster started with adding 3dfx card to run with your 2d video card, or physx on an ageia card.
When Turing dropped, much of the sentiment was the price increases were due to die being used for raytracing etc.
I always jump on new tech, always have. But many of my enthusiast friends are enthusiastic with getting as much performance for the best price. Overclocking etc got them into this hobby especially when there really were big gains to have by oc.
For them they adopted 3d a few gens in. With raytracing, they feel they are paying for something that they don't value the saturation on the games they will play. If it was a separate card would maybe wait another gen.
I kinda like people being forced
:D
will make the transition quicker. Sometimes a tech can really impress me, but never adopted by the mass and dissapears due to many factors, sometimes timing. 3d vision was amazing on oled or projector. No ghosting etc, most experienced it on slow lcd + cheating on 3d effects was becoming more common meaning more game effects needed to be turned off. Done well, 3d vision was amazing. If only I could've forced everyone to buy an oled screen or projector back then
:D

Agree, I absolutely loved 3d, on oled it was fantastic.

Although still not a great/valid comparison as difference is "everyone" and all hardware since turing is backing/supporting RT. 3d didn't have the same support sadly :( To achieve next gen visuals as shown by the RT teh demos, you simply can't achieve that with rasterization methods and if you could, the hours/effort required just won't be worth it when there is a solution out there that can achieve it far quicker. Been replaying RDR 2 again recently and it looks stunning the lighting, shadows etc. but it took years for R* to get to that and if you know what to look out for in terms of rasterization issues, the game is plagued by them i.e. reflections distorting/cutting off (particularly in the swamp area), lighting bleeding through areas/walls.

Nvidia: Let`s pay the devs to put as much crap as they can in their game.
Also Nvidia: Let's find some ways to mitigate the perf penalty because our cards are not good enough to run the features we paid for.
:)

That is the problem with RT. Even if it may look better, it is still a marketing tool and not a feature the devs are pushing for. I am not sure if there is any RT game released right now that wasn't sponsored by either Nvidia, AMD or Intel.
The fact that we don't have a fixed number of features and the devs can lock some features for a specific card if they want : CP RT worked only on Nvidia or Godfall RT worked only on AMD and so on. That is a disgrace and none of those games should be given attention. If you use your game to help selling some cards then i won't give you more respect than you give me.
Look Nvidia just added a thing on their 40 series called "thread ordering...something". It should help in theory to run the RT faster. The problem is, you need a custom code inserted in the game for it to work. So most likely it will work better in a future version of CP or whatever games Nvidia will sponsor and it will not work at all in games that are not sponsored by Nvidia. I believe Intel has some thread ordering something feature too, most likely with their own custom code that they will insert in the games they will sponsor ( most likely fewer than Nvidia). So assuming their drivers will work, you will see the big ARC running RT games worse than the 3050 or better than the 3080 or something, depending on the title.
:)

And that is the problem, the custom API instead of having both ( or hopefully all three cards manufacturers ) using the same API and competing and inovating inside the same framework.
On one hand we complain there is no or very little competition in the graphic cards market and the prices are going through the roof, on the other hand we don't understand there can't be any competition if each manufacturer does its own thing. Nvidia has a big advantage here, having a huge marketshare and being able to control most of the pc gaming market and tbh anyone who wants to play most of the future good PC games at the best performance should always buy Nvidia cards. But at least we can understand how we are manipulated instead of acting like stupid cheerleaders. They don't put the RT inside the game because they want you to play a better looking game. They put it because it allows them to gain an unfair advantage against the competition.

Again, why are you putting "ray tracing" as a "nvidia" thing? :confused:

And why aren't developers pushing for it? Have you watched/read any of the content by the developers who have been using it? It's pretty obvious why they want to move to using it.

There are plenty of games out there that aren't sponsored by either amd, intel or nvidia which have RT too. Just do a google for yourself to see this.

Is anyone forcing people to turn on/max out RT? Nope.

Also, CP uses Microsoft's DirectX Raytracing API, I can't comment as to why we had to wait for rt support on both amd and consoles, iirc, there was a tweet where CDPR said they were working with AMD on it, did nvidia perhaps pay for a timed exclusive on it? More than likely, same as happened with godfalls RT for nvidia but again, you're acting like this is the case with every RT game, when it's not.

Again, see metro ee, a RT only title:

OusRdRy.png

For some reason, people don't seem to like using metro ee to refer to for what good RT implementation looks like where it even runs very well on rdna 2, maybe because it's nvidia sponsored title? :p

AMD is perfectly capable of running RT to some extent and in fact, they are even improving thier performance, I believe there was an article the other day where driver update brought a 10% improvement in quake rtx? Nvidia just simply are better from a hardware pov though and given having more experience/headstart, it's not any surprise nvidia generally run better in all RT scenarios, even in AMDs sponsored ones.

Sure but it's the same as G-Sync vs Freesync IMO, the majority of people don't want to pay £100's for something with little discernable difference, they want good enough.

There's no question Nvidia RT is good, even great, but people don't want or can pay £100's for a high end solution, the majority of the market is in the middle, the majority prefer to pay for something that's good enough.

Completely different to free/g sync too, same way dlss vs fsr is completely different. In terms of free/g sync:

- monitor manufacturers have to pay for the nvidia module (developers/publishers don't have to pay for RT)
- monitor manufacturers have to change their monitor chasis design to accomdate the gsync module and it's module, this isn't a simple task (obviously developers have to change up their workflow and perhaps do things different/that they are not familiar with and they will no doubt some will find the old way/rasterization better since they know it... but ask any developer and they will tell you that learning new tools/workflows is part of the job, it's all about finding better ways to do your job more efficiently and it is expected of you too, especially if you have management breathng down your neck as to why things are taking so long :))

Gsync module at one point also only worked with nvidia gpus, amd now work on it too but I digress.... the point is RT is supported by every GPU released since turing.

The only way RT will die now is if hardware providers i.e. intel, amd and nvidia etc. all pull out of supporting it. BTW, ARM also support RT now too.

An old saying "why spend 6 hours ray tracing something when you can rasterise it in 6 minutes"
axe to grind pointless rubbish as usual

And posts like this where as usual you add nothing to the discussion just proves how little you know or/and still have an axe to grind.

PS. FSR bashing? You mean highliting valid cons of it which were also highlighted by majority of users on several other forums and subreddits AND tech journalists? Hence why AMD released an update to solve some of the issues with fsr 2 version then released fsr 2.1 to fix/address the issues highlighted? But yes, it's "bashing" only when it comes to my viewpoint..... :rolleyes: Also, are you still inferring that RT is "tech proprietary"..... :o
 
Last edited:
Soldato
Joined
7 Apr 2008
Posts
24,158
Location
Lorville - Hurston
Agree, I absolutely loved 3d, on oled it was fantastic.

Although still not a great/valid comparison as difference is "everyone" and all hardware since turing is backing/supporting RT. 3d didn't have the same support sadly :( To achieve next gen visuals as shown by the RT teh demos, you simply can't achieve that with rasterization methods and if you could, the hours/effort required just won't be worth it when there is a solution out there that can achieve it far quicker. Been replaying RDR 2 again recently and it looks stunning the lighting, shadows etc. but it took years for R* to get to that and if you know what to look out for in terms of rasterization issues, the game is plagued by them i.e. reflections distorting/cutting off (particularly in the swamp area), lighting bleeding through areas/walls.



Again, why are you putting "ray tracing" as a "nvidia" thing? :confused:

And why aren't developers pushing for it? Have you watched/read any of the content by the developers who have been using it? It's pretty obvious why they want to move to using it.

There are plenty of games out there that aren't sponsored by either amd, intel or nvidia which have RT too. Just do a google for yourself to see this.

Is anyone forcing people to turn on/max out RT? Nope.

Also, CP uses Microsoft's DirectX Raytracing API, I can't comment as to why we had to wait for rt support on both amd and consoles, iirc, there was a tweet where CDPR said they were working with AMD on it, did nvidia perhaps pay for a timed exclusive on it? More than likely, same as happened with godfalls RT for nvidia but again, you're acting like this is the case with every RT game, when it's not.

Again, see metro ee, a RT only title:

OusRdRy.png

For some reason, people don't seem to like using metro ee to refer to for what good RT implementation looks like where it even runs very well on rdna 2, maybe because it's nvidia sponsored title? :p

AMD is perfectly capable of running RT to some extent and in fact, they are even improving thier performance, I believe there was an article the other day where driver update brought a 10% improvement in quake rtx? Nvidia just simply are better from a hardware pov though and given having more experience/headstart, it's not any surprise nvidia generally run better in all RT scenarios, even in AMDs sponsored ones.



Completely different to free/g sync too, same way dlss vs fsr is completely different. In terms of free/g sync:

- monitor manufacturers have to pay for the nvidia module (developers/publishers don't have to pay for RT)
- monitor manufacturers have to change their monitor chasis design to accomdate the gsync module and it's module, this isn't a simple task (obviously developers have to change up their workflow and perhaps do things different/that they are not familiar with and they will no doubt some will find the old way/rasterization better since they know it... but ask any developer and they will tell you that learning new tools/workflows is part of the job, it's all about finding better ways to do your job more efficiently and it is expected of you too, especially if you have management breathng down your neck as to why things are taking so long :))

Gsync module at one point also only worked with nvidia gpus, amd now work on it too but I digress.... the point is RT is supported by every GPU released since turing.

The only way RT will die now is if hardware providers i.e. intel, amd and nvidia etc. all pull out of supporting it. BTW, iirc, qualcomm mobile chipsets also support RT now too.



Agree, I absolutely loved 3d, on oled it was fantastic.

Although still not a great/valid comparison as difference is "everyone" and all hardware since turing is backing/supporting RT. 3d didn't have the same support sadly :( To achieve next gen visuals as shown by the RT teh demos, you simply can't achieve that with rasterization methods and if you could, the hours/effort required just won't be worth it when there is a solution out there that can achieve it far quicker. Been replaying RDR 2 again recently and it looks stunning the lighting, shadows etc. but it took years for R* to get to that and if you know what to look out for in terms of rasterization issues, the game is plagued by them i.e. reflections distorting/cutting off (particularly in the swamp area), lighting bleeding through areas/walls.



Again, why are you putting "ray tracing" as a "nvidia" thing? :confused:

And why aren't developers pushing for it? Have you watched/read any of the content by the developers who have been using it? It's pretty obvious why they want to move to using it.

There are plenty of games out there that aren't sponsored by either amd, intel or nvidia which have RT too. Just do a google for yourself to see this.

Is anyone forcing people to turn on/max out RT? Nope.

Also, CP uses Microsoft's DirectX Raytracing API, I can't comment as to why we had to wait for rt support on both amd and consoles, iirc, there was a tweet where CDPR said they were working with AMD on it, did nvidia perhaps pay for a timed exclusive on it? More than likely, same as happened with godfalls RT for nvidia but again, you're acting like this is the case with every RT game, when it's not.

Again, see metro ee, a RT only title:

OusRdRy.png

For some reason, people don't seem to like using metro ee to refer to for what good RT implementation looks like where it even runs very well on rdna 2, maybe because it's nvidia sponsored title? :p

AMD is perfectly capable of running RT to some extent and in fact, they are even improving thier performance, I believe there was an article the other day where driver update brought a 10% improvement in quake rtx? Nvidia just simply are better from a hardware pov though and given having more experience/headstart, it's not any surprise nvidia generally run better in all RT scenarios, even in AMDs sponsored ones.



Completely different to free/g sync too, same way dlss vs fsr is completely different. In terms of free/g sync:

- monitor manufacturers have to pay for the nvidia module (developers/publishers don't have to pay for RT)
- monitor manufacturers have to change their monitor chasis design to accomdate the gsync module and it's module, this isn't a simple task (obviously developers have to change up their workflow and perhaps do things different/that they are not familiar with and they will no doubt some will find the old way/rasterization better since they know it... but ask any developer and they will tell you that learning new tools/workflows is part of the job, it's all about finding better ways to do your job more efficiently and it is expected of you too, especially if you have management breathng down your neck as to why things are taking so long :))

Gsync module at one point also only worked with nvidia gpus, amd now work on it too but I digress.... the point is RT is supported by every GPU released since turing.

The only way RT will die now is if hardware providers i.e. intel, amd and nvidia etc. all pull out of supporting it. BTW, iirc, ARM also support RT now too.




And posts like this where as usual you add nothing to the discussion just proves how little you know or/and still have an axe to grind.

PS. FSR bashing? You mean highliting valid cons of it which were also highlighted by majority of users on several other forums and subreddits AND tech journalists? Hence why AMD released an update to solve some of the issues with fsr 2 version then released fsr 2.1 to fix/address the issues highlighted? But yes, it's "bashing" only when it comes to my viewpoint..... :rolleyes: Also, are you still inferring that RT is "tech proprietary"..... :o
With the pound weakening and energy prices sky rocket, i dont think these cards will sell well here in the UK at least.
 
Soldato
Joined
7 Apr 2008
Posts
24,158
Location
Lorville - Hurston
Caporegime
Joined
4 Jun 2009
Posts
31,117
With the pound weakening and energy prices sky rocket, i dont think these cards will sell well here in the UK at least.

Given nvidias mindshare, they'll no doubt still sell out :( Look at all the people (gamers) who overpaid for ampere, rdna 2 :( I don't think they'll sell as well though, especially if rdna 3 delivers.
 
Back
Top Bottom