• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

To get 600W you need 4 pci-e to 16 pin.
2 won't do it.

Not with that cable on a Corsair you don’t. It shares the same pinouts on the PSU as the 8pin CPU EPS-12v which can do over 300 watts per connector.
It doesn’t use a pcie adapter, it plugs directly into the psu.

Some details here that have already been posted.


To simplify things, we’re introducing a dedicated 12VHPWR Type-4 cable that runs up to 600W of power directly from your Type-4 CORSAIR PSU. Giving you peace of mind that your graphics card is getting all the power it needs, while also simplifying cable management (we know that cable bulk adds up very quickly). Each of the two PSU-side connections is rated for 300W of power per connector, adding up to the 600W needed for the 12VHPWR side for your graphics card.
 
Last edited:
Completely different to free/g sync too, same way dlss vs fsr is completely different. In terms of free/g sync:

- monitor manufacturers have to pay for the nvidia module (developers/publishers don't have to pay for RT)
- monitor manufacturers have to change their monitor chasis design to accomdate the gsync module and it's module, this isn't a simple task (obviously developers have to change up their workflow and perhaps do things different/that they are not familiar with and they will no doubt some will find the old way/rasterization better since they know it... but ask any developer and they will tell you that learning new tools/workflows is part of the job, it's all about finding better ways to do your job more efficiently and it is expected of you too, especially if you have management breathng down your neck as to why things are taking so long :))

Gsync module at one point also only worked with nvidia gpus, amd now work on it too but I digress.... the point is RT is supported by every GPU released since turing.

The only way RT will die now is if hardware providers i.e. intel, amd and nvidia etc. all pull out of supporting it. BTW, ARM also support RT now too.
The monitor manufactures just passed the cost onto customers so that point it moot.

You also seem to be conflating Nvidia RTX with RT, like i said Nvidia RT (aka:RTX) is good or even great but people don't want to pay £100's for good or great, they want to pay nothing or very little for something that's good enough, something that doesn't add £100's to the cost of what they've used to.
 
Not with that cable on a Corsair you don’t. It shares the same pinouts on the PSU as the 8pin CPU EPS-12v which can do over 300 watts per connector.
It doesn’t use a pcie adapter, it plugs directly into the psu.

Some details here that have already been posted.



Thanks.

They are grounding 2 sense pins according to overclock net
I hope no one buys that for another brand and hope it works.
 
Last edited:
The monitor manufactures just passed the cost onto customers so that point it moot.

You also seem to be conflating Nvidia RTX with RT, like i said Nvidia RT (aka:RTX) is good or even great but people don't want to pay £100's for good or great, they want to pay nothing or very little for something that's good enough, something that doesn't add £100's to the cost of what they've used to.

Well not really as it adds to the cost of making the product, RT doesn't add anything to the cost of making a game, whole purpose of RT for developers/publishers is to save time, which saves money for the publishers, also, it's not "proprietary" like gsync was/is.

RTX is just nvidia branding name for "RT", hence why I think many people/amd fans hate on it i.e. they associate RT as being a nvidia feature or/and favouring nvidia (when reality is as you know, nvidia is just simply better at it).

You aren't paying anything extra for RT support (well maybe it could be argued for the 40xx series, you now are.... :cry: Time will tell on that front with nvidia sponsored titles.....) i.e. you don't have a choice of buying a non-RT gpu at all now, at least not if you're looking at anything somewhat recently released since turing so you're not directly paying for RT, it's standard, same way you aren't paying for tesselation, adaptive sync etc. support. Only way you would be paying extra for it is if you were offered 2 gpus, one which was cheaper and had no RT and another gpu with RT support which cost more.
 
Well not really as it adds to the cost of making the product, RT doesn't add anything to the cost of making a game, whole purpose of RT for developers/publishers is to save time, which saves money for the publishers, also, it's not "proprietary" like gsync was/is.

RTX is just nvidia branding name for "RT", hence why I think many people/amd fans hate on it i.e. they associate RT as being a nvidia feature or/and favouring nvidia (when reality is as you know, nvidia is just simply better at it).

You aren't paying anything extra for RT support (well maybe it could be argued for the 40xx series, you now are.... :cry: Time will tell on that front with nvidia sponsored titles.....) i.e. you don't have a choice of buying a non-RT gpu at all now, at least not if you're looking at anything somewhat recently released since turing so you're not directly paying for RT, it's standard, same way you aren't paying for tesselation, adaptive sync etc. support. Only way you would be paying extra for it is if you were offered 2 gpus, one which was cheaper and had no RT and another gpu with RT support which cost more.
Again you seem to be moving the goal posts, no one said it adds to the cost of the game, they said it adds cost to the product, to the graphics card itself. I assume you're moving the goal posts because you don't want to accept that the majority of people don't want to pay £100's for great, they want to £10's or better yet nothing for good enough. Especially so when what you're paying extra for is only useful 5% of the time (if that).

And no, RTX is not just the branding. (Nvidia worked with Microsoft to integrate RTX support with Microsoft's DirectX Raytracing API (DXR))

The reason you don't have a choice in buying a RT GPU, again you're conflating RTX with RT (aka:DXR) is because DX12 supports DXR and as above Nvidia worked with MS to include RTX in DXR but DXR is more than just RTX so if your GPU support DX12 it supports DXR but not all GPUs support RTX. (Because not all GPUs support NGX, USD and MDL, OptiX and CUDA).
 
Again you seem to be moving the goal posts, no one said it adds to the cost of the game, they said it adds cost to the product, to the graphics card itself. I assume you're moving the goal posts because you don't want to accept that the majority of people don't want to pay £100's for great, they want to £10's or better yet nothing for good enough. Especially so when what you're paying extra for is only useful 5% of the time (if that).

And no, RTX is not just the branding. (Nvidia worked with Microsoft to integrate RTX support with Microsoft's DirectX Raytracing API (DXR))

The reason you don't have a choice in buying a RT GPU, again you're conflating RTX with RT (aka
:D
XR) is because DX12 supports DXR and as above Nvidia worked with MS to include RTX in DXR but DXR is more than just RTX so if your GPU support DX12 it supports DXR but not all GPUs support RTX. (Because not all GPUs support NGX, USD and MDL, OptiX and CUDA).

Ah good to see back to the usual of when called out or highlighting flaw in ones logic, it becomes "moving goal posts" :D :cry:

You stated this:
Sure but it's the same as G-Sync vs Freesync IMO, the majority of people don't want to pay £100's for something with little discernable difference, they want good enough.

There's no question Nvidia RT is good, even great, but people don't want or can pay £100's for a high end solution, the majority of the market is in the middle, the majority prefer to pay for something that's good enough.

Then I pointed out how it is nothing like g vs free sync, which you then stated:

The monitor manufactures just passed the cost onto customers so that point it moot.

You also seem to be conflating Nvidia RTX with RT, like i said Nvidia RT (aka:RTX) is good or even great but people don't want to pay £100's for good or great, they want to pay nothing or very little for something that's good enough, something that doesn't add £100's to the cost of what they've used to.

Show me a GPU of identical spec where one has RT support and the other doesn't..... If you can do this and highlight that one is paying extra for RT gpu then you are 100% right, you are paying extra for RT support. As I said back when turing first came out, you were 100% paying nvidia for the RTX features, which included RT and DLSS and reflex, this is no longer the case now as "every" gpu released since then can enable and run the same RT effects, be that "RTX" or not.

As for 5% of time, that's entirely subjective, as stated before, personally I have played far more RT titles than only raster titles in the past 2 years where the difference is noticeable through the game for far more than "5% of the time", of course there will be people who have played far more raster. titles or/and turn off RT, hence it being subjective.

The only way you could "potentially" argue you are paying for RT support is by saying you are paying for a better RT experience e.g. 6800xt @ £600 vs 3080 @ £650 i.e. you are paying £50 extra for that extra RT performance but then you have far more factors to assist in that extra cost i.e. dlss, shadowplay/nvenc and whatever other advantages/pros nvidia/3080 has over amd/6800xt to justify the extra £50

Show me "RTX" games where ray tracing doesn't work on AMD/intel hardware?

EDIT:

Whilst we're on the topic of RT, just remembered as well sony filed some patent to accelerate their RT performance, will be curious to see how/where this goes, they seem to value it some of their exclusives e.g. ratchet and clank and spiderman titles:

 
Last edited:
Ah good to see back to the usual of when called out or highlighting flaw in ones logic, it becomes "moving goal posts" :D :cry:
Or just, you know, pointing out how you're moving the goal posts. If you'd pointed out any flaws I'd happily discuss them but you've not, all you've done is make some weak argument about monitor manufactures paying for G-Sync like that cost was never passed onto customers and then you made an even weaker argument about RTX not adding to the cost of games (that in itself after using your previous metric of G-Sync adding to the cost of monitor manufactures is laughable) when no one even said it added to the cost of games, they said it added cost to graphics cards.
You stated this:


Then I pointed out how it is nothing like g vs free sync, which you then stated:
Is this like last time we had a discussion where it took you tens of post and multiple pages to get to the point? I know what i said and i know what you said, if it isn't obvious by me reading your post i can actually read ya' know. :rolleyes:

Show me a GPU of identical spec where one has RT support and the other doesn't..... If you can do this and highlight that one is paying extra for RT gpu then you are 100% right, you are paying extra for RT support. As I said back when turing first came out, you were 100% paying nvidia for the RTX features, which included RT and DLSS and reflex, this is no longer the case now as "every" gpu released since then can enable and run the same RT effects, be that "RTX" or not.

As for 5% of time, that's entirely subjective, as stated before, personally I have played far more RT titles than only raster titles in the past 2 years where the difference is noticeable through the game for far more than "5% of the time", of course there will be people who have played far more raster. titles or/and turn off RT, hence it being subjective.

The only way you could "potentially" argue you are paying for RT support is by saying you are paying for a better RT experience e.g. 6800xt @ £600 vs 3080 @ £650 i.e. you are paying £50 extra for that extra RT performance but then you have far more factors to assist in that extra cost i.e. dlss, shadowplay/nvenc and whatever other advantages/pros nvidia/3080 has over amd/6800xt to justify the extra £50

Show me "RTX" games where ray tracing doesn't work on AMD/intel hardware?
If i showed you a GPU with identical specs I'd be showing you two of the same card, i mean that question doesn't even make sense. Not only are you once again conflating RTX with DXR (what even are you referring to when you say RT, do you mean RTX, do you mean DXR, do you mean the ray tracing i could do on my first ever CPU, a 486 DX2). Do you even understand what Ray Tracing is?

I really CBA with the rest of what you've said as this conversation is going the same way all your conversations go, with you not being able to form coherent sentences, conflating A with B, and multiple post just to get to the bottom of what you're trying to say. I'm out.
 
Or just, you know, pointing out how you're moving the goal posts. If you'd pointed out any flaws I'd happily discuss them but you've not, all you've done is make some weak argument about monitor manufactures paying for G-Sync like that cost was never passed onto customers and then you made an even weaker argument about RTX not adding to the cost of games (that in itself after using your previous metric of G-Sync adding to the cost of monitor manufactures is laughable) when no one even said it added to the cost of games, they said it added cost to graphics cards.

Is this like last time we had a discussion where it took you tens of post and multiple pages to get to the point? I know what i said and i know what you said, if it isn't obvious by me reading your post i can actually read ya' know. :rolleyes:


If i showed you a GPU with identical specs I'd be showing you two of the same card, i mean that question doesn't even make sense. Not only are you once again conflating RTX with DXR (what even are you referring to when you say RT, do you mean RTX, do you mean DXR, do you mean the ray tracing i could do on my first ever CPU, a 486 DX2). Do you even understand what Ray Tracing is?

I really CBA with the rest of what you've said as this conversation is going the same way all your conversations go, with you not being able to form coherent sentences, conflating A with B, and multiple post just to get to the bottom of what you're trying to say. I'm out.

Again, see the points you made with your comparison of gsync vs freesync being the same thing as RT, when reality is they are completely different as I pointed out, don't see how that is moving goal posts, you're the one who made that comparison, I simply explained why they are nothing alike. If you can't prove why they are alike, then don't make the statement in the first place, simples.

It makes perfect sense, you just aren't choosing to accept it for whatever reason, maybe because you don't want to admit it was a silly comparison or/and you are wrong? Hence why I said you are acting like people pay extra for other features such as adaptive sync and tesselation support on the gpus when they are just "standard" features of the gpu now, same way RT is just a "standard" supported on gpus now. If you really wanted, this is where you could use the g vs free sync argument i.e. people who buy a gsync version of the "same monitor" are paying extra for the "gsync" feature but alas, because there is no GPUs out there where they offer one with and without RT support, the comparison is not valid hence why people are not paying extra for RT like you believe.

Again, this really isn't rocket science.....

I get exactly what you are insinuating hence why I have simplified it to try and help you prove this point of yours:

- Show me "RTX" games where ray tracing doesn't work on AMD/intel hardware (you seem to be inferring that RT implemented by the RTX API/tool provided by nvidia only works with nvidia hardware?)
- Show me a GPU of identical spec where one has RT support and the other doesn't (this will prove that one is paying extra for "RT support")

Once you can do that, then your argument is completely valid, until then, it's all just as usual, you arguing for the sake of it.
 
- Show me "RTX" games where ray tracing doesn't work on AMD/intel hardware (you seem to be inferring that RT implemented by the RTX API/tool provided by nvidia only works with nvidia hardware?)
Quake 2 RTX and CP 2077 for a while. Ok Quake 2 was Nvidia proprietary API but CP 2077 was claimed to be DXR. How can you block some cards running such a game, it is like saying you can lock some cards from running a DX 12 game. Same for AMD huge blockbuster Godfall, for a while it didn't worked on Nvidia hardware.
 
Again, why are you putting "ray tracing" as a "nvidia" thing? :confused:

And why aren't developers pushing for it? Have you watched/read any of the content by the developers who have been using it? It's pretty obvious why they want to move to using it.

There are plenty of games out there that aren't sponsored by either amd, intel or nvidia which have RT too. Just do a google for yourself to see this.

Is anyone forcing people to turn on/max out RT? Nope.

Also, CP uses Microsoft's DirectX Raytracing API, I can't comment as to why we had to wait for rt support on both amd and consoles, iirc, there was a tweet where CDPR said they were working with AMD on it, did nvidia perhaps pay for a timed exclusive on it? More than likely, same as happened with godfalls RT for nvidia but again, you're acting like this is the case with every RT game, when it's not.

Again, see metro ee, a RT only title:

OusRdRy.png

For some reason, people don't seem to like using metro ee to refer to for what good RT implementation looks like where it even runs very well on rdna 2, maybe because it's nvidia sponsored title? :p

AMD is perfectly capable of running RT to some extent and in fact, they are even improving thier performance, I believe there was an article the other day where driver update brought a 10% improvement in quake rtx? Nvidia just simply are better from a hardware pov though and given having more experience/headstart, it's not any surprise nvidia generally run better in all RT scenarios, even in AMDs sponsored ones.
I put Nvidia in the first place because they are in the position to control the gaming market. They are not the only bad player out there, what AMD did with Godfall and Farcry 6 was also wrong. It doesn't even make sense economically to block Nvidia features when they control 80% of the gaming market ( and probably even more of the high end pc gaming market ), it means you are not making a game to sell it, you make a game to promote a product. Again Godfall and FC6 are good examples where some features were used to give an unfair advantage to a card manufacturer.
Think about this: we live in a different universe and Nvidia, AMD and Intel are releasing equal performance cards. Each of them are sponsoring a game and each of them put their own custom code inside the game they are sponsoring and lock the other two from adding their own custom codes.. And then you will see each of them being better than the other in benchmarks. :)
The market is not equal right now and most of the times Nvidia will have the lion share in the number and the quality of games they sponsor. That is why, no matter what happens, Nvidia will have an advantage. I gave you an example with that thread ordering feature. How can Intel compete in CP 2077 when Nvidia will use a custom code for their 4000 gen while Intel will use brute force, even if they also have a custom thread ordering feature? It is a small example but it is getting worse and worse every year as long as Nvidia marketshare increases and they are pushing more and more proprietary features.
 
Quake 2 RTX and CP 2077 for a while. Ok Quake 2 was Nvidia proprietary API but CP 2077 was claimed to be DXR. How can you block some cards running such a game, it is like saying you can lock some cards from running a DX 12 game. Same for AMD huge blockbuster Godfall, for a while it didn't worked on Nvidia hardware.

CP 2077 is DXR. Developers tweeted they were working with amd to get RT working. I don't doubt there was some kind of timed exclusivity shenangians going on for obvious reasons or/and since turing was the first desktop gpu to enable/support RT with it being tied into the RTX aspect i.e. it didn't support other gpu arch. and had to be updated by nvidia? Same way, iirc, control also didn't have RT support for amd on launch (when rdna 2 launched) as turing was the only arch. capable of RT i.e. a method tied to the implementation of RT and not RT itself being the problem?



As for quake rtx, probably similar as above, interesting article this though:


Nvidia actually did some of the heavy lifting to bring ray tracing to Vulkan in the first place. There are already existing tools in place to assist with translating DX12 calls and HLSL (High Level Shader Language) code into Vulkan and SPIR-V, respectively. Nvidia’s specific contribution to the project was to add ray tracing support to Microsoft’s open source DirectXCompiler, which is commonly used to port HLSL code to Vulkan.
In other words, Nvidia’s open source work is a key part of why AMD GPUs can now run Quake II RTX. This kind of ‘coopetition’, if you will, is a key part of ensuring standards are widely supported and making certain gamers can expect certain features on a wide range of systems. In theory, developers that already have a Vulkan and a DX12 path could keep Nvidia RTX support for one API and support both Nvidia and AMD in Vulkan. So far, we haven’t heard much about whether or not RTX-enabled games will receive an update to allow AMD to use ray tracing, or how much additional optimization is required to make use of the feature on RDNA2 GPUs as opposed to Turing / Ampere.

i.e. as pointed out, it seems the lack of support was simply because the method of implementing RT into the first RT games didn't support anything outside of turing arch.? Obviously once RDNA 2 launched then updates had to be made to how RT was being implemented i.e. to support more than just turing. If so, I wonder what AMD reasons were for godfall ;) :p

@Rroff might be able to answer/confirm why amd/rdna 2 weren't able to get RT working working on launch for titles like cp, quake.

I put Nvidia in the first place because they are in the position to control the gaming market. They are not the only bad player out there, what AMD did with Godfall and Farcry 6 was also wrong. It doesn't even make sense economically to block Nvidia features when they control 80% of the gaming market ( and probably even more of the high end pc gaming market ), it means you are not making a game to sell it, you make a game to promote a product. Again Godfall and FC6 are good examples where some features were used to give an unfair advantage to a card manufacturer.
Think about this: we live in a different universe and Nvidia, AMD and Intel are releasing equal performance cards. Each of them are sponsoring a game and each of them put their own custom code inside the game they are sponsoring and lock the other two from adding their own custom codes.. And then you will see each of them being better than the other in benchmarks. :)
The market is not equal right now and most of the times Nvidia will have the lion share in the number and the quality of games they sponsor. That is why, no matter what happens, Nvidia will have an advantage. I gave you an example with that thread ordering feature. How can Intel compete in CP 2077 when Nvidia will use a custom code for their 4000 gen while Intel will use brute force, even if they also have a custom thread ordering feature? It is a small example but it is getting worse and worse every year as long as Nvidia marketshare increases and they are pushing more and more proprietary features.

Valid points.

I don't disagree that it is concerning what will happen with nvidia sponsored titles now that the 40xx series is out with a lot of extra ray tracing capabilities (see all the specific improvements in portal RTX and cp 2077 that appear to tied to 40xx hardware only) but if it means pushing the rest to act in order to get to the RT era sooner than later, I'm ok with that, again, no one is forcing gamers to enable/max RT settings.
 
Last edited:
Back
Top Bottom