Associate
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
To get 600W you need 4 pci-e to 16 pin.
2 won't do it.
To simplify things, we’re introducing a dedicated 12VHPWR Type-4 cable that runs up to 600W of power directly from your Type-4 CORSAIR PSU. Giving you peace of mind that your graphics card is getting all the power it needs, while also simplifying cable management (we know that cable bulk adds up very quickly). Each of the two PSU-side connections is rated for 300W of power per connector, adding up to the 600W needed for the 12VHPWR side for your graphics card.
Nah your good. The video was released 9 hours ago. I doubt anyone was awake to post it.oops, sorry!
The monitor manufactures just passed the cost onto customers so that point it moot.Completely different to free/g sync too, same way dlss vs fsr is completely different. In terms of free/g sync:
- monitor manufacturers have to pay for the nvidia module (developers/publishers don't have to pay for RT)
- monitor manufacturers have to change their monitor chasis design to accomdate the gsync module and it's module, this isn't a simple task (obviously developers have to change up their workflow and perhaps do things different/that they are not familiar with and they will no doubt some will find the old way/rasterization better since they know it... but ask any developer and they will tell you that learning new tools/workflows is part of the job, it's all about finding better ways to do your job more efficiently and it is expected of you too, especially if you have management breathng down your neck as to why things are taking so long )
Gsync module at one point also only worked with nvidia gpus, amd now work on it too but I digress.... the point is RT is supported by every GPU released since turing.
The only way RT will die now is if hardware providers i.e. intel, amd and nvidia etc. all pull out of supporting it. BTW, ARM also support RT now too.
Not with that cable on a Corsair you don’t. It shares the same pinouts on the PSU as the 8pin CPU EPS-12v which can do over 300 watts per connector.
It doesn’t use a pcie adapter, it plugs directly into the psu.
Some details here that have already been posted.
Ready to Go Beyond Fast
NVIDIA GeForce RTX 40-Series graphics cards are coming and CORSAIR power supplies are ready.forum.corsair.com
The monitor manufactures just passed the cost onto customers so that point it moot.
You also seem to be conflating Nvidia RTX with RT, like i said Nvidia RT (aka:RTX) is good or even great but people don't want to pay £100's for good or great, they want to pay nothing or very little for something that's good enough, something that doesn't add £100's to the cost of what they've used to.
Again you seem to be moving the goal posts, no one said it adds to the cost of the game, they said it adds cost to the product, to the graphics card itself. I assume you're moving the goal posts because you don't want to accept that the majority of people don't want to pay £100's for great, they want to £10's or better yet nothing for good enough. Especially so when what you're paying extra for is only useful 5% of the time (if that).Well not really as it adds to the cost of making the product, RT doesn't add anything to the cost of making a game, whole purpose of RT for developers/publishers is to save time, which saves money for the publishers, also, it's not "proprietary" like gsync was/is.
RTX is just nvidia branding name for "RT", hence why I think many people/amd fans hate on it i.e. they associate RT as being a nvidia feature or/and favouring nvidia (when reality is as you know, nvidia is just simply better at it).
You aren't paying anything extra for RT support (well maybe it could be argued for the 40xx series, you now are.... Time will tell on that front with nvidia sponsored titles.....) i.e. you don't have a choice of buying a non-RT gpu at all now, at least not if you're looking at anything somewhat recently released since turing so you're not directly paying for RT, it's standard, same way you aren't paying for tesselation, adaptive sync etc. support. Only way you would be paying extra for it is if you were offered 2 gpus, one which was cheaper and had no RT and another gpu with RT support which cost more.
Again you seem to be moving the goal posts, no one said it adds to the cost of the game, they said it adds cost to the product, to the graphics card itself. I assume you're moving the goal posts because you don't want to accept that the majority of people don't want to pay £100's for great, they want to £10's or better yet nothing for good enough. Especially so when what you're paying extra for is only useful 5% of the time (if that).
And no, RTX is not just the branding. (Nvidia worked with Microsoft to integrate RTX support with Microsoft's DirectX Raytracing API (DXR))
The reason you don't have a choice in buying a RT GPU, again you're conflating RTX with RT (akaXR) is because DX12 supports DXR and as above Nvidia worked with MS to include RTX in DXR but DXR is more than just RTX so if your GPU support DX12 it supports DXR but not all GPUs support RTX. (Because not all GPUs support NGX, USD and MDL, OptiX and CUDA).
Sure but it's the same as G-Sync vs Freesync IMO, the majority of people don't want to pay £100's for something with little discernable difference, they want good enough.
There's no question Nvidia RT is good, even great, but people don't want or can pay £100's for a high end solution, the majority of the market is in the middle, the majority prefer to pay for something that's good enough.
The monitor manufactures just passed the cost onto customers so that point it moot.
You also seem to be conflating Nvidia RTX with RT, like i said Nvidia RT (aka:RTX) is good or even great but people don't want to pay £100's for good or great, they want to pay nothing or very little for something that's good enough, something that doesn't add £100's to the cost of what they've used to.
Or just, you know, pointing out how you're moving the goal posts. If you'd pointed out any flaws I'd happily discuss them but you've not, all you've done is make some weak argument about monitor manufactures paying for G-Sync like that cost was never passed onto customers and then you made an even weaker argument about RTX not adding to the cost of games (that in itself after using your previous metric of G-Sync adding to the cost of monitor manufactures is laughable) when no one even said it added to the cost of games, they said it added cost to graphics cards.Ah good to see back to the usual of when called out or highlighting flaw in ones logic, it becomes "moving goal posts"
Is this like last time we had a discussion where it took you tens of post and multiple pages to get to the point? I know what i said and i know what you said, if it isn't obvious by me reading your post i can actually read ya' know.You stated this:
Then I pointed out how it is nothing like g vs free sync, which you then stated:
If i showed you a GPU with identical specs I'd be showing you two of the same card, i mean that question doesn't even make sense. Not only are you once again conflating RTX with DXR (what even are you referring to when you say RT, do you mean RTX, do you mean DXR, do you mean the ray tracing i could do on my first ever CPU, a 486 DX2). Do you even understand what Ray Tracing is?Show me a GPU of identical spec where one has RT support and the other doesn't..... If you can do this and highlight that one is paying extra for RT gpu then you are 100% right, you are paying extra for RT support. As I said back when turing first came out, you were 100% paying nvidia for the RTX features, which included RT and DLSS and reflex, this is no longer the case now as "every" gpu released since then can enable and run the same RT effects, be that "RTX" or not.
As for 5% of time, that's entirely subjective, as stated before, personally I have played far more RT titles than only raster titles in the past 2 years where the difference is noticeable through the game for far more than "5% of the time", of course there will be people who have played far more raster. titles or/and turn off RT, hence it being subjective.
The only way you could "potentially" argue you are paying for RT support is by saying you are paying for a better RT experience e.g. 6800xt @ £600 vs 3080 @ £650 i.e. you are paying £50 extra for that extra RT performance but then you have far more factors to assist in that extra cost i.e. dlss, shadowplay/nvenc and whatever other advantages/pros nvidia/3080 has over amd/6800xt to justify the extra £50
Show me "RTX" games where ray tracing doesn't work on AMD/intel hardware?
Or just, you know, pointing out how you're moving the goal posts. If you'd pointed out any flaws I'd happily discuss them but you've not, all you've done is make some weak argument about monitor manufactures paying for G-Sync like that cost was never passed onto customers and then you made an even weaker argument about RTX not adding to the cost of games (that in itself after using your previous metric of G-Sync adding to the cost of monitor manufactures is laughable) when no one even said it added to the cost of games, they said it added cost to graphics cards.
Is this like last time we had a discussion where it took you tens of post and multiple pages to get to the point? I know what i said and i know what you said, if it isn't obvious by me reading your post i can actually read ya' know.
If i showed you a GPU with identical specs I'd be showing you two of the same card, i mean that question doesn't even make sense. Not only are you once again conflating RTX with DXR (what even are you referring to when you say RT, do you mean RTX, do you mean DXR, do you mean the ray tracing i could do on my first ever CPU, a 486 DX2). Do you even understand what Ray Tracing is?
I really CBA with the rest of what you've said as this conversation is going the same way all your conversations go, with you not being able to form coherent sentences, conflating A with B, and multiple post just to get to the bottom of what you're trying to say. I'm out.
Quake 2 RTX and CP 2077 for a while. Ok Quake 2 was Nvidia proprietary API but CP 2077 was claimed to be DXR. How can you block some cards running such a game, it is like saying you can lock some cards from running a DX 12 game. Same for AMD huge blockbuster Godfall, for a while it didn't worked on Nvidia hardware.- Show me "RTX" games where ray tracing doesn't work on AMD/intel hardware (you seem to be inferring that RT implemented by the RTX API/tool provided by nvidia only works with nvidia hardware?)
I put Nvidia in the first place because they are in the position to control the gaming market. They are not the only bad player out there, what AMD did with Godfall and Farcry 6 was also wrong. It doesn't even make sense economically to block Nvidia features when they control 80% of the gaming market ( and probably even more of the high end pc gaming market ), it means you are not making a game to sell it, you make a game to promote a product. Again Godfall and FC6 are good examples where some features were used to give an unfair advantage to a card manufacturer.Again, why are you putting "ray tracing" as a "nvidia" thing?
And why aren't developers pushing for it? Have you watched/read any of the content by the developers who have been using it? It's pretty obvious why they want to move to using it.
There are plenty of games out there that aren't sponsored by either amd, intel or nvidia which have RT too. Just do a google for yourself to see this.
Is anyone forcing people to turn on/max out RT? Nope.
Also, CP uses Microsoft's DirectX Raytracing API, I can't comment as to why we had to wait for rt support on both amd and consoles, iirc, there was a tweet where CDPR said they were working with AMD on it, did nvidia perhaps pay for a timed exclusive on it? More than likely, same as happened with godfalls RT for nvidia but again, you're acting like this is the case with every RT game, when it's not.
Again, see metro ee, a RT only title:
For some reason, people don't seem to like using metro ee to refer to for what good RT implementation looks like where it even runs very well on rdna 2, maybe because it's nvidia sponsored title?
AMD is perfectly capable of running RT to some extent and in fact, they are even improving thier performance, I believe there was an article the other day where driver update brought a 10% improvement in quake rtx? Nvidia just simply are better from a hardware pov though and given having more experience/headstart, it's not any surprise nvidia generally run better in all RT scenarios, even in AMDs sponsored ones.
You are better off asking here.anybody know what time on the 12th they go live to purchase?
Quake 2 RTX and CP 2077 for a while. Ok Quake 2 was Nvidia proprietary API but CP 2077 was claimed to be DXR. How can you block some cards running such a game, it is like saying you can lock some cards from running a DX 12 game. Same for AMD huge blockbuster Godfall, for a while it didn't worked on Nvidia hardware.
Nvidia actually did some of the heavy lifting to bring ray tracing to Vulkan in the first place. There are already existing tools in place to assist with translating DX12 calls and HLSL (High Level Shader Language) code into Vulkan and SPIR-V, respectively. Nvidia’s specific contribution to the project was to add ray tracing support to Microsoft’s open source DirectXCompiler, which is commonly used to port HLSL code to Vulkan.
In other words, Nvidia’s open source work is a key part of why AMD GPUs can now run Quake II RTX. This kind of ‘coopetition’, if you will, is a key part of ensuring standards are widely supported and making certain gamers can expect certain features on a wide range of systems. In theory, developers that already have a Vulkan and a DX12 path could keep Nvidia RTX support for one API and support both Nvidia and AMD in Vulkan. So far, we haven’t heard much about whether or not RTX-enabled games will receive an update to allow AMD to use ray tracing, or how much additional optimization is required to make use of the feature on RDNA2 GPUs as opposed to Turing / Ampere.
I put Nvidia in the first place because they are in the position to control the gaming market. They are not the only bad player out there, what AMD did with Godfall and Farcry 6 was also wrong. It doesn't even make sense economically to block Nvidia features when they control 80% of the gaming market ( and probably even more of the high end pc gaming market ), it means you are not making a game to sell it, you make a game to promote a product. Again Godfall and FC6 are good examples where some features were used to give an unfair advantage to a card manufacturer.
Think about this: we live in a different universe and Nvidia, AMD and Intel are releasing equal performance cards. Each of them are sponsoring a game and each of them put their own custom code inside the game they are sponsoring and lock the other two from adding their own custom codes.. And then you will see each of them being better than the other in benchmarks.
The market is not equal right now and most of the times Nvidia will have the lion share in the number and the quality of games they sponsor. That is why, no matter what happens, Nvidia will have an advantage. I gave you an example with that thread ordering feature. How can Intel compete in CP 2077 when Nvidia will use a custom code for their 4000 gen while Intel will use brute force, even if they also have a custom thread ordering feature? It is a small example but it is getting worse and worse every year as long as Nvidia marketshare increases and they are pushing more and more proprietary features.