• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

HDR support in games will come with time. If the consoles push it then expect it to come sooner rather than later.

I enabled HDR in Death Stranding for example, and it looked awful. It's also a cumbersome technology as it needs to be disabled in Windows when the game is over else everything else is negatively affected.
 
You mean 2 x 6 pin PCIE connectors we already have ? 2 x 6 = erm whats the answer ?
Making a 12 pin socket for a 12 pin adapter to plug into the same PCIE power is utterly stupid and waste of money and sounds like someone is taking the mick.
I don't think it's that far-fetched. The reason they'd move to a single 12-pin adapter is to prevent people (like myself) using a loop-de-loop cable, which might not be rated to carry the full power draw required.

It effectively forces you to plug into 2 PCIE connectors on the PSU end instead of using one i.e. same as using two separate cables.
 
I enabled HDR in Death Stranding for example, and it looked awful. It's also a cumbersome technology as it needs to be disabled in Windows when the game is over else everything else is negatively affected.
I recon one needs an OLED to get the proper HDR experience, the next best thing being those overpriced FALD monitors. The rest that are 400 or 600 certified are not really proper HDR imo.
 
That is correct.

I have yet to play a game that looks better in SDR than HDR, but I have a screen that does proper HDR so that could be why I haven't run into those issues.
Yea. Playing PS4 exclusives on my PS4 Pro and OLED tv with HDR really helped make those games look graphically stunning. Spider-Man, God of War and Horizon Zero Dawn in particular.

Once I pick up a RTX 3070 I will take my PC downstairs and connect it to my OLED and test out some games. OLED + G-Sync + 4K 120hz + Proper HDR :D
 
I don't think it's that far-fetched. The reason they'd move to a single 12-pin adapter is to prevent people (like myself) using a loop-de-loop cable, which might not be rated to carry the full power draw required.

It effectively forces you to plug into 2 PCIE connectors on the PSU end instead of using one i.e. same as using two separate cables.

i hope its just a crappy rumour as im not buying another £100 psu just to run a nvidia graphics card. Maybe another reason to go with Amd if they keep the 8 pin
 
i hope its just a crappy rumour as im not buying another £100 psu just to run a nvidia graphics card. Maybe another reason to go with Amd if they keep the 8 pin
I suspect it'll only affect the top-tier cards. If you have a fully-modular PSU, I don't think you have to be concerned - just order whatever cable becomes necessary from Corsair or whoever. If you have a non-modular PSU, then yes, you may need to get a new PSU. But honestly, if you're prepared to spend 3080Ti money and skimped on your PSU, you don't get my sympathy.

I really don't think nVidia would do this (if it's true) on a whim just to annoy people, but I can see it from an engineering, marketing and legal standpoint. They have to weigh up the risks of stupid people (I include myself in that category :D) just blindly plugging in the card, ignoring the warnings in the instructions, and using a single cable to bridge the two 8-pin connectors. It only takes a handful of reports of people's PCs smoking/melting cables or bursting into flame to cause a major PR shi..-storm and kill the product.
 
I suspect it'll only affect the top-tier cards. If you have a fully-modular PSU, I don't think you have to be concerned - just order whatever cable becomes necessary from Corsair or whoever. If you have a non-modular PSU, then yes, you may need to get a new PSU. But honestly, if you're prepared to spend 3080Ti money and skimped on your PSU, you don't get my sympathy.

I really don't think nVidia would do this (if it's true) on a whim just to annoy people, but I can see it from an engineering, marketing and legal standpoint. They have to weigh up the risks of stupid people (I include myself in that category :D) just blindly plugging in the card, ignoring the warnings in the instructions, and using a single cable to bridge the two 8-pin connectors. It only takes a handful of reports of people's PCs smoking/melting cables or bursting into flame to cause a major PR shi..-storm and kill the product.

There will be an adaptor, if you already have a decent PSU it most likely has 2 or more 6+2pin connectors, 16 pins in to 12 pins should be plenty, a single 12 pin connector will be easier to insert in to the graphics card and then the leads will give you room to fit the multiple connectors - fitting 2 8 pin connectors is often a bit of a mare so this is a good thing and makes it easier to fit the GPU

there is absolutely no way that you will NEED to fit a new PSU, whoever says that is being really really dim
 
There will be an adaptor, if you already have a decent PSU it most likely has 2 or more 6+2pin connectors, 16 pins in to 12 pins should be plenty, a single 12 pin connector will be easier to insert in to the graphics card and then the leads will give you room to fit the multiple connectors - fitting 2 8 pin connectors is often a bit of a mare so this is a good thing and makes it easier to fit the GPU

there is absolutely no way that you will NEED to fit a new PSU, whoever says that is being really really dim
spot on.
 
I suspect it'll only affect the top-tier cards. If you have a fully-modular PSU, I don't think you have to be concerned - just order whatever cable becomes necessary from Corsair or whoever. If you have a non-modular PSU, then yes, you may need to get a new PSU. But honestly, if you're prepared to spend 3080Ti money and skimped on your PSU, you don't get my sympathy.

I really don't think nVidia would do this (if it's true) on a whim just to annoy people, but I can see it from an engineering, marketing and legal standpoint. They have to weigh up the risks of stupid people (I include myself in that category :D) just blindly plugging in the card, ignoring the warnings in the instructions, and using a single cable to bridge the two 8-pin connectors. It only takes a handful of reports of people's PCs smoking/melting cables or bursting into flame to cause a major PR shi..-storm and kill the product.

There will be an adaptor, if you already have a decent PSU it most likely has 2 or more 6+2pin connectors, 16 pins in to 12 pins should be plenty, a single 12 pin connector will be easier to insert in to the graphics card and then the leads will give you room to fit the multiple connectors - fitting 2 8 pin connectors is often a bit of a mare so this is a good thing and makes it easier to fit the GPU

there is absolutely no way that you will NEED to fit a new PSU, whoever says that is being really really dim

Yea i guess thats true i do have a modular Evga 650+ 80 Gold so hopefully im fine

I fancy a 3080ti depending on the price but il probably end up staying at the £500 range which i think is a good chunk of money for a gpu

Probably save the rest for a PS5
 
There will be an adaptor, if you already have a decent PSU it most likely has 2 or more 6+2pin connectors, 16 pins in to 12 pins should be plenty, a single 12 pin connector will be easier to insert in to the graphics card and then the leads will give you room to fit the multiple connectors - fitting 2 8 pin connectors is often a bit of a mare so this is a good thing and makes it easier to fit the GPU

there is absolutely no way that you will NEED to fit a new PSU, whoever says that is being really really dim
I don't think it's about the plug or its awkwardness - it's about the cables. If everyone stuck to using two separate 6/8-pin cables, there most likely wouldn't be any issue. The problem is a lot of PSU manufacturers hand out the looped cables (2 6+2 connectors off a single cable) and many people use them for convenience. That's perfectly fine for anything up to a 2080Ti at the moment, but it probably cuts it very fine for what's required from something like a 3080Ti. Moving to a 12-pin connector solves that problem - you HAVE to use two 8-pin outputs on your PSU over a suitably-rated 12-pin cable. No shortcuts. No potential fire risk.
 
Yea i guess thats true i do have a modular Evga 650+ 80 Gold so hopefully im fine

I fancy a 3080ti depending on the price but il probably end up staying at the £500 range which i think is a good chunk of money for a gpu

Probably save the rest for a PS5
You'll be fine :)

While this is all still just conjecture and theory, I'd be highly surprised if the 3080Ti came in at under £1000. £500 would probably net you a 3070 I suspect, which would most likely be well within your power budget (and probably not require a 12-pin cable!)
 
You'll be fine :)

While this is all still just conjecture and theory, I'd be highly surprised if the 3080Ti came in at under £1000. £500 would probably net you a 3070 I suspect, which would most likely be well within your power budget (and probably not require a 12-pin cable!)

Yea ive always had the 70 variant so will more than likely be a 3070 for me again. £1000 is just nuts rather have a 3070 hoping it can do 4k at 60 fps and a PS5 for there exclusives at the same price as a 3080ti
 
Considering every GPU will be paired with a monitor (unless it's mining) then talk of displays is completely relevant; the display you choose will dictate the GPU you need.

All this talk of nits is Zzz... right enough. I'm following this thread because I need high refresh+Gsync and my 1080ti is giving me neither with my OLED. The main question for me is whether I need to aim as high as 3080ti or will one of the lower cards cut it? The 3070 should be enough but maybe my budget will push the ceiling a bit higher.

And RTX, I'm interested in how much this improves or will Nvidia ditch it? And HDR, is there a game that supports it properly? Does RTX negate the requirement for HDR?

So much to discuss... :)

You got an older OLED? As for RTX vs HDR, if it's a matter of put more of the money towards a beefier GPU for better ray-tracing OR put more of the money towards a better display for HDR, I'd pick HDR every time. But it's also a matter of, the display is going to be (hopefully) with you longer than a GPU, so it's more important to sort that one out first. Got my TV when I still had only an RX 480 but it was still the right choice. The picture quality boost is immense compared to just "higher settings".
 
You got an older OLED? As for RTX vs HDR, if it's a matter of put more of the money towards a beefier GPU for better ray-tracing OR put more of the money towards a better display for HDR, I'd pick HDR every time. But it's also a matter of, the display is going to be (hopefully) with you longer than a GPU, so it's more important to sort that one out first. Got my TV when I still had only an RX 480 but it was still the right choice. The picture quality boost is immense compared to just "higher settings".

It's a new model, LG-48CX. I think I need to play with the HDR settings going by the replies in this thread - I'll post in the dedicated monitor thread as it'll take things too far off topic in here. :)
 
Back
Top Bottom