• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

That's not the answer though is it. Shops loose money because returned items are often sold as B grade for less money. Essentially it's theft.

That law was put in place to help buyers who make a genuine and honest mistake.

It shouldn't be abused by impatient and selfish people.

Nothing wrong with exercising your consumer rights especially given as some of these large online retailers avoid UK tax which is essentially theft from the UK tax payer.
 
AMD's fault mate.
I kindly disagree. It's the consumer's fault. It's a luxury item, so not a need kinda thing for most people meaning if you don't pay the tax, the company will have to adjust. Same with the 5700XT. They have been selling well at the 300-350 mark even if they were suppose to be around 250 (RX680)
 
I have a NZXT 630 (Ultra Tower) and removed the HDD cages as no need for them+better airflow from front 200mm.

Cannot stand the new cases of past few years.

Exactly the same with my Corsair 750D. Without the HDD cages there’s loads of room. Probably up to 500mm or more for a GPU.
 
No. RTX 2080S is with 8 GB and still 20% faster than the RX 5700 XT.

RX 5700 XT has other bottlenecks well before VRAM hits its ceiling.

I'm maxing out my 8GB in Star Citizen at 1440P, i'm sure there are other games that do this too...

Dude the R9 390 had 8GB for £250, the RX 480 and then RX 580 has 8GB.... turn it up for #### sake 8GB on mid level cards is so 2014.
 
Looking forward to everyone buying these so I can pick up a second hand bargain :D

A 2080ti for £400-500 would do me well for a good few years.
 
Last edited:
Well maybe AMD will ride to the rescue with 12 and 16gb cards at the 3070 and 3080 price points.

One of the or both need to do something because for me its not about GPU's muscle anymore, i have plenty of that, its my measly 8GB buffer that's holding me back, any card faster than mine that does not have 12GB or more is useless to me because when i turn the res up textures streaming spills over and then chokes on my NVMe drive.
 
It’s really hard to see how much VRAM games require considering there’s a load of caching going on, usage does not equal need.
It’s real obvious when you do hit the limit though, performance drops through the floor.

Have we reached that point yet with 8-11GB at 3840 × 2160?
 
Last edited:
One of the or both need to do something because for me its not about GPU's muscle anymore, i have plenty of that, its my measly 8GB buffer that's holding me back, any card faster than mine that does not have 12GB or more is useless to me because when i turn the res up textures streaming spills over and then chokes on my NVMe drive.

My 11gb on the 1080ti fills up full in a number of games. Needs to be 16gb standard. Don't know if there's a performance hit when it hits that point but there's no room to spare.
 
I can tell you a little bit about Intel's practices. Mostly because I witnessed it first hand.

The reason I stopped giving Intel my money was because of the high prices and tiny performance gains. (I wasn't even aware of the shady behind the scenes stuff) Nvidia did the exact thing (with Turing) that turned me against Intel: Tiny performance gains for big prices.

Complain about Nvidia all you like, that is up to you. I've done plenty of it myself. However, they do always deliver the goods.

No. Pascal "delivered the goods". Just about every generation before Turing delivered. Nvidia changed their direction with Turing for the worst. If they do it again with Ampere they go in that Intel category with me.

I'm not saying I'll never buy Nvidia while I dislike them. But they will have a much harder time selling me their stuff. If they pull another Turing, AMD won't have to beat them. Just get close-enough for me to get a decent upgrade without giving Nvidia my money.
 
No. Pascal "delivered the goods". Just about every generation before Turing delivered. Nvidia changed their direction with Turing for the worst. If they do it again with Ampere they go in that Intel category with me.
They were probably planning to deliver another Turing but then got spooked by AMD so have now been forced to crank up the power usage to boost performance as they can't go back and re design the whole GPU.
 
The reason I stopped giving Intel my money was because of the high prices and tiny performance gains. (I wasn't even aware of the shady behind the scenes stuff) Nvidia did the exact thing (with Turing) that turned me against Intel: Tiny performance gains for big prices.



No. Pascal "delivered the goods". Just about every generation before Turing delivered. Nvidia changed their direction with Turing for the worst. If they do it again with Ampere they go in that Intel category with me.

I'm not saying I'll never buy Nvidia while I dislike them. But they will have a much harder time selling me their stuff. If they pull another Turing, AMD won't have to beat them. Just get close-enough for me to get a decent upgrade without giving Nvidia my money.

I don't think Ampere is anything like Turing. At all.

As I said, technically Turing was just a slightly shrunk Pascal with some Tensor cores on. It was not Ampere, and was never even supposed to happen. However, you can't say it didn't deliver when everything else says otherwise.

Was it as impressive as Pascal? no. Not in a raw performance, "ignoring RTX and all of the new tech" way no it wasn't. However, it did deliver the goods. It delivered RT where we had none before, it delivered (eventually !) DLSS which makes it much faster than Pascal and far better encoding and streaming than Pascal.

Those who bought it most likely bought it for RT. However, DLSS is, IMO, the best thing to come out of Turing and it will only get better. Those tensor cores most definitely do work.

It sold though. You don't have to like that but like I said, it definitely delivered. If you took a 1080Ti and you ran the same game at 4k and you utilised DLSS the 2080Ti will absolutely mangle the 1080Ti. DLSS is the best thing to happen in years, and it happened first on Turing.

They were probably planning to deliver another Turing but then got spooked by AMD so have now been forced to crank up the power usage to boost performance as they can't go back and re design the whole GPU.

That isn't the case either. We should have had Ampere over two years ago. The fact is that Samsung's original process it was supposed to use absolutely sucked, and Nvidia had to do something. So they stuck with TSMC, paid out the bum and gave us Turing. However, like I maintain it was never supposed to happen. They were supposed to jump from Pascal to Ampere on Samsung but it just didn't happen. It took Samsung another process to get it even close to being ready, and the power use will show us that even the revised node and technology is still far from perfect.

Would you have honestly preferred if they had skipped Turing and you had to wait about four years for Ampere? because in reality that is what would have happened.
 
They were probably planning to deliver another Turing but then got spooked by AMD so have now been forced to crank up the power usage to boost performance as they can't go back and re design the whole GPU.


That's exactly what it is, they're shovelling coal on like there's no tommorow. That's why the card is so huge with a totally new design fan and is triple slot.

If this was AMD's forthcoming top card it would be getting memed to **** on power usage and size.
 
Back
Top Bottom