• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

R.I.P Nvidia! :p


EDIT: If its a Nvidia proprietary connector, then surely they'll have to charge you to supply the adapter cables with the cards.
There fixed that for you. This is Nvidia we talking about. :D


Is it too early to start on the hot and loud memes?
Never!!!
Vegetta is highly surprised as he breaks his visor. :p

But don't worry I'm sure there will be other news posted in a few minutes saying that it's still fake.
 
how long are these people’s memories?

a 3080 should beat a 2080ti quite handily

The 2080 gave about the same performance as the 1080Ti so Nvidia made it the same price as the 1080Ti. If Nvidia gives Ampere the Turing treatment, the 3080 would cost 20% more than a 2080Ti because it's 20% faster than a 2080Ti.

I'm hopeful Turing isn't the new norm though.
 
Yeah I am not sure what the original price was for a 2080Ti as it would be a top line component I couldn't afford, but my guess is that it should be well south of £999. The price of the 2080 Super which seems to be $699 is more sensible and the equivalent 3000 series product would be best placed at this point in pounds (£699).

The original price was $999.
 
When AMD have had the better flagship cards in the past (a long time ago), Nvidia usually resort to software measures to artificially lock AMD out of a feature, or a performance mode, or resort to lowering image quality in drivers.

That said, I highly doubt AMD's top end big navi card will outperform Nvidia's top end. I expect the 3080ti to run rings around big navi, performance per watt wise. It's likely the card I'll get to power my CX48.

AMD has much more chance of having the superior mid-high end offering, in terms of performance per dollar. This won't matter due to the Nvidia influencer army, who'll convince the masses to get the more expensive 3060/3070, over the potentially faster AMD equivalent.

Having the fastest flagship is important in influencing/manipulating the minds of simpletons (who then go on to buy 3050/3060/3070 cards in huge numbers), it's something AMD need to work on.

Nvidia influencer army? I presume Intel had one too. Didn't see the influencer army when the RX 580 was one of the most recommended card on this and other forums. There was only one thing that the 1060 did better and that was VR. And Polaris sales reflected in the market share, they sold well once they fixed the launch problems. Problem with that was it left the 1060 basically on it's own for over 6 months. Navi has sold well and increased market share as well.

Maybe Nvidia need to hire a new influencing army.

AS for your Nvidia resorting to tricks, are you forgetting your GPU history? AMD/ATI got done for this several times too. Neither of these companies are innocent.

I asked you before in another thread to list 10 of these GPUs that AMD had that were faster, cheaper and used less power, but you haven't answered me yet.

It should be easy as you said it has happened countless times.
 
Oh dear, its coming, and to custom cards too, and its a new pin layout, so 2x 6 pins won't work, its a new 2x 6 pin to 12 pin adapter cable for all :/


Gamers Nexus says that it's OEM-only (for pre-built systems) and consumer cards will have standard 2x 8 pin connectors.
 
Gamers Nexus says that it's OEM-only (for pre-built systems) and consumer cards will have standard 2x 8 pin connectors.

MSI used 2x8 and 1x6 on some of the 2080ti cards, so they make go with that but yeah Steve said when he spoke to vendors he was told only Nvidia Reference cards will use the 12 pin and there would be an adapter, so no you don't need a new PSU.

Though I might get a new PSU anyway - I'm on 750w for the moment and with a 250w CPU and 400w GPU, it's getting a little close.

 
MSI used 2x8 and 1x6 on some of the 2080ti cards, so they make go with that but yeah Steve said when he spoke to vendors he was told only Nvidia Reference cards will use the 12 pin and there would be an adapter, so no you don't need a new PSU.

Though I might get a new PSU anyway - I'm on 750w for the moment and with a 250w CPU and 400w GPU, it's getting a little close.



I have a 750W one as well.
Looking around they want north of £240 for a 1000W :(
 
I have a 750W one as well.
Looking around they want north of £240 for a 1000W :(

I have an 850w, also a hx1000i but I swapped it out as she’s getting old now. I haven’t seen it come close even with a 2080Ti. Unlikely both a gpu and cpu are going to be max loaded at the same time unless you are using it for mining etc. Gaming load on my cpu is next to nothing on my 3950x! Talking 60w.

Actually uses less power flat out than my 9900k did rofl.

Guess it’s all on the next gen cards but it will be a really sad day if next gen ampere requires 600w of power :(.
 
I am totally okay with 3-slot, 4-slot or even 5-slot GPUs. Look at the massive coolers we put on CPUs and compare them with the tiny coolers we expect on GPUs.
 
High power PSUs are in short supply at the moment, which is driving up the price. I'm not quite sure why, but I imagine it's supply chain disruption from the 'rona.


I'm seeing the same.

Stop buying at ocuk, they are the most expensive retailer around.

You can pick up an EVGA 1000w for less than £180 if you look around.

Thanks.

Off topic, but does anyone remember the site that showed you who was the maker of power supply's?
 
Back
Top Bottom