• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Not sure anyone buying a 3080 is going to regret it.

They will. 3080 will prove to be a mid tier card in less than 6 months, Nvidia up to clever tricks again, obviously holding back 12gb and 16gb ti's to combat anything AMD launch. Once thsr happens buyer's remorse will hit home. 3080 is a monster of a card no doubt but 650 is still a huge chunk of money where Nvidia have brainwashed everyone in thinking this is a flagship card at an incredible price.
 
Agreed on this.

Usually I love Asus but that is clear price manipulation there. Won't get my money this gen. MSI probably will, never had a card fail on me

Was actually considering the strix non oc when it was initially listed for £699 (still not decided on upgrading currently), then it magically jumped to £829 after a few hours. Might try another brand or FE if i decide to upgrade.
 
Was actually considering the strix non oc when it was initially listed for £699 (still not decided on upgrading currently), then it magically jumped to £829 after a few hours. Might try another brand or FE if i decide to upgrade.
I believe if the FE offered a 5 year warranty it'll be what most people go for, myself included. Not even sure how returns work with nvidia, where are they based?
 
I still don't understand the factory oc on GPU's. Don't they use GPU boost anyway and boost to the max speeds they are capable of based on thermals etc?

So what's the point in a pre oc card?
 
Samsung yield rates aren't that good right now.
I will keep my eye out on defect rates of 3000 series. Anyone remember the release defect rate of 2080ti? Wasn't return rate in the double digits? Or 50% the 1st month or something like that?
Some early FE boards had issues and then other Tis had the memory chip failures.
 
They will. 3080 will prove to be a mid tier card in less than 6 months, Nvidia up to clever tricks again, obviously holding back 12gb and 16gb ti's to combat anything AMD launch. Once thsr happens buyer's remorse will hit home. 3080 is a monster of a card no doubt but 650 is still a huge chunk of money where Nvidia have brainwashed everyone in thinking this is a flagship card at an incredible price.

If they buy a graphics card as a status symbol, maybe.

The $700 price point went almost nowhere with Turing. Instead, Turing charged $1200 dollars for performance that should have come in around that $700

I don't care what they call it. I don't care what "tier" it is put in. I don't care if it's called "midrange" by the masses.

I only care about two things:

1) How fast is it?

2) How much does it cost?

It's a graphics card, not a hair piece.
 
GDDR6X numbers released by Micron, ok now I see why it's a wiser move going 3080 than 3070. Also since Micron aren't mass producing 16GB parts until next year, we probs won't see a Ti until some point next year. This also makes sense as given current 3080 performance, nVidia probably have nothing to worry about?

https://www.pcgamer.com/uk/amp/nvidia-ampere-gddr6x-memory-micron/

So maybe that's the reason why we only have 8GB and 10GB cards then.

I do admit for the power these things are packing and in the 3080's case considering it's a 4k capable card 10GB feels like the bare minimum.
 
They will. 3080 will prove to be a mid tier card in less than 6 months, Nvidia up to clever tricks again, obviously holding back 12gb and 16gb ti's to combat anything AMD launch. Once thsr happens buyer's remorse will hit home. 3080 is a monster of a card no doubt but 650 is still a huge chunk of money where Nvidia have brainwashed everyone in thinking this is a flagship card at an incredible price.
Not convinced we'll see much more from the 30 series, the power consumption is already as high a many people are happy with. A 400W+ GPU won't be popular. This series hasn't improved efficiency enough.
 
Based on purely clockspeed alone. If and when Ryzen can match Intel in that department it will be game over for Intel. The XT refresh of Ryzen 2 isn't a million miles away from Intel in terms of clockspeed.

It's not a matter of clock speed at all, it's a matter of latency. Monolithic designs will always have superior latency compared to chiplets, and therefore AMD can't really ever catch up at least for gaming. Of course, that's a double-edged sword, because while you have an unbeatable advantage in latency you lose a lot in scalability, so Intel will be forced to move to chiplets themselves.

In the here and now though I don't expect AMD to catch up in raw gaming performance unless they get an 8-core chiplet out, which is more or less the limit for significant multi-core scaling for games for the foreseeable future.
 
Is it worth buying a NvMe drive for games or won't I see much difference between that and my current 3.5hdd via sata3. Purely for lag, load times.

Not sure I get a 1tb for games when I have 5.5tb laying around on a usb3 drive and 1tb on a normal hdd ready.

More worried I'll max my 16gb ram with the 3080
 
Back
Top Bottom