• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Maybe the big bang is a reference to what will happen to your psu when you plug these in.

XD god i hope not, have a 750w sf platinum unit, hope it will be enough for the 3080 or what ever its called, i hope that card will be the best balance between performance and price compared to the 2080ti, yes the next card up will be on another plannet but the cost will likley be a fair bit more.

also with the 21 days, 21 years skit is nvidia trying to make fun of navi 21?
 
The 3070 and 3080 will decimate both 1440 and 2160p.

If they don't, then they are utter failures.

Realistically speaking the 3070/3080 will not "decimate" 4k... unless by "decimate" you mean getting 30-60fps at full graphical fidelity with AA and RT enabled in modern titles.
 
Mate, my 1080 utterly blitzes 1440p. The 3070 and 3080 will decimate both 1440 and 2160p.

If they don't, then they are utter failures.

I think you make a good point here.

Dropping around £1,000 on a top-end card to then not have the performance to get the most out of your other hardware - consistently beating 4K @ 60fps or 1440p/1600p at 144fps with all the pretties turned to max* - would be taking the Michael out of customers a bit!

*I'll exclude RTX and pointlessly high AA from this. But RTX should be playable.
 
Last edited:
Guys what do you think...

The latest rumours are that we will see a X80 NON Ti use the 'big chip' (the last time this happened was with Kepler 700 series) AND we will see 2 versions of each sku with double the Vram.. again.... the last time this happened at the high end was with Kepler (GTX 780 6GB)... excluding midrange cards (960, 1050 etc) I'm confident we'll see different memory configs for each high end sku.. me personally? I'm happy with a 10GB 3080 if it's reasonably priced.. <£650 and >30% faster than the 2080 Ti
 
Little table just to show how poor this current generation really was:

owwLLZE.png


I still have the RX480 now, and the others are the more popular cards that I would realistically have considered buying (not the 2080Ti though, too expensive, just included to show how little it was ahead of the pack really).

Any guesses what the new cards might pull off?

IF the new 3070 or 3080 or AMD's new offering is only still the same as the 2080Ti score, I think that's a major disappointment and I'll probably pick up a cheaper used model (against all my preferences not buying used electronics).

I want triple the performance of my RX480 on a new upgrade, so that's a passmark score of 24,000, and I want it for <£500.
 
Guys what do you think...

The latest rumours are that we will see a X80 NON Ti use the 'big chip' (the last time this happened was with Kepler 700 series) AND we will see 2 versions of each sku with double the Vram.. again.... the last time this happened at the high end was with Kepler (GTX 780 6GB)... excluding midrange cards (960, 1050 etc) I'm confident we'll see different memory configs for each high end sku.. me personally? I'm happy with a 10GB 3080 if it's reasonably priced.. <£650 and >30% faster than the 2080 Ti

it'll be good if the 3080 came in around £650, but given the rumors say we should see a 15-30% uplift on the 2080ti i'd say the price will more likley be around the £800 - 1k mark and the 3080ti/3090/titan card will be double if not a bit more, the only hope we have as comsumers is amd brings the heat to nvidia and have something that is comparable but costs a lot less, in that case the new nvidia cards will have to drop back price wise to remain competitive. time will tell of course but i'd say around £800+ mark is more realistic atm
 
Last edited:
Of course PPI makes a difference, but you should be sitting at a further distance whilst using a 40" display than someone sitting at a 27", therefore it shouldn't be a hugely noticeable difference.
Thanks just tell me exactly how far away I 'should' be sitting I've clearly no idea;). I can see crawlies/jaggies from my most comfortable distance from the screen therefore I need and use MSAA when possible. End of story, trying to spin it any other way isn't very productive. If and when you get a 40" or bigger monitor try it and get back to me.
 
Mate, my 1080 utterly blitzes 1440p. The 3070 and 3080 will decimate both 1440 and 2160p.

If they don't, then they are utter failures.
"Utterly blitzes" ? My 1080ti SLI certainly doesn't 'utterly blitz' 2160p and that's with about 10-20% more performance than a 2080ti although I do get decent performance. It just shows some people have vastly different expectations/opinions of what is good enough performance! If a 3070 performs as well as a 2080ti then it will struggle in some games to deliver 4K 60 fps which isn't what I would define as 'decimating' that resolution. I can't for the life of me understand why Kaapstad would have RTX Titan SLI if a single 2080ti is good enough for his needs.
 
Last edited:
Will we be getting a free game like we did with Turing (Space Invaders).

I recon Pac Man will be interesting with Ampere, watching it gobble up all the money in your wallet.:eek:
 
Mate, my 1080 utterly blitzes 1440p. The 3070 and 3080 will decimate both 1440 and 2160p.

If they don't, then they are utter failures.


The 2080ti averages around 55fps at 4k ultra on Horizon Zero Dawn

Your 1080 averages 51 fps at 1440p Ultra.

So there not even blitzing a ps4 game at 4k or 1440p.

I do hope the 3070 and 3080 decimate those Resolutions but its gonna be a big ask especially with new titles, Doubt they will run Cyberpunk at 4K Ultra 60fps ++
 
The 2080ti averages around 55fps at 4k ultra on Horizon Zero Dawn

Your 1080 averages 51 fps at 1440p Ultra.

So there not even blitzing a ps4 game at 4k or 1440p.

I do hope the 3070 and 3080 decimate those Resolutions but its gonna be a big ask especially with new titles, Doubt they will run Cyberpunk at 4K Ultra 60fps ++

I don't know what you are talking about. My 1080Ti can play Oblivion at 4k maxed out easy! Minesweeper too.
 
The 2080ti averages around 55fps at 4k ultra on Horizon Zero Dawn

Your 1080 averages 51 fps at 1440p Ultra.

So there not even blitzing a ps4 game at 4k or 1440p.

I do hope the 3070 and 3080 decimate those Resolutions but its gonna be a big ask especially with new titles, Doubt they will run Cyberpunk at 4K Ultra 60fps ++

Choosing a badly optimised game as the benchmark isn't really a legitimate comparison.
 
Back
Top Bottom