• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

would be nice, but that doesn't grow revenue for the company and shareholders plus there is a limit to what can be produced. So if say you know you can get (for the sake of argument) 100k wafers per year what you print and how much you can charge becomes all important. What they really want to sell is as much data centre cards as they can and if push came to shove they'd stop producing consumer GPUs if they had to i.e production was hit by some major event

Ive been seeing a lot of adverts lately on you tube for the video streaming service Stadia. They appear to be making a big push on that.
 
Well, AMD are stuffing an RTX 2080 Super level GPU into a game console with Raytracing.

If they can do that it looks promising for the rest of us.

Indeed, with pretty big thermal and power restraints on it. Hopefully it does mean a full blown graphics card can actually be competitive with Ampere/Hopper, whichever is the next gen desktop cards.
 
Indeed, with pretty big thermal and power restraints on it. Hopefully it does mean a full blown graphics card can actually be competitive with Ampere/Hopper, whichever is the next gen desktop cards.

Yeah that's the point, The performance is one thing, it puts my 5700XT to shame..... its and APU in a <5 Litre case passively cooled by a 120mm exhaust fan at the top of said case!

Think what this architecture can do when they feed it more shaders and pump it with steroids.
 
Yeah that's the point, The performance is one thing, it puts my 5700XT to shame..... its and APU in a <5 Litre case passively cooled by a 120mm exhaust fan at the top of said case!

Think what this architecture can do when they feed it more shaders and pump it with steroids.

To be honest you can downclock it to high heaven and run your card with low fan speeds at 80c avg and get the same result. I can downclock/downvolt my 2080Ti to 1750-1800mhz and it will consume nearly 50% less power than when overclocked and lose around 5-7% performance.

Same with CPU at stock, uses next to nothing. Now put those together to make a massive die and thermals + thermal transfer is even better.

Still a very serious console specs wise. Just hope as i've said before, AMD don't release an equivalent card and price it at a stupid price. If they release an XSX equivalent card and it cost's more than it, then its a kick in the nuts as far as i see it.
 
To be honest you can downclock it to high heaven and run your card with low fan speeds at 80c avg and get the same result. I can downclock/downvolt my 2080Ti to 1750-1800mhz and it will consume nearly 50% less power than when overclocked and lose around 5-7% performance.

Still a very serious console specs wise. Just hope as i've said before, AMD don't release an equivalent card and price it at a stupid price.

I've shaved about 30 watts off my 200 Watt 5700XT and maintained its performance, perhaps even gained margin of error levels of performance.

I don't know about Turing GPU's but there is no way i'm turning a 200 Watt RDNA GPU into a 100 Watt GPU and only losing a few % performance.....
 
I've shaved about 30 watts off my 200 Watt 5700XT and maintained its performance, perhaps even gained margin of error levels of performance.

I don't know about Turing GPU's but there is no way i'm turning a 200 Watt RDNA GPU into a 100 Watt GPU and only losing a few % performance.....

Been playing recently so thought i would show it. The witcher 3 4k max settings (left side afterburner specs) TEMP - GPU USAGE - CORE CLOCK - POWER USAGE - VOLTAGE

MAX OC stock voltage 2055mhz will do around 2070mhz but result is the same.
fULQCaP.jpg

1750MHZ 0.8v
TyhYejD.jpg

1750mhz max memory
azNouFu.jpg

74% to 134% is a pretty drastic reduction as far as i see it. Total 9% reduction in performance. That was a quick test and at 4k where the Ti does its best. Chances are i could get that to around 7% with the same 800mv.
 
Last edited:
Well done :) those screen grabs look nice too...

Unfortunately to maintain about 1.9Ghz i need to keep the volts at around 1.1v, down from 1.2v, if i lower the volts any more than that it crashes and i have to start reducing clocks, i got it down as far as 1.03v which reduced power by about 50 watts to 150 watts but clocks only at around 1.8Ghz.
 
Well done :) those screen grabs look nice too...

Unfortunately to maintain about 1.9Ghz i need to keep the volts at around 1.1v, down from 1.2v, if i lower the volts any more than that it crashes and i have to start reducing clocks, i got it down as far as 1.03v which reduced power by about 50 watts to 150 watts but clocks only at around 1.8Ghz.

Thanks :).

I don't run it at that daily (usually 2ghz at around .95v), just testing to show what can be done if you wanted an extremely quiet or even quieter system in general and was fine with losing a few % performance.
 
So we got about 3 months now until the 3000 series drop?

Nobody knows, all we know is that there will be a consumer version of Ampere, but nVidia haven't said anything about it other than it shares some architecture with the datacentre Ampere chips they revealed last week.

That's literally all we have that's in any way concrete, everything else is rumour and speculation.
 
I recon we will get an official launch date in September, with cards being available around the time Cyberpunk comes out later that month. I will be ready to pull the trigger on a RTX 3070 as soon as it is available.

Hell, if they manage to price the 3080 right I may even get one of those, but doubtful. I just don’t see the point in paying 50% extra for only 20% extra performance which is typically what the difference in price for performance between the two cards is. Rather wait 18-24 months and upgrade to a RTX 4070.
 
I've applied to the Bank of England for quantitive easing of the next gen GPUs along with an EU subsidy, failing that an unsecured loan from loadsamoney :p
Lol :D

I would not do the latter, if you cannot pay in time he will go mad and start posting fake news about you all over the place, oh and he will keep calling you Del.
 
Back
Top Bottom