• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Don't open that can of worms again :p

Too late!!

The response by gamer1988 made me chuckle. Some people are in for a surprise shortly.

The worst thing is AMD already has something like that with Polaris(Radeon SSG),but the guy said 8GB was soon to be low end,even below 4K.

It isn't (unless you have only 8gb or less)

The chap is lead programmer at id software. He literally said 8GB of VRAM would be "low end" even below 4K.

I think we know where the Super refreshes are going.
 
Too late!!



The worst thing is AMD already has something like that with Polaris(Radeon SSG),but the guy said 8GB was soon to be low end,even below 4K.



The chap is lead programmer at id software. He literally said 8GB of VRAM would be "low end" even below 4K.

I think we know where the Super refreshes are going.

Indeed. It seems almost certain that there is a 2070ti with 16gb now so there is almost sure to be a 3080S/ti with 16 or 20Gb.
 
Will my 750w psu power a 3080 ? (super-flower 80 plus gold if that makes any difference) Thinking of upgrading if I don’t need to add anything else

Short answer: Yes
Long answer: Yeeeeeeeeeees
Not so fast there. It’s never about the wattage. All wattage does tells you about the Competency of the power supply unit. Because nvidia went through great lengths creating a new power plug it’s suggest that it’s not the wattage that is important but the amperage/load of the rail(s). Therefore, I would not recommend 750 watts unless I knew what the amperage/load per rail is and how good the power delivery is per rail. Now I know why they call it ampere.
 
Last edited:
So the question now. Since we know, at the very least, that if RDNA 2 beats both 3070/3080 Nvidia will:
-Reduce prices
-Provide super duper variants at the same price point of the 3070/3080 but mush faster/more vram

Which would diminish the invest early adapters made just to say, I got mine 1st!

I think they’ll bring out S versions like they did when the 5700 came out. Just to push them down a tier and justify people spending more on the NV option. Even if that does mean shafting their early adopters. (For the record I have a 1080 that cost me 500 so I like an NV shafting as much as the next man. :D)
 
The response by gamer1988 made me chuckle. Some people are in for a surprise shortly.

Indeed. Sadly I've decided to make a bargain with the devil Nvidia so I'll get the 3080 and sell it off before the next wave hits, hopefully upgrading the vram without taking too much of a monetary hit but still getting to enjoy it for a few months/year.
 
Power Meters are less than £12 at xxxxx.
A wise investment for anyone with a P/S in the 500 - 700 watt range.
Keep in mind that power meters measure AC input and power supplies are rated for DC output.
So if your P/S is rated for 90% efficiency at 450W and whilst gaming it averages 500W at the wall socket, it's actually outputting 450W DC; 10% is lost to inefficiency.
So if you have a 650W P/S and it says 500W at the meter, as it's only outputting 450W you have 200W of headroom; 650 - 450.
That ten percent can be significant if you are getting close to the limit.
 
Will my 750w psu power a 3080 ? (super-flower 80 plus gold if that makes any difference) Thinking of upgrading if I don’t need to add anything else

Should be OK with a 3080.

It's the people who are asking whether a 650W is enough for a 3080.

All I know is, I wouldn't want to run one with it, not when Nvidia recommends a 750W.
 
Yeh, I know it has to be there as it's the end of the board, but it still looks inelegant, especially with the new, short power connector.
I agree with you, it doesn't look great. I don't think it will look right in a case to have a cable going down the middle. People will probably end up routing it down the side of the card to get it out of the way.
 
You see a massive problem there mate. Any air cooled CPU is going to get all that hot air coming from the top of the 3000 series card

Not really. All you have to do is reverse the fans on the CPU cooler so they draw cold air from the back of the PC. You could also have the fans at the front blowing outwards to exhaust the air.
 
Should be OK with a 3080.

It's the people who are asking whether a 650W is enough for a 3080.

All I know is, I wouldn't want to run one with it, not when Nvidia recommends a 750W.


The recommendation of a 750w, is to cover all range of 750w PSU's, and they factor in the high end CPU's etc into that, to cover themselves.

650w is close, however, 750w is preferable.
 
Indeed. Sadly I've decided to make a bargain with the devil Nvidia so I'll get the 3080 and sell it off before the next wave hits, hopefully upgrading the vram without taking too much of a monetary hit but still getting to enjoy it for a few months/year.
How much warning/heads up did we get prior to the super line being launched?
 
Yeah the seasonic adapter is a much better idea, it goes all the way back to the PSU as a single lead.

Personally I'd prefer a monoblock adapter which exposes a pair of 8-pin sockets along the top edge, facing down the back of the card. That way you could use existing custom cables without it looking unsightly.
 
Back
Top Bottom