• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

a06fe237110e6da70fefe36b99f3c681.gif

That is a good popcorn giff
 
Its easy to see what it actually should look like, the IO bracket should be exactly the same size, so shrink the 3090 by 26%. This unless the GPU doesn't belong to that IO bracket.

The problem is that as soon as you pull on anything else the PCIE slot goes out of the window again. I could make the images line up perfectly, if you like egg shaped fans. The bracket doesn't belong to it no. That card is lying flat, there is no way in heck you could get the bracket to look like that from flat on at that angle (think about it). You would have to walk around the card at least 45', and then everything else would skew too. That is why the PCIE bracket is massive, and nothing lines up. Because that is a 3D print of possibly a 3080 sample that some idiot has stuck a PCIE slot to (in the completely incorrect place) and a rear IO bracket in a completely false perspective.

It's fake. That particular image and scale comparison is 100% fake.
 
The problem is that as soon as you pull on anything else the PCIE slot goes out of the window again. I could make the images line up perfectly, if you like egg shaped fans. The bracket doesn't belong to it no. That card is lying flat, there is no way in heck you could get the bracket to look like that from flat on at that angle (think about it). You would have to walk around the card at least 45', and then everything else would skew too. That is why the PCIE bracket is massive, and nothing lines up. Because that is a 3D print of possibly a 3080 sample that some idiot has stuck a PCIE slot to (in the completely incorrect place) and a rear IO bracket in a completely false perspective.

It's fake. That particular image and scale comparison is 100% fake.
Yeah the PCIe thing is wrong, but if you measure from the top of the IO bracket to the bottom along the close edge you get 11.5cm for the 2080TI and 14.5cm from the 3090, (on my Screen, 32" 1440P) they should be the same, to me the cooler looks normal to the IO bracket, if you shrink the 3090 IO to 11.5cm and ignore the PCIe thing it should be right.

The PCIe thing is definitely more fuzzy than the rest of the image. the IO bracket could also be fake but it definitely looks like the cooler is attached to it, at the bottom there is a triangular cover between the IO and cooler that looks like its a cosmetic piece to cover up the PCB, you wouldn't fake something like that.
 
Do not fancy that fan sucking air in bottom and the other fan blowing air up to my CPU area if that is even real as so far seems madness (watched a long YT video on it weeks ago).

May sit it out till 2022, may even have a good monitor by then.
 
Yeah the PCIe thing is wrong, but if you measure from the top of the IO bracket to the bottom along the close edge you get 11.5cm for the 2080TI and 14.5cm from the 3090, (on my Screen, 32" 1440P) they should be the same, to me the cooler looks normal to the IO bracket, if you shrink the 3090 IO to 11.5cm and ignore the PCIe thing it should be right.

The PCIe thing is definitely more fuzzy than the rest of the image. the IO bracket could also be fake but it definitely looks like the cooler is attached to it, at the bottom there is a triangular cover between the IO and cooler that looks like its a cosmetic piece to cover up the PCB, you wouldn't fake something like that.

The PCIE bracket looks like it says Compaq lol.

Let's raise some more doubt. If that 3090 PCB image really is a 3090 then how would it work with the cooler from the 3080?

Y9lKBkp.jpg


And why on earth would you have a fan on the back of the GPU PCB blowing air into the board?

Like, the 3080 has a really weird PCB. It's two, IIRC, end to end. That 3090 PCB looks a bit more normal to me.

I don't doubt there is a 3090 coming at all. Mostly because I believe the rumours about the 3080 being at very best 10-15% faster than the 2080Ti as a maximum. Why? because if it were any faster they would be really stupid to release it at £800. I think like 1080ti to 2080 there won't be that much difference, at least in normal performance. Maybe the RT performance will be the 30% BS figure being thrown around? IDK. But I do know that Nvidia would never give you 30% more performance at 40% less the price of the 2080Ti. It just isn't happening, not with a $100 cooler on.

However, I reckon they have inside information, and I reckon that the 3080 was simply not enough to do that first. What I mean is, for years now they have released the 80 first with the 70 (670, 680, 970, 980, 1070 1080 etc) and did so with RTX too (the 2080 and 2070 came first). However, they were not expecting a "2080Ti like surprise" from AMD at those times. In fact, AMD had blown away like a fart in the wind.

I reckon Jen is not like, scared, but I reckon he is making 100% sure that he retains the crown with the supposed 3090. Meaning, it could well be massive overkill. That is why I believe the rumours about power draw, the power connectors and so on. Could you imagine if he launched the 3080 first and AMD came and slapped him in the face with something faster? And what that would do to the share prices?
 
There are leaks saying that 3090 (or whatever it is called) will target 350W peak power draw (PPD), BUT NVidia will allow user to remove power limiter, so that one can increase boost clock speeds past 2Ghz resulting in 400-500W PPD. It makes perfect sense if Ampere is on Samsung's 8nm node (comparable to 10nm TSMC's density). RDNA2 will boost at least to 2.23 GHz (confirmed by Sony's SP5 specs) on 7nm node so Ampere on 8nm will need A LOT of power draw to match it's frequency. I have no doubt Ampere's architecture would be much more power efficient than RDNA2's IF on 7nm, but Samsung's 8 nm is a big problem for Nvidia. We will likely see top tier Ampere GPUs migrating to 7 nm as soon as TSMC gives NVidia needed production capacities (probably in 2021 or 2022 when AMD Zen CPUs move to 5 nm)

Just read that. Make of it what you will :)
 
only to sensible people you forget the world consists of all manner of people which maybe low in % terms but scale real fast when you start talking about Billion demographics :)

I always assumed we're all pretty sensible, until 10 seconds before we actually press the buy button.
 
https://www.youtube.com/watch?v=AjRog2bc3Bw

1:35.

Apparently the pic of the 3090 PCB is not a reference card. Reference cards WILL use the 12 pin. After market cheaper cards (lol) will use 3x8 pin. Sounds like it is unlocked, can go 40% faster than the 2080Ti meaning that it will not be 50% faster than the 3080. Maybe as I guessed earlier the 3080 in real world will be 10% faster than the 2080Ti, with better RT performance hence the £500 price drop from the 2080Ti.
 
There are leaks saying that 3090 (or whatever it is called) will target 350W peak power draw (PPD), BUT NVidia will allow user to remove power limiter, so that one can increase boost clock speeds past 2Ghz resulting in 400-500W PPD. It makes perfect sense if Ampere is on Samsung's 8nm node (comparable to 10nm TSMC's density). RDNA2 will boost at least to 2.23 GHz (confirmed by Sony's SP5 specs) on 7nm node so Ampere on 8nm will need A LOT of power draw to match it's frequency. I have no doubt Ampere's architecture would be much more power efficient than RDNA2's IF on 7nm, but Samsung's 8 nm is a big problem for Nvidia. We will likely see top tier Ampere GPUs migrating to 7 nm as soon as TSMC gives NVidia needed production capacities (probably in 2021 or 2022 when AMD Zen CPUs move to 5 nm)

Just read that. Make of it what you will :)

Did it come from one of Moore's Laws sources (aka an alternate personality no longer supressed by medicine)?
 
https://www.youtube.com/watch?v=AjRog2bc3Bw

1:35.

Apparently the pic of the 3090 PCB is not a reference card. Reference cards WILL use the 12 pin. After market cheaper cards (lol) will use 3x8 pin. Sounds like it is unlocked, can go 40% faster than the 2080Ti meaning that it will not be 50% faster than the 3080. Maybe as I guessed earlier the 3080 in real world will be 10% faster than the 2080Ti, with better RT performance hence the £500 price drop from the 2080Ti.

I always though the whole "Sub £500 3070 beating 2080 Ti" was just wishful thinking. I think Nvidia will just do something very similar to what you described.
 
Back
Top Bottom