That is a good popcorn giff
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Its easy to see what it actually should look like, the IO bracket should be exactly the same size, so shrink the 3090 by 26%. This unless the GPU doesn't belong to that IO bracket.
Yeah the PCIe thing is wrong, but if you measure from the top of the IO bracket to the bottom along the close edge you get 11.5cm for the 2080TI and 14.5cm from the 3090, (on my Screen, 32" 1440P) they should be the same, to me the cooler looks normal to the IO bracket, if you shrink the 3090 IO to 11.5cm and ignore the PCIe thing it should be right.The problem is that as soon as you pull on anything else the PCIE slot goes out of the window again. I could make the images line up perfectly, if you like egg shaped fans. The bracket doesn't belong to it no. That card is lying flat, there is no way in heck you could get the bracket to look like that from flat on at that angle (think about it). You would have to walk around the card at least 45', and then everything else would skew too. That is why the PCIE bracket is massive, and nothing lines up. Because that is a 3D print of possibly a 3080 sample that some idiot has stuck a PCIE slot to (in the completely incorrect place) and a rear IO bracket in a completely false perspective.
It's fake. That particular image and scale comparison is 100% fake.
Yeah the PCIe thing is wrong, but if you measure from the top of the IO bracket to the bottom along the close edge you get 11.5cm for the 2080TI and 14.5cm from the 3090, (on my Screen, 32" 1440P) they should be the same, to me the cooler looks normal to the IO bracket, if you shrink the 3090 IO to 11.5cm and ignore the PCIe thing it should be right.
The PCIe thing is definitely more fuzzy than the rest of the image. the IO bracket could also be fake but it definitely looks like the cooler is attached to it, at the bottom there is a triangular cover between the IO and cooler that looks like its a cosmetic piece to cover up the PCB, you wouldn't fake something like that.
that, apparently is the 3090.
https://www.techpowerup.com/270986/nvidia-geforce-rtx-3090-ampere-alleged-pcb-picture-surfaces
that, apparently is the 3090.
https://www.techpowerup.com/270986/nvidia-geforce-rtx-3090-ampere-alleged-pcb-picture-surfaces
Looks more like a cpu on the board.
The first version of this pic that I saw had an Intel CPU photoshopped over the GPU chip to hide it.
only to sensible people you forget the world consists of all manner of people which maybe low in % terms but scale real fast when you start talking about Billion demographicsYeah at 2x the price of 3080, it has to be more than 50% faster, otherwise there's really no selling point.
I only use my PC for F1 or rocket league nowadays on the ultra wide monitor. Don't play any other competitive multiplayer any more as I've not got the time to be 'great' at any game like I used to be.
only to sensible people you forget the world consists of all manner of people which maybe low in % terms but scale real fast when you start talking about Billion demographics
There are leaks saying that 3090 (or whatever it is called) will target 350W peak power draw (PPD), BUT NVidia will allow user to remove power limiter, so that one can increase boost clock speeds past 2Ghz resulting in 400-500W PPD. It makes perfect sense if Ampere is on Samsung's 8nm node (comparable to 10nm TSMC's density). RDNA2 will boost at least to 2.23 GHz (confirmed by Sony's SP5 specs) on 7nm node so Ampere on 8nm will need A LOT of power draw to match it's frequency. I have no doubt Ampere's architecture would be much more power efficient than RDNA2's IF on 7nm, but Samsung's 8 nm is a big problem for Nvidia. We will likely see top tier Ampere GPUs migrating to 7 nm as soon as TSMC gives NVidia needed production capacities (probably in 2021 or 2022 when AMD Zen CPUs move to 5 nm)
Just read that. Make of it what you will
https://www.youtube.com/watch?v=AjRog2bc3Bw
1:35.
Apparently the pic of the 3090 PCB is not a reference card. Reference cards WILL use the 12 pin. After market cheaper cards (lol) will use 3x8 pin. Sounds like it is unlocked, can go 40% faster than the 2080Ti meaning that it will not be 50% faster than the 3080. Maybe as I guessed earlier the 3080 in real world will be 10% faster than the 2080Ti, with better RT performance hence the £500 price drop from the 2080Ti.