• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Honestly guys can't help but feel the lacking monitor tech is a bit of a drawback for a lot looking at the 3090.
I mean I wish we even had one 32" monitor capable of 144mhz 4k at a reasonable priced.

Faced with the option of a 27" way overpriced 144hz 4k from the likes of asus or predator.
Or an obscenely oversized 43" from asus.

I just don't know why asus haven't released a 32 yet!

Plus the complete lack of (good) HDR bums me out a little. You've got 1440p/4k at good refresh rates now just add some good HDR please! It's the only thing I'd want from a new monitor going forwards.

This one will be very strong for HDR IMO, but I don't see the point when it will uber expensive. Might as well just buy an OLED at that point.

https://www.tftcentral.co.uk/blog/a...44hz-refresh-and-576-zone-mini-led-backlight/
 
Rumours are that if its 80cu that its going to 50% faster than a 2080ti which will put it matching a 3080. Will all depend on price. There are rumours now the AMD price will be $649.

If its £649 for 16Gb card matching 3080 performance then they will sell.

Yes they will like hot cakes.

But I have a Gsync panel. So I am only in the market for an Nvidia card.
 

This has already been debunked here:

"Will customers find a performance degradation on PCIE 3.0?

System performance is impacted by many factors and the impact varies between applications. The impact is typically less than a few percent going from a x16 PCIE 4.0 to x16 PCIE 3.0. CPU selection often has a larger impact on performance.We look forward to new platforms that can fully take advantage of Gen4 capabilities for potential performance increases."
 
This has already been debunked here:

"Will customers find a performance degradation on PCIE 3.0?

System performance is impacted by many factors and the impact varies between applications. The impact is typically less than a few percent going from a x16 PCIE 4.0 to x16 PCIE 3.0. CPU selection often has a larger impact on performance.We look forward to new platforms that can fully take advantage of Gen4 capabilities for potential performance increases."

The benchmarks will show the reality. But I think AMD must pull Ryzen 4000 Zen 3 forward in order to be benchmarked simultaneously with the RTX 3000 series.

Because if they benchmark the slower in gaming Ryzen 3000, the difference won't be clear.
 
This has already been debunked here:

"Will customers find a performance degradation on PCIE 3.0?

System performance is impacted by many factors and the impact varies between applications. The impact is typically less than a few percent going from a x16 PCIE 4.0 to x16 PCIE 3.0. CPU selection often has a larger impact on performance.We look forward to new platforms that can fully take advantage of Gen4 capabilities for potential performance increases."

It keeps cropping up here and there...
 
Hedging bets or what :p

It is worth noting that the difference between PCIe Gen3 and Gen4 might not be all that significant and there might not be a difference at all when it comes to performance. In that case, it will all come down to marketing and the fact that Intel will not be able to fill in a checkbox.

If there is a difference in performance and the difference between PCIe Gen3 and Gen4 does indeed matter then Intel is going to be behind in the race.
 
Im sure some Tis were £1300+, like the Anus, so couple of years later, you're getting a card with more vram, that destroys it, for not much more in some cases.
The Strix, FTW3, Lightning and AMP Extreme were all north of £1400 at launch. Without getting into the really silly models like the HoF which was closer to two grand.
 
Classic ;)

1599163734856.png
Oh dear lol
 
The 3080 is a 320w card, the 3090 is 350w... I don't care how reasonable Nvidia's pricing was, those power numbers are awful. Potential buyers will more than likely end up having to make significant changes to their cooling and PSU with one of those.

I think AMD have scared Nvidia enough that no one should be pre-ordering or buying a 3000 series card just yet.
 
The 3080 is a 320w card, the 3090 is 350w... I don't care how reasonable Nvidia's pricing was, those power numbers are awful. Potential buyers will more than likely end up having to make significant changes to their cooling and PSU with one of those.

I think AMD have scared Nvidia enough that no one should be pre-ordering or buying a 3000 series card just yet.


Exactly look all over the net people are having to buy new cases and pay for these cards and funny thing is because these cards already getting closer to 400w for most people they won't even be able to overclock meanwhile my 2080ti overclocks like a beast
 
Back
Top Bottom