There is no way the 4080 would be that close to the 4090. The card would be pointless then...
Why? Just look at ampere where there was about 10% difference between the non ti and ti cards.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
There is no way the 4080 would be that close to the 4090. The card would be pointless then...
Why? Just look at ampere where there was about 10% difference between the non ti and ti cards.
Why? Just look at ampere where there was about 10% difference between the non ti and ti cards.
That's a poor comparison
Also i love how this is cropping up in the rdna3 thread.
Yeah but 4080 is close to the 4090, in certain games, because there is no CPU on the market that keeps up with the 4090 in every game (well it's either the CPU or something else bottlenecking the 4090 but that's the reality, the 4080 is within 10% of the 4090 in some games whether people like it or not)
I'd wait for proper benchmarks rather than some random guy on a Chinese site, how's he even have drivers when the cards not released yet and still under NDA for another week so not even the press drivers would be available?.Yeah but 4080 is close to the 4090, in certain games, because there is no CPU on the market that keeps up with the 4090 in every game (well it's either the CPU or something else bottlenecking the 4090 but that's the reality, the 4080 is within 10% of the 4090 in some games whether people like it or not)
True - and with DLSS your card uses a lot less power and runs cooler. I play at 4k and DLSS is a godsend.really? i always use it when available on a 3080, more frames more better, or higher settings and same frames, the main pull for nvidia here is you can use dlss/fsr/xess but if you go amd you instantly lose any dlss only game improvements so amd needs to bring some beefier chops to the table to make up for that before i would jump ship
Isn’t the game count pretty similar between the twoNvidia is worth it for DLSS alone. Very few games have AMD FSR
Also note the AMD slides stages it's upto 1.5x-1.7x and not on average. The Nvidia card faces more driver overhead so max fps is impacted as a resultLol the 4090 is only ~12% faster than a 6950XT at 1080p as both are CPU bound. Jump up to 4K and it is ~65% faster because the bottleneck switches to the GPU. So cherry picking some CPU bound game or resolution and saying the 4080 is only 10% slower than a 4090 makes you look desperate, because even a 6950 or 3090Ti is “only” 10% slower than a 4090.
So no, the 4090 is not bottlenecked at 4K and the 4080 will be about 35% slower given the massively cut down specs. (and based on Nvidia’s own marketing).
That leaves the 7900XTX plenty of room to be ~ 15-20% faster than a 4080 and 15-20% slower than a 4090 in raster at 4K.
At 4K the 4080 is only 25ish% faster than a 3090Ti.
I will caveat this with “based on the Nvidia and AMD marketing slides”.
Performance will be a bit all over the board with RDNA3 due to the organization of the ALUs/SPs. 1.5-1.7x is likely an average across AAA titles. There will be games that see a bigger, and others that smaller delta. It's going to be interesting to see direct game-for-game performance comparison, and similarity, how AMD marketing positions these comparisons. Also will be interesting, assuming they stick with this implementation, if software development evolves to optimize here -- if the cart will drag the horse, so to speak -- across generations. My guess is in console-originated games, it will.Also note the AMD slides stages it's upto 1.5x-1.7x and not on average. The Nvidia card faces more driver overhead so max fps is impacted as a result
Isn’t the game count pretty similar between the two
Performance will be a bit all over the board with RDNA3 due to the organization of the ALUs/SPs. 1.5-1.7x is likely an average across AAA titles. There will be games that see a bigger, and others that smaller delta. It's going to be interesting to see direct game-for-game performance comparison, and similarity, how AMD marketing positions these comparisons. Also will be interesting, assuming they stick with this implementation, if software development evolves to optimize here -- if the cart will drag the horse, so to speak -- across generations. My guess is in console-originated games, it will.
Performance will be a bit all over the board with RDNA3 due to the organization of the ALUs/SPs. 1.5-1.7x is likely an average across AAA titles. There will be games that see a bigger, and others that smaller delta. It's going to be interesting to see direct game-for-game performance comparison, and similarity, how AMD marketing positions these comparisons. Also will be interesting, assuming they stick with this implementation, if software development evolves to optimize here -- if the cart will drag the horse, so to speak -- across generations. My guess is in console-originated games, it will.
Likely, and the Xbox X Next will very likely follow suit. It's a prime architectural change for consoles: notable performance uplift for small cost.The question becomes will the ps5 pro console use rdna4 multi chiplet GCd?
With RDNA 2 we could see their confidence as they outright compared the 6900XT with the 3090 in a much wider variety of games. This time it's smokes and mirrors with some numbers showing FSR, some without it etc.They also had charts with UP TO on RDNA2 which ended up being averages
Some in the Nvidia camp really don't want it to be true don't understand because we should all want them to do well it only will help prices in the long run
You can mod FSR 2.0 in most games having DLSS. The problem isn't the game count but rather it's worse than DLSS when the game is in motion.Isn’t the game count pretty similar between the two