• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD RDNA3 unveiling event

Interesting comment I've just seen on good old reddit. Potentially one of the reasons, AMD has moved to double FP32 per stream processer, is to increase performance in Nvidia optimised titles. Their evidence for this was that in AMDs slides there was a bigger performance increase in CP2077 (RT off) than MW2 (1.7x vs 1.5x) which i recently learnt is very well optimised for AMD cards.
I would say that while that may be the result, such a huge change was probably baked in years ago and while GA102 in the form of 3080 launched over two years ago, I am not sure even that is enough time. Now it may be that doubling FP32 was an idea two years ago, and once GA102 launched they picked that over other ideas.

BTW, CB reports that 4080 is now benching higher than expected:
Maybe it just likes Forza Horizon 5, or Nvidia changed the clocks, bandwidth since the preview?
 
They've been using RT as an excuse for the last few years now that doesn't stand up (never did IMO) given the insane price premium they're struggling to justify going NV. I don't see what's wrong with just saying "I like Nvidia, I'm always just going to buy Nvidia" but then let's not have tears about pricing and then this shifting of blame as if NV's insane gouging for the last 3 gens is somehow AMD's fault for not being competitive..(which is total nonsense anyway).

They want it both ways, they want to always buy NV but they want AMD to make it cheaper, and when NV cards keep rising exponentially, they don't want to blame their beloved Jenson, it's all AMD's fault. Now all that mental gymnastics has finally become impossible to sustain.

I agree with most of what you said there.

I will give you my point of views, first in a gaming sense . I have always been more about image quality and the ability to add more realistic effects into games, so RT in one way does all this for me and I enjoy technology that does this and Nvidia are not the only ones doing it and for many years now the main GPU companies have been doing it, but there is always one that does something a little better than the other and then comes down to the customer to decide if that little bit better is worth the extra costs involved. That thin line between which company is better for a certain feature is shrinking by the day now and we then have to justify again is 50%+ in cost really worth it for that difference, I'm slowly coming to the thought that no it isn't and soon it will just become normal to buy any company for that feature as it's more than good enough at that level.

Now I will give you my point of views regarding professional features or the ability to use more professional application in say work or hobbies that use profession creator applications or applications that have a different professional nature and what hardware or driver/api support is available to professional application that can make use of them, sadly here it is an easy choice and Nvidia still leads in this and Nvidia was very very smart with CUDA and now OPTIX too and in one sense have forces pro users to buy their hardware because the applications they use only either support CUDA/OPTIX etc or support it in a way that makes better faster results with their hardware and in a professional environment time is money and the quality of the results. AMD can't touch Nvidia at the moment there and AMD really need to work more with professional application makers and get their hardware /api's to support them better and produce better and faster results too.


So from that above you can take the following for me..

Gaming only AMD are more than good enough for me and would buy AMD GPU for a gaming rig without question as it is the more sensible purchase for that use only.

Professional/work use well Nvidia is only option for a workstation.

For a rig that is a mixture of a gaming machine and also spends at least 50% of its time a workstation then again Nvidia wins as the machine is also making my income 50% of the time and basically pays itself off in time.


So again comes down to the main uses of the computer which gpu I buy from which company. I feel we are slowly coming to that same time when HD 4870s were around and Nvidia with their GTX 280/285 and in that time I purchased the 4870 and was the sensible purchase for me and even when fermi came out GTX 480 I was not excited to jump onto Nvidia then and went HD5870 and sadly not long after that I found the AMD driver updates broke my older software, from work apps to games like flight sims, but all the latest apps and games worked great, but I needed my older apps and sims to work and gave Nvidia a chance and it was a GTX 580 that I used then and all my apps and sims worked again but I did notice a downgrade in 2D visual quality and to me was obvious but had to live with it and the 3D part the image quality was fine and just as good as AMD and it confused me why the 2D was so bad but that is the differences in hardware used at the time, these days 2D and 3D is the same between them but then it wasn't and as I said before I'm a visual quality guy first and that was something that bothered me a lot with the GTX 580 compared to the 5870 and matrox cards around then, matrox and amd/ati had the best visual quality and nvidia looked blurry and dull to my eyes.


Let's see how the 7900 series does but I have a feeling AMD have pulled out a very good product for the price and seriously worth considering this time even more than the 6000 series and now we look back at that and realised too that was a very good series for gaming and the prices for some of their cards were a no brainer especially the 6950xt at when it was £800 and a 6900xt at £650 range is good value in my book.

Nvidia this time with pricing and products are a little insulting as the 4090 has had a major downgrade to me without the NVLINK for my pro work use and really turned into more of a gaming card this generation than a pro/gamer card, I would have let them get away with it a little if they had more VRAM on the 4090 but they stuck to 24GB and a lot of pro use requires more than 24GB and why NVLINK is sadly missed this time for me as now we can't pool VRAM or CUDA cores. Basically 4090 is neutered to me at a more expensive price for us here in UK especially at £300 more in reality when comparing 3090 fe to 4090 fe models MSRP price, the 3090 was £1400 the 4090 is £1700.. and a downgrade in important features for me. :(
 
Last edited:
I
I would say that while that may be the result, such a huge change was probably baked in years ago and while GA102 in the form of 3080 launched over two years ago, I am not sure even that is enough time. Now it may be that doubling FP32 was an idea two years ago, and once GA102 launched they picked that over other ideas.

BTW, CB reports that 4080 is now benching higher than expected:
Maybe it just likes Forza Horizon 5, or Nvidia changed the clocks, bandwidth since the preview?

CPU limited benchmark

Check the benchmark, says 57% of the 5-benchmark time, the CPU was limiting the 4090 performance. Without the CPU issue the 4090 would be around 150-160fps
 
Last edited:
I take it you didnt check out the nvidia markup of the 3080 -> 4080 then... (60+% increase). Anything lower than the 4080 will be a sidegrade at best.

Strage points of reference there. None of them are your friend I agree.

For marketing a x60ti card as a x80 card (4080 12GB) the green team do take the crown :)

Depends.

The chiplet designs should be giving them a manufacturing advantage. That should be good for them if they want to increase market share with aggressive pricing.

But prioritising opening with the high priced high end doesn't suggest they're looking to shift millions of cards any time soon.

Most likely they'll pocket the savings themselves like it was/is with Ryzen.

The difference last time though was nvidia had the 3080 for £649, now we got the 4070 for £1267.

Which proves my point. If nVIDIA does nVIDIA things, AMD (most likely), will follow as close as possible.

Is that ignoring AMD 6950xt which makes AMDs 2nd card the 6900xt then which was $999 also. If you are using naming scheme.

Also we have no idea on rasterisation performance being worse than previous either but relative increases from AMD are still significantly much better.

Not saying friends of course. However yes the 7900xtx vs 6900xt is the same price still no matter how you cut it then. The 7900xt is just $100 too much and should be $799 with the room for a 7800xt to still happily slot in at $649.
There was no 6950xt upon launch of the 6xxx series.
 
Last edited:
I


CPU limited benchmark

Check the benchmark, says 57% of the 5-benchmark time, the CPU was limiting the 4090 performance. Without the CPU issue the 4090 would be around 150-160fps

Indeed, the 6950XT gets 99 FPS at max setting 4k. So if those numbers are true for a 4080 it is only 17% faster than a 6950, which would be terrible.

forza-horizon-5-3840-2160.png
 
If your used to spending £650ish pounds on your component, I don't think anyone sees a doubling in cost as welcome. You cant really justify nvidia really, they unlaunched a product as it was 'confusing' to have same name yet then next week happily bring out more flavours of the 3060 series.. "just rejigging the naming to make it seem more palatable" no?

Nobody should be nailing their colours to the mast these days. However one thing everyone understands is price. ;)

Yeah what Nvidia have done is a complete xxxx take but AMD have rejigged their naming. 6800xt used to compete with 3080. Now 7900xt competes with 4080 but costs £350 more than previous xx80 competitor. People would have been complaining like mad if they called it a 7800xtx and announced the price at $999.99 but as its call a 7900xtx now that's fine even though its aimed at the 800 class tier. So as I said they have pulled an Nvidia just not taken the xxxx quite as much
 
Indeed, the 6950XT gets 99 FPS at max setting 4k. So if those numbers are true for a 4080 it is only 17% faster than a 6950, which would be terrible.

forza-horizon-5-3840-2160.png

The 4080's numbers sound about right to me, it showing up as 30% higher than the 3090ti which matches the difference in the leaked 3D Mark Timespy benches released yesterday. It's just the 4090 numbers are a bit low; they are low in that TPU benchmark as well because they used a 5800x and ddr4 CL20 ram to test with.
 
Yeah what Nvidia have done is a complete xxxx take but AMD have rejigged their naming. 6800xt used to compete with 3080. Now 7900xt competes with 4080 but costs £350 more than previous xx80 competitor. People would have been complaining like mad if they called it a 7800xtx and announced the price at $999.99 but as its call a 7900xtx now that's fine even though its aimed at the 800 class tier. So as I said they have pulled an Nvidia just not taken the xxxx quite as much

I agree and I have commented previously, but this is down to Nvidia's 4080 (both versions) **** take leaving such a wide open goal. These would have been the 7800XT and 7800 had Nvidia priced the 4080s realistically. Instead they tried to foist the 4060Ti on us for $900, before cancelling because even their most ardent fans were laughing at the sheer audicty and arrogance. This is the first time in a long time (probably ever) that AMD got one over on Nvidia in the marketing stakes and it wasn't because AMD executed perfectly, it's because Nvidia dun ****** up.

AMD are not our friends but this is the difference between Nvidia offering you a kick in the balls, while AMD are only giving you a punch in the face. Oooh, I'll have a punch in the face please and thank you AMD :D
 
Last edited:
Similar problem with Gotham Knights at 4k. Game is some hobbled and badly coded its also CPU limited.

The litmus test for legacy grunt is Total War: Warhammer 3. Awful engine, DX11, no upscaling support. Top, top game.

As previously mentioned, the UE5 engine is where the future is and Callisto Protocol should support all the goodies being one of the first AAA UE5 engine titles.

4090 vs 7900XTX on Callisto will show how big the gap truly is, only a month to wait for this shootout.
 


By Monica J. White November 7, 2022 6:30AM
Fine, AMD. You win. I’m jumping ship.

With the launch of the RX 7900 XTX and 7900 XT, this Nvidia fan was finally convinced to pick up an AMD graphics card as my next upgrade. I can’t believe I’m saying it, but for the first time ever, I couldn’t be more excited to be going Team Red.


I was never a fan of AMD​

Yes, I admit — I was never that much of an AMD fan. Seeing as PC hardware has always been my thing, I kept up to date with AMD and its rivals in equal measures, but one bad experience with an AMD processor years ago put me off enough that I never really went back. Around 15 years have passed — ancient history, as far as computing is concerned — and beyond testing and building for others, I never owned an AMD CPU or GPU in my own personal build.


Over time, this reluctance for AMD grew into a habit, and it was often justified — I picked Intel and Nvidia because I trusted them more and their hardware was simply better. This was years before the GPU shortage when components were still affordable enough that I was OK with spending a little more if it meant I’d be putting good stuff into my new builds.


Of course, as time went by, AMD improved. With the launch of Ryzen CPUs and RDNA 2 GPUs, I was ready to acknowledge that AMD is solid again, but still not quite ready to cut the cord and say farewell to Nvidia.


So there I was, an Nvidia fan planning out my next build, until the last few weeks finally broke me. AMD’s launch of the RX 7900 XTX was the final nail in the coffin of my “no AMD” phase.


I tried to stick with Nvidia​

Despite the soaring prices during the GPU shortage and the fact that AMD’s range was more affordable (even though none of it really was at the time), my upgrade plan has for months now involved an Nvidia card. I prepared different builds, ranging from an RTX 3070 Ti to an RTX 3090, and have been keeping my eye on the prices — still high in my area — until I could find a deal I’d consider worth it.


But my resolve was slowly melting. There I was, with AMD’s graphics cards within reach; perhaps not quite as good as Nvidia in ways like ray tracing, but still more than sufficient. Still, knowing that both manufacturers would be releasing new lineups this year, I made the common mistake of waiting to find out what we were getting instead of building my PC right away.


Cue the RTX 4090. It’s a real beast of a graphics card, with a pretty high power requirement and a much, much higher price. In our testing, the card proved to be pretty incredible in terms of performance, but in my mind, that still wasn’t enough to sway me to spend $1,600 on a graphics card. Not that I had the option to, anyway — despite the price tag, the GPU sold out in minutes, and I’m not going to be giving a few hundred dollars extra to a scalper just to be able to play Cyberpunk 2077 in seamless 4K.


Of course, I could wait for the RTX 4080 — the 16GB version, that is, because Nvidia promptly “unlaunched” the overpriced mistake of a card that was the RTX 4080 12GB. Unfortunately, the version with more memory didn’t convince me, either. Maybe I’m being cheap, or maybe I just want to pay reasonable prices for my hardware; either way, I wasn’t feeling up to it.


A steady decline​

The last few weeks have been rough for Nvidia, even despite the initial success of its new Ada Lovelace generation of GPUs. First, the EVGA controversy — no matter how you spin it, it’s just not a good look. Then, the controversy surrounding the RTX 40-series GPUs started, and I was quickly running out of ways to defend my own choices.


Jensen Huang, Nvidia’s CEO, said it himself: “The idea that the chip is going to go down in price is a story of the past.” The timing of that statement could not have been worse, given the fact that many Nvidia enthusiasts, myself included, were pretty unhappy with the way Nvidia chose to price the next generation of graphics cards. Huang basically made it clear that things are not going to get any better in that regard.


Now, it turns out that the RTX 4090, and therefore also the RTX 4080, may have some melting issues due to the power adapter. A quick PSA: don’t bend your cables if you want to avoid a fire hazard. Don’t get me wrong — despite these problems, the RTX 4090 does seem pretty outstanding in a lot of ways, and in all likelihood, the RTX 4080 will also be a significant upgrade over the previous gen.


Somehow, that just doesn’t matter to me anymore. After 15 years, it’s time to give AMD another shot.


AMD couldn’t have picked a better time​

With the disappointment in Nvidia leaving a bitter taste in my mouth, I found myself getting increasingly excited about the announcement of RDNA 3 GPUs. I’ve already toyed with the idea of picking up an AMD CPU for my next PC, and I was ready to make the same choice in terms of the graphics card.


Watching AMD’s announcement, I knew that I was on board. It’s sad that we’re at a time when a $1,000 GPU is a thrilling prospect, but it is — especially if we’re talking about a flagship that will likely become one of the best graphics cards of this generation.


The two new AMD flagships, the Radeon RX 7900 XTX and the RX 7900 XT, sound pretty great. We won’t know their true performance until they land in the hands of eager reviewers, but AMD promises a 54% increase over RDNA 2 in performance per watt alongside being up to 1.7 times faster than the RX 6950 XT at 4K; access to DisplayPort 2.1 (and subsequently, 8K monitors, supposedly coming soon); and second-generation ray tracing that could help it catch up to Nvidia in that regard. AMD also claims that AI performance will be up to 2.7 better than the previous generation of GPUs.


AMD keeps the power requirements more conservative, maxing the TDP out at 355W for the 7900 XTX, and it’s not going to use Nvidia’s ill-fated 12VHPWR adapter, which, so far, seems to be the cause behind these melted RTX 4090s.


All of that is nice, but the best part is that AMD, unlike Nvidia, didn’t raise its prices. The flagship will cost $999 for the reference model, followed by $899 for the 7900 XT.


We don’t all need an RTX 4090​

Some readers may chime in here and tell me that there’s no way the RX 7900 XTX will keep up with the RTX 4090, and in all likelihood, they’d be right. However, the truth is that not all of us need an RTX 4090 — in fact, most of us don’t. There still aren’t many games that really need that kind of power, and even if they do, you can still run them on a cheaper GPU if you sacrifice a little bit of frame rate or take the settings down a notch.


Not many people really need an RTX 4090. Some do, but I am certainly not one of them; at least, not at that price.


I believe that the market needs more of what AMD is serving up, meaning semi-affordable hardware that’s more accessible to more users, and less of the ultra-high-end components that most gamers just can’t justify in their building budgets.


AMD’s flagships sound like the perfect middle ground between the expensive enthusiast-only sector and the mid-range segment where you have to compromise on some settings in certain games. They’re likely to run most AAA titles on max settings, but they’re still priced at a level I can get behind.


I’m ready, AMD. It’s going to be nice to see you again.
 
Last edited:
A company does their own tier naming and has nothing to do with another company, Amd 900 cards are around a grand as with the 6 series and 7 series, Amd has nothing to do with Nvidia naming their 4080 at a higher price, Amd wants Nvidia to do this and shot themselves in the foot.
 
I am very interested in benchmarks though. 2k is the sweet spot for competitive gaming now a days, imo. There is a point where people are echoing that once you get to a certain performance level at that resolution enough is enough. When you factor power consumption and price.

The law of the 3Ps always apply in GPUs:
Price
Performance
Power Consumption
;)
 
Last edited:
A company does their own tier naming and has nothing to do with another company, Amd 900 cards are around a grand as with the 6 series and 7 series, Amd has nothing to do with Nvidia naming their 4080 at a higher price, Amd wants Nvidia to do this and shot themselves in the foot.

There have been cases in the past where Nvidia or AMD were able to reposition a lower tier GPU up a level (or even two). Or even when a higher tier GPU had to be dropped a tier due to better than expected competition.

AMD were given a major opportunity by Nvidia's mess up on the 4080s.

I would wager the 7900s were going to be 7800s, but AMD were able to rename them and even raise the pricing. AMD will be loving this, they get more profit and look like our saviours in the process.

I would also wager that Nvidia 4080 16GB will be ~15+% slower on raster performance than the 7900XTX, while costing a lot more. I don't see the RT perfromance saving it unless they do a price drop.

I believe (just a hunch) that Nvidia simply began to believe their own hype that they can do no wrong. Yet here we are they have cancelled a GPU and may be forced to price drop another a short while later.
 
There have been cases in the past where Nvidia or AMD were able to reposition a lower tier GPU up a level (or even two). Or even when a higher tier GPU had to be dropped a tier due to better than expected competition.

AMD were given a major opportunity by Nvidia's mess up on the 4080s.

I would wager the 7900s were going to be 7800s, but AMD were able to rename them and even raise the pricing. AMD will be loving this, they get more profit and look like our saviours in the process.

I would also wager that Nvidia 4080 16GB will be ~15+% slower on raster performance than the 7900XTX, while costing a lot more. I don't see the RT perfromance saving it unless they do a price drop.

I believe (just a hunch) that Nvidia simply began to believe their own hype that they can do no wrong. Yet here we are they have cancelled a GPU and may be forced to price drop another a short while later.
I don't think so, maybe the 7900xt was going to be 7800xt but not the 7900xtx and not at a 999, there could be a 7950xtx making the 7900xt the third card down and that could have been a 7800xt (like the 6950 6900 and 6800),but I think Amd wouldn't have changed their card naming for Nvidia and not after the backlash.

Yes I do think Nvidia started to believe their own hype but can you blame them as people just keep on buying their cards at any price.
 
I don't think so, maybe the 7900xt was going to be 7800xt but not the 7900xtx and not at a 999, there could be a 7950xtx making the 7900xt the third card down and that could have been a 7800xt (like the 6950 6900 and 6800),but I think Amd wouldn't have changed their card naming for Nvidia and not after the backlash.

Yes I do think Nvidia started to believe their own hype but can you blame them as people just keep on buying their cards at any price.

Good points on the 7900 naming. Though I do think this time Nvidia have had a rude awakening, as have many of even their most ardent fans. The fact they cancelled their 4080 12GB and the 4080 16GB is double the price of the 3080 is not a good look and has not gone unnoticed by tech sites and buyers.
 
Back
Top Bottom