Allegedly. We need to see some games reviews but it should not be far off that.
I was on about 5080 replying to bidley
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Allegedly. We need to see some games reviews but it should not be far off that.
I was on about 5080 replying to bidley
Really valid points in that article.Allegedly. We need to see some games reviews but it should not be far off that.
This is what Jack said in interviews months ago. So the objective seems to be there but will they stick to that?
my price limit was the 3080fe..so £650. said at time I wouldn't pay more than that for a card as expensive already, and my view not changed,,I'm playing a few games on 4k, don't think i need a mortgage to do that in 2025 (think nvidia loosing the plot on pricing now)If it actually can match the 7900XTX in half of the games I play and is priced around £600 then I think it's time to upgade from the old 3080FE. If the prices is the same or £50 cheaper than a 5070Ti then I'll wait for next gen. I have a price limit which I will not go over.
If it actually can match the 7900XTX in half of the games I play and is priced around £600 then I think it's time to upgade from the old 3080FE. If the prices is the same or £50 cheaper than a 5070Ti then I'll wait for next gen. I have a price limit which I will not go over.
Using the settings in that vid:
7800XT MBA card no driver tweaks.
Core Ultra 285k (Intel baseline profile) and with no info on ram speeds and timings in the Chinese leak, I used a 6400 XMP profile which is what reviewers were given as baseline testing for Core Ultra CPU's.
Win11 24H2 Intel PR5 fixes and bios updates in place.
Now comparing this to the original Chinese leak on X and using a 7800XT to compare with (at the supposed same settings) , looks like the 9070XT is a little monster considering the amount of shaders it actually has.
All boils down to street prices now, not imaginary MSRP's.
![]()
Ok... you can't have frame gen without FSR so FSR has to be on, i'll put it to quality.
A lot of CPU bottlenecking in this, not entirely but i would say for about 70% of this my GPU was at around 80 to 90% and a 285K is not much better at gaming than my 5800X, not even joking looks at reviews... its better, but not much.
I don't think we can get much from this, i think a lot of the performance we see in that result is the CPU, not the GPU, i mean 1080P, with FSR and Frame Gen.... these are choices you make from a CPU benchmark, not a GPU benchmark, its crap.
Bone stock RX 7800 XT, not that this matters, could be anything....
Sorry for weird Windows HDR colour on the screenshot.
![]()
You're using custom settings, watch the vid above my post, I ran the benchmark using the settings he claims were in the original leak on X yesterday. If you want to compare don't alter anything FG enabled and Ultra settings, don't modify anything else.Its even worse than i thought![]()
You're using custom settings, watch the vid above my post, I ran using the setting he claims are what was used in the original leak on X yesterday.
This is what it looks like using the same settings as you:
This is a work machine, I don't game on it.Is it right this time?
158.29, 5800X is still 3% better, or margin of error, sure. My CPU is 5 years old it was £440 brand new.
£590
158.29, 5800X is still 3% better, or margin of error, sure. My CPU is 5 years old it was £440 brand new.
This is a work machine, I don't game on it.
I ran the benchmark as a comparison for people who saw the original leak on X and gave you all a run with a 7800XT.
Now instead and as usual, you want to push your weird agenda.
I'm not going to get into your little games, grow up !!!
Jesus wept. You spend that on a cpu and gurn about £600 on a GPU!
Got mine on MM for £100 but it will be a bottleneck to replace at some point and go all in on AM5.
I think you misunderstood my whole post I didn't say Nvidia invented all those technologies I said they innovated new features. To innovate does not mean to invent. I am well aware of the origins of VRR and upscaling although upscaling and Ai upscaling is not the same thing. AMD had terrible tessellation performance for years so if they did indeed invent it they neglected it in hardware, I had AMD Eyefinity that was a nice AMD feature and it encouraged me to buy an AMD card and 3 new monitors which is exactly my point about features. I forgot about Nvidia 3D vision as well (Yes I know Nvidia didn't invent 3D).lol at so much revisionist nonsense. ATI did tessellation, not Nvidia. AMD did multi monitor gaming, upscaling was a console tech long before Nvidia “invented it”.
VRR was a laptop battery saving tech before Gsync was “invented”
Lots of the tech you attributed to Nvidia “inventing” was actually pioneered long before you think it was, by companies other than Nvidia.
I’m happy to give credit where it’s due but you are literally convincing yourself Nvidia “invented” technologies that were copies of existing technology.
Nvidia take someone else’s idea = innovation.
AMD take someone else’s idea = reactionary.
AMD often seem to cautious or risk adverse at times.I think you misunderstood my whole post I didn't say Nvidia invented all those technologies I said they innovated new features. To innovate does not mean to invent. I am well aware of the origins of VRR and upscaling although upscaling and Ai upscaling is not the same thing. AMD had terrible tessellation performance for years so if they did indeed invent it they neglected it in hardware, I had AMD Eyefinity that was a nice AMD feature and it encouraged me to buy an AMD card and 3 new monitors which is exactly my point about features. I forgot about Nvidia 3D vision as well (Yes I know Nvidia didn't invent 3D).
Nvidia do innovate and take risks and bring new features and AMD are currently reactionary and too safe that seems clear to anyone who is paying attention. AMD recently got asked why they copied apple with a gaming APU and the AMD engineer responded “We were building APUs [chips combining CPUs and Radeon graphics] while Apple was using discrete GPUs. They were using our discrete GPUs. So I don’t credit Apple with coming up with the idea.”
Joe Macri, AMD source
It took the success of Apples mini pc to get the funding or confidence to make Ryzen AI Max which tells you something about what it's like at AMD. It seemed obvious to many for years that there was a market for a powerful APU but AMD would not take a risk, it took Apple to do that. Apple also didn't invent the smart music player (iPod) or smart phone (iPhone) but they were the ones to take the risks. Almost every feature AMD has added to GPU's in years has been reactionary. I wouldn't be surprised if the delay in RDNA4 launch is due to them wanting to add MFG 3x and 4x to FSR4.
I have used as many AMD cards as Nvidia cards and I have an AMD CPU, I don't care about brands so this is not an attack on AMD. I just believe Nvidia has the better product for my needs at the moment. I truly wish AMD would innovate more and react less.