• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Any news on 7800 xt?

Reminds me of a time some guy came to my door because my German Shepard who was in my garden was looking at him as he was walking by.

Lol. That's what German Shepards do. We have one and I would think something was wrong with her if she wasn't permanently looking at people she thought were suspect (everyone). What an earth did he say to you ?
 
Apologies you are correct I got that wrong. What I was trying to point out (badly) was that if the 7900 XT was $550 USD MSRP on release, it would still not have been a £550 GPU in the UK.

I agree that would be true, although I don't know why it's true. It's not taxation. It's not distribution costs because the difference is more than even the cost of shipping items one at a time on a retail basis, let alone the actual cost. What's the reason? Or the excuse (with the reason being more profit)?

On the plus side, the fact that I recently bought a 6700XT after a couple of years of waiting to buy a graphics card improves the chances of the 7700some_name and/or the 7800some_name being better and cheaper than expected :)
 
Lol. That's what German Shepards do. We have one and I would think something was wrong with her if she wasn't permanently looking at people she thought were suspect (everyone). What an earth did he say to you ?
He literally said something along the lines of "your dog was staring at me when i walked by". Was around 18 years back so cant remember word for word but was obvious he was on something lol.
 
The 6700 XT is an excellent card for the price. I could have done not upgrading it the tinkerer in me wanted to test the 7900 XT.

By running an older benchmark (Unigine Heaven) to directly compare with my 1070Ti on max overclock ~5 years ago (I ran it stock after a few hours of overclocking for sport) I unintentionally tested coil whine on my new 6700XT when it reached 456.4 fps :) No coil whine. Or maybe coil whine at a frequency higher than my old ears can detect, which works equally well :)

I'd have preferred a 7900XT, but at 1440 the difference wouldn't have mattered enough to be worth paying £830 rather than £300. I didn't notice the £700 7900XT offer before buying the 6700XT. But I'd probably have picked the 6700XT even if I had. It's genuinely good value for £300 and "genuinely good value" isn't a phrase that normally comes to mind with graphics cards. Not any more. I'll look at next gen, maybe get a higher model then. Or maybe Windows 11 will have gone online-only maximum spyware edition by then and I'll change how I game when Win10 stops being supported. Maybe Intel will surprise everyone with Battlemage (and fully working software for it). Maybe AMD will get chiplets nailed down in the next gen (although the rumour is that they haven't). Maybe nvidia will stop using their market dominance for deceptive mislabelling and price gouging. OK, that last one was too improbable. But regardless, a 6700XT will certainly do me for the time being.
 
MLID says FSR3 is launching with the 7700xt

So there is no need to worry about these new cards not performing well relative to the 6800xt or 6750xt because FSR3 will boost frames like dlss3

 
Last edited:
When the solution is fake frames, we are all being had!

I have tried them in MSFS and Witcher 3 and found it caused a disconnect in lag. So you would be getting 80 FPS but the mouse camera movement felt like 30 FPS.

Where fake frames worked well was when you already had decent FPS to begin with. This is because the input lag and graphical corruption was less noticeable. Even with low latency mode enabled.

Going from 30 FPS to 60+ FPS seems like a good thing but the input lag disconnect felt wrong

Going 70 FPS to 120 is a good thing of course, but at 70 FPS you already have arguably decent FPS.

Ultimately fake frames work best when your FPS is already good. It is poor as a tool to increase performance when your FPS is low.

Or to put it another way, it’s a way for Nvidia benchmark graphs to look good. Especially for those review sites that never seem to mention the negative aspects of fake frames, or up scaling.
 
Last edited:
I have tried them in MSFS and Witcher 3 and found it caused a disconnect in lag. So you would be getting 80 FPS but the mouse camera movement felt like 30 FPS.

Where fake frames worked well was when you already had decent FPS to begin with. This is because the input lag and graphical corruption was less noticeable. Even with low latency mode enabled.

Going from 30 FPS to 60+ FPS seems like a good thing but the input lag disconnect felt wrong

Going 70 FPS to 120 is a good thing of course, but at 70 FPS you already have arguably decent FPS.

Ultimately fake frames work best when your FPS is already good. It is poor as a tool to increase performance when your FPS is low.

Or to put it another way, it’s a way for Nvidia benchmark graphs to look good. Especially for those review sites that never seem to mention the negative aspects of fake frames, or up scaling.
Yes, another factor is that it making high FSP look somewhat smother means that any halo card evangelists can sing it praise everywhere, so that the masses can buy x60 and x50 cards expecting the same experience. But like RT on a x50 or x60, they have been had; especially since often they have paid as much for their low end cards as a far faster at raster AMD card: once the shortage abated there were 3050 selling for nearly as much as a plain 6700 which is crazy.
 
MLID says FSR3 is launching with the 7700xt

So there is no need to worry about these new cards not performing well relative to the 6800xt or 6750xt because FSR3 will boost frames like dlss3


It basically confirms there is no high end from AMD with RDNA4, Its too expensive to R&D them.

That's exactly what i said would happen, AMD don't have the market share to compete with Nvidia at the high end, the 4090 is 60% more expensive than AMD's top card and they sell more of that card alone than AMD sell across the entire range.

Youtubers like HUB and Vex only see the BOM costs. "oh but AMD can sell them at half the price of Nvidia and still make money" actually it cost $500,000,000 to develop that GPU and yeah.... we can't do it any more.
 
Last edited:
I have tried them in MSFS and Witcher 3 and found it caused a disconnect in lag. So you would be getting 80 FPS but the mouse camera movement felt like 30 FPS.

Where fake frames worked well was when you already had decent FPS to begin with. This is because the input lag and graphical corruption was less noticeable. Even with low latency mode enabled.

Going from 30 FPS to 60+ FPS seems like a good thing but the input lag disconnect felt wrong

Going 70 FPS to 120 is a good thing of course, but at 70 FPS you already have arguably decent FPS.

Ultimately fake frames work best when your FPS is already good. It is poor as a tool to increase performance when your FPS is low.

Or to put it another way, it’s a way for Nvidia benchmark graphs to look good. Especially for those review sites that never seem to mention the negative aspects of fake frames, or up scaling.
I don't entirely disagree. That said in RPGs etc. I find I tend to forget about the input lag soon enough but still get the benefit of the smoothness. It's still a bit better than native sub-60 overall so it's still a win but just not as impressive as the graphs make it look.
 
I don't entirely disagree. That said in RPGs etc. I find I tend to forget about the input lag soon enough but still get the benefit of the smoothness. It's still a bit better than native sub-60 overall so it's still a win but just not as impressive as the graphs make it look.

That’s my point. On paper it seems like a win for gamers but the reality is different.
 
I think my perception is slightly different. I don't think it's not a win at all, it's just a smaller win than advertised.

It’s not a win when it becomes the new norm and these performance “gains” are sold as must have features. So you get the next gen GPUs being very poor upgrades but they give on paper massive increases.

A 4070 is an objectively terrible GPU and is slower than a 3080 but in a selection of these DLSS 3 games, it will on paper destroy a 3080 or even 3090Ti.
 
Last edited:
It’s not a win when it becomes the new norm and these performance “gains” are sold as must have features. So you get the next gen GPUs being very poor upgrades but they give on paper massive increases.

A 4070 is an objectively terrible GPU and is slower than a 3080 but in a selection of these DLSS 3 games, it will on paper destroy a 3080 or even 3090Ti.

Yeah but now we're talking about a speculative future more than anything else. The here and now problem is really to do with pricing, and they really don't have to worry about that because of datacentre AI.

They know the 4070 isn't selling as well as it should, but instead of cutting prices they cut production because they don't care. Why make 4070s when they could dedicate that wafer area to making H100s that they can sell for $30k a piece and people are queuing up for them? I don't think DLSS3 is making that much difference in the grand scheme of things.
 
Back
Top Bottom