• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD RDNA3 unveiling event

You make a good point, I guess it's because the 4080 benches aren't really about yet. Personally I would have raster performance figures up against the 4090 but then I'd have the price comparison in big bold letters :D
 
Nvidia do sometimes retro stuff to work on older cards that don't have dedicated hardware - ray tracing is a case in point - it's completely supported on pascal cards even though they have no RT hardware, it's all done in shaders (and runs snail slow). However AMD have a better track record for it.
 
AMD releases most of this stuff freely between vendors, but pretty sure I saw mention that open source software Frame Generation has been available for quite some time now, perhaps that's why AMD will be able to launch it in a relatively quick timeframe.

Hopefully another kick in the balls for NV if it works on 3000 series.
 
Last edited:
Nvidia do sometimes retro stuff to work on older cards that don't have dedicated hardware - ray tracing is a case in point - it's completely supported on pascal cards even though they have no RT hardware, it's all done in shaders (and runs snail slow). However AMD have a better track record for it.
Didn’t they only enable it on pascal to make Turing look better?

It was a safe bet because it was literally unusable on pascal.
Unlike DLSS 3 it could probably run at an acceptable level with ampere.
 
Last edited:
Although I'll probably get greif for mentioning that AMD releases most of this stuff freely between vendors, but pretty sure I saw mention that open source software Frame Generation has been available for quite some time now, perhaps that's why AMD will be able to launch it in a relatively quickly timeframe.

I think people are referring to amds fluid motion which was for videos, whilst there might be some code they can re-use, I imagine it is a completely different ball game to get it working for games, not to mention, getting good results on all levels of hardware:


Their method will probably be very similar to what we see with VR headsets.

Hopefully the sooner the better but going by history, expect end of next year, especially if they want to get it working well on various hardware.
 
Obviously I'm no expert on the fake frame subject but to me this seems similar to the Gsynce vs Freesync debate. AMD goes with the open source fake frame option (kinda like Adaptive Sync) and Nvidia have gone with their closed source hardware generated fake frame implementation (like Gsync) which I wouldn't be surprised if it just turns out to be a ploy to sell more 4000 series cards. I guess we won't know fully though until AMDs FSR 3 is released and we see it running on older gen cards and how latency and other aspects compare to DLS 3.

Freesync has proven that Gsync isn't worth the cost increase so time will tell on the fake frame implementation
 
Last edited:
Obviously I'm no expert on the fake frame subject but to me this seems similar to the Gsynce vs Freesync debate. AMD goes with the open source fake frame option (kinda like Adaptive Sync) and Nvidia have gone with their closed source hardware generated fake frame implementation (like Gsync) which I wouldn't be surprised if it just turns out to be a ploy to sell more 4000 series cards. I guess we won't know fully though until AMDs FSR 3 is released and we see it running on older gen cards and how latency and other aspects compare to DLS 3.

Freesync has proven that Gsync isn't worth the cost increase so time will tell on the fake frame implementation

IIRC, adaptive sync/freesync was not possible at the time on release due to nvidia gpus not having the required hardware, I think that was the "main" reason nvidia had to add hardware module.

The above and the first iteration of adaptive sync i.e. freesync was not on par with the gsync module in many areas i.e. no issues with black screens, flickering, poor FPS range, lack of low frame compensation, lack of variable overdrive (which is still the main advantage of gsync module for LCD based displays).

TFT central have a very good article on where the gsync module differs to the adaptive sync (freesync and gsync compatible):


cmCOTHB.png

But agree now, for the last couple of years, it hasn't been worth the premium especially on oled displays where the main advantage is rendered useless.
 
Last edited:
IIRC, adaptive sync/freesync was not possible at the time on release due to nvidia gpus not having the required hardware, I think that was the "main" reason nvidia had to add hardware module.

The above and the first iteration of adaptive sync i.e. freesync was not on par with the gsync module in many areas i.e. no issues with black screens, flickering, poor FPS range, lack of low frame compensation, lack of variable overdrive (which is still the main advantage of gsync module for LCD based displays).

TFT central have a very good article on where the gsync module differs to the adaptive sync (freesync and gsync compatible):


cmCOTHB.png

But agree now, for the last couple of years, it hasn't been worth the premium especially on oled displays where the main advantage is rendered useless.

The main issue with first version of Freesync wasn't really Freesync itself but monitor manufacturers implementation. You have many monitors claiming to be Freesync but only have a Freesynce range of some daft lik 20hz.
If you do your homework and buy a good Freesync monitor with a good range then your pretty much getting the same experience as Gsync without the extra cost so thats why I said its only worth the extra for Gsync.
 
The main issue with first version of Freesync wasn't really Freesync itself but monitor manufacturers implementation. You have many monitors claiming to be Freesync but only have a Freesynce range of some daft lik 20hz.
If you do your homework and buy a good Freesync monitor with a good range then your pretty much getting the same experience as Gsync without the extra cost so thats why I said its only worth the extra for Gsync.

Problem is there were very few "good" freesync monitors for quite a while, iirc, the first good one, which didn't suffer from issues was the benq TN 1440 144HZ and the freesync range was 48-144 (tbh, the range didn't bother me too much as even with either solution, I still wouldn't want to be dropping below 60) so it kind of boils down to the usual of how long do people want to wait for? Personally I don't mind waiting a couple of months or a few months but not 1+ year and certainly not if said product comes out below the quality of the competitors offering.

I would say the variable overdrive was and is the biggest advantage of gsync module over the other pros but I'm incredibly sensitive to motion and ghosting/overshooting of LCD monitors but alas also not worth £200+

It's things like this and with FSR where amd need to be first and not give the perception/image of them playing catchup all the time, sure their reasoning of wanting an open source solution available to the masses is perfectly good but not many people care about that when they're still spending hundreds/thousands on a product where being first/the best in ones affordability is everything.
 
The 3080 cost $700 back then though so it wasn't like they were going to compare a $999 6900XT to that, with the new 4080 coming in $200 higher than the 7900XTX then it makes sense comparing it to that card but it's not out yet so benchmarks are not available.
Comparing it to the 4080 is a trap in the sense we know the 4080 Ti is coming and will probably be dropped the moment this card is released and that will likely perform almost on par with 4090

Comparing it to the 4090 would have generated more interest in the product. Charts showing it walking over the 4080 isn't really an achievement as it's a terribly priced product due to be replaced soon.
 
Last edited:
If they started talking about competing with the 4090 and when reviews hit, people weren't happy with how close it was then it would be marketing suicide. A lot of people are already writing it off entirely based on theoretical RT performance.

Imagine the graphs showing it 10% behind the 4090 in reviews if they used it as a comparison. Nvidia fans would have a field day regardless of any other metric.
The 6900XT was 10% slower than 3090 and AMD could find games where the 6900XT was faster for their charts. I am sure Far Cry 6, COD MW2, Forza Horizon 5, Watch Dogs Legion, RE Village for instance would have the 7900XTX outperforming 4090.
 
Last edited:
Problem is there were very few "good" freesync monitors for quite a while, iirc, the first good one, which didn't suffer from issues was the benq TN 1440 144HZ and the freesync range was 48-144 (tbh, the range didn't bother me too much as even with either solution, I still wouldn't want to be dropping below 60) so it kind of boils down to the usual of how long do people want to wait for? Personally I don't mind waiting a couple of months or a few months but not 1+ year and certainly not if said product comes out below the quality of the competitors offering.

I would say the variable overdrive was and is the biggest advantage of gsync module over the other pros but I'm incredibly sensitive to motion and ghosting/overshooting of LCD monitors but alas also not worth £200+

It's things like this and with FSR where amd need to be first and not give the perception/image of them playing catchup all the time, sure their reasoning of wanting an open source solution available to the masses is perfectly good but not many people care about that when they're still spending hundreds/thousands on a product where being first/the best in ones affordability is everything.

Yeah the BenQ XL2730Z, had a range of 40-144hz. I know cos i'm still using one :p Been holding out for 16:9 OLEDs in the 27-32" range which feels like forever now!
 
It's a 3rd party hack saying it's not working properly.

Does that mean it can't work properly or that they can't get it to work?

Nvidia has zero interest in letting it work on older cards so they can do a whole range of things to prevent it working on anything other than the newest cards.

They want it to be a new shiny upsell factor for 4000 series.
And if they did try and make it work on the older hardware and it didn't look as good I am sure certain YT channels will pick that up and claim Nvidia is making older hardware look bad on purpose. There is no way they are coming out of this looking good.
 
The 6900XT was 10% slower than 3090 and AMD could find games where the 6900XT was faster for their charts. I am sure Far Cry 6, COD MW2, Forza Horizon 5, Watch Dogs Legion, RE Village for instance would have the 7900XTX outperforming 4090.

The 6900XT was never 10% slower than the 3090, and even then it still got slated and people bought the 3090 anyway.

I am referring to independent benchmarks though, the ones that come out on release. If AMD said "Look we compete with the 3090 for $999" and then reviews showed otherwise it would be a marketing disaster.
 
Comparing it to the 4080 is a trap in the sense we know the 4080 Ti is coming and will probably be dropped the moment this card is released and that will likely perform almost on par with 4090

Comparing it to the 4090 would have generated more interest in the product. Charts showing it walking over the 4080 isn't really an achievement as it's a terribly priced product due to be replaced soon.
It's not really a trap since we don't yet know when the 4080ti is coming or how much it'll cost. For all we know Nvidia could slot it in at $1400.
 
And if they did try and make it work on the older hardware and it didn't look as good I am sure certain YT channels will pick that up and claim Nvidia is making older hardware look bad on purpose. There is no way they are coming out of this looking good.

All I know is that all this blurring and sharpening and image reconstruction is a funny way to get closer to "8k" gaming.
 
I think AMD absolutely wants nvidia to release the 4080 at $1200 so by not showing figures vs a 4090 and playing down their own cards performance it keeps nvidia guessing.

Had they shown direct slides of 7900XTX coming close to the 4090 then Nvidia may have cut the launch price of the 4080 which would have meant when the 7900XT comes out the reviews won't be as positive as they likely will be with the 4080 priced at $1200.

It will also cause a lot of bad press and bad feeling for nvidia if they have to cut the price after launch due to a large performance deficit by the media and people who want out and bought the cards the cards for $1200+.
 
Last edited:
If by tuning you mean making it non-existent.
You made a very specific claim which is what I responded to. Don't be shifty now.

goal-post-moving.gif
 
It's not really a trap since we don't yet know when the 4080ti is coming or how much it'll cost. For all we know Nvidia could slot it in at $1400.
I do wonder if I were to do a search for occurrences of 4080Ti on OCUK, I suspect I might find that 90% of those are from Shaz12 in this thread!

A rather strange argument to defend the 4080's position anyhow, but since it would presumable have to be AD102 rather than AD103 based I would think Nvidia will wait and stockpile so they don't have to cut too many dies perfectly able to hit 4090 specs.
 
Back
Top Bottom