Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Didn’t they only enable it on pascal to make Turing look better?Nvidia do sometimes retro stuff to work on older cards that don't have dedicated hardware - ray tracing is a case in point - it's completely supported on pascal cards even though they have no RT hardware, it's all done in shaders (and runs snail slow). However AMD have a better track record for it.
Although I'll probably get greif for mentioning that AMD releases most of this stuff freely between vendors, but pretty sure I saw mention that open source software Frame Generation has been available for quite some time now, perhaps that's why AMD will be able to launch it in a relatively quickly timeframe.
Obviously I'm no expert on the fake frame subject but to me this seems similar to the Gsynce vs Freesync debate. AMD goes with the open source fake frame option (kinda like Adaptive Sync) and Nvidia have gone with their closed source hardware generated fake frame implementation (like Gsync) which I wouldn't be surprised if it just turns out to be a ploy to sell more 4000 series cards. I guess we won't know fully though until AMDs FSR 3 is released and we see it running on older gen cards and how latency and other aspects compare to DLS 3.
Freesync has proven that Gsync isn't worth the cost increase so time will tell on the fake frame implementation
IIRC, adaptive sync/freesync was not possible at the time on release due to nvidia gpus not having the required hardware, I think that was the "main" reason nvidia had to add hardware module.
The above and the first iteration of adaptive sync i.e. freesync was not on par with the gsync module in many areas i.e. no issues with black screens, flickering, poor FPS range, lack of low frame compensation, lack of variable overdrive (which is still the main advantage of gsync module for LCD based displays).
TFT central have a very good article on where the gsync module differs to the adaptive sync (freesync and gsync compatible):
Variable Refresh Rates - G-sync and FreeSync - TFTCentral
A detailed look at variable refresh rates (VRR) including NVIDIA G-sync, AMD FreeSync and all the various versions and certifications that existtftcentral.co.uk
But agree now, for the last couple of years, it hasn't been worth the premium especially on oled displays where the main advantage is rendered useless.
The main issue with first version of Freesync wasn't really Freesync itself but monitor manufacturers implementation. You have many monitors claiming to be Freesync but only have a Freesynce range of some daft lik 20hz.
If you do your homework and buy a good Freesync monitor with a good range then your pretty much getting the same experience as Gsync without the extra cost so thats why I said its only worth the extra for Gsync.
Comparing it to the 4080 is a trap in the sense we know the 4080 Ti is coming and will probably be dropped the moment this card is released and that will likely perform almost on par with 4090The 3080 cost $700 back then though so it wasn't like they were going to compare a $999 6900XT to that, with the new 4080 coming in $200 higher than the 7900XTX then it makes sense comparing it to that card but it's not out yet so benchmarks are not available.
The 6900XT was 10% slower than 3090 and AMD could find games where the 6900XT was faster for their charts. I am sure Far Cry 6, COD MW2, Forza Horizon 5, Watch Dogs Legion, RE Village for instance would have the 7900XTX outperforming 4090.If they started talking about competing with the 4090 and when reviews hit, people weren't happy with how close it was then it would be marketing suicide. A lot of people are already writing it off entirely based on theoretical RT performance.
Imagine the graphs showing it 10% behind the 4090 in reviews if they used it as a comparison. Nvidia fans would have a field day regardless of any other metric.
Problem is there were very few "good" freesync monitors for quite a while, iirc, the first good one, which didn't suffer from issues was the benq TN 1440 144HZ and the freesync range was 48-144 (tbh, the range didn't bother me too much as even with either solution, I still wouldn't want to be dropping below 60) so it kind of boils down to the usual of how long do people want to wait for? Personally I don't mind waiting a couple of months or a few months but not 1+ year and certainly not if said product comes out below the quality of the competitors offering.
I would say the variable overdrive was and is the biggest advantage of gsync module over the other pros but I'm incredibly sensitive to motion and ghosting/overshooting of LCD monitors but alas also not worth £200+
It's things like this and with FSR where amd need to be first and not give the perception/image of them playing catchup all the time, sure their reasoning of wanting an open source solution available to the masses is perfectly good but not many people care about that when they're still spending hundreds/thousands on a product where being first/the best in ones affordability is everything.
Once you go oled, there's no going back!Yeah the BenQ XL2730Z, had a range of 40-144hz. I know cos i'm still using one Been holding out for 16:9 OLEDs in the 27-32" range which feels like forever now!
And if they did try and make it work on the older hardware and it didn't look as good I am sure certain YT channels will pick that up and claim Nvidia is making older hardware look bad on purpose. There is no way they are coming out of this looking good.It's a 3rd party hack saying it's not working properly.
Does that mean it can't work properly or that they can't get it to work?
Nvidia has zero interest in letting it work on older cards so they can do a whole range of things to prevent it working on anything other than the newest cards.
They want it to be a new shiny upsell factor for 4000 series.
The 6900XT was 10% slower than 3090 and AMD could find games where the 6900XT was faster for their charts. I am sure Far Cry 6, COD MW2, Forza Horizon 5, Watch Dogs Legion, RE Village for instance would have the 7900XTX outperforming 4090.
It's not really a trap since we don't yet know when the 4080ti is coming or how much it'll cost. For all we know Nvidia could slot it in at $1400.Comparing it to the 4080 is a trap in the sense we know the 4080 Ti is coming and will probably be dropped the moment this card is released and that will likely perform almost on par with 4090
Comparing it to the 4090 would have generated more interest in the product. Charts showing it walking over the 4080 isn't really an achievement as it's a terribly priced product due to be replaced soon.
And if they did try and make it work on the older hardware and it didn't look as good I am sure certain YT channels will pick that up and claim Nvidia is making older hardware look bad on purpose. There is no way they are coming out of this looking good.
You made a very specific claim which is what I responded to. Don't be shifty now.If by tuning you mean making it non-existent.
I do wonder if I were to do a search for occurrences of 4080Ti on OCUK, I suspect I might find that 90% of those are from Shaz12 in this thread!It's not really a trap since we don't yet know when the 4080ti is coming or how much it'll cost. For all we know Nvidia could slot it in at $1400.