On the quality mode.On the ultra quality comparison?
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
On the quality mode.On the ultra quality comparison?
Is it just me or is history repeating itself here?
I am getting Freesync vs Gsync vibe.
I remember Gsync being the better choice because it had £100 expensive hardware.
Well look how that ship sailed.
If Amd can pull off FSR to look just like native or there about because I am sure like DLSS it won't be perfect then does it really matter what approach each graphic manufacturer choose?
So long all users get this upsampling feature I couldn't care less about DLSS gives a slight better image if you zoom in 10x you can spot a leaf detail.
How did it sail? Gsync still better than freesync. Not sure what you mean
freesync monitors - tons of issues, flickering and crappy in general, also vrr working only above a certain fps
gsync - premium monitors, barely any issues, no flickering, vrr working at super low fps compared to freesync
Also, Saphire AMD cards already have access to FSR and have for several months now, it's called Trixx Boost
It really depends on the developers and how much support AMD throws their way to make this seamless, but FSR does have the potential to kill DLSS especially since it is used on consoles as well as PCs.Is it just me or is history repeating itself here?
I am getting Freesync vs Gsync vibe.
I remember Gsync being the better choice because it had £100 expensive hardware.
Well look how that ship sailed.
If Amd can pull off FSR to look just like native or there about because I am sure like DLSS it won't be perfect then does it really matter what approach each graphic manufacturer choose?
So long all users get this upsampling feature I couldn't care less about DLSS gives a slight better image if you zoom in 10x you can spot a leaf detail.
Being open source, FSR can only get better, anyone can improve it. So even if it will look as bad as in the 1060 screen captures, things can only get better from now on.
Think you still reading issues from about 5 years ago.
Freesync as come on massive with high resolution and Freesync windows.
My new monitor coming soon is 4k 40-160hz HDR 600
With Freesync they is a lot of monitors to choose from cheap end to premium end your choice will obviously have limitations if you got the lower end.
It really depends on the developers and how much support AMD throws their way to make this seamless, but FSR does have the potential to kill DLSS especially since it is used on consoles as well as PCs.
P.S. I still don't like DLSS/FSR before anyone thinks i have changed positions.
Sorry, HDR 600 monitor is all i needed to hear.
That is anything but premium.
How many technologies in GPUOpen are widely used? That tells you all you need to know. Especially when you see the new generation of temporal AA and SR algorithms such as in UR5 and quake 2 RTX, and the conued growth of DLSS which is now plug 'n play for many developers. There will be many choices and engine specific implementations.
Come on stop that bs about tensor cores. They are useless and are inherited from Nvidia Pro line. Better ask them to add more compute units and leave the tensor cores job to the compute units...but that will also make the older gens DLSS compatible.I expect it will get better. Given some of the patents AMD have released and the state of the art, I expect AMD will be trying to keep their options open to allow allow temporal data in the future. This may not be possible under their current hardware as the ML models get much larger, but when AMD have their equivalent of Tensor Cores in RDNA3 they will then be in a position to do their own DLSS 1 to 2 type transition and ramp up the quality.
List of developers and games using effects from GPUOpen. Click me.All the fidelity FX are being used. Just look at the game support list and it wasn't released long ago.
Come on stop that bs about tensor cores. They are useless and are inherited from Nvidia Pro line. Better ask them to add more compute units and leave the tensor cores job to the compute units...but that will also make the older gens DLSS compatible.
What is the connection between Temporal data and tensor cores? UE 5 proves that you can have good TSR with no dedicated hardware and no ML.
And to be honest i am beginning to doubt that Nvidia is doing any ML, you won't have ghosting effects if the machine will know how a car looks like. You don't need temporal data for that.
List of developers and games using effects from GPUOpen. Click me.
ehh, tensor cores provide extremely efficient matrix operations as well as hardware support for FP16 ops etc.
It is not a choice between tensor cores or compute units, you can have both. The tensor cores take up about 4% of the die area, which is tiny. Simply adding compute units in place of the tensor cores doesn't automatically lead to any increased performance if those CU are not properly utilized, which always tends to be a problem.
I never said there was a connection between tensor cores and temporal data. Merely, I speculated than AMD might not be able to have a non-linear image reconstruction technique that can fully exploit the additional data and model complexity of added temporal data without the significant computational costs that defeat the purpose.
And yes, you can do a lot with the temporal data and no ML as UE5 shows. That is why it is dispaointing to see that currently FSR doesn't appear to leverage temporal data.
Stopped right there, thanks for the laughMatrix operations are matrix operations. A compute unit will do them as fast as long as it supports the instructions.
Stopped right there, thanks for the laugh
You know since MMX instructions they are all matrix operations. And you don't need separate hardware, you just use the FPU unit for all of them.Stopped right there, thanks for the laugh
Enjoy your "dedicated hardware"?Beat me to it!