• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD's FSR3 possibly next month ?

So you are basically saying that fake frames are for console playing peasants then!

Well, fake visuals was what PCMR use to say about consoles and their tricks. Now it seems PCMR opinion has turned 180° and the masters with expensive GPUs get to experience real elite fakery!

I'd say FG is useful in scenarios like MS Flight Sim where no matter if you have the highest end CPU you are going to run into low GPU usage more often than not. FG definitely helps smooth it out.
 
Has AMD confirmed FSR 3 is frame gen?
There was talk of DLSS 3 and FSR 3 before frame gen was announced and it took a lot of people by surprise iirc

According to this post which goes into what AMD showed/talk about it's either frame interpolation, generated frames or they found a genie to grant them a wish of super charging FSR.

In an Unreal Engine 5 demo, Radeon RX 7000 GPU hits 60 FPS with FSR 2.0, however once a new technology called FSR 3.0 is engaged, the performance increased to 112 FPS. According to AMD, this new upscaling technology will improve performance by up to 2 times.

 
Last edited:
Has AMD confirmed FSR 3 is frame gen?
From a PCGamer interview:

Not much was said on the specifics regarding FSR 3 during AMD's RX 7900 XTX and RX 7900 XT announcement stream, but we know it will incorporate what AMD's dubbed Fluid Motion Frames technology. According to Radeon GM Scott Herkelman, this may be in some way similar, at least in purpose if not in how it actually works, to what Nvidia calls DLSS Frame Generation. That technology infers information from motion vectors and rendered frames to generate entirely new frames and dramatically improve FPS in supported games.

"It's frame generation, that's for sure," Herkelman says during a Q&A.

 
From a PCGamer interview:

Not much was said on the specifics regarding FSR 3 during AMD's RX 7900 XTX and RX 7900 XT announcement stream, but we know it will incorporate what AMD's dubbed Fluid Motion Frames technology. According to Radeon GM Scott Herkelman, this may be in some way similar, at least in purpose if not in how it actually works, to what Nvidia calls DLSS Frame Generation. That technology infers information from motion vectors and rendered frames to generate entirely new frames and dramatically improve FPS in supported games.

"It's frame generation, that's for sure," Herkelman says during a Q&A.


Fake frames war now!

 
Once AMD's version is out fake frames will all of a sudden be championed.... :D

Nah,we can have endless screenshots of people trying to spin one fake frame is better than the others just like all the upscaling wars. Then all the things such as latency,etc will be forgotten just as ICDP said:

I heard AMD's version can do 4X the fake frames Nvidia can do:

I expect all the Nvidia guys who suddenly thought latency isn't a problem,will then suddenly find latency is an issue so 1X fake frames is so much better. That is until Nvidia does 5X fake frames,and suddenly its buttery smooth and better than AMD who only does 4X. Intel will then do 7X fake frames,and nobody will care....because Intel and no marketing budget! :cry:

KompuKare summed it up so well:

Well, fake visuals was what PCMR use to say about consoles and their tricks. Now it seems PCMR opinion has turned 180° and the masters with expensive GPUs get to experience real elite fakery!

Also to add TVs did frame insertion too. PC is so advanced it's like Back to the Future. The OLD is the new NEW!!


Back-to-the-GPU-future.jpg
 
Last edited:
Nah,we can have endless screenshots of people trying to spin one fake frame is better than the others just like all the upscaling wars. Then all the things such as latency,etc will be forgotten just as ICDP said:

I heard AMD's version can do 4X the fake frames Nvidia can do:

I expect all the Nvidia guys who suddenly thought latency isn't a problem,will then suddenly find latency is an issue so 1X fake frames is so much better. That is until Nvidia does 5X fake frames,and suddenly its buttery smooth and better than AMD who only does 4X. Intel will then do 7X fake frames,and nobody will care....because Intel and no marketing budget! :cry:

KompuKare summed it up so well:



Also to add TVs did frame insertion too. PC is so advanced it's like Back to the Future. The OLD is the new NEW!!


Back-to-the-GPU-future.jpg
If you'd just sit down at a PC and play withoit knowing with those "fake frames" you'd most likely not even notice it. :)
 
If you'd just sit down at a PC and play withoit knowing with those "fake frames" you'd most likely not even notice it. :)
People do notice it because of the latency.
ICDP explained it and so have many others.Would people want to use this in a competitive, ranked fps? If the prediction algorithm goes off, then your aim will be off! It is the same when people said lcds were off wrt to crts for years. Most said it was imagined. Then it was shown display latency was off. Then we had amd and nvidia introducing antilag tech recently.

Just when people said frametimes and frame pacing were not important, just average fps. Have people forgotten when xfire and sli were a thing? Nvidia went to great lengths to prove fps didn't matter just frametimes,etc.

Now we have gone full circle and now latency isn't important, frametimes are not important, lag isn't important, image quality isn't important, but average fps.

The reality is fake frames is just there for companies to sell you less for more and pcmr falls for it. Plus fake frame technology increases vram usage. Just limit those vram increases, and you can force an upgrade a bit earlier.

Hence why nvidia was trying to push the rtx4060ti performance figures using frame insertion. AMD will do the same. Less for more. More profits. This is the way.

I am sure when consoles do fake frames suddenly the magnifying glasses will come out! PCMR suddenly cares about fps, frametimes and quality of upscaling then. But pcmr goes on how consoles suck due to technical measurements. But let's not get too technical on PC right?

If people were to sit down and just play, many would find a console is perfectly fine at 30 to 60fps instead of a pc. Even games on their phones, which make more money than pc games.

console_pc_gaming_meme_21.jpg
:p
 
Last edited:
People do notice it because of the latency.
ICDP explained it and so have many others.Would people want to use this in a competitive, ranked fps? If the prediction algorithm goes off, then your aim will be off! It is the same when people said lcds were off wrt to crts for years. Most said it was imagined. Then it was shown display latency was off. Then we had amd and nvidia introducing antilag tech recently.

Just when people said frametimes and frame pacing were not important, just average fps. Have people forgotten when xfire and sli were a thing? Nvidia went to great lengths to prove fps didn't matter just frametimes , etc.

Now we have gone full circle and now latency isn't important, frametimes are not important, lag isn't important, image quality isn't important, but average fps.

The reality is fake frames is just there for companies to sell you less for more and pcmr falls for it. Plus fake frame technology increases vram usage. Just limit those vram increases, and you can force an upgrade a bit earlier.

Hence why nvidia was trying to push the rtx4060ti performance figures using frame insertion. AMD will do the same. Less for more. More profits. That is the way.

I am sure when consoles do fake frames suddenly the magnifying glasses will come out! PCMR suddenly cares about fps, frametimes and quality of upscaling then. But pcmr goes on how consoles suck due to technical measurements. But let's not get too technical on PC right?

If people were to sit down and just play, many would find a console is perfectly fine at 30 to 60fps instead of a pc with checkerboard rendering. Even games on their phones, which make more money than pc games.
I’ve just about had enough of both companies shennigans. I think my 7900xt will probably be my last upgrade on PC. Once my hardware becomes outdated and AAA games unplayable I’ll migrate over to console and keep the PC as a work/internet/word processor.
 
Last edited:
If you'd just sit down at a PC and play withoit knowing with those "fake frames" you'd most likely not even notice it. :)

And yet people like myself DO notice. The point isn’t even that it’s good or not, because even I conceded it has its uses. No the issue is that it is used as a USP for why the 40x0 range are awesome and worth the giant price hikes over AMD and Nvidia’s previous gen.

These tend to be the same people that proclaimed the previous gen 30x0 range was perfect for RT and that power consumption never mattered anyway. Now that AMD doesn’t do fake frames, is on par in RT with the 30x0 range it suddenly becomes crap and was not enough in the first place. The same for power consumption once again being “a thing”, but only because Nvidia are perceived as being better.

It’s the simple fact that if the only way Nvidia’s very expensive GPUs below the 4090 are perceived to have an advantage, is fake frames. Well then most of you really have been well and truly assimilated into Jensen mindless puppets. The fact some of them (not you to be fair) see any anti fake frame posts as actually anti Nvidia is proof of this.

I am not looking forward to fake frames from FSR 3 but at least if it is open to all GPUs I will concede it is better than not having it. I have it on my 4080 in a handful of games and find it meh. It is most certainly not a game changing technology the way VRR is.
 
Last edited:
People do notice it because of the latency.
ICDP explained it and so have many others.Would people want to use this in a competitive, ranked fps? If the prediction algorithm goes off, then your aim will be off! It is the same when people said lcds were off wrt to crts for years. Most said it was imagined. Then it was shown display latency was off. Then we had amd and nvidia introducing antilag tech recently.

Just when people said frametimes and frame pacing were not important, just average fps. Have people forgotten when xfire and sli were a thing? Nvidia went to great lengths to prove fps didn't matter just frametimes,etc.

Now we have gone full circle and now latency isn't important, frametimes are not important, lag isn't important, image quality isn't important, but average fps.

The reality is fake frames is just there for companies to sell you less for more and pcmr falls for it. Plus fake frame technology increases vram usage. Just limit those vram increases, and you can force an upgrade a bit earlier.

Hence why nvidia was trying to push the rtx4060ti performance figures using frame insertion. AMD will do the same. Less for more. More profits. That is the way.

I am sure when consoles do fake frames suddenly the magnifying glasses will come out! PCMR suddenly cares about fps, frametimes and quality of upscaling then. But pcmr goes on how consoles suck due to technical measurements. But let's not get too technical on PC right?

If people were to sit down and just play, many would find a console is perfectly fine at 30 to 60fps instead of a pc. Even games on their phones, which make more money than pc games.

console_pc_gaming_meme_21.jpg
:p

YES, this started with DLSS / FSR.

DLSS up-scaling is not the same as native, but its better than FSR, so better than AMD and that has become the detractor to it being not just acceptable but desired, to such an extent tech jurnoes are trying to create a massive hooplar about AMD stealing your DLSS for your $500 Game Console GPU, like the useful idiots that they are they are once again unwittingly creating a massive marketing campaign for Nvidia, these people are so stupid you couldn't ####### make it up for a Monty Python sketch.

Jenson thinks these people are stupid, its too easy to manipulate them, every ##### time.

The real story is Nvidia selling you fake frames and a lower image quality in place of real performance and at a massive premium mark up, not one of these useful idiots has realised that. Yes AMD sees this stupidity too and of course they try it on, but its not as good as DLSS so.....
 
Last edited:
YES, this started with DLSS / FSR.

DLSS up-scaling is not the same as native, but its better than FSR, so better than AMD and that has become the detractor to it being not just acceptable but desired, to such an extent tech jurnoes are trying to create a massive hooplar about AMD stealing your DLSS for your $500 Game Console GPU, like the useful idiots that they are they are once again unwittingly creating a massive marketing campaign for Nvidia, these people are so stupid you couldn't ####### make it up for a Monty Python sketch.

Jenson thinks these people are stupid, its too easy to manipulate them, every ##### time.

The real story is Nvidia selling you fake frames and a lower image quality in place of real performance and at a massive premium mark up, not one of these useful idiots has realised that. Yes AMD sees this stupidity too and of course they try it on, but its not as good as DLSS so.....

Some patreon of HUB please send that to them ffs......
 
YES, this started with DLSS / FSR.

DLSS up-scaling is not the same as native, but its better than FSR, so better than AMD and that has become the detractor to it being not just acceptable but desired, to such an extent tech jurnoes are trying to create a massive hooplar about AMD stealing your DLSS for your $500 Game Console GPU, like the useful idiots that they are they are once again unwittingly creating a massive marketing campaign for Nvidia, these people are so stupid you couldn't ####### make it up for a Monty Python sketch.

Jenson thinks these people are stupid, its too easy to manipulate them, every ##### time.

The real story is Nvidia selling you fake frames and a lower image quality in place of real performance and at a massive premium mark up, not one of these useful idiots has realised that. Yes AMD sees this stupidity too and of course they try it on, but its not as good as DLSS so.....

Its obvious when Nvidia released Turing V1 with stagnant price/performance and they tried to use DLSS 1.0 to upsell its performance. Then PCMR for years has mocked consoles for being "HD capable" when they were using upscaling too and called it a cheat. Now PC companies are cheating and its all fine. It seems these technologies are "upscaling" people's expectations of subpar products very well! :p

But what do you expect when Apple got away with making non-replaceable batteries a norm,non-upgradeable PCs a norm,etc. Nvidia knows what they are doing - can't criticise JHH for being an excellent CEO who knows social marketing very well even 20 years ago. A rarity TBH in the tech world.
 
Last edited:
Its obvious when Nvidia released Turing V1 with stagnant price/performance and they tried to use DLSS 1.0 to upsell its performance. Then PCMR for years has mocked consoles for being "HD capable" when they were using upscaling too and called it a cheat. Now PC companies are cheating and its all fine. It seems these technologies are "upscaling" people's expectations of subpar products very well! :p

Imagine it was AMD who introduced DLSS and were still the leaders at it.

Would we be having the same conversation? Serious question.
 
Back
Top Bottom