• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD FSR 3.0 has exposed the ugly truth about most PC gamers

I know a lot will disagree with me but I just can't help but see upscaling and frame generation technologies as an excuse for both sides to get lazy on actual advancement. Why keep pushing the hardware when you can create software to "improve performance" instead?

Part of the reason they have had to resort to this is because moore's law is dead is actually somewhat true, without these software "tricks", you wouldn't be getting much, if any more performance than if they were to just focus on hardware. That and as stated, one of the biggest reasons to look to software methods now is because it's a far more efficient process in a number of ways rather than always just trying to brute force max settings @ true 4k RT/PT etc. Without such software tricks, real time ray/path tracing would still be a decade of probably, so it's making the impossible possible today.

And again, I'm not sure why people think making software is cheap/free and also an easy thing to do? It really isn't, as shown by the heap of **** optimised games throughout 2023 that magically become better after patches.....

It's not like nvidia and amd are just putting in less effort/money, they both increased their R&D budget massively for 2023:

KQkxob6h.png


dKfFxwgh.png
 
A video I watched about 3-4 years included a talk with a seasoned computer engineer talking of hardware needing to become obscenely powerful gen on gen to start making impressive jumps otherwise it would just be iterative improvements that have to rely on software tricks... and well... here we are, DLSS, Frame Generation, FSR, XeSS etc....

Yup it's probably why their R&D has jumped considerably as they're having to find alternatives to keep the performance improvements going otherwise no one going to buy/upgrade to anything new.

Like I've always said before, it's the experience and end result I care about, if xyz can achieve what I deem as being good then it's good with me, if it isn't, well I simply won't use/buy said product/feature. Most people are paying for streaming services, which is arguably a software service and buying into smartphones for their "software" so why not do the same for other products you use?
 
Last edited:
I don't like seeing the term 'lazy' thrown around when it comes to game development because the devs themselves are always under constant pressure from their publishers in the form of budgets and deadlines - it *is* however becoming clear that upscaling is increasingly being used as a crutch to push the performance of console games into 'playable' (i.e. 30fps) territory.

Recent releases on the Series S have been as low as 540p internal resolution (not far shy of what the PS2 was putting out) and there's several examples of PS5 and Series X games that run as low as 720p (before upscaling) in their performance modes in an attempt to hit 60fps. The end result is a smeary image that lacks the clarity of consoles just a generation or two earlier.

Sadly, there really is no solution to this, other than to not buy these games because if you do, you're telling the publishers that image quality doesn't matter to you.

I suspect it won't be long before we see our first console game using FSR frame generation for its 'performance mode' whilst still actually running at 20-30fps - and if the game sells well enough, other devs will soon follow suit.

Yup agree on the whole except for image quality bit ;) :p

Development in general across all industries now is pretty **** tbh especially with ai in the picture now, it's always about getting to the market faster each time with no thought/care for building a well QC product, the stakeholders just want to get it over the line and out there with no care for what they are delivering (obviously there are a few exceptions but it's becoming rare in the day of "on demand"), it's very much a deal with the consequences after. There are practices, which seeks to improve the overall quality and speed up testing and delivery, generally what DevOps engineers aim to build out but problem is, this is often poorly implemented in most companies and if the software engineers complain that such "processes/systems" are slowing them down in delivering xyz features (because it's not passing tests, which means they have to go and spend more time resolving the issues before they can deliver xyz feature), stakeholders will want xyz processes put in place removed in order to meet the deadline without any care for why such processes were put there in the first place.

Optimisation/Quality is always the last thing to get looked at, I can't see this ever changing until there is a shift in how companies operate and change what they value.
 
Exactly. Who cares how. End result is what matters. If I personally like the end result I don't care how it is done.

That said, I really do not like the idea of having to have a live Internet connection for ai to do some of the work elsewhere. Can see Nvidia releasing something like this next. I want everything done locally on my pc, otherwise might as well get a virtual gpu like you did with a 4080 Nexus :p
Don't worry, it will still be done locally, as long as you have a nvidia gpu :p

I do recall of seeing something about hardware being used for windows copilot for acceleration and nvidia gpus though so probably will become a thing in games too for any ai specific workloads.
 
Not sure what that has to do with what you said about "it's always about getting to the market faster each time with no thought/care for building a well QC product" but OK. :rolleyes:

e: I guess for some people everything is always about Nvidia vs AMD.

My post was clearly in response to @Aegis post about "game development" in general and the problems with poor releases....

And then see the part where I said this:

Yup agree on the whole except for image quality bit ;) :p

Development in general across all industries now is pretty **** tbh especially with ai in the picture now, it's always about getting to the market faster each time with no thought/care for building a well QC product, the stakeholders just want to get it over the line and out there with no care for what they are delivering (obviously there are a few exceptions but it's becoming rare in the day of "on demand"), it's very much a deal with the consequences after. There are practices, which seeks to improve the overall quality and speed up testing and delivery, generally what DevOps engineers aim to build out but problem is, this is often poorly implemented in most companies and if the software engineers complain that such "processes/systems" are slowing them down in delivering xyz features (because it's not passing tests, which means they have to go and spend more time resolving the issues before they can deliver xyz feature), stakeholders will want xyz processes put in place removed in order to meet the deadline without any care for why such processes were put there in the first place.

Optimisation/Quality is always the last thing to get looked at, I can't see this ever changing until there is a shift in how companies operate and change what they value.

You took part of my post:

it's always about getting to the market faster each time with no thought/care for building a well QC product

To respond with this (which was my comment on another discussion topic with a different member about software methods nvidia/amd/intel are using):

It's the experience and end result I care about, if xyz can achieve what I deem as being good then it's good with me. :p

So yes in the context of getting a good end result with regards to gaming experience, Nvidia with their software solutions solve the issues with such issues in todays games by providing a good solution to address issues with rushed game releases i.e. bypass ****** optimised games and also get better IQ as evidenced than native + TAA where as the likes of amds solutions could be fit into the "rushed release" products as evidenced with FSR, freesync etc. Given you have taken snippets of my posts to try and frankenstein some narrative, I'm not really sure what you're trying to put across though.
 
So what you're saying is that you think the experience and end result being all I care about, that if if xyz can achieve what I deem as being good then it's good with me does not apply to getting to the market faster each time with no thought/care for building a well QC product.

That building a well QC product only applies if it's not the experience and end result I care about, if xyz can achieve what I deem as being good then it's good with me...talk about proving my point without even realising it. :cry:

Your logic is extremely flawed here like the ridiculous mona lisa comparison, it's like saying people are trying to take borderlands and force it to be photo realistic :cry:

I really don't get what you're trying to put across and it is just coming across like you're trying to start some petty argument tbh.

Companies can release fast to the market AND also have a well QC product as shown by nvidia.... As shown by good games like avatar (although it did get delayed multiple times). AMD just so happen to be an example of not just poor QC but also delivering slow.

Maybe next time, don't read too much into things and stop with the petty selective quoting to suit narratives as you're clearly getting lost in your own narrative now, selective quoting has been warned about before by the mods as it is classed as baiting/trolling.
 
I mean i thought i was pretty clear in my edit, that i did because you're clearly struggling, but seeing as you're intent on being deliberately obtuse I'll leave you to it.

What's clear is that you deliberately selectively quoted part of my posts to fit some narrative of yours for whatever reason. Read the stickied post at the top of this sub-forum.
 
Last edited:
Lets look at where you took this quote:

It's the experience and end result I care about, if xyz can achieve what I deem as being good then it's good with me.
:p

From and what said quote was in reference to:

A video I watched about 3-4 years included a talk with a seasoned computer engineer talking of hardware needing to become obscenely powerful gen on gen to start making impressive jumps otherwise it would just be iterative improvements that have to rely on software tricks... and well... here we are, DLSS, Frame Generation, FSR, XeSS etc....
Yup it's probably why their R&D has jumped considerably as they're having to find alternatives to keep the performance improvements going otherwise no one going to buy/upgrade to anything new.

Like I've always said before, it's the experience and end result I care about, if xyz can achieve what I deem as being good then it's good with me, if it isn't, well I simply won't use/buy said product/feature. Most people are paying for streaming services, which is arguably a software service and buying into smartphones for their "software" so why not do the same for other products you use?

Hence why I replied to that comment of yours/mine with:

Hence why nvidias dlss and frame gen is so much better than amds ;)

So yeah, about those subforum rules......

Stanners has posted before in threads warning people about it.
 
Last edited:
i am not a fan of fake frames. My experience is it is useless below the kind of FPS that most people find playable anyway.

If you are getting 30 - 50ish FPS enabling fake frames gives noticeable lag and in my opinion it is unplayable.
If you are getting 50-60+ FPS enabling fake frames is essentially pointless if you have a VRR monitor.

For example I get 70 FPS average in MSFS and adding fake frames brings it to 120+ FPS, but it doesn't feel faster. It just feels pointless on my VRR monitor.

People with low FPS are the very people who can benefit from fake frames, but they aren't really getting any benefit IMHO.

As someone with a gsync ultimate 175hz monitor, this isn't correct. Higher FPS provides more motion fluidity thus a smoother image in play, which is one of the main reasons to have high fps in the first place, even in hogwarts where the base fps is already like 80 fps, jumping to 130+ is noticeably better in this regard.

MSFS is not the kind of game you'll see a huge benefit in due to the sim like slow nature of it.
 
In my opinion if you are getting 80FPS enabling fake frames does nothing substantial to add to your experience. It is already good and enabling fake frames simply makes it appear smoother.

Where fake frames should be a game changer is making slower GPUs useable, but it actually arguably makes things worse. If your old GPU is giving 30FPS you would imagine fake frames giving you nearly 60 FPS would be a literal and figurative “game changer”. You would not need to pay the silly money for an even mediocre GPU. For such people the trade off for those fake frames is increased lag and more visible graphical artefacts. The artefacts are masked more the higher the base FPS gets, at which point the entire reason for fake frames becomes moot IMHO.

So if your main argument for how great fake frames is boils down to, “it makes my already 80 FPS smooth RTX games look smoother”. Well good for you, but I would argue that for the majority, that’s like reducing tax for the rich and calling it a “boon for the economy”.

Edit: Also there are plenty of scenarios in flight sims where you need high response times and accuracy. Air racing and stunt flying etc. MSFS is not just about flying a big sedate bus with wings. So it’s a bit condescending to assume anyone playing MSFS won’t see any benefit in real (ie not fake) FPS improvements.

"appear smoother" - it is smoother and more fluid, which is exactly the reason you want higher fps.... this is not opinion, it's fact as backed up by several reputable sources so the claim of it provides nothing is just false. This is what makes frame gen good, yes it doesn't improve latency to make it feel like the real xyz FPS but if you're wanting a smoother and more fluid game then frame gen precisely achieves this. If you aren't sensitive to motion clarity, fluidness then yes, it may not be noticeable to you, personally for me and as shown, many others, it is, especially since I'm gaming on a QD-OLED screen.

Daniel Owen covers it well:


Agree on the base fps for lower gpus, having said that, based on my experience testing the FSR 3 mod on my 3080, ark survival enhanced and cp 2077 despite the base fps being 40-50 fps, it feels and plays a **** ton better with frame gen on than off. But it also seems to come down to the game or/and mod/fsr 3 as in dying light 2 for example, even with a base fps of 70, it's awful but I'm pretty sure this is down to something to do with fsr 3/mod or/and the game as the motion blur/clarity takes a nose dive completely in this particular game. Input lag when your base fps is 60+ is pretty much a non issue "for me" on my display but I do notice the latency when base fps is say 40 but I'll take that if it means a smoother looking game (as long as artifacting is not too bad/noticeable).

Also, blurbusters take on fake frames:

 
Back
Top Bottom