• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

New GPU for MSFS

Associate
Joined
22 Jul 2009
Posts
59
Location
United Kingdom
Hello,

Looking for some advice please - I mainly play Microsoft Flight Simulator (Both 2020 and 2024) and looking to upgrade my graphics card. I currently have a 10GB 3080 so looking to upgrade it to something "newer".

I've been looking at AMD due to the insane pricing that Nvidia now has and the more recent driver issues don't look great. MSFS does like to use VRAM so I was wondering if its still worth picking up a 7900XTX in 2025 due to large amount of VRAM this card comes with or get a newer gen (9070XT) due to the newer tech support such as FSR4 support etc. (MSFS24 uses FSR 3.1 currently I believe with 4 support promised).

As I would still get additional VRAM on either cards from my current 3080.

In terms of CPU I have the 9800X3D and have 64GB DDR5 Ram and I run 1440p.

Any advice is appreciated, thanks!
 
MSFS does like to use VRAM so I was wondering if its still worth picking up a 7900XTX in 2025 due to large amount of VRAM this card comes with or get a newer gen (9070XT) due to the newer tech support such as FSR4 support etc. (MSFS24 uses FSR 3.1 currently I believe with 4 support promised).
I wouldn't want to get a 7900 XTX, unless it gets proper FSR4 support, which unfortunately seems unlikely at the moment.

There's a rumour that the 5080 is going to be released with a Super and 24GB of VRAM, that would be a card worth upgrading to.

9070 XT isn't really something I consider intended for 3080 owners, but it is ideal for upgrades from older cards (like a 2060 or 5700 XT).
 
I'd suggest a 9070XT, I think. You will have more future-proofing with an FSR4-capable card. You can get 5070 Tis now for RRP if you are patient, if you want to stay green-team, but as you say, drivers have been problematic for months.
 
I play flightsim a lot and you want something with frame gen as it really benefits with few of the downsides you get from it in faster paced games. Anything with 16gb VRAM will be fine.
 
9070 XT isn't really something I consider intended for 3080 owners, but it is ideal for upgrades from older cards (like a 2060 or 5700 XT).

This.

It's a pretty horrendous time for those on 3080 or similar performance levels to upgrade tbh.

Ideally I'd want something around the 5080 in terms of performance minimum, but certainly not at a grand. The 5070ti should have been firmly in that performance bracket for £600-700 as far as I'm concerned, the entire current Nvidia line up is a joke unless you're up for buying the flagship every gen.
 
Last edited:
I don't get the 5080 argument, at 1440P which is their requested resolution the 5080 is 16% better than the 9070 XT, which in turn is 45% better than the 3080.

If you're struggling with 40 FPS another 45% performance will get you to 60.

The 5080 in its cheapest available guise is 45% more expensive than the 9070 XT at £980, for 16% more FPS.

 
I don't get the 5080 argument, at 1440P which is their requested resolution the 5080 is 16% better than the 9070 XT, which in turn is 45% better than the 3080.

If you're struggling with 40 FPS another 45% performance will get you to 60.

The 5080 in its cheapest available guise is 45% more expensive than the 9070 XT at £980, for 16% more FPS.


Because of Multi frame gen, if someone can get good performance from lossless scaling, that's a cheap solution also.
 
I don't get the 5080 argument.

On my end it isn't an argument for a 5080, but rather acceptable uplift from a 3080 and how much that costs five years after the card launched.

45-60% uplift from a half decade old high tier card to a mid range AMD and second highest tier Nvidia card is poor, but when you account for the prices involved (a grand for the 5080) it becomes absolutely shocking, and there's only so far the inflation argument can go, it's flat out god awful value.

I'm sure "tariffs" will be the next excuse for god awful price/performance.

The bloody 5060ti should be on the level of the 3080 by now as far as I'm concerned, and that's without MFG etc.
 
Last edited:
On my end it isn't an argument for a 5080, but rather acceptable uplift from a 3080 and how much that costs five years after the card launched.

45-60% uplift from a half decade old high tier card to a mid range AMD and second highest tier Nvidia card is poor, but when you account for the prices involved (a grand for the 5080) it becomes absolutely shocking, and there's only so far the inflation argument can go, it's flat out god awful value.

I'm sure "tariffs" will be the next excuse for god awful price/performance.

The bloody 5060ti should be on the level of the 3080 by now as far as I'm concerned, and that's without MFG etc.

The reason for all of this is......

Because of Multi frame gen, if someone can get good performance from lossless scaling, that's a cheap solution also.

There we go you see fake frames make it all worth it.
 
Last edited:
There we go you see fake frames make it all worth it.

The sad thing is, I'm not even necessarily against AI being used to help GPU performance. What I'm against is that they're pushing it too hard and too early for where it's at, I don't hate DLSS for example at 4K but still think it needed a little more time in the oven.

Games engines like UE5 are near reliant on the tech to boot, and until recently AMD with FSR was miles behind to the point it seriously put me off going AMD even though they'd have been a better choice in almost every respect.

MFG is an ungodly mess and not competent at any level in my mind, I do not care about it. I can't even give it a semi-pass, this stuff is years too soon and we're paying a premium to be a testbed, and to boot we've got the atrocious forced 12VHPWR issues which are still ongoing and the worst drivers we've seen in a very long time from any consumer graphics company, not to mention missing ROPs.

I actually intended to upgrade this gen from my 4070 series and I'm playing my backlog instead, and with the summer I'll be out and about a lot more so bugger Nv -- I can't be too harsh toward AMD this round but there isn't enough uplift there for me either. I feel heart sorry for people who are genuinely wanting to upgrade to enjoy the hobby, especially for those where it's their main.
 
Last edited:
Oh i do hate it, i hate all of it, With Nvidia and out of necessity AMD following close behind you don't buy a GPU to render frames anymore, not as such, its not really what you're paying for now when 'purchasing an upgrade' what you're paying for is re-branded Temporal Anti Aliasing and Frame Interpolation, under normal circumstances these technologies are just the latest version of decades old image enhancement technologies.

Why just have that when you can repackage it under marketable branding and sell it as its own complimentary thing? That way you can sell GPU's that are barely different to previous generations while still claiming them to be something new, exciting and an upgrade in performance.

Why would consumers allow this? Consumers don't think like this, they might complain about the raw numbers, but that is quickly forgotten when they realise the other team doesn't fake it as well and with that they immediately feel good about their purchase, that other guy is now forced to get good at faking it.
 
Last edited:
Why would consumers allow this? Consumers don't think like this, they might complain about the raw numbers, but that is quickly forgotten when they realise the other team doesn't fake it as well and with that they immediately feel good about their purchase, that other guy is now forced to get good at faking it.

It's comparable to the mindshare and marketing of the late 90's and early-mid 00's -- remember Intel vs AMD? AMD was faster with their Athlon Thunderbird range than P3 was unless the Intel chip was running rambus, which was incredibly expensive compared to regular SDRAM. Then we move on to the Athlon XP vs P4 range, early P4's were pigs that ran like crap compared to AMD. It wasn't until the Northwood that Intel really gained some footing in that regard, the same story continued with the A64 era. It hardly mattered, people bought Intel because it was in the public eye and they had money to burn for marketing and more backhanded things to be frank.

Most gamers are not tech enthusiasts, they just aren't. Taking away that mindshare is very difficult, AMD managed it with Intel on the CPU front but it took stagnation and complacency on behalf of Intel before it happened. You cannot look at places like this for the average view, and even on dedicated old school tech forums you'll find hard bias toward brands/companies. You go to gaming forums where people buy pre-builds, or perhaps dabble in upgrades? It's nothing but nonsense takes, people screaming about AMD being terrible no matter what the era. This includes multiple times where Nvidia was objectively worse in hardware and software too, and in some cases was even releasing drivers which killed cards.

All companies do good and bad, but Nvidia has free reign to do whatever they want with the industry and the only thing anyone else can do is follow at this point. Game engines are built around their ideas, they're not just the industry leader they're so far ahead it's not even The Tortoise and the Hare anymore. We can all complain and hate it as much as we want, but Bob the Steam forum/reddit pleb read that AMD suck, so he's happily spending £400-500 on a 5060ti and running all the dodgy tech it involves, DLSS/MFG etc and he's happy with it on his horrendously poor quality 1080 or 1440P monitor he got two years prior with a dodgy prebuild.
 
Last edited:
It's comparable to the mindshare and marketing of the late 90's and early-mid 00's -- remember Intel vs AMD? AMD was faster with their Athlon Thunderbird range than P3 was unless the Intel chip was running rambus, which was incredibly expensive compared to regular SDRAM. Then we move on to the Athlon XP vs P4 range, early P4's were pigs that ran like crap compared to AMD. It wasn't until the Northwood that Intel really gained some footing in that regard, the same story continued with the A64 era. It hardly mattered, people bought Intel because it was in the public eye and they had money to burn for marketing and more backhanded things to be frank.

Most gamers are not tech enthusiasts, they just aren't. Taking away that mindshare is very difficult, AMD managed it with Intel on the CPU front but it took stagnation and complacency on behalf of Intel before it happened. You cannot look at places like this for the average view, and even on dedicated old school tech forums you'll find hard bias toward brands/companies. You go to gaming forums where people buy pre-builds, or perhaps dabble in upgrades? It's nothing but nonsense takes, people screaming about AMD being terrible no matter what the era. This includes multiple times where Nvidia was objectively worse in hardware and software too, and in some cases was even releasing drivers which killed cards.

All companies do good and bad, but Nvidia has free reign to do whatever they want with the industry and the only thing anyone else can do is follow at this point. Game engines are built around their ideas, they're not just the industry leader they're so far ahead it's not even The Tortoise and the Hare anymore. We can all complain and hate it as much as we want, but Bob the Steam forum/reddit pleb read that AMD suck, so he's happily spending £400-500 on a 5060ti and running all the dodgy tech it involves, DLSS/MFG etc and he's happy with it on his horrendously poor quality 1440P monitor he got two years prior with a dodgy prebuild.

You might be surprised....

S7UZSs6.jpeg


Guess what Intel started doing around mid 2005? :D

Ok, so i am one of those people who was pretty down on AMD's ability to take significant chunks of Nvidia market share.
I take that back, the 9070 XT is in reality the same price as the 5070 Ti, in value for money it actually stacks up less favourably against Nvidia than RDMA 1, 2 and 3 and yet despite this its doing better now, still, than those previous generations, just imagine how they would be doing if they were in continuous stock at under £600.

I think its because they got glowing reviews and they got glowing reviews primarily because AMD are better able to fake it now than they have in the past.

So yes i believe AMD can do much better against Nvidia, in time, all they have to do get on the same level of faking it.

------------------------------------------------

Just a question. does anyone actually use Frame Generation in their day to day gaming? Does anyone actually play their games is such a way where with DLSS 4 and FSR / AFMF only 1 in every 4 frame is real?

To put that in to context at 100 FPS that's 30 ms of added latency, that latency is equivalent to 33 FPS and the real frame latency added on top is 30 FPS. Why don't you play your games at 30 FPS? is it more because it looks laggy or is it because it feels laggy?
 
Last edited:
Just a question. does anyone actually use Frame Generation in their day to day gaming? Does anyone actually play their games is such a way where with DLSS 4 and FSR / AFMF only 1 in every 4 frame is real?

I used it (original frame gen) when playing Robocop: Rogue City, which is a UE5 game.

It did feel smoother, although I'm sure there was a certain added amount of latency there. That said, it's also single player so it's something that you can adapt to in that regard. I have tried MFG and it felt horrific, and I can't see how either could ever be viable in any form of online gaming.

I've not used it since in anything, take that as you will. :cry:

You ideally need to be pushing decent frames to benefit from frame gen, which begs the question of how useful it really is? It wont magically fix a card that's struggling, it'll just (potentially) help already decent cards out in niche scenarios.
 
Last edited:
I used it (original frame gen) when playing Robocop: Rogue City, which is a UE5 game.

It did feel smoother, although I'm sure there was a certain added amount of latency there. That said, it's also single player so it's something that you can adapt to in that regard. I have tried MFG and it felt horrific, and I can't see how either could ever be viable in any form of online gaming.

I've not used it since in anything, take that as you will. :cry:
When for every fake frame there is a real one its not so bad as through frame warping (nV Reflex and the AMD equivalent) which is a method of synchronising the CPU and GPU in such a way where the CPU can communicate an input before the frame is rendered, so you're actually only out by 1 frame on your input which will feel pretty normal, or kind of a 'just in time input to frame out system' if you use it without FG, the problem with MFG is that doesn't work because its stacking up 3 frame that are not real and with that have no input.
 
Last edited:
When for every fake frame there is a real one its not so bad as through frame warping (nV Reflex and the AMD equivalent) which is a method of synchronising the CPU and GPU in such a way where the CPU can communicate an input before the frame is rendered, so you're actually only out by 1 frame on your input which will feel pretty normal, or kind of a 'just in time input to frame out system' if you use it without FG, the problem with MFG is that doesn't work because its stacking up 3 frame that are not real and with that have no input.

I enjoyed the added smoothness in that one game which honestly required it at the time (patches may have improved things), and being single player I didn't really notice the latency differences that were likely there.

That's one UE5 game I've used it in, there's others I absolutely didn't and wouldn't, Remnant II as an example is the same engine and I think many would describe as a Souls-like (lite?) experience. I would not want my timing messed up in a souls style game, in fact I'd rather play at lower FPS in that scenario.

Anyhow, I think that I should stop ranting about the subject in here as it's some poor sod looking for upgrade advice.
 
Last edited:
I've used MFG X4 in

Oblivion remastered
CP2077
Stalker 2
Indiana Jones
Half Life 2 RTX

it's been fine on all of them (as you can see they are all single player experiences + I guess it helps I have higher base frame rates to start from to make the performance as good as it feels with my setup.

Would I prefer native frames (of course) but MFG works well when needed to run at fully maxed out settings.

As the OP wants optimal performance for MSFS that is the best solution - and a 5070ti may be good enough to get the performance he wants for a price close enough to a 9070xt
 
Yep, framegen works great in MSFS because it is slow paced enough to get away with it and is normally CPU limited, hence the recommendation of something with 16gb of VRAM and framegen. I'd avoid DLSS in this game as it still(!) makes glass panels blurry.
 
Back
Top Bottom