***The Official Starfield Thread*** (As endorsed by TNA)

My 3900x and 3070ti getting hammered, but really liking the game, only about 8 hours in, a mix of story/ quests, the mix is very interesting.
HUB's done some optimization guides that may (hopefully!) help:

 
I'd really like to see some actual controlled benches with ram speed (IE a 13900 tested with varying speeds of ddr 4 and ddr 5).

Same...

I'm almost tempted to test mine with default 5600mhz and then overclocked to 6200mhz and see the difference in fps.
 
I'd really like to see some actual controlled benches with ram speed (IE a 13900 tested with varying speeds of ddr 4 and ddr 5).
Yeah, that'd be interesting - someone should tweet (X?) at HUB or Gamers Nexus - sounds right up their street.
 
so for Starfield, buy faster memory for more performance it seems. Makes me happy that i overclocked my Ram at this point :eek:.
It will be interesting to see to what extent memory speed factors into this on the same CPU. The computerbase table lists the CPUs all with different RAM configs so it's not a fair comparison in that sense.

I have DDR4 3600 CL18 for example and am getting high fps still even in busy areas, so if I got faster memory, say DDR4 4400MHz, will I also see the big fps gains for the 1% lows and stuff like the computerbase table suggests? Obviously this would only apply to Starfield as the engine appears to favour memory speed as is being seen.

For me I think I slot into a hybrid environment, I have a 12th gen CPU with DDR4 memory running at 3600MHz, but have a GPU powerful enough to just brute force its way into high framerates in a game that seems to be demanding faster memory bandwidth. Obviously RDNA cards are getting better performance for obvious reasons vs equivalent Nvidia cards, but the CPU usage is going to be tied directly into memory bandwidth.
 
Ryzen 2700X and 7900XT here preformance is good (as expected and shown by numerous sources).

As for misinformation, well we can only go by what we see and what is reported and as it stands everything points towards good amd gpu performance, sub par nvidia performance and the only significant influence on performance is memory bandwidth.

Which is understandable if the game was developed for console first, the xbox series x has 500GB/s bandwidth.

A 7800X3D or Intel 13900k has about 90GB/s. With the AMD cpu hampered by infinity fabric that's sick at about 64GB/s.
Nice.
I'd rather just try it for free on steam and get a refund within the time period should it not run for you/you not like it?

VS being put off by the highly clickbait moan fest youtubers attention seeking?
 
  • Like
Reactions: TNA
Performance is all over the place, in a good few areas I can get 100+ fps but in a empty wasteland, get 50s, in that old western style town (can't think of name) = 50s yet in neon:

6mFiUMJ.jpg


Also, another good example of NPC behaviour, all the bad guys couldn't figure out how to open a door, that's their lasers shining through :cry:

Lt9XFTwh.png



Really do think this game was originally intended for last gen consoles given things like vram usage being so low and so on. Probably a good thing it never came out as it would have been like cp 2077 on the old gen consoles when released :cry:

Last night, with nothing much around, had around 80% + loading on the CPU in NA, practically watching an empty park. Crazy.
 
It will be interesting to see to what extent memory speed factors into this on the same CPU. The computerbase table lists the CPUs all with different RAM configs so it's not a fair comparison in that sense.

I have DDR4 3600 CL18 for example and am getting high fps still even in busy areas, so if I got faster memory, say DDR4 4400MHz, will I also see the big fps gains for the 1% lows and stuff like the computerbase table suggests? Obviously this would only apply to Starfield as the engine appears to favour memory speed as is being seen.

For me I think I slot into a hybrid environment, I have a 12th gen CPU with DDR4 memory running at 3600MHz, but have a GPU powerful enough to just brute force its way into high framerates in a game that seems to be demanding faster memory bandwidth. Obviously RDNA cards are getting better performance for obvious reasons vs equivalent Nvidia cards, but the CPU usage is going to be tied directly into memory bandwidth.

Shame my board only supports upto 6400mhz overclocked but i have seen people get more with a 13th gen cpu. I'll give it a go when i upgrade to 14th gen and test out some different ram speeds.
 
Playing this now with the DLSS3 and frame generation mod and getting framerates of 150fps to 230fps. Game has been perfect so far and I'm really enjoying.

No wonder AMD blocked Nvidia officially adding it during launch. With their inferior implementation and not even a release date for FSR3 that would've been very embarrassing :cry:
 
If it was one big open world, it would be somewhat impressive, but it's not. It's a series of separate locations linked by loading screens... Technically it's just not very impressive, each location is it's own thing, some run quite well and look very good, well some of the larger NPC hubs look bland and run like crap.

The larger NPC hubs just about stayed above 60FPS with 55% scaling at 3440*1440, but there was noticeable artifacting with FSR.
 
If it was one big open world, it would be somewhat impressive, but it's not. It's a series of separate locations linked by loading screens... Technically it's just not very impressive, each location is it's own thing, some run quite well and look very good, well some of the larger NPC hubs look bland and run like crap.

The larger NPC hubs just about stayed above 60FPS with 55% scaling at 3440*1440, but there was noticeable artifacting with FSR.

That's where your going wrong. DLSS is where it's at :p
 
That's where your going wrong. DLSS is where it's at :p
As funny as this is, it's actually true lol.

Was just chatting with a workmate on Skype as WFH, he mentioned he has the premium edition for the game but only just got good internet fitted in his new house so is looking forward to playing the game. He has a 2080 Ti and asked how it runs :o

Needless to say I did mention the various mods for DLSS etc so hopefully he is able to get some performance out of it using them.
 
As funny as this is, it's actually true lol.

Was just chatting with a workmate on Skype as WFH, he mentioned he has the premium edition for the game but only just got good internet fitted in his new house so is looking forward to playing the game. He has a 2080 Ti and asked how it runs :o

Needless to say I did mention the various mods for DLSS etc so hopefully he is able to get some performance out of it using them.

Yeah man. I give up on FSR now. It just can't match DLSS unfortunately.

If AMD can't make sure they release a version of it that is great in a huge title like this that they sponsored then what hope is there?

It is understandable to be fair, one uses hardware, the other does not.

Roll on FSR 3. Let's see what's that like :)
 
Last edited:
I’m still enjoying it but the more I play the more I’m not enjoying it all the loading screens and pointless areas that serve no purpose are getting a bit tiresome. Tbh I think Outer worlds was a better game.

It really lacks that grip that Fallout and Skyrim had for me. I’m not hooked and have found myself playing other games in between like Baldur gate.
Outer worlds seemed to have been a alpha/beta version of starfield imo
 
As funny as this is, it's actually true lol.

Was just chatting with a workmate on Skype as WFH, he mentioned he has the premium edition for the game but only just got good internet fitted in his new house so is looking forward to playing the game. He has a 2080 Ti and asked how it runs :o

Needless to say I did mention the various mods for DLSS etc so hopefully he is able to get some performance out of it using them.
This game runs decent on a steam deck btw.

It's not really a demanding game considering it's using a 20 year old engine and everything is behind loading screens
 
Back
Top Bottom