***The Official Starfield Thread*** (As endorsed by TNA)

There is simply no excuse for this level of performance.

The game looks fairly average at best. It's doing nothing graphically or technically that's new or pushing any boundaries.

When something like Crysis came out, pushing hardware to the limit was fine as it genuinely was a leap forward graphically. There are tons of better looking and performing games than this out right now.

If I had my tinfoil hat on, it just seems like an intentional effort to push GPU sales....

Crysis was massively CPU limited. People didn't realise it at the time,but as faster hardware came out it was bottlenecked by the CPU.

They tested it in the heaviest usage part of the game which is the forest section.

Wish people actually watched it. This was like the Crysis 3 where people were saying it wasn't CPU heavy and a dual core was fine,but hadn't reached the point under the Doom,which had tons of grass. Lots of reviews tested earlier parts. That pushed a heavy multi-threaded load. Computerbase were the first to point it out,then DF showed it too.

Or the Geothermal Valley section in ROTTR,which was very taxing on CPUs and dGPUs unlike other parts of the game.
 
Last edited:
That's literally all YouTube is. People posting clickbait title with their stupid face on the thumbnail, trying to sell whatever they've been sponsored to do or whatever.

I wasn't having fun until I DISCOVERED this mod.
*Insert stupid expression* and 20m video for 30s of info.

Sarah can **** off too. Rest 1hr to fill health, "hey there, nice long rest hey"
THIS! But the shills will fight you for it and waste HUNDREDS of pages on threads arguing why their card is better, and base all of their ownership of not having Nvidia/vram this blah blah that because of it...
And this is coming from an AMD guy myself whose chosen recently to try a 4070 because of the featureset, because as with cars it doesn't matter who makes it, if it works/I like how it drives and looks, I'll buy it/modify it regardless of the creator...

But people seem to be insanely biased now all based on stuff they haven't ever done themselves... Yet will argue till blue in the face that they're right/quote some plebs review/chart...
 
Last edited:
It's well worth taking the dream home trait. All the YT vids making out like the cost is high are rubbish; it's not expensive and the benefits are huge.

Mainly landing right by it so you can walk when over encumbered (which stops fast travel etc) and drop your stuff at a storage chest is so convenient.

For 500 credits per week on the mortgage which is the sale from a fraction of loot per instance, the cost is nothing.

Even paying it off fully early on at 125k credits is easily doable but not necessary.

I've not visited my house yet but glad I picked it. I figured that in games like this you always end up with more money than you know what to do with eventually anyway.
 
THIS! But the shills will fight you for it and waste HUNDREDS of pages on threads arguing why their card is better, and base all of their ownership of not having Nvidia/vram this blah blah that because of it...
And this is coming from an AMD guy myself whose chosen recently to try a 4070 because of the featureset, because as with cars it doesn't matter who makes it, if it works/I like how it drives and looks, I'll buy it/modify it regardless of the creator...

But people seem to be insanely biased now all based on stuff they haven't ever done themselves... Yet will argue till black in the face that they're right/quote some plebs review/chart...

Even funnier when an adult makes a decision based on what some twonk on YouTube says.
Things that are based on fact or stats I get but come on, my 13 year old nephew even learned critical thinking by now.
 
Is there a good reason to build a base/home in this? Other than for storage I mean or can everything in a base/home be done on your ship instead? I remember the "base" building in Fallout being rather pointless in the grand scheme of things, so hopefully its something thats more useful than some chests and workbenchs like in Fallout

You can fully decorate how you chose it (you place/rotate/align each item) plus all the equipment, crafting facilities, bounty clearance terminal, mission terminal etc.

I've not visited my house yet but glad I picked it. I figured that in games like this you always end up with more money than you know what to do with eventually anyway.

Yer,
 
HUB published an optimisation guide yesterday so I don't think they are pushing people to buy new hardware:

So you can get some decent improvements tweaking settings,and they pointed out it was the most taxing area they could find.

But it doesn't change the fact if we had a normal generational uplift,the RTX4070 would have been a closer to £400 RTX4060TI,and the RX7800XT would be close to that price as an RX7700XT. The RTX4070TI would be a £600 RTX4070 and the RX7900XT would be an RX7800XT for similar money.

It would also have further pushed down older cards in price too. So even if they are pushing worse case scenarios,some of these issues are down to general stagnation in hardware too.
He's not directly doing it but these people are by reverse psychology by having the rigs/setups they do constantly showcased as their claimed 'daily' setups. Which magically get upgraded every day by the looks of it with most of these shills.

I haven't 'tweaked' jack shid m9. Just cranked EVERY setting to the highest it'd go, played natively 1440p ultra no dynamic cheating resolution no nothing 52-60 fps. Did the dlss mod 90-110fps with the default 70% scaling setting the mod came with - didn't even bother to touch the settings, just wanted to see again if I could even tell it was on, and nope I cant... Unlike when I tried fsr1/2 with my AMD card... Which looks like gaids!

The price doesn't mean jack to me mate, I was ready to buy a 7900xt member, till I saw the disgusting power draw and how bad RT knocks it, if I'm going to spend £730-830 on a gpu (at the time) I want all the bells and whistles, so for the price I paid for my 4070 it delivers exactly what it says on the time featureset wise which genuinely blew my mind after being very anti RT/DLSS/AI Frames etc etc for ages! That's the difference with having a go yourself, at worst it's cost you a bit of time and the postage to return it and say it was rubbish... Which believe me I would do then slate it hard online haha!

Don't forget in 3 years time the money I've saved in electric gaming 6 hours every day every year means this cards cost me nothing vs setting fire to that money using twice the electric if I'd got a 6950xt/7900xt whilst burning my leg/making my room a sauna/coil wine/fan noise etc etc...
 
Last edited:
Sorry that's what I meant mate, as in has it been released, as when I grabbed his 2.0 it hadn't been released yet - unless I was being a dumbass (more than likely) IDK if I want to pay for it haha, frame gen is awesome, I've played with all these features and was very impressed in reality what my 4070 could do - as I'ma AMD guy at heart but wanted something that'd consume no power/wasn't like sitting next to a furnace/silent/nice selection of next gen features etc...

I have undervolted my gpu and run my 5700x in eco mode... Gone from an impressive 200w tbp of the 4070 to 105-135w in 1440p native ultra or the same in 4k dlss, or 145-165 max with 1440p ultra/RT on in control (the thirstiest I've seen any game push my gpu wattage)
My entire setup INCLUDING the monitor/amp/speakers uses 260-300w at the wall with a plug meter monitor gauge... That's with a 31.5" 1440p 165hz monitor too! Average is around 260w and 300w with things like Control maxed out in 1440p native with RT on etc etc...

I got sick of how much electric was costing so figured saving nearly £200 a year in electric every year meant by year 3 my 4070 was free, where as if I'd got a 6800xt-6950xt-7900xt (originally wanted a 7900xt) the gpu alone would be using the same or more electric than my entire setup! AND costing me twice as much a year to use the setup 6 hours a day, every day (which I do).

TLDR vs just throwing that money away for the entirity of the ownership of the system I may as well go for the 4070 as it becomes free in 3 years thanks to the aforementioned electric saving and I can rinse the death out of it 365 days a year without a care! When it becomes incapable even with frame gen/dlss/lower settings, my SFF 2nd rig gets a bonus upgrade - of which it'll fit in that case :D

The thing I found interesting was how at literally THE SAME res/settings the amd cards will use 2-3gb more vram than mine in any game me and my friends have compared... IDK if that's cause games are written more for Nvidia? Or whether despite the recommended allocation being way higher and not maxing out vram usage vs what the game thinks you'll use, the amd cards just use more.
Weird right? But yeah I seem to use 5.7-6.3GB in this with dlss mod for example, resi4 remake or TLOU native 1440p maxed out 9.3gb... Then we use the same res/settings in a 6800xt for example and it's using 2-3gb more but not maxing it's allocated amount? Maybe the AMD cards just chug more to achieve the same fps/settings? Who knows.

But yeah means I shouldn't have to worry about the vram issues for a fair bit imho, as at worst I just use dlss/frame gen/lower the settings as required, and as I say when that doesn't work at 1440p I'll just chuck it in the SFF rig and i've got my money back in savings on the electric each year by that point so who cares...

I wasn't actively looking for dlss 3 at the time but dlss2 was available on launch so installed that and tested it for abit before i looked for dlss3 mod. The only issue i have with frame gen is during ship building where it causes a lot of ghosting and smearing. I've done the same, i've undervolted the 4090 so it only draws around 200w while still boosting to around 2800mhz so i get abit better than stock performance for a lot less wattage and noise/heat.

I dont think this game has any Vram issues, i've barely used more than 7gb in vram where as system ram seems to take a bigger hit around 15gb iirc.
 
There is simply no excuse for this level of performance.

The game looks fairly average at best. It's doing nothing graphically or technically that's new or pushing any boundaries.

When something like Crysis came out, pushing hardware to the limit was fine as it genuinely was a leap forward graphically. There are tons of better looking and performing games than this out right now.

If I had my tinfoil hat on, it just seems like an intentional effort to push GPU sales....
But in reality as I've stated above in previous posts, the game runs WAY higher fps/res/settings than his claimed test hardware chart stats...

It's HIM that is pushing for GPU sales not the game. My 4070 uses like 5.7gb vram at 1440p ultra, so as you say the game isn't doing anything remotely taxing, hence why in reality it doesn't run like crap. It's just clickbait trolling as usual being copycatted for revenue/reposts/"fight me on a forum I have no social life in RL" type nonsense!
 
Crysis was massively CPU limited. People didn't realise it at the time,but as faster hardware came out it was bottlenecked by the CPU.



Wish people actually watched it. This was like the Crysis 3 where people were saying it wasn't CPU heavy and a dual core was fine,but hadn't reached the point under the Doom,which had tons of grass. Lots of reviews tested earlier parts. That pushed a heavy multi-threaded load. Computerbase were the first to point it out,then DF showed it too.

Or the Geothermal Valley section in ROTTR,which was very taxing on CPUs and dGPUs unlike other parts of the game.

Same thing happened in GTA5, heavy grass areas would absolutely destroy fps. That's why moving from the city to the countryside saw fps drop massively until you reduced the foliage/grass setting down.
 
I wasn't actively looking for dlss 3 at the time but dlss2 was available on launch so installed that and tested it for abit before i looked for dlss3 mod. The only issue i have with frame gen is during ship building where it causes a lot of ghosting and smearing. I've done the same, i've undervolted the 4090 so it only draws around 200w while still boosting to around 2800mhz so i get abit better than stock performance for a lot less wattage and noise/heat.

I dont think this game has any Vram issues, i've barely used more than 7gb in vram where as system ram seems to take a bigger hit around 15gb iirc.
Yeah these new cards are awesome in terms of performance per watt arent they and I'll repeat I'ma AMD guy normally but as I said before it doesn't matter who makes it, if it does what I want I'll buy it, if not I'll send it packing and replace it!
Yeah haha, I use like 5.7-6gb vram max at native ultra 1440p!

It's mad how bad the 6950xt/7900xt are when stock/undervolted, I asked people on here to give me stats of stock/undervolted at 1080p ultra in TLOU/CP and they were using the same or more than me when I'm using ultra 1440p/4k dlss... Ridiculous.

There's literally no excuse for it. Then pair it with laughable fsr/no RT, no thanks, I want what I pay for, not a lack of/excuse. I'll always use amd for cpu's (unless they mess up somehow) but the recent things they're doing is making me very annoyed as a lover of their products...

I don't even care about paying a bit more, I just want it to do everything if it's 'next gen' and not 'settle' cause it's a bit cheaper/give it the 'I'm not even bothered about dlss/rt' speech all the blatant shills now spout... And again this is coming from an AMD guy, so I'm as unbiased as you can be, as I said I'd of returned it for a refund and ran back to AMD with my tail between my legs if it was rubbish haha!
 
Last edited:
Crysis was massively CPU limited. People didn't realise it at the time,but as faster hardware came out it was bottlenecked by the CPU.



Wish people actually watched it. This was like the Crysis 3 where people were saying it wasn't CPU heavy and a dual core was fine,but hadn't reached the point under the Doom,which had tons of grass. Lots of reviews tested earlier parts. That pushed a heavy multi-threaded load. Computerbase were the first to point it out,then DF showed it too.

Or the Geothermal Valley section in ROTTR,which was very taxing on CPUs and dGPUs unlike other parts of the game.

Yeh but what is Starfield actually doing to push anything forward?

It simply doesn't look any better than many, many games that perform massively better.

If a game comes out that only has half the performance of most others, then I want to see why.
 
My initial experience of this game has been quite underwhelming so far but that's to because I've barely played it. I mean barely.

Started it up when it launched and:

- no 3840x1600 resolution out of the box
- no FOV adjustment which is really needed for ultra widescreen
- tried to clear what appeared to be vaseline or smoke from my eyes but no it's meant to look that way
- the few cut scenes don't bode well, they aren't even in the same resolution as you play the game in (very canned)
- pushed straight in to a character creation screen which you can't skip (you can pick anything and change it later, granted)

It was at this point I alt-tabbed and force quit the game.

I've still not played it 2 days later. It just didn't grab me in the first 5 minutes like a lot of other games have. I'll admit that I've not been able to give it the dedicated time I think it requires, life's busy and got in the way. So I'll sit down later tonight and research some more to circumvent some of the issues list above and get stuck in, hopefully it'll enthral me enough to want to keep playing it.

Forgive the replying to my own post but doing so for effect.

Having put some time in to the game and getting quite immersed, I got to say I'm really enjoying it so far. The cut scene factor doesn't bother me at all now and is sometimes wanted, and it's nice to step back and watch it :D.

So I take it all back, this game has gripped me and I'll be playing it for a while I reckon!
 
Even funnier when an adult makes a decision based on what some twonk on YouTube says.
Things that are based on fact or stats I get but come on, my 13 year old nephew even learned critical thinking by now.
EXACTLY mate!

But they'll fight to the death about it and mock people on here, yet never try it for themselves, there's a well known troll that frequents the forums (and on here too) that got previously banned for being sucha fan boy like this and thought changing his name slightly would hide it, he's actively refreshing every forum ready to slate anything that he doesn't own at any opportunity. Yet makes out cause he's had a few things in the past by rival brands he's unbiased, when all he does is literally copy and paste shill troll youtubers/reviewers/back hander journos.
 
You can fully decorate how you chose it (you place/rotate/align each item) plus all the equipment, crafting facilities, bounty clearance terminal, mission terminal etc.
Does your base get attacked by pirates and what not? Is there a sort of base defence aspect to the game where you need to build defensive walls and towers etc and fight off attackers? Also, is there any element of terraforming in the game, like can you build underground areas in your base, or that sort of thing (I get that might be stretching the possibilities of the game engine a bit but worth asking anyway, more in hope than expectation :) )
 
Fly around on very hard difficulty. Space combat is very involving then.

I'm comparing this element to X4 Foundations (and the others in the X series) as this for me is the defacto single player space sim with dog fighting combat.

Starfield requires a fair bit of finesse to survive plus disable not destroy and capture ships. The boarding is also a fun element.

It just doesn't work for me, but I'm so used to stuff like elite dangerous and DCS that it probably was never going to :p

One of the main things I don't like, is that you jump into a system and the enemy ships are already pointing at you shooting within 2 seconds, I just think it's cheesy and just doesn't work.
 
He's not directly doing it but these people are by reverse psychology by having the rigs/setups they do constantly showcased as their claimed 'daily' setups. Which magically get upgraded every day by the looks of it with most of these shills.

I have 'tweaked' jack shid m9. Just cranked EVERY setting to the highest it'd go, played natively 1440p ultra no dynamic cheating resolution no nothing 52-60 fps. Did the dlss mod 90-110fps with the default 70% scaling setting the mod came with - didn't even bother to touch the settings, just wanted to see again if I could even tell it was on, and nope I cant... Unlike when I tried fsr1/2 with my AMD card... Which looks like gaids!

The price doesn't mean jack to me mate, I was ready to buy a 7900xt member, till I saw the disgusting power draw and how bad RT knocks it, if I'm going to spend £730-830 on a gpu (at the time) I want all the bells and whistles, so for the price I paid for my 4070 it delivers exactly what it says on the time featureset wise which genuinely blew my mind after being very anti RT/DLSS/AI Frames etc etc for ages! That's the difference with having a go yourself, at worst it's cost you a bit of time and the postage to return it and say it was rubbish... Which believe me I would do then slate it hard online haha!

Don't forget in 2 years time the money I've saved in electric gaming 6 hours every day every year means this cards cost me nothing vs setting fire to that money using twice the electric if I'd got a 6950xt/7900xt whilst burning my leg/making my room a sauna/coil wine/fan noise etc etc...

THIS! But the shills will fight you for it and waste HUNDREDS of pages on threads arguing why their card is better, and base all of their ownership of not having Nvidia/vram this blah blah that because of it...
And this is coming from an AMD guy myself whose chosen recently to try a 4070 because of the featureset, because as with cars it doesn't matter who makes it, if it works/I like how it drives and looks, I'll buy it/modify it regardless of the creator...

But people seem to be insanely biased now all based on stuff they haven't ever done themselves... Yet will argue till black in the face that they're right/quote some plebs review/chart...

As an RTX3060TI owner,I think the RTX4070 is a terrible money grab,because of what it did below £500. It offered the same 35% to 45% generational improvement that the RTX3060TI had from the RTX2060 Super(for £400),and the same improvement the RTX2060 Super had from the GTX1070 had for similar money. So please stop trying to make it look good,when it was the main reason that the only new generation choice I have for £400 is a useless RTX4060TI. The rest are older cards which are halfway through their driver support period.

Its so poorly priced,that AMD upsold the RX7700 and RX7700XT as the £430 RX7700XT 12GB and £470~£480 RX7800XT 16GB,when they should be £50 cheaper. Yet even at those inflated prices,both the RX7700XT/RX7800XT are simply better value,and that comes from a person with Nvidia hardware currently. Plus they come with the full copy of this game worth nearly £70,not a nonsense OW2 battlepass.

The power consumption argument never concerned me with even a 12 litre NCase M1,otherwise I would have got an RX6600/RX6800 card. Even comparing months where I barely game to some of my heaviest months,I can't see a massive change in power consumption costs myself.

The RX6600/RX6800 were the most efficient cards of their generation - yet everyone suddenly started talking about power consumption since April 2023? Our energy prices have been going up since late 2021. Everyone reviewer should have been pushing the RX6600 and RX6800 in Europe until April 2023? People would have saved lots by then,surely?

You know why? The Nvidia review guidelines probably "suggested" putting in power figures. But "not" when they had tons of RTX3070 and RTX3080 cards to clear out. Yet when AMD was ahead the previous generation,how many of these review sites put these numbers in?Almost sounds like a way to sell trash like the RTX4060TI for £400 because muh power consumption. So as much as you say you are not influenced by reviews,you sort of have been. It happened with the GTX960 2GB.People were buying them over the R9 290 4GB for almost the same money(you could get them for as low as £170 with decent custom coolers) in full sized gaming rigs. For all the power that was saved,what was the point? Small PC sure,but huge ATX rigs?

I look forward to the £500 RTX5050TI 14GB in early 2025 which is 5% slower than an RTX4070 12GB,£50 cheaper,has 2GB more VRAM and consumes 20% less power. AMD can release the RX8600XT 18GB for £50 cheaper which is 2.5% faster than the RX7800XT 16GB it replaced and consumes 3% less power.
 
Last edited:
As an RTX3060TI owner,I think the RTX4070 is a terrible money grab. It offer the same 35% to 45% generational improvement that the RTX3060TI had from the RTX2060 Super(for £400),and the same improvement the RTX2060 Super had from the GTX1070 had for similar money. So please stop trying to make it look good,when it was the main reason that the only choice I had for £400 is a useless RTX4060TI.

Its so poorly priced,that AMD upsold the RX7700 and RX7700XT as the £430 RX7700XT 12GB and £470~£480 RX7800XT 16GB . Yet even at those inflated prices,both the RX7700XT/RX7800XT are simply better value,and that comes from a person with Nvidia hardware currently. The power consumption argument never concerned me with even a 12 litre NCase M1,otherwise I would have got an RX6600/RX6800 card. Even comparing months where I barely game to some of my heaviest months,I can see a massive change in power consumption costs myself.

The RX6600/RX6800 were the most efficient cards of their generation - yet everyone suddenly started talking about power consumption since April 2023? Our energy prices have been going up since late 2021.

You know why? The Nvidia review guidelines probably "suggested" putting in power figures. Yet when AMD was ahead the previous generation,how many of these review sites put these numbers in?Almost sounds like a way to sell trash like the RTX4060TI for £400 because muh power consumption. So as much as you say you are not influenced by reviews,you sort of have been.
Is DLSS3X/frame gen/super low latency etc supported on your card? As if not I would say it would be worth it with 35-45% better performance (if that's what you say it'll do? Sorry if I got confused if not)
As that means you can achieve ridiculous settings/res without compromise/even telling it's turned on, unlike FSR...

But yeah I wouldn't say upgrade to a 5070 in 3 years unless I couldn't achieve what I want, I'd just buy whatever can, whoever makes it.

But again for me, to go from a rx580 to this, with it's featureset/ridiculously low overall system wattage consumption (or lack of) thanks mainly due to this being a monster at oc'ing/undervolting, I cannot fault it for my needs.

I don't see in what world a £480 7800xt is a better buy than a £520 4070?

As I've said to you in the past, I've ALWAYS been concerned about power consumption, since my q66 stepping first gen quad core would eat £12 every 5 days when left on 24/7 via our meter card...

I loved the 6600xt, I missed the boat on that one, it's the reason I love the 4070, it IS literally the modern equivalent bar the pricing, but that isn't a factor to me. I woulda bought the 7900xt originally remember, if it hadn't been for the aforementioned reasons in past posts. I also don't think bar the vram the 'jump' is that high from my card either.

Also I think the reason it is highlighted more now is people were sick of constantly needing stupidly powerful psu's every few years, the fact we have what 1200-2000w psu's now flaunted about as if that's normal for a box to play a game on is quite frankly ridiculous, that's like daily'ing a 500+bhp rotary and calling it a prius!

Also 'TDP' is the most misleading bs term ever, TBP is closer to the truth, and that really annoys me people will publish nonsense like TBP as it doesn't help you work out what you'll need psu wise/consume AT ALL as it isn't accounting for everything... Where as TBP is not far off the reality. I test stuff at the wall with a proper power monitor gauge and compare it to the read outs from adrenaline/afterburner etc... So I know what is real and what software/sensors claim.

It's strange you think that, as it was the AMD cards like the rx6600xt that made me think years ago "oh someone else cares about the wattage draw vs performance other than me then!" I thought I was just a minority... So I'm on the opposite opinion on that one mate sorry, I think it's good that data is released as it matters to me, as it does with mpg vs bhp when I tune/build cars/engine swap/change gearbox and diff ratios etc etc...

I spose I'm the minority though, an AMD guy that had the balls to actually try the rival teams hardware purely based on what it claims it can do vs being loyal to a brand, as I've said before, it's like cars, I like all sorts, love some more than others but if a rival made something better I'm definitely still having it regardless of it being not the usual make I'd choose to drive...
 
Back
Top Bottom