***The Official Starfield Thread*** (As endorsed by TNA)

Yea i just redeemed one called '' Starfield - Preorder' about 15mins ago and nothing?

I even tried to restart steam
In steam settings > Account details > Licenses and subs you should see two entries for starfield, if not then your premium isn't activated correctly:

w92FlCL.png
 
But this should not be the case in 2023. After the last 2 years of absolute trash PC releases why are people still expecting lacklustre PC launches just because a studio was famous for it before? Especially given that the head has been on interviews and social media channels highlighting how good it's all shaped up after such a long delay!

Why are people normalising bad development?

Edit*
And I am pretty confident that Alex at DF will highlight the vast majority of these issues in his video as soon as it goes live. It's thanks to DF that my eyes are now always drawn to these issues.

Todd was also very careful with what and how he answered questions. So far I've seen exactly what I expected from the game.
 
Quite funny seeing all the hate coming in now about washed out graphics and lack of hdr, **** blacks :p



Can't wait for DF video, they will probably rip it a new one.
 

lGcWA3X.png


Amd seem to love having their sponsored games running better on their competitors hardware :p

Was expecting better scaling from the 3d cache cpus tbph. Definitely going back to intel next time.
 
Last edited:
But this should not be the case in 2023. After the last 2 years of absolute trash PC releases why are people still expecting lacklustre PC launches just because a studio was famous for it before? Especially given that the head has been on interviews and social media channels highlighting how good it's all shaped up after such a long delay!

Why are people normalising bad development?

Because people on forums are obssessed with graphics,when most people who buy their games don't simply care as much as people think. An Indie game like Valheim looks terribad,but sold well over 10 million copies because it was fun. All the people on reddit and twitter who were going to boycott because it had no RT,no this or that,etc would not make a real difference. The same was said about Fallout 4 being crap on forums because it looked like jank,but it was a fun game at its heart and had one of the most mod friendly developers and dedicated fanbase of any game franchise. Skyrim looked dated at launch but was genuinely a great game to play and Fallout:New Vegas is still one of the greatest RPG games ever created.

Hogwarts Legacy sold 15 million copies and it didn't look or run that great either. But it is apparently fun to play.

Cyberpunk 2077 had masses of bugs,horrendous AI and significantly cut back RPG elements(compared to The Witcher 3),etc but got a pass by enthusiasts because it was very pretty and was fun to explore. Yet it still sold well. Everyone is excited about Phantom Liberty because they want to mess around with graphics settings. Nobody is asking whether the RPG elements are going to be of the standard of The Witcher 3.
 
Last edited:

lGcWA3X.png


Amd seem to love having their sponsored games running better on their competitors hardware :p

Was expecting better scaling from the 3d cache cpus tbph. Definitely going back to intel next time.
Those 13th gens literally drink the extra power consumption though, and then fart it out as pure heat :p

Fingers crossed 14th gen is more efficient much like how the 12th gen is, just overall more powerful. It seems fairly pointless going 13th gen at higher cost to then have to control higher levels of heat and power consumption. We've got 4090s that consume less power than a 3080 Ti and run cooler too at the same res and settings whilst being up to 3x faster performant. Why don't Intel sort their CPUs out to match what Nvidia have been doing with GPUs is my question.
 
Those 13th gens literally drink the extra power consumption though, and then fart it out as pure heat :p

Fingers crossed 14th gen is more efficient much like how the 12th gen is, just overall more powerful. It seems fairly pointless going 13th gen at higher cost to then have to control higher levels of heat and power consumption. We've got 4090s that consume less power than a 3080 Ti and run cooler too at the same res and settings whilst being up to 3x faster performant. Why don't Intel sort their CPUs out to match what Nvidia have been doing with GPUs is my question.

Intel's power consumption problem goes as far back as Coffeelake

Leaks say 14'th gen is just another refresh so that's not going to fix it.
 
Last edited:
Back
Top Bottom