• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: AMD Screws Gamers: Sponsorships Likely Block DLSS

Are AMD out of order if they are found to be blocking DLSS on Starfield

  • Yes

  • No


Results are only viewable after voting.
Cyberpunk was a terrible launch, but the game is actually in a good place now and its expansion is out soon which CDPR are calling it's reboot and from what has been shown, Phantom Liberty looks awesome.

Cyberpunk looks great of course but I still think it is a sub-par gaming experience. The NPC's are basically mindless zombies compared with the GTA series or even Farcry 2. The poor AI breaks the immersion for me.
 

Ahh my friend but those games are all living proof that graphics do not a good game make.

FO3 - Game of the year.
FONV - Game of the year.
Skyrim? same.
FO4? people are still playing it 8 years on, still modding it and still streaming it.

FO76? is anus, but yeah....

I would say it is more important to have your name and logos on games like that (apart from FO76) than games that look pretty but are inevitably cack. I mean ffs I still play FO3. And I will play FO4 again by the end of the year too 'cause Fallout London is coming.

IDK what AMD are paying but you are forgetting one small thing there. Remember, they are paying for every copy of these games they give away with a GPU. Well, I say that, that's not strictly true, YOU are (as the GPUs with free games always cost more than the vanilla ones) but AMD still have to pay upfront for those keys. Prolly get a discount, but yeah that wouldn't be a cheap endeavour I don't think.
Well, actually looking at Nexus and Wabbajack, I would say more people are still playing Skyrim. Or at least the ones who do are modding more?

Anyway, heavily modded is why I am a VRAM advocate. Even with new renderers (and I personally find EBN to look worse), what a Wabbajack modlist player really needs from a GPU is VRAM rather than grunt.
 
Ahh my friend but those games are all living proof that graphics do not a good game make.

FO3 - Game of the year.
FONV - Game of the year.
Skyrim? same.
FO4? people are still playing it 8 years on, still modding it and still streaming it.

FO76? is anus, but yeah....

I would say it is more important to have your name and logos on games like that (apart from FO76) than games that look pretty but are inevitably cack. I mean ffs I still play FO3. And I will play FO4 again by the end of the year too 'cause Fallout London is coming.

IDK what AMD are paying but you are forgetting one small thing there. Remember, they are paying for every copy of these games they give away with a GPU. Well, I say that, that's not strictly true, YOU are (as the GPUs with free games always cost more than the vanilla ones) but AMD still have to pay upfront for those keys. Prolly get a discount, but yeah that wouldn't be a cheap endeavour I don't think.

I suspect majority of gamers have logged in countless hours into these Bethesda games compared to the pinnacle of graphics that is Cyberpunk2077. I know I have played Skyrim a whole lot more than CB2077 purely due to the better gameplay and atmosphere. The graphics were pretty top notch at the time of it's release and it still holds up well especially with HD textures.

I know,so I find it ironic people are so concerned about graphics in a Bethesda Games Studio game. If graphics were that important,we wouldn't have 10s of millions of sales of those games even today.

I have been telling people to worry more about the CPU requirements in the game,especially if you wanting to expand the in-game bases,etc. Just check the Fallout 4 benchmark thread I run.
 
I wouldn't say AMD is out of order necessarily.
First and foremost, we have zero evidence of this.
Secondly, AMD is the only company with an open source solution that works on any card. Nvidia doesn't even support their 10xx series and below and is keen to further fragment their own products to push more sales, and XeSS has a gimped fallback mode for anything that isn't an intel GPU with poorer performance and image quality on top.

If you guys really cared about making upscaling available for everyone, you should be pressuring Nvidia and intel to use more open source solutions. It's better for the consumer and better for progress. Nvidia's streamline proposal isn't a good solution either, it's just hiding the problem.

So far only AMD has introduced this, and the others would rather do their own thing.

The best thing would be if all three companies contributed to one singular open-source solution and they could just add in specific optimisations for when specific hardware is detected, without gimping or blocking fallback modes and with it being open-source there would be no foul play.
However that is a fantasy within a dream and we have this nonsense instead.
 
Last edited:
The best thing would be if all three companies contributed to one singular open-source solution and they could just add in specific optimisations for when specific hardware is detected, without gimping or blocking fallback modes and with it being open-source there would be no foul play.
However that is a fantasy within a dream and we have this nonsense instead.
It just seems "implement all three of them" with extra steps.
This will be either a DX standard at some point (preferably hardware, for best results) or each does its own thing. Is not that difficult to support all of them anyway. There are no excuses not to do it other than laziness.
 
YouTube is full o visual updates for Bethesda's games, but somehow graphics don't matter... Well, just turn everything to low and enjoy. Probably you won't need even FSR! :p

TBH none of them are ever going to look fantastic. Better? I guess, but the engine has always been the limiting factor. Even when it launched it (FO4) never looked great. The time these games need to be in dev for the quests to be written and ETC? none of them will ever be a stunner when launched. Otherwise they would just fall into the "Duke Nukem Forever" loop of infinity.

It's nice seeing a game that looks amazing. Obs. But I have never ever cared about that.
 
I wouldn't say AMD is out of order necessarily.
First and foremost, we have zero evidence of this.
Secondly, AMD is the only company with an open source solution that works on any card. Nvidia doesn't even support their 10xx series and below and is keen to further fragment their own products to push more sales, and XeSS has a gimped fallback mode for anything that isn't an intel GPU with poorer performance and image quality on top.

If you guys really cared about making upscaling available for everyone, you should be pressuring Nvidia and intel to use more open source solutions. It's better for the consumer and better for progress. Nvidia's streamline proposal isn't a good solution either, it's just hiding the problem.

So far only AMD has introduced this, and the others would rather do their own thing.

The best thing would be if all three companies contributed to one singular open-source solution and they could just add in specific optimisations for when specific hardware is detected, without gimping or blocking fallback modes and with it being open-source there would be no foul play.
However that is a fantasy within a dream and we have this nonsense instead.
Good post. Zero proof of anyone blocking anything, no dev confirmation either. I wonder why? Probably because everyone knows there’s no truth to it.
 
The best thing would be if all three companies contributed to one singular open-source solution and they could just add in specific optimisations for when specific hardware is detected, without gimping or blocking fallback modes and with it being open-source there would be no foul play.
However that is a fantasy within a dream and we have this nonsense instead.

You're missing the point nvidia want to sell you 4050 series cards disguised as 4060's that rely on dlss to give you the performance it should have had as a proper 4060 then when the next generation come along DLSS 4 will not work on these cards so you're forced to upgrade to the 5050 5060 to get the same performance with the new titles. If you don't understand that then you havn't understood Nvidia's intention at all.
 
Fair, so that’s two games out of how many Bethesda titles?

https://bethesda.net/en/games/home going back to ghostwire, all except Starfield support diss and fsr.

Boundary is an interesting one. Had dlss and was repeatedly demo’d with it. Then got amd sponsorship and diss was removed completely. https://www.dsogaming.com/news/boun...ature-ray-tracing-ditches-dlss-over-fsr-xess/

Its an Unreal 4 game as well in which diss is legitimately a check box to implement but hey, as you said, until companies break their nda and b2b relationships by sharing their contracts into the public sphere, its not real.
 
https://bethesda.net/en/games/home going back to ghostwire, all except Starfield support diss and fsr.

Boundary is an interesting one. Had dlss and was repeatedly demo’d with it. Then got amd sponsorship and diss was removed completely. https://www.dsogaming.com/news/boun...ature-ray-tracing-ditches-dlss-over-fsr-xess/

Its an Unreal 4 game as well in which diss is legitimately a check box to implement but hey, as you said, until companies break their nda and b2b relationships by sharing their contracts into the public sphere, its not real.
I see the developer doesn't indicate that AMD blocked them from adding DLSS and that it was removed due to lack of development resources, so that won't fit your agenda unfortunately Robert.
“Unfortunately, we need to remove Ray Tracing and DLSS from the EA version. The main reason is that our development resources cannot support multiple technical features, especially pure technical features, which means that this feature will not bring substantial improvements to gameplay.
Atomic Heart is Nvidia sponsored and removed RT too, Robert. Is that AMDs fault as well? :cry:
 
Last edited:
Where did I mention RT. Stay in scope.
You linked to an article that mentions removal of both features (RT and DLSS) as part of your claims that AMD is blocking DLSS from being added.

Sadly for you, the article says both features were removed due to development time costs, nothing to do with AMD telling them to remove DLSS.

It then references Atomic heart as similar in this regard for having features pulled prior to launch, despite Atomic Heart being Nvidia sponsored and RTX videos showing RT enabled for this game all over Nvidia YouTube.

So, get back to work Robert and see if you can find some confirmation from people who developer games that AMD stopped them from adding DLSS into their game.

I wish you the best of luck on your personal crusade my man. :)
 
I will admit the HDR on my screen was way off, it looked bloody awful out of the box, 30 seconds with the Windows HDR calibration tool fixed that, its gone from utter crap to gorgeous.

Yep, that's very good point and even on gsync OLED I use that tool. Tone mapping is already very good but brightness was off.
 
Back
Top Bottom