• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD's FSR3 possibly next month ?

Soldato
Joined
6 Feb 2019
Posts
17,853
Last edited:
Caporegime
Joined
4 Jun 2009
Posts
31,309

LOL!

oh-no.gif
 
Last edited:
Soldato
Joined
9 Nov 2009
Posts
24,914
Location
Planet Earth
Well there is no FSR files either

So unless something changes Starfield has no upscaling technology support

So all the people moaning about Dlss means nvidia,amd and Intel owners get no upscaling at all? Todd got annoyed!Probably said the modders can handle that hot potato! :cry:

Looks like all you get to experience the Creation 2 resolution scaler now. Probably be at least competitive with one from 2015.

A new generation can get to experience Bethesda Games Studio not giving a damn about graphics. Wonder if people can moan more about the RT,they might drop that too.

john-jonah-jameson-lol.gif
 
Last edited:
Associate
Joined
31 Dec 2010
Posts
2,481
Location
Sussex
Spoken like a true Apple acolyte. Which is what Jensen is aiming for, so mission accomplished.
Especially since tensor sensors were:
Firstly developed for AI.
Secondly developed for AI.
Thirdly developed for AI.

With some thoughts about other ideas on professional cards.


After all that... To justify them existing on dies sold to gamers... They made DLSS dependent on them?

Brilliantly marketed, but without DLSS those tensor sensors on consumer GPUs were wasted silicon problem looking for a solution.
 
Caporegime
Joined
17 Mar 2012
Posts
48,320
Location
ARC-L1, Stanton System
Epic's TSR doesn't use tensor cores. XeSS has a fallback layer that works on all cards so doesn't need those cores. FSR does the same too. FSR3,even if it uses RDNA3 features,will have a fallback too for RDNA2. So why did AMD and Intel make fallback layers and Nvidia didn't? Intel XeSS needs hardware support in the max tier.

Nvidia still sells the GTX1660 and GTX1650 series even now:

So you fully support Nvidia users being locked out of any DLSS usage. So basically your argument is that if you are poor Nvidia gamer you can clear off and a "better visuals/performance to a large percentage of the player base" is not really what you care about. All those non-FSR games which have DLSS are basically saying that.

You don't even need to buy a new card to use the AMD feature,and still some of you are moaning. God forbid if you needed an AMD card! :cry:

So I expect you were one of those people who bought an Nvidia FX for HL2,and made the same argument that Valve was "locking out" the majority of consumers from a decent DX9 HL2 experience. The indignity of AMD sponsoring Starfield is the same as when ATI got HL2. Also if you put DLSS in,don't you need to put the Nvidia logo in the game? I read that here.

Are we going to have a boycott now? Not even AMD users boycotted Cyberpunk 2077 despite their cards doing RT like crap in the game! :cry:

how-dare-you-greta-thunberg.gif



But DLSS is the upscaler of the people unless you are one of the undesirables. Yes,you GTX1660,GTX1650 and GTX1060 owners! The last two are in the top 3 on Steam.

Also AMD and Intel make cards? Never noticed.
Thank you CAT. Someone gets it....

Closed black box propitiatory crap that is designed to stuff only ones own coffers vs open standards that lifts everyone as whole.

Both Intel and AMD have a history of not just supporting things that actually benefit everyone but also spending their own time and resources on creating and developing it, contrast that with Nvidia whose only contribution to open source technology is to use other peoples work for their own benefits.

DLSS does nothing for anyone but Nvidia, to say AMD blocking it hurts everyone is just about the most brain washed concept i've ever heard.
 
Last edited:
Soldato
Joined
9 Nov 2009
Posts
24,914
Location
Planet Earth
Especially since tensor sensors were:
Firstly developed for AI.
Secondly developed for AI.
Thirdly developed for AI.

With some thoughts about other ideas on professional cards.


After all that... To justify them existing on dies sold to gamers... They made DLSS dependent on them?

Brilliantly marketed, but without DLSS those tensor sensors on consumer GPUs were wasted silicon problem looking for a solution.
Also that TSR proved DLSS1 didn't need Tensor cores,and only with later versions did DLSS appear to look better(but TSR can get close to DLSS2 sometimes). So that makes the whole Nvidia can't make a fallback layer premise false.
Thank you CAT. Someone gets it....

Closed black box propitiatory crap that is designed to stuff only ones own coffers vs open standards that lifts everyone as whole.

Both Intel and AMD have a history of not just supporting things that actually benefit everyone but also spending their own time and resources on creating and developing it, contrast that with Nvidia whose only contribution to open source technology is to use other peoples work for their own benefits.

DLSS does nothing for anyone but Nvidia, to say AMD blocking it hurts everyone is just about the most brain washed concept i've ever heard.
Because they are elitists.They were not moaning about DLSS1 not working on GTX series cards when most of the cards back then were not RTX branded ones. Yet TSR seemed to compare favourably(and in some cases is close to DLSS2) yet didn't need Tensor cores. Nothing said about that,despite Epic games and Nvidia having a long relationship.

The fact is they don't want DLSS to work in any way on GTX series and non-Nvidia cards. They paid for that feature so don't want plebs or non-Nvidia owners to use it for "free".

It was the same with the FreeSync vs GSync debates here. Most people were happy about FreeSync because it would mean more gamers would get to experience Adaptive Sync. It was cheaper and easier to implement. But not on tech forums where they almost wanted it to go away,and ironically more Nvidia users have "Vesa Adaptive Sync" monitors than actual GSync ones.
 
Last edited:
Associate
Joined
22 Nov 2020
Posts
1,462
I just wish Cyberpunk would be deleted from collective consciousness of using as a benchmark, it’s such a biased game towards Nvidia and is pretty awful to play as well.
 
Last edited:
Caporegime
Joined
17 Mar 2012
Posts
48,320
Location
ARC-L1, Stanton System
Also that TSR proved DLSS1 didn't need Tensor cores,and only with later versions did DLSS appear to look better(but TSR can get close to DLSS2 sometimes). So that makes the whole Nvidia can't make a fallback layer premise false.


Because they are elitists. They were not moaning about DLSS1 not working on GTX series cards when most of the cards back then were not RTX branded ones. Yet TSR seemed to compare favourably(and in some cases is close to DLSS2) yet didn't need Tensor cores. Nothing said about that,despite Epic games and Nvidia have a long relationship.

The fact is they don't want DLSS to work on GTX series and non-Nvidia cards. They paid for that feature so don't want plebs or non-Nvidia owners to use it.

While elitist explains some users complaining about this, but then again crying about the underdog thwarting the elite is not very Elitist :D It doesn't explain tech jurnoes doing the same thing and then try to justify that by appealing to the for the people sentiment like some pound shop Lenin.

Claiming to be the victim and have the moral high-ground while you spit on those beneath you is the narcissism born out of the Twitter generation, DLSS is a problem in its self and AMD blocking it are hurting no one but Nvidia profit margins on thier descustingly over priced crappy GPU's.
 
Soldato
Joined
9 Nov 2009
Posts
24,914
Location
Planet Earth
I just wish Cyberpunk would be deleted from collective consciousness of using as a benchmark, it’s such a biased game towards Nvidia and is pretty awful to play as well.

Its a fun enough game,but nobody cared if it had a ton of Nvidia specific technologies which favoured one brand(or newer cards). Fallout 4 had them and also just runs better on Nvidia cards(I run the benchmarking thread). Nobody said anything,but with Starfield....

giphy.gif


While elitist explains some users complaining about this, but then again crying about the underdog thwarting the elite is not very Elitist
:D
It doesn't explain tech jurnoes doing the same thing and then try to justify that by appealing to the for the people sentiment like some pound shop Lenin.

Claiming to be the victim and have the moral high-ground while you spit on those beneath you is the narcissism born out of the Twitter generation, DLSS is a problem in its self and AMD blocking it are hurting no one but Nvidia profit margins on thier descustingly over priced crappy GPU's.

If this game only had DLSS3,and no DLSS2 and FSR2,I would expect zero "backlash".
 
Last edited:
Soldato
Joined
14 Aug 2009
Posts
2,883
People do notice it because of the latency.
ICDP explained it and so have many others.Would people want to use this in a competitive, ranked fps? If the prediction algorithm goes off, then your aim will be off! It is the same when people said lcds were off wrt to crts for years. Most said it was imagined. Then it was shown display latency was off. Then we had amd and nvidia introducing antilag tech recently.

Just when people said frametimes and frame pacing were not important, just average fps. Have people forgotten when xfire and sli were a thing? Nvidia went to great lengths to prove fps didn't matter just frametimes,etc.

Now we have gone full circle and now latency isn't important, frametimes are not important, lag isn't important, image quality isn't important, but average fps.

The reality is fake frames is just there for companies to sell you less for more and pcmr falls for it. Plus fake frame technology increases vram usage. Just limit those vram increases, and you can force an upgrade a bit earlier.

Hence why nvidia was trying to push the rtx4060ti performance figures using frame insertion. AMD will do the same. Less for more. More profits. This is the way.

I am sure when consoles do fake frames suddenly the magnifying glasses will come out! PCMR suddenly cares about fps, frametimes and quality of upscaling then. But pcmr goes on how consoles suck due to technical measurements. But let's not get too technical on PC right?

If people were to sit down and just play, many would find a console is perfectly fine at 30 to 60fps instead of a pc. Even games on their phones, which make more money than pc games.
I did say in my previous post:
I'm mostly a 60hz gamer and played most of the time with v sync, ergo rather used to latency. At times FG can feel pretty much the same as native, while in others I notice a bit of lag - that's around 60fps with FG on. Input lag can depend per scene/game/card. Some will notice it, some will not. Probably a lot will be happy to game with it on if that means a smother gameplay and higher details.

Plus, I think he was testing on DLSS Quality which isn't enough as it drops too low. Normally DLSS Performance should keep around 60fps which is much better. Ideal? No. Just better. I would probably not use it in competitive events and since I'm not interested in them, my careometter is more or less zero in that front.
My issue is not that the tech exists,its when its used to sell trash like an RTX4060TI.

:cry:
And I don't care that much about how nvidia tries to promote is underpowered cards (or AMD for that matter). I'm more interested in what the tech can do before dismiss it.
Yup FSR is "free" yet anyone who actually cares for IQ will be turning it off anyway..... :cry: I rather sacrifice graphical settings than use fsr as the downgrade in IQ especially in motion is beyond bad with all that fizzling, shimmering/aliasing etc. But amd know what is best for the community, they're the white knights after all :D

Wouldn't worry though as someone will mod in dlss and probably nvidias frame generation and more than likely end up being better than the official implementation of FSR :cry: Funny to see them cutting out xess too though given that is open source (their reasons for not allowing nvidia stuff was always because it's not open source even they were called out on that with their streamline approach :D) Essentially the only ones amd are harming here are themselves now, PR be doing overtime soon :p
Well, I find FSR to be a bit better than DLSS in CB77 at 4k with PT. Better to be all solutions available all around.
Firing 64 traces per pixel when 4 makes no visual difference.

Where have i head something very similar before.....? Good grief at least learn some original tactics.

Actually it can make a huge difference. There's a huge gap missing until we can say we can shoot enough rays. And most important, how many bounces they'll do.
 
Caporegime
Joined
17 Mar 2012
Posts
48,320
Location
ARC-L1, Stanton System
Actually it can make a huge difference. There's a huge gap missing until we can say we can shoot enough rays. And most important, how many bounces they'll do.

Per pixel? There is only so much accuracy you can gain in tracing every pixel multiple times, 2 is enough its the standard for most, 4 is more than enough.
 
Last edited:
Soldato
Joined
14 Aug 2009
Posts
2,883
Per pixel? There is only so much accuracy you can gain in tracing every pixel multiple times, 2 is enough its the standard for most, 4 is more than enough.

Well, I guess they are doing it wrong tracing 50 path trace sample per pixel for professional work instead of 2. :)


And yes, bounces are not the same as traces.
 
Last edited:
Caporegime
Joined
17 Mar 2012
Posts
48,320
Location
ARC-L1, Stanton System
Well, I guess they are doing it wrong tracing 50 path trace sample per pixel for professional work instead of 2. :)


And yes, bounces are not the same as traces.

He's talking about "Path Tracing" as opposed to "Ray Tracing"

Ray Tracing traces the scene relative to the camera only, shodows and light bounces, its not tracing the light from the whole scene, anything that is outside the interest of the camera is rasterized, its a hybrid approach to give good enough accuracy, give the appearance of an accurately lit scene, but its not, it is far better than rasterization alone that can work fast enough on consumer grade GPU's for gaming.

Path Tracing has no rasterization at all, the entire scene starts out black until the whole scene is path traced, this has 100% true accuracy and we are a very long way from that being fast enough for games. In Ray Tracing the scene is already rasterized with a light (Ray) traced overlay.
 
Last edited:
Soldato
Joined
14 Aug 2009
Posts
2,883
He's talking about "Path Tracing" as opposed to "Ray Tracing"

Ray Tracing traces the scene relative to the camera only, shodows and light bounces, its not tracing the light from the whole scene, anything that is outside the interest of the camera is rasterized, its a hybrid approach to give good enough accuracy, give the appearance of an accurately lit scene, but its not, it is far better than rasterization alone that can work fast enough on consumer grade GPU's for gaming.

Path Tracing has no rasterization at all, the entire scene starts out black until the whole scene is path traced, this has 100% true accuracy and we are a very long way from that being fast enough for games. In Ray Tracing the scene is already rasterized with a light (Ray) traced overlay.

Well, you didn't mention ray or path tracing in that post.
You can do path tracing (as CB and others proved it) now, with limitations, just as you do RT or simple rasterization.

RT still is limited. RTGI can use multiple frames to accumulate data and even reflections are not really "true to life". Both are are far off until "no more rays are needed" or make no significant improvement. At the end of the day you can use a bit of everything to have the best image possible, giving the limitation of each approach.
 
Soldato
Joined
30 Dec 2011
Posts
5,545
Location
Belfast
Plus, I think he was testing on DLSS Quality which isn't enough as it drops too low. Normally DLSS Performance should keep around 60fps which is much better. Ideal? No. Just better. I would probably not use it in competitive events and since I'm not interested in them, my careometter is more or less zero in that front.

Apologies if I had not clarified. When I test frame generation I kept lowering settings until I got to the point latency was not a problem. This included RT and DLSS quality settings. Once I got to about 60 - 70 base FPS without fake frames enabled, is when latency with fake frames enabled became a non issue. The thing is at this speed the FPS and latency were already good enough without the need for fake frames. It basically did not allow me to increase the graphical settings because doing so lowered the baseline FPS and increased lag.

It has its uses but giving flexibility to increase graphical setting to extreme RT etc was not one of them because the lag ruined the experience for me. The FPS said 50 to 60 but the lag said mid 20s
 
Last edited:
Back
Top Bottom