• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD's FSR3 possibly next month ?

MLID (Tom) is friends with Tim, hopefully Tom has had a frank word with Tim and he's wiser now, maybe that's why they are silent on it now despite saying they will push it more if SF doesn't have DLSS.

Nvidia need to be seen for the massive untrustworthy trolls that they are by the tech press.
 
MLID (Tom) is friends with Tim, hopefully Tom has had a frank word with Tim and he's wiser now, maybe that's why they are silent on it now despite saying they will push it more if SF doesn't have DLSS.

Nvidia need to be seen for the massive untrustworthy trolls that they are by the tech press.
I think they know but their livelihood is linked to staying onside with these big companies. Even the more impartial outlets have to tread carefully. My attitude is if you're not prepared to pay for the independent reviews then the reviewers will be beholden to the manufacturers.
 
This is not entirely accurate. The way Super resolution works with Lightroom is similar to Topazx Lab's approach to reconstruct detail using AI,a nd if you have a RAW source image to work with then the results are at their best as RAW retains a lot of information deep within the capture that isn't always visible at the surface.

Photoshop's Enhance Details feature is AI powered too but simply sharpens the image using AI, Super Resolution in Lightroom uses AI to quadruple the resolution and rebuild detail that might otherwise be there if the original source was that same 104MP image taken with a higher MP sensor. Take a look at the fine detail in the bride's dress lower on the image where the sunlight hits it, the mesh detail isn't clear on the 26MP version, but it is clear on the 104MP upscaled image as it's been reconstructed accordingly and accurately - There is no sharpening applied here to simply enahcne details to give the illusion of better detail.

Because again its not recreating extra detail more than the original image which in the case are the RAW files. You could always get a better upscaled image using old fashioned upscaling from RAWs than starting with a jpeg file. But even in your case that is more to do with the jpeg engine in the camera being limited.

All the machine learning is doing is just refining the algorithm. Just look at DxO DeepPrime which uses machine learning to help clean up noise very well. It all has it's uses but what I have an issue is people act like it can make NEW details.

You can't make new details out of the photo file you are basing the image on.All you can do is help extract 100% of the detail,and machine learning can help. But outside that it's just an estimation of what it might look like if you use a better imaging device. The only way to resolve that scene with more detail is to use better equipment.


CSI has a lot to answer for! :cry:

Or do you think all the professional photographers will just stick with a 16MP D7200 and never upgrade to a higher resolution system? They could pass everything through the upscale filter and never upgrade again. You know very well that won't happen! :p

Not even smartphone photographers would do that. More MP,more lenses,more everything!

Marketing bandies around terms and it actually annoyed people I knew who were computing guys. They always called it machine learning,so when marketing keeps calling it AI,they started getting vexed repeatedly!
:cry:

This is why I think we are probably argueing two things - so best to leave it at that!

My point is this, there are many variations of these technologies, the way they are implemented matter the most, you can't have a low quality source image (DLSS at 1080p) and expect a 1080p output reconstruction to the same quality as a native 1080p image because the source internal render image is way too low res. But give the AI a suitably details source image, and it can do much much more and actually generate a perfect image as demonstrated with all these DLSS comparison videos being posted online.

In an ideal world everything would be native and run great in games, but we don't have an ideal world currently, but even then we'd still need a superior AA solution at native to combat jaggies, and no in-game internal AA solution is really that good or efficient for today's games, either always too soft, or too whacky for temporal stability. DLSS appears to resolve the shortfalls of inefficient AA so for that alone it's well worth implementing in a game properly, plus we all get free fps gains in the process so what's not to like.

My big issue is Nvidia using it to upsell crap like the RTX4060TI. If the RTX4070 had been the RTX4060TI we had all expected for closer to £400,then DLSS,FG,etc would be an additional selling point on top.

I dislike its becoming the main selling point.
 
Last edited:
Its interesting that AMD now have an FSR option that just enhances the native image quality, you know, like what Temporal AA was intended for.....

Its come right back round to just being a TAA option, just under the Image Up-scaling marketing umbrella :cry:

Nothing says scam better than that and DLSS is the same thing.
 
Last edited:
That's so frustrating, at this point i'm so frustrated with it i'm on the accelerationist side of it, make the 5080 $1500 for 20% gain.... at what point are people going to wake up and realise "i've been such an idiot"

They're not, they're in too deep.
 
If that's true then we are all #######
Always been the way but people want everything for free, no such thing of course. If you like the output and find it useful chuck them a few quid. If everyone that watched these reviews paid £1/month the big companies would have much less hold on the reviewers. I'd like all reviews to only be done on retail purchased hardware not a donated or lent out by the manufacturer. Hence they like preorders and building FOMO, get your cash before anyone knows the truth about the products ;) The marketing guys are a lot smarter than the average consumer!
 
If it's memory intensive then surely you'd expect memory to actually be used though. The game uses less than 6GB of VRAM and 7GB of system RAM, so that would point to anything but memory intensive nature of the game? Unless it's specifically targeting keeping a certain amount in VRAM at all times and then smashing textures etc through as fast as possible trying to maintain that set level of VRAM use which means the bottleneck is a software one which Bethesda could solve in a patch or something.


it'll be the game engine keeping track of all the npc's, objects, quest stages etc. there will be a lot of npcs with their own set daily schedule and so on. i would assume that graphics rendering is of a lower priority than that to make sure that npc's, objects and quests fire off correctly and are in the right places, i could be wrong though.
 
At the end of the day, you all can keep on arguing about this but no matter what people post, it will never change the below facts unless a mountain of evidence comes along to debunk these several sources and their evidence :)

Sadly, these ones will never accept it despite EVERY single tech press (TPU, computerbase, pcgamershardware, DF, gamer nexus, HUB, oc3d.net etc.) and comparison out there showing exactly where, when and how upscaling (well more so dlss) can produce as good as native or better than native, I have yet to see anyone who claims otherwise posting some of their own substantial evidence to debunk all the material out there now.....
 
Last edited:
  • Like
Reactions: mrk
Lets say it like it should be said. Products are not only sold on performance but "features" too - just look at the smartphone market. Whether people think DLSS or FG are good or not its not relevant. Its the fact AMD doesn't have something comparable and a year ago said it would have it,and its still not here.

So reviewers HAVE to show these features off,because the competition has it and AMD does not. Otherwise its not really fair on Nvidia either and Nvidia is going to wield it to win sales.

If it's memory intensive then surely you'd expect memory to actually be used though. The game uses less than 6GB of VRAM and 7GB of system RAM, so that would point to anything but memory intensive nature of the game? Unless it's specifically targeting keeping a certain amount in VRAM at all times and then smashing textures etc through as fast as possible trying to maintain that set level of VRAM use which means the bottleneck is a software one which Bethesda could solve in a patch or something.

Memory bandwidth sensitive. Fallout 4 scaled with memory bandwidth too - I suspect there are one or two worker threads for scripting,etc and its being constantly hammered.

Also,the consoles have massive memory bandwidth for their CPUs. So I suspect the engine has been orientated around that. It runs relative well on the consoles at its 30FPS cap,especially with the low end Zen2 performance it has(probably closer to a Zen+ CPU).
 
Last edited:
Lets say it like it should be said. Products are not only sold on performance but "features" too - just look at the smartphone market. Whether people think DLSS or FG are good or not its not relevant. Its the fact AMD doesn't have something comparable and a year ago said it would have it,and its still not here.

So reviewers HAVE to show these features off,because the competition has it and AMD does not. Otherwise its not really fair on Nvidia either and Nvidia is going to wield it to win sales.

Exactly.

And this here is where people don't quite grasp it, "software" and features whether people like it or not are sold as part of the "package" (same goes for every single product on the market) and this is considered in the "final" cost for said item. If you think amd don't charge for their features, then lol, better go and tell them to remove all their marketing for things like freesync, sam, chill etc. from their gpu boxes ;)

As it is, there will always be comparisons of these features when there is still a significant difference, the only time we'll stop seeing said comparisons is when both are on par and it's just become a standard, bit like with free vs g sync, still some comparisons but nowhere as much because free/adaptive sync has more or less got as good as gsync on the whole, still some differences but it's not worthwhile doing the same comparisons as back in the day.
 
Lets say it like it should be said. Products are not only sold on performance but "features" too - just look at the smartphone market. Whether people think DLSS or FG are good or not its not relevant. Its the fact AMD doesn't have something comparable and a year ago said it would have it,and its still not here.

So reviewers HAVE to show these features off,because the competition has it and AMD does not. Otherwise its not really fair on Nvidia either and Nvidia is going to wield it to win sales.

Would it be fair to say then that if AMD had the same performance and features they should be valued the same as Nvidia?
 
Last edited:
Sssh I don't want AMD stuff costing as much as Nvidia ;)

The thing is we all know its nothing to do with performance or features, its not as if AMD have never had feature and performance parity with Nvidia, at that point people just resort to saying "well one is Nvidia and the other is not" which is what its really about.

So with only 15% market share and having to sell 10% cheaper just to maintain that where is the incentive for AMD to spend the same amount of money as Nvidia on R&D for the same features?
 
Last edited:
The thing is we all know its nothing to do with performance or features, its not as if AMD have never had feature and performance parity with Nvidia, at that point people just resort to saying "well one is Nvidia and the other is not" which is really what its about.

So with only 15% market share and having to sell 10% cheaper just to maintain that where is the incentive for AMD to spend the same amount of money as Nvidia on R&D for the same features?
In a word Ryzen ;) They've done it before, if they crack this chiplet approach it could really put Nvidia under pressure. Who would have thought Intel would have had so much competition a few years ago?
 
In a word Ryzen ;) They've done it before, if they crack this chiplet approach it could really put Nvidia under pressure. Who would have thought Intel would have had so much competition a few years ago?

With CPU's its not about software features, its just purely performance.

Also, AMD have beaten Intel before, and gained market share, they have never gained market share against Nvidia, no matter what.
 
So what you're saying is... Nvidia or gtfo? :p

Funny thing is, the most vocal ones about "poor amd" have a history of buying nvidia gpus over amd even when supposedly amd are as good or better than the nvidia options..... but obviously amd isn't cutting it for them, as I have always said before, it's like people just want amd to compete so they can buy nvidia for cheaper :) I loved all my amd gpus I had, never any problems and absolute demolished nvidia in the past for bang per buck but that changed with dlss and ray tracing and being able to buy a 3080 for msrp changed that. Right now, I think amd are in a good place, they have got usable ray tracing now, the only thing stopping me is the lacklustre fsr 2 quality (and well still prices being silly)
 
Back
Top Bottom