• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Fidelity Super Resolution 2.0

Looks much better as expected. Hopefully see some comparisons to dlss soon.

Dishonoured 2 is a very good showcase for the temporal improvements over FSR 1 as with FSR 1 in that game you can easily notice how FSR enhances the native/TAA issues with edges/lines/wires disappearing and shimmering etc. Hopefully they can get this in RDR 2 too as that is probably the best showcase for temporal stuff as without it, the game looks awful, even when using MSAA 8x.

Very curious to see how motion looks too, given it is basically still TAA (specifically TAAU), I would expect to see some of the similar TAA motion issues but perhaps somewhat improved if AMD have fine tuned it....

Funny reading amds Reddit etc. how people are raving about it being better than native yet same ones who said/say dlss isn't better than native :cry: Never gets old the gpu fanboy wars :D
 
NGL the difference between DLSS and FSR seems minimal it's just all in the FPS. It's great these technologies will probably get an extra year out of a card before someone might normally upgrade.
 
I've said it for years, ever since Turing got announced with DLSS, you don't really need ML for temporal accumulation to work very well at reconstructing, it was made very obvious by The Division 2's implementation that it could absolutely rival DLSS at its best. At 4K and the like, even just straight up reductions in pixel counts (lower res scale) were ok because you just have so many pixels already, and then you add sharpening and it's excellent, that's also why FSR has been a fine alternative to DLSS at 4K. Now FSR 2.0 will bridge the gap for when you make a large jump, 720p/1080p -> 4K (and later on 1440p -> 8K). At the very least it's nice to have another form of AA to choose from.

Now all that's left is to see whether AMD can catch Nvidia in RT performance with RDNA 3, because to me that's been the biggest weakness of RDNA 2. It's not often that it really makes that much of a difference but when you do want that extra RT power... oh man, it sure sucks to miss it. Not that Nvidia can really brag too much here either, in reality even the 3090 buckles, so we need more power across the board overall. I wanted to keep the 6800 for longer but it's clear that no card available right now is really good enough for longevity w/ RT. Hoping this year's cards will meet that standard.
 
I agree @Poneros with also our discussions on the topic regarding RT going back to the Turing era where the grunt is nowhere near where it needed to be. Again its not quite there. Only leaks to go by.. but the RT improvement to lovelace is better than the rasterisation leap by 25-50%. As the former is a 75% boost it looks like almost a 100% improvement to RT then.

However, power consumption is going to be up.
 
Said what? That they hope Nvidia's proprietary technologies crawl off into a dark corner and die? Think you'll find people have been saying that for over a decade, not just when FSR came out. Fortunately, good has repeatedly triumphed over evil in that war. :D

PhysX
GamesWorks
Propriatary G-Sync

DLSS will go the same way, not dead, plodding on in zombie form but ultimately superseded by open technologies.
 
They certainly have changed their tune since the spat with nvidia got resolved.

Yeah.....

They said the RX 6600 was overpriced at $329, they did it in a somewhat childish way and yet again with a hyperbolic video title and thumbnail.

Ok fine, $329 is quite expensive for a GPU that sits between its 5700 and 5700XT predecessors, there was no need to treat people like they were 14 to make that point but ok...

Around comes the RTX 3050, glowing review, nothing but high praise for what is frankly a (Male Bovine Manure) card! It is, its better than the RX 6500 but its still an objectively "Bad Card"
They said, with the current situation the this card will not be MSRP, it will be more expensive, so with that in mind we believe its worth buying at up to $450, now compare that with what they said about the RX 6600, a much better card that was at that time widely in stock for $400.

Oh yes, they resolved their argument with Nvidia.

z0SafrU.png
 
The methods of their testing coupled with their opinion delivery have been poor for 2021 and into 2022. I am hoping they get better over the year as they were a solid source generally speaking like I said up until nvidia tried broadsiding them into favourable dictated information. Ironically they have actually swung into team green since then like you perfectly say - the recommending the 3050, panning the low end AMD cards (the 6500 was a dog though) and generally stating they are not basing their reviews on pricing yet still forgot most of the AIB nvidia prices were much worse (which should have made them come up with some measure like when $perframe was used).

If you check out todays release of the 6700 vs 3070, they conveniently mention pricing.. as the 3070 was way more expensive so should reflect that in their summation (like $200 more).
 
The methods of their testing coupled with their opinion delivery have been poor for 2021 and into 2022. I am hoping they get better over the year as they were a solid source generally speaking like I said up until nvidia tried broadsiding them into favourable dictated information. Ironically they have actually swung into team green since then like you perfectly say - the recommending the 3050, panning the low end AMD cards (the 6500 was a dog though) and generally stating they are not basing their reviews on pricing yet still forgot most of the AIB nvidia prices were much worse (which should have made them come up with some measure like when $perframe was used).

If you check out todays release of the 6700 vs 3070, they conveniently mention pricing.. as the 3070 was way more expensive so should reflect that in their summation (like $200 more).


I have written them off as Nvidia shills at this point and i don't think that will ever change.

I'm of the opinion Nvidia do buy these people off, they have been doing it for a decade, i don't blame these reviewers who take Nvidia money, i would, i absolutely would, who doesn't like lots of money?
But as a consumer i'm not stupid, its very easy to make something bad look good and something good look bad without telling a word of a lie, and i'm sure these people delude themselves that for this reason its a perfectly legitimate thing to do.
You as the consumer have to trust these tech journalists aren't doing that, that they are unbiased, Hardware Unboxed clearly are not unbiased and with that i don't trust anything that comes out of that channel, as i see it they are a marketing arm of Nvidia.

The options AMD have to tackle this, and they have to or they will never gain any market share from Nvidia.
One is get in to bidding wars with Nvidia on paying tech journalists off, who do you think will win that?
Two is make your GPU's so good its impossible to review them badly, or it would be really really obvious that's what you're doing, but then again its really obvious what Hardware Unboxed are playing at and normies will not see it, you and me are the only ones pointing out the blindingly obvious contradictions in their publications.

The only real option is for this community to wise up and call them out on their Male Bovine Manure, relentlessly.
"But why should it matter to me? I like Nvidia, i don't give a poo about AMD i would never buy an AMD GPU"
If you are in that ^^^^ camp you are a monumental idiot and yes Jenson adores you.
 
Not got much confidence in Hub then?:p

Hub imo are playing to their audience (~80%? Nvidia).

If you want to keep/improve viewers you need to feed them more of what they want to hear which probably isn't 'buy AMD', they'll only go elsewhere.

Although their relationship with Nv has no doubt increased, I doubt they're getting paid outwith a higher priority of access to hardware/info after they exposed Nv's bullying tactics.

If Nv are better at keeping Hub happy as, it's only AMD that can change that by getting more involved than Nv does with hub.

Saying all that, at the end of the day there's still a chance Hub does have the best leather jackets in Oz.
 
:cry:

Makes me laugh, oh they said a good thing about nvidia, must be nvidia shills! No possible other explanation?!?! :D Part that tops it all of, is the same people who "call out" these shills and go on about how people should boycott nvidia, nvidia has too much mindshare etc. yet those same people still purchase nvidia hardware :p :cry: If AMD gpus and the company are are so pure and good and nvidia are evil with worse hardware, why not buy amd?

HUB back up their points very well with "evidence", something many on this forum lack the ability to do..... Loads called them out for disabling SAM/rebar (funnily amds strongest fans were the most vocal), they posted a video purely on that but still not good enough (they even have it enabled in that last video of theirs).... heck if people watched the video, you would see that they recommended amd over nvidia because of the price difference in the "current" market (despite nvidia having the "overall" lead across all res. and games) ... HUB also disable RT in a lot of their benchmark videos, nvidias biggest advantage but you don't see the same people bringing that up when it comes to complaining about them being "nvidia shills" or disabling sam/rebar??? Funny eh....



But back on topic, any comparisons of FSR 2 to dlss yet? Initial thoughts are this won't be quite as good as DLSS as if it was, would amd not have compared it to DLSS rather than just native and FSR 1? Either way, even if it's not quite as good as dlss, it's still a step up from FSR 1.
 
There ^^^^ is that TikTok audience they are chasing.

There's ^^^^^ "amd good, nvidia bad" but last 2 gpus have been nvidia.... :cry:


Come on, get that thread posted on why HUB are "nvidia shills" with all your concrete evidence to prove why, will be a good read ;)

xCatiSC.png
 
Back
Top Bottom