• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia gimmicks / features pushing up Graphics card prices?

Well, it looks like Frame Generation is actually quite a clever technology:

Some impressive 1% lows in the Witcher 3 Next Gen. Noticeably improved vs frame gen off.

Particularly, the ability to reduce load on CPUs. I wonder how many games will support it?

I also wonder if it's basically cloning a lot of data from previous frames, and modifying it slightly.

Not necessarily a problem for most gamers, who don't need to scrutinise every frame.

My issue with technologies like this, is that they just cover up hardware deficiencies, particularly with Ray Tracing cores.

But, certainly could be a nice thing to have for the RTX 4070 and 4070 TI.
 
Last edited:
My issue with technologies like this, is that they just cover up hardware deficiencies, particularly with Ray Tracing cores.

It's easy. As you see through it, don't buy. Only buy when you consider the hardware on offer worthy of you money.
 
Well, it looks like Frame Generation is actually quite a clever technology:

Some impressive 1% lows in the Witcher 3 Next Gen.

Particularly, the ability to reduce load on CPUs. I wonder how many games will support it?

I also wonder if it's basically cloning a lot of data from previous frames, and modifying it slightly.

My issue with technologies like this, is that they just cover up hardware deficiencies, particularly with Ray Tracing cores.

But, certainly could be a nice thing to have for the RTX 4070 and 4070 TI.

The guy is a nvidia shill now though ;)

But yes, when we look past the over analytical testing scenarios by likes of HUB, DF and when used in the appropriate scenarios i.e. your base fps is already somewhat decent (so you're getting good input lag) and you're just playing the game, frame generation is very good.

Also, it doesn't reduce load on cpus, frame generation is doing nothing to help cpu issues here, it is simply adding "fake" frames to be able to increase overall fps and give the smoother gameplay, essentially bypassing cpu issues. Basically, it is taking the 1st and 3rd frame to insert a fake AI generated one in the middle of 2 real frames e.g.

hxr1bfg.png

This is why when you switch from a completely different frame/scene/angle, the fake frame looks considerably worse, hubs clickbait thumbnail for their video, when you look at what they did in order to get that:

the 1st real frame:

nTGG70i.png

the fake 2nd frame which is created based on the 1st and 3rd frame:

Fn6bt6Y.png

the 3rd real frame:

gOfazBj.png

But as has been stated by both HUB, DF etc. you wouldn't notice the "fake" frames outside of slowing footage down or/and even having to pause and pick out the fake frames, believe Tim said he had a hard time doing this even :cry:


EDIT:

As for how many games will support it, I imagine a good chunk will as nvidia are using/providing developers with their streamline solution which integrates dlss, fg, reflex all in one go, imagine it will be a slow initial uptake for the first couple/few months though until this trickles down and makes its way into developer road map given it was only revealed recently.

 
Last edited:
AMD is adopting something very similar soon also. Looks like it could become quite a big deal.

Have there been any ETAs though? All they said in their show was later next year, which could mean December 2023 :p Will be interesting to see how amds solution does though given:

- nvidia apparently have been working on frame generation for the last 6 years (iirc there are already some open source solutions that amd could use/base their solution of such as those in the VR headsets though)
- nvidia are using hardware in order to overcome the main issues associated with frame generation/inserting fake frames i.e. latency

FSR 1 was terrible and FSR 2 was poor too, it is only recently with fsr 2.2+ and better implementations by devs where it has become on par with dlss 2+ now so even if amd where to release their solution, like FSR and their other competing technologies, give it another 6/7+ months to match nvidias solution.
 
Last edited:
AMD is adopting something very similar soon also. Looks like it could become quite a big deal.

Yea, I have no issue with it. I just see it as an extra option like DLSS/FSR. No one is forcing anyone to use it.

I wonder if it will be used in the next lot of consoles like the PlayStation 5 Pro or PS6.
 
Yea, I have no issue with it. I just see it as an extra option like DLSS/FSR. No one is forcing anyone to use it.

I wonder if it will be used in the next lot of consoles like the PlayStation 5 Pro or PS6.
As they are targeting 8k in those next consoles they are going to need FSR with frame generation, so FSR 3.
 
Fair points, but

I doubt AMD just picked it up last Tuesday.

TBH, it looks like they did ;) When they revealed it and stated "next year", doesn't sound/look like they had anything really concrete in the works and I bet you their preview of it was fake, they learnt from rdna 2 with not speaking out about a dlss competitor (which was a huge key point for nvidia for so long) harmed them and they couldn't very well do the same all over again with frame generation and keeping everyone waiting for some "announcement" to say "yes we are working on it", rather than "we're looking into various methods to benefit the industry as a whole".

After all how long did we have to wait for FSR 1, which was just a spatial upscaler (which even amd said wasn't a competitor to dlss) then how long did it take for FSR 2 to arrive and then the better versions which are on par with dlss now.....

Yea, I have no issue with it. I just see it as an extra option like DLSS/FSR. No one is forcing anyone to use it.

I wonder if it will be used in the next lot of consoles like the PlayStation 5 Pro or PS6.

One thing I find surprising is how FSR still hasn't taken of with consoles, after all, as many like to remind us and even I thought the same too, surely having amd would see fsr exploding here.... maybe it's just down to amd again and their stance on these things with it being open source and taking the over the fence approach in their solutions i.e. here you go, do as you please :p
 
  • Like
Reactions: TNA
I just see it as an extra option like DLSS/FSR. No one is forcing anyone to use it.
This is the most important point. Techs have their positives and drawbacks, but ultimately just switch it off if you don't want it. Motion blur is in everything, and it goes off every time. I don't care who does it better.

Any new tech needs to be optional, certainly for the first few years. If games come out requiring a particular feature, it's going to have to be pretty amazing to get me to shell out 800 quid plus to play. Huge fuss in the news when consoles decide to release games at 60 quid instead of 50, but it seems acceptable to require a 4 figure graphics card to potentially play Avatar at decent settings. This is not right.
 
  • Like
Reactions: TNA
It's quite interesting to see the impact of frame gen. at native resolution in the Witcher 3.

Performance is pretty dire even at 1080p, with RT options enabled (frequently well below 60 FPS), even on the fastest card available (RTX 4090).

With frame gen on, the framerate appears smooth.

eh, if it can make ray tracing actually usable / performant in games, I'm all for it.
 
Last edited:
When they revealed it and stated "next year", doesn't sound/look like they had anything really concrete in the works and I bet you their preview of it was fake, they learnt from rdna 2 with not speaking out about a dlss competitor (which was a huge key point for nvidia for so long) harmed them and they couldn't very well do the same all over again with frame generation and keeping everyone waiting for some "announcement" to say "yes we are working on it", rather than "we're looking into various methods to benefit the industry as a whole".
So... Nvidia features are pushing prices up? Seems that way, though I wouldn't necessarily use the word gimmick.

AMD providing features at a lower cost, but at a later date? Falls in line with the 'fine wine' idea.

Seems its all as normal, and just how things have been for years, just different tech at the centre. People are paying for the privilege of being at the bleeding edge - Same as any industry. The only question is in relation to value - is it worth 100 - 200 pounds (on top of an extremely high base price) to have a blurry image of spiderman's **** generated 12 months before AMD get a slightly blurrier image of spiderman's ****.

Not for me.
 
We are supposed to be reaching a transistor size limit with current technology. One of the ways they can take it is to increase the size of the chip, but that increases power. I doubt there are many generations left. The only thing they can do is sell new "firmware" and they seem to be practising this already. I think the future is more "gimmicks" as you put it.

Whether they use this as an excuse to push up prices, I am not sure. They will use it as an excuse to sell cards for sure.

I really don't know where they think they are going with prices. All I can say is that they seem to have priced out a lot of enthusiasts this time. If the first appearances of the 4070ti are genuine then they will push out even more with that. Partner boards will be coming in at around £1000, which is way, way too high imo. Well, they have certainly priced me out anyway. I will not pay that sort of money for a 4070ti.

Frankly, the future is bleak.
 
Agree, £1,000 is much too high. TI or not.

RTX 3070 was £470 for the FE. The RTX 2070 was priced around £400. The GTX 1070 was priced around £300.

So the value seems shot to hell.
 
Last edited:
So... Nvidia features are pushing prices up? Seems that way, though I wouldn't necessarily use the word gimmick.

AMD providing features at a lower cost, but at a later date? Falls in line with the 'fine wine' idea.

Seems its all as normal, and just how things have been for years, just different tech at the centre. People are paying for the privilege of being at the bleeding edge - Same as any industry. The only question is in relation to value - is it worth 100 - 200 pounds (on top of an extremely high base price) to have a blurry image of spiderman's **** generated 12 months before AMD get a slightly blurrier image of spiderman's ****.

Not for me.
But would AMD ever give you a blurry generated image of Spiderman's **** if Nvidia hadn't done it 12 months earlier?
 
But would AMD ever give you a blurry generated image of Spiderman's **** if Nvidia hadn't done it 12 months earlier?
Probably, eventually. This isn't a 6 month development project. I'd imagine (no basis on fact) that nvidia has a larger gpu R&D outlay. If they weren't first with tech as a specialist GPU only vendor, I'd be surprised.
 
Probably, eventually. This isn't a 6 month development project. I'd imagine (no basis on fact) that nvidia has a larger gpu R&D outlay. If they weren't first with tech as a specialist GPU only vendor, I'd be surprised.
I guess we won't know. I mean I'm sure they get wind of what Nvidia are working on before we do, so it's not like their R&D starts after Nvidia release. It just seems odd that they release an equivalent to Nvidia's stuff just after Nvidia do it. It's not like the order is all jumbled up, like Nvidia release GSync and AMD release FSR. Fresync came after GSync and FSR came after DLS, etc.
Maybe it is just massive coincidence that they were working on things in the same order (more or less).
 
Didn't Nvidia do a bit corporate re-orientation towards AI years before the 2000-series even launched? It's easier to follow than to lead, but it's no surprise AMD are taking a while to catch up on that front.
 
My issue with technologies like this, is that they just cover up hardware deficiencies, particularly with Ray Tracing cores.

Unless rasterization has reach perfection in terms of visual quality and performance, that DLSS2/3 will still be useful outside of Ray Tracing.
So... Nvidia features are pushing prices up? Seems that way, though I wouldn't necessarily use the word gimmick.

AMD providing features at a lower cost, but at a later date? Falls in line with the 'fine wine' idea.

Seems its all as normal, and just how things have been for years, just different tech at the centre. People are paying for the privilege of being at the bleeding edge - Same as any industry. The only question is in relation to value - is it worth 100 - 200 pounds (on top of an extremely high base price) to have a blurry image of spiderman's **** generated 12 months before AMD get a slightly blurrier image of spiderman's ****.

Not for me.

Is "fine wine" where the current product offers good performance at a good price. R290/x were about as fast as nVIDIA's $1000 Titan at half the price. They were a good, sensible buy without hoping for some future performance improvements. "Fine wine" came later through drivers and was an unexpected surprise. Those cards managed to keep up with nVIDIA's next generation up to some point.

However, that was only because AMD had poor drivers to start with if that much performance was left on the table - and, in a way, HW and SW were bad since only through Mantle they could tap into that performance efficiently. If you want beta / incomplete drivers, then yes, is "fine wine" and a prayer that it will actually happen and you'll get some features working properly down the line (better not play your favorite games until then!). I'm done with that. If I'm happy with what I get now is fine, if not, there's always the competition or waiting for some future offers.

BTW, 7900xtx is about the same in price with a 4080, so you're not saving anything by going with AMD. You have to like it strictly on its performance and features.
 
Last edited:
However, that was only because AMD had poor drivers to start with if that much performance was left on the table - and, in a way, HW and SW were bad since only through Mantle they could tap into that performance efficiently. If you want beta / incomplete drivers, then yes, is "fine wine" and a prayer that it will actually happen and you'll get some features working properly down the line. I'm done with that. If I'm happy with what I get now is fine, if not, there's always the competition or waiting for some future offers.
Fair points. Just want to state that I'm not necessarily a believer in the fine wine idea (not a long enough history knowing about AMD to comment) but it is an oft repeated phrase.

You DEFINITELY should only pay for what a product offers at the time, IF you are happy with price. My main issue is the fact that people seem to base the future of GPUs on hope rather than what experience has shown us. If you want features/gimmicks NOW, there is a premium to pay. If you're OK with waiting, you can make a long term saving... But there is a gamble. If you're just after raster performance, the option is there for a cheaper, similar experience - currently, its what I'd recommend, with a heavy caveat that it's specific based on game or streaming requirements.

Ultimately, they are all too expensive at the moment in my opinion. But if the business is good enough for the producers? It's the right price.

And that is the part that truly sucks.
 
  • Like
Reactions: TNA
Back
Top Bottom