• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What do you think of the 4070Ti?

  • Thread starter Thread starter Deleted member 251651
  • Start date Start date
Status
Not open for further replies.
OMG, I stopped reading that site years ago.

Shill central, not just for Nvidia....but everyone.

They never, ever call out a poor product. Everything they write has one objective in mind...to keep the free stuff flowing.

TechPowerUp are the same.
Hah. Someone mentioned a few gens back here they would "congratulate a turd for having a racing stripe" if it came free to them.

We need more quality text-based reviewers. Most of what's left I wouldn't trust to mow my lawn. We need Anand and Kyle and Tom back, circa 2008.
 
Gamers Nexus was pretty scathing but I dunno, I don't watch the channel often, and he was (in my opinion) way to negative about the AMD 5800X on launch, and his opinion went against his own benchmarks.

I agree with his comments on the 3070ti though, but it's clear they have their own agenda.
 
Last edited:
It's very difficult to tell how good the 4070Ti (and all 4xxx cards) really are compared to previous generations because of DLSS3. Nvidia are enabling DLSS3 in their marketing benchmarks which invalidates their test. It is impossible to tell how much of the performance improvement is due to better silicon and how much is down to DLSS3.

I think this is in preparation for the future. They are approaching a physical limit on how much more powerful they can make the hardware so they know full well that in the future they are going to have to sell hardware based on the performance of the firmware. It seems like they way they are marketing DLSS3 is the first move towards that. Oh, and you can't have DLSS3 unless you buy new hardware! I expect that trend to continue and marketing to shift much more towards features rather than fundamentally more powerful hardware.
 
Couldn't we get a cheaper version of the RTX 4070 TI, with GDDR6 VRAM instead.

If the RTX 3070 TI was anything to go by, there wasn't much difference in performance, and the power consumption would be lower.
 
Last edited:
I think people are right to discount DLSS3 when looking at this card and certainly anything below. The lower down the product stack the less relevant DLSS3 is surely. The frame generation becomes more unpredictable the lower the frame rate, so on cards like the 4070 were you could be getting say sub 40 fps rendered (@4k), the "guess " on the intermediate frame will be less accurate as more will have changed between rendered frames. We've all the seen the mess you can get on the inserted frames when the scene changes completely between the rendered frames, the lower the rendered frame rate the more this will happen, in this regard the DLSS2 AI upscaling of rendered frames is a better solution imo.

You are completely correct. Tim from HWunboxed said that DLSS 3 worked really well when you had well over 100fps and did help with smoothness but it was really bad at anything around 60fps and made the game worse. His vid was very informative and helped me forget about this marketing scam.
 
does it matter it will still be well over priced
Best case, it might be £100-£200 cheaper if there was a variant called RTX 4070 and it had GDDR6 instead.

The RTX 3070 TI was well overpriced for ages, even now it costs ~£100 more for a small (5-10%) performance benefit.

TIs are always overpriced in my opinion...
 
Last edited:
I think this is in preparation for the future. They are approaching a physical limit on how much more powerful they can make the hardware so they know full well that in the future they are going to have to sell hardware based on the performance of the firmware.

I agree that there are very little gains to be had by further shrinking the lithography (e.g. going from 8nm process to 5nm to 3nm etc) as we're reaching the limit of how small each transistor can be with current technology. There can still be absolute performance gains (by simply making the die larger, more transistors) but not so much performance-per-watt gains (which depends on the node size among other factors). That is until inevitably a whole new technology comes along (e.g. quantum computing, or computing based around photons rather than electrons) but we're talking decades away for that.

So I'd expect to see performance, power consumption, and cost continue to rise over the next few years, but we are reaching the end of big improvements in terms performance-per-watt as far as the silicon is concerned. Future advances in software/firmware may alleviate those limitations as you suggest.

Also bear in mind that our goals and Nvidia's goals are not the same, and often oppose each other. We want faster hardware, better performance-per-watt and better value, all they want is more profit, they don't particularly care how.
 
Last edited:
Couldn't we get a cheaper version of the RTX 4070 TI, with GDDR6 VRAM instead.

If the RTX 3070 TI was anything to go by, there wasn't much difference in performance, and the power consumption would be lower.

Yes you can , buy a 4080 laptop part and you have one. Problem is because the 4070ti has the limited bus of a lower tier card it needs fast memory or it will lose performance in quite a few titles.
 
Yes you can , buy a 4080 laptop part and you have one. Problem is because the 4070ti has the limited bus of a lower tier card it needs fast memory or it will lose performance in quite a few titles.
I thought memory bandwidth mattered less with the RTX 4000 series, due to the much larger L2 cache?

The 4070 TI has 48MB L2 cache.
 
Last edited:
You are completely correct. Tim from HWunboxed said that DLSS 3 worked really well when you had well over 100fps and did help with smoothness but it was really bad at anything around 60fps and made the game worse. His vid was very informative and helped me forget about this marketing scam.

Better not watch their last thoughts on dlss 3/fg :cry:


They came to the conclusion that it is subjective but general consensus seems to be, it's worth it if you're happy with your base fps i.e. for most that will be 50/60 fps, others, it will be 80/90+ and for some maybe even 30 fps and only to be used in games, which aren't PVP FPS focussed. Most end user comments (from people who aren't fanboys/loyal to any company) seem to find it worthwhile when they're just playing the game as they normally would. The funniest thing about Tims original video review on FG was he goes on slating it but then proceeds to say "all in all, if you weren't slowing footage down or pausing it to pick out the fake frames and even then it was very hard to pick out the fake frames.... it is genuinely very hard to see the issues" :cry:





I have always found this fascinating when it comes to the hardware vs software debate and why to buy something over another product going all the way back to when I was big into smartphones with google nexus/pixel vs samsung galaxy i.e. I always bought pixel/nexus for the software/google pureness even though samsung had far better hardware and now I stick with Samsung because they have not only the best hardware but also the best software experience for android too imo. Heck just look at apple, their hardware isn't exactly top of the line but it's the software and ecosystem is what people buy into because it is very well polished and good, most people aren't buying iphones for the hardware.

It's pretty obvious nvidia are heavily investing into the ecosystem, software features to differentiate themselves further from the hardware rat race now (long gone are the days of it just being about hardware) and going by history, looking at other companies as well as what I do in my job, they are on the right path for themselves and consumers and this is where amd need to be careful, they have already shown time and time again how they are always late to the party and with inferior solutions for the first few months/years.
 
I agree that there are very little gains to be had by further shrinking the lithography (e.g. going from 8nm process to 5nm to 3nm etc) as we're reaching the limit of how small each transistor can be with current technology. There can still be absolute performance gains (by simply making the die larger, more transistors) but not so much performance-per-watt gains (which depends on the node size among other factors). That is until inevitably a whole new technology comes along (e.g. quantum computing, or computing based around photons rather than electrons) but we're talking decades away for that.

So I'd expect to see performance, power consumption, and cost continue to rise over the next few years, but we are reaching the end of big improvements in terms performance-per-watt as far as the silicon is concerned. Future advances in software/firmware may alleviate those limitations as you suggest.

Also bear in mind that our goals and Nvidia's goals are not the same, and often oppose each other. We want faster hardware, better performance-per-watt and better value, all they want is more profit, they don't particularly care how.

Whilst in general you are correct and performance per watt will eventually be very hard to improve there are still huge improvements to be made and have been achieved in the recent past.

AMD 5700x RDNA card was on TSMC 7nm and when the 6000 RDNA2 cards came out also on 7nm they massively improved the performance per watt. AMD made huge claims about RDNA3 being 50% better int his regard and that was a flat out lie and the 7000 cards are no better than the 6000 cards in this regard. Whilst there may be less gains to be made from node shrinks there are still big gains to be made by architectural improvements.
 
Good, shill video again.


Also how do you see dislikes on youtube ? They removed the count a while back are you using some browser add-on ?
Started reading the written review earlier and gave up. Their analysis is usually good but they just aren't willing to call a spade a spade.

And I can't be the only one who cringes every time I see middle aged blokes say/type "Team Green" and "Team Red". They are billion dollar companies making consumer products, not some sodding avengers esque wannabe's.
 
Status
Not open for further replies.
Back
Top Bottom