• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What do you think of the 4070Ti?

Status
Not open for further replies.
Soldato
Joined
8 Nov 2006
Posts
23,018
Location
London
If it sold so well why did the 3080 come out at £650 and please do tell me the performance uplift from the 3080 to 2080ti

Wasnt the 3070 pretty much on per with the 2080ti while costing £469 ?

I mean you are smart. You know why. What dictates prices? Is it just performance and costs?

Who else might have a say on what nvidia might be able to sell at?
 
Last edited:
Soldato
Joined
8 Nov 2006
Posts
23,018
Location
London
No, Moore's law.

My emphasis.

To more directly answer your IC manufacturing says a 3070 or 2080 TI owner can upgrade for free question, no because neither you nor i said anything about upgrading for free, you said...

And i replied by saying that you do, that Moore's law and the last 50 years of IC manufacturing has shown that as density increases the cost per transistor decreases resulting in being able to put more transistors into the same area increasing performance for a negligible cost increase, or putting the same amount of transistors into a smaller area and decreasing the cost.

E.g when the 4004 was released it cost something like $400, if you were to make it today it probably wouldn't even cost $4. Or you could spend that same $400 and get a x10k or more increase in performance.

So you decided to ignore my post and spend all this time arguing one sentence out of context. :cry:

Classic internet forum behaviour.

I'll repeat my post :cool:

If you have a 3070 and need better performance, you 'have to buy' a newer card. If you have a 2080 ti and want better performance, you need to buy something.

You don't get performance for free.

If your logic is you don't need to buy something, then the price is completely irrelevant. You shouldn't be buying a new card which will then immediately depreciate if your current card was fine.

I was talking about a 3070 or 2080 Ti owner upgrading. Stop changing what I said.
 
Last edited:
Soldato
Joined
16 Sep 2018
Posts
12,728
So you decided to ignore my post and spend all this time arguing one sentence out of context. :cry:

Classic internet forum behaviour.

I'll repeat my post :cool:



I was talking about a 3070 or 2080 Ti owner upgrading. Stop changing what I said.
What on earth, i literally addressed your key point, you said...
So last 50 years of IC manufacturing says a 3070 or 2080 TI owner can upgrade for free.
Did what i post not address that point? When i told you that neither you nor i said anything about upgrading for free, that you said free performance, did that not address that point?

e: Oh and before you say it the reason i didn't address that specific point is because that point is you moving the goal post after someone took issue with what you said, that's you saying you don't get free performance and when someone proves you wrong you change it to free upgrades rather than admit you were wrong.
 
Last edited:
Soldato
Joined
28 Oct 2009
Posts
5,661
Location
Earth
I mean you are smart. You know why. What dictates prices? Is it just performance and costs?

Who else might have a say on what nvidia might be able to sell at?

Where did you get the following from The 2080 Ti for example was $1200 for FE cards which set a new ceiling. They sold pretty well, even for AIB cards which rarely sold close to the $999 MSRP.

If I search back it all reads Turing sales were poor unless only the 2080ti sold well ?
 
Associate
Joined
22 May 2015
Posts
2,046
Location
Manchester

ljt

ljt

Soldato
Joined
28 Dec 2002
Posts
4,545
Location
West Midlands, UK
But this has nothing to do with my conversation with Twinz. We were talking about why people are buying cards. Not why you dont want to.

If you need more performance then you will need to upgrade. The price that is set will appeal to some people and not to others.

If every CPU was £1000 from now on each gen, would you never upgrade? Keep your current one running for 10 years?

If CPUs started at £1000 from now on. I'd keep my current rig until it died and then quit PC for good.

Similarly if these current price trends continue each gen from now on it will stop me gaming on my PC once my hardware failed.

I've already started to diversify my hobbies away from PC and gaming in general because of the way things currently are.
 
Soldato
Joined
14 Aug 2009
Posts
2,946
In the end it sort of does sadly.Because Nvidia will now only optimise any RT in newer games if that feature is active in hardware,hence reducing relative performance in future games on Ampere. They won't be bothered optimising for Ampere and Turing now. Even the Tweet say "more efficient usage of shaders" meaning Ampere will need more driver work on a generation Nvidia won't be making money on. If it were simply a case of Ampere pushing ahead due to more RT resources it would be different,as the work would could scale backwards.

For example the optimisation work for Ampere most likely scaled back to Turing because AFAIK the feature set was pretty much the same.

But it won't reading those tweets. This is my because concern with RT,it won't be just the general increase in RT resources each generation,but also these new features popping up which will make the performance gap grow beyond actual RT resources. It's quite clear you really need to buy at the start of each generation if you want to get the most out of RT performance.

I am talking about future proofing of RT performance. It really looks like you need to buy at the start of each generation. It's one thing RT performance scaling with additional hardware resources,its when the newer generation introduces newer hardware features which making the software optimisaton easier on the newer generation. The issue with that is anyone who say buys a new card 12 months into a generation,is only going to have 12 months of good optimisations,until the next generation changes something in the design,which does the same.
I don't know what the future will hold, but what I can say is this. In February my rtx2080 will have 4 years since I've bought it and in March is about 4,5 years since its launch.

Gave it a spin to test out in Cyberpunk77 as is a bit heavier in RT.
Some custom/tuned settings, RT maxed (including the Lighting on Psycho)

1080p with DLSS Performance: 74fps

Surround (5780x1080), DLSS Performance, maxed RT: 35fps
Surround (5780x1080), DLSS Performance, RT Lighting on Medium (the rest RT still on): 42fps
Surround (5780x1080), DLSS Ultra Performance, RT Lighting on Medium (the rest RT still on): 61fps
1440p is probably still a 60fps experience with all RT settings enabled (although maybe lighting only on ultra or medium) with DLSS Performance which I'd say is great after 4 years + and we're already into the 2nd gen after 2xxx series.

Ultra Performance is not a looker atm, but: https://www.nvidia.com/en-us/geforce/news/gfecnt/20231/ces-2023-rtx-dlss-game-updates/

DLSS continues to improve through ongoing training on NVIDIA’s AI Supercomputer, and a new update delivers improvements to DLSS Super Resolution, NVIDIA DLAA, and DLSS Frame Generation. Since DLSS 3 games are backwards compatible with DLSS 2 technology, all GeForce RTX gamers will also benefit.


Upcoming DLSS improvements include:


  • AI network enhancements for DLSS Frame Generation that better take advantage of game engine data, improving UI stability and image quality during fast movement.
  • An updated AI network for DLSS Super Resolution Ultra Performance mode, improving fine detail stability and overall image quality.
  • An updated AI network for NVIDIA DLAA that improves image quality, reduces ghosting, and improves edge smoothness in high contrast scenarios.

Cyberpunk 2077, The Witcher 3: Wild Hunt, and Portal with RTX will be the first to update with these new features in upcoming game patches.

If this comes to fruition will be beyond "fine wine" and whatever space the DLSS transistors take is well worth it. And they called Turing a turd! :p


So based on that experience, yeah, there's a decent chance the support will follow, but for how long is anyone's guess.
 
Last edited:
Soldato
Joined
9 Nov 2009
Posts
24,982
Location
Planet Earth
I don't know what the future will hold, but what I can say is this. In February my rtx2080 will have 4 years since I've bought it and in March is about 4,5 years since its launch.

Gave it a spin to test out in Cyberpunk77 as is a bit heavier in RT.
Some custom/tuned settings, RT maxed (including the Lighting on Psycho)

1080p with DLSS Performance: 74fps

Surround (5780x1080), DLSS Performance, maxed RT: 35fps
Surround (5780x1080), DLSS Performance, RT Lighting on Medium (the rest RT still on): 42fps
Surround (5780x1080), DLSS Ultra Performance, RT Lighting on Medium (the rest RT still on): 61fps
1440p is probably still a 60fps experience with all RT settings enabled (although maybe lighting only on ultra or medium) with DLSS Performance which I'd say is great after 4 years + and we're already into the 2nd gen after 2xxx series.

Ultra Performance is not a looker atm, but: https://www.nvidia.com/en-us/geforce/news/gfecnt/20231/ces-2023-rtx-dlss-game-updates/



If this comes to fruition will be beyond "fine wine" and whatever space the DLSS transistors take is well worth it. And they called Turing a turd! :p
So based on that experience, yeah, I'd say there's a great chance 3x

This is the problem when you don't follow the chain of the conversation. Ampere is basically Turing on steroids so optimisations for Ampere help Turing. Just like Pascal was Maxwell on steroids. Turing was the new generation design with Ampere being the optimisation. It wouldn't surprise me that Ampere was a stop-gap design because Nvidia couldn't get enough TSMC 7NM volume so had to use Samsung 8NM which is a 10NM class node. AMD had a stroke of luck IMHO with RDNA2 being on a better node.

Ada Lovelace has RT cores that are significantly changed over Ampere/Turing and introduces SER,which makes it more efficient to code for which is locked out of Ampere/Turing. People are buying Ampere right now and it's now at the end of it's optimisation run. So will Turing.

What do you think is going to happen,once optimisations start mostly for Ada Lovelace? The difference in RT performance between Ada Lovelace and Ampere will grow worse and worse. I literally said that in the message you quoted. So you basically wasted your time trying to prove a point which wasn't being made. The same goes with the latest version of DLSS,locked out of Ampere,so basically Ampere is going to fall behind more and more,and probably not in a way Turing did when compared to Ampere.

This is why I was pointing out,you will ideally need to buy these cards at the start of each generation to maximise effective lifespan if you want RT heavy effects. It's not just more hardware resources,but new features which will influence performance.

I am on Ampere BTW and I run games at qHD. That RTX2080 was a £700+ dGPU. You are literally running it at 1080p/qHD,using the worst upscaling/image reconstruction setting Nvidia has. That means internal resolution is very low - I tried it on my RTX3060TI and I wouldn't touch DLSS performance myself.

I have an RTX3060TI which is meant to better at RT than your RTX2080. I ran it with and without image reconstruction/upscaling at qHD - there are some very intensive areas which hammers dGPUs. With everything on Ultra the performance without DLSS was horrible. I had to use DLSS Quality to get playable framerates and performance can still be problematic in areas,so I basically just kept reflections on as the main setting. That was a year ago - I can give it a try with the latest drivers to see if things have gotten better if you are interested?

But this is also why I am a bit concerned with the upselling Nvidia and AMD are doing this generation. Both the "RTX4070TI" and "RX7900XT" should be below £600.

Nvidia cynically not reducing Ampere dGPU pricing too,by jacking up Ada Lovelace pricing and AMD joining in so most Nvidia dGPUs bought now are probably Ampere based! Lots of people buy Nvidia dGPUs probably because of RT and are still paying RRP or above RRP pricing for them(at least AMD cratered the price of many of their dGPUs). But now Ampere is essentially at the end of its optimisation lifespan over the next year or so.

The mainstream dGPUs this generation look like they will use dies pushed up at least one tier or worse! Whether it's relative RT performance or relative rasterised performance both Nvidia and AMD are giving mainstream buyers relatively less than last generation.
 
Last edited:
Soldato
Joined
14 Aug 2009
Posts
2,946
This is the problem when you don't follow the chain of the conversation. Ampere is basically Turing on steroids so optimisations for Ampere help Turing. Just like Pascal was Maxwell on steroids. Turing was the new generation design with Ampere being the optimisation. It wouldn't surprise me that Ampere was a stop-gap design because Nvidia couldn't get enough TSMC 7NM volume so had to use Samsung 8NM which is a 10NM class node. AMD had a stroke of luck IMHO with RDNA2 being on a better node.

Ada Lovelace has RT cores that are significantly changed over Ampere/Turing and introduces SER,which makes it more efficient to code for which is locked out of Ampere/Turing. People are buying Ampere right now and it's now at the end of it's optimisation run. So will Turing.

What do you think is going to happen,once optimisations start mostly for Ada Lovelace? The difference in RT performance between Ada Lovelace and Ampere will grow worse and worse. I literally said that in the message you quoted. So you basically wasted your time trying to prove a point which wasn't being made. The same goes with the latest version of DLSS,locked out of Ampere,so basically Ampere is going to fall behind more and more,and probably not in a way Turing did when compared to Ampere.

This is why I was pointing out,you will ideally need to buy these cards at the start of each generation to maximise effective lifespan if you want RT heavy effects. It's not just more hardware resources,but new features which will influence performance.

I am on Ampere BTW and I run games at qHD. That RTX2080 was a £700+ dGPU. You are literally running it at 1080p/qHD,using the worst upscaling/image reconstruction setting Nvidia has. That means internal resolution is very low - I tried it on my RTX3060TI and I wouldn't touch DLSS performance myself.

I have an RTX3060TI which is meant to better at RT than your RTX2080. I ran it with and without image reconstruction/upscaling at qHD - there are some very intensive areas which hammers dGPUs. With everything on Ultra the performance without DLSS was horrible. I had to use DLSS Quality to get playable framerates and performance can still be problematic in areas,so I basically just kept reflections on as the main setting. That was a year ago - I can give it a try with the latest drivers to see if things have gotten better if you are interested?

But this is also why I am a bit concerned with the upselling Nvidia and AMD are doing this generation. Both the "RTX4070TI" and "RX7900XT" should be below £600.

Nvidia cynically not reducing Ampere dGPU pricing too,by jacking up Ada Lovelace pricing and AMD joining in so most Nvidia dGPUs bought now are probably Ampere based! Lots of people buy Nvidia dGPUs probably because of RT and are still paying RRP or above RRP pricing for them(at least AMD cratered the price of many of their dGPUs). But now Ampere is essentially at the end of its optimisation lifespan over the next year or so.

The mainstream dGPUs this generation look like they will use dies pushed up at least one tier or worse! Whether it's relative RT performance or relative rasterised performance both Nvidia and AMD are giving mainstream buyers relatively less than last generation.
The quote from Nvidia was meant to show that all rtx cards are supported where possible and they won't necessarily drop the improvements as you suggested . If dlss3 could run well or not on ada we just don't know, just as we don't know why turing didn't received the upscaling for video playback - aka dlss for video.
Moreover, the test was to show that the card cab still perform decently after all these years and hasn't been gimped in preparation for Ada's launch.

SER, from my understanding, is a specific optimization for Ada. Doesn't mean you don't optimize your game, just that you can do a bit extra for a very THIN slice of potential gamers. For that small number of potential customers, I don't see a lot of studios line up to show the middle finger to the rest of the market.

Performance mode is not weakest, that's ultra performance. Performance is decent now compared to where it was at launch and I think is quite better than quality mode in crysis remaster :))
 
Last edited:
Soldato
Joined
18 Oct 2002
Posts
4,343
Great thread about the 4070ti
To be fair, the thread has not gone off topic that much, people are giving their 50p on what they think about it.

The bickering earlier in the thread was akin to two away fan supporters walking into the home-end boozer after the game on match-day and describing to everyone how their team beat the crap out of everyone else when it was in-fact a 0-0.
 
Associate
Joined
19 Sep 2022
Posts
630
Location
Pyongyang
So looks smoother but feels slower. That was my finding with frame gen on my 4080 in Witcher 3 and MSFS.
i could see noticeable stutter with framegen off in witcher 3, it seems to be a tradeoff of sorts. i am using an old processor though (5800x3d)
but still interested in knowing the 4070 ti experience
i havent actually ever played a msfs game, so its going to be a completely new experience for me
 
Last edited:
Soldato
Joined
16 Sep 2018
Posts
12,728
yeah we need a separate thread for 4070 ti owners or something
Can't tell if you're being intentionally ironic or not. :)

Thinking a thread asking people what they think of the 4070Ti is, or should be, about people who own them is pretty funny when Nvidia mistook it for a 4080. :D It's about as funny as reading posts moaning about there being no discussion of the 4070Ti when that's pretty much all it's been for 80+ pages.
 
Status
Not open for further replies.
Back
Top Bottom