• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What do you think of the 4070Ti?

  • Thread starter Thread starter Deleted member 251651
  • Start date Start date
Status
Not open for further replies.
i dont think theyre evenly matched in features though, but talking abt it only gets you murdered in this forum.. but i understand its speculative to talk abt this right now. however, whatever i read in the ada whitepaper looks promising

It gets you murdered because it’s mostly marketing BS. You even said it yourself earlier, you should not buy a GPU on potential.

RT, DLSS Gsync, physx etc. the truth is that as it sits right now, either an Nvidia or AMD GPU will give a largely similar experience for the majority of gaming needs and the “features” are largely marketing fluff.
 
Go look up the meaning of the word hyperbole, or exaggeration, or misrepresentation.
  • The 4070Ti is not miles better based on objective reviews and research. It is very evenly matched overall in both price/perf and features.
  • The 4070Ti is only up to 15% more expensive IF you cherry pick your comparison. The actual price difference is less than ~7% here in the UK.
  • It doesn't matter who started quoting OCUK, you referrenced OCUK in your price comparison but decided to compare the cheapest 4070Ti to a mid priced 7900 XT.
  • You have been purposely posting that the 4070Ti is a far superior and far cheaper GPU, yet everyone else with an actual working brain can see your posts as pure nonsense based on biased data.
Put down the shovel.

The unlaunched/relaunched is: A 60 class die, +£350 on the ampere msrp, not much better than last gens 80 (which was -£200 cheaper). Unless your on a 2070/2080 or older it is offering very little which is why people are annoyed. Regardless of how apologist people want to be, the prices should be no more than 10% over last gen because lets face it there are no miners and scalpers to blame at this point. Pure greed that needs a sturdy wake up slap.
 
It gets you murdered because it’s mostly marketing BS. You even said it yourself earlier, you should not buy a GPU on potential.

RT, DLSS Gsync, physx etc. the truth is that as it sits right now, either an Nvidia or AMD GPU will give a largely similar experience for the majority of gaming needs and the “features” are largely marketing fluff.

replying to brostrodamus is just going down a dowie hole of denial

Edited for efficiency on another.
 
The unlaunched/relaunched is: A 60 class die, +£350 on the ampere msrp, not much better than last gens 80 (which was -£200 cheaper). Unless your on a 2070/2080 or older it is offering very little which is why people are annoyed. Regardless of how apologist people want to be, the prices should be no more than 10% over last gen because lets face it there are no miners and scalpers to blame at this point. Pure greed that needs a sturdy wake up slap.
Its a 104 die, so similar to the 3070/ 3060 core
 
It gets you murdered because it’s mostly marketing BS. You even said it yourself earlier, you should not buy a GPU on potential.

RT, DLSS Gsync, physx etc. the truth is that as it sits right now, either an Nvidia or AMD GPU will give a largely similar experience for the majority of gaming needs and the “features” are largely marketing fluff.

We've been conditioned by marketing that 4k high refresh is essential.

I originally completed cyberpunk 2077 with Vega 64 @ 1440p 45fps and FSR (or whatever upscaling it was back then).

Have since had different playthroughs with 3070, highish RT and DLSS balance getting near 60fps @1440p

I wouldn't say there experience was significantly different despite the generational leap in GPU.

Both cards are excellent technically and it is great we have a choice (Intel 14nm+++++ 5% improvement anyone). With the current economic climate, inflation etc. I think they are both overpriced and realistically should have been closer to the 3080 MSRP.
 
Last edited:
but you can sure see some glimpses of what the tech (dlss3) is capable of achieving 106 fps in msfs 4k on a 4070 ti.. i dont have the card to sample the experience first hand but it should be a good test asking someone around how the dlss 3 experience in msfs has been?
also nvidia has implemented a recursive ray traversal pipeline, they call it SER or something and it looks highly scalable
theres potential here but not much meat as in the number of games available that can leverage all that and somehow the card is performing better on dx12 ultimate benchmark which is speedway
 
These are the only winners here, End of:p,
maxresdefault.jpg
 
Last edited:
It gets you murdered because it’s mostly marketing BS. You even said it yourself earlier, you should not buy a GPU on potential.

RT, DLSS Gsync, physx etc. the truth is that as it sits right now, either an Nvidia or AMD GPU will give a largely similar experience for the majority of gaming needs and the “features” are largely marketing fluff.
That's the magic, amd released a card so terrible that not even people that don't care about RT DLSS gsync physx etc should buy.
 
That's the magic, amd released a card so terrible that not even people that don't care about RT DLSS gsync physx etc should buy.

I agree AMD's card is terrible, but Nvidia's card is just as terrible. Trying to make it sound any different is frankly just laughable.
 
but you can sure see some glimpses of what the tech (dlss3) is capable of achieving 106 fps in msfs 4k on a 4070 ti.. i dont have the card to sample the experience first hand but it should be a good test asking someone around how the dlss 3 experience in msfs has been?
also nvidia has implemented a recursive ray traversal pipeline, they call it SER or something and it looks highly scalable
theres potential here but not much meat as in the number of games available that can leverage all that and somehow the card is performing better on dx12 ultimate benchmark which is speedway

I have tried the frame insert lark in MSFS and Witcher 3. It does make things smoother but laggier and gives a weird sluggish feel. So I ended up taking it off both games and had a better experience. So 70 FPS in MSFS with low latency on at 4K, felt better than with frame generation on at over 100 FPS.
 
Last edited:
Its a 104 die, so similar to the 3070/ 3060 core
It's actually worse than that. Nvidia last generation made a GA103,but it only had a limited release in laptops.

So for desktop it was GA102>GA104>GA106.

This generation it is:

AD102>AD103>AD104.

So if we look at the relative tiers this generation:

RTX3090/RTX3090TI(GA102@628MM2)=RTX4090(AD102@608MM2)

Last generation RTX3080 used a cut down GA102,so no real equivalent exists this generation yet.

RTX4080(AD103@379MM2)=RTX3070TI/RTX3070(GA104@392MM2)

RTX4080 is 59.4% of the shaders and 71.1% of the memory bandwidth of the RTX4090,with 2/3 the VRAM. The RTX3070TI was 57.4% of the shaders and 60.3% of the memory bandwidth of the RTX3090TI and 1/3 the VRAM.

The RTX3080 was 80.9% of the shaders and 75.4% of the memory bandwidth of the RTX3090TI and half the VRAM of the RTX3090TI. So technically the RTX4080 is in-between an RTX3070TI and RTX3080 in terms of Ampere positioning.

RTX4070TI(AD104@295MM2)=RTX3060TI/RTX3060(GA104@392MM2 and GA106@276MM2)

RTX4070TI is 46.9% of the shaders of the RTX4090 and 50% of the memory bandwith of an RTX4090TI,with half the VRAM. RTX3060TI was 45.2% of the shaders of the RTX3090TI(46.3% of an RTX3090) and 44.4% of the memory bandwidth(60.3% for the RTX3060TI GDDR6X version) and 1/3 the VRAM of the RTX3090TI.

The RTX3070 has 54.2% of the shaders of the RTX3090TI and the same bandwidth and VRAM figures as the RTX3060TI.

Even in terms of pure performance at qHD,the RTX3060TI and RTX3070 were 60% and 69% of the performance of the RTX3090TI,and 64% and 74% of an RTX3090:

The RTX4070TI is 71% of an RTX4090 FE at qHD:

So the RTX4070TI in pure specs alone,is more like a pre-overclocked RTX3060TI GDDR6X with some more VRAM added on(with the die size of an RTX3060),with performance closer to an RTX3070(or an overclocked RTX3060TI).

So even being charitable to Nvidia,this shouldn't be more than an RTX3070 FE for the cheapest models,ie,under £500. Even then the relative die size is significantly smaller too,and GDDR6/GDDR6X prices have probably gone down a lot too.

All the people making useless excuses,like "specs don't matter" and so on,even if we look at relative performance from the TPU charts they can't spin how overpriced this generation is so far(even with AMD). Relative specs compared to the top end are a good indicator of relative performance and positioning. Now the reason why Ada Lovelace has huge coolers is obvious - Nvidia is clocking these as high as they possible can,to eek out every few percent of performance to justify the increase in performance. Nvidia conservatively clocked Ampere dGPUs to make sure power didn't go so high on Samsung 8NM.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom