• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 4070 12GB, is it Worth it?

Status
Not open for further replies.
Before that the equivalent of the Ti cards were the Ultras - though the GeForce 3 and 4 series did have Ti cards including top end cards like the GeForce 3 Ti500.
True, but the Ti was always used really on the lower end cards and as you said the Ultra cards were the top cards with an extra overclock normally and maybe a little more VRAM, but the chip in most Ultras was a full chip with only a better bin that would overclock more but no extra cores/shaders normally. That's why Ultra cards also got a bad reputation because they were normally the same hardware with an overclock that most would mod their cards to overclock the same.

It's actually funny how Nvidia has not used the Ultra name on cards since .. The joke back then for top end Ultra cards was Ultra rip off, I even remember toms hardware and the review sites back then and pc magazines telling people to avoid them and get the card down for way less.. Again was Nvidia testing the market what people would pay for the so called best. They have done this game many times as we know, Titan cards are a perfect example of that.
 
True, but the Ti was always used really on the lower end cards and as you said the Ultra cards were the top cards with an extra overclock normally and maybe a little more VRAM, but the chip in most Ultras was a full chip with only a better bin that would overclock more but no extra cores/shaders normally. That's why Ultra cards also got a bad reputation because they were normally the same hardware with an overclock that most would mod their cards to overclock the same.

It's actually funny how Nvidia has not used the Ultra name on cards since .. The joke back then for top end Ultra cards was Ultra rip off, I even remember toms hardware and the review sites back then and pc magazines telling people to avoid them and get the card down for way less.. Again was Nvidia testing the market what people would pay for the so called best. They have done this game many times as we know, Titan cards are a perfect example of that.

Yep, the Titans were the beginning of the end but people couldn't see it, and here we are. Now maintaing full e-peen tumenesence costs them 3-4x what it used to.
 
The whole Nvidia naming has become a real mess and I really dislike the 90 class as we have it now.. 90 class was always used for dual chip top end single cards.. not as we have it now... really 90 class single chip cards are the real 80 class of before. Just Nvidia being Nvidia and trying to fool people with names.. 90 class made you go wow fast and expensive and lots of power needed... well that's still true now but they are missing a whole gpu chip on the card... so a fake 90 class from what it originally was.

Anyways anyone that's been into tech and gpus (graphics cards/video cards) knows all this but Nvidia is hoping to brainwash people or fool the new crowd with names... This is why I don't care what they call them I look at the size of the chip and is it fully enabled or cut down. Nvidia can play their name games all day long...

Even the Titans became not Titans in the end as they were no different than the top Ti card that had only half the VRAM.. where original Titans didn't have crippled double precision and their main selling point, but Nvidia quietly crippled them and of course was just selling you the same chip as the gaming cards..but Titans were for work they said... :rolleyes: Nvidia ... always read the full specs and what you are really paying for as the names gpus have means nothing anymore with them.. GTX 680 anyone ? more like GTX 670 sold as 80...because AMD had nothing to compete with even the 70 class chip at the time.

Honestly out of all the tech companies Nvidia annoys me the most with their deliberate naming games of their hardware, then Intel when they also feel they have the market cornered and stagnate the whole cpu market as they did with the never ending 4 core cpus.

This is why I never became a fanboy of any company, because they all are at it when they think they can get away with it.
 
Last edited:
GTX 680 anyone ? more like GTX 670 sold as 80...

Kepler is really where this all started - the 680 was very much a mid-range spec sold as if a high end part where the 700 series showed the truth of it. The 4070ti really is a x60ti sold at twice the price it should be - even with today's crazy prices and any x70 is going to be realistically a x50 class card in respect to the 4090 - but people will be paying way over what should be x50 class prices.
 
Last edited:
My decision to buy the card will be almost entirely based on if the FE price is £600 or lower - Assuming that the performance is at least as good as a RTX 3080.

The card will have DLSS3 / Frame gen capability, but if it isn't as fast as a RTX 3080, I would consider that to be a regression in performance.

It's quite possible that they won't produce many RTX 4070 FE cards (or they may cancel the FE version), if that happens I don't think there's much chance of this card being affordable.
 
Last edited:
This seems like a more plausible price for the RTX 4070:

I think I will only buy one though if it's at least as powerful as an RTX 3080.

A shader count of 5888 suggests it may not be, but I'm not sure they've got that right.
 
Last edited:
Last edited:
This seems like a more plausible price for the RTX 4070:

I think I will only buy one though if it's at least as powerful as an RTX 3080.

A shader count of 5888 suggests it may not be, but I'm not sure they've got that right.

Yeah...

RTX 3080: 8704 shaders at 1710Mhz, Mem bandwidth 760GB/s
RTX 3070Ti: 6144 shaders at 1770Mhz, Mem bandwidth 608GB/s
RTX 4070: 5888 shaders at 2475Mhz, Mem bandwidth 504GB/s

RTX 4070Ti: 7680 shaders at 2610Mhz, Mem bandwidth 504GB/s

The 4070Ti is 17% faster than the 3080.
The 4070Ti has 30% more shaders than the 4070.
The 3080 is 22% faster than the 3070Ti.

IMO it will be 10% faster than the 3070Ti for 3070Ti money, that way tech tubers can say "its better value than the 3070Ti"
 
Last edited:
5888 shaders looks wrong to me, given that it would be the same number as what the RTX 3070 has, and Nvidia increased the shader count in previous generations (e.g. GTX 1070 > RTX 2070 > RTX 3070).
 
Last edited:
This seems like a more plausible price for the RTX 4070:

I think I will only buy one though if it's at least as powerful as an RTX 3080.

A shader count of 5888 suggests it may not be, but I'm not sure they've got that right.
If its $600 then that would suggest the shader count is correct as there needs to be a sizable performance gap to the $800 4070ti, performance will probably end up similar to a 3080 maybe a bit slower in raster.
 
If its $600 then that would suggest the shader count is correct as there needs to be a sizable performance gap to the $800 4070ti, performance will probably end up similar to a 3080 maybe a bit slower in raster.

Yeah i think its right, it will sit between the 3070Ti and the 3080 for $100 less than the 3080, reviewers will say its good...

Job done.
 
4070 only £600, still not sure this is the right price, it appears to be to cut down, may be to "Meh" to be worth it.
Will have to wait on the benchmarks.
 
Last edited:
5888 shaders looks wrong to me, given that it would be the same number as what the RTX 3070 has, and Nvidia increased the shader count in previous generations (e.g. GTX 1070 > RTX 2070 > RTX 3070).

Most likely correct given in the mobile space the 4070 has less shader units than the 3070 (46xx vs 5120). The mobile 4070 is likely what the 4060Ti will end up being.... Great, considering the 4070(M) is barely quicker than the ~3070 desktop matching 3070Ti laptop GPU...

Now I hope this is wrong but given what we have seen so far for anything sub 4090 it wouldn't surprise me at all.
 
Status
Not open for further replies.
Back
Top Bottom