• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 4070 12GB, is it Worth it?

Status
Not open for further replies.
£500 at best, yes but Nvidia will not go that low.

They're still trying to flog VRAM gimped obsolete 3060tis and 3070s for £400 and £500 plus respectively...unbelievable really...I wonder why they're still stuck on shelves? it's laughable it really, cards that are over 2 years old, now obsolete are still around the MSRP from over 2 years ago, and actually ABOVE it in some cases...

:D
 
Even Nvidia admits vram is a problem for them - look at the last line.

At least they cut the price, the 4070 is $599


Nice made up sources.

No way AIBs didn't already know what the BOM cost was gonna be for them.

Also as if someone from nvidia would disparage their own company like that to some youtuber. The wording is clearly made up, no one talks like that.
 
Last edited:
So, let's see. This is TLOU at 720p ultra
720p.jpg
\



This is Plague Tale at 4k ultra
4k-ultra.jpg

One of them needs 5.4 gigs of vram and looks gorgeous the other one requires 10-11 (?) and looks like crap. Clearly you need to buy GPUs with more vram so youd end up with worse graphics regardless.
 
Last edited:
Well the ADA architecture must really suck then considering its a seeing a 25% performance deficit to a 628mm ampere chip and would be even worse at the same clock speeds.
Sorry was busy before for a proper answer


Yeah, the 4070 is GIGANTIC and it's clearly not performing as it should be. Maybe ADA is indeed terrible, dunno, but you can't blame them for undercutting the chip, it's actually MASSIVE in transistor count. It's 20-25% larger than a 3090ti. For comparison's sake, the 3070 was smaller (although - just barely) than a 2080ti. Big part of it is the cache I assume, but still performance is underwhelming when you consider the actual size of that thing.

EG1. The 4070s cache is 6 times bigger than the 3090tis - which is probably why the 4070 has a much higher transistor count.
 
Last edited:
I had a spare £900 voucher that needed to be spent in the retail chain named after Chicken Korma or so and they did price match me to a 4070 Ti at £850 and promptly sold out of all video cards, as the voucher was expiring got a Sony Inzone M9 which is okay. Probably swerved a bad buy TBH. Given how broken all PC ports are these days, would expect 16GB VRAM if spending more than £500 on a card. For pure AAA gaming then you're better of getting a PlayStation at this point, it'll play games better.
 
Sorry was busy before for a proper answer


Yeah, the 4070 is GIGANTIC and it's clearly not performing as it should be. Maybe ADA is indeed terrible, dunno, but you can't blame them for undercutting the chip, it's actually MASSIVE in transistor count. It's 20-25% larger than a 3090ti. For comparison's sake, the 3070 was smaller (although - just barely) than a 2080ti. Big part of it is the cache I assume, but still performance is underwhelming when you consider the actual size of that thing.

EG1. The 4070s cache is 6 times bigger than the 3090tis - which is probably why the 4070 has a much higher transistor count.

There are more transistors used for other stuff as well - beefed up caches in general, tensor cores and RT hardware is beefed up compared to Ampere, etc. while it might not contribute so much to overall performance now it will improve performance in certain things especially when games start to make heavier use of newer features compared to Ampere.
 
The alleged 4th source doesn't sound like anything Nvidia would say, it's so unprofessional. It's just MLID covering his **** yet again.

And playing to his base by talking about VRAM (or lack of).
You mean like they did with Ampere and sold FE models below what partners could afford to do themselves? If you listen to what sources like Hardware Unboxed and Gamersnexus have been saying in recent years, Nvidia/AMD have been taking more of the cut themselves. Then making sure especially in the case of Nvidia the blame gets pushed onto AIB partners for being greedy. PCMR needs to stop giving Nvidia excuses for this. They tried the same over consoles with MS with the original Xbox who never used them again. Sony stopped using them. When the bumps issues happened they fell out with Apple,because they apparently started passing the buck.

They did the same during Fermi, when they squeezed board partners at the last minute WRT to margins and supply. IIRC, XFX went from Nvidia only to ATI only. BFG dissappeared.Several well known names started making ATI models for the first time too and still do. Before that when they suddenly dropped GTX200 series prices by 25% there was also issues back then.

EVGA walked away despite over half their revenue being from dGPUs yet PSUs and peripherals making far more profits for them, despite record dGPU prices. This Stockholm syndrome with Nvidia is amusing. They have for a long time gone and screwed over AIB partners with last minute pricing changes. Usually it involved them wanting to maintain margins at the expense of their partners. But since even back in the day Nvidia was still closer to 60% sales share, most AIB partners were held over a barrel. It's even worse now.

Plus videocardz.com said this:

The FE models were originally meant to be a more premium, expensive model back during the Pascal days. But now it's become the cheapest one directly made by Nvidia for its own benefit and undercutting it's own AIB partners. So Nvidia suddenly cutting pricing at the last minute, whilst not telling it's partners sounds like something they would do.
 
Last edited:
You mean like they did with Ampere and sold FE models below what partners could afford to do themselves? If you listen to what sources like Hardware Unboxed and Gamersnexus have been saying in recent years, Nvidia/AMD have been taking more of the cut themselves. Then making sure especially in the case of Nvidia the blame gets pushed onto AIB partners for being greedy. PCMR needs to stop giving Nvidia excuses for this.

Exquisite. Could have ended the post there! Thing is when a handful of us have been parroting this for the best of 4 years, clearly it isn't processing. All I still observe is defence apologists and confirmation of planned obsolescence fruiting.
 
The FE models were originally meant to be a more premium, expensive model back during the Pascal days. But now it's become the cheapest one directly made by Nvidia for its own benefit and undercutting it's own AIB partners. So Nvidia suddenly cutting pricing at the last minute, whilst not telling it's partners sounds like something they would do.

And again fits right in line with why EVGA kicked Nvidia to the curb.
 

Do we have any rough idea where this perform at 1440p? it wil be interesting to see how big the gain is over last gen.
 

 
Last edited:
The FE models were originally meant to be a more premium, expensive model back during the Pascal days. But now it's become the cheapest one directly made by Nvidia for its own benefit and undercutting it's own AIB partners. So Nvidia suddenly cutting pricing at the last minute, whilst not telling it's partners sounds like something they would do.

Interesting the part about the 103 die plan for 3080 then samsung rejects were thrown in which is why they ended up on a 102 but a fraction of the cost..
 
Status
Not open for further replies.
Back
Top Bottom