• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 4070 12GB, is it Worth it?

Status
Not open for further replies.
What clock speeds can you get out of your 3080 at 200W? MY FE was a bit of a lemon with regards to power draw but I never went for low overall power when undervolting, instead trying to get the most out of stock limits. Sold it to a mate for cheap but sitll have the Gigabyte that can do ~1900MHz at 912mV which is enough to still hit 300W+ at 2160p.
1800 @ 0.800 for a 5% perf drop at 100w less or 1925 @ 0.850 for same performance for 60w less.
 
Just to add, if Nvidia or board partners had reduced the 3080 to say £500 in the last 6 months, I'd have probably bought one.
Surely selling those off 30xx cards first would have been sensible.
But maybe I am a bit naive about the VRAM requirements of new games.
What @gpuerrilla said is probably close to the truth: Nvidia are willing to let AIBs, retailers and distributors make any loss. There is a reason why EVGA went out of the GPU business.

Yes, I'm not sure I'm buying the VRAM concerns when you've got consoles on a baseline of between 12-13GB of RAM (of which they have to use for both their system and video RAM). No ones buying a £600 GPU expecting to run ultra settings at 4K, so I'd be very surprised if this GPU can't see out this console generation while comfortably maintaining better performance than what is possible on console.

The fact we're still looking at £600 for a midrange GPU sucks, but for people stuck on 1070/1080 class GPUs there gets a point where you can only wait so long to finally upgrade. I'm undecided but I want to play with these new technologies I've been missing out on like RT, DLSS, the non-gaming GPU accelerated things.
12GB+ for a console which also has DMA storage with special hardware to decompress textures realistically means that even 16GB cards will soon be tight.

Plus, one of the main points of PC gaming is to go beyond consoles either by settings, or mods. All of which require extra VRAM.
 
Looking at RX6800XT (not much in price between this and a 6800), not to keen on 2nd hand due to no warranty and no idea how its been used, as a stop gap card until next Gen but looking at the prices even second hand they are not far behind the 4070.

Unless I look to a 6750XT it seems I might as well buy a 4070 until the next Gen, have I fallen into Nvidia's cunning plan?

On a 1070 8GB at present.
There are a number of factors that you need to consider when buying a new GPU, eg what resolution are you gaming at, will the new card be held back by the cpu, can the psu handle the gpu, will the new card fit in the case etc.

TBH you are in the situation that I often shop in, by which I mean that you have skipped the next gen of card (2000 series and now have a choice of end of life of the 3000/6000 series or buying the new gen 4000/7000). In that situation I usually buy the EOL series. As your previous card was a 70 series (cracking card the 1070 when it came out) why not just upgrade to the 6700xt which last time I checked had the Asrock 6700xt for £380? It will give you a good increase in performance and save you a couple of hundred quid ready for the 5000/8000 series, when hopefully cards will be better at ray tracing and dlss3 and fsr3 have had time to mature. If you truly want a 4070 then it is certainly a better buy than a 3070ti which IMO was always a bad buy

Don't forget that the now £380 6700xt performs not far from the 2080ti, a gpu that those who must have the halo product usually paid around £1000 or more for new
 
Last edited:
I must admit I was a little irked at being called a fool ;]
but I am comparing this 4070 to a £1200 3080 so it's got to be better than that!!
I have come to the conclusion that if I wait for "good prices" I'll be waiting forever.
 
I must admit I was a little irked at being called a fool ;]
but I am comparing this 4070 to a £1200 3080 so it's got to be better than that!!
I have come to the conclusion that if I wait for "good prices" I'll be waiting forever.
People should never have paid £1200 for a 3080, it was only because of the mining boom that prices got to that. I don't like the 4070 as it doesn't always beat the previous gen card above it in the stack. People complain about Turing cards but IMO they weren't all bad, my 2060 12GB usually out performs a 1080, so a clear jump of 2 tiers up on the previous gen, it was cheaper at £270 new, has 50% more vram, dlss and full dx12 features. The 4070 is no where near that sort of bump up compared to the 3080

I think that you are right in saying that in terms of buying a gpu for gaming there has been a lot worse value, such as getting a 3070ti. The 12GB of vram on the 4070 can manage in games by reducing the settings and applying dlss, plus usually there is little difference between ultra and high settings in visual appearance, but things seem to be suddenly changing fast with vram usage so it will be interesting to see how much the VRAM limits the 4070 both in fps and quality of visuals in the coming years. I have seen footage of a 6800 using around 15GB in a game
 
Last edited:
I must admit I was a little irked at being called a fool ;]
but I am comparing this 4070 to a £1200 3080 so it's got to be better than that!!
I have come to the conclusion that if I wait for "good prices" I'll be waiting forever.
Not so much waiting for better prices, I like to think of it rather as waiting for the "right deal", by all means I would be happy to drop £1000 on a GPU, it just has to be the right gpu, with the right specs and performance to warrant the outlay. Something can be £1000 and be a significant better value than something that costs £450, just depends what you are actually getting for your hard earned. Which is why the only card this gen that anyone actually felt comfortable with after buying was the 4090...everything else leaves you feeling used and abused and a little bit dirty...as it should, since Nvidia pulled you're pants down and forgot the tub of lube......
 
People complain about Turing cards but IMO they weren't all bad, my 2060 12GB usually out performs a 1080, so a clear jump of 2 tiers up on the previous gen, it was cheaper at £270 new, has 50% more vram, dlss and full dx12 features. The 4070 is no where near that sort of bump up compared to the 3080
I think using the 12GB 2060 as an example of Turing being decent is being just a little disingenuous. That launched in December 2021, a month shy of three years after the 2060's initial launch and well over a year after Ampere's launch. It wasn't available while Turing was a current architecture. The launch version of the 2060 was turbogimped with only 6GB of VRAM and might just about be able to run Minesweeper with RT enabled these days before exploding. There was nothing inherently terrible about Turing as an architecture, but there was plenty wrong with the products Nvidia made using it. The fact that the apparent shining example of it not being a complete disaster came via a mining craze cash-in product released three years later says it all really. There might even be a decent Ada product by December 2025...
 
Its just honestly so sad because all of the 4070's specs especially the low power draw and size of the thing screams that this was meant to be a 4060ti with 4070 logo on it.

And like many have pointed out thats the reason why for the first time in almost a decade the 4070 doesn't beat last gens 80 series and then theres the kicker of an extra £100 slapped on top aswell!!.

Personally im just guna wait for AMD's new offering and i have been using nvidia since the 580 so a good chunk of time, so this one really does sting but nvidia have genuinley lost the plot with this entire lineup besides the 4090 which is a great GPU.

And the whole VRAM thing becoming relevent again with the way new games are going just pushes me more towards AMD, i cannot phathom having to look at a brand new £600 GPU downgrading textures to avoid stuttering because its running at the VRAM limit.

Nvidia hopefully will learn because i'd like to think people are not so stupid when parting with that kind of money and you want it to last atleast a few years, im still running my 1080ti from 2017 but she's showing her age now.
 
I think using the 12GB 2060 as an example of Turing being decent is being just a little disingenuous. That launched in December 2021, a month shy of three years after the 2060's initial launch and well over a year after Ampere's launch. It wasn't available while Turing was a current architecture. The launch version of the 2060 was turbogimped with only 6GB of VRAM and might just about be able to run Minesweeper with RT enabled these days before exploding. There was nothing inherently terrible about Turing as an architecture, but there was plenty wrong with the products Nvidia made using it. The fact that the apparent shining example of it not being a complete disaster came via a mining craze cash-in product released three years later says it all really. There might even be a decent Ada product by December 2025...
It is a true life example. I had a 970 and waited to replace it at the right time. There is no need to rush out and buy a new gen gpu as soon as it is released. I waited and got a much better card at a good price, as someone said above some people didn't wait and paid £1100 or more for a 3080. I also waited and got a 680 for around £230 and a vega 56 for £235. People just need to learn to wait for better deals to come along if they want them, while others don't want to wait and will pay top price. As I play mostly old games I hardly even use the 6800 I do have, its not been installed yet this year, so bouncing between a pc with a 6500xt and a 2060 12gb is usually ideal for me
 
Last edited:
People complain about Turing cards but IMO they weren't all bad, my 2060 12GB usually out performs a 1080, so a clear jump of 2 tiers up on the previous gen, it was cheaper at £270 new, has 50% more vram, dlss and full dx12 features.

Wait whaaaaat, there was a 12GB 2060?
Mind actually blown
 
Its just honestly so sad because all of the 4070's specs especially the low power draw and size of the thing screams that this was meant to be a 4060ti with 4070 logo on it.

And like many have pointed out thats the reason why for the first time in almost a decade the 4070 doesn't beat last gens 80 series and then theres the kicker of an extra £100 slapped on top aswell!!.

Personally im just guna wait for AMD's new offering and i have been using nvidia since the 580 so a good chunk of time, so this one really does sting but nvidia have genuinley lost the plot with this entire lineup besides the 4090 which is a great GPU.

And the whole VRAM thing becoming relevent again with the way new games are going just pushes me more towards AMD, i cannot phathom having to look at a brand new £600 GPU downgrading textures to avoid stuttering because its running at the VRAM limit.

Nvidia hopefully will learn because i'd like to think people are not so stupid when parting with that kind of money and you want it to last atleast a few years, im still running my 1080ti from 2017 but she's showing her age now.

Yep.

No reason they couldn't have followed how they did the 3xxx series

4070 should be 4060ti (matches 3080 like the 3060ti matches the 2080 Super)
4070Ti should be 4070 (matches the 3090ti like the 3070 matches the 2080Ti)
4080 should be the 4080 i guess, but a lot cheaper. Although with the 4080 and 4090 parts they likely could have got a 4070ti, 4080,4090 and 4090Ti out of them at the 3xxx series equivalent prices.
 
Not so much waiting for better prices, I like to think of it rather as waiting for the "right deal", by all means I would be happy to drop £1000 on a GPU, it just has to be the right gpu, with the right specs and performance to warrant the outlay. Something can be £1000 and be a significant better value than something that costs £450, just depends what you are actually getting for your hard earned. Which is why the only card this gen that anyone actually felt comfortable with after buying was the 4090...everything else leaves you feeling used and abused and a little bit dirty...as it should, since Nvidia pulled you're pants down and forgot the tub of lube......

Tend to agree about the 4090 being the best bang for buck, but I've no interest in 4K gaming myself at present, happy enough at 2K with current monitor.
Would love the 4070 to have been £100 less but I'm like that when buying anything.
 
I was always looking at FE prices, were they that much?
3080 was a great buy at the time of release I think

It was £650.

In hindsight yes. But at the time, most thought it was still too high for the standard xx80 card.

It was also amazing at mining so it went for hilariously stupid prices, as you would expect a literal money printer to do.
 
Last edited:
This card is probably gonna be performance per $ king this generation. Can't see the 4060 or 4060 ti beating this, even at 1080p.
 
Status
Not open for further replies.
Back
Top Bottom