• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

One of my ex work mates buys every Nvidia halo product, If Nvidia released a 3090 Ti+ that saw a 0.5% perf increase over a "vanilla" 3090 Ti for £3000 he'd make up some excuse as to why he needed it, He did the same thing with the 3090 Ti, Even with the gen before lasts Titan Volta and then last gens RTX Titan, Both mainly aimed at workstations but nope, Had to have it.

Some people have more money than sense "Must have the latest shiny no matter the cost" or "keep up with the Jones syndrome" as my dad calls it.

Yeah, you'll never talk any sense to people who just "want" things. Look at all the fools paying £600 P/M + to rent a Range Rover just so they can park it on the drive and do the shopping/school run in it. The roads are clogged with mums who wanna be WAGS.
 
3070 are around £400 with 3080 £100 more but better performance.

How much do you think the 3000 series will be come October? I'm guessing prices for the 4070 will be $499 with on par performance of the 3090ti surely that must mean anyone selling the 3070 & 3080 now is a good time otherwise the cost of them come Nov/ Dec will be in the region of £200 MAX but then who would buy a used 3070 thats probably been mined to death
I doubt a 3080 will drop to £200 considering that is around the price point of the 6500XT which in turn is no faster than cards that were released at similar price points 5 years ago so you can bet that a 4050 or 7500XT isn't coming close to 3080 performance.

I have a feeling a used 3080 will settle around £350 as it will probably just lose to a £400+ 4060ti but should beat a 4060.

Losing £300 on the cards resale value after 2 years of use is pretty reasonable, compare this to a 3090 which will likely drop to around £450-500 you've lost £900 odd which in itself would buy you a top end 4080 model.
 
3070 are around £400 with 3080 £100 more but better performance.

How much do you think the 3000 series will be come October? I'm guessing prices for the 4070 will be $499 with on par performance of the 3090ti surely that must mean anyone selling the 3070 & 3080 now is a good time otherwise the cost of them come Nov/ Dec will be in the region of £200 MAX but then who would buy a used 3070 thats probably been mined to death


Can't see a 4070 being on par with a 3090 Ti if past gens are anything to go by, 4080 ? Yeah quite likely.
 
The only thing concerning me is the wattage 4070 is 300 watts! 3070 was 220 watts.

Choice is should i wait and see what the actual benchmarks are? AMD is a good card but drivers are pretty dire and run pretty hot.
 
Can't see a 4070 being on par with a 3090 Ti if past gens are anything to go by, 4080 ? Yeah quite likely.
3070 matches/beats a 2080ti.

The only thing concerning me is the wattage 4070 is 300 watts! 3070 was 220 watts.

Choice is should i wait and see what the actual benchmarks are? AMD is a good card but drivers are pretty dire and run pretty hot.

If a 4070 can provide 3090/ti performance with less power usage, that's good efficiency, not to mention if 40xx undervolts as good as ampere does (up to 100w less for same perf). Always a 4060 or lesser if power consumption is a worry.
 
Can't see a 4070 being on par with a 3090 Ti, 4080 ? Yeah quite likely.
Even during the no competition days of Turing and the jump from TSMC 16-12 the lacklustre 2070 managed to beat a 1080 so if a 4070 is only as fast as a 3080 despite nvidia moving from Samsung 8 to TSMC 5 then nvidia would have failed hard and most likely be well behind AMD.

Transistor density of the nodes nvidia has used are TSMC 16nm = 28.2 > TSMC 12nm = 33.8 > Samsung 8nm = 61.2 > TSMC 5nm = 171.3 so as you can see the jump to 5nm is substantial and should see even larger gains than with Samsung which already see the 3070 matching the 2080ti.
 
The only thing concerning me is the wattage 4070 is 300 watts! 3070 was 220 watts.

Choice is should i wait and see what the actual benchmarks are? AMD is a good card but drivers are pretty dire and run pretty hot.
nvidia are the ones running hot this gen (and probably next) aren't they? Also thought the AMD drivers were alright now. Fundamentally it's going to boil down to ray tracing and how much you care about it, pricing, raster and heat (from NV) this gen I think. Nobody wants a 350w "midrange" card, it's a space heater.
 
nvidia are the ones running hot this gen (and probably next) aren't they? Also thought the AMD drivers were alright now. Fundamentally it's going to boil down to ray tracing and how much you care about it, pricing, raster and heat (from NV) this gen I think. Nobody wants a 350w "midrange" card, it's a space heater.
I don't think Nvidias power draw is much different to AMD aside from the 3080ti and 3090. The 3080 is 320w vs the 6800XT 300w, if you built a 3080 with just GDDR6 it would probably pull quite a bit less power than the 6800XT given the difference between a 3070 220w and the GDDR6X 3070ti at 290w.
 
3070 matches/beats a 2080ti.

Not the correct match up, Does a 3070 match or beat an RTX Titan ? That would be last gens 3090, Then you have to add about 10% more performance on top for the RTX Titan to get it to last gens version of a 3090 Ti.

From the videos I've watched a 3070 just about matches an RTX Titan which I'd say is impressive, So maybe it'll match a 3090 or 3090Ti depending on various factors.
 
Well I've been looking forward to the 4090 for a decent performance hike but then had a think about the price and wondered if a 4080 might do the job instead, and be considerably cheaper.

However, since coming back from my holiday I’ve not even turned my PC on, just played around with my Steam Deck instead.

I think I may actually be coming to the conclusion that unless there’s a monumental performance improvement with the next gen cards, I may just skip them completely, especially considering the price of electricity keeps going up and up.

Our monthly DD for electric has gone up from £160 last year to £390 already, and it’s going to go up again in a couple of months, and probably again in March.
 
I think I may actually be coming to the conclusion that unless there’s a monumental performance improvement with the next gen cards, I may just skip them completely, especially considering the price of electricity keeps going up and up.

Our monthly DD for electric has gone up from £160 last year to £390 already, and it’s going to go up again in a couple of months, and probably again in March.

According to the latest news we could be paying around £2000 extra per year for electricity alone with the high possibility of that doubling in another year.

What we need is 12900KS and 3090Ti performance at 1/4 the wattage, Now that would be impressive.
 
According to the latest news we could be paying around £2000 extra per year for electricity alone with the high possibility of that doubling in another year.

What we need is 12900KS and 3090Ti performance at 1/4 the wattage, Now that would be impressive.
My 3080 undervolted to around 180w still beats a 2080ti so a 7600X and 4080 with a large undervolt should see it under 1/2 the power but you'll probably need to wait another gen to see it down to 1/4.
 
The trick is to undervolt or/and FPS cap. I am playing Final Fantasy 7 remake with a ps5 controller at the moment and set it to 60fps even though I can easily run it at 120fps locked.

Power usage is way down at 60fps and you don't even hear the fans on the GPU. As I recall the whole pc including monitor uses less than 200w playing it like this. Feels perfectly smooth to me. Certainly much better than the 30fps I played it on the PS4 Pro some years ago anyway.

Way I see it is in most games 60fps I am happy with, so the newer gpu's will run at even lower voltage than what I have now to hit 60fps. Then when I play games that need the extra grunt like Cyberpunk the performance will be available.

I recon the 4080 will be able to play every single game in my Steam library with ease at 60fps. Right now my 3070 already does with 99% or more anyways. But new tech is always nice and I always like to get the latest gen :D
 
The trick is to undervolt or/and FPS cap. I am playing Final Fantasy 7 remake with a ps5 controller at the moment and set it to 60fps even though I can easily run it at 120fps locked.

Power usage is way down at 60fps and you don't even hear the fans on the GPU. As I recall the whole pc including monitor uses less than 200w playing it like this. Feels perfectly smooth to me. Certainly much better than the 30fps I played it on the PS4 Pro some years ago anyway.

Way I see it is in most games 60fps I am happy with, so the newer gpu's will run at even lower voltage than what I have now to hit 60fps. Then when I play games that need the extra grunt like Cyberpunk the performance will be available.

I recon the 4080 will be able to play every single game in my Steam library with ease at 60fps. Right now my 3070 already does with 99% or more anyways. But new tech is always nice and I always like to get the latest gen :D
That 175hz qd-oled screen is wasted on you!


:p
 
Not the correct match up, Does a 3070 match or beat an RTX Titan ? That would be last gens 3090, Then you have to add about 10% more performance on top for the RTX Titan to get it to last gens version of a 3090 Ti.

From the videos I've watched a 3070 just about matches an RTX Titan which I'd say is impressive, So maybe it'll match a 3090 or 3090Ti depending on various factors.

Well yeah 2080ti basically matched the RTX titan. No doubt a 4070 will be matching a 3090/ti and most likely beating it in RT.

Key thing is pricing, 3070 for £460 vs 2080ti @ £1000+, no contest what one to go for. Same will happen come 4070 announcement/release where you'll have 3090/ti owners taking a huge loss as it just simply makes no sense to buy a 3090/ti for more than what a 4070 will cost unless you "need" 24GB of vram.....
 
That 175hz qd-oled screen is wasted on you!


:p

You still get some of the benefits as I recall reading, plus OLED helps too. That said I do play the occasional CSS:GO which I play at 175hz/fps. Though generally I have my monitor set to 144hz and 10bit. Some older games I just cap it to 140fps and play like that. Some I even play at 90fps. All depends on the game in the end. But for final fantasy 7 remake and many games in general I am more than happy at 60fps. I am not a twitch gamer :)

Great thing is the option is there so I can pick whatever suits me for the game I am playing :D
 
Well yeah 2080ti basically matched the RTX titan. No doubt a 4070 will be matching a 3090/ti and most likely beating it in RT.

Key thing is pricing, 3070 for £460 vs 2080ti @ £1000+, no contest what one to go for. Same will happen come 4070 announcement/release where you'll have 3090/ti owners taking a huge loss as it just simply makes no sense to buy a 3090/ti for more than what a 4070 will cost unless you "need" 24GB of vram.....

Well I'll be skipping 4000 series as all being well I'll be getting a 3090 Ti soon from an old work mate selling off everything for roughly a quarter what they were new as he's moving country and has a baby on the way so PC gaming is EOL for him.

I don't mind being 1 gen behind especially as it means I get that gens fully unlocked chip with a ton of memory, I'll be limiting everything to 60 anyway, Final Fantasy 14 should run smooth :D
 
Well I'll be skipping 4000 series as all being well I'll be getting a 3090 Ti soon from an old work mate selling off everything for roughly a quarter what they were new as he's moving country and has a baby on the way so PC gaming is EOL for him.

I don't mind being 1 gen behind especially as it means I get that gens fully unlocked chip with a ton of memory, I'll be limiting everything to 60 anyway, Final Fantasy 14 should run smooth :D
I'd flog the 3090 Ti, use the money in a couple of months to buy a 4080 and keep the change. Should be around 40-50% faster than a 3090 Ti and more power efficient. Skipping a generation with a massive leap in performance doesn't make much sense, that 3090 Ti with loads of memory won't look nearly as appealing when it's getting demolished by new GPU's and electricity prices are through the roof.
 
Back
Top Bottom