sitting in the cold using pc as a heater

Lower your heating bill... with GAMING
Sign up for Private Internet Access VPN at https://www.privateinternetaccess.com/pages/linus-tech-tips/linus2Buy noblechairs ICON Series Real Leather Gaming ...

Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
sitting in the cold using pc as a heater
1800 @ 0.800 for a 5% perf drop at 100w less or 1925 @ 0.850 for same performance for 60w less.What clock speeds can you get out of your 3080 at 200W? MY FE was a bit of a lemon with regards to power draw but I never went for low overall power when undervolting, instead trying to get the most out of stock limits. Sold it to a mate for cheap but sitll have the Gigabyte that can do ~1900MHz at 912mV which is enough to still hit 300W+ at 2160p.
Got a better deal than I did.
What @gpuerrilla said is probably close to the truth: Nvidia are willing to let AIBs, retailers and distributors make any loss. There is a reason why EVGA went out of the GPU business.Just to add, if Nvidia or board partners had reduced the 3080 to say £500 in the last 6 months, I'd have probably bought one.
Surely selling those off 30xx cards first would have been sensible.
But maybe I am a bit naive about the VRAM requirements of new games.
12GB+ for a console which also has DMA storage with special hardware to decompress textures realistically means that even 16GB cards will soon be tight.Yes, I'm not sure I'm buying the VRAM concerns when you've got consoles on a baseline of between 12-13GB of RAM (of which they have to use for both their system and video RAM). No ones buying a £600 GPU expecting to run ultra settings at 4K, so I'd be very surprised if this GPU can't see out this console generation while comfortably maintaining better performance than what is possible on console.
The fact we're still looking at £600 for a midrange GPU sucks, but for people stuck on 1070/1080 class GPUs there gets a point where you can only wait so long to finally upgrade. I'm undecided but I want to play with these new technologies I've been missing out on like RT, DLSS, the non-gaming GPU accelerated things.
There are a number of factors that you need to consider when buying a new GPU, eg what resolution are you gaming at, will the new card be held back by the cpu, can the psu handle the gpu, will the new card fit in the case etc.Looking at RX6800XT (not much in price between this and a 6800), not to keen on 2nd hand due to no warranty and no idea how its been used, as a stop gap card until next Gen but looking at the prices even second hand they are not far behind the 4070.
Unless I look to a 6750XT it seems I might as well buy a 4070 until the next Gen, have I fallen into Nvidia's cunning plan?
On a 1070 8GB at present.
3080 was £750 ishI must admit I was a little irked at being called a fool ;]
but I am comparing this 4070 to a £1200 3080 so it's got to be better than that!!
I have come to the conclusion that if I wait for "good prices" I'll be waiting forever.
People should never have paid £1200 for a 3080, it was only because of the mining boom that prices got to that. I don't like the 4070 as it doesn't always beat the previous gen card above it in the stack. People complain about Turing cards but IMO they weren't all bad, my 2060 12GB usually out performs a 1080, so a clear jump of 2 tiers up on the previous gen, it was cheaper at £270 new, has 50% more vram, dlss and full dx12 features. The 4070 is no where near that sort of bump up compared to the 3080I must admit I was a little irked at being called a fool ;]
but I am comparing this 4070 to a £1200 3080 so it's got to be better than that!!
I have come to the conclusion that if I wait for "good prices" I'll be waiting forever.
Not so much waiting for better prices, I like to think of it rather as waiting for the "right deal", by all means I would be happy to drop £1000 on a GPU, it just has to be the right gpu, with the right specs and performance to warrant the outlay. Something can be £1000 and be a significant better value than something that costs £450, just depends what you are actually getting for your hard earned. Which is why the only card this gen that anyone actually felt comfortable with after buying was the 4090...everything else leaves you feeling used and abused and a little bit dirty...as it should, since Nvidia pulled you're pants down and forgot the tub of lube......I must admit I was a little irked at being called a fool ;]
but I am comparing this 4070 to a £1200 3080 so it's got to be better than that!!
I have come to the conclusion that if I wait for "good prices" I'll be waiting forever.
I think using the 12GB 2060 as an example of Turing being decent is being just a little disingenuous. That launched in December 2021, a month shy of three years after the 2060's initial launch and well over a year after Ampere's launch. It wasn't available while Turing was a current architecture. The launch version of the 2060 was turbogimped with only 6GB of VRAM and might just about be able to run Minesweeper with RT enabled these days before exploding. There was nothing inherently terrible about Turing as an architecture, but there was plenty wrong with the products Nvidia made using it. The fact that the apparent shining example of it not being a complete disaster came via a mining craze cash-in product released three years later says it all really. There might even be a decent Ada product by December 2025...People complain about Turing cards but IMO they weren't all bad, my 2060 12GB usually out performs a 1080, so a clear jump of 2 tiers up on the previous gen, it was cheaper at £270 new, has 50% more vram, dlss and full dx12 features. The 4070 is no where near that sort of bump up compared to the 3080
It is a true life example. I had a 970 and waited to replace it at the right time. There is no need to rush out and buy a new gen gpu as soon as it is released. I waited and got a much better card at a good price, as someone said above some people didn't wait and paid £1100 or more for a 3080. I also waited and got a 680 for around £230 and a vega 56 for £235. People just need to learn to wait for better deals to come along if they want them, while others don't want to wait and will pay top price. As I play mostly old games I hardly even use the 6800 I do have, its not been installed yet this year, so bouncing between a pc with a 6500xt and a 2060 12gb is usually ideal for meI think using the 12GB 2060 as an example of Turing being decent is being just a little disingenuous. That launched in December 2021, a month shy of three years after the 2060's initial launch and well over a year after Ampere's launch. It wasn't available while Turing was a current architecture. The launch version of the 2060 was turbogimped with only 6GB of VRAM and might just about be able to run Minesweeper with RT enabled these days before exploding. There was nothing inherently terrible about Turing as an architecture, but there was plenty wrong with the products Nvidia made using it. The fact that the apparent shining example of it not being a complete disaster came via a mining craze cash-in product released three years later says it all really. There might even be a decent Ada product by December 2025...
People complain about Turing cards but IMO they weren't all bad, my 2060 12GB usually out performs a 1080, so a clear jump of 2 tiers up on the previous gen, it was cheaper at £270 new, has 50% more vram, dlss and full dx12 features.
Comes in handy in TLOU I understand, https://youtu.be/5vR3-ituwCc?t=14Wait whaaaaat, there was a 12GB 2060?
Mind actually blown
Its just honestly so sad because all of the 4070's specs especially the low power draw and size of the thing screams that this was meant to be a 4060ti with 4070 logo on it.
And like many have pointed out thats the reason why for the first time in almost a decade the 4070 doesn't beat last gens 80 series and then theres the kicker of an extra £100 slapped on top aswell!!.
Personally im just guna wait for AMD's new offering and i have been using nvidia since the 580 so a good chunk of time, so this one really does sting but nvidia have genuinley lost the plot with this entire lineup besides the 4090 which is a great GPU.
And the whole VRAM thing becoming relevent again with the way new games are going just pushes me more towards AMD, i cannot phathom having to look at a brand new £600 GPU downgrading textures to avoid stuttering because its running at the VRAM limit.
Nvidia hopefully will learn because i'd like to think people are not so stupid when parting with that kind of money and you want it to last atleast a few years, im still running my 1080ti from 2017 but she's showing her age now.
Not so much waiting for better prices, I like to think of it rather as waiting for the "right deal", by all means I would be happy to drop £1000 on a GPU, it just has to be the right gpu, with the right specs and performance to warrant the outlay. Something can be £1000 and be a significant better value than something that costs £450, just depends what you are actually getting for your hard earned. Which is why the only card this gen that anyone actually felt comfortable with after buying was the 4090...everything else leaves you feeling used and abused and a little bit dirty...as it should, since Nvidia pulled you're pants down and forgot the tub of lube......
I was always looking at FE prices, were they that much?3080 was £750 ish
3080 FE was £649 msrpI was always looking at FE prices, were they that much?
3080 was a great buy at the time of release I think
I was always looking at FE prices, were they that much?
3080 was a great buy at the time of release I think