got 32gb gskill 6000 cl30 for £120 on amazon, worked out well with the £30 gift certificate balance i had sitting there to make it £90 insteadSeen some Corsair equivalent RAM for under £100 now!
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
got 32gb gskill 6000 cl30 for £120 on amazon, worked out well with the £30 gift certificate balance i had sitting there to make it £90 insteadSeen some Corsair equivalent RAM for under £100 now!
got 32gb gskill 6000 cl30 for £120 on amazon, worked out well with the £30 gift certificate balance i had sitting there to make it £90 instead
Ha! I think I'd trust the model more than humans, even with some consumer quality silicon.I knew it, they put Skynet in charge of the nukes, ain't they.
my 4090 has an ecc memory toggleHa! I think I'd trust the model more than humans, even with some consumer quality silicon.
On an a serious note, these huge clusters deal with bit flips due to SEU (usually high energy radiation from space) with ECC memory. THat isn't a feature of the consumer GPUs. I'm unsure whether it is disabled in BIOS for performance or whether the memory devices selected are different. But it's a large distinction that NV make.
Consoles GPU grew pretty big from Xbox 360 area. That was only 181mm2 while now series x is at 360mm2. keeping the same ratio between xbox360 and 8800gt you'd need about a 664mm2 GPU (well, maybe less, depends how 90nm vs 65nm was in the past compared to 7nm vs 4 NM now). 250 to 350 pounds only, for such a big die, is probably impossible without some serious chiplets design (even so it may be nothing more than a dream)But the issue is that a GTX1650 is weaker than even the £250 XBox Series S(a 20% slower RX5500XT) ,and it has an 8 core Zen2 CPU. That is still better than the average Steam pc. A £100 card should demolish it by now. Nvidia is still selling 5 year old cards,and the AMD/Intel equivalents are rebranding old performance.
Devs can't keep optimisng for slow hardware for ever and even the XBox Series S was cheaper than the equivalent PC graphics card until the last year(but not as a whole system). Most gamers buy prebuilt systems, and it's almost three years later. Within two years of the XBox 360 and PS3 we had the 8800GT and HD3850 which demolished the consoles for under $250. The consoles were $400 to $600.
The consoles have an RX6700/RX6700XT class dGPU with access to more than 8GB of VRAM and SSD VRAM cache. Where are the sub £350 dGPUs that demolish the consoles? After three years we still appear to be stuck on 8GB cards, and a lot of the top ten graphics cards on steam are not faster. The 8800GT was probably twice as fast as the console graphics cards.
Nvidia does not care and AMD is rebranding slightly better performance since 2019.
Most of the gamers I know buy mainstream hardware. It's simply not good enough now if we want more technical advancement.Even non technical gamers are waking up to this.
Consoles GPU grew pretty big from Xbox 360 area. That was only 181mm2 while now series x is at 360mm2. keeping the same ratio between xbox360 and 8800gt you'd need about a 664mm2 GPU (well, maybe less, depends how 90nm vs 65nm was in the past compared to 7nm vs 4 NM now). 250 to 350 pounds only, for such a big die, is probably impossible without some serious chiclets design (even so it may be nothing more than a dream)
But, a 4070 probably demolishes the consoles in RT and PT, so that's that.
Is tough around.
It reminds me of what Apple does.
It's exactly what Apple does and worse in some ways.
This generation of hardware is dead from the cpus, motherboards and gpus. They can all cry inflation (which is 10% at best) but they all raise prices by double in gpus and motherboards. They can keep them really and let them sweat on them and fire sell them in the end. If not they can pay to stick them in landfill.
Greed has got out of hand now regarding a lot of areas from energy prices, food, general bills and pc electronics. All related to the greed created by the lockdowns and wars that are nothing to do with us and fake prices because of a so called war with energy prices, explain to me why we pay 10x more for the same energy bills compared to Poland and other European countries that are buying energy from Russia at a larger rate than what we ever did but excuses.. Honestly the whole country has become corrupt regarding everything now and needs a change soon before they make things worse and give us more fake excuses. All a huge scam that needs to end now and bring back some form of normality to peoples lives. Fed up off all the fake excuses now that make zero sense to anyone with half a brain to how the world works.
Rant over... sorry fed up now of all this mess in this country now
Exactly... all fake excuses that make no sense. I wish people would wake up to what is going on and how we are all being taken for a ride at OUR EXPENSE.It's even worse when you consider the UK is one of the few European countries to have its own oil,gas and coal!
Well not entirely, whilst I agree that top end PC hardware is disgustingly expensive you can't baseline it at PS5 lvls and not include the license fees for games that get passed on to consumers.Your average PC is now more dependent on average console ports than ever now.
We can continue to justify £1600 gpus whilst sucking on the teat of PS5 devs .
Pcs need to start offering comparable gaming experiences for under £500 all in. That used to be doable in older generations but now it's laughable. A ps5 is nearly 3 years old and the landscape in the pc market can barely compete.
The RTX4070/RTX4070TI use a sub 300MM2 die so are not "expensive" dGPUs in reality but Nvidia wants its mining margins, and AMD also wants to follow them. They are basically the RTX3060/RTX3060TI replacements. RTX4080/RX7900XTX should have been £600~£700 at most.
The RTX4070 is just above 40% faster than the RTX3060TI. That is a normal generational uplift.If it was priced similarly to the RTX3060TI it would have given you a sub £400 dGPU that would have easily gone past a console dGPU in all ways. Plus it would also be an upsell,because it would be a massively salvaged third tier die pushed up another level.
It would also push a console equivalent dGPU closer to £200.
Instead greed was more important,so that is why the average PC on Steam has a GTX1650,and most of the top 10 dGPUs are worse or marginally better than a console.
Now we are probably a year away from the consoles being refreshed,so the PS5/XBox Series X are old hat,even for consoles. After all Sony last year broke even on PS5 costs.
So at this point,if that happens and the dGPU gets a decent upgrade we will be back to square. This is not a technical limiation of what Nvidia/AMD can design,but they clearly don't care about gamers like they did in the 8800GT and HD4870 days. The whole 8GB still being a thing is just penny pinching when the companies making RAM are selling it on the cheap.
It reminds me of what Apple does.
Yeah, I don't necessarily disagree with you that prices should be lower, I was just point out that comparing the old xbox with the 8800GT, doesn't hold anymore that well since the GPUs in consoles grew quite big and prices can't be that low for graphics cards considering the increase in complexity: 24GB, 450W tdp for 4090 vs 512MB, 125W tdp for 8800GT. At least I'm not sure is possible with current tech, even with what AMD is doing with chiplets, to have a $350-$400 card like the 4090 or larger while still making a profit. I'd love to be proven wrong as even a $500 4090 would be sweet!
xbox360 GPU, as per TPU, is 181mm2 on 90nm and gt8800 is 324mm2 on 65nm. 232mil transistors vs 754mil, so 3.25more in transistor count vs 1.79x more die area for the GTX.
Series X is 360mm2 GPU, 15.300mil transistors and an equivalent GPU (like xbox 360 to 8800gt) will need to have 49.725mil transistors, so a bit over a 4080 which has 45.900mil or 644mm2 die size. 4080 is 379mm2 and 4090 is 608mm2, so even bigger than that a GPU would need to be.
4070 should have been around $350, that would have been a good sell - and probably scalped pretty bad. Fingers crossed that prices will go down soon.
PS: Looking again at the original demo for UE5, with movie quality assets from Quixel and per pixel shadow accuracy vs. the crappy games that got launch these days... man, what these studios are doing?!
no one is scalping **** without cryptoYeah, I don't necessarily disagree with you that prices should be lower, I was just point out that comparing the old xbox with the 8800GT, doesn't hold anymore that well since the GPUs in consoles grew quite big and prices can't be that low for graphics cards considering the increase in complexity: 24GB, 450W tdp for 4090 vs 512MB, 125W tdp for 8800GT. At least I'm not sure is possible with current tech, even with what AMD is doing with chiplets, to have a $350-$400 card like the 4090 or larger while still making a profit. I'd love to be proven wrong as even a $500 4090 would be sweet!
xbox360 GPU, as per TPU, is 181mm2 on 90nm and gt8800 is 324mm2 on 65nm. 232mil transistors vs 754mil, so 3.25more in transistor count vs 1.79x more die area for the GTX.
Series X is 360mm2 GPU, 15.300mil transistors and an equivalent GPU (like xbox 360 to 8800gt) will need to have 49.725mil transistors, so a bit over a 4080 which has 45.900mil or 644mm2 die size. 4080 is 379mm2 and 4090 is 608mm2, so even bigger than that a GPU would need to be.
4070 should have been around $350, that would have been a good sell - and probably scalped pretty bad. Fingers crossed that prices will go down soon.
PS: Looking again at the original demo for UE5, with movie quality assets from Quixel and per pixel shadow accuracy vs. the crappy games that got launch these days... man, what these studios are doing?!
It’s motherboards and GPUs which seem to have their own inflation rates that are 5-10x higher than any other electrical goods while also at the same time also experiencing shrinkflation with smaller dies and bus sizes for gpus and motherboards seeing features being cut.This generation of hardware is dead from the cpus, motherboards and gpus. They can all cry inflation (which is 10% at best in UK and the countries of manufacture have single digit inflation, give it a rest now with the fake excuses) but they all raise prices by double in gpus and motherboards. They can keep them really and let them sweat on them and fire sell them in the end. If not they can pay to stick them in landfill.
The Series X and PS5 also have the CPU,I/O and other bits so the graphics part would be smaller. Also they use a 384 bit and 256 bit memory controller respectively. That also shows how tightarse AMD and Nvidia have become. A 256 bit memory controller is "now premium".
The AD103 in the RTX4080 is sub 400MM2,like the GA104 in the RTX3070/RTX3070TI. The RTX2070 Super used a salvaged 545MM2 12NM die. Nvidia is running at nearly 70% Gross GAAP margins(higher than Apple).
So even the RTX4080 is probably an upsell if sold at £700~£800. The reality is if this had been the Nvidia or AMD of over a decade ago,prices would be massively lower.
Two mining booms and a Pandemic since 2017 have made companies greedy.
Lack of VRAM and no movement on properly integrating NVME SSDs into PC gamers probably is the reason why. The pitiful VRAM amounts are what gets me. In 7 years we barely saw a 50% increase in mainstream VRAM amounts.
Not now, but they would would had been scalpers if the prices were "correct".no one is scalping **** without crypto
I can see it now, Nvidia in 3 years tells people they only kept prices high because they wanted to stop scalpers.Not now, but they would would had been scalpers if the prices were "correct".
They became the scalpers out of greed, just like the lockdown scalpers... pot kettle... but worse as it's being done at the source..I can see it now, Nvidia in 3 years tells people they only kept prices high because they wanted to stop scalpers.
100% agree with you on this. While the cost per mm^2 of die has gone up between 7nm and 5nm, the cost per transistor is still falling. There's evidence the 4070ti die will cost a similar amount to the 3060ti die: https://hardwaresfera.com/en/noticias/hardware/precio-oblea-tsmc-5nm/ Also note these are 2020 prices, when semi shortages were rampant and fabs were upping their prices to lower demand.The RTX4070/RTX4070TI use a sub 300MM2 die so are not "expensive" dGPUs in reality but Nvidia wants its mining margins, and AMD also wants to follow them. They are basically the RTX3060/RTX3060TI replacements. RTX4080/RX7900XTX should have been £600~£700 at most.