Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
This argument has always bamboozled me, IMO it is the ultimate in 'defending your favourite brand at any and all costs' it is that some people argue for stagnation, built in obsolescence, and even over pricing given that at a certain price point one should expect a certain level of design and engineering befitting its price, everyone does for everything else, just not Nvidia GPU's.
Insanity.
So are we agreed then that a GPU costing more than £450 should have at least 12GB of VRam?
Good.....Good. it only took 3 years and a lot of arguing but at least we have now arrived at a sane place.
When the 3080 came out 2gb GDDR6X modules were not available so the only options for Nvidia were to use a 384 bus with 12gb which would have made it even closer to the 3090, put 16 or 20gb on it and have chips both sides of the card or use 16gb of 2GB GDDR6 which would have meant a large bandwidth cut and resulted in the card losing to a 6800XT by a 10% margin.
Resident evil 4 runs great on my 4090 and no it doesn't crash like it does on 8gb cards. But that isn't what you wanted to hear. A waste of my time even typing this response since you're just going to go off on some tangent again. People aren't responded because they don't want to read your massive monologues about nothing. You should try making YouTube content, it's a better format - people will watch 10 minute videos before they read your 10 minute posts
I dont play and do not really care but is Hogwarts Legacy a Lumin software RT game ? If it is then AMD gpus do very well and it is not until hardware RT is enabled that NV get their mojo back.
This could just be the first of many instances of games producers aiming for the lower tiered products to maximise sales. Why sell to the 1% and ignore the other 99%. Or it could just be a lzy Unreal 5 engine game that uses software RT and gives an advantage to RDNA2.
If anyone was daft enough to think that an 8gb 3070 was going to be a better long term card than a 16gb 6800 then I despair. The AMD fine wine shizzle is true, if you want performance for 2 years buy Nv if for 5+ buy AMD imho.
8GB VRAM cards were only worth sub £400 at best IMHO.Wow 8gb can't even run 1080p
People got scammed, regardless of what coping excuse nexus comes up with if you paid good money for a 8gb card in the last couple years you got swindled
What a total scam by NV, their 60 class card is NOW officially potato level.
![]()
This is both the fault of Nvidia and AMD when it comes to sub £350 cards. The big problem isn't the cards is where they are being positioned now in terms of price. They are pushing up lower tier dGPUs/entry level dGPUs to mainstream/entry level enthusiast pricing and these are made for laptops. At the same time they are cutting down the PCI-E bus width,which is going to start causing more problems as VRAM increases are stagnating. This is really concerning for a gamer like me who buys dGPUs under £400.This is nothing new though.
I had an EVGA 1070 with 8GB of VRAM, and even back in like 2017 I guess it would have been, I was running Fallout 4 with the high resolution texture pack and a few mods and easily maxing out that 8GB. And that game is old by modern standards.
But Nvidia have been shipping out cards with not enough VRAM for a long time, they historically used to do 2 versions of cards, one with 50% VRAM and those were horrendous. I forget which generation it was, but one shipped with 3GB and the other 6GB, and I remember still being on an AMD 6950 released years prior with 2GB of VRAM still holding up and thinking why on earth would you buy a card with only 3GB of VRAM.
I know the performance of the card means you cannot directly compare the amounts, so 8GB of VRAM on a 2016 era 1070 is NOT the equivalnet of 8GB of VRAM on a modern GPU, but even so.....
I dont play and do not really care but is Hogwarts Legacy a Lumin software RT game ? If it is then AMD gpus do very well and it is not until hardware RT is enabled that NV get their mojo back.
This could just be the first of many instances of games producers aiming for the lower tiered products to maximise sales. Why sell to the 1% and ignore the other 99%. Or it could just be a lzy Unreal 5 engine game that uses software RT and gives an advantage to RDNA2.
If anyone was daft enough to think that an 8gb 3070 was going to be a better long term card than a 16gb 6800 then I despair. The AMD fine wine shizzle is true, if you want performance for 2 years buy Nv if for 5+ buy AMD imho.
Stopping buying this garbage is not going to work. NV etc are already rolling out their next business model with subscription access to 4080/4090 gpus with Geforce Now. This is obviously where they want us to be and it is way easier to generate profits when they can blame lag or low gfx fidelity on your own ISP.
Pricing GPUs out of the reach of the average consumer is a win win for Nv and AMD do not care either because they win by console sales...... I may be drunk and this may be the wierdest thing I say in my life but "Help me Intel ,, Only you can save me now..." We need real competition in the market and not just the good old boys running a monopoly to maximise profits.
I , you , or Dave next door may be able to afford the most upto date hardware but we are seeing a clear indication with the sales figures and the Steam hardware surveys of the slower uptake in the most modern gpus. What this means to progress of new game is unknown, if I games developer is making a game for under 10% of its target audience is that a good idea?
Stopping buying this garbage is not going to work. NV etc are already rolling out their next business model with subscription access to 4080/4090 gpus with Geforce Now. This is obviously where they want us to be and it is way easier to generate profits when they can blame lag or low gfx fidelity on your own ISP.
Pricing GPUs out of the reach of the average consumer is a win win for Nv and AMD do not care either because they win by console sales...... I may be drunk and this may be the wierdest thing I say in my life but "Help me Intel ,, Only you can save me now..." We need real competition in the market and not just the good old boys running a monopoly to maximise profits.
I , you , or Dave next door may be able to afford the most upto date hardware but we are seeing a clear indication with the sales figures and the Steam hardware surveys of the slower uptake in the most modern gpus. What this means to progress of new game is unknown, if I games developer is making a game for under 10% of its target audience is that a good idea?
More like £500, mark my words.
Stop buying this junk, the moment people stop buying this junk and stop defending it is the moment this insanity will stop.
Tech tubers who are either stupid or far to 'loyal' to Nvidia are at fault here for saying cards like the 3070 had enough, when clearly it doesn't and never did.
But But But..... Quantitative Easing (Printing Money) is not the cause of inflation, < The lie told by every G20 nation engaged in money printing for decades.The problem is that it requires a decent internet connection,and is another monthly cost in an era of people cutting down(why do you think Netflix is suffering?). Plus on top of this the US has awful internet in large parts of the country too. Most of this growth has been driven by cheap credit,due to 15 years of money printing by central banks,because morons keep borrowing money to buy too much stuff. Too many consumers and even companies have gotten comfortable with taking on debt.
The reality is that a reckoning is coming for a lot of these companies,because the tap has to be shut off at some point.
But But But..... Quantitative Easing (Printing Money) is not the cause of inflation, < The lie told by ever G20 nation engaged in money printing for decades.
Printing money devalues the money in your pocket, that's inflation, its a stealth tax. If a government wants money they literally print it, the population them pays for it through inflation.
The problem is not only this.
If you look at the people specifically on this forum, this is a computer hardware forum, the type of person here has to go to the effort of signing up to such thing and by default (in general) will be a certain type of person, and even here the opinions are a little divided.
The problem is, you go outside of this, and ask you average Joe, who maybe games on a PC but isnt really into hardware, and I gaurantee when you ask them what graphics card is the best they will nearly all say blindly say "Nvidia", and Nvidia know this.
Yes people need to stop buying "that junk" I for one agree, but even on this forum opinions are divided you have no hope when it comes to the general consumer base.
People got scammed, regardless of what coping excuse nexus comes up with if you paid good money for a 8gb card in the last couple years you got swindled
No surprise 8gb is struggling. Lol worthy the people who thought they were 4k gaming cards for the future, of course, if you're happy to drop settings then that's another matter.
Still 3070 is a great card at the Fe price especially if you want to experience ray tracing and have access to dlss![]()
Stuttering on 4090 even after shader compilation
There is one thing Diablo 4 does identify as a beta: the hunger for memory. We're hearing numerous user reports complaining about stuttering and we respect our Afterburner logNot surprising: Diablo 4 (Beta) allocated up to 23 GiByte on our Geforce RTX 3090 Ti in our first test sessions with maximum details. Users of a Geforce RTX 4090 confirm this behavior, sometimes accompanied by the first reloading hiccups. These observations point to a memory overflow caused by "messie behavior": the game doesn't seem to remove older data from memory as it progresses. What, in the worst case, leads to texture dropouts on a high-end graphics card with 24 GiByte obviously causes problems for graphics cards with less VRAM. Gamers report crashes and strong jerks, which are only alleviated after reducing the texture details - even on a Geforce RTX 3080, which has 10 GiBytes, not just the common 8 GiByte models.
But how did that happen?
But how did that happen?
Yep,and also so is exporting all your important resource extraction/manufactering(oil/gas/fertiliser) and growing all the food abroad,and then having to print more money to import it. So when a a pandemic or war comes along and causes shortages,import costs go even higher causing more inflation. Then we print even more money.
But going back to gaming,Nvidia trying to push everyone to streaming might not work. Most people who buy consoles didn't care if AMD provided the hardware. The same goes with streaming,if MS and Sony start investing more into it using custom AMD gaming servers,AMD can provide all of it down to the CPU and motherboards.
End users won't care who provides the hardware,because a service is a service.
And there we have it why Charlie is so disliked. Not the rumours, not the paywall but that by exposing Nvidia's solder problems - and coming up with the term "bumpgate" due to how covered up it was - he didn't tow the "keep quiet or else". Else being access. Guess he can do without review samples. Killing the messenger only became necessary afterwardsNvidia knew how to market better,by using influencers on tech forums(who were later found out) and by sponsoring more games than ATI/AMD bothered to do. Then also a tech press which could got cut off from review parts,so tended to talk less about Nvidia problems.
One of them was the famous bumps problems where you had Nvidia dGPUs failing in laptops(link). It might also be why Apple and Nvidia never worked together again. The tech press went very quiet about it.
Globalism, if we all make ourselves dependant on eachother then there are no more wars, because we are all friends, its part of the reason for Germany's initial reluctance to condemn Russia properly, its not just that they rely on them for their energy. the reason for that is the same reason for that reluctance, they thought they could bring Russia around to the globalist idea. The UK could also easily be energy independent, we used to be, we chose not to be as a sacrifice for globalism.
Russia, nor China, or India, or any of the gulf nations... and so on, are not interested in this global hegemony led ultimately by America, they are in fact now building a rival hegemony of their own.
Some people might not like the way your hegemony is run, who knew? Is this not why every socialist nation in history ends in tyranny? Because its the only way social hegemony works. ###### unicorn air heads.
And back to gaming, i have no interest is game streaming, they all want us subscribing to their systems because in that way they have the hand on the money taps, its why people like Nvidia are so pig headed about GeForce now, no thank you.....
Yes,and many people on tech forums attacked him over it.And there we have it why Charlie is so disliked. Not the rumours, not the paywall but that by exposing Nvidia's solder problems - and coming up with the term "bumpgate" due to how covered up it was - he didn't tow the "keep quiet or else". Else being access. Guess he can do without review samples. Killing the messenger only became necessary afterwards
Wow 8gb can't even run 1080p
People got scammed, regardless of what coping excuse nexus comes up with if you paid good money for a 8gb card in the last couple years you got swindled