• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Close this thread please, it's old and tired :)

Status
Not open for further replies.
This argument has always bamboozled me, IMO it is the ultimate in 'defending your favourite brand at any and all costs' it is that some people argue for stagnation, built in obsolescence, and even over pricing given that at a certain price point one should expect a certain level of design and engineering befitting its price, everyone does for everything else, just not Nvidia GPU's.

Insanity.

And yet, the funny thing is, you will still buy nvidia gpus with limited vram like the others :cry: You never did answer the question of why don't you and all the others who constantly cry about nvidia vram go to amd gpus instead???

So are we agreed then that a GPU costing more than £450 should have at least 12GB of VRam?

Good.....Good. it only took 3 years and a lot of arguing but at least we have now arrived at a sane place.

Yes it finally seems like people are releasing the bang per buck/value is actually a relevant argument in this topic but no, spend the extra £750 just to future proof yourself for another year or 2 even though you'll be reducing settings anyway due to lack of grunt.....

When the 3080 came out 2gb GDDR6X modules were not available so the only options for Nvidia were to use a 384 bus with 12gb which would have made it even closer to the 3090, put 16 or 20gb on it and have chips both sides of the card or use 16gb of 2GB GDDR6 which would have meant a large bandwidth cut and resulted in the card losing to a 6800XT by a 10% margin.

This has been pointed out so many times on how complete and utterly wrong humbug is on this but he refuses to acknowledge it or rather, he some how knows better than companies worth billions.....

Resident evil 4 runs great on my 4090 and no it doesn't crash like it does on 8gb cards. But that isn't what you wanted to hear. A waste of my time even typing this response since you're just going to go off on some tangent again. People aren't responded because they don't want to read your massive monologues about nothing. You should try making YouTube content, it's a better format - people will watch 10 minute videos before they read your 10 minute posts

There we have it @TNA :cry:

And yet there are people over on reddit saying it is crashing at times for them on their 3090s/4090s :D

I dont play and do not really care but is Hogwarts Legacy a Lumin software RT game ? If it is then AMD gpus do very well and it is not until hardware RT is enabled that NV get their mojo back.

This could just be the first of many instances of games producers aiming for the lower tiered products to maximise sales. Why sell to the 1% and ignore the other 99%. Or it could just be a lzy Unreal 5 engine game that uses software RT and gives an advantage to RDNA2.

If anyone was daft enough to think that an 8gb 3070 was going to be a better long term card than a 16gb 6800 then I despair. The AMD fine wine shizzle is true, if you want performance for 2 years buy Nv if for 5+ buy AMD imho.

I'm not 100% sure on how RT was implemented in hogwarts but all I know, is the RT implementation in that game is ****...... Heck, use high settings instead and you get better graphics than ultra settings but no, it's not the game, it's the hardware

4OyOB2T.png


:cry:
 
Last edited:
  • Haha
Reactions: TNA
The RTX4060 series only has PCI-E 8X with 8GB of VRAM:

I like to see how these will perform on PCI-E 3.0 systems like mine(B450 motherboard). This is why I would only buy a PCI-E 5.0 system now, because it is quite clear that if things don't change an RTX6060/RX8600XT will be PCI-E 5.0 8X.

Wow 8gb can't even run 1080p

People got scammed, regardless of what coping excuse nexus comes up with if you paid good money for a 8gb card in the last couple years you got swindled
8GB VRAM cards were only worth sub £400 at best IMHO.

Now think how an RTX4060 is going to fare in that game? Less memory bandwidth and when it has to page into system RAM is stuck on an 8X connection
 
Last edited:
This is nothing new though.

I had an EVGA 1070 with 8GB of VRAM, and even back in like 2017 I guess it would have been, I was running Fallout 4 with the high resolution texture pack and a few mods and easily maxing out that 8GB. And that game is old by modern standards.

But Nvidia have been shipping out cards with not enough VRAM for a long time, they historically used to do 2 versions of cards, one with 50% VRAM and those were horrendous. I forget which generation it was, but one shipped with 3GB and the other 6GB, and I remember still being on an AMD 6950 released years prior with 2GB of VRAM still holding up and thinking why on earth would you buy a card with only 3GB of VRAM.

I know the performance of the card means you cannot directly compare the amounts, so 8GB of VRAM on a 2016 era 1070 is NOT the equivalnet of 8GB of VRAM on a modern GPU, but even so.....
 
What a total scam by NV, their 60 class card is NOW officially potato level.

:rolleyes:


This is nothing new though.

I had an EVGA 1070 with 8GB of VRAM, and even back in like 2017 I guess it would have been, I was running Fallout 4 with the high resolution texture pack and a few mods and easily maxing out that 8GB. And that game is old by modern standards.

But Nvidia have been shipping out cards with not enough VRAM for a long time, they historically used to do 2 versions of cards, one with 50% VRAM and those were horrendous. I forget which generation it was, but one shipped with 3GB and the other 6GB, and I remember still being on an AMD 6950 released years prior with 2GB of VRAM still holding up and thinking why on earth would you buy a card with only 3GB of VRAM.

I know the performance of the card means you cannot directly compare the amounts, so 8GB of VRAM on a 2016 era 1070 is NOT the equivalnet of 8GB of VRAM on a modern GPU, but even so.....
This is both the fault of Nvidia and AMD when it comes to sub £350 cards. The big problem isn't the cards is where they are being positioned now in terms of price. They are pushing up lower tier dGPUs/entry level dGPUs to mainstream/entry level enthusiast pricing and these are made for laptops. At the same time they are cutting down the PCI-E bus width,which is going to start causing more problems as VRAM increases are stagnating. This is really concerning for a gamer like me who buys dGPUs under £400.

Last generation,we saw Nvidia release the GA107 card as the RTX3050 8GB which is the spiritual successor to the GP107 based GTX1050TI 4GB and TU117 based GTX1650. But these were sub £200 cards,and the RTX3050 8GB literally was priced at close to £300! It was also cut down to PCI-E 8X too. But the performance barely moved on:

GTX1660TI/RTX2060 level performance for the same or more money. But an RX5700 8GB/RX5600XT 6GB was similar money and faster.

AMD then literally replaced Navi 10(RX5600XT/RX5700/RX5700XT) with Navi 23,which used a smaller dGPU on the same 7NM process. The RX6600XT 8GB literally replaced the RX5700XT at a similar price point. Yet VRAM did not really increase,but memory bandwidth was slashed,and so was PCI-E bandwidth. The RX6700XT has 3X the Infinity Cache,and according to an AMD slide Navi23 had a less than optimal amount of cache,but Navi 22 launched at over £400,because the RRP jumped 20% or thereabouts.

The RX5700 8GB non-X could be had for £250 at one point. The RX6600XT until recently was still close to the price you could get an RX5700XT for. It's a performance sidegrade at best,or in some scenarios it can be a downgrade.

Then you had Navi 24(RX6500XT 4GB) which not only had less VRAM than the RX5500XT 8GB(at the same street price),but had not enough Infinity Cache,was only PCI-E 4X(so was much faster on PCI-E 4.0 systems when the target market probably uses PCI-E 3.0 systems),and no video decode engine either. It also a much smaller dGPU made on the same process node(2/3 the size):

The RX6500XT should have been sold as an RX6400 for around £100.

Now you have the GTX1650 successor,first bumped upto over £200 in the RTX3050 8GB,to a £300+ RTX4060 8GB. The RTX4050/Geforce MX is going to be over £200 and cut down even further.
 
Last edited:
I dont play and do not really care but is Hogwarts Legacy a Lumin software RT game ? If it is then AMD gpus do very well and it is not until hardware RT is enabled that NV get their mojo back.

This could just be the first of many instances of games producers aiming for the lower tiered products to maximise sales. Why sell to the 1% and ignore the other 99%. Or it could just be a lzy Unreal 5 engine game that uses software RT and gives an advantage to RDNA2.

If anyone was daft enough to think that an 8gb 3070 was going to be a better long term card than a 16gb 6800 then I despair. The AMD fine wine shizzle is true, if you want performance for 2 years buy Nv if for 5+ buy AMD imho.

Don't dis Lumen :p I'm huge fanboy of it.....

It provides Real Time RT for a fraction of the performance cost of Hardware RT, frankly Lumen is what RTX should have been, its just better, much much better.

Stopping buying this garbage is not going to work. NV etc are already rolling out their next business model with subscription access to 4080/4090 gpus with Geforce Now. This is obviously where they want us to be and it is way easier to generate profits when they can blame lag or low gfx fidelity on your own ISP.

Pricing GPUs out of the reach of the average consumer is a win win for Nv and AMD do not care either because they win by console sales...... I may be drunk and this may be the wierdest thing I say in my life but "Help me Intel ,, Only you can save me now..." We need real competition in the market and not just the good old boys running a monopoly to maximise profits.

I , you , or Dave next door may be able to afford the most upto date hardware but we are seeing a clear indication with the sales figures and the Steam hardware surveys of the slower uptake in the most modern gpus. What this means to progress of new game is unknown, if I games developer is making a game for under 10% of its target audience is that a good idea?

I have no faith in Intel, they made their usual cringe worthy slimy marketing spiel about how they are the largest GPU manufacturer and have the most money with the best team, while at the same time begged reviewers not to go hard on their under developed broken hardware, they then played the "Things are expensive but i'm on your side" card while charging $450 for a card that could barely keep up with a competitors $300 card.

AMD get a hard time for not trying hard enough to force Nvidia to drop prices, as if that's their only purpose, While Intel almost get a free pass for being worse than Nvidia because excuses excuses and they are all so nice for stroking our ego as Youtube "influencers" < they have been the problem, the reason for this situation all along.

AMD have learned that undercutting Nvidia by much just looses them revenue, that's our fault. Like those reviewers we only want them to compete so we can have cheaper Nvidia GPU's, AMD know if they play that game the only thing they get for it is a hit on revenue.
 
Last edited:
Stopping buying this garbage is not going to work. NV etc are already rolling out their next business model with subscription access to 4080/4090 gpus with Geforce Now. This is obviously where they want us to be and it is way easier to generate profits when they can blame lag or low gfx fidelity on your own ISP.

Pricing GPUs out of the reach of the average consumer is a win win for Nv and AMD do not care either because they win by console sales...... I may be drunk and this may be the wierdest thing I say in my life but "Help me Intel ,, Only you can save me now..." We need real competition in the market and not just the good old boys running a monopoly to maximise profits.

I , you , or Dave next door may be able to afford the most upto date hardware but we are seeing a clear indication with the sales figures and the Steam hardware surveys of the slower uptake in the most modern gpus. What this means to progress of new game is unknown, if I games developer is making a game for under 10% of its target audience is that a good idea?

The problem is that it requires a decent internet connection,and is another monthly cost in an era of people cutting down(why do you think Netflix is suffering?). Plus on top of this the US has awful internet in large parts of the country too. Most of this growth has been driven by cheap credit,due to 15 years of money printing by central banks,because morons keep borrowing money to buy too much stuff. Too many consumers and even companies have gotten comfortable with taking on debt.

The reality is that a reckoning is coming for a lot of these companies,because the tap has to be shut off at some point.
 
Last edited:
More like £500, mark my words.

Stop buying this junk, the moment people stop buying this junk and stop defending it is the moment this insanity will stop.


Tech tubers who are either stupid or far to 'loyal' to Nvidia are at fault here for saying cards like the 3070 had enough, when clearly it doesn't and never did.

The problem is not only this.

If you look at the people specifically on this forum, this is a computer hardware forum, the type of person here has to go to the effort of signing up to such thing and by default (in general) will be a certain type of person, and even here the opinions are a little divided.

The problem is, you go outside of this, and ask you average Joe, who maybe games on a PC but isnt really into hardware, and I gaurantee when you ask them what graphics card is the best they will nearly all say blindly say "Nvidia", and Nvidia know this.

Yes people need to stop buying "that junk" I for one agree, but even on this forum opinions are divided you have no hope when it comes to the general consumer base.
 
The problem is that it requires a decent internet connection,and is another monthly cost in an era of people cutting down(why do you think Netflix is suffering?). Plus on top of this the US has awful internet in large parts of the country too. Most of this growth has been driven by cheap credit,due to 15 years of money printing by central banks,because morons keep borrowing money to buy too much stuff. Too many consumers and even companies have gotten comfortable with taking on debt.

The reality is that a reckoning is coming for a lot of these companies,because the tap has to be shut off at some point.
But But But..... Quantitative Easing (Printing Money) is not the cause of inflation, < The lie told by every G20 nation engaged in money printing for decades.

Printing money devalues the money in your pocket, that's inflation, its a stealth tax. If a government wants more money they literally print it, the population then pays for it through inflation.
 
Last edited:
But But But..... Quantitative Easing (Printing Money) is not the cause of inflation, < The lie told by ever G20 nation engaged in money printing for decades.

Printing money devalues the money in your pocket, that's inflation, its a stealth tax. If a government wants money they literally print it, the population them pays for it through inflation.

Yep,and also so is exporting all your important resource extraction/manufactering(oil/gas/fertiliser) and growing all the food abroad,and then having to print more money to import it. So when a a pandemic or war comes along and causes shortages,import costs go even higher causing more inflation. Then we print even more money.

But going back to gaming,Nvidia trying to push everyone to streaming might not work. Most people who buy consoles didn't care if AMD provided the hardware. The same goes with streaming,if MS and Sony start investing more into it using custom AMD gaming servers,AMD can provide all of it down to the CPU and motherboards.

End users won't care who provides the hardware,because a service is a service.
 
Last edited:
The problem is not only this.

If you look at the people specifically on this forum, this is a computer hardware forum, the type of person here has to go to the effort of signing up to such thing and by default (in general) will be a certain type of person, and even here the opinions are a little divided.

The problem is, you go outside of this, and ask you average Joe, who maybe games on a PC but isnt really into hardware, and I gaurantee when you ask them what graphics card is the best they will nearly all say blindly say "Nvidia", and Nvidia know this.

Yes people need to stop buying "that junk" I for one agree, but even on this forum opinions are divided you have no hope when it comes to the general consumer base.

But how did that happen?
 
People got scammed, regardless of what coping excuse nexus comes up with if you paid good money for a 8gb card in the last couple years you got swindled

Keep up dear chap, I've never said 8gb was a good purchase decision going forward, more so if you were overpaying though...

No surprise 8gb is struggling. Lol worthy the people who thought they were 4k gaming cards for the future, of course, if you're happy to drop settings then that's another matter.

Still 3070 is a great card at the Fe price especially if you want to experience ray tracing and have access to dlss
:cool:

The only time I ever questioned supposed so called issues with 8gb vram was when someone insisted on having all kinds of issues purely because of vram in certain games with no evidence to back this up and no reviewers mentioning or showing such issues.

10GB thread/story, well that's another topic.....





Also, lol worthy if you base vram/performance arguments on well regarded broken games.....

Stuttering on 4090 even after shader compilation


pViNdsj.png


Maybe throw another grand on your PC/gpu with even more vram and you might get a somewhat better experience......

Again, this is another classic example of people blaming hardware when it's a game/development fault, given I see this every day with my work (with people who deal with "optimising" their applications, it's not surprising to see the average joe being oblivious/ignorant to the real root issue on here.

Bit like diablo 4 beta, "zOMG 10gb not enough" yet lets read the article:


There is one thing Diablo 4 does identify as a beta: the hunger for memory. We're hearing numerous user reports complaining about stuttering and we respect our Afterburner logNot surprising: Diablo 4 (Beta) allocated up to 23 GiByte on our Geforce RTX 3090 Ti in our first test sessions with maximum details. Users of a Geforce RTX 4090 confirm this behavior, sometimes accompanied by the first reloading hiccups. These observations point to a memory overflow caused by "messie behavior": the game doesn't seem to remove older data from memory as it progresses. What, in the worst case, leads to texture dropouts on a high-end graphics card with 24 GiByte obviously causes problems for graphics cards with less VRAM. Gamers report crashes and strong jerks, which are only alleviated after reducing the texture details - even on a Geforce RTX 3080, which has 10 GiBytes, not just the common 8 GiByte models.

So ultimately, fault with how the game has been "optimised", it basically all comes down to consoles having a different approach in terms of memory i.e. vram and system memory is unified and then obviously direct storage functionality so when games are ported to work with a different architecture, it's no surprise we are seeing issues like this, essentially gpus with large amounts of vram are brute forcing their way through issues or rather avoiding the real issue here.
 
Last edited:
  • Like
Reactions: TNA
But how did that happen?

Nvidia knew how to market better,by using influencers on tech forums(who were later found out) and by sponsoring more games than ATI/AMD bothered to do. Then also a tech press which could got cut off from review parts,so tended to talk less about Nvidia problems.

One of them was the famous bumps problems where you had Nvidia dGPUs failing in laptops(link). It might also be why Apple and Nvidia never worked together again. The tech press went very quiet about it.
 
Last edited:
Yep,and also so is exporting all your important resource extraction/manufactering(oil/gas/fertiliser) and growing all the food abroad,and then having to print more money to import it. So when a a pandemic or war comes along and causes shortages,import costs go even higher causing more inflation. Then we print even more money.

But going back to gaming,Nvidia trying to push everyone to streaming might not work. Most people who buy consoles didn't care if AMD provided the hardware. The same goes with streaming,if MS and Sony start investing more into it using custom AMD gaming servers,AMD can provide all of it down to the CPU and motherboards.

End users won't care who provides the hardware,because a service is a service.

Globalism, if we all make ourselves dependant on eachother then there are no more wars, because we are all friends, its part of the reason for Germany's initial reluctance to condemn Russia properly, its not just that they rely on them for their energy. the reason for that is the same reason for that reluctance, they thought they could bring Russia around to the globalist idea. The UK could also easily be energy independent, we used to be, we chose not to be as a sacrifice to globalism.

Russia, nor China, or India, or any of the gulf nations... and so on, are not interested in this global hegemony led ultimately by America, they are in fact now building a rival hegemony of their own.
Some people might not like the way your hegemony is run, who knew? Is this not why every socialist nation in history ends in tyranny? Because its the only way social hegemony works. ###### unicorn air heads.

And back to gaming, i have no interest in game streaming, they all want us subscribing to their systems because in that way they have the hand on the money taps, its why people like Nvidia are so pig headed about GeForce Now, no thank you.....
 
Last edited:
Nvidia knew how to market better,by using influencers on tech forums(who were later found out) and by sponsoring more games than ATI/AMD bothered to do. Then also a tech press which could got cut off from review parts,so tended to talk less about Nvidia problems.

One of them was the famous bumps problems where you had Nvidia dGPUs failing in laptops(link). It might also be why Apple and Nvidia never worked together again. The tech press went very quiet about it.
And there we have it why Charlie is so disliked. Not the rumours, not the paywall but that by exposing Nvidia's solder problems - and coming up with the term "bumpgate" due to how covered up it was - he didn't tow the "keep quiet or else". Else being access. Guess he can do without review samples. Killing the messenger only became necessary afterwards
 
Globalism, if we all make ourselves dependant on eachother then there are no more wars, because we are all friends, its part of the reason for Germany's initial reluctance to condemn Russia properly, its not just that they rely on them for their energy. the reason for that is the same reason for that reluctance, they thought they could bring Russia around to the globalist idea. The UK could also easily be energy independent, we used to be, we chose not to be as a sacrifice for globalism.

Russia, nor China, or India, or any of the gulf nations... and so on, are not interested in this global hegemony led ultimately by America, they are in fact now building a rival hegemony of their own.
Some people might not like the way your hegemony is run, who knew? Is this not why every socialist nation in history ends in tyranny? Because its the only way social hegemony works. ###### unicorn air heads.

And back to gaming, i have no interest is game streaming, they all want us subscribing to their systems because in that way they have the hand on the money taps, its why people like Nvidia are so pig headed about GeForce now, no thank you.....

The big issue,is that too many of the Global systems,assume the vast majority of countries are fine with being told what to do by a small number of countries,and not having any real say over the consequences. But when these consequences manifest themselves in other countries,they are left to pick up the pieces(usually meaning the public in those countries have to endure hardships). If these systems were truly Global it would mean decisions would be taken in concert with representatives from most of the world. But sadly they are not. So what happens our Global competitors are exploiting this chagrin to get smaller countries onto their own systems.

And there we have it why Charlie is so disliked. Not the rumours, not the paywall but that by exposing Nvidia's solder problems - and coming up with the term "bumpgate" due to how covered up it was - he didn't tow the "keep quiet or else". Else being access. Guess he can do without review samples. Killing the messenger only became necessary afterwards
Yes,and many people on tech forums attacked him over it.
 
Last edited:
Wow 8gb can't even run 1080p

People got scammed, regardless of what coping excuse nexus comes up with if you paid good money for a 8gb card in the last couple years you got swindled

Not directly at you Grims, but in general

Using a game's day of release performance is poor. But still people blame the gfx card. Or is this just a post that you know will make people bite? :cool:

Clearly obvious if at 1080p if it's falling over with 8GB then as usual, rushed release. Which this game doesnt have an excuse seeing as it came to console years ago (10). HWUB just getting clicks for that. Revenue for them.

Looking at the steam reviews for the game (mostly negative)......so is it the game or people blaming gfx cards coz the game wont run?

Those who think it's the cards are at fault, are the ones who will be drawn into buying expensive cards or through lack of intelligence be pushed to console.

Those who look at the game and reviews will probabaly think to themselves - I wont pay for that, not at MSRP - probabaly be good in 6 months - when half price. LIKE ALL GAMES RELEASED OVER THE PAST 4 YEARS!

Dunno how hard PC gaming is, but some are so far up their chuffs in component knowledge, they are blinded from the simplest of things.

If I was a game dev, I'd look at the hardware available, what are most people using (steam polls etc) to know what I nbeed to make it run on, so people will buy it.

Apparently, you must upgrade hardware...................£50 for that game. Not worth £5 early access - maybe I'll spend another £2k for next gen cards. Original on console is 10yrs old.

If game code was as open as it used to be, some bod somewhere would have fixed it with CHATGPT in a week.

Even Steams recommended specs only goes up to 8GB orf VRAM :cry: must be the cards though.................

It's not NV or AMD pulling the wool, it's the lack of nous applied when licking the backsides of internet globsal tech press sites, that know they just need clicks on whatever platform to keep revenue coming in. Internet was good until social media and anyone has a platform. Too many people making lots of money from sheeple.

Who do you follow? Baaaaaaaaaaa

every channel, tuber, website, whatever easy medium to reach people globally you choose, they all work on clicks - the more controversial the more clicks - social media professional's rule No 1.


Baaaaaa where's my 5090 I NEEEEEEED to pony up for............................or just dont buy that dribble of a port for a 10yr old game. Hell, they managed to film 9hr episodes before this game was released.

THE LAST OF US................that recognize when it's the game - not the hardware. @Nexus18 @TNA
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom