• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

QA Consultants concludes that AMD has the most stable graphics driver in the industry.

Its DDR4 not 3 - oh look I'm defending it by pointing out an inaccuracy!

But yeah I agree they should not have used the same naming.

I didn't say I was talking from global perspective and about the Radeon R7 240 (Oland Pro).
These things are from the ATi times - Radeon X1600 Pro being with DDR2 and GDDR3, too.
 
For the record, I think what Nvidia have done with the 1030 is terrible.

What Nvidia did with the 1050 4gb is reasonable. The 3gb will stop being produced and the 4gb Will be the same price and faster most of the time, occasionally slower but will have a longer life and you get 33% more vram
What about the 970, what is your stance on that?

My understanding is Nvidia said it was a mixup with their marketing department, which is blatantly a lie. They tried it on and got caught.


So much petty arguing, not sure what it is about graphics cards that gets people so het up.
It’s all the sweet popcorn mate, all that sugar gets to some of us! :p
 
What about the 970, what is your stance on that?

My understanding is Nvidia said it was a mixup with their marketing department, which is blatantly a lie. They tried it on and got caught.



It’s all the sweet popcorn mate, all that sugar gets to some of us! :p

Lol, so at least there's an explanation :D ;)

On the subject of the 970, that WAS a scandal. But Nvidia was rightly called out on it. Thing is, all I care about with a GPU is that it runs my games as fast as it can, don't give a monkeys who makes it. Just cannot get the brand loyalty some appear to have. I'm gonna keep this system for at least this year as it runs everything I throw at it, although I expect the G-Sync mitigates a lot of things about this old system, it's certainly made me think I'll also hang onto this 1070 for quite a while. I was running this lot with 8 Gigs of RAM until the other week. Crazy, because the extra 8 I added has breathed totally new life into this system, gaming is a MUCH smooter experience with 16Gb as opposed to 8.

But were I to be shopping for a complete system now and wanted the best bang for my buck, it'd be a Ryzen, Vega 64 and Freesync monitor. That's where the sweet spot is for me right now if I was to be upgrading.

Still, bicker away to those enjoying it ;)
 
So much petty arguing, not sure what it is about graphics cards that gets people so het up.

Don't think it's so much the graphics cards. If you look at how people argue with flat earthers, it's like that. In that it's really difficult to understand why anyone would think that lol. AMD drivers are a lot better than they have been in the past, that much is certainly true.

Personally haven't had any issues with NVIDIA drivers at all in the last 4 years that weren't related to overclocking.
 
Don't think it's so much the graphics cards. If you look at how people argue with flat earthers, it's like that. In that it's really difficult to understand why anyone would think that lol. AMD drivers are a lot better than they have been in the past, that much is certainly true.

Speaking of society thinking:
- Radeon drivers are very good, not bad like many would like to think;
- Framerates count doesn't tell the whole story like many would like to think;
- Radeon graphics solutions should be available with wider notebooks range - right now AMD loses a lot of sales because of wrong hardware-retailers and consultants who speak bad things about them.

Similar to why people should be forced to hate Russia. Have you seen Russian women how innocent, good and beautiful they are?! Probably the best in the world.
So, why do they deserve the whole world's negative energy to ruin their society?!
 
Funny how some get hyper defensive about Radeon drivers when people have been criticising the portrayal of nVidia drivers not talking about AMD drivers.
 
It really surprises me when people say they don't see an iq difference between amd and nvidia. I have both my systems connected to the same screen and without messing with anything amd looks more vibrant whereas nvidia looks more muted. I've not noticed any rendering differences in game's just noting that colour wise at least amd looks (to me) better at control panel defaults.
 
It really surprises me when people say they don't see an iq difference between amd and nvidia. I have both my systems connected to the same screen and without messing with anything amd looks more vibrant whereas nvidia looks more muted. I've not noticed any rendering differences in game's just noting that colour wise at least amd looks (to me) better at control panel defaults.

There is a difference in quality, AMD image is sharper, the colourse are deeper and seems to be a higher range.

I don't know why there is a difference but using delta colour compression is one way of improving performance and nVidia do a lot of that.
 
What about the 970, what is your stance on that?

My understanding is Nvidia said it was a mixup with their marketing department, which is blatantly a lie. They tried it on and got caught.

:p

Nvidia was definitely in the wrong with the 970 but I don;t think it is the ultimate evil the AMD fanboys would have you believe.

The memory issue itself is pretty minor. The main issue is actually the it only had 56 ROPS and less L2 cache, which in itslef is completely harmless as that is what cut-down cards do, but Nvidia seemed to indicate that the 980 and 970 were identical. Nvidia gave the specs and info for the full GM204 chip and implied the 970 was the same.

All nvidia had to do was make it clear to reviewers that the provided GM204 specs were only for the 980, and that the 970 is a salvaged part with slightly reduced specs. Who actually cares how how L@ cache or a ROPS a GPU really has, except some computer nerds. The main thing is the performance, which was clearly marked in reviews. There is no obligation to tell anyone the internal structure of the GPU, so Nvidia didn;t have to tell anyone how the 970 looked in tech specs. But they absolutely should not have implied it was identical to the 980, and that is why they admitted their wrong doing and paid out refunds.

The fact that 1/8th of the RAM is slower than the rest is not really a big issue, and I didn't think Nvidia even had to make that explicit. Testing revealed it made no difference, because windows likes to keep a few hundred MB anyway and the driver has some metadata. For all intents and purposes it behaved the same as if all 4GB was the same speed. And that is why people purchased it, because of the performance in benchmarks. Knowing that some memory was slower wouldn't have stopped people buying it, as it was a great performer for the money. Previous cards had a similar memory segregation. The main issue is actually the stated bandwidth in the initial reviews was wrong, so again, false advertising.


I don't know how intentional the errors in the 970 technical specs were. The tech specs were valid for the GM204 GPU. The engineers were not the marketing people so it is absolutely possible to be an accident. Some people claim it was blatant lying and misinformation to sell GPUs, but that doesn't make much sense. The GPUs would sell anyway, as was clearly the case because the 970 went on to be the highest selling mid-range GPU ever. At one point it and something like 4% of the total Steam user base. People buy based on price and performance and ecosystem/brand, not how much L2 cache a GPU has. It is also obvious that the true specs would be discovered pretty quickly anyway, as was inherently the case.

Maybe Nvidia did blatantly try to scam customers, I don't know for sure, but the evidence doesn't exist and logically it doesn't make any sense. Nvidia was still guilty of false advetsiign, hence the class action lawsuit which they settled.
 
Last edited:
It really surprises me when people say they don't see an iq difference between amd and nvidia. I have both my systems connected to the same screen and without messing with anything amd looks more vibrant whereas nvidia looks more muted. I've not noticed any rendering differences in game's just noting that colour wise at least amd looks (to me) better at control panel defaults.


The key phrase there is "without messing with anything amd looks more vibrant "
AMD by default uses a higher vibrancy. that isn't necessarily better image quality. If you are professional working with images you will be using a much lower vibrancy, contrast and screen brightness.
You can tweak that in the Nvidia control panel to match AMD, but it is purely a personal preference.

It is like al;l the TVs in the show room, with contrast and saturation ramped way up. Looks good to the layman but anyone who really cares about image quality will be turning that right down. I have my monitors calibrated for photographic work. They are pretty dull strangely dim with the brightness right down. But, it makes editing photos much easier.
 
Funny how some get hyper defensive about Radeon drivers when people have been criticising the portrayal of nVidia drivers not talking about AMD drivers.

Let's not forget that what we are discussing here hasn't reached the people who don't read any discussion boards and who obviously do purchase computer hardware.
There is no way to inform this large population that it is safe to buy Ryzen and Radeon.
Their brains are so much changed in all those years when hardware sales consultants did only one thing and it has been to recommend Intel+nVidia combos.

It really surprises me when people say they don't see an iq difference between amd and nvidia. I have both my systems connected to the same screen and without messing with anything amd looks more vibrant whereas nvidia looks more muted. I've not noticed any rendering differences in game's just noting that colour wise at least amd looks (to me) better at control panel defaults.

There should be a way for testing this with calibration software. It just seems Radeons output of colours, and generally the image quality, is closer to the intentions of the content creators and to the natural true colours.

Nvidia was definitely in the wrong with the 970
The memory issue itself is pretty minor.
The fact that 1/8th of the RAM is slower than the rest is not really a big issue

Actually, the throttling appears at 3.2GB, not at 3.5GB. http://www.extremetech.com/extreme/198214-198214

Note that performance is constant between both cards until you hit the end of the memory pool in both cases. When the GTX 970 hits the 3.2GiB mark, performance craters, falling to less than 20% of its initial rating in some cases.
 
Actually, the throttling appears at 3.2GB, not at 3.5GB. http://www.extremetech.com/extreme/198214-198214

I often found that the 970 "4GB" worked identical to my 780 3GB in respect to behaviour when you truly loaded up with more actual data load than there was VRAM - pretty much both would start to stutter or performance crater at the same point i.e. some Skyrim mods (using a modified client) both cards would work fine up until 4GB VRAM utilisation and then even a few bytes past 4GB and performance would drop off a cliff - other stuff would behave differently i.e. the moment you went past ~3GB both cards would start to show signs of stutter, etc. depending on the nature of the VRAM utilisation but in most cases if the 780 was fine the 970 was and vice versa.

Also throttling doesn't automatically appear at 3.2GB it depends on the actual utilisation - I played quite a few games where the 780 sat at 2.8GB, 970 sat at 3.4-3.6GB and both were perfectly smooth as in most games it is just lazy garbage collection and "pre-caching", etc. rather than actually needing all that VRAM.
 
Let's not forget that what we are discussing here hasn't reached the people who don't read any discussion boards and who obviously do purchase computer hardware.
There is no way to inform this large population that it is safe to buy Ryzen and Radeon.
Their brains are so much changed in all those years when hardware sales consultants did only one thing and it has been to recommend Intel+nVidia combos.



There should be a way for testing this with calibration software. It just seems Radeons output of colours, and generally the image quality, is closer to the intentions of the content creators and to the natural true colours.
If you used calibration software and calibrated the monitors then you would find there is zero difference. What you are claiming about images being closer to the content creators is basically the exact opposite of reality. Nvidia do not have high vibrancy in order to have more natural colors. It is just many people are used to artificially over saturated colors from their uncalibrated TVs.


Actually, the throttling appears at 3.2GB, not at 3.5GB. http://www.extremetech.com/extreme/198214-198214

Note that performance is constant between both cards until you hit the end of the memory pool in both cases. When the GTX 970 hits the 3.2GiB mark, performance craters, falling to less than 20% of its initial rating in some cases.

that is an artificial memory test. In actual games, even when loaidng up 4GB of memory there was no difference between the 980 and 970.
 
If you used calibration software and calibrated the monitors then you would find there is zero difference. What you are claiming about images being closer to the content creators is basically the exact opposite of reality. Nvidia do not have high vibrancy in order to have more natural colors. It is just many people are used to artificially over saturated colors from their uncalibrated TVs.

It comes from smartphones. +50% digital vibrance, the default setting in the NCP, is too low and needs to be adjusted to +60%, otherwise you are closer to the grey-scale imaging than colourful imaging.
 
Back
Top Bottom