• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The GPU war is over.

IMO its not over not by a long shot.

Die shrinks are not as common as they once were.
I cannot see one for a long time after the next half shrink happens.

AMD will gain ground again technologically.

They just need to be smart for the next few years and just wait it out.
 
In my opinion the Gpu war has just started. The RTG group which needs time has already made some improvements and will hopefully keep going going in the right direction. Driver wise on a single card they look to be doing a good job. They have gained a decent chunk of market share back from Nvidia and maintained it. Will be in the Next XBox as well as being in all current Gen bar Nintendo's handheld. If Vega is good and the launch goes well then I see them making further strides.

As has been said though they need to handle there products well with no bad press. Quadrupling your share price in a year must show they have taken a step in the right direction. The road is a long one though and they need to be able to rinse and repeat like Nvidia while bringing wanted features to there software suite.

But AMD needs to do more things like brand the consoles with some indication they are powering it. Nvidia seems to keep branding every little thing their chip in with some indication Nvidia is in it.

I suspect more people will know Nvidia is in the new NX than AMD in 4 blasted consoles!!

AMD make assumptions most gamers pay attention to what is said on websites,Nvidia does not,so makes sure they do an Apple on everything they do.
 
Last edited:
But AMD needs to do more things like brand the consoles with some indication they are powering it. Nvidia seems to keep branding every little thing their chip in with some indication Nvidia is in it.

I suspect more people will know Nvidia is in the new NX than AMD in 4 blasted consoles!!

AMD make assumptions most gamers pay attention to what is said on websites,Nvidia does not,so makes sure they do an Apple on everything they do.

I think both MS and Sony have something of a policy towards not having branding stuff on their consoles (other than their own) I don't think AMD have a lot of say in that as such.
 
Was reading a transcript earlier from an interview with the AMD chief and got the impression they're more excited by Zen than the GPU side of things and stated it's an important market. Also left me wondering if they could have some kind of competitive advantage by providing both the CPU and GPU, with some trickery going on that could give them some kind of performance advantage.
 
I think both MS and Sony have something of a policy towards not having branding stuff on their consoles (other than their own) I don't think AMD have a lot of say in that as such.

Well the thing is I expect Nvidia will try and do it with the NX,but ultimately it should even be on the box somewhere.

Was reading a transcript earlier from an interview with the AMD chief and got the impression they're more excited by Zen than the GPU side of things and stated it's an important market. Also left me wondering if they could have some kind of competitive advantage by providing both the CPU and GPU, with some trickery going on that could give them some kind of performance advantage.

Most of the financial issues with AMD related to their CPU side(AMD has generally been profitable when it comes to GPUs anyway),so Zen doing well will no doubt be the biggest impact to them in years. After all when AMD makes decent CPUs,they tend to have less issues selling them.

I do think if Zen does OK,it will by extension help the GPU brand too,and no doubt a decent APU would be very nice in laptops. Imagine something closer to a RX460 on the IGP side but more in a thin laptop with decent battery life? Or even if the said Zen APUs under DX12 could be used for offloading some processing off the main graphics cards for effects,etc(a bit like what the PhysX cards did).
 
There seemed to be a lot of that around here sure but there were others pointing out that the 390 & 390x were better options in there opinion so it kind of balanced it out because in every thread where the 970 was suggested so was the 390.
And at the end of the day it was the better choice.

Yer, this!

Some recommended the 970 when specific GameWorks titles were a must to play or the price was cheaper but most recommended the 390/X (depending on budget) over the 970. The 970 I had competed nicely with the 290X I had and won/lost a fair share and both were the same price, so no wonder people were on the fence for either side. The 390X was dearer but a better option over the 980 which was quite a bit dearer.
 
Right. Fine.
You WILL find that at 1080p ultra the 970 WILL run out of VRAM in GTAV. That is fact and has been tested. Not to mention shadow of mordor and others.

Yep and even when it doesn't quite run out, anything over 3.5gb cripples performance anyway due to the last 512mb being gimped :/
 
Last edited:
Right. Fine.
You WILL find that at 1080p ultra the 970 WILL run out of VRAM in GTAV. That is fact and has been tested. Not to mention shadow of mordor and others.

SO does all other 4GB cards like the FuryX and 980 Hardly surprising if you go over 4GB then cards with 4GB VRAM struggle.
 
And when you need more than 4GB do you tap into the magic reserves? Gimped or ungimped you cant use what you dont have.... Because GTAV on ultra will regularly go over 4GB.

GTA V aside (where it will actually use the VRAM as shown in the menu when configuring settings) you have to be a bit careful when judging VRAM utilisation and even then what the impact is on performance.

Before the 1070 I was using both a 970 and 780 side by side for a bit and quite often the 780 would be at like 2.7-2.8GB VRAM in use and the 970 at ~3.4GB but zero impacted performance on the 780 because it was not running out of VRAM just the game only flushing data when it needed to (there might have been the very odd scenario where the 970 was slightly smoother due to having data it needed already "cached" in VRAM but I was never able to nail that down).

On the other hand when you had something that actually needed and used more VRAM than the cards had the story wasn't straightforward either - sometimes even a couple of MB over and the game would feel heavy and stuttery and/or performance dived other times you could go 100s of MB over with minimal or even no noticeable impact if the data swapping wasn't intensive enough to saturate the slower transfer rates i.e. Skyrim with ENB at 4K I could make it peak at 5GB VRAM used but both the 970 and 780 would be perfectly smooth all the way upto 4GB VRAM usage and then fall apart completely just over 4GB - despite the 780 at that point being 1GB beyond its onboard VRAM and the 970 deep into its slower VRAM pool.
 
But AMD needs to do more things like brand the consoles with some indication they are powering it. Nvidia seems to keep branding every little thing their chip in with some indication Nvidia is in it.

I suspect more people will know Nvidia is in the new NX than AMD in 4 blasted consoles!!

AMD make assumptions most gamers pay attention to what is said on websites,Nvidia does not,so makes sure they do an Apple on everything they do.

Tbh i would not be surprised if being in the consoles put people of buying there hardware for PC's. Some PC users are under the impression that consoles are for peasants only and AMD don't want to be seen in this light any more than they are.

I know what you are saying though. They need to be taking a lot more credit and pushing the good things they do.
 
Tbh i would not be surprised if being in the consoles put people of buying there hardware for PC's. Some PC users are under the impression that consoles are for peasants only and AMD don't want to be seen in this light any more than they are.

I know what you are saying though. They need to be taking a lot more credit and pushing the good things they do.

Please don't put junk and stickers on the hardware. Thank goodness we're now getting to the stage where most laptops no longer come with Intel Inside, Nvidia and all the other stickers on them by default.

Have a small AMD/Nvidia logo on the cardboard box (maybe in the specs card on the side) and something small on the splash screen when the console turns on.
 
If AMD had stickers on all their consoles, it would probably associate AMD with gaming. Surely seeing the AMD logo over and over is going to create some sort of positive association?

No publicity is bad publicity.
 
Please don't put junk and stickers on the hardware. Thank goodness we're now getting to the stage where most laptops no longer come with Intel Inside, Nvidia and all the other stickers on them by default.

Have a small AMD/Nvidia logo on the cardboard box (maybe in the specs card on the side) and something small on the splash screen when the console turns on.

Yea i agree. On the boxes is fine.

If AMD had stickers on all their consoles, it would probably associate AMD with gaming. Surely seeing the AMD logo over and over is going to create some sort of positive association?

No publicity is bad publicity.

Yep my original post was a bit of a laugh and getting an amd logo on there might do them some good.
 
I have personally owned a lot of cards both AMD and Nvidia, including 7950, GTX760, 290, 290X 8GB, 970, 980, RX480 and 1060 all overclocked... If you played with any of them without a frame rate monitor you probably wouldn't noticed a huge difference in any of them. One thing for sure though, all those AMD cards have been horribly hot compared to the Nvidia equivalents. Also at the moment I can play some games maxed out like Warcraft Legion and Fifa 17 without my cards fans even having to come on - can AMD produce a card like that? I doubt it, and all under 60 Celsius.

I really want to like AMD, their performance is very good at times but I can't stand how hot their products are, or how power hungry they at their performance level.

In all honesty I feel mentally 'safe' choosing Nvidia these days.

Also for reference to the above talk, I would like to see a overclocked 390X beat an overclocked 980 in more than a very small handful of games as no way is it a faster card.
 
I got nothing against AMD but lots of people here say that AMD drivers are better than nvidia.
Well I bought RX 480 the day it was released and I was having problems with it since wher my 980 ti works perfectly fine every time.
How many drivers updates we had since RX 480 was released.? A lot and stupid watman crashes every other time that I want to open it.
I bought RX 480 for my HTPC to have HDMI 2.0 so I can display 4K/60 on my new oled tv and what I'm getting is not what I was hoping for.
Can't get HDMI deep colour enabled unless I will set the refresh rate to max 30hz so I'm stuck with 8bit 4:2:0 regardless of the cable I use.
Now everything works fine with nvidia card.
I'm not complaining about performance of the RX 480 as I bought it at £176 and flashed it to 8GB but performance isn't everything.
Oh and it's impossible to monitor gpu usage on the RX 480 as it will only show correct usage when it's at 0% or 100% load and anything between that will show constant jumps from 0% to 100% and that just ****es me off.
 
Last edited:
I got nothing against AMD but lots of people here say that AMD drivers are better than nvidia.
Well I bought RX 480 the day it was released and I was having problems with it since wher my 980 ti works perfectly fine every time.
How many drivers updates we had since RX 480 was released.? A lot and stupid watman crashes every other time that I want to open it.
I bought RX 480 for my HTPC to have HDMI 2.0 so I can display 4K/60 on my new oled tv and what I'm getting is not what I was hoping for.
Can't get HDMI deep colour enabled unless I will set the refresh rate to max 30hz so I'm stuck with 8bit 4:2:0 regardless of the cable I use.
Now everything works fine with nvidia card.
I'm not complaining about performance of the RX 480 as I bought it at £176 and flashed it to 8GB but performance isn't everything.
Oh and it's impossible to monitor gpu usage on the RX 480 as it will only show correct usage when it's at 0% or 100% load and anything between that will show constant jumps from 0% to 100% and that just ****es me off.

Emm,you do know about the problems with HDMI,right??


Article from Guru3D said:
There is an interesting read available today at Heise.de, it is a German based website though so allow me to relay their findings. As you all know 10-bit HDR is one of the emerging technologies that for example you can enjoy on the new Polaris based Playstation and your RX 400 series based graphics card.

As it turns out (and really this is not an issue specific only to AMD) AMD is fighting the HDMI protocol and specification, as even HDMI 2.0 does not have enough bandwidth for 10-bit HDR (over HDMI) in specific at 4K and a 60hz refresh-rate with 4:4:4 YCrBr-sampling (2160p60 / 10bpc).



To stay within the bandwidth limits it turns out that AMD is applying 4:2:2 or 4:2:0 sampling and thus shares red and blue color components to get to a lower bitrate over HDMI. The information itself is not exactly a secret, in fact AMD shared this information already during the Polaris launch. Hower AMD claimed that they supported 10-bit HDR gaming as well, and that is not right. On HDMI 2.0a the color depth is also lowered to 8-bit with dithering. Considering that the Playstation 4 also is Polaris based, we can only assume the same happens there.

In a test at heise they checked out Shadow Warrior 2 in HDR a Radeon RX 480 which showed similar visual results towards a GeForce GTX 1080. So it seems this is the case for Nvidia as well and likely Nvidia is using a similar trick at 8-bit also. Nvidia has not yet shared info on this though. According to heise, they did see a decrease in performance with Nvidia whereas the RX 480 performance remained the same.

The solve if you have a 10-bit compatible HDR-monitor for only to use DisplayPort 1.4 (supported by Polaris), though these will become available in volume early next year. At this time we are not sure what this entails and means for playback HDR supported movies on a HDR compatible Ultra HDTV at HDMI 2.0
 
That's weird because I can get 10bit at 60hz with 980 ti but not with RX 480.

Well its not working on the GTX1080 as that German site tested it too.

http://www.hdmi.org/manufacturer/hdmi_2_0/hdmi_2_0_faq.aspx#146

Thats is from the HDMI website.

– 2160p, 10/12 bits, 24/25/30Hz, RGB/4:2:2/4:4:4
– 2160p, 10/12 bits, 50/60Hz, 4:2:0/4:2:2

That means you need chroma resampling to get 60HZ 10 bit colour since it is an issue with bandwidth.

That is from the horses mouth.

Displayport 2.0 has enough bandwidth.
 
Back
Top Bottom