• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: ** The AMD VEGA Thread **

On or off the hype train?

  • (off) Train has derailed

    Votes: 207 39.2%
  • (on) Overcrowding, standing room only

    Votes: 100 18.9%
  • (never ever got on) Chinese escalator

    Votes: 221 41.9%

  • Total voters
    528
Status
Not open for further replies.
Why do people voluntarily brand lock themselves when buying Gsync/Freesync monitors? Whats wrong with Vsync. works fine for me. And I can avoid the being like those with the 1000's of posts saying "Id love to buy "x" GPU but I've just spent £1000 on a Gsync / Freesync monitor so have to buy a Nvidia / AMD GPU. Madness. Time and time again I see people posting that they want a certain GPU but can't buy it because of their monitor. Why do that to yourself?

Because vsync isn't even close to freesync/gsync?
 
Dunno. The way I do it is look at features (27" 144hz, 1ms response, FreeSync/Gsync) and then go off and find monitors in my price range around £500-600.

Then look at reviews, if generally thumbs up. Purchase.

So for me whether FreeSync or Gsync same price.

Maybe I'm missing something but I did exactly the same and couldn't find any g-sync monitors anywhere close to the price of the freesync (my criteria was 34" 1440p ultrawide, ips with sync technology)

A quick example from ocuk alone;

All are 27", 1440p, IPS screens with a minimum of 144hz;

Acer Freesync - £500 - https://www.overclockers.co.uk/acer...ing-widescreen-led-monitor-bla-mo-095-ac.html

Acer G-sync - £700 - https://www.overclockers.co.uk/acer...descreen-led-monitor-black-red-mo-098-ac.html

£200 saving by going freesync


Asus Freesync - £560 - https://www.overclockers.co.uk/asus...escreen-led-slim-bezel-monitor-mo-083-as.html

Asus G-sync - £900 - https://www.overclockers.co.uk/asus...descreen-led-monitor-black-red-mo-088-as.html

£340 saving by going freesync



Why do people voluntarily brand lock themselves when buying Gsync/Freesync monitors? Whats wrong with Vsync. works fine for me. And I can avoid the being like those with the 1000's of posts saying "Id love to buy "x" GPU but I've just spent £1000 on a Gsync / Freesync monitor so have to buy a Nvidia / AMD GPU. Madness. Time and time again I see people posting that they want a certain GPU but can't buy it because of their monitor. Why do that to yourself?

For me after trying freesync and g-sync it's really really difficult to go back to just V-sync.

I'm having to use V-sync now via my nvidia gpu and freesync screen and the experience is very jolting coming from using a variable refresh monitor before.

Now I have to make sure the game has a constant frame rate locked at 60Hz, any drop from that and stuttering occurs. If i just lock the frame rate at 60Hz without v-sync I get tearing as well.

Before when using free/g-sync monitors, any brief drops below the refresh rate wasn't noticed, it was lovely and smooth whether the game was running 50fps or 100fps, all the frames between were perfectly smooth and I enjoyed the games so much more.
 
Last edited:
Maybe I'm missing something but I did exactly the same and couldn't find any g-sync monitors anywhere close to the price of the freesync (my criteria was 34" 1440p ultrawide, ips with sync technology)

A quick example from ocuk alone;

All are 27", 1440p, IPS screens with a minimum of 144hz;

Acer Freesync - £500 - https://www.overclockers.co.uk/acer...ing-widescreen-led-monitor-bla-mo-095-ac.html

Acer G-sync - £700 - https://www.overclockers.co.uk/acer...descreen-led-monitor-black-red-mo-098-ac.html

£200 saving by going freesync


Asus Freesync - £560 - https://www.overclockers.co.uk/asus...escreen-led-slim-bezel-monitor-mo-083-as.html

Asus G-sync - £900 - https://www.overclockers.co.uk/asus...descreen-led-monitor-black-red-mo-088-as.html

£340 saving by going freesync

Look wider. ;)

My comparison isn't Apples to Apples but for me it's close enough as the features are the same. As long as price is in my bracket and reviews are good then it's a goer.
 
As a consumer, I bought a GTX 1070 for 380 dollars. I sold it, in order to eventually get Vega and a Ryzen system. To now find that the Vega 56 is basically the same price as the GTX 1070 for the same money is very frustrating, when AMD have had 15 months or whatever to get to this place.
I feel you man. I also had a 1070 which I sold and after all this time all AMD can do is match it, which is very disappointing.

Unless Vega 56 turns out to be closer to 1080 performance, at least in AMD games, then it will be disappointing at $399. AMD needed to hit $299 price point for Vega 56 (assuming it is around 1070 performance) and they would have got a lot of praise like they did with Ryzen.
 
I still have the 8800gt in a drawer, its a great card. Perfect all round really, really low power too. Its like a 1050 now I guess except it was quite top end in its day, literally runs crysis:cool:

Yep I've seen that.
Are any of those newer specs on Vega likely to be pivotal. Remember the BF2 launch, would there be anything that makes people say 'I have to chuck my pascal out now it cant run this' Not now obviously but in next 3 years say
No games need HBCC,

I'm hoping HBCC works so well this wont be written in a months time. Would be nice if Vega had something but it may also be nothing.
They should have called this card the Mirage, every time we seem to be getting close its still in the distance and may not exist in any meaningful way at all


AMD have always been first to market with features:

1. 2Gb Vram. My 5870 had it, which is why I bought it over the GTX 470.
2. Eyefinity, first iteration of proper multi monitor support.
3. DX 10.1 again AMD first.
4. Vulkan in the guise of Mantle developed by AMD.
5. DX12 feature sets more support by AMD than Nvidia.
6. Now we have HBCC and FP16 hardware support.

GDDR5 on the 4870. Turned out vastly better then the 2900xt though its still a hot one.
I only hope thats an omen for Vega with its 'new' memory

Vega seems fine for max settings on 1440p at least. Cant see anyone being particularly unhappy with a Vega card.
Vega64 yea but Vega 56 for 1440p seems wrong. 56 is more for the mainstream, maybe the price sucks in that regard but its tilted towards that end. We'll see if it can cope or not but 1080p is mainstream and thats where I see it probably being best paired with.

The possible complaints for 'Vega 64 is a bad card ' would be noise and heat most likely. Price or performance is probably not going to be far off point. People can be very picky on nasty amounts of noise a card can produce. I dont think the liquid version is one to go for but the 3rd party cards may do it far better then the basic design so its about patience and waiting to get a good review show up.

I only care about the type of noise really, more then load even the idle fan speed being silent would be good. Decibels is a psychological measure of awareness and to me a particular noise can be far worse then others. I setup all my old fans and lucky they all run basically silent still, just a gentle whush is fine and I tune that out. I presume the delay to Vega is from HBM2 not driver problems, that might be the only other reason for genuine disappointment. Also not likely as Frontier runs ok apparently just some concerns over 'gaming mode' not being implemented
 
Yeah colour calibration has been a "problem" for Nvidia for a long time for some reason, it's been pointed out quite often on screenshots. Now is that because Nvidia set their colours to be more realistic translating into something less vibrante ? Don't know, but I know I'm not necessarily looking for "real life" colours in a game personally.
Nvidia set colours with less contrast and saturation which presents a more accurate rendition. Easy enough to ramp up the vribrance.
 
Seems most logical to jump on Vega. It's got support for newer standards, stuff that Nvidia will bring when Volta launches. If I go 1080 I am going to be behind in terms of hardware support features.

Lets put it this way, if this was Volta that had just launched with an AMD launch 6-9 months away looming, I wouldn't think twice about buying... but AMD seems to be much harder. Like I've said my last ATI/AMD card was a 5870 and I moved over to Nvidia because I felt the drivers support was much better (Game Ready) and things like Physx appealed.

But I think this time round considering where we are in product launches and the fact I already have a Ryzen, Vega is most likely my next purchase.

If the 1080 had dropped in price, then maybe I could warrant the purchase but not at £500+.



The features and technology are there tho, support for the full DX12 feature set and the Nvidia cards are still full price +.

Obviously Volta will be another leap in performance but that's just how it always goes.

I mean the developers and games AMD have got on board are already impressive right?



I woudl complete ignore all this rubbish about DX12 feature sets.



Pas cal supports more DX12 features than Polaris/Fiji/Hawaii, and some of these feature provably increase performance and/or visual quality:
https://developer.nvidia.com/content/dont-be-conservative-conservative-rasterization


Nvidia enjoyed 70-80% discrete GPU market share during this period, and Nvidia has unarguably by far the best developer relations and developer support. Yet how many games were released that actually made use of these additional features?



In a similar vain, Intels IGP actually has had by far the highest level of support for DX12, and these IGP need as much help as possible. Account for both discrete and integrated graphics, Intel has 70% market share, but developers didn't jump on teh opertunity to add a load of support.





This has always been the case with graphics cards and games. games are developers around a set of common denominators, which for now is mostly DX11 and windows 7. Developers don't like to spend time adding features and functionality that only some users cna make use of, which is why things like SLi/CF seldom work well and the latest features are never used.
 
AMD have always been first to market with features:

1. 2Gb Vram. My 5870 had it, which is why I bought it over the GTX 470.
2. Eyefinity, first iteration of proper multi monitor support.
3. DX 10.1 again AMD first.
4. Vulkan in the guise of Mantle developed by AMD.
5. DX12 feature sets more support by AMD than Nvidia.
6. Now we have HBCC and FP16 hardware support.


While there is no denying AMD often do add features that will be useful in the future, or have a forward looking architecture, as explained this often a waste of time because by the time those features are useful everything has moved on 2 generations and then AMd never seems to catch up w once the tech is utilized. For example, ATI introduced tessellation long before it became a DX requirement, it was basically never used. Then when tessellation was addes to the DX specs Nvidia produced a very good implementation while AMD struggled to get a performant version out.

If we look at your list:
1) AMD cards have mostly required additional bandwidth over Nvidia cards, because NVidia GPUs have more advanced BW saving techniques while AMD relies on brute force. One way to get additional bandwidth is to have a wider pipe with more chips, which leads naturally to higher VRAM amounts. this adds costs. This is only ever important if you have games that max out VRAm, which just doesn't really happen that often, and when it does it often all taxes the GPU heavily. And here we are today with Nvidia offering 11/12GB VRAm for their top GPUS that might come close to using that amount at 4K/SLI/ultra settings. AMD's best consumer offering is at 8GB. As some balance, I don;t think nvidia 1060 3GB models are a great buy if you wanted yo hold on to the card for a long time, the 6GB models or Polaris are btter. But for a year or 2 then the 3GB is liekly fine.



4). Nvidia had Vulkan drivers certified before AM. Nvidia had a load of developer support, example open-soruce code, documentation, etc. all online a long time before AMD did. And the last time I checked Nvidia were still well ahead with Vulkan developer support.

5). Nvidia hass offered high DX12 feature support in Pascal and Maxwell than the AMD competition. It is only with Vega has AMD caught up and taken over. In 6 months Volta will liekly be at the same level


6). HBCC is designed for HPC and not gaming. It can onyl really help in gaming when the GPU runs out of VRAM, which doesn't typically happen. AND then see your point 1.

As for Fp16, actually Nvidia had FP16 support way back on the FX 5 series cards, back when the lower precision was really important to get the msot performance out of the new pixel shader technology. Fp16 support was dropped because GPUs were powerful enough not to need to drop form FP32, and in fact the DX spec require F32 as a minimum. FP16 has only resurfaced now because it is useful for deep learning, which is why Nvidia added CFP16 support for Pascal Gp100 released 16 months ago. Since AMD hae yet to differentiate their compute and gaming cards, vega happens to also have FP16 support for gaming. There is some limited use cases where FP16 is sufficient and will lead to minor performance increase in specific scenarios. this isn;t some game changing feature, and it certainly isnlt some highly novel ground breaking technology that AMD are the first to develop. Nvidia sold GPUs with FP16 support 15 years ago.
 
Source? From everything I've seen, there's not much in Volta for gamers. At the very least nVidia haven't said anything about it for the consumer side, it seems to be largely pro focused.

I just mean, if they are both on a 2 year cycle, we are less than a year from volta gaming cards, but if AMD are going to keep being a year late to the party, nvidia are going to keep leapfrogging them and AMD dont look to be improving the price/perfomance situation so its looking increasingly glum for people who are doggedly sticking with them whether thats because of buying in to freesync, or whatever
 
If this is correct, how do you explain the greater performance advance a RX480/580 has over the 1060 cards?

What performance "advance"?

https://www.techpowerup.com/reviews/Gigabyte/GTX_1060_Xtreme_Gaming/26.html

And even if we assume the RX480 is some how significantly faster than the 1060, that has nothing to do with DX12 feature support because these features just are not used in many games.


Pascal supporting higher DX12 feature levels than Polaris and Fiji is just a fact. It is meaningless to the performance of the card, just like vega's extra DX12 feature support wont make any difference.
 
I do get the impression tho that overall Gsync is better than Freesync. Greater frequency range.

In what situations is it better? I thougth G-Sync refresh rates range from 30Hz to 144Hz. The FreeSync spec is capable of refresh rates that range from 9Hz to 240Hz.

In the independent blind test, more gamers actually preferred the freesync experience to the gsync one.
 
That's not really true though is it? You have a Freesync monitor now and you need 1080 performance, you buy a Vega card to go with your Freesync monitor. It's going to be a hell of a lot more expensive to sell the freesync monitor and buy a gsync monitor. You are also going to be paying the higher prices from Nvidia too for those 5 years.

Volta could be another 6/7 months away. And if it follows the same release pattern as Pascal, then the Volta X80 cards will be at Ti prices for 6 months. So you could be waiting a year more to buy.

The narrative AMD are spinning is that they know their GPU prices suck, but their monitor pricing makes up for it... this presumes people dont currently have a freesync monitor... those are the people AMD are trying to woo in to becoming locked in to freesync... an in depth look at the economics of GPU+monitor choice doesnt quite stack up to scrutiny though
 
In what situations is it better? I thougth G-Sync refresh rates range from 30Hz to 144Hz. The FreeSync spec is capable of refresh rates that range from 9Hz to 240Hz.

In the independent blind test, more gamers actually preferred the freesync experience to the gsync one.

Gsync supports whatever the panel will support. There arent any panels that support 9hz so saying freesync supports that is quite misleading. If anything freesync monitors on a like for like basis currently have a narrower support range than the equivalent gsync monitors.

There have been several blind tests that have gone either way - in the one you are referencing they pitched a VA panel against an IPS panel, adjusted the IQ settings down and then asked everyone leading questions based purely on motion clarity (which VA panels are pretty famous for beating IPS on generally), with a single game sponsored by one of the vendors.
 
I do get the impression tho that overall Gsync is better than Freesync. Greater frequency range.
Nvidia actually developed Gsync as a superior alternative to Freesync (which they had been testing in their mobile solutions) so I would expect it to be better, however since AMD got involved with Freesync it has improved quite a bit I am told.
 
The narrative AMD are spinning is that they know their GPU prices suck, but their monitor pricing makes up for it... this presumes people dont currently have a freesync monitor... those are the people AMD are trying to woo in to becoming locked in to freesync... an in depth look at the economics of GPU+monitor choice doesnt quite stack up to scrutiny though
Tbh all GPU prices suck just now.
 
In what situations is it better? I thougth G-Sync refresh rates range from 30Hz to 144Hz. The FreeSync spec is capable of refresh rates that range from 9Hz to 240Hz.

In the independent blind test, more gamers actually preferred the freesync experience to the gsync one.



Gsync can support something like 1HZ to 1000HZ, in reality it supports whatever the monitor does, and in the case of Gsync most monitors are offering a wider range than freesync.

The only detailed blind test with sufficient sample size I am aware of is form TomsHardware and a majority of people preferred Gsync. that is quite old now and hopeful Freesync has improved. I know Hardocp did a useless test with a veyr small sample size with no conclusive results where msot people said they coudln't tell a difference which is bound to be the case when both GPUs would hit the max refresh rate of the monitor.
 
Status
Not open for further replies.
Back
Top Bottom