• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: The Vega Review Thread.

What do we think about Vega?

  • What has AMD been doing for the past 1-2 years?

  • It consumes how many watts and is how loud!!!

  • It is not that bad.

  • Want to buy but put off by pricing and warranty.

  • I will be buying one for sure (I own a Freesync monitor so have little choice).

  • Better red than dead.


Results are only viewable after voting.
Quite a lot of potential reasons - nVidia drivers do quite a bit of stuff in software which will use more RAM including hooking/intercepting DX API functions and forced threading models, etc. there is also some difference with the way shader caching (and probably other caching works) where stuff is kept resident in memory even if its not being used unless a low memory situation is encountered where it can be freed up to try and alleviate that.


But wouldnt that be in vram not system ram, and although primitive discard isnt enabled yet primitive shaders apparently is. But i saw one video which was running a 7700k and it was 64 vs 1080 in witcher 3, the ram usage difference was around 1k of ram constantly
 
No a lot of the new stuff is, like amd didnt even bother to send him the 500 series, he had to buy those himself, but nvidia will happily throw gpus at him. But who can say, its not like before every review he goes and this i bought, this i was given, this i get to keep, this i have to send back.

Edit for the bit not quoted, i agree, a lot of it doesnt add up, like how vega makes his threadripper pc worse, how exactly ?

Vega reviews can still work, its this card does this, this is the spreadsheet, this is the fps, this is the mins, the max, the lows and the frame times. this is what it feels like, this is the info you need, do what you want. No price point stays the same, cards get old, new versions come out, aibs go to war with each other and miners fluctuate prices.

If you had to redo your review ever time a price changed you would only ever review about 3 cards,

Someone on hardforum confirmed that amd specifically asked reviewers to exclude 1080tis from the results.

I think jayz realised thats too much BS to save a bit of cash and that he hardly ever publishes on NDA day anyway so he may as well just buy the cards and not be under any kind of noose from AMD.
 
But wouldnt that be in vram not system ram, and although primitive discard isnt enabled yet primitive shaders apparently is. But i saw one video which was running a 7700k and it was 64 vs 1080 in witcher 3, the ram usage difference was around 1k of ram constantly

A lot of the driver "assist" stuff is system RAM - shader caching will be split between stuff that is resident in VRAM and system memory.
 
Someone on hardforum confirmed that amd specifically asked reviewers to exclude 1080tis from the results.

I think jayz realised thats too much BS to save a bit of cash and that he hardly ever publishes on NDA day anyway so he may as well just buy the cards and not be under any kind of noose from AMD.

Well amd can ask to exclude what they want, doesnt mean you have to and spreadsheets are fun, and its a bit hard to swallow that you dont want amd stuff so you arnt under a noose yet be sponsored by nvidia.

ow and lets not forget, hes not saying dont buy vega at the inflated cost, hes saying do not buy radeon, and he isnt sure todo his threadripper build but intel are just as bad, intel, the company who have still not paid there fines placed on them by the eu and us federal courts (seperate fines), let that sink in.
 
Last edited:
Lot of ground to cover - these links will give a general overview but diving deeper is really complex.

http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review/3

https://www.pcper.com/reviews/Graphics-Cards/NVIDIA-Talks-DX12-DX11-Efficiency-Improvements

https://www.youtube.com/watch?v=nIoZB-cnjc0

EDIT: Stuff like this might have some implications as well - some of it due to bugs some intended behaviour https://devtalk.nvidia.com/default/...xcessive-amount-of-ram-used-by-glsl-programs/
 
Lot of ground to cover - these links will give a general overview but diving deeper is really complex.

http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review/3

https://www.pcper.com/reviews/Graphics-Cards/NVIDIA-Talks-DX12-DX11-Efficiency-Improvements

https://www.youtube.com/watch?v=nIoZB-cnjc0

EDIT: Stuff like this might have some implications as well - some of it due to bugs some intended behaviour https://devtalk.nvidia.com/default/...xcessive-amount-of-ram-used-by-glsl-programs/


Thank you greatly, if i could thumb up i would, so its work scheduling, the nvidia drivers are queing the work up for the gpu to chomp through thus a massive increase in performance ?
 
Was talking in a more general sense in that nVidia have a lot of this going on in the background while AMD take a different approach which is partly (though probably not the only reason) responsible for differences in system RAM and VRAM usage. Due to the way the driver in some cases intercepts and sometimes even over-rides application/API behaviour its likely there is also some duplication of assets so the driver can do its thing without changing where the application expects to find stuff, etc.
 
Well amd can ask to exclude what they want, doesnt mean you have to and spreadsheets are fun, and its a bit hard to swallow that you dont want amd stuff so you arnt under a noose yet be sponsored by nvidia.

ow and lets not forget, hes not saying dont buy vega at the inflated cost, hes saying do not buy radeon, and he isnt sure todo his threadripper build but intel are just as bad, intel, the company who have still not paid there fines placed on them by the eu and us federal courts (seperate fines), let that sink in.

the evidence points to reviewers following what amd asked. They are drones serving marketing departments for the most part, and yes of course this applies to nvidia, intel etc. also.

OC3D seems didnt like the idea enough to the point he deleted his post and my reply to it. Guess he likes his free toys too much.

These are the same guys who all managed to miss the performance drop off on the 970 remember above 3.5 gig :) then after it been pointed out to them by tons of users as if by magic they could repeat it in tests.

If you are a reviewer who is wealthy enough to buy the units you test, and you know it gains you followers and the like, then it should be a no brainer to shun AMD on this which is I guess how jay came to that conclusion as he is a wealthy guy.
 
Well thats why i love wendell from level1tech, he knows what he is talking about, understand the tech on a level i will never understand but has the ability to provide you that info in a way you understand and he does the tests, benchmarks etc in a straightforward way, no bias and no chip on his shoulder, just gives you the info. And if you remember when the waa waa ryzen #### was goin off on its release, he was one of the few who saw the potential of ryzen and just gave the info straight.
 
ahh yeah I like him, but not checked his rx vega reviews and follow up situation, I will check to see what stance he has taken.

For reference I agree with you, anyone who does shun is likely to have made a calculated decision it will still get them revenue in the long run from exposure, extra content etc. Its all about #1 these days.

But make no mistake the guides may not seem compulsory but just as easily a reviewer can ignore a guide, a marketing department can blacklist a reviewer. Ask totalbiscuit about that on his game reviews and relationship with publishers.
 
These are the same guys who all managed to miss the performance drop off on the 970 remember above 3.5 gig

You need to do a lot of testing to even get any idea there was a problem - what happens when you hit slower VRAM or page to other storage depends a lot scenario to scenario. IIRC it only came to light due to someone testing stuff for compute purposes.

For instance when playing around with my old 780GHz (3GB) and a 970 using a modified Skyrim client + ENB I could push to nearly 5GB VRAM used before performance dropped out while testing other games in some cases they fall apart the moment you went over the VRAM limits or hit slow VRAM.
 
Someone on hardforum confirmed that amd specifically asked reviewers to exclude 1080tis from the results.

I think jayz realised thats too much BS to save a bit of cash and that he hardly ever publishes on NDA day anyway so he may as well just buy the cards and not be under any kind of noose from AMD.

Hexus review is a good example of this. We discussed it here. On the Titan Xp review they benched 3 different 1080ti, while of the Vega review used 1 1080ti (EVGA SC2 if correct) , which was 10-15% slower than the other 1080Ti on the previous review. While the rest of the system was exactly the same. 6700K at 4.4, ram etc. Even the FuryX was 4% slower on the Vega review compared to all previous reviews.


------
Regarding the more CPU, RAM & VRAM usage of the Nvidia, had posted it here few weeks ago, when someone benched numerous games.
The single core usage of the Nv cards (1080 and 1080ti) was always hitting 90%, having also on avg 10% higher CPU usage, compared to the Vega FE. So I am not surprised if higher clocked CPUs (4.9-5.4) show so much greater fps perf compared to 4Ghz clocked ones when NV cards are used.
 
Hexus review is a good example of this. We discussed it here. On the Titan Xp review they benched 3 different 1080ti, while of the Vega review used 1 1080ti (EVGA SC2 if correct) , which was 10-15% slower than the other 1080Ti on the previous review. While the rest of the system was exactly the same. 6700K at 4.4, ram etc. Even the FuryX was 4% slower on the Vega review compared to all previous reviews.


------
Regarding the more CPU, RAM & VRAM usage of the Nvidia, had posted it here few weeks ago, when someone benched numerous games.
The single core usage of the Nv cards (1080 and 1080ti) was always hitting 90%, having also on avg 10% higher CPU usage, compared to the Vega FE. So I am not surprised if higher clocked CPUs (4.9-5.4) show so much greater fps perf compared to 4Ghz clocked ones when NV cards are used.

Ye its very interesting, the more i have looked into it the more i have found about what isnt even enabled on vega, rasterization, primitive discard, hbcc. Makes me wonder what the amd driver team is doing.
 
Hexus review is a good example of this. We discussed it here. On the Titan Xp review they benched 3 different 1080ti, while of the Vega review used 1 1080ti (EVGA SC2 if correct) , which was 10-15% slower than the other 1080Ti on the previous review. While the rest of the system was exactly the same. 6700K at 4.4, ram etc. Even the FuryX was 4% slower on the Vega review compared to all previous reviews.


------
Regarding the more CPU, RAM & VRAM usage of the Nvidia, had posted it here few weeks ago, when someone benched numerous games.
The single core usage of the Nv cards (1080 and 1080ti) was always hitting 90%, having also on avg 10% higher CPU usage, compared to the Vega FE. So I am not surprised if higher clocked CPUs (4.9-5.4) show so much greater fps perf compared to 4Ghz clocked ones when NV cards are used.
I've noticed this also in the Vega 64 and 56 review they didn't include the gigabyte overclocked 980 Ti which would have been interesting as I still say a well overclocked 980 Ti (1450-1500) mhz can match it exceed Vega 56 in some instances whilst nipping at Vega 64 heels
 
I've noticed this also in the Vega 64 and 56 review they didn't include the gigabyte overclocked 980 Ti which would have been interesting as I still say a well overclocked 980 Ti (1450-1500) mhz can match it exceed Vega 56 in some instances whilst nipping at Vega 64 heels


ye but that brings more questions then answers as then you put a fury x in the mix is up there with vega in some benchmarks, which goes to my previous post that there is something very wrong with drivers and stuff not even being enabled on vega and the driver team need to pull there finger out.
 
Last edited:
Back
Top Bottom