• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: The Vega Review Thread.

What do we think about Vega?

  • What has AMD been doing for the past 1-2 years?

  • It consumes how many watts and is how loud!!!

  • It is not that bad.

  • Want to buy but put off by pricing and warranty.

  • I will be buying one for sure (I own a Freesync monitor so have little choice).

  • Better red than dead.


Results are only viewable after voting.
Around $350 for 1080 Ti? What you smoking bro? The bias is strong with this one :p

Yeah I am not kidding, cant show you competitor link but think I can show you a cropped picture of EVGA GTX 1080 Ti Founders Edition cards from a distributor $350 each for minimum 3 pieces and it up to sellers and retailers to add on VAT, import charge, margins and markup.

7LfBs28.jpg
 
Last edited:
The tl;dr on this is as follows.

Vega does not win any Nvidia customers over to AMD.

But AMD customers are thinking 'if it wasn't for being trapped by this Freesync monitor I'd seriously consider buying Nvidia.'

That's the situation in a nutshell.
 
The tl;dr on this is as follows.

Vega does not win any Nvidia customers over to AMD.

But AMD customers are thinking 'if it wasn't for being trapped by this Freesync monitor I'd seriously consider buying Nvidia.'

That's the situation in a nutshell.

Overgeneralise much? Your opinion does not = fact. I am happy with my Vega and not once have I felt "I'm trapped with Freesync".
 
The tl;dr on this is as follows.

Vega does not win any Nvidia customers over to AMD.

But AMD customers are thinking 'if it wasn't for being trapped by this Freesync monitor I'd seriously consider buying Nvidia.'

That's the situation in a nutshell.
Vega isn't bad by any means and when AIB cards start to appear and prices settle, things will look much better for Vega. A bit of a farcical launch indeed but Vega is a decent option for those who are interested :)
 
What I don't really understand is that specs wise, Vega 64 kills the 1080. Why do raw performance numbers (12,665 GFLOPS vs 8,228 GFLOPS)
not translate into better FPS? Is it all down to a lack of optimisation?
 
What I don't really understand is that specs wise, Vega 64 kills the 1080. Why do raw performance numbers (12,665 GFLOPS vs 8,228 GFLOPS)
not translate into better FPS? Is it all down to a lack of optimisation?
There's a lot of reasons why that's the case but the main ones I think off are draw calls/CPU overhead and shader utilisation, geometry engine and the pixel engine. AMD have relied on hardware scheduling to handle draw calls for many years which means under DX11 draw call commands made to the CPU can be split over multiple CPU threads but only if the developer has enabled it in the game code. Most games don’t do this and what happens is the main CPU thread gets clogged up with draw call commands and the GPU ends up waiting for the CPU to finish processing the command list. Nvidia managed to get around this with Kepler (GTX670/GTX680) by having a hybrid software/hardware scheduler which breaks down commands across multiple CPU threads and frees up the CPU threads to handle other commands. Nvidia’s approach isn’t ideal as it requires a lot of driver updates and support to work and is in fact slower if the main CPU thread isn’t being fully utilised.

Shader utilisation is a harder one to quantify some of it could be as a result of the above but effectively it means not all of the GPU cores are being used to draw the frame and are being wasted. That’s where A-Sync shaders comes in (which was a technology requested by Sony of all companies), where there are gaps in the command line a-sync shaders effectively fill in those gaps so you don’t waste GPU power. Where games have been enabled to use this tech you can see the difference when you compare cards like the 390 and 390X, before they would be almost on par with each over (when closed at similar speeds) as the extra cores in the 390x weren’t being used. With a-sync the 390x can show off what it can really do when with all of its cores being enabled.

Those are the big ones AFAIK and the only way they can be fixed is with better support from developers. The RX480/Polaris has a better hardware scheduler then earlier GCN cards but it’s not perfect (it can barely keep its nose in front of a GTX1060 which is a smaller die and lower TDP). There are offer short comings Geometry shading has proven to be weak on GCN which has been addressed on Vega and Nvidia have had the pixel engine technology that AMD have touted for RX Vega for a longer time. The other issue with Vega is apparently a lot of the new technology (HBCC, DSBR, primitive shaders etc) are either not working fully or haven’t been enabled in the driver so AMD have left potentially left quite a bit of performance on the engineers bench.

You also have to appreciate Vega was built for the data centre plus this is the biggest update since GCN 1.0 so software support is going to lag against new hardware releases and it’s something we just have to accept (if AMD had more cash maybe this would be different). But your right to point out the TFLOPS disparity on paper RX Vega should cream the 1080 and give the 1080Ti a split lip.
 
I feel trapped by freesync,but the optimist in me thinks AMD will turn this around

I felt the same as the first part of your statement, but not in the second part :P

So I've sold my ultrawide freesync monitor after just 4 months (I bought it in anticipation of an upgrade to Vega). Hated using the freesync monitor for gaming on my existing Nvidia GPU, so now I'm back using my old 1080p G-sync monitor which is far more enjoyable due to lack of stuttering and tearing.

I've given up with AMD GPU's for a while. I took a gamble with buying a freesync screen and it backfired. I'm going to sit on what I've got for the time being
 
I've given up with AMD GPU's for a while. I took a gamble with buying a freesync screen and it backfired. I'm going to sit on what I've got for the time being
Two of the reasons I went for FreeSync was that is was nearly free (in terms of cost of the monitor compared to similar non-FreeSync monitors), and I've never spent over £200 for a GPU anyway. As long as AMD is competitive in the mid-range, it's not a huge problem for me. Yes I'd have liked to get Vega 56 but right now an RX 480 is sufficient for the games I play on my monitor, fortunately.
 
What I don't really understand is that specs wise, Vega 64 kills the 1080. Why do raw performance numbers (12,665 GFLOPS vs 8,228 GFLOPS)
not translate into better FPS? Is it all down to a lack of optimisation?

Actual GFLOPS reported on these GPUs is a very grey area. For example the GFLOPS you quote for the 1080 are for the base clocks of 1607, while the GFLOPS for the Vega 64 are for the boost clocks of 1546. In reality the GTX 1080 will boost to 1800+ on the core, which is about 9.5 GFLOPS+ and the Vega 64 reference will settle at around 1401 core clocks, which is 11.5 GFLOPS. Obviously mileage may vary on different models with different coolers etc. So Vega on paper kills GTX 1080 but in reality it's closer, though still in Nvidia's favour in GLFOPS efficiency comparison.
 
So what was the verdict on the HBCC feature? It was massively hyped and sounds great (in theory) but does it work in games yet?

It wont do much in the titles we have right now...

I read the other day something very cool and it makes perfect sense!

Games right now are still using 1080p assets this is why file sizes are still under 100GB for games... future tiles will start building games with 4K in mind using 4K assets much bigger than 1080p these titles will be very Vram demanding.. Its here that HBCC will help massively for VEGA and future titles!!

When you run a game at 4K right now you really not playing a True 4K title all you doing is upscaling the image and graphics.

But sadly this is all we have right now! FarCry 5 is supposed to be using HBCC for its game I not sure what they mean by that!
 
Back
Top Bottom