• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: The Vega Review Thread.

What do we think about Vega?

  • What has AMD been doing for the past 1-2 years?

  • It consumes how many watts and is how loud!!!

  • It is not that bad.

  • Want to buy but put off by pricing and warranty.

  • I will be buying one for sure (I own a Freesync monitor so have little choice).

  • Better red than dead.


Results are only viewable after voting.
Part of the reason why that never happened was due to the API. PS4 and Xbox had their own unique API's which developers had to port their code to, the PC was stuck on the bloated pig that's DX11. Now the Scorpio will be using DX12 as it's API it should make PC ports a lot more consistent which will benefit GCN hardware. It will still take few years but in time I would fully except AMD's cards to be consistently ahead or around the same level on watt for watt basis.

Pretty sure the Xbox one API is what Microsoft offered up for Windows.
 
Pretty sure the Xbox one API is what Microsoft offered up for Windows.

I think the original XBONE API was a custom version of DX11 I very much doubt XBONE was using the same DX11 extension that was being used on PC Windows platform it must have had some custom low level extensions so developers could access the hardware and get the most out of it's modest GPU.

Given DX12 is a wafer thin layer anyway I wonder just how much it differs from the PC version? It could be interesting few years ahead that could see AMD finally beaning able to show off what it's GCN hardware is capable and lay the to rest the ghost of DX11.
 
I think the original XBONE API was a custom version of DX11 I very much doubt XBONE was using the same DX11 extension that was being used on PC Windows platform it must have had some custom low level extensions so developers could access the hardware and get the most out of it's modest GPU.

Given DX12 is a wafer thin layer anyway I wonder just how much it differs from the PC version? It could be interesting few years ahead that could see AMD finally beaning able to show off what it's GCN hardware is capable and lay the to rest the ghost of DX11.

I think the differences will come down to the hardware config. Yeah DX11 needs to go away and the faster the better.
 
I image we'll see some console games using it on PC but we'll also see Nvidia get involved with some of the big name releases where it'll get stripped out, Just like what happened with Rise of the Tomb Raider's A-s compute.

Nvidia need to stop doing that or the consoles will dominate the gaming market and the PC will be stale wine.
 
Last edited:
They don't do it a lot but they do it strategically so that they can maximize the publicity showing how they perform better than the competition, like with ROTTR.

It just shows weakness. If you're a high technology based company and you have to disable tech to look good, then people should be getting sacked. People shouldn't buy the games if that's the case.

What games have Nvidia done this in?
 
I'm constantly looking at vega listing on websites waiting for some sort of price change.
I need to buy a graphics card like now.

There's plenty of listings with stock. I wonder how many vega 56 and 64 and are currently selling.
 
I'm constantly looking at vega listing on websites waiting for some sort of price change.
I need to buy a graphics card like now.

There's plenty of listings with stock. I wonder how many vega 56 and 64 and are currently selling.


I'm in a similar position waiting on the AIB models,
I've got a Ryzen M-ATX build that my MSI 1080 Armor gpu can't fit in so I need a longer, thinner 3 fan Vega card like a Tri-x or Strix to replace it.
 
Part of the reason why that never happened was due to the API. PS4 and Xbox had their own unique API's which developers had to port their code to, the PC was stuck on the bloated pig that's DX11. Now the Scorpio will be using DX12 as it's API it should make PC ports a lot more consistent which will benefit GCN hardware. It will still take few years but in time I would fully except AMD's cards to be consistently ahead or around the same level on watt for watt basis.

One big difference is memory management - consoles typically have a very different setup to PCs and with low level APIs you have to pretty much have to do a ton of your own memory management which means "straight ports" from console to PC aren't so simple.
 
As the developers wont bother with something, that only a few people can make use of on the PC.
Is this in reference to AMD hardware? Which Xbox one and Xbox one X both have. Not to mention PS4/Pro but that's a different API. How many devs have to optimise their games for DX12 on the consoles using AMD hardware? Not exactly a stone throw away from doing the same on PC is it. More of a stone throw away doing it for nvidia though as we all know nvidia hardware isn't as DX12 friendly as AMD. So if devs are making games around consoles which they tend to do then port over to pc which hardware is devs most likley to spend time on? AMD ofc not nvidia which will require more work. Probably the case with Turn 10 and Forza 7.
 
RIP 1070, no wounder 1070 Ti is coming VEGA 56 is ripping it apart.
In fairness Vega 64 is 13 months newer than the 1070 and about £120 more expensive.
If you compare the 1070 to a card that it's 13 months newer than, the 390X for example, the 1070 is probably ripping that apart too.
From a purely price point of view the 1080 is probably closer to the Vega 56.

1070 vs Vega 56 is apples and oranges.
 
Back
Top Bottom