• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD VEGA confirmed for 2017 H1

Status
Not open for further replies.
I will speculate Dawn Of War 3, Prey and Dirt 4 will make mighty good benchmarking tools for Vega. Once you add in Hitman, Tomb Raider, Doom and AOTS you pretty much have a full suite of DX12/Vulcan titles to run a review against.
There's already a whole bunch of DX12 games.

You mean DX12 games that notably favor AMD, I'm guessing?

People would freak the hell out if people were asking for a full panel of Nvidia-biased games for reviews.
 
There's already a whole bunch of DX12 games.

You mean DX12 games that notably favor AMD, I'm guessing?

People would freak the hell out if people were asking for a full panel of Nvidia-biased games for reviews.

There are no real truly built from the ground up DX12 games at moment though are there? I assume all are really ports from DX11 and changed as are not all the game engines not directly moved onto DX12 complete as such?

I am asking as I haven't followed the API updates for the last 9 months.

So is Besty not suggesting those games noted are going DX12 only? Although I thought DoW3 was DX11 & DX12 also?
 
There are no real truly built from the ground up DX12 games at moment though are there? I assume all are really ports from DX11 and changed as are not all the game engines not directly moved onto DX12 complete as such?

I am asking as I haven't followed the API updates for the last 9 months.

So is Besty not suggesting those games noted are going DX12 only? Although I thought DoW3 was DX11 & DX12 also?
Some of the Microsoft 1st party titles are DX12-only.

But yea, most everything else is going to have DX11/OpenGL base paths for a while still. I dont think many devs are completely confident yet in DX12 matching or beating DX11, especially for Nvidia users which make up a majority of PC gaming base.
 
Some of the Microsoft 1st party titles are DX12-only.

But yea, most everything else is going to have DX11/OpenGL base paths for a while still. I dont think many devs are completely confident yet in DX12 matching or beating DX11, especially for Nvidia users which make up a majority of PC gaming base.

Yeah I know that people say that for instance Gears is DX12 only but really that is still built on DX11 with just specific code to stop running on anything other than DX12 which isn't the same.

But yeah so I don't feel suggesting we have loads of DX12 titles for either AMD or Nvidia then is correct and so even the games that were suggested above are not really going to show anything other than of course hopefully the devs knowing how to optimise a DX11 port to DX12 better then?
 
Yeah I know that people say that for instance Gears is DX12 only but really that is still built on DX11 with just specific code to stop running on anything other than DX12 which isn't the same.

But yeah so I don't feel suggesting we have loads of DX12 titles for either AMD or Nvidia then is correct and so even the games that were suggested above are not really going to show anything other than of course hopefully the devs knowing how to optimise a DX11 port to DX12 better then?
I dont think I've heard anything about Gears being built on a DX11 base. It might have been, but as it started life as an XB1 game, I'd say that low level coding was always part of the process, meaning it would have far more in common with DX12 than DX11.

As for whether we have loads of DX12 titles, DX12 is DX12. We can only go by what we have, but yes, some of it will come down to how well the devs translate a DX11 path to DX12. It's not the only factor, though. Every game is different, will have different bottlenecks and challenges and some devs are genuinely bringing in positive enhancements that aren't explained simply by lesser DX11/OpenGL drivers or anything. We're in a fairly 'hybrid' state right now, kinda like what happened with DX9 into DX10/DX11, but with DX12 being a much more comprehensive and fundamental change. It's worth reviewing the differences cuz it'll probably be like this for a while. But I also think filling reviews with tons of DX12 benches that favor AMD cards is not a very 'balanced' way to do a GPU review.
 
Only been skimming this thread, where are the leaked 580 benchmarks? would love to see them, if the 580 is indeed as fast as the 1070, it could bode well for full Vega being a bit of a good card
yeah me too.

will have to see if i can track down, Believe its registered in the Ashes of Sing benchmark database, I think over a month ago seems 3 entries were registered , with one being an AMD employee. These were clearly under 1080 levels but singing close to 1070

https://videocardz.com/66253/amd-radeon-rx-580-ashes-of-the-singularity-results-leaks-out
 
+1

Unless the 580 is small Vega or something. If it is a clock bumped Polaris like rumoured, it won't come close to the 1070.

RX580 is Polaris 20 (20 not 10) not Vega. And is not only higher clocks but different manufacturing process compared to Polaris 10 (LPP). While it will have faster Vram.
 
I bought my 670 over the 7970 mainly because of the bf3 performance, Was faster than the 7970 never mind the 7950 in july 2012 in most games tbh.

The GHz launched in June was widely regarded in reviews against the 680 as being the faster GPU and the 7950's oc'ing by 50℅ was legendary.

670 was a smashing card too, had an Amp Extreme, but ran out of vram when it still had plenty of power on the core when big textures were introduced further down the line(but still in the lifetime of the card) whereas the 7950 was still breezing through them.
 
RX580 is Polaris 20 (20 not 10) not Vega. And is not only higher clocks but different manufacturing process compared to Polaris 10 (LPP). While it will have faster Vram.
With that info I would guess it is likely to closer to 390X than a 1070. At best in between.
 
Only issue with an April launch for Vega is, where does this leave the 480 500 series refresh?

Same time? Something like a 570, 580, 590 launch. 570 & 580 being 480 and a faster version refresh and the 590 being Vega?

Dunno..... something doesn't make sense here.

To differentiate Vega they'd have to call it either RX Vega

It will be the Rx Vega . Wont be 590 as that places it alongside cards which will be half as fast, Vega should be faster then 1080. We should run a pool or a poll on that, seems an easy guess to me though I guess if drivers were no good it could be bodged up
 
I was just thinking, with the introduction of Intel Optane, combining it together with Vega's HBCC may end up being a good combo in the future :)
 
No, Proper DX12 games. Ones without a DX11 base.
I dont think I've heard anything about Gears being built on a DX11 base. It might have been, but as it started life as an XB1 game, I'd say that low level coding was always part of the process, meaning it would have far more in common with DX12 than DX11.


But yea, most everything else is going to have DX11/OpenGL base paths for a while still. I dont think many devs are completely confident yet in DX12 matching or beating DX11, especially for Nvidia users which make up a majority of PC gaming base

Not only confidence, but backwards compatibility. Windows 7 still hold 50% marketshare, that's a whole lot of potential gamers that can't use DX12. MS's efforts for getting people onto 10 hasn't been stellar either. So DX12 is hitting a bit a of a snag to say the least.

At least Vulkan can run everywhere, but moving from DirectX development for years to that is even more effort sadly.


Depends on the Gears game. Gears of War Ultimate used the 2006 Unreal Engine source code, with some DX12 features added on.

http://www.eurogamer.net/articles/digitalfoundry-2015-vs-gears-of-war-ultimate-edition

It starts with the game's beating heart - the game engine. While Gears of War 4 is in development using Unreal Engine 4, Gears Ultimate instead opts for more familiar ground - the original 2006 source code.
 
Last edited:
I was just thinking, with the introduction of Intel Optane, combining it together with Vega's HBCC may end up being a good combo in the future :)

Optane only works on Kabylake and newers; plus Vega's HBC memory sharing function seems far more geared towards Enterprise and Professional workloads than gaming really.

System RAM, SSD's and the rest are all significantly slower than onboard VRAM, whether that's GDDR5/X or HBM2.
I can't see that helping in gaming much, unless you can only use system RAM, and even then it'll just mean people need to fork out more for high speed RAM there.
 
There's already a whole bunch of DX12 games.

You mean DX12 games that notably favor AMD, I'm guessing?

People would freak the hell out if people were asking for a full panel of Nvidia-biased games for reviews.

I would sooner frame it as 'AMD collaborations' pushing the envelope forward which have resulted in a new set of really well optimised games rather than Nvidia's DX12 token efforts which have been rather poor.

I have to hand it to AMD, their title collaborations with software developers have been, generally speaking, much more successful than the Green teams but AMD have to demonstrate that the raw compute power in Vega can actually be brought to bear through software.

If the card is released and the software base is not right then its a lame duck launch, with the implied cost of the Vega card, this would be more costly in the long run than getting it right first time.

And lest us not forget, even putting these 3 games aside, potentially the biggest PC 'release' of the year will be Star Commander Citizen Beta 3.0 and it will run on Vulkan. Assumption there being it will also run great on Vega.

Bright times are almost here for AMD fans but like I said yesterday, I expect the Vega hardware is ready to go last month and its now just a waiting game to get all the software ducks lined up in a row. Launching now and using Doom as a poster child won't hold enough water.
 
The thing is it's easy to see why AMD would put so much effort in to DX12/Mantle/Vulkan. Disregarding Ryzen as its new (but still multi threaded and would benefit) their FX CPUs had poor IPC but could run more threads in parallel so it was in AMD'S interest to push the gaming ecosystem in this direction.

Where as from Nvidia they didn't have any CPUs of there own to think about other than maybe the performance Intel's which generally speaking are the 4 core 8 thread i7, so they focused on their DX11 path, which is why they weren't really so bothered about DX12 and Vulkan as it wasn't in their interest.

However I think Nvidia realise now that actually DX12 and Vulkan superficially is what the consumer wants, which is why they released their 'DX12' driver when they launched the 1080ti.
 
Optane only works on Kabylake and newers; plus Vega's HBC memory sharing function seems far more geared towards Enterprise and Professional workloads than gaming really.

System RAM, SSD's and the rest are all significantly slower than onboard VRAM, whether that's GDDR5/X or HBM2.
I can't see that helping in gaming much, unless you can only use system RAM, and even then it'll just mean people need to fork out more for high speed RAM there.
Well to be fair I did say in the future, not near future. Good point about it being on Kabylake only at the moment, but I am sure it will be on whichever mobo I upgrade to next in a year or so's time :)

I agree it does seem more geared towards Enterprise and Professional workloads, but hopefully there will be some benefit to gamer's also. Otherwise it would seem the only reason they are mentioning it is to have another bullet point, rather than it will have any benefit at all.

I understand that GDDR5/X or HBM2 is much faster, I think most people browsing this thread probably do (or should) :p The whole point of HBM2 and HBCC is that HBM2 is fast enough to swap things in and out. Question is is it fast enough, will it have a benefit on all games, some games or one's coded to make use of it. We will see. Here is an interesting thought, let us look at what Nvidia did with the 970. 3.5GB fast VRAM and 0.5 slow. With HBCC it could be like that in a sense, where you would have 8GB fast HBM2 and another 16GB or whatever from DRAM or from Optane (in the future) etc.

Just some of my random thoughts above really. But I want to be clear, I am not arguing that HBCC is better than just having more HMB2 in the first place, as I get the feeling people will project again and assume this. It is just an interesting new tech that may help provide us extra performance cheaper, also the ability to use a lot more than 16GB storage for the graphics card now, rather than the future.
 
Status
Not open for further replies.
Back
Top Bottom