• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD VEGA confirmed for 2017 H1

Status
Not open for further replies.
Yes?

Their roadmap says it's on track, and IBM originally designed the process, GloFo just bought the IP and are reusing a lot of their 14nm equipment for it.

And TSMC also say they're starting mass production of their 7nm by the end of 1H 2018.
You've seen the past experience with 20nm nodes, etc?

If 7nm comes in on time (no delays) and performs as expected then... it'll be very surprising :p And that's something of an understatement.

e: Intel reckon they're about 3 years ahead in terms of process tech. GF's "7nm" will be something similar to Intel's 10nm, from what I've read. And Intel had massive problems with 10nm, didn't they.
 
Last edited:
You've seen the past experience with 20nm nodes, etc?

If 7nm comes in on time (no delays) and performs as expected then... it'll be very surprising :p And that's something of an understatement.
+1

Smaller the nodes get the harder it seems to be getting. More things not going to plan.
 
+1

Smaller the nodes get the harder it seems to be getting. More things not going to plan.

I don't think being stuck on 14nm would be that big a deal if they can manage to optimize like we saw Nvidia do with Maxwell at the end of the 28nm's life, A couple of years on 14nm wouldn't be a bad thing.. If they keep shrinking down as quickly as this we don't get the best out of it and the fact is they're going to hit a wall at some point, sooner rather than later. Ten years from now, What's likely to be available then?
 
Yup I will just hold of playing the game till it is all patched up, got mafia 3 to complete now.

Once games properly support resolutions higher than 1080P, the difference will be even more noticeable.





Again, "4k" is not limited to 16.9... Aspect ratio and resolution, 2 different things :p



Yup it can be a right PITA to adjust settings etc. but it is well worth it as not only do you get a game that runs better but can often look better too



That video is spot on! Ive been saying it for ages. Max settings don't tend to do much for IQ. high settings gets you virtually the same IQ with much better frame rates. You can play with other settings and turn them down to medium with little to no effect on visual quality.

Like the guy said consoles tend to be the baseline which means low/medium tends to get you the same experience as console Xbox One / PS4

But cranking up those settings doesn't always tend to do much and me personally if it offers what i see as nothing its stays where it is. I prefer a smoother gameplay over tiny or very miniscule visual quality increment where you have to focus in on a area to notice the difference while stood still. Who plays games like that? With motion you will not notice the minor details but when playing moving high frame rates will benefit you and you will notice it especially on higher refresh rate monitors.
 
So, I was digging up some old news and read this January article about Vega on PC Perspective:

One the subject of the new 'geometry primitive shader':

The new programmable geometry pipeline on Vega will offer up to 2x the peak throughput per clock compared to previous generations by utilizing a new “primitive shader.” This new shader combines the functions of vertex and geometry shader and, as AMD told it to me, “with the right knowledge” you can discard game based primitives at an incredible rate. This right knowledge though is the crucial component – it is something that has to be coded for directly and isn’t something that AMD or Vega will be able to do behind the scenes.

This primitive shader type could be implemented by developers by simply wrapping current vertex shader code that would speed up throughput (to that 2x rate) through recognition of the Vega 10 driver packages. Another way this could be utilized is with extensions to current APIs (Vulkan seems like an obvious choice) and the hope is that this kind of shader will be adopted and implemented officially by upcoming API revisions including the next DirectX. AMD views the primitive shader as the natural progression of the geometry engine and the end of standard vertex and geometry shaders. In the end, that will be the complication with this new feature (as well as others) – its benefit to consumers and game developers will be dependent on the integration and adoption rates from developers themselves. We have seen in the past that AMD can struggle with pushing its own standardized features on the industry (but in some cases has had success ala FreeSync).
 
I'm a bit confused. Vega gaming edition is due to be showcased at Computex with a launch date later on in the year? does that mean towards the end of 2017?

Also isn't Volta supposed to be coming before the end of 2017 too?
 
so what ha
That video is spot on! Ive been saying it for ages. Max settings don't tend to do much for IQ. high settings gets you virtually the same IQ with much better frame rates. You can play with other settings and turn them down to medium with little to no effect on visual quality.

Like the guy said consoles tend to be the baseline which means low/medium tends to get you the same experience as console Xbox One / PS4

But cranking up those settings doesn't always tend to do much and me personally if it offers what i see as nothing its stays where it is. I prefer a smoother gameplay over tiny or very miniscule visual quality increment where you have to focus in on a area to notice the difference while stood still. Who plays games like that? With motion you will not notice the minor details but when playing moving high frame rates will benefit you and you will notice it especially on higher refresh rate monitors.

For me the difference between medium and high or ultra is day and night. There is no question about it. I prefer both smooth play AND quality visuals. Can't stand consoles, PCs are so much better ...
 
That video is spot on! Ive been saying it for ages. Max settings don't tend to do much for IQ. high settings gets you virtually the same IQ with much better frame rates. You can play with other settings and turn them down to medium with little to no effect on visual quality.

Like the guy said consoles tend to be the baseline which means low/medium tends to get you the same experience as console Xbox One / PS4

But cranking up those settings doesn't always tend to do much and me personally if it offers what i see as nothing its stays where it is. I prefer a smoother gameplay over tiny or very miniscule visual quality increment where you have to focus in on a area to notice the difference while stood still. Who plays games like that? With motion you will not notice the minor details but when playing moving high frame rates will benefit you and you will notice it especially on higher refresh rate monitors.

True. Especially when comes to AA, many ingame engines do not handle it well at all. Is better to over-write it via the gpu drivers.
And many examples of current games (or older ones like WOT), which ingame AA taxed the GPU for no reason and in many times making the screen blurry, while over-writing it the settings with the same from the drivers, provided the same if not better visuals at much higher refresh rate.
 
I'm a bit confused. Vega gaming edition is due to be showcased at Computex with a launch date later on in the year? does that mean towards the end of 2017?

Also isn't Volta supposed to be coming before the end of 2017 too?

No the card is coming out this side of the year half :) And Volta is due 2018 not 2017.
Except if NV pulls again a fast one selling it with GDDR5X ram and not HBM or GDDR6 as it is planned.
 
No the card is coming out this side of the year half :) And Volta is due 2018 not 2017.
Except if NV pulls again a fast one selling it with GDDR5X ram and not HBM or GDDR6 as it is planned.

Ah right, there's been so many dates being thrown about it's hard to keep track on what's actually accurate.

I say that because if Vega is due towards the end of the year then I'd get a 1080Ti right now but if it's still on track to be released before the end of June then that's not too bad
 
So, I was digging up some old news and read this January article about Vega on PC Perspective:

One the subject of the new 'geometry primitive shader':

The new programmable geometry pipeline on Vega will offer up to 2x the peak throughput per clock compared to previous generations by utilizing a new “primitive shader.” This new shader combines the functions of vertex and geometry shader and, as AMD told it to me, “with the right knowledge” you can discard game based primitives at an incredible rate. This right knowledge though is the crucial component – it is something that has to be coded for directly and isn’t something that AMD or Vega will be able to do behind the scenes.

This primitive shader type could be implemented by developers by simply wrapping current vertex shader code that would speed up throughput (to that 2x rate) through recognition of the Vega 10 driver packages. Another way this could be utilized is with extensions to current APIs (Vulkan seems like an obvious choice) and the hope is that this kind of shader will be adopted and implemented officially by upcoming API revisions including the next DirectX. AMD views the primitive shader as the natural progression of the geometry engine and the end of standard vertex and geometry shaders. In the end, that will be the complication with this new feature (as well as others) – its benefit to consumers and game developers will be dependent on the integration and adoption rates from developers themselves. We have seen in the past that AMD can struggle with pushing its own standardized features on the industry (but in some cases has had success ala FreeSync).

Raja, mentioned that a lot of the developers where working on drivers for Vega, so could it be that they are optimising a few games to utilise this new primitive shader.
 
so what ha


For me the difference between medium and high or ultra is day and night. There is no question about it. I prefer both smooth play AND quality visuals. Can't stand consoles, PCs are so much better ...
No one mentioned medium. Medium settings these days is like 3rd or 4th setting in games. The point here is these days there is hardly any difference at all between the highest settings and one below. Main difference is your fps counter showing much smaller numbers. Lol.
 
Ah right, there's been so many dates being thrown about it's hard to keep track on what's actually accurate.

I say that because if Vega is due towards the end of the year then I'd get a 1080Ti right now but if it's still on track to be released before the end of June then that's not too bad

That's my dilemma too. It's not just when Vega comes out, it's when it will be available for sale with a decent cooler. I've seen that it may be released with a 'better than a blower' twin fan cooler, but the dragged out uncertainty is starting to persuade me to go for a 1080ti. Another couple of weeks and I'll be tempted to go down the latter route...
 
Same issue, I am wanting to buy a 34 super wide monitor, the one I like is freesync, so need a card to pair with it, given total lack of firm news now starting to veer to nvidia
 
Status
Not open for further replies.
Back
Top Bottom