• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD VEGA confirmed for 2017 H1

Status
Not open for further replies.
The only thing I can say about your post is it is about 100% right.:)

Has it not occurred to anyone that the reason we do not have Vega yet is because it is going to be very different and improved over Polaris.

Yeah, i can even put down an example of this, the original GTX Titan.

People looked at the 7970 and quite rightfully said "it uses the same power as the Titan and that is 35% faster"

Following that came the "thats why there is no way AMD can catch the Titan, they are both using 250 Watts"

Along came the 290X, 40% faster than the 7970 using 10% more power, and it didn't just catch up with the Titan, it beat it, the same card now is trading blows with the GTX 980 and has left the GTX Titan and the 780TI way behind.

For a start adding 50% more Shaders does not result in 50% more power, quit aside from that all those memory IC's use about 30 to 40 Watts, various PCB components also use power, none of which transfer to add power because its just another chip on a PCB, what's more Shaders don't even scale 1 for 1 power.

Add to that Vega is not Polaris, Polaris is Tonga (R9 380X) die shrunk and tweaked, Vega is from the ground up brand new.
 
Fact of the matter at the high end power limits performance at the high end regardless of opinion , look at the power hungry cards of the past. Im talking the GTX 480's,GTX 280, 2900pro 290/390's, Fury's. All cards on the high end with very high power consumption/power limited at stock. You could overclock and get loads more performance out of them but then power consumption would get to insane levels which is why they where clocked and as they where at stock.

In the midrange/low end you have much more freedom to push performance at the expense of power consumption.

Not really but I know what you are getting at.

Pascal Titan and GTX 1060 are both big overclockers from opposite ends of the spectrum where as the 1070 and 1080 are quite moderate.

I think the point you are making is when the cards are getting close to 270 or more watts TDP it makes it difficult to cool with a normal air cooler.:)
 
Yeah, i can even put down an example of this, the original GTX Titan.

People looked at the 7970 and quite rightfully said "it uses the same power as the Titan and that is 35% faster"

Following that came the "thats why there is no way AMD can catch the Titan, they are both using 250 Watts"

Along came the 290X, 40% faster than the 7970 using 10% more power, and it didn't just catch up with the Titan, it beat it, the same card now is trading blows with the GTX 980 and has left the GTX Titan and the 780TI way behind.


10% more power ? that's outright incorrect

https://www.techpowerup.com/reviews/AMD/R9_290X/25.html
http://www.anandtech.com/show/7457/the-radeon-r9-290x-review/19
 
The problem is a single RX480 uses as much power as a single GTX 1070 (sometime more) so unless Vega is much better on the PPW on the same node, (which can be done looking at Kepler to Maxwell) they might end up being power limited keeping a high end Vega to a sensible TDP.

HBM2 will help in that regards but it can only do so much going by HBM1 on the Fury X.

Well as it seems, the Vega will be quite different from Polaris. For instance the GCN4 (Polaris) uses the GCN3 ISA. GCN5 (Vega) is new ISA.
 
Don't get hung up on power usage.

Just pretend for a minute that the Vega cards use more watts than the 1080 but cost less and are faster, I think people will buy them without giving it a second thought.

Some people on forums will argue about whether a card uses 10 or 20 watts more or less than another but for most people it comes down to performance and cost.

My guess is Vega is going to be very energy efficient anyway.:)
 
So will Nvidia ever support freesync?

I can't see it, as they won't get any money from it.

And you know this how?

Two RX480s based on Polaris beat a 1070 quite comfortably and sit just under the 1080.

No they dont, a single 1070 smashes them, i'll take a one all day long over a pair of 480s, plus theres the sheer power 2x of them will need over what a 1070 does as well, as 1x on its own needs more.
 
Last edited:
Don't get hung up on power usage.

Just pretend for a minute that the Vega cards use more watts than the 1080 but cost less and are faster, I think people will buy them without giving it a second thought.

Some people on forums will argue about whether a card uses 10 or 20 watts more or less than another but for most people it comes down to performance and cost.

My guess is Vega is going to be very energy efficient anyway.:)

Power usage does not bother me. It's all about bang for buck and at the time this 290 was it. It's not known to sip on power either lol.
 
Hawaii was and still is a great card, altho i love the 970 i do also miss my 290, the 970 is better at tessellation and it is more efficient, i like it a lot for that, but when push really came to shove the 970 folds up and caves in, the 290 just kept going like an unstoppable freight train.
 
Last edited:
I really like my Hawaii cards too, for mGPU use they are some of the very best AMD have produced.

They showed the 980s the door at 2160p from day one in mGPU.:)

And the cooler showed people the way to the nearest shop for earplugs :p

No idea how amd thought that crappy cooler was acceptable.
 
In your TPU link:

7970 Ghz = 209 Watts
290X Uber = 236 Watts +12%

As you well know your power performance claim was on the 7970 not the 7970 ghz

7970 = 163 Watts
290X Uber = 236 Watts

The 7970 ghz just being a 8% clock bump (925mhz to 1ghz) to the normal card but 25% greater power consumption. Which backs up what I was saying that on the high end power is the limiting factor as at point pushing clocks higher even if you can do it power shoots up and efficiency drops. This has always been the case for both AMD and Nvidia.
 
So will Nvidia ever support freesync?

If Nvidia feel they are loosing lots of sales to AMD by people who have a freesync monitor then yes. However, as it stands Nvidia are still maintaining nearly 80% of the discrete desktop GPU sales so there is no sign of many fresync owner really swinging the sales stats.
 
To be honest I think Nvidia would probably gain more out of adopting freesync than not adopting it.

The extra cost of the gsync module on top of what are expensive monitors in the first place probably put a lot of people off, myself included. Over a year ago I would have considered a 980Ti if it were not for the fact that I already had a freesync monitor and wanted to keep using variable refresh technology.

:)


The gsync module isn't that expensive to produce, don't mistake market value for production cost. The Gsync premium is almost purely down to market demand and perceived value.
 
if freesync was a hdmi standard then would nvidia have to implement it on any of there gpus with hdmi output, dont know myself but just thinking it

So will Nvidia ever support freesync?


Arrrrggghhhh......God dam AMD marketing and Huddy and his big mouth. :(

Freesync is not now or will it ever be part of the HDMI standard just as it is not part of the DisplayPort standard.

Adaptive Sync on the other hand is an optional part of the DisplayPort Standard and look like it will be added to the next HDMI standard but again probably optionally.

NVidia will never support Freesync, because if they tried to AMD would sue them to hell and back, being that Freesync is proprietary AMD technology.

Now of course there is nothing stopping NVidia supporting Adaptive Sync and it is entirely possible that they already have used it on the GSYNC labelled laptops, because they didn't have a Gsync module in them.

AMD have pulled such a blinder with Freesync, as the term is used all the time, instead of the correct Adaptive sync when talking about non AMD products, then Richard Huddy miss speaks (quite possibly deliberately) and drums up loads of hype for Adaptive Sync in HD TV's only to retract it all 15 minutes latter.

It would be like all GPU cores being known as Cuda cores, or all multiGPU usage being SLI, the AMD supporters would be up in arms over it all the time, but Freesync seems to get a free pass.
 
Vega is already here! :eek::eek::eek::eek:

latest
 
Status
Not open for further replies.
Back
Top Bottom