• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: ** The AMD VEGA Thread **

On or off the hype train?

  • (off) Train has derailed

    Votes: 207 39.2%
  • (on) Overcrowding, standing room only

    Votes: 100 18.9%
  • (never ever got on) Chinese escalator

    Votes: 221 41.9%

  • Total voters
    528
Status
Not open for further replies.
Bottom line, it's a 1070 killer because it's not a 1080 killer. It's also lackluster in terms of power consumption. The price simply needs to reflect these things, as per the norm.

That's the saddest bit, The price doesn't reflect what the card is like it reflects how AMD got it very wrong and can't afford to sell it based on what it's actually worth. I like to try and be a half full rather than half empty type of guy but AMD make it hard sometimes.
 
Have patience :D:D:D. Mate just moved from a 290x to a 1080ti and so far he ain't impressed with it's 4k capability. He expected to much. Keep telling him it's a beast yet 4k gaming is still not possible if you want to turn the sliders up. He's more impressed with just how good his nearly 4 year old card is. I am much more impressed as i follow the hardware side and to me it's a really fast card. Just shows how much the performance leaps have slowed down and with 1 card 4k is still not ideal. Really think he was expecting 4 x the performance and not 2 x. Hopefully AMD get back in the game soon so we can get some real performance upgrades.
He is silly then I've done that and it amazes me the difference. Gone from 40fps med/high tomb raider reboot to about 80fps ultra
 
obviously this will mainly be guess work here, but do you chaps think the Air cooled Vega 64 will be able to achieve the same boost clocks as the liquid 64 when put in a custom loop? or do you think the liquid 64 chips are completely cherry picked to hit the higher boost clock?

Reason I ask, I'll prob just go for the air cooled 64 as I'm already rocking a custom water loop for my 390 crossfire setup. Pointless paying the extra for the liquid version as i'll want to pick up a couple goodies from EK.

I don't think the liquid ones will be cherry picked, the Nano version will be the cherry picked one which will come later, if it was me and I had a loop I'd just buy the reference cheapest one I could find.

that's kinda what I was thinking
 
That's the saddest bit, The price doesn't reflect what the card is like it reflects how AMD got it very wrong and can't afford to sell it based on what it's actually worth. I like to try and be a half full rather than half empty type of guy but AMD make it hard sometimes.

Well that's what they get for staying on a 4 wide geometry engine again, and not scaling to 6.
Still they could have expanded the Cu's by 4 on each engine, to then range from 4608 -5120. Would probably have been enough to keep the clocks down and power down whilst being solidly between gp104 and gp102.
 
obviously this will mainly be guess work here, but do you chaps think the Air cooled Vega 64 will be able to achieve the same boost clocks as the liquid 64 when put in a custom loop? or do you think the liquid 64 chips are completely cherry picked to hit the higher boost clock?

Reason I ask, I'll prob just go for the air cooled 64 as I'm already rocking a custom water loop for my 390 crossfire setup. Pointless paying the extra for the liquid version as i'll want to pick up a couple goodies from EK.

I think they will tbh. Binning seems to be a thing of the past apart from the very top chips. I can see no evidence in the past that AMD have ever done this as most reference chips under water were some of the best. As long as you have voltage control which seems to be there i would say it's just a case of the silicon lottery.
 
I think you're missing the actual point in the HDR we're talking about. For all intents and purposes, we're talking about maximising contrast. Which is where fall array local dimming comes into things to enable proper HDR. I feel like you're misunderstanding something, as that isn't something you can achieve with just software. The hardware has to be able to physically produce the extended range from dark to bright that comes with the HDR spec.
I agree.
Please take a breath and look at what you saying.

1 - game developer uses 6 bit colours as thats compatible with every display
2 - owners of 8/10 bit panels have colours not used as a result.
3 - some other developer makes software to use the unused colours and as a result contrast is increased, more vibrancy etc.

The hardware you talk of in these situations is already present, just not utilised without HDR.

There is no specific hardware needed for actual HDR processing, the hardware required is just hardware that can output the extended range of colours. So basically the only requirement for 10bit HDR is a 10bit panel.

Of course if you are a commercial company wanting to maximise your revenues, you will want to make HDR a profitable feature, and as such you convince people they need a new tv for it. You make deals with companies that output to tv's (such as console manufacturers) that they will only enable HDR on HDR marketed displays. This can be achieved by checking for the presence of a so called HDR chip.

This is not complicated, its happened numerous times in the past and will happen again.

Its a similar thing with 3d graphics as well.

A cpu can generate the same visuals as e.g. a 1080TI, the difference is it does it a "lot" slower.
Early 3d games made a software 3d mode available where the game was still 3d but with lower performance. Now days no such software mode is available, instead games will just check for the presence of a GPU that can accelerate 3d graphics.

GSYNC is a very recent example as well.

Nvidia chose a path that required a chip, display companies jumped on it as they could sell monitors at a premium, now freesync has came out, which ironically also has a premium attached but is a smaller premium, nvidia cannot just start supporting freesync as they very likely have commercial agreements with the GSYNC display manufacturers to make sure demand is kept for those products.

It is all about money.
 
Last edited:
Nice, you're right I'd missed that. If I'd known Vega was going to take so long and be so underwhelming I would have snapped that up a year ago! Now I'd like something faster.

Well dont expect for nano to be faster mate. They will need 2 cut down on power section mhz ect. If its a match to yhis 1070 i actually be shocked. Not to mention it will cost same
 
I still think yhat Vega disappointed in only 1 thing... Price!!! This late this power hungly should be 50-100 bucks beliw nv matching cards.

If NV would like to kill this vega crap they should just cut 100 bucks off all their cards. And we all know yhey xan with 0 problems xause their pascal cards is Cheaper in production yhsn vega!!
 
Radeon RX VEGA 64 Liquid

2JYSxOA.jpg

EltjAAf.gif.png
 
I still think yhat Vega disappointed in only 1 thing... Price!!! This late this power hungly should be 50-100 bucks beliw nv matching cards.

If NV would like to kill this vega crap they should just cut 100 bucks off all their cards. And we all know yhey xan with 0 problems xause their pascal cards is Cheaper in production yhsn vega!!

I would try remapping your x and y keys on your keyboard before typing.
 
I'm fairly confident Vega 64 will come in under the £500 1080 price point.

£450 to be precise.

Although I have no idea how much the Fury X launched for as I wasn't following AMD at the time. :p
 
The more i learn about Vega arch the clearer i see which market they were actually targeting


Not really obvious.

They clearly were not going after compute and HPC because vega10 only has 1:16 FP64 support so is essentially locked out of the market for most uses. It is not that obvious for professional software, because the VEga FE doesn't have certified drivers and is no faster than $800 Nvidia quadro cards with certified drivers. Moreover, GPUs or pro-software typically have a lot of VRAM, a 32GB model would be nice but that is not possible with HBM2 at the moment.

For deep learning, the FP16 certainly comes in ahndy and vega should do well here. But AMD have struggled to get traction in this market and they have an uphill battle since all of the frameworks like TensorFlow, Caffe etc. are all built around CUDA so OpenCL support is either non-existent or limited/delayed/redued functionality. And fomr some of my collegues who work closerto deep learning software, their opinion is that the performance advance of CUDA being a low-level API tightly nit around the hardware, will more than overcome the lack of FP16 on consumer hardware, And furthermore, a lot of them use Amazon AWS GPU instances with GP100 and full FP16 support, which combined with the CUDA interface blows Vega out of the water.

Most of AMD's marketing has not been that focused on compute (cf. nvidia's Volta releases), except for HBCC. HBCC is big thing for HPC but without the 1:2 FP64 this is really just a pipelcleaner for Vega 20 with FP64 support.

The most obvious market for Vega is indeed gaming (all the DX12 feature levels jumped up form being someway behind Pascal to being well ahead). It just so happens the performance isn't there. Maybe Vega has some design flaw, maybe GF process is terrible, maybe some unforeseen event happened. AMD say that most of the 3.9bn new transistor was spent on increasing clock speeds by lengthening pipeline stages., which can decrease IPC slightly. But it looks like they never managed to hit the clock speeds they envisioned, and that even when the clockspeed is bumped up there must be some other severe bottleneck. GCN just hasn't evolved that well IMO. AMD have scaled up the compute resources each iteration but the efficiency goes down. GCN was never short of compute, the problems was properly addressing and balancing that. Vega is still stuck with the same number of 4 compute units (even if AMD made bg changes and called them NCU) feeding 1000 GCN cores, just like Fiji. Fiji was terribly unbalanced. A 6x768 design would have lead to better utilization but I'm sure there are reasons why AMD stuck with the old architecture.

If You see what NVidia have done, they had 192CUDA cores per SM in Kepler, reduced that to 128 in Maxwell and reduced that again in pascal down to 64. More streaming multiprocessor, lless cores per processor, elad to better utilization and efficiency keeping more of the cores at full load.
 
Its good to see amd trying every arcane possibility:
http://oc.jagatreview.com/2017/08/f...n-performance-per-watt-profile-untuk-rx-vega/

Both Radeon RX VEGA 64 aircooled and liquid-cooled versions will have two BIOS :

  • Primary BIOS (default setting)
  • BIOS Secondary (Power Save BIOS)
And the wattman performance per watt profile can be used to configure the Power Profile settings on the GPU ( most likely with power limit configuration in the driver ), with Power Save settings to save power , andTurbo settings for extra performance

but if i were amd i would be trying to get the bloody card out into the market.. whats the point of all that awsomeness if you cant buy it?
 
AMD strength is also their weakness. Developing both CPU's and GPU means they have a unified vision but also that they have to split RnD budgets.

Quite a disadvantage when your competitors specialize in one thing.

Is there any benefit to them doing both cpus and gpus though? It's not as if more gpus run faster on AMD cpus, in fact, for years even amd gpus ran faster on intel cpus :)
 
Status
Not open for further replies.
Back
Top Bottom