• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: ** The AMD VEGA Thread **

On or off the hype train?

  • (off) Train has derailed

    Votes: 207 39.2%
  • (on) Overcrowding, standing room only

    Votes: 100 18.9%
  • (never ever got on) Chinese escalator

    Votes: 221 41.9%

  • Total voters
    528
Status
Not open for further replies.
It's only cheaper if you don't take the FE card as the real RRP. Nvidia marketing at its finest. Boost the price but claim a lower RRP. With the 1070 almost all decent coolers were priced above the FE card which is the stock cooler. Genius again to raise prices on the sly.
I've been saying this over and over, but people don't want to hear it... The AIB price for the 1070 was around the FE price, not the RRP.

People just so keen to jump to defend Pascal's pricing. "Please Sir, I want to pay some more."
 
I think the power draw requirements for Vega have been blown out of proportion. Do we as PC gamers care that much about heat and power draw? (if it's nothing unreasonable)

Of course you want performance and an efficient architecture, but it doesn't matter too much as long as the card performance.

My worry tho is that both Nvidia and AMD are diverging quite drastically on a technical front now meaning we are going to have two camps game wise. Nvidia games that AMD owners will need to turn stuff off to get to run adequately and AMD games where Nvidia cards will have to do the same as the game uses Rapid Pack Maths, Async or HBCC.
 
IPC is meaningless if you don't take into account clock speed.

The only metric that matters is instructions per second, IPS.


Artificially limiting clock speed is like taking Buggati Veyron and testing it at 50MPH against a Ford Fiesta and claiming both cars are just as fast as each other. It is completely meaningless.

i will borrow few arguments from basic queue theory
you have a system with one server that processes 50 tickets in an hour
and another one where there are two shared servers each doing 24 tickets an hour [purely from a IPS perspective system 1 is 50 TPH while system 2 is 48 TPH]
tail wait times [this is something like max frame times in GPU parlance] in system 1 ought to be higher than corresponding estimates for system 2
even the overall queue characteristics in system 1 will be worse compared to system 2

so in summary, given the same or slightly lower IPS, i will choose a system with more threads and a better IPC
so thats my line of thought
but yeah, completely appreciate your perspective as well
 
How do you come to that conclusion? It is likely the good architecture in Pascal that allows for such high clocks....

Also, it still scores 2000 more in the graphics score (cpu difference makes the overall score look closer there)
No i have made an assumption here that the RX drivers will be able to cover the 10% difference - its reasonably within the achievable range
 
I think the power draw requirements for Vega have been blown out of proportion. Do we as PC gamers care that much about heat and power draw? (if it's nothing unreasonable)

Of course you want performance and an efficient architecture, but it doesn't matter too much as long as the card performance.

My worry tho is that both Nvidia and AMD are diverging quite drastically on a technical front now meaning we are going to have two camps game wise. Nvidia games that AMD owners will need to turn stuff off to get to run adequately and AMD games where Nvidia cards will have to do the same as the game uses Rapid Pack Maths, Async or HBCC.

Back in 2013 where the argument was for the power draw of the 290X (stock) over 780ti (stock) and the price difference of the two, we worked out that someone had to run the 780ti for 25years to make the extra money from the lower power consumption. And the Titan Black 58 years. at 10h per day 365 days a year at 100% speed constant.

while if overclocking was applied to both (290x and 780ti), the calculation estimate was running to centuries, because the difference was miniscule.
 
These have been getting slagged off for months, will continue to do after launch, then in a few months everyone will calm down and agree they are actually pretty good, in fact far better value than Nvidia and will also improve over the next couple of years with new drivers so again are a solid choice.

Same as it always have been with AMD in here.

People expecting too much. I think a lot are also lost in the past. Last decade's rapidly advancing fab processes meant you got the big obvious performance jumps at traditional pricepoints as whole products stacks came out in relatively quick succession.

Now a particular performance level gradually gets cheaper over a generational lifecycle with the new products being just a slight bump in the price/perf at a particular tier. Its a more gradual gradient over time rather than large stepwise changes every 6-12 months. The community hasn't adjusted hence folks get upset over EOL prices vs new product prices. "But we could buy this performance for around this price already, its barely an improvement at all!"
 
I've been saying this over and over, but people don't want to hear it... The AIB price for the 1070 was around the FE price, not the RRP.

People just so keen to jump to defend Pascal's pricing. "Please Sir, I want to pay some more."

But it simply isn't true. What you are saying is simply factually incorrect. There were loads of AIB 1070's for sale at around the £380 mark after release, and even after the pound plummeted to 1.3 and below (meaning after VAT the dollar price was practically the same as the pound price). Also, before the mining craze, good AIB 1070's could be had very easily for ~£350 mark.
 
There is a difference between the water and air cooled Vega 64 and that is 70c and 90c temperature cap before throttling respectively. Not sure why they put 70c on the Water but probs just a safety feature.


I think its to limit Power Draw. You can always flash Air bios on it with 90c limit :)
 
But it simply isn't true. What you are saying is simply factually incorrect. There were loads of AIB 1070's for sale at around the £380 mark after release, and even after the pound plummeted to 1.3 and below (meaning after VAT the dollar price was practically the same as the pound price). Also, before the mining craze, good AIB 1070's could be had very easily for ~£350 mark.

On sale yea but not at there normal prices. A one week or weekend sale is not the same. There normal pricing was above the FE card for the majority of the time. This mining business has not helped with prices either but in the main the decent 1070's were above the FE just like before the FE where better coolers were priced higher than the reference. there would be sales back then as well where a good model would come on sale below reference pricing.

Any how totally forgot this was the Vega thread for a bit so better get back on topic before the warning comes :D:D:D.
 
I'll still wait for full reviews before judgement but so far the product doesn't look bad at all, disappointing... yes but "terrible", not really... It has been let down big time by the following though:

- poor volta PR poster
- bundle deal being limited to certain parts of the world (and what makes it even more laughable is that the samsung 34" monitor has bad problems with freesync flickering [unless it's been fixed], why not offer £100/200 of any freesync monitor when buying vega :confused:
- coming this late, it could/should be at least $50 cheaper
- power efficiency is looking disappointing but again, I'll wait until reviews for this....

I would still pick vega over a 1080 as I be willing to bet that vega will be far more future proof down the line especially when dx 12/vulkan becomes even more common and even more so if FP16 gets used a lot, which it should as it will "supposedly" improve performance for nvidia too (and having 2 pretty big AAA games already announced to be using it is very good, lets just hope it doesn't go the way of AMD's true audio....)

Just when things could not get any worse for the Vega launch.....

eskOwrv.jpg


https://www.techpowerup.com/235701/...l-features-for-titan-xp-through-driver-update

The fastest gaming card available just got better !!!!

Shame I have not got any professional work for mine to do.:(

And this just goes to show:

1. why competition is needed
2. further shows how greedy/nasty nvidia really are

That GPU has been out for how long and all of a sudden overnight it has received an update to bring 300% more performance? So in other words, if vega FE wasn't so good, titan XP users would have most likely have never seen this update..... Reading on reddit, supposedly maya performance regression has been a huge bug for nvidia GTX GPUs (according to nvidia) for over a year now and nvidia have made no attempt to fix it until now, funny timing eh.....

So yeah, have fun with the PC gaming industry should anything happen to AMD :D
 
On sale yea but not at there normal prices. A one week or weekend sale is not the same. There normal pricing was above the FE card for the majority of the time. This mining business has not helped with prices either but in the main the decent 1070's were above the FE just like before the FE where better coolers were priced higher than the reference. there would be sales back then as well where a good model would come on sale below reference pricing.

Any how totally forgot this was the Vega thread for a bit so better get back on topic before the warning comes :D:D:D.

Not really. A lot of the average prices for good aftermarket 1070's are under the £400 mark on the camelcamelcamel tracker. Can't really go into detail and link to loads of things as it is against the rules, but once supply was good and before the mining stuff, you could always get an AIB 1070's for around the $379 rrp, often less.

Even Overclockers who I find are generally a bit expensive on the whole had a few of the cheaper coolers/brands for around the RRP (considering the exchange rate at the time). This was 4th July 2016 -

https://web.archive.org/web/2016111...onents/graphics-cards/nvidia/geforce-gtx-1070

RRP is always the starting price. The premium coolers and brands are always going to be and always have been more.
 
Last edited:
That GPU has been out for how long and all of a sudden overnight it has received an update to bring 300% more performance?
It's a £5000 Quadro card with "only" 12GB VRAM and a bunch of features disabled for £1000, that's a pretty good deal, and now due to competition it has less features disabled. Hence, as you said competition being good.

I think it's a bit lame Nvidia locked out so much of the performance the TXP was capable of via drivers, but then AMD once edited the HD7970 BIOS, unlocked a few features, changed the name to FirePro D700 and sold them to Mac Pro users as an optional upgrade for like 300% of the 7970's RRP. So both companies have done some pretty dodgy driver stuffs.
 
Last edited:
https://www.gigabyte.com/Graphics-Card/GV-RXVEGA64X-W-8GD-B#kf

Why is a 1000w psu recommended for a 345w TDP video card? Surely a good 800w would run it fine?

I think it is quick clear from what we have seen that raising the clocks on vega significantly increases power draw to crazy levels so Gigabyte are probably playing it safe.

Been answered a few times in this thread. A PSU is most efficient somewhere around 50% load. If you push your PSU to its max rated output it will get *very* hot and you will reduce its lifespan.

You always want to buy a PSU rated at least 1.5x your max power draw. 850W minimum for Vega 64, 1000W recommended.

This has been answered wrongly several times in this thread. It does not need a 1000W PSU, not even an 850W. Kyle over on Hardforum, who has used one and tested one has said that

But I would suggest that a good quality 750W would be fine for a full blown enthusiast clocked system.

That's him taking measurements from the wall. He says if you just game and don't overclock a 500W PSU was fine.

I did check at the wall loads while gaming. A 500w PSU would be just fine for gaming. That said, the 1800X was not overclocked
 
I'll still wait for full reviews before judgement but so far the product doesn't look bad at all, disappointing... yes but "terrible", not really... It has been let down big time by the following though:

- poor volta PR poster
- bundle deal being limited to certain parts of the world (and what makes it even more laughable is that the samsung 34" monitor has bad problems with freesync flickering [unless it's been fixed], why not offer £100/200 of any freesync monitor when buying vega :confused:
- coming this late, it could/should be at least $50 cheaper
- power efficiency is looking disappointing but again, I'll wait until reviews for this....

I would still pick vega over a 1080 as I be willing to bet that vega will be far more future proof down the line especially when dx 12/vulkan becomes even more common and even more so if FP16 gets used a lot, which it should as it will "supposedly" improve performance for nvidia too (and having 2 pretty big AAA games already announced to be using it is very good, lets just hope it doesn't go the way of AMD's true audio....)



And this just goes to show:

1. why competition is needed
2. further shows how greedy/nasty nvidia really are

That GPU has been out for how long and all of a sudden overnight it has received an update to bring 300% more performance? So in other words, if vega FE wasn't so good, titan XP users would have most likely have never seen this update..... Reading on reddit, supposedly maya performance regression has been a huge bug for nvidia GTX GPUs (according to nvidia) for over a year now and nvidia have made no attempt to fix it until now, funny timing eh.....

So yeah, have fun with the PC gaming industry should anything happen to AMD :D

I find it hilarious how people praise AMD when stuff like this happens. When a performance boost is given after a while everyone is on their knees saying how awesome the support is and how future proof these cards are, when Nvidia does it it just shows how greedy they are. If that's not bias, I don't know what is lol! As for the second statement, do you know something the rest of us don't?
 
Not really. A lot of the average prices for good aftermarket 1070's are under the £400 mark on the camelcamelcamel tracker. Can't really go into detail and link to loads of things as it is against the rules, but once supply was good and before the mining stuff, you could always get an AIB 1070's for around the $379 rrp, often less.
Why do you keep shifting between $ and £ as tho they are interchangeable? They weren't/aren't. It muddies the water.

Maxwell launch price was £270/$329 for the 970.

Pascal launch price was £400/~$449 for the 1070. AIB boards form Giga and MSI were $439 and $429 (or vice versa).

We were saying that "Brexit isn't the be-all, end-all" of this price rise. The USD price did jump between Maxwell and Pascal. Brexit is not the smoking gun people keep saying. nV pushed the price a tier higher than it had been during Maxwell.

Any one of us could just find an old thread from around Pascal's launch and see people complaining how the price had shot up vs the 970. And that was before Brexit.

And yet everybody just wants to blame Brexit now for literally everything. If AMD/nV put price the 2070 or the Vega 20 mid-range part at $499 (USD), somehow that'll be the fault of Brexit...

It's an excuse people like to use to justify paying more. Because they want to pay more, but they also want to feel like they had no choice. That's the real problem. People want to spend spend spend but feel like a victim while they do...
 
I find it hilarious how people praise AMD when stuff like this happens. When a performance boost is given after a while everyone is on their knees saying how awesome the support is and how future proof these cards are, when Nvidia does it it just shows how greedy they are. If that's not bias, I don't know what is lol! As for the second statement, do you know something the rest of us we don't?

Except there is a big difference between:

- 300% over night
- 20-30% at "most" spread across 6+ months to 2 years

And also, note the timing of said update.....
 
This has been answered wrongly several times in this thread. It does not need a 1000W PSU, not even an 850W. Kyle over on Hardforum, who has used one and tested one has said that



That's him taking measurements from the wall. He says if you just game and don't overclock a 500W PSU was fine.
You clearly have no idea how PSUs work. Feel free to run yours at its maximum output. I sure as hell won't be doing that with mine.

Educate yourself and stop relying on whoever "Kyle at Hardforum" is to spoonfeed you.

e: Also please buy a Vega 64 and try using it with a 500W PSU, with a i7, a couple disk drives, a few system fans... Please let us all know how long it runs before catching fire :p

You're saying you can run a system with a 400W GPU with a 500W PSU.

No-one is taking *that* seriously. no one.
 
Status
Not open for further replies.
Back
Top Bottom