• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: ** The AMD VEGA Thread **

On or off the hype train?

  • (off) Train has derailed

    Votes: 207 39.2%
  • (on) Overcrowding, standing room only

    Votes: 100 18.9%
  • (never ever got on) Chinese escalator

    Votes: 221 41.9%

  • Total voters
    528
Status
Not open for further replies.
Moar power?
you think this is a lot of power ?
just wait for vega20's 700 watt power draw...
this would be a Vega20 owner going to play GTAV.
Bq6i8dZ.gif
 
Last edited:
The 580 and 285 are difficult ones to judge as they were sort of refreshes/late in the architectures life.

Fundamentally though , Titan being the exception, graphics card prices have not changed much at all.

I mean even in 2004 an ATI 9800Xt had an RRP of $499!

If you put £500 into an inflation calculator, you get ... £700

http://www.bankofengland.co.uk/education/Pages/resources/inflationtools/calculator/default.aspx

The Titan when it came out was was massively more expensive than the GTX480 and GTX580 - the die sizes are what you need to look at,especially when a number of these Titan class cards are second or third level salvage. Look at the first Titan(Pascal),it was a second or third level salvage,and would have been more equivalent to a GTX470 or GTX570/GTX560TI 448 in the Fermi eras.

Its something done in other areas - have a stupidly priced "high end" product and it pushes up the product stack.

With situations like the GTX480,GTX580,GTX285,GTX280,etc the product stack below it was much more constrained.

Now look at what has happen at the lower end of the market - R9 290/GTX970/R9 290X/R9 390/GTX1060/RX470/RX480/RX570/RX580 - all comparable performance,and its gotten worse under £150 where performance has really not hugely increased either.

Even $50 more on a high end card does not sound much,but on a lower end card it means you are jumping price brackets.

That is the whole point of these kind of launches,and its been done elsewhere in other areas too outside computers.

The problem is with AMD not really competing well in graphics means,there is no real need for Nvidia to care so much and things like the Titan will still sell anyway.
 
I was just reading this on reddit talking about FP16 : https://www.reddit.com/r/Amd/comments/6qovte/wolfenstein_2_will_use_fp16_on_vega_golemde/





Although this isn't the same case scenario and might not be applicable to today this got me wondering, could it be possible that with AMD leveraging those 25 Tflops FP16 in some games for certain things, it could hurt performance on GPUs that don't support double half precision ? I mean if that is a possibility it would make Vega look better, but not because it's leveraging those 25 tflops at times but because it's bringing down performance on non supported GPUs ?
Could this FP16 thing in games become a problem for other GPUs ?

Personally i cant see many Devs going to the bother of coding in things that are only in one segment of one GPU vendor, Especially when its the higher end and most people have 1060/580's or below.
 
I was just reading this on reddit talking about FP16 : https://www.reddit.com/r/Amd/comments/6qovte/wolfenstein_2_will_use_fp16_on_vega_golemde/





Although this isn't the same case scenario and might not be applicable to today this got me wondering, could it be possible that with AMD leveraging those 25 Tflops FP16 in some games for certain things, it could hurt performance on GPUs that don't support double half precision ? I mean if that is a possibility it would make Vega look better, but not because it's leveraging those 25 tflops at times but because it's bringing down performance on non supported GPUs ?
Could this FP16 thing in games become a problem for other GPUs ?

No it will make no difference. GPUs that don't support FP16 will simply use FP32 registers as usual.

Only very specific operations can get away with FP16 so if a developer really tried to make something compute heavy and rely on FP16 to reduce some of the workload it will still be limited by FP32 performance. Recently AMD GPUs have offered far more theoretical FP32 compute than Nvidia cards just to match performance
. With Vega the different in compute performance is much less, whcih is explainign the performance in games. A developer making a veyr compute heavy game will likely impact Vega more than Pascal at this stage.
 
The 970 was a massive drop in price compared to the 770. The prices of Nvidia GPUs have not increased.

Indeed. The 670 rrp was $399 as well. Even the 470 in 2010 was $349!

The GTX 260 was $399 as well.

$399 in early/mid 2008 would have been ~£230 after VAT... Now it works out as £370...
 
Last edited:
Sounds like AMD are aligning themselves with the consoles.

Now if only Nvidia supported these features we'd all see some advancement and better games.

Seems Like AMD and Nvidia are diverging even more on a technical front. Imagine a game that NEEDS the HBCC to run properly due to using massive amounts of data. It's gonna ran like crap on Nvidia without some concessions.

Hmm maybe HBCC is useful in situations where you don't have enough Vram for the amounts of textures you want to use?

nVidia have unified memory architecture on Pascal onwards - the trick AMD has up its sleeve is probably the combination with IF as nVidia will still be limited on non nvlink systems by the PCI-e bus - but with the huge footprint of Intel CPUs and the market dominance of nVidia game developers aren't going to be making use of HBCC type stuff beyond what you can utilise via the limits of the PCI-e bus anyhow so as not to cut off large parts of their market. Consoles generally use a very different (and largely shared) memory space anyhow.
 
Well I just had a look through my box of odds and ends and I cant find another 8 pic PCI-E power connector.

I have one 6 and one 8 powering the 970 at the moment but if Vega needs two 8 pins....

Can I buy this PCI-E modular power connectors separately?

---

Seems like you can.
 
It didn't on release.

The AIB products were all more expensive than the FE, priced at $449. The $379 RRP was never anything more than a fantasy. AIBs didn't want their product being cheaper than the reference design.

I think it was the other way round at the very beginning due to Nvidia making the claim that the new Founders edition was more expensive due to the quality of it. It didn't last long though they soon pushed the prices of the aftermarket cards up and put the tacky blowers under the Founders price wise.
Unless I'm doing a brain fart I remember the aftermarket EVGA 1080's being cheaper than the Founder version but those that pre-ordered didn't get them for ages. I remember fighting the urge to order one (glad I didn't now).
 
that's what i want to know, vega56 will have AIB for sure, but no clue about nano and 64, if they do a repeat of Fiji that would be bad.

Pic's of the Asus Vega 64 Strix have been released so yeah this time around we can get aftermarket models, Oddly enough Sapphire are only showing Vega reference cards on their site, I thought they'd be one of the first with aftermarket Vega's just like with the Fury pro.
 
Realistically for the AIB cards we are looking at Sep. :(

At this rate Volta will be here.

Volta gaming cards should be coming in Q1 2018, or thereabouts.

There's a 384-bit GDDR6 card being made then, according to Hynix, so that must be Volta.

Point being, Vega should have ~6 months of availability before gaming Volta.
 
Difficult. I cant really buy in to a platform (1080) which is 12 months old, and very expensive. Especially knowing Volta is only 6-9 months away.

I usually keep a GPU for 2 years. So that would mean I would always be buying when the last generation is on the way out. Not bad if prices dropped but they haven't.

On the flip side, makes sense to just go Vega. It's new, just released and I have a Ryzen already.

But I haven't had an AMD GPU since my 5870. Not easy to tear my self away from Nvidia.
 
Do you Think we Will see aftermarket cards by The Middle of September also do you think they will cost around the £400 mark for the RX Vega 56 thats what i intend on getting :)
 
nVidia have unified memory architecture on Pascal onwards - the trick AMD has up its sleeve is probably the combination with IF as nVidia will still be limited on non nvlink systems by the PCI-e bus - but with the huge footprint of Intel CPUs and the market dominance of nVidia game developers aren't going to be making use of HBCC type stuff beyond what you can utilise via the limits of the PCI-e bus anyhow so as not to cut off large parts of their market. Consoles generally use a very different (and largely shared) memory space anyhow.

Well this video posted by Gerard the guys does mention the use of HBCC for building 'big worlds with lots of textures that run really fast'.

Listen from 2:00 onwards.

 
I took those values from Gamers Nexus so not sure how accurate.

From watching the presentation what I think they did was focus getting them selves back in the data centre first. So Epyc and Vega for compute. Ryzen with it's modular design seems to have paid off, but Vega.... not sure yet.

Your numbers are accurate :)


(I dare not to try add images again, because might get perma banned if something wrong)

http://wccftech.com/amd-radeon-rx-vega-64-56-official-slide-performance-specs-price-leak/

Yes is from WCCFTech but nothing AMD hasn't posted and is their own presentation. It has both the Tflops (13.7 for the liquid used on the tests apparently) and the rest.
Also bit lower has the images with the Vega comparing to FuryX to BF1. Which is a good game to check AMD cards (well optimised apparently) and an indication how pants the Vega performance is.
Is worse even than Polaris. Cannot be explained when you have a 60% higher clocked card, just barely getting 20% the FPS at 2560x1440 and 33% at 4K. A resolution (4K) and game (BF1) where the FuryX more likely is running out of Vram in some places. I bet at 3440x1440 where the FuryX only runs out of Vram at Tomb Raider only, the difference would be again 20-22% if not less. (FX is strong at 1440p+ when is not running out of Vram). Consider a normal FuryX at 1190/600 how much could have close the gap.
 
i think that gtx 1070 will still be better (and cheaper)
Yup. I expect the Vega 56 will be ~equal to a 1070.

As for price... sadly it completely depends on miners, and how good the Vega 56 is for mining. Sad :( Just wish this nonsense mining thing would implode.
 
Status
Not open for further replies.
Back
Top Bottom