• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: ** The AMD VEGA Thread **

On or off the hype train?

  • (off) Train has derailed

    Votes: 207 39.2%
  • (on) Overcrowding, standing room only

    Votes: 100 18.9%
  • (never ever got on) Chinese escalator

    Votes: 221 41.9%

  • Total voters
    528
Status
Not open for further replies.
The 1070 starts at $379
It didn't on release.

The AIB products were all more expensive than the FE, priced at $449. The $379 RRP was never anything more than a fantasy. AIBs didn't want their product being cheaper than the reference design.

e: Found some launch reviews.

The MSI Gaming was $439, the Gigabyte G1 Gaming was $429. Slightly less than the FE but nowhere close to $379. I don't think any AIB 1070 was $379 in the first few months after launch.

Also if you remember stock levels were very, very limited, so retailers gouged the hell out of the prices. Wasn't uncommon for 1070s to be $499.
 
Last edited:
to sums it up :
Vega launch could have been a good launch if the cards were 50 to 70$ cheaper.
but if you look at it Vega cards although they are late they still bring a good value for the ppl who are looking to buy 300-500$ cards, the Vega56 should out perform the 1070, and the vega 64 should trade blows with the 1080, AMD seem not able to increase efficiency for high count stream processors yet, so i expect the 56 to be very close to the 64 vega at launch (maybe ~10%) as did Fury to the FuyX at launch.
that said, there are a lot of reasons for ppl to pick vega instead of 1070/1080 ( unless nvidia drops the price by 50$ ), you have the HBM2, HBCC, the minimum frame rate they are talking about, vulkan/dx12 performance compared to dx11, higher perf as resolution goes up, price of freesync, then you have power draw that shouldn't really matter that much for gamers.
me today if i want to buy a 500$ card i would pick vega instead of 1080 or 1070, with all that i said above, i also know that driver support will last longer for a new card than the one that is about to be replaced by a new lineup.
another bad launch from Radeon group, product is ok, probably going to get better, should have bit the bullet for the price and went 50$ lower, now i just hope they allow AIBs to make Vega 64 and nano version.
 
It didn't on release.

The AIB products were all more expensive than the FE, priced at $449. The $379 RRP was never anything more than a fantasy. AIBs didn't want their product being cheaper than the reference design.

Blower version as cheap as £365 at launch

https://forums.overclockers.co.uk/threads/nvidia-gtx-1070-wanna-pre-order.18734903/

Also, before the mining nonsense, you could easily pick up a decent 1070 for low three hundreds. You only have to type 1070 into hotukdeals to see that.
 
The Titan is a vanity/halo product and admittedly is where Nvidia have taken the **** somewhat.Nothing new though as seen by the 8800ultra

For all intents and purposes the Ti versions recently have basically been the top card and as posted these were ~$649 even 10 years ago.

Yeah,but remember the GTX580 was the full monty and that was $500. The GTX480 was a second level salvage but a similar price. The GTX280 was the fully monty at $699,and the GTX285 at $400.

But during time ATI/AMD were competing really well with Nvidia so prices were kept in check.

The Titan brand(and even the 8800 Ultra) only existed to push up the price of cards under it - the same with what the Fury X did. R9 290 series card prices were in free fall before it launched.The combination of the R9 290 rejig and Fury X actually pushed R9 390/390X prices up.

Its the midrange and lower end where performance improvements have stagnated,and it means if you are wanting a decent upgrade its more and more likely you will be looking up the AMD and Nvidia stack now and spending more money.
 
I just turned down the opportunity to get a 1080Ti with an EVGA hybrid cooler for £520 on members market :p
 
Yeah,but remember the GTX580 was the full monty and that was $500. The GTX480 was a second level salvage but a similar price. The GTX280 was the fully monty at $699,and the GTX285 at $400.

But during time ATI/AMD were competing really well with Nvidia so prices were kept in check.

The Titan brand(and even the 8800 Ultra) only existed to push up the price of cards under it - the same with what the Fury X did. R9 290 series card prices were in free fall before it launched.The combination of the R9 290 rejig and Fury X actually pushed R9 390/390X prices up.

Its the midrange and lower end where performance improvements have stagnated,and it means if you are wanting a decent upgrade its more and more likely you will be looking up the AMD and Nvidia stack now and spending more money.

The 580 and 285 are difficult ones to judge as they were sort of refreshes/late in the architectures life.

Fundamentally though , Titan being the exception, graphics card prices have not changed much at all.

I mean even in 2004 an ATI 9800Xt had an RRP of $499!

If you put £500 into an inflation calculator, you get ... £700

http://www.bankofengland.co.uk/education/Pages/resources/inflationtools/calculator/default.aspx


Massive graphics card price increases are a myth and on the whole are similar even all the way to the early 2000's, especially accounting for inflation. We only have been feeling it more recently as the pound has taken such a hit.

Or US inflation - $647

http://www.usinflationcalculator.com/
 
I was just reading this on reddit talking about FP16 : https://www.reddit.com/r/Amd/comments/6qovte/wolfenstein_2_will_use_fp16_on_vega_golemde/

Because it offers a tangible benefit in most applications. Geforce FX supported FP16/32 back in 2003 (when the original DX9 specification mandated FP24) and there were visible anomalies when using the less precise path in Half-Life 2. Shader model 3.0 then started mandating FP32 support back in 2004 and games have been using it ever since.
The reason that FP16 is now seeing a resurgence is that some modern rendering techniques don't need full precision. You shouldn't take these news to mean that the entire rendering pipeline will use FP16 because we've long since moved past that.

In view of recent ATI’s Shader Day event and Gabe Newell’s presentation of Half-Life 2, we asked John Carmack about his opinion. What can we expect from GeForce FX family regarding future DirectX 9 games?

Hi John,

No doubt you heard about GeForce FX fiasco in Half-Life 2. In your opinion, are these results representative for future DX9 games (including Doom III) or is it just a special case of HL2 code preferring ATI features, as NVIDIA suggests?

Unfortunately, it will probably be representative of most DX9 games. Doom has a custom back end that uses the lower precisions on the GF-FX, but when you run it with standard fragment programs just like ATI, it is a lot slower. The precision doesn't really matter to Doom, but that won't be a reasonable option in future games designed around DX9 level hardware as a minimum spec.

John Carmack

Although this isn't the same case scenario and might not be applicable to today this got me wondering, could it be possible that with AMD leveraging those 25 Tflops FP16 in some games for certain things, it could hurt performance on GPUs that don't support double half precision ? I mean if that is a possibility it would make Vega look better, but not because it's leveraging those 25 tflops at times but because it's bringing down performance on non supported GPUs ?
Could this FP16 thing in games become a problem for other GPUs ?
 
Blower version as cheap as £365 at launch

https://forums.overclockers.co.uk/threads/nvidia-gtx-1070-wanna-pre-order.18734903/

Also, before the mining nonsense, you could easily pick up a decent 1070 for low three hundreds. You only have to type 1070 into hotukdeals to see that.
Don't switch between USD and £ prices or the comparison is meaningless, before and after Brexit.

If you just stick to USD launch prices you can ignore Brexit altogether.

The 1070 was a massive jump in price from the 970. Moreover, the existence of the FE at $449 and incredibly low stock levels on launch meant that, as said, the $379 RRP for AIB 1070s was never actually met. It was a complete joke.

That KFA @ £365, btw, was one of what, 15 1070s in that thread? All the others were >£400. There must have been some special deal between KFA and OcUK on that one.
 
Multiple people in hospital with serious burns most likely. In all seriousness the power really ain't a problem as if it was faster than a 1080ti and priced below people would buy. It's gtx1080 and that really does sip juice in comparison.

Moar power?
 
The 1070 was a massive jump in price from the 970. Moreover, the existence of the FE at $449 and incredibly low stock levels on launch meant that, as said, the $379 RRP for AIB 1070s was never actually met. .

So you are citing other factors affecting the RRP which yes obviously would have an effect.

Once supply was good and before the mining nonsense happened, decent 1070's in the mid to low three hundreds were very commonplace and would match the $379 rrp after conversion.

As i said, look on hotukdeals. A Good ~£350 (sometimes less) 1070 was easy to get.

The $379 rrp was fully realised before the mining craze.
 
Sounds like AMD are aligning themselves with the consoles.

Now if only Nvidia supported these features we'd all see some advancement and better games.

Seems Like AMD and Nvidia are diverging even more on a technical front. Imagine a game that NEEDS the HBCC to run properly due to using massive amounts of data. It's gonna ran like crap on Nvidia without some concessions.

Hmm maybe HBCC is useful in situations where you don't have enough Vram for the amounts of textures you want to use?
 
Status
Not open for further replies.
Back
Top Bottom