• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Radeon VII

Also if they announced pci e
IMO real world performance of this in some titles will match or exceed the 2080Ti / Titan. Namely ones which either do or can lean very heavily on compute based rendering.

Vega 64 is pretty much neck and neck with a 2080 in PubG & Blackout once you fix the voltages, which both do the above. Also doubt it'll be behind in iDTech.

Also, remember that is the tangent NVIDIA sponsored games are taking now, since the 20xx series cards finally caught up with AMD on compute based rendering, and left their own old cards in the dust.

If some samples can really OC from stock 1.8Ghz to 2.4Ghz, there's going to be a huge secondary market for high clocking samples. Performance ought to be higher than anything else out there.

Let's hope they have serious volume ... though I fear, again, HBM2 / interposers will limit supply. If they do, then NVIDIA have no choice but to make large price cuts.

Unfortunately in the real world it does not work like that.

The Titan V for compute work is a very long way ahead of any other gaming card but this does not help it very much for the situation you describe above.:)
 
Has there been anything said about power consumption? Is it likely to be much lower given how much juice the original Vega cards used out of the box


No, 300w, potentially slightly more.


Ah that's a shame :( I was hoping given the clock speeds aren't that much higher, as well as the lower number of compute units, we'd see less power draw on 7nm


The 7nm process offers about 25% more performance for the same power consumption. AMD used that 25% to boost performance by 25% rather than lower power usage.
 
The card chip are old stock and can only run with 16gb at the lowest. I imagine they have excess HBM2 supplies and this is why they want to shift them. Either that or the card chip will only run with HMB2


I doubt they have excess supplies, I think it is more liekly that the binning is done after the HBM and GPU are mounted to the interposer. At that point there is no going back and removing a stack of memory.
 
The only saving grace is its worth buying over the 2080... its just a shame its the same performance just with more ram.

I would have preferred less ram say 12GB and a retail of £499 this would then make the 2080 obsolete.

Is it though? I am not sure what person with £700 burning a hole in their pocket would buy one over a 2080, other than the die hard AMD fans or those who just have an axe to grind with Nvidia for some reason. No RTX features, which however useless they might be for now, at least there's a chance these could be leveraged in a game you play down the line. An extra 8GB of RAM won't come in to play for years. AMD have also lost their Freesync advantage now, so that's something else to consider.


So this card has it's place...and now all they need is to improve the GPU Architecture with Navi, and they already have worked out the Memory Bandwidth constraints/limitations with HBM2.

Then for cut-down versions, release one with GDDR6.

Point is..this car has it's place, definitely.

You are right it has a place, but that place is somewhat niche. Unless they do something about the price (impossible it seems thanks to HBM2 cost), mainstream gamers are not going to be buying this card in their droves, no more than they have the 2080. The 'cut down' 8GB GDDR6 version you mentioned, priced right, would have been the card to do that. But this won't happen. Re-tooling and new memory controller etc. would make it far too costly... and such a card would by necessity have to come in cheap.


Lol. So true.

PS5 will get a lot of PC gamers attention I think (I know I am getting one for sure). With 120Hz VRR OLED TVs with proper HDR it will be an easy choice. Nvidia better price their 3000 series much more competitively or they will see number of sales plummet.

100% agree there. Next gen consoles, although well over a year away yet, are potentially going to give PC's a real run for their money if pricing doesn't stop running away with itself. Things can't continue like this forever or before long we'll have a four-figure starting price for a mid-range GPU!
 
Is it though? I am not sure what person with £700 burning a hole in their pocket would buy one over a 2080, other than the die hard AMD fans or those who just have an axe to grind with Nvidia for some reason. No RTX features, which however useless they might be for now, at least there's a chance these could be leveraged in a game you play down the line. An extra 8GB of RAM won't come in to play for years. AMD have also lost their Freesync advantage now, so that's something else to consider.




You are right it has a place, but that place is somewhat niche. Unless they do something about the price (impossible it seems thanks to HBM2 cost), mainstream gamers are not going to be buying this card in their droves, no more than they have the 2080. The 'cut down' 8GB GDDR6 version you mentioned, priced right, would have been the card to do that. But this won't happen. Re-tooling and new memory controller etc. would make it far too costly... and such a card would by necessity have to come in cheap.




100% agree there. Next gen consoles, although well over a year away yet, are potentially going to give PC's a real run for their money if pricing doesn't stop running away with itself. Things can't continue like this forever or before long we'll have a four-figure starting price for a mid-range GPU!
Maybe that's the master plan (tin foil hat on) price everyone into consoles for gaming, especially with the whole keyboard and mouse thing going on for consoles now and in the future.(tin foil hat off)
 
The point someone made earlier, this IS just an AMD 1080ti so they are not just late to the latest gen party, they are just arriving at the last one. they are now speed AND features behind Nvidia (and power draw) not just power like in 2017 vega release.

My advice would be for them make an AMD open source raytracing cored, AI cored, Physx cored 2080ti beater or just give up at this point. Nvidia won and the gap will just get bigger and bigger until Intel have a pop at it, and if they fail, then it's all over.

P.s I love AMD , just seeing the cold hard facts
 
Intel
The point someone made earlier, this IS just an AMD 1080ti so they are not just late to the latest gen party, they are just arriving at the last one. they are now speed AND features behind Nvidia (and power draw) not just power like in 2017 vega release.

My advice would be for them make an AMD open source raytracing cored, AI cored, Physx cored 2080ti beater or just give up at this point. Nvidia won and the gap will just get bigger and bigger until Intel have a pop at it, and if they fail, then it's all over.

P.s I love AMD , just seeing the cold hard facts

Intel drivers on release would be crap
 
My advice would be for them make an AMD open source raytracing cored, AI cored, Physx cored 2080ti beater or just give up at this point. Nvidia won and the gap will just get bigger and bigger until Intel have a pop at it, and if they fail, then it's all over.

Then I would strongly encourage you to give Lisa Su a ring with a copy of your CV, degrees in semiconductor design and fortune telling abilities. Because clearly it's as easy as you say :rolleyes:
 
Or just stream it to your browser. The Project Stream demo was quite impressive.
Never going to happen for high end gamers though for 2 reasons,
1: its only 1080p
2: The latency will never be good enough for e sports or fast fps shooters.

I can see it being a replacement for low end cards and people with no money for gaming rigs or console (playing on mums laptop/tablet/ grannies desktop)
 
Then I would strongly encourage you to give Lisa Su a ring with a copy of your CV, degrees in semiconductor design and fortune telling abilities. Because clearly it's as easy as you say :rolleyes:
Not saying it's easy, saying if they can't compete with nvidia at the feature set and design / raw power side then they will never catch up now or ever, they are 2 gens behind effectively now, instead of 1 , even though they launched a "new" product

Something magical must have happened at nvidia between the 980ti and 1080ti because the AMD Fury X was on par with a 980ti and then it just fell apart for them after that.
 
Not saying it's easy, saying if they can't compete with nvidia at the feature set and design / raw power side then they will never catch up now or ever, they are 2 gens behind effectively now, instead of 1 , even though they launched a "new" product

Something magical must have happened at nvidia between the 980ti and 1080ti because the AMD Fury X was on par with a 980ti and then it just fell apart for them after that.


It's going to be hard for them. I've no idea what they have in store with Navi, but by the time that sees the light of day, Nvidia will have their 7nm products on the horizon. It's a real shame. Despite only having 1080Ti power, there is absolutely a place in the market for a card which can perform to that level... but not at the £700 price point. Circa £500 and people would really be paying attention right now. That was the only move they could have made to win back major support near the top end, but it seems being locked in to HBM2 didn't give them any other choice. An 8GB HBM2 card wouldn't be much cheaper I suspect... it would have had to be GDDR6, but for reasons already discussed, that would have just been too costly for them to make.

I guess it's up to trusty old Intel now... we can surely rely on them to put out a top end performance amazing value GPU. :p

Ultimately, we do have Nvidia somewhat to blame for all this. Had they brought in Turing at more sensible prices... a 2080 around the £500-550 mark, and the 2080Ti around £700-750, then AMD wouldn't have even bothered bringing the Radeon VII to market. Or they would have had to drastically re-engineer things to make it happen at the right price.
 
Last edited:
I think it will do ok against the 2080 but that is not the point.

In 6 or 9 months time RTX and DLSS will be more commonplace in games which will give the 2080 the edge unless AMD cut their prices.

In 6 or 9 months well prob have two more RTX games if we're lucky xD

RTX not a selling point for me at all. I think second gen RTX will be more compelling. The whole RTX thing is very underwhelming, from implementation (Or lack of) too the extra price etc.

I needed a replacement for a dead VEGA and 1080 Ti was more expensive than RTX 2080. So RTX 2080 was my best option. (2080 Ti failure rate scares me).

RTX 2080 is a beast of card, seems even faster than my old 1080 Ti. If AMD's card can compete and offers decent games I think it's a massive positive for AMD.

Honestly for me at least 1080 Ti, RTX 2080 level of performance is enough. A future with 4K games running with RTX at max will be welcome though. Atm Ray tracing just isn't there yet, or the cards to run it..
 
In 6 or 9 months well prob have two more RTX games if we're lucky xD

RTX not a selling point for me at all. I think second gen RTX will be more compelling. The whole RTX thing is very underwhelming, from implementation (Or lack of) too the extra price etc.

I needed a replacement for a dead VEGA and 1080 Ti was more expensive than RTX 2080. So RTX 2080 was my best option. (2080 Ti failure rate scares me).

RTX 2080 is a beast of card, seems even faster than my old 1080 Ti. If AMD's card can compete and offers decent games I think it's a massive positive for AMD.

Honestly for me at least 1080 Ti, RTX 2080 level of performance is enough. A future with 4K games running with RTX at max will be welcome though. Atm Ray tracing just isn't there yet, or the cards to run it..

Give the new Port Royal benchmark a go to see Ray Tracing in action.
 
To be honest this feels like a PR product, nothing more.

The RX 590 felt like something to fulfil GloFo's wafer agreement and to give AiB partners something new to sell over Christmas and has a 9 month shelf life. Now with the raster performance of RTX not being much higher than GTX 10, and ray tracing performance utterly woeful, the Radeon VII cannibalises MI50 sales just to say "look, we can match Nvidia at the top end, honest". And for a price point that doesn't really offer much incentive to actually buy Radeon VII over the RTX 2080.

I would've much preferred AMD stick their chest out and defiantly say "no, we have a plan and we're sticking to it. We're taking back the midrange with Navi. We're not chasing Nvidia's halo products right now. We're getting our GPU house in order and then we'll talk Arcturus in 2020".

Very disappointed in a $700 PR stunt.

It also brings into question that hint one of AdoredTV's sources dropped when they said they'd seen a "Big Navi" running and it was impressive. Given the RX 3080 as leaked is still a chunk of performance away from Radeon VII, could it be the "Big Navi" was actually just Vega 20 after all?

Roll on Computex when I think we'll see some Navi talk. I'm still excited at the prospect of Vega 64 + 15% for £250.
If this is big Navi then it wouldn't have been described as impressive. I thought AdoredTV said Navi in general was impressive not big Navi in particular. We don't really know when a true high end Navi will come out although at this rate its competitor will be the 3080Ti.
 
Back
Top Bottom