• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Radeon VII

You mean like when gamers didn't need the 4GB of the 290X and the GTX780's 3GB was more than enough, or when gamers didn't need the 3GB of the HD7970 and the GTX680's 2GB was enough?

For reference in both those examples the cards with more VRAM lasted their owners longer.

And now 6gb is borderline. So soon the 1060s will be crippled or stuck in low detail settings while 8gb 480/580/590s cruise on.

I'm betting Doom Eternal will eat more than 6, seeing as the 2016 one wanted 5 for highest texture settings. In which case bad times for anyone who ordered a 2060.
 
I didn't see this coming, The price & how it translates for us in the UK will be the key factor.

Makes me wonder why they disabled some of the cores, if it's mainly a die shrink with added memory I would have expected it to keep the same core count.

This looks more like what you'd expect from a Vega VII pro,

Maybe a faster model will appear once they're getting better yields?
 
Last edited:
TODAYS NEWS: 700 clams for a hot 1080ti

RTX 2080 was more expensive than a 1080Ti with similar performance on release day and had less VRAM. I agree prices are a joke but I keep repeating myself over and over, the prices are a joke because Nvidia had no competition. AMD being competitive drops prices.
 
No-one said it wasn't faster than vega64. You should perhaps take a month off and go back and read the posts.

The only comparison made was with a 2 year old 1080ti

That's a laugh coming from you, mr "fan of technology" (as long as its green). The only thing a 2080 has over this is dlss and ray tracing, both features that need developer support to implement. And right now we have a grand total of ONE game supporting those features. Its a step up no matter what way you try to slice it.
 
"For gamers, AMD Radeon VII enables maximum settings for extreme framerates at the highest resolutions."*

*by highest we mean 1440p
 
A card does though? Or will Pascal be able to use it?

DLSS is just supersamping, except it does it only to selected objects. When they say "increase performance", it's only an increase vs using regular full-scene supersamping. But if your playing in 1440 or 4k it's not going to be that useful.

Seeing through the marketing fluff.. it's more of a workaround to make RT run less crap. Once tech moves on it will likely vanish.
 
Last edited:
Posted today, how wrong you was once again. I bet if I could be bothered digging through threads, it was yourself that started amd wouldn't release 7nm Vega for gaming.

I wouldn't pay too much attention to D.P., he has a serious anti-AMD agenda to maintain on these forums, casting doubt on anything associated with them and their products, to the point of even claiming TSMCs 7nm node is 'rough'. He should get a job on Intel or Nvidia's marketing dept. :D
 
I take it you missed the demo showing it playing at 4k? *looks at sig* oh now it makes sense.
If you consider 60fps AVERAGE in 4k for latest games, sure, suit yourself. That's what "Extreme Frame rates" means to you, all the more power to you buddy. Doesn't change the fact that their marketing message is objectively wrong!
 
DLSS is just supersamping, except it does it only to selected objects. When they say "increase performance", it's only an increase vs using regular full-scene supersamping. But if your playing in 1440 or 4k it's not going to be that useful.

Seeing through the marketing fluff.. it's more of a workaround to make RT run less crap. Once tech moves on it will likely vanish.

Cheers, I remember reading something a few days back about DLSS helping get the performance back up when used alongside RTX but don't recall anything about whether DLSS was a 20 series only feature? It didn't really interest me until the adaptive sync support announcement but that's now tempting me to try RTX out.
 
If you consider 60fps AVERAGE in 4k for latest games, sure, suit yourself. That's what "Extreme Frame rates" means to you, all the more power to you buddy. Doesn't change the fact that their marketing message is objectively wrong!


Yeah i forgot the card has been reviewed already..oh wait. How about we wait til its out before you post your little spiel about frame rates at 1440?
 
If you consider 60fps AVERAGE in 4k for latest games, sure, suit yourself. That's what "Extreme Frame rates" means to you, all the more power to you buddy. Doesn't change the fact that their marketing message is objectively wrong!

Well, Nvidia fans were singing the praises of the RTX cards doing 40fps at 1080 so I don't see your point at all, unless its just to badmouth AMD to fit whatever agenda you have going on here :P
 
You mean like when gamers didn't need the 4GB of the 290X and the GTX780's 3GB was more than enough, or when gamers didn't need the 3GB of the HD7970 and the GTX680's 2GB was enough?

For reference in both those examples the cards with more VRAM lasted their owners longer.
Oh come on!! You know that's not the same. 16gb is a huge amount of memory that even 4k is unlikely to get anywhere near in the next few years
 
Back
Top Bottom