• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Possible Radeon 390X / 390 and 380X Spec / Benchmark (do not hotlink images!!!!!!)

Status
Not open for further replies.
Unfortunately I do get the feeling that tomorrow might not actually give us all the answers we are wanting and I don't mean we will hear things that we don't want too.

I expect lots of blurb about the future of gaming, future VR tech, DirectX 12 and other such stuff that most of us here, don't really care about right now.

A quick bit about the 300 series with tech details and probably a quick glimpse of the Furry right at the end of the event.

Based on the timing and supposed lack of reviewer cards it sounds like it will be much like the Titan X 'launch' or the first time we heard about Mantle and other things. As with most semi gamer oriented events, a bunch of game demos, a bunch of fairly basic slides to talk about upcoming hardware, launch and maybe talk about availability of some of the lower end cards, talk about DX12, talk about features the cards support. Then talk in basic about Fury then 1-2 weeks you have proper reviews.

99% of gamers don't know the first thing about their graphics cards and most people who go to E3, or an Apple thing get told the basics, not the in depth stuff. A very tech press oriented event(less public, less gamer, more industry) or a developer forum you might get a very in depth talk about products but I haven't and don't expect much beyond how Nvidia first mentioned Titan X, or Titan Z, or most of their products. A few slides, a bit of talking it up but not a whole lot of real architecture or in depth detail. But when it's talked about openly the reviews and proper information will be 1-2 weeks away.
 
Oh no no no... Hololens is a whole new ball game. That's like saying we've had graphics for years, pick up your Gameboy and be happy.

What if I don't want to play Minecraft on a tabletop?

It seems like they panicked with everyone else getting their own VR headset and greenlit some random lab experiment from 2010.
 
It's a shame that Charlie doesn't realise that kaap always buys 4 of whatever the top tier cards are from BOTH vendors, he's pretty well clued up on what problems both sides have
 
What if I don't want to play Minecraft on a tabletop?

It seems like they panicked with everyone else getting their own VR headset and greenlit some random lab experiment from 2010.
There's more to it than Minecraft... I'm puzzled why they chose this to show it off though. I never got the whole Minecraft thing. Having read some of the other stuff they're doing, it has amazing potential though. It's not going to compete with VR, they're two very different things. AR doesn't have the immersion in the same way, so it's applications will be different.

Back on to the Fury though, I wonder if AMD are being very smart here by keeping everything under wraps, letting everyone moan and slam them for the rebrands... then they'll amaze and saturate us all with the magic of Fury and all that other stuff will be forgotten.
 
After watching E3 the only game I really care about is FO4, seeing as they're STILL using Gamebryo I'm not concerned about VRAM until 2016.

Depending on which armchair expert you believe HBM2 will either be out this year or end of next.

( armchair expert ) Love it :D
 
In two months time there will be Hot Chips, so if they don't talk about the interposer at E3 they damn sure will then.

I'm hoping they will talk about near future APU uses, just imagine the CPU and 400mm GPU on a silicon interposer with HBM. Also I want to know exactly how big they can make them and dual-GPU implications.
 
So given the fact 80% of the "new" cards are rebrands again, I suspect that AMD have avoided releasing driver's so that they can release 8 months worth of minor improvements in a new driver to brag that the new 300 series cards are 15% faster.

But won't that mean that if they release a driver which has the 390x ahead of the 290x they must in fact be nerfing the 290x's performance ???

If they can boost the 390x why not the 290x?
 
People still banging on about drivers when there is another thread on Nvidia instability, another TDR issue(which has been going and coming back for well over two years with a huge number of users effected).

Witcher 3, with an 'old' driver works dandy on all AMD architectures, performance is as expected and generally strong, single gpu is perfect. xfire not great, but single gpu perfect.

Nvidia, multiple drivers, most unstable for huge numbers of users, entire generations of products were drastically underperforming(on all drivers new and old). SLI scaling particularly on Kepler was terrible. People who had paid near enough 2 grand for Titan X sli are complaining that while performance is lower in single gpu the game is smoother and more playable. The instability is present for all types of users, sli, kepler, maxwell.

But AMD have a driver issue because xfire sucks in Witcher 3 even though sli is highly problematic in Witcher 3 anyway. Despite it being an Nvidia game, it's more stable on AMD drivers released before the game was launched. Damn those old AMD drivers, I'd far prefer to have 3-4 different versions of specific Witcher 3 drivers all of which have various problems and introduce crashing.

Having spent enough time with both, recently moving to nvidia... I can tell you amd's problems with crossfire go way further than witcher 3, which was indeed appalling.

Every game I have tried has been INSTANTLY noticeably better.

Inquisition, no more flickering.

Advanced warfare, multi gpus actually work! Was told this was a game issue in the amd thread!

Hawken, no more dodgy menu issues and flickering

Witcher 3, obviously, actually works.

And the game that pushed me over at last, elite dangerous, because after half a year since release, amd have done **** all about it!

Nvidia are not perfect in their support and have some issues. That will always happen with either side.

At least I can feel a bit more sure that nvidia will actually address their problems.

So even though I despise them, they offer the better experience, so amd can **** off.

Have you even used both?

Edit: oh yeah, and that 'displayport link failure' issue I kept having with amd? Gone!
 
Advanced warfare, multi gpus actually work! Was told this was a game issue in the amd thread!

I'm pretty sure they work with the developer (especially on COD of all things) so game issue seems to mean "we can't fix it on our own so expect a longer lead time". It doesn't mean it will never be fixed.
 
Having spent enough time with both, recently moving to nvidia... I can tell you amd's problems with crossfire go way further than witcher 3, which was indeed appalling.

Every game I have tried has been INSTANTLY noticeably better.

Inquisition, no more flickering.

Advanced warfare, multi gpus actually work! Was told this was a game issue in the amd thread!

Hawken, no more dodgy menu issues and flickering

Witcher 3, obviously, actually works.

And the game that pushed me over at last, elite dangerous, because after half a year since release, amd have done **** all about it!

Nvidia are not perfect in their support and have some issues. That will always happen with either side.

At least I can feel a bit more sure that nvidia will actually address their problems.

So even though I despise them, they offer the better experience, so amd can **** off.

Have you even used both?

Edit: oh yeah, and that 'displayport link failure' issue I kept having with amd? Gone!

I have no issues with Witcher 3 using my 290x . The game runs flawlessley full settings. Maybe I was lucky.
 
Then talk in basic about Fury then 1-2 weeks you have proper reviews.

This would be a bit of a disappointment, but I feel there will be a bit more meat on the bones tomorrow. I am in the position of wanting a new GPU and a new gsync or freesync monitor. I would far rather buy a Fury at £550 and a 1440p 27" freesync monitor for £420 or £480, than spend £620 on a properly cooled 980ti and £600 for the privilege of owning an equivalent gsync monitor.

Fortunately I am blessed with patience, but the excitement generated by the 980ti shows that a lot of people aren't. If AMD's presentation is all 'jam tomorrow', it's going to cost them a lot of sales. And every gsync monitor sold is a future sale lost for AMD as well.
 
So when AMD release the 8GB 290Xs in answer to Nvidia's 4GB 980 we're told repeatedly by AMDMatt that 4GB isn't enough (e.g. here: http://forums.overclockers.co.uk/showpost.php?p=27715057&postcount=1582).

Now that AMD only have 4GB on Fury, we all of a sudden find out (from people with 1080p screens it often seems) that 4GB is fine for 4K.

Back when AMDMatt raised the issue it was a big thing and going forward more and more games would need more than 4GB at 4K.
Now AMD have 4GB on Fury 4GB will be enough for 4K for the foreseeable future.

I notice most of the red team that are defending 4GB @ 4K now didn't argue with AMDMatt at the time.

So was the 8GB entirely unnecessary and just a money grabbing manoeuvre by AMD to milk it's customers for £150 they didn't need to spend (4GB 290Xs were about £206, 8GB 290X were about £360) or is there a need for more than 4GB and AMD and the red team supporters are trying to make excuses?
 
So when AMD release the 8GB 290Xs in answer to Nvidia's 4GB 980 we're told repeatedly by AMDMatt that 4GB isn't enough (e.g. here: http://forums.overclockers.co.uk/showpost.php?p=27715057&postcount=1582).

Now that AMD only have 4GB on Fury, we all of a sudden find out (from people with 1080p screens it often seems) that 4GB is fine for 4K.

Back when AMDMatt raised the issue it was a big thing and going forward more and more games would need more than 4GB at 4K.
Now AMD have 4GB on Fury 4GB will be enough for 4K for the foreseeable future.

I notice most of the red team that are defending 4GB @ 4K now didn't argue with AMDMatt at the time.

So was the 8GB entirely unnecessary and just a money grabbing manoeuvre by AMD to milk it's customers for £150 they didn't need to spend (4GB 290Xs were about £206, 8GB 290X were about £360) or is there a need for more than 4GB and AMD and the red team supporters are trying to make excuses?

Understand as it has been said so often in this thread....4gb of hbm is not 4gb of ddr5. Now come tomorrow we might find the difference is minimal, but we might also find it is a huge leap forward. The point is your comment smacks of trying to draw a negative reaction by finger pointing when there are no solid facts upon which to base your comparison.
 
So when AMD release the 8GB 290Xs in answer to Nvidia's 4GB 980 we're told repeatedly by AMDMatt that 4GB isn't enough (e.g. here: http://forums.overclockers.co.uk/showpost.php?p=27715057&postcount=1582).

Now that AMD only have 4GB on Fury, we all of a sudden find out (from people with 1080p screens it often seems) that 4GB is fine for 4K.

Back when AMDMatt raised the issue it was a big thing and going forward more and more games would need more than 4GB at 4K.
Now AMD have 4GB on Fury 4GB will be enough for 4K for the foreseeable future.

I notice most of the red team that are defending 4GB @ 4K now didn't argue with AMDMatt at the time.

So was the 8GB entirely unnecessary and just a money grabbing manoeuvre by AMD to milk it's customers for £150 they didn't need to spend (4GB 290Xs were about £206, 8GB 290X were about £360) or is there a need for more than 4GB and AMD and the red team supporters are trying to make excuses?
I think most are looking at the fact they are entirely different RAM structures and assuming the power is there in the new type so that 4GB of HBM is enough.

Of course no one knows how HBM is going to work out so most sensible people arnt criticising or praising it yet?
 
I thought they were skipping 20nm because the yields wasn't that great, I'm sure I read somewhere that Pascal will be 16nm and AMD were teasing that there next flagship would be 20nm aka the Fury which isn't the case so why are AMD banging on about 20nm next year?

AMD next year will be Samsung/GloFo 14nm
But you are right even in last november they said they have many things planned on 20nm... i can imagine they really planned a 20nm line..so the bad yields could hurt them badly if its true.
 
Status
Not open for further replies.
Back
Top Bottom