We've had AR for years. You can pick up your phone and use it now.
Oh no no no... Hololens is a whole new ball game. That's like saying we've had graphics for years, pick up your Gameboy and be happy.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
We've had AR for years. You can pick up your phone and use it now.
Unfortunately I do get the feeling that tomorrow might not actually give us all the answers we are wanting and I don't mean we will hear things that we don't want too.
I expect lots of blurb about the future of gaming, future VR tech, DirectX 12 and other such stuff that most of us here, don't really care about right now.
A quick bit about the 300 series with tech details and probably a quick glimpse of the Furry right at the end of the event.
Oh no no no... Hololens is a whole new ball game. That's like saying we've had graphics for years, pick up your Gameboy and be happy.
There's more to it than Minecraft... I'm puzzled why they chose this to show it off though. I never got the whole Minecraft thing. Having read some of the other stuff they're doing, it has amazing potential though. It's not going to compete with VR, they're two very different things. AR doesn't have the immersion in the same way, so it's applications will be different.What if I don't want to play Minecraft on a tabletop?
It seems like they panicked with everyone else getting their own VR headset and greenlit some random lab experiment from 2010.
After watching E3 the only game I really care about is FO4, seeing as they're STILL using Gamebryo I'm not concerned about VRAM until 2016.
Depending on which armchair expert you believe HBM2 will either be out this year or end of next.
Maybe new drivers![]()
So given the fact 80% of the "new" cards are rebrands again, I suspect that AMD have avoided releasing driver's so that they can release 8 months worth of minor improvements in a new driver to brag that the new 300 series cards are 15% faster.
People still banging on about drivers when there is another thread on Nvidia instability, another TDR issue(which has been going and coming back for well over two years with a huge number of users effected).
Witcher 3, with an 'old' driver works dandy on all AMD architectures, performance is as expected and generally strong, single gpu is perfect. xfire not great, but single gpu perfect.
Nvidia, multiple drivers, most unstable for huge numbers of users, entire generations of products were drastically underperforming(on all drivers new and old). SLI scaling particularly on Kepler was terrible. People who had paid near enough 2 grand for Titan X sli are complaining that while performance is lower in single gpu the game is smoother and more playable. The instability is present for all types of users, sli, kepler, maxwell.
But AMD have a driver issue because xfire sucks in Witcher 3 even though sli is highly problematic in Witcher 3 anyway. Despite it being an Nvidia game, it's more stable on AMD drivers released before the game was launched. Damn those old AMD drivers, I'd far prefer to have 3-4 different versions of specific Witcher 3 drivers all of which have various problems and introduce crashing.
Advanced warfare, multi gpus actually work! Was told this was a game issue in the amd thread!
Having spent enough time with both, recently moving to nvidia... I can tell you amd's problems with crossfire go way further than witcher 3, which was indeed appalling.
Every game I have tried has been INSTANTLY noticeably better.
Inquisition, no more flickering.
Advanced warfare, multi gpus actually work! Was told this was a game issue in the amd thread!
Hawken, no more dodgy menu issues and flickering
Witcher 3, obviously, actually works.
And the game that pushed me over at last, elite dangerous, because after half a year since release, amd have done **** all about it!
Nvidia are not perfect in their support and have some issues. That will always happen with either side.
At least I can feel a bit more sure that nvidia will actually address their problems.
So even though I despise them, they offer the better experience, so amd can **** off.
Have you even used both?
Edit: oh yeah, and that 'displayport link failure' issue I kept having with amd? Gone!
I have no issues with Witcher 3 using my 290x . The game runs flawlessley full settings. Maybe I was lucky.
Then talk in basic about Fury then 1-2 weeks you have proper reviews.
So when AMD release the 8GB 290Xs in answer to Nvidia's 4GB 980 we're told repeatedly by AMDMatt that 4GB isn't enough (e.g. here: http://forums.overclockers.co.uk/showpost.php?p=27715057&postcount=1582).
Now that AMD only have 4GB on Fury, we all of a sudden find out (from people with 1080p screens it often seems) that 4GB is fine for 4K.
Back when AMDMatt raised the issue it was a big thing and going forward more and more games would need more than 4GB at 4K.
Now AMD have 4GB on Fury 4GB will be enough for 4K for the foreseeable future.
I notice most of the red team that are defending 4GB @ 4K now didn't argue with AMDMatt at the time.
So was the 8GB entirely unnecessary and just a money grabbing manoeuvre by AMD to milk it's customers for £150 they didn't need to spend (4GB 290Xs were about £206, 8GB 290X were about £360) or is there a need for more than 4GB and AMD and the red team supporters are trying to make excuses?
I think most are looking at the fact they are entirely different RAM structures and assuming the power is there in the new type so that 4GB of HBM is enough.So when AMD release the 8GB 290Xs in answer to Nvidia's 4GB 980 we're told repeatedly by AMDMatt that 4GB isn't enough (e.g. here: http://forums.overclockers.co.uk/showpost.php?p=27715057&postcount=1582).
Now that AMD only have 4GB on Fury, we all of a sudden find out (from people with 1080p screens it often seems) that 4GB is fine for 4K.
Back when AMDMatt raised the issue it was a big thing and going forward more and more games would need more than 4GB at 4K.
Now AMD have 4GB on Fury 4GB will be enough for 4K for the foreseeable future.
I notice most of the red team that are defending 4GB @ 4K now didn't argue with AMDMatt at the time.
So was the 8GB entirely unnecessary and just a money grabbing manoeuvre by AMD to milk it's customers for £150 they didn't need to spend (4GB 290Xs were about £206, 8GB 290X were about £360) or is there a need for more than 4GB and AMD and the red team supporters are trying to make excuses?
I thought they were skipping 20nm because the yields wasn't that great, I'm sure I read somewhere that Pascal will be 16nm and AMD were teasing that there next flagship would be 20nm aka the Fury which isn't the case so why are AMD banging on about 20nm next year?