Associate
Funny he refers to FO4 which ran fine on my 680 but has artifact-ing on the map and in other places on my new 390.... Which is on the latest drivers
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Hear we go AGAIN...
Threads like this are not going to change anything and will only deepen the battle line and the trenches between the Holy Army of AMD and The Dark Nvidia empire.
We all know the start the middle and the end game of this thread, we should just skip it and close it.
They spend billions in R&D, years of trials and await government approval on many medicines only when they have one they can market they have to charge a large amount at first to recoup the costs already incurred.
errrr, that's why.
Not going to happen. If AMD disappear, the market will still dictate what they are prepared to pay and if Nvidia charge £2000 for a base card, they won't be selling any.
Funny he refers to FO4 which ran fine on my 680 but has artifact-ing on the map and in other places on my new 390.... Which is on the latest drivers
The Video is not a bad watch. It lays out the evidence pretty well and is not just about Amd v Nvidia it's about Nvidia v Nvidia older Gen hardware. It's not only Amd users that don't like GameWorks there is plenty of Nvidia users who are not keen on it.
The Video is not a bad watch. It lays out the evidence pretty well and is not just about Amd v Nvidia it's about Nvidia v Nvidia older Gen hardware. It's not only Amd users that don't like GameWorks there is plenty of Nvidia users who are not keen on it.
People will still buy them. As proven by the people who bought titan cards
People will still buy them. As proven by the people who bought titan cards
Interesting...
I'd say Ian is BS'ing....
The thing about GameWorks is that- let's "suppose" it really doesn't "delibrately" cripple AMD hardware, however if during any stage of adding the GameWorks features to a game, and it "unfortuntately" break something along the way, they'd only fix what affect their own hardwares, but would not look into the mess they left for AMD because I don't think they are politically correct enough to feel that they should be responsible for fixing problems for AMD, despite the problem could well be caused by their own two hands in the first place.Exactly,
We know Nvidia is gimping some games so they work badly on AMD hardware but the video also shows the older gen Nvidia cards running quite a bit slower than their AMD counterparts. I know I'd be pretty peeved off if my 290X dropped 20% compared to a 970 or similar over time.
I tried to watch the video but all I picked up on was "something about a kilt" and I am sure he mentioned Porridge a few times
Running some of these cards IRL and owning a 780 as I've posted before the 780 benchmark numbers often don't seem to reflect what I see in day to day use. (I'm not shy of spending money on GPUs when I feel the need to so believe me this isn't me being defensive of the 780 - if it wasn't performing to my requirements it would be gone without a second thought).
Doing a little experimenting the other day they often seem to match up with what would happen if you pretended the 780's fairly healthy actual boost clocks out the box (not overclocking) didn't exist and limited them to the on paper reference clocks. Which would suit both AMD and nVidia :S
However - Maxwell does have some changes to the shader architecture that will give it an upto 35% increase in efficiency over Kepler in an otherwise like for like scenario - there will be some games (especially newer releases) where that translates to a 10-15% bonus ingame that can't be made up even with any amount of overclocking advantage to Kepler due to the nature of the bottleneck.
As to the video I'm not even going to watch it - is it really anything other than yet another sensationalist, poorly understood attack on GameWorks by someone lacking actual technical insight? coz they are getting really boring.
Lol! No mention of Iron Brew?
Running some of these cards IRL and owning a 780 as I've posted before the 780 benchmark numbers often don't seem to reflect what I see in day to day use. (I'm not shy of spending money on GPUs when I feel the need to so believe me this isn't me being defensive of the 780 - if it wasn't performing to my requirements it would be gone without a second thought).
Doing a little experimenting the other day they often seem to match up with what would happen if you pretended the 780's fairly healthy actual boost clocks out the box (not overclocking) didn't exist and limited them to the on paper reference clocks. Which would suit both AMD and nVidia :S
However - Maxwell does have some changes to the shader architecture that will give it an upto 35% increase in efficiency over Kepler in an otherwise like for like scenario - there will be some games (especially newer releases) where that translates to a 10-15% bonus ingame that can't be made up even with any amount of overclocking advantage to Kepler due to the nature of the bottleneck.
As to the video I'm not even going to watch it - is it really anything other than yet another sensationalist, poorly understood attack on GameWorks by someone lacking actual technical insight? coz they are getting really boring.
Your opinion counts for nothing if you haven't even bothered to watch the video this thread is about.