With that admittedly verbose background out of the way, let’s dig into Watch Dogs specifically. I’ve been testing it over the weekend on a variety of newer AMD and Nvidia graphics cards, and the results have been simultaneously fascinating and frustrating. It’s evident that Watch Dogs is optimized for Nvidia hardware, but it’s staggering just how un-optimized it is on AMD hardware. I guarantee that when the game gets released, a swarm of upset gamers are going to point fingers at AMD for the sub-par performance. Their anger would be misplaced.
I asked Robert Hallock about this specifically, and he explains that they had “very limited time with the title and [we've] been able to implement some respectable performance improvements thanks to the skill of our driver engineers. Careful performance analysis with a variety of internal tools have allowed us to profile this title, despite deliberate obfuscation attempts, to improve the experience for users.”
AMD will release a new driver to the public this week which reflects those improvements. (It’s the same driver I conducted my testing with.) Unfortunately my conversation with Hallock didn’t end with a silver lining: “I am uncertain if we will be able to achieve additional gains due to the unfortunate practices of the Gameworks program,” he remarked...
What you’re seeing in the benchmarks above is a $500 AMD video card (Radeon 290x) struggling to keep up with a $300 one (GTX 770) from Nvidia using one of the lowest levels of Anti-Aliasing, since Nvidia’s TXAA isn’t available to Radeon users. And the performance deficiences scale down accordingly. Both of the cards tested were reference boards, and the system is an Intel Core i7 4770K with 16GB of 1866MHz RAM running on Windows 8.1 and this week’s game-ready drivers from both Nvidia and AMD.
To further put this in perspective, AMD’s 290x graphics card performs 51% better than Nvidia’s 770 on one of the most demanding PC titles around, Metro: Last Light — which also happens to be an Nvidia optimized title. As you would expect given their respective prices, AMD’s flagship 290x can and should blow past Nvidia’s 770 and compete with Nvidia’s 780Ti on most titles. To really drive the point home, my Radeon 290x can hit 60fps on Metro: Last Light with High quality settings and 4x anti-aliasing, at a higher resolution of 1440p. It points to a poorly optimized game all around; but substantially worse on AMD hardware, as Joel Hruska accurately predicted several months ago. (For those wondering, AMD CrossFire and Nvidia SLI scaling are almost nonexistent at this point, especially in higher resolutions like 1440p and 4K).