• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The Witcher 3 Benchmarks

A year ago a 780ti was as fast as a 980, today you need two 780ti's to match it, a single 780ti cost MORE than BOTH my 290 pros... but as a 780ti owner Nvidia is actively refusing to optimise drivers for you.

What non argument. For one thing Witcher 3 runs great for me, I don't care about waiting for a proper driver, look in the Nvidia driver thread created today. THe latest Nvidia driver, that they've rushed out for Witcher 3 is giving people lots of problems. I'm soooo sad that I'm sat here playing Witcher 3 perfectly fine but without xfire and waiting on a stable driver that fixes the problems while Nvidia guys can do the same... or install the driver rushed out that people are having problems with.

I bought my second 290 knowing that xfire (and SLI) don't always have perfect support for brand new games, I don't care about it at all. ANyone buying a second gfx from either company thinking it should work perfectly in every game is ignorant, as would anyone be who claimed SLI worked flawlessly in every game at launch.

The difference is, had I spent that £600+ on a 780ti single gpu which I wouldn't expect any problems with... I'd have been left high and dry. Paying £600 for a card with magically brilliant support, in which as soon as a new gen card is released performance optimisation will cease immediatedly.

Yeah, non argument, no lack of support on the Nvidia side.

Again, I'm not claiming and didn't claim perfect support from AMD, I'm laughing at the ignorance of those Nvidia guys who claim perfect support from Nvidia... because it's laughable. This isn't the first time the last gen products have 'lost performance' consistently after a new gen was launched and it won't be the last.

The other day i OCed my 780 to around 1250 core and left the ram stock and managed 12k gpu score in firestrike. Thats the same as the 980 G1 gaming according to Guru3d spite that being powered by a more powerful cpu. Before anyone jumps on the hate train here, yes i know you can OC the 980 even further but what makes no sense to me is that i cannot get half the performance in Witcher 3 compared to the 980. For a game that isnt that good looking im surprised it tanks performance the way it does on last gen nvidia, then again im not really, afterall its a gameworks title.

I dont know who i want to point fingers at here, the Dev or nvidia. I guess the devs since they choose what to put in the game afterall. They dont have to make use of gameworks features as there are plenty of other unbiased options out there.

Not a single crash for me with the latest drivers and I spent the whole day/night yesterday playing TW3. I am sure there is people with problems though and from both sides. No real need for such damming posts that DM has done in the last couple of threads and I hope everyone from AMD to Nvidia and even the consoles can play this superb game without issues.

I agree with you. Before go we go slaughter nVidia for this mess we should give them some time to fix their driver bug, cause it has to be a driver bug looking at those performance numbers. I hope for them its a driver bug :P
 
The driver doesn't have issues for everyone and at least they attempted it, they'll probably have another out by the time AMD release theirs. It's not exactly difficult to roll back to the previous driver to play the game if you do have issues anyway.

If it's not just one game then link me some more benchmarks where a 980 is as good as or beating 780ti SLI.

Ah, a driver that doesn't work and causes problems is better than no driver, okay, hardly an issue to roll back or could just take the time to make a less buggy driver.. particularly being a TWIMTBP game they have clearly had more involvement and time to work on it, even more so being a gameworks game.

As for the latter, if you can show me where I claimed the 980 was as good as or beating 780ti sli in other games, I'll find some benchmarks to show it... because I'm sure if I claimed that it would be true... I however didn't claim that, may be true, may not be.

If the 780ti was on average 5-10% behind a 980 at launch, and is on average anything from 20-50% behind now, regardless of exactly which number it is.... it isn't right. WIth almost every passing driver set and every new game, more and more people notice and remark on the growing disparity between Maxwell and Kepler performance. Some growth would be expected, massive change wouldn't be.
 
You have people in this very thread saying AMD don't have a driver for it, what a joke... but you're saying Nvidia who have put out a driver, which has issues, hasn't improved performance for the 780ti.

It also isn't remotely one game, the ongoing obvious disparity in Nvidia last gen products has been commented on by many people in many games since the 980 launch. this isn't the first game in which the increasing performance gap has been noticed and it certainly won't be the last.

Didn't see any mention in there of Maxwell having better tessellation performance than Kepler. Maybe you forgot to mention that?


Sorry but in the real world not giving access to source code is an every day affair, as is putting people into confidentiality agreements. You do not need the source code to optimise for the game as should be evident to even the lesser informed of you seeing as you've already pointed out developers did-not even have access to this prior to changes in the program. CD-Red cannot optimise for AMD hardware without AMD.

Seemingly as someone as shown on Reddit, a simple user level driver setting tweak has greatly improved the games performance in HairWorks/Tessellation.

All these things speak words - the game is out, and there is no driver yet.

AMD want you to hate GameWorks. NVIDIA is a huge conglomerate with numerous shader technologies and who continue to develop them. Why should they hand these out for free just because AMD say they should.

It is up to AMD to:

Push their technologies onto developers - there is no licencing that prohibits separate libraries from separate vendors from coexisting in the same game

Buy a licence (unless NVIDIA is prohibiting them from doing so, frankly I think AMD would be the first to tell you if that were the case)

Provide a similar library, if they don't want developers just using NVIDIA's toolsets

Lastly, stop pointing blame else where for optimisation on their own products. They are whining at you (their consumers), and doing absolutely nothing about it other than instigating via forms of media.

You can either put up a banner in protest at developers using vendor libraries, and insist they create in engine technologies in house - which by the way is more often than not not financially viable, hence why GameWorks even exists. And end up having none of these technologies in games at all.

Or you can tell AMD to stand up and offer developers the same, or better toolsets.



End of.
 
So you have a go at others for assuming things then you do the same yourself by assuming devs don't have access to gameworks code...

I didn't have a go at someone for assuming 'things' I said someone was assuming they had access to the source code, a specific assumption. It was also because of the points I made, as you did, you believed that Nvidia changed it so gameworks users definitely have the source code, there are two options.

LIkewise the dev has stated they can't optimise for AMD, this is usually because they do NOT have the source code. I said his assumption was likely incorrect based on the things we do know. LIke for instance Nvidia offer the non source code gameworks library cheaper than giving access to the source code. We know Nvidia didn't want to tell anyone how much source code access was, we know Nvidia only added a source code option WAY down the line after they received much public criticism and we know the dev has stated they can't optimise.

Assuming something that goes against everything known is silly, assuming something which the facts generally tend to agree with is not silly.
 
You're not assuming things based on facts though, you're assuming things based on more assumptions.

You don't know whether the devs have to pay for the gameworks code or not for a start.
 
Didn't see any mention in there of Maxwell having better tessellation performance than Kepler. Maybe you forgot to mention that?


Sorry but in the real world not giving access to source code is an every day affair, as is putting people into confidentiality agreements. You do not need the source code to optimise for the game as should be evident to even the lesser informed of you seeing as you've already pointed out developers did-not even have access to this prior to changes in the program. CD-Red cannot optimise for AMD hardware without AMD.

Seemingly as someone as shown on Reddit, a simple user level driver setting tweak has greatly improved the games performance in HairWorks/Tessellation.

All these things speak words - the game is out, and there is no driver yet.

AMD want you to hate GameWorks. NVIDIA is a huge conglomerate with numerous shader technologies and who continue to develop them. Why should they hand these out for free just because AMD say they should.

It is up to AMD to:

Push their technologies onto developers - there is no licencing that prohibits separate libraries from separate vendors from coexisting in the same game

Buy a licence (unless NVIDIA is prohibiting them from doing so, frankly I think AMD would be the first to tell you if that were the case)

Provide a similar library, if they don't want developers just using NVIDIA's toolsets

Lastly, stop pointing blame else where for optimisation on their own products. They are whining at you (their consumers), and doing absolutely nothing about it other than instigating via forms of media.

You can either put up a banner in protest at developers using vendor libraries, and insist they create in engine technologies in house - which by the way is more often than not not financially viable, hence why GameWorks even exists. And end up having none of these technologies in games at all.

Or you can tell AMD to stand up and offer developers the same, or better toolsets.



End of.

I completly agree with all the above.

However a world where Nvidia and AMD share the same libraries would benefit us gamers. Im not sure how this would work but this would be the ideal.

Locking the other vendor out is no good for us in the long run which ever side you fall in to, whether that be Green or Red.
 
AMD's open practises are better IMO, and TressFX ran better on Nvidia cards (Tomb Raider) than Hairworks does on AMD's card.

The next TressFX game is Deus Ex though isn't it? It's not had much uptake :(
 
You're not assuming things based on facts though, you're assuming things based on more assumptions.

You don't know whether the devs have to pay for the gameworks code or not for a start.

Yes, that is a stated fact from Nvidia themselves. Go and look up all the threads or articles on gameworks. Nvidia ADDED another license giving devs the option of black box(as originally was the only option) or getting a license that included the source code, as said they wouldn't tell anyone how much more it cost.

This is fact, not assumption, likewise it's pretty simple to see that black box license would be significantly cheaper than source code license.. how much is questionable, Nvidia wouldn't tell anyone which if you can be honest, doesn't suggest value for money in that option.
 
Drunkenmaster, you can't argue with people. They immediately shutdown once they think they're being judged for making a wrong choice. And how could you reasonably argue with a man that's defending Nvidia against his own best self-interest? Or one that takes no personal responsibility for his purchases & pretends that they have no consequences?

It is strange though how this whole fiasco with how unsupported older cards are has done nothing to dampen the zealous optimism people have for Nvidia. I guess people don't mind buying have to buy new cards every year.
 
I completly agree with all the above.

However a world where Nvidia and AMD share the same libraries would benefit us gamers. Im not sure how this would work but this would be the ideal.

Locking the other vendor out is no good for us in the long run which ever side you fall in to, whether that be Green or Red.

Then what you need are libraries that aren't owned by either or. Having open source libraries is something you could consider honourable, to lay hate on a company for not doing this is nothing short of ridiculous, though.

The ball is firmly in AMD's court really. TressFX as it's own entity isn't really an example of generosity seeing as it's the only technology on offer, and in only one title - even if (from what was seen in Tomb Raider) it is infinitely better than HairWorks IMO
 
Last edited:
AMD Radeon Cards Get Improved Performance In Witcher 3 With Reduced Tessellation Levels

Read more: http://wccftech.com/amd-radeon-card...s-catalyst-whql-drivers-issued/#ixzz3ag6NiCo0

bU4EYfD.png
 
Drunkenmaster, you can't argue with people. They immediately shutdown once they think they're being judged for making a wrong choice. And how could you reasonably argue with a man that's defending Nvidia against his own best self-interest? Or one that takes no personal responsibility for his purchases & pretends that they have no consequences?

It is strange though how this whole fiasco with how unsupported older cards are has done nothing to dampen the zealous optimism people have for Nvidia. I guess people don't mind buying have to buy new cards every year.

My argument has nothing to do with thinking I've made the wrong choice :confused:

You can flip that argument on the head though and ask why he's always so hellbent on having a go at nVidia when most of the time has no facts to back up his assumptions. He still hasn't proved the devs have to pay to use gameworks code.

Writing long posts filled with waffle doesn't make him right.
 
I just ignored that comment, it's deep and cynical whilst having no objection to still not having an optimised driver for the game in question. It's not about that - for instance I think XDMA Crossfire has been better than SLI for well over a year now, despite recent dwindling driver support from AMD
 
Drunkenmaster, you can't argue with people. They immediately shutdown once they think they're being judged for making a wrong choice. And how could you reasonably argue with a man that's defending Nvidia against his own best self-interest? Or one that takes no personal responsibility for his purchases & pretends that they have no consequences?

It is strange though how this whole fiasco with how unsupported older cards are has done nothing to dampen the zealous optimism people have for Nvidia. I guess people don't mind buying have to buy new cards every year.

The NV GPU is the new Mobile phone.
 
That 2013 to 2015 is a very poor example on how the actual game looks.

Here are some screenshots I took earlier using a sweet fx profile which are a much better indication. You can make it not that far away from the 2013 just by changing the cartoonish colours of the retail version.

SpGPPoh.jpg


zNbQoRj.jpg


uXB6mft.jpg

To be fair, in those screens that looks impressive
 
Back
Top Bottom