That's the second recent game (other being Crysis 2) where one of my gfx cards may as well be made out of wood.
Get yourself Radeon pro:
http://www.radeonpro.info/en-US/
You can force xfire profiles onto unsupported games, for Crysis 2 you can use the 'bioshock' profile.
For the Witcher 2, use 'Dirt 2' profile. It's not a tremendous boost just now, but something is better than nothing.
Does anyone know the technical reason why crossfire optimisation appears to be in the hands of gfx card manufacturers rather than developers? Surely multi-GPU functionality should be open for developers to use their own optimisation techniques?
CatalystCreator Andrew D tweeted:
'We always try to get CF support the day a game releases, but sometime we don't see the final build of a title until it releases...'
'We're looking at Witcher 2 for single and CF - will release hotfix driver as soon as possible'
'Witcher 2 - CF profile - looks like a driver change is needed (can't do to much with just CAP) - will let you know more info when I have it'
When talking about xfire performance in Crysis 2, he blamed crytek for not putting AFR(advanced frame rendering) support into the game.
He sort of put the blame on developers and said all they need to do is put AFR support into games.
Nvidia must be using another kind of rendering for their titles than AFR, and I presume that's why there is big differences in performance for each camp with certain games.
I suppose it's payback for the red team just now as DA II ran like a dog on team green when it came out.