Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Whatever the reason, if BF3 was my game of choice, I wouldn't take a card that loses 33% frame rate over a card that only loses 19.5% frame rate with 4xMSAA, if both card are supposed to be same teir and price range (6970 vs GTX570). Yes people can always speculate why AMD cards do so poorly with MSAA on BF3/Frostbite engine, but regardless of the cause, I think it is just common sense to get the card that perform better in games that they play, rather than "oww~~you tried. Don't worry about the poor MSAA performance...I'll buy you anyway despite the competitor offer a more capable card at the same price point on games that I play." I don't think consumers are charity...
To be honest I was excited about the 7950...but after seeing that AMD STILL looses a huge of frame rate on AA application like the cards in its previous generation (7950 is faster than GTX580 by a fair margin on 0xAA, but once 4xAA applied, it became slower than GTX580 by fair margin in BF3), that became a turn off, and made me want wait for Kepler to see how they perform, as it is likely to offer at least similar performance, the same VRAM benefit, but without the MSAA performance weakness.
A Dice dev stated that Nvidia DLC was implemented into the engine, if it increases performance then quit rightly so.
I was talking about paying the same/more money for a slower card (the performance I quoted was for GTX570 vs 6970), not the 6950 which is known to be one of the best bang for bucks card of last gen.Oc's best selling gpu of 2011 was the 6950, it wasn't down to charity, it was down to best bang for buck.
Funny thing is that BF3 is a Gaming Evolved title and people would presume it means the same as TWIMTBP and everything would de done in favour when it just means AMD worked together with the developer on some aspects and even when money is involved the developer is not limited to implementations favourable to AMD, its just reassurance that same aspect will work on AMD, DICE also worked with NV on that title.
http://blogs.amd.com/play/2011/12/12/bf3techinterview/2/
I was talking about paying the same/more money for a slower card (the performance I quoted was for GTX570 vs 6970), not the 6950 which is known to be one of the best bang for bucks card of last gen.
And I think you misunderstood about I'm saying about "poor MSAA performance" for AMD card. I'm not talking about how many games that 7950 beat the GTX580 in on 1920 res 4xAA, but about the % of frame rate lost going from 0xAA to 4xAA. The 7950 does beat the GTX580 in most games at 4xAA, but it is beating it in the condition of losing bigger % of frame rate going from 0xAA, so you can't help but to question if AMD could have improve more on their GPU architecture to lose less frame rate on AA application. I mean if you look at 0xAA results, you could even argue that 7950 is streets ahead of the GTX580...but because of it loses so much more frame rate than GTX580 on AA application, it makes it to be only tiny be faster than the GTX580. So what I'm saying is AMD cards clearly got the grunt (looking at 0xAA results), but the AA performance is dragging their legs.
My lowly 460 GTX 1gb has a **** fit if i try everything on ultra ; ;
@random guy
right, so you aren't playing at max settings then!
4x MSAA isnt exactly maximum AA, and is negligible over 2x in terms of visual quality. If you to play with 'max AA', then you need to be forcing 8x SSAA through your driver settings.
1) The reason why 'ingame settings' dont allow you to set a higher AA than 4x or 8x MSAA in just about every game is because most setups wouldnt be able to handle them.
2) In terms of visual quality, all forms of MSAA are terrible, none work on every surface in any game. If you want proper AA you need to be forcing SSAA through the drivers.
3) Id rather be using FXAA over MSAA for the peformance, or SSAA for the visual quality if the game will still run fine. Only using in game settings for AA for comparative reasons is daft.
Regardless of that, a pair of 1 Gb GTX 560 tis in SLI ABSOLUTELY WILL run BF3 on ultra settings and 4x MSAA with no problem with 8 Gb system ram. A single one wont because its GPU is too slow.
can you post up an afterburner log, because at least 3 of us on here have an experience totally the opposite - that 1GB cards fall off a cliff when you enable MSAA
I'd feel really sorry for anyone believing your blatant lie and getting a pair of 1GB cards and finding out they'd just wasted 300 quid
and in fact if you google 560ti sli battlefield 3 you find numerous forums posts of people saying "if i enable 4xmsaa my fps goes down to 5fps"
Check this thread and get a re education on this topic:
http://forums.overclockers.co.uk/showthread.php?t=18336345
In the other thread, there is Ocuk forum members with their own comparisons with 1gb 560 sli v 2 gb 560 sli.
http://www.tomshardware.com/reviews/radeon-hd-7950-overclock-crossfire-benchmark,3123-6.html
6970 2GB loses 33% frame rate on 4xMSAA application, whereas GTX570 1.25GB only loses 19.5% on 4xMSAA application.
Grunt/GPU architecture>VRAM for most games, even for games that are known to use VRAM a little over than what's available. Metro2033 is probably the ONLY game that extra VRAM would make a big difference, and it's VRAM hungry like mad, and no other game is like this.
I think the texture mods are essential as some of Skyrim's is terrible.![]()