• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

780 x 2 possible with this PC?

This is true i was just showing what the old dog could do, i would not want the op to make the same mistake is i did, this





I got the upgrade bug and hit it with over 2k of goodies thinking this was the best way to go, but i used the same GPU the 7970 xfire and there was no improvement at all the 4930k was at 4.4ghz also, faster does not always mean better.

So a new bundle for him may not net the desired results.

That system was sent back (it was sold with a guarantee of the Cpu doing 4.8ghz, overclockers typo error, that's another story)

Then i went 4770k with sli 780 then sli 780ti only then did i get what i wanted.

Op will have to go 4770k and Sli to get what he wants or do what Kappstad said in the begining try another 780 first and see what happens.

Just by changing the bundle aint going to make a dramatic improvement, that's going on this particular experience.
For 2560 res, I would imagine i7 920 would be more or less already a perfect balance with CF7970, as both the CPU and GPU grunt would probably both hitting around the same frame rate range, so upgrading CPU the bottleneck would become GPU limited, and upgrade GPU would become CPU limited.

Comparing to CF7970 performance at 100%, SLI GTX780's performance would be around 160%, so he would definitely need a platform faster than a i7 920 at 3.80GHz to make the most out of it; as if he was go SLI GTX780, there would be many occasion in some games which his performance would be worse than your CF7970 on your i7 920 at 4.40GHz.

IMHO he should just keep his system as it is, and save up his money and leave his upgrade till both next gen high-end 16nm cards and next gen Intels are out and upgrade THEN. With the move to 16nm, the chances are the next gen high-end card would probably be around 80% of SLI GTX780 performance anyway.
 
I going to build the test rigs next week and then we'll se how it all scales up or not - should be a fun project :)

1st rig. MSI 790XT-G45, 8GB Kingston DDR2 800MHz CL5, AMD Phenom II X6 1090T, MSI GTX 780 ref SLI

2nd rig. ASUS CROSSHAIR IV FORMULA, 8GB Kingston Value 1333MHz CL9, AMD Phenom II X6 1090T, MSI GTX 780 ref SLI.

3rd rig. MSI X79-GD45 Plus, 16GB DDR3 1866MHz G.Skill Ares CL9, Core i7 3820, MSI GTX 780 ref SLi

4th rig. Asus P9X79 Pro, 16GB Crucial BallistiX 1600MHz CL8, Core i7 3930K, MSI GTX 780 ref SLI.


I'll be using the same Intel 330 120GB SSD for all testing and Window 7 Home Premium.

I'll be testing everything in stock clocks and only 2560x1440 resolution with 0xAA and then with 16x/32xQCSAA (what the games support) to see what bottlenecks what.

I'll benchmark in:
Crysis 1
Crysis 2
Crysis 3
(Should be interesting to see how the hardware scales through the older games)
BioShock Infinite
COD Black Ops II
DCS Black Shark
Homefront
BattleField 3 SP

If won't be doing synthetic tests since I want some concrete actual real life gaming results.

Should be done around Saturday noon. :)


Comparing to CF7970 performance at 100%, SLI GTX780's performance would be around 160%, so he would definitely need a platform faster than a i7 920 at 3.80GHz to make the most out of it; as if he was go SLI GTX780, there would be many occasion in some games which his performance would be worse than your CF7970 on your i7 920 at 4.40GHz.
I don't disagree, but isn't that the never ending "problematics" something faster will make something go faster. I think the point is in all it's essense if he will benefit from an extra card no matter how faster his system need to be or not !. I say if he's playing in 2560x1440 then yes he will get a good performance increase with two card in his current configuration. A faster system yes will game a higher baseline score, but that's still doesn't say anything about if he would win something with his current setup.
FPS isn't everything IMO ! the option to increase the visual image quality settings is something you can do without sacrificing performance and that's a hughe plus to me ! He may not win FPS but he sure increase AA settings a whole darn lot !.... and I don't like jaggies no matther the resolution :D
I was running 3x 580 at 1920x1200 at stock speeds and stock Ci7 870 speed and all I heard was bottleneck comments (we'll I still do). I'd get lower 3Dmark scores than people with 2x highly OC'ed 580's and OC'ed 920's, but that really didn't bother me since I played at max settings at 1920x1200 with 32xQCSAA at all time and had great FPS and image quality and no jaggies - 3Dmark scores can't beat that... I really hate jaggies :D
 
Last edited:
But isn't that the never ending "problematics" something faster will make something faster. I think the point is in all it's essense if he will benefit from an extra card no matter how faster his system need to be or not !. I say if he's playing in 2560x1440 then yes he will get a good performance increase with two card in his current configuration. A faster system yes will game a higher baseline score, but that's still doesn't say anything about if he would win something with his current setup.
FPS isn't everything IMO ! the option to increase the visual image quality settings is something you can do without sacrificing performance and that's a hughe plus to me ! He may not win FPS but he sure increase AA settings a whole darn lot !.... and I don't like jaggies no matther the resolution :D
I was running 3x 580 at 1920x1200 at stock speeds and stock Ci7 870 speed and all I heard was bottleneck comments. I'd get lower
3Dmark scores than people with 2x highly OC'ed 580's and OC'ed 920's, but that really didn't bother me since I played at max settings at 1920x1200 with 32xQCSAA at all time and had great FPS and image quality and no jaggies - 3Dmark scores can't beat that... I really hate jaggies :D
Except forcing ridiculous higher level of AA doesn't necessarily mean it looks noticably better. People in general are far more sensitive to gameplay smoother than the so-called improve image quality. I honestly doubt people can tell the different between your 32xQCSAA and a typical 8xMSAA even if they were looking at screenshots of the two right next to one another, let alone can tell the difference on the image quality on moving images during gaming :p
 
Except forcing ridiculous higher level of AA doesn't necessarily mean it looks noticably better. People in general are far more sensitive to gameplay smoother than the so-called improve image quality. I honestly doubt people can tell the different between your 32xQCSAA and a typical 8xMSAA even if they were looking at screenshots of the two right next to one another, let alone can tell the difference on the image quality on moving images during gaming :p

Higher AA makes all lines look smoother so it's not ridiculous - if it was soo many kinds on AA algorithms wouldn't have been made. Some people may crave only FPS but I crave image quality :rolleyes: (just like some are obsessed with the best sound quality).
X4cqV.png

http://international.download.nvidi...icles/batmanarkhamcity/BAC-AAComparison-2.png
batman_aa.jpg


I always play at highest AA levels since I can and I like the smoothness of the pictures - tough I'm not so impressed with FXAA it just blurs everything.
Especially when playing flightsimulators or racing games it makes the overall quality of gaming so much better. In flightsims the high AA levels help me to spot an aircraft silhouette rather than a cluster of pixels :)
When I play at ultra high settings, I play at ultra high AA settings too :cool:
 
Last edited:
I have never really tested but what I find using maxed settings is it tends to keep the fps closer to the average in games like Crysis 3. I don't seem to get very low minimums or very high maximums which is nice.
 
So I would say we are getting CPU bottlenecking on Sleeping Dogs which is very demanding on the CPU but the other games don't have a problem.

I would also place BF4 and Crysis 3 in the same group as Sleeping dogs but unfortunately they don't have inbuilt benchmarks to test with.

Sleeping dogs calling bottleneck is bull, your at 150 fps more than any monitor can show and you loose what 5% of your avg fps from losing 30% of your cores and minimums are identical.
It may still be a slight bottleneck but at fps beyond what anybody needs so is pointless caring.
 
Last edited:
Sleeping dogs calling bottleneck is bull, your at 150 fps more than any monitor can show and you loose what 5% from losing 30% of your cores.
It may still be a slight bottleneck but at fps beyond what anybody needs so is pointless caring.

Still a bottleneck and there are games that are worse than Sleeping Dogs.

Have you also noticed the performance increase going from 4 to 6 cores on SD.

SB and SB-E CPUs are also more efficient than the older 920s so I am expecting a drop in fps when I test one.

The point is Sleeping Dogs has managed to bottleneck a 6 core 12 thread intel CPU, this is not good news.

You should also see the runs I have done with my 4930 and 290Xs now that is a massive bottleneck with little increase in fps going from 1080p to 1600p lol, but that is for another day.
 
I have never really tested but what I find using maxed settings is it tends to keep the fps closer to the average in games like Crysis 3. I don't seem to get very low minimums or very high maximums which is nice.

Ditto!! I rarely get that large FPS fluctuations. I rarely see there's more than 30FPS difference from the miniumum to the maximum FPS... It's really quite stable.
 
Higher AA makes all lines look smoother so it's not ridiculous - if it was soo many kinds on AA algorithms wouldn't have been made. Some people may crave only FPS but I crave image quality :rolleyes: (just like some are obsessed with the best sound quality).
X4cqV.png

http://international.download.nvidi...icles/batmanarkhamcity/BAC-AAComparison-2.png
batman_aa.jpg


I always play at highest AA levels since I can and I like the smoothness of the pictures - tough I'm not so impressed with FXAA it just blurs everything.
Especially when playing flightsimulators or racing games it makes the overall quality of gaming so much better. In flightsims the high AA levels help me to spot an aircraft silhouette rather than a cluster of pixels :)
When I play at ultra high settings, I play at ultra high AA settings too :cool:
If your screenshots proves anything, it is that 32xCSAA is not worth killing the performance for comparing to MSAA (if someone already got powerful enough GPU grunt fine, but not when people have to to drop another £400 on a GTX780 just for the sake of using that), as people are NOT really going to notice the different when they are actually gaming and running around. They are not going to stand at the same spot starring at the screen all day :p

If someone was dropping extra £400 on a GTX780 for using 32xCSAA on a 1920 res monitor they are doing it WRONG...they should be spending the money on a 2560 res monitor instead. Lots of 2560 res users comment that they don't really need to use beyond 2xMSAA, and OP is already on 2560 res. I am using custom resolution utility to run 2560x1440 res on my Samsung SA700 23", and for games I don't even feel the need of using AA...but I use 2xMSAA because psycologically wise it reels like it would help out a bit, but in honesty I can't really tell the difference.

May be you are ultra-sensitive to these kind of things, but I don't think it's fair to think that everyone is as sensitive or has as high expectation as you. I would even dare say you are most likely enabling 32xCSAA for the psychological feel good factor of "I'm running it at the absolutely max" than you actually have the capable to tell its difference from 8xMSAA when both are running at 2560 res when you are under actual gaming (again, by gaming I mean with actual action going on, not standing there on the spot starring at the screen all day). Had there between two identical monitors with one running 8xMSAA and the other running 32xCSAA, I don't think you can tell which is which, and your eyes certainly cannot enlarge pictures like the screen shot above (may be if you were to stick you face to 10cm away from the screen you might, but what would be the point? You are not gaming with your eyes 10cm away from the screen.)
 
Last edited:
have to agree with marine, with 2560x1440 i cant realy see any difference with more than 2xmsaa, i could drop another 290x in my pc tomorrow if i wanted but i just dont feel as if i need it
 
Still a bottleneck and there are games that are worse than Sleeping Dogs.

Have you also noticed the performance increase going from 4 to 6 cores on SD.

SB and SB-E CPUs are also more efficient than the older 920s so I am expecting a drop in fps when I test one.

The point is Sleeping Dogs has managed to bottleneck a 6 core 12 thread intel CPU, this is not good news.

You should also see the runs I have done with my 4930 and 290Xs now that is a massive bottleneck with little increase in fps going from 1080p to 1600p lol, but that is for another day.

But its at 144fps majority of gamers are at 60hz monitors and that wouldn't bottleneck your cards if you were running triple screens or 4k which is what that many top end cards are really for.
 
Last edited:
If your screenshots proves anything, it is that 32xCSAA is not worth killing the performance for comparing to MSAA (if someone already got powerful enough GPU grunt fine, but not when people have to to drop another £400 on a GTX780 just for the sake of using that), as people are NOT really going to notice the different when they are actually gaming and running around. They are not going to stand at the same spot starring at the screen all day :p

If someone was dropping extra £400 on a GTX780 for using 32xCSAA on a 1920 res monitor they are doing it WRONG...they should be spending the money on a 2560 res monitor instead. Lots of 2560 res users comment that they don't really need to use beyond 2xMSAA, and OP is already on 2560 res. I am using custom resolution utility to run 2560x1440 res on my Samsung SA700 23", and for games I don't even feel the need of using AA...but I use 2xMSAA because psycologically wise it reels like it would help out a bit, but in honesty I can't really tell the difference.

May be you are ultra-sensitive to these kind of things, but I don't think it's fair to think that everyone is as sensitive or has as high expectation as you. I would even dare say you are most likely enabling 32xCSAA for the psychological feel good factor of "I'm running it at the absolutely max" than you actually have the capable to tell its difference from 8xMSAA when both are running at 2560 res when you are under actual gaming (again, by gaming I mean with actual action going on, not standing there on the spot starring at the screen all day). Had there between two identical monitors with one running 8xMSAA and the other running 32xCSAA, I don't think you can tell which is which, and your eyes certainly cannot enlarge pictures like the screen shot above (may be if you were to stick you face to 10cm away from the screen you might, but what would be the point? You are not gaming with your eyes 10cm away from the screen.)

It can be that I've been in this game so long about Anti-aliasing and I was hooked for lifetime when Voodoo 5500 offered hardware rendered AA levels.

I have a 2560x1440 and I immidiately see if I'm using 0xAA, 2xAA, 8xAA or 32xAA. Some games it's harder to see the difference between the 16x and 32x AA levels, but yes I see them - precisely as some people are very observant about sound quality... (I'm not because I have a bit tinnitus on one ear).

I don't say that people have the same expectations, but they have the option to pull every slider to the max ! whether they do or not is up to them. I'm just saying it's possible and it does give that extra edge - combine it with som extra SweetFX tweaking you can get stunning results :)

Why is 32xCSAA on a 1920x1200 wrong ? I don't get the point. This stands in direct contrast to the orginal thought of hardware based AA 3Dfx came around with - running NFS Porsche at 640x480 and 4x hardware AA and get a image quality on jaggies close to 1280x1024 leves.
It just improves the quality. Going to a 2560x1440 res with 2xMSAA is not only nearly a 80% increase in pixel but also and additionel bandwidth is being used for AA - so you actually end with lower performance than with 1920x1200 and 32xQCSAA.
Also 32xCSAA isn't making your FPS drop by 50%. QCSAA is just a tad more demanding.
wp-aaperf2.jpg

http://www.pcper.com/reviews/Graphi...ew-Fermi-brings-DX11-desktop/Dont-Forget-ROPS

Also upscaling from 1920x1080 to 2560x1440 on the same screen you can't compare to a native 2560x1440 res and image quality. You still have the same limited pixels available on the 1080 screen and the pixel pitch doesn't change either.

I don't say everybody needs or may even at all be interested in AA - but nor nVidia or AMD would devote so much time and energy to it if it wasn't something to consider in "immersive gaming" and if people didn't use it. If it's really not that important then why even bother making new methods or support levels higher than 4xAA.

I'm just saying wanting the best image quality as possible is just as valid as it is wanting the most FPS available :)

Personally I just don't see the point of going for 120FPS+ which I rarely would notice at all whereas I instantly would notice AA etc - and I'd prefer having max AA and 80-100FPS over than having 144FPS and very low or none AA. But I DO respect if people wish to get the 200FPS and not run with AA...that's their choice. I'm just saying that you can see things both ways ! getting the option to get more FPS or the option to enable higher image quality settings - each pick their own game :)

have to agree with marine, with 2560x1440 i cant realy see any difference with more than 2xmsaa, i could drop another 290x in my pc tomorrow if i wanted but i just dont feel as if i need it

Exactly ! I don't really notice the difference between 60-200FPS, just as you perhaps don't notice the difference between 2xMSAA and 32xCSAA :) (OBS! not meaning that either can't see a difference)

Sorry OP, this is going OT.
 
Last edited:
Why is 32xCSAA on a 1920x1200 wrong ? I don't get the point.
I think you misinterpret what I meant.

What I meant was...let say someone already has a 290 or GTX780 and a 1920 res monitor, and his CPU is already starting to show its age and starting to become a bottleneck. He got £400 for improve imagine quality, the best thing for him to do is upgrading his monitor to a 2560 res monitor, not to grab another GTX780 for using 32xCSAA.
 
But its at 144fps majority of gamers are at 60hz monitors and that wouldn't bottleneck your cards if you were running triple screens or 4k which is what that many top end cards are really for.

I also use a 60hz monitor and if the cards are allowed to run bottlenecked it does not look good. This is why two stock GTX 690s look better running Sleeping Dogs than four 290Xs @1080p, the 690s are also faster on fps lol.

Fortunately I can use 1600p so this is not a problem.

Everyone has told you that highend cards bottleneck even LtMatt.
 
As a side note - my system (i7 4820K, 780) with the CPU at 4.4GHz and GPU at 1188MHz hits ~370watt draw from the wall in BF4 (full ultra 1920x1080) with 4.6-4.7 on the CPU and 1267MHz on the GPU it hits around 400watt.
 
I also use a 60hz monitor and if the cards are allowed to run bottlenecked it does not look good. This is why two stock GTX 690s look better running Sleeping Dogs than four 290Xs @1080p, the 690s are also faster on fps lol.

Fortunately I can use 1600p so this is not a problem.

Everyone has told you that highend cards bottleneck even LtMatt.

Define does not look good? Provide me with quantifiable evidence not heresay.
 
Define does not look good? Provide me with quantifiable evidence not heresay.

It is not heresay as it is on my own PC.:D

I have already demonstrated that Sleeping Dogs can bottleneck a 6 core intel CPU and a 4 core one even more. I think it is about time you provided something of substance rather than being negative in the face of what most people are telling you.
 
If you can't see the difference in image 1 & 4, you need Specsavers man. :)
Hehe while I don't disagree with you, I stand by my point peope would struggle to tell the difference between the 3rd and 4th pic had it not be greatly enlarge small focused area of an image.

Try pressing ctrl- on your broswer till it is the smallest it can, and it should be more or less show the image in its original size. Then you can tell me if you can tell the difference betwee the AA and 32xCSAA :p

Edited: Actually on second though I might as well just do this:
http://img13.imagevenue.com/img.php?image=569324667_AA_122_470lo.jpg
 
Last edited:
Back
Top Bottom