Hey KidCanary
I'll admit a lot of this stuff goes over my head - So perhaps I'm missing something here, but surely the graph in the first post (and other benchmarks) shows that although some games use over 768MB of memory, it doesn't make *that* much difference to gameplay?
We have no idea actually how much vRam is being used in any of the anandtech results but there are a small group of people here that think those results are worthless . . . they think min/avg/max benchmarks somehow "mask" how the game actually plays . . . their proposition is that every few seconds a card that does not have enough VRam will "stutter" or the game will "skip" . . . not one of them have produced any evidence or data to backup this proposition so it is in fact conjecture on their behalf?
I think the Anandtech data "perhaps" shows the truth of the situation and I think the difference is FPS between the 768MB card and the 1024MB card is those tests is not soley due to the vRam difference but actually the 33% extra ROPS and memory bandwidth on the 1024MB card . . .
I seem to remember that 256-bit helps transfer the graphic data quicker which helps when you game at high res like 1920x1200 . . in the old days the really uBer cards had 512-bit memory link although apparentky that costs a lot to produce now . . . they work around this by using faster memory!
Rroff mentioned Modern Warfare 2, which was missing from the graph, but
this benchmark shows that the 1GB version of the card still only gets just under 5FPS average more than the 768MB card. (looking at the standard versions of the cards, as different manufacturers overclock in different ways)
Indeed . . . that benchmark shows the data that we have always been used to seeing . . . now it is being ridiculed as "worthless & meaningless"
If somebody wants to do some testing and show a timeline from a game where the framerate can be viewed along with the vRam usage that would be really helpful . . . sadly no one has made this effort and we are just expected to listen to one or two guys like they actually know what they are talking about . . . leap of faith if you will!
With Crysis Warhead, which I imagine also would take a lot of VRAM, the difference is again only about 5FPS more.
It appears to be the same old story in most of the games . . . the 768MB card despite having less vRam, less ROPs and memory bandwidth chugs along quite nicely . . . . but somehow anyone would be foolish to buy one! . . .
With the original Crysis, this benchmark shows that at the highest settings they tested, the average FPS is only 1 higher with the bigger card. Clearly neither card is good enough for that game at those settings - But at lower settings the difference is still only 1 or 2FPS.
Same old story . . . the nVidia fermi seems like a comprelling choice for anyone that wants to save a few quid and get some gaming done "today" . . .
Certainly in the future a lot more games will use larger amounts of video memory, but by then the rest of the card will be so outdated that it's not going to matter that much.
I think most games have the capacity to use a heap of VRam if you turn up the options . . . whether this makes the games better is subjective, I played Crysis on Medium but then Crysis medium was like Ultra-High in most other games I played before!
The thing is there appears to be some people that take their GPU purchases seriously . . . very seriously . . .and after weeks or reasearch and careful "examination" they nearly always conclude by purchasing the most expensive GPU and feeling pretty "justified" about it too . . . so when a newer card comes out, that costs a lot less than they paid for theirs and still performs really well it kinda kicks them in the goolies!
I'm wanting to do a bit of gaming soon actually and these GTX 460 768MB cards look just the ticket . . . sadly as they become more popular and as more people become aware of how much performance they bring the price seems to be rising . . .
As I said, a lot of this stuff goes over my head, but such a tiny increase in FPS doesn't seem to justify the extra £40/£50 for the bigger card.
I would say your just as much an expert as anyone here makes out to be . . . some people like to consider themselves an authority here but actually they are no smarter than you or I, they may just have some facts that we don't . . . but when you ask them for the facts you get abused!
I look at those benchmark charts the same way as you do and all I see if "bargain buy me" . . . . if some of these guys here who carry on like that have some wisdom would actually pull their fingers out and produce some meaningful facts then we would all be the wiser . . . sadly no one has so if somebody is looking for card to game at 19200x1200 then it seems that the Geforce GTX 460 768MB is the bang-for-buck champion right now!
