Worst analogy ever. VRAM != speed.

c'mon its not that bad!! Anyway lots of valid points and this disscusion is and will go on forever, 1000's of pages on net and i'm sure many more to come.
Oh what happens if you do not use AA, vram goes down fps(speed) goes up
This was taken from techpower up
Well, BF3 at 2560x1440 on ultra is definitely >2GB usage. And again, it's not just Vram. In 5 years? All our cards will be dust.
BF3 should not be used to determine how VRAM is required as it will scale with how much is available to it:
In my own Eyefinity / Surround testing I have monitored the following:
At 3620x1920, Ultra settings (no AA):
HD7970 - 2.2-2.4GB
GTX680 - 1.4-1.6GB
At 3620x1920, Ultra settings (2xAA):
HD7970 - 2.4-2.6GB
GTX680 - 1.6-1.8GB
At 3620x1920, Ultra settings (4xAA):
HD7970 - 2.4-2.8GB
GTX680 - 2GB+ (out of memory error).
What we can conclude from this is two things:
1) Yes it is technically possible for BF3 to require more than 2GB of VRAM but the settings that actually require it are well beyond 1080p levels (and are certainly beyond the level that a single card can achieve - 30FPS average is not really playable in BF3).
2) VRAM usage under normal scenarios is not accurate as there is a lot of caching going on.
I completely agree with your second point, there is no point in asking the question whether the current generation of cards will be any good for 60+ FPS average in 5 years time. Regardless of VRAM requirement they will be too slow.
To answer the original question: to attain 60fps constant at maximum settings in every current game (+ oodles of AA) you will need either GTX680 SLI or HD7970 CF minimum, even at 1080p. Now this is just ridiculous overkill for the most part as dropping a couple of pointless settings (Uber sampling in Witcher 2 for example) will allow for nigh on 60fps average with just a single GTX680 or HD7970.