Depends on what games you play and the settings you play at.
Even with my X2 4400+ @ 2.5ghz I saw a 2.5x increase in performance in Crysis, going from a X1900XT to HD4870.
In HL2:EP2, I'm now able to play at 1680x1050/max details/8xAA with v-sync on, something I couldn't do before.
Most games on the Unreal 3 engine (e.g. Gears of War, R6 Vegas, Bioshock) have seen massive increase (at least 2x perfomance increase) and far more playable than ever before.
Devil May Cry 4 just runs constantly at 60+fps with max details and 8xAA.
Call of Duty 4 looks and runs sublime with this card; pretty much 60fps+ from start finish with not a single hiccup. I only tried the demo with my X1900XT, but in the same scene framerates have jumped from 20-40fps to 60-100fps.
Even Stalker, which is heavily CPU dependent in many areas, I've seen a performance increase of 1.5x on average (2-3x in areas with little NPC activity).
Performance is only really hampered by some heavily CPU dependent games, like the obvious RTS titles and some odd ones like Mass Effect (I'm still trying to figure out why Mass effect gobbles up both cores though).
Finally, a lot of benchmarks do show considerable bottleknecking with older CPUs, but what they don't point out is that anything above 60+fps is very hard to notice (in fact it can look worse when you factor in screen tearing on LCDs). Yeah sure, with a quad core Intel CPU, clocked at 4ghz you can get 200fps, but does that really look that much better than 60fps? Is it even noticeable?
Fact is my FPS increased within the ranges that matter, i.e. from 20-40fps to 50-100fps and for that I think it's well worth it. Plus, even if the game is CPU bottleknecked, you can at least now play with 24xAA + adaptive AA with no performance drop (as was the case with Oblivion

).