We agree to disagree. For me, whether I own NVidia or ATI (and I have owned a lot of both), minimun fps is more important and noticeably distracting than average fps. Perhaps this is because I set Vsync on for everything. I find the importanct of min fps inversely proportional to that of max fps (which to me is meaningless), with avg sitting in the middle

.
You're not disagreeing, I don't think you understand what the problem is. The minimum FPS is only useful when you know how long it occurred for. If you're playing a game with a pretty constant 60FPS 99.99% of the time, but the game drops to 1FPS for half a second while new textures are being loaded, your minimum FPS becomes one. For some one who doesn't quite get how that works, it can look like low performance.
All the minimum is, is the lowest framerate recorded through the benchmark, but it seems like you're thinking of the minimum framerate as meaning the lowest FPS you'll get over a period of time, like an average of the minimum framerates, which it isn't representative of.
It's hard to quantify the meaning of the minimal FPS without a graph that plots the framerate so you can see where it occurs and how often it occurs and how long that lasts, but then that would turn in to the average framerate. Average frame rate is what matters most when it comes to smooth gameplay. Most of the time though, the minimum framerate refers to a rapid drop, then climb in framerate. In the HardOCP graphs, you can see the drop to 14FPS is quite quick, and then it goes back up again.
When it comes to the average FPS though, you're going to realistically want the "mode", as in, the FPS that occurs the most, or a range of FPS that occur the most, and reallistically you'll want that to be in the region of 60 (if you're using vsync). If the most common FPS is around the 60FPS+ mark, you can be getting smooth consistent gameplay, even if the minimum at some point is 1FPS.