Most of us use a range of professional reviews before buying or recommending a product. This gives a better picture than a single user who will inevitably have a large error margin being just one person with their own particular use-cases, helps inform of issues specific to different setups and avoids buyers bias where people wish their new items to be worth the money so they believe they have improved things.
You may be totally scientific in your assessment saying it's snappier but it contradicts what others have found including many who have documented their methods. For example we've no way of telling if you're comparing an old install with lots of stuff slowing it down with a new fresh install or if it was like for like. Most reviews have found the improved sequential write makes little difference, and at consumer levels of use higher iops doesn't make a big difference either. Random read (normally credited with the 'snappy' feeling) is not improved by a significant margin, though it is slightly better.
That's not to say reviews are never wrong - reviewers have bias, make mistakes, don't know how to properly configure hardware, get sent different versions of items to those available at retail etc. In numbers user reviews also cover far more combinations of hardware which can be very valuable, especially if certain less common components turn out to not work well together or something. Hearing different viewpoints is interesting, especially if you're willing to write up your basis for your belief.