It depends on the card purchased IMHO.
If you purchase a reference card and DSR it because it isn't a demon overclocker then I agree. On the other hand if you DSR because your highly expensive GPU aimed squarely at overclocking is crap at overclocking then DSR away.
I bought a 7970 Matrix Platinum and it would only overclock to 1170 core. I DSR'd it without blinking an eyelid. That GPU was aimed squarely at overclockers and was even advertised as such.
Then you need to take into account that Gibbo is very fond of posting here on the forums that, "card X is a great little overclocker". For example in his R9 290 buying guide thread he explicitly stated he got 1200+ core clock on all the Sapphire R9 290X cards he tested, and he didn't even bin them first. As soon as someone buys a turkey Sapphire R9 290X I think they have a case if they remember that little quote and DSR that bugger.
Within reason, yes if you pay for a premium gpu then if it's not a great clocker, DSR could be argued for returning, but, oc'ing
is a lottery regardless the premium paid.
Some no doubt have bought and returned 3+ gpu's looking for that golden sample, that's what has wrecked DSR.
Gibbo mentioned that oCuk have a blacklist of repeat offenders, DSR wasn't brought about to protect the consumer for oc'ing.
I've used DSR for a faulty noisy fan on a 7950 Ice-Q, which I was gutted to return as it went to 1240MHz on stock voltage(I can only dream what that would have hit with moar voltage, but I never bothered as it was going back).
I also used it for a 7970 WF that artifacted at stock, I could have went for replacements but choose to take the hit for postage as icba with any 'no fault found' returned faulty gpu's with testing fee+return postage, due to what happened with the infamous cheap 480 WF's, loads of them were faulty and getting returned as 'no fault found', I guess because oCuk were probably testing them with high quality 1200w psu's at the time, where as recommended speced quality 750W psu's didn't have enough juice on the 12v rail-but that was the fault of Nvidia's rated requirements at the time.