The problem with slower cards,it means you need more computers to run them.
Its an up front cost and running cost balance.
However,TBF its only the R9 280X which has been affected massively by price increases.
THG measured it at 68w peak load under GPGPU.
THG measured as much as 141W peak when gaming,meaning the cards is cheating regarding its TDP.
Remember this is a card only measurement.
People are thinking of using a picoPSU to power this card. It better be a 200W job then.
Intel did the same thing with its phone chips. They would massively breech TDP to look good in benchmarks over short term.
Nvidia Boost 2 does the same thing. Reviews have shown the issues over longer time periods.
A lot of these new boost mechanisms from AMD and Nvidia are cheats. Very few sites test extended 10 to 15 minutes runs,or even try them in more cooling limited situations. The tests are run mostly on open air benches in air condition offices or a huge high end case,with the side open in an air conditioned office and usually the runs are under 3 minutes long. You see a best case scenario in reviews ESPECIALLY with reference coolers.
The GPU boosting mechanisms are overly aggressive,unlike with desktop CPUs and have way too much variation according to conditions. They are very dependent on cooling and environment conditions. You can see what aggressive boosting does for laptop CPUs - different models show different scores and in some ways the boosting with cards is probably a bit worse.