Anyone still remember how the GTX780 and GTX780Ti were running like turd after Nvidia released the GTX 900 series, and got better for a little bit after 9 months of waiting, and then performance fell of the cliff once again and never recovered afterward?
The Witcher 3 was the game which became a big talking point on that front. Kepler severely underperformed in it at launch, which lead to the accusations of driver gimping. So much so that Nvidia released a hotfix driver not long after to improve Kepler performance in that game.
The thing about Kepler as an architecture is that it's actually quite radically different to Maxwell onwards. Is has 192 CUDA cores per SM, yet only four warp schedulers capable of addressing 32 cores per cycle to feed them. Each warp scheduler can actually execute two commands per cycle, but that requires the use of instruction-level parallelism, and
that requires driver and software support. The TL;DR about Kepler is that it's an architecture that
needs both focused driver support and developer consideration in order to perform well. Without ILP the CUDA cores are starved of work (since only 128 out of 192 are issued work to do), and so sit there doing nothing. Maxwell changed all this, moving to a 128 CUDA core per SM model with the same number of warp schedulers, meaning all 128 cores could be addressed without leaning on ILP. That's why it and subsequent Nvidia architectures have aged much better, relatively speaking. They're not anywhere near as heavily reliant on driver support and developer tricks to perform to their potential.
Ultimately, it wasn't really fair to say Nvidia were
intentionally gimping Kepler. They simply dropped it like it was hot and didn't provide driver optimisations for it any more, whilst game developers also moved on and forgot about it. Without that software support it was boned and fell off a cliff.