Whilst that is partially true, interlacing is used to improve the perceived motion smoothness as 25p is not entirely comparable to 50i. The problem is that since the demise of CRT, few home devices utilise the old fashioned electronic scanning meaning everything gets de-interlaced prior to display affecting not only the way it looks but also the apparent smoothness. 50p is the way to go as you get both the motion smoothness of 50i and the vertical resolution of progressive but it then costs you bandwidth, hence why 50i is considered to save on bandwidth.
We shot and edited one programme at 25p instead of 50i to satisfy the producers curiosity. Whilst the pictures looked superb (PMW-F5 with primes), the motion was too flickery for fast moving subjects or pans. It would be far to time consuming to add & render motion blur as an effect in post when using 50i solves the problem with no time penalty. (plus you can't apply effects to live shots) Viewing interlaced content on a proper grading monitor like a Sony PVMA250 or JVC DT-R24L41D show what it is supposed to look like. (butter smooth)
I can't disagree that change is necessary but 25p is not it. I'd rather see 1080p/50 than 4k but I can't realistically see either happening for a good decade or so due to infrastructure requirements.