That is also why I find it strange that Maxwell would appear on 28nm - it was designed from the ground up for 20nm with TSMC making the design kits available early on and as the article illustrates the lithography and layout is very different to the same design implemented on 28nm meaning they'd have to go to a whole load of extra work implementing and debugging for 2 seperate processes with no direct shrink path available though I guess it might be possible to port a low spec design directly back in an inefficient manner.
There won't be a low end Maxwell on 20nm anyway, and like hell will anything translate to the finfet designs at 16nm anyway so why bother.
Nvidia barely does low end even when low end was probably 2-3 times the current volume(which is also shrinking by the quarter). Due to costs there is basically no financial incentive to make the low end uber small. Say a 20nm wafer costs double compared to a 28nm wafer, with 30% lower yields, and the die shrink only means you get say 50% more chips per wafer. A chip that cost $15 per core to make, ends up costing around $27-28 or something instead. There just isn't any worth to doing that, which is why normally low end lags significantly behind.
Older process, particularly as people migrate to the new ones, have so much availability the cost goes down, yields are up. You can't make a theoretical GM110(titan successor) on 28nm, you can at 20nm... so where do your dedicate your early small number of wafers with lower yields. Highest profit cores that aren't possible on the older process.
Low end is the worst chip to do on a new process, and due to the lack of performance push, on a design aimed at say 1Ghz clocks, doing 600-700Mhz at lower voltage reduces leakage drastically. Biggest gains on newer processors are at the high end where leakage is worst.
The real question is will AMD/Nvidia bother with 20nm, for a year, when they can ignore it and work straight towards 16nm. Mobile guys, qualcomm/samsung/apple, they sell in a quantity and profit margins that make it a no brainer to do 20nm. Huge graphics parts.... when it's the 16/14nm drop which is bringing most of the power reduction.
Answer is they probably will, but both companies should be focused on 16/14nm and with the power drop being the biggest feature of 20>16nm, I wouldn't be surprised if a 7970/680gtx level card come at 20nm, but the 290/titan sized cores only arrive with 16nm.