Exactly, the law is clear, anything that is not required in order to operate the car is only to be undertaken when safe to do so.
Anything that requires you to take your eyes off the road will fall under this category.
But really the issue is the law has not been amended at all, so the assumption is that the person behind the wheel is in control and hence fully liable should anything happen.
The law specifically says you cannot watch video whilst moving so those devices must stop operating, IMO doing something like checking twitter on a screen is closer to the video than operating a car and as such when one inevitably goes to court the law may be pushed that way by case law.
If autopilot missed something that then caused a massive crash the driver of the car would be liable, just the same as they are with any car with adaptive cruise, auto braking, and lane control etc. (Ie the main functions that are allowing the tesla driver to not look at the road for short periods are as available on some ICE cars, the tech isn't unique for these cases)
IMO the one thing thats going to hold back the adoption of full automation is resolving the issue of who is liable when a full automated car ploughs into a line of kids waiting for a bus. At the moment its the driver, even if something was to go mechanically wrong. With full automation do we pass liability to the software company?