The only thing that is of any concern even in the professional industry is frequency related attenuation per meter. I have dealt with cable lengths around 5-700M dealing with digital signals. SDI is particularly sensitive to the type of cable used as it is both video and 4 channel audio going down BNC terminated coax. The old MV3333C cables that were used before cannot handle the frequency (1.5GHz) over any considerable length without serious attenuation. Cable impedance also plays a crucial role as it is necessary for it to be 75ohm in both digital signals and older fashioned analogue video.
Using SDI as an example...
All digital data is derived and managed by a repetitive pulse train called a clock. Without it, data transitions cannot be identified coherently. The digital data can either contain the clock information embedded within it, or the clock signal can accompany the data separately. Since SDI is a singular wire transmission scheme, the clock is embedded. (AFAIK the clock is embedded via HDMI too) Therefore, not only does cable attenuation affect recovery of data, it seriously affects the receiver's ability to recover the clock signal such that the system can stay synchronized. This is where cable attenuation comes in. The maximum cable distance is governed by the receiver's ability to recover clock and data reliably.
Cable loss affects the amplitude of the SDI signal while jitter affects the zero crossing point of the data edges. The data edges appear to dance back and forth with random uncertainty. There is a jitter budget allowance, but since noise and jitter effects can become generally random, bit error rate can creep up periodically and cause lost data. If the jitter budget is exceeded, data cannot be recovered at all.
As with analogue signals, once you have noise in the signal, it is extremely difficult and costly to remove. Jitter caused by induced noise effects, unstable signal sources, or poor re-clocking systems will ruin digital signals. Sometimes, basic signal attenuation effects are mistaken as signal jitter. SDI signals contain a range of low to high frequencies like analogue signals. Cable attenuation still affects the high frequencies most. When looking at an eye pattern, the data zero crossing point (risetime/falltime area) appears wider than normal. The eye pattern is typically used to evaluate signal quality including jitter. This appears to smear the data edges and look as though large amounts of jitter are present (the thick green X shaped sections on lucids image shows the jitter range, the lower the jitter, the thinner these sections get), when, in fact, measurement with SDI measurement equipment may show the signal well within jitter specifications.
Also note, that what we commonly refer to as a square wave isn't technically possible as rise/fall times from IC's simply aren't fast enough as a true square wave is instantaneous off to on and visa versa. (note that the divisions on the diagram in lucid's post shows that each division is 100 pico seconds. Eye patterns for SPDIF are no different.) If you look at the datasheets for various opamps, particularly those designed for video, they show how they handle rapid rise and fall times and how long they take to settle. Depending upon the impedance load they are working into and their power supply, some opamps won't actually settle and will oscillate. This can also happen with audio amplifiers, and is quite often part of the reason why the ampilfier input is frequency filtered. Too high a frequency through some amps can cause parasitic oscillations which can lead quite rapids to its destruction. (can take the speakers with it too)