You do actually get differences between cables. Its more whether the binary signal gets to the other end cleanly enough that the receiving end can tell whether it is a 1 or 0 that was sent. The receiving device is reading the voltage and set intervals and deciding whether its a 1 or 0 from that. But the voltage at the receiving end is what can get affected by quality of a cable and connectors.
In reality, digital signal waveforms are not precisely going from 0 to 5V in a perfect square edged step up and down motion. The reality is that there are curved slopes as it takes time for the signal voltage move and change between each state. So the digital signal is realistically starting to look like a partly curved waveform in places ... almost a bit like an analog signal would.
At lower resolutions, the frequency of data transfer is generally such that the period of change is mugh smaller than the period at which the data signal stays at between each interval. So its quite easy to read the signal at the other end at the timed intervals. So signal takes time to change and settle into a state .... receiver reads at the set interval, signal then moves onto changing to the next stage, settles, gets read again. As you ramp up the resolution and refresh rate, the timed intervals between reading the voltage begins to reduce considerably. So the signal has to change and hold steady in a shorter space of time.
What would happen if the cable began to affect how quickly the signal voltage can change between states ? ... what happens if it starts to add in a delay to how fast the signal changes ?
It is perfectly feasible that a cheap cable may not be of high enough quality to prevent such a signal degredation. The result could be that sometimes the waveform gets affected so that the change doesn't happen cleanly enough in time for the receiver to read the voltage at thte set interval and the receiver can make out what it should be. Imagine that the receiver is reading the voltage of the signal, but due to a poor cable, it hasn't quite gone fully from 0 to 1 ... what would the reading be ? assume a 0? assume a 1? But that might not always happen, so you'll still get a picture, just sometimes just bit of data missed / wrong.
A higher quality cable may prevent that sort of thing happening due to higher quality connectors and cable material and insulation.
Indeed, this sort of cable difference in a digital environment is well established and accepted ... for example:
Ethernet : Cat 5, Cat 5e, Cat 6 ... well known that different cable quality can acheive different network speeds.
HDMI: A basic cheap HDMI cable may well do 1080p 60hz fine for the TV, but struggle and fail on 4K 60hz ... which is why you get the higher specced high speed cables. I have direct experience of this. A cheap cable initially connected at 4K fine and provided a picture ... but when started to be asked to play stuff, began to stutter and fail.
So overall, perhaps for high speed digital its better to think that signal quality is a bar to be reached, and once reached then there is nothing more to be gained by a higher spec cable, but a lower spec cable just might not be able to reach the bar in the first place.
Equally, its possible that a longer high quality cable will perform better than a shorter, lower quality cable just because it doesn't affect the signal going down it so much.
--------------------
Does any of that make sense ?