The time it took you from clinking on the link requesting to open this page and the actual page being displayed is latency.
Latency is the time gap between an action (input) and the outcome, or the time data takes to travel from point A to B. The longer the latency, the longer the time people have to wait for applications to load.
Therefore, latency is a constant of most modern technologies and systems, spanning from the data centre, networks, satellite communications, and audio to the internet and beyond.
Why does latency occur? This is simply down to the physical distance between two, or more, systems. Even though information usually travels at the speed of light, this is not enough to shorten the time it takes for data to travel, as the speed of light is a physical constant itself.
How do you solve latency? The closer to the source, the lower the latency will be. For example, this is one of the greatest premises being used in the data centre space to justify the need for edge data centres.
Data centres that serve financial hubs, like the City of London, are usually closer to the hub itself so latency times are shortened to a minimum.