Techsplained @FE: What is edge computing? How it works and why we need it
In the late 2000s, high-speed data access and seamless connectivity allowed companies to access servers from a remote location. It allowed organisations to scale capacity by the click of a button and a pay-as-you-go model developed. Clouds provided a much-needed boost to new age start-ups. Now, any developer could create an app and run it on Amazon, Microsoft or Google’s servers and make millions or billions in the process. Clouds reduced the cost of running a business. But as clouds become pervasive and more economies enter the realm of 5G, data footprints are growing beyond what a cloud can control. This has led to another age of computing. Instead of going bigger, companies are trying to go smaller.
What is edge computing?
Edge computing means processing data at the edge of the network, usually where it is generated. Instead of a cloud, data is processed by the device or by a local computer or server.