Introduction to Edge Computing
INTRODUCTION TO EDGE COMPUTING ImageSource: https://devops.com/is-your-infrastructure-ready-for-edge-computing/ What is Edge Computing? Edge computing brings computation and data storage closer to the devices where it’s being gathered, rather than relying on a central location that can be thousands of miles away. This is done so that data, especially real-time data, does not suffer latency issues that can affect an application’s performance. In addition, companies can save money by having the processing done locally, reducing the amount of data that needs to be processed in a centralized location. Edge computing was developed due to the exponential growth of IoT devices, which connect to the internet for either receiving information from the cloud or delivering data back to the cloud. Why does Edge computing matter? Before edge computing, a smartphone scanning a person’s face for facial recognition would need to run the facial recognition a...