By IT Brew Staff
less than 3 min read
Definition:
Edge computing is the more efficient alternative to processing and analyzing data in a far-off cloud data center. By processing data closer to where it’s stored and generated (such as networked IoT devices), edge computing reduces bandwidth costs and delays.
Computing on the edge. As the name implies, edge computing takes place at “the edge” of a network, via cloud and physical infrastructure. There’s also an increased cybersecurity benefit, thanks to decentralized data processing and storage, which can significantly reduce the impact of a cyber attack.
Edge computing could play a major role in emerging technologies such as autonomous driving. To respond to rapid changes in road conditions, driverless cars will need to process data within the car’s IT infrastructure itself, rather than in a far-off datacenter. Healthcare devices and factory machines can also benefit from innovations in edge computing.
Edge computing driven by AI, Forbes writes, is the “foundation of modern business innovation, fueling smarter operations, enhanced customer experiences, and entirely new business models.”
The publication said the need to address a gap between realized value and potential is a challenge for enterprises racing to implement edge computing. It argued that these technologies are no longer futuristic concepts, but the foundation of today’s enterprise tech stack, and IT staffers must figure out how to actually integrate them effectively (and maybe even profitably).

