This form logs you into your management portal account. To access your help desk account, click here and use the form to the right of the news.
In the early days of computing, centralization was key. Computers were so large it couldn’t be any other way. The machines themselves were locked away in dust-free rooms and users interacted with them via the hallowed operator and, later, via thin clients. In later decades, as miniaturization reduced the size of the average computer, computing became more decentralized; applications ran on PCs sat on or under desks.
In the last few years, the pendulum has swung back towards centralization. Although we all have computers in our pockets more powerful than even the most ambitious desktop PCs of a decade ago, much of the actual computing happens on public cloud platforms and dedicated servers in data centers far away from users.
With another turn of the wheel, it seems like we’re once again heading away from centralization towards what’s been dubbed edge computing. Edge computing is the idea that, in the future, more of our computing will be done by autonomous smart devices that can communicate with each other over local networks, but aren’t necessarily dependent on centralized control.
As is often the case, the growing importance of edge computing has been cast as a battle between centralization and distribution, but the reality is more complex.
The key driver of edge computing is the Internet of Things. In the next few years there will be billions of sensor, processor, and network equipped smart devices released into the world. Just about every electronic device you can think of will be smartened up, as will products that previously had little to do with computing, like clothes.
Much of the benefit of ubiquitous smart devices relies on real-time feedback, and to deliver real-time information effectively, a degree of autonomy is required. Network latencies and laggy round-trip times aren’t an option. Hence the increased interest in decentralized edge computing.
Edge computing is here to stay, and it’s likely to become more important over the next few years, but that doesn’t mean centralized cloud computing or the traditional bare metal server will go away any time soon. Edge computing is useful in many scenarios, but much of the benefit of smart devices only materializes when data is aggregated and analyzed. As InformationWeek writer Andrew Froehlich puts it:
“Yes, edge computing can indeed provide performance benefits such as real-time computation and a reduction of dependence on network connectivity into the cloud. But at the same time, you give up a great deal that makes cloud computing so appealing. This includes better overall management of applications and data since they are centralized.”
A local smart device without a connection to the cloud might be able to tell you it’s raining and that it rained three times last week, but it won’t be able to tell you whether it’ll rain tomorrow, because accurate climate tracking demands centralized analysis.
Another much-hyped trend in the technology space is machine learning: the application of self-learning algorithms to large data sets. Machine learning is capable of spotting patterns that humans aren’t, but to do so it needs large data sets and an awful lot of number crunching. As machine learning becomes increasingly important, the computing power required will only grow, and there’s little chance that smart devices on the edge will have the requisite resources.
Rather than replacing centralized computing, edge computing will complement it, collecting data and carrying out preliminary processing, before sending it off to centralized cloud locations where it’ll be processed. Any insights derived from that processing can then be used to influence the behavior of devices at the edge. Rather than a mesh of autonomous devices, the computing model of the near future will rely on both the edge and the center.
Comments (0)
Leave a Comment