Imagine a data logger is chugging away at someone else’s desk halfway around the world and creating row after row of data 100 columns wide. Someone at that same office asks you for the largest number in column B even though you’re thousands of miles away. You are acting as the cloud, so the entire file gets sent to you. You have to acquire the document, do the filtering, and send the answer to the requestor, using a high amount of data to download and process all the information from far away.
With edge computing, that data logger is hooked up to a computer on the same desk. When someone local needs that largest number in column B, they send a request over the network, the computer runs a script to process the data and gives the requestor the single number they want, and then the computer also sends that number up to you for safe keeping and to mark the request. While your computer is more powerful than the one right next to the data logger, it is much cheaper to transmit that massive amount of data to a computer a few feet away than to send it all to you, and the requestor typically gets their answer faster. The only reason you are even involved is to hold onto that value as a backup in case something does go wrong at the network level. As the cloud, you are aware of the request and the result, but all that data stayed within the local network. Beyond the cost of sending data, you can probably see how people are more comfortable from a security standpoint as well when more data stays local.
In fact, security often drives the decision to enable edge computing in a network more than the cost savings of transmitting less data to and from the cloud. At fractions of pennies a packet, it takes a lot of data cost savings to justify adding thousand-dollar servers or even standard computers to your network. Yet, with data becoming a higher liability and powerful single board computers (SBC) decreasing in cost, the best solution is often to bring at least some processing back to the local network. Transmitting to a single board computer acting as a local server is no different than transmitting to any of the various sensors or devices on the local network that collect and transmit data, but now a $35 Rasbperry Pi with some extra memory is increasing network response time, storing and processing sensor information locally behind your firewall, and saving you hundreds of cloud messages a day.
That solution may sound too good to be true – if it’s such a great compromise, why is cloud computing so prevalent today? In reality, the processing needs of a network tend to be much more demanding than filtering a spreadsheet. Cheaper SBCs like the Rasbperry Pi 3 and BeagleBone Black are fine for typical home use where only light processing is needed. For larger scale networks, it has traditionally been cheaper and faster to transmit data to the cloud where it can be processed on extremely powerful computers than it is to shoulder the cost of building local server farms. With the introduction of powerful x86 single board computers like the Intel Joule, edge computing has become a realistic option for significantly more applications. Now that the investment for such powerful SBCs is only a few hundred dollars, many networks now use a blend of local and cloud processing. Sensitive or timing critical data is handled by a local computer and all other data handled through cloud services, maximizing security and speed while minimizing system cost and risk exposure.
관련 상품 참조
Inspired? Arrow’s IoT services can help you design a custom solution with your own unique blend of local and cloud based resources!
Do you want to learn more? Subscribe to our newsletter to get the latest news.