Trying to foresee the future almost never goes to plan, and most technological predictions either fail completely or are far off from their estimated time. In this article, we will look at computing trends, where they may lead, and what the future of desktop electronics could look like.
Introduction
The electronics industry is unique in that it touches almost every aspect of everyday life, other industries are being improved by it, and it may eventually replace the need for human labor. No one could have imagined what the electronics industry would become when the first valve was created, and no one could have ever thought that electronic switches the size of atoms could ever be made.
This integration of electronics into everyday life has dramatically changed how we live and what we consider essential. The only electronics that people encountered a century ago would have been radios, lighting, and telephones, but even then, most people could do without such devices. Now, electronics provide instantaneous communication across the planet, enable the trading of millions of stock shares every second, provide on-demand video content, and hold most information gathered by man. Someone wanting to take electronics out of their life in modern times may find it impossible to do so, and this should indicate just how integrated electronics have become.
How electronics moved to the home
The advances made in electronics have not only made them smaller and more efficient but also cheaper. Early equipment such as scanners and printers would only be found in the office due to their price and size. Some individuals may have been able to afford a home office, but this would be limited to those with the money who had a dire need for such equipment. As technology progressed, office equipment became smaller and more affordable and entered the home market.
The same occurred with computers; the first machines were the size of rooms and only ever found in high-end businesses and research centers that had a need for computational devices. As the first computers were extremely expensive and large, terminal systems were developed that enabled individual users in an office to access the central computer (i.e., mainframe) and use its processing power remotely.
Eventually, technological development reduced the size of a computer from an entire room to a single piece of silicon, and computers that stayed the size of a room exponentially increased in capability. This reduction in transistor size helped bring computers into the home, which further led to the development of entire industries centered around computers. Thanks to the ability to reduce the price and size of electronics, the world eventually saw the rise of the internet, software giants, games, and mobile applications.
Where are electronics now?
A lot has changed in the electronics and computing industries compared with two decades ago. One of the biggest changes seen is the move to cloud-based systems, whereby data is both processed and stored in remote data centers instead of locally on devices. While this has the disadvantage of requiring an internet connection, it enables multiple devices to be synced with ease, allows for data to easily be backed up, and the same data can be accessed anywhere around the world.
From the concept of cloud computing (i.e., internet computing) came the internet of things. The idea behind IoT is that simple devices can also be connected to the internet and be used to gather data on their environment as well as remotely control devices. The difference between an IoT device and a standard internet-capable device (such as a personal computer), is that an IoT device typically measures sensory data and reports to a remote server what it is reading, while a standard internet-capable device is able to access sites to view content, interact, and display media (such as browsers). Furthermore, IoT devices are extremely low-cost compared with a standard internet-enabled device, making them ideal for mass deployment.
The development of the IoT industry led industrial companies to develop the industrial internet of things (IIoT). Simply put, manufacturers and suppliers of industrial equipment realized that IoT devices used in an industrial environment could provide a wealth of information that can dramatically improve the efficiency and performance of an industrial setup. Combining IIoT with AI creates a closed-loop industrial environment that can self-monitor and recognize when processes are failing and require maintenance.
Cloud computing has also led to the development of web applications and cross-platform programming. One of the biggest challenges since the first PCs has been that operating systems from different manufacturers generally don’t work well with each other. An application compiled for one system may not be able to run on another system (such as a Windows to Mac transition), thus requiring multiple compilations of the same program. Furthermore, programs designed to use an operating system’s specific framework (such as .net) can operate only on that system. As such, the use of web applications that operate in-browser allows any device with a browser to access and run complex applications such as word processors and spreadsheets.
What aspects of electronics are becoming important?
The past few decades have seen device capability and accessibility at the core of concern for consumers and researchers alike. Such categories of interest include CPU power, CPU speed, memory size, internet connection speed, and the number of polygons per second that a GPU can handle. However, the next decade in electronics is going to see a dramatic shift in computing toward security and data protection.
With the rising reports of hacking, ransomware, and cyberattacks, there is no doubt that consumers are increasingly becoming concerned regarding internet-connected devices. While such devices allow us to stream our favorite TV shows, access bank transactions, and pay for bills, the reality that someone may be gathering potentially personal data to sell to advertisers or to spy on one’s activity is beginning to become a reality.
Such concerns can be seen in the industry by looking at legislation being brought about around the world. For example, the U.K. government has already introduced its own IoT laws, which require such devices to be able to wipe personal information, not use common passwords, and use secure, trusted connections when communicating personal data across the internet. Other examples include western nations such as the U.S. and U.K. banning the use of Chinese-made equipment in key infrastructure such as cellular networks.
How could electronics change in the future?
To understand where the future of desktop electronics will go, we need to understand how current changes in electronics are affecting everyday life. Of all the electronics devices ever created, the smartphone is arguably the one device that has been adopted the fastest. If you stop someone in the street to ask them what electronics they have, there is a good chance that they have a computing device (laptop or computer), but it is almost guaranteed that they will have a smartphone.
While desktop processors continue to increase in performance and core count, it is the mobile SoC sector that is making the biggest change. From integrating AI co-processors to 4K GPUs, some mobile SoCs are on par with many desktop computers in performance. This means that the next few years will start to see smartphones that can, in theory, replace an entire computer.
Considering how the price of smartphones continues to fall (relative to their processing capability), it is very likely that users will shift to using a single device for all their needs. An advanced smartphone could be mounted on a docking station which converts it to a full-fledged PC, removed when on the go, and mounted in a vehicle to power any in-vehicle apps. The same device can be brought into work whereby it is connected to a docking station to serve as a work environment computer, then carried into an industrial site to interact with machinery via a cross-platform web app.
The use of a singular device would see individuals invest more in such devices, as there is no need for additional devices, and the reliance on a single device could help to reduce e-waste, which would further provide an incentive to use. The same devices may also become modular in design to allow future expansion (i.e., increased processor performance and memory size), thereby removing the need to recycle the device when a new model is released.
Mainframe vs. cloud
The use of cloud computing has exponentially increased and now allows basic remote devices to take advantage of large-scale data centers. For example, converting spoken words to text is a resource-intensive task that an IoT device may not be able to do. Thus, a recorded conversation can be sent to a remote data center for processing and the result returned back to the IoT device. However, if privacy and trust continue to increase in importance, then users may want to move away from cloud computing so that personal data can better be protected and controlled.
At the same time, computing technology will continue to improve, and the next decade or two will see personal computers with exceptional capabilities. In fact, it may even be possible for future computers to take on the role of a personal data center at home whereby a single machine can service all users via terminal links. Thus, instead of each individual in a home needing their own computer, a single computer can stream a desktop environment to different terminals, which consist of just a monitor, mouse, and keyboard.
If such a computer was developed, it could see individual users hosting their own personal cloud and data center. All devices owned by a user could instead utilize the home data center to process resource-intensive tasks, which provides both privacy and trust. Privacy is assured, as all data gathered by the system is stored in the user’s home, and trust is assured, as the data center gathering data is owned by the user. This ability to place trust in a home mainframe would also enable the development of smart homes, which use entirely internal connections between devices and the mainframe.
Conclusion
It is likely that the role of smartphones will continue to change and may likely replace entire PCs. A portable computer platform that merely needs to connect to a docking station would see users invest more in their smartphone while spending less on home computer equipment.
The increase in processor power and memory size could also see the rise of personal data centers, which enable trust and privacy. If such a system was combined with cloud computing, it could lead to a world whereby the control of personal data is finally given back to the user in its entirety while enabling the user to travel anywhere in the world and still access their data.