3 IoT Myths to Take With a Grain of Salt

The conceptual framework known as the Internet of Things is still in its early years.

The network of the future is already here. The Internet of Things (IoT), a conceptual framework existing beyond the event horizon of technology for years, is now becoming reality as the number of connected devices surpassed the number of people living on Earth in 2008.

For now, the “things” connecting to the Internet are still largely personal devices, such as smartphones, tablets and notebook computers, but that is changing quickly — exponentially, in fact. According to Cisco Systems, digital infrastructure is being adopted at five times the rate that electricity and telephony were in their early stages. The number of connected devices is projected to grow to 50 billion by 2020 — an average of six online objects per person, and these objects won’t be limited to personal devices. As sensors become less expensive, engineers are developing more applications to usefully process the data they produce. Ultimately, any object that creates data that is valuable to enterprises and consumers will be connected to the Internet.

Contrary to the lifeless connotation inherent in the word “thing,” IoT has the capability to connect both inanimate and living beings. The example consumers are probably most familiar with is the increasing popularity of devices that track health and fitness metrics, such as connected pedometers, heart-rate monitors, bike and run trackers, and smart watches.

However, it is important to remember that the connectivity needed for IoT to function is still emerging. The ability of connected devices to gather and share data is already changing what types of devices communicate over networks. In the very near future, it will vastly increase the amount of data being shared and stored. Network managers must be prepared for this shift, which in part means understanding what IoT is and what it isn’t.

Perhaps because this technology is still emerging (and because it is the source of so many diverse use cases), the Internet of Things is the subject of many misconceptions. What follows are three common myths about IoT and explanations of why they’re wrong.

Myth No. 1: IoT will make data centers irrelevant because all of the processing will take place at or near the endpoints.

It is true that computing at or near endpoints will be necessary in order to facilitate the billions of new connections and pieces of data that will be created by the Internet of Things.

Partly, this will be accomplished by moving some computing to routers, which will handle data that doesn’t need to be processed at a data center. For example, if a sensor is placed on a machine to detect when the failure of a certain part is imminent, that sensor will constantly produce data. But most of this data will simply be the repeated message that the part is still working fine.

There’s no need, in most cases, for those messages to be sent to a data center, when they can instead be handled by a router that is positioned closer to the sensor. Only when the sensor is reporting trouble will the data be transmitted to an organization’s central data center or the cloud.

Rather than becoming irrelevant, data centers will be instrumental in facilitating IoT, and they will need to adapt in order to accommodate this emerging technology.

Myth No. 2: IoT won’t reach large-scale deployment because of lack of bandwidth.

The prospect of billions of new devices producing data that must be transmitted and processed in real time has some observers nervous about a bandwidth drought.

It shouldn’t.

First, many IoT-connected sensors will collect only minute amounts of data at a given time. Consider a sensor on a piece of industrial equipment that records the temperature of the part in order to alert maintenance staff of overheating. Now compare the connection capacity needed for that one data point to the amount of bandwidth required for current widespread applications such as video streaming and peer-to-peer file sharing. It’s a relative drop in the bucket.

Obviously, the data adds up when thousands of sensors are all talking and transmitting readings. But much of this data will be processed at or near the point of collection, dramatically reducing the bandwidth needed to carry data back and forth to central data centers.

Networking professionals will have time to prepare for the influx of new data, as well. Cisco predicts that, because of their low-bandwidth nature, machine-to-machine applications will represent less than 3 percent of global IP traffic by 2018.

Myth No. 3: IoT’s lack of standards will make for a Wild West environment that limits wider adoption.

It is true that the industry has not reached consensus on standards for IoT devices, but this is not surprising for an emerging technology. Going back only a few years, Blu-ray and high-definition DVD formats competed for dominance, until Blu-ray ultimately prevailed. Before that, VHS beat out Betamax. Such technology wars date back to at least the 1800s, when Thomas Edison and George Westinghouse battled in the “War of Currents” to determine whether Edison’s direct current or Westinghouse’s alternating current would become the standard for electricity.

Similarly, enterprises and consumers can be confident that industry standards for IoT will emerge. But it may take some time for the dominant standard to emerge. Already, a number of companies and other organizations are working to create standards and govern IoT.

Dive Deeper

Check out CDW's "Networking: Focus on Software" tech insights guide to learn more about:

  • The increased role of software in networking
  • Essential wireless management tools
  • Network optimization using Intelligent WAN