It won't be the Internet of Things as you know it

Tools

Too many technological concepts are overhyped before they are understood. Some are given catch-phrases. Back in the '80s, I lost count of the number of pitches I received for news related to the "modem generation." Our whole world is going online, I was told. Online was changing the way we live, work, think and eat, I was told. By 1990, over a billion people will own modems. Some people will own three or four modems. Why? Because you'll need one at work too, then one in your car--and some people own two cars.

Modems will be everywhere--all the trends were pointing that direction. And just think of the many "modem generation" trends: Maybe restaurants will serve up modems right at your table. How will modems be distributed to people in third-world nations? How soon will modems render the post office obsolete? Will kids in public schools recognize the sing-song pattern of Microcom 56Kbps handshake tones before they recall the National Anthem?

Of course, modems aren't much without telephones. With greater mobility enabling people for the first time to work dozens, even hundreds, of feet from their offices, technologists projected people would be keeping modems in their pockets, or maybe on their key rings, by the turn of the century. Why? Because people will have to dial up AOL somehow. Something has to plug into all those telephones.

Technology analysis focusing on devices loses sight of the forces of evolution. The Internet of Things will thrive by 2025, concluded the Pew Research Center last May, because it asked 1,606 people this question: "Will the Internet of Things have widespread and beneficial effects on the everyday lives of the public by 2025?" And some 83 percent of those folks said yes. Had the phrase "Internet of Things" been replaced by "gilded fairy leprechauns," I doubt the majority would have shrunken all that much.

The guiding principle of IoT is this: When you embed inexpensive communications devices into ordinary objects, you enable the type of sensor-intensive applications that previously required human observation. If things can report their status, applications can be written to make that data useful.

But those applications require something called infrastructure, which doesn't manifest itself organically. A useful application must present itself first before the builders of infrastructure make the necessary investment. The notion that an IoT can be "the Internet" (the set of all hosts of Internet Protocol) derived from the hope that devices can leverage the communications medium that's already here, the infrastructure we already have.

That's not practical for the applications being developed for "real IoT," for a variety of reasons:

  • Things don't need to communicate with each other. They only need to communicate with a central hub. That hub can have an IP address, but even then, it makes sense to leave the hub in a subnet that communicates with one and only one server. That server may contain the policies and permissions necessary to send signals to these hubs.
  • Things don't need to join "the Internet" as peers. Put another way, we, the people, don't need to communicate with these things. It's more practical to communicate with the application that controls the hub.
  • An Internet of equivalent things and people would be extremely difficult to manage. You think Netflix is a bandwidth hog? What about all the cartons of eggs?
  • Inserting things into "the Internet" would make people less secure. We need identity to secure the communications between people online. The only reason things would need identity in a shared Internet would be to distinguish them from people, as in, "Do not authenticate me, I am not a person, I am a 4-by-12 sheet of drywall." And since things would require fewer policy controls than people, the easiest course of action for a malicious agent would be to disguise himself/itself as a 4-by-12 sheet of drywall.

All the problems that would be created by the Internet of Things we read about in press releases have actually already been solved by architects of the real IoT. There are low-bandwidth communications protocols that things may use to communicate with hubs. But so long as we continue to envision this singular, colossal mesh of signals the way it's being marketed to us, rather than the way it should truly be built, we actually work against the real IoT. A network of interconnected devices needs low-bandwidth communication--at a time where we're ripping up low-bandwidth cellular towers and replacing them with 4G. We're building an infrastructure around the Internet of people, and we're often blind to the fact that things don't require the same resources, any more than the Internet of people required more telephone modems.

There are folks who argue that the original Star Trek series is now behind the times, and that we've already progressed past the flip-top communicator stage of our evolution. I tell them they forget about the part they don't see. Kirk and Spock could talk to each other when stranded on opposite sides of the same planet, or buried beneath miles of solid rock. Their communicators were not cellphones. They generate their own infrastructure where there wasn't one before--and that, to me, has always been the extraordinary part. We get so caught up in the devices that we forget how we need to evolve the things that connect them. - SF3