Making the Internet of Things
Ask 10 people to define IoT and you’ll probably get 10 different answers. Even though the Internet of Things has been around for more than 20 years, there is still no universally recognized definition.
Here’s a fairly non-controversial way to describe IoT:
The Internet of Things, IoT for short, is the growing network of Internet-connected devices around the world that can send and receive data over a wireless network with little or no human assistance.
Today, pretty much anything can be outfitted with an inexpensive processor, connected to a good wireless network, and given a voice on the Internet of Things. Everyday objects such as your umbrella, coffee pot, or garage door can now share Internet connectivity that was previously reserved for our laptops and smartphones. As a result, the word smart has outgrown its relationship with humans and evolved into an adjective with seemingly limitless potential. We now chat about smart homes and smart cities surrounded by smart cars, smart speakers, smartphones, and smartwatches that our increasingly smart industry cleverly produces.
How did this mesmerizing digital transformation begin? Where did the Internet of Things originate?
Let’s begin with an uncanny prediction from 1926:
Tesla was not alone in his inklings of what was to come. However, we are going to jump forward a few decades to the United States at the height of the Cold War. The year is 1969 and the Advanced Research Projects Agency Network (funded by the U.S Defense Department) had just managed to briefly connect a host computer at the University of California, Los Angeles (UCLA) with a host computer at Stanford University. ARPANET was an experimental computer network to link computers in government-funded research institutions over telephone lines. The project is the result of many years of research and the desire to decentralize government and military systems in case of a foreign invasion. Although the project began with military objectives in mind, interest within the academic community was keen from the start. For the next 10 years, ARPANET will provide fertile ground for numerous networking breakthroughs that will eventually form the backbone of the Internet as we know it.
Which brings us to a refreshing part of our story: the cold soft drinks.
In the early 1980s, several bright young technology students at Carnegie Mellon University in Pittsburgh, Pennsylvania enjoyed drinking ice-cold Coca Cola. Fortunately, the computer science department at CMU had a coke machine. Unfortunately, some students had to walk a relatively long way to retrieve their beverage of choice. As the department grew and the distance to the Coke machine increased, two troubling issues arose: First, at the end of the trek, high demand frequently left the caffeine-deprived student standing in front of an empty machine. Worse yet, you might happily deposit coins in the machine only to receive a recently-stocked bottle that was still warm.
The main computer of the CMU computer science department was one of a few hundred computers worldwide that had access to ARPANET. The thirsty students recognized the unique opportunity this connection presented. Joining forces with a research engineer at the university they set out to solve their unsatisfactory soft-drink situation once and for all:
They installed micro-switches in the Coke machine to sense how many bottles were present in each column.
The switches were hooked up to the main departmental computer.
A server program was written to monitor the state of the Coke machine (including how long each restocked bottle had been in the machine)
If you ran a status-inquiry program you could see the following information for each column in the Coke machine from the comfort of your desk:
COLD: A properly chilled bottle of Coca-Cola awaited you in the column.
EMPTY: No product was available in the column.
Elapsed time: Displayed the number of hours and minutes since the bottle in the column had entered the machine. The students determined that the optimal temperature (just slightly above freezing) was reached in about three hours. After three hours of chilling, the status changed to COLD.
The program was wildly successful and after a few more modifications, people could check the Coke machine from any CMU computer (and later the Internet). Despite push-back from the disgruntled Coke repairman, the communicative Coke machine continued to serve students faithfully for more than a decade.
The Internet Coke machine of CMU is generally considered the first Internet-connected appliance.
About the same time that ARPANET was helping things go better with Coke, adoption of the new Transmission Control Protocol/Internet Protocol (TCP/IP) began to radically change how things would go on ARPANET. Equipped with the common language that TCP/IP provided, ARPANET expanded quickly into a network of interconnected networks, an ‘Internet’.
For the average person, the new global network of networks didn’t look like much. In the late 1980’s, that situation also changed. Tim Berners-Lee, a young software engineer at the CERN research labs in Switzerland, had an idea about how to share information. He called his project the WorldWideWeb. It turned out to be a very good idea. Realizing the importance of the project, CERN graciously made the underlying code available on a never-ending, royalty-free basis. The Web would remain an open standard that anyone could use to access and navigate the Internet.
People were warming up to the Internet and eager to discuss how things should be done. IT networking conferences sprang up to satisfy this need to share ideas. One of the original and longest-running conferences of its kind, Interop, is where we meet the toaster of our story.
A very special Sunbeam Deluxe Automatic Radiant Control Toaster.
In 1989, the president of the Interop conference presented the computer scientist John Romkey with a challenge. If he could “bring up his toaster on the Net” he could have top billing at the following year’s Interop. Center stage at a conference is very appealing. Not one to shy away from a bit of ridiculousness, John Romkey and his colleague, Simon Hackett took up the challenge. At Interop 1990, the pair served up a little slice of the future: a toaster that they could turn on and off over the Internet.
The talented little household device was a huge hit at the conference. Connection was established with TCP/IP networking with a single control: power on. The darkness of the toast was determined by how long you let the power stay on. (Since this Sunbeam toaster was indeed a deluxe model, the slice of bread lowered and popped back up automatically.)
The one hitch was that the physical presence of a human was required to drop in the bread and remove the finished toast. This obvious drawback was eliminated the next year. In 1991, the toaster was accompanied to Interop by a robotic arm that took care of the bread. Naturally, this handy robotic sidekick was also controlled over the Internet.
The TCP/IP-connected Sunbeam toaster is usually considered the first IoT device.
Over the next several years, many events improved the overall usability of the Internet and paved the way for the further development of IoT. Most notably, the first version of the GPS satellite network and the release of IPv6. However, the next stop in our story takes us to a pivotal year in the early development of IoT: 1999.
Similar to the Internet of people, the Internet of Things took a while to find a name and a common language. Although the idea of connecting devices had been around for many years, the name Internet of Things is credited to a 1999 PowerPoint presentation by Kevin Ashton. At the time, Mr. Ashton was a brand manager for Procter & Gamble (P&G) and keen on optimizing the supply chain for a desirable item. In this case, the item was a wonderfully popular shade of Oil of Olay lipstick that always seemed to be out of stock.
Frustrated with the lack of accurate inventory information, Mr. Ashton proposed placing a tiny radio microchip on the lipstick (and every other P & G product) that could raise an alert when the stock was low. The Internet was a hot new topic in 1999. Radio-frequency identification (RFID) was not on the tip of everyone’s tongue. To grab the attention of senior decision-makers at P&G, Ashton decided on an appropriately evocative term for his presentation: the Internet of Things. The presentation went well. The name stuck and the Internet of Things got a name (the acronym IoT followed later).
That same year, IoT functionality as we know it began to take shape. Two engineers, Andy Stanford-Clark (IBM) and Arlen Nipper (Cirrus Link) needed to monitor oil and gas pipelines in a desert. Given the difficult location, landlines, wired connections, and radio transmissions were not good choices. Satellites offered a possibility, but satellite charges were based on the amount of data transmitted, a potentially expensive option. To keep costs in check, the engineers had to make sure that the multitude of sensors they placed along the pipeline required absolutely as little bandwidth as possible.
No existing solution fits the bill. They wanted something compact, easy to use, and easy to implement. Something that would gracefully handle remote locations, patchy network connections, and limited everything. Something new. So, they created MQTT : an extremely lightweight machine-to-machine protocol that allows applications to communicate easily with one another over TCP/IP.
As it turned out, MQTT was precisely what the Internet of Things needed to connect its ‘Things’. The flexible publish/subscribe model MQTT offered combined with the ability to run MQTT on SSL/TLS for secure communication between devices fit the bill perfectly. The widespread implementation of MQTT led to the acceptance of MQTT as open OASIS and ISO standard several years later. Today, you are just as likely to find MQTT connecting things in your living room or car as on a lonely pipeline.
Since the turn of the century, exploration and implementation of Internet of Things applications has continuously accelerated and there are surely many more interesting tales to tell.
Our look into the origins of IoT ends at the local newsstand with a prediction written in 1999: