In this column, we briefly introduce the concepts of the Internet of Things (IoT), the Industrial Internet of Things (IIoT), the Internet of Healthcare Things (IoHT), and the Artificial Intelligence of Things (AIoT).
From one perspective it could be said that, not so long ago, things used to be a lot simpler than they are today. If you had a sensor like a thermometer mounted on the wall of your house, for example, then all it did was measure the temperature. Furthermore, in order to determine what it was trying to tell you, you had to fight your way out of your comfy chair, waddle across the room, and look at the little rascal.
On the bright side, this meant that you got a lot of exercise. On the downside, raw data is of limited use in isolation. As we’ve grown to discover, if we can gather and access data from remote locations, we can use it to monitor and control what’s happening. And, if we store disparate data over time, we can use technologies like artificial intelligence (AI), machine learning (ML), and deep learning (DL) to spot subtle patterns and identify complex relationships between seemingly unrelated sources (see also Fundamentals: AI, ANNs, ML, DL, and DNNs). Furthermore, we can use these technologies to detect trends that are indicative of future problems, offer predictions, and provide solutions.
Everything is Connected
In 1998, I formed a company with my friend, Alvin Brown. We both had day jobs, so every day after work we ambled across the road to an office we rented. We had both purchased PC tower computers powered by Intel 486 processors. I can’t remember how much RAM they had, but it wasn’t much, and I can’t recall the capacity of their hard disk drives, but it wasn’t large. What I do remember is that these little rascals each cost around $2,400, which made my eyes water.
I also remember that we purchased a 56k modem, which had just come out that year. Even though this data rate was only a trickle compared to today’s “fire hose” connections, it was almost twice what we’d had before, and we thought we were “sitting in the catbird seat” as the Americans so amusingly say.
The only downside was that only one of us could use our modem at any particular time. This wasn’t too much of an issue. The real problem was that we had two desks facing back to back, and that we had to climb on top of the table to unplug the modem cable from the back of one machine and plug it into the other.
I still remember when Alvin purchased a special card that he plugged into his tower computer. This little beauty allowed us to connect the 56K modem to his computer, and from there to mine. Although it was still only possible for one of us to use the modem at a time, at least we could access the little scamp without having to clamber around on the furniture.
This all seems so antediluvian now (“So 20th century, my dear”). The scary thing is that it’s only 21 years ago as I pen these words. Now, we have wireless networks, the internet, smart phones, smart sensors, smart devices, and more acronyms than you can swing a stick at. Speaking of which, let’s take a look at some of these little rascals…
The Internet of Things (IoT): The definition of what the IoT actually is has evolved over time. An early definition might have read something like, “Sensors and actuators that are embedded in physical objects and that are linked through wired and wireless networks.” More recently, we might talk about, “A system of interrelated computing devices, mechanical and digital machines, objects, animals, or people that are provided with unique identifiers and the ability to transfer data over a network without necessarily requiring human-to-human or human-to-computer interaction.”
The term “Internet of Things” was coined by British technology pioneer Kevin Ashton during a presentation he made at Procter & Gamble (P&G) in 1999. Kevin used “Internet of Things” to describe a system where the internet is connected to the physical world via ubiquitous sensors. It wasn’t long before the term Internet of Things and its IoT abbreviation had themselves become ubiquitous.
The first “thing” on the IoT actually predates the IoT moniker by almost two decades. As related in The Little-Known Story of the First IoT Device, in the early 1980s, some students and techies at Carnegie Mellon University tinkered with a Coke machine and made history. These little scamps augmented the Coke machine with sensors, and then connected these sensors to the department’s main computer, which was itself connected to ARPANET (the precursor of the internet). This allowed anyone with access to the ARPANET to remotely check whether or not there were any Cokes in the machine and — if so — which ones were cold.
How things have changed. According to Statistica, there are expected to be ~30 billion IoT connected devices installed around the world in 2020, rising to ~75 billion in 2025.
The Industrial Internet of Things (IIoT): This refers to interconnected sensors, instruments, and other devices networked together with computers and industrial applications, including manufacturing and energy management. This connectivity allows for data collection, exchange, and analysis, potentially facilitating improvements in productivity and efficiency as well as providing other economic benefits.
The IIoT is enabled by technologies such as cybersecurity, cloud computing, edge computing, mobile technologies, machine-to-machine (M2M), 3D printing, advanced robotics, big data, the Internet of Things (IoT), RFID technology, and cognitive computing. (FYI If you want to learn more, the folks at Tiempo have created a rather good What is the IIoT? online book that introduces various aspects of the IIoT in an easy-to-understand way.)
The Internet of Heavier Things: The term “Heavy Industry” refers to an industry that involves one or more characteristics, such as large and heavy products, large and heavy equipment and facilities (e.g., heavy equipment, large machine tools, huge buildings, and large-scale infrastructure), or complex or numerous processes.
The point is that there is estimated to be about $6.8 trillion of existing “dumb” fixed infrastructure and heavy machinery in the USA alone. There are tremendous efficiency and productivity benefits that can be obtained by having heavy industry connected to the IIoT. The options are to struggle along as-is, to replace the existing “dumb” equipment with bright and shiny “smart” equipment, or to retrofit the existing equipment with modern sensors and control systems, connect it to the IIoT, and give it a new lease on life, as it were.
American venture capital firm Kleiner Perkins has designated the augmenting of industrial systems with IIoT and AIoT capabilities as the “Industrial Awakening.” In an article published in 2015, The Industrial Awakening: The Internet of Heavier Things, Kleiner Perkins referenced a report generated by the World Economic Forum, which noted that this “Industrial Awakening” is expected to generate $14.2 trillion of global output by 2030.
The Internet of Healthcare Things (IoHT): The Internet of Healthcare Things is a concept that describes uniquely identifiable devices that are used in the medical area, that are connected to the internet, and that are able to communicate to the internet and with each other.
By itself, the Internet of Healthcare Things is expected to top $163 billion by 2020. As just one example, take a look at the Injectsense Real-Time Eye Pressure Monitoring platform. This teeny-tiny hermetically sealed silicon device is only 2.5 mm x 0.6 mm. The goal of this type of device is to provide organ-to-cloud data connections that will provide unprecedented visibility into the human body and enable clinicians to assess the effectiveness of alternative therapies in real-time.
The Artificial Intelligence of Things (AIoT): According to the IoT Agenda, “The Artificial Intelligence of Things (AIoT) is the combination of artificial intelligence (AI) technologies with the Internet of Things (IoT) infrastructure to achieve more efficient IoT operations, improve human-machine interactions, and enhance data management and analytics […] the AIoT is transformational and mutually beneficial for both types of technology as AI adds value to IoT through machine learning capabilities and IoT adds value to AI through connectivity, signaling, and data exchange.” I couldn’t have said it better myself.
The Cloud, the Fog, the Edge, and the Far Edge
The term “The Cloud” is typically used to refer to data centers available to many users over the internet. Large clouds often have functions distributed over multiple locations from central servers. Around the world, there are thousands of data centers boasting tens of millions of server-class processing modules.
Many companies have mission-critical requirements, which means they can’t afford to lose the ability to monitor and control their systems in real-time. In turn, this means that they have to be able to keep on running, even when they lose their internet connection to the cloud. The solution is to use “The Fog,” which may be visualized as small, local clouds. In real terms, this means a smaller group of server-class processors at the level of the local area network (LAN). At the time of this writing, there are millions of fog-level nodes.
The term “The Edge” refers to the devices at the edge of the internet. This is where the internet meets, and interfaces with, the real world. At the time of this writing, there are billions of edge-level devices.
Unfortunately, there’s a bit of confusion here, because some people regard a client server in a factory as being on the edge. To put this another way, there are some who regard the edge and the fog to be one and the same thing. In order to minimize confusion, some folks are beginning to talk about “The Far Edge” or “The Extreme Edge” to refer to the IoT devices that interface with the real world (see also my column on EEJournal.com — What the FAQ is the Edge vs. the Far Edge? — for an in-depth discussion on this topic).
Not surprisingly, the term “cloud computing” refers to processing data in the cloud, the term “fog computing” refers to processing data in the fog, and the term “edge computing” refers to processing data where it is being generated at the edge of the network.