Some techy terminology is inherently intriguing. The World Wide Web, for instance. Or quantum computing, blockchain, artificial intelligence… These terms serve to pique your interest and desire to find out more. They sound exciting and futuristic, perhaps evoking memories of sci-fi novels and movies.
But the “Internet of Things”? What’s that about?!
It sounds like something you say when you’ve forgotten the right words. What is the definition of a “thing” and why do they need the internet? It seemingly describes something dull, generic and unimportant, when the reality is anything but…
Believed to have been coined by Kevin Ashton of Procter & Gamble in 1999, the term “internet of things” was originally conceived as the “internet for things”. He viewed radio-frequency identification (RFID) as an essential component in enabling computers to manage all individual things.
The Internet of Things (or “IoT” as it’s affectionately known) is an information system infrastructure for implementing smart, connected objects. Or, in simpler terms, the extension of internet connectivity into everyday, physical devices.
Arguably the easiest way to understand the concept from a consumer perspective is to consider “smart home” technology.
From doorbells and lighting to thermostats and home security systems, these appliances form part of a common ecosystem which can be controlled remotely via smartphones, smart speakers or similar internet-enabled devices. They communicate and interact over the internet, enabling the IoT via the merging of technologies including embedded electronics, wireless sensor networks, automation and machine learning, amongst others.
When you consider the immeasurable impact that the internet has already had on our day-to-day lives – communication, education, governance, science, sports, relationships and so on – it would be folly to underestimate the radical potential for change that the IoT wields. By collating, analysing and distributing data, the IoT represents the next phase of our digital evolution – and it’s more than a giant leap for mankind.
The old Latin aphorism scientia potentia est (“knowledge is power”), is particularly apt as the IoT looks destined to change the world as we know it - offering unprecedented insight, information and wisdom to governments, businesses and individuals. Projects are already underway to narrow the divide between the rich and the poor, and to more effectively regulate and distribute the world’s most precious resources.
On a smaller scale, IoT is forecasted to make businesses much more efficient. For example, farmers can ensure a better yield by employing sensor networks to monitor crop health and growth rates. Similarly, major contractors are adding sensors to building projects to run simulations that spot flaws and inefficiencies before they become much costlier problems.
With the release of the Apple Watch Series 4 model, it is easy to see how the IoT can also benefit individuals in a big way. The Heart Rate app continuously monitors the user’s pulse for signs of atrial fibrillation, and sends a notification if abnormal heart rhythms are detected. The watch will also spring into action if the wearer takes a tumble, immediately presenting the option to contact emergency services with a single swipe of the finger.
The Cisco Internet Business Solutions Group (IBSG) defined the Internet of Things as simply the point in time when more “things or objects” were connected to the internet than there are people. With the explosive growth of smartphones following the release of the now-ubiquitous iPhone in 2007, we didn’t have to wait long for this digital transformation to occur. In 2010, there were 12.5 billion internet-enabled devices, compared to a human populace of 6.8 billion resulting in a 1.84 : 1 ratio of connected devices to people.
Those numbers are mind-blowing and frightening in equal measure. As almost a decade has passed, it’s daunting to consider what the ratio would be now. Let alone in another 5, 10 or 20 years’ time!
Whilst the logic behind IBSG’s definition is clear, I’d argue that the Internet of Things does not pertain to a tipping point – an isolated moment in time when the digital landscape altered forever – but to the tip of an iceberg that we are decades away – perhaps even centuries - from fully comprehending.
Nowadays, all sorts of “smart” inventions have made their way to market. From toothbrushes that record and assess your daily brushing habits to toasters that remember your browning preferences, it seems that if something can be connected, then it will be.
Even the cows aren’t safe...
In a 2010 report by The Economist entitled “Augmented Business”, readers were introduced to Sparked – a Dutch start-up responsible for implanting sensors in the ears of cattle to allow farmers to remotely monitor their health and track their mooo-vements.
As with most advancing and emerging technologies, however, there are serious concerns around security and privacy. By utilising video as a form of visual sensor, questions are raised around the controversial role of surveillance in our daily lives. Whilst one family may find at-home cameras useful to keep tabs on pets, small children and elderly relatives, another may find it to be totally unacceptable breach of trust and boundaries.
Similarly, there are legitimate fears around the security vulnerabilities of technology which relies heavily on a connection to the Cloud. Whether it be large corporations harvesting personal data or hackers controlling appliances from afar, an increasingly connected world undoubtedly equates to greater risks.
So, what next for the development of IoT?
The playing field is constantly changing. As are the goal posts. For the IoT to achieve its full potential, it is necessary to set global standards across security, privacy, architecture and communications, as well as solving for challenges around powering and hosting the billions of new sensors that are set to be deployed in the coming years.
It is sure to be a long and complicated road to navigate, but I cannot wait to see where it will take us.