In most cases, we have all the technology building blocks today for the Internet of Things. We have microcontrollers and SoCs small enough to put intelligence into everything, plus sensors to detect all that is happening. We have the Internet and wireless networks and mobile devices of all shapes and sizes. We have cloud computing and storage and data analytics and a host of other technologies.
On a small to medium scale, IoT configurations are a slam dunk. Connecting a smartphone to home automation or fitness devices via Wi-Fi or Bluetooth – easy. Setting up beacons in retail stores and sporting venues to inform patrons – easy. Creating smarter hospital rooms with wirelessly connected instruments – fairly easy, compliance issues aside. Putting 70,000 ZigBee nodes in the ARIA Resort & Casino, controlling light switches and thermostats and door locks and more – still relatively easy.
What these examples all have in common is they are bounded. Each of these networks knows essentially what devices and apps it is connecting with, controls its own data and the format of the stream, and doesn’t have to share much with the world at large. Granted, there are still issues mostly centering on privacy or security or safety, but these types of applications can be built today – the requirements are straightforward and scalable.
The unbounded world, where everything streams data to everything in a dynamic environment, is another story. Case in point: the spectacular failure of the Joint Tactical Radio System. From that article, David Axe summarized the JTRS vision in terms chillingly like those often used to describe the IoT:
[The US Army] developed a plan to create a “cyberwar” force on the battlefield, upending a tradition that only leaders carry radios, and information flows not between individuals, but between squads, platoons, companies, and other units. Instead of merely pointing their rifles and scanning with their eyes, every soldier in the networked force would be an information node with his own cameras, GPS tracker and radio, all communicating perfectly with others.
Transmitting large amounts of unstructured data in far too many different formats – over 30 at last count – quickly turned a great vision into bloatware. The original JTRS man-portable radio goal all but evaporated, and even vehicle mounting the inflated units turned into a challenge. Fortunately, the idea behind software defined radio lives on in everyday use: most 4G smartphones carry a form of SDR in their baseband interface to be able to deal with a variety of cellular networks.
Recent calls for a replacement of the commercial airliner “black box” have similar requirements problems. 93,000 flights per day across varying regions with differing communications frequencies and air traffic control protocols. Uploading far more extensive data from an extremely complex system to the cloud via satellite, streamed in real-time. Dealing with a diversity of airframes with different models of engines, manufactured in a variety of configurations over time. The technology certainly exists, if one could wave a wand and have a standard – and the ITU has issued an invite for just such an effort.
On the ground, similar calls for connected cars that brilliantly share data with each other seem feasible, but the same problems arise. Auto manufacturers are each creating their own data schemes, and unless some standards body steps in with data formats and APIs, the odds cars will be able to communicate vehicle dynamics information seamlessly with other makes and models in real-time are low. The mandate for event data recorders, 49 CFR 563, but it only defines what variables are to be recorded and how often – not a format for a real-time data stream with byte order and encoding, which has already evolved to be very vehicle-specific. Also in progress is SAE J1698, but it only addresses frontal impact data currently.
Requirements “creep” or overspecification is often cited as the issue, and that is a symptom; the JTRS requirements document tripled in size over the life of the program. However, the root cause for this expanse and diversity of requirements in connected systems is a rush to commercialization before data stream standardization. Manufacturers are often rewarded for early innovation and first-to-market, but the penalty for this now shows up later when incompatible implementations try to join together.
In the larger vision of the IoT, we are now here. Connectivity is just the start of the day – devices are just babbling away unless one knows the data format. The way to prevent a JTRS-like failure in the end game for the IoT is to define higher level standards for data interchange, maybe at first within specific industries such as automotive and medical (where the Continua Health Alliance, working with ISO and the IEEE SA, is making significant progress), and the smart grid.
Marketing consortia can take on some of this for particular verticals and protocols, but most bias toward a particular solution set – not a protocol-agnostic, data-centric approach. An effort like Hyper/CAT, building a machine-readable catalog of resources and their data formats, holds some promise for solving the ad-hoc already underway if more developers embrace JSON.
Uber-competitive firms may bristle at this whole idea, the very antithesis of how we have succeeded developing technology, creating sustainable advantage, and reaping the rewards in the past. Closed IoT systems can succeed in the short run, until they grow beyond their boundaries. Privacy wonks will also be quick to speak up. There are still layers of security – starting with the basics of authentication and encryption – that keep the data itself safe, and those should also be baked into interchange standards.
For the IoT market, the lesson learned from JTRS should be early collaboration – and global standards bodies like the IEEE, ISO, the ITU, and the SAE need to help catalyze more of the needed higher level data interchange efforts before implementations run rampant.