Reading time ( words)
It is often surprising how closely our science fiction novels and movies model the future. The Internet of Things (IoT) and big data may prove to be an excellent example of this phenomenon. Coined in 1999 by Kevin Ashton (the RFID standards pioneer), the IoT is one of the least descriptive monikers of all time for something very important in the history of technology. At its core, the IoT (or IoE or IoX) is a catchall grouping for a network of interconnected devices across multiple technologies that span everything from smartphones, to utility systems, medicine, to vehicles—all of which will collect data and communicate with the cloud and with each other to make “intelligent” decisions about their operation within the total context of the network. Add to this an astounding proliferation of sensors measuring everything from temperature to vibrations, chemicals, and magnetic fields. Drop in some big data analytics, make
the leap from traditional neural nets and genetic algorithms written in LISP or Prolog to more modern programming languages like Python or Haskell, transition to distributed parallel architecture and a bunch of Google programmers lead by Jeff Dean, Geoffrey Hinton and Raymond Kurzweil, and you begin to have something that looks very much like James Cameron’s mythical “Skynet” from the 1984 movie “The Terminator.”
What exactly are these devices that will be connected?
In 2015, with 13.4 billion connected devices, we are close to saturation on human computer interface devices such as tablets, mobile phones and personal computers.
Editor's Note: This article originally appeared in the November 2015 issue of The PCB Magazine.