Surveillance and censorship, creativity and altruism. The internet of things is the last best chance to build an open society. This timeline covers over 134 events and includes some credible predictions and likely milestones in the years ahead. It covers a range of social and engineering events related broadly to the development of device networks, and can be sorted by categories such as: drones and satellites, types of innovations, security, network trends, culture, politics and policy, and technical standards. Suggestions for new entries welcome, please send them to Phil Howard (pnhoward((at))uw.edu).
American mathematician Edward O. Thorpe, together with mathematician Claude Shannon invented the world's first wearable computer. The cigarette pack-sized device was made to predict the motion of roulette wheels.
American universities UCLA and Stanford Research Institute were connected with the first permanent link of ARPANET, the precursor to the Internet. About a month earlier, on 29 Oct 1969, the first message was sent from UCLA to Stanford via computers by UCLA student programmer Charley Kline.
American psychologist and computer scientist Dr. Joseph Carl Robnett Licklider wrote a series of memos exploring his idea of an “intergalactic computer network”. He imagined a system of computers connected to one another, and a space where all the data is available for everyone from anywhere. This idea paved the way to the creation of online banking interfaces, digital libraries, and cloud computing.
French inventor Philip Moreno demonstrated that a plastic card with a computer chip embedded in it can be used for electronic payments. Moreno is generally credited with inventing the smart card, which he called la carte à puce (“the flea card”). The card took eight years to become popular in France, and even longer to become widespread elsewhere. The first trials of ATM bank cards with chips were successfully conducted in 1984.
British inventor and entrepreneur Michael Aldrich demonstrated real-time transaction processing by connecting a domestic television set to a computer by a telephone line. He called his invention teleshopping. It was a centralized, two-way online system, which transmitted real-time information, similar to how airport schedules are displayed nowadays.
Metcalfe's law was presented, stating that the value of a network is the square of the number of connected nodes (n2). In the 1980s, the law referred to telephones and fax networks, but with the spread of the internet, it carried over to users and networks. The logic is that a single node is worth nothing, but every additional device increases greatly the total value of the network by introducing multiple new connections. The law can be used to illustrate the possible growth with the development of IoT.
American researcher Steve Mann designed and built a wearable multimedia computer with wireless capabilities. The laptop had not yet been invented, so a battery-operated computer was a novelty. The computer also had imaging capabilities. Mann carried the system in a backpack, and had a CRT display on the helmet. He carried a lamp to be able to take pictures in the dark.
The US Department of Defense declared TCP/IP as the standard for all military computer networking. TCP/IP is a set of communication protocols used on computer networks, providing end-to-end connectivity for computers.
Four graduate students at the Carnegie Mellon University's School of Computer Science installed micro switches in the soda vending machine to be able to check from their desks whether the machine was stocked with drinks. This was the first Internet-connected appliance in the world.
Inventor Charles Walton first patented the Radio Frequency Identification (RFID) device. The device, consisting of a small chip and an antenna, is used to transfer data wirelessly between connected objects. The technology was first developed for espionage in 1945.