The Turing Machine
Alan Mathison Turing (1912–1954) was an extremely gifted man, who was influential in the development of computer science and providing a formalization of the concept of the algorithm and computation with his famous Turing machine, playing a significant role in the creation of the modern computer. Turing discovered that it was possible, in principle, to devise one single universal machine that, all by itself, could carry out any possible computation.
The Turing Machine
The Turing machine is an idealized computing device, consisting of a read/write head (or scanner) with a paper tape passing through it. The tape is divided into squares, each square bearing a single symbol (0 or 1, for example). This tape is the machine’s general purpose storage medium, serving both as the vehicle for input and output and as a working memory for storing the results of intermediate steps of the computation. In the original article of 1936, Turing actually imagines not a mechanical machine, but a person whom he calls the computer, who executes these deterministic mechanical rules in a desultory manner.
The input that is inscribed on the tape before the computation starts must consist of a finite number of symbols. However, the tape is of unbounded length, for Turing’s aim was to show that there are tasks that these machines are unable to perform, even given unlimited working memory and unlimited time. The read/write head is programmable. To compute with the device, you program it, inscribe the input on the tape, place the head over the square containing the left most input symbol, and set the machine in motion. Once the computation is completed, the machine will come to a halt with the head positioned over the square containing the left most symbol of the output (or elsewhere if so programmed).
The Turing machine can perform six types of fundamental operations—read, write, move left, move right, change state and halt. A complicated computation may consist of hundreds of thousands, or even millions such operations. Despite the Turing machine’s simplicity, it is capable of computing anything that any modern computer can compute.
The History Behind it
Back then, computers were people; they compiled actuarial tables and did engineering calculations. As the Allies prepared for World War II they faced a critical shortage of human computers for military calculations. When men left for war the shortage got worse, so the U.S. mechanized the problem by building the Harvard Mark 1, an electromechanical monster 50 feet long. It could do calculations in seconds that took people hours.
The British also needed mathematicians to crack the German Navy’s Enigma code. Turing worked in the British top-secret Government Code and Cipher School at Bletchley Park. There code-breaking became an industrial process; 12,000 people worked three shifts 24/7. Although the Polish had cracked Enigma before the war, the Nazis had made the Enigma machines more complicated; there were approximately 10^114 possible permutations. Turing designed an electromechanical machine, called the Bombe, that searched through the permutations, and by the end of the war the British were able to read all daily German Naval Enigma traffic. It has been reported that Eisenhower said the contribution of Turing and others at Bletchley shortened the war by as much as two years, saving millions of lives.
As the 1950s progressed business was quick to see the benefits of computers and business computing became a new industry. These computers were all Universal Turing Machines—that’s the point, you could program them to do anything.
By the 1970s a generation was born who grew up with “electronic brains” but they wanted their own personal computers. The problem was they had to build them. In 1975 some hobbyists formed the Homebrew Computer Club; they were excited by the potential the new silicon chips had to let them build their own computers.
One Homebrew member was a college dropout called Steve Wozniak who built a simple computer around the 8080 microprocessor, which he hooked up to a keyboard and television. His friend Steve Jobs called it the Apple I and found a Silicon Valley shop that wanted to buy 100 of them for $500 each. Apple had its first sale and Silicon Valley’s start-up culture was born. Another college drop-out, Bill Gates, realized that PCs needed software and that people were willing to pay for it—his Microsoft would sell the programs.
Turing’s Test and the legacy he’s left behind
In 1950 Turing published a paper called “Computing machinery and intelligence.” He had an idea that computers would become so powerful that they would think. He envisaged a time when artificial intelligence (AI) would be a reality. But, how would you know if a machine was intelligent? He devised the Turing Test: A judge sitting at a computer terminal types questions to two entities, one a person and the other a computer. The judge decides which entity is human and which the computer. If the judge is wrong the computer has passed the Turing Test and is intelligent.
Although Turing’s vision of AI has not yet been achieved, aspects of AI are increasingly entering our daily lives. Car satellite navigation systems and Google search algorithms use AI. Apple’s Siri on the iPhone can understand your voice and intelligently respond. Car manufacturers are developing cars that drive themselves; some U.S. states are drafting legislation that would allow autonomous vehicles on the roads. Turing’s vision of AI will soon be a reality.