Editors’ Note: This is the third in a series of four esssays on Alan Turing, who is considered the father of modern computing. These essays are by Jack Copeland, an expert on Turing’s life and work.
How could researchers tell if a computer—whether a humanoid robot or a disembodied supercomputer—is capable of thought? This is not an easy question. For one thing neuroscience is still in its infancy. Scientists don’t know exactly what is going on in our brains when we think about tomorrow’s weather, or plan out a trip to the beach—let alone when we write poetry, or do complex mathematics in our minds. But even if we did know everything there is to know about the functioning of the brain, we might still be left completely uncertain as to whether entities without a human (or mammalian) brain could think. Imagine that a party of extraterrestrials find their way to Earth, and impress us with their mathematics and poetry. We discover they have no organ resembling a human brain; inside they are just a seething mixture of gases, say. Does the fact that these hypothetical aliens contain nothing like human brain cells imply that they do not think? Or is their mathematics and poetry proof enough that they must think—and so also proof that the mammalian brain is not the only way of doing whatever it is that we call thinking?
Of course, this imaginary scenario about aliens is supposed to sharpen up a question that’s much nearer to home. For alien, substitute computer. When computers start to impress us with their poetry and creative mathematics—if they don’t already—is this evidence that they can think? Or do we have to probe more deeply, and examine the inner processes responsible for producing the poetry and the mathematics, before we can say whether or not the computer is thinking? Deeper probing wouldn’t necessarily help much in the case of the aliens—because ex hypothesi the processes going on inside them are nothing like what goes on in the human brain. Even if we never managed to understand the complex gaseous processes occurring inside the aliens, we might nevertheless come to feel fully convinced that they think, because of the way they lead their lives and the way they interact with us. So does this mean that in order to tell whether a computer thinks, we only have to look at what it does—at how good its poetry is—without caring about what processes are going on inside it?
That was certainly what Alan Turing believed. He suggested a kind of driving test for thinking, a viva voce examination that pays no attention at all to whatever causal processes are going on inside the candidate—just as the examiner in a driving test cares only about the candidate’s automobile-handling behavior, and not at all about the nature of the internal processes that produce the behavior. Turing called his test the “imitation game,” but nowadays it is known universally as the Turing test.
At the end of World War II, mathematician and early computer scientist Alan Turing was a hero. He had led the British in breaking the Engima code, an effort that was hailed for bringing the war to a close.
But not long after the war, in 1952, he was arrested for what was then a crime in England: his sexuality, or, in legalese, “acts of gross indecency between adult men.” He had a choice: imprisonment or estrogen. He chose the hormone treatment, which made him impotent and caused him to grow breasts. Two years later, he died in what is widely believed to have been a suicide.
In more recent times, England has sought ways to repent for what it did to Alan Turing. In 2009 Prime Minister Gordon Brown issued a formal apology. But official forgiveness remains in the offing: Earlier this year, members of parliament introduced legislation to pardon Turing. Now, several of the nation’s top scientists, including Stephen Hawking, and other leaders have penned a letter to the Telegraph, throwing their support behind the bill.
Is Alan Turing Both Inventor of the Basic Ideas of the Modern Computer and a Pioneer of Artificial Intelligence?
In 1936, at the age of just 23, Turing invented the fundamental logical principles of the modern computer—almost by accident. A shy, boyish-looking genius, the young Turing had recently been elected a fellow of King’s College, an unusual honor for such a young researcher. King’s College, in the heart of Cambridge, lies a few steps along narrow medieval lanes from Trinity College, where in the seventeenth century Isaac Newton revolutionized our understanding of the universe. King’s College was Turing’s intellectual home, and he remained a fellow there until the early 1950s, when he was a given a specially created Readership in the Theory of Computing at Manchester University, a new position in the new field that Turing was instrumental in creating, computer science.
At King’s, the young Turing worked alone in a spartan room at the top of an ancient stone building beside the River Cam (scarcely more than a narrow stream winding through the old masonry, and a productive source of damp and chill). It was all quite the opposite of a modern research facility—Cambridge’s scholars had been doing their thinking in comfortless stone buildings, reminiscent of cathedrals or monasteries, ever since the university had begun to thrive in the Middle Ages. Turing was engaged in highly abstract work in the foundations of mathematics. No one could have guessed that anything of any practical value would emerge from this research, let alone a machine that would change all our lives.
As everyone who can operate a personal computer knows, the way to make the machine perform the job you want—word-processing, say—is to locate the appropriate program in memory and click on it. That’s the so-called ‘stored program’ concept and it was Turing’s invention in 1936. (This was just on paper—no engineering at this stage.) Turing’s fabulous idea, dreamed up by pure thought, was of a single processor—a single slab of hardware—that, by making use of instructions stored inside its memory, could change itself from a machine dedicated to one specific task into a machine dedicated to a completely different task—from calculator to word-processor to chess opponent, for example. Turing called his invention the ‘universal computing machine’; now we call it simply the universal Turing machine. If Isaac Newton had known about it, he would probably have wished that he had thought of it first. Nowadays, though, when nearly everyone owns a physical realization of Turing’s universal machine, his idea of a one-stop-shop computing machine is apt to seem as obvious as the wheel and the arch. But in 1936, when engineers thought in terms of building different machines for different purposes, Turing’s idea of a single universal machine was revolutionary.
Editors’ Note: This is the first in a series of four esssays on Alan Turing, who is considered the father of modern computing and whose work breaking German codes changed the course of World War II. These essays are by Jack Copeland, an expert on Alan Turing.
Writing ‘counterfactual’ history is always speculative, never cut and dried. Because, if some key matters had gone differently, the overall outcome of a war or battle or election might have been very different or it might nevertheless have been just the same. If the CIA had killed Osama Bin Laden in 2000, 9/11 might still have happened—perhaps because, following Bin Laden’s (counterfactual) death, one of his lieutenants would have stepped forward to take control of Al Qaeda and implement Bin Laden’s plans.
If Hitler’s Operation Sea Lion (Seelöwe)—his planned invasion of Great Britain in 1940—had actually been launched, troop carriers would have poured across the English Channel from France, accompanied by fleets of supply barges loaded with tanks, artillery, and heavy machine guns. During the massive attack by sea and air, thousands of gliders crammed with heavily armed crack German soldiers would have descended onto British soil. Paratroops would also have rained down, with swarms of dive-bombers disabling airfields and holding back a British ground response.
Once the invaders had secured a foothold—a patch of territory containing suitable harbours and airfields—Hitler’s formidable forces would have advanced ruthlessly in every direction, until they held all Britain’s key cities, or so the Führer planned. In the event, though, the Sea Lion invasion was postponed and then abandoned. But Britain’s fate had hung by a thread. If her Royal Air Force had not proved so resilient during the summer of 1940, if the German leader’s attention had not been wandering in the direction of Russia, if Alan Turing’s complicated electromechanical machine or bombe, whimsically named ‘Agnus Dei’—the Lamb of God—had not been breaking the Luftwaffe’s top-secret Enigma communications … then it might all have turned out very differently. When the Imperial Japanese air force attacked Pearl Harbor in 1941, Roosevelt might have faced a Europe completely dominated by Emperor Hirohito’s ally, Hitler.
The tide began to turn against the German military in 1942, with the neutralisation of the North Atlantic U-boat threat, and the humiliating rout of Field Marshal Rommel’s panzer army at El Alamein in North Africa. British successes in the U-boat war—where Turing was a key player—freed up the supply routes from North America to Britain, while the disaster at El Alamein denied Hitler his chance of taking the Suez Canal and capturing the precious middle Eastern oilfields. Debilitating shortages of fuel plagued the German military for the rest of the war.
‘A Perfect and Beautiful Machine’: What Darwin’s Theory of Evolution Reveals About Artificial Intelligence
Some of the greatest, most revolutionary advances in science have been given their initial expression in attractively modest terms, with no fanfare.
Charles Darwin managed to compress his entire theory into a single summary paragraph that a layperson can readily follow.
Francis Crick and James Watson closed their epoch-making paper on the structure of DNA with a single deliciously diffident sentence. (“It has not escaped our notice that the specific pairings we have postulated immediately suggests a possible copying mechanism for the replicating unit of life.”)
And Alan Turing created a new world of science and technology, setting the stage for solving one of the most baffling puzzles remaining to science, the mind-body problem, with an even shorter declarative sentence in the middle of his 1936 paper on computable numbers:
It is possible to invent a single machine which can be used to compute any computable sequence.
Turing didn’t just intuit that this remarkable feat was possible; he showed exactly how to make such a machine. With that demonstration the computer age was born. It is important to remember that there were entities called computers before Turing came up with his idea, but they were people, clerical workers with enough mathematical skill, patience, and pride in their work to generate reliable results of hours and hours of computation, day in and day out. Many of them were women.
Before the world would begin developing artificial intelligence, Alan Turing was asking the question, “Can machines think?” In 1936, this became his idea for the development of the “The Turing Test” or “Universal Turing Machine”, a machine that could be programmed to perform a series of tasks.
Turing biographer Andrew Hodges describes the idea’s significance in this way:
‘It is now almost impossible to read Turing’s 1936 work without thinking of a Turing machine as a computer program, and the Universal Turing Machine as the computer on which different programs can be run. We are now so familiar with the idea of the computer as a fixed piece of hardware, requiring only fresh software to make it do entirely different things, that it is hard to imagine the world without it. It was also essential to Turing’s 1936 work that a Turing machine could be thought of as data to be read and manipulated by another Turing machine — this is the principle of the modifiable stored program on which all computing now depends.’
The BBC is marking what would have been his 100th birthday this Saturday with a series of articles about his life. Well, the technology part of his life. Oddly enough (or maybe not so oddly), nothing is being said about why he “tragically died” (as one of the articles states - he committed suicide) or the persecution by the British Government for being a homosexual, effectively ending his life. In fact, in all those articles, he is only referred to as “a gay/cultural icon” and nothing else.
In the beginning, It was his work with computing and his genius, that caught the attention of the British government. They put him to work as a codebreaker, deciphering and breaking coded messages from the Nazis which was imperative to winning the war.
After the war, he want back to developing his ideas on what would eventually become artificial intelligence. However, Alan Turing was also gay and homosexuality was illegal in Britain at the time, an arrestable offense.
It came to be that Turing was burglarized and he suspected a young man he had taken home one evening. He reported the robbery to the police and by that report, they realized Turing was gay. Instead of treating him as the National Hero that he was, he was arrested. He plead guilty to an indecency offense and was offered two choices. Prison or chemical castration. To avoid the scandal and loss of his reputation, he opted for chemical castration.
He was to take diethylstilbestrol which lowers testosterone and raises the estrogen levels in the system. It also prevents erection, shrinks the testes and eventually, you grow breasts. However, since your hormones are also tied to your brain function, Turing’s thinking started to become muddled. He couldn’t concentrate on anything and his ability to do any of his work stopped. His brilliant mind was gone.
He took his life by cyanide poisoning on June 7, 1954, two weeks shy of his 42nd birthday. Lost to us forever.
I had the pleasure of seeing a 50 minute, condensed screening of his documentary last Tuesday which I believe is coming out shortly. It tells the story of his work, the people he loved and inspired him and his sad, tragic end. turingfilm.com
UPDATE: I came across this article in the Daily Beast by his brother, John Turing. Gives more interesting insight.
The life and achievements of Alan Turing - the mathematician, codebreaker, computer pioneer, artificial intelligence theoretician, and gay/cultural icon - are being celebrated to mark what would have been his 100th birthday on 23 June.
To mark the occasion the BBC has commissioned a series of essays to run across the week, starting with this overview of Turing’s legacy by Vint Cerf.
Rory Cellan-Jones gets a preview of an exhibition dedicated to the life and work of scientist and computer pioneer Alan Turing.
I’ve worked in computing, and more specifically computer networking, nearly all my life. It’s an industry in a constant state of innovation, always pushing beyond the limits of current capability.
It is sometimes said that “broadband” is whatever network speed you don’t have, yet!
Things we take for granted today were, not that long ago, huge technological breakthroughs.
Although I’ve been lucky enough in my career to be involved in the development of the internet, I’ve never lost sight of the role played by my predecessors, without whose pioneering labour, so much would not have been accomplished.
This year, in the centenary of his birth, there is one man in particular who is deservedly the focus of attention: Alan Turing.
Turing was born into a world that was very different, culturally and technologically, yet his contribution has never been more important.
His is a story of astounding highs and devastating lows. A story of a genius whose mathematical insights helped save thousands of lives, yet who was unable to save himself from social condemnation, with tragic results. Ultimately though, it’s a story of a legacy that laid the foundations for the modern computer age.
In 1936, while at King’s College, Cambridge, Turing published a seminal paper On Computable Numbers which introduced two key concepts - “algorithms” and “computing machines” - that continue to play a central role in our industry today