Future by design

The Second Digital Turn: Design beyond intelligence
Mario Carpo
MIT Press

THE Polish futurist Stanislaw Lem once wrote: “A scientist wants an algorithm, whereas the technologist is more like a gardener who plants a tree, picks apples, and is not bothered about ‘how the tree did it’.”

For Lem, the future belongs to technologists, not scientists. If Mario Carpo is right and the “second digital turn” described in his extraordinary new book comes to term, then Lem’s playful, “imitological” future where analysis must be abandoned in favour of creative activity, will be upon us in a decade or two. Never mind our human practice of science, science itself will no longer exist, and our cultural life will consist of storytelling, gesture and species of magical thinking.

Carpo studies architecture. Five years ago, he edited The Digital Turn in Architecture 1992-2012, a book capturing the curvilinear, parametric spirit of digital architecture. Think Frank Gehry’s Guggenheim Museum in Bilbao – a sort of deconstructed metal fish head – and you are halfway there.

Such is the rate of change that five years later, Carpo has had to write another book (the urgency of his prose is palpable and thrilling) about an entirely different kind of design. This is a generative design powered by artificial intelligence, with its ability to thug through digital simulations (effectively, breaking things on screen until something turns up that can’t be broken) and arriving at solutions that humans and their science cannot better.

This kind of design has no need of casts, stamps, moulds or dies. No costs need be amortised. Everything can be a one-off at the same unit cost.

Beyond the built environment, it is the spiritual consequences of this shift that matter, for by its light Carpo shows all cultural history to be a gargantuan exercise in information compression.

Unlike their AIs, human beings cannot hold much information at any one time. Hence, for example, the Roman alphabet: a marvel of compression, approximating all possible vocalisations with just 26 characters. Now that we can type and distribute any glyph at the touch of a button, is it any wonder emojis are supplementing our tidy 26-letter communications?

Science itself is simply a series of computational strategies to draw the maximum inference from the smallest number of precedents. Reduce the world to rules and there is no need for those precedents. We have done this for so long and so well some of us have forgotten that “rules” aren’t “real” rules, they are just generalisations.

AIs simply gather or model as many precedents as they wish. Left to collect data according to their own strengths, they are, Carpo says, “postscientific”. They aren’t doing science we recognise: they are just thugging.

“Carpo shows all cultural history to be a gargantuan exercise in information compression”

Carpo foresees the “separation of the minds of the thinkers from the tools of computation”. But in that alienation, I think, lies our reason to go on. Because humans cannot handle very much data at any one time, sorting is vital, which means we have to assign meaning. Sorting is therefore the process whereby we turn data into knowledge. Our inability to do what computers can do has a name already: consciousness.

Carpo’s succinctly argued future has us return to a tradition of orality and gesture, where these forms of communication need no reduction or compression since our tech will be able to record, notate, transmit, process and search them, making all cultural technologies developed to handle these tasks “equally unnecessary”. This will be neither advance nor regression. Evolution, remember, is maddeningly valueless.

Could we ever have evolved into Spock-like hyper-rationality? I doubt it. Carpo’s sincerity, wit and mischief show that Prospero is more the human style. Or Peter Pan, who observed: “You can have anything in life, if you will sacrifice everything else for it.”

 

“We don’t know why we did it”

Two views of the US space programme reviewed for New Scientist, 2 July 2014

“WE HAVE no need of other worlds,” wrote Stanislaw Lem, the Polish science fiction writer and satirist in 1961. “We need mirrors. We don’t know what to do with other worlds. A single world, our own, suffices us; but we can’t accept it for what it is.”

A few years later, as NASA’s advocates hunted for suitable justification for the US’s $24 billion effort to put a man on the moon, they began to invoke humanity’s “outward urge” – an inborn desire to leave our familiar surroundings and explore strange new worlds.

A hastily concocted migration instinct might explain tourism. But why astronauts visited the moon, described by the 1940s US columnist Milton Mayer as a “pulverised rubble… like Dresden in May or Hiroshima in August”, requires a whole other level of blarney.

In Marketing the Moon: The selling of the Apollo lunar program, released earlier this year, David Meerman Scott and Richard Jurek curated that blarney in their illustrated account of how Apollo was sold to a public already paying a bloody price for the Vietnam war.

Historian Matthew Tribbe, on the other hand, looks in an almost diametrically opposite direction. His No Requiem for the Space Age sweeps aside the Apollo programme’s technocratic special pleading – and the subsequent nostalgia – to argue that Americans fell out of love with space exploration even before Neil Armstrong took his first steps on the moon in July 1969.

There is no doubt that national disillusionment with the space programme swelled during the 1970s, as counter-cultural movements sent the US on “the biggest introspective binge any society in history has undergone”. But digging beneath this familiar narrative, Tribbe also shows that opposition to Apollo was both long-standing and intellectually rigorous.

The Nobel laureate physicist Max Born called Apollo “a triumph of intellect, but a tragic failure of reason”. And novelist Norman Mailer considered it “the deepest of nihilistic acts – because we don’t know why we did it”.

Apollo was the US’s biggest, brashest entry in its heart-stoppingly exciting – and terrifying – political and technological competition with the Soviet Union. By the time Apollo 11 was launched, however, that race was already won, and only a fanatic (or a military-industrial complex) would have kept running.

There was a fairly concerted attempt to sell Apollo as science. But that never rang true, and anyway what we really seek in space, as the science fiction writer Arthur C. Clarke told the American Aeronautical Society in 1967, is “not knowledge, but wonder, beauty, romance, novelty – and above all, adventure”. Apollo was supposed to offer the world’s most technologically advanced nation a peacetime goal as challenging and inspiring as war.

But the intractability of the war in Vietnam put paid to John F. Kennedy’s fine words to Congress on 25 May 1961, about sending an American safely to the moon before the end of the decade. As the Washington Evening Star columnist Frank R. Getlein observed: “The reason you have a moral equivalent of war is so you don’t have to have war… For us Americans, unfortunately, the moral equivalent of war has turned out to be war.”

Tribbe argues that popular enthusiasm was doused as soon as people realised just who was going into space – not them, but the representatives of the very technocratic power structure that was wreaking havoc on Earth.

This, you could argue, was hardly NASA’s fault. So it is reassuring, among all this starkly revealed futility, to see Tribbe expressing proper respect and, indeed, real warmth for NASA and its astronauts. NASA had labelled them “super-normal”; with such a moniker, it was perhaps inevitable that they failed to capture hearts and minds as easily as everyone had assumed they would. While public uninterest is Tribbe’s theme, he does not lay the blame for it at NASA’s door.

Explorations rarely inspire contemporary stay-at-homes. For example, over a century elapsed between Columbus’s initial voyage and the first permanent English settlements. Lem was right. We don’t need alien places. We need an ever-expanding supply of human ones. The moon may yet provide them. This, at least, is the compelling and technically detailed argument of Arlin Crotts’s forthcoming book The New Moon: Water, exploration, and future habitation – a perfect speculative antidote for those who find Tribbe’s history disheartening.

Tribbe quotes an unnamed journalist who wrote, during the Vietnam war: “The moon is a dream for those who have no dreams.” This may sum up many of the problems people had with Apollo in the 1970s. But Tribbe is no pessimist, and history need not demoralise us. Times and technologies change, so do nations, and so, come to think of it, do dreams.

Maths into English

One to Nine by Andrew Hodges and The Tiger that Isn’t by Michael Blastland and Andrew Dilnot
reviewed for the Telegraph, 22 September 2007

Twenty-four years have passed since Andrew Hodges published his biography of the mathematician Alan Turing. Hodges, a long-term member of the Mathematical Physics Research Group at Oxford, has spent the years since exploring the “twistor geometry” developed by Roger Penrose, writing music and dabbling with self-promotion.

Follow the link to One to Nine’s web page, and you will soon be stumbling over the furniture of Hodges’s other lives: his music, his sexuality, his ambitions for his self?published novel – the usual spillage. He must be immune to bathos, or blind to it. But why should he care what other people think? He knows full well that, once put in the right order, these base metals will be transformed.

“Writing,” says Hodges, “is the business of turning multi?dimensional facts and ideas into a one?dimensional string of symbols.”

One to Nine – ostensibly a simple snapshot of the mathematical world – is a virtuoso stream of consciousness containing everything important there is to say about numbers (and Vaughan Williams, and climate change, and the Pet Shop Boys) in just over 300 pages. It contains multitudes. It is cogent, charming and deeply personal, all at once.

“Dense” does not begin to describe it. There is extraordinary concision at work. Hodges covers colour space and colour perception in two or three pages. The exponential constant e requires four pages. These examples come from the extreme shallow end of the mathematical pool: there are depths here not everyone will fathom. But this is the point: One to Nine makes the unfathomable enticing and gives the reader tremendous motivation to explore further.

This is a consciously old-fashioned conceit. One to Nine is modelled on Constance Reid’s 1956 classic, From Zero to Infinity. Like Reid’s, each of Hodges’s chapters explores the ideas associated with a given number. Mathematicians are quiet iconoclasts, so this is work that each generation must do for itself.

When Hodges considers his own contributions (in particular, to the mathematics underpinning physical reality), the skin tightens over the skull: “The scientific record of the past century suggests that this chapter will soon look like faded pages from Eddington,” he writes. (Towards the end of his life, Sir Arthur Eddington, who died in 1944, assayed a “theory of everything”. Experimental evidence ran counter to his work, which today generates only intermittent interest.)

But then, mathematics “does not have much to do with optimising personal profit or pleasure as commonly understood”.

The mordant register of his prose serves Hodges as well as it served Turing all those years ago. Like Turing: the Enigma, One to Nine proceeds, by subtle indirection, to express a man through his numbers.

If you think organisations, economies or nations would be more suited to mathematical description, think again. Michael Blastland and Andrew Dilnot’s The Tiger that Isn’t contains this description of the International Passenger Survey, the organisation responsible for producing many of our immigration figures:

The ferry heaves into its journey and, equipped with their passenger vignettes, the survey team members also set off, like Attenboroughs in the undergrowth, to track down their prey, and hope they all speak English. And so the tides of people swilling about the world?… are captured for the record if they travel by sea, when skulking by slot machines, half?way through a croissant, or off to the ladies’ loo.

Their point is this: in the real world, counting is back-breaking labour. Those who sieve the world for numbers – surveyors, clinicians, statisticians and the rest – are engaged in difficult work, and the authors think it nothing short of criminal the way the rest of us misinterpret, misuse or simply ignore their hard-won results. This is a very angry and very funny book.

The authors have worked together before, on the series More or Less – BBC Radio 4’s antidote to the sort of bad mathematics that mars personal decision-making, political debate, most press releases, and not a few items from the corporation’s own news schedule.

Confusion between correlation and cause, wild errors in the estimation of risk, the misuse of averages: Blastland and Dilnot round up and dispatch whole categories of woolly thinking.

They have a positive agenda. A handful of very obvious mathematical ideas – ideas they claim (with a certain insouciance) are entirely intuitive – are all we need to wield the numbers for ourselves; with them, we will be better informed, and will make more realistic decisions.

This is one of those maths books that claims to be self?help, and on the evidence presented here, we are in dire need of it. A late chapter contains the results of a general knowledge quiz given to senior civil servants in 2005.

The questions were simple enough. Among them: what share of UK income tax is paid by the top one per cent of earners? For the record, in 2005 it was 21 per cent. Our policy?makers didn’t have a clue.

“The deepest pitfall with numbers owes nothing to the numbers themselves and much to the slack way they are treated, with carelessness all the way to contempt.”

This jolly airport read will not change all that. But it should stir things up a bit.

Unknown Quantity: a Real and Imagined History of Algebra by John Derbyshire

Unknown Quantity: a Real and Imagined History of Algebra by John Derbyshire
reviewed for the Telegraph,  17 May 2007

In 1572, the civil engineer Rafael Bombelli published a book of algebra, which, he said, would enable a novice to master the subject. It became a classic of mathematical literature. Four centuries later, John Derbyshire has written another complete account. It is not, and does not try to be, a classic. Derbyshire’s task is harder than Bombelli’s. A lot has happened to algebra in the intervening years, and so our expectations of the author – and his expectations of his readers – cannot be quite as demanding. Nothing will be mastered by a casual reading of Unknown Quantity, but much will be glimpsed of this alien, counter-intuitive, yet extremely versatile technique.

Derbyshire is a virtuoso at simplifying mathematics; he is best known for Prime Obsession (2003), an account of the Riemann hypothesis that very nearly avoided mentioning calculus. But if Prime Obsession was written in the genre of mathematical micro-histories established by Simon Singh’s Fermat’s Last Theorem, Derbyshire’s new work is more ambitious, more rigorous and less cute.

It embraces a history as long as the written record and its stories stand or fall to the degree that they contribute to a picture of the discipline. Gone are Prime Obsession’s optional maths chapters; in Unknown Quantity, six “maths primers” preface key events in the narrative. The reader gains a sketchy understanding of an abstract territory, then reads about its discovery. This is ugly but effective, much like the book itself, whose overall tone is reminiscent of Melvyn Bragg’s Radio 4 programme In Our Time: rushed, likeable and impossibly ambitious.

A history of mathematicians as well as mathematics, Unknown Quantity, like all books of its kind, labours under the shadow of E T Bell, whose Men of Mathematics (1937) set a high bar for readability. How can one compete with a description of 19th-century expansions of Abel’s Theorem as “a Gothic cathedral smothered in Irish lace, Italian confetti and French pastry”?

If subsequent historians are not quite left to mopping-up operations, it often reads as though they are. In Unknown Quantity, you can almost feel the author’s frustration as he works counter to his writerly instinct (he is also a novelist), applying the latest thinking to his biography of the 19th-century algebraist Évariste Galois – and draining much colour from Bell’s original.

Derbyshire makes amends, however, with a few flourishes of his own. Also, he places himself in his own account – a cultured, sardonic, sometimes self-deprecating researcher. This is not a chatty book, thank goodness, but it does possess a winning personality.

Sometimes, personality is all there is. The history of algebra is one of stops and starts. Derbyshire declares that for 269 years (during the 13th, 14th and early 15th centuries) little happened. Algebra is the language of abstraction, an unnatural way of thinking: “The wonder, to borrow a trope from Dr Johnson, is not that it took us so long to learn how to do this stuff; the wonder is that we can do it at all.”

The reason for algebra’s complex notation is that, in Leibniz’s phrase, it “relieves the imagination”, allowing us to handle abstract concepts by manipulating symbols. The idea that it might be applicable to things other than numbers – such as sets, and propositions in logic – dawned with tantalising slowness. By far the greater part of Derbyshire’s book tells this tale: how mathematicians learned to let go of number, and trust the terrifying fecundity of their notation.

Then, as we enter the 20th century, and algebra’s union with geometry, something odd happens: the mathematics gets harder to do but easier to imagine. Maths, of the basic sort, is a lousy subject to learn. Advanced mathematics is rich enough to sustain metaphor, so it is in some ways simpler to grasp.

Derbyshire’s parting vision of contemporary algebra – conveyed through easy visual analogies, judged by its applicability to physics, realised in glib computer graphics – is almost a let-down. The epic is over. The branches of mathematics have so interpenetrated each other, it seems unlikely that algebra, as an independent discipline, will survive.

This is not a prospect Derbyshire savours, which lends his book a mordant note. This is more than an engaging history; it records an entire, perhaps endangered, way of thinking.

 

Elephants on Acid and Other Bizarre Experiments by Alex Boese

There is a connection between vaudeville and science, and it is more profound than people credit. Alex Boese’s collection of bizarre scientific anecdotes illuminates this connection, claims far too much for it, and loses the thread of it entirely.

This probably doesn’t matter – by Boese’s own estimation, Elephants on Acid is a book you dip into in the bathroom. There’s even an entire chapter, ‘Toilet Reading’, dedicated to this very idea.

But Boese, quietly meticulous, is a champion of the idea of science. So, at the risk of taking a mallet to a sugar-coated almond, let’s take him seriously here.

Boese is the curator of a splendid on-line museum of hoaxes – museumofhoaxes.com. To move from deliberate fakery to science gone awry, deliberately or not, is, Boese argues, but a small step.

Hoaxers and experimenters are both manipulators of reality. But only experimenters wrap themselves in the authority of science. ‘This sense of gravity is what lends bizarre experiments their particularly surreal quality.’ More charitably, he might have added: only scientists run a serious and career-busting risk of hoaxing themselves.

Boese’s accounts of unlikely experiments include sensible and legitimate studies into risible subjects (how could studies into human ticklishness not sound silly?) Elsewhere, accounts of doubtful ‘discoveries’ reveal how badly credulousness and ambition will misdirect the enquiring mind.

Wandering among Boese’s carnival of curiosities we learn, for example, the precise weight of a human soul and acquire a method for springing crystalline insects out of rocks.

Less convincing are his stories of research misinterpreted by gullible or hostile media. A sharper editor would have spotted when Boese’s eye for a good tale was leading him astray.

In 1943 the behaviourist Burrhus Skinner invented a comfortable, labour-saving crib for his baby daughter – only to be pilloried for imprisoning her in an experimental ‘box’. This is a tale of irony and injustice, deftly told. But it is not ‘bizarre science’.

It’s devilishly difficult to get good at something unless you can find the fun in it. The more intellectually serious a work is, the more likely it is to have playful, even mischievous aspects. Science is no exception.

The more entertaining, and less troubling, of Boese’s tales involve ingenious, self-aware acts of scientific folly. We learn a truly magnificent (and wrong) formula for working out the moment at which cocktail parties become too loud.

A study that involves erotically propositioning young men on a wobbly bridge must surely have fallen out of the bottom of an Atom Egoyan movie. And pet owners should heed a slapstick 2006 study entitled ‘Do Dogs Seek Help in an Emergency?’ (‘Pinned beneath the shelves, each owner let go of his or her dog’s leash and began imploring the animal to get help from the person in the lobby.’)

Yet, for all its hilarity, Elephants on Acid proves to be an oddly disturbing experience when read cover-to-cover.

The decision to put all the truly gut-wrenching vivisection stories in the first chapter was foolhardy. Robert White’s 1962 attempt to isolate a monkey’s brain by removing, piece by piece, the face and skull, absolutely belongs in this book – but it is delivered so early that it’s one hell of a hurdle to clear in the first five minutes of reading.

Other horrors lurk in wait for those who persevere (Ewen Cameron’s brainwashing experiments of the 1950s are particularly horrendous). Boese’s off-the-cuff observation that the Cold War had its surgical and psychological aspects is not staggeringly original but it does mollify our easy outrage at such past ‘mistakes’.

Quite rightly so, for most of what we primly label ‘maverick science’ is no such thing; it is simply science that served a long-since-vanished purpose.

Most disturbing of all, however are those celebrated and familiar behavioural experiments that, while harming no one, reveal human gullibility, spite, vanity and witlessness.

Philip Zimbardo’s prison-psychology experiment at Stanford University had to be terminated, so keenly did his volunteers brutalise each other. Testing the limits of obedience (clue: there aren’t any), Stanley Milgram invited volunteers to inflict what they thought were potentially lethal electric shocks to people. Few demurred. Ironically, these kinds of experiments share methods with many stage magic routines.

The connection between vaudeville and science is profound, all right – and not particularly funny. Boese is right to invite us to dip in and out of his book. His facetious mask cannot hide for long the underlying seriousness of such striking material.

 

Glimpses of the Wonderful

Glimpses of the Wonderful by Anne Thwaite, reviewed for New Scientist, 2 November 2002

WHY do we study the natural world? Today, we might answer: to uncover life’s underlying principles. In the mid-19th century, those underlying principles were thought to be already established: life was a Creation of God’s.

Ann Thwaite, a literary biographer best known for her lives of Edmund Gosse and A. A. Milne, forays into the history of science with this life of Edmund Gosse’s father, the naturalist Philip Henry Gosse.

Thwaite shows that Gosse’s believed that “the gratification of scientific curiosity is worse than useless if we ignore God”. After all, what is science for, if not veneration? What Gosse never could do was abandon his belief in the revealed Word and take up the un-anthropomorphic “search for underlying principles” which would become the defining feature of modern science.

A self-styled Puritan who famously called Christmas pudding “the devil’s sweetmeat”, Gosse was also the finest naturalist of his age. He enjoyed a lifelong, friendly correspondence with Charles Darwin, and popularised the science of his day with rigour and intelligence. For most of us, though, Gosse is best known through his son’s memoir Father and Son – a poignant account of Edmund’s father’s “strange severities and eccentric prohibitions”, to which Thwaite provides a robust response.

Dogmatic belief shields us from the inevitability of death. Gosse, born into a millennial age, believed that Christ would return before he died. He spent the last hours of his life in a state of heart-rending and terrible dejection. Who can say they do not share Gosse’s terrible fear of death? His unenviable distinction was to hold fast to conventional comforts in a revolutionary age.

Thwaite weaves together Gosse’s professional studies and personal convictions, not into some dead synthesis, but into a story of a man caught in the toils of the scientific establishment as it re-geared itself for the modern age. Omphalos, Gosse’s great – and greatly lampooned – attempt to marry creationist dogma with the evolutionary record, is the work by which he is best known. It is a measure of Thwaite’s intellectual grasp that we understand how well considered that book really is, and at the same time how unworthy of him. Better that we remember Gosse as a friend remembered him: a figure to embody all the contradictions of his day, “plunging into a pool in full sacerdotal black, after a sea anemone”.