Putting the Kama Sutra in the shade

Reviewing a new verse translation of Gendun Chopel’s Treatise on Passion for The Spectator, 9 June 2018.

The Tibetan artist and poet Gendun Chopel was born in 1903. He was identified as an incarnate lama, and ordained as a Buddhist monk. In 1934 he renounced his vows, quit Tibet for India, learned Sanskrit and — if his long poem, usually translated as A Treatise on Passion, is to be taken at face value — copulated with every woman who let him.

Twelve years later he returned to Tibet, and was thrown into prison on trumped-up charges. The experience broke him. He died of cirrhosis in 1951, as troops of China’s People’s Liberation Army were marching through the streets of Lhasa.

Chopel’s reputation as the most important Tibetan writer of the 20th century is secure, mostly through his travelogue, Grains of Gold. The Passion Book is very different; it is Chopel’s reply to the kamasastra, a classical genre of sanskrit erotica best known to us through one rather tame work, the Kama Sutra.

If Chopel had wanted to show off to his peers back home he could simply have translated the Kama Sutra —but where would have been the fun in that? The former monk spent four years researching and writing his own spectacularly explicit work of Tibetan kamasastra.

It is impossible not to like Chopel — ‘A monastic friend undoing his way of life,/ A narrow-minded poser losing his facade’ — if only for the sincerity of his effort. At one point he tries to get the skinny on female masturbation: ‘Other than scornful laughs and being hit with fists/ I could not find even one who would give an honest answer.’

Still, he gets it: ‘Since naked flesh and sinew are different,’ he warns his (literate, therefore male) readership, ‘How can a thorn sense what the wound feels?’

Thus, like touching an open wound,
The pleasure and pain of women is intense

Chopel insists that women’s and men’s experiences of sex differ, and that women are not mere sources of male pleasure but full partners in the play of passion. So far, so safe. But let’s not be too quick to tidy up Chopel’s long, dizzying, delirious mess of a poem, which jumps from folk wisdom about how to predict a woman’s future by studying the moles on her face, to a jeremiad against the hypocrisy of the rich and powerful, to evocations of tantric states, to the sexual preferences of women in various regions of India, to sexual positions, to fullblown sexual delirium:

They copulate squatting and they copulate standing;
Intertwined, with head and foot reversed, they copulate.
Hanging the woman in the air
With a rope of silk they copulate.

Chopel’s translators, Donald S Lopez Jr. and Thupten Jinpa — Tibetan and Buddhist scholars, who a few years ago also translated Grains of Gold — have appended a long afterword which goes some way to revealing what is going on here.In the Christian faith, sexual intercourse may lead to hell. The early tradition of Buddhism took a different position: sex is fine, so far as it goes; it’s everything that follows — marriage, home, property, domestic contentment, the pram in the hall — that paves the road to perdition.

This is what inspires Buddhism’s tradition of astounding misogyny. Something has got to stop you from having sex with your own wife — and a famous Mahayana sutra has the solution. Think of her as a demon. An ogre. A hag. As sickness, old age, or death:

As a huge wolf, a huge sea monster, and a huge cat; a black snake, a crocodile, and a demon that causes epilepsy; and as swollen, shrivelled and diseased.

The rise of the tantric tradition altered sexual attitudes to the extent that one was now actually obliged to have intercourse if one ever hoped to achieve buddhahood. But the ideal tantric playmate — a girl of 16 or younger, and ideally low-caste — was still no more than a tool for the enlightenment of an elite male.

Chopel, coming late to the ordinary delights and comforts of sex, was having none of it. Lopez and Jinpa speculate entertainingly about where Chopel sits in the pantheon of such early sexologists as Ellis, Freud and Reich. For sure, he was a believer in sexual liberation: ‘When suitable deeds are prohibited in public,’ he asserts, ‘Unsuitable deeds will be done in private.’

Jeffrey Hopkins translated Chopel’s A Treatise on Passion into prose in 1992 as Tibetan Arts of Love. This is the first effort in verse, and though a clear, scholarly advance, the translators have struggled to render the carefully metered original into lines of even roughly the same number of syllables. You can understand their bind: even in the original Tibetan, there’s still no critical edition. With so much basic scholarship to be done, it would have been pointless if they had simply jazz-handed their way through a loose transliteration.

Their effort captures Chopel’s charm, and that’s the main thing. As Chopel said of the act itself: ‘It may not be a virtue, but how could it be a sin?’

Elements of surprise

Reading Vera Tobin’s Elements of Surprise for New Scientist, 5 May 2018

How do characters and events in fiction differ from those in real life? And what is it about our experience of life that fiction exaggerates, omits or captures to achieve its effects?

Effective fiction is Vera Tobin’s subject. And as a cognitive scientist, she knows how pervasive and seductive it can be, even in – or perhaps especially in – the controlled environment of an experimental psychology lab.

Suppose, for instance, you want to know which parts of the brain are active when forming moral judgements, or reasoning about false beliefs. These fields and others rest on fMRI brain scans. Volunteers receive short story prompts with information about outcomes or character intentions and, while their brains are scanned, have to judge what other characters ought to know or do.

“As a consequence,” writes Tobin in her new book Elements of Surprise, “much research that is putatively about how people think about other humans… tells us just as much, if not more, about how study participants think about characters in constructed narratives.”

Tobin is weary of economists banging on about the “flaws” in our cognitive apparatus. “The science on this phenomenon has tended to focus on cataloguing errors people make in solving problems or making decisions,” writes Tobin, “but… its place and status in storytelling, sense-making, and aesthetic pleasure deserve much more attention.”

Tobin shows how two major “flaws” in our thinking are in fact the necessary and desirable consequence of our capacity for social interaction. First, we wildly underestimate our differences. We model each other in our heads and have to assume this model is accurate, even while we’re revising it, moment to moment. At the same time, we have to assume no one else has any problem performing this task – which is why we’re continually mortified to discover other people have no idea who we really are.

Similarly, we find it hard to model the mental states of people, including our past selves, who know less about something than we do. This is largely because we forget how we came to that privileged knowledge.

“Tobin is weary of economists banging on about the ‘flaws’ in our cognitive apparatus”
There are implications for autism, too. It is, Tobin says, unlikely that many people with autism “lack” an understanding that others think differently – known as “theory of mind”. It is more likely they have difficulty inhibiting their knowledge when modelling others’ mental states.

And what about Emma, titular heroine of Jane Austen’s novel? She “is all too ready to presume that her intentions are unambiguous to others and has great difficulty imagining, once she has arrived at an interpretation of events, that others might believe something different”, says Tobin. Austen’s brilliance was to fashion a plot in which Emma experiences revelations that confront the consequences of her “cursed thinking” – a cognitive bias making us assume any person with whom we communicate has the background knowledge to understand what is being said.

Just as we assume others know what we’re thinking, we assume our past selves thought as we do now. Detective stories exploit this foible. Mildred Pierce, Michael Curtiz’s 1945 film, begins at the end, as it were, depicting the story’s climactic murder. We are fairly certain we know who did it, but we flashback to the past and work forward to the present only to find that we have misinterpreted everything.

I confess I was underwhelmed on finishing this excellent book. But then I remembered Sherlock Holmes’s complaint (mentioned by Tobin) that once he reveals the reasoning behind his deductions, people are no longer impressed by his singular skill. Tobin reveals valuable truths about the stories we tell to entertain each other, and those we tell ourselves to get by, and how they are related. Like any good magic trick, it is obvious once it has been explained.

Pushing the boundaries

Rounding up some cosmological pop-sci for New Scientist, 24 March 2018

IN 1872, the physicist Ludwig Boltzmann developed a theory of gases that confirmed the second law of thermodynamics, more or less proved the existence of atoms and established the asymmetry of time. He went on to describe temperature, and how it governed chemical change. Yet in 1906, this extraordinary man killed himself.

Boltzmann is the kindly if gloomy spirit hovering over Peter Atkins’s new book, Conjuring the Universe: The origins of the laws of nature. It is a cheerful, often self-deprecating account of how most physical laws can be unpacked from virtually nothing, and how some constants (the peculiarly precise and finite speed of light, for example) are not nearly as arbitrary as they sound.

Atkins dreams of a final theory of everything to explain a more-or-less clockwork universe. But rather than wave his hands about, he prefers to clarify what can be clarified, clear his readers’ minds of any pre-existing muddles or misinterpretations, and leave them, 168 succinct pages later, with a rather charming image of him tearing his hair out over the fact that the universe did not, after all, pop out of nothing.

It is thanks to Atkins that the ideas Boltzmann pioneered, at least in essence, can be grasped by us poor schlubs. Popular science writing has always been vital to science’s development. We ignore it at our peril and we owe it to ourselves and to those chipping away at the coalface of research to hold popular accounts of their work to the highest standards.

Enter Brian Clegg. He is such a prolific writer of popular science, it is easy to forget how good he is. Icon Books is keeping him busy writing short, sweet accounts for its Hot Science series. The latest, by Clegg, is Gravitational Waves: How Einstein’s spacetime ripples reveal the secrets of the universe.

Clegg delivers an impressive double punch: he transforms a frustrating, century-long tale of disappointment into a gripping human drama, affording us a vivid glimpse into the uncanny, depersonalised and sometimes downright demoralising operations of big science. And readers still come away wishing they were physicists.

Less polished, and at times uncomfortably unctuous, Catching Stardust: Comets, asteroids and the birth of the solar system is nevertheless a promising debut from space scientist and commentator Natalie Starkey. Her description of how, from the most indirect evidence, a coherent history of our solar system was assembled, is astonishing, as are the details of the mind-bogglingly complex Rosetta mission to rendezvous with comet 67P/Churyumov-Gerasimenko – a mission in which she was directly involved.

It is possible to live one’s whole life within the realms of science and discovery. Plenty of us do. So it is always disconcerting to be reminded that longer-lasting civilisations than ours have done very well without science or formal logic, even. And who are we to say they afforded less happiness and fulfilment than our own?

Nor can we tut-tut at the way ignorant people today ride science’s coat-tails – not now antibiotics are failing and the sixth extinction is chewing its way through the food chain.

Physicists, especially, find such thinking well-nigh unbearable, and Alan Lightman speaks for them in his memoir Searching for Stars on an Island in Maine. He wants science to rule the physical realm and spirituality to rule “everything else”. Lightman is an elegant, sensitive writer, and he has written a delightful book about one man’s attempt to hold the world in his head.

But he is wrong. Human culture is so rich, diverse, engaging and significant, it is more than possible for people who don’t give a fig for science or even rational thinking to live lives that are meaningful to themselves and valuable to the rest of us.

“Consilience” was biologist E.O. Wilson’s word for the much-longed-for marriage of human enquiry. Lightman’s inadvertent achievement is to show that the task is more than just difficult, it is absurd.

Writing about knowing

Reading John Brockman’s anthology This Idea Is Brilliant: Lost, overlooked, and underappreciated scientific concepts everyone should know for New Scientist, 24 February 2018 

Literary agent and provocateur John Brockman has turned popular science into a sort of modern shamanism, packaged non-fiction into gobbets of smart thinking, made stars of unlikely writers and continues to direct, deepen and contribute to some of the most hotly contested conversations in civic life.

This Idea Is Brilliant is the latest of Brockman’s annual anthologies drawn from edge.org, his website and shop window. It is one of the stronger books in the series. It is also one of the more troubling, addressing, informing and entertaining a public that has recently become extraordinarily confused about truth and falsehood, fact and knowledge.

Edge.org’s purpose has always been to collide scientists, business people and public intellectuals in fruitful ways. This year, the mix in the anthology leans towards the cognitive sciences, philosophy and the “freakonomic” end of the non-fiction bookshelf. It is a good time to return to basics: to ask how we know what we know, what role rationality plays in knowing, what tech does to help and hinder that knowing, and, frankly, whether in our hunger to democratise knowledge we have built a primrose-lined digital path straight to post-truth perdition.

Many contributors, biting the bullet, reckon so. Measuring the decline in the art of conversation against the rise of social media, anthropologist Nina Jablonski fears that “people are opting for leaner modes of communication because they’ve been socialized inadequately in richer ones”.

Meanwhile, an applied mathematician, Coco Krumme, turning the pages of Jorge Luis Borges’s short story The Lottery in Babylon, conceptualises the way our relationship with local and national government is being automated to the point where fixing wayward algorithms involves the applications of yet more algorithms. In this way, civic life becomes opaque and arbitrary: a lottery. “To combat digital distraction, they’d throttle email on Sundays and build apps for meditation,” Krumme writes. “Instead of recommender systems that reveal what you most want to hear, they’d inject a set of countervailing views. The irony is that these manufactured gestures only intensify the hold of a Babylonian lottery.”

Of course, IT wasn’t created on a whim. It is a cognitive prosthesis for significant shortfalls in the way we think. Psychologist Adam Waytz cuts to the heart of this in his essay “The illusion of explanatory depth” – a phrase describing how people “feel they understand the world with far greater detail, coherence and depth than they really do”.

Humility is a watchword here. If our thinking has holes in it, if we forget, misconstrue, misinterpret or persist in false belief, if we care more for the social consequences of our beliefs than their accuracy, and if we suppress our appetite for innovation in times of crisis (all subjects of separate essays here), there are consequences. Why on earth would we imagine we can build machines that don’t reflect our own biases, or don’t – in a ham-fisted effort to correct for them – create ones of their own we can barely spot, let alone fix?

Neuroscientist Sam Harris is one of several here who, searching for a solution to the “truthiness” crisis, simply appeals to basic decency. We must, he argues, be willing to be seen to change our minds: “Wherever we look, we find otherwise sane men and women making extraordinary efforts to avoid changing [them].”

He has a point. Though our cognitive biases, shortfalls and the like make us less than ideal rational agents, evolution has equipped us with social capacities that, smartly handled, run rings round the “cleverest” algorithm.

Let psychologist Abigail Marsh have the last word: “We have our flaws… but we can also claim to be the species shaped by evolution to possess the most open hearts and the greatest proclivity for caring on Earth.” This may, when all’s said and done, have to be enough.

Future by design

The Second Digital Turn: Design beyond intelligence
Mario Carpo
MIT Press

THE Polish futurist Stanislaw Lem once wrote: “A scientist wants an algorithm, whereas the technologist is more like a gardener who plants a tree, picks apples, and is not bothered about ‘how the tree did it’.”

For Lem, the future belongs to technologists, not scientists. If Mario Carpo is right and the “second digital turn” described in his extraordinary new book comes to term, then Lem’s playful, “imitological” future where analysis must be abandoned in favour of creative activity, will be upon us in a decade or two. Never mind our human practice of science, science itself will no longer exist, and our cultural life will consist of storytelling, gesture and species of magical thinking.

Carpo studies architecture. Five years ago, he edited The Digital Turn in Architecture 1992-2012, a book capturing the curvilinear, parametric spirit of digital architecture. Think Frank Gehry’s Guggenheim Museum in Bilbao – a sort of deconstructed metal fish head – and you are halfway there.

Such is the rate of change that five years later, Carpo has had to write another book (the urgency of his prose is palpable and thrilling) about an entirely different kind of design. This is a generative design powered by artificial intelligence, with its ability to thug through digital simulations (effectively, breaking things on screen until something turns up that can’t be broken) and arriving at solutions that humans and their science cannot better.

This kind of design has no need of casts, stamps, moulds or dies. No costs need be amortised. Everything can be a one-off at the same unit cost.

Beyond the built environment, it is the spiritual consequences of this shift that matter, for by its light Carpo shows all cultural history to be a gargantuan exercise in information compression.

Unlike their AIs, human beings cannot hold much information at any one time. Hence, for example, the Roman alphabet: a marvel of compression, approximating all possible vocalisations with just 26 characters. Now that we can type and distribute any glyph at the touch of a button, is it any wonder emojis are supplementing our tidy 26-letter communications?

Science itself is simply a series of computational strategies to draw the maximum inference from the smallest number of precedents. Reduce the world to rules and there is no need for those precedents. We have done this for so long and so well some of us have forgotten that “rules” aren’t “real” rules, they are just generalisations.

AIs simply gather or model as many precedents as they wish. Left to collect data according to their own strengths, they are, Carpo says, “postscientific”. They aren’t doing science we recognise: they are just thugging.

“Carpo shows all cultural history to be a gargantuan exercise in information compression”

Carpo foresees the “separation of the minds of the thinkers from the tools of computation”. But in that alienation, I think, lies our reason to go on. Because humans cannot handle very much data at any one time, sorting is vital, which means we have to assign meaning. Sorting is therefore the process whereby we turn data into knowledge. Our inability to do what computers can do has a name already: consciousness.

Carpo’s succinctly argued future has us return to a tradition of orality and gesture, where these forms of communication need no reduction or compression since our tech will be able to record, notate, transmit, process and search them, making all cultural technologies developed to handle these tasks “equally unnecessary”. This will be neither advance nor regression. Evolution, remember, is maddeningly valueless.

Could we ever have evolved into Spock-like hyper-rationality? I doubt it. Carpo’s sincerity, wit and mischief show that Prospero is more the human style. Or Peter Pan, who observed: “You can have anything in life, if you will sacrifice everything else for it.”

 

Maths into English

One to Nine by Andrew Hodges and The Tiger that Isn’t by Michael Blastland and Andrew Dilnot
reviewed for the Telegraph, 22 September 2007

Twenty-four years have passed since Andrew Hodges published his biography of the mathematician Alan Turing. Hodges, a long-term member of the Mathematical Physics Research Group at Oxford, has spent the years since exploring the “twistor geometry” developed by Roger Penrose, writing music and dabbling with self-promotion.

Follow the link to One to Nine’s web page, and you will soon be stumbling over the furniture of Hodges’s other lives: his music, his sexuality, his ambitions for his self?published novel – the usual spillage. He must be immune to bathos, or blind to it. But why should he care what other people think? He knows full well that, once put in the right order, these base metals will be transformed.

“Writing,” says Hodges, “is the business of turning multi?dimensional facts and ideas into a one?dimensional string of symbols.”

One to Nine – ostensibly a simple snapshot of the mathematical world – is a virtuoso stream of consciousness containing everything important there is to say about numbers (and Vaughan Williams, and climate change, and the Pet Shop Boys) in just over 300 pages. It contains multitudes. It is cogent, charming and deeply personal, all at once.

“Dense” does not begin to describe it. There is extraordinary concision at work. Hodges covers colour space and colour perception in two or three pages. The exponential constant e requires four pages. These examples come from the extreme shallow end of the mathematical pool: there are depths here not everyone will fathom. But this is the point: One to Nine makes the unfathomable enticing and gives the reader tremendous motivation to explore further.

This is a consciously old-fashioned conceit. One to Nine is modelled on Constance Reid’s 1956 classic, From Zero to Infinity. Like Reid’s, each of Hodges’s chapters explores the ideas associated with a given number. Mathematicians are quiet iconoclasts, so this is work that each generation must do for itself.

When Hodges considers his own contributions (in particular, to the mathematics underpinning physical reality), the skin tightens over the skull: “The scientific record of the past century suggests that this chapter will soon look like faded pages from Eddington,” he writes. (Towards the end of his life, Sir Arthur Eddington, who died in 1944, assayed a “theory of everything”. Experimental evidence ran counter to his work, which today generates only intermittent interest.)

But then, mathematics “does not have much to do with optimising personal profit or pleasure as commonly understood”.

The mordant register of his prose serves Hodges as well as it served Turing all those years ago. Like Turing: the Enigma, One to Nine proceeds, by subtle indirection, to express a man through his numbers.

If you think organisations, economies or nations would be more suited to mathematical description, think again. Michael Blastland and Andrew Dilnot’s The Tiger that Isn’t contains this description of the International Passenger Survey, the organisation responsible for producing many of our immigration figures:

The ferry heaves into its journey and, equipped with their passenger vignettes, the survey team members also set off, like Attenboroughs in the undergrowth, to track down their prey, and hope they all speak English. And so the tides of people swilling about the world?… are captured for the record if they travel by sea, when skulking by slot machines, half?way through a croissant, or off to the ladies’ loo.

Their point is this: in the real world, counting is back-breaking labour. Those who sieve the world for numbers – surveyors, clinicians, statisticians and the rest – are engaged in difficult work, and the authors think it nothing short of criminal the way the rest of us misinterpret, misuse or simply ignore their hard-won results. This is a very angry and very funny book.

The authors have worked together before, on the series More or Less – BBC Radio 4’s antidote to the sort of bad mathematics that mars personal decision-making, political debate, most press releases, and not a few items from the corporation’s own news schedule.

Confusion between correlation and cause, wild errors in the estimation of risk, the misuse of averages: Blastland and Dilnot round up and dispatch whole categories of woolly thinking.

They have a positive agenda. A handful of very obvious mathematical ideas – ideas they claim (with a certain insouciance) are entirely intuitive – are all we need to wield the numbers for ourselves; with them, we will be better informed, and will make more realistic decisions.

This is one of those maths books that claims to be self?help, and on the evidence presented here, we are in dire need of it. A late chapter contains the results of a general knowledge quiz given to senior civil servants in 2005.

The questions were simple enough. Among them: what share of UK income tax is paid by the top one per cent of earners? For the record, in 2005 it was 21 per cent. Our policy?makers didn’t have a clue.

“The deepest pitfall with numbers owes nothing to the numbers themselves and much to the slack way they are treated, with carelessness all the way to contempt.”

This jolly airport read will not change all that. But it should stir things up a bit.

Unknown Quantity: a Real and Imagined History of Algebra by John Derbyshire

Unknown Quantity: a Real and Imagined History of Algebra by John Derbyshire
reviewed for the Telegraph,  17 May 2007

In 1572, the civil engineer Rafael Bombelli published a book of algebra, which, he said, would enable a novice to master the subject. It became a classic of mathematical literature. Four centuries later, John Derbyshire has written another complete account. It is not, and does not try to be, a classic. Derbyshire’s task is harder than Bombelli’s. A lot has happened to algebra in the intervening years, and so our expectations of the author – and his expectations of his readers – cannot be quite as demanding. Nothing will be mastered by a casual reading of Unknown Quantity, but much will be glimpsed of this alien, counter-intuitive, yet extremely versatile technique.

Derbyshire is a virtuoso at simplifying mathematics; he is best known for Prime Obsession (2003), an account of the Riemann hypothesis that very nearly avoided mentioning calculus. But if Prime Obsession was written in the genre of mathematical micro-histories established by Simon Singh’s Fermat’s Last Theorem, Derbyshire’s new work is more ambitious, more rigorous and less cute.

It embraces a history as long as the written record and its stories stand or fall to the degree that they contribute to a picture of the discipline. Gone are Prime Obsession’s optional maths chapters; in Unknown Quantity, six “maths primers” preface key events in the narrative. The reader gains a sketchy understanding of an abstract territory, then reads about its discovery. This is ugly but effective, much like the book itself, whose overall tone is reminiscent of Melvyn Bragg’s Radio 4 programme In Our Time: rushed, likeable and impossibly ambitious.

A history of mathematicians as well as mathematics, Unknown Quantity, like all books of its kind, labours under the shadow of E T Bell, whose Men of Mathematics (1937) set a high bar for readability. How can one compete with a description of 19th-century expansions of Abel’s Theorem as “a Gothic cathedral smothered in Irish lace, Italian confetti and French pastry”?

If subsequent historians are not quite left to mopping-up operations, it often reads as though they are. In Unknown Quantity, you can almost feel the author’s frustration as he works counter to his writerly instinct (he is also a novelist), applying the latest thinking to his biography of the 19th-century algebraist Évariste Galois – and draining much colour from Bell’s original.

Derbyshire makes amends, however, with a few flourishes of his own. Also, he places himself in his own account – a cultured, sardonic, sometimes self-deprecating researcher. This is not a chatty book, thank goodness, but it does possess a winning personality.

Sometimes, personality is all there is. The history of algebra is one of stops and starts. Derbyshire declares that for 269 years (during the 13th, 14th and early 15th centuries) little happened. Algebra is the language of abstraction, an unnatural way of thinking: “The wonder, to borrow a trope from Dr Johnson, is not that it took us so long to learn how to do this stuff; the wonder is that we can do it at all.”

The reason for algebra’s complex notation is that, in Leibniz’s phrase, it “relieves the imagination”, allowing us to handle abstract concepts by manipulating symbols. The idea that it might be applicable to things other than numbers – such as sets, and propositions in logic – dawned with tantalising slowness. By far the greater part of Derbyshire’s book tells this tale: how mathematicians learned to let go of number, and trust the terrifying fecundity of their notation.

Then, as we enter the 20th century, and algebra’s union with geometry, something odd happens: the mathematics gets harder to do but easier to imagine. Maths, of the basic sort, is a lousy subject to learn. Advanced mathematics is rich enough to sustain metaphor, so it is in some ways simpler to grasp.

Derbyshire’s parting vision of contemporary algebra – conveyed through easy visual analogies, judged by its applicability to physics, realised in glib computer graphics – is almost a let-down. The epic is over. The branches of mathematics have so interpenetrated each other, it seems unlikely that algebra, as an independent discipline, will survive.

This is not a prospect Derbyshire savours, which lends his book a mordant note. This is more than an engaging history; it records an entire, perhaps endangered, way of thinking.