Free 8 min read

I Want A Word


A weekly letter from the founding editor of The Browser. Correspondence and criticism gratefully received and always read: robert@thebrowser.com

If you are not a paying subscriber to The Browser, but enjoy this letter, please do become a paying subscriber to The Browser, because that is how I earn the money to write this letter.

This week: Things I want to read, books I have been reading, and a word that gives me goosebumps.


MY WANTS and needs are few. I am glad to have the three lowest levels of Maslow's hierarchy covered for the time being. Above that I will settle for a connected iPad running Kindle, a sofa, a spaniel, decent public transport, and world peace. All but the last are within my grasp.

But, since you ask, here are some things that I want to read more about:

Dishwashers: Everybody has their own ideas about loading the dishwasher, and most stuff doesn't fit anyway. Can't somebody tip off the manufacturers that dining plates are big, soup bowls are concave, and wine glasses have long stalks?  

The Voynich Manuscript: Why can't AI solve this in 0.1 seconds?

ASML: Given what it does, why is it the only company that does what it does?

Thomas Pynchon: Not to the point of invading his privacy, but what would it take to persuade him to publish an autobiography?

What did happen to Hu Jintao?

Where is Putin's fortune?


Here are some things that I have been reading:

Crassus, by Peter Stothard (2022).
The problem with history is that there are too many people in it, as Nick Hornby noted. The particular problem with ancient Roman history is that there are too many names in it. Not only does everybody of significance have three names, but every famous person tends to share at least two of their names with somebody else famous doing something similar at roughly the same time, since they all belong to the same families and all divide their time between governing Rome and invading other countries.

Marcus Licinius Crassus, though having many eminent relations with similar names, has avoided confusion with relations and namesakes by getting fantastically rich, which was strangely unfashionable in those days. His cv was so distinctive that he passed into history with just the one name, Crassus. Even Julius Caesar needed two. His fortune was reckoned by Plutarch at 229 tons of gold, which today would be worth about $14 billion, whatever that might mean in purchasing power across two millennia. At worst one might confuse him with Croesus. If the name "Crassus" makes you think of "crass", so much the better. Crassus was crass.

Stothard's book is short and taut. Plain prose. Pleasure to read. Inevitably too many names, everybody arrives dragging a family tree behind them, but the pace is generally well maintained. In brief: Crassus crushes Spartacus, makes a fortune in real estate, bankrolls Julius Caesar, partners up with Pompey, worries about seeming wussy by comparison, decides to invade Parthia for no particular reason, gets killed.

The book is part of Yale's Ancient Lives series, a series which hopes to persuade us that the big questions in life “have changed very little over the course of millennia”. I finished Crassus thinking exactly the contrary.

The ancient Rome of Gibbon is so varnished over with irony and high style that it reads like fiction. The ancient Rome of Stothard reads more like a newspaper delivered two thousand years late. Everything that happens in Crassus is described in admirably straightforward terms. But I can scarcely guess at the psychology of the people involved, nor model the choices that they faced.

The richest person on Earth invades a big foreign country in person, at the head of his own army, on spec, just for the lulz, and loses? Not to mention the six thousand slaves he crucified earlier in life along the Appian Way? I suppose if you could graft stem-cells from Vladimir Putin on to stem-cells from Yevgeny Prigozhin then you might grow something that ticked most of the boxes, but I don't understand those people either.

The Company, by Robert Littell (2002).
I had been meaning to read The Company for more than a decade, ever since I was knocked sideways by Jonathan Littell's The Kindly Ones (2006), which I thought to be one the best novels about World War Two that I had ever read — not to mention a peculiarly dangerous and disturbing novel, because The Kindly Ones gets you inside the heads of Nazis who are carrying out the Holocaust, which is absolutely not a place than any sane person wants to be for very long.

Finding that Jonathan Littell's father, Robert Littell, had written a score of spy novels (and seems even now to be still writing them), I was curious to see whether any heritable talent might have been involved. The answer, I conclude, is Yes. Robert Littell is very fine thriller writer indeed, even if his son is the greater novelist. The Company kept me gripped for at least three-quarters of its length, from Cold-War Berlin through the Bay of Pigs and the Kennedy assassination to Iran-Contra and 1980s Afghanistan. I mean that as high praise, given that the print edition of the book runs to 894 pages, and there are only so many ways to skin a cat, even when you are CIA.

Littell would have done better, I think, to have divided The Company into two volumes, and taken a break between the two. Still, with the Kindle edition massively discounted on Amazon as I write (to $1.99/£1.99), I cannot think of a better way to spend two dollars.

The Diaries, Volume Three, 1943-57, by Henry Channon (2022).
I found the young Channon insufferable in Volume One, 1918-38. So I didn't read Volume Two, 1938-43. And then, just when I thought I was out, they pulled me back in. I was reading Hugo Vickers's book about all the fun he had while writing his biography of Cecil Beaton, and it struck me that pretty much all the interesting people in Beaton's world also drifted in and out of Channon's world (both of them having a marked taste for dukes and duchesses, film stars, cabinet ministers and debauchees in roughly that order). So I gave Channon another go.

This third volume is the one to read. Channon has grown up. He is just as arrogant and snobbish and self-centred as he was in his twenties, but the passage of time is making him less naive. He is no longer quite so enamoured of all those countesses and kings-in-exile, all those dinner parties in Belgrave Square; he still gets some pleasure from his lovers and from his seat in parliament, but he is starting to accept that he will never be a Great Man in any walk of life, starting to worry that he might not even get a peerage, and starting to wonder whether these diaries of his, these writings which he began so frivolously in his youth, might yet be his main claim to remembrance. On each of these last points he is correct. And so the personality of Channon himself starts to become interesting.

The Wrestling, by Simon Garfield (1996).
Should I even mention this? OK, but very briefly. If you grew up watching televised wrestling in Britain in the 1960s, if you saw Mick McManus and Jackie Pallo grunt in person at some Corn Exchange or Free Trade Hall, if the name Kent Walton rings any bells, then you may share my delight at this anthology of reminiscences. If not, then not.


Also, I want a word. Which may seem an odd request when the English language has a million of them freely available. But perhaps for that very reason I can't put my finger on quite the right one.

I want a word which means: "Using a word to show that one knows that particular word".  

You might say that usage of this kind overlaps with what John Searle called  "performative" utterances — statements which "change the reality of what they are describing".  Searle had in mind a tight causal relationship between the saying of a given word or phrase and the effect of that word or phrase. By saying, "I bet you ..." you make the bet. By saying, "I do", at the altar, you contract to marry.  

Now, for example, imagine that I use the word "eponymous" where in simpler times I might have said "gave her name to" or "is named after"? I am using the word "eponymous" as ordinary speech. It is behaving like any other word. It is not a code or a trigger. But I am hoping for a secondary effect — that my listener will be favourable impressed to learn that I am the sort of person who knows and uses the word "eponymous".

The relationship here between word and hoped-for effect is far weaker, far less certain, than in Searle's "performatives". Using fancy words is a form of showing off, and perhaps "showing off" is all that one can say about it. But it would be nice to have some more technical-sounding term to hand, that one could employ without giving obvious offence to the person who is showing off.  

I wonder — to bore you for a moment more — if we might remember here Leonard Sachs, host of The Good Old Days, a variety show broadcast on BBC television in the 1960s. His shtick consisted largely of using elaborate Latinate words to ridiculous effect. Please do sample this one-minute audio clip of Sachs on YouTube (expect some brief audience noise, then Sachs warming up from a cold start in a loud voice).

Perhaps "Sachsian" is the word I want, to describe a pulchritudinous propagation of pedagogical polysyllables. But a free and giftable one-year Browser subscription to anybody who comes up with a mot more juste.


Not that I am obsessed with the word "eponymous" or anything, but when I die you will find it carved on my heart. A plurality of English-speaking humanity is repelled by the word "moist". For me the goosebumps-word is "eponymous" — with "eschew" a close second.

And it turned out that I didn't even know my enemy.  

A couple of weeks ago I was laying down the law on what I imagined to be the meaning of "eponymous" (to Uri, if you must know), and saying that it was doubly a pity that so many journalists had started using this horrible word since they almost always used it wrongly.

"Eponymous", I insisted, referred strictly to a person who gave their name to a thing, not to a thing which was named after a person. Thus Gordon Selfridge was the eponymous founder of Selfridge's department store in London, but Selfridge's was not the eponymous department store founded by Gordon Selfridge.

Then I troubled to check this claim in a few dictionaries and I found that I was talking through my hat. The two usages are gazetted as equally correct. Selfridge's and Gordon Selfridge can eponymise one another in complete mutuality, reciprocity, and commutability.

Bah. Was it always thus?

Fowler's Modern English Usage (1965) lists as "eponymous words" only proper names (of people) which have subsequently passed into English for describing things or practices closely associated with them — Captain Boycott, Professor Bunsen, Dr Guillotine etc.

Fowler does not specifically rule out "eponymous" to describe a thing which assumes the name of its originator; and Fowler is rarely slow to rule out things which need ruling out; so either the alternative usage did not trouble H.W. Fowler, or H.W. Fowler thought the possibility of so gross an error so improbable as to be not worth mentioning.


I shouldn't mind. I am a liberal. I am a descriptive rather than a prescriptive grammarian. We have the language that we have because enough people want it to be that way. Wisdom of crowds.

But I can't always help myself. When I see the word "eponymous" in a newspaper or magazine I generally just stop reading the piece there and then — much as I tend to stop reading when I see other, wholly inoffensive, words which nonetheless are reliably absent from writing that I enjoy. These include "best I have ever tasted", "whip-smart", "razor-sharp", "pitch-perfect", "toxic" (in a non-chemical context), "TikTok", "hegemony", "bivalve", and "paradox" (when "apparent contradiction" is meant).

I also tend not to read articles which have the word "must" or "secret" in the headline, since there is never anything that anyone must do, nor anything which can be both "secret" and the subject of a newspaper article at the same time, except possibly the location of the Amber Room.

If that makes me sound curmudgeonly, let me add that I will read almost anything, at least until I find reason to do otherwise, which contains any of a far longer list of words, among them, "Voynich" (see above), "Monty Hall", "paradox" (used correctly), "Fowler's Modern English Usage", "cathar", "bayesian", "new books", "Elena Ferrante", "Michael Hoffman", and "1970s". Yes it does occur to me to feed all those words into ChatGPT and see what comes out, but it also occurs to me that most of the pleasure lies in the anticipation, so I will put off this narrative rendezvous for as long as I can bear.

Robert

Free 9 min read

I Might Be Wrong


A weekly letter from the founding editor of The Browser. Topics may vary. Correspondence and criticism welcome: robert@thebrowser.com.

This week: Russia, Seymour Hersh, lying, George Orwell, and books I have been reading


Where to begin? Various medium-sized thoughts will succeed one another in this letter, all of which have been jostling in what I laughingly refer to as my mind over recent days.

Russia's war in Ukraine is still with us, one year on. My heart is with the people of Ukraine as they suffer through this terrible winter. My mind is wondering what sort of duty I have, at least so far as The Browser is concerned, to learn more about what the war looks like from the Russian side, how it looks to the Russian people.

Charles Crawford dealt well with this question, of trying to see the other country's point of view, in his article, Explanations Come to An End, which we recommended recently on The Browser. He argued that, in ordinary times, one could make whatever allowances one felt to be appropriate for Russian history, Russian interests, Russian beliefs and Russian illusions. But in launching a war of aggression, Russia was attacking, not only Ukraine, but also the continued existence of an orderly world; and for this there could be no justification, no mitigation.

I seek to be as complete a supporter of the Ukrainian people in this war as one can reasonably be from a safe distance (I donate through Stopify). But I can never get out of my head — not only in this context, but whenever I feel certain of anything — that remark of Bertrand Russell's: "I would never die for my beliefs, because I might be wrong".

An example of such momentary doubt: I was genuinely bewildered by Seymour Hersh's claim earlier this month that the United States had blown up the Nord Stream pipelines. I recommended Hersh's piece in my first draft of that night's Browser, saying that I thought it was a good read even if it was totally wrong. Then I deleted my recommendation, thinking that it would be far worse to spread a contagious lie than to share a guilty pleasure.

Even now I do not know what to make of Hersh's story (which is still a work in progress). It makes sense, if one is willing to swallow the far-fetched premise that America would decide to anger and alienate its European allies, to the point of fracturing Nato, in furtherance of supposed global and economic goals.

One cannot go halfway by saying only that there must be something in it. The story is so closely argued, and with so much detail, that either it is all true (save for some trivial errors of time and place) or it is a fabrication from start to finish.

I infer that somebody told Hersh the story in its entirety; and that the somebody was a person whom Hersh had good reason to trust; and/or it was somebody who had put enormous time and effort into winning Hersh's trust in order to deceive him. When and if the story of Hersh's story is ever written, I will be sure to recommend it on The Browser. I hope this is not long in coming. Does nobody even tap Hersh's phones these days?


On a slightly more abstract level, I was also musing yesterday about a piece called The First Year Of The Conflict, written by a Russian academic called Oleg Barabanov, published in the journal Russia In Global Affairs.

Barabanov is a professor at MGIMO, the most elite of all Russian educational institutions, a finishing-school for diplomats and spies. Russia In Global Affairs is meant to be to be a sort of Russian version of Foreign Affairs, house-trained and open-minded, welcome at Davos.

There is something going on in this article and I am not sure what it is. If I were to summarise it very crudely, saying what I think Barabanov to have been writing in invisible ink between the lines, it would go like this:

We have been lying about this war — not least by refusing even to call it a "war". This made things easier at the start. But when the war dragged on and we had to call up reservists, the official line became harder to sustain. Was this a war or not? What was really going on? The Russian people, habituated to trusting the government blindly, sensed that something was wrong. They began to express doubts and ask questions. Then, gradually, fortunately, the public grasped the truth: We were indeed in a war. Our soldiers had to be supported. Everybody must do what they can. As a result, Russians are now united in their patriotism, and willing to work for victory.

Barabanov's claim that there has been an uptick in the Russian public mood just lately sounds to me like something added in the course of a rewrite to make the piece publishable at all. But the rest is quite daring in its way: It does not say that the Kremlin was wrong to lie about the war, but it does question whether such lying was productive.

Not, of course, that the word "lying" is ever mentioned. The English text (I can find no Russian original) refers to "semantic euphemisms" and "Aesopian language". Here is Barabanov tip-toeing around usage of the word "war" itself:

The term “war”, as we all know, is being avoided in official Russian discourse. To a certain extent, how willingly this or that Russian uses this phrase “special military operation” in his speech can serve as a marker of his attitude to what is happening. Opponents of the military solution rarely use this official wording. On the contrary, people who have organically included this phrase in their vocabulary, as a rule, are supporters of the operation itself. But, in our opinion, the situation is more complicated with supporters. As the experience of communication over the past year has shown, a fairly significant number of people who support the actions taken by the Russian leadership, at the same time prefer, from their point of view, to call a spade a spade.

From which I take it that he would rather call the war a war and have done with it.

Professor Barabanov does not sound like a warmonger. He holds the Jean Monnet Chair in European Relations at MGIMO, a chair which has been partly funded in past years by the European Commission. If we were to meet him when the war is over, I imagine he would say that he was against the invasion, that he expressed what reservations he could, but that in time of war one cannot always say and do just what one wishes, one has a duty of solidarity with one's country, right or wrong. I wonder what I would be doing now if I were in his shoes.


Plans for a proposed debate between Henry Oliver and me about rules for writers, and particularly George Orwell's rules for writers, are advancing well. Henry and I are in discussion with an admirable institution in central London which has expressed interest in hosting the event. Henry and I are both champing at the bit. If all goes according to plan, we will have enough tickets available to satisfy all those who have already written saying that they would like to attend, subject to the final decision on time and place (we hope for London in late April).

If you have written, you are on the list. But we are now approaching potential capacity. So if you think you might like to hear me explaining why Orwell's rules are a decent stab at fool-proofing English, and Henry explaining why Orwell's rules are a slippery slope towards Newspeak, do drop me a line while there is still time: robert@thebrowser.com.  


It strikes me now that Barabanov's argument, and my argument with Henry, are not unrelated (a construction which Orwell hated, by the way, saying, “it should be possible to laugh the not un- formation out of existence”). Henry and I are both interested in ways in which language can reveal and conceal truths. Henry might argue that life is complex, and so language must be complex to capture life's truths. I might argue that truth is inherently simple, so complex language is only ever needed to obscure truth, not to express it. As for Barabanov, he seems to be saying that the truth will come out in the end, so you might as well admit it in the first place.

I am tempted at this point to write about the place of lying in Russian culture and history. I have never known a society in which lying has been so generally accepted — even expected — in all but the most trivial interactions. At the risk of exaggeration I would say that, during the Stalinist terror of 1937-38, every single word in the public domain was a lie, from the articles in Pravda about the course of Soviet life to the signs above the shops claiming that milk and beef were available for purchase.

But this would risk underplaying the role of lying at other times and in other places. In Plato's Republic, Socrates makes a strong argument that the mechanics by which the ruling elite of the Republic are selected and perpetuated should be concealed from the public at large, for the public's own good.

Utilitarians (with whom I am generally in sympathy) argue that lying may be justified if it produces a better outcome than telling the truth. Henry Sidgwick says as much in his Methods Of Ethics, and goes on to argue that rulers may justifiably seek to keep their programmes a secret from those whom they rule, whether by remaining silent or by claiming to be doing something quite different, on the grounds that if the public knew what was going on they would only mess it up:

The opinion that secrecy may render an action right which would not otherwise be so should itself be kept comparatively secret; and similarly it seems expedient that the doctrine that esoteric morality is expedient should itself be kept esoteric. Or if this concealment be difficult to maintain, it may be desirable that Common Sense should repudiate the doctrines which it is expedient to confine to an enlightened few. And thus a Utilitarian may reasonably desire, on Utilitarian principles, that some of his conclusions should be rejected by mankind generally; or even that the vulgar should keep aloof from his system as a whole, in so far as the inevitable indefiniteness and complexity of its calculations render it likely to lead to bad results in their hands.

But this way madness lies, does it not? Segments of society cannot conduct themselves like double- and triple- secret agents, lying about lying and expecting lies about lies in return. No doubt Sidgwick meant well, but he loses himself in his abstracted and impersonal logic, which is made all the foggier by the hesitancy of his prose — "comparatively secret", "seems expedient", "it may be desirable", "mankind generally", "in so far as", "render it likely".

Orwell's idea of good writing was that, “When you make a stupid remark its stupidity will be obvious, even to yourself”. I believe Sidgwick makes Orwell's case.


I have been thinking a lot about indexes lately, and had intended to write about them in this letter, but I have already gone on too long, so they must wait until next week. Let me conclude instead with notes on a few books that I have been reading:

The Visit Of The Royal Physician, by Per Olov Enquist. Historical fiction on a par with Hilary Mantel's Thomas Cromwell trilogy. The story is set in the Danish royal court of the late-18th century. I knew nothing of the underlying history before reading, and felt at no disadvantage. I learned much, and pleasurably so. In brief: What happens when a physician, filled with the ideals of the Enlightenment, becomes the closest friend and counsellor of a weak-minded yet absolute monarch, and lover of the monarch's wife? Now read on.  

The Disappearance Of Josef Mengele, by Olivier Guez. I have yet to finish this, but I will be most surprised if it lets me down. Any book published by the doggedly left-wing Verso, and yet favourably reviewed by the Wall Street Journal, is worth a look for that coincidence alone. There are a couple of flashbacks to Auschwitz, but this is mostly about Mengele making his next life in South America. I would say that Guez wrote the book in a bid to show how the terrible possibilities of Nazism could act upon an otherwise mediocre man like Mengele. In spirit, therefore, not far from Arendt's Banality Of Evil. But with more plot and less moralising.    

A History Of Water, by Edward Wilson-Lee. A couple of weeks in Lisbon last month has left me fascinated by everything Portuguese. Portugal seems to be an extreme case of a country which produced more history than could be consumed locally. There is any amount of it left over, just lying around the place and crying out to be turned into novels like this one. The story is set in the 16th century when Portugal was peaking as a world power. Its heroes are a diplomat and a poet. The Guardian loved it. Me too.  

Valuable Humans in Transit, by QNTM. Science fiction. Blurbed by Charlie Stross as "a refreshing dose of existential despair". A collection of short stories, the first of which, Lena, is alone worth the price of admission. Suppose the futurists are correct, and we eventually work out how to create digital replicas of our minds which can go on functioning as minds up there in the cloud. What if somebody copies the copy, and the copies go viral, and soon there millions of versions of your mind in circulation, for people to do with as they please, rather like Henrietta Lacks's cancer cells?


And, finally, a book to which I have been listening. At last there is some Isaiah Berlin on Audible — The Hedgehog And The Fox, read by Peter Kenny. A fine start. Kenny's reading is pleasing if a touch spritely: I would have preferred Jonathan Keeble as narrator, or, better still, Berlin's editor, Henry Hardy. But that is just me, and if Princeton University Press favours Kenny, then I defer to their judgment, and I am delighted that they are doing this at all. There are 18 volumes of Berlin's writings in print, and four volumes of letters, so I trust that more Audible editions will follow.  
Robert

Free 11 min read

Write A Book, Run A Country


A weekly letter from the founding editor of The Browser. Topics may vary. Correspondence and criticism welcome: robert@thebrowser.com.

This week: Novels by presidents and prime ministers; preceded by a note about a possible live event, and a grateful salute to recent correspondents.


I have had a serious argument with the writer Henry Oliver. It was a pre-arranged argument, I hasten to say. We wanted to be sure that we did indeed disagree. And we do.

Henry thinks that George Orwell's rules for writers ("Never use a long word when a short one will do", etc) are just plain bad: Bad for writers, bad for readers, bad for society in general. I think Orwell's rules are generally good, generally useful, and certainly the best such rules of which I am aware.

Here is Henry on the subject, previously. And here is me. Henry and I both received lively comments on our respective articles, enough to suggest that the topic was one of broader interest, and we are wondering now whether we might attempt to deepen our differences by means of an in-person argument before an audience of friends.

Practical considerations would require that we do this, at least initially, in London, though we would post a video recording. Our apologies to friends who might like to attend but would be prevented from doing so by distance. As a first step, may I ask whether such an event would hold any attractions for readers of this letter?

I imagine that Henry and I would want to organise the evening as an argument around the title "Orwell was wrong about writing" or some such; it would happen somewhere in central-ish London; and it would last for about 90 minutes, with drinks before and after.

If you think that you might like to attend such an event, perhaps in late April, would you be so kind as to drop me a line? If you happen to have any experience of organising or hosting such events, I would be doubly pleased to hear from you. Henry and I would happily place ourselves in hardier hands than our own. I am robert@thebrowser.com.  


My thanks for your generous comments on my recent letter about doctors in literature. My particular thanks for alerting me to two errors in the piece (and if those were the only errors then I got off lightly):

— I mis-spelt Iain McGilchrist's name (giving him an "Ian" at first reference). Let me atone by recommending at least an exploratory foray into The Matter With Things (2021), McGilchrist's two-volume magnum opus developing the argument begun in The Master And His Emissary (2009) that much of what we think about the world, and of how we think about the world, is the result of our having a divided brain.

— I listed Socrates as a writer, when, as far as history records, Socrates never wrote anything at all and may even have nursed an active hostility towards the written word. Thus: Socrates was a writer who did not write.

As for omissions, I learn that there is much more to be said for Richard Gordon than I had previously imagined. Behind the farcical fictions there was a highly accomplished doctor who wrote several technical books, served a year as a ship's surgeon, and was deputy editor of the British Medical Journal.

I should have found room to mention The House of God, that deliriously funny novel by "Samuel Shem", who was in civilian life the psychiatrist Stephen Bergman. And I cannot think what transient dysfunction caused me to neglect the writing of Atul Gawande, Abraham Verghese, Siddhartha Mukherjee, Jerome Groopman, Perri Klass and Rebecca Skloot.

For these and other points, to which I hope to return, I am indebted to Ramesh V, Roman J, Iain B, Reed H, Loren W, David B, Galen S, Judyth R, Matthew N, Mike D, Jamie P, Meena A, Charles T and Anne P.


AFTER THOSE recent ruminations about doctors as authors and doctors in fiction, my long-time friend and correspondent Reed Hundt wrote to ask: What about presidents and prime ministers? Has anybody ever run a country and written a decent novel?

My thoughts turned to Winston Churchill, the first world leader since Julius Caesar to have written books for the ages. Was there a novel buried somewhere among Churchill's histories and memoirs? Indeed there was. It was the first book Churchill wrote, at the age of 24. The trouble is, it is not much good.

The novel in question, Savrola: A Tale of the Revolution in Laurania (1898), is an action thriller set in a fictional European state vaguely resembling Spain. The plot turns on the overthrow of a dictator. In later life Churchill said of Savrola: "I have consistently urged my friends to abstain from reading it."

Response to Savrola was mixed. Some critics claimed to quite like it. One newspaper bought the serial rights. On the strength of that reception lesser authors might have tried their hand at a second novel. But Churchill, with a clearer view of his own relative strengths, abandoned fiction in favour of the fighting of wars, the running of governments, and the writing of history.

Having looked in briefly on Savrola, I am inclined to follow Churchill's advice, and abstain. In case you might feel differently, here are some sample lines.

Savrola begins with a crowd of protestors gathering in front of the Lauranian dictator's palace:

Wild passions surged across the throng, as squalls sweep across a stormy sea.

The army fires on the crowd. Many are killed. Others flee:

The President remained unmoved. Erect and unflinching he gazed on the tumult as men gaze at a race about which they have not betted.

Was it a dark and stormy night? No, oddly enough:

There had been a heavy shower of rain, but the sun was already shining through the breaks in the clouds and throwing swiftly changing shadows on the streets, the houses, and the gardens of the city of Laurania.

I had always been in two minds about the Oxford comma. There again, I had never previously seen the Oxford comma deployed to quite the bewildering effect that Churchill achieves in those final clauses. I am now cured of the Oxford comma.  


I suspect, although I have no evidence for this, that another factor in Churchill's decision to abandon fiction so abruptly may have been his discovery, soon after completing Savrola, that his place at table had been taken. The literary world was already lionising an American novelist called Winston Churchill whose commercial success the British Churchill could scarcely hope to rival.

While British Winston's Savrola was finding modest commercial success in England, American Winston's Richard Carvel (1899), a society novel set partly in London, was selling two million copies in the United States. It was the Da Vinci Code of its day.  

It fell to the lesser (British) Winston to write to the greater (American) Winston remarking on the possibilities for confusion and wondering how they might keep out of one another's way. It was not going to be easy. They were of comparable age; both had served with some distinction in the military; both were keen painters; and both, it turned out, had political ambitions.

On the naming question they agreed to differ. American Winston would continue to write as "Winston Churchill". British Winston would write as "Winston Spencer Churchill", which he later shortened to "Winston S. Churchill".

The Winstons even met, in 1900, when British Winston, on a speaking tour of the United States, went to collect his post restante mail in Boston, only to find that the the mail had been delivered to American Winston, who lived at 181 Beacon Street.

According to the Boston Globe, American Winston visited British Winston at the latter's hotel for what was described as “an odd meeting”. Conversation cannot have been helped by British Winston's opening line: "How came you by that name?” To which American Winston replied: “It seems that there have been Winston Churchills over here for a good many years”. (I score that point as a win for American Winston.)

British Winston, then 26, was in a buoyant mood. “I mean to be Prime Minister of England", he told American Winston, "it would be a great lark if you were President of the United States at the same time”.

History does not record American Winston's reply. I imagine it was a polite demurral. He already had a perfectly serious political career mapped out in his mind. It was no "great lark".

His immediate target was the New Hampshire House of Representatives, where he served from 1903 to 1905. In 1906 he entered the Republican primary for Governor of New Hampshire, and made no secret of his higher ambitions.

“Watch the Winstons”, wrote the New York Times from London on 9th August 1906:

Here is Lord Randolph Churchill’s son, at one and thirty the most striking and picturesque figure in the Liberal Party — a potential Premier. Across the water is his cognominal double, a man of 34, aiming at a post which is a step upon the way of a man whose goal is the Presidency.

I imagine a counter-factual novel by Robert Harris in which American Winston reaches the White House while British Winston languishes on the back benches. But in reality this was the point at which the scissors of history crossed.

American Winston lost the Republican primary of 1906. He ran for Governor of New Hampshire as the candidate of the Progressive Party in 1912, and lost again. He gave up writing fiction, retired from public life, consolidated his modest reputation as a painter, immersed himself in theology, and died in 1947. As to the course of British Winston's life, I shall not bore you here.


Thomas Jefferson re-edited the Bible. Ulysses Grant wrote the first presidential memoir. Theodore Roosevelt published an account of an African safari in the course of which he and his party killed 11,400 animals for the Smithsonian museum collections. But it was not until 2003 that an American president published a novel. This was Jimmy Carter's lone work of fiction, The Hornet's Nest: A Novel Of The Revolutionary War.

The Hornet's Nest was not a bad book. It was better than Savrola. It was better than the worst of its reviewers claimed. It sold decently. If you read it, you came away better informed about the Revolutionary War, which Carter had studied diligently.

It was not a work of genius; and all first novelists, without exception, hope to be acclaimed for their genius. But I doubt Jimmy Carter pitched his hopes too high. He was almost 80 when the book was published, and no novelist of genius, as Carter must have been aware, had ever revealed themselves so late in life. I hope on balance he was happy with the warm and respectful manner in which his book was received.

There have been two more presidential novels since The Hornet's Nest, both by Bill Clinton "with" James Patterson.

In The President Is Missing (2018), President Jonathan Lincoln Duncan sets out alone from the White House to foil a cyber-attack against America while evading a variety of assassins. In The President's Daughter (2021), ex-president Matthew Keating assembles a posse of pals to rescue his daughter from the clutches of revenge-seeking Middle Eastern terrorists.

Both books were best-sellers. Both received admiring reviews. Both were good books of their kind. I mean no disrespect in saying that nobody will be reading them a hundred years from now, which debars them from greatness.    


Greatness hedged Vaclav Havel, the late president of the former Czechoslovakia, but while Havel wrote many acclaimed plays, he wrote no novels. President Léopolde Senghor of Senegal was elected to the Académie Francaise, but as a poet. Mario Vargas Llosa won a Nobel Prize for his novels, but lost to Alberto Fujimori when he ran for president of Peru.

A far as I can see, that leaves only one person whom we might possibly regard as a gold-medal winner in both politics and fiction. I phrase the suggestion hesitantly because the person I have in mind is Benjamin Disraeli, and for myself I cannot quite learn to love Disraeli's novels. A page or two I can manage, five hundred pages I cannot.

But here I will defer to Robert McCrum, a writer and publisher whose judgment is far superior to my own, and who ranks Disraeli's Sybil as the eleventh-best novel ever written in the English language.  Here is the nub of McCrum's argument:

Disraeli's plots are far-fetched, and his characters balsa-wood. Yet Disraeli has flashes of brilliance that equal Dickens and Thackeray at their best. With his polemical fiction of 1844-47 (Coningsby, Sybil and Tancred), he more or less invented the English political novel. From this trilogy, Sybil, or the Two Nations, stands out as perhaps the most important Victorian condition-of-England novel of its time. Without Disraeli, Charles Dickens might not have written Hard Times. We are approaching the summit of the mid-Victorian novel.

From what McCrum says here, it sounds to me as though Disraeli at least had the right stuff. He might have made a great novelist if he had only put his back into it, if he had made writing his life's work and his life's ambition rather than dashing books off in his spare time when he needed money.

Viewed with a century or two of hindsight, writing might even have been the better choice. We may speak now of Disraeli as having lived in the age of Dickens; we would never speak of Dickens as having lived in the age of Disraeli.

There again, I doubt Disraeli was overly concerned with posterity. He lived for the here and now. He wanted, more than anything else, to be on top of everything and everyone. This is not generally the condition of even the most triumphantly successful writer, at least within their own lifetime.

And what a life Disraeli had! By the time he was 21 he had made and lost a fortune on the stock exchange. By the time he was 30 he had started a newspaper, written eight novels, had a nervous breakdown, done a Grand Tour of Europe, and run twice for parliament as a Radical. He then switched parties, entered parliament as a Conservative, married money, paid off his enormous debts, wrote more novels, became chancellor of the exchequer, and the rest is, quite literally, history.

Throughout it all Disraeli had to contend with the casual anti-semitism of the British public, and the systematic anti-semitism of the English upper classes who dominated political life. He did so by fabricating an elaborate pedigree for his family, dating back centuries, to equal anything which any English aristocrat could claim. He aligned himself as far as he could in his views and his behaviour with the aristocracy itself. He outlasted his tormentors. Had I been around in the day (and if I had had the vote) I imagine that I would have voted for Gladstone. But I would have greatly enjoyed watching Disraeli.  

I rely for this view of Disraeli largely on Adam Kirsch's Benjamin Disraeli, in the Jewish Encounters series, which I recommend as a highly readable and not-too-long account of Disraeli's life with a particular focus on his Jewish identity. Here are a few key lines:

Disraeli had to turn his Jewishness from a handicap into a mystique. He had to convince the world, and himself, that the Jews were a noble race, with a glorious past and a great future. He even had to turn anti-Semitic myths to his own account — to make people believe that, if he was a wizard and a conjuror, he would at least use his powers for England.

Adam Kirsch is marvellous writer, is he not? He resolves one of the most bewildering political conundrums of the 19th century — how Disraeli rose to govern a country which treated him explicitly as an outsider — in an entirely satisfactory and even rather uplifting manner, all within sixty or so words.

All Adam Kirsch's writing is that good, by the way. It is not just the economy of his argument; it is the boldness and vivacity with which he expresses it. When I read Kirsch, he has my full attention. Perhaps more to the point, I feel that I have his full attention. Writing is what he cares about most.

I wonder, on reflection, if this factor of caring accounts for the shortage of good-to-great novels by presidents and prime ministers. When world leaders write, their hearts are not in it; their minds are elsewhere; they have seen too much. To a great novelist, at least when he or she is writing, the writing must be all that matters. The problems of life can be as nothing compared to the problems of fiction. I doubt that anybody who has had a country to run could possibly feel that way, and, I have to concede, they would be right. — Robert

Free 10 min read

Altruistic Violence


A weekly letter from the founding editor of The Browser. Topics may vary. Correspondence and criticism welcome: robert@thebrowser.com

This week: Doctors as writers, doctors in literature. Preceded by a personal note.


My recent letter about George Orwell and David Bentley Hart, Rules For Writers, provoked generous comment for which I am much in your debt. Most came as private email; an exception was Henry Oliver's admirable public rejoinder at The Common Reader. I hope to return to this general subject, whether there can be rules of any general value to writers, and, if so, what those rules might be, with a view to arguing in favour of soft rules intended to help writers express themselves clearly, and against hard rules intended to constrain individuality and experimentation.

I do read and value all the thoughts and comments which friends and subscribers are kind enough to send me, about this letter and about The Browser in general. I think between us we must have the gentlest and most cultivated community in the writing and reading world.

Your emails touch me so deeply that I feel all the more keenly my own shortcomings as a correspondent. Often I find myself replying to emails weeks and months in arrears, and sometimes not at all. It is not, I promise you, because I do not want to reply. It is because I want to reply well, which I never feel that I have the time and focus to do at the given moment, with the result that I delay, and, well, you know what comes of that.

Allow me, as a temporary measure, to thank, salute, honour and praise my friends and recent correspondents Matthew H, David N, Charles C, Barbara E, Clayton M, Robert De V, Martin W, Irwin R, Alison T, David Y, Timothy W, Ben W, Steven Mac, Steven M, Gemma B, Charles A, Jim B, David H, Stephen S, Benjamin C, Nick F, Provi, Hannah K, Jim W, Reed H, Kyle B, Daniel S, David Y, Will H, Ruth Mac, John A, Peter T, Thierry M, Leslie S, Michael D, Jeffrey R, Antonio G, John, Simon S, Tony D, Tony C, Frank R, Roman J, Stephen F, David P, Rodney D, Peter M, Galen S, Christopher S, Jsh, Steve C, Warren F, Steve P, Donald N, Fred R, Peter L, Lois P, Tobias S, Jonathan H, Jeremy C, Brian, Shardul C, Richard F, Henry F, Antony D, klgraham, Ted O, John A, Ellen W, Lewis L, Sarah P, Jay S, Judyth R, Michael H, Jeanette C, Willis R, Dennis T, Charles A, James C.

To each of you I owe a letter; forgive me my fecklessness; and above all, though it is a great deal to ask in the circumstances, please do continue to write. A particular thanks to Barbara Epstein and Irwin Rosenthal, whose emails have been a constant source of inspiration and information to me, and without whom my reading and writing would have been much the poorer.

On now to this week's letter, about doctors and books, sparked by my long-standing admiration for the brain surgeon Henry Marsh. We have been recommending Marsh's writing on The Browser since 2012, when he made his literary debut with an essay in Granta about the pineal gland, accompanied by a brief and touching interview.

In 2015 we highlighted an extraordinary piece of writing about Henry Marsh, by Karl Ove Knausgård, who watched Marsh operate on a patient in Albania. The piece has since disappeared from its original home online at the London Telegraph, but can also be found in the New York Times magazine of January 2016.

Marsh said last week that his cancer, mentioned below, is currently in remission. This has allowed him to do more charitable work in Ukraine, and to complete his book about dying, And Finally. His character is caught well in a recent photograph by Patrick Sherlock.


IN HIS forty-year career as a brain surgeon Henry Marsh never once sought to have his own brain scanned, noting that the surprises which came from doing so were invariably on the downside. He was in semi-retirement by the time he finally agreed to have a scan, and then only because it was required of him as a volunteer in a clinical trial:

I had blithely assumed that the scan would show that I was one of the small number of older people whose brains show little sign of aging. I can now see that although I had retired, I was still thinking like a doctor — that diseases only happened to patients, and not to doctors, that I was still quite clever and had a good memory with perfect balance and coordination.

The scans told a different story. He had been right in the first place:

I was looking at aging in action. My seventy-year-old brain was shrunken and withered, a worn and sad version of what it once must have been. There were ominous white spots known in the trade as white-matter hyperintensities. My brain was starting to rot. I am starting to rot.    

Reading this passage, I was looking forward to Marsh's reflections on the cognitive aspects of aging. I had reckoned without the next twist in his tale. His worries about dementia were overtaken by the news that he also had prostate cancer: "I had a PSA of 127, I couldn’t really believe it. Frantic googling told me that most men with a PSA of over 100 will be dead within a few years".

Marsh, who still walks among us, is an unusually powerful writer even by the standards of his profession, partly thanks to his subject matter. Nobody does brain surgery better, at least on the printed page. He tells us about life-and-death operations. He tells us what he thinks while performing such operations. When I am reading Marsh I think: This is how it feels to be a surgeon.

It is clear, too, that Marsh enjoys his work, which is an odd thing to say when a critical part of that work involves cutting holes in people's heads. "I found its controlled and altruistic violence deeply appealing", he says of surgery in his memoir, Do No Harm.

If that line sounds almost psychopathic, the reality could scarcely be more different. Marsh is virtue incarnate. Even so, there is something unnerving going on here. Obviously, you cannot be a good surgeon if you recoil in shock from what you see on the operating table, as most people would. "Altruistic violence" is a concept worth investigating, and perhaps a skill worth cultivating.

Nor can you be a model surgeon if you develop (and express) strong sympathies towards some patients and aversions towards others. Your counterparty is the disease, not the patient. As Ernst Jünger said in a suitably military formulation: "The ill person is the tactical object of medicine, the illness is the strategic object”.


I hesitate to say that all physicians make good storytellers, but the correlation is certainly impressive, and holds up well against other philosophically-inclined professions.

True, soldiering and writing have always got on well together: Think of Xenophon, Herodotus, Descartes, Chateaubriand, Siegfried Sassoon, T.E. Lawrence, Winston Churchill, Erich Remarque and Ernst Jünger.

Lawyering has given to our libraries Cicero, Francis Bacon, Henry Fielding, Walter Scott, J.W. Goethe, Charles Perrault, Franz Kafka and Wallace Stevens.

Even the advertising industry has done its bit for literary fiction, nurturing at various points in the 20th century F. Scott Fitzgerald, Peter Carey, Don DeLillo, Salman Rushdie, Faye Weldon and Dorothy Sayers.

Other writers, any number of them, have doubled as farmers, teachers, civil servants, scientists, clergy, politicians, criminals, athletes, parents, invalids and retirees, each carrying something from this other life into their writing.

But doctors, even so, still seem to me to be in a class of their own when they commit their ideas to paper. As perhaps they should. They possess skills verging on the miraculous; they work on the frontiers of life and death; they know at first-hand more than any philosopher or priest does about relations between mind and body. If anybody can tells us the truth about the human condition, it ought to be doctors.

Any self-respecting bookshelf of doctors-as-writers would be a very long one indeed, extending from Luke the Evangelist to Rivka Galchen by way of François Rabelais, Thomas Browne, John Keats, Friedrich Schiller, Oliver Wendell Holmes, Anthon Chekhov, Arthur Conan Doyle, Somerset Maugham, Mikhail Bulgakov, Sigmund Freud, Robert Bridges, William Carlos Williams, Jonathan Miller, Oliver Sacks, Adam Phillips and Iain McGilchrist.

Such a bookshelf might also find room for Dante, who was a member of the Physicians' and Apothecaries' Guild of Florence (which was also the guild for paper-makers); James Joyce, who failed three times to get into medical school; Gertrude Stein, who dropped out of a medical degree at Johns Hopkins pleading boredom; and perhaps even Marcel Proust, modernity's greatest authority on insomnia and hypochondria.

In Scattered Limbs, his highly recommendable anthology of reflections on medicine, Iain Bamforth, a doctor himself, compares doctors to critics:

If what critics do is analytical, what doctors do is often anatomical: they cut a long story short, parse it in ingenious, abrupt and sometimes violent ways in order to reduce it to its bare plot lines: the solid vertebrae that hold up the fleshy superstructure.

This makes doctors sound more like editors, I think, which is a pleasingly suggestive analogy. There are certainly times when one's body needs a good edit.

As for a more direct analogy with novelists, I would rather say that doctors actually do what most novelists merely want to do: They see life at its barest and most vulnerable, in its most private and extreme states, and they act forcefully, sometimes painfully, upon it.

This necessary rudeness of technique, and the uncertainty as to outcomes, tended to make doctoring an unpopular business prior to the rise of scientific medicine in the 20th century. When Sir Thomas Browne published Religio Medici (The Religion Of A Doctor) in 1643, he began it with a disclaimer asking readers to forgive what he called “the general scandal of my profession".  


Doctors are everywhere as characters in literature, from Charles Bovary to Yuri Zhivago. But what they actually do in literature when closeted with their patients has tended to remain something of a mystery. Depictions of medical practice are constrained by the prudishness of the day in matters of bodies and bodily functions. As a general rule, if the selling-point of a story is its physiological detail, then it probably belongs in a genre other than literary fiction.

When Gustave Flaubert sought to include a few glimpses of general practice in Madame Bovary, critics professed disgust: “We are in a dissection room, and we have just read an autopsy report”, said one. But Flaubert did succeed in moving the dial towards anatomical realism; as did the novels of D.H. Lawrence and James Joyce.

Joyce in particular delighted in bodily functions. He may well have been the first novelist to give prime time to the kidney since Tobias Smollett (a ship's surgeon) in the mid-18th century.

Joyce took the novel outside my own personal comfort zone by reintroducing defecation to the printed page in Ulysses (1922). Please do not correct me if I am wrong, but I doubt that defecation had previously been addressed so directly in literary fiction since Don Quixote (1605); and it has not been much addressed since, to my knowledge, save by Philip Roth and Anthony Burgess. Tolstoy's War And Peace tracks the behaviour of 500 characters across three thousand days without a single lavatory scene.  

This is not meant as a complaint. Far from it. I am sure I would skip any prolonged and graphic lavatory scenes if any novelist did think fit to include them — much as I skipped the dozen-or-so pages of Ian McEwan's The Innocent (1990) which described in gruesome detail the dismembering of a body such that it could be fitted into two suitcases.

Still, it is an expression of puzzlement. The lavatory can be a place of solitude, of deep reflection, of reacquaintance with one's animal self. I will leave to a stronger and wiser stomach than my own a more thorough investigation of why this element of human behaviour has been so neglected in literature, and, indeed, in the generality of public life.


Ever since the mass-production of penicillin in the 1940s, doctors have had big science on their side. Their status has risen in life and in art. But before then, as John Salinsky has noted, doctors depicted in literary fiction were a very mixed bunch:

There is Emily Brontë’s Dr Kenneth, who does his best with some very difficult patients in Wuthering Heights. Dr Slop the man-midwife makes a botched delivery of the infant Tristram Shandy. We spend an evening in the Dublin Rotunda Hospital in the company of Leopold Bloom and a crowd of drunken medical students. We study the career of young Dr Lydgate of Middlemarch in some detail and emerge shaking our heads sadly.

Of course, George Eliot's Doctor Lydgate is a fictional character, as is his near-contemporary, Anthony Trollope's Doctor Thorne. But if we take those two as being plausible representations of the doctors of their times, and since both are represented as sincerely wanting to do good, I find myself wondering what such doctors thought they were doing when they met with their patients.

For example, when Trollope's Dr Thorne attends Lady Arabella Gresham for what Trollope simply calls "cancer", are we supposed to think that Dr Thorne understands much about the given illness, or that he thinks himself to have any means of curing or arresting it? What does he tell Lady Gresham? What does he tell himself?

When Tertius Lydgate attends Casaubon in George Eliot's Middlemarch after what seems to have been Casaubon's first stroke, on what grounds and with what degree of confidence does Lydgate tell Dorothea that her husband might die the next day or live another 15 years? Of what use is such information supposed to be?

Doctors such as Lydgate and Thorne, practising in the first and second thirds of the nineteenth century respectively, would certainly have had some sense of scientific method, which was then advancing rapidly in the natural sciences. They must have known that their "art" of medicine had, by comparison, little or no scientific basis. They had no reliable cures for any serious diseases.

And yet they persisted; which provokes me into wondering, since the wisdom of great literature is timeless, whether things are so very different nowadays.

We know now that many diseases are carried by microscopic organisms which we call germs, that such diseases can be cured with drugs, that failing organs can be replaced, and that cancers can be eradicated or immobilised.

But while we have learned to prolong the average human lifespan in ways at which history will marvel, this prolongation has been largely the result, not of new technologies, but of widespread drastic reductions in infant mortality achieved through means of no intrinsic novelty whatsoever — encouraging doctors to wash their hands, and enabling parents to put more food on the table.

By reducing infant mortality we prolong youth, and youth is the most wonderful of things. But prolonging old age, the explicit focus of much current medical research, strikes me as less of a deliverance. My impression is that life adjusts to later prolongation in the manner prescribed by Parkinson's Law: Work expands to fill time available.

I marvel at modern medicine. I have benefited from it myself, and greatly so. As Iain Bamforth says, there are no technophobes on the operating table. But in my more philosophical moments, I think all that I can reasonably ask of any doctor (indeed, of anybody at all) is that I should not die unnecessarily. Perhaps here I begin to appreciate better the role of a Doctor Lydgate or a Doctor Thorne. If they did not cure, they did at least care; they were the best of their kind. Doubtless I would have felt just as much moral and psychological reassurance in their hands two centuries ago as I feel in the hands of any doctor today — Robert

Free 11 min read

Rules For Writers


A weekly letter from the founding editor of The Browser. Topics may vary. Correspondence and criticism welcome: robert@thebrowser.com

This Week: The Persistence Of George Orwell


There are no good universal rules for writers. And yet, in moments of weakness, many writers feel an urge to set down "rules" for other writers in which they describe the habits which have served them well in their own years of craft.  

Living writers who stumble down this rabbit-hole soon bump into the ghost of George Orwell. Orwell's "six rules" for writers, included in a 1945 essay called Politics And The English Language, have become a mandatory part of every style guide, and I have yet to find a majority against them. You are doubtless familiar with them, but let me reproduce them here for ease of reference. They read, in full:

i. Never use a metaphor, simile or other figure of speech which you are used to seeing in print.
ii. Never use a long word where a short one will do.
iii. If it is possible to cut a word out, always cut it out.
iv. Never use the passive where you can use the active.
v. Never use a foreign phrase, a scientific word or a jargon word if you can think of an everyday English equivalent.
vi. Break any of these rules sooner than say anything outright barbarous.

I found myself revisiting Orwell this week after learning, by way of my Browser colleague, Caroline Crampton, that David Bentley Hart, a noted American writer on religious matters, had recently experienced his own moment of weakness, and had set down his own "rules for writers", in opposition to those of Orwell.

For a long time I had been allowing Orwell to fall in my estimation. He was too wrong about too many things. But reading Hart's critique of Orwell has obliged me to rediscover Orwell's merits.

Hart begins his "rules" with a conventional disclaimer:

To propose a list of rules for writers is probably a very presumptuous thing to do. The only authority it can possibly have is one’s own example, and so offering it to the world is something of a gamble. One has to assume that one’s own writing is impressive enough to most readers to provide one with the necessary credentials for the task.

Hart digs himself deeper into this hole by trying next to explain why Orwell's rules just will not do:

George Orwell was a perfectly competent (if rather boring) stylist; and yet his celebrated essay Politics And The English Language, which was intended as a rebuke of obscurantist jargon, endures now mostly as a manifesto of literary provincialism.

Cheap shots aside, I do not follow Hart's argument here. If he is saying of Orwell, "Boring stylist wrote tedious essay", this scarcely seems a point worth making.  

When I look again now at Orwell's rules in the light of Hart's critique, I think of H.L. Mencken's axiom: “There is a solution to every problem: simple, quick, and wrong”.

Orwell's rules are simple and quick. They are not absolutely wrong in themselves, but they are grossly inadequate.

Hart's rules are not simple and quick. They are complicated and time-consuming. They are also so impractical that their rightness or wrongness scarcely matters. They are far worse.

Here are a very few lines from Hart (his essay runs to 6,000 words):

Always use the word that most exactly means what you wish to say, in utter indifference to how common or familiar that word happens to be.
The exotic is usually more delightful than the familiar. Be kind to your readers and give them exotic things when you can. In general, life is rather boring, and a writer should try to mitigate that boredom rather than contribute to it.
Never squander an opportunity for verbal cleverness
Orwell decrees: “If it is possible to cut a word out, always cut it out.” No great writer in the history of any tongue has ever observed this rule, and no aspiring writer should follow it. The correct counsel would be “If a word is so excessive as to mar the effect of a sentence, remove it; but never remove a word simply because it is possible to do so”.

There is something to be said for each of Hart's claims, but not much. Re-reading that last citation from Orwell, by way of Hart ...

If it is possible to cut a word out, always cut it out.

... I find myself agreeing so strongly with Orwell's principle that I want to cut a word even from the rule itself. Orwell's rule should read:

If it is possible to cut a word out, cut it out.

Perhaps even:

If you can cut a word, cut it.

And let others debate the meaning of "can".


In journalism many decades ago I learned, and learned to respect, an equivalent axiom ...

"When in doubt, cut it out"

... which was meant to encourage the deletion not merely of surplus words, but also of facts of uncertain provenance, of proper names which might have been mis-spelled, of descriptive passages of dubious relevance, of entire paragraphs, and sometimes of entire articles.

So perhaps, when Hart claims, in contradiction of Orwell's advice to cut words wherever possible, that "No great writer in the history of any tongue has ever observed this rule", he may have in mind novelists rather than journalists. But even then, his claim is ridiculous.

I doubt that any great work of literature has come into being, at least since the dawn of the novel, which was not at some earlier stage vigorously edited, and often vigorously truncated, usually by its author. The wordiest novel is often the most truncated.

Every novelist, as best I can tell, has always had too much to say. Every first draft comes in too long by a factor between two and ten. Drafts never come in too short. "Editing" means cutting the last draft down to size; it never means inflating the prose to fill excess space available.

Cutting out cuttable words is one of the easier and more rewarding parts of any writer's life. The more cuttable words you can cut from your manuscript, the more of your uncuttable words will survive into print. All writers, pace Hart, have respected this rule.


Orwell's six rules are perfectly sound rules for journalism. They may also be sound rules for speechwriting and for marketing. They are not, in my view, sound rules for literature. But since Orwell made no particular distinction between journalism and literature in his essay, doubtless he intended his rules to guide writers of all kinds.

Orwell's main aim in writing Politics And The English Language, the essay into which his rules were embedded, was to argue for a new definition of "good writing". Orwell wanted "good writing" to mean writing which

(i) Said something true, and
(ii) Did so in language that was clear to all readers.

Other writing — writing which was not clear, writing which failed to express a true message — was, almost by definition, the work of a bad writer, or of a woolly thinker, or of a deliberate propagandist.

It is a short hop from here to 1984, Orwell's great novel about a totalitarian England, which was published in 1949. The central invention of 1984 is an official language called "Newspeak", a version of the English language which has been reduced, by force of law, to a tiny vocabulary of approved words so bland and generic as scarcely to permit the imagining a dissenting opinion, let alone the expressing of it.    

A character in 1984 called Symes, who is said to be editing the 11th edition of the Newspeak Dictionary, explains the virtues of Newspeak to Winston Smith, the novel's hero, as follows:

If you have a word like ‘good’, what need is there for a word like ‘bad’? ‘Ungood’ will do just as well — better, because it’s an exact opposite, which the other is not. Or again, if you want a stronger version of ‘good’, what sense is there in having a whole string of vague useless words like ‘excellent’ and ‘splendid’ and all the rest of them? ‘Plusgood’ covers the meaning, or ‘doubleplusgood’ if you want something stronger still. Of course we use those forms already. but in the final version of Newspeak there’ll be nothing else. In the end the whole notion of goodness and badness will be covered by only six words — in reality, only one word.

Symes's argument is by no means absurd. It has attractions and precedents.

Similar claims were made by advocates of Esperanto, an entirely virtuous project to invent a European lingua franca in the late 19th century.

Similar principles were invoked by two Cambridge scholars, I.A. Richards and Charles Ogden, when they proposed in 1930 a simplified version of English, to be called "Basic English", which would use just 850 common words, and was designed to spread English as a global language.


As to the rest of Orwell's rules, I cannot see much harm in them.

Few people can tell the difference between active and passive moods, so there is not much to be gained from overthinking the use of these moods in literary fiction.

Aversion to the passive mood does, however, remain a healthy instinct in journalism. "Jones was hurt" contains much less information than "Smith hurt Jones". Newspapers should always tell us who to blame.

I might also agree with Orwell that short words are generally better than long ones, but I cannot easily say why this should be so. Perhaps it comes from some distantly imbibed inverse snobbery on both of our parts. In any case, Orwell's rule applies only where a short word and a long word are close-enough substitutes for one another, where one "will do" for the other, which is rarely the case. No two words are perfectly synonymous in all contexts and for all audiences.

As for Orwell's throw-away exit-line ...

Break any of these rules sooner than say anything outright barbarous.

... this is an ingratiating and essentially fraudulent appeal to the reader's or writer's supposed common sense. It suggests that one can follow Orwell's rules while still feeling in charge of one's own prose.


When he sat down to write 1984, on the Scottish island of Jura in 1946, Orwell had evidently reflected further on the nature of "good writing".

He had seen the nationalist propaganda of World Wars 1 & 2; he had watched a rising tide of ideological propaganda flood Europe since the Spanish Civil War; he had watched the rise of broadcast media and of broadcast advertising.

In 1984 he concluded and gave warning that simplicity of language was dangerous when taken to extremes. Life itself was complicated, and if we could not discuss Life in complicated terms, then we, like Winston Smith, were lost.

Ambiguity, imprecision, vagueness, repetition, jargon and digression are all figures and manners of speech which give speakers and writers places to hide, places to conceal their opinions and ideas without malice.

Orwell banished all such hiding-places from Newspeak. The permitted vocabulary of Newspeak allowed only for praise of Big Brother and for repetition of Big Brother's slogans. You can imagine Orwell's despair when he realised quite how easily this could be done.

All that said, I wonder whether Orwell 's experience of propaganda might possibly have left him with an exaggerated view of what propaganda could achieve, and with a distaste for propaganda which deterred him from any proper study of its inner workings.

He imagined in 1984 that people were best motivated by simple commands and physical force. They would obey any order that was repeated to them often enough in plain enough terms. Let me try to explain my hesitation about this model by citing a line from the American poet and novelist Paul Eldridge:

A man will die for an idea, provided the idea is not quite clear to him.

Now this is surely true. People around the world have shown their willingness over many centuries to die for a God, or for a nation, or for an ideal, all things in respect of which the mass of people might possess at best some very few fragments of fact or some vague impressions acquired by hearsay.

People have not, an other hand, generally shown a similar willingness to die for particular concrete things, and for things that they know well. They will not generally die to ensure the survival of their superiors at work, nor of the house in which they live, nor to perpetuate some actually existing civic virtue, such as cleanliness.

In the case of 1984 I do not for a moment believe that the citizens of Oceania would die willingly for Oceania merely because Oceania declared itself to be "doubleplusgood". Even the most totalitarian of states does well to provide some hidden place for romanticism within its ideology or within its regime. People need somewhere to place their irrational hopes.

In Russia the Orthodox Church provides this service. It purports to provide spiritual comfort to the government, and spiritual guarantees to the public at large, while being in most respects an undeclared arm of government itself.

In Britain, which is not a totalitarian state, the equivalent job goes to the Royal Family. The British monarchy gives religious and historical legitimacy to the government, while being in most respects an undeclared arm of government itself.


The sort of prose style which Orwell so deplored in his 1945 essay, and which Hart has defended in his 2023 essay, was first and most ably discussed by H. W. Fowler and F. G. Fowler, in The King's English (1906) under the rubric of "elegant variation", of which they say:

We include under this head all substitutions of one word for another for the sake of variety, and some miscellaneous examples. But we are chiefly concerned with what may be called pronominal variation, in which the word avoided is either a noun or its obvious pronoun substitute.  

Sub-editors on The Guardian newspaper have a term of their own for words and phrases introduced solely in pursuit of "elegant variation". These are called "povs", an acronym from "popular orange vegetable", referencing a long-lost and perhaps legendary piece of copy about the health-giving properties of carrots, the writer of which, having introduced carrots as "carrots" in paragraph one, felt obliged to open his second paragraph with the words: "The popular orange vegetable ..."  

H.W. Fowler returned to "elegant variation" in his Dictionary Of Modern English Usage (1926), mainly in order to warn still more strictly against it:

It is the second-rate writers, those intent rather on expressing themselves prettily than on conveying their meaning clearly, and still more those whose notions of style are based on a few misleading rules of thumb, that are chiefly open to the allurements of elegant variation.

Message received: Follow Fowler and avoid vulgarity. But I envy Fowler the presumption of hauteur from which he can deliver such a declaration. Who — save perhaps for Shakespeare and Homer, Sappho and Pushkin, Du Fu — would ever dream of claiming to be a first-rate writer? To be a second-rate writer, up there with Charlotte Bronte and Anthony Trollope, would be bliss indeed.

Snobbery aside, there is still some value in Fowler's scorn. Fowler warns that no writer who is admired primarily for "prettiness" in their own time is likely to be thought "first-rate" by later generations. Good advice. Keep your prose-style in check until you have first been admired for your wisdom. Oscar Wilde fell at this fence, a furlong short of greatness; and so, a century later, did Martin Amis.


Having got this far into an argument against the very principle of "rules for writers", how else can I exit, save by offering my own "rules for writers", if only to demonstrate the futility of such rules?

Describing, as a reader, the qualities which I seek out in a writer, I would couch my advice to writers as follows, and in descending order of importance:

(i) Be honest
(ii) Point to something in the world and talk about it
(iii) Have something memorable to say
(iv) Start well.

I could expand on these rules, but rules requiring expansion are not good rules in the first place.

There is nothing about prose-style in these rules, nor should there be. All style rules find their contradictions in Shakespeare.

At some levels and in some places there will be an audience for long words over short ones, old words over new ones, sadness over happiness, ramblings over brevity, experimental fiction over genre fiction, shock over decorum. There will be audiences for Samuel Beckett and Jane Austen, for Elmore Leonard and for Frozen, for Quentin Tarantino and for Muriel Spark, for David Brooks and for Amia Srinivasan.

There may be good and bad reasons to distinguish between such audiences, but I beg you to proceed gently when doing so. Because I myself am in all of them.

Robert

Free 9 min read

Time And Space


A weekly letter from the founding editor of The Browser. Topics may vary. Correspondence and criticism welcome: robert@thebrowser.com

This week: Books from the 1920s


How glorious the nineteen-twenties must have seemed to those artists and scientists fortunate enough to survive the 1914-18 War without great personal loss and then to rebound into a world where old certainties were crumbling and anything seemed possible.

The very nature of reality was being contested by advances in relativity and quantum theory. The formulators of quantum theory could say only that it described the behaviour of matter at a very small scale and that its principles departed from what had previously counted as common sense. If Newtonian physics had been a system of mathematics and mechanics, quantum physics was more like a system of metaphors — metaphors which could as easily be put to work in other fields of human activity. If reality defied common sense at a very small scale, why should it not defy common sense at all other scales?

Relativity was easier to understand if you had some mathematics. It did not rewrite the logic of the universe, but it did show the need to adjust intuitions about time and space in order to explain effects that astronomers in particular were observing. Relativity released its own swarm of metaphors into the world of ideas, beginning with its propositions that every property of everything in the universe was relative, and that time was neither a wholly abstract phenomenon nor entirely uniform in its behaviour. One might argue that modernism, a long nineteen-twenties, began with the spreading awareness of Einstein's theory of general relativity, first among intellectuals and then among the general public, after its publication in 1915.

The principles of genetics, revealed by the rediscovery of Gregor Mendel's work in 1900, were assumed in the 1920s to offer a scientific method for improving the entire human race by means of selective breeding, or eugenics. Freud's lectures proposed a new theory of the human mind. The modelling of the atom by Ernest Rutherford reopened the old claims of the alchemists: All matter might be fungible. New technologies that had been accelerated and operationalised by the recent war included aeroplanes, weaponry, radios and televisions, psychological techniques of persuasion and propaganda, psychological treatments for grief and trauma.

No less a shift was under way in philosophy, where Bertrand Russell argued that philosophy, mathematics and logic were more or less the same thing. Mathematics was a symbolic language in which logical propositions were expressed; analytical philosophy could be worked out in the language of mathematics; the confusions of ordinary language could thus be eliminated and the operations of ordinary language exposed. Language was the key to knowledge and even to reality itself.

To this ferment in the hard sciences and philosophy was added the prosperity of post-war America, where the beneficiaries of price-rises saw themselves as the creators of wealth. This was a golden age for management theorists, economists, and free-market ideologists, who could find evidence for their laws and theories wherever they looked. Dissenters could find encouragement in the survival, if not yet the success, of the communist revolution in Russia.


This torrent of ideas and analogies pouring into the humanities made the 1920s as thrilling for writers and painters as it was for physicists and philosophers.

Novelists who had previously considered words as mere instruments for the telling of stories now came to see words as things in themselves. Words determined facts, not vice-versa. Fictional characters could do strange and unpredictable things in a universe which was itself strange and unpredictable. Plots became arbitrary when old rules were disappearing and new rules were still being discovered. One could spend the 2020s reading novels from the 1920s and feeling that literature had added nothing of comparable significance in the intervening hundred years.

There was Ulysses, of course; The Great Gatsby; Mrs Dalloway; To The Lighthouse; The Sun Also Rises; The Sound And The Fury; A Passage To India; Women In Love; All Quiet On The Western Front; Decline And Fall; Point Counter Point; The Trial; We; Swann's Way; The Magic Mountain. All human life was there, minus more recent elements of human life — cellphones, climate change, culture wars — that we probably don't much want in our fiction anyway.

All this you know. What I hope to do in this letter is to highlight one or two of the non-fiction books of the period which seem to me in their various ways to have a sensibility all their own: A grounding in what we can recognise and respect as scientific method, coupled with a certain innocence about the world — a belief that the rules of life, once discovered, will be comprehensible, and favourable to humanity. These are also books that hold their own as works of literature, however far they may have been overtaken as works of science and pseudo-science.


On Growth And Form by D'Arcy Thompson, published in 1917, seemed at the time to be among the great books of the new century. Now it is scarcely known.

Thompson's argument was that properties of size and shape had been much underrated and understudied by zoologists and biologists; that evolution was as much a matter of changing size and shape as of anything else; that fitness, in Darwinian terms, was largely determined by size and shape; and that the life of an organism was limited, even defined, by its size and shape.

Perhaps the most extraordinary feature of Thompson's book was the extravagant admiration which it inspired among its partisans. In a preface to the 1992 revised edition of On Growth And Form, Stephen Jay Gould wrote:

From Falstaff to the Ring of the Nibelungen, great constructions and great works of art have paid a price for amplitude beyond usual standards. D’Arcy Wentworth Thompson (1860–1948), Professor of Zoology at Scotland’s University of St. Andrews, and perhaps the greatest polymath of our century, was scarcely homo unius libri (a man of one book). He composed two volumes of commentaries on all birds and fishes mentioned in classic Greek texts; he prepared the standard translation of Aristotle’s Historia animalium; he labored for years over statistics for the Fishery Board of Scotland; and he wrote the section on pycnogonids (a small but fascinating group of arthropods) for the Cambridge Natural History series. But his enduring (indeed evergrowing) fame rests upon a glorious (and very long) book that served more as the active project of a lifetime than a stage of ontogeny — On Growth And Form (first edition of 793 pages in 1917, second edition enlarged to 1116 pages in 1942).

The Nobel-prizewinning immunologist Peter Medawar, widely considered one of the finest scientific minds of the 20th century, said of Thompson:

[He was] an aristocrat of learning whose intellectual endowments are not likely ever again to be combined within one man. He was a classicist of sufficient distinction to have become President of the Classical Associations of England and Wales and of Scotland; a mathematician good enough to have had an entirely mathematical paper accepted for publication by the Royal Society; and a naturalist who held important chairs for sixty-four years, that is, for all but the length of time into which we must nowadays squeeze the whole of our lives from birth until professional retirement. He was a famous conversationalist and lecturer (the two are often thought to go together, but seldom do), and the author of a work which, considered as literature, is the equal of anything of Pater’s or Logan Pearsall Smith’s in its complete mastery of the bel canto style. Add to all this that he was over six feet tall, with the build and carriage of a Viking and with the pride of bearing that comes from good looks known to be possessed.

I hope by now I have piqued your interest such that you will consider downloading the free sample of On Growth And Form available from Amazon for Kindle. It includes Stephen Jay Gould's foreword, Thompson's first chapter, which introduces his argument, and his second chapter, On Magnitude, which is a feast in itself — from which, a morsel:

Small insects skating on a pool have their movements controlled and their freedom limited by the surface-tension between water and air, and the measure of that tension determines the magnitude which they may attain. A man coming wet from his bath carries a few ounces of water, and is perhaps 1 per cent heavier than before; but a wet fly weighs twice as much as a dry one, and becomes a helpless thing. A small insect finds itself imprisoned in a drop of water, and a fly with two feet in one drop finds it hard to extricate them.

Risk, Uncertainty, And Profit by Frank Knight, published in 1921, grew out Knight's PhD thesis at Cornell. It launched Knight on a brilliant career in economics spent largely at the University of Chicago where his pupils included Milton Friedman and James Buchanan. His name lives on in the concept known to economists as "Knightian uncertainty", following the distinction which Knight developed in his book between "risk" and "uncertainty":

“Risk" means in some cases a quantity susceptible of measurement, while at other times it is something distinctly not of this character; and there are far-reaching and crucial differences in the bearings of the phenomenon depending on which of the two is really present and operating. There are other ambiguities in the term "risk" as well, which will be pointed out; but this is the most important. It will appear that a measurable uncertainty, or "risk" proper, as we shall use the term, is so far different from an unmeasurable one that it is not in effect an uncertainty at all. We shall accordingly restrict the term "uncertainty" to cases of the non-quantitive type. It is this "true" uncertainty, and not risk, as has been argued, which forms the basis of a valid theory of profit and accounts for the divergence between actual and theoretical competition.

But whereas the concept of Knightian uncertainty remains current 50 years after Knight's death, the rest of his landmark book appears to be little read. If you buy the digital edition of the book you get the old 1921 print edition crudely scanned. Something of an explanation for this recent neglect can be found in a journal article about Knight published by the American economists Richard Langlois and Metin Cosgel in 1993:

For decades now, economists have struggled to interpret Frank Knight's Risk, Uncertainty, And Profit. Like a handful of other classic texts — Das Kapital and The General Theory come to mind — it has produced nearly as much confusion as inspiration, nearly as much misinterpretation as interpretation. Risk, Uncertainty, And Profit is a brilliant book. But it is also idiosyncratic in scope and method. Worse yet, in the eyes of the modern economist, it is deeply philosophical.

Needless to say, that last sentence made my eyes light up. I sat down to read Risk, Uncertainty, And Profit and found that for the non-economist it is brilliant without being in the least confusing.

Knight argues that the world is an uncertain place, and that true uncertainty cannot be quantified nor perhaps even identified. If we can assign a well-grounded probability to a future event then we no longer have uncertainty but something more like certainty — the near-certainty, for example, that something will happen seven times out of ten.

Knight goes on to argue that the profits of almost any company will almost always arise from true uncertainty about future business conditions. A company that guesses these future conditions correctly, and positions itself accordingly, will stand to make a profit. A company that guesses incorrectly will not. Where there is no true uncertainty, but merely "risk", which can be costed, and insured or provided against, any number of people may enter the business with absolute confidence and any profits will be competed away.

Knight goes on to develop from this model a theory of management which seems to me to be both original and unsurpassed.

In conditions of uncertainty, a good manager is somebody who is good at judging the fitness of subordinates and business partners to fulfil particular tasks in conditions of uncertainty. The more elevated the manager's position, the more completely his or her job will consist of judging subordinates and business partners; and the key quality on which these subordinates or business partners will be judged will be their capacity for judging others similarly. The essential work of a manager, therefore, is second-guessing other people's guesses. The whole business world is a Keynesian beauty contest, all the way down.

A few lines from Knight to give you a sense of his style:

Profit arises out of the inherent, absolute unpredictability of things, out of the sheer brute fact that the results of human activity cannot be anticipated and then only in so far as even a probability calculation in regard to them is impossible and meaningless.
The existence of a problem of knowledge depends on the future being different from the past, while the possibility of the solution of the problem depends on the future being like the past.
The ability to judge men in relation to the problems they are to deal with, and the power to "inspire" them to efficiency in judging other men and things, are the essential characteristics of the executive.

Knight's book has some dull patches where it goes into detail about the practicalities of corporate management; these sections seem distinctly dated in the way that his more general arguments do not; but skipping over a few pages here and there is a price well worth paying for the wisdom that remains.


I shall stop here simply because these letters should not be be unreasonably long. Next week, if you will allow, I will conclude by discussing Alfred North Whitehead's Science And The Modern World (1925); and J.W. Dunne's An Experiment With Time (1927).

Robert Cottrell

Free 5 min read

Books From The Future


Founding Editor's note:

As part of our new Friends of The Browser package, we are adding a weekly editor's letter, which has no set theme, but is unlikely to stray far from my main topics of interest — journalism, literature, language and philosophy. We will share some of these letters with all Browser members. The first follows here.

And, by the way, thank you for being a Browser subscriber. We are proud of what we have been able to achieve with your help, and we hope you are too. If would like to become a Friend of The Browser — reaping your rewards in beautiful Browser merchandise, conversations in our online community, a giftable bonus Browser subscription, and, most important of all, our endless love and gratitude – please visit our web page Friends of The Browser, or email sylvia@thebrowser.com.

Robert Cottrell


27th November 2021

Robert Writes: The Quick And The Dead

Of the many books that I hope still to read in my life, the most elusive are those that have yet to be written.

To introduce obliquely the first of these books that I dream of imagining into being, while lacking the application to write them myself, I should say that the one contribution of lasting value I ever made to anything in my thirty years of full-time journalism was my advocacy of an obituaries page for The Economist.

I was, nominally, Features Editor of The Economist at the time — 1994 or so — and trying to make myself generally useful around the place. But since there was only one long weekly feature in the paper which was not being otherwise handled by specialist section editors, I tended to have a bit of time on my hands (such things happened in those days),

I had a weakness for obituaries after having worked on The Independent, where the obituaries editor and antiquarian James Fergusson had turned a traditional backwater into a garden of delights. Transported to my new home in St James's Street I modelled, using the past year's newspapers, the confidence with which we could expect interesting people to die at appropriate intervals and thus ensure a compelling obituary in each week's Economist. The numbers were encouraging, the obituary page was blessed by the editor, Bill Emmott, and it has been duly executed ever since by two of the most perfect prose stylists ever to have graced a printed page, Keith Colquhoun (himself the subject of an Economist obituary in 2010) and Ann Wroe.

All of which is to say that I have a certain interest in obituaries, a certain knowledge of how they are done, and a keen desire to read a quirky crime novel set in the Fleet Street of the mid-1990s when the internet is starting to show its teeth and the old soaks are fearing for their expense accounts.

Let us imagine as our central character a journalist somewhat past his prime, known as Dave "Coffin" Lidd by virtue of a smoking habit formed when such things were still mandatory in the newspaper business. Dave has been bumped from the sports section to the obituaries desk after an awkward incident involving a women's basketball team. He rightly fears he is on course for "voluntary" retirement within months if not weeks.

But he will not go gently into that dark night. He still has the rat-like cunning and the capacity for plausible invention under pressure which one necessarily acquires over long years at the Daily Filthpacket; and, if his new job is technically a demotion, he has no particular animus against writing about the dead, but rather an almost indecent feeling of excitement — for the dead, as every journalist knows, cannot sue for libel. He senses opportunity here. He will ransack the last days of the rich and famous to procure and fabricate scoops and super-scoops that will send sales of the Filthpacket soaring, while his colleagues and rivals on more scrupulous newspapers look on in amazement.

And so he does. Here is one of his first exploits.

A retired Vice-Marshal of the Royal Air Force is strolling home from drinks at his West End club at 8.30pm on a Monday evening. By 10pm he is dead in a mortuary, the victim of a hit-and-run accident. By 8am the following morning he is decorating breakfast tables across the Home Counties in the form of a sensational 2,000-word obituary in the Daily Filthpacket, under Lidd's byline, filled not only with tales of derring-do in times of war, but also with a variety of anecdotes involving ladies' underwear, ballet dancers, poppers and and hidden cameras which the Vice-Marshal, fortunately for the Filthpacket, is no longer in a position to dispute.

Lidd considers this a job well done — especially the muted ferocity, peculiar to journalists, market porters and night-club bouncers, with which he was able to shoehorn the piece into the final London edition of the Filthpacket just minutes before the 2am press run, assuaging an obstructive stone-sub with a case of Glenfiddich and the promise of minicabs home for life.

Lidd's peers on other newspapers, a mostly mild-mannered group, generously attribute his success to an expertise doubtless honed during his years as a sports reporter which has enabled him to build a network of trusted contacts among doctors and nurses in the emergency rooms of the top London hospitals, not to mention oncologists and policemen, taxidermists and coroners, crematorium-stokers and general practitioners.

Nor do his rivals resent his success as much as one might have expected. His regular front-page splashes for the Filthpacket — "Countess chokes on rock star's vomit" — "Cabinet minister hid childhood sex-change" — cast a reflected glory on the obituary-writing profession in general, alerting editors-in-chief the length of Fleet Street to the possibilities of extracting sensational stories from this neglected page which they might otherwise have axed had they thought about it all.

So far, so good. But as Dave Lidd's star rises, and he bestrides his profession like a colossus with a scythe in one hand and a pen in the other, the doubts begin to accumulate; not only among the readers of our imagined book, but even among the readers and editors of the Daily Filthpacket.

The warning signs are already there, if one only knows where to look: An odd coincidence, a cryptic remark, a strange visitor to Lidd's office, a nocturnal absence, a momentary panic involving a lost diary, a misrouted telephone call, a confusion over two identically-spelled names in Burke's Peerage — all of them can be explained away as isolated incidents, but still, one cannot help but worry that Lidd enjoys just a little more good luck than any prudent journalist has the right to possess.

Slowly, reluctantly, the truth dawns upon us. Under pressure to maintain his hit-rate of untimely death notices, Dave Lidd is reversing the apparent course of events. Perhaps he always did so. He is choosing his subjects, writing their obituaries, and then killing them himself.

How does it end?

"Coffin" Lidd must surely sense a growing unease around his singular clairvoyance. He decides to risk one last job — the biggest job of all — before retiring from Fleet Street to host a television show, How They Died, about the final hours of celebrities.

But, in conformity with the rules of the "one last job" genre, something goes wrong.

Perhaps owing to his unfamiliarity with the finer points of online publishing, or to some change in the Filthpacket content-management system, Lidd sends his last, transcendent obituary notice "live" before he has sent its subject to their death.

As horror and wonder sweep the nascent Internet, Lidd is arrested in the grounds of Balmoral Castle, disguised as a ghillie, preparing to shoot the Queen Mother with a rifle belonging to another member of the Royal Family.

(finis, shock, applause)

It seems obvious to me that this book must be have been written by Michael Frayn around 1973, but for some reason I can find no trace of it. The Internet would not have been available as a plot device, but since newspapers in those days printed substantially revised editions from late afternoon until the small hours of the morning, premature publication of the Queen Mother's obituary could still have cooked Lidd's goose in much the same way.

If this book does not exist, I assert my moral right to be identified as the author of this short fiction. If it does exist, and you have a copy of it, would you send it to me, please?

Join 150,000+ curious readers who grow with us every day

No spam. No nonsense. Unsubscribe anytime.

Great! Check your inbox and click the link to confirm your subscription
Please enter a valid email address!
You've successfully subscribed to The Browser
Welcome back! You've successfully signed in
Could not sign in! Login link expired. Click here to retry
Cookies must be enabled in your browser to sign in
search