Or is it a question of where to look?
A madman lit a lantern in the bright morning hours and ran into the market place. ‘“Whither is God?” he cried; “I will tell you. We have killed him – you and I. All of us are his murderers.”’ Many of those standing around laughed at this ungainly interruption, even though their faith had either lapsed or turned lukewarm. And, finally, it dawned on the madman. He smashed his lantern on the ground. He cried out something histrionic about lightening and thunder, then tailed off with the words “deeds, though done, still require time to be seen and heard”.
You might begin to detect my line of thought. In his famous allegory for the death of God, Nietzsche spots the gap between the tectonic movements of culture and the individuals scuttling about on them. Something of this sort is happening with the internet. We, too, have a “market place” of ordinary folk persisting with the status quo, while making flippant comments about it. We, too, have had a few hysterics barking about the shifting ground beneath us (I’m one of them – witness my insanity).
In the case of the internet one of the oceanic issues at stake is the truth. The title of a paper published last year by the Tow Center for Digital Journalism, called “Lies, Damn Lies, and Viral Content”, pretty much says it all. With a perverse incentive to clock up as many clicks or views as possible, publishers and content producers will cut corners. In the catalogue of bad practice, we find that many news sites often don’t trouble to verify the stories they publish, but will pass the buck to the source they quote. This means they are easy prey for cheap rumours. They will jump at the opportunity for a story thrown up by hearsay, and cover their backs with hedging language (something like ‘Inflatable German sex toy causes unnatural long life, research claims…’). The same report also found that news organisations which do verify their stories and have more rigorous quality controls, often don’t challenge the rumour-mill or fact-check the myths it disseminates.
Even a summary reading of this report leaves the impression that the internet is an out-of-control lie factory. The Guardian’s editor, Katherine Viner, in a recent piece inveighed against this sloppy practice, and raised the obvious question it implies: “Does the truth matter any more?” This question has since been echoed by a run of articles in the national press on a similar theme. This month the Economist had a cover story about social media called “The art of the lie”.
Viner gives an example of the Tow Center’s analysis, in the well-known story drawn from the unofficial biography of David Cameron. The story claimed that the then Prime Minister, as an Oxford undergraduate, “inserted a private part of his anatomy” into the head of a dead pig. But within twenty four hours the book’s authors admitted that they could not provide solid evidence to corroborate what they had reported and which – surprise, surprise – had been voraciously consumed by the media.
This example might seem pretty frivolous and innocent. But it’s no surprise to find that Viner spots something much more serious at stake. As click-farming steals the ground from investigative journalism, she suggests this creates vacant territory in which ideas and movements flourish unchecked. The two most visible examples in our immediate field of vision are Brexit and Donald Trump. Increasingly, even at the pinnacle of the most high-profile matters, politicians can assert something which suits them and then worry about the truth afterwards. Two of the key facts at the vanguard of the “Leave” campaign – that leaving the EU would save 350 million per week to support the NHS and immigration would fall – were quickly retracted once the vote had been won.
In the month before the referendum Peter Oborne writing in the Spectator levelled the same charge at the “Remain” side. In particular he accused George Osborne of misusing the offices of state for a purely rhetorical purpose. The chancellor, he claimed, “has now converted the Treasury into a partisan tool to sell the referendum, exactly as Tony Blair used the Joint Intelligence Committee to make the case for war against Iraq.” The Remain camp had put forward as fact that families would be £4,300 worse off in the event of Brexit. Oborne suggests that this relied on a misleading conflation of GDP with household income, and that the chancellor then pimped his facts by circuiting the broadcast studios before the Treasury documents were released.
For Oborne this sort of practice typifies an era of “post-factual politics”. Nurtured by New Labour then adopted by the Coalition Government and its Tory successor, politics is more about presentation than the integrity of public policy. If it has some ground to it, this sort of “spinning” clearly predates the widespread influence of the internet. But the internet, and especially social media, may have made matters worse.
Social media makes it quicker and easier to publish and share something than ever before, and in the clamour for attention, as the Tow Center’s report says, “online media frequently promote misinformation in an attempt to drive traffic and social engagement”. Large communities of people seize on a story at a moment’s notice. These conditions mean that a rear-drear editor, troubled by their conscience, can easily get left behind. For a cynical politician (I’m sure there must be a few of them), this creates a space in which their main aim is to craft “facts and figures” primed to go viral. It creates a fertile habitat for sophistry.
I suspect that most politicians understand this sort of cultural change, and that part of their communications strategy will aim to exploit it in a tactical but cautious way. Some are clearly a little more brazen than others. Or, as the headline from one of Nick Cohen’s opinion pieces shortly after the EU vote put it, “There are liars and then there’s Boris Johnson and Michael Gove”.
Donald Trump seems to have taken this approach on an industrial scale. He treats public debate in the same way Irving Thalberg – pioneer of the Hollywood studio system – treated the film production process. You test a public statement with a focus group to see if it will make waves, then you go mainstream. Whether the statement is true or not is an afterthought. Hence it echoed recently all around the world that Barack Obama had founded ISIS (while, no doubt, holding a sex orgy in the company of a dubious-looking man with a bifurcated tail). The main accolade we might award to a successful piece of digital content is summed up by Rihanna: “I love the way you lie”.
I’m exaggerating the point: the answer to Viner’s question, “Does the truth matter any more?”, might not yet be “no”, but that seems to be the direction of travel. Which suggests that something in the character of social media – or the character of the way we use it – isn’t too fussed about the truth. If you go looking for its characteristics, words like “subversive”, “ironic”, “funny”, “surreal” often surface quickly. There’s usually also a game or a quiz to chivy you along. So if we can spy any relationship to the truth, it’s love-knotted with entertainment.
And, from some angles, whatever it is that entertains looks skin-deep. Research suggests that large numbers of people read little more than a headline or a few curated “facts”, usually confected into an image or infographic. It’s commonplace for news organisations to share links back to more “substantial” stories, but, with the exception of Buzzfeed, these links send only small percentages of traffic to the “mother” site. The BBC website, for example, only receives 7% of its traffic from social media. If we also take into account that people only read under 30% of the content on a web page (however they get there) we better hope that the adage “less is more” holds firm. (“More of less” might be more accurate.)
All of which gives the impression that people trawl through their social media channels glancing from one meat-sliced headline to another without really engaging with it in any meaningful or critical way. If the truth makes any kind of appearance, it’s so ephemeral that you would be forgiven for failing to notice it. “Thought leaders” in this area often effuse around words like “engagement” and the search for “sticky” content. The irony might be that most of this content is so shallow that almost none of it sticks long enough for anyone to engage with it. Then there’s the familiar problem of memory loss. The act of forgetting happens almost as quickly as the act of discovery. Far from a warm and welcoming home for the truth, isn’t all of this a kind of high-spec, programmatic nihilism?
The Guardian’s criticisms do, of course, make assumptions. They assume that the truth is a “thing” – or an object – waiting out there for conscientious journos to discover. This is dicey. We sit firmly within a culture which has a two-and-half-thousand-year history populated by wildly differing views of what “the truth” means. Some even have the temerity to suggest it doesn’t exist at all. Our teutonic friend intent on reading the funerary rites for the deity, is also famous for having said “there are no facts, only interpretations,” before adding, just for clarity, “and that too is an interpretation”.
As anyone who has managed to scrape through the history of philosophy in the twentieth century will know, we have, for some time, been feeling wary of “fixed” truths out there which neatly correspond to the mushy matter of the mind. Nietzsche and Heidegger, with many followers, forged the view – the “hermeneutic” turn – that truth is a matter of interpretation. For some this makes it relative; for others, just “weak”. Wittgenstein ushered a “linguistic” turn, rooted in the traditions of analytical philosophy. Here the truth obtains only in a cultural and linguistic context. Or through “language games”. Scientists operate within a set of parameters, which creates the conditions in which they can do business (though it might seem pretty arcane to the rest of us). Politicians, too, have their own explicit or implicit rules, shaped by the structures of power and the soft cultural wrapping that sits around them. As do economists. Or avant-garde artists. Or academic historians. And so on.
In a recent public lecture on Philosophy and the Internet at the Bath Royal Literary and Scientific Institution, Dr Matthew Harris, mused on how this turn relates to the current state of the internet and digital publishing. For some thinkers, the nature of online publishing can look like a riotous fulfilment of twentieth century disillusionment. Harris cited, as an example, the Italian philosopher Gianni Vattimo. If philosophy began with a grand account of reality reduced to first principles, it has reached a state in which an unstoppable plurality of voices talk about and interpret “reality” in different and incompatible ways. The internet micro-slices spontaneous moments into contradictory and irreconcilable perspectives.
So digital technology could be seen as liberating. It has dismantled the corporate character of truth with its stranglehold on ordinary life. And it has set free the proliferating differences and innate ambiguities which burgeon in human culture.
The title of Viner’s article is “How technology disrupted the truth”. Heidegger might have, albeit uncharacteristically, raised a smile at the prospect that any technology could disrupt the truth. He was cautious about claims to have “overcome” the sort of adamantine and objective account of reality – or “Being” – which he traces through the history of western culture. But he finds a measure of comfort in ways of framing the truth as a disruption to the reductive violence of the metaphysical mind, and which situate “Being” as a participatory event.
The language of philosophy – “metaphysics”, “Being” and so on – is, shall we say, a little more abstract than the shin-kicking of journalism and politics. It’s all very well to talk of the truth in this way, but where public policy, economics, or for that matter, the exploratory questing of Biotech are concerned, “facts” mean something very real for many people.
So Viner’s concern for the truth is well-placed. It is framed by the hard realities of politics and public debate. But, in a sense, this is a point shared by most postmodern thinkers: “truth” is, in their view, not a piece of rarefied thought. It is political. Broadsheet journalism doesn’t take place inside an aura of divine light. The context – the “language game” – is the institutions of parliamentary democracy.
Journalists in this game never really “stand outside” the context in which they are reporting. They are a professional and skilled class who belong to a kind of establishment. The way public reporting is often justified makes the information sound like a package of data passed between two agents. A journalist discovers the facts, and puts them in the public domain, all in the name of accountability. The facts have a “fixed” meaning shared between the reporter and the public. But really these facts are an interpretation from the inside, which requires considerable skill, networking and PR polish. In the “power” and “the people” divide, journalists belong firmly on the “power” side of the fence. Even when journalists and politicians are tearing chunks out of each other, there is an implied fraternity shaped by the structures of the game.
And this is true for all forms of national journalism, whether “broadsheet” or “tabloid”. Popular media might be seen as obviously more low-brow, but in some ways it is more patrician. It takes just as much, if not more, rhetorical skill to craft genuinely popular content, and the reach and degree of influence – the power – is greater. Why else do political parties go to such lengths to court it? The Prime Minister’s Director of Communications can hardly be anything other than an establishment job. It is telling that two of its most recent incumbents – Andy Coulson and Alastair Campbell – both had a background in the popular press. So, in nuanced ways, all journalism reflects the select world in which it is created.
The point that someone like Vattimo seems to be making is that technology is democratising the “corporate” view of truth forged in the furnace of executive power and filtered through a handful of media moguls. It is challenging an unbalanced, almost monopolised, view of public debate. So if we want to really cross from “power” to “the people” we have to scroll down from the professionally crafted article to the comment wall beneath it, or take a swim in the vitriolic cesspit of social media.
It doesn’t take long before we start to feel unclean. It would, of course, be wonderful if the voices outside mainstream publishing were lending more colour and contrast to the public debate by empowering a broader and more diversified discussion. So the big question is: are they? And the short answer is: for the most part, no.
“Heretical” views about digital content have become more strident in recent years. To contract some of them, we might say that “truth” is drowning in an anonymous and sadistic sea of sludge. My own view is that the excess of social media is more logical than it might seem. Technology reflects back at us something about the way we think, and a similar habit of thought underpins our political culture. Both technology and politics want to reduce our lives to a singular cast of mind, which is in violent conflict with the mysterious messiness of people. And because we aren’t the determinate and utopian creatures that techno-politics wants us to be, this “idealism” quickly converts to chaotic nihilism. Reading any article on any news site, and then scrolling down to the comments, is an iterated illustration in miniature of a destructive psychology. The light of truth is going out because we start from an unrealistic outlook.
Both Viner and Harris observe something perversely subjective about the internet. It creates the impression that the world revolves, in sometimes very particular ways, around you. At its most innocent or practical, this may simply look like efficiency. The internet is good at delivering services. We can now even have buttons which will deliver washing powder at our convenience. And Tinder, if you know what you are doing (I discovered recently that I don’t …), makes getting laid only secondarily related to ordering sushi. Soon technology might fulfil every fetish and hedonistic inclination with scarcely any need for movement or thought. Maybe shopping online isn’t so bad (or is it?), but things start to look more grotesque when convenience colours judgement. It’s a short leap from a selfie to selfishness.
The internet seems to teach that everything turns around you. Or, to change the angle of reflection slightly, it is removing the challenge of different perspectives. A vague creed of “liberal diversity” is standard in many western societies, but the culture and practice of the internet is marginalising direct encounters with difference, just as a programme loops logically through an array. In fact, as Viner points out, computer algorithms are now programmed to sift the global spread of data and deliver results based on your digital profile. This meant that on the day of Brexit, when she searched through Facebook she was unable to discover anyone from the 52% of the population who voted to leave. Turned around, it also meant that many of the digitally savvy millennials who voted to remain, worked themselves up into a fury of self-righteous rage, and caricatured the damned 52% as a collective of racist idiots. If they had exhibited the same behaviour towards an ethnic group, women, or any other bastion of liberal inclusion, they would have been brutally condemned for their intolerance and retrograde thinking. Does this mean that digital culture is hypocritical as well as selfish?
Technology does not, from this angle, liberate culture into a postmodern encounter with the eclectic and unfamiliar. In its current guise, it is modernity on speed. It entrenches subjectivity, and division. It shouldn’t come as any surprise to hear, as Harris emphasised, that the people among the most adept at using it are terrorists, who actively want to shape unformed minds into an inflexible and dogmatic mould.
That digital culture is borderline narcissistic should tell us that it is, above everything else, reductive. It reduces everything to the archetype of a machine. It puts technology before people. Jaron Lanier in his 2011 book, You Are Not a Gadget, portrays this character at length. In their eagerness to use social networking sites, or other digital platforms, people make concessions, or trim themselves around the edges, to embrace the accepted grammar and idiom of the site. “Conversations” between people on social networking sites are not like “real” conversations. You have the time to groom your appearance (if you choose), using the artificial etiquette of the site. As Lanier says, the most successful users of social media are the ones who are best at fictionalising themselves.
The same reductive character of the web, means that most things online look, if not the same, then very similar. News is mashed up into aggregated tools and appears, free from its original context, in units of pristine uniformity. In fact even if we hunt out the original context by visiting different news sites, we very often find that, superficially at least, most of the newspapers adopt a similar design and public profile (which, perhaps, suggests they are all rooted in the same basic mentality). The same could be said of the websites for political parties, charities, and many businesses. In fact finding a site that is genuinely distinctive is quite challenging. Does the internet ever provide context? Sticking with that thought, I was recently at a conference for university web managers, where someone confidently claimed they could spot the university’s content management system by the aesthetics of the site. An idiosyncratic environment for teaching and research is flattened by the technology.
Then there are the terrifying trolls. The examples abound. So much so that we now have a collection of legal cases overwhelming the letter of the law. Tabloid journalism in the pre-digital age was often slapped about the face for its uncouth character, but this sort of sensationalism can scarcely compete with the bloody-minded blob of social networking. The extreme character of trolling is one of its most salient characteristics. The journalist and campaigner, Caroline Criado-Perez, who, in 2013, put forward the politely liberal suggestion that we should include Jane Austen on ten pound banknotes, quickly received rape threats on Twitter. This lowered the bar for standards in public debate by a significant degree. Threatening a woman with rape is, in itself, vile. But as a reaction to this sort of campaign it also seems just a tiny bit unhinged.
The “dark net” has been well-chronicled, from sadism to suicide, from overt criminality to every kind of venality. For Lanier this entails the same reductive mindset. For anyone who has trained someone to use social media professionally, one of the first lessons they have to impart is that everything you say and do is public. And yet people go to it with the assumption that they are somehow less visible than they would be if they were speaking face-to-face. In a sense, both perspectives are right. Everything you say and do is public, but we only ever glimpse a filtered or shielded view of the real person. This distance between subjects easily creates the perception that you can speak much more freely, in a way that you might in an impersonal or private space. (It’s ironic that digital marketers often encourage organisations to embrace the “personal” nature of social media, when in fact this is “personal” in much the same way that “reality” TV is real. )
So, to say it again, the current culture of technology brings life within strict horizons. It gives one perspective the appearance of something universal, but violently reduces the mysterious world beyond the horizon to its own. It doesn’t cultivate the sort of space which grants natural differences the dignity proper to their nature. Interestingly, Lanier makes an obvious connection with the blunt-edged, impersonal character of technology and financial speculation:
“A hedge fund manager might make money by using the computational power of the cloud to calculate fantastical financial instruments that make bets on derivatives in such a way as to invent out of thin air the phony virtual collateral for stupendous risks. This is a subtle form of counterfeiting, and is precisely the same maneuver a socially competitive teenager makes in accumulating fantastical numbers of “friends” on a service like Facebook.”
This connection is right, but we can see a similar artificial mindset at work in the character of politics. Democratic accountability in most modern liberal democracies operates like the calculations of a computer. The act of calculating is called “voting”. Citizens do not exercise power directly. A small executive authority – Government – enacts laws under an electoral mandate. The theory assumes that if the electorate are properly informed, they will make a judgement in their own interest and choose a parliamentary representative who will do their bidding. The points of contention – the “politics” – arise over a question of knowledge: what is the right way to understand the interests of society? The system wants to think about knowledge, power and “truth” in the same way as a piece of technology; something with firm edges that can mandate clear and measurable actions. It cedes personal circumstances to an impersonal fixed object.
Does this theory serve the interests of society in practice? Or, just as the reductive tendency of technology creates extreme characteristics, does the reductive character of political truth not have the same effect on society at large? I can’t help but notice that the concentration of power in a small executive confers privilege, financial reward, a revolving door with executive positions in industry, and a significant political, economic, and cultural imbalance between one part of the country and the rest of the population. It also seems to create an enfranchised population who are more or less unfailingly cynical about the political class (whatever their political persuasion). We are now all familiar with the rank inequalities in many western societies, stagnant wage levels, the rise of populist anti-establishment movements, a disenfranchised underclass, and so on. Has the exercise of electoral choice simply not managed to identify the truth? Do we need more and better investigative journalism so that we can make the right choices? Or is truth not the fetishised object that this theory assumes?
Dovetailing social media with politics is troublesome for all the reasons that Viner explores. It is also revealing for what it says about our political culture. One way to think about the contribution of social media to public debate might be to say that it has pushed the current system to a kind of breaking point.
Digital technology has dramatically increased the number of people, organisations and vested interests jostling for a position in the small and imperfect central spotlight. As we have seen, the pressure this places on publishers undermines quality controls, produces a radically truncated account of the facts, skewered by the need to catch the consumer’s attention. Truth is lost in a zoetrope of fleeting images. The paradox is that, under the pressure of social media, the attempt to fix information makes it more transitory.
Just as the attempt to fix truth has the opposite effect, we have also seen that reducing it to a subjective viewpoint makes it more divisive. At the level of national political culture this means that social media flushes out the animosities and divisions that lurk behind the clique of power. In a representative democracy, the “subjective viewpoint” is the executive: the government, or a little more broadly, parliament. To exercise power takes skill, insight, education, a certain élan. And the discussions which hover around this exercise of power – traditionally fostered by mainstream publishers – require a similar level of professionalism (perhaps even the word “professional” took shape in this structure).
So is it not logical that seeing the truth through a narrow and insular prism marginalises it elsewhere? Or is it surprising that the quality, tone, and character of content on social media is so poor or outright shocking? If the conditions for truth, insight, sound judgement, as well as civil behaviour, stick to the locus of power, then why shouldn’t everyone else behave like the eighteenth century mob soaked in gin? The intense trolling visited on members of parliament is a logical expression of an imbalance in power. The privileges of office and the raging, maladjusted cynicism directed at it are two sides of the same coin.
Publications like The Guardian or The Economist are all rooted in a commitment to public accountability and the detailed investigative journalism which makes this possible. They all also rightly spot the threat to their business which the cultural changes ushered by the technology pose. In many of the recent articles about “post-factual” or “post-truth” politics, the response to this threat seems to have been to hunker down. Viner is quite clear that the sort of rigorous concern for the truth practised by her paper is worth fighting for and that public discussion may still require gatekeepers of some sort to check the excesses of populist manipulation. The truth, it is implied, is still out there, but “without some form of consensus, it is hard for any truth to take hold.” The Economist cautiously, and a little doubtfully, goes even further. It suggests that to seek out the truth old media needs “to take the power out of users’ hands and recreate the gatekeepers of old.” But it quickly acknowledges that it will be “hard put to get a new grip on the gates”.
This understates it. Short of a counter-revolution, it would be more or less impossible to roll back the tide of social media. And so the spate of articles about digital technologies and the decline of truth begin to look like a caged canary gasping in a sudden belch of carbon monoxide.
To risk a different line of thought, it’s tempting to suggest that social media is a kind of light shed on an unseemly side of modern culture that has hitherto remained in the dark. If that’s the case, then it is something we should confront and address. The objectification of truth could be seen as a rock, the dark, dank underside of which creates fertile conditions for creepy crawlies. Social media has simply lifted up the rock.
Instead we should ask a question: why has digital technology disrupted the truth? My suggestion is that it reflects back at us an unrealistic habit of mind that is entrenched within modern culture. The internet is telling us something about us: that we operate and think inside horizons. When we try to discover the truth beyond these horizons, it creates all the problems with which we are now familiar: it makes it self-centred, exclusive, politically unhinged, impersonal, and shallow. When it fetishises the truth as a fixed object, the sheer power of the web does kill the truth.
Another thinker I haven’t yet managed to name-drop is the historian of science, Thomas Kuhn. Kuhn, in his 1962 book, The Structure of Scientific Revolutions, is famous for introducing the idea of a “paradigm shift”. In the ordinary run of things, scientific discovery takes place within an established paradigm, or an assumed framework of understanding, shared across the research community. Invention, innovation, empirical research, make incremental contributions to the body of knowledge circumscribed by this context. But some discoveries are so radical that they call into question the paradigm. Copernicus overturned the Aristotelian view of the cosmos. Einstein challenged the model of Newtonian mechanics. Darwin introduced an entirely new way of thinking about biology.
It’s not unreasonable to suggest that the printing revolution in the fifteenth century created a cultural paradigm shift in western society, which stood in a direct relationship with the Reformation, the Renaissance and the Enlightenment. It’s also not unreasonable to speculate that digital technology has pushed us to the edge of a similar paradigm shift. In fact the ailing health of public truth could be an early signal of change on this scale. It may be forcing us to re-think the way we think.
My guess is that digital technology is goading us towards a more differentiated and particular view of the truth. Or, at least, one that is less univocal. I am a woolly-minded arts graduate, who discovered a love of reading and culture at university. I have also spent the last fifteen years trying frantically to keep pace with the character of digital publishing. Even now the difference between a good novel, a work of art, a study of history, philosophy or religion, and the character of content on most websites or digital platforms is striking. Human culture is, to say the very least, rich, unpredictable and diverse. The more you probe it, the more it yields, and the more odd nooks and crannies you discover. This, it seems to me, is the real character of truth. But most digital content, in its current guise, is basically the same thing repeated endlessly. It is a straitjacket for culture.
It follows that to resuscitate the truth, digital technology needs to adopt a mindset which is more open and responsive to the manifold forms truth can take. It needs more culture. Travelling might provide a metaphor. If, for example, you take a trip through different European cities, you might take the time to explore their unique characteristics and attractions: Dublin’s Kilmainham Gaol and its haunting history of British colonialism, Cologne’s masterly gothic Cathedral, Baratti & Milano in Turin where Mark Twain is supposed to have dined. Rome is, in itself, such a dense palimpsest of cultural history that you could spend years there without exhausting everything it has to offer. Digital technology is self-evidently a powerful way of joining up different human experiences. If it were properly curated and developed, it could provide a similar experience, or routes through the myriad maze of human nature and all its contorted byways. Instead, the mindset that underpins it too often wants to see the whole world all at once.
It also follows that, in the same way, an accurate view of public life can only take shape in the frame of these differences. Where the Economist calls for a return to gatekeepers of the truth, a fair question might be: where and whose are the gates? Truth is always mediated through a particular circumstance. The worries that mainstream publishers like the Guardian and Economist express are real. The United States faces the prospect that it will soon elect a cartoon character president. (Donald Trump obviates the need for political satire because he is a caricature.) It’s understating it, but if you think people are capable of discerning the truth and the truth is the paterfamilias for a whole family of values like justice, integrity, beauty, love and so on, then this sort of development can’t a be a good thing. The commitment to the truth which Viner articulates is elemental, and its erosion threatens democratic culture. But I am not convinced that committed mainstream publishers will stave off this threat by simply maintaining a traditional publishing model.
Publishers need to use digital technology to discover the truth inside the limitations and horizons which make it meaningful. They need to discover new perspectives. And a “perspective” should mean more than a two-and-a-half minute film or five hundred words on a web page reworked to suit the reading age of a twelve year-old. Such a perspective ought to be, by its nature, a much richer experience.
The niche policy site wonkhe began life as a blog run by and for higher education policy wonks. It has since grown to become one of the authoritative and respected sources of information about the English higher education sector, a rival to more mainstream publications like THE or the Guardian Higher Education. The site offers a sustained, forensic and often very well-informed, perspective on the latest developments, attuned to the interests and concerns of the Academy. And here “participation” across the community does not mean a free-for-all of bilious comment, but a carefully curated discussion among those actively involved in different areas of the sector. It gives a voice to those involved in the issues at stake.
The “particular” in this example is actually a particular take on something quite broad: national higher education policy. But clearly particularities are legion. As far as the public good and public truth is concerned, are there not opportunities to let the truth unfold more locally in a space that is more immediately visible to the public it concerns? In the UK, the last fifteen years has seen successive moves towards political devolution and regional empowerment. Bristol, the city in which I live, has an elected Mayor. But, beyond that, it has a rich cultural inheritance: a lively music and theatre scene, festivals of ideas, music, comedy, economics, food, beer, digital culture. This is fertile territory in which digital publishing could go to work. It has a local elected representative to hold to account, and a thriving social and cultural community to celebrate. National publications sport the appearance of a broad view of the truth, which masks the metropolitan clique in which it is forged. Would not a local digital publication of this sort create a more intimate, and so more meaningful, account of public truth?
It’s even possible to turn the logic of this argument on its head. If the clamour of voices digital communication has empowered is breaking the systems of public accountability in representative democracy, does this not support the case for greater devolution? Power needs to accommodate itself to a more manageable community of interest. It needs to make the public feel confident that they have an active stake in it. With a more direct hold on power comes a better understanding of practical politics and public administration. This should diffuse public cynicism and make people less susceptible to the deranged excesses of demagogic populism.
None of this would undermine journalism. Rather, it should push it in new – maybe even more interesting – directions. Viner connects the impact of digital technology on news reporting with the decline of journalism as a profession. She points out that the number of journalists in the UK fell by a third between 2001 and 2010. This might be a symptom of the death of impersonal and corporate truth, but it is ironic since the kind of skillset journalists practise has never been more important. The need for human stories, the need to create, write, film, photograph, edit, design, curate a meaningful account of the richness and variety in the world around us, has never been quite so important or quite so possible.
It is a cliché that all good writing is re-writing. The same could be said of anything vaguely “creative”. Good works of culture have been polished like pebbles on a beach, smoothed by tidal patterns. They require a careful, respectful, and sensitive set of skills and passions. So if there is a place for mediated truth, it hinges on creative, editorial judgement. Digital communications has in the last ten years begun to evolve, in a chaotic and haphazard fashion, a kind of grammar which governs its different forms of communication: how to visualise data, how to use images and video juxtaposed with text, how interactive experiences or games enrich content. But mostly digital content is not carefully polished. It is knocked out with mechanical fury, only to pirouette over the surface of the mind. Really, these skills should be applied with love and discipline to create experiences with much deeper roots. We need digital cathedrals, not makeshift tents.
The concern among the current gatekeepers of public truth is real enough. It will affect the health of our democracy, and with it, our political economy. The danger must be that if the kind of journalism that sorts through shredded paper to find the facts is under threat, we fall into the chaos of political populism. The danger will be a Roman Republic which loses out to the hubris and intrigue of a Caesar.
The writer James Joyce said that he could get to the heart of all cities of the world by always writing about Dublin. The “digital revolution” is urging us to think about truth in the same way. As Joyce said “In the particular is contained the universal.”