Monday, 30 November 2009

Feeling listless? If only...

As the year draws to an end, so polls of the "best" albums, films, books etc. of the last twelve months appear in newspapers and magazines; as the decade draws to an end (allegedly), so polls of the "best" albums, films, books etc. of the last ten years appear in newspapers and magazines. These are disparagingly but inaccurately called "lists", presumably by those who think that election results are just meaningless itemizations of random people's names. Such critics' polls are especially common in those cultural forms where academia hold little sway, like rock and pop, or where consumer opinion is noisily heard, like film: you don't get many polls about the "best" Greek tragedies or Victorian poems. Consequently, the results are to a certain extent canon-making, in the absence of professionals thought to possess the socio-cultural capital (the authorityl) to create them. The recurring "best albums of all time" polls found in the NME, Rolling Stone, Q etc have wound up by establishing a sense that certain texts - Revolver, Astral Weeks, etc. - are of an exceptional quality regardless of how many copies they sell or what the punters may think.

Texts of the year polls, on the other hand, come too early to be worthwhile. The NME chose "Heroes" as its album of the year in 1977, a verdict surely nobody would today wish to endorse (even Bowie). On the other hand, they sum up a mood among critics, which has a certain interest in itself. The fact that in 1989 the NME plumped for De La Soul's 3 Feet High and Rising, although four years later they chose the first Stone Roses album as the greatest of all time let alone of the year of its release, reminds me of the critical popularity at the end of the 80s of postmodernist theories of pastiche and intertextuality, and therefore the modishness of sampling. It brings back an era, regardless of whether it's "true" or not.

So what have I learned from the current glut of lists, sorry, polls?

(1) That the transformation of American and consequently world popular cinema into a form of children's entertainment continues apace, and that although some of its manifestations are cynically sterile (Transformers 2: etc etc), many of them are brilliantly original (Up, Fantastic Mr Fox. I expect that the Christmas Radio Times which will be out in a fortnight will be packed to the rafters again with "children's films". What these have replaced is not, as some drearily and disingenuously moan, towering works of film art like 2001 and Taxi Driver; instead, they have superseded earlier forms of popular entertainment. Thirty years ago the Christmas Radio Times was a litany of MGM musicals, 1930s slapstick comedies, and war movies. Some of these were brilliantly original too, but tastes change.

(2) That the rock/pop album is dead. The results of the various albums of the decade polls that I've seen have been truly pitiful. Kid A, a very good record that pales in comparison with its predecessor in Radiohead's oeuvre, as the best album of the past 3600 days? Or The Streets, the only artist in the history of rock to provide its own devastasting parody at the same time and in the same place as its music? Some ancient rocker, I forget who, once complained that critics were so lazy that, were they to review a record called "I'm Completely Crap", they would just write "You said it, mate" even if it sounded like "Good Vibrations". Bearing this in mind I don't quite know how to respond to the fact that NME chose Is This It? as the best long-player of the noughties. Is that it? (Sorry.) That slab of diet Coke Marquee Mooning? The best we can come up with? You might as well have listed the "best" music hall artists of the decade. The point is, of course, that a poll of the best downloads would yield far more impressive results, since that is now the format in which rock and pop achievement comes.

(3) That TV drama and comedy has been, across the decade, of an extraordinarily high standard. They have, I think, culturally outpaced their film equivalents. The advent of "box-setting", whereby TV shows are written and structured largely with a view to being watched at home in a self-scheduled and intensive run analagous to the reading of a novel, has vastly improved these genres, especially American drama. This is important to note, because just switching on your set every night and flicking through the channels you can get the impression (a) that TV is crap, and (b) that TV is nothing but wall to wall docusoaps, reality TV, and futile "challenges". The point is, of course (to repeat myself) that box-setting takes shows outside the economy of television: they're deprived or freed of the advertising that bankrolled them. Fantastic for the viewer, tough on the networks.

(4) Somebody writing on Wikipedia called my "The Death of Postmodernism and Beyond" an "entirely pessimistic" take on 2000s culture. Since writing it in 2006, I've come to the conclusion that my negativity about the quality of contemporary culture owed too much to a backward-looking conventionality (the same that causes critics to cling to notions of "best albums"). The real achievement of the decade hasn't been the appearance of great works in established formats so much as the emergence of new formats of expression. I shouldn't therefore have been looking so much for terrific new novels, plays, films etc., as exploring the new kind of textuality that has emerged. If I were invited on to a show discussing the drift of culture in the 2000s that's what I would emphasize: the extraordinary flowering of new avenues and economies of expression, rather than sensational new instances within long-established forms.

Mea culpa? Yes, all right; but 2006 was, in digimodernist terms, a very very long time ago.

Monday, 23 November 2009

A Mysterious Affair of Style (with apologies to Gilbert Adair)

I decided to write Digimodernism in a very particular style, and alluded to this in the Introduction, where I wrote:

I’ve tried to address here a hybrid audience, and for an important reason: on one side, it seemed hardly worth discussing such a near-universal issue without trying to reach out to the general reader; on the other, it seemed equally pointless to analyze such a complex, multifaceted and shifting phenomenon without a level of scholarly precision. Whatever the result may be, this approach is justified, even necessitated, by the status and nature of the theme.

With this in mind, I tried to phrase my points as if writing for two kinds of people: academic specialists on the one hand and the intelligent general reader on the other. So I looked for interesting ways into subjects, and tried to maximise references to actual cultural examples and to minimize allusions to theorists. At the same time, I also aimed to be as judicious, exact and analytical as possible. There's an obvious danger in such a double-barrelled approach, which is that you wind up pleasing no one: the general reader finds it too offputtingly abstract and rarefied, and the specialist finds it shallow and argumentative.

There was, however, an ulterior motive at work behind my choice of style. One of the few books I talked about at length was Terry Eagleton's After Theory, and while I was fairly critical of its intellectual content, I felt that its style suggested the possible emergence of a new approach to writing about cultural issues. If it was true that postmodernism was over, what would become of the jargon-clogged prose most wonderfully (ahem) exemplified by Homi Bhabha? It was already no longer possible to believe, as the likes of Gayatri Spivak seem once to have imagined, that world justice and liberty could be measurably advanced by a better critique of Heart of Darkness. If cultural criticism's day of messianic delusion was over, so surely was its use of jargon for its own sake.

This is not to be confused with populism. Philosophy and cultural analysis should be hard to understand if they are to have any sophistication to them. People who jeer at the prose of theorists have usually never tried to read any other philosophy; you can in truth only mock Derrida's style if you can honestly say that you have worked through the Critique of Pure Reason and you find it more lucid. A friend of mine who works as a postdoctoral researcher in genetics recently showed me a paper he had contributed to; its title composed 7 or 8 words, of which I knew only "the" and "of". Why then should abstract reasoning, thought about thought, be simple to understand?

[As postmodernism has been the dominant theoretical trend for several decades now, and its influence has crossed many disciplines, it tends frequently to be the only form of abstract thought people are familiar with. Consequently, it is often confused with philosophy itself, as witness complaints about how tough it is to understand. Another instance of this is when people describe postmodernism as casting doubt on all our systems of thought and values. No; it's philosophy that does that; and postmodernism is a form of philosophy (among much else).]

What I mean is slightly different: it is an abandonment of deliberate obfuscation. My objective was to write prose that was precisely as difficult as its subject matter. It is to my mind shameful that there are today academics writing no more than criticism of novels in prose that is harder to grasp than the philosophical writings of Wittgenstein; the chances that the former are saying something more profound or original than the latter are zero. This may be partially attributable to the popularity on campus of French-language theorists, whose writings translate uneasily into English: Freud and Nietzsche are far easier to read (though not necessarily to understand) than Lacan and Derrida. But this should not be overstated: some French theorists, like Foucault and Barthes, are very lucid indeed, The Postmodern Condition is a very accessible text, and most of Baudrillard is, though peculiarly expressed, not actually that difficult to figure out. The real demon is the Anglo-American academic who wants to work with this stuff but lacks the intellectual equipment or training to get inside it. S/he therefore dons its outward form, its linguistic "noise", which shrouds an absence of any real philosophical engagement.

There are issues too regarding deliberate obfuscation, which I don't want to talk about for now, that relate to the role and place of the university in the digimodernist society. These are socio-economic and political questions, too.

Wednesday, 4 November 2009

Electric Literature

Yet more news of digimodernism in literature. Why so much recently? Maybe because literature has hitherto been so far behind the digimodernist game. There were digimodernist films (well, one anyway) in 1993 already. Two New Yorkers have created Electric Literature, a fascinating and entirely positive venture. This comes from their website:

Electric Literature’s mission is to use new media and innovative distribution to return the short story to a place of prominence in popular culture.

As A.O. Scott wrote recently in the New York Times:
The blog post and the tweet may be ephemeral... but the culture in which they thrive is fed by a craving for more narrative.”
Fiction transports us. It uniquely captures the experience of human consciousness like no other art form, revealing underlying truth and opening us to life’s possibilities. Like any creative act, writing fiction carries within it an implicit belief in the future. Electric Literature was created by people who believe in the future of writing.

We're tired of hearing that literary fiction is doomed. Everywhere we look, people are reading—whether it be paperbooks, eBooks, blogs, tweets, or text messages. So, before we write the epitaph for the literary age, we thought, let’s try it this way first: select stories with a strong voice that capture our readers and lead them somewhere exciting, unexpected, and meaningful. Publish everywhere, every way: paperbacks, Kindles, iPhones, eBooks, and audiobooks. Make it inexpensive and accessible. Streamline it: just five great stories in each issue. Be entertaining without sacrificing depth. In short, create the thing we wish existed.

Here's how our model works: To publish the paperback version of Electric Literature, we use print-on-demand; the eBook, Kindle, iPhone, and audio versions are digital. This eliminates our up-front printing bill. Rather than paying $5,000 to one printer, we pay $1,000 to five writers, ensuring that our writers are paid fairly. Our anthology is available anywhere in the world, overruns aren’t pulped, and our back issues are perpetually in print. We hope that this model can set a precedent: more access for readers, and fairness for writers.

Publishing is going through a revolution. There's opportunity and danger. The danger lies in ignoring or resisting the transformation in media. New platforms present an opportunity to adapt. We believe the short story is particularly well-suited to our hectic age, and certainly for digital devices. A quick, satisfying read can be welcome anywhere, and while you might forget a book, you’ll always have your phone.

To us, literature is what is important, not the medium. If eBooks, Kindles, or iPhone apps help literature survive, then we’re all for them.

People of our generation—with one foot in the past and one in the future—must make sure that the media gap is bridged in a way that preserves and honors literature. We don’t want to be sentimental old folks in a world where literary fiction is only read by an esoteric few.
Andy Hunter & Scott LindenbaumEditors

Monday, 26 October 2009

Well, what do you know

Two new-ish things:

An article in the Guardian on "twitterfiction", where authors experiment with narratives in 140 characters. I fear they haven't thought this through: Twitter isn't some OULIPO-esque kind of arbitrarily constrained writing (i.e. limited in scope) but another way of defining authorship itself, among much else. Still, it's interestingly digimodernist:

And Edexcel, the exam board, have sent out their new 2010 GCSE specifications, which include - in the field of "English" - three options: English; English Language and Literature; and English Studies: Digital Literacy. Tellingly, the digital now ranks twice as heavily as the literary. It is also, perplexingly, defined against the literary. Hm.

Monday, 19 October 2009

Write your own review!

Somewhere among my collection of books is a copy of Alasdair MacIntyre's little study for the Fontana Modern Masters series of the life and thought of Marcuse (bear with me here). I don't remember anything about reading it beyond the fact that it actually belongs to a former college friend who I'm no longer - for uninteresting reasons - in contact with. And I remember that because of the comments he wrote all over it.
MacIntyre is hostile to Marcuse, declaring at the outset that basically all his ideas are "false", which probably isn't a helpful stance to take toward a thinker you're summarising and simplifying for an implied readership of amateurs and students. An introductory book like this doubtless requires a position of neutrality, real or assumed, in the specialist doing the presenting, since if Marcuse or another really is wrong it's going to be for complex and recondite reasons, and this isn't the forum to air those. But MacIntyre was writing in 1970, after several years of campus unrest enacted to a degree in the name of the theories of One Dimensional Man, and Marcuse's influence may have seemed so great and pernicious to him that he was unable to cite his theories without trying to demolish them.
My college friend, reading the book in the late 1980s, was almost inevitably more sympathetic to Marcuse than to he who had come to bury him. Consequently a chapter of MacIntyre's exegesis concludes with the scribbled comment, in pencil: "superficial rubbish". When MacIntyre describes Marcuse's style as incantatory and devoid of reasoning my friend scrawled in the margin: "if you are spiritually dead". And so on - you get the idea. The digimodernist point of course is that in the late 2000s there is no motivation or need to hunt for one's pencil: you go online and vent your spleen there. And it cuts both ways: if your mother or father once defaced their library books with the words "Great!" or "Love it!", these days you go on to Amazon and write a "customer review", describing the author's style and message, his/her background and qualifications for writing, and recommending that a certain kind of reader purchase it.
In the pre-digital age there was a clear distinction between the reader and the critic. The reader would read for pleasure and give up on a book that was proving a pain, and would keep their reactions to themselves (pretty much). The critic would read from professional duty, often trudging through tedious material, and would read in order to speak or write, that is, their reading would imply a spoken or written reiteration of the book's meaning or importance or qualities to be made at a later date. The digital age conflates the two roles: by commenting on blogs or reviewing for Amazon or the IMDb or Metacritic or by going on message boards to respond to newspaper articles, you are making yourself a critic in the latter sense, that is, you are investing your reading with the future option of critiquing or evaluating the text in question. However, the people who publish their textual responses in this professionalized manner tend to do so with the mind of readers: as a result, for instance, Amazon customer reviews mostly give the minimum or the maximum number of stars to their texts - in other words, they are ways of shouting either "supeficial rubbish" or "love it!", which are readers' reactions. In short, digimodernism creates the individual who reads like a reader and publishes their responses for the whole world to peruse like a critic. What do we call this new synthesis - a readic? a criter? Neither sounds very positive.
This development is different from the emergence of online reviewers like book slut who are allegedly driving print journalists out of a job. There is a fairly small number of such people, who turn their voracious reading habit into a self-publishing venture. What I am referring to is far more widespread, the sort of spontaneous and emotional declarations that my friend was inspired to by reading Marcuse vs MacIntyre, but unable to share with the planet.
This post then is self-referential, or textually narcissistic, since it relates in part to the comments that some people or maybe you (dear Reader!) can append or have appended to this blog. Every writer wants only positive feedback, or, in Martin Amis's wonderfully sarcastic phrase, wants "to be received like the Warrior Christ half an hour before Armageddon". But it is in the nature of a publication to receive criticism, as it is in the nature of a child to make its own enemies. As the author of Digimodernism, however, I feel I have the right to expect a certain dimension from online criticism, whether it be worshipful or contemptuous or anywhere in between. I can expect that it show a critical awareness of its own special textual conditions. You have the right to think that my book is rubbish, but by going online to say so you must realize that you are confirming or instantiating one of its arguments, which is not something that can be said of any of Amazon's 847 customer reviews of The God Delusion. Simply by placing your fingers on your keyboard you are engaging with what I call digimodernism, even if you despise Digimodernism. Your review, then, is foreseen by the text you are critiquing: your critique of it is, to a certain extent, what the book is actually about. For instance, in chapter 2 I describe the digimodernist text as typically characterized by anonymous or pseudonymous authorship, and as tending to invective or abuse as a result (try, if you feel up to it, YouTube's comments pages, unspeakable and seething cauldrons of bile and hate). You may think that I'm talking superficial rubbish, but taking a pseudonymous name and pouring out a lot of invective or abuse just confirms what I am saying, though in a rather worthless fashion. As a response it's structurally self-contradictory. If I'm wrong, you're going to have to try harder than that to show it.
What I'd like to see happen is this. Web 2.0 enables, as is well known, everyone to become a published writer, and the 21st century is the golden age of the self-appointed critic who is busy sending out his or her critiques and evaluations for the world to see. I would like the ideal spirit of the professional critic to come to inflect the writings of these readers-turned critics (and I say that knowing that much, perhaps most, actual professional criticism is scarcely worth using to wrap chips up in): knowledgeable, proportionate, judicious, contextualizing, fair, sympathetic (as much as possible), and so on.
I would also like to see a code by which all anonymous or pseudonymous criticism is treated as the digital equivalent of pressing someone 's doorbell and running away, as antisocial and immature. If you're not prepared to show your face, to put yourself on the line, then what you have to say is of no value. In the Milgram experiment carried out in 1961 volunteers who were ordered to by an authority figure (a scientist) willingly administered what they believed to be fatal electric shocks to unseen victims once they believed that they would not be held responsible for the effects. Milgram saw this, understandably in the aftermath of the death camps and at the start of the 1960s, as shedding light on the human tendency to mindless obedience to authority figures. However, it also suggests that a fairly large proportion of the population will happily carry out sadistic acts if assured that they will never be held responsible for them. The viciousness of much pseudonymous copy on Web 2.0 would seem further confirmation of this.
So write your own review! Every (wo)man a critic! But remember that those who judge are subject to judgment too.
I also fondly remember the college friend who confused Marcuse with Mark Hughes...

Friday, 9 October 2009

Roman Polanski and Brooke Shields

Two recent incidents, both involving children and their putative sexual exploitation, highlight changes in the prevailing conception of the “artist” and his/her sensibility.

The first, and more internationally notorious, was the arrest of Roman Polanski in Switzerland on a charge of drugging and raping a thirteen-year-old girl in California in 1977. The judicial move, which occurred when Polanski had travelled to a film festival to pick up a lifetime achievement award, was instantly and roundly condemned by the French government: Frédéric Mitterrand, the Minister for Culture, described the arrest as “absolutely appalling”; Polanski had for thirty years been protected by the French state, and had been granted French citizenship. It was tempting at first to interpret this indignation as an expression of the fondly and widely held belief by which France, the “beacon of civilization and art” resists America, the “philistine and puritanical bully”; Polanski, then, would supposedly become the cultured and Gallicized martyr of the brutishly Yankee Satan. However, the French response was quickly echoed by an international battalion of filmmakers, many of them American, who signed petitions of protest calling for Polanski’s release. Polanski had, it is worth noting, already pleaded guilty to the crime, and had fled America before he could be sentenced and punished. Juridically, the nature of the offence and the extent of his guilt have never been disputed, least of all by the director himself.

It seems likely that this defence of Polanski – and indeed his protection since 1977 – is generated by the vestiges of a Romantic conception of the author or artist. The expressions of outrage repeatedly referred, for instance, to Polanski being a “great director”, even a “genius”; his “originality” and “daring” were evoked (Agnès Poirier even accused the US of never forgiving Polanski for his maverick tendencies when in Hollywood, as though the arrest were some bizarre form of long delayed film criticism). And yet these epithets do not stack up. The longevity of Polanski’s career is indeed remarkable: this is a man who made exceptional films both in the early 1960s and in the early 2000s; and so is its geographical scope, since he made enduring films in Poland, Britain, America and France. However, his forty-odd-year career does include about a quarter of a century during which he made nothing of artistic value and his continuing fame depended on his newsworthiness as a fugitive; and thematically his work, which returns endlessly to sexual torture and rape, is hardly separable from his queasy private life. And even his best films pale by comparison with those of his contemporaries and peers: Repulsion or Rosemary’s Baby or Chinatown are both conventional and second-rate when placed alongside the work of Losey, Coppola or Altman. In short, Polanski’s “greatness” appears to have been invented as a necessary element of the martyr narrative into which, under the aegis of a Romantic ideology, Polanski was plunged by his defenders. By the terms of this ideology – with Byron as an early example – the Artist is troubling, disturbing, unconventional, bohemian, he (probably he) breaks the rules, shocks the bourgeoisie, outrages the puritans, and produces dazzling works of breathtaking originality and greatness. His alcohol and drug-taking and illicit sex and weird dress are part of this story, as is his persecution by a hypocritical and brutish society. It seems evident that this prefabricated identity has been transferred on to Polanski: not only, then, is it no big deal that he raped a child (though it would be, were he not an Artist), but it guarantees the greatness of his Works (which cannot be located in his actual works) and the injustice of his prosecutor (though this, save for procedural issues, has not been demonstrated).

Interestingly, the response in cyberspace was very different. Online polls and message boards in France and indeed worldwide rang with fury against the defenders of Polanski, and with calls for equality before the law. The Mitterrand/Poirier/Woody Allen position was revealed as narrowly based. It is clear that digimodernist authorship, which is multiple and anonymous, does not square at all with the Romantic image of the exceptional, suffering Genius. The French government soon retreated from its anger, while the Swiss tellingly refused Polanski bail. What the fall-out from this episode suggests is the obsolescence, beyond an institutionalized and self-interested elite, of a certain conception or ideology of the artist. Ministers and other creators may still afford it some credence, but in cyberspace the screams of the victim take precedence.

The second incident involved the removal by the British police, before the exhibition it was due to feature in had even opened, of Richard Prince’s Spiritual America from the walls of Tate Modern. Prince’s piece, which dates from the early 1980s (the heyday of formulations of postmodernism) reproduces and refracts a photograph taken of Brooke Shields for Playboy when she was ten years old: she is naked and wearing lipstick and turning a “sensual” shoulder to the camera. In short, this is a work of art distancing itself from and commenting on but nonetheless reproducing a paedophilic photograph. The police seem to have found the element of the work contained in the last four words of my previous sentence decisive: their action was, in a sense, a work of art criticism. In defence of Prince’s work, one might argue politically, in libertarian or liberal manner, that the police have no right in a free society to decide what galleries may display. The legal retort to this is that the public display of an indecent (i.e. both nude and sexualized) image of an actual child appears to be a criminal act; morally, and in support of this, it must be noted that Shields had unsuccessfully fought as an adult to have the picture suppressed. More specifically, and in defence of Prince, a surprising number of commentators retreated to a decrepit model of authorial intent demolished (at the latest) by Roland Barthes in the late 1960s: that Prince meant the work as a socio-cultural comment not as paedophilic titillation so that must be what it really is. The notion that the meaning of a text is not contained in its author’s stated or imagined “intention” seemed to have passed such commentators by.

Nonetheless, the removal of the piece caused relatively little fuss. This stands in need of some explanation. My sense is that the art-critical scaffolding erected around the paedophilic photo in order to transform it into Prince’s comment on our sexualized culture no longer stands up. For, to justify or validate or explain Spiritual America it is to the discourse of postmodernism that we must turn: the piece is a cultural détournement or recuperation, it is meta-representation, an image of an image, an image about the making of images, it is depthless, affectless, a reflection on a media-saturated hyperreality where images refer only to other images and the “real” is dead (or her suit is dismissed), it is an ambivalent response to a culture of desire and representation and exploitation; it’s a simulacrum, an art of the exhaustion of art, a commodified artwork refracting a commodified photo, it’s the logic of Warhol’s Marilyn at its most extreme. One could go on and on. Defenders of Prince accused the police of philistinism: hadn’t they read Jameson or Baudrillard? Certainly they hadn’t, but the general sense seems to have been that all that theoretical apparatus, that barrage of abstract discourse which Prince relies on and adds to, is no longer interesting enough to redeem the public display of an undoubtedly exploitative and paedophilic photograph. In 2009, all one feels is that here is a vile image passed through and subjected to a certain art-critical discourse. But if the last ten words of my previous sentence no longer refer to something people care about, they fall away and leave only the nastiness of the image. Prince is not (one assumes) a paedophile and nor are (most of) the spectators of his work, but he is the postmodernist redeployer of paedophilia, and when “postmodernism” loses its currency, its potency and heft – as I suggest this episode shows it has – all that is left to the viewer is the paedophilia itself. For me this betrays the weakness of the piece: in contrast to Cindy Sherman’s Untitled Film Stills, which also invites, depends on and enriches a postmodernist discourse, Spiritual America does not walk artistically by itself.

So if the Romantic notion of the artist as shocking but all-justified genius no longer has general currency, neither does the postmodernist conception of the artist as the recycler of images from our commodified hyperreality. In each case the sexually assaulted child prevails. What, then, of the sensibility of the artist in the digimodernist age? It is socialized, not asocial; it is not the creature either of our continuing media excess. It moves between these two poles.

Thursday, 1 October 2009

News flash

Two bits of news: Digimodernism is out in the UK (I believe - I haven't actually seen it in the shops, but that's what I've been told). And a paper proposal of mine has been accepted for an international conference on 21st century European literature to be held in September 2010 at St Andrew's University. I'll be speaking about the various theories of culture after postmodernism, and while my own will obviously figure prominently my paper won't especially plead the case for digimodernism. It's called "The Inheritors", an apt enough title I think.

Wednesday, 30 September 2009

Shifting down to the kids

Two current TV series, one back for its second series, the other probably embarking on its only series, illustrate one of the most controversial arguments in Digimodernism. In chapter five I assert - at nothing like the length and in nothing like the detail that the theme requires - that "popular culture", such a favourite of postmodernism and of postmodern criticism, has been supplanted by a dominant children's entertainment which does not correlate to any extant sense of "popular". This has very different meanings in the various contexts where it is found, nor is it an entirely negative development.
Merlin (BBC) is an enjoyably mystical/nonsensical/Potterish re-telling of the Arthurian sagas for the digimodernist age. It's packed with CGI and earnestness (in place of irony), and it reshapes the inherited narrative structures in one very striking way. The sorceror Merlin himself is no longer the bearded, wrinkly, ancient oldster whose literary descendents include Gandalf and Dumbledore. Instead, he's about 19 years old (above left). Consequently, he doesn't transfer the baby Arthur from his threatened father King Uther to a foster home, but is Arthur's exact contemporary, and is obliged to protect the once and future monarch from a status of uneasy peer/rival. Reducing Merlin to one third of his usual age, the show keeps Uther alive and difficult, while Merlin, far from being the finished magical article, is still at conjuring school under the tutelage of a tetchy oldster. Guinevere is also Merlin's age, as are most of the peripheral characters, so that, as an ensemble, the cast resembles a pop group like Hear'Say, complete with token nods to racial inclusiveness. The assumption here is, I think - and this is not a thought that would have occurred to the makers of children's entertainment forty years ago - that child-friendly entertainment must be focused on semi-children and on juvenile issues (getting on with teachers and Dad etc.). It's supposedly for families, but Mary Poppins it ain't.
The execrable Trinity (ITV2, above right), on the other hand, is essentially what you would get if you made a class of comprehensive school fifteen-year-olds watch Brideshead Revisited and then had them write a TV show about a black kid starting at a posh college. He is relentlessly patronized and bullied and humiliated by a group called the Dandelion Club who are vaguely based on David Cameron's Bullingdon Club, and who are equipped with ludicrous dialogue endlessly adumbrating their class superiority. All they appear to do all day is saturate the world around them with their snobbery. The entire college is basically run by this clique of arrogant beaux, who effortlessly bend the Dean and the Warden and the Council to its will (they are guaranteed their degrees and have no need to work [!]). Filmed in a school, Dulwich College, the locale for this tomfoolery doesn't look anything like a university, and nobody behaves remotely like people who have been through higher education. Super-privileged students at Oxbridge, I know for a fact, just ignore those they socially despise; they don't seek them out and strut ostentatiously around them. Equally, the only colleges controlled by their students figure in the narcissitic dreams of particularly immature teenagers. And yet these days almost 50% of the English go to university; half the population will therefore be immediately aware that Trinity offers a laughably unreal image of higher education. The show's makers, tellingly, don't seem to care about this; instead, they have chosen to construct HE as what you might imagine it is before you go there, because they assume that their audience must be under 18.
The primary goal of both shows seem to be to gain the attention of children, either by throwing them and their concerns up on to the screen regardless of the source material, or by consciously inventing worlds which correspond, no matter how risibly, to their limited prejudices. Actors over 30 appear as parents or teachers whose only wish is to impose boring and petty restrictions on the glamorous stars, which the latter nevertheless easily get round. Of the two, it is Trinity that is more radical, because - scheduled at 10 pm and filled with sex and violent death - it is not in fact sold as "children's entertainment". Most people would probably file it under "popular culture". Narratologically it's primarily the former, because the latter has largely become the former.

Tuesday, 29 September 2009

Tings Things

Last night I listened to the Ting Tings' album We Started Nothing. Released just over a year ago, it's a bit of enjoyable pop fluff most notable for the remarkable "That's Not My Name", which reached number one in Britain. To my ears it sounds like a retread of angular 80s pop, with New Order, among others, as one of its creditors; the very title of the album seems to admit the belatedness of this form of music, its lateness in the day of its genre, while also perhaps acknowledging the lack of socio-cultural importance that rock and pop now has to contend with. Despite this, the structure of "That's Not My Name" strikes me as interesting. Five minutes long, the song very gradually constructs itself before the listener: the first couple of minutes are unexceptional, spiky pop fizz, then element after element is added to the mix, and then elements are taken away from it, so that listening to it is like watching something be assembled and then dismantled before your eyes. It would be a stretch to call this an example of onwardness in music. But there is a sense of listening not just to people playing music, but to them actually inventing it; and as a result the overall musical identity of the piece, just like its lyric which obsesses negatively about personal identity, is muddied.

Monday, 28 September 2009

No, not "Postpostmodernism"

A review has appeared in the the latest issue of New Left Review of Nicolas Bourriaud's book The Radicant, where the Frenchman argues that art has shifted towards the new paradigm of the "altermodern". The review's not freely available, but here it is in brief:

I so dislike the word (widespread on the Internet and even consecrated by its own Wikipedia page) which the reviewer uses, with a question mark, as his title: "postpostmodernism". Very obviously, it's ugly as sin. Worse, it's highly misleading, since it implicitly defines what comes after postmodernism in terms determined by postmodernism, i.e. it reinforces the authority of that which it is supposedly tracing the overthrow. "Postmodernism" sees itself in linear terms as that which comes after modernism, a contention which it assumes and never demonstrates, although various objections could be levelled at this piece of intellectual and cultural historiography, e.g. that something distinctive and important happened between high modernism (c. 1920) and high postmodernism (early 1970s on), or that philosophically postmodernism positions itself more as a form of counter-modernism, as a naysayer, than as its successor.

"Postpostmodernism" perpetuates this error by implying that we are due to have more of pretty much the same thing. As coincidence would have it, this unimaginativeness reflects the shortcomings of Bourriaud's theory, which simply prunes back and reissues postmodernism for the 21st century (though admittedly he is using some rather uninteresting work as the jumping-off point for his thoughts about contemporary art). "PoMo" insists by definition on coming-after, on its posteriority, its successor state; "PoPoMo" supposes that something will come after "PoMo" which insists on its doubly coming-after, its reiterated posteriority, its successor state to a successor state.

This is neither true nor plausible. Whatever the merits of the theory of digimodernism, the cultural dominant which succeeds postmodernism will stand by itself; it will be marked by a level of conceptual autonomy. Its definition will not be created, either directly or indirectly, under the aegis of the definition of postmodernism. Postmodernism, then, will really be over; it will be over when we no longer need its limits and tendencies to define what comes after it.

Friday, 11 September 2009


Sally Potter's new film Rage looks absolutely fascinating. Here's a newspaper story about it:

Thursday, 10 September 2009

Digimodernism and the book (again)

From yesterday's Guardian:

If you disagree with a point in Po Bronson's new book about parenting, NurtureShock, then don't bother returning it or giving it a one-star review on Amazon: you can tell Bronson directly, thanks to an online experiment that will allow readers to add their own footnotes to the pages of a digital version of the book.
As of next week, readers of Bronson and Ashley Merryman's NurtureShock: New Thinking About Children, will be able to go online and make notes on three chapters of the book. Covering the topics of why 98% of children lie, why too much praise for children is a bad idea, and how important an extra hour of sleep is, the three chapters will be posted on, and, where readers will be able to highlight sections of the text, and add their own footnotes to their selections.
"I'm interested in building community around books, facilitating discussion. This is an experiment to see what happens," said novelist and journalist Bronson, whose book, NurtureShock, was published in the US last week by Twelve, an imprint of Hachette Book Group USA, with UK publication lined up for next year. "Our book already has 70 pages of sources, and 7,000 words of footnotes, that we've put in there."
Caroline Vanderlip, chief executive of SharedBook, the American company enabling the exercise, agreed with Bronson. "We believe that the community can enrich the original, similar to how footnotes or marginalia have enriched books for years," she said. "The difference here is that it's collaborative annotation, rather than from one source."
Interested collaborators will then be able to buy a PDF of the three chapters complete with their new footnotes. "We think the level of comments could be as engaging as the original," said Vanderlip. "Because our system supports annotation in a very detailed, contextual way, we have found that users do not abuse the system. But we have the means to delete anything that might be offensive." Bronson said he saw the project as having a "'wisdom of crowds'/Wikipedia-like community moderation".
Philip Jones, managing editor of, said that publishers were all looking at ways of making books "more communal". "It's the whole idea of having a conversation around a book, no longer reading in isolation and building a community of readers," he said. "[The Bronson experiment is] another innovation from publishers who are seeking ways to reach out to readers in the digital age. It works for Amazon who have created a whole new platform for getting feedback on books in their comments. There is nowhere else you can get that feedback, and I know authors use it."
At Penguin, digital publisher Jeremy Ettinghausen said that readers were increasingly "wanting to discuss and comment and tag things, and as an initiative which allows people to indulge that, this is welcome". "I'm looking forward to a version when people can read the same book at the same time and all comment together," he said. "We are always thinking about how we can develop communities around particular books or categories, and there will be a time when we'll be able to integrate those communities and conversations with content."
"Enhanced ebooks will almost certainly be the way forward, and as the quality of ereaders improves, there will be a multitude of ways in which we can do this," added Hodder & Stoughton's Isobel Akenhead, pointing to "director's cut" editions of books – with commentary from the author about why and how the text might have changed, as well as user commentaries, which she said would work particularly well for reference books such as recipe books.

Friday, 4 September 2009

The "digi-novel" and digimodernism

This is fascinating:

'Digi-novel' combines book, movie and website
Wednesday, 2 September 2009

Is it a book? Is it a movie? Is it a website? Actually it's all three.
Anthony Zuiker, creator of the "CSI: Crime Scene Investigation" U.S. television series, is releasing what he calls a "digi-novel" combining all three media -- and giving a jolt to traditional book publishing.
Zuiker has created "Level 26," a crime novel that also invites readers to log on to a website about every 20 pages using a special code to watch a "cyber-bridge" -- a three-minute film clip tied to the story.
Starting next Tuesday, readers can buy the book, visit the website, log in to watch the "cyber-bridges," read, discuss and contribute to the story.
"Just doing one thing great is not going to sustain business," he said. "The future of business in terms of entertainment will have to be the convergence of different mediums. So we did that -- publishing, movies and a website."
He said he did not believe the digi-novel would ever replace traditional publishing, but said the business did need a shot in the arm.
"They need content creators like myself to come in the industry and say, 'Hey, let's try things this way,'" he said.
Zuiker put together a 60-page outline for the novel, which was written by Duane Swierczynski, and wrote and directed the "cyber-bridges." He said the book could be read without watching the "cyber-bridges."
Zuiker said the United States was infatuated with technology and it had become such a permanent part of people's lives that more entertainment choices were needed.
Increasingly, people are reading books on electronic readers like's Kindle and Sony Corp's Reader.
Those devices don't play videos, so "Level 26" readers still need to log on to the Internet on a different device. Apple Inc is said to be developing a touchscreen tablet, which some analysts envision as a multimedia device that could play videos.
Zuiker said people's attention span was becoming shorter and shorter and that it was important to give people more options on how they consumed entertainment and books.
"Every TV show in the next five, 10 years will have a comprehensive microsite or website that continue the experience beyond the one-hour television to keep engaging viewers 24/7," he said. "Just watching television for one specific hour a week ... that's not going to be a sustainable model going forward."
"I wanted to bring all the best in publishing, in a motion picture, in a website and converge all three into one experience," he said.
"And when the book finished and the bridges finished, I wanted the experience to continue online and in a social community."
Zuiker said he came up with the idea for the "digi-novel" during a three-month TV writers strike in 2007/08.

Tuesday, 18 August 2009

It takes all sorts to make an episteme

Another approach to shifting cultural paradigms. Check out:

"Clarice Garcia’s compilation was built around the rebel manifesto that post-modernism is dead and duly deconstructed casual day dresses into irregular blocks of carnation, tangerine and orchid.


Tuesday, 11 August 2009

Another interview I gave (long, but good I hope)

1.) What is digimodernism?

At its simplest, digimodernism is the name I give to the cultural-dominant which has emerged since the second half of the 1990s in the wake of the exhaustion of postmodernism. It denotes a prevailing cultural paradigm, what Fredric Jameson called “a dominant cultural logic or hegemonic norm… the force field in which very different kinds of cultural impulses… must make their way”. In this sense it is postmodernism’s successor, although cultural history cannot, of course, be cleanly divided into watertight compartments: it is strongly inflected, in its contemporary form, by postmodernist residues, especially some of its habits of thought.
More precisely, digimodernism is the name I give to the cultural impact of computerization. It denotes the point at which digitization intersects with cultural and artistic forms. Most recognizably, this leads to a new form of text with its own peculiar characteristics (evanescence, onwardness, haphazardness, fluid-boundedness, etc.). But there are wider implications which make digimodernism, though easy to sum up in a misleadingly quick slogan, a disparate and complex phenomenon. Digimodernism is the label under which I trace the textual, cultural and artistic ripples which spread out from the explosion of digitization. Under its sign, I seek patterns in the most significant cultural shifts of the last decade or so, in such a way as to have predictive value: recently phenomena such as Nicolas Bourriaud’s Altermodern exhibition at Tate Britain and Anthony Gormley’s Fourth Plinth in Trafalgar Square have confirmed its outline of our cultural present.
However, digimodernism differs from terms which superficially resemble it, like “modernism” and “postmodernism”, or even “Romanticism”, in two crucial ways. First, it does not clearly refer to a privileged quantity of artistic content or set of artistic styles for creators to select from, mould and transform. It is not primarily an aesthetic given, at least not yet. Secondly, it is not found by merely gathering together the work of the era’s most innovative or intelligent artists and reading off what they have in common, as Bourriaud seeks to do. Both of these cultural-historical traditions are mired in assumptions about the avant-garde and the historical linearity of art which strike me as outdated. Digimodernism is not automatically an aesthetic achievement; to a degree it is characterized by a certain value-neutrality, evinced by the potential which opens before each participant as s/he steps on to Gormley’s plinth.

2.) Do you believe that digimodernism has come about as a result of the fragmentation and break up of grand narratives of postmodernism? Would you say that digimodernism is part of postmodernism or comes after postmodernism?

I think digimodernism’s origins lie in innovations in computer technology, which are in the throes of revolutionizing every inherited dimension of the text: its authorship, reception, material form, boundedness, economics, and so on. These upheavals entrain and are paralleled by a raft of cultural and social shifts. They are, I think, inimical to a postmodernism formulated well over a quarter of a century ago now, though it does not suddenly invalidate postmodernism to say that its moment has passed. The superannuation of postmodernism was noted in 2002-03, long before digimodernism became visible, by Linda Hutcheon and Ernst Breisach, among others. It is simplest to say that digimodernism succeeds postmodernism, because the former’s vitality is simultaneous with and to a degree reliant on the latter’s exhaustion. But as both terms are complex and multifaceted, so is the historical relationship between them.

3.) Do you think that twittering and blogging help to create a pluralist society and help to break up violent thinking/just one media voice?
4.) Do you think that Twittering and blogging etc fragments or unifies us as a society?

Remembering how Goethe, the apostle of Enlightenment, died with the phrase “More light!” on his lips, Steven Connor has some fun imagining his postmodernist equivalent departing this world with the cry: “More voices!” Postmodernism valorized the project of moving to the cultural centre previously silenced or marginalized voices (women, “colonials”, etc.), immeasurably and irreversibly enriching our sense of cultural history. Such a project inevitably destabilized certain entrenched cultural and social power formations, and there is no reason to believe it is finished. In consequence, there is an impulse to welcome blogs and Twitter. Quantitatively, they dramatically increase the numbers of people who write for publication and for an audience potentially global and enduring in scope. Indeed, all the platforms of Web 2.0 drive up the number and broaden the range of articulating voices in a way which postmodernism has taught us to see as inherently pluralist, emancipating, and transgressive.
The most obvious retort to such a view in this context is to point to the mind-numbing banality or the savage viciousness of much that actually appears on Web 2.0. The cultural empowerment of an ever-wider cross-section of the public runs up against its educational and social failings. More interestingly, it is the very digimodernist textuality of these platforms that predisposes them to these faults: it is their evanescence that breeds a tendency to triviality, their anonymity (or pseudonymity) that paves the way for aggressiveness. The postmodern project could not foresee this. The blending together in one space of such a vast number and wide range of voices seems unifying in effect, the renewal of pluralist democracy through a sort of electronic town hall or a challenge to the corporate control of the media. Yet social interaction presupposes a physical proximity that Web 2.0, which aggregates in large cyber-groups what are socially tiny numbers of people from an infinitely large and dispersed number of places, militates against. Moreover, I cannot see how, in ordinary times, a platform as evanescent as Twitter can solder a society together: a formed society rests upon a reasonably stabilized textuality such as books or films allow, enduring over time so that it can be shared and passed on.
The flipside of this, however, is that in exceptional circumstances, when a society is being re-formed through war or political crisis or collective dissent, the haphazardness, onwardness and evanescence of the digimodernist text are ideally suited to the dissemination of information on a wide scale which is intended as the basis for action. Many instances of this – the Iraq war, the Iranian elections, the 1st April demonstrations – can be given. Official information conduits are then bypassed and citizenship enhanced. Textual unformedness here goes hand in hand with socio-political uncertainty.

5.) In one of your blog entries you quote Charles Arthur as saying that 95% of existing blogs on the Web are abandoned and that the bloggers have moved on to Facebook and Twitter. Do you agree with this?

I have no empirical data on either the numbers or the motivations of people who abandon blogs. In my book I talk about the characteristic evanescence of textuality under digimodernism at the level of the individual creation, and Arthur made me wonder whether this might be extended to digimodernist platforms themselves, though logically there must be some limit to this. Twitter and indeed Spotify came to prominence after I had finished the book, and their emergence evinces the continuing dynamism of digital textual innovation. But the revolutions of digitization will outlast the social excitement they may elicit.

6.) In another of your blog entries you talk about Nick Cohen from the Observer as saying that professional journalists ‘look as doomed as blacksmiths in the age of the combustion engine’ due to Web writing. Do you think that journalism is being overturned by digitization? What do you mean when you talk about digimodernist novels and poems?

Contrary to some apocalypticists, I suspect that journalism will be turned inside out by digitization but not destroyed. There is a social demand for reliable sources of information about the outside world which long predated print and will outlast it. The contemporary challenge, and hardly an insurmountable one, is to monetize digital journalism. We can only speculate about what journalism will look like in twenty years’ time, but it may be that it will retrench at the level of the national/international and the weekly/monthly, abandoning forever the local and the daily. When speed and proximity are more important than breadth or depth, the digital and amateur will beat the professional and print; but also vice versa.
As for digimodernist literature, I’m not sure that this exists yet, though I can think of some proto-digimodernist works such as B. S. Johnson’s The Unfortunates. Nevertheless, there is evidence of a move beyond postmodernism in contemporary literature. Digitization has transformed the shape and status of many kinds of written or printed text, sweeping away or radically revamping such ancient modes as the diary, the cheque, the map, the newspaper, and the letter. One assumes that the highest form of writing, literature, will eventually be engulfed too by this wave. Already the authorship, production and reception of literature and books are being revolutionized by computerization; their content and style will surely follow.

Wednesday, 5 August 2009

Not desperate, but not romantic either

The BBC is currently screening a new costume drama about the Pre-Raphaelites called Desperate Romantics. It seems a useful peg on which to hang a few observations about the contemporary digimodernist conception of the past. Desperate Romantics is symptomatic of a trend in historical drama, and the points I am going to make apply just as easily to other recent TV series such as Rome, The Tudors, and Life on Mars as well as Hollywood productions like The Mummy or Peter Jackson’s King Kong.

Fredric Jameson famously identified the nostalgia film as one of the central instances of 1970s-80s postmodernism. In a world where “history”, or the sense of the past feeding into the present in a continuous cycle, is lost, it can only be evoked as something fossilized, stylized, and mourned: as frozen in aspic, transformed into fashion, and suffused with melancholic longing for what is now irretrievable. Desperate Romantics, on the other hand, could scarcely be more different in its approach to the past. It’s self-consciously tongue-in-cheek, as its joky title and nod to the series Desperate Housewives attests; a disclaimer at the start of each episode warns us that certain fanciful liberties have been taken with the historical record. But inaccuracy is not the issue here.

In short, Desperate Romantics recreates the 1850s as the 2000s in vintage clothing. As Rossetti, Millais, and Hunt stride heartily along London streets with their long hair flowing and their youthful eyes ablaze, they do look, as one reviewer commented, like a contemporary boy band about to burst into song. But whereas postmodernism might have richly played past and present off each other, as Blackadder or Back to the Future did, Desperate Romantics swamps its nominal past with the actual present. The cast move and talk like present-day Oxbridge graduates dressed in old-style clothes; no attempt is made to mimic the stiffness or formality portrayed in Victorian novels. The average viewer is given the impression that the painters were no more interested in or informed about art history and literature than s/he is. Their speech foregrounds present-day sexual frankness: they openly discuss their “virginity”, Effie Ruskin casually reminds her husband of when he “cupped my breast” – genteel characters have an easy sexual discourse that in 1850s’ England would only have been voiced by a prostitute. In a reversal of actual dominant ideology, Victorian repression is depicted as peripheral or as a joke: Tom Hollander’s Ruskin is uptight and anguished, but also ludicrous and marginal. The implication, as conceited as it is historically untrue, is that interesting and worthwhile people in the past were tolerant (open to other classes, genders, races), free (in sex and discourse), and indistinguishable from ourselves. Anyone else is comic relief.

Similarly, in Life on Mars a 2006 policeman travelled back to 1973 to discover that he was more knowledgeable (he knew everything they knew, but they didn’t know, for instance, that Britain would soon have a woman Prime Minister), more tolerant (towards women and ethnic minorities), and less technologically advanced (in forensic science) than his parents’ generation. They and their world are uglier, their food is worse, and so on. This assumption of unearned temporal superiority is partly explained as a product of the brain of a particularly self-confident individual lying in a coma; and though it cannot be articulated, the lost qualities of 1973 are finally inchoately felt in the show’s conclusion. On the whole, the present strides through Life on Mars’s 1973 like a messiah of knowledge, tolerance, and taste come to redeem the benighted heathen.

Some of the superiority of the present day here is well founded, of course, especially the advances in forensics and equality. Moreover, it is as long-standing a human trait to feel that one’s generation is better than its predecessors as it is to imagine one’s culture better than foreign ones. Since the early 19th century people have complacently enjoyed the myth that all pre-Colombian Europeans believed the earth was flat: if humans like to construct other societies as “backward”, they relish setting their invidious constructions in distant times as well as in remote lands. Life on Mars’s temporal superiority complex becomes limiting and unsatisfactory, while Desperate Romantics – which would like to see itself as a “romp” – displays a general indifference to the pastness of the past.

Essentially, it assumes that if 1850s Victorians are not like us, they are of no value or interest – they are, like Ruskin, cartoonish, grotesque, screwed-up. They need people like us to come among them and save them – real people, good people, normal people. This missionary premise was memorably dramatized as long ago as 1998 by the film Pleasantville, where the present day magically invests the 1950s with sexual fulfilment, personal freedom, and racial and gender equality. Pleasantville is closer to postmodernism in its treatment of the past, but the move beyond nostalgia, beyond fossilization and mourning, was already apparent. The present is here become arrogant, imperialistic, totalizing, and deluded: be as us, it proclaims, or be wrong, stupid, dull, unhappy or wicked. Such films and TV series are, then, morality plays in which, by living now, we are guaranteed to be the goodies: it is time that tells.

Thursday, 9 July 2009

The plinth, Twitter, and similar

Charlotte Higgins at the Guardian has blogged about the installations at the plinth in Trafalgar Square, which strike me as pure digimodernism in action:

Wednesday, 8 July 2009

Got to be ending something

The death and memorial service for Michael Jackson brings to an end something besides the life of a man. I drafted a post on this last week, put it to one side while I was writing a paper for a conference I'm going to this Friday (on Literary London, if you must know), then found that Momus had said pretty much exactly what I wanted to say in the mean time: that Jackson was the last of his type, a relic from another age... and Momus had been quoted around the world for it:

In particular he notes:

Michael Jackson is not just the King of Pop, but the Last King of Pop. Three major factors will prevent there ever being another one: digital culture and its fragmentation of the big "we are the world"-type audience into a million tiny, targeted audiences; the demographic decline of the "pigs in the pipe" (the Baby Boomers, Gen X and Gen Y, who made pop music's four-decade-long pre-eminence possible); and the decline of the influence of the United States....
Jerry Del Colliano, a professor of the music industry at the University of Southern California... thinks that stars will emerge from social networking software.
[I], however, believe that social networking may have the opposite effect... the world may be headed back to what celebrated sociologist Pierre Bourdieu found in 1960s France -- white-collar workers preferred high-brow classical music, while manual laborers listened to cheap pop. A few decades later, postmodern consumer culture had leveled that, at least superficially: now, people with college degrees spoke about Michael Jackson 'intelligently,' people from lower class backgrounds spoke about him 'passionately.' But everybody spoke about him... But social networking is now limiting interaction among groups with different tastes... I think we'll see different classes embracing different cultures again. Things will settle back into the kind of cultural landscape Bourdieu described.
Socio-cultural prediction is a mug's game, and my instincts tell me that wherever we are going it is not forward into the past. Indeed, the clinging to Bourdieu may itself by a consoling retreat to the familiar even as the end of an era is announced. But the incommensurability of digital forums such as social networking with the old postmodern landscape is accurately perceived, I feel.
In truth, as an entertainer Jackson was already a ghost from a vanished world when he died. He left us the same day as Farrah Fawcett, another pop culture superstar from the late 1970s/early 1980s. After the end of the 80s Jackson remained in the public eye through ever-more grotesque and sordid personal behaviour, not through his work; his songs and videos - which once defined MTV and the whole early 80s Fredric Jameson aesthetic of pastiche, the waning of affect et al - became void of style or content outside of their deluded messianism (clips of him bestowing hope on thousands of the downtrodden, etc.) The website set up for Jackson's funeral invites us to choose Jackson's best song - "Thriller", "Billie Jean" or "Bad" - and doesn't notice that the most recent of these is considerably older than the average pop music fan.
So, Jackson's passing encompasses four kinds of death, beginning with the immediately physical. Textually this was yesterday's superstar too, like Fawcett; creatively he was already a burned out force. The nature and scale of his superstardom is doubtless a thing of the past too, as Momus notes. Finally, in the wake of a Glastonbury dominated by Blur, a band who peaked commercially 15 years ago and split up 6 years ago, Jackson's death highlights the contemporary status of pop and rock as something superannuated, living on its memories, fading slowly into the cultural night.

Tuesday, 7 July 2009

What postmodernism isn't (or wasn't): part two

My phrase about "postmodernism" becoming some kind of intellectual black hole which sucks into itself all of the thinking and culture of the last forty years, like an insatiable and monstrous thought-mouth, reminds me of a joke (not a funny one really) that people used to make in the 1980s. It took this kind of form:

Person 1: "Ow!"

Person 2: "What is it?"

Person 1: "I've stubbed my toe!"

Person 2: "... That's so postmodern."

- Because everything was postmodern. Or, rather, everything that happened in the very recent past and present was postmodern. Or, rather, everything that happened in the very recent past and present to interesting, cool and hip types like ourselves was postmodern. Postmodernism, such an evil empire to a certain kind of evangelical Christian , was such a badge of cultural superiority to a certain kind of hipster.

Wikipedia's entry on postmodernism reads in part:

"The movement of Postmodernism began with architecture, as a reactionary movement against the perceived blandness and hostility present in the Modern movement."

The black hole beckons. If we are speaking of a "movement against", why not use the term "antimodernism"? If we agree that something called modernism exists, like America and clockwise movement, then surely antimodernism can exist along with anti-Americanism and anticlockwise movement? And I would certainly call a range of writers like Auden, Larkin, Orwell and Greene antimodernist - they were reacting against the experimentation and difficulty of Eliot, Joyce et al with a shift toward simplicity and tradition. And after them came writers (in Britain) like B.S. Johnson or Christine Brooke-Rose who went in another direction again, a postmodernist one.

Postmodernism has this drive toward an intellectual black hole because it operates on certain levels and in certain quarters - and very ironically - as a sort of grand narrative, a totalizing and complete description and explanation of the world. This is a fairly low, even trashy level of postmodernist discourse, but it palpably exists. Terry Eagleton's The Illusions of Postmodernism is aimed at it. When functioning in this manner it has a tendency, as grand narratives do, to see the world in binary terms as a battle of good and evil, with itself taking up the role of good.

Such a worldview is again beyond irony, for postmodernism emerged simultaneously with (but not quite coterminously with) deconstruction, which argued (in part) that language is composed of binaries but that we privilege one or other term (white over black, male over female) in a way which destabilizes the flow of meaning. It's unarguable that postmodernism frequently defines itself over and against modernism. Ihab Hassan (above) produced a pair of columns setting modernist and postmodernist characteristics against one another: "metaphor" against "metonymy", "transcendence" against "immanence" etc. Fredric Jameson begins his magnum opus with a comparison of a Van Gogh and a Munch against a Warhol. Binaries, pairs.
The assumption is then made, and the message is then transmitted, that:

(1) Postmodernism is the immediate successor to modernism. Nothing happened between them; we went straight from Woolf to Doctorow. (Dubious in the extreme.)

(2) Postmodernism is the equal of modernism: every Ulysses, Battleship Potemkin, Demoiselles d'Avignon or Waste Land has an equal postmodernist achievement in the field of the novel, film, art and poetry. (Arguable, but tenuous in the extreme.)

(3) Modernism is the only alternative to postmodernism. If the latter is the way we are now, the former is the alternative to it. Binaries imply a totality: if you do not fight for God, you fight for the devil, necessarily and with no other options; if you are not pomo, then you are modernist. (Nonsense for published postmodern theorists, but a pit into which students who are unversed in history and culture prior to 1890 all too often fall.)

(4) Postmodernism is better than modernism as a spiritual and/or moral and/or political and/or philosophical condition. As in deconstruction, we privilege one of the terms of the binary. This position is suggested by The Postmodern Condition and it is extremely attractive again to hipsters (if we can use the word) or anyone who wants to feel good about themselves. We are not a culture which is interested in the idea of being inferior to another time, as mass media texts from Pleasantville to Life on Mars unflaggingly repeat. Our temporal superiority complex is redoubtable. And so postmodernism gets identified with freedom and justice while modernism or modernity, we are told, are all about oppressive universals. I won't belabour this: on the Internet where the under-30s discuss postmodernism it is obvious that they consider it the equal of, the alternative to, and the clear superior of, modernism. Hence the hostility of some of them to the notion that it may be over.

My sense is that postmodernism had much that was good about it. It represented a redrawing of the cultural and political landscape to include many who had once been excluded or denigrated, with empowering and enriching effects. But it had serious drawbacks too. And in any case, it's a mistake to see postmodernism as a camp in a Manichean struggle against previous oppression. We are (or were) thrown into postmodernism; we are/were born and find ourselves floundering within it. It must be stressed that you will not find such assumptions in the writings of most postmodernist theorists. But those writings are disappearing into the past now (it's the 25th anniversary this year of "Postmodernism, or the Cultural Consequences of Late Capitalism", which is as far away from us as The Smiths' first album or the Soviet boycott of the Olympics). And what remains today is something of lower quality, more exposed and untenable. The great days of postmodern theory are long gone, and we are stuck in their dog days. Hopefully Digimodernism can be a ladder by which people can start to climb out.

More on this to come: metafiction; the blurring of high and low culture; and lots more besides.

Friday, 3 July 2009

What postmodernism isn't, or wasn't: part one

Part one of an occasional series.

For several reasons my book doesn't spend much time defining what postmodernism is, or, increasingly, was. Plenty of existing books by writers like Hans Bertens, Steven Connor and Simon Malpas already do a perfectly good job of this. Consequently, Digimodernism pretty much assumes that we all know well enough what postmodernism means.

However, looking at recent uses of the term on the Internet makes this assumption rather awkward. There are an awful lot of misconceptions out there. This matters to a degree because an understanding of digimodernism and its status as a contemporary cultural dominant depends on a correct apprehension of the nature of what it succeeded. Roughly half of new references to postmodernism that I find on the Internet seem to be from Christians, especially American Christians. This confirms a pattern I found when, in 2008, I ran a search through books published in the last 6-7 years held in Oxford's university library (which receives by law a copy of every book published in the UK) for the word "postmodernism" or its cognates. About 80% of contemporary usages of the term, on the Internet and in the publishing world, came either from students of college modules or their professors (giving the term the same status as "Romanticism") or from Christians.

The danger with the term was always that it would become a kind of intellectual black hole, sucking into itself concepts and practices that had no need to be labelled or understood in that manner. "Postmodernism" must mean, to a strong extent, something periodized, a historical era, it must be temporal or it must mean nothing at all. Both the body of the word and its prefix emphasize a moment in time, with a beginning and, therefore, an end. This moment will certainly have its precursors, but "postmodernism" cannot be an eternally available option. It is not, as one contributor to an Internet forum opined, a "state of mind" with exemplars among the ancient Romans. States of mind have their own descriptors - see a dictionary for details. Writers who explicitly discuss postmodernism, such as Lyotard, Harvey and Jameson, agree that there was once a time when it simply did not exist. The confusion arises when postmodernism is taken to mean writers, notably Derrida and Foucault, who never treated the subject, though their work clearly bears some family resemblance to it. Derrida's work aims at eternal validity, as philosophy does; it is not, therefore, strictly postmodern.

Consequently, postmodernism is NOT, as so many online Christians seem to think, merely relativism or subjectivism regarding truth or ultra-scepticism regarding knowledge and objectivity or a belief in pluralism. These ideas are age-old, familiar to the ancient Greeks, and have floated around whenever people sat down to think seriously about thinking. Nor is postmodernism identical with some kind of questioning of the bases of knowledge - that's philosophy (duh).

The most common error made by Christians here is to construct postmodernism as the mirror image of themselves. Evangelicals, by their very nature, are keenly interested in the intellectual and philosophical and moral state of the world they are trying to evangelize, in the same way that footballers are very interested in the teams and players they come up against. The question they ask is, then, eternally this one: Why isn't the world Christian? What is it with the world, that makes it not believe? I was an evangelical Christian myself in the early 1980s (while Lyotard, Baudrillard and Jameson were defining postmodernism as the expression of our time), and I was informed by church leaders who had been at university in the late 1960s that the world was fundamentally... humanist. This prevailing humanism was said to lie at the root of abortion on demand, tolerance for homosexuality, amoral politicians, and all other social evils.
25 years later, evangelicals are asking the same question, and this time their leaders, who had been at university in the late 1980s and early 1990s, answer that the world is fundamentally... postmodernist. This prevailing postmodernism is said to lie at the root of abortion on demand, tolerance for homosexuality, amoral politicians, and all other social evils. Exactly like humanism. Never mind that postmodernism is anti- or post-humanist. And never mind that postmodernism in 2009 is as hip and now as humanism was in 1983-84.
I'm all for trying to understand the world we live in systematically and historically. It's just ignorance and arrogance to suppose that one is the sole and supreme source of what one does and thinks. But the Christian take on postmodernism is unreal. It imagines that "the postmodernists" are an organized body, a recognizable set of people with a raft of shared beliefs and ideas. It imagines that these people sit around sharing their common views on truth and reality and knowledge and belief - that they hold to an agreed value system about the meaning of the real and the true. But this is a description of evangelical Christians, not of postmodernists. All such references to "the postmodernists" are incapable for this reason of supplying names.
There are people these days who espouse an ultra-subjectivist or relativist view of truth ("you're right from your side/and I'm right from mine", in Dylan's words). But this is much less the effect of "postmodernism" than of consumerism, which sees all choices in the market place as equally valid. American Christians are obsessed with postmodernism because they are socially conditioned never to question the economic system they inhabit, and this is nothing new (I noticed it in about 1982, and it wasn't new then either). The US is dominated by a certain economic system and its driving ideology, and American Christianity is deeply infected, as all non-Christians know, by nationalism: in the early 1980s the US was said to be under attack from humanism and Marxism, and now it is said to be threatened by postmodernism, but in all cases and all times it is said to be menaced by something foreign and alien and coming from abroad. Its problems can never be home-grown.
So version one of what postmodernism isn't or wasn't: it's not an ultra-sceptical or relativist creed about reality and truth which threatens the free world. More on this later.

Tuesday, 30 June 2009

Matilda Anderson: Theory and criticism

"It is my conjecture, however, that neither the modernist nor post modernist approach is fitting for artwork created within the past five years. "

Matilda Anderson: Theory and criticism

Monday, 29 June 2009

Of sex and blogs

Two interesting articles in last week's Technology Guardian leapt out at me.

Aleks Krotoski noted that sex is absolutely excluded from videogames in any form other than visual titillation. Yes there are heroines with bulging tight tops to gawp at. But actual sex as a practice is rigorously excluded. You can adopt a self that blows someone's head off, fine. But you can't adopt a game self that has intercourse. Why? This is a very interesting question. Every other textual form - the novel, theatre, film, TV etc etc - has found a place for sexual activity and tended to sell better when that place was a prominent one. Why not videogames?

I suspect the answer is, in short, that videogames are a different (digimodernist) species of textuality. The inherited rules of games don't apply to them; they're not commensurable with what's gone before them; their exceptionalism takes us into uncharted reaches.

Charles Arthur, on the other hand, argued that the long tail of blogging is over. Yes, people still blog and read blogs, but he quoted research saying that 95% of existing blogs on the Web are in effect abandoned. Where have all the bloggers gone? To Facebook and Twitter, he thinks.

The digimodernist text is clearly subject to its own evanescence, but also, it would seem, on the level of the platform. There is a powerful wave of fascination for computerized text around now and it tends to surge towards the newest thing, leaving behind older models.

Thursday, 25 June 2009

Another successor to postmodernism?

Philip Galantar has suggested a successor to postmodernism called "complexism". Again, I haven't fully digested this yet, but it does seem stimulating. Here's a chapter on the subject he wrote:

"As a central part of the art manifesto aspect of this chapter I’m asserting that it is time to go beyond postmodernism. Like all waves of philosophical skepticism, postmodernism taken to its ultimate conclusion leads to an intellectual and existential dead-end. And, indeed, even in the arts and humanities there is a vague sense that postmodernism has been “played out.”

Allan Cameron's modular narratives

After James Harkin, Allan Cameron has written about the rise of "modular narrative" in contemporary cinema, and changes in narrative structure in films such as 21 Grams, Time Code, Run Lola Run, Eternal Sunshine of the Spotless Mind, and Memento. Very interesting, though he's more reticent (it seems) on the links between new digital technologies, the Internet and such movies. I need to study this more carefully, but here's the introduction to his book (published 2008).

Friday, 19 June 2009

Friday, 12 June 2009

Symbols clash

It's been announced that "Web 2.0" is the millionth word in the English language:

A shame, perhaps, that it wasn't digimodernism (!), but otherwise it's entirely fitting that a concept so close to the heart of our new cultural and textual dominant should be given such a symbolic status. There's a whole chapter about some of the modes of Web 2.0 in the book.

Except, of course, "Web 2.0" is not a word... Never mind. When the symbol and the facts diverge, print the symbol!

Friday, 5 June 2009

Lost and gained

I've been watching the first episodes of Lost, first screened in 2004.

Many people pointed out a while ago that Lost resembles the 1960s TV drama The Prisoner, and I think a comparison of the two is interesting. Both concern the sudden and violent arrival, against their will, of English-speaking individuals in a beautiful and sinister, exotic-looking location. One is kidnapped, the others fall from the sky in a plane crash. Where they are, they don't really know - it's a major enigma and plot driver - but it seems to combine the idyllic, the touristically delightful, and the dangerous, the menacing. Stranded and striving to survive, these people have secrets which they may or may not want to reveal; they are opaque to each other and to the viewer. The attraction of both series is the unresolved nature of their premises - who these people really are, where they are, and why, are all nebulous, such that viewers become simultaneously hooked and frustrated, eager for explanations and ready to advance their own. And so a "cult" series is created, in the sense that its fans become ravenously keen to know more about it.

And yet there are differences between the shows which suggest to me the passage of cultural time. One key variation reflects the shift from "popular culture" to a dominant "children's entertainment" paradigm which is consonant both with the fulfilment of postmodernism and its superannuation. The Prisoner was a mature man, a worker, sober, serious and inscrutable; most of the Lost cast are very young, single, non-employed, and childless adults who dress and act for the most part in order that young teenagers can identify with them. I've never been on a flight with fewer couples, kids, or over-thirties on board! They all look 23 going on 18. Moreover, they're "young" in a teenage way - as cool dudes and slackers, geeks and rebellious babes. They wear scratches from the crash like make up, like members of some hip subculture. They're lightly unattached, sexually attractive, and given to fashionably unexamined attitudes about gender and race.

Much has been lost by Lost. The geopolitical, ideological, technological and indeed political aspects of The Prisoner have been evacuated by this dash for a young teenage audience. No knowledge of or interest in the actual existing world is assumed by Lost's makers. Equally missing is any philosophical enquiry, however allusive and finally unsatisfying was the one sprinked through The Prisoner. Consequently the stories oscillate between the soapily quotidian (the backstories, the character interplay) and the mystically sinister, between the natural and the ineffable, squeezing out the middle ground, the social, on the way. It's a Prisoner for a consumerist, a post-political age, and just as it's not really "popular" (it has no desire to please the older half of the population), it's not really "cultural" either - not really significant or resonant beyond itself.

Narratologically, too, the episodes are rather shapeless compared to those of The Prisoner. The show seems predicated on its own apparent endlessness: just piling enigmas and mysteries and riddles on to each other, just ceaselessly adding puzzles and their solutions, the show is set up to go on, in principle, forever. It moves forward episode by episode, but in no very palpable direction. Most interesting new TV dramas made over the last decade grow from episode to episode and contain no internal mechanism by which they could end themselves, so that they continue until halted by a TV executive (rather than until they conclude their own business). Lost is an obvious example of this. Personally, I find it rather ridiculous, though enjoyable: it seems so nakedly arranged to be "intriguing" and "addictive", to draw its viewers along a wild goose chase-cum-treasure hunt, rather like Life on Mars and The Da Vinci Code, with no intention of ever repaying the fascination they elicit. Such stories, with their evocation of a problematic and opaque reality system, owe much to the strand of postmodernist fiction that destabilized their characters' sense of the real, from The Crying of Lot 49 to The Matrix and from Money to The Man in the High Castle. But Lost isn't interested in ideas related to the "fabrication of the real". It's like an afterecho of such fictions, rendered shallow and subjugated entirely to its prevailing endless narrative structure.

Wednesday, 3 June 2009

Interview I gave

Here's the link to an interview I gave about digimodernism:

Thursday, 28 May 2009

If not now, when?

So, it's not going to be out in May, is it... The best bet would seem to be some time in July. But watch this (cyber)space.