Monday, 30 November 2009
As the year draws to an end, so polls of the "best" albums, films, books etc. of the last twelve months appear in newspapers and magazines; as the decade draws to an end (allegedly), so polls of the "best" albums, films, books etc. of the last ten years appear in newspapers and magazines. These are disparagingly but inaccurately called "lists", presumably by those who think that election results are just meaningless itemizations of random people's names. Such critics' polls are especially common in those cultural forms where academia hold little sway, like rock and pop, or where consumer opinion is noisily heard, like film: you don't get many polls about the "best" Greek tragedies or Victorian poems. Consequently, the results are to a certain extent canon-making, in the absence of professionals thought to possess the socio-cultural capital (the authorityl) to create them. The recurring "best albums of all time" polls found in the NME, Rolling Stone, Q etc have wound up by establishing a sense that certain texts - Revolver, Astral Weeks, etc. - are of an exceptional quality regardless of how many copies they sell or what the punters may think.
Texts of the year polls, on the other hand, come too early to be worthwhile. The NME chose "Heroes" as its album of the year in 1977, a verdict surely nobody would today wish to endorse (even Bowie). On the other hand, they sum up a mood among critics, which has a certain interest in itself. The fact that in 1989 the NME plumped for De La Soul's 3 Feet High and Rising, although four years later they chose the first Stone Roses album as the greatest of all time let alone of the year of its release, reminds me of the critical popularity at the end of the 80s of postmodernist theories of pastiche and intertextuality, and therefore the modishness of sampling. It brings back an era, regardless of whether it's "true" or not.
So what have I learned from the current glut of lists, sorry, polls?
(1) That the transformation of American and consequently world popular cinema into a form of children's entertainment continues apace, and that although some of its manifestations are cynically sterile (Transformers 2: etc etc), many of them are brilliantly original (Up, Fantastic Mr Fox. I expect that the Christmas Radio Times which will be out in a fortnight will be packed to the rafters again with "children's films". What these have replaced is not, as some drearily and disingenuously moan, towering works of film art like 2001 and Taxi Driver; instead, they have superseded earlier forms of popular entertainment. Thirty years ago the Christmas Radio Times was a litany of MGM musicals, 1930s slapstick comedies, and war movies. Some of these were brilliantly original too, but tastes change.
(2) That the rock/pop album is dead. The results of the various albums of the decade polls that I've seen have been truly pitiful. Kid A, a very good record that pales in comparison with its predecessor in Radiohead's oeuvre, as the best album of the past 3600 days? Or The Streets, the only artist in the history of rock to provide its own devastasting parody at the same time and in the same place as its music? Some ancient rocker, I forget who, once complained that critics were so lazy that, were they to review a record called "I'm Completely Crap", they would just write "You said it, mate" even if it sounded like "Good Vibrations". Bearing this in mind I don't quite know how to respond to the fact that NME chose Is This It? as the best long-player of the noughties. Is that it? (Sorry.) That slab of diet Coke Marquee Mooning? The best we can come up with? You might as well have listed the "best" music hall artists of the decade. The point is, of course, that a poll of the best downloads would yield far more impressive results, since that is now the format in which rock and pop achievement comes.
(3) That TV drama and comedy has been, across the decade, of an extraordinarily high standard. They have, I think, culturally outpaced their film equivalents. The advent of "box-setting", whereby TV shows are written and structured largely with a view to being watched at home in a self-scheduled and intensive run analagous to the reading of a novel, has vastly improved these genres, especially American drama. This is important to note, because just switching on your set every night and flicking through the channels you can get the impression (a) that TV is crap, and (b) that TV is nothing but wall to wall docusoaps, reality TV, and futile "challenges". The point is, of course (to repeat myself) that box-setting takes shows outside the economy of television: they're deprived or freed of the advertising that bankrolled them. Fantastic for the viewer, tough on the networks.
(4) Somebody writing on Wikipedia called my "The Death of Postmodernism and Beyond" an "entirely pessimistic" take on 2000s culture. Since writing it in 2006, I've come to the conclusion that my negativity about the quality of contemporary culture owed too much to a backward-looking conventionality (the same that causes critics to cling to notions of "best albums"). The real achievement of the decade hasn't been the appearance of great works in established formats so much as the emergence of new formats of expression. I shouldn't therefore have been looking so much for terrific new novels, plays, films etc., as exploring the new kind of textuality that has emerged. If I were invited on to a show discussing the drift of culture in the 2000s that's what I would emphasize: the extraordinary flowering of new avenues and economies of expression, rather than sensational new instances within long-established forms.
Mea culpa? Yes, all right; but 2006 was, in digimodernist terms, a very very long time ago.
Monday, 23 November 2009
I decided to write Digimodernism in a very particular style, and alluded to this in the Introduction, where I wrote:
I’ve tried to address here a hybrid audience, and for an important reason: on one side, it seemed hardly worth discussing such a near-universal issue without trying to reach out to the general reader; on the other, it seemed equally pointless to analyze such a complex, multifaceted and shifting phenomenon without a level of scholarly precision. Whatever the result may be, this approach is justified, even necessitated, by the status and nature of the theme.
With this in mind, I tried to phrase my points as if writing for two kinds of people: academic specialists on the one hand and the intelligent general reader on the other. So I looked for interesting ways into subjects, and tried to maximise references to actual cultural examples and to minimize allusions to theorists. At the same time, I also aimed to be as judicious, exact and analytical as possible. There's an obvious danger in such a double-barrelled approach, which is that you wind up pleasing no one: the general reader finds it too offputtingly abstract and rarefied, and the specialist finds it shallow and argumentative.
There was, however, an ulterior motive at work behind my choice of style. One of the few books I talked about at length was Terry Eagleton's After Theory, and while I was fairly critical of its intellectual content, I felt that its style suggested the possible emergence of a new approach to writing about cultural issues. If it was true that postmodernism was over, what would become of the jargon-clogged prose most wonderfully (ahem) exemplified by Homi Bhabha? It was already no longer possible to believe, as the likes of Gayatri Spivak seem once to have imagined, that world justice and liberty could be measurably advanced by a better critique of Heart of Darkness. If cultural criticism's day of messianic delusion was over, so surely was its use of jargon for its own sake.
This is not to be confused with populism. Philosophy and cultural analysis should be hard to understand if they are to have any sophistication to them. People who jeer at the prose of theorists have usually never tried to read any other philosophy; you can in truth only mock Derrida's style if you can honestly say that you have worked through the Critique of Pure Reason and you find it more lucid. A friend of mine who works as a postdoctoral researcher in genetics recently showed me a paper he had contributed to; its title composed 7 or 8 words, of which I knew only "the" and "of". Why then should abstract reasoning, thought about thought, be simple to understand?
[As postmodernism has been the dominant theoretical trend for several decades now, and its influence has crossed many disciplines, it tends frequently to be the only form of abstract thought people are familiar with. Consequently, it is often confused with philosophy itself, as witness complaints about how tough it is to understand. Another instance of this is when people describe postmodernism as casting doubt on all our systems of thought and values. No; it's philosophy that does that; and postmodernism is a form of philosophy (among much else).]
What I mean is slightly different: it is an abandonment of deliberate obfuscation. My objective was to write prose that was precisely as difficult as its subject matter. It is to my mind shameful that there are today academics writing no more than criticism of novels in prose that is harder to grasp than the philosophical writings of Wittgenstein; the chances that the former are saying something more profound or original than the latter are zero. This may be partially attributable to the popularity on campus of French-language theorists, whose writings translate uneasily into English: Freud and Nietzsche are far easier to read (though not necessarily to understand) than Lacan and Derrida. But this should not be overstated: some French theorists, like Foucault and Barthes, are very lucid indeed, The Postmodern Condition is a very accessible text, and most of Baudrillard is, though peculiarly expressed, not actually that difficult to figure out. The real demon is the Anglo-American academic who wants to work with this stuff but lacks the intellectual equipment or training to get inside it. S/he therefore dons its outward form, its linguistic "noise", which shrouds an absence of any real philosophical engagement.
There are issues too regarding deliberate obfuscation, which I don't want to talk about for now, that relate to the role and place of the university in the digimodernist society. These are socio-economic and political questions, too.
Wednesday, 4 November 2009
Electric Literature’s mission is to use new media and innovative distribution to return the short story to a place of prominence in popular culture.
As A.O. Scott wrote recently in the New York Times:The blog post and the tweet may be ephemeral... but the culture in which they thrive is fed by a craving for more narrative.”
Fiction transports us. It uniquely captures the experience of human consciousness like no other art form, revealing underlying truth and opening us to life’s possibilities. Like any creative act, writing fiction carries within it an implicit belief in the future. Electric Literature was created by people who believe in the future of writing.
We're tired of hearing that literary fiction is doomed. Everywhere we look, people are reading—whether it be paperbooks, eBooks, blogs, tweets, or text messages. So, before we write the epitaph for the literary age, we thought, let’s try it this way first: select stories with a strong voice that capture our readers and lead them somewhere exciting, unexpected, and meaningful. Publish everywhere, every way: paperbacks, Kindles, iPhones, eBooks, and audiobooks. Make it inexpensive and accessible. Streamline it: just five great stories in each issue. Be entertaining without sacrificing depth. In short, create the thing we wish existed.
Here's how our model works: To publish the paperback version of Electric Literature, we use print-on-demand; the eBook, Kindle, iPhone, and audio versions are digital. This eliminates our up-front printing bill. Rather than paying $5,000 to one printer, we pay $1,000 to five writers, ensuring that our writers are paid fairly. Our anthology is available anywhere in the world, overruns aren’t pulped, and our back issues are perpetually in print. We hope that this model can set a precedent: more access for readers, and fairness for writers.
Publishing is going through a revolution. There's opportunity and danger. The danger lies in ignoring or resisting the transformation in media. New platforms present an opportunity to adapt. We believe the short story is particularly well-suited to our hectic age, and certainly for digital devices. A quick, satisfying read can be welcome anywhere, and while you might forget a book, you’ll always have your phone.
To us, literature is what is important, not the medium. If eBooks, Kindles, or iPhone apps help literature survive, then we’re all for them.
People of our generation—with one foot in the past and one in the future—must make sure that the media gap is bridged in a way that preserves and honors literature. We don’t want to be sentimental old folks in a world where literary fiction is only read by an esoteric few.
Andy Hunter & Scott LindenbaumEditorseditors@electricliterature.com
Monday, 26 October 2009
An article in the Guardian on "twitterfiction", where authors experiment with narratives in 140 characters. I fear they haven't thought this through: Twitter isn't some OULIPO-esque kind of arbitrarily constrained writing (i.e. limited in scope) but another way of defining authorship itself, among much else. Still, it's interestingly digimodernist:
And Edexcel, the exam board, have sent out their new 2010 GCSE specifications, which include - in the field of "English" - three options: English; English Language and Literature; and English Studies: Digital Literacy. Tellingly, the digital now ranks twice as heavily as the literary. It is also, perplexingly, defined against the literary. Hm.
Monday, 19 October 2009
Friday, 9 October 2009
The first, and more internationally notorious, was the arrest of Roman Polanski in Switzerland on a charge of drugging and raping a thirteen-year-old girl in California in 1977. The judicial move, which occurred when Polanski had travelled to a film festival to pick up a lifetime achievement award, was instantly and roundly condemned by the French government: Frédéric Mitterrand, the Minister for Culture, described the arrest as “absolutely appalling”; Polanski had for thirty years been protected by the French state, and had been granted French citizenship. It was tempting at first to interpret this indignation as an expression of the fondly and widely held belief by which France, the “beacon of civilization and art” resists America, the “philistine and puritanical bully”; Polanski, then, would supposedly become the cultured and Gallicized martyr of the brutishly Yankee Satan. However, the French response was quickly echoed by an international battalion of filmmakers, many of them American, who signed petitions of protest calling for Polanski’s release. Polanski had, it is worth noting, already pleaded guilty to the crime, and had fled America before he could be sentenced and punished. Juridically, the nature of the offence and the extent of his guilt have never been disputed, least of all by the director himself.
It seems likely that this defence of Polanski – and indeed his protection since 1977 – is generated by the vestiges of a Romantic conception of the author or artist. The expressions of outrage repeatedly referred, for instance, to Polanski being a “great director”, even a “genius”; his “originality” and “daring” were evoked (Agnès Poirier even accused the US of never forgiving Polanski for his maverick tendencies when in Hollywood, as though the arrest were some bizarre form of long delayed film criticism). And yet these epithets do not stack up. The longevity of Polanski’s career is indeed remarkable: this is a man who made exceptional films both in the early 1960s and in the early 2000s; and so is its geographical scope, since he made enduring films in Poland, Britain, America and France. However, his forty-odd-year career does include about a quarter of a century during which he made nothing of artistic value and his continuing fame depended on his newsworthiness as a fugitive; and thematically his work, which returns endlessly to sexual torture and rape, is hardly separable from his queasy private life. And even his best films pale by comparison with those of his contemporaries and peers: Repulsion or Rosemary’s Baby or Chinatown are both conventional and second-rate when placed alongside the work of Losey, Coppola or Altman. In short, Polanski’s “greatness” appears to have been invented as a necessary element of the martyr narrative into which, under the aegis of a Romantic ideology, Polanski was plunged by his defenders. By the terms of this ideology – with Byron as an early example – the Artist is troubling, disturbing, unconventional, bohemian, he (probably he) breaks the rules, shocks the bourgeoisie, outrages the puritans, and produces dazzling works of breathtaking originality and greatness. His alcohol and drug-taking and illicit sex and weird dress are part of this story, as is his persecution by a hypocritical and brutish society. It seems evident that this prefabricated identity has been transferred on to Polanski: not only, then, is it no big deal that he raped a child (though it would be, were he not an Artist), but it guarantees the greatness of his Works (which cannot be located in his actual works) and the injustice of his prosecutor (though this, save for procedural issues, has not been demonstrated).
Interestingly, the response in cyberspace was very different. Online polls and message boards in France and indeed worldwide rang with fury against the defenders of Polanski, and with calls for equality before the law. The Mitterrand/Poirier/Woody Allen position was revealed as narrowly based. It is clear that digimodernist authorship, which is multiple and anonymous, does not square at all with the Romantic image of the exceptional, suffering Genius. The French government soon retreated from its anger, while the Swiss tellingly refused Polanski bail. What the fall-out from this episode suggests is the obsolescence, beyond an institutionalized and self-interested elite, of a certain conception or ideology of the artist. Ministers and other creators may still afford it some credence, but in cyberspace the screams of the victim take precedence.
The second incident involved the removal by the British police, before the exhibition it was due to feature in had even opened, of Richard Prince’s Spiritual America from the walls of Tate Modern. Prince’s piece, which dates from the early 1980s (the heyday of formulations of postmodernism) reproduces and refracts a photograph taken of Brooke Shields for Playboy when she was ten years old: she is naked and wearing lipstick and turning a “sensual” shoulder to the camera. In short, this is a work of art distancing itself from and commenting on but nonetheless reproducing a paedophilic photograph. The police seem to have found the element of the work contained in the last four words of my previous sentence decisive: their action was, in a sense, a work of art criticism. In defence of Prince’s work, one might argue politically, in libertarian or liberal manner, that the police have no right in a free society to decide what galleries may display. The legal retort to this is that the public display of an indecent (i.e. both nude and sexualized) image of an actual child appears to be a criminal act; morally, and in support of this, it must be noted that Shields had unsuccessfully fought as an adult to have the picture suppressed. More specifically, and in defence of Prince, a surprising number of commentators retreated to a decrepit model of authorial intent demolished (at the latest) by Roland Barthes in the late 1960s: that Prince meant the work as a socio-cultural comment not as paedophilic titillation so that must be what it really is. The notion that the meaning of a text is not contained in its author’s stated or imagined “intention” seemed to have passed such commentators by.
Nonetheless, the removal of the piece caused relatively little fuss. This stands in need of some explanation. My sense is that the art-critical scaffolding erected around the paedophilic photo in order to transform it into Prince’s comment on our sexualized culture no longer stands up. For, to justify or validate or explain Spiritual America it is to the discourse of postmodernism that we must turn: the piece is a cultural détournement or recuperation, it is meta-representation, an image of an image, an image about the making of images, it is depthless, affectless, a reflection on a media-saturated hyperreality where images refer only to other images and the “real” is dead (or her suit is dismissed), it is an ambivalent response to a culture of desire and representation and exploitation; it’s a simulacrum, an art of the exhaustion of art, a commodified artwork refracting a commodified photo, it’s the logic of Warhol’s Marilyn at its most extreme. One could go on and on. Defenders of Prince accused the police of philistinism: hadn’t they read Jameson or Baudrillard? Certainly they hadn’t, but the general sense seems to have been that all that theoretical apparatus, that barrage of abstract discourse which Prince relies on and adds to, is no longer interesting enough to redeem the public display of an undoubtedly exploitative and paedophilic photograph. In 2009, all one feels is that here is a vile image passed through and subjected to a certain art-critical discourse. But if the last ten words of my previous sentence no longer refer to something people care about, they fall away and leave only the nastiness of the image. Prince is not (one assumes) a paedophile and nor are (most of) the spectators of his work, but he is the postmodernist redeployer of paedophilia, and when “postmodernism” loses its currency, its potency and heft – as I suggest this episode shows it has – all that is left to the viewer is the paedophilia itself. For me this betrays the weakness of the piece: in contrast to Cindy Sherman’s Untitled Film Stills, which also invites, depends on and enriches a postmodernist discourse, Spiritual America does not walk artistically by itself.
So if the Romantic notion of the artist as shocking but all-justified genius no longer has general currency, neither does the postmodernist conception of the artist as the recycler of images from our commodified hyperreality. In each case the sexually assaulted child prevails. What, then, of the sensibility of the artist in the digimodernist age? It is socialized, not asocial; it is not the creature either of our continuing media excess. It moves between these two poles.
Thursday, 1 October 2009
Wednesday, 30 September 2009
Tuesday, 29 September 2009
Monday, 28 September 2009
I so dislike the word (widespread on the Internet and even consecrated by its own Wikipedia page) which the reviewer uses, with a question mark, as his title: "postpostmodernism". Very obviously, it's ugly as sin. Worse, it's highly misleading, since it implicitly defines what comes after postmodernism in terms determined by postmodernism, i.e. it reinforces the authority of that which it is supposedly tracing the overthrow. "Postmodernism" sees itself in linear terms as that which comes after modernism, a contention which it assumes and never demonstrates, although various objections could be levelled at this piece of intellectual and cultural historiography, e.g. that something distinctive and important happened between high modernism (c. 1920) and high postmodernism (early 1970s on), or that philosophically postmodernism positions itself more as a form of counter-modernism, as a naysayer, than as its successor.
"Postpostmodernism" perpetuates this error by implying that we are due to have more of pretty much the same thing. As coincidence would have it, this unimaginativeness reflects the shortcomings of Bourriaud's theory, which simply prunes back and reissues postmodernism for the 21st century (though admittedly he is using some rather uninteresting work as the jumping-off point for his thoughts about contemporary art). "PoMo" insists by definition on coming-after, on its posteriority, its successor state; "PoPoMo" supposes that something will come after "PoMo" which insists on its doubly coming-after, its reiterated posteriority, its successor state to a successor state.
This is neither true nor plausible. Whatever the merits of the theory of digimodernism, the cultural dominant which succeeds postmodernism will stand by itself; it will be marked by a level of conceptual autonomy. Its definition will not be created, either directly or indirectly, under the aegis of the definition of postmodernism. Postmodernism, then, will really be over; it will be over when we no longer need its limits and tendencies to define what comes after it.
Friday, 11 September 2009
Thursday, 10 September 2009
If you disagree with a point in Po Bronson's new book about parenting, NurtureShock, then don't bother returning it or giving it a one-star review on Amazon: you can tell Bronson directly, thanks to an online experiment that will allow readers to add their own footnotes to the pages of a digital version of the book.
As of next week, readers of Bronson and Ashley Merryman's NurtureShock: New Thinking About Children, will be able to go online and make notes on three chapters of the book. Covering the topics of why 98% of children lie, why too much praise for children is a bad idea, and how important an extra hour of sleep is, the three chapters will be posted on PoBronson.com, Nurtureshock.com and Twelvebooks.com, where readers will be able to highlight sections of the text, and add their own footnotes to their selections.
"I'm interested in building community around books, facilitating discussion. This is an experiment to see what happens," said novelist and journalist Bronson, whose book, NurtureShock, was published in the US last week by Twelve, an imprint of Hachette Book Group USA, with UK publication lined up for next year. "Our book already has 70 pages of sources, and 7,000 words of footnotes, that we've put in there."
Caroline Vanderlip, chief executive of SharedBook, the American company enabling the exercise, agreed with Bronson. "We believe that the community can enrich the original, similar to how footnotes or marginalia have enriched books for years," she said. "The difference here is that it's collaborative annotation, rather than from one source."
Interested collaborators will then be able to buy a PDF of the three chapters complete with their new footnotes. "We think the level of comments could be as engaging as the original," said Vanderlip. "Because our system supports annotation in a very detailed, contextual way, we have found that users do not abuse the system. But we have the means to delete anything that might be offensive." Bronson said he saw the project as having a "'wisdom of crowds'/Wikipedia-like community moderation".
Philip Jones, managing editor of theBookseller.com, said that publishers were all looking at ways of making books "more communal". "It's the whole idea of having a conversation around a book, no longer reading in isolation and building a community of readers," he said. "[The Bronson experiment is] another innovation from publishers who are seeking ways to reach out to readers in the digital age. It works for Amazon who have created a whole new platform for getting feedback on books in their comments. There is nowhere else you can get that feedback, and I know authors use it."
At Penguin, digital publisher Jeremy Ettinghausen said that readers were increasingly "wanting to discuss and comment and tag things, and as an initiative which allows people to indulge that, this is welcome". "I'm looking forward to a version when people can read the same book at the same time and all comment together," he said. "We are always thinking about how we can develop communities around particular books or categories, and there will be a time when we'll be able to integrate those communities and conversations with content."
"Enhanced ebooks will almost certainly be the way forward, and as the quality of ereaders improves, there will be a multitude of ways in which we can do this," added Hodder & Stoughton's Isobel Akenhead, pointing to "director's cut" editions of books – with commentary from the author about why and how the text might have changed, as well as user commentaries, which she said would work particularly well for reference books such as recipe books.
Friday, 4 September 2009
'Digi-novel' combines book, movie and website
Wednesday, 2 September 2009
Is it a book? Is it a movie? Is it a website? Actually it's all three.
Anthony Zuiker, creator of the "CSI: Crime Scene Investigation" U.S. television series, is releasing what he calls a "digi-novel" combining all three media -- and giving a jolt to traditional book publishing.
Zuiker has created "Level 26," a crime novel that also invites readers to log on to a website about every 20 pages using a special code to watch a "cyber-bridge" -- a three-minute film clip tied to the story.
Starting next Tuesday, readers can buy the book, visit the website, log in to watch the "cyber-bridges," read, discuss and contribute to the story.
"Just doing one thing great is not going to sustain business," he said. "The future of business in terms of entertainment will have to be the convergence of different mediums. So we did that -- publishing, movies and a website."
He said he did not believe the digi-novel would ever replace traditional publishing, but said the business did need a shot in the arm.
"They need content creators like myself to come in the industry and say, 'Hey, let's try things this way,'" he said.
Zuiker put together a 60-page outline for the novel, which was written by Duane Swierczynski, and wrote and directed the "cyber-bridges." He said the book could be read without watching the "cyber-bridges."
Zuiker said the United States was infatuated with technology and it had become such a permanent part of people's lives that more entertainment choices were needed.
Increasingly, people are reading books on electronic readers like Amazon.com's Kindle and Sony Corp's Reader.
Those devices don't play videos, so "Level 26" readers still need to log on to the Internet on a different device. Apple Inc is said to be developing a touchscreen tablet, which some analysts envision as a multimedia device that could play videos.
Zuiker said people's attention span was becoming shorter and shorter and that it was important to give people more options on how they consumed entertainment and books.
"Every TV show in the next five, 10 years will have a comprehensive microsite or website that continue the experience beyond the one-hour television to keep engaging viewers 24/7," he said. "Just watching television for one specific hour a week ... that's not going to be a sustainable model going forward."
"I wanted to bring all the best in publishing, in a motion picture, in a website and converge all three into one experience," he said.
"And when the book finished and the bridges finished, I wanted the experience to continue online and in a social community."
Zuiker said he came up with the idea for the "digi-novel" during a three-month TV writers strike in 2007/08.
Tuesday, 18 August 2009
"Clarice Garcia’s compilation was built around the rebel manifesto that post-modernism is dead and duly deconstructed casual day dresses into irregular blocks of carnation, tangerine and orchid.
Tuesday, 11 August 2009
At its simplest, digimodernism is the name I give to the cultural-dominant which has emerged since the second half of the 1990s in the wake of the exhaustion of postmodernism. It denotes a prevailing cultural paradigm, what Fredric Jameson called “a dominant cultural logic or hegemonic norm… the force field in which very different kinds of cultural impulses… must make their way”. In this sense it is postmodernism’s successor, although cultural history cannot, of course, be cleanly divided into watertight compartments: it is strongly inflected, in its contemporary form, by postmodernist residues, especially some of its habits of thought.
More precisely, digimodernism is the name I give to the cultural impact of computerization. It denotes the point at which digitization intersects with cultural and artistic forms. Most recognizably, this leads to a new form of text with its own peculiar characteristics (evanescence, onwardness, haphazardness, fluid-boundedness, etc.). But there are wider implications which make digimodernism, though easy to sum up in a misleadingly quick slogan, a disparate and complex phenomenon. Digimodernism is the label under which I trace the textual, cultural and artistic ripples which spread out from the explosion of digitization. Under its sign, I seek patterns in the most significant cultural shifts of the last decade or so, in such a way as to have predictive value: recently phenomena such as Nicolas Bourriaud’s Altermodern exhibition at Tate Britain and Anthony Gormley’s Fourth Plinth in Trafalgar Square have confirmed its outline of our cultural present.
However, digimodernism differs from terms which superficially resemble it, like “modernism” and “postmodernism”, or even “Romanticism”, in two crucial ways. First, it does not clearly refer to a privileged quantity of artistic content or set of artistic styles for creators to select from, mould and transform. It is not primarily an aesthetic given, at least not yet. Secondly, it is not found by merely gathering together the work of the era’s most innovative or intelligent artists and reading off what they have in common, as Bourriaud seeks to do. Both of these cultural-historical traditions are mired in assumptions about the avant-garde and the historical linearity of art which strike me as outdated. Digimodernism is not automatically an aesthetic achievement; to a degree it is characterized by a certain value-neutrality, evinced by the potential which opens before each participant as s/he steps on to Gormley’s plinth.
2.) Do you believe that digimodernism has come about as a result of the fragmentation and break up of grand narratives of postmodernism? Would you say that digimodernism is part of postmodernism or comes after postmodernism?
I think digimodernism’s origins lie in innovations in computer technology, which are in the throes of revolutionizing every inherited dimension of the text: its authorship, reception, material form, boundedness, economics, and so on. These upheavals entrain and are paralleled by a raft of cultural and social shifts. They are, I think, inimical to a postmodernism formulated well over a quarter of a century ago now, though it does not suddenly invalidate postmodernism to say that its moment has passed. The superannuation of postmodernism was noted in 2002-03, long before digimodernism became visible, by Linda Hutcheon and Ernst Breisach, among others. It is simplest to say that digimodernism succeeds postmodernism, because the former’s vitality is simultaneous with and to a degree reliant on the latter’s exhaustion. But as both terms are complex and multifaceted, so is the historical relationship between them.
3.) Do you think that twittering and blogging help to create a pluralist society and help to break up violent thinking/just one media voice?
4.) Do you think that Twittering and blogging etc fragments or unifies us as a society?
Remembering how Goethe, the apostle of Enlightenment, died with the phrase “More light!” on his lips, Steven Connor has some fun imagining his postmodernist equivalent departing this world with the cry: “More voices!” Postmodernism valorized the project of moving to the cultural centre previously silenced or marginalized voices (women, “colonials”, etc.), immeasurably and irreversibly enriching our sense of cultural history. Such a project inevitably destabilized certain entrenched cultural and social power formations, and there is no reason to believe it is finished. In consequence, there is an impulse to welcome blogs and Twitter. Quantitatively, they dramatically increase the numbers of people who write for publication and for an audience potentially global and enduring in scope. Indeed, all the platforms of Web 2.0 drive up the number and broaden the range of articulating voices in a way which postmodernism has taught us to see as inherently pluralist, emancipating, and transgressive.
The most obvious retort to such a view in this context is to point to the mind-numbing banality or the savage viciousness of much that actually appears on Web 2.0. The cultural empowerment of an ever-wider cross-section of the public runs up against its educational and social failings. More interestingly, it is the very digimodernist textuality of these platforms that predisposes them to these faults: it is their evanescence that breeds a tendency to triviality, their anonymity (or pseudonymity) that paves the way for aggressiveness. The postmodern project could not foresee this. The blending together in one space of such a vast number and wide range of voices seems unifying in effect, the renewal of pluralist democracy through a sort of electronic town hall or a challenge to the corporate control of the media. Yet social interaction presupposes a physical proximity that Web 2.0, which aggregates in large cyber-groups what are socially tiny numbers of people from an infinitely large and dispersed number of places, militates against. Moreover, I cannot see how, in ordinary times, a platform as evanescent as Twitter can solder a society together: a formed society rests upon a reasonably stabilized textuality such as books or films allow, enduring over time so that it can be shared and passed on.
The flipside of this, however, is that in exceptional circumstances, when a society is being re-formed through war or political crisis or collective dissent, the haphazardness, onwardness and evanescence of the digimodernist text are ideally suited to the dissemination of information on a wide scale which is intended as the basis for action. Many instances of this – the Iraq war, the Iranian elections, the 1st April demonstrations – can be given. Official information conduits are then bypassed and citizenship enhanced. Textual unformedness here goes hand in hand with socio-political uncertainty.
5.) In one of your blog entries you quote Charles Arthur as saying that 95% of existing blogs on the Web are abandoned and that the bloggers have moved on to Facebook and Twitter. Do you agree with this?
I have no empirical data on either the numbers or the motivations of people who abandon blogs. In my book I talk about the characteristic evanescence of textuality under digimodernism at the level of the individual creation, and Arthur made me wonder whether this might be extended to digimodernist platforms themselves, though logically there must be some limit to this. Twitter and indeed Spotify came to prominence after I had finished the book, and their emergence evinces the continuing dynamism of digital textual innovation. But the revolutions of digitization will outlast the social excitement they may elicit.
6.) In another of your blog entries you talk about Nick Cohen from the Observer as saying that professional journalists ‘look as doomed as blacksmiths in the age of the combustion engine’ due to Web writing. Do you think that journalism is being overturned by digitization? What do you mean when you talk about digimodernist novels and poems?
Contrary to some apocalypticists, I suspect that journalism will be turned inside out by digitization but not destroyed. There is a social demand for reliable sources of information about the outside world which long predated print and will outlast it. The contemporary challenge, and hardly an insurmountable one, is to monetize digital journalism. We can only speculate about what journalism will look like in twenty years’ time, but it may be that it will retrench at the level of the national/international and the weekly/monthly, abandoning forever the local and the daily. When speed and proximity are more important than breadth or depth, the digital and amateur will beat the professional and print; but also vice versa.
As for digimodernist literature, I’m not sure that this exists yet, though I can think of some proto-digimodernist works such as B. S. Johnson’s The Unfortunates. Nevertheless, there is evidence of a move beyond postmodernism in contemporary literature. Digitization has transformed the shape and status of many kinds of written or printed text, sweeping away or radically revamping such ancient modes as the diary, the cheque, the map, the newspaper, and the letter. One assumes that the highest form of writing, literature, will eventually be engulfed too by this wave. Already the authorship, production and reception of literature and books are being revolutionized by computerization; their content and style will surely follow.
Wednesday, 5 August 2009
Fredric Jameson famously identified the nostalgia film as one of the central instances of 1970s-80s postmodernism. In a world where “history”, or the sense of the past feeding into the present in a continuous cycle, is lost, it can only be evoked as something fossilized, stylized, and mourned: as frozen in aspic, transformed into fashion, and suffused with melancholic longing for what is now irretrievable. Desperate Romantics, on the other hand, could scarcely be more different in its approach to the past. It’s self-consciously tongue-in-cheek, as its joky title and nod to the series Desperate Housewives attests; a disclaimer at the start of each episode warns us that certain fanciful liberties have been taken with the historical record. But inaccuracy is not the issue here.
In short, Desperate Romantics recreates the 1850s as the 2000s in vintage clothing. As Rossetti, Millais, and Hunt stride heartily along London streets with their long hair flowing and their youthful eyes ablaze, they do look, as one reviewer commented, like a contemporary boy band about to burst into song. But whereas postmodernism might have richly played past and present off each other, as Blackadder or Back to the Future did, Desperate Romantics swamps its nominal past with the actual present. The cast move and talk like present-day Oxbridge graduates dressed in old-style clothes; no attempt is made to mimic the stiffness or formality portrayed in Victorian novels. The average viewer is given the impression that the painters were no more interested in or informed about art history and literature than s/he is. Their speech foregrounds present-day sexual frankness: they openly discuss their “virginity”, Effie Ruskin casually reminds her husband of when he “cupped my breast” – genteel characters have an easy sexual discourse that in 1850s’ England would only have been voiced by a prostitute. In a reversal of actual dominant ideology, Victorian repression is depicted as peripheral or as a joke: Tom Hollander’s Ruskin is uptight and anguished, but also ludicrous and marginal. The implication, as conceited as it is historically untrue, is that interesting and worthwhile people in the past were tolerant (open to other classes, genders, races), free (in sex and discourse), and indistinguishable from ourselves. Anyone else is comic relief.
Similarly, in Life on Mars a 2006 policeman travelled back to 1973 to discover that he was more knowledgeable (he knew everything they knew, but they didn’t know, for instance, that Britain would soon have a woman Prime Minister), more tolerant (towards women and ethnic minorities), and less technologically advanced (in forensic science) than his parents’ generation. They and their world are uglier, their food is worse, and so on. This assumption of unearned temporal superiority is partly explained as a product of the brain of a particularly self-confident individual lying in a coma; and though it cannot be articulated, the lost qualities of 1973 are finally inchoately felt in the show’s conclusion. On the whole, the present strides through Life on Mars’s 1973 like a messiah of knowledge, tolerance, and taste come to redeem the benighted heathen.
Some of the superiority of the present day here is well founded, of course, especially the advances in forensics and equality. Moreover, it is as long-standing a human trait to feel that one’s generation is better than its predecessors as it is to imagine one’s culture better than foreign ones. Since the early 19th century people have complacently enjoyed the myth that all pre-Colombian Europeans believed the earth was flat: if humans like to construct other societies as “backward”, they relish setting their invidious constructions in distant times as well as in remote lands. Life on Mars’s temporal superiority complex becomes limiting and unsatisfactory, while Desperate Romantics – which would like to see itself as a “romp” – displays a general indifference to the pastness of the past.
Essentially, it assumes that if 1850s Victorians are not like us, they are of no value or interest – they are, like Ruskin, cartoonish, grotesque, screwed-up. They need people like us to come among them and save them – real people, good people, normal people. This missionary premise was memorably dramatized as long ago as 1998 by the film Pleasantville, where the present day magically invests the 1950s with sexual fulfilment, personal freedom, and racial and gender equality. Pleasantville is closer to postmodernism in its treatment of the past, but the move beyond nostalgia, beyond fossilization and mourning, was already apparent. The present is here become arrogant, imperialistic, totalizing, and deluded: be as us, it proclaims, or be wrong, stupid, dull, unhappy or wicked. Such films and TV series are, then, morality plays in which, by living now, we are guaranteed to be the goodies: it is time that tells.
Thursday, 9 July 2009
Wednesday, 8 July 2009
Tuesday, 7 July 2009
Friday, 3 July 2009
Tuesday, 30 June 2009
Monday, 29 June 2009
Aleks Krotoski noted that sex is absolutely excluded from videogames in any form other than visual titillation. Yes there are heroines with bulging tight tops to gawp at. But actual sex as a practice is rigorously excluded. You can adopt a self that blows someone's head off, fine. But you can't adopt a game self that has intercourse. Why? This is a very interesting question. Every other textual form - the novel, theatre, film, TV etc etc - has found a place for sexual activity and tended to sell better when that place was a prominent one. Why not videogames?
I suspect the answer is, in short, that videogames are a different (digimodernist) species of textuality. The inherited rules of games don't apply to them; they're not commensurable with what's gone before them; their exceptionalism takes us into uncharted reaches.
Charles Arthur, on the other hand, argued that the long tail of blogging is over. Yes, people still blog and read blogs, but he quoted research saying that 95% of existing blogs on the Web are in effect abandoned. Where have all the bloggers gone? To Facebook and Twitter, he thinks.
The digimodernist text is clearly subject to its own evanescence, but also, it would seem, on the level of the platform. There is a powerful wave of fascination for computerized text around now and it tends to surge towards the newest thing, leaving behind older models.
Thursday, 25 June 2009
"As a central part of the art manifesto aspect of this chapter I’m asserting that it is time to go beyond postmodernism. Like all waves of philosophical skepticism, postmodernism taken to its ultimate conclusion leads to an intellectual and existential dead-end. And, indeed, even in the arts and humanities there is a vague sense that postmodernism has been “played out.”
Friday, 19 June 2009
Friday, 12 June 2009
A shame, perhaps, that it wasn't digimodernism (!), but otherwise it's entirely fitting that a concept so close to the heart of our new cultural and textual dominant should be given such a symbolic status. There's a whole chapter about some of the modes of Web 2.0 in the book.
Except, of course, "Web 2.0" is not a word... Never mind. When the symbol and the facts diverge, print the symbol!
Friday, 5 June 2009
Many people pointed out a while ago that Lost resembles the 1960s TV drama The Prisoner, and I think a comparison of the two is interesting. Both concern the sudden and violent arrival, against their will, of English-speaking individuals in a beautiful and sinister, exotic-looking location. One is kidnapped, the others fall from the sky in a plane crash. Where they are, they don't really know - it's a major enigma and plot driver - but it seems to combine the idyllic, the touristically delightful, and the dangerous, the menacing. Stranded and striving to survive, these people have secrets which they may or may not want to reveal; they are opaque to each other and to the viewer. The attraction of both series is the unresolved nature of their premises - who these people really are, where they are, and why, are all nebulous, such that viewers become simultaneously hooked and frustrated, eager for explanations and ready to advance their own. And so a "cult" series is created, in the sense that its fans become ravenously keen to know more about it.And yet there are differences between the shows which suggest to me the passage of cultural time. One key variation reflects the shift from "popular culture" to a dominant "children's entertainment" paradigm which is consonant both with the fulfilment of postmodernism and its superannuation. The Prisoner was a mature man, a worker, sober, serious and inscrutable; most of the Lost cast are very young, single, non-employed, and childless adults who dress and act for the most part in order that young teenagers can identify with them. I've never been on a flight with fewer couples, kids, or over-thirties on board! They all look 23 going on 18. Moreover, they're "young" in a teenage way - as cool dudes and slackers, geeks and rebellious babes. They wear scratches from the crash like make up, like members of some hip subculture. They're lightly unattached, sexually attractive, and given to fashionably unexamined attitudes about gender and race.
Much has been lost by Lost. The geopolitical, ideological, technological and indeed political aspects of The Prisoner have been evacuated by this dash for a young teenage audience. No knowledge of or interest in the actual existing world is assumed by Lost's makers. Equally missing is any philosophical enquiry, however allusive and finally unsatisfying was the one sprinked through The Prisoner. Consequently the stories oscillate between the soapily quotidian (the backstories, the character interplay) and the mystically sinister, between the natural and the ineffable, squeezing out the middle ground, the social, on the way. It's a Prisoner for a consumerist, a post-political age, and just as it's not really "popular" (it has no desire to please the older half of the population), it's not really "cultural" either - not really significant or resonant beyond itself.
Narratologically, too, the episodes are rather shapeless compared to those of The Prisoner. The show seems predicated on its own apparent endlessness: just piling enigmas and mysteries and riddles on to each other, just ceaselessly adding puzzles and their solutions, the show is set up to go on, in principle, forever. It moves forward episode by episode, but in no very palpable direction. Most interesting new TV dramas made over the last decade grow from episode to episode and contain no internal mechanism by which they could end themselves, so that they continue until halted by a TV executive (rather than until they conclude their own business). Lost is an obvious example of this. Personally, I find it rather ridiculous, though enjoyable: it seems so nakedly arranged to be "intriguing" and "addictive", to draw its viewers along a wild goose chase-cum-treasure hunt, rather like Life on Mars and The Da Vinci Code, with no intention of ever repaying the fascination they elicit. Such stories, with their evocation of a problematic and opaque reality system, owe much to the strand of postmodernist fiction that destabilized their characters' sense of the real, from The Crying of Lot 49 to The Matrix and from Money to The Man in the High Castle. But Lost isn't interested in ideas related to the "fabrication of the real". It's like an afterecho of such fictions, rendered shallow and subjugated entirely to its prevailing endless narrative structure.