“Metamodernism is not so much a philosophy […] as it is an attempt at a sort of vernacular, or . . . a sort of open source document, that might contextualize and explain what is going on around us, in political economy as much as in the arts.” – Timotheus Vermeulen
The internet has radically reframed the art of criticism. This is obvious, but I don’t think we quite realize the extent to which the game has been changed. I don’t mean criticism in the academic sense – I only mean the everyday definition of a critic as “someone whose job is to write about art and pop culture events.” The internet has made criticism – and the objects of criticism – more accessible than anyone ever thought imaginable. This unimaginable access happened faster than anyone could have ever imagined it to happen – and the critical apparatus has, for the most part, failed to keep up.
I don’t actually know what I’m talking about here and I’m about to make some bold, unproven claims, but this will ultimately prove to be in the spirit of everything, so let’s go.
Criticism began in a time when art was limited to events – cultural happenings that occurred a limited number of times in a limited number of places in front of a limited audience. A performance of a Stravinsky ballet happened only so many times, and in all likelihood you were too poor to be there for it. Critics existed to tell the you about what you couldn’t see but nevertheless ought to care about. As a discipline it was essentially the same as journalism – just focused on material that was more obviously subject to opinion.
The rise of mass media shifted the focus of criticism from the after to the before. Instead of describing something that rich people had already experienced, critics were impelled to describe things that anyone could hypothetically pay money to experience themselves. The job was no longer to assess a cultural entity for history’s sake, but to describe it and assess its worth as a personal investment. Carl Van Vechten told you that the symphony you missed was a major historical event – Roger Ebert told you that the movie about to be released was a major historical event and that you could be a part of it. This was – and still is – an exciting prospect for many people, and the consumption of criticism became a global obsession as a result.
The pre-internet mass media critic was held by two essential standards. 1., he had to present his facts accurately. He was a journalist, and he was held, above all else, to objective honesty. 2., he had to present a thoughtful and informed assessment. The critic was paid because he knew more about art than anyone else. He knew what he was seeing, and he had the breadth of knowledge and the strength of vision to pick out what was good and what was bad. He was the only person who could afford to see every single movie – his purpose in life was to use that luxury to let you know how best to spend your money.
These are the axes of cultural criticism, and they held true for nearly 100 years. This is the school that we grew up with, and except for a few notable suggestions this is the school that predominates today. For the entirety of the 20th century, in a world still defined by intellectual limits, it worked. It was the only thing that made sense. It was completely unprepared for the expansion of the internet.
The internet provided a series previously unimaginable innovations that radically altered the critic’s relationship with the object of criticism, and, and as a result, the critic’s relationship with his reader. Some of the most relevant allowances are –
1. CRITICAL COMMENTARY ON EVERYTHING – not just the important ones. Paper limited criticism by space and by payroll – there were a limited number of critics and a limited number of events that could fit into the limited space provided for their opinions. Everything happened, but not quite so much. Now there is coverage in some form of nearly everything that happens on any given day (provided by essentially free labor), and anyone can learn about any particular event if they try hard enough.
2. WIDER DIVERSITY OF ACCESSIBLE OPINION – Before the internet, one had access to only a few critical opinions at a time. You subscribed to your local paper, several magazines, maybe the New York Times, and you could watch TV. A major film might be discussed in each of them – a less major film might only be discussed in one or two – an independent production would maybe be discussed in a one paragraph blurb in the back of the local A&E section. Now anyone with a viable internet connection has access to everything anyone writes about anything. A major film has 100s of professional reviews, 1,000s of amateur reviews, and untold millions of half-sentence social media reviews.
3. UNIVERSAL ACCESS TO THE PRESENT – Before mass media, only the privileged witnessed culture. In the pre-internet mass media age, it was witnessed only by those who paid. Now, anyone can witness anything – the only restrictions are time and morality. Even private screenings and live performances – once the epitome of limited cultural experience – are now subject to instantly shared amateur filming. You might not actually be at the concert, but you can easily get a relatively solid understanding of what happened.
4. INCREASING ALL-COMPREHENSIVE ACCESS TO HISTORY – This is maybe the most important. The critic’s greatest hold above the masses was once upon a time his vast knowledge of seemingly everything. Now, everyone has seeming knowledge to seemingly everything. A critic used to impress readers by remembering the uncredited onscreen debut of a now leading actor. Now I know that Keanu Reeves’s feature film debut was in the 1986 hockey film “Youngblood,” a fact that I was not aware of two minutes ago but that I can now state with as much authority as A. O. Scott.
I don’t think I’ve said anything remotely surprising here, but I think it’s necessary to lay out the obvious facts in order to make it apparent how mixed up we are at this moment in critical history. The typical critic continues to behave as if it is job to describe to the masses exactly what the masses are supposed to be witnessing, and he is confused when suddenly the masses are less willing to pay him money for it. At the same time, the masses still expect their critics to tell them something they didn’t already know, without making the glaringly apparent connection that they already know pretty much everything.
* * *
Late last year I wrote a review of a Sturgill Simpson concert for my local weekly. Sturgill Simpson is a country musician whose music reflects his comprehensive knowledge of country music – I listen to “Red Solo Cup” sometimes when I’m drunk. It was relatively easy for me, though, to burn through Simpson’s catalog, study up on his influences, perform some Google searches of his lyrics and interviews, and write a review that placed his performance in a relatively accurate spot in country music history. It’s not bullshitting – it’s learning things very quickly and writing them as if I’ve known what I’m talking about for longer than I actually have. Once you realize that this is what critics are doing 90% of the time you start to discover that you’re nowhere near as dumb as you think you are when you read an Ian Cohen BNT.
A comment on my review noted with considerable disdain that I had incorrectly attributed a cover of “I Never Go Around Mirrors” to Keith Whitley. This was an accurate criticism – it was originally sung by Lefty Frizzell – and I was adequately embarrassed (do you and touché, jenningstim6). But I was mostly frustrated by the circumstances that impelled my new enemy to post a mean comment – there was nothing I could do to improve my article in order to make it a better document, and there was nothing my enemy could do to use his clearly superior knowledge for anything other than petty vanity. Everyone looked bad.
The reason my enemy was justly angry was that he felt as if an inaccurate opinion was being forced down his throat as ordained fact (also, he probably assumed that I was getting paid a handsome salary for half-assed research – sorry bro but – :/). He had the simultaneous experience of knowing that a simple Google search could have corrected my error, but also the weird sense of intellectual inferiority that print (digital or otherwise) always inflicts on the reader. That I had written the review meant that someone – especially me – believed it to be true. Because I was wrong meant that some injustice had been done – especially to him. The situation ends in anger – him feeling that I can’t do my job, me feeling annoyed for not doing my job, no one allowing the opportunity for me, in this particular instance, to do my job better.
In a better world, my enemy could have posted a comment saying “Lefty Frizzell wrote ‘I Never Go Around Mirrors,'” I could respond “thanks,” and then I could edit my article to correct the mistake, leaving our exchange at the bottom of the article as an implicit citation (this is how it’s done on Gawker sites, where commentators are admirably listed as contributors in the byline and the comment sections are often just as interesting as the article itself).
In an even better world, though, the review I wrote would barely even have existed. I wouldn’t have had to pretend I knew anything about country music, and could have freely admitted that my attempt to estimate Sturgill Simpson’s place in country’s canon could be bested by anyone if they just spent an hour more reading into it than I did. I wouldn’t have had to waste time with comparisons to other artists because anyone can see just as well as me that Sturgill Simpson is like some things and unlike some others, and that his relative likeness lies entirely in the hands of subjective experience. I wouldn’t have had to describe the events of the show because there was already by the time of my writing a comprehensive body of cellphone footage. The facts of the matter could have been left up to crowd sourcing, with an editor playing mc as a horde of interested fans called out inaccuracies and added new facts. The only thing I would have needed to contribute would be the particularities that only I could see based on my critical creativity. I wouldn’t be held accountable for things that everyone can know just as well as me – I would only be responsible for the things that only I, as critic, was able to contribute.
Obviously this is utopian and far more participatory than anyone has time or energy for. The critic’s job, despite the complications, hasn’t changed. He still has to sum up what’s happened, describe what’s happening, and give his readers enough information to decide whether they consider that happening worthy of their time. But for the critic to be held to a higher standard of authority in a matter of easily proven facts is, in a time of infinitely free information, ridiculous. The critic can no longer pretend that he is above anyone else for his knowledge. And his readers can’t pretend to hold him to that standard when their knowledge is so very equal, if not superior, to his own.
I think that the rules of criticism need to be changed to reflect our changing relationship with information and objectivity. We’re holding critics to a century-old standards. It’s time that we moved forward and rewrote the job description. Here’s an initial attempt.
1 . The institution of critical finality needs to end. Everything can be edited, and – when proven to be incorrect – should be edited. The fact of the editing can be left behind for honesty’s sake (Pitchfork’s practice of putting strikethroughs over edited sections is one of its more admirable traits). We can’t continue to treat the critic’s opinion as if it is final – if we do we do a disservice to his readers, and we force him to be something that he cannot and never will be able to do.
2. We need to stop holding the critic accountable for facts. Facts account for themselves, and they are always right there. But sometimes critics fuck them up, especially when they’re scrambling to keep up with the impossibly fast and absurdly comprehensive pace of the internet. Once upon a time, a critical mistake amounted to an uncorrectable un-truth that could result in real harm – effectively, a lie. Now there is an army of fact checkers, the leader of which is, first and foremost, you. Facts are for print journalism. There is no reason to trust a critic’s facts – their only purpose is to point you in the direction of the truth.
3. We need to embrace hyperlinks. We already do, but we need to fully appreciate what an incredible innovation they are, and the stunning enormity of their potential. Hyperlinks used to exist to back up an argument – now they can become the argument’s very foundation.
4. Critics need to stop pretending that their knowledge is actually superior to anyone else’s. When a critic acts as if he has a comprehensive knowledge of something that happened ten minutes ago, stupid/self-conscious people feel like they’re worthless and smart people make the correct assumption that the critic is full of shit. The critical voice needs to shift from the position of an expert to the perspective of a thoughtful and equally informed observer (unless the critic is an expert – in which case he has to be prepared to go toe-to-toe with other experts, and to take down those who claim to be experts but are clearly not despite their comprehensive arsenal).
5. Except in the realm of summary (which is, above all else, helpful) critical focus needs to shift away from objective truth – but without spiraling into complete subjective opinion. No one knows objective truth – everyone has a subjective opinion. Estimations of both these things are abundantly present on the internet, and there is no need for someone who calls himself a critic to contribute to the noise that is adequately covered by Yahoo Answers and Facebook notes. The critic’s focus needs to be on some unique glory of the object of criticism that is apparent to him and him alone. He has to discover the object’s world as his personal subjectivity experiences it, and he needs to share that world to a readership that has very likely experienced more worlds in the object than he could ever imagine. It’s worthless for him to determine whether they object is good or bad – it’s worthless for him to determine **what it is,** or where it exists in a historical or cultural spectrum. The object is both good and bad, and it exists in too many places in too many spectrums for the critic to be even remotely aware of its infinite potentials. His job is to find one of those potentials and present it in a way that will get someone else excited about what he is seeing. A conservative morality tale in “Girls” – a comprehensive ontology of animal v. machine warfare in Pixar films. He has to give up any conception of knowing what he’s talking about and dive into the world of the object, pulling out what’s shiny and excellent to him and then capturing that shine in a way that will convince someone else that the experience is worth their time and energy.
I think this is what you can call a metamodern critical perspective, and I think it’s essential to pick it up if we want criticism to survive in a world that knows increasingly far too much for its own good. I don’t think this is a new method of criticism, and I think that it’s been slowly developed by the ascending digital media groups over the past ten years. Gawker Media, despite its evils, has allowed this critical methodology to grow almost to the point of signature, and Caity Weaver is maybe our generation’s first great metamodern critic. Vox Media’s flagship site, though still finding its footsteps, is another step in the right direction, posting news items on a wide variety of topics for the purpose of continuous growth and editing, allowing a news story to do what news stories do – develop. And there’s also a whole new school of critical theory – “metamodernism” itself – that’s attempting to figure all of this out. We’re still not there in terms of cultural consciousness – the uglier side of any comment section of a critical piece paired with the smug in-the-know attitude that defines internet journalism will make this clear in about two seconds – but we’re getting there.
This is the future, it is better than the past, and traditional media outlets should probably adapt. Criticism isn’t what it used to be, and that begins first and foremost with the critic’s fall from grace. All-knowing objectification and transcendent knowledge is chill and all – but reality is pretty cool too.