On August 23rd, 2005, Mark Zuckerberg purchased the domain name “facebook.com” for $200,000. That same week, I was moving into a dorm room at Cornell University, to begin four years of mostly feeling uncomfortable. Zuckerberg’s new social networking site had spent a year at “thefacebook.com,” but the company was now taking its first steps toward world domination under the leadership of Sean Parker, founder of Napster. I was just trying to make friends.
There is a routine you follow when you meet other college students, one that Facebook was beginning to turn into a reservoir of hard data. What’s your name? Where are you from? What’s your major? After that fundamental formula, follow-up questions varied. But there’s a classic that was of particular interest to me. As a teenager, my only interest in life had been music, and my main goal for college was to start a band. I had decided to be an English major because Lou Reed had been one, just a bit further upstate at Syracuse University. So there was a special urgency to asking, or being asked, “what kind of music do you like?”
There is an answer to this question that is also a classic: one I was to hear repeatedly in the fall of 2005, and see under the heading “Favorite Music” on countless Facebook profiles. “I like everything,” it goes, “except rap and country.”
There are two implications in this response, one obvious and one unspoken. The obvious part is the negative. I do not like rap and country, my fellow young Ivy Leaguers told me, standing there in their boat shoes. Rap and country were not the resident musical forms of the halls of academe. They had not emerged from the streets where my classmates grew up, which was almost invariably someplace called “Westchester.” Rap and country were the music of the unwashed masses.
The unspoken part of that response is its affirmative aspect. With two exceptions, I like everything — everything else. But “everything” is hard to define. Luckily, if you were an incoming freshman in 2005 who put “everything but rap and country” on your Facebook profile, there was another website growing in popularity that year that covered everything: Pitchfork Media.
I’m being unfair. Pitchfork did cover rap once in a while, but that was new at the time. The website had been founded in 1995 by Ryan Schreiber, while he was still a teenager living at his parents’ house. In an earlier time, its lack of a print edition might have affected its credibility, but Pitchfork’s audience was the first generation not to care. For its first decade, Pitchfork’s content was oriented mostly towards expressing Schreiber’s taste in music. Coverage skewed heavily towards white musicians, preferably ones with guitars. (This is often called “indie rock.”) Albums were given alarmingly precise ratings, with their infamous decimal points.
The earliest regular column of rock criticism, by Robert Christgau in the Village Voice, was called the “Consumer Guide.” But Pitchfork’s audience didn’t buy music, at least not anymore. The internet had made it possible to hear a new album immediately, so it was less important for a review to describe what it sounded like. Instead, you would read a Pitchfork review with your opinion already formed. The review functioned as an expression of a consensus — the common sense that prevailed among your social set. You could then measure your personal opinion against it. Rather than a consumer guide, Pitchfork was more like a lifestyle guide: a musical Michelin or Zagat for the young urban student or professional navigating the contemporary cultural landscape.
Like Zuckerberg, Schreiber had stumbled into something bigger than he could have imagined, also catching a wave that had begun with Napster. Pitchfork reviewers were the organic intellectuals of the upwardly mobile millennial class. As Richard Beck put it in a history of Pitchfork at n+1, “no genre’s fans are more vulnerable to music criticism than the educated, culturally anxious young people who pay close attention to indie rock.”
It would take another two years before Pitchfork was able to move from pitchforkmedia.com to pitchfork.com, having finally accumulated enough profit to purchase the url from the livestock company that owned it. By that time, it was as ubiquitous on college campuses as Facebook.
In those years, Pitchfork’s coverage of rap was controversial. It was a deviation from “everything.” The definition of “everything” may not have been articulated, but it was no less the marking of a boundary. “The world is everything that is the case,” says Proposition 1 of Wittgenstein’s Tractatus Logico-Philosophicus. When we speak of “everything,” we are building a world. Where does that leave the exceptions?
In 2004, Kelefa Sanneh wrote an influential essay for the New York Times called “The Rap Against Rockism.” Sanneh issued a righteous challenge to the centrality of rock in the critical canon, on behalf of music made under different modes of production — rap, yes, but also country. It took years for his point to become absorbed, and today, a reductive version of it called “poptimism” is all but critical orthodoxy. In the mid-2000s, though, it was still under debate, and not all Pitchfork readers wanted to see reviews of rap records.
What Pitchfork was doing was not unlike what my English professors were doing: building a canon. Canonization is a kind of sacrament, originally referring to the selection of saints by the Christian church. Literary scholar Raymond Williams defined the creation of cultural canon as a “selective tradition.” In its construction, he wrote, “a version of the past is used to ratify the present and to indicate directions for the future.”
A canon is formed by something called “taste,” and then, in a strange loop, the canon goes on to determine what taste is. “Taste” is a supple concept. We say it’s a matter of taste to express a kind of laissez-faire cultural relativism — everybody makes their own choices. But there is also the common sense of good taste, a universal etiquette we recognize even when we transgress it, with misbehavior or kitsch. At the same time, there’s no accounting for taste, and people’s preferences are not necessarily under their control. The privileged knowledge of good taste is what sociologist Pierre Bourdieu famously named “cultural capital.” It has a social function, both as a shared set of references and as a mechanism for gatekeeping. A canon is constituted not only by what it includes, but what it leaves out.
Pitchfork’s most recent act of canonization is a list of The 200 Best Albums of the 1980s. In part, this list is meant to expand the acknowledged limitations of the site’s scope from around when I had started reading it. Pitchfork had already published a list of The Top 100 Albums of the 1980s in 2002, since scrubbed from the internet. According to the new list’s introduction, the old list “represented a limited editorial stance we have worked hard to move past; its lack of diversity, both in album selections and contributing critics, does not represent the voice Pitchfork has become.”
This is the result of an ongoing pattern of self-criticism. Unlike Facebook, Pitchfork has shown a willingness to evaluate itself and adjust — today, much of the best writing on music, of all kinds, is published there. As the list’s introduction puts it, the publication has spent the past decade “looking at Pitchfork’s own history frankly.” It has also rewritten that history to an extent, removing and replacing many of its old reviews over the years. What used to look like a classic now looks like a relic; what was once maligned by a reviewer who didn’t really listen to a particular genre is now recognized as a classic of that genre. The John Coltrane review written in Clockwork Orange argot speaks for itself.
Franchises of genre fiction have a name for this: retcon, or “retroactive continuity.” If we introduce a new superpower for this particular comic book hero in this particular issue, we have to include flashbacks to explain how that was possible. The function of a list like Pitchfork’s 200 Best Albums of the 1980s is to provide an origin story for the music it covers today. The introduction acknowledges that while “indie rock” as we know it hadn’t coalesced into a genre yet, there was a “thoughtful nexus” pointing the way. That nexus is a selective tradition coming into being. Once a retcon is accepted by its audience, it becomes part of the fictional universe. “This is canon,” fans will say.
The best definition of indie rock was given by Beavis and Butt-head, as they watched the music video for The Flaming Lips’ 1993 crossover hit “She Don’t Use Jelly.” “Uh-oh,” said Butt-head. “I think this is college music.” My own social life in college came to be built on indie rock. After the initial disappointment I experienced in September, when I learned that none of the residents of my dorm building bore any resemblance to characters in a Jean-Luc Godard movie, I had resigned myself to self-imposed isolation. Try as I might, I could not participate in the social rituals that took place at frat parties, the axis around which campus social life orbited. To this day, I have no idea how to play beer pong.
My salvation was the local indie rock scene. I discovered a community that orbited instead around the performance of music. This is a wonderful foundation for a community. But it was accompanied by a regulatory culture that was not always so wonderful. We held meetings to discuss booking bands for shows, and suggesting a band that was slightly too well-known might win you derisive laughter. Even while I immersed myself in this scene, I started to feel an aversion to the cultural norms it inscribed.
I listened to less and less rock. I listened to some rap, but I got even more interested in house music, which had so little currency among my peers that most of them did not recognize its existence. Pitchfork declined to cover this music, until LCD Soundsystem built an audience by superimposing wry commentary over it. In the meantime, I became a DJ, clearing the floor at parties when anyone made the mistake of asking me to play.
At the same time, I had also started listening to country music. Gram Parsons was the gateway drug, and George Jones and Merle Haggard followed. This music, with its rich narrative detail, with its wit and wordplay, appealed to a different part of my brain. Rather than abstract, it was concrete; what it lacked in complexity, it made up for in ambiguity. Growing up, I’d thought of country music as the absolute worst that culture had to offer. It was just not sophisticated. There was more to life, I thought, than pickup trucks and cheap beer. But once within an ivory tower, face to face with the heights of insularity and elitism, I felt differently.
In a Monty Python’s Flying Circus sketch, a playwright’s son leaves home to become a coal miner. His father is ashamed of him for abandoning his roots, treating him with disdain when he returns to visit. “You’re all bloody fancy talk since you left London,” says the father, in response to talk of “tungsten carbide drills.” The son storms out, shouting, “One day you’ll realize there’s more to life than culture!”
Rock criticism is not the first cultural institution to rewrite its own history. It is a repetition, in a way, of a process undertaken by literary studies over the course of the 20th century. Scholars like Raymond Williams pointed out that if social context is part of a text’s meaning, and if experimental literature leaves us uncertain of what a text is in the first place, literary scholars have to read more than books. Feminism and the civil rights movement brought their own challenges, leading to a thorough audit of the storied Western Canon.
At first, the canon was made to include novels by women and nonwhite authors, before expanding to the point that it could also include pop music and TV shows. A new discipline of “cultural studies” was born. By the 1990s, only cranks objected to these developments — mostly Harold Bloom and Allan Bloom, who were not even related. But whether this new inclusiveness added up to a critical mass is hard to determine. The literary canon is still mostly books, just as Pitchfork’s canon has always mostly been indie rock.
The second 1980s list is a laudable development. As I scanned it, I was pleased to see a decent amount of house music, which would never have happened when I was trying to play house records for Pitchfork readers at parties. I was less convinced by its selection of jazz albums, which seem to be someone’s personal favorites rather than an attempt to capture the scope of the genre in that period. But that may be an insurmountable problem. The perspective from within a genre has its own concerns. This is a challenge for any canonizer.
The most remarkable thing about Pitchfork’s list, though, is the near-absence of country music. Only one entry appears that might fit the qualification — Lucinda Williams’ self-titled album from 1988. It’s a great album. But in the context of the list, it may not even be understood as country. Its presence here seems to represent inclusion in a tradition cutting across genres — that of the confessional singer-songwriter, which would come to include canonical indie rock artists like Liz Phair. Williams was frank about female sexuality in a way that anticipates some of the indie rock of the 1990s, and besides, the album came out on the British indie label Rough Trade. As Pitchfork contributor Stephen M. Deusner’s blurb puts it, the music is “rock-tinged country (or is it country-tinged rock?)”
But the Pitchfork list is simply called “Best Albums,” not “Best Rock Albums.” In the absence of a categorical designation, it’s fair to assume that the list is meant to encompass “everything” — or at least, everything good. This framing is analogous to “literary fiction,” a description that tells us nothing about a novel except that it does not belong to a genre.
After the relative decline of print publications like Rolling Stone and Spin, Pitchfork is the major music publication still standing, proclaiming itself “The Most Trusted Voice in Music,” in a bold echo of CNN’s “The Most Trusted Name in News.” The rock critical establishment finds itself in the position James Murphy described in the first LCD Soundsystem single: “I’m losing my edge to the art-school Brooklynites in little jackets and borrowed nostalgia for the unremembered Eighties.”
The 1980s were a strange era for country music. The 1960s had been characterized by pop crossover in Nashville and back-to-basics insurgency in Bakersfield. The major event of the 1970s, amid the disco flirtations every genre indulged in, was outlaw counterculture in Austin. Rock musicians, from the Rolling Stones to Bob Seger, had borrowed so extensively from country music it was hard to draw a dividing line.
Then, in the 1980s, a tendency arose that is sometimes called neotraditionalism. If that sounds like an exercise in canon-building, it is. Country stations were moving to FM Radio and introducing a “classic country” format at the time this new music emerged. There was now enough historical distance from the 1950s and 1960s that a selective tradition could be established. Country musicians of the 1980s incorporated earlier flirtations with pop and rock by pulling them back into a core that retained its autonomy — not unlike Pitchfork’s “thoughtful nexus” of early indie rock. But the result had to be country. It was made to be played on country radio for those who chose listen to country music, not to reach a new audience that had previously found the music distasteful.
Country’s roots run earlier than rock and roll. It has never completely invested in the equation between popular music and youth culture, the equation that birthed both rock criticism and pop radio. ”Country music has always been adult music sung by adults,” says Bruce Hinton, chairman emeritus of MCA Records Nashville, in a New York Times story on modern country. By the 1980s, country music itself had reached adulthood. The decade might be the most critical moment in the music’s biography. It is when, instead of getting swept up as one among many elements in the generic landslide of pop, country music chose to continue to exist.
At that point, country and rap were aligned, on the periphery of the central canon of rock. But Pitchfork’s new list, in which rap gets a major hearing, shows how much they have diverged. Pitchfork participated in this process — its slow acceptance of rap was a public reckoning.
The year before I started college, Kanye West had put out his debut album, The College Dropout. Ironically, that album began to allow rap to take on a collegiate quality — if “backpack rap” like Jurassic 5 had previously appealed to nerds, Kanye appealed to preppies. The year I graduated, in 2009, Asher Roth put out a novelty rap single called “I Love College.” As for Kanye, he may have started as a dropout, but he proceeded to Late Registration and Graduation in short order. In 2010, Pitchfork gave a new rap album a 10.0 rating for the first time — an honor previously reserved mostly for Radiohead. It was Kanye West’s My Beautiful Dark Twisted Fantasy. He was a college graduate now.
After this shift — one that shocked many readers — Pitchfork continued to make a valuable adjustment to the canon by reviewing previously released albums. Today, rap classics like Ghostface Killah’s Supreme Clientele hold the full scores they were once denied. The site started doing a year-end roundup in 1999, but a list of The 50 Best Albums of 1998 was published in 2018, giving Outkast’s Aquemini the top slot. Had the list been put together ten years earlier, a southern rap album could never have taken that position. This isn’t a sign of hypocrisy; it’s a sign of progress. But that progress has its limits.
What made college students who had previously preferred Animal Collective and the Arcade Fire start to care about rap? In some sense, it was a surface-level expression of a larger pattern. The Pitchfork generation came of age after the implementation of widespread multicultural educational policy — debates about the canon were taking place in both classrooms and boardrooms. For the neoliberal interests behind the scenes, the core function of higher education had to be preserved, even against the challenge of multiculturalism. The university served as a conveyor belt for the assembly line of the middle class, either carrying on its lineage or allowing for occasional entry from below. The development of collegiate culture reflected these conditions. For career-minded students, the cultural canon could open up to diversity, but ultimately, it needed to be compatible with the ideology of social mobility.
If rap became more palatable to bourgeois sensibilities, it may be in part because the narrative content of its most commercially successful expression, gangsta rap, has often dealt with power. A limited conception of the the arc of power — the rise without the fall — made rap appealing to middle-class college students in the same way as a Scarface poster. At the same time, rap could also be understood as a cutting-edge technology; its mode of production was a recent invention. The search for novelty in mass culture mirrors the pursuit of progress in high culture — both in the concert hall and on the pop charts, there is a modernist imperative to “make it new.” The adoption of new technology is a classic means of displaying cultural capital.
But country sits the whole thing out. It may reluctantly acknowledge changes in society and technology, but it keeps a steady pace and an eye over its shoulder. Its rejection of the model of innovation is also a rejection of social mobility. For some, this makes it a useful cloak — upper middle-class suburbanites or nouveau riche entrepreneurs who wish to signal a populist temperament can use country for this purpose, and crossover stars like the men of “bro country” have made this possible. But for the upwardly mobile, it’s a dead weight. To many young Americans, country music is the sound of poverty.
If rap can be incorporated, the limit Pitchfork now sets on good taste is “everything but country.” That phrase happens to be the title of a fine essay at Pitchfork from 2007, by Stephen M. Deusner. In a review of a various-artists box set called Legends of Country, Deusner makes the case against listening to “everything, except rap and country”:
Let’s set aside the possible race and class implications of that appending phrase; anyone who closes themselves off to these two genres is missing out on vast and exciting worlds of music in which territory is being explored that’s foreign to indie guitar bands, squeaky clean pop acts, and dead rock idols.
Deusner’s language here is a lot like Sanneh’s in “The Rap Against Rockism.” It’s an argument against the “literary fiction” model of music criticism, with its world beyond genre. By the end, he makes an earnest plea: “The truth is, mainstream country music — now as ever — isn’t all that bad.” Apparently, no one was convinced.
I probably won’t convince anyone either. But what I can do is make a list. Here’s my attempt, in no particular order, at listing The 20 Best Country Albums of the 1980s.
Randy Travis – Always & Forever
George Strait – Strait from the Heart
Dwight Yoakam – Buenas Noches from a Lonely Room
Rosanne Cash – Seven Year Ache
Rodney Crowell – Diamonds and Dirt
Lyle Lovett – Pontiac
George Jones – I Am What I Am
Merle Haggard – Big City
Steve Earle – Guitar Town
The Mekons – Fear and Whiskey
Keith Whitley – I Wonder Do You Think of Me
k.d. lang – Shadowland
Joe Ely – Live Shots
Dolly Parton – 9 to 5 and Odd Jobs
Guy Clark – Better Days
Ricky Skaggs – Highways & Heartaches
Clint Black – Killin’ Time
The Judds – Why Not Me
Dolly Parton, Linda Ronstadt, & Emmylou Harris – Trio
Rosie Flores – Rosie Flores
Foster & Lloyd – Faster and Llouder
Kris Kristofferson – To the Bone
Lorrie Morgan – Leave the Light On
Reba McEntire – My Kind of Country
Carlene Carter – Musical Shapes
Patty Loveless – Honky Tonk Angel
Jason & The Scorchers – Lost & Found
The Long Ryders – Native Sons
Marty Stuart – Hillbilly Rock
Bobby Bare – Drunk & Crazy
So… that’s 30 albums. Twenty would have been better conceptually, but I had too much trouble deciding what to cut. In my defense, there’s a lot to choose from. There are elder statesmen like Haggard and Jones, who put out a couple of their all-time best records. There is the first wave of neotraditionalism: Travis, the classic crooner, Yoakam, the hillbilly honky-tonker, Strait, the elegant ranger. There are the outliers, rocker Steve Earle and bebopper Lyle Lovett, who expanded the scope of what could be called country. There are hippie insurgents from Texas, like Guy Clark and Joe Ely, setting the stage for Americana and alternative folk music. There are the progenitors of alt-country, including a gang of Yorkshire punks, The Mekons, and a Canadian vegan, k.d. Lang. Would any contemporary musicians claim these albums as a selective tradition? Yes, absolutely — let’s say Kacey Musgraves and Sturgill Simpson for starters, both of whom have been reviewed at Pitchfork.
What ties these separate works together? To some extent, it’s the narrative richness and the lexical wit that initially drew me to the genre. It may also be something like “twang,” which is hard to describe, but easy to spot — both in a steel guitar and in a Southern accent. It’s less clear whether there’s a conceptual unity to all of these albums. Country has a reputation for being monolithically conservative, but that isn’t true of these artists, many of whom — even Merle Haggard, whose storytelling sometimes gave people the wrong idea about his politics — were liberals, and some of whom, like Earle and Kristofferson, are outspoken radicals.
To take an example from the list, consider Rosie Flores, a singer of Mexican origin, performing “God May Forgive You (But I Won’t),” a song about a woman leaving her husband upon his conversion to evangelical Christianity — written by Harlan Howard, a Republican, and Bobby Braddock, a Democrat. Country’s homogeneous political reputation may have less to do with what it is than what it can be used to represent. For politicians claiming to speak for working-class people, as reactionaries always do, country is a powerful signifier. But it’s a malleable one. Barack Obama played Brooks and Dunn’s “Only in America” at his presidential acceptance speech, after winning the 2008 election. It was the same song George W. Bush had played at rallies during his campaign in 2004.
All that really proves about country music is its popularity. According to Nielsen, country radio has remained the top format in the United States for nearly a decade. Streaming may have become the predominant means of listening to music for media professionals, but most people still listen to the radio — 93 percent of American adults, says Nielsen’s data. The introduction to Pitchfork’s 1980s list says its editors hope it “represents the best of what this innovative decade has to offer, as well as how people consume music now.” Since a lot of people listen to country music, there may be something missing from this description. Anyone who listens to country radio knows that most of the above artists still get played every day.
My list is also an act of canonization, and, as such, is open to dispute. Now that I’ve finished it, I myself am inclined to dispute it. There are at least two albums each by Yoakam, Cash, Travis, Strait, and Lovett that, on a different day, I might have swapped in for the ones I chose. You could widen the definition of “country” to include more folk music (bluegrass is mostly a blind spot for me), or on the other end, more southern rock (I fully admit my bias in favor of alt-country, but I left off my beloved Blasters for coherence). I couldn’t decide which album to include by neotraditionalist stalwart John Anderson — his first four are all of roughly the same quality, with a few standouts each (get the Greatest Hits). I also left off a couple records I personally really enjoy, but which are deliberate stylistic throwbacks (Willie Nelson & Ray Price’s San Antonio Rose and Emmylou Harris’s Roses in the Snow). And of course, if Pitchfork hadn’t made its one exception for her already, I would have included Lucinda Williams.
This kind of indecisiveness is inevitable, whether you’re putting together a cultural canon or a grocery list. But today, listmaking might be the most common form of cultural criticism. After the growth of Facebook and Twitter — essentially lists of acquaintances — private canons can travel in memes and through arguments. It’s a risky move to publish a “Best Of” list these days, with the citizens of Twitter lying in wait. Even if you accept the framing, you can always find a bone to pick.
But the further we drift from rock, the more charged each choice becomes. Even in the golden age of rock criticism, it was not unusual to see the likes of A Love Supreme or Bitches Brew on Best Albums of the 1960s lists. The 1980s are more of a challenge. This was a time when jazz, like country, was more interested in establishing itself as a sustainable practice than in entering a new stage. This was as true for avant-gardists, like the artists of Chicago’s Association for the Advancement of Creative Musicians, as it was for the traditionalist “Young Lions.”
Neither category is represented on the Pitchfork list. The jazz musicians present — Cecil Taylor, Ornette Coleman, and Alice Coltrane — all had established careers during jazz’s most canonized period. It’s not surprising that Wynton Marsalis is absent, but one might wonder about the omission of David Murray, Geri Allen, Paul Motian, James Blood Ulmer, Henry Threadgill, Bill Frisell, Muhal Richard Abrams, and their contemporaries. The exclusion of artists like these would be entirely comprehensible if jazz was not within the scope of the list. But the list has jazz on it; it opens the door.
If everything is permitted, something is being said about what’s left out. What I noticed as I assembled country albums is that most of them wouldn’t fit on Pitchfork’s list. I don’t know exactly why, but I know it’s true. They are all full of great music, and they are all in some sense reflective of their era. But assuming the framing Pitchfork sets, they would neither ratify the present nor indicate directions for the future. Once you’ve done the math, these albums don’t point towards an organizing principle that explains what it takes for a new piece of music get a good review at Pitchfork. In fact, they seem to be in violation of that principle — too unsophisticated, or not innovative enough, or something.
For all the trouble it causes, genre has its uses. If a comedy doesn’t make you laugh, it has failed to achieve the state of being a comedy. If you can tell who committed the murder right away, it’s not a satisfying mystery. If you can’t dance to it, it isn’t dance music. These rubrics of genre are limitations, but they are also the expression of shared values, enabling communication. Seeking to transcend them, by stepping outside the lines of genre, means determining a new set of values. Most rock critics, not to mention David Hume and Immanuel Kant, would admit as much.
Setting out a universal standard for aesthetic judgment is a bold move. If we were to name a set of existing genres, and could articulate what unified the set, we might be able to come to a mutual understanding. But when The Most Trusted Voice in Music tells us what is Best, we lack access to the standards that informed that judgment. And we can only recognize exclusion as a kind of judgment in itself.
Needless to say, one list of albums does not have a decisive effect on society. But it isn’t an overstatement to say that there is a force at work that leads us to view the expression of class interests as a neutral expression of taste. It’s the same thing that asks us to consider a hierarchical institution like the university a meritocracy. Social theory has a name for this imaginary relationship to existing conditions: “ideology.” The project of cultural studies was based on the premise that understanding ideology, by investigating how culture is produced and how canons are formed, can help explain how wider social formations operate.
The marking of a boundary requires definitions, even if they are unspoken; an exception needs to be defined in order to be excluded. That Pitchfork’s list includes everything but country reflects some understanding of what country is. The question is, what is everything else, when it’s everything but country?
Shuja Haider