When I embarked on adulthood with the English degree I was repeatedly told had marked me for a lifetime of failure and poverty, I didn’t have the luxury of being picky. As Anne Helen Petersen recently pointed out in BuzzFeed, millennials came of age in a collective state of financial calamity. I graduated from college in the spring of 2009, meaning the last year of my university education took place in the shadow of the 2008 financial crisis. It was already clear that while the U.S. government would provide public assistance to banks and multinational corporations, those of us who needed to make a living were on our own.
I applied for jobs at the magazines, newspapers, and websites I had admired as a reader, and was mostly ignored (note to aspiring young writers: this will also happen to you). Gradually, desperation led me to pursue opportunities at less auspicious venues. But things were looking bad. If I didn’t find a job, I would have to continue living at my parents’ house, in a town where the only activity available to me was to go to bars I couldn’t get into as a teenager, all of which turned out to be disappointing.
But then my luck changed. One of these more obscure companies, which I believed to be a press agency of a smaller size but similar remit to Reuters or the AP, invited me in for an interview in a relatively interesting city. I knew in advance that if they offered me the job I would take it. This is not a good position to be in. It is a strange and artificial set of circumstances that leads individuals to believe they need jobs more than businesses need workers.
I arrived at the interview in the same blazer I had worn for my graduation, purchased at a clearance sale at a department store, and was greeted by a mysterious man with a British accent whom I never spoke to again. He handed me off to his underlings. I don’t remember anything they asked in the interview except their final question: “Do you have any questions?”
I had been doing okay up until then, but this one really stumped me. It had never occurred to me that something like this could happen. My only question was, “Are you going to give me this job or what?”
I babbled something I felt stupid about. I now pass this wisdom on to any aspiring job-havers who may be reading: decide on a question before the interview, ideally two, in case the bastards answer the first one before you have a chance to ask it.
To my great shock, I got an email a few days later offering me the job. I was grateful. I hadn’t been to journalism school, so I ordered a few style manuals and textbooks. But I was soon to discover that they would be of no use. I was not quite beginning a career in journalism. I didn’t know the term then, but I had become a toiler on a content farm.
I came in a day early to do paperwork. Everything had the loose vibe of a new millennium startup: big open space, no cubicles, exposed brick, iMacs, Keurig machine with free pods. The office was conceptually divided into two halves. One half consisted of two islands of desktop computers, staffed by about eight people each. The other half of the room, geometrically and numerically speaking, occupied more than half: it was for the sales staff. There were at least twice as many of them as there were writers, all of them male, and they spent their days cold calling businesses to peddle the product the writers were making.
I was given an email address and a style guide. What are the hours, I asked? Oh, nothing’s set in stone, I was told, we’re pretty relaxed about that.
The trick of startup culture is that the lack of imposed discipline doesn’t mean liberation. The philosopher Slavoj Žižek describes this kind of cultural ethic with a parable of the Postmodern Father. The authoritarian Modern Father says, I don’t care what you want, you have to visit your grandmother whether you like it or not—putting you in a position where you can rebel. But the Postmodern Father says it’s up to you, do what you want, but you know how much your grandmother loves you.
“Beneath the appearance of a free choice there is an even more oppressive demand than the one formulated by the traditional authoritarian father,” says Žižek, “namely an implicit injunction not only to visit the grandmother, but to do it voluntarily, out of the child’s own free will.” This is openly the ideology of the so-called science of contemporary management theory, with its appeals to principles of “humanism.” We are compelled by the imperatives of competition and the shadow of precarity to discipline ourselves. #riseandgrind, as the kids say today.
Frederick Winslow Taylor’s 1911 book Principles of Scientific Management was an early articulation of the structure of corporate culture, in an influential formulation that made no attempt to conceal the inequality on which it must depend. Taylor had developed his theory by observing the productivity of laborers hauling pig-iron. He wrote,
the science of handling pig iron is so great and amounts to so much that it is impossible for the man who is best suited to this type of work to understand the principles of this science, or even to work in accordance with these principles without the aid of a man better educated than he is.
Taylor’s strategies amounted to increasing competition between workers by offering them incentives and punishments. I had become familiar with this structure in the first job I had as a teenager, flipping burgers. Actually, in the Fordist model adopted by the burger joint I worked at, I did not flip them. My role was strictly to place the burger upon a bun. The guy to my right had applied ketchup and mustard to it and the guy to my left proceeded to assemble it with lettuce and tomatoes. I was allotted a specific amount of seconds to perform my task—nine, if I remember right.
A different rhetoric was introduced in the 1920s by Elton Mayo, a psychologist who shifted the emphasis from mechanical processes to “human relations.” In a way, he predicted trends in the American economy: from 1949 to 2018, the proportion of manufacturing jobs has declined from 30 percent to 8.5 percent. This has led some observers to call our era “post-Fordism.” In the context of this increasingly predominant form of work, it becomes, as Marx anticipated more than a century ago, “the development of the social individual which appears as the great foundation-stone of production and of wealth.”
Whether or not the process has changed in the Information Age, its goal never has: maximizing production for the increase of profit, whether by the lash or the Keurig machine. The line of thought that passes through Frederick Winslow Taylor and Elton Mayo to Malcolm Gladwell and Tim Ferriss leaves this project unquestioned.
Browse the Business section at a bookstore and you’ll see the same message in many guises. Your goal is to be “effective” and “successful.” You are to “innovate” and “disrupt.” This is achieved through “influence” and “leadership.” Your social life is for “networking,” and your leisure time is for “recharging.” All aspects of your life are relations of production. The ideological injunction to commit to this project is so complete, we are expected to do it “proactively.” In Taylor’s time, better-educated men barked orders on shop floors. Today, Harvard Business Review has a book called On Managing Yourself.
At the contemporary office (or “co-working space”), you are your own taskmaster. You and your colleagues are not members of a collective, but a competitive market. I found this out slowly, after mistakenly assuming I was not under scrutiny. Yet no matter how hard I tried, how much earlier I came in, at least half the writing staff would be there before me. There was too much content to harvest for anyone to get away with sleeping in. Even more alarmingly, no matter how late I forced myself to stay, I was never the last one to leave.
I was assigned an alter ego, an alliterative nom de plume that immediately brought to mind a male porn star. Like the rest of my colleagues, mostly fellow misguided English majors, I was given a list of clients, mid-sized corporations whose websites I was to write for. The company sought to exploit a loophole in online search engines: by including a news feed with “original articles” on their websites, these corporations could get better results for themselves on Google.
Let’s say you are a manufacturer of thumb drives. In addition to “Buy Now” and “Contact Us,” your website has a tab called “News.” You could hire my company to produce stories every day to go on that news page, about developments in fiber optics, nanotechnology, the silicon industry, or whatever other thumb drive-related subjects, generating a steady flow of content meant to draw traffic to your site through Google News.
It was openly a bait-and-switch operation, driven by SEO, or “search engine optimization.” Google frowns on this practice the same way a casino frowns on counting cards. The odds are supposed to be in the house’s favor. Over the years, Google has implemented various proprietary algorithms to prevent the effectiveness of many the techniques I was meant to use, in spite of efforts by white-hat SEO practitioners to keep search engines neutral.
In any case, back then, the tricks worked. Not being experts in business or technology, we produced this content mostly by rewriting press releases or stories in other publications, which we found using the same search engines we were trying to game.
My primary subject was “cloud computing.” This is a more familiar concept now that we have Dropbox and Google Docs, but at the time, it was entirely new to me. I was writing for multiple companies that found the topic important for different, sometimes contradictory reasons. Some provided remote hosting services, and wanted to spread the gospel of cloud computing. Others were manufacturers of consumer hardware, determined to stop the dangerous scourge of cloud computing in favor of localized storage.
At the time, if you had asked me to explain what cloud computing was, I wouldn’t have been able to do it. I was too overloaded with contradictory perspectives. This was true of many of the subjects I wrote about, financial and technological concepts ranging from obscure to trivial. One company I wrote for sold lanyards. I needed to write three stories a day about news related to the subject of lanyards. I cannot for the life of me remember how I pulled this off.
We were given lists of potential search terms for each client, and were meant to include as many of them as possible. This meant not only incorporating words like “servers” and terms like “cloud computing,” but longer phrases like “store information online.” These phrases would come up in search results even if interrupted by punctuation, so to fit the longer phrases in, you could string together bullshit sentences that were unrelated to the sense of the phrase. Something like:
“These days, you don’t have to ask for directions to the grocery store. Information online is easily available, even in your own neighborhood.”
You would then try to pivot to offering an opinion without seeming to hold one, e.g.:
“Conveniences like these have been made possible by cloud computing, which processes data at high speeds. According to recent statistics, that speed is increasing.”
Or if you were arguing the opposite side, you might say:
“But you wouldn’t want want someone to be able to access your private servers just as easily. According to the latest reports, a new criminal underground is growing online.”
I have to emphasize these pages appeared as results on Google News, where they looked just like any other source. But the content of the news itself didn’t really matter. Whatever it was, you wrote it in a way that strongly implied anyone reading needed to purchase the company’s product, by assembling the required terms in a way that concealed your intentions.
We would send off our stories to the editor, who would check not if they were well-written, interesting, or true, but whether they would hook the bait. Shortly after, they would begin appearing in our own search results as we researched our next stories.
I was uncomfortable with the work, but I was also bad at it. There were growing doubts about my work ethic. Though I had taken the job out of desperation, I hadn’t recognized just how dire the situation was. My colleagues were savvier. Most of them were perfectly nice, without malice, just trying to get by, and pulling it off more competently than I could. Some of them, though, were true believers, who saw the successes of both companies and individuals as the outcome of the same honorable work ethic. My supervisor was one of them.
After a short grace period, criticisms of my performance emerged. It was eventually made clear to me that taking the full hour allotted for lunch was generally frowned upon. Unfortunately, there was nothing I could do about this—that’s just how long it took me to walk to Chinatown and back. Most of my colleagues got lunch at the Starbucks in the nearby train station, and ate at their desks.
It was also made clear to me that I was not producing nearly enough content. This has subsequently been pointed out to me by every editor I’ve worked with since, but here the numbers were stark. I was meant to be writing more than 10,000 words per day, but I had trouble managing even half that. I was getting a reputation as the laziest farmhand on the field.
This state of affairs required numerous meetings with both of my superiors. The chief editor took a what’s-the-big-deal attitude, but his second-in-command was stern and disapproving.
“What’s going on?” she asked, holding direct eye contact without blinking. I squirmed.
“I guess I don’t always understand what I’m writing about,” I said.
“You don’t have to,” she said.
It was a farce of a newsroom, but banal rather than absurd. My editor demanded not that I break stories or get facts straight, but that I make my writing less meaningful so I could produce more of it. This general emptiness was obscured by strict adherence to business jargon and statistical terminology.
One habit that was impressed upon me, as you can see in the sentence above about word counts, was to always use the phrase “more than” or “fewer than” rather than “over” or “under” when referring to numbers. In 2014, the AP Stylebook officially announced that it was okay to say “over” or “under,” but this rule has stuck with me like a lab rat expecting an electric shock.
At that point, I was going through about a dozen Keurig cups a day. I had churned out a lot of bullshit writing papers in college, but no one other than underpaid teaching assistants had ever read them, and I’d never had to say anything I didn’t believe. Here, I was seeing my own work rise to the top of Google search results. I was having an influence on public opinion and I didn’t know what most of it meant.
At one point, I was called over by my editor, who told me a story I had written wasn’t convincing enough. This had never come up before, and I have wondered in retrospect if it was a mild hazing ritual.
“You need a quote,” he said.
This made no sense to me. I had never interacted with anyone at any of the companies I was assigned to write for. I had never been encouraged, or even permitted, to do any reporting for the stories I wrote.
“Where do I get a quote?” I asked.
“Here’s a name,” he said, scribbling something on a post-it note while still staring at his computer screen. “Make something up.”
I already felt guilty about sending information into the world that I couldn’t personally verify, even under a pseudonym. But it struck me with some force that this was crossing a line. It even seemed possible that the line was behind me, and I had been too panicked and exhausted to notice when I had crossed it.
This is the natural climax of the story. If it were a well-written story with a good narrative arc, I would have quit then and there. I would even have given a speech. It would have been like Jerry Maguire or Inherit the Wind. I’m sorry to report that the real story is not as good.
“I…I can’t do that,” I said, apologetically.
He looked up from his screen. So did everyone else.
“Why?”
“I, uh,” I replied.
He looked at me with confusion, before he remembered he was the kind of guy who did not make a big deal about things.
“Okay, I’ll finish this one,” he said.
My colleagues watched me with pity as I returned to my seat. They continued to look at me this way from then on, as I continued to go to work as usual, and do my usual substandard job. But the atmosphere had changed. In the eyes of my employer, my sin was no longer just sloth, but pride. I had placed myself above the job. My days were numbered.
They didn’t fire me, presumably because finding and training a new candidate was a greater expense than having an employee who sucked. After two more meetings with each of my superiors, I decided to quit. I did so with no righteousness whatsoever.
The story of what happened next is even more anticlimactic. With an urge to compensate for this false start, I decided to throw myself into freelancing, in an attempt to establish my own byline as a known quantity. This was an even more drawn-out, uneventful failure. In college, I’d had some luck publishing music criticism, never making more than pocket change. The lack of compensation had not been an issue while student loans were coming in, but it became a problem once I started having to pay them back. I had one story published, but it was so hard-won I gave up completely. I resigned myself to the inevitable and moved back to my parents’ house, and got a job at the reference desk at the local library. I stopped writing at all.
It has never been predictable how you start a career as a writer. Education and credentials seem to have nothing to do with it. In 1969, Lester Bangs began his career by sending an angry, drug-fueled review to Rolling Stone of the MC5’s Kick Out the Jams that was vaguely insulting to Rolling Stone. He insisted that if they declined to publish him, they send him an explanation of why. Instead of ignoring him, which they could easily have done, they ran his review, and subsequently many others. The result was a now-classic body of work that continued until he was fired by Jann Wenner in 1973 for “disrespecting musicians.”
It would seem like the internet should have let a thousand Lesters bloom, after splitting the distribution of information and opinion from the dictates of advertising. It almost did, during the heralded Golden Age of Blogging, but in the long run, the opposite seems to have taken place. Aspiring writers are increasingly forced to work on the content farm, scramble endlessly as freelancers, or hold precarious positions at publications that lay off employees left and right, as apparent industry leaders Huffington Post and Buzzfeed did recently. According to CNN, 1,000 media jobs were cut last week.
For my part, I only began writing again out of a desire to collect my thoughts, with no intention of beginning a career. Some of the work I produced found an audience, which led to offers of payment. I can’t advise anyone to go about pursuing a career as a writer this way, because I was not pursuing one. My own career now is also precariously dependent on the ecosystem of the industry, which could change to my disadvantage, drastically, at any moment.
Mark Zuckerberg became the third richest person in the world by inventing a platform whose primary function has turned out to be monopolizing the spread of information. Meanwhile, the search for the truth is a risky, if not self-destructive proposition. President Trump was unsurprisingly pleased by last week’s media layoffs, given that he has named the press “the enemy of the people.” But for all the joy he takes in firing people, Trump was not the person who fired these workers.
It’s hard to put it better than A.J. Liebling did: “Freedom of the press is guaranteed only to those who own one.” Today, that freedom is guaranteed to the likes of Rupert Murdoch, Jeff Bezos, and Sheldon Adelson; for the rest of us, its promise is conditional. We can say what we think, but if Peter Thiel or Joe Ricketts don’t like it, they can make sure no one will hear it.
One of the most obvious signs of an authoritarian society is that it jails or kills journalists. But what kind of society is it that starves them?