On January 24th, 25-year-old snowmobiler Caleb Moore crashed during the X Games freestyle finals in Aspen. A week later, he died of complications from his injuries. And if you want, today, you can watch the fatal crash in its entirety right on YouTube. It’s been uploaded dozens of times by perverse opportunists, viewed by millions of post-ironic eyeballs, and sponsored by a herd of unsuspecting advertisers. By any measure, it’s one of the most popular snuff films in history.
Had Moore died on the scene, YouTube might have pulled the plug on the video early. Its Community Guidelines are clear about what is and is not acceptable content, and deaths and other gruesome, graphic material are high on the list. But Moore gets up and walks away. He doesn’t die on the screen. And that was enough to nudge it past YouTube’s own guidelines. Not only did it make it on the site, it began displaying ads for massive brands like Mercedes, Samsung, Foot Locker, North Face, Crysis 3 and more, right next to uploaders and commenters basking in the crash. And once it was clear that one could actually make money off of Moore’s death, it was everywhere.
The number of uploads increased. One industrious profiteer changed the video’s thumbnail to include red block-lettered splash text next to the image of Moore falling, shouting FATAL ACCIDENT. Another version of the video headlined it “Caleb Moore dead Snowmobile Crash Winter X Games die.” The actual death comes later, but this is what killed him. This is the moment when his life is ended. Calling it anything else is insulting.
Warning: Links referring to on-screen deaths in this piece contain graphic imagery, as do the videos contained.
This isn’t about squeamishness. Awful things are uploaded and written by millions of of gleeful internet deviants every day. It’s about YouTube, an impressively sanitized First World of internet video, finding itself in a position where it literally can’t stop itself from posting a video of a man dying and serving up targeted ads against it. Are we okay with that as a cost of doing business on the internet?
Moore’s crash isn’t the first time YouTube has had this issue, and with a sporting event, no less. In 2005, the year YouTube was founded, boxer Leavander Johnson fought Jesus Chavez to defend his IBF lightweight title. He lost the fight—though he left the ring under his own power—and later collapsed in his dressing room. Johnson passed away five days later, after being put into a drug-induced coma. You can still see the fight that killed him, blow by blow, on YouTube. Then, in 2011, Russian boxer Roman Simakov passed away due to brain injuries two days after losing by knockout to Sergey Kovalev. Two videos of the fight, titled “THE K.O OF DEATH—-RUSSIAN FIGHTER DEATH.” and “Boxer dies in ring” and uploaded the day news of Simakov’s death was made public, sit at over 1.1 million and 960,000 views right now, respectively.
And you don’t want to see the videos of bullfighters, but those are there too. Gruesome, fatal, and views in the millions.
The videos of those fights don’t run ads; they’ve been up for long enough that they’ve probably had time to be properly flagged. Given their titles, that’s a safe assumption, but it’s also possible that no one’s bought that tract of advertising (old boxing fights) for a while.
Public curiosity in death and injury is nothing new. Faces of Death was a phenomenon more than 30 years ago. The term Darwin Awards has sanitized all manner of fatal and life-altering injuries into digestible, linkable jokes. For better or worse, these things are part of the fabric of the internet, and this is not a call to scrub them off of it. But some re-evaluation has to happen at some point. Very few people are anti-porn, but there is no porn, copyrighted or otherwise, on YouTube. So what does it mean that these videos are? And what does it mean that there will always be money made in public forums off of millions of people clicking over to someone die?
It’s not that YouTube is a neutral or silent proprietor. Its own Community Guidelines are straightforward—if broad—on the subject:
The world is a dangerous place. Sometimes people do get hurt and it’s inevitable that these events may be documented on YouTube. However, it’s not okay to post violent or gory content that’s primarily intended to be shocking, sensational or disrespectful.
Asked for further clarification, a Google spokesperson told us that “videos flagged by YouTube users are reviewed against our Community Guidelines. While YouTube’s Guidelines generally prohibit graphic or violent content, we may make exceptions for material with documentary, or news value.”
That “news value” of graphic content on YouTube has often come from wartime uploads, including the possible deaths of Muammar Gaddafi in 2011 and Syrian soldiers in 2012. Those were troubling, and raised questions about and within YouTube, but not far removed from the type of correspondent-based war reporting that’s been the standard since Vietnam.
But that doesn’t preclude that news sense from shifting. Look at the biggest spikes in Wikipedia traffic, or the circulation of celebrity rags when one lands an exclusive shot of a famous corpse, or even the coverage of Steve Jobs’ later life. We aren’t just obsessed with death, we’re obsessed with the ceremony of it. Every time there’s a death in the public consciousness, the rush to get all the information—drugs? violence? recklessness?—and a photo of the dead body begins. Moore’s fall didn’t take place in a war-torn country. It wasn’t captured with a handheld video camera. It was a man falling down in a nationally televised competition, and the world looping back to see how he died. That’s enough, apparently, to make it news. And news can be sponsored.
Tunneling through these cracks in the system are thousands of opportunistic profiteers trafficking in eyeballs and ad impressions. The YouTube Partners program was started in 2007, and as of last year, has more than a million members in 27 countries. Of those, “hundreds” make a six figure income off of the advertisements that are served against their uploads, with an elite few making over a million. But for the overwhelming majority? A few thousand bucks, if they’re lucky.
Targeting celebrity or public deaths for spikes in AdSense revenue is probably more nihilistic than it is depraved. But it’s hard to shake the sense we’re living out some dystopian pastiche, with a digital ringmaster selling on-demand tickets to watch men end their lives.
It’s important to remember that for YouTube and its advertising partners, none of this is deliberate. After we contacted YouTube with some questions about the video, it started taking down ads on all of the uploads. A Google spokesperson told us, “When we become aware of ads that are showing against sensitive content, we disable ad serving.” It did a pretty thorough job, stripping corporate sponsorship from most of the first 20 or so found in a standard searches. But it’s an imperfect system.
The most viewed version of the Caleb Moore crash video had more than 1.4 million views and counting as of last Thursday. After its ads were stripped on Friday, the account, 2013XGAMES, removed the video (or was compelled to by YouTube). And replaced it with a new one. The ads returned, running just fine until the reup was taken down a day later. Video of the crash can still be seen peripherally in an update to Moore’s condition that replays the crash twice, once in slow motion. And of course, “caleb moore crash” will turn up any number more results. It’s impossible to keep up with all of it.
Here’s the problem: There’s no real alert system in the advertising world for when something goes wrong. Typically, sites try to collapse ads before they ever display on controversial content. Is this possibly offensive to the advertiser? Don’t run the ad. But as advertising has expanded across the internet, the filters have been far outpaced by the scale of sites displaying ads—especially one as monolithic as YouTube. So when something goes obviously wrong, like a deeply homophobic video showing up with a political ad, the “alert system” is usually a forwarded email saying, “You see this???”
View the gallery
Fixing that is crucial to YouTube. Advertisers buy what’s called a Run of Site (ROS) placement, which means their ads will display on certain types of videos. Say, “cats” or “sports” or “video games”. Media buyers we reached out to for specifics were hesitant to speak on the record due to their relationship with Google, but you don’t have to go much further than YouTube’s support pages to see how this can go wrong:
The Adsense ads displayed on your video are determined automatically by our system based on a number of contextual factors relating to your video. These factors include but are not limited to your video metadata and how you categorize your video. We aren’t able to control all of the ads that appear with your videos manually.
And every time an advertiser is furious over an unfortunate placement, YouTube loses money. Or more accurately, it loses future money, since advertisers demand a “makegood”, or compensated ad space, for the flub. For something as glaring as the Caleb Moore video, the free impressions could—across all advertisers—value in the millions.
So as mad as Mercedes, Samsung, Foot Locker, North Face, and all the other advertisers might be over having unwittingly underwritten death on film, they will be happily sated by lots and lots of free ad impressions. Hypothetically, edgier brands could even actively seek out riskier material, hoping to catch YouTube displaying their ad on an inappropriate video. That’s horribly impractical, though; not even Google knows with 100 percent certainty what types of videos will turn up in what types of ad tracts. And even if it did, it’s hard to keep tabs on every single thing that’s uploaded.
It is hard to feel sympathy for corporate advertisers, but here, you almost do. They wanted to be associated with the X Games, or perhaps generically sports-related videos. Instead they got a snuff film.
And it would’ve been almost impossible for them not to end up there. The account that ran the most popular YouTube iteration of the crash is a good illustration of this. It’s called 2013XGAMES, and uses the official X Games logo. It headlines its videos fairly standardly. To a casual observer, this was ESPN, wantonly promoting the crash to a million and a half willful viewers. It wasn’t, but aside from getting some videos pulled down when and if it notices them, there’s not much ESPN can do to stop its imitators.
For its part, ESPN has done its best to distance itself from the Moore crash. You could say it’s being tasteful, or that it’s straining to insulate its X Games brand from the impression that its athletes can no longer defy death. Either way, it’s gone about scrubbing the crash from the internet as best it can. That includes requesting takedowns of many of the videos we reference here, though the crash is still easy to find. You can’t put that genie back in its bottle.
That’s just how the internet works now. We’ve become so efficient at the monetization and distribution of content that it’s impossible to stop grim things from finding a large stage, and for people to profit off of those tragedies. And what’s scariest isn’t that it exists. It’s that it’s impossible to stop.