Spreading false information is potent. Particularly now, because information spreads quickly. The Russian government is actively spreading false information about Sweden, Ukraine and Syria. It is, in the words of Neil MacFarquhar of The New York Times, deploying “weaponized information” instead of guns and bombs to spread discord and chaos.

Deliberately planting false stories is nothing new. The United States government does it. Kanye West does it. Donald Trump. Just about any Hollywood publicist I’ve ever met. But anyway.

Something else is going on, something bigger, deeper, faster, machine driven and therefore more banal and even more dangerous. Misinformation is being spread by dumb machines controlled by smart people. Yep, badly-tuned AI is your new news editor.
Facebook landed in hot water for firing an entire team of contractors responsible for curating its Trending news section. Facebook, which wants to become our news overlord sooner rather than later, was experimenting with making the Trending section more automated. In a press release, it wrote that people would no longer be required to “write descriptions for trending topics.”

Dump the people? Big disaster. You get a machine learning train wreck.

Facebook’s Trending news promoted a false story about Megyn Kelly and also a story about a man having sex with a McChicken sandwich. (I’d give you the links, but then you’d just click away. Look them up later.) The idea, which did not work well, was to get rid of bias by reducing human control over what was in the Trending feed. It went deeper than that, because Facebook was using human journalists as a stopgap. The contractors were to be employed just long enough to train the machine so that it could take over. Turned out the machine couldn’t pick news stories very well, and also the engineers who were supervising the machine couldn’t recognize a false news story when one turned up. Lesson? You can’t exchange journalists for engineers, because domain expertise counts for something. Engineers focused on building an algorithm smart enough to gather a sufficient number of articles and posts to signify a trend — well, that just isn’t smart enough. You need hoax detection, satire detection, bullshit detection, outright goddamn lie detection, and there’s no algorithm for that yet. (Regular people aren’t all that great at that kind of detection, either.)

Machine learning or artificial intelligence will be running lots of things one day soon, but not next week. There has been success with automated curation — you can cite Twitter’s Moments deployment, which groups Twitter posts into something resembling a narrative, apps like Blendle do a decent job of pulling interesting news stories, and Apple’s News app is not completely terrible, but pretty close to terrible.

Automating writing, even automating headlines, is a tough gig, because writing is hard. (Duh.) There are nuances, skills and expertise that are beyond the purview of the engineers tasked with writing algorithms that are supposed to accomplish the task. Sorry, engineers, I’m grateful for your failure, temporary as it is, because it gives us all time to enjoy blogs like this that are written by a human at the end of a long work day. Yay typos.

Teaching machines to write and curate seems next to impossible, but there are already a few glimmers of success. AP has a bot that writes minor league baseball stories. There’s a bot that takes your data and turns it into a product description. The New York Times is experimenting with a bot that writes haiku based on news stories. It sounds rather charming, until you discover that a haiku like this:

He told students to
get their diplomas and shared
his dream of escape.

… came from a story about the murder of a rapper in Baltimore. Not what you might have been expecting when you read that poetry, huh? A bot can’t figure that out, yet. But wait.

It’s strange how the mundane tasks of creating newspaper headlines and blog post titles can go so far astray. The problem begins with the people assigned to the task of programming: You can’t just swap out journalists for engineers, as Facebook did, and expect the engineers to program something appropriately journalistic. You get people having sex with sandwiches and false stories about Megyn Kelly getting fired. Why? Because smart people tend to be smart in their own field. Although software is eating the world, it’s a fallacy to assume that everything can be programmed, at least not yet. Language is still too mysterious for that. Today. For now. The countdown has started. But wait. Things will not be this way for long.

Did you like this post? Sign up for our list and get more like it.