Crowdsourcing Swift

Earlier this week I led a class on the topic of Crowdsourcing. Since our discussion was focused primarily on Human Computation research, I took the opportunity to show a live demonstration of Mechanical Turk.

After some thought of appropriate perception-based tasks that could be outsourced to workers and return meaningful results between the beginning and end of class, I settled on a modern day rewrite of Jonathan Swift’s A Modest Proposal.

If you haven’t read Swift’s famous 18th century satire, I encourage you to do so. In it, Swift describes the plights of the impoverished in Ireland before offering a solution: for the poor to sell their children to the wealthy for food. The brilliance of the piece is in the cold rhetoric being used to argue for such a shocking proposition.

Of course, a modern read of A Modest Proposal as a satire is different from a completely naive read, one where the reveal of the proposal is truly a shock and where there’s a risk that a reader may not recognize it as satire at all.

This is why the idea of paying workers to rewrite it in plain English, sentence-by-sentence with no context, provided much amusement. What would these workers think, looking at this sentence written in such unassuming prose and deciphering it, only to realize that it is about cannibalism. Even better, I suspected most wouldn’t realize it, except for those rewriting a select few sentences.

I compartmentalized the task into two steps: rewriting and voting. To add limitations to task of rewriting and constrain turkers from simply offering back the same line, I had the rewrites done as tweets, which is to say written in 140 characters of less. Each line was rewritten either two or three times (starting with three, I lowered the count after observing less noise than expected) before being promoted to the voting stage. In the voting stage, workers were presented with the original sentence and rewrites, and chose the best one.

The rewriting and voting modules were written in PHP and MySQL over the weekend, and then modified to fit into Mechanical Turk tasks using Amazon’s Command Line Tools. I paid $0.11 for each rewritten sentence and $0.02 for each vote. At 64 sentences, this cost around twenty dollars, though the rewriting wage was notably higher than comparable tasks on the site.

I have a somewhat hesitant relationship with paid human computation. Crowdsourcing with volunteers forces the organizer to be considerate of the crowds and offer them a satisfying intrinsic reward, but once you’re paying them it’s easy to see people as simply labour, because they are. Though this isn’t inherently bad, it introduces a slippery slope to an exploitative relationship. A Modest Proposal criticizes such dehumanization of citizens by using a systems-level look, appropriate considering the experiment was partially a response to Soylent, a Microsoft Office plug-in for outsourcing document proofing on Turk.

The crowdsourced Swift is on Twitter now, repulsing people with his views over the upcoming week. Follow him at @swiftsays.

1 thought on “Crowdsourcing Swift”

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s