Many of Amazon's least-paid workers aren't in its warehouses, but online. They're the workers of Mechanical Turk, (MTurk) Amazon's marketplace where companies can request for individuals, referred to as Turkers, to complete micro-sized tasks, called Human Intelligence Tasks (HITs), in exchange for pennies a pop. The tasks vary widely, and can include transcription, writing, editing, and content moderation, among many other things.
But Turkers aren't just helping to translate sentences or tag pictures on Instagram, they're contributing to academia. As a post published Wednesday by Vanessa Williamson on The Brookings Institute's TechTank blog points out, professors and researchers are increasingly turning to the platform to mine data for their studies. According to the post, "a search on Google Scholar returns thousands of academic papers citing MTurk, increasing from 173 in 2008 to 5,490 in 2014."
MTurk appeals to academics across many disciplines, from the STEM fields to philosophy. It allows them to gather data quickly, easily, and most importantly: cheaply. Researchers are also able to source data from a wider swath of participants than might be available on the campuses of their respective universities, including people who are older or from different geographical regions.
The problem is, it's not clear whether the data gathered via Turkers is valid in some circumstances, and there are questions as to whether it's sourced ethically, since at least some people who work on MTurk are paid far below the federal minimum wage of $7.25 per hour. In her post, Williamson wrote that "I paid hundreds of Americans living in poverty the equivalent of about $2 an hour."
Similarly to the drivers of Uber cars, Turkers aren't considered to be employees of Amazon, but rather independent contractors. While workers' identities are verified—Amazon reportedly compares users to their IRS documents and they have to wait out a probationary period—Amazon does not take responsibility for things like ensuring workers make an hourly minimum wage, or handling their tax withholding and contributions. MTurk functions similarly to a micro-sized temporary work agency: individuals compete to snag extremely short gigs that are only on the site for seconds before they're claimed.
How to make ends meet while working on MTurk is a popular topic on the platform's subreddit, r/mturk. For example, user EveryManALion asked in a post yesterday how they should report their MTurk income to the IRS and how it will impact the foodstamp benefits that they receive.
While relatively rare, it's possible for requesters to "reject" the work that Turkers have done, sometimes for no reason, jeopardizing their ability to be approved for new projects in the future. Workers have the ability to reach out to requesters to ask about what they may have done wrong or for the rejection to be reversed, but oftentimes their requests go unanswered.
Despite these concerns, several Turkers I talked to on Reddit reported that academic surveys are often the least problematic requesters on the platform. "I think that around 50 percent of academic requesters try to pay fairly, which is more than the requester pool in general; meaning a higher percentage of academic requesters care about paying fairly compared to requesters who aren't posting surveys," reddit user auralgasm told me, who said he works on MTurk full time.
"For the most part, academic posters stick to the pay rates suggested by many mturk forums ($0.10/min), the surveys aren't as monotonous as the most frequently seen batch tasks, and at the end of the survey we're given a debrief that tells us how the work we have done is helping to make a contribution to our understanding of psychology/economics/etc. (with other tasks, it's sometimes very hard to see what the point of what you are doing is)," Reddit user xai pointed out.
One of the biggest issues about academic surveys on the platform is that oftentimes they take longer than is advertised, so sometimes a Turker might be paid less than a dollar for what can end up being an hour of work.
There is a group of workers that often receive higher compensation for their work than normal: those who possess the master qualification. Masters are "workers who have demonstrated excellence across a wide range of HITs," according to Amazon. These workers enjoy higher compensation according to several users I spoke to on Reddit, including MarciTX. "I Turk full time and [am] a master. If you go to MturkCrowd.com, most of those turkers are masters and averaging $200-2000 a week. Between $50-800 a day," MarciTX told me.
The problem is, it's not clear why or how Amazon awards the qualification. Some Turkers who have spent relatively little time on the site are awarded the qualification, while those who have completed tens of thousands of tasks are still without it.
It's also possible that normal Turkers are now being paid even less than before for their work. Requesters pay twice for a task: once to workers, the "Worker Reward," and once to Amazon, the "Mechanical Turk Fee." The Mechanical Turk Fee is based on the Worker Reward, although it's paid separately—in other words, Amazon does not take any cut of the workers' pay. Instead, it charges the requester 20 percent of the offered wages, and an additional 20 percent if a researcher needs their survey or activity to be completed by more than 10 people, which is almost always the case.
In June of last year, Amazon doubled its base commission rate from 10 to 20 percent. Because that percentage is based off, and added on top of, Turkers' wages, Turkers say the new structure has caused request makers to pay lower wages for the same work to save money. For example, if a researcher has a hard budget of $500 for conducting surveys, Amazon increasing its commission fee means the researcher would have to either run fewer surveys or cut wages to make up the difference.
"This dropped survey wages quite a bit; I do a lot of surveys and noticed a definite plunge in what I earn from those," auralgasm told me. "I don't think it's ethical to recommend people to stop posting work, [on MTurk] they should just ask their university for more money," they said.
When asked to comment on the changes in pricing on Mechanical Turk, Amazon pointed toward several blog posts that provide background on the decision.
When asked how Amazon protects Mechanical Turk workers from scammers or shady requesters, a spokesperson from Amazon stated that, "We employ several mechanisms. Some are visible to Workers, while others happen in the background. For example, Workers can flag Requesters and HITs for investigation by Amazon. These investigations can result in corrective actions including Requester warnings or suspensions. Internal reviews happen periodically based on Requester behaviors and actions."
One of the biggest issues with academics using MTurk has nothing to do with pay at all. Many researchers simply don't even know how the site works or how to use it properly. For example, in 2014 a researcher reportedly blocked hundreds of Turkers on Christmas Eve by accident, in order to prevent them from retaking his survey. He didn't realize that Amazon sometimes suspends a worker's account for being blocked, which can prevent them from earning income until their account is restored.
"Another really common thing is requesters having an error in their surveys (for instance, forgetting to ask people their gender) and then rejecting everyone who took the survey because they didn't want to pay for bad data," auralgasm told me. "This means that all the time you spent on the survey was wasted. It sucks that the requester will lose out on money if they approve, but it's not OUR fault. They should have been more careful."
The online Mechanical Turk community is very aware of the ethics surrounding conducting academic research through the site, and members of it have created a wiki page dedicated to providing guidelines for academic requesters who use MTurk. The guidelines are a project that came out of Dynamo, a group working towards reforming work on Mechanical Turk.
Even if the contractors on MTurk are paid ethically, it's not clear whether the data they provide is skewing researchers' findings.
As political scientists Kathleen Searles and John Barry Ryan wrote in the Washington Post, it's definitely reasonable to use MTurk to gather data in certain circumstances. The trouble is knowing when. Even when it's appropriate, it's important not to oversell the results because "studies using MTurk do not always lead to the same conclusion as those using national samples," they said.
Many of the tasks on the site are completed by a very small set of workers, who frequently chat with one another about the surveys they are completing in various online forums. It's possible that using such a small sample of people might skew research results across many different fields. According to Williamson's Brookings Institute post, around 80 percent of tasks on the site are completed by roughly 20 percent of the site's reported half a million workers.
In a paper published in this month's PS: Political Science & Politics, Williamson calls for academics to examine the way that MTurk and other crowdsourcing data collection methods are used, and to push for academic journals to only allow for ethically-sourced data to be published.
In the meantime, Turkers have taken measures into their own hands, creating things like Turkopticon, a Chrome extension that allows Turkers to review requesters, in order to help each other avoid the ones who abuse the site. The extension is maintained in part by Lilly Irani, a professor at University of California, San Diego.
There's no doubt that MTurk provides a lifeline for some workers who are unable or unwilling to earn income in other ways. "Turk basically allowed me to pay off my 10 year grad school loan in TWO years, saving me approximately $6k in interest charges," Reddit user rdulany told me. But many Turkers are paid far below the minimum wage, and it's unclear whether the data they supply to academic researchers is valid.
As researchers continue to question the ethics around MTurk, its workers will keep to completing many of the strange and monotonous tasks that keep academia and the internet at large churning along.
Update and correction: This story has been updated to add comment from Amazon. An earlier version of this story misidentified Lilly Irani's teaching university affiliation.