S. A. Applin, Ph.D. is an anthropologist whose research explores the domains of human agency, algorithms, AI, and automation in the context of social systems and sociability. You can find more @anthropunk.
When Amazon’s algorithms ban shoppers for returning too many items without warning, it isn’t just inconvenient. It removes people from participating in Amazon’s rapidly dominating economy. This threat sets a dangerous precedent of subtly encouraging us to change our behavior to appease its algorithms.
This week, I got a letter from Zappos, which is owned by Amazon, telling me that it might break up with me. It said that it seemed I wasn’t happy with their level of Customer Service (I was happy with it), and I returned too many shoes, so as a way of improving that Customer Service, it would actually stop Customer Service.
“Although our primary focus is the 100% satisfaction of our customers, we also need to take into account the cost effectiveness of continuing to do business on an individual basis,” the email read in part. “After careful review of your account history, this email serves as a warning, that any future issues that occur after the date of this notice may result in the permanent closure of your account and any associated accounts.”
Zappos also asked for what it could do to improve Customer Service (Hint: keep my account open and don’t send me pre-break up letters.)
In recent years, Amazon’s dominance of the retail industry has resulted in many small and local businesses shutting down all over the country. In its mission to “dominate retail,” Amazon has hollowed out mainstreet, and is increasingly where many Americans do all their shopping. The retail business Amazon has built now exists as a walled ‘obey-to-play’ garden, where people are encouraged to shop for everything, and have it delivered. As such, Amazon has placed itself in a new role as a provider of an entire city’s worth of retail from a single vendor.
In this context, if Amazon begins banning people from its platform for unclear, algorithmically generated reasons, the people who are affected will not merely be inconvenienced, they could be critically cut-off from a supply chain that has subsumed other options. This is the equivalent of Amazon not only kicking someone out of a store, but kicking them out of an entire marketplace that used to be in the public physical sphere, and which has now been privatized to the digital one.
In my case, I called Zappos. Apparently, because I had ordered more expensive shoes and returned them, it caused them to be concerned about my shopping behavior. Go figure.
The Customer Service agent said that it was solely because the dollar amount of my returns triggered a flag. Zappos only sent me the letter because it looked at the monetary amount of what was being returned and not the context of the return.
The shoes I ordered are handmade. Handmade shoes are wonky. They are often irregular to sizing, one shoe may fit different than the other for left vs right, and have other idiosyncrasies. Sometimes I need to order two pairs in the same size to try them, or to bracket sizes to get the right fit.
In general, shoes are things that need to be tried on, that take some time to be fitted, that vary from brand to brand—often within styles of the same brand—and require a tactile component to their purchase. Shoes are not books.
While I don’t buy handmade shoes often, I was apparently doing it enough to get the pre-breakup letter. The Zappos agent I spoke with understood my discussion of context, but could do nothing about the flags being triggered. They could only add “notes” to my account. They could also not reset the flag status on my account.
Last week, there were several articles about Amazon cancelling customer accounts for making too many returns or requiring refunds. There are numerous reasons that are plausible: fraud, wearing items and trying to return them, or other criminal acts. There are also legitimate reasons based on human choice and preferences.
What stands out is Amazon and Zappos expected homogeneity and normalization of consumer behavior and have programmed an inability to understand that shoes and clothes and other personal items are much different than books or other commoditized goods. Both are seemingly using the same algorithms to flag purchases across a range of goods. This is problematic because Amazon’s algorithms do not seem to be understanding context, and as such they likely trigger flags based on very simplistic usage patterns. But because humans take agency and think and act in ways well outside of expected norms within certain contexts, our behaviors do not match algorithmic expectations of what that behavior should be. It’s too rigid.
There are usual reasons for returning items: they are defective, damaged as they are packed, damaged in transit, or damaged on arrival. When this happens, a customer would want to return the item. However, if a customer purchases something that is outside the norm anyway (handmade shoes on Zappos), or are expensive or otherwise unusual, the algorithms spring into action. Sometimes incorrectly.
Where the system seems to be broken is in understanding why. Humans don’t understand what algorithms and AI are doing, and algorithms and AI currently are not able to understand the concept of “why” and thus, the context of what humans are doing. It seems the algorithms can only count and flag patterns and compare those patterns to others. For example, if most people buy an item and keep it, someone who returns the same item may be be flagged. If that happens enough, then that person could be singled out and sent a letter or have their account terminated without any warning, and perhaps without any human intervention.
While Amazon says it does have humans review these accounts, it also says it has 300 million customers worldwide. It is unlikely that humans could review all cases. Even in situations where a human does review a case, the Customer Service agents operate via a series of scripts and processes that can further cloud a person’s plea for account restoration.
Unfortunately, this type of “Customer Service” and algorithmic surveillance sets up a type of algorithmic intolerance that creates biases against people for having human needs and preferences—and bodies, that need to fit with items that are cut and measured either to a mass standard that most of us do not easily conform to, or a handmade one that requires nuance.
Amazon has swallowed the retail world, becoming many people’s go-to store. However, if people, once having purchased items, have personal, emotional, or physical needs that are not quite satisfied by Amazon’s products, and this happens over an unspecified time, they are unceremoniously dumped back into the Commons where Amazon has decimated local retail.
Thus, we’re allowed to shop at Amazon and its subsidiaries as long as we keep what we buy and have no problems with it, or the shippers who sent it, or the third party within Amazon who sold it. As long as our actions aren’t triggered by an unknown algorithmic pattern, we’ll be OK.
This might be fine for people who aren’t fully installed into the Amazon system for groceries, entertainment, clothing, household supplies and all else. There are still Walmarts and Targets and other online retailers, but these too, have suffered as a result of Amazon’s dominance.
However, where this system begins to darken is if as Amazon creeps back into the Commons from being online—it brings these “bans” with it. Amazon Go stores operate without cashiers (human or otherwise) and require an Amazon account, Amazon bookstores in the community are linked to its online store, and potentially if all other Amazon types of retail moves back into the Commons, it could exclude people with return patterns that were rejected by its algorithms.
What happens when someone has invested in a household Alexa Echo and/or Dot system and Amazon cancels their account? They lose access to their smarthome, as well as a substantial hardware investment. As Whole Foods markets (owned by Amazon) continue to be in communities and offer special discounts to Prime customers, those who are no longer on the system will be excluded—and eventually, Amazon could choose to make its grocery stores “member only” after running down the competition.
It isn’t that Amazon, or even Zappos are bad. They’re great for lots of people and clearly have honed in on a formula that people appreciate (as long as they don’t return very much.) However, they are being run by algorithms that have real consequences for people, and these are not being properly monitored. It may not matter today if Amazon or Zappos restricts them from the platform for retail, or other Amazon products and services. But it does matter what happens as Amazon grows, and as it continues to subsume and consume the Commons—and us.
For if we begin to change our behavior by yielding our choices and preferences to appease a rather kludgy algorithm, to maintain a certain type of privilege, we forfeit our agency and choice and instead, volunteer for a type of retail tether that erodes at our dignity and options for choice.
It is in these ways, that we begin to be more easily controlled—and that becomes more dangerous, the more of our food, medication, and all else is filtered through one behemoth.
Chances are, if I decide to buy shoes from Zappos again, it will trigger the same flag. I can find the shoes elsewhere, and will likely buy them elsewhere because I want to support local businesses. Having things shipped home is all well and good until you are algorithmically tracked, judged, and sentenced.
I’m going to balance my shopping, and shop local more often for what I can. Before it goes away for good—and before I am not longer allowed to participate in Amazon’s economy.
Correction: An earlier version of this article said Amazon Go stores have no human employees. They have no human cashiers.