Advertisement
Motherboard

Delivery Robots Will Rely on Human Kindness and Labor

If robots are going to invade public spaces, they will be subject to vandalism and pranking.

by S.A. Applin
May 8 2018, 12:00pm

Image: Starship Technologies

S. A. Applin, Ph.D. is an anthropologist whose research explores the domains of human agency, algorithms, AI, and automation in the context of social systems and sociability. You can find more @anthropunk.

In April, Starship Technologies announced that it is going to launch “robot delivery services for campuses.” Its goal is to deploy at least 1,000 delivery robots to “corporate and academic campuses in Europe and the U.S.” within the next year. It’s the latest in a long list of automated delivery schemes from a tech companies big and small.

This is another version of civic disruption, only this time instead of deploying devices into the broader Commons, deployment is aimed at both public and private bounded Commons, which I call the micro-Commons. The micro-Commons are smaller, bounded domains, which have social and cultural rules of engagement for cooperation within that domain. Examples of micro-Commons would be a university or corporate campus, shopping mall, hospital, or convention center.

Starship Technologies appears to assume here that the environments they deploy to will be completely cooperative with their delivery robots. Furthermore, there is an assumption that both humans and robots will function similarly across all of the deployment micro-Commons campuses.

Spoiler: they won’t.

In 2015, M.D. Fischer and I wrote a paper called “New Technologies and Mixed-Use Convergence: How Humans and Algorithms are Adapting to Each Other.” We outlined the ways in which humans help algorithms to function. In the case of delivery robots, which are driven by algorithms, human interaction engages with the robot’s action, rather than directly with its software. We had several observations.

First, with each new “disruptive technology,” people are expected to rapidly adjust to an increasing heterogeneous presence. This means that not only are people going to be expected to adjust to the delivery robots right away, they will be doing so within the context of dockless e-bikes or scooters, new apps or VR/AR on their phones, or whatever other new disruptive device is launched into the market and then onto the Commons or micro-Commons at around the same time.

Second, the rigidity of the programming and limited adaptability of the algorithm (in this case controlling the robot) will cause people to develop workarounds to figure out how to deal with this new disruptor in their domain. Thus, people and in this case, robots, are going through an “adjustment period” where they are sorting out how to share physical space within domains that have not previously included robots.

Companies that propose to use delivery robots are making an assumption that their devices will smoothly meld into the existing socio-technosystem, without requiring much labor (on the company’s part.) This is largely because the labor model will be shifted to those in the micro-Commons who will be interacting with these robots on a daily basis—even if they are not recipients of deliveries.

The delivery robots will be deployed on different types of campuses—both academic and corporate—and as such they will require a boatload of human cooperation to function. For example, on a daily basis at a college campus, the delivery robots will be required to navigate and negotiate terrain and paths, the latter of which have pre-defined social navigation rules that enable people to cooperatively move throughout the landscape efficiently. Those navigating on paths may include pedestrians, golf carts or other small transport, bikes, scooters, roller skates/blades, Segways, hoverboards, and/or whatever new mobility thing gets released in the next year that people will be using—as well as other delivery robots, perhaps from other vendors with other navigation algorithms that are incompatible.

The robots can be bot-napped, signal jammed, ridden, tripped, rolled into feces or chewing gum, stolen, or pranked and disrupted in any number of creative ways that humans can invent

The Starship Technologies delivery robots, as well as other robots will stop, they will get stuck, they will crash into things, they will veer off the path, and the way that they will continue on their journey will be if and when someone on the path stops what they are doing and where they are going in order to right each disabled robot, and set it back on the path. They also might trip up and injure others on the path by virtue of stopping and causing traffic jams or sudden movements. Thus there are obstacles and issues of sociability that must be solved for delivery drones to become useful and continued presence in these domains.

The robots will also have to deal with weather, doors, elevators, and other physical limitations. Current technology does not enable bots to open doors or control elevators, so in order for these bots to get around, they will require human cooperative labor provided by the people within whatever domain to which the robot is deployed. This could be students, faculty, administrators, custodians, workers, visitors, or anyone else in a delivery robot’s path.

Potentially, the robot mishaps and need for free human custodial labor will disrupt people’s movement and intent within the micro-Commons. This a big assumption of cooperation on the part of Starship Enterprises. Additionally, Starship Enterprises may only get feedback that their robots are effective, not realizing that the robot delivery success is only accrued at the hands of invisible human helpers. This incorrect data will then be used as evidence for even more deployment where even more people will be required to plug the gaps in this faulty model of automation.

All of this applies only if people are cooperative. They can choose not to be as well. The robots can be bot-napped, signal jammed, ridden, tripped, rolled into feces or chewing gum, stolen, or pranked and disrupted in any number of creative ways that humans can invent. The expectation of trust here on the part of Starship Enterprises is phenomenally naive. It is expecting on-demand volunteer crowd-sourcing from people who may not wish to cooperate.

The last point, and one of the most important, is about culture. The culture and social rules of engagement for private companies and private educational campus versus public college and university campuses, and between the regions of the U.S. and between each country in Europe are marked and different. Patterns of engagement, cooperative models, helpfulness, and even mischief, all differ by region because of our cultural frameworks. When an autonomous robot is deployed in different regions, people will respond to it in different ways, and its programming must be developed to include ways to mesh within social and cultural rules of mobility and carriage in physical space. The cultural myopia in this instance sets up a precedent whereby the robots will only be successful within cultures that match their particular way of engagement.