The Nature of the Beast
- Posted on September 6, 2018
- Estimated reading time 2 minutes
Every year, you hear stories about bears in Yosemite or Yellowstone National Park that need to be relocated because they have lost their natural fear of humans. Or you hear stories of people on safari trips who are surprised when the wild animals they encounter react to them in an unexpected way. There’s no doubt that invading something’s natural habitat can disrupt the order of things. As we humans become more accustomed to living with technology as part of our daily lives, some of those same phenomena are happening.
While we quite enjoy having something on hand to update us on the weather, stream our music or tell us a joke, we can get upset to realize the same gadget is listening to us constantly and potentially recording what we say. We enjoy the free service that enables us to keep up with our friends, families and acquaintances but draw the line at any of our browsing history or personal data being shared. Or, much more consequential, we expect driverless cars to drive perfectly—to the point that they can anticipate and account for the sometimes-irrational behavior of pedestrians, bicyclists and other vehicles. For instance, when an autonomous vehicle struck and killed a pedestrian in Arizona earlier this year—the U.S. state with the highest rate of pedestrian deaths per resident population—it was international news. How often do you hear, even locally, about pedestrian fatalities with human drivers involved?
While it’s understandable that we hold technology to a higher standard than humans when it comes to rational decision-making, does that mean we leave no room for error? And if we want technology to uphold a high standard, shouldn’t we expect to give something in return? Like vigilance as a pedestrian, or an understanding that an always-on virtual assistant must always be on to be effective.
If you look at old videos from Yellowstone Park in the 1970s, you will see people feeding and interacting with bears. As park rangers realized the dangers for both parties in the encounters, they worked to change the culture and re-establish a healthy respect for powerful, wild creatures.
With AI and machine learning, we are unleashing a new powerful creature into our environment. As tourists did with bears, many people are expecting technology to behave according to—or at least be aware of—human standards of decorum. In both cases we often try to say, “You can interact with me in my space when I want you to, but otherwise, you need to leave me in peace.” This will be no more successful with machines than it was with bears. We need to identify and shape the right relationship between humans and technology, and that calls for a thoughtful approach to digital culture and digital ethics. Digital ethicists are serving in a similar role to park rangers of the 1970s and trying to build a healthy respect for the power of technology while also keeping us from harm.
How much protection is enough?
In 1865, British Parliament passed a series of laws to protect horses and people from horseless carriage drivers, including a requirement to have someone walk in front of the car waving a red flag warning of impending danger (the source of the term “red flag laws”). This shows it is possible to go too far in the name of safety and protection. As much as it would be good for job creation, I don’t think anyone wants to start an industry of red flag wavers for autonomous vehicles.
So, who are the park rangers of the digital world? At Avanade, we believe that the organizations leading the charge in building, using and applying the technology should ultimately take responsibility and are already working with our clients on defining a digital ethical framework. At the broader level, The Partnership on AI brings together 50+ companies in 9 countries to explore the impact of AI on people and society.
Somewhere between bears poking into car windows looking for food and red flag wavers is the place where we need to be when it comes to living with technology. A proactive approach and a healthy respect (maybe even paranoia), combined with the knowledge and expectation that sometimes things go wrong—despite best intentions—seems to be a good place to start.