The promise and pitfalls of relying on artificial intelligence
UMass Boston professor worries about ceding choice to machines
ARTIFICIAL INTELLIGENCE IS changing the way we think about authorship, art, and white collar work. It may be changing how we think, full stop.
As artificial intelligence, or machine learning, becomes more integrated into people’s everyday lives, it runs the risk of “replacing moral judgments, or by replacing practical judgments, or replacing everyday judgments,” Nir Eisikovits, professor of philosophy at the University of Massachusetts Boston, said this week on The Codcast.
Decisions as minor as what to watch on a streaming service or as major as whether to approve someone for a mortgage are being gently, or not so gently, automated.
Eisikovits founded the Applied Ethics Center at UMass Boston. Since the center opened in 2017, Eisikovits and his colleagues have been thinking about the impact of artificial intelligence on moral decision-making and its unique relationship to work and creativity.
Flashy fears about artificial intelligence are probably not the best targets for human hypervigilance, Eisikovits notes. “One important misconception is that we’re moving closer and closer to a sentient kind of Skynet AI that’s capable of generating its own intentions and finally becoming a robot overlord,” he said. “I think that’s not in the cards for the near future.”
A risk of increased reliance on artificial intelligence is, instead, about short-cutting many of the basic moral and creative decisions that people make. If a machine can suggest possible song samples for aspiring musicians, or go further and create a music video from a popular artist nearly indistinguishable from the real thing, the relationship between artist and artistic product can get confusing.
“We admire great performances, because they represent the kind of giftedness that awes us,” Eisikovits said. “And in the case of technologically generated performance, our admiration moves from the giftedness to an engineer or to a pharmacist or to a lab person. And it’s not clear what art can still do for you under those conditions.”
There are some upsides to better machine learning, in theory. A person’s “spidey sense” could boil down to personal bias that a more fair machine learning system could counteract. On the other hand, the bias of a programmer or the pools of data that machines are trained to sort through could fundamentally taint the outcome.
Famously, Eisikovits recalled, facial recognition technologies were better at recognizing light-skinned faces than dark-skinned faces because the data used to train the programs used more White people.
“Importantly, if there’s enough political will, and commercial pressure, and both, then those biases can be fixed, just like Microsoft fixed its facial recognition software in response to pressure,” he said. “So in some way, I think the extension-of-bias question is a big question. It’s important. But it can be addressed. What I think can’t be addressed, or is much harder to address, has to do with the loss of capacity from this replacing us in some basic functions.”
Artificial intelligence’s capacity to replace workers is similar to industrial machinery in that it can perform labor-intensive, low-skill work more efficiently – reviewing large amounts of documents in legal or medical fields, for instance.
It ties into a broader conversation about the nature of work and fulfillment. Is there an inherent value to making a human being pore through tens of thousands of pages, or summarize stock briefings, or pull together a basic marketing deck? CommonWealth last month considered the rise of the four-day work week, which proponents assert gives employees more time for their personal lives while counterintuitively getting the same amount of work done as they might in a five-day work week.“What seems to be happening is rather than jobs being replaced wholesale, parts of jobs are being replaced,” Eisikovits said. “So people’s job definitions are changing. I think there’s no way around that, meaning that sooner or later, you’ll need fewer people to generate the same kind of productivity.”