If you work for a boss who has all the emotional intelligence of a computer, consider that someday your boss might actually be a computer. That’s not a dystopic fantasy — for some workers, it’s reality.
Take ridesharing. Millions of cab rides in the U.S. have started with someone pulling out their smartphone, plopping a pin on a map and waiting for a driver to show up and take them on their way. Uber started the mobile ride-hailing revolution a decade ago, and some transportation network company users have probably never called a cab the old fashioned way.
Those who have called for a cab will remember talking to a dispatcher who coordinated their pickup. The dispatcher performed a middle manager function, communicating between drivers and passengers, allocating labor resources according to the company’s goals. Then Uber and Lyft came along and automated the dispatcher’s job. The automation of leadership isn’t likely to stop with the personal transportation revolution, according to a new paper in Computers in Human Behavior, which outlines a framework for further explorations into the changing nature of traditional human-to-human workplace hierarchies.
The paper proposes a conceptual framework of computers-as-leaders that can inform research, and a Leadership-TAM, or technology acceptance model. Researchers have used such models to predict “individual adoption and use of technology” and they have been “successfully applied to a broad range of different technologies,” the authors write.
Academic theories around automated leadership can help policymakers and the public understand the real consequences of computer bosses, but those theories are lagging behind real-world implementation, according to the paper. The authors assume workers need to perceive their automated leaders as being useful and workers need to be able to interact with automated leaders in an effortless way.
“Technology philosophers say algorithms cannot be leaders because they are not legal persons and they cannot be sanctioned and they don’t have inherent moral sentiments and feelings and this is, like, full stop,” says social scientist Jenny Wesche, visiting professor at Humboldt University of Berlin and one of the authors. “Although I agree a computer is not a legal person, it’s important to be open to this paradigm in order to see the people who are working under the leadership of algorithms. If you say machines cannot be leaders you ignore the people who are already working in such situations.”
Computers: from paperweight to the corner office
Researchers typically frame this discussion around what they call “human-computer interaction.” When computers first entered workplaces they were viewed as tools. They were much more sophisticated than, say, a hammer. But, like a hammer, an early computer was a mere paperweight if it didn’t have a human telling it what to do. “The computer is a moron,” Peter Drucker wrote in a classic McKinsey Quarterly essay in 1967.
Around the turn of the 21st century, researchers started to explore computers as team players, with computers and humans acting in cooperation, even as peers. Almost twenty years on, with exponentially greater processing capabilities and artificial learning, the algorithm-as-boss is here for many workers, particularly those in food service and the gig economy.
“Computers are becoming intelligent entities and are already making decisions that seriously influence human work and life,” write Wesche and her co-author, University of Fribourg psychologist Andredas Sonderegger.
Major chains in the service sector often use automated scheduling, and so do some hospital systems. A computer, not a shift leader, might tell your favorite barista when to show up in the morning. A few years ago, reporting revealed that the scheduling algorithm Starbucks used was creating havoc for some workers, who were sent scrambling for child care in order to make shifts at odd hours.
“Even if these algorithms maybe are not fully autonomous they nevertheless have a big impact on workers’ lives,” Wesche says. “It is more drastic in the gig economy because, from my personal view, the workers there are quite exchangeable.”
For example, Wesche says that a transportation network company may not be particularly invested in career advancement for drivers who use their smartphone application, “and this is different from most traditional companies, especially in higher-skill jobs.”
What’s a leader?
Organizational psychologists have examined from many angles how human leaders and subordinates interact. But when technology is added to the mix, the scholarship tends to focus on how computers can help human leaders and teams – less so the idea of computers as, “active agents in leadership and team processes,” the authors write.
Some researchers put personal management styles – charismatic or inspirational, for example – at the core of being a leader. Leadership is, “a shared human process,” workplace researchers Wilfred Drath and Charles Palus wrote in the mid-1990s. Other definitions of leadership, like that put forward by University of Albany psychologist Gary Yukl, are more functional and have to do with one individual guiding another in structuring activities and relationships. Workers with bosses who primarily do things that can be quantified – scheduling, establishing goals and priorities, monitoring job performance – are those seemingly likely to encounter automated leaders.
“I think we will also in traditional companies see increasingly that functions will be automated, because it is much more efficient,” says Wesche. “But the question is, how do we design it?”
That design may need to account for some lost human element. Social exchange theory observes that workplace relationships can become something more than a financial transactional. Professional relationships can develop beyond an employee being productive for an organization in exchange for monetary or other compensation. Mutual trust between people builds over time, and this can play out in positive social ways. If a barista’s grandmother dies their longtime boss might empathize, approve time off without a fuss, and trust the employee will be back to work when they’re ready – and, perhaps, be open to taking an early shift in a pinch.
“It’s important to do research on the way that humans, with their need for social contact, can interact with a computer leader so that they can in fact flourish at work, they can perform well and at the same time develop personally and experience well-being,” Wesche says.
Hey Alexa, how’s my TPS report?
Humans are arguably already quite comfortable interacting, at least on a basic level, with non-anthropomorphic algorithms. A growing body of research, for example, is exploring how adults and children interact with smart speakers. Alexa, the voice of the Amazon Echo, is backed by sophisticated algorithms and it reverberates from a body that has zero resemblance to the human form – but people can still form real connections with it.
“So my daughter thinks she knows Alexa’s habits and she can understand Alexa even if I can’t,” one parent recounted in 2017 conference paper that analyzed 278,000 Alexa commands. “It’s kind of creepy. As I say it out loud it’s totally weird that my daughter is friends with a tower that sits on my counter.”
Advanced economies are complex and some workers may never interact, at least directly, with an automated leader. For them, the concept of computers-as-bosses may be moot. But for others, like gig workers and food service staff, the future is now — and there are still a lot of unknowns as to how automated leaders might change the lives of their human charges.
“It’s not the question of whether this future will come, or whether functions can be automated, or whether artificial intelligence is taking over more and more functions at work — it’s a question of how it’s going to happen,” Wesche says. “We should be aware of that. I think it is the responsibility of scientists and journalists and politicians to guide this development and set boundaries and discuss standards and discuss development — and not so much discuss whether it is going to come or not.”FacebookTwitterLinkedInRedditEmail
SOURCE: Clark Merrefield
VIA: Journalist’s Resource