Robots have already taken over our work, but they’re made of flesh and bone
Brett Frischmann and Evan Selinger
Many jobs in the modern economy have been sapped of their humanity. How should we resist the rise of ‘digital Taylorism’?
Most of the headlines about technology in the workplace relate to robots rendering people unemployed. But what if this threat is distracting us from another of the distorting effects of automation? To what extent are we being turned into workers that resemble robots?
Take taxi drivers. The prevailing wisdom is they will be replaced by Uber drivers, who in turn will ultimately be replaced by self-driving cars. Those lauding Transport for London’s refusal to renew Uber’s licence might like to consider how, long before that company “disrupted” the industry, turn-by-turn GPS route management and dispatch control systems were de-skilling taxi drivers: instead of building up navigational knowledge, they increasingly rely on satnavs.
Fears about humans becoming like machines go back longer than you might think. The sort of algorithmic management we see in the modern gig economy – in which drivers and riders for digital platforms such as Uber and Deliveroo are dispatched and managed not by human beings, but by sophisticated computer systems – has its roots in a management theory developed by Frederick Taylor in the early 20th century. As a young man, Taylor worked as a shop foreman for a steel-making corporation in Philadelphia, where he diagnosed inefficiencies he saw as being products of poorly structured incentives, unmotivated and sometimes shirking workers, and a huge knowledge gap that rendered management ineffective. Managers, he proclaimed, knew too little about the workforce, their tasks, capabilities and motivations.
Taylor and his disciples extolled the virtues of breaking down tasks into inputs, outputs, processes and procedures that can be mathematically analysed and transformed into recipes for efficient production. Over decades, and across different industries, his theories have been used to apply time-and-motion studies to workplaces, workers and what they produce. The assembly line is the most recognised example of Taylorism: unskilled workers engage in repetitive, mindless tasks, attending to semi-finished parts that, in the end, are combined into a whole product.
Over time, Taylorism became synonymous with the evils of extracting maximum value from workers while treating them as programmable cogs in machines. An early case in point: in 1917, at the height of wartime, approximately 100,000 Australian workers took part in a general strike. The action was sparked by the introduction of time cards, which recorded every minute spent at jobs and breaks. Today, it’s hard to think of time cards, even digital ones, as innovation. They have faded into the background of office life, business-as-usual for many workers. But back then they were seen as a new tool of oppression. Managers could use the information to learn how fast everyone worked and demand a quicker pace. This demeaning model was decried as “robotism”.
Taylor’s approach jump-started debates about data-driven innovation and surveillance that continue today. The modern, digital version of Taylorism is more powerful than he could have ever imagined, and more dehumanising than his early critics could have predicted. Technological innovations have made it increasingly easy for managers to quickly and cheaply collect, process, evaluate and act upon massive amounts of information. In our age of big data, Taylorism has spread far beyond the factory floor. The algorithmic management of the gig economy is like time cards on steroids.
And it’s not just taxi drivers who are being de-skilled. The logistics and trucking industries utilise even more extensive and intensive data-driven systems that control fleets and employees. Employers utilise an array of sensors to track location, timing, driving and other aspects of performance. Complex algorithms, analytics software and other hidden components of management systems generate intelligence which is then used to instruct truck drivers. Cornell University professor Karen Levy has documented how these intense management systems reduce workers’ autonomy and can incentivise sleep deprivation and speeding.
Technology also allows much more sophisticated performance management of employees than during Taylor’s lifetime. Back then, employee reviews were costly in resource terms. They required face-to-face meetings or documents that took time to pull together. Today small businesses as well as giants such as Amazon are using digital tools to create continuous streams of data for employee appraisal. Constant monitoring, and the addition of peer review to supervisor feedback, can create overly competitive, and sometimes hostile, dynamics between employees.
It’s not just the intensity of the monitoring that is different. Surveillance is increasingly hidden. In Taylor’s analogue era, workers were acutely aware when they were being observed by management with stopwatches and notebooks. Today management tools are much less visible. A cashier at a fast-food franchise who rings up purchases with a virtual cash register app on her tablet might be unaware of the programs running surreptitiously in the background, logging keystrokes, recording audio or video, transmitting data and continuously rating performance. Workers who might know that their boss monitors calls, texts, and browsing on their employer-issued smartphones might be surprised to learn that the device also communicates geolocation data, allowing tracking of their movement 24/7.
France’s ‘right to disconnect’ banning workplace emails on weekends or holidays is a step in the right direction