Back in the 1990’s (I think), there was a TV series on the BBC in which a successful business man would be shipped into failing businesses. He would examine what was going wrong and make suggestions.
In an episode I’ll never forget, a production line of people were making soft toys. It was work of a repetitive nature. There was no variation in what the staff did, day in, day out.
He asked all the staff for suggestions, and with their help the company implemented them.
A year later the TV show went back.
The business was still failing, but the workers had made suggestions which had been implemented by the company. These suggestions had enormously, and negatively, impacted their working environment.
The production line had been re-shaped so that it was more efficient. This meant from now on, none of the workers had any excuse to get out of their chairs.
The small, but highly motivating, conversations which they had had every day as they got out of their chairs to move the toys from A to B had made the job bearable. In their willingness to show themselves in a favourable light for the TV show cameras, and not really in possession of the knowledge that they needed these conversations to make the job bearable, they’d mistakenly co-opted themselves into a demonstrably worse future.
They did not know this at the time, but with hindsight, their efficiency drive had taken away something from them, something they regretted letting go of. But it was now gone, and there no hope of a return to how it was.
That’s how I’m starting to see AI.
We’re co-opting our future selves, and all of those who follow us, into environments in which humanity is harder and harder to find. A future in which work, and decisions, must be at the speed of an electron, and most of us will have little to no oversight of any of it. We risk reducing the moments we have to do the things which make life worth living.

Leave a Reply