Reinvention done properly: John Haynes’ answer to the AI question

by Richard Holley

John Haynes will talk about (A)I'm not scared: How AI could take your job (and what you can do about it)

John Haynes has a bold claim: he might just be able to save your (working) life. 

A recurring theme in our industry is the idea of digital taking over. We hear of people falling into digital careers. Consumers using digital products unquestioningly. Now, we’re told, the robots are coming for our jobs. 

At DigiFest, John will talk about on AI and the future of work as we know it. His experience in product design speaks of embracing change and radical thinking. Artificial Intelligence is not something to fear, it’s something to get to grips with. Key to this is changing how we understand Artificial Intelligence.

When you inherit a situation, you inherit its pitfalls. 

People who shake up the world often didn’t set out to do so. They found themselves in a world of inadequate technologies and inefficient systems and decided to do something about it. What John can teach us is that being swept along by AI isn’t the only path available to us. By looking for legacy-free reinvention, we make AI work for us. When you inherit a situation, you don’t have to inherit its pitfalls. 

As soon as you get into a discussion on AI with John, you realise how much he has thought about it. He is interested in how we use technology and how, once we’ve built a system, we can quickly become prisoners of it. John refers to a cartoon he saw: “We all dreamed that AI would do our laundry and our dishes so that we can make art and write poetry. Not for AI to do my art and poetry so that I can do laundry and wash dishes”. Setting aside the fact that dishwashers already exist, there’s a powerful point here: if you aren’t careful with how you implement a technology, you won’t get the results you wanted. In fact, you might get the opposite. 

In a varied career, John has repeatedly come up against the idea that ‘technology is coming for your job’. His ability to remain philosophical and to learn from the mistakes people make is one of John’s great strengths. He introduces me to the futurist, Ben Hammerstein: 

“He's got this phrase that uses… legacy-free reinvention (more on this later…). Thinking about the task, trying to break it down and asking those questions at each stage of the process, it helps you assess the value in each step. And as soon as you find non valuable stuff, you can stop doing it”. 

Like anybody who understands what they’re doing, John makes it seem obvious. It isn’t though. It’s radical: “It's not about looking at the process through the lens of the solution. Your solution won't come until you understand the ‘why’ behind what you do. What you're trying to get to, is the underlying motivation behind doing these things”. 

Using examples in fields as varied as film production, the automotive industry and academic research, John demonstrates how automation can’t replace insight and ingenuity. 

“A lot of these systems, people don't understand them, so they feel threatened by them, like a large language model. It feels really like it's having a conversation with you. All it's doing is predicting the next piece of text, right? So, it's just auto complete on steroids. It kind of crosses the uncanny valley quite quickly, because it feels like it, but there's no logical thought going into the responses. It's essentially a bit like a Google search, where you type in something, and it just pulls up a result. It isn't capable of making decisions. All it's doing is there's like a bell curve of all the possible results that it can pick, and it's basically picking something out of the multiple ones, right? But the real value of answering specific questions, is in like, the upper end of that bell curve. That’s where creativity, new ideas, thinking laterally, strategy, that kind of stuff is”.

If creative decision making is too important to trust to AI, it’s worth thinking about how technology can free us up to make those decisions. As John puts it "In development space, in coding there is a threat. A lot of junior development work is doing quite a basic application of what they do. So, all that AI will do is, kind of raise the bar. Raise the bar for the threshold of what a junior developer is. I think that a lot of these tools that you have, like GitHub, co pilot, and, even just using chat GPT to kind of learn to code, it's going to raise productivity. When productivity increases - if one person can do more stuff - then there's going to be more products in the world. And a world with more products needs more people to maintain it.”

What we’re talking about here is a world where technological innovation doesn’t destroy opportunities, it creates new ones, and we must adapt to embrace them.

“It's like we were saying with the car”, John says. “You think the car is going to replace peoples’ jobs, to take away a whole way of life. It actually created a whole new industry, and there are the second and third order effects of that. Like mechanics. Whole industries popped up. It’s still carrying on. You've got people who do same day car parts delivery, for example”.  

Whilst the idea of a of society and technology evolving together isn’t new, AI does carries with it a sense of bigger possibilities and greater change. Asked about the challenges this poses to social cohesion, John is in no doubt: “I'm not saying it's not going to challenge people. There's a difference between a fixed mindset and a growth mindset. Having a fixed mindset and just saying ‘well, this is what I do, and this is what I'm going to do for the next 30 to 40 years’… that isn’t true”. 

Ruminating on what the lion’s share of your career has instore for you raises many questions. One of these questions is about agency. If we’re not able to say what we’ll be doing in 30 years’ time, are we comfortable with surrendering that decision-making to the machine?  

“I mean, that's already happened. Blackrock. Biggest investment firm in the world, 21 trillion under management, the last 25 years, they've been developing a system called Aladdin that is integrated decision making and that has recently gotten into purchasing real estate. And I don't commercial mean commercial – I mean peoples’ homes. They’ve got to the point where they own family homes and the only way to disrupt this stuff is to understand where the value lies in these things”. The more you talk to John about automation, the more you realise how far things have come: “No one can compete with Google in search. The OpenAI is working on it… no one compete with Amazon publishing. They aren’t just a monopoly, they’re a Monopsomy. Monopolies aren’t the biggest threat. It's not one company owning only all of the stuff that you can buy. It's owning the selling experience, right? Amazon owns the output as well as the input, then the production, then distribution, then the user experience, the whole thing”. 

Our conversation moves through different examples of businesses gaining power through automation and businesses losing power through not moving beyond legacy. John has some fascinating insight on the Hollywood industry: “The biggest threat to Hollywood with AI is that it’s going to change the model of distribution.  There’s a lot of rhetoric that I don’t buy. Disney has for years, been really exclusive with the films, the games, studios, the manufacturing for all the toys and merchandise. Not to mention the experiential side, the theme parks. They want to keep all that. So, when Disney sits down and has conversations with the unions and talks about AI, It’s not in their interest to want to change that”. The same rings true for many legacy organisations. From TV, to print media to transportation. There are some huge actors with an interest in us being scared of change.

 A cursory glance at Mr Musk’s approach to transportation on (on this planet, at least) is a great example of using technology to tinker, rather than create: “Electric cars, to lots of people look like the future. I look at electric cars and think that's the death knell of the car. It's the car in its final form. Because actually the car is not the most efficient form of travel. It does a particular type of transport, but something like last mile transport is way more important. Those electric scooters, they’re closer to the ideal. They’re electric, they’re distributed. You can pick one up, ride it and get off it”. 

John’s attitude to Artificial Intelligence seems born of an innate sense of justice and a desire to improve the world. “The biggest concern I have is that people trust it too much and rely on it to make decisions”, John says. He referrers to the growing number of governments using Blackrock’s Aladdin to make decisions and inform policy. “I think the only way around it is collective action. I believe that people are fundamentally good, right? But with things like the NHS, it’s on its knees and we need a care system in the UK, the only way that we're going to do that is by forming some sort of collective. The only way that we can change these things is by being involved, stepping up and being involved”. We talk about the possibility of reimagining society using democratising technology to share resources, power, information. 

“Coming back to the purpose of the talk” John says, “I want to try and give people a mental framework that they can take away and use to challenge their thinking. So they can say what is the legacy free reinvention approach?”

In the digital domain, we are largely self-taught, and society adapts to new technologies. How we implement technology will dictate the roll it plays in our lives. Luckily, people like John are here to help us understand our relationship with technology. Unwise people aren’t people who don’t make mistakes – they’re people who don’t learn from their mistakes. 

Next
Next

Excitement Building for DigiFest 2024 on 10th October