The Business of Human Technology – Episode 5

The Business of Human Technology – Episode 5

The following is a transcript of the 5th episode of the podcast This Human Business, which is available on iTunes, Stitcher, and Castos.

Jonathan Cook: Hello, and welcome to This Human Business, an expression of the growing movement of professionals seeking to reassert the role of uniquely human contributions to business.

This movement has formed in reaction against the idea that all innovation means technological development, that we’ve reached the end of human ability, and now need to seek a transcendence of our humanity through digital technology. It’s a belief that has dominated business culture this century.

As a cultural belief, however, simple technophilia is just a narrow entryway into a broader range of ideas available to the digital revolution. One thing holding back Silicon Valley from a fuller exploration of these possibilities is that the new digital business orthodoxy has been positioned by their advocates not as a set of beliefs, but as a collection of facts that are beyond debate.

The gurus of Silicon Valley spin wonderful yarns about the future, and then get tangled up in those yarns so tightly that they come to believe the future they have predicted is already here. It’s hard to forget, given this zeal, but there is not yet any true artificial intelligence. The Singularity has not yet happened, hasn’t even appeared on the horizon, and may in fact never occur. No one has yet transcended humanity.

Let’s lay one claim to rest: The people of Silicon Valley are not beings of pure rationality. They are as grounded in culture as the rest of us. Their forecasts of the future are simply new mythologies, full of potent metaphors. They can code and quantify all they like, but the people who work in big tech still can’t help but remain human.

So, the movement for an increased role for humanity in business isn’t a luddite rejection of technology. It’s an effort to bring the roles of automation and humanity back into a mutually-respectful balance. This episode of This Human Business focuses on the effort to restore this balance.

There’s a long way to go. It isn’t just a matter of protecting our jobs from robots. As the ideological power of the digital industry increases, we find ourselves asking questions that a few years ago would have seemed ridiculous. Prime among these questions: What is the point of a human being any more?

I want to introduce you to Andy Akester, a researcher into commercial culture whom I’ve collaborated with many times over the years. The first thing you need to know about Andy is that he is one of the most thoughtful, sensitive people you’ll ever meet. He puts soul into his work. Nonetheless, when I spoke to Andy recently, he told me that he worries he’s become like a robot in his work.

Andy moved into the business world after leaving a career in counseling, and the course that he took in that journey seems to parallel a larger journey that humanity as a whole may be taking as we accept the presence of more and more technology in our professional lives.

Andy Akester: Isn’t that terrible? I’m like a robot.

Jonathan Cook: What makes you say you’re like a robot?

Andy Akester: The thing that’s the most human element of the field that I had chosen is the thing that at its core was what shook me the most. So, I feel like I’ve moved away from that depth of true, kind of like human, striving work, and then that maybe sometimes it feels like I’ve landed quite comfortably in the realm of the more superficial or aspirational elements of the human experience, sometimes, yeah I think I sometimes long for a little bit deeper connection. There’s a part of me that still very much loves people and is very motivated by connecting with people and wants the best for people. And I feel like we get flashes of being able to uncover some of that, or I get to leverage some of that at least in the work that we do. It’s probably a part of what makes me as marginally successful as I have been with it. Sometimes I wish maybe for a chance to go a little bit more, it would be great to have some follow up with people and really can be invested a little bit more again.

I think this is one of the hardest aspects of living as a human in our world, is we learn most in the mess. We learn most in the chaos. We thrive the most whenever we are a tad bit upended, or significantly upended, either one when we are upended, and I think that the business model itself is always sort of about guarding and girding against that sort of mess and that sort of chaos, right? We have to deliver a deliverable. We have to have a clear cut objective. We have to stick with our timeline and make sure that fiduciary contracts are upheld. It’s all about keeping the thing on the rails, whereas I think if we had the chance to be a little bit more messy, we might be maybe in a business setting we could get a little more deep that way.

Humanity and technology in particular are going to be strange bedfellows with one another in very intimate ways.

Human Technology

Jonathan Cook: There’s a conflict represented by Andy’s experience. The most true, and most useful experiences in our lives come when we’re most vulnerable, but being vulnerable is terrifying, so we often find ourselves striving for intimacy in a world of increasingly facile business transactions. We run away from human vulnerability into the cold but secure arms of digital technology.

The most pressing concern expressed by business professionals these days isn’t that robots will take away our jobs – for most of us, that threat is a few years away still – but that digital technology is already forcing human beings to become more robotic in the way that they work. It’s a profound identity crisis we’re going through, to have come to the point where we wonder what the point of human effort is any more.

The strangeness of it all is represented by that gatekeeper’s code many of us now are confronted with several times a day, thanks to Captcha: “I am not a robot”. We ought not to take lightly the irony that this is a robotic algorithm that tests our humanity – and one that is routinely fooled by other robots. We have to prove to robots that we are not robots in order to gain access to robotic systems that take advantage of our humanity.

This disorientation caused by this crisis came around full circle in last week’s unexpected podcast episode featuring Jan Kremlacek, who built a business around the idea that the robotic identity of artificial intelligence could be reframed into the identity of a domestic housecat.

We don’t know who we are anymore. Ubiquitous digital technology has alienated us from our own world. Is it any wonder that we have difficulty trusting digital business?

After decades of promises of better living through digital technology, skepticism is rising. Among those who have critical questions is Nina Kruschwitz, a researcher who edits the Journal of Beautiful Business.

Nina Kruschwitz: I admit to being a bit of a luddite. I am somewhat wary of technology, this technology especially, partly because it is created by humans, and humans’ intentions and motivations, and ability to execute on things can be really flawed. We are really not great at looking at how things come out over time. We are far more attracted to what is shiny and new and right in front of us, and pursuing that with little thought to the long term.

The little bit of experience I’ve had just with software sometimes, it’s so frustrating to be using a piece of software where I feel like I’m imprisoned in somebody else’s mind. Somebody else’s mind or some group’s mind created this software program, and I’m kind of stuck, you know? I’m completely at the mercy of how their mind works. So, what really scares me is ending up in a world that is the product of a bunch of, still, let’s face it, mostly white guys, possibly on the spectrum, sitting around in a room in Silicon Valley, thinking, “Hey, wouldn’t it would be cool if we could live forever?” That does not appeal to me, for one thing, and for another, and for another, it’s doubtful and probably impossible that we would include all seven billion of us.

Human microchipped

Jonathan Cook: Immortality, in the abstract, sounds lovely. The thing is, as Nina suggests, the quest for immortality never takes place in the abstract. It has impacts in the real world. The weird combination of emotional abstraction and economic elitism in Silicon Valley is creating the impression of a digital culture that’s being shaped by self-obsessed misanthropes.

The revolution in digital technology has brought unprecedented power to an isolated few, but as we watch Jeff Bezos fail to connect to the human beings in the communities around him, and witness Elon Musk smoking marijuana while bragging about the results of sleepless nights at work, we see a frightening gap between those who hold power and those who hold wisdom.

Nina Kruschwitz: You know, one of the things that we’ve talked very little about here that is sort of eerily compelling is the idea that computers are developing to the point where they are learning themselves and becoming conscious and becoming much smarter than we are, really, really quickly. What happens at that point?

Would we know? Would they want us to know, or let us know? Would they want to keep us around? Would we have to make a case for ourselves?

There are all kinds of intelligence, and to be really kind of trite, there’s a big difference between being smart and being wise. Wisdom doesn’t just come from data and experience. You can be 95 and still not be very wise. It’s how you engage with your internal world and the larger external environment, and with everything that’s basically invisible and immeasurable, things that we treasure but can’t exactly measure or explain, like love, or spirit, emotions, intuition. I don’t think that a robot is going to have an intuition.

There is a great bit in that Lydia Netzer book where the main character talks about the three things robots can’t do. It goes something like: Show preference without reason, which would be called love, doubt rational decisions, which we would call regret, and then trust data from a previously unreliable source, which would be being able to forgive. I’m pretty sure that a world run by an AI without those three capacities would not be that great for humans.

Jonathan Cook: It isn’t just outside critics who are worried about the tech industry losing touch with human need. People inside digital businesses are also expressing concerns.

Among these inside critics is Mark Williams, a leader at People First, a digital company that offers software platforms aimed at aligning the interests of employers and workers.

Mark Williams: My name is Mark Williams. I am the currently I am head of the product team at People First. I come from a creative background. So, I started in fine art, actually. I did a degree in fine art, and then it was more kind of interactive installations, and using computers for art, which really got me into the technical side and then I became a developer and came through a kind of development route and then started to go into research and that’s really where the concept of, you know, how to fix work came up, really.

It’s part technical and part kind of sociological. So, you can’t just focus on the software because there’s people involved as well and so we were looking at both technological trends and how that might help but also the wider sociological trends and how they intersect.

Jonathan Cook: Fixing work is an admirable goal, but it’s easier said than done. So, Mark is struggling to find the right balance, working in a software company while striving to look for solutions beyond what software can offer.

Mark Williams: I’ve got a kind of healthy skepticism with Big Data. Certainly, not all measurement is good measurement. How you wrap it up, how you reward, if you reward based on measurement then you get, people can game it. We talk about pragmatic people analytics, because I think it’s very easy to set up metrics and data. There’s a book called The Tyranny of Metrics, which is brilliant on this, because yes, some metrics are good but also there’s a lot of them which can be gamed, which don’t actually help the person who you’re kind of trying to help in the first place.

The more we kind of quantify ourselves as enterprises, not as people, then the closer we get into being replaced by a robot, because the amount of humanness that we bring to a situation gets less and less.

Well, I’m not I’m definitely not a techno utopian. I don’t believe technology will fix problems and there are power structures outside and that’s why we’ve done the whole thing with the alliance. There are power structures that live outside of what technology can do, that you need to resolve, that you can’t just throw software at it, and it will solve.

So yeah, I’ve got a healthy skepticism for what is possible with software, and the dangers of metrics. You’ve got to be careful and it’s very easy to throw tech at it and think that’s going to solve it, and actually, you’re dehumanizing the very people you’re trying to help.

Jonathan Cook: You can’t just throw an app at a problem, Mark tells us, but for far too long, that’s been the approach of business, both inside and outside Silicon Valley. Who has worked in the corporate world who doesn’t have a story about management coming in with a new digital system that’s supposed to make work more efficient, but ends up doing nothing but causing alienation and chaos?

A catch phrase from a few years ago was “There’s an app for that.” You don’t hear people saying that much any more. Jaime Stettin of the House of Beautiful Business reflects on her own experience with the vast wasteland behind the dazzling colors of the App Store.

iPhone terrorJaimie Stettin: I think sometimes, and I don’t mean this to sound like too much of a criticism, but there are too many apps to get a project done. I’m going to name a name here, but when people propose setting up a Slack channel for a project, my initial reaction is always please, no, let’s not do that, because I think we’re over, over-apping our lives to a certain degree, and I think about simplifying processes.

I think that while apps can help organize, I think sometimes there’s just one too many technological solutions that complicates things rather than streamlines them. Sometimes it’s just too much. I like it. I think it’s a good app. I think it’s just one more app, just one more thing, and I still don’t know 100 percent what the added value is if you’re all using your G suite apps anyway. I get it. You can have lots of different rooms for different things and different channels, and you can send files easily, but it just seems a little unnecessary.

Jonathan Cook: Slack channels inevitably go slack. Google struggles to maintain user engagement as concerns about the lack of privacy in its cloud apps grows. And so, a new trend in business is to pull back, to use digital technology as part of a larger set of business tools, many of which are purposefully kept offline.

I spoke about some of these issues with Helene Petersen, a poet and artist in residence at the Danish Parliament, who has worked with the House of Beautiful Business to radically rehumanize conversations about the struggles of working life.

Before you listen to Helene, I have to admit something to you: With the following recordings of Helene, I attempted to engage in just the sort of over-apping that Jaime and Helene herself warned me about. I attempted to edit out the noise with an algorithmic filter, but it actually made her sound less natural. It was a mistake.

You’ll hear more material in the interviews in this podcast that aren’t perfect. They have hiss in the background, or a murmur, or some other imperfection. This is what the non-edited world really sounds like. It’s the human experience, and as much as it might irritate the ear of an audio connoisseur, I think that when we eliminate it, we miss something essential.

I think we try to do too much of this in our digital lives, editing out all the noise in the hopes of achieving a perfect sound. It just didn’t work. The noise was reduced, but the result sounded deeply unnatural, much more annoying than the original. This time, let’s embrace the background noise from my time with Helene, sitting by the shore of the river in Lisbon, Portugal with the sounds of the city and the water around us. It’s not as clean as what we might have recorded in a studio, but I think it’s more her style.

Helene Petersen: So basically, this conference has been about technology and humanity in technology.  Ethics, the ethics of the AI, is also bringing us to the question of ethics of humanity, like, if we are not able to treat each other with compassion, respect, basically protecting each other’s lives, not being agreeing, not being the same.

If we want to embrace the future not with fear but with trust, we need to start embracing being human and we need a new space for doing that, because right now language is not helping us. On the contrary, we have so many misunderstandings, we’re communicating so much due to technology, making the space for misunderstanding even greater than it has ever been. It’s like an explosion of fragility if we are not careful. And that’s why we need the space of awareness of being human and recognizing each other’s common ground, like what is exactly the same for me as for you.

Human future

Jonathan Cook: Recognizing each other’s common ground, embracing being human – as I hear Helene say these words, I recognize that much of the debate over artificial intelligence is spoken in intimate, passionate terms, and realize that when we argue about the role of artificial intelligence and other digital technologies in business, we’re really talking about something deeper than just a technological issue.

This is an issue about a relationship between people that’s become abusive and hurtful. When we express our reluctance about the growing role of artificial intelligence in business, we do so with the tone of a jilted lover. The tale of digital automation in business is a tale of unrequited love.

Businesses want our loyalty. They seek our passionate attachment, but when we reach out, they ask us to talk to their chatbots. We speak of our fear of being replaced by robots, and in response, business tells us that there will be other jobs for us. What we’re really feeling so uneasy about, though, isn’t that we won’t have jobs to provide for our economic security, but that business leaders seem to believe that humans can be replaced by algorithms. As customers, as workers, we begin to wonder whether that’s all we ever have been to the brands we’ve loved: Has our relationship been anything more than a formulaic calculation?

Universal basic income is no replacement for being loved.

Helene Petersen: I do believe that our drive to search for solutions and technology is also applied to finding answers to existential questions. And my point is search for these answers in you search for these answers in your relation to yourself and to the life that you live while you have it, because you don’t have it forever.

Developing us further within technology, we’re still struggling with the consequences of industrialism. We are basically drying out the earth’s resources due to industrialism. We have an ocean of plastic flying out there. That is the consequence of one of our inventions and now we want to solve that invention by creating another invention. But I think that we will we will continue to run after ourselves and basically missing the whole point of life meaning just living life if we are not conscious of ourselves, and that’s where I really hope that this color spectrum will prove itself to be the foundation for a new way of being human by reminding us something that we have always known.

Jonathan Cook: Helene makes a disturbing observation: We still haven’t figured out how to run the Industrial Revolution without endangering the planet. So, what makes us think we can manage the Digital Revolution without unforeseen consequences?

Helene urges us to refocus on understanding the human condition on human terms. Semiotician Martina Olbertova, however, warns that digital technology giants seem to view our humanity as an error in their operating systems.

Martina Olbertova: They’re looking at us and saying we are the glitch in the system because we no longer basically react and work the way that we’re supposed to do, and there’s something wrong with people, whereas the whole system the corporations created is fundamentally wrong.

We have this gap in between organizations and humanity right now, because people aren’t machines. You cannot replicate behaviors that you just want consumers to have and everybody will be happy. This is not how we humans work. This is not what we were built on. This is not what we should be doing.

So, I think that it’s up to organizations to actually change themselves to mirror what matters to people, and what we value, rather than for them to recreate us and how we operate and this fundamental clash in between who has the power is exactly what is at the epicenter of this discussion about business humanity. So, we need to make businesses more humane and humanity driven and informed by meaning and sense and culture and other human principles.

Martina Olbertova quote

Jonathan Cook: It’s up to us humans to craft business organizations that are driven by human principles. Can we do this, though, when the tools that we’re using to communicate and organize are primarily digital?

Designer Scott Dawson echoes Martina’s concerns, observing that technological applications, in business and in the world in general, are replacing social skills, not augmenting them. It’s a sober assessment about the future of a world dominated by artificial intelligence.

Scott Dawson: I started off in a very technical role, but I was promoted to the point where I was managing teams most of the time, and I was really far away from that individual contributor role, and I spent all my day in meetings, and just dealing with people on different levels, professionally, personally, making sure they had everything that they needed, making sure they were happy, making sure they had the tools they needed to succeed, and it was paradoxical, because you spend all this time in school being educated technically, but nobody really sits you down in the technology world and says, here’s how you deal with people.

I’m a firm believer that to be successful in business, whether you’re on site or remote, you have to know how to deal with people, how to meet them where they are, how to build relationships, how to establish trust, and that’s what makes organizations work, is when you have good relationships between people. Without that, when you have distrust, it’s not an efficient team.

I am really concerned about the proliferation of apps that don’t seem to serve a purpose to enrich our lives, to enrich our relationships, to help us offload or delegate things that take up our time so that we can do other things that are more worthy pursuits. I’m talking about kids walking around on their phones all the time, just kind of lost in their own world, and the argument that they’re being social or they’re being enriched, I don’t think that falls squarely in my head as a valid argument.

I think a lot of the apps that are out there are designed to be addictive. They’re designed to line pockets with advertising money. They’re not designed to promote a healthier lifestyle.

Jonathan Cook: Are these criticisms unfair? Too harsh? They need to be taken in the context of a generation of hype from technological companies that have promised economic opportunity, democratization, less work, greater health, and an all around better society. On all these terms, Silicon Valley has failed to deliver, and in many cases actually made our problems worse. They promised us a digital utopia, and what we got instead was another Gilded Age, a time in which a very small number of people have gotten very wealthy at the expense of everyone else.

So, is the techlash too harsh? Not by half, and keep in mind, Scott’s someone who works with technology for a living, as are many of the other people you’re hearing in this episode. They’re not standing on the outside, throwing stones. They’re in the middle of the storm.

Scott Dawson: I’m kind of in a position with being in my 40s where I knew a world where we didn’t have smartphones. I knew a world where our social contacts were in person, and you learned to talk to somebody else, and you learned to empathize with somebody else. I think we need to get back there, but I don’t think we can get back there. Too much has happened, and we’re on this track now where technology, for better or worse, is part of the fabric of how we live.

I think we can pull the needle back so that we have more of a healthy balance where we’re living a life that’s more engaged with people in the real world, where we’re civic minded, where we’re not just out for ourselves, where we’re contributing to the betterment of where we live right now, and I think the danger is that the generation that’s growing up right now doesn’t have that context that that is a, that that was a reality, it is a reality certain places and it really should be. There’s so much bound up in these devices that we hold in our hands that’s changed the way we live.

monotone world

Jonathan Cook: Scott talks about the problem of digital ways of thinking and living influencing our behavior, reducing the human connection in our lives.

Julia von Winterfeldt, the founder of SoulWorx, explained to me a mirror image of this concern. She’s worried that the problems that already existed in corporate culture have been magnified by digital technology. Specifically, she’s worried about the overlap of problems with gender and problems with technology, both arenas that adopt a masculine bias.

Julia von Winterfeldt: We’re in danger of putting into this, putting into the technology, programming this technology more of the masculine qualities, more efficient, more top of game, having the best and always coming up with the best solution, ad-hoc, quickly. I only wish that we not only raise the number of technologists who are female and really ask their support and give them the opportunity to create together. And I also ask that anyone or everyone who is creating new technologies is looking at it from a place of, ‘Why am I doing this and what benefit am I really trying to bring into the world by doing this?’

Jonathan Cook: Julia has concerns about the masculine characteristics of business culture that emphasize competition over a sustainable process. Competition drives us to quick solutions, but the best solution in the moment is often harmful in the long run. Digitally optimized businesses win this kind of pyrrhic victory far too often.

When your metrics are short sighted, they miss the larger landscape that a business needs to navigate through. The trouble with these concrete measurements is that they define success in such a tightly circular way that the system becomes blind to risks that lie outside of the circle. In the same way, the masculine bias implicit within the digital dominance of business culture isn’t recognized by many people working in tech, because they’ve gotten used to seeing masculinity as the default, rather than just one of many options.

Julia’s larger point is that the limited scope of business technology has led to a limited vision of what business can be. Too many businesses are creating technological solutions to make a profit without first considering whether there are really any human problems that call for the solutions they’ve engineered.

Though he’s working in an inherently human field, Reinhard Lanner, Chief Digital Officer at the National Tourism Office of Austria, shares Julia’s concern.

Reinhard Lanner: I think that the tourism industry is in a very lucky situation because what we were talking about in Lisbon with humanity and business, about co-creation between the consumer and the producer, that’s what tourism always has been. The tourism product has always been about co-creation between the producer, the hotelier, and the visitor.

So, hospitality, empathy, creativity, some skills which we discussed in Lisbon, are fundamental for the tourism business, and like other sectors, the tourism business has developed in an industrial way as well over the last 50, 60 years. So, we have the big, big companies, big travel agencies, and so it’s kind of like producing a car, with management and everything.

That’s things where we learn from other technologies, but basically, and especially in the European Alps, we have a very small structure of companies. Most of them are family business, and family businesses are not only thinking about the next three months, for the profit, but they are thinking about the longer term, for their kids, how they can continue. Technology brought us the big, big opportunity that we connect different of these small companies much easier than we could do this 20 or 30 years ago.

So, I think that we should adapt these technologies quickly, and all the skills we need for that, but they are only the background for doing our business in the front, and the front is all the skills humanity does for us. Sometimes, at these conferences like Web Summit or so, technology is the end, somehow, and we should think about them more as a means to create something else, and that it supports our real business, and that’s something that I think gets easier through technology.

Jonathan Cook: If Reinhard is right, and digital technology should be employed as a means to an end, rather than an end in itself, then we need to reintroduce the idea that, sometimes, the best means to the ends that we seek are offline. Jaimie Stettin confirms this idea. At times, getting offline is the best thing a business can do.

Jaimie Stettin: I had this conversation with a friend the other day, and he was saying, wouldn’t it be great if there was an app that sent messages to people so you wouldn’t have to keep track and keep up with everyone all the time, and for me, that was a huge boundary crossing, because to me, what makes getting a message or sending a message matter is that the person who sent it or meant to send it to me, thought about sending it to me, wrote something and send it to me themselves. The idea that someone somewhere would be automating messages to the people that they quote, unquote, care about, that sort of seems to be a sort of breach of a boundary, or a hole in humanity if we have to start automating messages to each other. That takes out all the meaning from communication.

Jonathan Cook: As I listen to Jaimie, my mind goes back to the idea Martin Reeves brought up in the last episode of this podcast. He talked about how biology can serve as a potent metaphor in business.

Biologists have shown that a diversity of strategic approaches makes for an overall healthier competitive landscape. In the natural world, there’s a place for many different species, even within a specific ecological niche, because they’re not all trying to do things the same way.

In business today, we’ve seen a dangerous reduction of strategic diversity. With a rush to digitize everything, just a few big companies are taking up most of the resources for themselves.

Business has become ecologically imbalanced. Luckily, Tim Leberecht, author of The Business Romantic, sees the pendulum of business culture beginning to move back toward the center.

Tim Leberecht: Especially after working in Silicon Valley at Frog Design, where I worked for eight years, witnessing the work of my design colleagues and technology colleagues, I did not have a relationship with technology that was by any means problematic. To me, it was sort of a part of life. It took it for granted. If anything I admired what people were doing because it was just foreign to me.

Then, after my years in Silicon Valley, I think slowly but surely, I developed a much more conscious relationship to both the potential but also the perils of technology and I think at some point I became more and more worried about a very myopic belief in technology as the universal panacea, the key to solving every single problem on earth, and what Evgeney Morozov calls solutionism, which is I think a perfect term to describe much of the thinking that is typical of Silicon Valley, and I was worried.

When I wrote the book, The Business Romantic was very much a response to almost like a religious belief in the quantification of everything, and this idea of wanting to optimize our world and making better about optimizing it including ourself, by the way. And, in my book, my work was very much a meant as a rebuttal to that wanting to show people that no wait a minute there is there is actually another world out there, a world of arts, a world of the humanities, a world of the interior soul, of that that is inexplicable and not quantifiable.

Then, I think most recently, actually, a couple of years ago, for a number of reasons, many of which have to do with scandals that occurred on Silicon Valley’s watch, notably the Cambridge Analytica issue with that scandal around Facebook, many other mishaps and developments, I think the global consciousness has really shifted, and I think the world is now waking up to the idea that Silicon Valley is not the only paradigm in town. It’s time for us to resurrect and celebrate a renaissance of other disciplines and other worldviews that balance that purely tech driven, solutionism-driven view that Silicon Valley is the epitome of, a technology-focused worldview, is promoting.

So, I think the pendulum is swinging back, and I see this: I see that the real humanization of business is a huge trend, more and more voices on the need for bringing other disciplines into this conversation, the need for a new social contract, for really redefining what it means to be human in an age of technology, and not just doing that from a technological viewpoint of what’s possible, but what is actually desirable for us humans. So, I think that conversation has started a couple of years ago and is now reaching not just the mainstream of business but also I think the mainstream discourse in our societies. I see this with great encouragement and optimism.

Jonathan Cook: In 2018, it’s become impossible for honest people to ignore the pervasive problems in big tech, but Tim sees hope. His idea isn’t for an overthrow of all things digital. As Scott Dawson pointed out, such a luddite revolution seems impossible. How could it be organized without a Facebook group?

Digital technology does wonderful things for us. If I sit back and think of all the digital tools that I’ve used to produce this podcast, it’s staggering. There’s no way I want to go back to a world where I’d have to splice magnetic tape to edit a radio show together.

Researcher and artist Tania Rodamilans envisions a future in which digital technology gives people in business the freedom to acknowledge the limits of their knowledge and expertise, and to follow their passions liberated from these obstacles.

Tania Rodamilans: I think new technologies are the perfect example of forcing people to really admit they don’t know. Even with products or devices that you are really familiar with, something is going to be different tomorrow and you’re not going to know how to use something that doesn’t even exist today. So, the rate of novelty and change that we are experiencing today, and even the fastest pace of the technology changes that we’re going to be experiencing in the future, I think it’s going to require people to rely on other people and to admit that they just don’t know everything.

There’s no way you can possibly know everything there is to know, no matter how much experience you have in your own field of work or expertise, including all the simple apps that you use on your everyday life and that you just can’t figure out how they work. At least I can’t. So, what technology will allow us to do is to have a lot of the tasks that might get in the way of human interaction taken care of, you know, all the never ending to do lists that seemed to get in the way of actually taking the time to have a conversation and real understanding of other human beings.

In the kind of work that I do what I see happening and what ideally I would like technology to do is to kind of free people from worrying about the details like is this conversation being properly recorded, are the images being collected, are the interviews being scheduled at the right times? You know all of that kind of smaller, I would say more admin type task, in any given project, technology should be like having a personal assistant of sorts that takes care of a lot of the things that might get in the way of you being able to just sit down for two hours and just think about something. So technology’s job should be removing obstacles for us.

Jonathan Cook: The key to Tania’s positive vision of a technological future in business seems to be based on an abandonment of the competitive model. People can’t compete with computers on their own terms, but we have never derived the greatest satisfaction with mere speed, or the ability to store huge amounts of information in our memories. As computers take care of the range of skills that they specialize in, Tania sees an opportunity for human beings to recapture the particular specialty of our own species: Flamboyant creativity.

Tania Rodamilans: Well, that reminds me of the reason why I brought up photography earlier on in our conversation. As a technology, digital photography was viewed as a threat and mistrusted by some. But to me it was a way to remove technical barriers, at least technical barriers that I personally had with the cameras.

I take a lot of pictures with my phone now and having the kind of technology that gives me the ability to do that with my phone and to do it better and better and more professionally looking every day, that’s amazing. The technology gets better every day and the lenses get better and the editing apps are great, and also a lot of fun to play with. So, that has removed a big barrier for me when it comes to photography, because in the past, I had a few cameras but I wasn’t really taking that many pictures because I was just too concerned and worried about the settings and the technical part of it. So, the settings felt convoluted to me and I was spending more time thinking about that than actually taking any pictures.

The technical part, I would say, was getting in the way of creating. So what the technology behind digital photography has done for me, and I would say not just as an artist but as an individual, I think it has touched my life and helped me in developing my own visual aesthetic and the way I see things as a visual artist. It has given me more freedom, more creative freedom for sure, and more time to focus on what I’m really creating rather than the way in which the technology actually works.

I can see the pictures right away and the result of what I’m doing right away. I can test and experiment more. I can take the same picture twenty different ways from three different angles and then right away just know what’s working and what I like about it. You know some people might not consider themselves a photographer and they’ll never go into that world professionally because they might think it’s scary. It’s too complex.

Now, you’ve created a tool that everybody with a phone in their hands can use and everybody can be a photographer. I think that’s great, because you are in a way democratizing an area of knowledge that used to be only for certain people. That’s technology at its best, opening up barriers and helping people be able to do more things. From my point of view particularly when technology helps you do more things in the creative world, I think that’s fantastic.

Jonathan Cook: For far too long, people have been treated like machines at work, and have suffered in roles that didn’t fit their dreams. Merely making money isn’t enough, because most of our lives are spent in labor, not on vacation.

Ironically, the integration of machines at work could finally allow human beings to stop working like machines.

At the House of Beautiful Business last year, Karel Golta shared a very personal story about the struggle to find human purpose at work. The example of his father’s executive estrangement has shaped Karel’s career.

Karel Golta: When I was at a very early age or young age, probably due to having been brought up in Switzerland, you think about how to plan ahead a lot. All the retirements, there are four different systems of retirements in place to save money for your retirement in Switzerland, four that you participate in, so it’s like quadruple redundancy, and that makes you always have to be prepared.

Are you going to have a job? You get a job or whatever you plan ahead so that you are super safe, and that there is no accident, basically, in your life, and so for me it was important that the job, and I remember thinking that when I was about eighteen, nineteen years old, I didn’t want to get into a mid-life crisis like so many people did.

I remember my father saying that he was a civil engineer, doing bridges and tunnels in Switzerland. It’s super complicated stuff, and he loved to do the calculation for it, you know, but he became the CEO of one of these big companies, and he hated it, because he wasn’t able to calculate and design bridges and tunnels anymore, but he had to do the number crunching. And that was the point where I said, hey, no, let’s not.

It’s also more important that you stay at the place where you want to, where you still can live your passion. I don’t want to at a given point in time that I have to tell myself that when the plane is soon hitting the mountain and I’m crashing, darn, I should have done it differently.

switzerland bridge

Jonathan Cook: Karel, as the CEO of Indeed Innovation, seeks to apply these lessons to his firm’s work with technology. Their designs, he says, are always “human first”.

Karel Golta: I have a vision that we need to embrace technology, but not for the sake of technology, but only for the benefit to humans, and therefore I would like to switch this artificial intelligence discussion, which is a technical term, to transform it into an augmented intelligence. This is I think a really important aspect, because if we can take as we did in the times before, to really utilize the super powerful technology, it makes us stronger, not dependent, stronger, more creative, more facing the individual strength of each person. I mean that is a great future, I think.

It’s totally okay if certain jobs will be eliminated, or certain professions, whatever you want to call that. It doesn’t mean that there will be less work. But it’s no, remember Charlie Chaplin? Charlie Chaplin was Modern Times, right? I mean that’s a stupid kind of work, and we have enough stupid work like that, even in accounting of course, you have work where, that doesn’t probably help a lot because it’s not something that you utilize the true strength of a human being.

It’s different for a cab driver, for example, because of course you will eventually have self-driving autonomous cars in the city. Well, imagine you are in Barcelona, and they take you from the airport to your hotel. It’s totally fine because you know exactly you want to go from here to the airport, but how many times are you talking to a cab driver and want a list from him, maybe what is actually the most authentic restaurant in Lisbon, right? Would you get that from Google? No, because Google, or the self-driving car will tell you most likely which one is highest ranked because of whatever, but does it need to be authentic at that point? Questionable, honestly. So they will co-exist.

What I don’t want, though, and I have to say as well because I think those perspectives are giving guides to anticipate the spectrum of what I am talking, is there are many people, saying, “Oh great. There will be no work for people. We can all go play tennis. It’ll be wonderful.” We heard it Sunday several times. Time for tennis, right? This is stupid because work defines us and we define work.

I think even dreadful work sometimes is super important, because it is the delta to the joyful moment of relaxation. We need this duality, OK. We need distractions from relaxation. You can’t have sex all the time, even if it will be joyful. You would lose the sense of that, right? So, work is nothing bad, even stupid work sometimes, okay, that we all do, is good. Cleaning the house sometimes, at some point we all have to do that, even ironing. If you do that all the time, and that’s only your life, that is stupid. Just to to sometimes do hard work or difficult things, that is important, because otherwise, I always see, I am sure you know Disney’s movie Wall-E, right? I mean those wobbly people on these floating devices just consuming? That’s a bad future.

Jonathan Cook: Karel seems to be arguing that, while we should embrace technology, we shouldn’t lose ourselves in its embrace. Instead, he challenges us to work counter to the dopplegangers we encounter in the digital field, and work to discover the human experience embedded in what we do.

The implication, I think, is that the most important challenge of technology isn’t who can race to the top first, who can be fastest, who can be smartest. The challenge is to find our center again.

Perhaps what makes us so afraid of technology is that it forces us to face up to the traps we’ve made for ourselves in our own lives. It confronts us with the image of our lack of fulfillment in our work – a problem that predated the digital age. We want work that is meaningful to us, that makes us feel vital, and we also want to keep the pleasure of plain, ordinary work, not to have it taken away from us by the bots of convenience.

As we face the autopilot in Wall-E, we need to recognize that it isn’t just an external enemy. The ideology that enables artificial intelligence predates the invention of digital computers. It’s been a sore spot on business culture for generations, a shadow in our minds whenever we come to the office, that horrible, data-driven vision that can only perceive business technologies of tools as manipulation, because it believes, deep down, that human beings are nothing more than superficial, behaviorist routines of habit.

We want more than The Power of Habit. We want to feel vital in our business, staying true to our dignity and purpose even as we confront the drudgery of everyday work.

cog in the machine industrial strategy

Artificial intelligence technology is a shadow of our selves, a screen upon which we project our fears about the limits of who we are. Technology is a puppet – and we need to remember that its voice, no matter how frightening, is really our own. If its voice is that of a monster, a demon without a soul, then where could that voice have come from but within our own minds?

Whenever we’re talking about technology, we’re really talking about ourselves – for good or for ill. Technology isn’t just a bunch of objects we invent. Humans imagined these objects. They are ideas made manifest. As Martina Olbertova teaches us, technology is ideology.

Martina Olbertova: Every technology has its own form of ideology embedded in the way it looks, and the way it works, and the way it functions. Basically every technology was created and devised by the human mind and the set of prevailing beliefs about what humans are like and what the reality is like. So, you cannot have a technology without the embedded bias in it.

Technology is only an enabler of a transformation or of human evolution. It’s not what should drive it. The human mind should drive it, but it should be enabled by technology but I feel that the sort of crisis that we’re getting into now is that people started worshipping technology as a sort of, it’s almost like a self-fulfilling prophecy, this idea that we’re creating something that’s infinitely more powerful and intelligent than us and it will lead us into the future.

Well, look, human minds should lead us into the future because we should ultimately be the ones responsible and in control of technology. So, I think the sense it’s much more important than technology. Technology is just the enabler. It’s how information is structured, but it’s the sense that comes out of that technology through applying it here to social problems or to wicked problems or like business conundrums or something, it’s the sense that comes out of it that is the value, not the technology. That technology is the requisite of something greater, and somehow we appeared in this culture where we embrace how we do things instead of why we do them or what should be the outcome of applying it. So, that’s why we are obsessed with technology. That’s why we are obsessed with data.

It’s only a path that is supposed to lead us to an outcome, and we’re just basically standing with our backs to the outcome looking at the data or Big Data or the algorithms or technology and just worshipping the greatness of something instead of using it to our own advantage.

Jonathan Cook: Martina challenges us to shake off the idolatries of technological ideology, and reclaim control for humanity. What cult of technology is stronger right now than the fetish of blockchain, a magical promised land that seems to offer whatever its believers most want to see, though it doesn’t quite live up to its promises in the present? Reinhard Lanner is confronting this particular idolatry in his work in tourism.

Reinhard Lanner: Blockchain is a kind of a new technology. I don’t have any experience personally, but I am observing it as I observe all the developments which are going on. I think at the moment, it’s a kind of marketing hype on the one hand. On the other hand, I think we are yes, we are now working with an Internet that was created in the 70s, with TCP/IP protocol and so on, and we are thinking about storing things in central hubs and servers, and so on and so on.

As the amount of data is getting more and more and increasing, maybe the method of having a few servers in a center and delivering it from here and there is not the best option for the future, and if we think about data rich files, like music and videos, which will increase in the future as well, maybe the technology has to change. If we think about, when we watch a YouTube video, that video does not come from one server. The video or the films comes from different servers and is put together. So, I think that storing things in a network at the end, and not in the center, and connecting things together, the basic idea of blockchain, will be relevant in the future.

If it is the blockchain technology we know about today, I do not know, but I am very pragmatic with this thing. I don’t really know how a car functions, or how an airplane functions. I am using it when it’s there, and when it serves me. Right now, I do not have an example in the tourism industry where I would say that it’s only possible with blockchain technology.

Jonathan Cook: Who is to say blockchain won’t live up to the hype, eventually? Perhaps it will. The point is that the days of faith in salvation by technology are gone. Chasing after new technological will o’ the wisps in the hopes that they’ll deliver us from the conflicts that are inherent in doing business will only lead us deeper into the mire. Now is the time for skeptical inquiry of the claims of the Temple of Silicon Valley.

Bhavik Joshi, Strategic Director at LPK, is helping his clients to see beyond the hype, to find the tools they actually need.

Bhavik Joshi: I think that one of the things we struggle with is that all of these social media, social listening, social discovery, anything that has to do with the technological aspect of research with trying to understand people through technology is such a shiny object right now, it’s kind of like the Tesla in your driveway. Every CEO wants to say that they have one, right?

So, whether it be neurotesting, or social listening, or hey could you just parse through the Instagram feed of influencers and see what it is that they are talking about and what it is that their followers are getting most influenced by, all of that is such a shiny object that all of us fall prey to using them without doing any deeper level of discovery. It’s very easy to say, oh, look at this. This person is talking about such and such, and she has clicked the picture in such and such manner, so there is the content of it, but also the expression, and that content and that expression when summed up together could mean this. But we also know that a lot of the driving factor behind someone posting something on social media might have to do with a need to get more likes or to have something to do with the status aspect of how it appears.

So, there might not be a deeper driving force, or at least not the one that you are jumping to a conclusion to, so I think we use technology because it has scale, and it has reach, and it’s faster, I would say, but at the same time, I’m very conscious of the fact that the content you get from this kind of media does need to be investigated deeper, further down, and when you do, you might get to something more universal, archetypal maybe, but if not archetypal, then something that binds us together as humanity and has a solid foundational root instead of just being at the superfluous level of expression.

vision

Jonathan Cook: We began this century, this new millennium, with the feeling that we were somehow entering into a kind of technological End Time, with an Information Super Highway, and a Singularity that would end all our suffering, and deliver our heart’s desire. Now, as the buzz dies down, we’re realizing that things have stayed the same more than they’ve changed.

We can stick a wireless chip into an object and call it “smart”, but there’s more to business intelligence than that. When it comes down to it, the greatest opportunity for business still remains connecting human beings with each other.

Bhavik Joshi: I am optimistic about the fact that people who are curious about the human condition and want to understand it in context of certain media, such as technologically-assisted media, but also want to understand it independent of the context of any medium, will play a much better role, will play a much more influential role in everything that’s going to happen in the realm of technology in the future. It’s kind of encouraging and feels rewarding to hear that even the machine learning aspect of trying to understand data or pictures or words, by just understanding more and more of it, it’s also being springboarded through the coaching of humanists, people who understand social sciences, of people who understand imagery, and through experts, social psychologists and cognitive psychologists.

I think all those, if I may use a very general, broad brushstroke, of these humanities-based inquiry fields, if all of those kind planted themselves in these hubs like Silicon Valley, and I’m hearing more and more that’s happening, but if that was more prolific, I think I see a future where the growth of technology, the application of technology to benefit mankind, to benefit humanity, does not happen independent of the understanding of humanity. Right now, it just feels like the invention is happening just because it can. It’s like, hey, we can do it, so why not, you know? Hey, we know how to, why not?

There is not much more of an intentional effort to understand not only the impact of that, but also the cause of wanting to make something happen. Why should we do this? Who would benefit from this, and how would they benefit, and how would their life be different? Maybe taking the classic, traditional research approach, but not hand it to a mirrored focus group kind of thing, but actual investigative human research, and infusing that into these technological moonshot aspirations.

Jonathan Cook: With the initial thrill of technological development subsiding, business is now learning how to live with digital tools in a long term, sustainable, healthy relationship. As Bhavik points out, the task ahead of us is not to upload everything and make it digital, but to find ways to integrate new technology into vibrant human culture.

David Altschul, a specialist in the craft of storytelling in business, reminds us that neither humanity nor technology will be a victor in this conflict. The conflict is too deep not to endure.

David Altschul: There are three essential pieces to the story framework: Conflict, the meaning, and purpose. The conflict is the source of energy for the story. It’s based on the most fundamental principle of storytelling which is that without conflict there is no story. The story starts when the protagonist world is thrown out of balance. For some reason inner conflict arises that has to be resolved. So, the conflict provides the energy and authenticity to the story.

Conflict is, when I say conflict, I’m talking about some universal human conflict. I’m not talking about a conflict being a good thing and a bad thing. That’s not helpful, because the audience knows how that story is supposed to come out. The conflicts that drive stories that go on indefinitely, that don’t need to ever end, are conflicts between good things that happen to be opposed, like virtue versus pleasure, which turns out to be the uber conflict of the food category, or safety versus freedom, or spirited versus sensible.

You can see in any of those conflict pairs both energies are positive. You say, ‘Which would you prefer to be, spirited or sensible?’ Well, the truth is you have to be both. Which would you give up? Well, you can’t give up either. If you are only spirited, you would fly off the deep end, and if you were only sensible you would be boring. And, I think that is also a quality of the kind of conflicts that drive really compelling stories, is that not only are both energies positive but opposed to each other in some way, but you can see how either energy if taken to an extreme would be a bad thing.

Freedom versus safety: Which one are you going to give up? If you take freedom to an extreme, you’d be dead fairly quickly. If you take safety to an extreme, your life would be quite boring, and you can’t just, it’s not about striking a balance because if you’re half free and half safe, you’re really neither. It’s about going after both of them a hundred percent, and learning to live in the tension that arises. In fact, of course, that’s the reason why we continually tell stories is to think about and remind each other about ways to deal with these conflicts that can never be resolved.

Jonathan Cook: The conflict between technology and humanity, between quantification and qualitative experience, between intellect and emotion, can never be resolved, but by keeping the conflict alive, we can keep our humanity alive. We’ll use the technologies of Google, but continue to criticize when Google oversteps the boundaries of reasonable behavior. We’ll limit our use of any single tech giant, and regulate them reasonably, bringing them back into the realm of responsible human society.

When we find the human tools to keep technology in its place, and work with it meaningfully, then we’ll be able to let the techlash subside, and re-engage with digital devices as the enchanted objects that they can be. It will take work of qualitative, human researchers to establish these new connections.

Anthropologist Tom Maschio is on the job.

Tom Maschio: Google, I work for Google a lot. It’s kind of the ground zero for the kind of business you are talking about, I guess. Google knows everything. Google is big data and Google knows what you know, I suppose. They know where you go. They know about all sorts of trends that they measure quantitatively: People searches, you know, where they are during the day. You know, they know where you are geographically and they have all this data, like Facebook and all the other great media companies. So, they still need to know about the human kind of meaning of things. I think I’ve done six projects now for Google, on an initiative called The Humanizing Digital Project which is interesting. I mean, Google came to me originally wanting to know about the human dimensions of smartphone use.

They wanted to know the meaningful grammar of search, what people were searching for, and why, and what was the, I mean they had all the how and then the what ofs of search. I mean, Google knows everything about search in that sense, but they didn’t know why, what it was doing for people. They didn’t understand the meaningful dimensions of search on smartphones or even what smartphone phenomenology really was. They are a great technical company, data-driven design and engineering centric but they felt that for all their studies and quantitative measurements of customer usage of their search engine on mobile devices, the human meanings of what people were up to in all its depth were somehow eluding them.

attention economy addiction

Jonathan Cook: An ethnography of the smartphone is the job for a cultural outsider. Of course, with the pace at which digital culture is developing, we’re all outsiders – even the people who are developing the technology that’s the focus of our new cultural practices.

Tom Maschio: Everyone was having a smartphone suddenly. Suddenly I mean everyone had a smartphone but you know what will that mean? And how do you, what is the business problem?

What they’re really asking me to do is give them some sense of how people were humanizing their digital technologies, but people humanize technologies and all sorts of objects from the get go. They attribute meaning meaningful dimensions to all phenomena they encounter and humanistic anthropology, which is what I do, is the study of these meanings.

They give us permission to suspend disbelief to daydream and, in the digital world small objects such as smartphones and tablets share in the symbolism of the small. They spur imaginations, the kind of toys that lend themselves to fantasy and play. The small and the miniature is the realm of childhood, really.

I know people are on Twitter and they’re doing all sorts of nasty things to each other there, but you know, they’re also on Instagram and other platforms where they’re playing, where there is a kind of craft playfulness there and they’re building worlds. They are doing world building through the portal of the small, the smartphone.

So what are people doing? What’s their ritual that they’re engaging in? You know, when they engage in smartphone play, they’re building a dwelling place, a kind of architecture of happiness that’s for themselves and their mapping and exploring that space.

They’re placing an intellectual structure upon reality through play and in this way, I think, bringing the cold objectivity of the world into line with their own inner disposition inclinations and desires. So, they’re they’re humanizing the digital space that they’re exploring through smartphones.

Jonathan Cook: Tom Maschio provides a special caution to tech executives who think that they can simply convene a summit of digital industry leaders and crack the secret for how to humanize their businesses.

Consumers are already engaged in projects of humanization, and the first thing the teams of people at companies like Apple and Amazon need to do is get back out into the world beyond the borders of their corporate campuses, to study the cultural revolution that’s working with their tools.

One tech company seems to be emblematic of this new human wave of technological adaptation. I’m talking about Trint, a company that’s all about listening.

Trint was founded by Jeff Kofman, a former television journalist. He founded Trint to solve a functional problem that journalists, and other people who do lots of interviews for a living were having trouble with. They were struggling to wade through their interview recordings.

Jeff Kofman: The thing that stopped it from going out faster was the process of getting the story assembled and the slowest part of that process was transcription. If you have three interviews that are twenty minutes each, there’s an hour of content you’ve got to wade through to get the right soundbite to get the right quote that you can put in the story.

Jonathan Cook: Trint is an automated transcription service, using machine learning technology, and as such, it’s a valuable service. There’s no question that the functional need for fast, accurate transcription is real. Nonetheless, solving this functional problem isn’t enough of a story to build a brand around. For that, we need to look deeper, to find what really motivates interviewers in their work.

Jeff Kofman: You talk about the quality of work. Being a stenographer is, first of all, not fun. Anybody who knows, who has been through that workflow knows that it’s just a big pain. It’s drudgery, and it doesn’t make you a better journalist. I think it’s important to know your interview content, to be clear, but it doesn’t mean typing out every word. So, you know, what we’re doing actually is increasing the quality of work for journalists because you can focus on content creation not stenography, and that’s what Trint’s really about. We’re not taking people’s jobs away. We’re liberating people to do their job.

I think that, applied well, artificial intelligence, in our case, can liberate. It can allow us to use our brains for the things that we can most effectively use them for and that excite us rather than simple drudgery. So, you know, I think that AI, which in our case is automated speech to text, when leveraged to an extra level through through Trint software, makes work more engaging. It makes work more interesting and it makes work more productive. Those are all good things those are not evils of technology. Those are liberating things.

Jonathan Cook: The more Jeff Kofman talks about what Trint can do, the more clear it becomes that just getting a quick and accurate transcript isn’t at the heart of what the company’s software does. The underlying human need is to give journalists the space to engage more with their interviews, rather than to pull back from them. By spending less time worrying about typing, researchers who use Trint can spend more time thinking about the material they’ve gathered.

Trint is an instructive model for the future of digital technology because it didn’t begin with a technological solution in search of a problem. Trint began with a functional need, that it then developed to address the core human need to connect. Then, a technological solution was applied. The result is that, when a person uses Trint, it’s the interview material that’s elevated, rather than the technology that assists in its processing.

Maybe, As Jeff Kofman suggests, technology isn’t really a competitor to humanity in business. Perhaps, what digital technology is doing is challenging us to step up and do the quality of work that we’ve always dreamed of.

If technology is, as Martina Olbertova says, an ideology, perhaps we can leverage the human ability to live within ambiguity to dynamically thrive within the conflict, as David Altschul suggests all successful characters in good stories do.

Mark Williams of People First foresees a future in which technology fades, the more powerful it becomes.

Mark Williams: The bits where automation won’t touch for a long time are the ones where it’s about face-to-face interaction, creativity. You know that’s where we’ll, that’s where we should thrive. Having said, the enabling technology that takes all the other stuff out the way, it allows me to just be free and engage in what me as a human does best, rather than thinking I’d like to do a spreadsheet. You know, the automation will take those bits away.  I think the human bit comes in as to what we’re looking for creativity in the widest sense to solve. We’re going to have robots to solve efficiency problems. The really big problems are human problems.

Jonathan Cook: The really big problems are human problems, and technology problems are human problems.

Technology doesn’t create problems for humans. It amplifies the problems humans have always had, just as it amplifies our positive abilities.

As the power of technology increases, it’s time that we increase the positive power of our humanity to match it. To do that, we have to build visions of purpose for what we want to do with technology. We can’t outsource this work, because worthwhile vision only comes through human experience.

People in business need new skills to meet the digital age – and I’m not talking about learning how to code. I’m talking about learning how to break the code. We need human methods to enable a new kind of vision quest, to discover what our place can be in the digital world.

I said at the beginning of this episode that we’d be talking about technology. What we’ve learned, though, is that whenever we’re talking about technology, we’re really just talking about physical representations of the weird, subjective, emotional issues that we have as human beings. No matter how technological business gets, it’s still thoroughly human.

The future is human. That’s the topic of next week’s episode of This Human Business. We’ll be talking about competing visions of what the future of business might be.

Of course, that’s in the future. Until next week, be in the present.

Techno Upopia