Advertisement

SKIP ADVERTISEMENT

What’s Next for Humanity: Automation, New Morality and a ‘Global Useless Class’

Video
bars
0:00/3:48
-0:00

transcript

Yuval Noah Harari on the Future of Humanity

Humans, Mr. Harari warned, “have created such a complicated world that we’re no longer able to make sense of what is happening.”

Just as the industrial revolution of the 19th century created the urban working class, so the automation revolution of the 21st century might create the useless class, and much of the political and social history of the coming decades might revolve around the problems, and the hopes, and the fears of this new class. With disruptive technology, the danger, in a way, is far greater because it has some wonderful potential. So there are a lot of forces that for some very good reasons are pushing us faster and faster to develop and adopt these disruptive technologies. And it’s very difficult to know in advance what the consequences will be. They might very soon reach the point when they create systems, they create algorithms that understand us better than we understand ourselves, and you won’t even understand that this is happening. The liberal democracy will become an emotional puppet show. This is the kind of threat that we are already beginning to see emerging today. Now, in the coming years, in the coming decades, we will face individual discrimination, and it might actually be based on a good assessment of who you are. You will not be able to do anything or almost anything about this discrimination, first of all, because it’s just you. They don’t discriminate against you or me because you’re Jewish, or gay, or black, or whatever, because you are you. And the worst thing is it will be true. [laughter] Now, time is accelerating. So the long term is not 2,000 years or 200 years. The long term is 20 years. Nobody knows the basics about how the world would look like in 20 or 30 years, such as what the job market would look like, what kind of skills people will need, what family structure would look like, what gender relations would look like. So it is the first time in history when we have no idea. I think that politics and government in most of the world today, they are doing a far better job than ever before in running the day-to-day business of the country. But what they have almost lost completely is the ability to have a long-term plan for the future. So what you see in more and more countries is that they look to the past instead of to the future. They repackage nostalgic fantasies about the past. To really act well, it’s not enough to have good values. You need to have a good understanding of the chains of causes and effects. Like, if you think about the commandment like, don’t steal, the big problem is that stealing has become so complicated that I’m stealing all the time and I’m not even aware of it. And even if I am aware, I don’t know how the corporation makes its money. And during that time, I will be guilty of so many other crimes which I know nothing about. The problem is on understanding the extremely complicated chains of cause and effect in the world. And again, my fear is that maybe homo sapiens is just not up to it. We have created such a complicated world that we are no longer able to make sense of what is happening.

Video player loading
Humans, Mr. Harari warned, “have created such a complicated world that we’re no longer able to make sense of what is happening.”

LONDON — What will our future look like — not in a century but in a mere two decades?

Terrifying, if you’re to believe Yuval Noah Harari, the Israeli historian and author of “Sapiens” and “Homo Deus,” a pair of audacious books that offer a sweeping history of humankind and a forecast of what lies ahead: an age of algorithms and technology that could see us transformed into “super-humans” with godlike qualities.

In an event organized by The New York Times and How To Academy, Mr. Harari gave his predictions to the Times columnist Thomas L. Friedman. Humans, he warned, “have created such a complicated world that we’re no longer able to make sense of what is happening.” Here are highlights of the interview.

Just as the Industrial Revolution created the working class, automation could create a “global useless class,” Mr. Harari said, and the political and social history of the coming decades will revolve around the hopes and fears of this new class. Disruptive technologies, which have helped bring enormous progress, could be disastrous if they get out of hand.

“Every technology has a good potential and a bad potential,” he said. “Nuclear war is obviously terrible. Nobody wants it. The question is how to prevent it. With disruptive technology the danger is far greater, because it has some wonderful potential. There are a lot of forces pushing us faster and faster to develop these disruptive technologies and it’s very difficult to know in advance what the consequences will be, in terms of community, in terms of relations with people, in terms of politics.”

The combination of biotech and I.T. might reach a point where it creates systems and algorithms that understand us better than we understand ourselves.

“Once you have an external outlier that understands you better than you understand yourself, liberal democracy as we have known it for the last century or so is doomed,” Mr. Harari predicted.

“Liberal democracy trusts in the feelings of human beings, and that worked as long as nobody could understand your feelings better than yourself — or your mother,” he said. “But if there is an algorithm that understands you better than your mother and you don’t even understand that this is happening, then liberal democracy will become an emotional puppet show,” he added. “What happens if your heart is a foreign agent, a double agent serving somebody else, who knows how to press your emotional buttons, who knows how to make you angry, how to make you bold, how to make you joyful? This is the kind of threat we’re beginning to see emerging today, for example in elections and referendum.”

In the 20th century, discrimination was used against entire groups based on various biases. It was fixable, however, because those biases were not true and victims could join together and take political action. But in the coming years and decades, Mr. Harari said, “we will face individual discrimination, and it might actually be based on a good assessment on who you are.”

If algorithms employed by a company look up your Facebook profile or DNA, trawl through school and professional records, they could figure out pretty accurately who you are. “You will not be able to do anything about this discrimination first of all because it’s just you,” Mr. Harari said. “They don’t discriminate against your being because you’re Jewish or gay, but because you’re you. And the worst thing is that it will be true. It sounds funny, but it’s terrible.”

It took centuries, even thousands of years, for us to reap the rewards of decisions made by our forebears, for example, growing wheat that led to the agricultural revolution. Not anymore.

“Time is accelerating,” Mr. Harari said. The long term may no longer be defined in centuries or millenniums — but in terms of 20 years. “It’s the first time in history when we’ll have no idea how human society will be like in a couple of decades,” he said.

“We’re in an unprecedented situation in history in the sense that nobody knows what the basics about how the world will look like in 20 or 30 years. Not just the basics of geopolitics but what the job market would look like, what kind of skills people will need, what family structures will look like, what gender relations will look like. This means that for the first time in history we have no idea what to teach in schools.”

Leaders and political parties are still stuck in the 20th century, in the ideological battles pitting the right against the left, capitalism versus socialism. They don’t even have realistic ideas of what the job market looks like in a mere two decades, Mr. Harari said, “because they can’t see.”

“Instead of formulating meaningful visions for where humankind will be in 2050, they repackage nostalgic fantasies about the past,” he said. “And there’s a kind of competition: who can look back further. Trump wants to go back to the 1950s; Putin basically wants to go back to the Czarist Empire, and you have the Islamic State that wants to go back to seventh-century Arabia. Israel — they beat everybody. They want to go back 2,500 years to the age of the Bible, so we win. We have the longest-term vision backwards.”

“We’re now living with the collapse of the last story of inevitability,” he said. The 1990s weres flush with ideas that history was over, that the great ideological battle of the 20th century was won by liberal democracy and free-market capitalism.

This now seems extremely naïve, he said. “The moment we are in now is a moment of extreme disillusionment and bewilderment because we have no idea where things will go from here. It’s very important to be aware of the downside, of the dangerous scenarios of new technologies. The corporations, the engineers, the people in labs naturally focus on the enormous benefits that these technologies might bring us, and it falls to historians, to philosophers and social scientists who think about all of the ways that things could go wrong.”

“To act well, it’s not enough to have good values,” Mr. Harari said. “You have to understand the chains of causes and effects.”

Stealing, for example, has become so complicated in today’s world. Back in biblical times, Mr. Harari said, if you were stealing, you were aware of your actions and their consequences on your victim. But theft today could entail investing — even unwittingly — in a very profitable but unethical corporation that damages the environment and employs an army of lawyers and lobbyists to protect itself from lawsuits and regulations.

“Am I guilty of stealing a river?” asked Mr. Harari, continuing his example. “Even if I’m aware, I don’t know how the corporation makes its money. It will take me months and even years to find out what my money is doing. And during that time I’ll be guilty of so many crimes, which I would know nothing about.”

The problem, he said, is understanding the “extremely complicated chains of cause and effect” in the world. “My fear is that homo sapiens are not just up to it. We have created such a complicated world that we’re no longer able to make sense of what is happening.”

Advertisement

SKIP ADVERTISEMENT