Technology’s State of Crisis Demand Crisis Management

San Francisco Chronicle, September 16, 2018, p. E7.

Make no mistake about it, technology is in a state of crisis of its own making.

Technology has betrayed our deepest sense of trust and well-being. It has allowed itself — indeed, its values are deeply woven into the underlying business model of tech companies — to be used for nefarious purposes.

It has collected and sold without our full awareness, let alone permission, our personal information to third parties for their gain, not ours. It’s monetized every aspect of our being. It’s provided a platform for fake news and hate speech. It’s allowed foreign governments to interfere with our elections. It’s served as a vehicle for cyberbullying, thereby hounding people every moment of their lives.

One of the deepest fears is that, instead of aiding us, artificial intelligence will take over and control us. In these and countless other ways, technology has sown distrust into the very fabric of society.

Every day brings news of but yet another crisis caused by all-too-powerful tech companies. More and more, the crises affect not only them, but all of us as well. When Facebook’s stock took a big hit, for example, it affected tech stocks across the board thereby negatively impacting the entire economy.

The concerns lie not just with the problematic intended uses of technology but the failure to think about and anticipate the unintended uses. Technologies are fundamentally abused and misused in ways that their creators didn’t envision, and in far too many cases, didn’t ever want to consider. For instance, from my more than 30 years in the field of crisis management, I’m convinced that virtually all of Facebook’s enumerable crises could have been foreseen if crisis management had been an integral part of the company’s culture and thinking from its founding.

Prior to the Facebook technology’s launch, there is reason to believe that teams of parents, teachers, psychologist and kids would have come up with the possibility of it being used as a vehicle for cyberbullying. If steps had been taken before it went live, we still would have had something like Facebook, but hopefully a much more responsible one.

We need a government agency, similar to the U.S. Food and Drug Administration, to not only oversee the social impacts of technology, but to protect us from those that present a clear danger to our well-being. We must establish panels composed of parents, social scientists, child development experts, ethicists, crisis management authorities — and kids — to think of as many ways as they can about how a proposed technology could be abused and misused.

Ideally, tech companies would do this on their own. Indeed, research has shown that companies that are proactive in anticipating and planning for crises are substantially more profitable than those that are merely reactive. Crisis management is not only the ethical thing to do, it’s good for business; it heads off major crises before they are too big to fix.

Crisis management needs to be built into every technology, from inception and across its lifetime. As difficult as the invention of a technology is, its management is just as difficult, if not more so. It requires a different set of skills and levels of maturity than that which is needed to invent a technology. We need different types of technologists and tech companies.

But that’s no reason to hesitate: The backlash against technology and tech companies is clearly brewing.

About imitroff

Dr. Ian Mitroff is Professor Emeritus at the Marshall School of Business and the Annenberg School for Communication at the University of Southern California in Los Angeles. He is the president and founder of Mitroff Crisis Management, a private consulting firm based in Oakland, California, that specializes in the treatment of human-caused crises. He is a Senior Affiliate with the Center for Catastrophic Risk Management at the University of California, Berkeley.
This entry was posted in Crisis Management, Philosophy + Systems and tagged , . Bookmark the permalink.