Imagining possible futures can help us plan a secure information technology environment for the years to come.
Whether for blackmail, “revenge porn” or other motives, cybercriminals and hostile governments in this world would find new ways to exploit data about emotion. The terms of cybersecurity would be redefined, as it became more important for people to manage and protect how their emotions and mindsets appeared to the monitors.
This is just one of several potential future cybersecurity scenarios dreamed up by a group of multidisciplinary experts recently. Here at the Center for Long-Term Cybersecurity, we asked them to think about what we could see happening in the near future of 2020. These are not predictions – it’s impossible to make precise forecasts about such a complex set of issues. Rather, the scenarios paint a landscape of future possibilities, exploring how emerging and unknown forces could intersect to reshape the relationship between humans and technology – and what it means to be “secure.”
And they raise pressing questions we should consider today as we lay the groundwork for a secure information technology environment in the future: how might individuals function in a world when they are no longer able to ignore the fact that literally everything they do online will likely be hacked or stolen? How could the proliferation of networked appliances, vehicles and devices transform what it means to have a “secure” society? What would be the consequences of almost unimaginably powerful algorithms predicting individual human behavior at the most granular scale?
At the heart of our approach is scenario thinking, a proven methodology for identifying important driving forces and unexpected consequences that could shape the future. This approach often leads to more questions than answers, but what we identify can help guide us toward solutions as society and technology evolve.
In our scenario about emotion-sensing, for example, many questions arise:
Our broad interdisciplinary group of experts on computer science, political science, neuroscience and other areas came from universities, the private sector, nonprofits and governments. They helped us develop that scenario, and four others, for the year 2020.
For example, imagine that two decades after the first dot-com bust, the advertising-driven business model for major Internet companies has fallen apart. As overvalued web companies large and small collapse, criminals and companies alike race to gain ownership of underpriced but potentially valuable data assets. It’s a “war for data” under some of the worst possible circumstances: financial stress and sometimes panic, ambiguous property rights, opaque markets and data trolls everywhere.
In this world, cybersecurity and data security become inextricably intertwined. There are two key assets that criminals exploit: the datasets themselves, which become the principal targets of attack; and the humans who work on them, as the collapse of the industry leaves unemployed data scientists seeking new jobs. The questions that arise are difficult:
This is just the beginning. In one of our other scenarios, we imagine that hackers have become so successful that the public’s default expectation about Internet transactions flips from “we are basically safe” to “we are going to have our data stolen.” Another looks at the potential of predictive algorithms: if those improve to be able to predict individual behavior, all sorts of new attacks might occur. Still another looks at the Internet of Things, suggesting that governments may lead the way in IoT adoption – and could become both more effective and more vulnerable as a result.
The world in 2020 could look very different from today. Our scenarios are designed to serve as a starting point for conversation and debate among academic researchers, industry practitioners, and government policymakers. We invite the public to join us as well; please read the full-text scenarios and engage with them on Twitter (@cltcberkeley). We look forward to building a better cybersecurity future with you.