IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Why Your Data Needs An 'Expiration Date' to Stay Safe (Industry Perspective)

It is crucial in today’s security climate to begin classifying data and networks in a new way, based not just on levels of sensitivity but on shelf life and the realities of our evolving computing landscape.

As public administrations in the U.S. and around the world move communications and data storage from paper-based and local network systems to the cloud, security has become a serious concern. It’s becoming easier and easier for hackers to intercept information, as tools that can capture data in transit are increasingly cheap and widely available. As this interactive map shows, cyber attacks are a constant threat for all types of organizations -- and government agencies are no exception.

With mounting pressure to protect citizens’ data, network administrators at government agencies today face increasing regulations both domestically and globally. In the post-Snowden era, we can expect these regulations to continue piling up, especially as countries around the world engage in online espionage.

The Security Balancing Act

Government security managers know that protecting their citizens’ data is a tall order, and much easier said than done. The central challenge they face is finding and implementing security systems that balance affordability and practicality with efficacy over the long-term.

While high-security sectors like defense and intelligence services tend to employ manual key distribution via smart cards for data security purposes, these options are generally too expensive, tedious and impractical for small government agencies, especially within public administration, to adopt.

Instead, most government security managers rely on public key cryptography (such as RSA), where encryption is performed using a so-called public key and decryption requires a private key. Since the public key is only useful for encryption, it can be distributed without special security precautions, as long as the private key is kept secure.

But the traditional cryptography approach is vulnerable because human ingenuity knows virtually no bounds. As classical computing power increases, traditional cryptography will only be easier to break, and when quantum computing becomes available it will be rendered all but useless for highly sensitive transmissions.

As you can see, while public key cryptography may be practical for public administrations now, it’s not perfectly secure and will only grow weaker with time. It is worth noting that, over the past few years, cryptographers have devised a new class of public key cryptosystems that are resilient to currently known quantum attacks. But while these are an improvement compared to their predecessors, they are still highly vulnerable to the first two threats mentioned above, and there’s no guarantee that future quantum attacks won’t be able to defeat them (in fact, it is very likely they will.)

What a Working Quantum Computer Would Mean for Government Security

The NSA is currently building a quantum computer that it hopes will be “cryptologically useful”; one that could crack RSA and other similar public key cryptosystems within a matter of minutes.

At ID Quantique, we believe the first non-classified, small-scale, universal quantum computer will be available within the next five to 10 years. This estimate is based on the state of scientific research around key elements of the quantum computer: superconducting qubits and ion traps, as well as the level of investment by public funding agencies. (And of course, government agencies beyond the NSA are also working on this topic and investing significant resources in classified projects, so they are likely to be ahead of public research.)

This is a big problem for network administrators, since it will nullify the effectiveness of the public key cryptosystems they rely on for data security.

The Problem with Long-Term Security

But if we don’t have a working quantum computer yet, our data is safe for today at least, right?
Some government security managers assume that if their systems are good enough to foil today’s hackers, then their data is safe enough for now. They know they’ll need new systems eventually, but they feel like they have plenty of time to implement them. However, it’s important to realize that even if we don’t have a quantum computer today, hackers can record currently encrypted data and wait until one of the advances mentioned above comes to pass to decrypt it.

In other words, we don’t need a fully-functioning quantum computer for our data to be in jeopardy. “Backward vulnerability” represents a sea change in the data security dynamic, one that we need to take seriously.

Not all Data is Created (or Secured) Equally

From a practical standpoint, government security managers know that they need to prioritize and deploy their strongest security techniques to protect the most sensitive pieces of data. Not every single message sent by or within a state requires absolute secrecy forever. Some data doesn’t require perfect long-term security, since it loses its relevance quickly. Other types of data need to be secured for five, 10 or 20 years -- or more.

For example, the location of a prominent political figure at any given time is highly sensitive during and leading up to that moment -- but is largely irrelevant after the person has moved to a new location. This type of information should still be encrypted, of course, but it presents less of a long-term security concern due to its transient nature and is thus a good candidate for public key encryption.

Data with a longer shelf life must be protected carefully from threats both present and future. For example, internal communications about foreign relations can be highly sensitive even years after the fact. Data like social security numbers, tax records, or evidence in high-profile trials often need to be kept secret for decades or more.

This explains the growing need for data lifecycle management (DLM) solutions. However, most DLM solutions do not address the growing problem of advanced computing and the reality that soon quantum computers will be able to crack encrypted messages captured in the past.

The only currently known technology that sufficiently protects data from current and future attacks is QKD, a method of distributing encryption keys that are safe from eavesdropping due to the laws of quantum physics. 

Moving Forward in Today’s New Data Climate

It is crucial in today’s security climate to begin classifying data and networks in a new way, based not just on levels of sensitivity but on shelf life and the realities of our evolving computing landscape. We must take care to define both the level of secrecy and durations associated with data, and then secure it differently based on these classifications. For information that needs to be protected for more than five years, we should mandate protection from the computational powers of the future, specifically quantum computing.

Network administrators who aren’t classifying data by such standards can’t risk delaying the implementation of these new measures; if they wait much longer, it may already be too late.

Gregoire Ribordy is the co-founder and CEO of ID Quantique, and has worked in the security and communications industry for more than 15 years. He was previously a research fellow at the Group of Applied Physics of the University of Geneva, where he actively developed quantum cryptography technology and is the holder of a number of patents in the field. 

Special Projects
Sponsored Articles
  • How the State of Washington teamed with Deloitte to move to a Red Hat footprint within 100 days.
  • The State of Michigan’s Department of Technology, Management, and Budget (DTMB) reduced its application delivery times to get digital services to citizens faster.

  • Sponsored
    Like many governments worldwide, the City and County of Denver, Colorado, had to act quickly to respond to the COVID-19 pandemic. To support more than 15,000 employees working from home, the government sought to adapt its new collaboration tool, Microsoft Teams. By automating provisioning and scaling tasks with Red Hat Ansible Automation Platform, an agentless, human-readable automation tool, Denver supported 514% growth in Teams use and quickly launched a virtual emergency operations center (EOC) for government leaders to respond to the pandemic.
  • Sponsored
    Microsoft Teams quickly became the business application of choice as state and local governments raced to equip remote teams and maintain business continuity during the COVID-19 lockdown. But in the rush to deploy Teams, many organizations overlook, ignore or fail to anticipate some of the administrative hurdles to successful adoption. As more organizations have matured their use of Teams, a set of lessons learned has emerged to help agencies ensure a successful Teams rollout – or correct course on existing implementations.