Cyberthreats are universal. But the appropriate response may be quite different in academia from what works in the corporate world.
Cybersecurity concerns crop up everywhere you turn lately – around the election, email services, retailers. And academic institutions haven’t been immune to security breaches either. According to a recent report by VMware, almost all universities (87 percent) in the United Kingdom have been the victims of cyber crime. In general, from 2006 to 2013, 550 universities suffered data breaches. When higher ed breaches occur, attackers typically steal student information, intellectual property or research data. Among the criminals behind these attacks are nation-states and organized crime groups motivated by the economic gain.
A common knee-jerk reaction to a cyberattack – wherever it happens – is to clamp down on access and add more security control. For example, in 2005 after a major attack against a credit card processor affected 40 million customers, there were urgent calls for new mandatory encryption standards in the U.S. Senate. As paranoia sets in, a sense of urgency to do something about a possible next attack takes over, just like what happened in the University of California system. After a 2015 hack, the university administration started monitoring user traffic without consulting faculty and students (not to mention receiving their consent), resulting in a huge backlash.
As is so often the case, too much of anything is not good. Cybersecurity is a delicate balancing act between usability and countermeasures designed to reduce or prevent threats. A one-size-fits-all, or Procrustean, approach usually leads to lower productivity and a large group of unhappy users. And it’s particularly tricky to get the balance right in an academic setting.
Much of what we in academia do hinges upon our academic freedom. American scholars count on the freedom to pursue academic projects without administrators imposing any political, religious or philosophical beliefs from above. Our free access to information technology (IT) resources is a big part of how we accomplish our scholarly work. But unlimited access may no longer be realistic as we start to grapple with the realities of an ever-hostile cyberthreat environment. Campus security leaders must walk a fine line when considering how to improve cybersecurity, particularly in the wake of an attack.
Cybersecurity in higher education is different than in corporate milieus. Companies have a much easier time compelling employees to comply with and enforce access-control policies to protect intellectual property and trade secrets.
But the free flow of information among students, faculty members and the surrounding community is part of what allows academic communities to flourish. Unfortunately, the academic ideal of openness conflicts with some of cybersecurity’s major goals and can lead to more vulnerabilities – and the attacks that exploit them.
In this sensitive environment, I’d suggest security practitioners on campus should avoid an “all-or-nothing” approach. Their mission is to help faculty members do their jobs effectively and safely. Productivity in both research and teaching needs to be supported as much as possible and balanced with security requirements.
Take a two-factor authentication scenario. A faculty member may be required to carry an additional device (like a cellphone) to generate a code to be entered in addition to his system password. It’s safer, but cuts into productivity because he has to spend extra time and care to generate the secret code on his second device.
Sometimes it may be appropriate to compromise – accepting some risk while trying to maximize productivity and minimize security vulnerabilities. The administration can decide on an acceptable risk level and translate it into security policies. Then it’s up to security and IT professionals to enforce them. In an academic setting, input from faculty and students needs to be factored into this process since they’re the organization’s primary customers.
Context is going to count on campus. There are bound to be faculty and staff members who need more access to IT resources simply to do their jobs. For example, instructors teaching an ethical hacking course will want administrative access, while those mostly doing clerical work on their computers don’t usually request the same kind of privileges. By accommodating each individual’s occupational needs as much as possible, a chief information officer can hopefully avoid rogue users who do IT tasks in their own way without authorization – and end up introducing new security vulnerabilities without the security group’s knowledge.
It is also crucial for the security experts to listen to the users and establish trust and transparency. They need as much buy-in as possible so everyone on campus is, in effect, on the same cybersecurity team. The last thing a CIO wants is users not sounding the alarm about a potential security problem and trying to solve it on their own due to a lack of trust. It’s mutually beneficial for users and security people to freely communicate with each other about anything that could have an impact on their professional lives. This could mean announcements of additional security restrictions imposed on end users by the IT group, or a user’s confession of a personal security oversight.
Logging and monitoring are a CIO’s best friend. For employees this means that every move they make on their computer is being tracked and recorded. Due to the fact that colleges need flexibility in their security management landscape, looking out for potential security incidents and responding to them are critical. The quicker you detect an attack before it can do irreversible damage, the better. On the other hand, academics may feel uneasy about this type of surveillance that could potentially impinge on their academic freedom.
CIOs should educate, not babysit, and think of users as resources. One of the biggest challenges in cybersecurity is that its scope is beyond the capabilities of a single person or even a small group. If security professionals attempt to do it all themselves and keep users out of the loop, they are destined to fail. There are simply too many things to protect and too few resources, including manpower, time and budget. In addition, an uninformed or misinformed user can turn out to be a major security vulnerability, as demonstrated by social engineering cyberattack techniques such as phishing – an impersonation attack that gets victims to surrender sensitive information.
A better approach is to educate the users about how to protect themselves and delegate some security-related responsibilities to them, depending on their knowledge and roles. Using a “carrot and stick” strategy can help. There need to be consequences for repeated violations of security policies – for instance, visiting an unauthorized website can mean losing privileges such as full access to a computer system. Desirable behaviors can be reinforced by rewards, even as simple as a chance to win a gift certificate or other swag. This way the users can better protect themselves, and IT staff can get some relief in terms of their workload.
As an end user, be reasonable. Don’t say you need unlimited rights and privileges. Do you really require full access to everything? Probably unrestricted access to all student records since the university’s founding is overkill. Unnecessary access makes the job of security professionals almost impossible because it introduces an uncontrollable number of security vulnerabilities. Even a system administrator doesn’t typically have this kind of unlimited power anymore because of concerns about insider threats.
At the same time, that’s not to say an academic should allow his rights to be stifled. Scholars should be proactive and communicate their IT needs to the appropriate people. You may assume that your request for a personal wireless router in your office will be either ignored or outright rejected; but an IT team’s primary security goal may simply be awareness of what’s going on in their network – including connection of the router. Many security incidents result from a lack of visibility. It’s users’ responsibility to notify IT staff before taking any security-relevant actions.
Lastly, you can be the master of your own cybersecurity destiny. End users are frequently the weakest link. Faculty members should demand systematic training on how to take their own precautions against common security attacks. For example, a majority of phishing attempts can be defeated with an elevated level of awareness and education.
Whether you’re an IT service user or a security professional, a common goal should be making security more usable and transparent. If possible, we need to make any security efforts as unobtrusive and usable as possible. Unfortunately, most security solutions today are highly visible and can be detrimental to “getting things done.” We can all do better when we meet in the middle as higher education’s core mission – teaching and research – hinges on both its academic freedom and cybersecurity.
Jungwoo Ryoo, Associate Professor of Information Sciences and Technology at Altoona campus, Pennsylvania State University. This article was originally published on The Conversation. Read the original article.