Proponents say constant peer review creates more secure applications.
When you buy software, you probably trust that you're getting a secure product that runs well. This faith may come from the fact that the source code - the digital DNA that tells the program how to work and what to do - is hidden from consumers. In most cases, only the select programmers tasked with maintenance and security can see it and make changes.
Closed or proprietary code is the engine of legions of vendor-made products. Many of them, like Microsoft's nearly ubiquitous Windows software, are closed code to prevent piracy and duplication by competitors or users. And for some license owners, the perceived benefit of closed code is that if no one sees it, those who intend to do harm can't see the software's vulnerabilities easily and how to exploit them.
The prevalence of open source code, however, could make one wonder how much secret code matters. The term "open source" generally refers to programs in which people can view or modify the programming code. Open code is developed in a collaborative environment where programmers can make changes that are visible for the community to see. People can download many of these programs free of charge and can choose to join the development process by making modifications or viewing changes as they see fit.
But does this openness make it less secure than its closed source brethren? Open source advocates certainly don't think so.
"You know exactly what needs to be done to secure it and what vulnerability it has. It's quantifiable; it's knowable," said Christopher Adelman, vice president of sales and marketing for Alien Vault. Alien Vault created OSSIM (Open Source Security Information Management). "The problem with closed source solutions is there's a certain leap of faith associated with closed source software."
Open source code lets users judge how secure a program is, Adelman said. When you can't see the code, you can't see for yourself just how secure it is or isn't. "You know exactly what you're getting into, and for me, that's everything. Game won right there."
A popular argument of the pro-open source crowd is this: If it's open, it's essentially up for peer review, which means there are more sets of eyes to identify security holes and fix them. In a closed environment, how do you know how thorough your software's being reviewed if you can't see what's happening or know who's doing it?
"The things that keep me awake at night are the things I don't know about. It's the things that I have no idea are out there that the hackers know that I don't, that are going to cause us problems on our security operation front," said Jon Dolan, chief information security officer of Oregon State University.
Open source can also make patching software a bit faster. There's no need to contact the vendor about a bug - like you'd have to with proprietary code - or wait for a next release of the software that's fixed the bug.
"If I find a bug in an open source program ... I submit a fix to the people who are responsible for the program," Dolan said. "It gets peer reviewed before it's accepted, but then it is accepted in short order, so we eliminate this whole workflow of reporting a bug to have somebody else fix it. You just fix it yourself and pass along the fix to everyone
Other users feel the same way.
"What I think is good about an open source project is that all of those discussions happen out in the open, and so people can see if there's a fix for something right away," said Michael Tutty, enterprise IT expert for the Iowa Department of Administrative Services, where IT personnel use open source Geeklog, a PHP/MySQL-based application for content management. "I don't believe that a for-profit company could approach the speed that an open source company can patch things at."
Open source advocates can point to strong evidence of just how much consumers support open code in spite of any security-related distinctions between it and closed code.
Netcraft uses a tool to periodically query Web sites and discern statistical data about them, including what Web servers and operating systems they use. The company's April 2009 query received responses from more than 231 million sites from around the world, and approximately 106 million of them used Apache Web servers - technology developed in an open source community facilitated by the Apache Software Foundation. Apache Web server technology has been No. 1 in Netcraft surveys like this since 1996.
"I think it's true that most people run both open source code and closed code, and that there's a hodgepodge of it on all of our computers today," Dolan said.
But do users themselves worry about the integrity of their open code? Of course, but to some, the closed versus open argument oversimplifies the issue.
"From my perspective, no code is truly safe and secure. Whether the code base is open or closed, they both have the same level of frailties, and that's because it's designed by humans, and all things that we do are imperfect," said Noel Hidalgo, director of technology innovation for the New York state Senate.
His office uses Drupal open source software to manage Web site content. He feels that open source, by its very nature, could motivate people to plug security holes.
"They're always having to go about greater levels of security audits, and maybe that's because people just feel insecure about it and that there's a certain level of insecurity. But in the end," he said, "because more eyes are being viewed upon it and that they are forced to do more security audits, I feel that open source software has a tendency to be more secure."
Software audits for open source code might not necessarily be the same, however, as ones done by a technology giant that sells code to millions of consumers. These companies have thorough and laborious documentation and quality assurance processes. They also have cash.
"If you consider somebody's time equal to the money that it would cost for them to do the work, how much are people really going to spend auditing open source software seriously to find bugs versus a commercial piece of software?" said John Viega, chief technology officer of the software-as-a-service business unit at McAfee.
It's assumed that vendors pay skilled programmers to iron out the kinks in their own closed code. Which makes one wonder: In an open environment, if programmers often operate for free, are they as skilled or motivated as seasoned programmers at a Microsoft- or Oracle-level company? Wouldn't programmers who have the threat of unhappy bosses and the allure of paychecks have more incentive to work harder than those who don't?
Answers vary depending on whom you ask, but most people don't seem too worried about
the credentials of the developers in their open source communities.
"I don't think that's true at all," said Steve Grubb, team lead for the security technologies team at Red Hat, a prominent provider of open source technology. "I think that if word got out that the Red Hat operating system is not secure, the government would not be buying and deploying it. There's just as much motivation for us to make sure that our source code is as good as it can be because it really comes down to reputation."
He feels that a collaborative community brings benefits that outweigh financial incentive.
"In the open source world, the source code is viewable all over the world by just about anybody, and so you can draw upon the experience of security experts from all over the world and you're not limited to just how many people you can put on your own payroll," he said.
Even so, Red Hat has the money to pay its programmers. The company makes revenue when customers pay for maintenance, not for the free software. So in these cases, the "many eyes" in the open environment have a financial incentive to be diligent.
Once you go a few - or quite a few - steps below an operation at Red Hat's level, you'll find communities of programmers who are working for free, or sometimes a mixture of the paid and unpaid. In Geeklog's case, Tutty said the majority of developers don't receive pay for their work.
"They're all volunteers. Some of the contributors to the project are people who actually do sell their time to do support for Geeklog sites," he said. But it's not anything like the sophisticated affair of larger open source providers. "These guys are much more informal, much more 'open sourcy' about it. Their main goal is to put out this thing because they all use it, and they all like it, and then, if you need help, you get it from the forums or mailing lists."
In smaller open source communities, peer review still makes open code worthwhile, developers say, even when monetary compensation isn't a factor.
"The vast majority of these people are not paid to do this. This is done because they have a hobby. They don't play sports; they don't go to play tennis at lunch; they don't play racquetball. They code. That's where they get their enjoyment," said Harper Apted, network administrator of the Warsaw Community Public Library in Indiana. "When they submit code, they get 'props' for this. They get reputation, perks, and are known in their community and their social circle."
According to Apted, if you're a hardcore programmer, then successfully creating or tweaking programs during your free time that satisfy users is enough to make you do exemplary work.
"If you are known for submitting bad, buggy code, corrupt or even virus-laden [code], you will be the dreg of your social circle," he said. "You probably won't even be allowed to participate, so reputation is a very valuable commodity, even if it's intangible."
Three out of the Warsaw Community Public Library's 12 servers run operating systems built on top of the Linux open source kernel. These servers support the integrated library system, which is a database of books, videotapes, CDs, audio books and everything else that can be checked in or out.
Apted said he and his colleagues don't modify the open source programs they use, but they do view the code as they wish and can see how it's architected. When they need modifications, they hire someone to handle that. And because it's open code, it's relatively simple to see what needs tweaking and do it, he said.
Users of commercial, closed source products generally assume that their
software's developers are professionals. But with open source products, consumers may wonder about the credentials of these mysterious programmers wandering around in these free environments.
"Frankly you look at a lot of open source projects, there's somewhat of a perception that large parts of it were written by high school kids just to start learning and building a reputation," McAfee's Viega said.
But the reality is that most users don't know who developed their software - whether it's open source or proprietary. You probably don't give much thought to the Microsoft staffers who created your latest Windows release. It's probably not much different if you're downloading software from an open source resource like SourceForge.net, either.
"As far as it goes, how do you really know anybody that's writing the code is worth it? I don't know that it's necessarily built on trust. A lot of it is built on reputation," Apted said.
Software's integrity may have little to do with whether it's open or closed and everything to do with how widely it's used. If people care about it, they'll scrutinize it.
"I know that the U.S. government has paid to review many open source projects such as OpenSSL, the free security library, and Apache. So I'm sure they've done it more than once," Viega said. "The government even paid for OpenSSL to be FIPS [Federal Information Processing Standards] certified."
The National Institutes of Standards and Technology (NIST) first certified OpenSSL in January 2006 as compliant with FIPS 140-2 Level 1 standards. This certification was revoked in June that year after concern about how it interacted with other software. However, NIST reinstated the certification in 2007.
In open source's case, this scrutiny comes not only from the members of the open community creating and modifying it, but also from third parties like private companies and the government that want to vet software for their own use. In the closed code world, companies typically pay for the same sort of vetting themselves, but again, these companies likely only do it for the most popular applications. Security, in these cases, has little to do with whether code is hidden, but everything to do with how much people want to use it.
"The way that security bugs are found is by people looking at the code," Viega said. "There's a big industry around this. There are many companies looking for vulnerabilities. The vulnerability company is focused on self-marketing, really, at the expense of the customers."
In his book, The Myths of Security, Viega wrote that "the most popular open source software gets reviewed more. The most popular commercial software typically has a large investment in training, tools, auditing and so on." He implies that if you're concerned about the security of your software and asking if it's closed or open source, you're asking the wrong question. It's a moot point.
Perhaps you should consider how long the product has existed, how well it has performed, and how you would handle modifications if they're necessary. You may have to pay a third party or the open source provider to handle the work, but if the community your code came from is robust and thriving, you likely won't have to.
"One aspect of open source security that is a little less tangible but makes sense when you think about it is, when security professionals have all of the source code, they can explore new solutions to old problems in a very creative way," Grubb said.
Looking for the latest gov tech news as it happens? Subscribe to GT newsletters.