Indian Institute of Information Technology - Allahabad
The Big Debate: To report software flaws or not?
Alumni of IIIT-A (Btech-IT, 2005)
Master of Information System Management
Cargenie Mellon University , Pittsburgh , US
On one hand, warning the public about bugs in software flaws can help them take suitable actions to protect themselves against the vulnerabilities. Also, the company providing the software can come up with suitable patches to overcome those vulnerabilities and focus on continuous innovation and improvement to safeguard against security holes in their software. Moreover, hackers being technically savvy can anyways unearth bugs in software and exploit them. So, public disclosure would only be in the benefit of the end users.
On the other hand, hackers can misuse information about these vulnerabilities and exploit it to cause harm. Public disclosure of vulnerabilities may encourage a larger base of people to misuse that information before the naïve users can take appropriate steps to protect themselves. The company providing the software can go into heavy losses if consumers develop negative opinions about their product. Exposing vulnerabilities may become a trend for thwarting competition, maligning companies, magnifying the small flaws. This may even create an aura of negative impression on naïve consumers, scaring them unnecessarily and making them pessimistic about using software.
The other intermediate option would be to inform the company producing the software about the bugs/flaws in their software, rather than exposing them publicly. However, the company producing the software must be receptive to such complaints. If the bugs are exposed publicly and users are aware of those flaws, the company will have no choice but to overcome those security issues in order to remain competitive. On the other hand, if public is not told about the vulnerabilities, companies would not be under pressure to take immediate action and hackers may keep exploiting those faults causing harm. Thus, public disclosure might be the only way to force company into writing secure code.
In fact, this argument can be supported by the example of open source software. Since open source software’s vulnerabilities are not hidden, the open source community quickly fixes them. Therefore, open source software undergoes rapid development creating a mature and robust product in a relatively short time.
Software companies and Research organizations have divergent agendas: Vendors want indefinite amount of time and want to be able to control the distribution of information to their customers. They also don’t want to release any details, just bug fixes. While researchers want to find vulnerabilities and inform the general public so that the public at large can understand the risk as soon as possible and before any hackers take advantage of those risks. They want to have immediate release with as much detail as possible. This often leads to mistrust between the two. Security researchers accuse software vendors of putting profits before consumer welfare and software companies in turn, accuse researchers of wanting publicity by exposing their software bugs to the public. In light of such divergent agendas, neutral third party approach may be the only solution to the deadlock.
Software makers might be slow to publicize bugs because they want to have thorough fixes in hand before spreading news of a flaw to those who might exploit it for destructive purposes. On the other hand, those who argue in favor of full and immediate public disclosure of software flaws hold that keeping bugs secret will not keep the hackers from finding these flaws on their own. Hackers are technically savvy people who do not rely on the media to obtain information about security bugs. They probably know more than the research organizations. So, publicly disclosing flaws would only be in the interest of the larger user community. In fact, many software companies recognize this fact and hold hacking contests to discover flaws in their software. Software vendors would definitely profit from knowing about critical unpatched software flaws that would affect thousands of users. But, public disclosure can cause more harm than good.
In my opinion, software companies should be given a chance and time to fix these problems before the public is informed. After all, no software in the world is without flaws and one can never test their software enough. So, companies should be given a fair chance and time to overcome the flaws in their software as and when they are discovered. They need to validate the flaw, come up with fixes for different platforms, and test them before releasing them. As far as the argument that public disclosure forces developers to accelerate their effort is concerned, it applies only when software vendors have no intentions of fixing the problem. If developers are already trying to develop a fix, they would only be distracted by the volume of emails that come after a public disclosure. Thus, going public may actually delay a firm’s ability to get patches out quickly. Most companies are already trying to resolve problems as soon as possible and seeing it on paper won’t make them go any faster. It just adds to their overhead of managing the media. Moreover, prematurely releasing the information would only arm the hackers who would use that information to compromise the systems and attack users.
But, then the questions arise: How much time should software companies have? Are standards needed to govern this arena? I believe, there should be regulatory bodies who control the disclosure of information in an impartial manner, keeping the public interest in mind and at the same time being fair to the software companies. And the time given to the software companies should depend on the severity of the bug. For critical flaws, they should be required to come up with immediate patches while for less sensitive flaws, they can wait till the next release. Also, the software companies should establish relationships with research organizations and make it easy for them to report problems and issues with their software. They must also develop internal structures needed to handle security flaws more swiftly and effectively. Researchers should also recognize and appreciate the negative effects of public disclosure and report bugs in a responsible manner.