Demand quality software, for everyone's sake
A team of researchers led by SCADA security firm Digital Bond announced last week that they had found major security vulnerabilities in industrial control systems sold by five manufacturers. These are systems that are at work in nuclear power plants, utilities, gas pipelines, chemical plants and the like.
What struck me hardest about the news was not that the products have flaws, but that the vendors allegedly were aware of the problems and opted not to do anything about them.
A "large percentage" of the vulnerabilities the researchers identified were known by the manufacturers, who had "chosen to live with" them, Dale Peterson, founder of Digital Bond, is quoted as saying in an article at Wired. "Everyone knows PLC's [programmable logic controllers] are vulnerable, so what are we really disclosing? We're just telling you how vulnerable they are."
The security holes identified by the researchers include "backdoors, weak credential storage, ability to change ladder logic and firmware, command line interface, overflows galore, TFTP for important files and so much more," according to a blog post by Peterson. If there were any venues where it could be successfully argued that fixing flaws of this nature is worth the cost, wouldn't they be critical infrastructure facilities?
And the larger, more obvious and more perplexing question: Why is security not built into this kind of software from the outset? If I ran a nuclear power plant or a gas pipeline, why wouldn't I demand that my PLC supplier deliver a secure product? Would it take a little longer to get a secure product than a shoddy one? No doubt. Would it be a lot more expensive? I'm not so sure. It turns out that the most expensive PLC the researchers found vulnerabilities in also had the most security holes. It took researcher Reid Wightman only 16 hours to locate staggering flaws in it.
"He found that the system used no authentication to control the uploading of 'ladder logic' to program the PLC. Backdoors in the system also allowed him to list processes, see where in memory they lived and to read and write to memory," Wired's Kim Zetter reported. "He also had access to the configuration file, which listed, among other things, user names and passwords that would allow an attacker to gain access to the system using legitimate credentials. Basic buffer overflow flaws in the system could also be used to crash it."
We have been hearing for years that critical infrastructure is under constant threat of cyber attack, and this regularly raises questions about how much information industry should be sharing with the government and vice versa. In Washington there is a steady stream of proposals from lawmakers and the administration calling for greater government authority over critical infrastructure. The talk becomes ever-more urgent, with ideas ranging from the infamous "Internet kill switch" to mandatory breach notification to Draconian penalties for hackers.
Wouldn't it be better if businesses just started demanding quality software instead? - Caron