The US government is deliberating about how to approach the “cyber” security problem. But the solution the government needs to network security isn’t sweeping authority over the Internet — it’s common-sense security practices they’ve heretofore failed to implement.

As we previously said, it is unfortunate that the government tends toward the dramatic and seeks to broadly expand its powers in the name of security, while continuing to overlook more prosaic issues. Bruce Schneier explains,

GAO reports indicate that government problems include insufficient access controls, a lack of encryption where necessary, poor network management, failure to install patches, inadequate audit procedures, and incomplete or ineffective information security programs. These aren't super-secret NSA-level security issues; these are the same managerial problems that every corporate CIO wrestles with.

Moreover,

The best thing the government can do for cybersecurity world-wide is to use its buying power to improve the security of the IT products everyone uses. If it imposes significant security requirements on its IT vendors, those vendors will modify their products to meet those requirements. And those same products, now with improved security, will become available to all of us as the new standard.

We know the market pressure approach can work. Once Microsoft saw that the market would (at least threaten to) make purchasing decisions on the basis of security, we suddenly got the Secure Windows Initiative and Trustworthy Computing. A key security technique is keeping the heat on vendors.

There is also an operational problem. To get a handle on the state of security of important infrastructure, try a Google search for [ scada security ]. It turns up alarming reports of basic security problems in some of our nation’s most important systems. (“SCADA” stands for “supervisory control and data acquisition”, and is used generally to refer to industrial control systems for things like water purification, electricity, manufacturing, and so on.) Here are some examples:

  • SCADA Security and Terrorism: We're Not Crying Wolf by Maynor and Graham explains that some SCADA systems depend on a long-ago fixed design flaw in Windows, and so can’t be upgraded to safer, more recent versions of Windows. They further note that although it would be easy to hack into many SCADA systems, that is generally not necessary since the systems are completely unprotected by design:
    • “Blaster” worm exposed the problem with DCOM
    • So Microsoft SP2 turned off “anonymous” by default for DCOM
    • This breaks SCADA systems because they don’t have logins
      • No security
      • X-Force research: looks like OPC problem [sic] has lots of buffer-overflows in it, but since everyone uses it with no authentication anyway, it’s pointless researching them.
      • Go to http://www.opcfoundation.org/, download trial software, test the authentication, binary review their code
  • 21 Steps to Improve Cybersecurity of SCADA Networks, a DOE guidelines document, confirms Maynor and Graham’s assertion that SCADA systems are indeed accidentally connected to the internet. All 21 steps are mundane network management tasks — yet are exciting news in the SCADA world.

    It is all too cheap and easy to connect to the internet; to maintain an air gap requires conscious effort and incurs a cost.

  • Joe St. Sauver’s academic presentation agrees that SCADA security today lags 5 – 10 years behind business security. Of course, we already know that business security tends to lag behind attacker capabilities by some number of years.

The purpose here is not to scare people. As Maynor and Graham note, “There is neither cause to panic nor cause to ignore the issue.” The way out of the security mess is reason, not paranoia. Instead, these examples show that even the most important systems suffer surprisingly basic problems — with basic fixes.

In my own private-sector security industry work, I observed a pattern: the higher the stakes, the worse the security. “Worse” usually means “more easily resolved with known techniques”. I evaluated a wide range of applications and platforms, and almost invariably found that the most important systems — those managing life, health, and money — were poorly engineered. By contrast, small startups doing something interesting but not (yet) critical would sometimes have very well-engineered systems, with entire classes of vulnerability designed away, minimal feature creep, and solid development practices reducing the risk of accidental implementation flaws. I suspect the reason for this pattern is that organizations that handle life, health, and money do not think of themselves as software engineering organizations, and so seek to minimize engineering costs. Additionally, engineering-driven companies tend to be disruptive newbies who have not yet made a big enough impact on the market to control much important information.

Although some members of Congress want to give the president the power to declare a “National Cyber Emergency”, as in S 3480, simple things like keeping systems updated and keeping critical systems air-gapped would provide more day-to-day safety to the nation. The government should use its enormous purchasing power to pressure platform and application vendors to advance their engineering standards.