In association with heise online

04 May 2007, 19:33

Good Guys, Bad Laws - Protect Our Vulnerability Researchers!

In a threat space of rapidly changing detail, with new types of attack emerging momentarily, it is easy to overlook the stability of the fundamentals. Even from the purely technical perspective, perennial errors are made time after time. Just for example, thirty years after Kernighan and Ritchie introduced the C programming language there should be no excuse for developers using strcpy() or the dozen or so well-known similarly unsafe functions in an unprotected manner which can be exploited to cause buffer overflows. However, COTS developers have always seemed to have difficulty coming to terms with such issues, probably in part because their products have become so bloated by evolutionary development that they are essentially untestable in any exhaustive manner. I am grateful therefore for the services of the multitude of independent vulnerability testers who provide a public service in discovering security holes in software that vendors would find uneconomic to identify. In general this has to be done without formal authorisation, as PR concerns tend to diminish the cooperativeness of vendors. And although this comes under the umbrella title of "hacking" it is to my mind a completely different activity from attempting to penetrate other peoples' live systems without their consent: something I absolutely cannot condone.

From the US Digital Millennium Copyright Act and the UK Copyright Designs and Patents Act (both of 1998 vintage) to the new "toothier" Computer Misuse Act which came into effect last November, measures have progressively been introduced to curb hacking in its broadest sense. Sadly, the distinction between software vulnerability research and unauthorised live systems cracking has usually been missed, much of the legislative emphasis having been placed on the former. Call me cynical, but I personally suspect that vested interests have also come into play. It is after all embarrassing and expensive to be forced to fix a publicised bug in your mainstream product line. The net result though is that almost all "reverse engineering" and software examination is at present strictly unlawful, and established legitimate researchers have regularly been threatened with crippling law suits for announcing important vulnerabilities.

Interestingly, the balance seems to be getting restored, at least in the UK. The recent announcement of the suspension of the "supply" clause (section 37) in part 5 of the Police and Justice Act while it gets revised is significant. As enacted, this clause would have effectively made it an offence to supply tools that "might" be used for unlawful hacking. The threat of having to defend oneself against an accusation of hypothetical foreknowledge of the motives of third parties would make most people cringe, so some of the best tools in the defenders' armoury might well have disappeared from legitimate use. But of course that would not have prevented the criminals carrying on as before. So hooray for a gleam of common sense at last. We can only hope that the revised version of the supply clause is better tuned to the prosecution of the real bad guys rather than hard-working and often voluntary legitimate researchers. To survive against increasingly organised criminal intrusions that already have significant impact on trust and the conduct of business and government, our global computing infrastructure largely depends on these people. They should therefore be explicitly protected by the law, rather than threatened by it.

Print Version | Permalink: http://h-online.com/-747131
  • Twitter
  • Facebook
  • submit to slashdot
  • StumbleUpon
  • submit to reddit
 


  • July's Community Calendar





The H Open

The H Security

The H Developer

The H Internet Toolkit