19 November 2009
Some members of Congress have gotten extremely upset about peer-to-peer filesharing. Even the New York Times has editorialized about the issue. The problem of files leaking out is a real one, but the bills are misguided.
Fundamentally, the real issue is that files are being shared without the user intending that result. This is not a weakness unique to peer-to-peer software; more or less any mechanism for publishing files can do that. The real problem is that the targeted software — whatever it is; the news stories full of outrage haven't identified which package or packages are implicated — is bad software, either because they share files the user hadn't intended or because they make it too hard for the user to understand what will happen. Given the sub rosa nature of much peer-to-peer software, perhaps this is not surprising; developing good software is remarkably difficult. Perhaps Congress should instead decriminalize sharing of music and video...
I digress. The real issue I'm addressing is bad legislation. Quite apart from my general concerns, the bills are just poorly drafted.
The first bill, H.R. 1319, is in many ways more reasonable: it mandates notice to the user of what is happening, and bars software that is difficult to remove. However, it stumbles badly when trying to define peer-to-peer software:
the term `peer-to-peer file sharing program' means computer software that allows the computer on which such software is installed—As best I can tell, any web browser is covered by that definition.
(A) to designate files available for transmission to another computer;
(B) to transmit files directly to another computer; and
(C) to request the transmission of files from another computer.
The newer bill, H.R. 4098, does a much better job on a workable definition, though it's fun to try to twist it into knots, too. I particularly like the way software "designed primarily to operate as a server that is accessible over the Internet using the Internet Domain Name system" is not covered; who would have thought that the DNS had such mystical shielding properties?
The problem with H.R. 4098 is that it bans the wrong thing. Yes, NASA's use of BitTorrent would be permitted because it is "instrumental in completing a particular task or project that directly supports the agency's overall mission", but NASA employees probably wouldn't be allowed to download such files on their home computers because the bill seeks to block "the download, installation, or use by Government employees and contractors of such software on home or personal computers as it relates to telework and remotely accessing Federal computers, computer systems, and networks". In other words, you can either view such files or you can save the government money by using your own computer to work from home.
I should add a personal disclaimer: I, like most professors in the sciences and engineering, receive substantial goverment grants and contracts; that technically makes me a government contractor, as best I can tell. Am I covered? My students who receive stipends from such grants?
For those who are wondering if this bill is really just another ploy by a paid shill for the content industry, campaign finance records do not seem to support the notion. According to OpenSecrets.org, while Rep. Towns (the introducer) did indeed receive considerable campaign funding from from PACs associated with content owners, he has also received a lot of money from PACs associated with companies like Verizon that have not been particularly sympathetic to the content industry's demands. I do not think that that claim is supported by the data.
Overall, what we have here is too much firepower being aimed in the wrong direction. If the incidents are taking place from home computers, the solution is to provide government employees with the government-owned equipment — and government-provided sofware, support, and system administration — to let them do their jobs properly. Using poorly managed or maintained machines carries many more security risks than just peer-to-peer software; I could make a very good case that such software is the least of the security problems. If the incidents have taken place on office computers, the issue is really a management problem: employees are making more than the normal and acceptable de minimus personal use of their employer's equipment. There is also likely a problem with the quality of systems administration in such organizations. Again, those issues pose many more risks. These are real problems; focusing on peer-to-peer software won't address them.
3 November 2009
For years now, there have been calls for a high-level cybersecurity official, preferably reporting directly to the president. This has never happened. Indeed, there is a lot of unhappiness in some circles that President Obama has not appointed anyone as "czar" (or czarina), despite the early fanfare about the 60-day cybersecurity review. There are many reasons why nothing has happened, I'm sure, up to and including high-level disagreement over the need for such a post. But another reason, I suspect, is that there are (at least) three different roles that need to be filled. The different roles have different needs and different responsibilities, but all are very difficult.
The first role is effectively as chief security officer for .gov. That is, the government — and I'm speaking of the civilian sector, not the military — has a vast IT complex. Securing any one part of the government is very hard; securing all of it may be impossible. The czar's role, though, is to cadge, cajole, or coerce many different departments into doing something. Given how independent the departments are, it wouldn't be easy. Presidential authority might help, but Truman predicted that Eisenhower would say "Do this! Do that! And nothing will happen". A czar, by definition lower-level than the president, would have an even more frustrating time.
There have been attempts to set a single security policy for the government. The Federal Information Security Management Act (FISMA) tried it; unfortunately, it appears to have turned into yet another exercise in security by checklist. Beyond that, there's a more subtle problem: a proper security posture is site- and application-specific. The requirements for securing, say, an informational web server are very different than what an EPA monitoring project might use when polling air quality sensors around the country. One size does not fit all; a centralized policy won't work very well.
Some things, such as intrusion monitoring, might (or might not) be better off centralized. Detailed security policy is probably better off decentralized — if different departments will do it properly. The key to that is finding the right incentives, since we're not dealing with profit-making organizations for which money is a suitable metric. That, I think, is the challenge for securing .gov. It is not clear that a high-level czar would help; one cannot enforce a policy if that policy doesn't exist.
The second role I see for a cybersecurity czar is providing policy advice to the president. Cybersecurity (and cyber policy in general) are cross-cutting issues. Do you want a smart power grid? How will you secure the sensors, the actuators, and the computer systems that talk to them? Hunting cybercriminals? Is there a suitable agreement with the country they're in? Improving education by providing computers to schools and libraries? How will these be secured? The president needs to hear advice on such issues, from someone with a very broad grasp of not just cybersecurity, but the fields in which there may be security concerns. There needs to be someone at a very high level advising the president on such issues, but should this advisor report directly to the president, or just be part of an office of science and technology policy?
The cybersecurity advisor has another big responsibility, though: devising a national strategy. What policies should the government pursue to help improve the overall security of computers in general? To give one example, many people have advocated a liability-based model: make vendors liable for for problems caused by their security flaws, and let the market work its magic. Is this a good idea? Someone needs to look into this in detail, and make a recommendation to the president. Others having suggested replacing the Internet with something newer and more secure. Will this help? What about broad, national initiatives, like electronic health records, where the security and privacy risks are pervasive? All of these have very deep implications; someone needs to advise the president about them. Again, though, at what level should this advice be given, directly to the president or at one remove?
The third major cybersecurity role is liason to the private sector. Most of the national computing capability is in private hands; what these organizations and people do has a great impact on the nation's cybersecurity. Some changes can be accomplished by legislation or regulation, especially in critical infrastructure sectors; others, though, require persuasion. For example, suppose it was concluded that ubiquitous encryption would be a tremendous security advantage. The cybersecurity liason would try to jawbone vendors, web sites, etc., into implementing this. Does this need presidential access? It wouldn't seem to, but as Theodore Roosevelt noted, the presidency is a bully pulpit; the further the cybersecurity liason is from the center of power, the less influence he or she would have.
These, then, are the three roles: government CSO, cybersecurity advisor, and cybersecurity liason. The first and last need the presidency's power; the middle needs access. Is this one person, two, or three?
I'm certainly not privy to the debates going on inside the White House. I suspect, though, that some variant of the questions I've posed — the exact role and (especially for the CSO option) powers this person would have — are the reason for the delay. I also suspect that trying to combine all three roles in one position is counterproductive; the necessary skills are very different.