Steven M. Bellovin. Rethinking privacy regulation. GWU Journal of Law and Technology, 1(1), 2025. To appear. [ bib | http ]

Steven M. Bellovin, 1 Geo. Wash. J.L. & Tech. (2025) (forthcoming)

Steven M. Bellovin. The antiquity of algorithmic patents. Ohio State Technology Law Journal, 20(2), May 2024. To appear. [ bib | http ]

Steven M. Bellovin, 20 Ohio St. Tech L.J. __ (2024) (forthcoming)

Steven M. Bellovin. Who coined the phrase “data shadow”? Ohio State Technology Law Journal, 20(2), May 2024. To appear. [ bib | http ]

The phrase “data shadow” is commonly used in books and articles on privacy. The origin of the phrase, though, is mysterious. It is often attributed to Alan Westin, but it does not seem to appear in any of his writings. I show that it was coined in the early 1970s by Kerstin Anér, a member of the Swedish parliament, as “dataskugga.” She later used the phrase in English, later in the 1970s. It was briefly popular then, but disappeared until the early 1990s. It since become a popular and evocative phrase to describe how our activities, online and offline, follow us around.
Steven M. Bellovin, 20 Ohio St. Tech L.J. __ (2024) (forthcoming)

Susan Landau, James X. Dempsey, Ece Kamar, and Steven M. Bellovin. Recommendations for government development and use of advanced automated systems to make decisions about individuals, March 2024. [ bib | http ]

Contestability -- the ability to effectively challenge a decision -- is critical to the implementation of fairness. In the context of governmental decision making about individuals, contestability is often constitutionally required as an element of due process; specific procedures may be required by state or federal law relevant to a particular program. In addition, contestability can be a valuable way to discover systemic errors, contributing to ongoing assessments and system improvement. On January 24-25, 2024, with support from the National Science Foundation and the William and Flora Hewlett Foundation, we convened a diverse group of government officials, representatives of leading technology companies, technology and policy experts from academia and the non-profit sector, advocates, and stakeholders for a workshop on advanced automated decision making, contestability, and the law. Informed by the workshop's rich and wide-ranging discussion, we offer these recommendations. A full report summarizing the discussion is in preparation.

Hal Abelson, Ross Anderson, Steven M. Bellovin, Josh Benaloh, Matt Blaze, Jon Callas, Whitfield Diffie, Susan Landau, Peter G. Neumann, Ronald L. Rivest, Jeffrey I. Schiller, Bruce Schneier, Carmela Troncoso. Bugs in our pockets: The risks of client-side scanning. Journal of Cybersecurity, 10(1), 2024. [ bib | http ]

Janet Zhang and Steven M. Bellovin. Preventing intimate image abuse via privacy-preserving anonymous credentials. SMU Science and Technology Law Review, 26:149--215, November 2023. [ bib | http ]

Janet Zhang & Steven M. Bellovin, 26 SMU Sci & Tech. L. Rev. 149 (2023)

Steven M. Bellovin, Adam Shostack, and Tarah Wheeler. Ten questions we hope the Cyber Safety Review Board answers---and three it should ignore. Lawfare, February 9, 2022. [ bib | http ]

Steven Bellovin and Adam Shostack. Finally! A cybersecurity safety review board. Lawfare, June 7, 2021. [ bib | http ]

Steven M. Bellovin. Testimony for the New York City Council Committee on Technology hearing on “Benefits and Disadvantages of Cloud-computing Systems”, December 15, 2020. [ bib | .pdf ]

Steven M. Bellovin, Matt Blaze, Susan Landau, and Brian Owsley. Seeking the source: Criminal defendants' constitutional right to source code. Ohio State Technology Law Journal, 17(1):1--73, December 2020. [ bib | http ]

The right to a fair trial is fundamental to American jurisprudence. The Fifth Amendment of the Bill of Rights guarantees “due process,” while the Sixth provides the accused with the right to be “confronted with the witnesses against him.” But “time works changes, brings into existence new conditions and purposes.” So it is with software. From the smartphones we access multiple times a day to more exotic tools---the software “genies” of Amazon Echo and Google Home---software is increasingly embedded in day-to-day life. It does glorious things, such as flying planes and creating CAT scans, but it also has problems: software errors. Software has also found its way into trials. Software's errors have meant that defendants are often denied their fundamental rights. In this paper, we focus on “evidentiary software”---computer software used for producing evidence---that is routinely introduced in modern courtrooms. Whether from breathalyzers, computer forensic analysis, data taps, or even FitBits, computer code increasingly provides crucial trial evidence. Yet despite the central role software plays in convictions, computer code is often unavailable to examination by the defense. This may be for proprietary reasons---the vendor wishes to protect its confidential software---or it may result from a decision by the government to withhold the code for security reasons. Because computer software is far from infallible---software programs can create incorrect information, erase details, vary data depending on when and how they are accessed---or fail in a myriad of other ways---the only way that the accused can properly and fully defend himself is to have an ability to access the software that produced the evidence. Yet often the defendants are denied such critical access. In this paper, we do an in-depth examination of the problem. Then, providing a variety of examples of software failure and discussing the limitations of technologists' ability to prove software programs correct, we suggest potential processes for disclosing software that enable fair trials while nonetheless prevent wide release of the code.
Steven M. Bellovin et al., 17 Ohio St. Tech. L.J. 1 (2020)

Steven M. Bellovin. Mail-in ballots are secure, confidential, and trustworthy. Columbia News, October 23, 2020. [ bib | http ]

Steven M. Bellovin. Testimony for the New York City Council Committee on Technology and Committee on Small Business hearing on “Cybersecurity for Small Businesses”, February 25, 2020. [ bib | .pdf ]

Simha Sethumadhavan, Steven M. Bellovin, Paul Kocher, and Ed Suh. Please disclose security vulnerabilities! February 7, 2019. [ bib | http ]

Steven M. Bellovin. Yes, "algorithms" can be biased. Here's why. Ars Technica, January 24, 2019. [ bib | http ]

Steven M. Bellovin, Preetam K. Dutta, and Nathan Reitinger. Privacy and synthetic datasets. Stanford Technology Law Review, 22(1):1--52, 2019. [ bib | http ]

Steven M. Bellovin et al., 22 Stan. Tech. L. Rev. 1 (2019)

Steven M. Bellovin and Peter G. Neumann. The big picture. Communications of the ACM, 61(11), November 2018. [ bib | .pdf ]

Steven M. Bellovin. Comments on privacy. LawArXiv, November 2018. Comments submitted to the NTIA request for comments on privacy. [ bib | http ]

Today, all privacy regulations around the world are based on the 50-year-old paradigm of notice and consent. It no longer works. The systems we deal with---web pages with their multiple levels of advertising, the Internet of Things, and more---are too complex; consumers have no idea what sites they are contacting nor what their privacy policies are. Privacy harms are not well-defined, especially under U.S. law. Furthermore, their privacy policies are ambiguous and confusing. Use controls---the ability for users to control how their data is used, rather than who can collect it---are more promising but pose their own challenges. I recommend research on a new privacy paradigm, and give suggestions on interim changes to today's privacy regulations until there is something new.

Steven Bellovin and Susan Landau. Encryption by default equals national security. Lawfare, October 26, 2018. [ bib | http ]

Steven M. Bellovin, Matt Blaze, Dan Boneh, Susan Landau, and Ronald L. Rivest. Op-ed: Ray Ozzie's crypto proposal---a dose of technical reality. Ars Technica, May 07, 2018. [ bib | http ]

Steve Bellovin. Here's how to make sure Hawaii's missile warning fiasco isn't repeated. Ars Technica, January 21, 2018. [ bib | http ]

Jonathan Bair, Steven Bellovin, Andrew Manley, Blake Reid, and Adam Shostack. That was close! Reward reporting of cybersecurity “near misses”. Colorado Technology Law Journal, 16(2):327--364, 2018. [ bib | .pdf ]

Jonathan Bair et al., 16 Colo. Tech. L.J. 327 (2018)

Steven Bellovin. Replacing social security numbers is harder than you think. Vice Motherboard, October 5, 2017. [ bib | http ]

Steven M. Bellovin, Susan Landau, and Herbert S. Lin. Limiting the undesired impact of cyber weapons: Technical requirements and policy implications. Journal of Cybersecurity, 3(1), 2017. [ bib | http ]

Steven M. Bellovin. Columbia's riots and rebellions in the 1970s. Columbia Spectator, October 13, 2016. [ bib | http ]

Steven M. Bellovin and Adam Shostack. Input to the Commission on Enhancing National Cybersecurity, September 2016. [ bib | .pdf ]

Steven M. Bellovin. Comments on “Protecting the privacy of customers of broadband other telecommunications services”, July 2016. [ bib | .pdf ]

Steven M. Bellovin, Matt Blaze, and Susan Landau. Insecure surveillance: Technical issues with remote computer searches. IEEE Computer, 49(3):14--24, March 2016. An earlier version is available at https://www.cs.columbia.edu/~smb/papers/rsearch.pdf. [ bib | http ]

Steven M. Bellovin, Matt Blaze, Susan Landau, and Stephanie Pell. It's too complicated: How the Internet upends ikatz, smith, and electronic surveillance law. Harvard Journal of Law and Technology, 30(1):1--101, Fall 2016. [ bib | .pdf ]

For more than forty years, electronic surveillance law in the United States developed under constitutional and statutory regimes that, given the technology of the day, distinguished content from metadata with ease and certainty. The stability of these legal regimes and the distinctions they facilitated was enabled by the relative stability of these types of data in the traditional telephone network and their obviousness to users. But what happens to these legal frameworks when they confront the Internet? The Internet's complex architecture creates a communication environment where any given individual unit of data may change its status---from content to non-content or visa-versa---as it progresses Internet's layered network stack while traveling from sender to recipient. The unstable, transient status of data traversing the Internet is compounded by the fact that the content or non-content status of any individual unit of data may also depend upon where in the network that unit resides when the question is asked. In this IP-based communications environment, the once-stable legal distinction between content and non-content has steadily eroded to the point of collapse, destroying in its wake any meaningful application of the third party doctrine. Simply put, the world of Katz and Smith and the corresponding statutes that codify the content/non-content distinction and the third party doctrine are no longer capable of accounting for and regulating law enforcement access to data in an IP-mediated communications environment. Building on a deep technical analysis of the Internet architecture, we define new terms, communicative content, architectural content, and architectural metadata, that better reflect the structure of the Internet, and use them to explain why and how we now find ourselves bereft of the once reliable support these foundational legal structures provided. Ultimately, we demonstrate the urgent need for development of new rules and principles capable of regulating law enforcement access to IP-based communications data.
Steven M. Bellovin et al., 30 Harv. J.L. & Tech. 1 (2016)

Steven M. Bellovin. The danger of `exceptional access'. CNN.com, November 18, 2015. [ bib | .html ]

Harold Abelson, Ross Anderson, Steven M. Bellovin, Josh Benaloh, Matt Blaze, Whitfield Diffie, John Gilmore, Matthew Green, Susan Landau, Peter G. Neumann, Ronald L. Rivest, Jeffrey I. Schiller, Bruce Schneier, Michael A. Specter, and Daniel J. Weitzner. Keys under doormats: Mandating insecurity by requiring government access to all data and communications. Journal of Cybersecurity, 1(1), September 2015. [ bib | DOI | http ]

Twenty years ago, law enforcement organizations lobbied to require data and communication services to engineer their products to guarantee law enforcement access to all data. After lengthy debate and vigorous predictions of enforcement channels “going dark,” these attempts to regulate security technologies on the emerging Internet were abandoned. In the intervening years, innovation on the Internet flourished, and law enforcement agencies found new and more effective means of accessing vastly larger quantities of data. Today, there are again calls for regulation to mandate the provision of exceptional access mechanisms. In this article, a group of computer scientists and security experts, many of whom participated in a 1997 study of these same topics, has convened to explore the likely effects of imposing extraordinary access mandates.We have found that the damage that could be caused by law enforcement exceptional access requirements would be even greater today than it would have been 20 years ago. In the wake of the growing economic and social cost of the fundamental insecurity of today's Internet environment, any proposals that alter the security dynamics online should be approached with caution. Exceptional access would force Internet system developers to reverse “forward secrecy” design practices that seek to minimize the impact on user privacy when systems are breached. The complexity of today's Internet environment, with millions of apps and globally connected services, means that new law enforcement requirements are likely to introduce unanticipated, hard to detect security flaws. Beyond these and other technical vulnerabilities, the prospect of globally deployed exceptional access systems raises difficult problems about how such an environment would be governed and how to ensure that such systems would respect human rights and the rule of law.

Steven M. Bellovin, Matt Blaze, and Susan Landau. Comments on proposed remote search rules, October 2014. [ bib | .pdf ]

Steven M. Bellovin, Renée M. Hutchins, Tony Jebara, and Sebastian Zimmeck. When enough is enough: Location tracking, mosaic theory, and machine learning. NYU Journal of Law and Liberty, 8(2):555--628, 2014. [ bib | http ]

Steven M. Bellovin et al., 8 NYU J.L. & Liberty 555 (2016)

Steven M. Bellovin, Matt Blaze, Sandy Clark, and Susan Landau. Lawful hacking: Using existing vulnerabilities for wiretapping on the Internet. Northwestern Journal of Technology and Intellectual Property, 12(1):1--64, 2014. [ bib | http ]

For years, legal wiretapping was straightforward: the officer doing the intercept connected a tape recorder or the like to a single pair of wires. By the 1990s, though, the changing structure of telecommunications---there was no longer just “Ma Bell” to talk to---and new technologies such as ISDN and cellular telephony made executing a wiretap more complicated for law enforcement. Simple technologies would no longer suffice. In response, Congress passed the Communications Assistance for Law Enforcement Act (CALEA), which mandated a standardized lawful intercept interface on all local phone switches. Technology has continued to progress, and in the face of new forms of communication---Skype, voice chat during multiplayer online games, many forms of instant messaging, etc.---law enforcement is again experiencing problems. The FBI has called this “Going Dark”: their loss of access to suspects' communication. According to news reports, they want changes to the wiretap laws to require a CALEA-like interface in Internet software. CALEA, though, has its own issues: it is complex software specifically intended to create a security hole---eavesdropping capability---in the already-complex environment of a phone switch. It has unfortunately made wiretapping easier for everyone, not just law enforcement. Congress failed to heed experts' warnings of the danger posed by this mandated vulnerability, but time has proven the experts right. The so-called “Athens Affair”, where someone used the built-in lawful intercept mechanism to listen to the cell phone calls of high Greek officials, including the Prime Minister, is but one example. In an earlier work, we showed why extending CALEA to the Internet would create very serious problems, including the security problems it has visited on the phone system. In this paper, we explore the viability and implications of an alternative method for addressing law enforcement's need to access communications: legalized hacking of target devices through existing vulnerabilities in end-user software and platforms. The FBI already uses this approach on a small scale; we expect that its use will increase, especially as centralized wiretapping capabilities become less viable. Relying on vulnerabilities and hacking poses a large set of legal and policy questions, some practical and some normative. Among these are: * Will it create disincentives to patching? * Will there be a negative effect on innovation? (Lessons from the so-called “Crypto Wars” of the 1990s, and, in particular, the debate over export controls on cryptography, are instructive here.) * Will law enforcement's participation in vulnerabilities purchasing skew the market? * Do local and even state law enforcement agencies have the technical sophistication to develop and use exploits? If not, how should this be handled? A larger FBI role? * Should law enforcement even be participating in a market where many of the sellers and other buyers are themselves criminals? * What happens if these tools are cpatured and repurposed by miscreants? * Should we sanction otherwise-illegal network activity to aid law enforcement? * Is the probability of success from such an approach too low for it to be useful? As we will show, though, these issues are indeed challenging. We regard them, on balance, as preferable to adding more complexity and insecurity to online systems.
Steven M. Bellovin et al., 12 Nw. J. Tech. & Intell. Prop. 1 (2014)

Steven M. Bellovin. Why healthcare.gov has so many problems. CNN.com, October 15, 2013. [ bib | http ]

Steven M. Bellovin. Submission to the Privacy and Civil Liberties Oversight Board: Technical issues raised by the Section 215 and Section 702 surveillance programs, July 2013. [ bib | .pdf ]

Steven M. Bellovin, Matt Blaze, Sandy Clark, and Susan Landau. Going bright: Wiretapping without weakening communications infrastructure. IEEE Security & Privacy, 11(1):62--72, January--February 2013. [ bib | DOI | .pdf ]

Mobile IP-based communications and changes in technologies, including wider use of peer-to-peer communication methods and increased deployment of encryption, has made wiretapping more difficult for law enforcement, which has been seeking to extend wiretap design requirements for digital voice networks to IP network infrastructure and applications. Such an extension to emerging Internet-based services would create considerable security risks as well as cause serious harm to innovation. In this article, the authors show that the exploitation of naturally occurring weaknesses in the software platforms being used by law enforcement's targets is a solution to the law enforcement problem. The authors analyze the efficacy of this approach, concluding that such law enforcement use of passive interception and targeted vulnerability exploitation tools creates fewer security risks for non-targets and critical infrastructure than do design mandates for wiretap interfaces.

Steven M. Bellovin, Scott O. Bradner, Whitfield Diffie, Susan Landau, and Jennifer Rexford. Can it really work? Problems with extending EINSTEIN 3 to critical infrastructure. Harvard National Security Journal, 3:1--38, 2012. [ bib | .pdf ]

In 2004 the increasing number of attacks on U.S. federal civilian agency computer systems caused the government to begin an active effort to protect federal civilian agencies against cyber intrusions . This classified program, EINSTEIN, sought to do real-time, or near real-time, automatic collection, correlation, and analysis of computer intrusion information as a first step in protecting federal civilian agency computer systems . EINSTEIN grew into a series of programs, EINSTEIN, EINSTEIN 2, and EINSTEIN 3, all based on intrusion-detection and intrusion-prevention systems (IDS and IPS). Then there was public discussion of extending the EINSTEIN system to privately held critical infrastructure.

Extending an EINSTEIN-like program to the private sector raises serious technical and managerial issues. Scale matters, as do the different missions of the private sector and the public one. Expanding EINSTEIN-type technology to critical infrastructure is complicated by the complex legal and regulatory landscapes for such systems. There are simply fundamental differences between communication networks supporting the U.S. federal government and those supporting the private-sector critical infrastructures that create serious difficulties in attempting to extend EINSTEIN-type technologies beyond the federal sector. This paper examines the technology's limitations, pointing out the problems involved in expanding EINSTEIN beyond its original mandate.

Steven M. Bellovin et al., 3 Harv. Nat'l. Sec. L.J. 1 (2011)

Maritza L. Johnson, Steven M. Bellovin, and Angelos D. Keromytis. Computer security research with human subjects: Risks, benefits and informed consent. In Financial Cryptography and Data Security, Lecture Notes in Computer Science. Springer Berlin / Heidelberg, 2011. [ bib | .pdf ]

Computer security research frequently entails studying real computer systems and their users; studying deployed systems is critical to understanding real world problems, so is having would-be users test a proposed solution. In this paper we focus on three key concepts in regard to ethics: risks, benefits, and informed consent. Many researchers are required by law to obtain the approval of an ethics committee for research with human subjects, a process which includes addressing the three concepts focused on in this paper. Computer security researchers who conduct human subjects research should be concerned with these aspects of their methodology regardless of whether they are required to by law, it is our ethical responsibility as professionals in this field. We augment previous discourse on the ethics of computer security research by sparking the discussion of how the nature of security research may complicate determining how to treat human subjects ethically. We conclude by suggesting ways the community can move forward.

Steven M. Bellovin, Scott O. Bradner, Whitfield Diffie, Susan Landau, and Jennifer Rexford. As simple as possible---but not more so. Communications of the ACM, 2011. Note: this is a shorter version of “Can it really work?”. [ bib | .pdf ]

Steven M. Bellovin, Matt Blaze, Whitfield Diffie, Susan Landau, Peter G. Neumann, and Jennifer Rexford. Risking communications security: Potential hazards of the “Protect America Act”. IEEE Security & Privacy, 6(1):24--33, January--February 2008. [ bib | .pdf ]

Steven M. Bellovin, Matt Blaze, Whitfield Diffie, Susan Landau, Peter G. Neumann, and Jennifer Rexford. Internal surveillance, external risks. Communications of the ACM, 50(12), December 2007. [ bib ]

Paula Hawthorn, Barbara Simons, Chris Clifton, David Wagner, Steven M. Bellovin, Rebecca Wright, Arnold Rosenthal, Ralph Poore, Lillie Coney, Robert Gellman, and Harry Hochheiser. Statewide databases of registered voters: Study of accuracy, privacy, usability, security, and reliability issues, February 2006. Report commissioned by the U.S. Public Policy Committee of the Association for Computing Machinery. [ bib | .pdf ]

Steven M. Bellovin, Matt Blaze, Ernest Brickell, Clinton Brooks, Vint Cerf, Whitfield Diffie, Susan Landau, Jon Peterson, and John Treichler. Security implications of applying the Communications Assistance to Law Enforcement Act to Voice over IP, 2006. [ bib | .pdf ]

Steven M. Bellovin, Matt Blaze, and Susan Landau. The real national-security needs for VoIP. Communications of the ACM, 48(11), November 2005. “Inside RISKS” column. [ bib | .pdf ]

Steven M. Bellovin. Cybersecurity research needs, July 2003. Testimony before the House Select Committee on Homeland Security, Subcommittee on Cybersecurity, Science, Research, & Development, hearing on “Cybersecurity---Getting it Right”. Transcript at https://archive.org/details/gov.gpo.fdsys.CHRG-108hhrg98150. [ bib | .ps | .pdf ]

Steven M. Bellovin, Matt Blaze, David Farber, Peter Neumann, and Gene Spafford. Comments on the Carnivore system technical review draft, December 2000. [ bib | .html ]

Matt Blaze and Steven M. Bellovin. Tapping on my network door. Communications of the ACM, 43(10), October 2000. [ bib | .html ]

Matt Blaze and Steven M. Bellovin. Open Internet wiretapping, July 2000. Written testimony for a hearing on “Fourth Amendment Issues Raised by the FBI's `Carnivore' Program” by the Subcommittee on the Constitution, House Judiciary Committee. [ bib | .html ]

Steven M. Bellovin. Wiretapping the Net. The Bridge, 20(2):21--26, Summer 2000. [ bib | .ps | .pdf ]

Fred Schneider, Steven M. Bellovin, and Alan Inouye. Critical infrastructures you can trust: Where telecommunications fits. In Telecommunications Policy Research Conference, October 1998. [ bib | .ps | .pdf ]

Hal Abelson, Ross Anderson, Steven M. Bellovin, Josh Benaloh, Matt Blaze, Whitfield Diffie, John Gilmore, Peter G. Neumann, Ronald L. Rivest, Jeffrey I. Schiller, and Bruce Schneier. The risks of key recovery, key escrow, and trusted third-party encryption, May 1997. A report by an ad hoc group of cryptographers and computer scientists. [ bib | .pdf ]

Yakov Rekhter, Paul Resnick, and Steven M. Bellovin. Financial incentives for route aggregation and efficient address utilization in the Internet. In Proceedings of Telecommunications Policy Research Conference, 1997. [ bib | .html ]