Bulletin of the Atomic Scientists August/September 1983     pp.17-18

Attempts to limit scientific communication may hinder the very
dynamism our society seeks to promote.

by Stephen H. Unger

THE CAMPAIGN by government officials over the past half decade to
clamp down on the free flow of scientific and technical information
threatens fundamental principles of openness inherent in both the
scientific process and in U.S. political traditions. A novel aspect of
this campaign is the claim that the government should be able to
impose secrecy on work done by private individuals or organizations
not covered by contracts with secrecy provisions, and in some cases
not involved at all with government contracts or grants. The issue is
not the balancing of personal rights and privileges of individual
engineers and scientists against the security of the nation. On the
contrary, impeding that open communication among U.S. scientists and
engineers which has helped bring this country to the forefront in
science and engineering would significantly undermine the
technological basis of the United States' national security.

In the political realm, the principles of free speech, freedom of the
press and free assembly are justified mainly because they facilitate
the efficient and just operation of society. Both in this area and in
the scientific process, truth is most closely approached when ideas
and data are exposed to an open process of criticism and
enhancement. This stimulates the generation of new ideas and exposes
errors, distortions and omissions. It also reduces the useless
duplication of work.

But isn't the security of the nation impaired when the results of
U.S. research and development efforts are made available for
exploitation by potential enemies? Why, for example, should we publish
information useful to the Soviet Union in manufacturing integrated
circuits incorporated in sophisticated weapons? Shouldn't we try at
least to delay publication of such knowledge for a few years and make
them expend the effort necessary to develop the information for
themselves? Facile answers to these questions can be misleading. An
outline of a deeper analysis yields a very different conclusion.

-There is no practical way to restrict the outflow of scientific and
 engineering knowledge across our borders without significantly
 reducing its availability within our borders.

-Restricting communications among members of a community is likely to
 impair their effectiveness to a far greater degree than would
 restricting their access to information generated elsewhere. This
 effect is enhanced by the fact that the technical publications of a
 country that is ahead in a given field are likely to deal principally
 with matters that will not be of concern to less advanced countries.

-It is argued that the restrictions on publication would not apply to
 basic ideas, but only to specific techniques that would facilitate
 actual production of devices and systems. But manufacturing know-how
 is not transferred effectively on pieces of paper. On-site training
 and the transfer of hardware are necessary for this purpose.

In the context of current weaponry, incremental technological
advances by the Soviet Union would not make any appreciable
difference in the balance of power. But, in any event, a bottom-line
argument is that the relatively open U.S. system has generated a lead
of five to 10 years over the closed Soviet system in the fields of
electronics and computers. Japan and some West European countries,
who also follow relatively open publication policies, are also well
ahead of the Soviet Union in these fields. In fact, a more
restrictive U.S. policy would be largely nullified if a similar
policy were not also adopted by these nations.

It does not follow from the above arguments that nothing at all
should be kept secret. Such matters as the details of military plans,
cipher keys (but not the principles on which the ciphers are based),
details of weapon designs and characteristics of systems related to
electronic counter-measures ought to be safeguarded. This would not
interfere significantly with engineering and scientific progress or
with the discussion of issues of public interest.

One might at first be inclined to broaden this list to include more
general characteristics of existing or proposed military systems. But
this would make it impossible to have meaningful public debate over
the wisdom of proceeding with the development and/or deployment of
such systems as the MX missile, ABMs or cruise missiles. Existing
secrecy now hampers discussion of matters such as the efficacy of
verifying compliance with arms control measures via satellite
observation or seismography. Certainly a vigorous public debate on
the nuclear powered airplane project might have saved taxpayers a
great deal of money. More discussion of the MIRVing of missiles when
this idea was first proposed might have led to a more stable military
situation. (And the concept of a missile with a single warhead would
not now be the latest idea of strategic thinkers.) The events of
recent decades make it clear that those charged with responsibility
for national defense are by no means exempt from the human frailties
that make us unwilling to allow those in other branches of government
to conceal their operations from the public eye. Personal ambition,
interservice rivalries, fanaticism, ignorance, corruption and
stupidity have all shown a tendency to flourish behind screens
labeled "national security."

Unless there are clear and narrowly drawn rules as to what may be
kept secret, with strong oversight mechanisms, we can be sure that
there will be major abuses. As far back as 1970, a Defense Department
task force (chaired by Frederick Seitz and including Edward Teller)
concluded that perhaps 90 percent of all classified scientific and
technical information should be declassified. Their recommendations
were not followed, and the report itself was classified.

During a period when an attempt is being made to persuade us that
more secrecy is necessary, one might expect that particular care
would be exercised as to the choice of cases. But even in this phase,
we see one example after another of absurd behavior. Going back to
1977, there was the issuance of a secrecy order against three
engineers planning to market a low-cost voice scrambler they had
invented. During 1981, the State Department, which had set up an
exchange program with the specific purpose of helping China in high
technology areas, attempted to pressure several universities who had
accepted students under this program into keeping them away from
recent work in the computer area. This past November, the Air Force
moved to block presentation at the Institute of Electrical and
Electronics Engineers International Test Conference of three papers
concerned with general techniques for making integrated digital
circuits more testable. Although the work was done under an Air Force
contract, it was not classified, was not particularly relevant to
military systems and was of a nature quite similar to other work
presented at the same conference or published elsewhere. After a
minor furor, permission to present the papers was finally granted.

Such absurdities should not be regarded as perturbations in an
otherwise reasonable system. They are typical examples of the way
that censors behave. It is always safer for a censor to object to
publication when there is any doubt. Since the criteria for rejection
and the material being screened may both be difficult to comprehend,
doubtful cases are likely to be common. When it may take an expert a
full day to understand a typical paper, and where referees often
disagree as to the merits of material they are reviewing for
publication, what sort of staff can be set up to screen the many
thousands of papers submitted monthly to U.S journals and
conferences? Who would take on such a job? A related point is that
once it became known that work in a particular field was regarded as
being particularly relevant to national security and thus more likely
to be censored, researchers would naturally tend to avoid that field.

This discussion has focused on the problem of increased governmental
constraints on the flow of scientific and engineering information. No
implication is intended that there are not other serious barriers to
this flow, such as commercial secrecy, the desire to beat out rival
researchers, laziness and inadequate communications skills. The
existence of some barriers in no way, however, justifies the
imposition or tolerance of others.

Stephen H. Unger, professor of computer science at Columbia University
in New York City, is vice-president of the Institute of Electrical and
Electronics Engineers Society on Social Implications of Technology and
author of Controlling Technology: Ethics and the Responsible Engineer

The IEEE Society on Social Implications of Technology adopted the
following resolution in February 1983:

Whereas openness with respect to scientific and technical knowledge

  is in consonance with basic American principles,

  is part of the general tradition of openness that is essential to
  progress in science and technology,

  and has been a positive factor in establishing and maintaining the
  technological excellence that is an important pillar of our national
  security, however defined.

Be it resolved that the IEEE Society on Social Implications of
Technology opposes governmental efforts to impose restrictions on the
free flow of scientific and technical information other than with
respect to specific classified details directly concerning weaponry,
military plans and the like.