Obviously there is a need for more basic research and development (R&D) in almost every aspect of our lives. So why are we getting less? Symbolic of this decline is the death by a thousand cuts being suffered by Bell Laboratories, once the leading research facility in the private sector . While many corporations have R&D labs, not surprisingly, the objective of virtually all privately funded R&D is to produce products profitable in the near term . Some benefits to people other than owners and top-level corporation managers do often occur, but only as side effects. There are NSF (National Science Foundation) grants and other funding for truly basic research, but this supports only a small fraction of scientific and engineering work. How serious is this problem, and what can be done about it?
There is significant R&D in the area of alternative energy sources (tho not, in my opinion, nearly enough), because this can plausibly lead to substantial profit for big companies. But new ideas for reducing energy waste or increasing the efficiency of various systems and devices are less closely linked to profit margins of large corporations. So we don't see a lot of basic work on novel heat insulation methods, or to increase the efficiency of small motors, or hot water heaters. Such ideas as heat pumps or zoned heating systems for homes have the potential for large reductions in energy consumption, but have not been refined to the extent required for widespread use. Most people don't think much about energy usage in connection with their computers (apart from the drain on batteries), so there probably wouldn't be a lot of profit in developing a printer, for example, that used a fraction of the energy used in current models.
In the area of agriculture, most research is done to benefit production on large factory farms, rather than to find better ways for small farmers (or home gardeners) to operate, or to develop products that better serve the health of consumers.
The general subject of public health and safety hazards is grossly neglected. During the past half century, tens of thousands of new chemicals have been introduced into the human environment, appearing in consumer products, in the workplace, in the air we breath, the food we eat, and in our water supply. Most have not been tested at all for toxicity, and very few of them have been subjected to more than superficial testing . Almost nothing has been done to explore the effects of long term exposure, or synergetic effects of simultaneous exposure to a multiplicity of chemicals . There is an urgent need to develop effective methods for carrying out such tests, and there is work being done on this problem—but not nearly enough. Here again, nobody is likely to get rich by investing in such research.
Similarly, possible hazards associated with electromagnetic radiation in various parts of the spectrum are not being investigated objectively. In fact, if it turned out that the use of cell phones, for example, was dangerous , there might be serious financial consequences for many companies. So work that might produce such knowledge is actively discouraged. 
Consider medical R&D, clearly important to the well-being of all of us. To a large extent, it is controlled by large pharmaceutical companies. But these companies have little interest in finding cures for diseases. They seek products that can ameliorate symptoms of chronic ailments, preferably medications that need to be taken daily for life. (All the better if they are addictive.) 
Real scientific research on nutrition, with large-scale controlled experimental studies, other than work funded by companies pushing particular products, is minimal, considering its great potential for bettering the health of so many people. Apart from an all-too small number of people working to discover what is best to eat, dissemination of what is known on the subject is also minimal, apart from commercial promotion of profitable products. 
It is widely accepted that we need to drink, on average, roughly two quarts of water daily. But there seems to be little scientific evidence to justify this belief . The authors of the cited paper examined the literature and found many interesting bits of evidence showing the effect of water intake on various important medical conditions. At one point they comment, "Only large and expensive randomized trials could settle these questions definitively. Given that water cannot be patented, such trials seem unlikely." The concluding sentences in the paper are,
In fact, there is simply a lack of evidence in general. Given the central role of water not only in our bodies but also in our profession, it seems a deficit worthy of repletion
Let's zero in more closely on another neglected topic.
Positive results were obtained, but the studies were small-scale, the principal one  involved 387 subjects in a controlled experiment. It is obvious that a lot more experimental work is needed to verify the results. There are numerous variables that should be explored. One is the issue of plain water versus solutions of various substances such as salt, iodine, or tea, to mention a few in common use. Is the temperature of the solution important? What are the effects of gargling frequency? What ailments are affected (e.g., one small Japanese study indicated that gargling significantly reduces influenza incidence among the elderly). If it is effective, as suggested by the Japanese work, what exactly is the underlying mechanism?
Preliminary results suggest that this simple, low-cost, and almost certainly safe procedure has a lot of potential for significantly reducing the incidence of very widespread ailments affecting all of us. We won't know for sure until more extensive, carefully designed studies are made. Unfortunately, gargling is not something likely to lead to big payoffs for any large corporation. Don't expect the Hackensack Water Company, or Morton Salt to fund research on gargling. In general, the money-driven US research establishment is not interested in conducting serious studies on such topics.
Thus, when the FDA (Food and Drug Administration) sets up a committee of experts to advise it about, say the efficacy of some potential new medication, where test results submitted by the manufacturer are questionable, all, or most, of the committee members usually have financial connections that may bias their views . Often these connections are not revealed, but even if they are, the work of the committee must be considered seriously tainted. Apart from the fact that FDA officials setting such committees may themselves be biased by past or prospective industry connections, it is often difficult to find people with the required expertise who do not have financial ties that might influence their views. This is a consequence of the fact that the income of even university researchers is likely to come, at least in part, directly or indirectly, from industry sources .
There are companies that employ scientists to cast doubt on arguments and tests revealing defects in the products of their clients. Using techniques pioneered by tobacco corporations to obscure the hazards of tobacco, they are often hired by pharmaceutical companies to delay the removal of hazardous or ineffective products from the market place .
Nobody should be shocked to learn that corporations design their R&D budgets to enhance their profits. That is what they are supposed to do. It should also not surprise people that, while some of the means used to enhance corporate profits also benefit the general public, this is by no means always the case. Some profit-producing operations, on balance, are neutral with respect to the public good, and some may be harmful. There are also, as illustrated above, areas where R&D potentially beneficial to many people is not deemed by corporate managers to be likely to result in enough profit to warrant the necessary investment.
It makes sense, therefore, to consider basic R&D as a public function, to be carried out in government laboratories. The mission of such laboratories would be to explore science and technology deemed to be of potential value to the people of the country. This is by no means a novel idea, and, of course, such national laboratories have existed for many decades. The National Institute of Standards and Technology (NIST, formerly called the Bureau of Standards) has been an important factor in promoting technology in the US since its inception in 1901. Many federal agencies including the DoD, NASA, the FDA, the EPA, and the NIH have R&D facilities.
Unfortunately, many of these are grossly underfunded. One of the most important victims of such inadequacy is the FDA. Its overall budget is much too small for anything approaching proper routine inspections. The situation was summarized three years ago by former FDA Associate Commissioner, William Hubbard :
Food inspections have dropped from a robust 50,000 in 1972 to about 5,000 today, meaning that U.S. food processors are inspected on average about every 10 years. The chance of a food product from overseas being inspected is infinitesimal. Most raw materials for our drugs come from foreign producers that are rarely inspected.
The FDA has essentially no budget for testing proposed new drugs or for monitoring the effects of drugs currently used by Americans. This has led to the ludicrous situation in which the testing of proposed new drugs is left under the control of the manufacturers. The FDA also lacks proper facilities and staffing for serious in-house research on drug evaluation methods and basic R&D. The situation is similar for other regulatory agencies such as the EPA.
What we need is a massive increase in funding for government R&D, both to beef up existing labs and to start new ones. Government funding for university research is helpful, but also too low. University research has therefore, for some time, been becoming increasingly dependent on corporate funding. Using federal money to subsidize industry R&D is counterproductive, since it only reinforces the emphasis on promoting private profit as opposed (often) to the public good.
As an example, if R&D organizations within the NIH were generously funded, the kinds of deficiencies pointed out above could be properly addressed. Topics for medical research could be chosen on the basis of potential benefits for people, rather than to maximize opportunities for corporative profit. Similar arguments could be made with respect to virtually every area of technology. The tunes played by the pipers of science are chosen by those who pay the bills. If we want the public to benefit, then the public should pay, and the natural mechanism for this is the government. It is becoming increasingly clear that allowing big corporations to dominate science and technology does not adequately serve the needs of real people.
There is an important secondary benefit of publicly financed science and technology research. If most R&D were carried out in government labs, or in government funded university labs, then large numbers of scientists and engineers, in virtually every field, would be employed in such labs. Relatively unbiased experts would be available to testify about the safety or efficacy of various products, or to help set standards for safety and efficacy, particularly in the medical field. This assumes that strong civil service regulations would protect the independence of laboratory staff members.
Another benefit of publicly controlled R&D is that it could be carried out more openly. Results would be made available to all. (Along these lines, consideration should also be given to reforming our broken patent system .) Private companies could compete with one another to commercially exploit the knowledge developed in the public labs. Since US taxpayers would be paying the bills, provision might be made to give American companies some modest advantages, such as invitations to seminars at which preliminary results were discussed.
Comments can be sent to me at unger(at)cs(dot)columbia(dot)edu
Return to Ends and Means