If human illness were strictly a private matter, never impinging upon the health, freedom or affairs of others, there would be little ambiguity or dispute concerning the need for health regulations; indeed, there would be no need for any public health legislation since each person’s health, just as each person’s bank account or sex life, would then remain a strictly personal matter. But, medically speaking, no man is an island entire unto himself. And, to paraphrase Donne still further, any man’s illness diminishes us because we are involved in mankind.
The origins of the first regulations defining community response to individual health must necessarily be obscured in early history. But if the Bible provides us with some dim vision of these early public health efforts, then the earliest regulations probably reflected their ill-defined appreciation that certain diseases, which moderns now call contagious, were capable of affecting some and then disabling still others within the immediate community.
How best, then, to decontaminate the tribal space when confronting a potential threat to tribal health? Ritual killing, undertaken by certain primitive groups, must have been regarded, sooner or later, as excessive; hence, a compromise was reached, allowing the infected person to live – but to live ostracized beyond the tribal area. Experience taught the nomads that, after a set interval, the person set apart either died of the disease or was no longer infective; and hence, the interval of segregation was limited to some observable time frame such as a lunar cycle. It was later extended to 40 days (hence the Italian word for 40, quarantina).
Quarantining, the physical separation of the sick from the healthy, operated in two ways: first, the expulsion of a newly sick person from within the community. And second, the prevention of a sick foreigner from entering the tribal confines. Sovereign nations learned that immigrants might bring in communicable disease. And, therefore, ports of entry became sites of medical vigilance, with maritime officials provided with the authority to bar those with visible disease. The United States Immigration Service, beginning in the 19th century, had its own corps of physicians who examined all incoming aliens to detect such illnesses as pulmonary tuberculosis and an eye infection called trachoma. In truth, trachoma continues to be a public health threat: It is highly communicable and, if left untreated, will eventually result in blindness. It is estimated that there currently are 146 million cases of trachoma, mainly in Africa and southern Asia.
Diseases perpetuated by physical contact, particularly those associated with sexual intimacy, became the first to be internally regulated. And thus, the Book of Leviticus declares that males with urethral discharges are to be ostracized until declared to be clean by the priest-physicians. In the primitive nomadic world of the wandering Israelites, cleanliness then became an abiding watchword and things were simplistically declared either clean or unclean. Amongst the unclean were such states as menstruation, bodily discharges, skin rashes and death.
Conflict necessarily arose between the unwritten rights of privacy and the need by the greater community to know where threats to its public health might lurk. Sexually transmitted diseases, essentially diseases of intimate contact, were viewed as a serious threat to the community and, in many modern nations, physicians were required to report all cases of venereal disease to public authorities who then interviewed all cases, required them to undergo therapy and seek out all of their sexual contacts. The public weal was considered of greater importance than the citizen’s right to sexual confidentiality.
By the 19th century, the scope of public health authority expanded considerably. In the interests of the well-being of the greater community, public health officials planned sewage systems, garbage disposal plants and even access to uncontaminated drinking water.
The earliest of public health efforts were to separate the sick from the well; to bar new communicable illnesses from entering the community from without; and then to aggressively pursue those within the community infected with those infectious diseases transmitted by physical contact. The second public health phase represented efforts to prevent new disease by recognizing that unregulated water supplies regularly infected large numbers of people with diseases such as cholera, typhoid and dysentery. And if a person who consistently shed the germs of one of these water-borne infections was found, the authorities were then granted police powers to arrest and indefinitely detain such carriers (such as Typhoid Mary).
The transition from quarantining to prevention and, finally, to preemption was inevitable; yet, it took many centuries before the state, through its public health arm, declared its right to prevent a person from acquiring an infectious disease since he or she, if infected, might then represent a palpable threat to the community. Vaccination against the viral disease called smallpox, despite known complications, was declared a necessary intervention. After 1820, many nations not only paid for the vaccine but demanded that all children be vaccinated by a certain age. In some American states, for example, children are barred from the public school system unless they show documentary proof of a series of preventive vaccinations.
Obligatory vaccination programs have been a point of considerable debate, with some civil libertarians claiming, “Yes, vaccination is a worthy intervention, but let it be a voluntary act since the sanctity of one’s own body is not to be violated by intrusive state regulations. First it will be smallpox control and next thought control.” Clearly, there exists a delicate balance between civil liberty and community health.
Stanley M. Aronson, M.D., (email@example.com) is dean of medicine emeritus at Brown University.