At the beginning of each school year, children around the country will be asked to provide school supplies alongside evidence that they have received all recommended vaccinations against childhood vaccine-preventable diseases. Although these requirements have become increasingly contentious, but were, in fact, created only in the last 35-50 years in an effort to create greater access to care for the most children in need.
It is useful to think about modern vaccines before laws requiring them. At the time to inactivated polio vaccine was licensed in 1955, there were, for practical purposes, no legal requirements for vaccination. Those that had been passed at the turn of the twentieth century to require vaccination against small pox had fallen out of favor and been repealed or were no longer enforced.
Polio was a disease with relatively low rates of morbidity and mortality, with about 90 percent of those afflicted having few or mild symptoms. It killed fewer children than measles, diphtheria, or pertussis. Yet, polio was feared because in about one percent of cases, those infected would become permanently crippled; it also spread easily. Known as the disease of infantile paralysis, polio showed no preference for race, class, sanitation, or neighborhood.
In 1938, President Franklin Roosevelt—who had been stricken with the disease in adulthood—and his former law partner Basil O’Connor founded the National Foundation for Infantile Paralysis (NFIP) to provide care for victims of polio and funding to search for a cure for or vaccine against the disease. Rather than drawing on family wealth as most foundations did, NFIP instituted the March of Dimes, where people all over the country were encouraged to contribute small denominations of money that would collectively fund the national battle against polio. The campaign drew on parental fear, public sympathies, and aggressive marketing through film, radio, and print media, with the face of a president crippled himself by the disease out in front. The March of Dimes made the search for a polio vaccine a national priority, even as polio did not represent the largest health risk at the time.
The March of Dimes’ success demonstrates public faith in science, which had brought penicillin to soldiers in World War II. Financial investment in biomedical research in both public and private sectors rose quickly in the postwar years, newly seen as a national asset that could better humanity.
Among the several teams competing, NFIP chose the vaccine Jonas Salk’s group was developing, which used a killed virus, and launched what would become the largest clinical trial in history. In 1954, NFIP began enrolling school children in a test of the polio vaccine. Within a year, 1.8 million children had enrolled in this study and more than 90 percent of the population knew about the trials. It is hard to imagine that kind of public trust for testing a new pharmaceutical product today. However, the pay-off of the trial was a safe and effective vaccine and mass production began.
Demand for the new vaccine rapidly outpaced supply, particularly as summer polio season approached. Who would first get this vaccine and who should decide were unclear. NFIP had already bought vaccine from all six manufacturers to give to states to give to first and second grade children, the age group polio most often struck. After companies filled that order, no one was sure how much would be available or how it should be distributed. Should it all be sold to private physicians? Should city and county health agencies be guaranteed a certain portion? The vaccine was most effective when given in a three shot series, but could be rationed to give out fewer shots to each child to spread it more broadly. Despite the good will that led to the creation of the vaccine, suspicions that those with political connections or wealth would be given priority created distrust among consumers.
Parents and local authorities voiced concern about equitable distribution and some urged the federal government to step in, in part to make sure it remained available to poor children. Yet others passionately opposed government involvement. The American Medical Association, which had aggressively campaigned against President Truman’s plan to create a national healthcare system, ferociously objected and in 1955, passed a resolution opposing any purchases or distribution of vaccination by the federal government except for those who were too poor to pay for it themselves (and thus too poor to pay private physicians). The American Drug Manufacturers Association lobbied against federal involvement, arguing the vaccine belonged to the companies and that if the government took over, they would have no incentive to develop new products, which would harm the country. With a diffuse fear of communism as a backdrop, Congress authorized funds to give to states to buy vaccines, but refused to determine how the vaccine would be distributed, except to note that it should go to people under the age of 20 years and pregnant women. Doctors resented these age limits and insisted they should be allowed to buy as much vaccine as needed to meet the demands of paying patients. Parents, community groups, school officials, and others complained of unfair distribution and argued that no private physicians should get any vaccine until all children had the opportunity to receive vaccine in school or community settings. These tensions—between for-profit markets and community priorities—plague vaccine policy today.
These issues remained unresolved as new vaccines, like those against measles (1963), mumps (1967), and rubella (1969) emerged in the following decade. These childhood diseases were not feared like polio, but they were nearly universally experienced and were very serious for some. For example, measles, for most people, is a mild illness, but can lead to more serious complications like pneumonia, ear infection, diarrhea, deafness, brain swelling, mental retardation, or, in rare instances (about one in 1000), death; infection can cause miscarriage or premature birth. Mumps is also relatively mild, with fever, headache, muscle ache, loss of appetite, and most characteristically, swollen glands under the ears or jaw; in teens and adults it can cause testicular or breast swelling or infection and sterility. Rubella (German measles) is also a minor childhood illness, causing a rash, low fever, and cold-like symptoms that lasts a few days. Yet, women exposed to rubella early in pregnancy face high rates of birth defects in their offspring, including blindness, deafness, heart damage, cataracts, internal organ damage, and mental retardation. Before vaccination, rubella was the leading cause off congenital deafness in the United States.
As questions about how to recommend, distribute, and fund vaccines persisted following polio, the federal government stepped in. Reflecting the Kennedy administration’s goal to address poverty, Congress passed The Vaccination Assistance Act of 1962 (an amendment to the Public Health Service Act), which created funding to vaccinate poor children. Although the law made no mention of mandates, Christian Scientists wanted assurance that vaccines would not become compulsory. CDC officials assured them that the bill did not contain language about compulsion and that the federal government had no power or authority to require vaccines, since this power resided with the states and localities. The American Academy of Pediatrics demanded and received assurance that this bill would not pull patients away from their private practices. The CDC formed the Advisory Committee on Immunization Policy (ACIP) in 1964, which became responsible for coordinating recommendations for routine vaccination and advising private physicians. In so doing, the CDC as a federal agency became central to national vaccination policy.
More generally, healthcare at this time was growing. In 1965, Congress passed into law Medicaid to provide healthcare to low income individuals, and Medicare to provide healthcare to seniors and the disabled. As the country prioritized helping the poor with programs supported as part of a “War on Poverty,” children’s vaccines took precedent, reflecting the reality that poor children were less than half as likely to be vaccinated and three times less likely to see a private physician as wealthier children. Unlike other health inequities, large numbers of unvaccinated children could spread infectious disease, which might pose risks to and thus politically motivated those disinclined to care about low income children’s health. As public health leaders saw vaccination as the best tool to battle disease, policy and resources to address disparities in access followed.
During the campaigns to promote the polio vaccine, the NFIP argued that vaccines, like charitable giving, should be voluntary. Bills in the early 1960s to require vaccination all failed. Yet, as the federal government newly prioritized immunization, states again considered compulsory vaccination laws. By 1968, half of the states had laws requiring evidence of vaccination for school attendance. There was evidence these laws worked. In one notable example, a measles outbreak infected children in Texarkana, a city that straddles both Texas and Arkansas. Texas had no vaccine requirements for measles and Arkansas did. Children in Texas experienced measles at twelve times the rate of neighboring kids in Arkansas. By 1974, 40 states required evidence of vaccination for school attendance and by 1981, all states did.
Although it does not match current perceptions, compulsory vaccine laws were intended to increase public access to vaccines and were aimed, under programs of the Great Society, to work toward social justice. The relatively rapid passage of these laws was not particularly controversial. Polls from this time period suggest that many Americans—as many as 25 percent—did not know their state had a compulsory vaccine law; most said they planned to have their children vaccinated anyway.
These laws (in all but two states) contained exemptions for children whose parents held religious beliefs that would be violated by being forced to vaccinate. The result of lobbying efforts by Christian Scientists, these exemptions were in some states broadly written to be inclusive of other faith or personal beliefs as well. How these exemptions should be interpreted proved complicated. Did parents need to belong to an organized religion? Could they be expected to demonstrate the sincerity of their beliefs? Did risk to other children outweigh these rights? A series of lawsuits carved the multiple meanings of religious exemption; in short, the answer to each question was no. Yet, the relatively quiet expansion of school attendance laws, which made school personnel the unwilling enforcers of public health law, reflected the general consensus that vaccines were good and enforcement would not be difficult. As an increasing number of parents are today using exemptions for non-religious reasons, this consensus can no longer be taken for granted.
This fall, as we register our children for school, we do so with piles of paperwork that embody our relative trust in meanings of health, science, and community responsibility. California in 2016 repealed state law that allowed parents to exercise exemptions for religious and personal beliefs, noting that they had become widely used in ways not intended. Other states search for ways to persuade parents that maintaining high levels of vaccination are good for their own children, but also other people’s children, hoping as the NFIP did, that vaccination, like charitable giving, will be voluntary and reflect generosity to those in our communities. Yet throughout, we should remember that these laws were always intended to protect those most vulnerable among us and to work toward social justice for all children.
Jennifer Reich is Associate Professor of Sociology at the University of Colorado, Denver. Her publications include the award-winning books Calling the Shots: Why Parents Reject Vaccines (NYU Press 2016) and Fixing Families: Parents, Power, and the Child Welfare System.
Feature image: Vacunas by Carlos Reusser Monsalvez. CC0 1.0 via flickr.