Educational Non Fiction


Nancy Leys Stepan
Rating: 7.4

“Stepan’s history of eradication efforts gives you a good sense of how involved the work can get, how many different kinds of approaches have been tried without success, and how much we’ve learned from our failures. She writes in a fairly academic style that may make it hard for non-experts to get to her valuable arguments, but it’s worth the effort. You come away from it with a clearer sense of how we can use the lessons of the past to guide future efforts to save lives.”
-Bill Gates, “The Best Books I Read in 2015”

Eradication has proven unsuccessful at improving overall health, despite its popularity among public health practitioners.

The emergence of bacteriology and parasitology helped move public health focus from sanitation to disease-causing agents. The discovery that bacteria, viruses and parasites caused certain diseases fueled the envisioning of vaccines. The discovery of disease vectors reinforced belief that eradication was possible. Eliminating the vectors, contemporaneous thinking held, would end human suffering from these diseases.

Some public health practitioners saw eradication as the most promising avenue for improving overall health, while others noted the plethora of factors involved. Chief among these is poverty – though geography, climate and social organization also played a role. Development projects that aim to improve quality of life often change landscapes, social structures and patterns of interaction in ways that unintentionally promote the spread of diseases.

Despite the complex interaction of factors related to disease incidence, advances in microbiology helped maintain public health focus on eradicating the immediate biological causes of diseases.

Scholars found that social interventions, including improved sanitation, changes in political culture and increased education generated the most significant impacts on public health. Some practitioners view eradication campaigns as a method for improving health without waiting for economic development. Their thinking holds that healthier populations will be more productive, which leads to economic growth and more rapid development.

“Despite the scale and resources deployed in eradication campaigns, they are weak determinants of public health because they are usually independent of basic health services.”

Problematically, this approach depends upon social leaders investing the benefits of economic growth in human health. Supporting this argument is the theory that equality – not collective wealth – improves societal health.

Eradication campaigns involve a top-down approach. Their specific, narrow focus may not work in all circumstances.

Yellow fever still exists, despite a long eradication campaign. This disease was not necessarily a good candidate for eradication since current knowledge reveals animal and insect vectors. Governments focused on yellow fever because it caused dramatic epidemics, often with a high fatality rate, and disrupted economic growth.

The United States’ military occupation of Cuba during the Spanish-American War reduced a desire to investigate yellow fever more fully. The military – not a political structure – controlled the US empire, and army doctors had responsibility for public health. Instead of research, occupying forces relied on insect vector control: eradicating mosquitoes. This campaign’s success in 1901 led to a similar, controlled approach during construction of the Panama Canal.

“Because the eradication campaign in Cuba was run by the military, a much different system could be implemented than in ordinary civilian situations.”

The driving force behind the campaigns was that non-native white people – who lacked immunity to yellow fever – moved into areas where the disease was endemic. In the Panama Canal, workers of color were not granted the same protections, and suffered far different public health outcomes. The relative success of both campaigns solidified confidence in vector-based eradication rather than sanitation-focused plans.

Eradication campaigns targeted individual diseases, not overall public health. Often this meant setting up independent structures outside any existing health services. Eradication campaigns required accurate mapping, record keeping and a top-down organization with strict regulations. Their campaigns utilized chemicals extensively, often without regard for environmental or health impacts. The combination of these factors, and especially the success of the limited campaigns in Cuba and Panama, made eradication seem possible and desirable.

“Something that had been viewed as immensely difficult, the elimination of yellow fever, was now being presented as within reach and even morally imperative.”

Due to the highly focused, independent and hierarchical nature of eradication campaigns, they must be understood within the context of the tension between individual rights and common good. Outside of military or other authoritarian settings, public compliance must be earned, not imposed.

Major factors in eradication campaign failures include incomplete understanding of the diseases in question and a lack of basic health services in the countries involved.

The theories that disease caused poverty and that removing disease would spur economic growth drove eradication in the early 20th century. By thus depoliticizing disease control, organizations such as the Rockefeller Foundation could pursue the work they thought important without negotiating the political structures of their host countries. This focus by the foundation and others on vector control obviated the need for extensive medical knowledge. Killing mosquitoes, the insect vector for yellow fever and malaria, allowed conducting campaigns without a complete understanding of disease transmission.

When these campaigns rolled out globally, they often proved at odds with local public health priorities. For example, yellow fever was not a priority in West Africa, because many locals had some immunity. But the Rockefeller Foundation made yellow fever a global concern, and overrode local programs. Many nations came to see eradication as an international and almost imperial priority that actors conducted globally to protect US interests.

“Human activities had to be recognized as factors affecting the breeding of mosquitoes and thereby the incidence of malaria.”

Disease transmission, even of the same disease, does not occur the same way in every context. A successful eradication campaign from one area may not work in another. But pioneer and chief promoter of eradication Fred L. Soper saw campaigns as administrative processes that bypassed the need for understanding the disease context.

Relying on international assistance for national public health campaigns is subject to changes in sponsor priorities, which can potentially leave the local structure unable to continue eradication. These factors and the limited success of eradication contributed to this approach losing favor by World War II.

Despite widespread understanding of the difficulties of eradication, post-World War II optimism and emerging technologies fueled new attempts.

After World War II, eradication made a comeback, partly due to the development of DDT as a relatively cheap, effective insecticide. The World Health Organization (WHO) launched in 1948 as a United Nations body. Its mission was a science-based approach to improving health. The desire to apply scientific approaches to public health situated WHO within Cold War competition. Disease eradication was a propaganda tool and a public health approach. Some viewed approaches that focused on general public health services as “socialized medicine,” and Western governments wanted no part of them. As propaganda, eradication was intervention that did not require countries to develop first. Western governments promoted eradication to encourage development.

In the 1960s, global health organizations focused more on social and economic methods of improving public health. This sprang in part from the realization that a more holistic approach was necessary and that eradication campaigns were expensive, given their limited social benefit.

That Western Europe and the United States had few malaria cases pushed WHO officials to launch the Malaria Eradication Programme (MEP) to rid the world of malaria. After great initial success, enthusiasm for the program waned, and malaria has returned in many areas where its reduction was evident. Malaria is back on the list for new eradication programs. The Gates Foundation and WHO notably pursue this solution.

Before World War II, attempts to control malaria derived from the strategic importance of a given area. The Panama Canal Zone is a prime example. Some attempts depended on eliminating insect vectors, while others, the Italian school, focused on draining swampy areas to reduce breeding habitats and improve the economic situation of local areas.

In the United States, economic improvement and associated social changes proved the important factors in the decline of malaria.

“Ecological knowledge was overlooked and complexities forgotten in the rush to apply DDT.”

Eradication was popular also because it allowed organizations such as WHO to work on aspects of public health over which they had control. These organizations could not control land distribution and other economic and social issues, but they could focus on diseases and vectors. The reliance on DDT for mosquito eradication led to the insects’ resistance, which became a problem early in the process. Some mosquitoes became resistant as early as 1942. Despite knowing this, health officials decided that completion of eradication must continue before resistance became widespread. They did not seek alternative methods of control.

Officials realized that each geographic and social context was different, and they had to adjust to local environments. Another major factor in the failure to eradicate malaria was lack of political support. Countries struggling with other issues tended not to see the program as vital to overall health. When malaria programs transferred to national health services, they were no longer priorities.

Successful eradication of smallpox kept the strategy alive in the second half of the 20th century.

Human intervention has eradicated only smallpox. This took nearly 200 years from the development of the first vaccine. The most concerted eradication took place between 1967 and 1977. Vaccination was important to eradicating smallpox, but surveillance and containment measures were equally critical.

In the 1950s, 80% coverage rates were arbitrarily set as expectations for vaccination campaigns. These high vaccination requirements proved onerous for Latin American countries in which smallpox was not the greatest concern. Problems arose with vaccine production, quality and delivery. At the time, public health services had not integrated vaccination into their programs as they do now.

“The situation on the ground was messy, and messy in each country in its own way.”

Local knowledge emphasized complex variations in outbreaks. Improved understanding of its transmission enabled smallpox eradication. This allowed implementing the surveillance-containment strategy: isolating victims and monitoring their local contacts. WHO formally announced it had defeated smallpox in 1980.

Improving public health requires situational knowledge of the political, economic and social environments in which disease-control campaigns operate.

The success with smallpox has kept eradication popular as a public health strategy. The privatization of health care and the decline in public health services in the mid-20th century helped keep eradication relevant.

Researchers hoped that examining past failures would teach public health practitioners how to succeed with future eradication campaigns. Eradication of Guinea worm disease – for which there is neither a vaccine nor cure – has depended on shifting people’s behavior, “often considered the hardest thing to change of all.” Preventing Guinea worm disease has relied upon health education that primary health services deliver.

The return to a philanthropic model of supporting public health in the 21st century meant the return of technological approaches, despite knowledge that basic public health services are often more successful.

“The greatest challenge to the Gates Foundation and others like it is to realize that the social, political, economic and local factors cannot be separated from the biological and medical when considering the determinants of disease and its solutions.”

A better approach is a sustainable control project that focuses on a disease within an overall public health agenda.

Similar Posts

Leave a Reply

Your email address will not be published.