The Expert in American Life

Andrew J. Taylor

Fall 2021

The Covid-19 pandemic provides a unique opportunity to assess the current state of the expert in American public life. The crisis is vast and complex, and policymakers at all levels of government have looked to trained and experienced professionals for advice on how to stop the contagion from spreading. From their regular appearances on television, health-care experts like former Food and Drug Administration commissioner Scott Gottlieb, former White House coronavirus response commissioner Deborah Birx, and National Institute of Allergy and Infectious Diseases director Anthony Fauci became household names overnight. On social media and elsewhere, previously anonymous epidemiologists, immunologists, and public-health academics emerged from relative obscurity to become public heroes.

But the pandemic came at a fraught moment for the American expert. In the years leading up to the crisis, President Donald Trump routinely expressed disdain for them, often to the thunderous applause of his supporters. He very publicly and frequently lambasted the political experts who believed he couldn't win the 2016 election, the economic experts who denigrated his tax policies, and the "deep state" experts who allegedly undermined his administration from within.

When Covid-19 began spreading across America in March 2020, Trump largely kept to his populist script. Almost immediately, he contradicted Fauci, his chief medical advisor, on the effectiveness of the anti-malarial drug hydroxychloroquine in treating the virus. After the doctor expressed uncertainty about the pharmaceutical at a press conference, Trump retorted, "I feel good about it. That's all it is, just a feeling, you know." Tensions mounted between the two over subsequent weeks, and the very public Trump-Fauci feud became something of a symbol of Americans' disagreements over the appropriate role of experts in guiding government policy. By July, Fauci had become a folk hero to millions of Americans, a villain to millions of others, and persona non grata to the White House.

Unsurprisingly, the dispute over the role of experts like Fauci falls largely along partisan lines. In a 2017 Buzzfeed News/PredictWise survey, 60% of Republicans agreed with the statement "I'd rather put my trust in the wisdom of ordinary people than the opinions of experts and intellectuals," compared to only 44% of Democrats. A January 2019 Pew poll reported that 73% of Democrats believed scientists "should take an active role in policy debates," while only 43% of Republicans said the same.

Democrats have repeatedly identified Trump and his brand of Know Nothing populism as the pernicious forces driving Republicans' skepticism of experts and expertise. They argue that the rise of anti-expert sentiment threatens the perceived legitimacy of America's increasingly expert-driven government, and that the key to restoring its legitimacy is to combat the bad-faith political actors who actively undermine it.

These Democrats aren't wrong to worry that mounting distrust of experts is a destabilizing force in our politics. But their account of its causes both overstates the influence of Trump and dismisses the truth that lies at the core of populist anti-expertise grievance: Experts do play too large a role in American government.

To be sure, experts' contributions to American government are indispensable. But our system of accountable, transparent, republican self-government requires that the authority of experts be more carefully limited and more clearly defined than it is today. Rather than simply bemoaning the "anti-science" posture of many voters, America's leaders should recognize that there is something legitimate underlying their complaint. If the growing challenge of anti-expert sentiment is to be met, the country's political leaders must restore experts to their appropriate place within America's system of government.

THE RISE OF THE AMERICAN EXPERT

Before considering the outsized power and authority of the modern expert class, it's necessary to clarify precisely who the experts are and understand how they came to play such an important role in American political affairs.

In his 1963 book, Anti-Intellectualism in American Life, Richard Hofstadter offered a definition of the "expert" by distinguishing him from the "intellectual." The intellectual, Hofstadter explained, has a deep love of ideas broadly defined and is naturally philosophical, while the expert tends to be more practical and more narrowly trained. Intellectuals have a calling to tackle the central questions of human life; they feel obligated to examine moral issues on our behalf. They are also inherently cosmopolitan, prone to calling for the universal application of their ideas. The expert's aspirations, by contrast, are more modest. Though he may display a missionary's zeal to remake the world, his contribution to collective knowledge and human flourishing is limited to his field. His approach is also largely empirical.

Like Europe, the United States was devoid of experts during the 18th and 19th centuries. The country was founded by polymaths like Benjamin Franklin and Thomas Jefferson, who tended to be what Isaiah Berlin later called "foxes" rather than "hedgehogs" — men with broad liberal educations and rapacious intellectual appetites but without advanced training in any particular field. Though the founders were deeply knowledgeable about the views of the ancients, Italian Renaissance thinkers, English republicans, and French Enlightenment philosophers, as historians like Bernard Bailyn and J. G. A. Pocock have shown, they were not "political scientists" in the strict sense. They were guided by ideas and theory, rather than empirical experimentation and study.

The principal political movements of the pre-industrial age were similarly devoid of scientists and experts. Early Democrats — who tended to subscribe to the views of men like Jefferson and Andrew Jackson — organized into a political party by mobilizing workers and yeomen farmers and embracing the cause of the "common man." This populist temper shaped the decentralized but nevertheless pervasive system of American public schools, where the education was broad and largely vocational. Rather famously, such schooling did not produce many philosophers or theoretical scientists; it was not until the 20th century that our public-education system could support research universities with world-class reputations.

The Democrats' opponents, the conservatives, originated in the South. They valued such things as religion, moral codes, tradition, and leisure. America's antebellum conservatives were deeply anti-capitalist, fearing industrialization's potential to destroy their genteel pastoral existence, to say nothing of their personal finances. The Civil War ended much of this way of life.

Those who took up the mantle of American conservativism during the Gilded Age were northern businessmen who knew much — though not in terms of technical information; rather, they had a deep understanding of human relations, economic behavior, and social organization. These "robber barons" made themselves — and in turn, the United States — very wealthy. But expertise was not the source of this wealth, and even at the end of their lives, men like Andrew Carnegie and John Rockefeller recognized the importance of intellect, as opposed to expertise, in American public life through their targeted philanthropy and investments in liberal higher education.

The American expert only began to emerge during the Second Industrial Revolution in the 1880s, as the United States took its position as a pre-eminent world economic power. Inventiveness and entrepreneurialism were hallmarks of the country's first industrial scientists — including Thomas Edison, Alexander Graham Bell, and George Westinghouse. But even these early experts, born at least a decade or two before the Civil War, retained much of the character of their intellectual predecessors — an interest in various fields, artistic talents and tastes, a hunger for theoretical knowledge, and a desire to generate information in service of their fellow citizens. Their educations were often unorthodox and interrupted. They were free to experiment, and were not tasked with following promulgated procedures.

It was not until large, research-driven industrial firms came to dominate the economy that expertise took the form we recognize today. Within massive corporations like U.S. Steel, American Telephone and Telegraph, General Electric, Standard Oil, and DuPont, individual managers and teams of workers began to specialize in knowledge much the way workers were committed to specific, repetitive tasks on the recently developed production line. By the 1910s, observers as diverse as Frederick Winslow Taylor and Max Weber had noted the rise of expertise. Professional associations were established in fields like medicine and engineering, furthering a trend that had begun a quarter-century earlier with the founding of the more explicitly academic and theoretical National Academy of Sciences.

It was also during this period that the American research university, somewhat lagging its European counterparts, exploded into public life. The Morrill Act of 1862 established land-grant universities with missions to research and teach in fields like agriculture, engineering, and military science. In the 1870s, Congress funded agricultural-research stations that brought experts to farms across the nation. In 1873, Johns Hopkins left much of his fortune to establish a hospital devoted to science-based medicine; in 1910, the Flexner Report pushed all American teaching hospitals to follow suit.

During the first third of the 20th century, some of these experts forged the technocracy movement. Sympathetic to but distinct from most progressives of the era, technocrats were more interested in economic and sociological theories than in saving the poor or busting the trusts. Their intellectual father was Thorstein Veblen, who warned of the damage the "leisure class" was doing to the economy, castigated them for their "conspicuous consumption," and sought to bring about a balance between depletion and production within the overall system. The technocrats' mantra was efficiency, and they displayed boundless confidence in the capacity of their expertise to shape not just the natural world in its service, but social life as well.

Technocracy blurred the boundaries between the expert and the intellectual. The world Veblen described was one where scientists and engineers served the public interest. Veblen's technocratic offspring argued that they should be granted the power to use their knowledge for the collective good unencumbered by self-interested politicians and the volatile impulses of public opinion. This was a role similar to what Samuel Taylor Coleridge had in mind for experts when he described them as a modern "clerisy" nearly a century earlier, to the approval of noted utilitarian John Stuart Mill.

As committed progressives, presidents Theodore Roosevelt and Woodrow Wilson viewed favorably the technocratic exercise, and the movement grew considerably in influence during the early years of American progressivism. It was the Great Depression, however, that provided technocrats with an opportunity to fully apply their theories, and they became more organized in response.

In Franklin Roosevelt, technocrats found a dependable ally — someone receptive to the need for complex planning and the human talent required to pull it off. It was not the cabinet or Congress that put together the New Deal, but FDR's Ivy League-educated "brain trust" of men like Adolf Berle, Harry Hopkins, Raymond Moley, and Rexford Tugwell. The political coalitions required to pass the Social Security Act may have been built by blue-collar congressmen like "Farmer Bob" Doughton of North Carolina and Robert Wagner of New York, but the program itself was drafted by Frances Perkins and her staff of technocrats at the Committee on Economic Security. Demonstrating that experts were not a monolith, the brain trusters often disagreed with one another. But regardless of their differing opinions, they all agreed it should be them — the experts — driving policy.

Technocracy died as a formal movement in the late 1930s, not least because social engineering was out of keeping with the collective effort to defeat totalitarianism. By then, however, many of its ideas had been absorbed into the American mainstream, where they have remained ever since. The template of policy-by-experts became a central assumption of government in the years after World War II, and universities poured resources into producing the next generation of economists, political scientists, sociologists, and engineers that the new mode of governance would require.

The primacy of experts inevitably attracted opposition, not least among them right-wing populists like the Anti-Masons and the Know Nothings of the antebellum era. Trump is only the latest inheritor of their populist banner. Despite his political victories, the Covid-19 pandemic demonstrated that experts remain a central force in American public life, often forging policy that both implements their ideas and perpetuates their influence. Trump's presidency did little to alter what is not an entirely healthy status quo.

GOVERNMENT BY EXPERTS

As defined by Jason Brennan, an "epistocracy" is a regime in which the knowledgeable rule by virtue of their technical know-how, where the votes of people who understand politics outweigh those of their compatriots. In the formal sense, then, America is not an epistocracy. Nor can it be described as what Joel Kotkin called a "neo-feudal" society, in which a small number of technocratic oligarchs use their expertise, supported by a progressive professional class, to enrich themselves. But at all levels of government — federal, state, and local — America's political establishment increasingly resembles the kind of government that the technocrats of the early 20th century worked to achieve.

Experts today enjoy a great deal of formal power granted to them by statutes, rules of procedure, and conventions, which privilege their positions in the policymaking process over those of laymen. Many government fields require approved professional preparation, significant training, or a related academic credential. This is notably the case in the federal executive branch. Although the courts have prohibited Congress from establishing qualifications so specific that they significantly limit presidential discretion to nominate appointees, legislators have been defining proficiency and sufficient preparedness for federal-government positions since the Progressive Era. Today's lawmakers seem to desire rather than require certain credentials of the president's nominees — a doctor of medicine for surgeon-general nominees, for example. Still, they typically require nominees to demonstrate knowledge of, and provide evidence of lengthy work within, a relevant field.

Some executive-branch officials — including undersecretaries, executive-agency heads, and members of independent collegial bodies — must meet formal standards, like those set for the administrator of the Federal Emergency Management Agency (FEMA). Chastened by its experience with Michael Brown during Hurricane Katrina ("Brownie," as President George W. Bush famously called him, actually has a juris doctorate), Congress established a series of qualifications for prospective FEMA heads in 2006. From then on, candidates needed a "demonstrated ability in and knowledge of emergency management and homeland security" and "not less than 5 years of executive leadership and management experience in the public or private sector."

States' legal stipulations for government positions are often quite specific. Most judicial and attorney positions, for example, mandate significant experience in legal practice, which requires both a law degree and membership in the state bar. Thirty-one U.S. states and territories require as much of their attorneys general. Twelve states demand their comptrollers have college diplomas, often in accounting, and sometimes even graduate degrees. States frequently require school superintendents to hold a teaching certificate as well as a terminal degree, such as a doctorate. Engineers are often required to take courses and exams beyond their formal education to become "professional," which enables them to secure higher pay and perform specific tasks.

At the municipal level, many policymaking processes are so esoteric that only experts can effectively participate. All over the country, municipal councils and boards use quasi-judicial processes for what are, at root, legislative proceedings. The rules prohibit ex parte communications, force the recusal of elected policymakers who have a conflict of interest or some previous knowledge of the matter, and dictate other standard procedures used in courtrooms, such as representation by counsel and the use of expert witnesses. The quasi-judicial process is frequently observed during the disposition of land-use matters, effectively excluding ordinary residents and leaving commissioners and council members at the mercy of attorneys, civil engineers, and realtors. Those who dare attempt participation are generally outmaneuvered within the byzantine maze of rules, regulations, and procedures.

Such quasi-judicial proceedings have been adopted by executive officials at all levels of government in an effort to be even-handed and provide protections to participants in what many believe to be their adjudicatory role. Yet issuing legal authority for someone to do something is an intrinsically legislative act. Congress and state assemblies may delegate that power to an administrative agency, but the power retains its legislative character. When lawmakers reserve the authority to make such decisions, they are free to consider all relevant information, and average citizens are at liberty to express their opinions and make their views known to their national representatives. When lawmakers delegate that authority to agencies, trained bureaucrats make decisions behind closed doors with little input from the public.

As obvious numerical minorities, individuals face clear disadvantages in a majoritarian political process. Yet lawmakers — especially those sensitive to constitutional and legal constraints — issue decisions only after consulting with concerned members of their jurisdiction. The legislative process can thus enable broad public participation and respect each resident's rights. The nature of an adjudicative process, through which the state exercises power based on the recommendations of experts and those directly affected, precludes both.

The legislative process in Washington is more porous than it is at the state and local levels. The public makes its opinions known to members of Congress continuously, and lawmakers and their staffs are generally not experts in a single field.

The problem here, however, is that our national legislature no longer makes many of the rules that shape American public life. In the 1930s, Congress began delegating its lawmaking authority to the executive branch as well as independent agencies like the Federal Reserve and the National Labor Relations Board. In 1946, lawmakers enacted the Administrative Procedure Act (APA) to standardize executive rulemaking, effectively rendering the practice routine. As Congress increasingly ceded its legislative authority to these agencies, the Supreme Court failed to intervene, refusing to strike down such delegations as separation-of-power violations. In the 1945 case Bowles v. Seminole Rock & Sand Co., the Court granted further leeway to agency experts by directing federal courts to yield to agencies' interpretations of their own vague regulations. And in the mid-1980s, the Court ruled that judges should defer to regulators when congressionally written instructions are ambiguous.

The number of regulations governing American life since the 1930s has skyrocketed, with no ceiling in sight. During Barack Obama's tenure as president, rules occupied 80,000 to 90,000 pages of the Federal Register — the federal government's daily publication of agency rules, proposed rules, and public notices. The Office of Information and Regulatory Affairs reported in its Unified Agenda that, on Obama's watch, agencies were considering around 2,500 new rules at any one point in time — not one of which has been reviewed by, much less received approval from, the people's representatives.

To be sure, when executive agencies make rules, they must provide for public comment before promulgating a final decision. And when they use their adjudicatory powers, Americans can advocate for themselves, as they might do in front of an administrative-law judge determining whether they are eligible for Social Security benefits.

Increasingly, however, administrative processes are elbowing citizens aside. Contemporary lawmaking at the federal level is now largely the domain of agency experts. Competing forums and complex procedures — standardized by the APA — intimidate the public, rendering participation unappealing and likely ineffective. Scientists and social engineers in federal agencies compose rules in the context of sophisticated research and technical considerations. They do not have constituents in the sense legislators do, and they do not make decisions in a political environment. Lawyers then consider the legality of these rules before they are enacted.

Philip Hamburger laments the unlawful nature of administrative law, largely because the Constitution explicitly grants the power to constrain the liberty of Americans only to the legislative and judicial branches. He also sees administrative lawmaking as inherently undemocratic, as it displaces the people and their representatives in favor of members of what he calls "the knowledge class." Unsurprisingly, administrative law remains attractive to the technocrat. As administrative-law advocate John Preston Comer wrote, it "has greater permanence, continuity and scientific value than legislative administration issuing from fickle popular bodies." But as a mechanism of governance in a republican democracy, it remains wanting.

The public has also been squeezed out of the process of distributing public goods through grants and procurement — money spent on administering a program combatting homelessness or building a new weapons system, for example. Modern federal-procurement decisions are decreasingly shaped by democratic mechanisms of allocation like markets and elections; instead, agency bureaucrats are directed to certain outcomes by complex and entrenched rules and procedures. Potential contractors must commit to paying prevailing wages, while the procuring agency must encourage "buying American" so as to give advantage to domestic suppliers; provide "maximum practicable opportunities" to small businesses; issue contracts in geographic areas where there is a surplus of labor or history of business "underutilization"; support minority, female, and veteran-owned small businesses; and evaluate firms' environmental record, business ethics, and hiring of illegal immigrants. Congress could rescind these requirements, of course. But they now have legitimacy, in addition to powerful constituencies within the expert class that manages federal purchasing.

Legislative "earmarks" — provisions in bills that direct federal spending or tax benefits to a particular recipient in a congressional district or state — once superseded agency formulas, bureaucratic discretion, and competitive procurement and grant-making processes as a mechanism for the distribution of funds and resources. Yet in response to accusations of corruption in 2011, House and Senate leaders put in place an earmark moratorium. The moratorium was recently lifted, but such constraints still shape the legislative politics of government spending. As a result, Congress has effectively ceded distributive policymaking to the experts in federal agencies, many of whom are high-ranking career civil servants. The White House, meanwhile, has little time or inclination to get deeply involved in all but the largest federal contracts; it tends to leave officials involved in acquisitions and grant-making alone.

As government procurement has come to focus more heavily on services and information rather than tangible goods, it increasingly requires an army of trained technical experts to evaluate complex bids and maintain what are called "indefinite delivery, indefinite quantity" contracts — arrangements between agency and vendor that evolve over the duration of the agreement. Whereas the Department of Defense used to order a certain number of aircraft from Boeing and await delivery, today it is more likely to purchase data and software from a host of obscure companies located in and around the District of Columbia, such as CACI, GDIT, Leidos, and SAIC. Not surprisingly, the ongoing relationship between government experts and the companies' researchers creates a growing community of powerful and well-paid scientists, computer engineers, and intelligence analysts, many enticed into government work by a revolving door that permits access to and from the lucrative private sector. Generalists, who inhabit the administration's top layers, can do little to hold them accountable.

Experts have tightened their grasp on federal policymaking in less formal ways as well. Many scientists and engineers rely on government to fund their research or purchase their products. As a result, they have learned to become effective lobbyists for their interests and influence government policy. "Communication" and "outreach" have become buzzwords in both the academy and private research settings. Although experts and their organizations, like the American Association for the Advancement of Science, bemoan their lack of influence, they have enjoyed great success in Washington; the annual federal budget for research-and-development expenditures is now about $145 billion. Though this amount has not grown much since the financial crisis of 2008-09, it is nearly twice what it was in the late 1990s. Before the end of the Cold War, more than two-thirds of the budget went to research directed at security and the military. Today, about half of this money is used for matters other than national defense.

State governments add only about $2.5 billion to annual research-and-development expenditures, but experts have succeeded in persuading those officials to use regulatory policy to promote their interests as well. As is the case in the federal government, states have bolstered occupational-licensing laws to protect experts' careers and influence over public life. Thirty-five states and the District of Columbia have certificate-of-need rules that permit public-health agencies to control private investment in order to expand health care. Most states require residents to use "professional" engineers — those who have seemingly superfluous training beyond their formal education — when preparing and submitting routine engineering proposals to government agencies. Such rules advance the interests of experts at the expense of public input and self-government.

THE PROPER ROLE OF EXPERTS

Though it's easy to dismiss President Trump's senseless hostility toward experts in general, it would be irresponsible to ignore the grievance expressed by the many millions of voters who cheer his Know Nothing impieties. In some important ways, experts do have too much power in America today.

Many observers think they have hardly any, but this view is only viable because the power of experts is so easily obscured. At all levels of American government, a small group of highly credentialed and experienced men and women — generating knowledge in fields such as engineering, law, and the natural and social sciences — are granted significant authority in the policymaking process. Americans are not wrong to worry about the impact this new class of powerful technocrats is having on the accessibility and accountability of our democracy.

Expertise, of course, is essential to good government. Bills, executive orders, and other policy promulgations are instruments by which we solve social and economic problems. For these efforts to bear fruit, they must be informed. "Experts," as Yuval Levin has written, "are needed precisely when facts and figures do not speak for themselves and require analysis by people with a well-honed aptitude for seeing through fog." But it is important for elected officials, who are typically generalists and selected through democratic processes, to harness the information experts generate without allowing such information and its progenitors to unilaterally direct decision-making.

In addition to eroding democratic legitimacy, government by experts undermines effective policymaking. Because of their deep immersion in the complexities of a particular issue, experts frequently have little understanding of other fields. They are incapable of integrating policies into a coherent programmatic approach to the problems a city, region, or country faces. They cannot prudently weigh policy options from different issue domains against one another. Experts' training and credentials convince them of the importance of their work but leave them blinded to the costs, whether economic, social, or political, of their favored programs or policies. Effectively steering the ship of state requires leaders to understand and balance these costs, making specialized technocrats among the least-suited individuals to take the helm.

The administrative state exacerbates these pathologies. Scientific exploration is naturally pluralistic, assuming a diversity of theories and methods, incomplete data, and skeptical investigation. It was in this spirit that many public-health experts signed the Great Barrington Declaration criticizing the lockdowns that accompanied the early stages of the Covid-19 pandemic. But in modern governance, as Hamburger has noted, experts and the policymaking process they stand atop "hark back to the medieval monarchical vision of a wise ruler, who knows what is best for his people, and who therefore must have the full range of unspecialized power to impose justice."

Government officials today seek absolutes — what one might call "the truth." Experts purport to know what that is, and the delegation of lawmaking authority to administrative agencies staffed by such experts, combined with complex agency procedures that only they fully understand, allows them to force their conceptions of "the truth" into law.

At the outset of the pandemic, many epidemiologists pursued widespread economic shutdowns with evangelical zeal even as they argued with one another over the accuracy of the data, the various theories of the virus's transmission, and the seriousness of the disease. Any dissent from their recommendations was characterized as self-serving and insensitive. They could not believe businesses and their employees were willing to take risks to protect their careers and livelihoods. Often working from the comfort of home offices and behind layers of employment protections, they were even oblivious to other health-related issues — such as deteriorating mental states and the delaying of critical medical procedures unrelated to Covid-19 — that arose from their decisions.

Skillful leaders recognize experts' motives. They recruit and deploy those with specialized knowledge as "teams of rivals," checking and balancing them against one another in a process similar to the one James Madison laid out in his discussion of factions in Federalist 10. Elected officials should pit experts against each other, encouraging them to continually cross-examine one another's methods and conclusions. Even then, however, they must not permit the "winner" to set policy unilaterally. Instead, decision-makers need to use their own judgment, necessarily guided by political self-interest and the varied interests of their constituents, as they weave together the most convincing of the experts' prognoses into a coherent political program.

This is no less the case at the local and state levels, where the complexity of policymaking is lessened. It is here that self-government — by average citizens, not experts — is most beneficial in not just a democratic, but a practical sense. Americans do not believe they can, or even should, influence the process by which the federal government selects the next weapons system to protect the nation, to take one example. But they do not wish to be similarly disenfranchised when they participate in community deliberations about the placement of traffic signals in their neighborhood. In these cases, their experience is surely just as valuable as the knowledge of the experts.

TEMPERING THE TECHNOCRATIC IMPULSE

Professor and commentator Tom Nichols recently argued that rapidly changing views of experts represent "the death of expertise" in America. In his 2017 book by that name, Nichols presented alarming trends in the levels of disregard, distrust, and outright hostility that modern Americans display toward the knowledge class. Though Americans today are no more or less misinformed than they have been in the past, Nichols asserted that the rise of Information Age technology and an increasingly narcissistic cultural ethos have convinced Americans that their opinions are just as valid as those of the credentialed authorities.

Though technological and cultural changes likely represent important causes of shifting attitudes regarding experts, Nichols's explanation fails to account for the way growing distrust of experts so closely mirrors the growth of their authority and power in American public life. Americans may not always fully understand the mechanisms behind recent expansions of expert power, but they perceive the tremendous influence that technocrats now exert over public policy, and they are not wrong to worry that this influence has come at the expense of both responsive lawmaking and democratic accountability. Along with technological and cultural change, this revolution in the structure of American government should be regarded as a significant cause of Americans' mounting resentment toward the nation's expert class.

If the crisis of confidence in America's experts is to be addressed, a good place to start would be to rectify the outsized role they have come to play in shaping American public life. In May 2020, Anthony Fauci himself explained succinctly what such a change could look like. In response to a question from Senator Rand Paul, who asserted during a committee hearing that the doctor should not be the person who makes policy decisions, Fauci concurred: "Senator Paul, I have never made myself out to be the end-all and only voice in this. I'm a scientist, a physician, and a public-health official. I give advice according to the best scientific evidence."

Experts like Dr. Fauci are critical in crafting and implementing public policy, but they must not be regarded as the "end-all" authorities in that work. Such authority ultimately rests with the voters and their elected representatives. America's political leaders must do more to ensure that the structure of their government better reflects this truth.

ANDREW J. TAYLOR is a professor of political science at North Carolina State University.


Insight

from the

Archives

A weekly newsletter with free essays from past issues of National Affairs and The Public Interest that shed light on the week's pressing issues.

advertisement

Sign-in to your National Affairs subscriber account.


Already a subscriber? Activate your account.


subscribe

Unlimited access to intelligent essays on the nation’s affairs.

SUBSCRIBE
Subscribe to National Affairs.