My Paper: Lessons from Around the World on Drug Decriminalization and Legalization

After decades of tremendous financial and social costs, the punitive drug model is being steadily eroded at home and abroad. Even the conservative law-and-order types who oppose the use of illicit drugs are increasingly accepting that the war on drugs has failed both in its objective (undercutting drug use) and its efficiency (accomplishing little yet reaping a huge economic and human toll).

Even Mexico, which has suffered more than most nations from our appetite for illegal drugs, has gone forward with legalizing marijuana in an effort to undercut a major source of funding for its powerful and vicious cartels. (So now both of America’s only neighbors have fully done away with punitive attitudes towards one of the weaker and comparatively less harmful illicit substances.)

All that being said, I do feel validated in having proposed and written a paper exploring the alternative methods, policies, and cultural attitudes of various countries when it comes to illegal drugs. As the U.S. and other countries question the wisdom of the status quo, it may help to look abroad at those places that were ahead of the curve in dispensing with the punitive approach in favor of more constructive methods. I focus especially on Portugal, which twenty years ago blazed the trail towards decriminalizing all illegal drugs and framing their use as a public health matter rather than a criminal one.

See the source image

As you will hopefully read, many of these strategies are unique to the time, place, or sociopolitical context of the nations that implemented them; nevertheless, there are still useful lessons to glean, and at the very least we can see proof that there are other ways to address the scourge of drug addiction, trafficking, and other associated ills, besides the blunt instrument of police and prisons.

Feel free to leave your thoughts, reactions, and feedback. Thanks again for your time.

The Outbreaks That Never Happened and the Unseen Success of Global Institutions

Given all the death and dysfunction resulting from the COVID-19 pandemic, it is worth appreciating the many potential outbreaks that never happened, thanks to the efforts of Kenya, Mozambique, and Niger, alongside the United Nations and other international partners

In December 2019, just months before the COVID-19 pandemic came in full swing, these nations managed to halt an outbreak of a rare strain of “vaccine-derived polio”, which occurs “where overall immunization is low and that have inadequate sanitation, leading to transmission of the mutated polio virus”. It is all the more commendable given that Niger is among the ten poorest countries in the world.

The fact that polio remains both rare and relatively easy to quash is the results of a U.N.-backed campaign announced in 2005 to immunize 34 million children from the debilitating disease, which often leaves victims permanently disabled. The effort was led by  by World Health Organization the U.N. Children’s Fund (UNICEF), Rotary International, and the United States Centers for Disease Control and Prevention.

A nurse administers an oral poliovirus vaccine (OPV) to a baby at the Kaloko Clinic, Ndola, Zambia.
© UNICEF/Karin Schermbrucke

A little over fifteen years later, two out of three strains of polio have been eradicated—one as recently as last year—while the remaining strain is in just three countries: Afghanistan, Nigeria, and Pakistan. This once widespread disease is on its way to becoming only the second human disease to be eradicated, after smallpox, which once killed tens of millions annually. That feat, accomplished only in 1979, was also a multinational effort led by the U.N., even involving Cold War rivals America and Russia.

Even now, the much-maligned WHO actively monitors the entire world for “acute public health events” or other health emergences of concern that could portend a future pandemic. As recently as one month ago, the U.N. agency issued an alert and assessment concerning cases of MERS-Cov (a respirator illness related to COVID-19) in Saudi Arabia. Dozens of other detailed reports have been published the past year through WHO’s “Disease Outbreak News” service, spanning everything from Ebola in Guinea to “Monkeypox” in the United States. (WHO also has an influenza monitoring network spanning over half the world’s countries, including the U.S.).

Not bad for an agency with an annual budget of slightly over two billion—smaller than many large U.S. hospitals. (And contrary to popular belief in the U.S., the WHO did in fact move relatively quickly with respect to the COVID-19 pandemic:

On 31 December 2019, WHO’s China office picked up a media statement by the Wuhan Municipal Health Commission mentioning viral pneumonia. After seeking more information, WHO notified partners in the Global Outbreak Alert and Response Network (GOARN), which includes major public health institutes and laboratories around the world, on 2 January. Chinese officials formally reported on the viral pneumonia of unknown cause on 3 January. WHO alerted the global community through Twitter on 4 January and provided detailed information to all countries through the international event communication system on 5 January. Where there were delays, one important reason was that national governments seemed reluctant to provide information

Of course, it goes without saying that the WHO, and global institutions generally, have their shortcomings and failings (as I previously discussed). But much of that stems from structural weaknesses imposed by the very governments that criticize these international organizations in the first place:

WHO also exemplifies the reluctance of member states to fully trust one another. For example, member states do not grant WHO powers to scrutinise national data, even when they are widely questioned, or to conduct investigations into infectious diseases if national authorities do not agree, or to compel participation in its initiatives. Despite passing a resolution on the need for solidarity in response to covid-19, many member states have chosen self-centred paths instead. Against WHO’s strongest advice, vaccine nationalism has risen to the fore, with nations and regional blocks seeking to monopolise promising candidates. Similarly, nationalistic competition has arisen over existing medicines with the potential to benefit patients with covid-19. Forgoing cooperation for selfishness, some nations have been slow to support the WHO organised common vaccine development pool, with some flatly refusing to join.

The tensions between what member states say and do is reflected in inequalities in the international governance of health that have been exploited to weaken WHO systematically, particularly after it identified the prevailing world economic order as a major threat to health and wellbeing in its 1978 Health for All declaration. WHO’s work on a code of marketing of breastmilk substitutes around the same time increased concern among major trade powers that WHO would use its health authority to curtail private industry. Starting in 1981, the US and aligned countries began interfering with WHO’s budget, announcing a policy of “zero growth” to freeze the assessed contributions that underpinned its independence and reorienting its activities through earmarked funds. The result is a WHO shaped by nations that can pay for their own priorities. This includes the preference that WHO focus on specific diseases rather than the large social, political, and commercial determinants of health or the broad public health capacities in surveillance, preparedness, and other areas needed for pandemic prevention and management

In fact, it was this prolonged period of chronic underfunding, and of WHO member states prioritizing nonemergency programs, that precipitated the agency’s abysmal failings in the early phases of the 2014 Ebola outbreak. But once that crisis ended, member states, rather than defund or abandon the organization, opted to reform and strengthen its emergency functions; this overhaul resulted in the Health Emergencies Program, which was tested by the pandemic and thus far proven relatively robust:

On 31 December 2019, WHO’s China office picked up a media statement by the Wuhan Municipal Health Commission mentioning viral pneumonia. After seeking more information, WHO notified partners in the Global Outbreak Alert and Response Network (GOARN), which includes major public health institutes and laboratories around the world, on 2 January. Chinese officials formally reported on the viral pneumonia of unknown cause on 3 January. WHO alerted the global community through Twitter on 4 January and provided detailed information to all countries through the international event communication system on 5 January. Where there were delays, one important reason was that national governments seemed reluctant to provide information.

I know I am digressing into a defense of WHO, but that ties into the wider problem of too many governments and their voters believing that global governance is ineffective at best and harmfully dysfunctional at worst. We Americans, in particular, as constituents of the richest country in the world, have more sway than any society in how institutions like the U.N. function—or indeed whether they are even allowed to function.

As our progress with polio, smallpox, and many other diseases makes clear, what many Americans decry as “globalism” is actually more practical and effective than we think, and increasingly more relevant than ever. We fortunately have many potential outbreaks that never happened to prove it.

Forgotten Allies

The contributions of our foreign allies to the Afghanistan War have been overlooked or downplayed throughout the 20-year conflict. But in proportion to their size, many of them committed more troops and funds, and suffered more casualties, than even the U.S.

The 9/11 attacks were the first time NATO invoked Article 5 of its treaty, which enshrines the principle of “collective defense” by recognizing an attack against one ally as an attack against all allies. Thus, all the other 29 members of NATO—along with 21 partner countries ranging from Australia to South Korea—contributed troops, money, and other aid to the war in Afghanistan.

(It is also worth adding that even the typically-deadlocked U.N. Security Council resoundingly supported American retaliation, indicating an exceptionally rate amount of international support.)

Besides the U.S., the top five countries to send troops were the United Kingdom, Germany, France, Italy, and Canada. The U.K. in particular supplied roughly two to three times the troops of the other top contributing allies relative to its population.

British and Canadian troops put their lives at risk at twice the rate of American troops, when seen as a percentage of each country’s peak deployment. Proportionally, both suffered more than double the casualties of U.S. forces, while France suffered a similar rate.

As proportion of their military, many smaller countries played an outsized role, with Denmark, Estonia, Georgia, Norway, and North Macedonia ranking near the top after the U.S. and U.K.; consequently, some of these countries suffered the highest fatality rates per capita.

The top contributing allies lost over a thousand lives in U.S.-led conflicts in Afghanistan as well as Iraq; all told, roughly half of all foreign military deaths in Afghanistan were among U.S. allies.

When measured as a percentage of their annual baseline military spending, the U.K. and Canada spent roughly half as much on Afghanistan as the U.S.; relative to their overall economic size, the U.K. spent more than the U.S., while Germany and Canada spent about the same.

This did not have to be our allies’ fight. The likes of Georgia, Norway, and South Korea (among dozens of others) had little to no skin in the game, aside from a broader sense that terrorism could potentially impact them. But even then, involvement would put them at greater risk of retaliation and domestic opposition (as Spain learned the hardest way when it lost nearly 200 lives in a terrorist attack perpetrated in response to its participation in Iraq).

The Sadly Prescient Warnings of the United Nations

The United Nations warned about the deteriorating situation in Afghanistan for years, and just three months ago published a report with tragically accurate warnings about the repercussions of a hasty withdrawal. It is a grim reminder that we should pay more attention to international institutions like the U.N., since they benefit from having a large pool of resources from different countries, and are given access that most governments are denied.

The U.N. report stated the Taliban was trying to demoralize the government, intimidate the populace, and put “major pressure” on near the capital, “massing forces around key provincial capitals and district centers, enabling them to remain poised to launch attacks”—which we saw play out in barely two weeks.

U.N. observers believed the Taliban were planning their operations around the withdrawal date announced by Trump and Biden when foreign troops would “no longer [be] able to effectively respond”. It cautioned that the Afghan military was “in decline” and that our departure “will challenge Afghan Forces by limiting aerial operation with fewer drones and radar and surveillance capabilities, less logistical support and artillery, as well as a disruption in training”—again, all this explained why the government melted away so soon.

The U.N. also predicted that the Taliban would target departing foreign troops to “score propaganda points” and believed the group is “closely aligned” with al-Qaeda, with “no indication of breaking ties” despite trying to mask their connections. To make matters worse, the U.N. believes Islamic State may position itself in Afghanistan, which recent news reports suggest is already happening.

While it remains to be seen whether some of the pending predictions come true, the U.N.’s overall conclusion was sadly spot on: “The Afghan Taliban poses a major threat to the survival of the Afghan government, which is likely to substantially grow with the full withdrawal of U.S. forces”.

[Literally one day after I shared the U.N. report on social media, Kabul’s airport was attacked by an Islamic State affiliate, killing over a dozen Americans and scores of Afghans desperately trying to flee. The report had warned of other extremist groups that are or will grow more powerful, often with tacit Taliban support, and that the Taliban would take full advantage of our withdrawal and target departing foreign troops to “score propaganda points”. Sadly, it was once again not too far off the mark.]

I am not sure how many more disasters and tragedies it will take for us to learn to listen to our international partners, many of whom have intelligence networks and resources we lack. One does not have to be a “globalist” to recognize that — the writing was almost literally on the wall.

The Rebellion that Shook a Fledgling America

Shays forces flee Continental troops, Springfield.jpg

On this day in 1786, the newly minted United States faced its greatest domestic challenge when Daniel Shays led an armed uprising in western Massachusetts against the federal government known as “Shays’ Rebellion“.

Shays was a veteran of the American Revolutionary War who saw combat in several major battles and was wounded in action. Like most people in the fledgling country of roughly three million, he was a subsistence farmer just scrapping by; most residents of rural Massachusetts had few assets beyond their land, and often had to rely on bartering or debt from urban merchants.

Like most revolts, there were many complex ideological, political, and economic factors that drove Shays and four thousand others to take arms against their purportedly representative government. The majority of veterans received little pay for their service, and still had difficult getting the government to pay up. Compounding this problem were mounting debts to city businessmen and higher taxes by the state government, which happened to be dominated by the same mercantile class. Mounting bankruptcies and repossessions, coupled with the ineffectiveness of the democratic process in addressing these issues, finally boiled over to well organized efforts to shut down the courts and prevent more “unjust” rulings. Over time, the protests morphed into an outright insurrection that sought the overthrow of the Massachusetts government. Things really came to a head in 1787, when Shays’ rebels marched on the federal Springfield Armory in an unsuccessful attempt to seize its weaponry for their cause. The national government, then governed by the Articles of Confederation, did not have the power nor ability to finance troops to put down the rebellion; it came down to the Massachusetts State militia and even privately funded local militia to put an end to the rebellion, at the loss of nine lives in total.

Most participants were ultimately pardoned, including Shays himself, who ultimately died poor and obscure in 1825. But the legacy of the conflict far outlived its relative blip in modern history: Though it is still widely debated, the rebellion may have influenced the already-growing calls for the weak Confederation to be replaced by a federal system under a new constitution. Among other things, the event is credited with creating a relatively more powerful executive branch, as it was believed one single president would have a better chance at acting decisively against national threats. Some delegates felt that the uprising proved the masses could not be trusted; the proposed Senate was already designed to be indirectly elected (as it would remain until the early 20th century), but some wanted even the House of Representatives to be removed from the popular vote. Regardless of its effects, if any, on the course of our constitutional and political development, the causes, sentiments, and public debates around Shays’ Rebellion (and the response to it) are no doubt familiar to many of us today; depending on how you look at it, that is either reassuring (things are not so uniquely bad after all) or depressing (things are *still* pretty bad over two centuries later).

The Franco-American Alliance and U.S. Independence

Among the four paintings prominently displayed in the U.S. Capitol is the Surrender of Lord Cornwallis by John Trumbull (known as the “Painter of the Revolution” for his many iconic depictions of the war and period; you’ll recognize many of them if you look him up).

The painting is fully described in the article text.
Wikimedia.org

The painting shows the British surrender at Yorktown in 1781, which marks the decisive end of the American Revolution. Flanked on one side of the defeated general are Americans carrying the Stars and Stripes, and on the other French soldiers beneath the banner of France’s monarchy—the two forces portrayed as equal combatants. Trumbull’s decision to show French and Americans as identical victors reflected widespread acknowledgement that the U.S. owed its independence to the Kingdom of France. (Ironically, the world’s first modern republic was birthed with the help of one of its oldest and most absolute monarchies—more so than Great Britain’s!)

Almost as many French troops took part in the final battle as Americans; one of the two military columns that secured victory was entirely French. Meanwhile, the French Navy had kept British ships from coming to Cornwallis’ aid, prompting him to surrender—and the British to sue for peace. Even this already-critical contribution is just one example of decisive French aid.

Well before the Declaration of Independence, the Founders actively sought an alliance with France: While the French monarchy was everything the revolution stood against—heck, it was more authoritarian than even Britain’s—the Patriots were pragmatic enough to recognize that only the French had both the motive and means to take on the British, to whom they lost all their North American colonies just a decade before, in the Seven Years’ War (to say nothing of centuries of rivalry and mutual enmity).Indeed, France’s foreign minister urged the king to support the Americans, arguing that “[destiny] had marked out this moment for the humiliation of England.”

Hence why the Founders pursued a two-year diplomatic mission, led by noted Francophile Benjamin Franklin, to court the French for as much aid and support as possible.

Wikimedia.org

The alliance was not merely opportunistic: Most of the Founders were avid consumers of French political philosophy, which promoted ideals of individual liberty and political representation. As far back as the 1760s, it was trendy for Americans to favor France over their English overlords; as one historian notes, “It became almost a patriotic duty for colonists to admire France as a counterpoise to an increasingly hostile England”. France’s powerful monarchy helped spur many French thinkers to explore better political alternatives—and in the process, inspire Americans across the Atlantic.

Patrick Henry’s famous exhortation, “Give me freedom or give me death!”, which convinced the colonists to prepare for war, echoed French philosopher Jean-Jacques Rousseau, who opened his influential 1762 work, The Social Contract, with the words “Man is born free and everywhere he is in chains”. Rousseau’s core argument—predating the American Revolution by over a decade—is familiar to us now: Sovereignty rested not in a monarch, but in the people, with laws needing to reflect the common good, not the whims of an aristocratic elite. These ideals were channeled by Thomas Jefferson—another avid reader and noted Francophile—in the language of the Declaration of Independence. The U.S. Constitution may have drawn from the even older work of Baron de Montesquieu, who forty years before published “The Spirit of the Laws”, which laid out many familiar principles: That the executive, legislative, and judicial functions of government should be separated, so that each branch can keep the other in check; that laws should ensure a fair trial, presumption of innocence and proportional punishments; and that people had the freedom of thought, speech and assembly (he also argued against slavery, though sadly that did not take root until much later).

Lafayette (right) depicted alongside George Washington at Valley Forge. John Ward Dunsmore (1907)

In any event, the admiration was mutual: Many French, including those who directly aided and fought in the American Revolution, were reeling under the monarchy and sought change; many of the political philosophers beloved by the Founders, including Rousseau and Montesquieu, faced persecution and even exile for their writings. To many in France, the nascent American republic signified their ideals made real, an experiment they wanted to succeed so it could perhaps be a model to their own efforts. (It is no coincidence that the French Revolution—which was bolder but bloodier than our own—would occur less than two decades after America’s.)

But as important as the ideological support was the practical kind. Even the most noble efforts require money to succeed, and France—then one of the world’s wealthiest countries—provided open-ended credit to the tune of billions of dollars. American troops, who initially lacked even basic goods like boots and winter jackers, received those supplies and more: By some measures, 90% of American gunpowder was of French origin, as were a similar proportion of U.S. armaments at Yorktown.

The Comte de Rochambeau, who is pictured as Washington’s equal in the Surrender of Yorktown, led the French Expeditionary Force that helped secure American victory—and which remains the only foreign allied force ever to campaign on American soil. Other brilliant Frenchmen like the Marquis de Lafayette, Louis Duportail, and Pierre L’Enfant played leading roles in the war and were personal friends and aides to George Washington (L’Enfant even helped design the nation’s capital). Tens of thousands more French served as soldiers and sailors, with the latter making up the bulk of our naval force.

Beyond the military dimension, France’s diplomatic heft could not be understated: As the first country to recognize American independence, it provided considerable legitimacy to the Patriot’s cause; if one of the most powerful countries in the world saw something in these upstart Americans, why shouldn’t other nations? Sure enough, France managed to get other powers like Spain and the Dutch Republic to throw in their lot with the Americans—turning what could have been just another self-contained rebellion into a full-fledged world war that stretched British forces thin. France even helped broker the peace deal that finally secured British recognition of U.S. independence—the “Treaty of Paris”—after refusing Britain’s offer of a separate peace deal without the Americans (a pretty solid ally indeed).

Source: Wikipedia; Encyclopedia Britannica; How Did the French Help Win the American Revolution? – HISTORY

A World of Knowledge

It is odd that Americans are so reluctant, if not hostile, to looking abroad for ideas about how to do things, such as education, voting methods, healthcare, etc. The principles and ideas that underpinned this nation’s founding did not emerge from nowhere: They were inspired by, or even directly drawn from, Enlightenment thinkers from across Europe; certain elements of British law and government (ironically), such as the Magna Carta and English Bill of Rights; and of course the Greeks and Romans, from whom we borrowed specific methods, institutions, terminology, and even architecture. (The U.S. Senate is explicitly inspired by the original Roman Senate, with senatus being Latin for council of elders.)

Americans make up less than five percent of humanity. The U.S. is one of nearly 200 countries. Its history as a nation, let alone as a superpower, is a relative blink in time; as a point of reference, the Roman-Persian wars lasted over 600 years, nearly three times America’s lifespan. Conversely, many countries are much younger, including most of the world’s democracies, providing fresher or bolder perspectives on certain issues not addressed or contemplated by our more conservative system.

Given all that, it stands to reason that someone, somewhere out there, has done something that we have not thought of or figured out, something worth studying or implementing. It is statistically unlikely that we are the only people or nation to know everything, giving our narrow slice of time, humans, and experience. The fact that so many innovators, inventors, and other contributes this country have come from all over the world proves the U.S. has always tacitly accepted the idea that the rest of the world has something to offer.

In fact, this would be in accordance with the vision of most of the nation’s founders, who were far from nationalistic. Their debates, speeches, and correspondences reveal them to have been fairly worldly folks who were open to foreign ideas and perspectives and sought to integrate the country into the international system. From Jefferson’s cherished copy of the Muslim Koran, to Franklin’s open Francophilia and Madison’s insistence that we respect global public opinion and norms, the supposed dichotomy between patriotism and internationalism is a false one at odds with one’s service to the nation.

It is all the more ironic because one of the few schools of philosophy to originate in the United States was pragmatism, which emerged in the 1870s and postulated, among other things, that people promote ideas based on their practical effect and benefit (i.e., regardless of their national or foreign origin). It should not matter where our solutions to certain problems come from it matters that they are solutions, and thus beneficial to our community, in the first place.

An American Parliament

As the U.S. once again finds itself between two widely unpopular choices, it is worth reflecting on this 2016 hypothetical from the Economist, a British newspaper: parties centered on narrower but more representative ideas.

Image may contain: 4 people, text that says 'WHAT IF THE UNITED STATES HAD A PARLIAMENT? PREDICTED PARLIAMENT* TOTAL SEATS 435 113 49 124 LEFT CENTRE-LEFT "Social "Liberal Democratic Party" Party" BERNIE SANDERS HILLARY CLINTON 26% of vote 28% 37 112 CENTRE-RIGHT RIGHT POPULIST "Conservative "Christian "People's Party" Coalition" Party" JOHN KASICH TED CRUZ DONALD 8% 11% TRUMP 26% Sources: YouGov; CPS; The Economist Pic credits: Getty Images; Reuters *based on April 22-26th 2016 polling; seats allocated Economist The proportionally by census region (North, Midwest, South, West)'

America’s presidential system, along with its winner-take-all elections and Electoral College, tends to lead to gridlock and polarization. These mechanisms and institutions were devised before political parties were a thing—or at least as rigid as they are now—and thus never seriously took them into account. Hence, we are stuck with two big parties that are far from representative of the complex spectrum of policies and ideologies.

Rather than the proportional representation you see above, members of Congress are elected in single-member districts according to the “first-past-the-post” (FPTP) principle, meaning that the candidate with the plurality of votes—i.e. not even the majority—wins the congressional seat. The losing party or parties, and by extension their voters, get no representation at all. This tends to produce a small number of major parties, in what’s known in political science as Duverger’s Law.

With the Electoral College, there is a similar dynamic at play: a presidential candidate needs no more than half the vote plus one to win the entire state and its electors. Some states are considering making it proportional, but only Maine and Nebraska have already done so.

This is why you see so many seemingly contradictory interests lumped into one or the other party. In other systems, you may have a party centered on labor rights, another on the environment, yet another for “conventional” left-wing or right-wing platforms, etc. The fragmentation might be messy, but it also forces parties to either appeal to a larger group of voters (so they can have a majority) or form coalitions with other parties to shore up their legislative votes (which gives a voice to smaller parties and their supporters).

Note that this is a huge oversimplification, as literally whole books have been written about all the reasons we are stuck with a two-party system most do not like. And of course, a parliament would not fix all our political problems, which go as deep as our culture and society.

But I personally think we may be better off with a parliamentary-style multiparty system—uncoincidentally the most common in the world, especially among established democracies—than what we have now.

What are your thoughts?

Compulsory Voting

As I see folks share that they voted, I’m reminded of the idea of mandatory voting, in which all eligible citizens are required to vote unless they have a valid excuse.

In ancient Athens, it was seen as the duty of every eligible citizen to participate in politics; while there was no explicit requirement, you could be subject to public criticism or even a fine.

Today, only a few countries require citizens to vote, most of them in Latin America; but of this already small number, only a handful actually enforce it with penalties.

Image may contain: text that says 'Nodata No data No compulsory voting No sanctions Source: -Dem Dataset Version 8 (2018) Minimal sanctions Costly sanctions'
Note: The light blue countries require voting but don’t enforce it. (Source: Wikimedia)

Moreover, just five of the world’s 35 established democracies have compulsory voting: Australia, Luxembourg, Uruguay, Costa Rica, and Belgium (which has the oldest existing compulsory voting system, dating back to 1893.) In Belgium, registered voters must present themselves at their polling station, and while they don’t have to cast a vote, those who fail to at least show up without proper justification can face prosecution and a moderate fine. (To make it easier, elections are always held on Sundays.) If they fail to vote in at least four elections, they can lose the right to vote for 10 years, and might face difficulties getting a job in government (though in practice fines are no longer issued).

The arguments for compulsory voting is that democratic elections are the responsibility of citizens—akin to jury duty or paying taxes—rather than a right. The idea is that making voting obligatory means all citizens have responsibility for the government they choose; in a sense, it makes the government more legitimate, since it represents the vast majority of people.

The counterargument is that no one should be forced to take part in a process they don’t believe in or otherwise don’t want to be a part of; basically, not voting is itself a form of expression. Unsurprisingly, this view is prevalent in the U.S., where many believe compulsory voting violates freedom of speech because the freedom to speak necessarily includes the freedom not to speak. Similarly, many citizens will vote solely because they have to, with total ignorance about the issues or candidates. In many cases, they might deliberately skew their ballot to slow the polling process and disrupt the election, or vote for frivolous or jokey candidates. This is prevalent in Brazil, the largest democracy with mandatory voting, where people increasingly have become cynical about politics, elect joke candidates, and still choose not to vote despite the penalty.

Some have argued that compulsory elections help prevent polarization and extremism, since politicians have to appeal to a broader base (i.e. the entire electorate). It does not pay to energize your base to the exclusion of all other voters, since elections cannot be determined by turnout alone. This is allegedly one reason Australian politics are relatively more balanced, with strong social policies but also a strong conservative movement.

Finally, there is the claim that making people vote might also make them more interested in politics. It’s been shown that while lots of folks resent jury duty for example, once they’re in the jury, they typically take the process seriously. Similarly, they may hate mandatory voting in theory but in practice will find themselves trying to make the best of it.