My Paper: Lessons from Around the World on Drug Decriminalization and Legalization

After decades of tremendous financial and social costs, the punitive drug model is being steadily eroded at home and abroad. Even the conservative law-and-order types who oppose the use of illicit drugs are increasingly accepting that the war on drugs has failed both in its objective (undercutting drug use) and its efficiency (accomplishing little yet reaping a huge economic and human toll).

Even Mexico, which has suffered more than most nations from our appetite for illegal drugs, has gone forward with legalizing marijuana in an effort to undercut a major source of funding for its powerful and vicious cartels. (So now both of America’s only neighbors have fully done away with punitive attitudes towards one of the weaker and comparatively less harmful illicit substances.)

All that being said, I do feel validated in having proposed and written a paper exploring the alternative methods, policies, and cultural attitudes of various countries when it comes to illegal drugs. As the U.S. and other countries question the wisdom of the status quo, it may help to look abroad at those places that were ahead of the curve in dispensing with the punitive approach in favor of more constructive methods. I focus especially on Portugal, which twenty years ago blazed the trail towards decriminalizing all illegal drugs and framing their use as a public health matter rather than a criminal one.

See the source image

As you will hopefully read, many of these strategies are unique to the time, place, or sociopolitical context of the nations that implemented them; nevertheless, there are still useful lessons to glean, and at the very least we can see proof that there are other ways to address the scourge of drug addiction, trafficking, and other associated ills, besides the blunt instrument of police and prisons.

Feel free to leave your thoughts, reactions, and feedback. Thanks again for your time.

The Shanghai Cooperation Organization

It is not a a household name like NATO and the European Union, but the milquetoast-sounding Shanghai Cooperation Organization may become one of the most important geopolitical blocs in the world. Iran’s recent entry into the Eurasian alliance has given it a rare spotlight in mainstream Western news media.

Founded two decades ago, the SCO is the world’s largest regional organisation, covering three-fifths of the Eurasian continent, nearly half the human population, and one-fifth of global GDP. It originated from a mutual security agreement in the 1990s between Russia, China, and several Central Asian countries (all former Soviet republics), which committed to maintaining “military trust” along their border regions.

But since being announced by member governments in Shanghai in 2001, the SCO has become more integrated along political, economic, and even cultural lines, in addition to beefing up military cooperation beyond simply maintaining border security. The fact that the alliance is led by two of America’s chief rivals, and comprised mostly of authoritarian countries, certainly adds to its image as the principal antinode to the Western-led world order.

No doubt Iran’s membership will add to that perception, though it also joins the likes of India and Pakistan, which became members in 2017, both of which are close (if tenuous) partners with the United States and other Western countries.

No photo description available.

In fact, many analysts warn that the perception of the SCO as an anti-American or anti-Western bloc is vastly overstated. While it is certainly predicated on the idea of a “multipolar” world—coded language for an international order not dominated by the U.S. specifically—the group is far from presenting itself as anything akin to an “Eastern” NATO:

Rather than major political or economic gains, Iran’s main takeaway from this success in the short term may be limited to a boost in prestige and diplomacy.

The main issue with Iran’s approach towards the SCO is that it looks at it as a “concert of non-Western great powers” rather than a modern international organisation, and views it in an anti-Western or anti-US setting, says Hamidreza Azizi, visiting fellow at the German Institute for International and Security Affairs (SWP).

“This is despite the fact that countries such as Pakistan and India are US’s close partners, and even Russia and China have never been willing to openly challenge the US on the global scene,” Azizi told Al Jazeera.

“The combination of these two misunderstandings, and also Iran’s self-perception as a natural hegemon in West Asia, would make the whole thing appear to the Iranian leaders as Iran joining other anti-Western great powers to form a strong coalition that is going to challenge the US hegemony.”

Azizi added that SCO members are reluctant to entangle themselves in Iran’s rivalries, which may be why, on Friday, they also admitted Saudi Arabia, Qatar and Egypt as “dialogue partners” in a balancing effort.

From a diplomatic perspective, the approval is significant.

Indeed, for a country as diplomatically and economically isolated as Iran, joining such a large and imposing regional body, whatever its limitations, is at least good optics.

SCO MAP 10 July 2015 - Including two new permanent members Pakistan and India.png
A slightly dated map showing SCO partners (dark green), observers (light green) and “dialogue partners” (yellow). Source: Wikimedia Commons

The SCO is far from being a full-fledged alliance with formal and binding commitments among its members; there is nothing like NATO’s Article 5, which obligates all members to come to the defense of another member in an attack, nor does it have the level of economic integration of the European Union. As one analyst describes it, the SCO is more of a “venue” for discussion among “high-level dignitaries”—which is perfectly suited for mostly autocratic countries that jealously guard their sovereignty.

Still, many powerful regional blocs like the EU did start from humble beginnings, growing from diplomatic talk shops to fully institutionalized arrangements over the span of decades. A wide array of countries have expressed interest in joining the group or are currently engaged with it in some way, including NATO members like Turkey and strategic partners like Saudi Arabia. It remains to be seen if the SCO will ever become as tightly integrated as its Western counterparts, though this is unlikely given its explicit commitment to nonintervention in members’ affairs—which ironically makes it all the more appealing for certain countries to join.

The Outbreaks That Never Happened and the Unseen Success of Global Institutions

Given all the death and dysfunction resulting from the COVID-19 pandemic, it is worth appreciating the many potential outbreaks that never happened, thanks to the efforts of Kenya, Mozambique, and Niger, alongside the United Nations and other international partners

In December 2019, just months before the COVID-19 pandemic came in full swing, these nations managed to halt an outbreak of a rare strain of “vaccine-derived polio”, which occurs “where overall immunization is low and that have inadequate sanitation, leading to transmission of the mutated polio virus”. It is all the more commendable given that Niger is among the ten poorest countries in the world.

The fact that polio remains both rare and relatively easy to quash is the results of a U.N.-backed campaign announced in 2005 to immunize 34 million children from the debilitating disease, which often leaves victims permanently disabled. The effort was led by  by World Health Organization the U.N. Children’s Fund (UNICEF), Rotary International, and the United States Centers for Disease Control and Prevention.

A nurse administers an oral poliovirus vaccine (OPV) to a baby at the Kaloko Clinic, Ndola, Zambia.
© UNICEF/Karin Schermbrucke

A little over fifteen years later, two out of three strains of polio have been eradicated—one as recently as last year—while the remaining strain is in just three countries: Afghanistan, Nigeria, and Pakistan. This once widespread disease is on its way to becoming only the second human disease to be eradicated, after smallpox, which once killed tens of millions annually. That feat, accomplished only in 1979, was also a multinational effort led by the U.N., even involving Cold War rivals America and Russia.

Even now, the much-maligned WHO actively monitors the entire world for “acute public health events” or other health emergences of concern that could portend a future pandemic. As recently as one month ago, the U.N. agency issued an alert and assessment concerning cases of MERS-Cov (a respirator illness related to COVID-19) in Saudi Arabia. Dozens of other detailed reports have been published the past year through WHO’s “Disease Outbreak News” service, spanning everything from Ebola in Guinea to “Monkeypox” in the United States. (WHO also has an influenza monitoring network spanning over half the world’s countries, including the U.S.).

Not bad for an agency with an annual budget of slightly over two billion—smaller than many large U.S. hospitals. (And contrary to popular belief in the U.S., the WHO did in fact move relatively quickly with respect to the COVID-19 pandemic:

On 31 December 2019, WHO’s China office picked up a media statement by the Wuhan Municipal Health Commission mentioning viral pneumonia. After seeking more information, WHO notified partners in the Global Outbreak Alert and Response Network (GOARN), which includes major public health institutes and laboratories around the world, on 2 January. Chinese officials formally reported on the viral pneumonia of unknown cause on 3 January. WHO alerted the global community through Twitter on 4 January and provided detailed information to all countries through the international event communication system on 5 January. Where there were delays, one important reason was that national governments seemed reluctant to provide information

Of course, it goes without saying that the WHO, and global institutions generally, have their shortcomings and failings (as I previously discussed). But much of that stems from structural weaknesses imposed by the very governments that criticize these international organizations in the first place:

WHO also exemplifies the reluctance of member states to fully trust one another. For example, member states do not grant WHO powers to scrutinise national data, even when they are widely questioned, or to conduct investigations into infectious diseases if national authorities do not agree, or to compel participation in its initiatives. Despite passing a resolution on the need for solidarity in response to covid-19, many member states have chosen self-centred paths instead. Against WHO’s strongest advice, vaccine nationalism has risen to the fore, with nations and regional blocks seeking to monopolise promising candidates. Similarly, nationalistic competition has arisen over existing medicines with the potential to benefit patients with covid-19. Forgoing cooperation for selfishness, some nations have been slow to support the WHO organised common vaccine development pool, with some flatly refusing to join.

The tensions between what member states say and do is reflected in inequalities in the international governance of health that have been exploited to weaken WHO systematically, particularly after it identified the prevailing world economic order as a major threat to health and wellbeing in its 1978 Health for All declaration. WHO’s work on a code of marketing of breastmilk substitutes around the same time increased concern among major trade powers that WHO would use its health authority to curtail private industry. Starting in 1981, the US and aligned countries began interfering with WHO’s budget, announcing a policy of “zero growth” to freeze the assessed contributions that underpinned its independence and reorienting its activities through earmarked funds. The result is a WHO shaped by nations that can pay for their own priorities. This includes the preference that WHO focus on specific diseases rather than the large social, political, and commercial determinants of health or the broad public health capacities in surveillance, preparedness, and other areas needed for pandemic prevention and management

In fact, it was this prolonged period of chronic underfunding, and of WHO member states prioritizing nonemergency programs, that precipitated the agency’s abysmal failings in the early phases of the 2014 Ebola outbreak. But once that crisis ended, member states, rather than defund or abandon the organization, opted to reform and strengthen its emergency functions; this overhaul resulted in the Health Emergencies Program, which was tested by the pandemic and thus far proven relatively robust:

On 31 December 2019, WHO’s China office picked up a media statement by the Wuhan Municipal Health Commission mentioning viral pneumonia. After seeking more information, WHO notified partners in the Global Outbreak Alert and Response Network (GOARN), which includes major public health institutes and laboratories around the world, on 2 January. Chinese officials formally reported on the viral pneumonia of unknown cause on 3 January. WHO alerted the global community through Twitter on 4 January and provided detailed information to all countries through the international event communication system on 5 January. Where there were delays, one important reason was that national governments seemed reluctant to provide information.

I know I am digressing into a defense of WHO, but that ties into the wider problem of too many governments and their voters believing that global governance is ineffective at best and harmfully dysfunctional at worst. We Americans, in particular, as constituents of the richest country in the world, have more sway than any society in how institutions like the U.N. function—or indeed whether they are even allowed to function.

As our progress with polio, smallpox, and many other diseases makes clear, what many Americans decry as “globalism” is actually more practical and effective than we think, and increasingly more relevant than ever. We fortunately have many potential outbreaks that never happened to prove it.

Map: How Nuclear Powers Pledge to Use Their Nukes

The world has been fortunate to only see nukes used aggressively against one nation, nearly eighty years ago, during the waning days of the Second World War (of course this is small comfort to the hundreds of thousands of victims in Hiroshima and Nagasaki).

This is all the more surprising considering we now have nine countries with nuclear weapons, some of which have been governed by certifiable mass murders (e.g., Stalin and Mao) or by men with questionable moral positions on ordering nuclear strikes (e.g., Nixon). One would think sheer probability would have resulted in at least an accidental launch (of which we have had several close calls).

This got me wondering how this select group of nuclear-armed countries approach the weighty issue of using their nukes against another nation. The most recent and reliable source I could find is a 2018 article from the Council on Foreign Relations, which offers a country-by-country breakdown on the “no first use” policy, the position that nukes should never be used first in any conflict but only in retaliation to a nuclear strike.

Based on the article, I made the following map, which shows the distressing rarity of that commitment:

It’s my first map, so I welcome any feedback or suggestions!

As explained in the article:

A so-called NFU pledge, first publicly made by China in 1964, refers to any authoritative statement by a nuclear weapon state to never be the first to use these weapons in a conflict, reserving them strictly to retaliate in the aftermath of a nuclear attack against its territory or military personnel. These pledges are a component of nuclear declaratory policies. As such, there can be no diplomatic arrangement to verify or enforce a declaratory NFU pledge, and such pledges alone do not affect capabilities. States with such pledges would be technically able to still use nuclear weapons first in a conflict, and their adversaries have generally not trusted NFU assurances. Today, China is the only nuclear weapon state to maintain an unconditional NFU pledge.

Given that such pledges are not binding, it is odd that more nations do not make them anyway; China’s lone commitment to this stance—which only India comes close to echoing—may not count for much, but clearly it carries enough significance for other nuclear powers to avoid it.

In fact, the United States had previously considered adopting an NFU policy, but has refrained from doing so out of fear that it might indicate insufficient deterrence of foreign threats:

During the Cold War and even today, the credible threat of the United States using its nuclear weapons first against an adversary has been an important component of reassuring allies. At the height of the Cold War, the threat of U.S. tactical nuclear use was conceived of as a critical bulwark against a conventional Soviet offensive through the Fulda Gap, a strategically significant lowland corridor in Germany that would allow Warsaw Pact forces to enter Western Europe. A nuclear first-use policy was thought to be a cornerstone of the defensive posture of the North Atlantic Treaty Organization (NATO), given the large number of bases of Warsaw Pact conventional military forces. Accordingly, NATO has always opposed a U.S. NFU declaration and has never ruled out U.S. first use under its “flexible response” posture since 1967. Today, U.S. allies in East Asia and Europe alike rely on credible commitments from the United States to use nuclear weapons first to deter major nonnuclear threats against them.

I guess these pledges are not so vacuous after all.

The Franco-American Alliance and U.S. Independence

Among the four paintings prominently displayed in the U.S. Capitol is the Surrender of Lord Cornwallis by John Trumbull (known as the “Painter of the Revolution” for his many iconic depictions of the war and period; you’ll recognize many of them if you look him up).

The painting is fully described in the article text.
Wikimedia.org

The painting shows the British surrender at Yorktown in 1781, which marks the decisive end of the American Revolution. Flanked on one side of the defeated general are Americans carrying the Stars and Stripes, and on the other French soldiers beneath the banner of France’s monarchy—the two forces portrayed as equal combatants. Trumbull’s decision to show French and Americans as identical victors reflected widespread acknowledgement that the U.S. owed its independence to the Kingdom of France. (Ironically, the world’s first modern republic was birthed with the help of one of its oldest and most absolute monarchies—more so than Great Britain’s!)

Almost as many French troops took part in the final battle as Americans; one of the two military columns that secured victory was entirely French. Meanwhile, the French Navy had kept British ships from coming to Cornwallis’ aid, prompting him to surrender—and the British to sue for peace. Even this already-critical contribution is just one example of decisive French aid.

Well before the Declaration of Independence, the Founders actively sought an alliance with France: While the French monarchy was everything the revolution stood against—heck, it was more authoritarian than even Britain’s—the Patriots were pragmatic enough to recognize that only the French had both the motive and means to take on the British, to whom they lost all their North American colonies just a decade before, in the Seven Years’ War (to say nothing of centuries of rivalry and mutual enmity).Indeed, France’s foreign minister urged the king to support the Americans, arguing that “[destiny] had marked out this moment for the humiliation of England.”

Hence why the Founders pursued a two-year diplomatic mission, led by noted Francophile Benjamin Franklin, to court the French for as much aid and support as possible.

Wikimedia.org

The alliance was not merely opportunistic: Most of the Founders were avid consumers of French political philosophy, which promoted ideals of individual liberty and political representation. As far back as the 1760s, it was trendy for Americans to favor France over their English overlords; as one historian notes, “It became almost a patriotic duty for colonists to admire France as a counterpoise to an increasingly hostile England”. France’s powerful monarchy helped spur many French thinkers to explore better political alternatives—and in the process, inspire Americans across the Atlantic.

Patrick Henry’s famous exhortation, “Give me freedom or give me death!”, which convinced the colonists to prepare for war, echoed French philosopher Jean-Jacques Rousseau, who opened his influential 1762 work, The Social Contract, with the words “Man is born free and everywhere he is in chains”. Rousseau’s core argument—predating the American Revolution by over a decade—is familiar to us now: Sovereignty rested not in a monarch, but in the people, with laws needing to reflect the common good, not the whims of an aristocratic elite. These ideals were channeled by Thomas Jefferson—another avid reader and noted Francophile—in the language of the Declaration of Independence. The U.S. Constitution may have drawn from the even older work of Baron de Montesquieu, who forty years before published “The Spirit of the Laws”, which laid out many familiar principles: That the executive, legislative, and judicial functions of government should be separated, so that each branch can keep the other in check; that laws should ensure a fair trial, presumption of innocence and proportional punishments; and that people had the freedom of thought, speech and assembly (he also argued against slavery, though sadly that did not take root until much later).

Lafayette (right) depicted alongside George Washington at Valley Forge. John Ward Dunsmore (1907)

In any event, the admiration was mutual: Many French, including those who directly aided and fought in the American Revolution, were reeling under the monarchy and sought change; many of the political philosophers beloved by the Founders, including Rousseau and Montesquieu, faced persecution and even exile for their writings. To many in France, the nascent American republic signified their ideals made real, an experiment they wanted to succeed so it could perhaps be a model to their own efforts. (It is no coincidence that the French Revolution—which was bolder but bloodier than our own—would occur less than two decades after America’s.)

But as important as the ideological support was the practical kind. Even the most noble efforts require money to succeed, and France—then one of the world’s wealthiest countries—provided open-ended credit to the tune of billions of dollars. American troops, who initially lacked even basic goods like boots and winter jackers, received those supplies and more: By some measures, 90% of American gunpowder was of French origin, as were a similar proportion of U.S. armaments at Yorktown.

The Comte de Rochambeau, who is pictured as Washington’s equal in the Surrender of Yorktown, led the French Expeditionary Force that helped secure American victory—and which remains the only foreign allied force ever to campaign on American soil. Other brilliant Frenchmen like the Marquis de Lafayette, Louis Duportail, and Pierre L’Enfant played leading roles in the war and were personal friends and aides to George Washington (L’Enfant even helped design the nation’s capital). Tens of thousands more French served as soldiers and sailors, with the latter making up the bulk of our naval force.

Beyond the military dimension, France’s diplomatic heft could not be understated: As the first country to recognize American independence, it provided considerable legitimacy to the Patriot’s cause; if one of the most powerful countries in the world saw something in these upstart Americans, why shouldn’t other nations? Sure enough, France managed to get other powers like Spain and the Dutch Republic to throw in their lot with the Americans—turning what could have been just another self-contained rebellion into a full-fledged world war that stretched British forces thin. France even helped broker the peace deal that finally secured British recognition of U.S. independence—the “Treaty of Paris”—after refusing Britain’s offer of a separate peace deal without the Americans (a pretty solid ally indeed).

Source: Wikipedia; Encyclopedia Britannica; How Did the French Help Win the American Revolution? – HISTORY

The First War to End All Wars

Yesterday was an even more devastating anniversary than the bar exam.

On July 28, 1914—exactly one month after the assassination of Archduke Franz Ferdinand—Austria declared war on Serbia and the First World War began. Despite directly setting off the war, both nations would soon be overshadowed by the much bigger players they dragged with them: France, Germany, Russia, and the U.K.

See the source image

After putting up stiff resistance for the first year, Serbia was conquered by the end of 1915 and occupied by Austro-Hungarian forces until the war’s end in 1918. Over 1.1 million Serbs died, including one out of four troops, up to a quarter of the population and 60 percent of men; proportionally, Serbia suffered more losses than any other country involved (the Ottoman Empire ranks second in this regard, losing 13-15 percent of people, followed by Romania at 7-9 percent).

For its part, the weak and declining Austro-Hungarian Empire lost over 2 million people, of whom 120,000 were civilians, amounting to about 4 percent of its total population. Having exhausted itself in its pyrrhic victory against Serbia, the country barely kept it together throughout the conflict, remaining a peripheral power dependent on German support; indeed, Austria-Hungary would ultimately collapse into several new countries, some of which would join Serbia to form a new multiethnic state called Yugoslavia.

All told, some 8 million fighting men were killed by combat and disease, and 21 million more were wounded. As many as 13 million civilians died as a result of starvation, exposure, disease, military action, and massacres. Four great empires and dynasties—the Hohenzollern, the Habsburg, the Romanov, and the Ottoman—fell, and the intercontinental movement of troops helped fuel the deadliest influenza pandemic in history. The ripple effects of the war, from the Great Depression, to World War II, to the Cold War, continue to be felt today. The war helped usher in the Russian Revolution, and ultimately the Soviet Union, the first major communist government (which ironically would play the pivotal role in helping end the second iteration of the war).

See the source image
See the source image

Better known are the grievances engendered by the post-war Versailles Treaty, which helped fuel the desperation and misery that became the Nazi’s stock and trade. Even Japan saw its star rise further as a major world power, belatedly joining the Allies and getting a seat at the table as one of the leaders of the post-war League of Nations (no small feat for a non-European country).

In Casualties of History, John Arquilla describes the almost morbidly comical arrogance and stupidity of this meat grinder of a conflict:

“Yes, a second and even more destructive conflict followed all too soon after the “war to end all wars”, impelling a name change from Armistice Day to Veterans Day. And the rest of the 20th century was littered with insurgencies, terrorism, and a host of other violent ills — most of which persist today, guaranteeing the steady production of new veterans, of which there are 22 million in the United States.

But despite the seemingly endless parade of wars waged and fresh conflicts looming just beyond the bloody horizon, World War I still stands out for its sheer horror. Over ten million soldiers died, and more than twice that number were wounded. This is a terrible enough toll. But what makes these casualties stand out even more is their proportion of the total numbers of troops mobilized.

For example, France put about 7.5 million soldiers in the field; one in five died, and three out of four who lived were wounded. All other major combatants on both sides suffered horribly: the Austro-Hungarian Empire’s 6.5 million soldiers had a combined casualty rate of 74 percent. For Britain and Russia, the comparable figures totaled a bit over 50 percent, with German and Turkish losses slightly below one-half of all who served. The United States entered the conflict late, and so the overall casualty rate for the 4.3 million mobilized was “just” 8 percent. Even so, it is more than double the percentage of killed and wounded from the Iraq War, where total American casualties amounted to less than 4 percent of the one million who served.

Few conflicts in all of military history have seen victors and vanquished alike suffer such shocking losses as were incurred in World War I, so it is worth taking time to remember how this hecatomb came to pass. A great body of evidence suggests that this disaster was a product of poor generalship. Historian Alan Clark’s magisterial “The Donkeys” conveys a sense of the incredible stubbornness of high commanders who continued, for years, to hurl massed waves of infantry against machine guns and rapid-firing artillery. All this went on while senior generals stayed far from the front. A British field commander, who went riding daily, even had soldiers spread sand along the country lane he followed, to make sure his horse didn’t slip.

It is little wonder that in the face of Nazi aggression barely a generation later, most of Europe melted away and succumbed to occupation within a year. Most nations did not have the political or public will to endure yet another meat grinder of a conflict; indeed, the major powers could not imagine that anyone would actually want another war given all the bloodletting that went around. Perhaps the greatest tragedy of the First World War was the fact that even all that death and destruction failed to stem the hatred, cruelty, and aggression of monstrous men and their millions of supporters and collaborators; in fact, the shortsightedness and vindictiveness of postwar leadersas had already been evidenced by their callous ineptitude on the battlefieldall but ensured that desperation and humiliation would give the likes of Hitler, Mussolini, and their minions plenty of currency to start an even bloodier.

Thanks goodness that, for now, that has not played out again all these decades later.

Source: Encyclopædia Britannica, Inc./Kenny Chmielewski

A World of Knowledge

It is odd that Americans are so reluctant, if not hostile, to looking abroad for ideas about how to do things, such as education, voting methods, healthcare, etc. The principles and ideas that underpinned this nation’s founding did not emerge from nowhere: They were inspired by, or even directly drawn from, Enlightenment thinkers from across Europe; certain elements of British law and government (ironically), such as the Magna Carta and English Bill of Rights; and of course the Greeks and Romans, from whom we borrowed specific methods, institutions, terminology, and even architecture. (The U.S. Senate is explicitly inspired by the original Roman Senate, with senatus being Latin for council of elders.)

Americans make up less than five percent of humanity. The U.S. is one of nearly 200 countries. Its history as a nation, let alone as a superpower, is a relative blink in time; as a point of reference, the Roman-Persian wars lasted over 600 years, nearly three times America’s lifespan. Conversely, many countries are much younger, including most of the world’s democracies, providing fresher or bolder perspectives on certain issues not addressed or contemplated by our more conservative system.

Given all that, it stands to reason that someone, somewhere out there, has done something that we have not thought of or figured out, something worth studying or implementing. It is statistically unlikely that we are the only people or nation to know everything, giving our narrow slice of time, humans, and experience. The fact that so many innovators, inventors, and other contributes this country have come from all over the world proves the U.S. has always tacitly accepted the idea that the rest of the world has something to offer.

In fact, this would be in accordance with the vision of most of the nation’s founders, who were far from nationalistic. Their debates, speeches, and correspondences reveal them to have been fairly worldly folks who were open to foreign ideas and perspectives and sought to integrate the country into the international system. From Jefferson’s cherished copy of the Muslim Koran, to Franklin’s open Francophilia and Madison’s insistence that we respect global public opinion and norms, the supposed dichotomy between patriotism and internationalism is a false one at odds with one’s service to the nation.

It is all the more ironic because one of the few schools of philosophy to originate in the United States was pragmatism, which emerged in the 1870s and postulated, among other things, that people promote ideas based on their practical effect and benefit (i.e., regardless of their national or foreign origin). It should not matter where our solutions to certain problems come from it matters that they are solutions, and thus beneficial to our community, in the first place.

An American Parliament

As the U.S. once again finds itself between two widely unpopular choices, it is worth reflecting on this 2016 hypothetical from the Economist, a British newspaper: parties centered on narrower but more representative ideas.

Image may contain: 4 people, text that says 'WHAT IF THE UNITED STATES HAD A PARLIAMENT? PREDICTED PARLIAMENT* TOTAL SEATS 435 113 49 124 LEFT CENTRE-LEFT "Social "Liberal Democratic Party" Party" BERNIE SANDERS HILLARY CLINTON 26% of vote 28% 37 112 CENTRE-RIGHT RIGHT POPULIST "Conservative "Christian "People's Party" Coalition" Party" JOHN KASICH TED CRUZ DONALD 8% 11% TRUMP 26% Sources: YouGov; CPS; The Economist Pic credits: Getty Images; Reuters *based on April 22-26th 2016 polling; seats allocated Economist The proportionally by census region (North, Midwest, South, West)'

America’s presidential system, along with its winner-take-all elections and Electoral College, tends to lead to gridlock and polarization. These mechanisms and institutions were devised before political parties were a thing—or at least as rigid as they are now—and thus never seriously took them into account. Hence, we are stuck with two big parties that are far from representative of the complex spectrum of policies and ideologies.

Rather than the proportional representation you see above, members of Congress are elected in single-member districts according to the “first-past-the-post” (FPTP) principle, meaning that the candidate with the plurality of votes—i.e. not even the majority—wins the congressional seat. The losing party or parties, and by extension their voters, get no representation at all. This tends to produce a small number of major parties, in what’s known in political science as Duverger’s Law.

With the Electoral College, there is a similar dynamic at play: a presidential candidate needs no more than half the vote plus one to win the entire state and its electors. Some states are considering making it proportional, but only Maine and Nebraska have already done so.

This is why you see so many seemingly contradictory interests lumped into one or the other party. In other systems, you may have a party centered on labor rights, another on the environment, yet another for “conventional” left-wing or right-wing platforms, etc. The fragmentation might be messy, but it also forces parties to either appeal to a larger group of voters (so they can have a majority) or form coalitions with other parties to shore up their legislative votes (which gives a voice to smaller parties and their supporters).

Note that this is a huge oversimplification, as literally whole books have been written about all the reasons we are stuck with a two-party system most do not like. And of course, a parliament would not fix all our political problems, which go as deep as our culture and society.

But I personally think we may be better off with a parliamentary-style multiparty system—uncoincidentally the most common in the world, especially among established democracies—than what we have now.

What are your thoughts?

Compulsory Voting

As I see folks share that they voted, I’m reminded of the idea of mandatory voting, in which all eligible citizens are required to vote unless they have a valid excuse.

In ancient Athens, it was seen as the duty of every eligible citizen to participate in politics; while there was no explicit requirement, you could be subject to public criticism or even a fine.

Today, only a few countries require citizens to vote, most of them in Latin America; but of this already small number, only a handful actually enforce it with penalties.

Image may contain: text that says 'Nodata No data No compulsory voting No sanctions Source: -Dem Dataset Version 8 (2018) Minimal sanctions Costly sanctions'
Note: The light blue countries require voting but don’t enforce it. (Source: Wikimedia)

Moreover, just five of the world’s 35 established democracies have compulsory voting: Australia, Luxembourg, Uruguay, Costa Rica, and Belgium (which has the oldest existing compulsory voting system, dating back to 1893.) In Belgium, registered voters must present themselves at their polling station, and while they don’t have to cast a vote, those who fail to at least show up without proper justification can face prosecution and a moderate fine. (To make it easier, elections are always held on Sundays.) If they fail to vote in at least four elections, they can lose the right to vote for 10 years, and might face difficulties getting a job in government (though in practice fines are no longer issued).

The arguments for compulsory voting is that democratic elections are the responsibility of citizens—akin to jury duty or paying taxes—rather than a right. The idea is that making voting obligatory means all citizens have responsibility for the government they choose; in a sense, it makes the government more legitimate, since it represents the vast majority of people.

The counterargument is that no one should be forced to take part in a process they don’t believe in or otherwise don’t want to be a part of; basically, not voting is itself a form of expression. Unsurprisingly, this view is prevalent in the U.S., where many believe compulsory voting violates freedom of speech because the freedom to speak necessarily includes the freedom not to speak. Similarly, many citizens will vote solely because they have to, with total ignorance about the issues or candidates. In many cases, they might deliberately skew their ballot to slow the polling process and disrupt the election, or vote for frivolous or jokey candidates. This is prevalent in Brazil, the largest democracy with mandatory voting, where people increasingly have become cynical about politics, elect joke candidates, and still choose not to vote despite the penalty.

Some have argued that compulsory elections help prevent polarization and extremism, since politicians have to appeal to a broader base (i.e. the entire electorate). It does not pay to energize your base to the exclusion of all other voters, since elections cannot be determined by turnout alone. This is allegedly one reason Australian politics are relatively more balanced, with strong social policies but also a strong conservative movement.

Finally, there is the claim that making people vote might also make them more interested in politics. It’s been shown that while lots of folks resent jury duty for example, once they’re in the jury, they typically take the process seriously. Similarly, they may hate mandatory voting in theory but in practice will find themselves trying to make the best of it.