The Outer Space Treaty

On this day in 1967, the Outer Space Treaty entered into force, becoming the first effort to establish universal principles and guidelines for activities in outer space. It was created under the auspices of the United Nations based on proposals by the world’s two principal space powers, the United States and Soviet Union.

Naturally, I took the opportunity to improve the Wikipedia article about it, which deserves greater justice (See the before and after photos below.)

It may not be a household name — then again, few treaties are —but the Outer Space Treaty remains one of the most relevant texts in international law today. It is the foundational framework for what we now know as space law, a legal field that is more relevant than ever now that dozens of countries and companies are actively involved in space activities.

The Outer Space Treaty forms the basis of ambitious projects such as the International Space Station (the biggest scientific endeavor in history) and the Artemis Program, a U.S.-led international coalition to return humans to the Moon and to ultimately launch crewed missions to Mars and beyond.

May be a black-and-white image of 2 people, people sitting and indoor
The treaty was signed in Washington, Moscow, and London, representing the first three countries to have artificial satellites in space at the time.

The main crux of the Outer Space Treaty is preventing the placement of weapons of mass destruction in space; broader principles include allowing all nations to freely explore space; limiting space activities to peaceful purposes; preventing any one nation from claiming territory in space; and fostering goodwill and cooperation in space exploration (such as rescuing one another’s astronauts or preventing our space probes from damaging others).

I know, I know, it is all quite idealistic. But all things considered, the treaty has held up fairly well: Most of the world’s countries, including all the major space powers, have ratified it and abided by its terms (after all, it is in everyone’s self-interest to keep everyone else from putting nukes in space). Naturally, some provisions were written vaguely enough to allow some workarounds — for example, space forces are still allowed so long as they are not armed with WMDs and belligerent.

The Outer Space Treaty is influential enough to still be referenced by the major space programs, and has enough legitimacy that every government feels the need to at least pay lip service to its terms. Whether this holds up in an ever-intensifying rivalry among both countries and companies is a different story — but it is certainly better than nothing.

The Outbreaks That Never Happened and the Unseen Success of Global Institutions

Given all the death and dysfunction resulting from the COVID-19 pandemic, it is worth appreciating the many potential outbreaks that never happened, thanks to the efforts of Kenya, Mozambique, and Niger, alongside the United Nations and other international partners

In December 2019, just months before the COVID-19 pandemic came in full swing, these nations managed to halt an outbreak of a rare strain of “vaccine-derived polio”, which occurs “where overall immunization is low and that have inadequate sanitation, leading to transmission of the mutated polio virus”. It is all the more commendable given that Niger is among the ten poorest countries in the world.

The fact that polio remains both rare and relatively easy to quash is the results of a U.N.-backed campaign announced in 2005 to immunize 34 million children from the debilitating disease, which often leaves victims permanently disabled. The effort was led by  by World Health Organization the U.N. Children’s Fund (UNICEF), Rotary International, and the United States Centers for Disease Control and Prevention.

A nurse administers an oral poliovirus vaccine (OPV) to a baby at the Kaloko Clinic, Ndola, Zambia.
© UNICEF/Karin Schermbrucke

A little over fifteen years later, two out of three strains of polio have been eradicated—one as recently as last year—while the remaining strain is in just three countries: Afghanistan, Nigeria, and Pakistan. This once widespread disease is on its way to becoming only the second human disease to be eradicated, after smallpox, which once killed tens of millions annually. That feat, accomplished only in 1979, was also a multinational effort led by the U.N., even involving Cold War rivals America and Russia.

Even now, the much-maligned WHO actively monitors the entire world for “acute public health events” or other health emergences of concern that could portend a future pandemic. As recently as one month ago, the U.N. agency issued an alert and assessment concerning cases of MERS-Cov (a respirator illness related to COVID-19) in Saudi Arabia. Dozens of other detailed reports have been published the past year through WHO’s “Disease Outbreak News” service, spanning everything from Ebola in Guinea to “Monkeypox” in the United States. (WHO also has an influenza monitoring network spanning over half the world’s countries, including the U.S.).

Not bad for an agency with an annual budget of slightly over two billion—smaller than many large U.S. hospitals. (And contrary to popular belief in the U.S., the WHO did in fact move relatively quickly with respect to the COVID-19 pandemic:

On 31 December 2019, WHO’s China office picked up a media statement by the Wuhan Municipal Health Commission mentioning viral pneumonia. After seeking more information, WHO notified partners in the Global Outbreak Alert and Response Network (GOARN), which includes major public health institutes and laboratories around the world, on 2 January. Chinese officials formally reported on the viral pneumonia of unknown cause on 3 January. WHO alerted the global community through Twitter on 4 January and provided detailed information to all countries through the international event communication system on 5 January. Where there were delays, one important reason was that national governments seemed reluctant to provide information

Of course, it goes without saying that the WHO, and global institutions generally, have their shortcomings and failings (as I previously discussed). But much of that stems from structural weaknesses imposed by the very governments that criticize these international organizations in the first place:

WHO also exemplifies the reluctance of member states to fully trust one another. For example, member states do not grant WHO powers to scrutinise national data, even when they are widely questioned, or to conduct investigations into infectious diseases if national authorities do not agree, or to compel participation in its initiatives. Despite passing a resolution on the need for solidarity in response to covid-19, many member states have chosen self-centred paths instead. Against WHO’s strongest advice, vaccine nationalism has risen to the fore, with nations and regional blocks seeking to monopolise promising candidates. Similarly, nationalistic competition has arisen over existing medicines with the potential to benefit patients with covid-19. Forgoing cooperation for selfishness, some nations have been slow to support the WHO organised common vaccine development pool, with some flatly refusing to join.

The tensions between what member states say and do is reflected in inequalities in the international governance of health that have been exploited to weaken WHO systematically, particularly after it identified the prevailing world economic order as a major threat to health and wellbeing in its 1978 Health for All declaration. WHO’s work on a code of marketing of breastmilk substitutes around the same time increased concern among major trade powers that WHO would use its health authority to curtail private industry. Starting in 1981, the US and aligned countries began interfering with WHO’s budget, announcing a policy of “zero growth” to freeze the assessed contributions that underpinned its independence and reorienting its activities through earmarked funds. The result is a WHO shaped by nations that can pay for their own priorities. This includes the preference that WHO focus on specific diseases rather than the large social, political, and commercial determinants of health or the broad public health capacities in surveillance, preparedness, and other areas needed for pandemic prevention and management

In fact, it was this prolonged period of chronic underfunding, and of WHO member states prioritizing nonemergency programs, that precipitated the agency’s abysmal failings in the early phases of the 2014 Ebola outbreak. But once that crisis ended, member states, rather than defund or abandon the organization, opted to reform and strengthen its emergency functions; this overhaul resulted in the Health Emergencies Program, which was tested by the pandemic and thus far proven relatively robust:

On 31 December 2019, WHO’s China office picked up a media statement by the Wuhan Municipal Health Commission mentioning viral pneumonia. After seeking more information, WHO notified partners in the Global Outbreak Alert and Response Network (GOARN), which includes major public health institutes and laboratories around the world, on 2 January. Chinese officials formally reported on the viral pneumonia of unknown cause on 3 January. WHO alerted the global community through Twitter on 4 January and provided detailed information to all countries through the international event communication system on 5 January. Where there were delays, one important reason was that national governments seemed reluctant to provide information.

I know I am digressing into a defense of WHO, but that ties into the wider problem of too many governments and their voters believing that global governance is ineffective at best and harmfully dysfunctional at worst. We Americans, in particular, as constituents of the richest country in the world, have more sway than any society in how institutions like the U.N. function—or indeed whether they are even allowed to function.

As our progress with polio, smallpox, and many other diseases makes clear, what many Americans decry as “globalism” is actually more practical and effective than we think, and increasingly more relevant than ever. We fortunately have many potential outbreaks that never happened to prove it.

The Only Woman Executed in the French Revolution for Her Politics

Olympe de Gouges.png

On this day in 1793, French playwright, journalist, and outspoken feminist Olympe de Gouges (born Marie Gouze) published the Declaration of the Rights of Woman and of the Female Citizen, hoping to expose the failures of the French Revolution to recognize gender equality.

Initially hopeful that the French Revolution would usher equality between men and women, Gouges became disenchanted upon discovering that the key revolutionary tenant of egalite would not be extended to women. In 1791, in response to the Declaration of the Rights of Man and of the Citizenan otherwise seminal work in human rights— she wrote a counter-declaration that proposed full legal, social, and political equality between men and women. She also published her treatise, Social Contract, named after the famous work of Enlightenment thinker Jean-Jacques Rousseau, calling for marriage based upon gender equality.

Even before the revolution, Gouges was well ahead of her time both ideologically and professionally. She dared write plays and publish political pamphlets at a time when women were denied full participation in the public and political space. After releasing a play critical of slavery, she was widely denounced and even threatened for both her anti-slavery stance and being involved in the male profession of theatre in the first place. Gouges remained defiant: “I’m determined to be a success, and I’ll do it in spite of my enemies”. Unfortunately, threats and outright sabotage from the slavery lobby forced the theatre to abandon her play after just three days.

Heck, even her name was an act of defiance against prevailing social norms, as explained by Columbia College:

…Gouges took on her mother’s middle name, changed the spelling of her father’s and added the aristocratic “de.”  Adding to this already audacious gesture, the name “Gouges” may also have been a sly and provocative joke.  The word “gouge” in Occitan was an offensive slang term used to refer to lowly, bawdy women.  

Unsurprisingly, once the French Revolution came into full swing, Gouges wasted no time in seizing the moment. Aside from her already-bold feminist views, she rigorously supported a wage of policies and rights that proved radical even for the revolution:

She produced numerous broadsides and pamphlets between 1789 and 1792 that called for, among other things, houses of refuge for women and children at risk;  a tax to fund workshops for the unemployed;  the legitimation of children born out of wedlock;  inheritance equality;  the legalization and regulation of prostitution;  the legalization of divorce;  clean streets;  a national theater and the opening of professions to everyone regardless of race, class or gender.  She also began to sign her letters “citoyenne,” the feminine version of the conventional revolutionary honorific “citoyen.”  

Gouges’ opposition to the revolution’s growing and bloody radicalism, and support for a constitutional monarchy, put a target on her back. Above all she openly disliked, Maximillian Robespierre, in effect the most powerful man in the country, going so far as to use the informal tu when referring to him in an open letter. This proved the last straw; she was tried, convicted, and executed for treason as one of only three women to be executed during the Reign of Terror, and the only one executed for her politics.

Nonetheless, Gouges’ legacy lived on for decades, influencing women’s rights movements across Europe and North America: the 1848 Seneca Falls Convention in New York—the first convention dedicated to women’s rights—based its “Declaration of Sentiments” on her “Declaration of the Rights of Woman”. 

Map: How Nuclear Powers Pledge to Use Their Nukes

The world has been fortunate to only see nukes used aggressively against one nation, nearly eighty years ago, during the waning days of the Second World War (of course this is small comfort to the hundreds of thousands of victims in Hiroshima and Nagasaki).

This is all the more surprising considering we now have nine countries with nuclear weapons, some of which have been governed by certifiable mass murders (e.g., Stalin and Mao) or by men with questionable moral positions on ordering nuclear strikes (e.g., Nixon). One would think sheer probability would have resulted in at least an accidental launch (of which we have had several close calls).

This got me wondering how this select group of nuclear-armed countries approach the weighty issue of using their nukes against another nation. The most recent and reliable source I could find is a 2018 article from the Council on Foreign Relations, which offers a country-by-country breakdown on the “no first use” policy, the position that nukes should never be used first in any conflict but only in retaliation to a nuclear strike.

Based on the article, I made the following map, which shows the distressing rarity of that commitment:

It’s my first map, so I welcome any feedback or suggestions!

As explained in the article:

A so-called NFU pledge, first publicly made by China in 1964, refers to any authoritative statement by a nuclear weapon state to never be the first to use these weapons in a conflict, reserving them strictly to retaliate in the aftermath of a nuclear attack against its territory or military personnel. These pledges are a component of nuclear declaratory policies. As such, there can be no diplomatic arrangement to verify or enforce a declaratory NFU pledge, and such pledges alone do not affect capabilities. States with such pledges would be technically able to still use nuclear weapons first in a conflict, and their adversaries have generally not trusted NFU assurances. Today, China is the only nuclear weapon state to maintain an unconditional NFU pledge.

Given that such pledges are not binding, it is odd that more nations do not make them anyway; China’s lone commitment to this stance—which only India comes close to echoing—may not count for much, but clearly it carries enough significance for other nuclear powers to avoid it.

In fact, the United States had previously considered adopting an NFU policy, but has refrained from doing so out of fear that it might indicate insufficient deterrence of foreign threats:

During the Cold War and even today, the credible threat of the United States using its nuclear weapons first against an adversary has been an important component of reassuring allies. At the height of the Cold War, the threat of U.S. tactical nuclear use was conceived of as a critical bulwark against a conventional Soviet offensive through the Fulda Gap, a strategically significant lowland corridor in Germany that would allow Warsaw Pact forces to enter Western Europe. A nuclear first-use policy was thought to be a cornerstone of the defensive posture of the North Atlantic Treaty Organization (NATO), given the large number of bases of Warsaw Pact conventional military forces. Accordingly, NATO has always opposed a U.S. NFU declaration and has never ruled out U.S. first use under its “flexible response” posture since 1967. Today, U.S. allies in East Asia and Europe alike rely on credible commitments from the United States to use nuclear weapons first to deter major nonnuclear threats against them.

I guess these pledges are not so vacuous after all.

The Rise of Killer Drones

Alright, so I am being a bit cheeky here. (Come on, even the big-name media brands use hyperbolic headlines!)

But, buried within a 548-page United Nations report on the Libyan Civil War is a troubling account about an autonomous military drone (specifically an “unmanned aerial vehicle”, or UAV) attacking soldiers without any direct human command.

Described as “a lethal autonomous weapons system”, the drone was powered by artificial intelligence and used by government-backed forces against an enemy militia. According to the report, these fighters “were hunted down and remotely engaged by the unmanned combat aerial vehicles or the lethal autonomous weapons systems” and even when they retreated, the drones subjected them to “continual harassment”; no casualties are mentioned.

See the source image
The alleged killer drone, the Turkish-built Kargu-2

The report further states that the weapon systems “were programmed to attack targets without requiring data connectivity between the operator and the munitions”—in other words, it was a “fire and forget”.

However, it is unclear whether the drone was allowed to select its target autonomously or did so “on its own”, so to speak. Either way, some observers already consider it the first attack in history carried out by a drone on their own initiative.

It is worth mentioning that the drone in question is a Kargu-2, a small rotary drone built by a Turkish company closely affiliated with that country’s government. Turkey has emerged as an unlikely pioneer in drone technology: another one of its drones, the larger and better armed Bayraktar TB2, is credited with helping Azerbaijan win its war with Armenia in 2020; after years of literally losing ground against a militarily superior foe, Turkey’s ally gained a decisive edge because of these drones.

Drone strikes — targeting Armenian and Nagorno-Karabakh soldiers and destroying tanks, artillery and air defense systems — provided a huge advantage for Azerbaijan in the 44-day war and offered the clearest evidence yet of how battlefields are being transformed by unmanned attack drones rolling off assembly lines around the world.

The expanding array of relatively low-cost drones can offer countries air power at a fraction of the cost of maintaining a traditional air force. The situation in Nagorno-Karabakh also underscored how drones can suddenly shift a long-standing conflict and leave ground forces highly exposed.[…]“

Drones offer small countries very cheap access to tactical aviation and precision guided weapons, enabling them to destroy an opponent’s much-costlier equipment such as tanks and air defense systems,” said Michael Kofman, military analyst and director of Russia studies at CNA, a defense think tank in Arlington, Va.

“An air force is a very expensive thing,” he added. “And they permit the utility of air power to smaller, much poorer nations.”

In Azerbaijan, the videos of the drone strikes have been posted daily on the website of the country’s Defense Ministry, broadcast on big screens in the capital, Baku, and tweeted and retweeted online.

Washington Post

Little wonder why Ukraine is rumored to be seeking these same drones to take back territory controlled by Russian-backed separatists, or why Iraq is considering acquiring some to hunt down ISIS militants and even to shore up gaps in its fledging air force. (Unsurprisingly, Turkey has seized on the success and prestige of its drone industry by proclaiming itself one of the world’s three leaders in combat drone technology.)

To be sure, the U.S. is still far and above the dominant user of combat drones, due in large part to the massive expense of acquiring and maintaining the highest-end systems. Within a decade it may have up to 1,000 drones at its disposal, well above the less than 100 employed by chief rivals China and Russia.

Infographic: The Countries Set To Dominate Drone Warfare  | Statista

Of course, a lot can happen between now and 2028; a technology that was once exclusive to just a handful of nations is now proliferating across the world, thanks to innovations that make drones easier and cheaper to develop, build, and operate. As of 2019, close to 100 countries use military drones — albeit the vast majority for surveillance purposes — up from around 60 a decade earlier. There are at least 21,000 drones in active service worldwide (though the number may be much higher), spanning over 170 different systems; 20 nations are known to have armed, higher-end models.

As to be expected, China and Russia are among the countries with armed drones, but so are the likes of Israel, Iran, Pakistan, and Nigeria. So far, only ten countries are known to have used drone technology on the battlefield: the U.S., Israel, the U.K., Pakistan, Iraq, Nigeria, Iran , Turkey, Azerbaijan, Russia, and the United Arab Emirate.

Note that most of these countries are not among the wealthiest or most powerful in the world, which can also be said of several more countries currently developing drones. The D.C.-based think tank New America has an excellent up-to-date report on this fast-moving world of drone tech, which includes the following infographics:

Drones have become accessible enough that they are even utilized by nonstate actors, ranging from paramilitary groups to terrorist organizations and even cartels

Military drones have come a long way since Israel first used them for surveillance purposes in the 1960s (the U.S. used Israeli-made UAVs to provide intelligence during the Bosnian War of the 1990s, and Israel remains a leading exporter of military drones). Indeed, just a few months after the U.N. report, Israel reportedly used a “swarm of drones” to identify and strike targets in the Gaza Strip—the first time this type of A.I. has been used. These swarms can number in the hundreds, coordinating with one another as they cover far more ground, and far more quickly, than other means. This is no doubt why China is also pioneering this particular type of drone tech, reportedly developing rocket-armed helicopter drones that can overwhelm targets like a swarm of angry bees—with just the push of a faraway button.

Not to be outdone, Russia is also looking to build an “army of robot weapons” backed by Chinese advances in A.I. tech. A report drawing on Pentagon intelligence identified two dozen platforms being developed by the Russian military incorporating some degree of AI or autonomy; these include land, air, and sea vehicles, specialized mines, A.I-powered logistical and training system, and supposedly even an anthropomorphic robot capable of dual-wielding firearms and driving cars. (This does not even include Russia’s purported edge in hypersonic missiles, which is already engendering yet another arms race between the big powers.)

While a lot of this is no doubt posturing, there is zero doubt that countries of all shapes and sizes are going to pursue this tech and ultimately succeed. There were times when firearms, tanks, and aircraft were cutting edge tech limited to a handful of great powers; now, even the smallest military forces have them.

Of course, as some hapless Libyan militants can attest, none of that hardware has the potential to go off the rails like A.I. does…

The Rebellion that Shook a Fledgling America

Shays forces flee Continental troops, Springfield.jpg

On this day in 1786, the newly minted United States faced its greatest domestic challenge when Daniel Shays led an armed uprising in western Massachusetts against the federal government known as “Shays’ Rebellion“.

Shays was a veteran of the American Revolutionary War who saw combat in several major battles and was wounded in action. Like most people in the fledgling country of roughly three million, he was a subsistence farmer just scrapping by; most residents of rural Massachusetts had few assets beyond their land, and often had to rely on bartering or debt from urban merchants.

Like most revolts, there were many complex ideological, political, and economic factors that drove Shays and four thousand others to take arms against their purportedly representative government. The majority of veterans received little pay for their service, and still had difficult getting the government to pay up. Compounding this problem were mounting debts to city businessmen and higher taxes by the state government, which happened to be dominated by the same mercantile class. Mounting bankruptcies and repossessions, coupled with the ineffectiveness of the democratic process in addressing these issues, finally boiled over to well organized efforts to shut down the courts and prevent more “unjust” rulings. Over time, the protests morphed into an outright insurrection that sought the overthrow of the Massachusetts government. Things really came to a head in 1787, when Shays’ rebels marched on the federal Springfield Armory in an unsuccessful attempt to seize its weaponry for their cause. The national government, then governed by the Articles of Confederation, did not have the power nor ability to finance troops to put down the rebellion; it came down to the Massachusetts State militia and even privately funded local militia to put an end to the rebellion, at the loss of nine lives in total.

Most participants were ultimately pardoned, including Shays himself, who ultimately died poor and obscure in 1825. But the legacy of the conflict far outlived its relative blip in modern history: Though it is still widely debated, the rebellion may have influenced the already-growing calls for the weak Confederation to be replaced by a federal system under a new constitution. Among other things, the event is credited with creating a relatively more powerful executive branch, as it was believed one single president would have a better chance at acting decisively against national threats. Some delegates felt that the uprising proved the masses could not be trusted; the proposed Senate was already designed to be indirectly elected (as it would remain until the early 20th century), but some wanted even the House of Representatives to be removed from the popular vote. Regardless of its effects, if any, on the course of our constitutional and political development, the causes, sentiments, and public debates around Shays’ Rebellion (and the response to it) are no doubt familiar to many of us today; depending on how you look at it, that is either reassuring (things are not so uniquely bad after all) or depressing (things are *still* pretty bad over two centuries later).

The First War to End All Wars

Yesterday was an even more devastating anniversary than the bar exam.

On July 28, 1914—exactly one month after the assassination of Archduke Franz Ferdinand—Austria declared war on Serbia and the First World War began. Despite directly setting off the war, both nations would soon be overshadowed by the much bigger players they dragged with them: France, Germany, Russia, and the U.K.

See the source image

After putting up stiff resistance for the first year, Serbia was conquered by the end of 1915 and occupied by Austro-Hungarian forces until the war’s end in 1918. Over 1.1 million Serbs died, including one out of four troops, up to a quarter of the population and 60 percent of men; proportionally, Serbia suffered more losses than any other country involved (the Ottoman Empire ranks second in this regard, losing 13-15 percent of people, followed by Romania at 7-9 percent).

For its part, the weak and declining Austro-Hungarian Empire lost over 2 million people, of whom 120,000 were civilians, amounting to about 4 percent of its total population. Having exhausted itself in its pyrrhic victory against Serbia, the country barely kept it together throughout the conflict, remaining a peripheral power dependent on German support; indeed, Austria-Hungary would ultimately collapse into several new countries, some of which would join Serbia to form a new multiethnic state called Yugoslavia.

All told, some 8 million fighting men were killed by combat and disease, and 21 million more were wounded. As many as 13 million civilians died as a result of starvation, exposure, disease, military action, and massacres. Four great empires and dynasties—the Hohenzollern, the Habsburg, the Romanov, and the Ottoman—fell, and the intercontinental movement of troops helped fuel the deadliest influenza pandemic in history. The ripple effects of the war, from the Great Depression, to World War II, to the Cold War, continue to be felt today. The war helped usher in the Russian Revolution, and ultimately the Soviet Union, the first major communist government (which ironically would play the pivotal role in helping end the second iteration of the war).

See the source image
See the source image

Better known are the grievances engendered by the post-war Versailles Treaty, which helped fuel the desperation and misery that became the Nazi’s stock and trade. Even Japan saw its star rise further as a major world power, belatedly joining the Allies and getting a seat at the table as one of the leaders of the post-war League of Nations (no small feat for a non-European country).

In Casualties of History, John Arquilla describes the almost morbidly comical arrogance and stupidity of this meat grinder of a conflict:

“Yes, a second and even more destructive conflict followed all too soon after the “war to end all wars”, impelling a name change from Armistice Day to Veterans Day. And the rest of the 20th century was littered with insurgencies, terrorism, and a host of other violent ills — most of which persist today, guaranteeing the steady production of new veterans, of which there are 22 million in the United States.

But despite the seemingly endless parade of wars waged and fresh conflicts looming just beyond the bloody horizon, World War I still stands out for its sheer horror. Over ten million soldiers died, and more than twice that number were wounded. This is a terrible enough toll. But what makes these casualties stand out even more is their proportion of the total numbers of troops mobilized.

For example, France put about 7.5 million soldiers in the field; one in five died, and three out of four who lived were wounded. All other major combatants on both sides suffered horribly: the Austro-Hungarian Empire’s 6.5 million soldiers had a combined casualty rate of 74 percent. For Britain and Russia, the comparable figures totaled a bit over 50 percent, with German and Turkish losses slightly below one-half of all who served. The United States entered the conflict late, and so the overall casualty rate for the 4.3 million mobilized was “just” 8 percent. Even so, it is more than double the percentage of killed and wounded from the Iraq War, where total American casualties amounted to less than 4 percent of the one million who served.

Few conflicts in all of military history have seen victors and vanquished alike suffer such shocking losses as were incurred in World War I, so it is worth taking time to remember how this hecatomb came to pass. A great body of evidence suggests that this disaster was a product of poor generalship. Historian Alan Clark’s magisterial “The Donkeys” conveys a sense of the incredible stubbornness of high commanders who continued, for years, to hurl massed waves of infantry against machine guns and rapid-firing artillery. All this went on while senior generals stayed far from the front. A British field commander, who went riding daily, even had soldiers spread sand along the country lane he followed, to make sure his horse didn’t slip.

It is little wonder that in the face of Nazi aggression barely a generation later, most of Europe melted away and succumbed to occupation within a year. Most nations did not have the political or public will to endure yet another meat grinder of a conflict; indeed, the major powers could not imagine that anyone would actually want another war given all the bloodletting that went around. Perhaps the greatest tragedy of the First World War was the fact that even all that death and destruction failed to stem the hatred, cruelty, and aggression of monstrous men and their millions of supporters and collaborators; in fact, the shortsightedness and vindictiveness of postwar leadersas had already been evidenced by their callous ineptitude on the battlefieldall but ensured that desperation and humiliation would give the likes of Hitler, Mussolini, and their minions plenty of currency to start an even bloodier.

Thanks goodness that, for now, that has not played out again all these decades later.

Source: Encyclopædia Britannica, Inc./Kenny Chmielewski

A World of Knowledge

It is odd that Americans are so reluctant, if not hostile, to looking abroad for ideas about how to do things, such as education, voting methods, healthcare, etc. The principles and ideas that underpinned this nation’s founding did not emerge from nowhere: They were inspired by, or even directly drawn from, Enlightenment thinkers from across Europe; certain elements of British law and government (ironically), such as the Magna Carta and English Bill of Rights; and of course the Greeks and Romans, from whom we borrowed specific methods, institutions, terminology, and even architecture. (The U.S. Senate is explicitly inspired by the original Roman Senate, with senatus being Latin for council of elders.)

Americans make up less than five percent of humanity. The U.S. is one of nearly 200 countries. Its history as a nation, let alone as a superpower, is a relative blink in time; as a point of reference, the Roman-Persian wars lasted over 600 years, nearly three times America’s lifespan. Conversely, many countries are much younger, including most of the world’s democracies, providing fresher or bolder perspectives on certain issues not addressed or contemplated by our more conservative system.

Given all that, it stands to reason that someone, somewhere out there, has done something that we have not thought of or figured out, something worth studying or implementing. It is statistically unlikely that we are the only people or nation to know everything, giving our narrow slice of time, humans, and experience. The fact that so many innovators, inventors, and other contributes this country have come from all over the world proves the U.S. has always tacitly accepted the idea that the rest of the world has something to offer.

In fact, this would be in accordance with the vision of most of the nation’s founders, who were far from nationalistic. Their debates, speeches, and correspondences reveal them to have been fairly worldly folks who were open to foreign ideas and perspectives and sought to integrate the country into the international system. From Jefferson’s cherished copy of the Muslim Koran, to Franklin’s open Francophilia and Madison’s insistence that we respect global public opinion and norms, the supposed dichotomy between patriotism and internationalism is a false one at odds with one’s service to the nation.

It is all the more ironic because one of the few schools of philosophy to originate in the United States was pragmatism, which emerged in the 1870s and postulated, among other things, that people promote ideas based on their practical effect and benefit (i.e., regardless of their national or foreign origin). It should not matter where our solutions to certain problems come from it matters that they are solutions, and thus beneficial to our community, in the first place.