The Outer Space Treaty

On this day in 1967, the Outer Space Treaty entered into force, becoming the first effort to establish universal principles and guidelines for activities in outer space. It was created under the auspices of the United Nations based on proposals by the world’s two principal space powers, the United States and Soviet Union.

Naturally, I took the opportunity to improve the Wikipedia article about it, which deserves greater justice (See the before and after photos below.)

It may not be a household name — then again, few treaties are —but the Outer Space Treaty remains one of the most relevant texts in international law today. It is the foundational framework for what we now know as space law, a legal field that is more relevant than ever now that dozens of countries and companies are actively involved in space activities.

The Outer Space Treaty forms the basis of ambitious projects such as the International Space Station (the biggest scientific endeavor in history) and the Artemis Program, a U.S.-led international coalition to return humans to the Moon and to ultimately launch crewed missions to Mars and beyond.

May be a black-and-white image of 2 people, people sitting and indoor
The treaty was signed in Washington, Moscow, and London, representing the first three countries to have artificial satellites in space at the time.

The main crux of the Outer Space Treaty is preventing the placement of weapons of mass destruction in space; broader principles include allowing all nations to freely explore space; limiting space activities to peaceful purposes; preventing any one nation from claiming territory in space; and fostering goodwill and cooperation in space exploration (such as rescuing one another’s astronauts or preventing our space probes from damaging others).

I know, I know, it is all quite idealistic. But all things considered, the treaty has held up fairly well: Most of the world’s countries, including all the major space powers, have ratified it and abided by its terms (after all, it is in everyone’s self-interest to keep everyone else from putting nukes in space). Naturally, some provisions were written vaguely enough to allow some workarounds — for example, space forces are still allowed so long as they are not armed with WMDs and belligerent.

The Outer Space Treaty is influential enough to still be referenced by the major space programs, and has enough legitimacy that every government feels the need to at least pay lip service to its terms. Whether this holds up in an ever-intensifying rivalry among both countries and companies is a different story — but it is certainly better than nothing.

The Shanghai Cooperation Organization

It is not a a household name like NATO and the European Union, but the milquetoast-sounding Shanghai Cooperation Organization may become one of the most important geopolitical blocs in the world. Iran’s recent entry into the Eurasian alliance has given it a rare spotlight in mainstream Western news media.

Founded two decades ago, the SCO is the world’s largest regional organisation, covering three-fifths of the Eurasian continent, nearly half the human population, and one-fifth of global GDP. It originated from a mutual security agreement in the 1990s between Russia, China, and several Central Asian countries (all former Soviet republics), which committed to maintaining “military trust” along their border regions.

But since being announced by member governments in Shanghai in 2001, the SCO has become more integrated along political, economic, and even cultural lines, in addition to beefing up military cooperation beyond simply maintaining border security. The fact that the alliance is led by two of America’s chief rivals, and comprised mostly of authoritarian countries, certainly adds to its image as the principal antinode to the Western-led world order.

No doubt Iran’s membership will add to that perception, though it also joins the likes of India and Pakistan, which became members in 2017, both of which are close (if tenuous) partners with the United States and other Western countries.

No photo description available.

In fact, many analysts warn that the perception of the SCO as an anti-American or anti-Western bloc is vastly overstated. While it is certainly predicated on the idea of a “multipolar” world—coded language for an international order not dominated by the U.S. specifically—the group is far from presenting itself as anything akin to an “Eastern” NATO:

Rather than major political or economic gains, Iran’s main takeaway from this success in the short term may be limited to a boost in prestige and diplomacy.

The main issue with Iran’s approach towards the SCO is that it looks at it as a “concert of non-Western great powers” rather than a modern international organisation, and views it in an anti-Western or anti-US setting, says Hamidreza Azizi, visiting fellow at the German Institute for International and Security Affairs (SWP).

“This is despite the fact that countries such as Pakistan and India are US’s close partners, and even Russia and China have never been willing to openly challenge the US on the global scene,” Azizi told Al Jazeera.

“The combination of these two misunderstandings, and also Iran’s self-perception as a natural hegemon in West Asia, would make the whole thing appear to the Iranian leaders as Iran joining other anti-Western great powers to form a strong coalition that is going to challenge the US hegemony.”

Azizi added that SCO members are reluctant to entangle themselves in Iran’s rivalries, which may be why, on Friday, they also admitted Saudi Arabia, Qatar and Egypt as “dialogue partners” in a balancing effort.

From a diplomatic perspective, the approval is significant.

Indeed, for a country as diplomatically and economically isolated as Iran, joining such a large and imposing regional body, whatever its limitations, is at least good optics.

SCO MAP 10 July 2015 - Including two new permanent members Pakistan and India.png
A slightly dated map showing SCO partners (dark green), observers (light green) and “dialogue partners” (yellow). Source: Wikimedia Commons

The SCO is far from being a full-fledged alliance with formal and binding commitments among its members; there is nothing like NATO’s Article 5, which obligates all members to come to the defense of another member in an attack, nor does it have the level of economic integration of the European Union. As one analyst describes it, the SCO is more of a “venue” for discussion among “high-level dignitaries”—which is perfectly suited for mostly autocratic countries that jealously guard their sovereignty.

Still, many powerful regional blocs like the EU did start from humble beginnings, growing from diplomatic talk shops to fully institutionalized arrangements over the span of decades. A wide array of countries have expressed interest in joining the group or are currently engaged with it in some way, including NATO members like Turkey and strategic partners like Saudi Arabia. It remains to be seen if the SCO will ever become as tightly integrated as its Western counterparts, though this is unlikely given its explicit commitment to nonintervention in members’ affairs—which ironically makes it all the more appealing for certain countries to join.

The Outbreaks That Never Happened and the Unseen Success of Global Institutions

Given all the death and dysfunction resulting from the COVID-19 pandemic, it is worth appreciating the many potential outbreaks that never happened, thanks to the efforts of Kenya, Mozambique, and Niger, alongside the United Nations and other international partners

In December 2019, just months before the COVID-19 pandemic came in full swing, these nations managed to halt an outbreak of a rare strain of “vaccine-derived polio”, which occurs “where overall immunization is low and that have inadequate sanitation, leading to transmission of the mutated polio virus”. It is all the more commendable given that Niger is among the ten poorest countries in the world.

The fact that polio remains both rare and relatively easy to quash is the results of a U.N.-backed campaign announced in 2005 to immunize 34 million children from the debilitating disease, which often leaves victims permanently disabled. The effort was led by  by World Health Organization the U.N. Children’s Fund (UNICEF), Rotary International, and the United States Centers for Disease Control and Prevention.

A nurse administers an oral poliovirus vaccine (OPV) to a baby at the Kaloko Clinic, Ndola, Zambia.
© UNICEF/Karin Schermbrucke

A little over fifteen years later, two out of three strains of polio have been eradicated—one as recently as last year—while the remaining strain is in just three countries: Afghanistan, Nigeria, and Pakistan. This once widespread disease is on its way to becoming only the second human disease to be eradicated, after smallpox, which once killed tens of millions annually. That feat, accomplished only in 1979, was also a multinational effort led by the U.N., even involving Cold War rivals America and Russia.

Even now, the much-maligned WHO actively monitors the entire world for “acute public health events” or other health emergences of concern that could portend a future pandemic. As recently as one month ago, the U.N. agency issued an alert and assessment concerning cases of MERS-Cov (a respirator illness related to COVID-19) in Saudi Arabia. Dozens of other detailed reports have been published the past year through WHO’s “Disease Outbreak News” service, spanning everything from Ebola in Guinea to “Monkeypox” in the United States. (WHO also has an influenza monitoring network spanning over half the world’s countries, including the U.S.).

Not bad for an agency with an annual budget of slightly over two billion—smaller than many large U.S. hospitals. (And contrary to popular belief in the U.S., the WHO did in fact move relatively quickly with respect to the COVID-19 pandemic:

On 31 December 2019, WHO’s China office picked up a media statement by the Wuhan Municipal Health Commission mentioning viral pneumonia. After seeking more information, WHO notified partners in the Global Outbreak Alert and Response Network (GOARN), which includes major public health institutes and laboratories around the world, on 2 January. Chinese officials formally reported on the viral pneumonia of unknown cause on 3 January. WHO alerted the global community through Twitter on 4 January and provided detailed information to all countries through the international event communication system on 5 January. Where there were delays, one important reason was that national governments seemed reluctant to provide information

Of course, it goes without saying that the WHO, and global institutions generally, have their shortcomings and failings (as I previously discussed). But much of that stems from structural weaknesses imposed by the very governments that criticize these international organizations in the first place:

WHO also exemplifies the reluctance of member states to fully trust one another. For example, member states do not grant WHO powers to scrutinise national data, even when they are widely questioned, or to conduct investigations into infectious diseases if national authorities do not agree, or to compel participation in its initiatives. Despite passing a resolution on the need for solidarity in response to covid-19, many member states have chosen self-centred paths instead. Against WHO’s strongest advice, vaccine nationalism has risen to the fore, with nations and regional blocks seeking to monopolise promising candidates. Similarly, nationalistic competition has arisen over existing medicines with the potential to benefit patients with covid-19. Forgoing cooperation for selfishness, some nations have been slow to support the WHO organised common vaccine development pool, with some flatly refusing to join.

The tensions between what member states say and do is reflected in inequalities in the international governance of health that have been exploited to weaken WHO systematically, particularly after it identified the prevailing world economic order as a major threat to health and wellbeing in its 1978 Health for All declaration. WHO’s work on a code of marketing of breastmilk substitutes around the same time increased concern among major trade powers that WHO would use its health authority to curtail private industry. Starting in 1981, the US and aligned countries began interfering with WHO’s budget, announcing a policy of “zero growth” to freeze the assessed contributions that underpinned its independence and reorienting its activities through earmarked funds. The result is a WHO shaped by nations that can pay for their own priorities. This includes the preference that WHO focus on specific diseases rather than the large social, political, and commercial determinants of health or the broad public health capacities in surveillance, preparedness, and other areas needed for pandemic prevention and management

In fact, it was this prolonged period of chronic underfunding, and of WHO member states prioritizing nonemergency programs, that precipitated the agency’s abysmal failings in the early phases of the 2014 Ebola outbreak. But once that crisis ended, member states, rather than defund or abandon the organization, opted to reform and strengthen its emergency functions; this overhaul resulted in the Health Emergencies Program, which was tested by the pandemic and thus far proven relatively robust:

On 31 December 2019, WHO’s China office picked up a media statement by the Wuhan Municipal Health Commission mentioning viral pneumonia. After seeking more information, WHO notified partners in the Global Outbreak Alert and Response Network (GOARN), which includes major public health institutes and laboratories around the world, on 2 January. Chinese officials formally reported on the viral pneumonia of unknown cause on 3 January. WHO alerted the global community through Twitter on 4 January and provided detailed information to all countries through the international event communication system on 5 January. Where there were delays, one important reason was that national governments seemed reluctant to provide information.

I know I am digressing into a defense of WHO, but that ties into the wider problem of too many governments and their voters believing that global governance is ineffective at best and harmfully dysfunctional at worst. We Americans, in particular, as constituents of the richest country in the world, have more sway than any society in how institutions like the U.N. function—or indeed whether they are even allowed to function.

As our progress with polio, smallpox, and many other diseases makes clear, what many Americans decry as “globalism” is actually more practical and effective than we think, and increasingly more relevant than ever. We fortunately have many potential outbreaks that never happened to prove it.

The Duty and Devotion of Albanian Hospitality

Albania, one of the poorest countries in Europe, has committed to taking in up to 4,000 Afghan refugees, which is among the most in the world and the most in proportion to its population (which is roughly 2.8 million)Hundreds of Afghans, including roughly 250 children, are being housed in coastal resorts, under a clever emergency plan developed by the government in response to a devastating 2019 earthquake; when thousands of people were rendered homeless, officials opted to shelter them in the mostly unused space of beach hotels.

Afghan refugees in Albania are being housed in resorts along the Adriatic coast.
Credit: New York Times

Such hospitality is deeply rooted in Albanian culture. The Muslim-majority country is known for its stringent code of generosity and hospitality to anyone and everyone who needs it. Known as besa, which roughly translates to “trust”, “faith”, or “oath”, it commits all Albanians to help people in need regardless of their background or circumstances. As locals explain, the tradition is simple: “If someone needs a place to stay, you give it to them, period”.

While the practice may go back to ancient times, it was first codified in the Kanun, a set of customary laws written in the 15th century to govern the many independent tribes of the region. Within this book is a proverb that sums it up nicely: “Before the house belongs to the owner, it first belongs to God and the guest.” You could knock on the door of any house and ask for help and the owner would have to take you in. The Kanun even advises households to always have a spare bed ready at any time, just in case.

Credit: BBC

While besa is a duty that binds all Albanians, there is evidence that they genuinely find hosting guests as a point of pride. There is one anecdote about a town that rebelled against a hotel that was going to be built there; everyone went to town hall and complained, saying people who needed a place to stay could just come knock on their doors.

Perhaps the greatest proof of this tradition is the Second World War, after which Albania was perhaps the only country to have more Jews than before the Holocaust. Not only did they save nearly their entire Jewish community, but they saved another two thousand or so who had fled to the country. Albanians largely resisted all the pressure and threats by Axis forces to turn over people in hiding. Had anyone given up their guest, they would bear a great shame that could only be solved by “cleaning the blood”—meaning taking vengeance against whoever took and harmed their guest (which is one hell of a story idea…).

This is also why Albania is relied upon by the U.S. and Europe to take in folks neither wants, from Iranian and Syrian refugees, to Guantanamo detainees deemed innocent but nonetheless untrusted.

The Only Woman Executed in the French Revolution for Her Politics

Olympe de Gouges.png

On this day in 1793, French playwright, journalist, and outspoken feminist Olympe de Gouges (born Marie Gouze) published the Declaration of the Rights of Woman and of the Female Citizen, hoping to expose the failures of the French Revolution to recognize gender equality.

Initially hopeful that the French Revolution would usher equality between men and women, Gouges became disenchanted upon discovering that the key revolutionary tenant of egalite would not be extended to women. In 1791, in response to the Declaration of the Rights of Man and of the Citizenan otherwise seminal work in human rights— she wrote a counter-declaration that proposed full legal, social, and political equality between men and women. She also published her treatise, Social Contract, named after the famous work of Enlightenment thinker Jean-Jacques Rousseau, calling for marriage based upon gender equality.

Even before the revolution, Gouges was well ahead of her time both ideologically and professionally. She dared write plays and publish political pamphlets at a time when women were denied full participation in the public and political space. After releasing a play critical of slavery, she was widely denounced and even threatened for both her anti-slavery stance and being involved in the male profession of theatre in the first place. Gouges remained defiant: “I’m determined to be a success, and I’ll do it in spite of my enemies”. Unfortunately, threats and outright sabotage from the slavery lobby forced the theatre to abandon her play after just three days.

Heck, even her name was an act of defiance against prevailing social norms, as explained by Columbia College:

…Gouges took on her mother’s middle name, changed the spelling of her father’s and added the aristocratic “de.”  Adding to this already audacious gesture, the name “Gouges” may also have been a sly and provocative joke.  The word “gouge” in Occitan was an offensive slang term used to refer to lowly, bawdy women.  

Unsurprisingly, once the French Revolution came into full swing, Gouges wasted no time in seizing the moment. Aside from her already-bold feminist views, she rigorously supported a wage of policies and rights that proved radical even for the revolution:

She produced numerous broadsides and pamphlets between 1789 and 1792 that called for, among other things, houses of refuge for women and children at risk;  a tax to fund workshops for the unemployed;  the legitimation of children born out of wedlock;  inheritance equality;  the legalization and regulation of prostitution;  the legalization of divorce;  clean streets;  a national theater and the opening of professions to everyone regardless of race, class or gender.  She also began to sign her letters “citoyenne,” the feminine version of the conventional revolutionary honorific “citoyen.”  

Gouges’ opposition to the revolution’s growing and bloody radicalism, and support for a constitutional monarchy, put a target on her back. Above all she openly disliked, Maximillian Robespierre, in effect the most powerful man in the country, going so far as to use the informal tu when referring to him in an open letter. This proved the last straw; she was tried, convicted, and executed for treason as one of only three women to be executed during the Reign of Terror, and the only one executed for her politics.

Nonetheless, Gouges’ legacy lived on for decades, influencing women’s rights movements across Europe and North America: the 1848 Seneca Falls Convention in New York—the first convention dedicated to women’s rights—based its “Declaration of Sentiments” on her “Declaration of the Rights of Woman”. 

Map: How Nuclear Powers Pledge to Use Their Nukes

The world has been fortunate to only see nukes used aggressively against one nation, nearly eighty years ago, during the waning days of the Second World War (of course this is small comfort to the hundreds of thousands of victims in Hiroshima and Nagasaki).

This is all the more surprising considering we now have nine countries with nuclear weapons, some of which have been governed by certifiable mass murders (e.g., Stalin and Mao) or by men with questionable moral positions on ordering nuclear strikes (e.g., Nixon). One would think sheer probability would have resulted in at least an accidental launch (of which we have had several close calls).

This got me wondering how this select group of nuclear-armed countries approach the weighty issue of using their nukes against another nation. The most recent and reliable source I could find is a 2018 article from the Council on Foreign Relations, which offers a country-by-country breakdown on the “no first use” policy, the position that nukes should never be used first in any conflict but only in retaliation to a nuclear strike.

Based on the article, I made the following map, which shows the distressing rarity of that commitment:

It’s my first map, so I welcome any feedback or suggestions!

As explained in the article:

A so-called NFU pledge, first publicly made by China in 1964, refers to any authoritative statement by a nuclear weapon state to never be the first to use these weapons in a conflict, reserving them strictly to retaliate in the aftermath of a nuclear attack against its territory or military personnel. These pledges are a component of nuclear declaratory policies. As such, there can be no diplomatic arrangement to verify or enforce a declaratory NFU pledge, and such pledges alone do not affect capabilities. States with such pledges would be technically able to still use nuclear weapons first in a conflict, and their adversaries have generally not trusted NFU assurances. Today, China is the only nuclear weapon state to maintain an unconditional NFU pledge.

Given that such pledges are not binding, it is odd that more nations do not make them anyway; China’s lone commitment to this stance—which only India comes close to echoing—may not count for much, but clearly it carries enough significance for other nuclear powers to avoid it.

In fact, the United States had previously considered adopting an NFU policy, but has refrained from doing so out of fear that it might indicate insufficient deterrence of foreign threats:

During the Cold War and even today, the credible threat of the United States using its nuclear weapons first against an adversary has been an important component of reassuring allies. At the height of the Cold War, the threat of U.S. tactical nuclear use was conceived of as a critical bulwark against a conventional Soviet offensive through the Fulda Gap, a strategically significant lowland corridor in Germany that would allow Warsaw Pact forces to enter Western Europe. A nuclear first-use policy was thought to be a cornerstone of the defensive posture of the North Atlantic Treaty Organization (NATO), given the large number of bases of Warsaw Pact conventional military forces. Accordingly, NATO has always opposed a U.S. NFU declaration and has never ruled out U.S. first use under its “flexible response” posture since 1967. Today, U.S. allies in East Asia and Europe alike rely on credible commitments from the United States to use nuclear weapons first to deter major nonnuclear threats against them.

I guess these pledges are not so vacuous after all.

Forgotten Allies

The contributions of our foreign allies to the Afghanistan War have been overlooked or downplayed throughout the 20-year conflict. But in proportion to their size, many of them committed more troops and funds, and suffered more casualties, than even the U.S.

The 9/11 attacks were the first time NATO invoked Article 5 of its treaty, which enshrines the principle of “collective defense” by recognizing an attack against one ally as an attack against all allies. Thus, all the other 29 members of NATO—along with 21 partner countries ranging from Australia to South Korea—contributed troops, money, and other aid to the war in Afghanistan.

(It is also worth adding that even the typically-deadlocked U.N. Security Council resoundingly supported American retaliation, indicating an exceptionally rate amount of international support.)

Besides the U.S., the top five countries to send troops were the United Kingdom, Germany, France, Italy, and Canada. The U.K. in particular supplied roughly two to three times the troops of the other top contributing allies relative to its population.

British and Canadian troops put their lives at risk at twice the rate of American troops, when seen as a percentage of each country’s peak deployment. Proportionally, both suffered more than double the casualties of U.S. forces, while France suffered a similar rate.

As proportion of their military, many smaller countries played an outsized role, with Denmark, Estonia, Georgia, Norway, and North Macedonia ranking near the top after the U.S. and U.K.; consequently, some of these countries suffered the highest fatality rates per capita.

The top contributing allies lost over a thousand lives in U.S.-led conflicts in Afghanistan as well as Iraq; all told, roughly half of all foreign military deaths in Afghanistan were among U.S. allies.

When measured as a percentage of their annual baseline military spending, the U.K. and Canada spent roughly half as much on Afghanistan as the U.S.; relative to their overall economic size, the U.K. spent more than the U.S., while Germany and Canada spent about the same.

This did not have to be our allies’ fight. The likes of Georgia, Norway, and South Korea (among dozens of others) had little to no skin in the game, aside from a broader sense that terrorism could potentially impact them. But even then, involvement would put them at greater risk of retaliation and domestic opposition (as Spain learned the hardest way when it lost nearly 200 lives in a terrorist attack perpetrated in response to its participation in Iraq).

The Sadly Prescient Warnings of the United Nations

The United Nations warned about the deteriorating situation in Afghanistan for years, and just three months ago published a report with tragically accurate warnings about the repercussions of a hasty withdrawal. It is a grim reminder that we should pay more attention to international institutions like the U.N., since they benefit from having a large pool of resources from different countries, and are given access that most governments are denied.

The U.N. report stated the Taliban was trying to demoralize the government, intimidate the populace, and put “major pressure” on near the capital, “massing forces around key provincial capitals and district centers, enabling them to remain poised to launch attacks”—which we saw play out in barely two weeks.

U.N. observers believed the Taliban were planning their operations around the withdrawal date announced by Trump and Biden when foreign troops would “no longer [be] able to effectively respond”. It cautioned that the Afghan military was “in decline” and that our departure “will challenge Afghan Forces by limiting aerial operation with fewer drones and radar and surveillance capabilities, less logistical support and artillery, as well as a disruption in training”—again, all this explained why the government melted away so soon.

The U.N. also predicted that the Taliban would target departing foreign troops to “score propaganda points” and believed the group is “closely aligned” with al-Qaeda, with “no indication of breaking ties” despite trying to mask their connections. To make matters worse, the U.N. believes Islamic State may position itself in Afghanistan, which recent news reports suggest is already happening.

While it remains to be seen whether some of the pending predictions come true, the U.N.’s overall conclusion was sadly spot on: “The Afghan Taliban poses a major threat to the survival of the Afghan government, which is likely to substantially grow with the full withdrawal of U.S. forces”.

[Literally one day after I shared the U.N. report on social media, Kabul’s airport was attacked by an Islamic State affiliate, killing over a dozen Americans and scores of Afghans desperately trying to flee. The report had warned of other extremist groups that are or will grow more powerful, often with tacit Taliban support, and that the Taliban would take full advantage of our withdrawal and target departing foreign troops to “score propaganda points”. Sadly, it was once again not too far off the mark.]

I am not sure how many more disasters and tragedies it will take for us to learn to listen to our international partners, many of whom have intelligence networks and resources we lack. One does not have to be a “globalist” to recognize that — the writing was almost literally on the wall.

The First War to End All Wars

Yesterday was an even more devastating anniversary than the bar exam.

On July 28, 1914—exactly one month after the assassination of Archduke Franz Ferdinand—Austria declared war on Serbia and the First World War began. Despite directly setting off the war, both nations would soon be overshadowed by the much bigger players they dragged with them: France, Germany, Russia, and the U.K.

See the source image

After putting up stiff resistance for the first year, Serbia was conquered by the end of 1915 and occupied by Austro-Hungarian forces until the war’s end in 1918. Over 1.1 million Serbs died, including one out of four troops, up to a quarter of the population and 60 percent of men; proportionally, Serbia suffered more losses than any other country involved (the Ottoman Empire ranks second in this regard, losing 13-15 percent of people, followed by Romania at 7-9 percent).

For its part, the weak and declining Austro-Hungarian Empire lost over 2 million people, of whom 120,000 were civilians, amounting to about 4 percent of its total population. Having exhausted itself in its pyrrhic victory against Serbia, the country barely kept it together throughout the conflict, remaining a peripheral power dependent on German support; indeed, Austria-Hungary would ultimately collapse into several new countries, some of which would join Serbia to form a new multiethnic state called Yugoslavia.

All told, some 8 million fighting men were killed by combat and disease, and 21 million more were wounded. As many as 13 million civilians died as a result of starvation, exposure, disease, military action, and massacres. Four great empires and dynasties—the Hohenzollern, the Habsburg, the Romanov, and the Ottoman—fell, and the intercontinental movement of troops helped fuel the deadliest influenza pandemic in history. The ripple effects of the war, from the Great Depression, to World War II, to the Cold War, continue to be felt today. The war helped usher in the Russian Revolution, and ultimately the Soviet Union, the first major communist government (which ironically would play the pivotal role in helping end the second iteration of the war).

See the source image
See the source image

Better known are the grievances engendered by the post-war Versailles Treaty, which helped fuel the desperation and misery that became the Nazi’s stock and trade. Even Japan saw its star rise further as a major world power, belatedly joining the Allies and getting a seat at the table as one of the leaders of the post-war League of Nations (no small feat for a non-European country).

In Casualties of History, John Arquilla describes the almost morbidly comical arrogance and stupidity of this meat grinder of a conflict:

“Yes, a second and even more destructive conflict followed all too soon after the “war to end all wars”, impelling a name change from Armistice Day to Veterans Day. And the rest of the 20th century was littered with insurgencies, terrorism, and a host of other violent ills — most of which persist today, guaranteeing the steady production of new veterans, of which there are 22 million in the United States.

But despite the seemingly endless parade of wars waged and fresh conflicts looming just beyond the bloody horizon, World War I still stands out for its sheer horror. Over ten million soldiers died, and more than twice that number were wounded. This is a terrible enough toll. But what makes these casualties stand out even more is their proportion of the total numbers of troops mobilized.

For example, France put about 7.5 million soldiers in the field; one in five died, and three out of four who lived were wounded. All other major combatants on both sides suffered horribly: the Austro-Hungarian Empire’s 6.5 million soldiers had a combined casualty rate of 74 percent. For Britain and Russia, the comparable figures totaled a bit over 50 percent, with German and Turkish losses slightly below one-half of all who served. The United States entered the conflict late, and so the overall casualty rate for the 4.3 million mobilized was “just” 8 percent. Even so, it is more than double the percentage of killed and wounded from the Iraq War, where total American casualties amounted to less than 4 percent of the one million who served.

Few conflicts in all of military history have seen victors and vanquished alike suffer such shocking losses as were incurred in World War I, so it is worth taking time to remember how this hecatomb came to pass. A great body of evidence suggests that this disaster was a product of poor generalship. Historian Alan Clark’s magisterial “The Donkeys” conveys a sense of the incredible stubbornness of high commanders who continued, for years, to hurl massed waves of infantry against machine guns and rapid-firing artillery. All this went on while senior generals stayed far from the front. A British field commander, who went riding daily, even had soldiers spread sand along the country lane he followed, to make sure his horse didn’t slip.

It is little wonder that in the face of Nazi aggression barely a generation later, most of Europe melted away and succumbed to occupation within a year. Most nations did not have the political or public will to endure yet another meat grinder of a conflict; indeed, the major powers could not imagine that anyone would actually want another war given all the bloodletting that went around. Perhaps the greatest tragedy of the First World War was the fact that even all that death and destruction failed to stem the hatred, cruelty, and aggression of monstrous men and their millions of supporters and collaborators; in fact, the shortsightedness and vindictiveness of postwar leadersas had already been evidenced by their callous ineptitude on the battlefieldall but ensured that desperation and humiliation would give the likes of Hitler, Mussolini, and their minions plenty of currency to start an even bloodier.

Thanks goodness that, for now, that has not played out again all these decades later.

Source: Encyclopædia Britannica, Inc./Kenny Chmielewski