On this day in 1791, the Polish-Lithuanian Commonwealth—one of the largest and most powerful countries in Europe—adopted the first written national constitution in Europe, and only the second in the world, after the U.S. Constitution just two years earlier.
Like its counterpart across the Atlantic, Poland’s constitution—titled the Governance Act and known simply as the Constitution of 9 May 1791—was influenced by the Enlightenment, the European intellectual movement that, among other things, pioneered concepts like civil liberty, individual rights, religious and political tolerance, and so on.
Remarkably, despite the vast geographic distance between the two countries, Poland’s constitutional structure was markedly similar to that of America: There were three branches of government—legislative, executive, and judicial—with checks and balances, a bicameral legislature, and a cabinet of ministers. The constitution declared that “all power in civil society [should be] derived from the will of the people” and defined the role of government as ensuring “the integrity of the states, civil liberty, and social order shall always remain in equilibrium. While Roman Catholicism was recognized as the “dominant faith”, freedom of religion was guaranteed—a remarkable proposition in a continent where people regularly killed each other for being the wrong Christian or simply holding the wrong doctrine.
The people of Poland-Lithuania were defined not as “subjects” of a king, but “citizens” with popular sovereignty—which included townspeople and peasants, who in most of Europe had no such recognition. The right to acquire property, hold public office, and join the nobility—whose powers and immunities were restricted—was extended to millions more people, including Jews (who almost everywhere else were denied anything akin to legal recognition, let alone political rights).
The new constitution even introduced a version habeas corpus—the core legal right that prevents abuse of power—known as Neminem captivabimus, summarized as “We shall not arrest anyone without a court verdict”.
To be clear, the Constitution of 9 May 1791 had its limits, and its radicalism should not be overstated. The monarchy was still retained, with the king serving as head of the executive branch. Religious minorities such as Jews, as well the peasants who made up the vast majority of the population, still had few powers. While constrained, the nobility was not abolished as in the U.S. and later France, and in fact still retained many privileges.
But even in these areas, the Commonwealth went farther than almost any other country in the world at the time. The monarchy was not absolute: The king’s powers were constrained by the constitution and essentially shared with a council of ministers, who could overrule his decrees, forcing him to go to parliament. While peasants and Jews had few rights, they now had official protection from abuse—a step closer to recognizing their political rights, well beyond what was normal at the time. Eligible middle-class people could even join the ranks of nobility, a seemingly paradoxical form of progress that, again, was unusual for the time; nobles certainly couldn’t ride roughshod over commonfolk as they did elsewhere in Europe (which isn’t to say there weren’t abuses—this is still feudal Europe after all).
In any event, the Constitution of 9 May 1791 was a relatively bold and momentous step in the right direction, as evidenced by its rarity at the time—and sadly, by its short existence. In fewer than two years, the Polish-Lithuanian Commonwealth would be extinguished by the absolute monarchies of neighboring Prussia and Russia, which felt threatened by the constitution and the dangerous “revolutionary” ideas it introduced and could spread. Poland would cease to exist for well over another century, with its experiment never being fully tested—but also never dying off entirely, as the then-ongoing French Revolution and subsequent political reverberations would prove.
This week in 1846 saw the outbreak of one of the most obscure, consequential, and unjust wars in U.S. history: The Mexican American War, which in two years resulted in the U.S. becoming a continental power, at the expense of its weaker southern neighbor—something even American heroes like Abraham Lincoln and Ulysses S. Grant regarded as a grave injustice.
The war began under the equally obscure but history-making presidency of James K. Polk, a one-term president with the rare distinction of having fulfilled all his campaign promises—one of which was expanding U.S. territory to the Pacific.
The problem was that Mexican (and to a lesser extent British) territory was in the way. Beginning with the Louisiana Purchase of 1803, which more than doubled the size of the fledging republic, there were several overtures to purchase what was then Spanish territory; in 1825, Andrew Jackson made a sustained effort to buy the northern lands of what was now newly independent Mexico, to no avail.
Meanwhile, Mexico was well aware of its precarious position: Not only was it wracked by political instability and social strife, but it lacked full authority over the rugged, sparsely inhabited lands of the now-American Southwest—especially against the various fiercely independent native tribes that were effectively sovereign. So, in the 1820s, the Mexican government invited Americans to settle and “civilize’ the vast, largely empty plains of present-day Texas; among them were men like Stephen F. Austin, the “Father of Texas“, who brought hundreds of “Anglo” families with him.
The rapid influx of Americans led to them outnumbering Mexicans in their own distant territory, which was already thousands of miles from Mexico’s political base in Mexico City. Aside from cultural and linguistic barriers, a major sticking point—surprise—was slavery: Mexico’s constitution had outlawed the practice decades before the U.S., but the vast majority of American settlers were slaveowners.
In a macabre foreshadowing of what was to come, disputes over slavery—along with the Mexican government’s effort to impose property taxes on the fiercely independent American immigrants—led Mexico to close the border with the U.S.—only for American slave owners to continue illegally crossing into Mexico (no need to harp on the irony here).
Escalating matters further, Mexico’s strongman president, Antonio Lopez de Santa Anna, sought to roll back the country’s federal system in favor of centralized power; this upset the quasi-independent “Texans”, and when Santa Anna led an army to reign them in, the Texas Revolution broke out, and the Texans, with U.S. support, achieved de facto independence in 1836.
Mexico never recognized this claim—though the U.S. and other foreign powers did—and the border of this new “Republic of Texas” were subsequently unclear and disputed. So, when America made the controversial move of annexing Texas as a state in 1845—hotly debated in Congress and by the public—this brought the dispute to what was now our border.
After yet another failed attempt to buy Mexican territory and finding significant opposition to starting a war with its only independent neighbor, Polk essentially egged on Mexico to start hostilities first—by sending a military expedition deep into Mexican territory. Even Grant, who served in the war despite his opposition to it, claims in hisPersonal Memoirs (1885) that the main goal was to provoke the outbreak of war without attacking first, thereby hindering domestic opposition to the war.
“The presence of United States troops on the edge of the disputed territory farthest from the Mexican settlements, was not sufficient to provoke hostilities. We were sent to provoke a fight, but it was essential that Mexico should commence it. It was very doubtful whether Congress would declare war; but if Mexico should attack our troops, the Executive could announce, “Whereas, war exists by the acts of, etc.,” and prosecute the contest with vigor. Once initiated there were, but few public men who would have the courage to oppose it. … Mexico showing no willingness to come to the Nueces to drive the invaders from her soil, it became necessary for the “invaders” to approach to within a convenient distance to be struck. Accordingly, preparations were begun for moving the army to the Rio Grande, to a point near Matamoras. It was desirable to occupy a position near the largest centre of population possible to reach, without absolutely invading territory to which we set up no claim whatever.”
After Mexican forces engaged what it saw as American invaders, killing or capturing dozens, Polk made his case for war. Many pro-slavery Democrats supported a declaration of war, while many northern “Whigs” remained staunchly opposed—including a freshman Congressman from Illinois named Abraham Lincoln, who challenged Polk’s assertion that American blood had been shed on American soil as “a bold falsification of history.” Within hours, Congress voted to formally declare war against Mexico—one of the few times in history that the U.S. as officially been at war with another country.
Notwithstanding some success on the battlefield, Mexico simply lacked the resources, military experience, and political unity to defend itself against superior American forces. Once its capital was occupied—along with most other major cities—it was clear that the U.S. was victorious and could dictate terms—which unsurprisingly included annexing the northern territories the U.S. had long sought.
(There was actually an “All of Mexico Movement” that sought to take the entirety of Mexico, but it fell apart due in large part to concerns about incorporating millions of inferior Indian and mixed races that comprised the majority of the country’s population.)
In the peace treaty that followed, Mexico ceded to the United States the present-day states of California, Nevada, and Utah, most of New Mexico, Arizona and Colorado, and parts of Texas, Oklahoma, Kansas, and Wyoming.
In return, Mexico received $15 million—$470 million today—which was less than half the amount the U.S. offered before the war; the U.S. further agreed to assume $3.25 million in debts that the Mexican government owed to U.S. citizens ($102 million today).
Aside from its obvious enrichment of the U.S., the war had a huge impact on American domestic politics: A bloody expansion led to a bitter and polarizing debate about whether America was fulfilling its “Manifest Destiny” as an enlightened republic or was instead no different than the imperialist Europeans it claimed to have broken from. Once again, Grant captured the mood in his memoirs:
“For myself, I was bitterly opposed to the measure, and to this day regard the war, which resulted, as one of the most unjust ever waged by a stronger against a weaker nation. It was an instance of a republic following the bad example of European monarchies, in not considering justice in their desire to acquire additional territory.”
The already-violent debate over slavery came to a head as both sides debated which of these vast territories should be “free” or “slave”; it was a cruel irony considering that the war had begun partly because illegal American immigrants insisted on having slaves in an “uncivilized” nation that had long since banned the despicable practice.
In some sense, America’s actions came to haunt it barely a generation later when these disputes over the fate of former Mexican territory furthered the boiling point to the American Civil War—which was led and fought by many veterans of the Mexican American War with tactics and strategies learned from that conflict.
Grant also expressed the view that the war against Mexico had brought punishment on the United States in the form of the American Civil War. “The Southern rebellion was largely the outgrowth of the Mexican war. Nations, like individuals, are punished for their transgressions. We got our punishment in the most sanguinary and expensive war of modern times”.
The Ides of March coin, also known as the Denarius of Brutus or EID MAR, is a rare coin issued by the Roman Republic from 43 to 42 BC to celebrate the assassination of Julius Caesar on March 15, 44 BC.
One side features Marcus Junius Brutus, once a close friend of Cesar who, after becoming disillusioned with his autocratic behavior and polices, helped lead his assassination.
The other side depicts a pileus cap between two daggers. The pileus cap was a Roman symbol of freedom and was often worn by recently freed slaves (it is still used in the coat of arms of several republics and in revolutionary art and propaganda); the daggers, of course, represent the assassins’ weapons. At the bottom is EID MAR, short for Eidibus Martiis – “on the Ides of March” – the date Cesar was assassinated.
The coins were minted under the auspices of Brutus during the “Liberator’s Civil War” that followed Cesar’s death; they were likely intended as a form of propaganda, or to lend official legitimacy to the assassination, which was not supported by the majority of Romans, as the assassins had hoped.
Given its brief and minimal use, the coin is considered one of the rarest in the world.
Fun fact: The Ides of March coin is a type of “denarius”, a nickel-sized silver coin that was standard Roman currency for about four centuries. It is the root for the word “money” in several Mediterranean countries, including Spain (dinero), Italy (denaro), Slovenia, (denar) and Portugal (dinheiro), and also survives in the Arabic word “dinar”, the name for the official currencies of several Arab countries, including Algeria, Tunisia, and Syria (all Mediterranean) and farther off places like Kuwait and Iraq.
On this day in 1922, a dying 14-year-old named Leonard Thompson received the first purified dose of insulin for his diabetes at Toronto General Hospital in Canada.
Barely six months before Thompson received his life-saving dose, a team of researchers led by his doctor, Frederick Banting of the University of Toronto, discovered that a hormone known as insulin regulates blood sugar, successfully isolating it to treat humans. (As is common with such groundbreaking work, Banting’s colleagues came from various countries and were building on the research of German and Romanian scientists.)
Though widely seen as a modern disease (and it is indeed more common) diabetes is one of the oldest known scourges of humanity; it is described in Egyptian and Indian medical records well over 2,000 years ago. In the 19th century, a 10-year-old child with Type 1 diabetes would typically live for just another year; now, thanks to discoveries like insulin, people with Type 1 diabetes can expect to live almost 70 years.
Until Banting’s achievement, the recommended treatment for Type 1 diabetes was a near-starvation diet, in order to keep sugar from accumulating in the blood. Thompson was just 65 pounds, and probably days from death, before Banting injected him with insulin; another round of shots successfully stabilized his blood sugar levels—and spared him and countless others from enduring such a long, painful, and dangerous treatment.
Banting rightfully won the Nobel Prize in Medicine the following year, along with Scottish team member John James Rickard Macleod. (At age 32, Banting remains the youngest Nobel laureate in the field). Believing that his colleague Charles Herbert Best also deserved recognition as a co-discoverer, the humble Canadian doctor shared his prize money with him.
But more telling of Banting’s character and contributions to humanity was what he did with this groundbreaking—and potentially lucrative—accomplishment: He refused to patent it and make a profit even after being offered $1 million and royalties for the formula. Banting believed that the Hippocratic Oath prohibited him from profiting off such lifesaving treatment, stating that “insulin belongs to the world, not to me”. His co-laureate Macleod likewise turned down the opportunity.
Thus, it was Banting’s teammates Best and James Collip, a Canadian biochemist, who were officially named as inventors in the patent application—but they immediately transferred all rights to their insulin formula to the University of Toronto for just one dollar. All these men believed that insulin should be made as widely available as possible, without any barriers such as cost—something quaint by today’s standards, where the costs of the four leading types of insulin in the U.S. have more than tripled over the past decade, to roughly $250 a vial (some patients need two to four vials a month).
No doubt, Banting and his colleagues would be spinning in their graves.
On this day in 1961, former Nazi leader Adolf Eichmann—one of the key perpetrators of the Holocaust—was sentenced to death by an Israeli court after being found guilty on fifteen criminal charges, including war crimes, crimes against humanity, and crimes against the Jewish people. His widely publicized trial helped popularize the infamous defense of many evil men: That was just another cog in a bigger killing machine who had no choice but to follow orders.
While undoubtedly one of the most sinister figures in history, yet like many Nazi leaders, Eichmann had a relatively uninteresting life—he was college dropout-turned traveling oil salesman before joining the Nazi Party in 1932. He rose through the ranks to eventually become head of the “Jewish Department”, which was initially tasked with intimidating Jews, through violence and economic pressure, into leaving Germany, and increasingly all of Europe.
After drafting plans to deport Jews to distant “reservations” such as Madagascar, Eichmann was informed of a “Final Solution to the Jewish question”: rather than expulsion and resettlement, Jews were to be exterminated. This was decided at the Wannsee Conference of January 1942, a meeting of leading Nazi figures chaired by Eichmann’s superior, Reinhard Heydrich—widely considered to be the darkest figure of the regime and the principal architect of the Holocaust.
Eichmann was thereafter charged with facilitating and managing the large-scale logistics of the Holocaust: the mass deportation of millions of Jews to ghettos and extermination camps in Nazi-occupied Eastern Europe during World War II. In essence, he was a faceless administrator of death, tallying the number of Jews in a given area, organizing the seizure and accounting of their solen property, and ensuring the trains ran on time to take them to certain death. He held regular meetings with staff and conducted inspections and tours of ghettos and camps across Europe, like some regional manager making sure all the stores under his care are running smoothly.
In this sense, Eichman revealed the morbidly dispassionate and bureaucratic nature of the Holocaust; he was never a leader or even a policymaker, but like hundreds of thousands involved in the Holocaust, was simply doing his job: Keeping the Nazi killing machine well-oiled and efficient.
After the war, Eichmann managed to avoid Allied forces under several aliases and connections, before finally settling in Argentina to live the quiet life he had denied of so many others. He was captured there by Mossad in 1960—a whole other saga worthy of its own post—and put in trial in Israel.
The trial revealed how normal men could commit and rationalize seemingly abnormal things (like the slaughter of an incalculable number of people). Eichmann defended his actions by simply asserting that he was “just following orders” (coined as the “Nuremberg Defense” for how often it was invoked by his associates after the war.). He insisted he had no authority in the Nazi regime, and that he was bound by his oath to Hitler; the decision to murder millions was made by the likes of Hitler and Heydrich, and he felt completely absolved of guilt. Reflecting on the Wannsee Conference that had implemented the Holocaust, Eichmann expressed relief and satisfaction that a clear decision had been made by the higherups, since it meant the killing were out of his hands.
Even before trial, investigators had concluded that Eichmann seemed genuinely incapable of grasping the enormity of his crimes, never once showing remorse. During trial, he admitted to not liking Jews and even seeing them as enemies, but claimed he did not think they needed to be killed. In one of his last statements in court, he admitted being guilty only for arranging the transports—not for the consequences.
(Eichmann would admit in trial that in 1945, he stated “I will leap into my grave laughing because the feeling that I have five million human beings on my conscience is for me a source of extraordinary satisfaction”; however, he wrote this off as simply reflecting his “opinion” at the time.)
In 2016, Eichmann’s written plea for pardon was published, revealing that this steadfast lack of conscience was evidently (and disturbingly) sincere: “There is a need to draw a line between the leaders responsible and the people like me forced to serve as mere instruments in the hands of the leaders. I was not a responsible leader, and as such do not feel myself guilty”.
Eichmann was executed by hanging on June 1, 1962. His last words were reportedly, “Long live Germany. Long live Argentina. Long live Austria. These are the three countries with which I have been most connected and which I will not forget. I greet my wife, my family and my friends. I am ready. We’ll meet again soon, as is the fate of all men. I die believing in God”; it is claimed he later mumbled “I hope that all of you will follow me”.
Eichmann’s trial had a lasting impact on our reflection and understanding of the Holocaust and of human evil as a whole. Perhaps the most famous example comes from Hannah Arendt, who reported on the trial and later wrote a book about it, Eichmann in Jerusalem, where she described him as the embodiment of the “banality of evil”: an otherwise average and mundane person, rather than a fanatic or sociopath, who rationalized his evil actions rather than own them; who was motivated by advancing his career rather than ideological commitment; and who was simply complacent with what was going on around him.
Jewish Nazi hunter Simon Wiesenthal, who helped captured Eichmann, reflected on the trial:
The world now understands the concept of “desk murderer“. We know that one doesn’t need to be fanatical, sadistic, or mentally ill to murder millions; that it is enough to be a loyal follower eager to do one’s duty.
The term “little Eichmanns” has since been used to describe people whose actions, on an individual scale, seem relatively harmless even to themselves, but who collectively create destructive and immoral systems in which they are actually complicit—but too far removed to notice, let alone feel responsible.
The Eichmann trial is a disturbing reminder that much of human evil, including the worst atrocities imaginable, are perpetrated or facilitated not by psychopaths or fanatics, but by normal and sometimes even otherwise decent people. It is a cautionary tale for all times, places, and people.
One of my latest Wikipedia projects concerns the Brazilian Expeditionary Force (FEB in Portuguese), a military division of 25,000 men and women that fought with the Allies in World War II.
That’s right: Brazil was active and often decisive participant in humanity’s largest conflict. As early as 1941, the United States and Great Britain actively sought Brazil’s allegiance, owing to its vast resources and strategically location (the Battle of the Atlantic had already been raging for nearly two years, and the country’s coastline was the longest in the Western Hemisphere).
After agreeing to cut diplomatic ties with the Axis, host several major American bases—including the largest overseas airbase—and provide precious natural resources to the Allied cause, Hitler called for a “submarine blitz” against Brazil’s merchant vessels. The loss of three dozen ships and close to 2,000 lives led to Brazil’s formal declaration of war in August 1942.
Brazil thus became the only independent country outside the Western powers to fight in the Atlantic and European theaters. The FEB was deployed to the Italian Campaign, among the most grueling and difficult in the war. They were nicknamed the “Smoking Cobras”—and even had shoulder patches featuring a snake smoking a pipe—based on commenters skeptically noting that the world would more likely see snakes smoking than see Brazilian troops on the battlefield (akin the saying “when pigs fly”).
So, in characteristically Brazilian humor, those “unlikely” troops took that as their mantra. Lacking the resources of the major Allied powers, Brazilian troops were placed under U.S. command and equipped with American weapons and supplies. They mostly saw combat at the platoon level, providing a reprieve for the exhausted Allied soldiers that had already been fighting for months.
The FEB performed with distinction across Italy: they scored victories in over a dozen decisive battles, managing to capture over 20,500 enemy troops, including two generals and almost 900 officers. What the Brazilians lacked in training and experience they more than made up for in tenacity and enthusiasm—allegedly retreated only when they ran out of ammunition. Both allies and adversaries alike commented on their bravery and fighting prowess, with one German captain telling his Brazilian captors:
Frankly, you Brazilians are either crazy or very brave. I never saw anyone advance against machine-guns and well-defended positions with such disregard for life … You are devils.
Meanwhile, Brazil’s fledging air force punched well above its weight, successfully completed 445 missions and 2,550 individual sorties. Despite making up only 5% of the war’s air sorties, they managed to destroy 85% of Axis ammo dumps, 36% of Axis fuel depots, and 28% of Axis transportation infrastructure.
The Brazilian Navy actively participated in the Battle of the Atlantic, defending thousands of merchant marine convoys, engaging Axis naval forces at least 66 times, and taking out over a dozen subs. Aside from its military contribution, Brazil’s abundance of natural resources, from rubber to agricultural products, proved crucial to the Allied war machine. Brazilian forces were considered threatening enough for the Axis to target them with Portuguese propaganda leaflets and radio broadcasts urging them not to fight someone else’s war. It certainly did not help the Axis cause to fight troops that were racially integrated, which even the Allies did not do. (Notice the ethnic composition of the Brazilian units.) The U.S. also produced propaganda informing Americans of Brazil’s contributions. By the end of the war, Brazil had lost around 1,900 men, dozens of merchant vessels, three warships, and 22 fighter aircraft.
While Brazil’s involvement was hardly decisive, it served as an understandable point of pride for its people, who were proud to represent their country on the world stage. It also indicated the country’s growing global prominence, with many seeing Brazil as an up-and-coming power. The U.S. even wanted Brazil to maintain an occupation force in Europe, though its government became reluctant to get too involved overseas.
Belated World AIDS Day post: Although HIV/AIDS remains a scourge of humanity—particularly in it’s likely place of origin, Africa—we have made tremendous progress in reducing both infections and rates of death. Being HIV positive is no longer the death sentence it once was—ironically the large number of people living with the disease is in part a testament to the success of treatments and of policies to make them widely affordable and accessible (aided in large part by the much-maligned WHO).
Even though #worldaidsday has been used to promote awareness of the disease and mourn those who have died from it since 1988, the global epidemic is far from over.
According to data by @unaidsglobal, more than ten million people with HIV/AIDS don’t currently have access to antiretroviral treatment and the number of new infections with #HIV has remained the same compared to 2019 at roughly 1.5 million. When taking a closer look at the numbers, there are enormous regional differences in terms of battling the epidemic. Eastern and southern Africa, for example, combine for 55 percent of all known HIV/AIDS cases, while reducing new infections by 43 percent between 2010 and 2020. Western and central Africa also saw a decline of 37 percent when comparing 2010 and 2020, although it falls short of the benchmark of 75 percent set by the United Nations General Assembly.
While the number of new infections has dropped from 2.9 million in 2000 to 1.5 million last year, the number of people living with HIV increased from 25.5 million to approximately 37.7 million over the past two decades. According to UNAIDS, the increase is not only caused by new infections, but also a testament to the progress that has been made in treating HIV with antiretroviral therapy, which has vastly improved the outlook of those infected with HIV.
The even more astute data-lovers at Our World in Data vividly convey both the scale of the problem and just how much we have progressed, even in the most hard-hit places:
While in law school, I and some colleagues had the incredible opportunity to meet the hard working and earnest people at UNAIDS headquarters in Geneva. This unique entity is the first and only one of its kind in the world, combining the personnel and resources of nearly a dozen U.N. agencies to offer a comprehensive response to this pandemic. UNAID is also the only initiative to include civil society organizations in its governing structure.
Since it was launched in 1994, UNAIDS has helped millions of people worldwide get antiretroviral treatment for HIV/AIDS, provided millions more with preventative methods. Thanks to their efforts, and those of their partners across the world, the rate of infection and death by HIV/AIDS has stagnated or even declined in many areas, while the rate of treatment has increased.
On this day in 1967, the Outer Space Treaty entered into force, becoming the first effort to establish universal principles and guidelines for activities in outer space. It was created under the auspices of the United Nations based on proposals by the world’s two principal space powers, the United States and Soviet Union.
Naturally, I took the opportunity to improve the Wikipedia article about it, which deserves greater justice (See the before and after photos below.)
It may not be a household name — then again, few treaties are —but the Outer Space Treaty remains one of the most relevant texts in international law today. It is the foundational framework for what we now know as space law, a legal field that is more relevant than ever now that dozens of countries and companies are actively involved in space activities.
The Outer Space Treaty forms the basis of ambitious projects such as the International Space Station (the biggest scientific endeavor in history) and the Artemis Program, a U.S.-led international coalition to return humans to the Moon and to ultimately launch crewed missions to Mars and beyond.
The main crux of the Outer Space Treaty is preventing the placement of weapons of mass destruction in space; broader principles include allowing all nations to freely explore space; limiting space activities to peaceful purposes; preventing any one nation from claiming territory in space; and fostering goodwill and cooperation in space exploration (such as rescuing one another’s astronauts or preventing our space probes from damaging others).
I know, I know, it is all quite idealistic. But all things considered, the treaty has held up fairly well: Most of the world’s countries, including all the major space powers, have ratified it and abided by its terms (after all, it is in everyone’s self-interest to keep everyone else from putting nukes in space). Naturally, some provisions were written vaguely enough to allow some workarounds — for example, space forces are still allowed so long as they are not armed with WMDs and belligerent.
The Outer Space Treaty is influential enough to still be referenced by the major space programs, and has enough legitimacy that every government feels the need to at least pay lip service to its terms. Whether this holds up in an ever-intensifying rivalry among both countries and companies is a different story — but it is certainly better than nothing.
Initially hopeful that the French Revolution would usher equality between men and women, Gouges became disenchanted upon discovering that the key revolutionary tenant of egalite would not be extended to women. In 1791, in response to the Declaration of the Rights of Man and of the Citizen—an otherwise seminal work in human rights— she wrote a counter-declaration that proposed full legal, social, and political equality between men and women. She also published her treatise, Social Contract, named after the famous work of Enlightenment thinker Jean-Jacques Rousseau, calling for marriage based upon gender equality.
Even before the revolution, Gouges was well ahead of her time both ideologically and professionally. She dared write plays and publish political pamphlets at a time when women were denied full participation in the public and political space. After releasing a play critical of slavery, she was widely denounced and even threatened for both her anti-slavery stance and being involved in the male profession of theatre in the first place. Gouges remained defiant: “I’m determined to be a success, and I’ll do it in spite of my enemies”. Unfortunately, threats and outright sabotage from the slavery lobby forced the theatre to abandon her play after just three days.
…Gouges took on her mother’s middle name, changed the spelling of her father’s and added the aristocratic “de.” Adding to this already audacious gesture, the name “Gouges” may also have been a sly and provocative joke. The word “gouge” in Occitan was an offensive slang term used to refer to lowly, bawdy women.
Unsurprisingly, once the French Revolution came into full swing, Gouges wasted no time in seizing the moment. Aside from her already-bold feminist views, she rigorously supported a wage of policies and rights that proved radical even for the revolution:
She produced numerous broadsides and pamphlets between 1789 and 1792 that called for, among other things, houses of refuge for women and children at risk; a tax to fund workshops for the unemployed; the legitimation of children born out of wedlock; inheritance equality; the legalization and regulation of prostitution; the legalization of divorce; clean streets; a national theater and the opening of professions to everyone regardless of race, class or gender. She also began to sign her letters “citoyenne,” the feminine version of the conventional revolutionary honorific “citoyen.”
Gouges’ opposition to the revolution’s growing and bloody radicalism, and support for a constitutional monarchy, put a target on her back. Above all she openly disliked, Maximillian Robespierre, in effect the most powerful man in the country, going so far as to use the informal tu when referring to him in an open letter. This proved the last straw; she was tried, convicted, and executed for treason as one of only three women to be executed during the Reign of Terror, and the only one executed for her politics.
Nonetheless, Gouges’ legacy lived on for decades, influencing women’s rights movements across Europe and North America: the 1848 Seneca Falls Convention in New York—the first convention dedicated to women’s rights—based its “Declaration of Sentiments” on her “Declaration of the Rights of Woman”.
On this day in 1786, the newly minted United States faced its greatest domestic challenge when Daniel Shays led an armed uprising in western Massachusetts against the federal government known as “Shays’ Rebellion“.
Shays was a veteran of the American Revolutionary War who saw combat in several major battles and was wounded in action. Like most people in the fledgling country of roughly three million, he was a subsistence farmer just scrapping by; most residents of rural Massachusetts had few assets beyond their land, and often had to rely on bartering or debt from urban merchants.
Like most revolts, there were many complex ideological, political, and economic factors that drove Shays and four thousand others to take arms against their purportedly representative government. The majority of veterans received little pay for their service, and still had difficult getting the government to pay up. Compounding this problem were mounting debts to city businessmen and higher taxes by the state government, which happened to be dominated by the same mercantile class. Mounting bankruptcies and repossessions, coupled with the ineffectiveness of the democratic process in addressing these issues, finally boiled over to well organized efforts to shut down the courts and prevent more “unjust” rulings. Over time, the protests morphed into an outright insurrection that sought the overthrow of the Massachusetts government. Things really came to a head in 1787, when Shays’ rebels marched on the federal Springfield Armory in an unsuccessful attempt to seize its weaponry for their cause. The national government, then governed by the Articles of Confederation, did not have the power nor ability to finance troops to put down the rebellion; it came down to the Massachusetts State militia and even privately funded local militia to put an end to the rebellion, at the loss of nine lives in total.
Most participants were ultimately pardoned, including Shays himself, who ultimately died poor and obscure in 1825. But the legacy of the conflict far outlived its relative blip in modern history: Though it is still widely debated, the rebellion may have influenced the already-growing calls for the weak Confederation to be replaced by a federal system under a new constitution. Among other things, the event is credited with creating a relatively more powerful executive branch, as it was believed one single president would have a better chance at acting decisively against national threats. Some delegates felt that the uprising proved the masses could not be trusted; the proposed Senate was already designed to be indirectly elected (as it would remain until the early 20th century), but some wanted even the House of Representatives to be removed from the popular vote. Regardless of its effects, if any, on the course of our constitutional and political development, the causes, sentiments, and public debates around Shays’ Rebellion (and the response to it) are no doubt familiar to many of us today; depending on how you look at it, that is either reassuring (things are not so uniquely bad after all) or depressing (things are *still* pretty bad over two centuries later).