After decades of tremendous financial and social costs, the punitive drug model is being steadily eroded at home and abroad. Even the conservative law-and-order types who oppose the use of illicit drugs are increasingly accepting that the war on drugs has failed both in its objective (undercutting drug use) and its efficiency (accomplishing little yet reaping a huge economic and human toll).
Even Mexico, which has suffered more than most nations from our appetite for illegal drugs, has gone forward with legalizing marijuana in an effort to undercut a major source of funding for its powerful and vicious cartels. (So now both of America’s only neighbors have fully done away with punitive attitudes towards one of the weaker and comparatively less harmful illicit substances.)
All that being said, I do feel validated in having proposed and written a paper exploring the alternative methods, policies, and cultural attitudes of various countries when it comes to illegal drugs. As the U.S. and other countries question the wisdom of the status quo, it may help to look abroad at those places that were ahead of the curve in dispensing with the punitive approach in favor of more constructive methods. I focus especially on Portugal, which twenty years ago blazed the trail towards decriminalizing all illegal drugs and framing their use as a public health matter rather than a criminal one.
As you will hopefully read, many of these strategies are unique to the time, place, or sociopolitical context of the nations that implemented them; nevertheless, there are still useful lessons to glean, and at the very least we can see proof that there are other ways to address the scourge of drug addiction, trafficking, and other associated ills, besides the blunt instrument of police and prisons.
Feel free to leave your thoughts, reactions, and feedback. Thanks again for your time.
Albania, one of the poorest countries in Europe, has committed to taking in up to 4,000 Afghan refugees, which is among the most in the world and the most in proportion to its population (which is roughly 2.8 million)Hundreds of Afghans, including roughly 250 children, are being housed in coastal resorts, under a clever emergency plan developed by the government in response to a devastating 2019 earthquake; when thousands of people were rendered homeless, officials opted to shelter them in the mostly unused space of beach hotels.
Such hospitality is deeply rooted in Albanian culture. The Muslim-majority country is known for its stringent code of generosity and hospitality to anyone and everyone who needs it. Known as besa, which roughly translates to “trust”, “faith”, or “oath”, it commits all Albanians to help people in need regardless of their background or circumstances. As locals explain, the tradition is simple: “If someone needs a place to stay, you give it to them, period”.
While the practice may go back to ancient times, it was first codified in the Kanun, a set of customary laws written in the 15th century to govern the many independent tribes of the region. Within this book is a proverb that sums it up nicely: “Before the house belongs to the owner, it first belongs to God and the guest.” You could knock on the door of any house and ask for help and the owner would have to take you in. The Kanun even advises households to always have a spare bed ready at any time, just in case.
While besa is a duty that binds all Albanians, there is evidence that they genuinely find hosting guests as a point of pride. There is one anecdote about a town that rebelled against a hotel that was going to be built there; everyone went to town hall and complained, saying people who needed a place to stay could just come knock on their doors.
Perhaps the greatest proof of this tradition is the Second World War, after which Albania was perhaps the only country to have more Jews than before the Holocaust. Not only did they save nearly their entire Jewish community, but they saved another two thousand or so who had fled to the country. Albanians largely resisted all the pressure and threats by Axis forces to turn over people in hiding. Had anyone given up their guest, they would bear a great shame that could only be solved by “cleaning the blood”—meaning taking vengeance against whoever took and harmed their guest (which is one hell of a story idea…).
This is also why Albania is relied upon by the U.S. and Europe to take in folks neither wants, from Iranian and Syrian refugees, to Guantanamo detainees deemed innocent but nonetheless untrusted.
Initially hopeful that the French Revolution would usher equality between men and women, Gouges became disenchanted upon discovering that the key revolutionary tenant of egalite would not be extended to women. In 1791, in response to the Declaration of the Rights of Man and of the Citizen—an otherwise seminal work in human rights— she wrote a counter-declaration that proposed full legal, social, and political equality between men and women. She also published her treatise, Social Contract, named after the famous work of Enlightenment thinker Jean-Jacques Rousseau, calling for marriage based upon gender equality.
Even before the revolution, Gouges was well ahead of her time both ideologically and professionally. She dared write plays and publish political pamphlets at a time when women were denied full participation in the public and political space. After releasing a play critical of slavery, she was widely denounced and even threatened for both her anti-slavery stance and being involved in the male profession of theatre in the first place. Gouges remained defiant: “I’m determined to be a success, and I’ll do it in spite of my enemies”. Unfortunately, threats and outright sabotage from the slavery lobby forced the theatre to abandon her play after just three days.
…Gouges took on her mother’s middle name, changed the spelling of her father’s and added the aristocratic “de.” Adding to this already audacious gesture, the name “Gouges” may also have been a sly and provocative joke. The word “gouge” in Occitan was an offensive slang term used to refer to lowly, bawdy women.
Unsurprisingly, once the French Revolution came into full swing, Gouges wasted no time in seizing the moment. Aside from her already-bold feminist views, she rigorously supported a wage of policies and rights that proved radical even for the revolution:
She produced numerous broadsides and pamphlets between 1789 and 1792 that called for, among other things, houses of refuge for women and children at risk; a tax to fund workshops for the unemployed; the legitimation of children born out of wedlock; inheritance equality; the legalization and regulation of prostitution; the legalization of divorce; clean streets; a national theater and the opening of professions to everyone regardless of race, class or gender. She also began to sign her letters “citoyenne,” the feminine version of the conventional revolutionary honorific “citoyen.”
Gouges’ opposition to the revolution’s growing and bloody radicalism, and support for a constitutional monarchy, put a target on her back. Above all she openly disliked, Maximillian Robespierre, in effect the most powerful man in the country, going so far as to use the informal tu when referring to him in an open letter. This proved the last straw; she was tried, convicted, and executed for treason as one of only three women to be executed during the Reign of Terror, and the only one executed for her politics.
Nonetheless, Gouges’ legacy lived on for decades, influencing women’s rights movements across Europe and North America: the 1848 Seneca Falls Convention in New York—the first convention dedicated to women’s rights—based its “Declaration of Sentiments” on her “Declaration of the Rights of Woman”.
Yesterday was an even more devastating anniversary than the bar exam.
On July 28, 1914—exactly one month after the assassination of Archduke Franz Ferdinand—Austria declared war on Serbia and the First World War began. Despite directly setting off the war, both nations would soon be overshadowed by the much bigger players they dragged with them: France, Germany, Russia, and the U.K.
After putting up stiff resistance for the first year, Serbia was conquered by the end of 1915 and occupied by Austro-Hungarian forces until the war’s end in 1918. Over 1.1 million Serbs died, including one out of four troops, up to a quarter of the population and 60 percent of men; proportionally, Serbia suffered more losses than any other country involved (the Ottoman Empire ranks second in this regard, losing 13-15 percent of people, followed by Romania at 7-9 percent).
For its part, the weak and declining Austro-Hungarian Empire lost over 2 million people, of whom 120,000 were civilians, amounting to about 4 percent of its total population. Having exhausted itself in its pyrrhic victory against Serbia, the country barely kept it together throughout the conflict, remaining a peripheral power dependent on German support; indeed, Austria-Hungary would ultimately collapse into several new countries, some of which would join Serbia to form a new multiethnic state called Yugoslavia.
All told, some 8 million fighting men were killed by combat and disease, and 21 million more were wounded. As many as 13 million civilians died as a result of starvation, exposure, disease, military action, and massacres. Four great empires and dynasties—the Hohenzollern, the Habsburg, the Romanov, and the Ottoman—fell, and the intercontinental movement of troops helped fuel the deadliest influenza pandemic in history. The ripple effects of the war, from the Great Depression, to World War II, to the Cold War, continue to be felt today. The war helped usher in the Russian Revolution, and ultimately the Soviet Union, the first major communist government (which ironically would play the pivotal role in helping end the second iteration of the war).
Better known are the grievances engendered by the post-war Versailles Treaty, which helped fuel the desperation and misery that became the Nazi’s stock and trade. Even Japan saw its star rise further as a major world power, belatedly joining the Allies and getting a seat at the table as one of the leaders of the post-war League of Nations (no small feat for a non-European country).
In Casualties of History, John Arquilla describes the almost morbidly comical arrogance and stupidity of this meat grinder of a conflict:
“Yes, a second and even more destructive conflict followed all too soon after the “war to end all wars”, impelling a name change from Armistice Day to Veterans Day. And the rest of the 20th century was littered with insurgencies, terrorism, and a host of other violent ills — most of which persist today, guaranteeing the steady production of new veterans, of which there are 22 million in the United States.
But despite the seemingly endless parade of wars waged and fresh conflicts looming just beyond the bloody horizon, World War I still stands out for its sheer horror. Over ten million soldiers died, and more than twice that number were wounded. This is a terrible enough toll. But what makes these casualties stand out even more is their proportion of the total numbers of troops mobilized.
For example, France put about 7.5 million soldiers in the field; one in five died, and three out of four who lived were wounded. All other major combatants on both sides suffered horribly: the Austro-Hungarian Empire’s 6.5 million soldiers had a combined casualty rate of 74 percent. For Britain and Russia, the comparable figures totaled a bit over 50 percent, with German and Turkish losses slightly below one-half of all who served. The United States entered the conflict late, and so the overall casualty rate for the 4.3 million mobilized was “just” 8 percent. Even so, it is more than double the percentage of killed and wounded from the Iraq War, where total American casualties amounted to less than 4 percent of the one million who served.
Few conflicts in all of military history have seen victors and vanquished alike suffer such shocking losses as were incurred in World War I, so it is worth taking time to remember how this hecatomb came to pass. A great body of evidence suggests that this disaster was a product of poor generalship. Historian Alan Clark’s magisterial “The Donkeys” conveys a sense of the incredible stubbornness of high commanders who continued, for years, to hurl massed waves of infantry against machine guns and rapid-firing artillery. All this went on while senior generals stayed far from the front. A British field commander, who went riding daily, even had soldiers spread sand along the country lane he followed, to make sure his horse didn’t slip.
It is little wonder that in the face of Nazi aggression barely a generation later, most of Europe melted away and succumbed to occupation within a year. Most nations did not have the political or public will to endure yet another meat grinder of a conflict; indeed, the major powers could not imagine that anyone would actually want another war given all the bloodletting that went around. Perhaps the greatest tragedy of the First World War was the fact that even all that death and destruction failed to stem the hatred, cruelty, and aggression of monstrous men and their millions of supporters and collaborators; in fact, the shortsightedness and vindictiveness of postwar leaders—as had already been evidenced by their callous ineptitude on the battlefield—all but ensured that desperation and humiliation would give the likes of Hitler, Mussolini, and their minions plenty of currency to start an even bloodier.
Thanks goodness that, for now, that has not played out again all these decades later.
Setting aside my own globalist sentiments, is worth noting that all the top COVID-19 vaccines are products of international collaboration, and a testament to the fruits of globalization.
The Oxford-AstraZeneca vaccine (marketed in some places as Covishield) is the most straightforward example, as it was developed in a partnership between Oxford University in the U.K. and the British-Swedish multinational pharmaceutical company AstraZeneca.
The Pfizer vaccine, which was the first to be confirmed 90% effective, was developed by a German company, BioNTech, founded and led by a Turkish-born married couple of leading immunologists. Pfizer, which was founded in the U.S. by German immigrants, helped provide vital resources for logistics, clinical trials, and manufacturing.
The Johnson & Johnson vaccine, like Pfizer’s, was also developed in Europe with the backing of American resources, by Janssen Vaccines in Leiden, Netherlands, and its Belgian parent company Janssen Pharmaceuticals, a subsidiary of J&J.
Heck, even Russia’s “Sputnik V” vaccine—which was technically the first to be developed—has turned out to be more efficacious than initially believed (much to my own surprised and that of many epidemiologists, apparently).
While the pandemic exposed the many perils of an interconnected world, it has also shown the even greater peril of trying to go it alone when it comes to major challenges and threats that disregard political boundaries and nationalities.
I’m hardly the first or only person to notice this: As long ago as 1851, when the Industrial Era helped rapidly globalize trade, travel, and war—and with them, more rapidly and widely spread diseases—the first of several “International Sanitary Conferences” was convened by the Ottoman Empire to coordinate containment strategies for infectious diseases—even among rivals and former enemies. It was the first time that a formal process of international collaboration was devised for public health; but as we’re learning, it remains even more relevant nearly two centuries later.
Of course, one doesn’t have to be a “globalist” to appreciate the logic of multilateralism (in public health and generally). One study in the medical journal BMJ examining the international response to COVID-19 argues:
The reasons for collaboration remain clear, logical, and have endured essentially unchanged from their original conceptualisation in the 1800s. Three of the most central are as follows. Firstly, the many ties between nations create collective health risks that are difficult to manage independently. The rapid spread of SARS-CoV-2 shows the close connections between countries, and the poorly managed economic and social costs are further evidence of their shared fate. Secondly, sharing knowledge and experience accelerates learning and facilitates more rapid progress. Information and knowledge on pathogens, their transmission, the diseases they provoke, and possible interventions are all areas in which researchers and public health professionals can benefit from the experience of others. Thirdly, agreeing on rules and standards supports comparability of information, helps establish good practices, and underpins shared understanding and mutual trust. All three reasons drive nations to collaborate and are reflected in their creation of WHO, a central authority, and its World Health Assembly (WHA), which serves as a forum for countries to share information, debate issues, and take collective decisions.
Little wonder why, despite the rise of nationalism and insularity (which predate the pandemic but was exacerbated by it), some global survey data suggest that a majority of people believe that more global collaboration would help reduce the impact of COVID-19. Far from idealistic, it is simply pragmatic to throw everything we have at his problem, regardless of which national jurisdiction the resources or knowhow happen to be located.
I’ll leave the final word to the above-mentioned study in BMJ, which I think makes a sober, evidence-based case for multilateralism, which is all too often treated as Utopian or naïve rather than realistic and practical:
The covid-19 pandemic painfully shows the reasons why nations are better off when they cooperate and collaborate in health, and also reveals the hazards of their incomplete commitment to doing so. Member states have prioritised themselves by restricting WHO from meaningful oversight of national information and endangered global health security by competing for vaccines rather than allocating them equitably. The inability to verify national data or advance its own estimates is just one of the many crucial dimensions in which WHO is prevented from maintaining the primacy of technical competence over the self-interested obfuscations of some member states. WHO’s independence is compromised also through the manipulation of its budget. The patchwork of institutions active in health reflects the limited, ad hoc agreement among powerful countries. Although generally global institutions have performed well in their missions, their often limited mandates leave the world’s people inadequately protected from new threats. In a pandemic, the cost is expressed in lives and livelihoods. More than 10, 000 people were dying daily at end of 2020, and the world economy was forecast to lose $5tn or more in 2020 alone. The imperative of finding collaborative and collective solutions—solidarity—has never been more obvious, or more urgent, for covid-19, climate change, non-communicable diseases, and the many other pressing and grave challenges that hinge on collective action.
Meaningful international collaboration is a critical part of the road ahead and calls for immediate action in three areas. Firstly, member states must end the systematic weakening of WHO—end ad hoc institutional fragmentation in global health and end budgetary manipulation. Secondly, they must support the independence of WHO—increase its core budget and build its authority over trade and travel related issues, including compulsory licensure for pharmaceuticals. Thirdly, states must uphold fairness, participation, and accountability by granting WHO powers to hold members accountable, including for overcoming deficiencies in national data, and by decolonising its governance to address the undue influence of a small number of powerful member states.
Half the ISS—which involves five space agencies and fifteen countries—is Russian-built and operated, and to this day Russia does most of the legwork in launching both crew and cargo. It was a rare and enduring example of cooperation between two erstwhile rivals, an interesting if fragile antidote to the petty politics on the ground. (Scientists and astronauts from both countries get along pretty well and have consistently collaborated even through the worst flareups of tensions and hostility.)
China was never part of the ISS—a notable absence given its hefty financial resources and technical knowledge—due to a controversial NASA policy implemented by Congress in 2011 that excludes any form of cooperation with any Chinese institution or organization. So I imagine its ambitious attempt at a national space station, like so many of its actions abroad, clearly has a triumphalist “We’ll show you!” aspect to it.
But China’s Tiangong, or “Heavenly Palace”, which is set for completion in just a year, will have only one-sixth the mass of the ISS, and roughly a quarter of its habitable space. This isn’t to say it won’t be an impressive feat—especially for a developing country that remains a byword for cheap consumer goods—but its full potential is likely limited given the sheer costs and complexity of building (and regularly maintaining) a human habitat in space.
Meanwhile, Russia’s plans are less clear: Though it holds many records in space stations—including launching the first one, having the most in total, and having the most experience with space walks and the like—it no longer has the financial resources to back this knowhow. (That’s what made the ISS so successful: What Russia lacked in America’s vast resources it made up for with its proven expertise, and visa versa.)
Even the otherwise prideful U.S.—albeit namely its pragmatic scientists at NASA—has now seemingly realized that space is too big, costly, and complex an endeavor for even superpowers to handle.
Aside from being a key founder of the ISS, which was created to replace a planned U.S. station that would have been too costly, NASA plans to return to humans to the moon for the first time in fifty years through the Artemis Program—a decidedly international effort.
While it will be led primarily by NASA and its mostly American commercial contractors, it will include personnel, tech, and resources from Europe, Japan, Canada, Italy, Australia, the United Kingdom, United Arab Emirates, Ukraine, and Brazil. (Believe it or not, those last three do carry a lot of technological heft in space; the UAE has a probe orbiting Mars as we speak, and India is notable for accomplishing many difficult space ventures at fairly low cost.) More countries have been invited and are are expected to join.
The Artemis Program not only aims to put humans (including the first woman) on the Moon by 2024, but has the long-term goal of establishing a lunar base that will be a launchpad for crewed missions to Mars.
Surprisingly, all this was promulgated during the tenure of a Trump-appointed, former Oklahoma congressman as NASA Administrator, who explicitly modeled the “Artemis Accords”, which broaden international participation in the program, on the United Nations Outer Space Treaty of 1967 (on which most space law is grounded).
To be sure, neither the Artemis Program, nor the Accords that essentially “internationalize” it, are without their criticisms. Many international legal scholars see them as a way for America to apply its own self-interested interpretation of space law that permits commercial exploitation of celestial bodies; as The Vergereports:
[The] Outer Space Treaty is pretty vague — purposefully so — which means there is a lot of room for interpretation on various clauses. The goal of the Artemis Accords is to provide a little more clarity on how the US wants to explore the Moon without going through the slow treaty-making process. “We are doing this in keeping with the Outer Space Treaty,” said Bridenstine, adding that NASA is trying to “create a dynamic where the Outer Space Treaty can actually be enforced.”
One big thing NASA wanted to make clear in the accords is that countries can own and use resources that are derived from the Moon. As part of the Artemis program, NASA hopes to extract lunar materials, such as the Moon’s dirt or water ice that’s thought to be lurking in the shadows of lunar craters. The Outer Space Treaty forbids nations from staking claim to another planetary body, but the policy of the US is that countries and companies can own the materials they extract from other worlds. “Article II of the Outer Space Treaty says that you cannot appropriate the Moon for national sovereignty,” Bridenstine said. “We fully agree with that and embrace it. We also believe that, just like in the ocean, you can extract resources from the ocean. But that doesn’t mean you own the ocean. You should be able to extract resources from the Moon. Own the resources but not own the Moon.”
It’s an interpretation of the Outer Space Treaty that not everyone may agree on. A pair of researchers writing in the journal Science last week have called on countries to speak up about their objections to this interpretation, and that the United States should go through the United Nations treaty process in order to negotiate on space mining. “NASA’s actions must be seen for what they are—a concerted, strategic effort to redirect international space cooperation in favor of short-term U.S. commercial interests, with little regard for the risks involved,” the researchers wrote in Science.
Still, the overall substance and spirit of the Accords — which at just seven pages, makes for an easy read) — seems like the sensible way forward. I know, I know count on the internationalist to reach that conclusion! But really, if we want to maximize humanity’s potential in space, we must do so as, well, humans: unified in our resources, knowhow, innovation, and vision. Given how much has been accomplished by just a handful of nations on their ow — and the number of countries joining the space club grows annually — imagine what a united front can offer?
Given that China and Russia have lunar aspirations of their own—including a joint lunar base that sort of speaks to my point—it will be interesting to see which vision will play out successfully: The Star Trek-style pan-humanist approach, or the more familiar competitiveness and nationalism that characterized the Cold War or even the colonial era.
I was so busy reeling from the results of my cursed Bar Exam that I forgot April 12 was also a much happier occasion: International Day of Human Space Flight, which commemorates the 1961 flight of Russian cosmonaut Yuri Gagarin—the first man to enter outer space and the first to orbit the Earth. He spent 108 minutes aboard the Vostok 1, which was basically one big cannonball with rudimentary, if resourceful, technology.
Gagarin subsequently became the most visible and iconic Russian in the world, a far cry from dour and disreputable figures that were more familiar to outsiders. His natural charm and friendliness—both personally and in every media spotlight—earned him the moniker “the Smiling Soviet“, as it contradicted the popular image of Russians as gruff and sullen.
How does one become the first human in space, especially as the son of peasants in a country as seemingly blighted as Soviet Russia? After personally enduring the grief and hardship of the Second World War—including having his home occupied by a German officer, and serving in the resistance—Gagarin returned to normal life; he loved math and science in school, and was fascinated with planes, building model aircraft and eventually a local flying club. Unsurprisingly, he joined the Soviet Air Force, where his confidence and knack for flying were matched only by his astute technical knowledge; as a youth, he worked in a steel factory and later went to vocational school, learning about industrial work and tractors.
As the Soviet space program went into high gear in the 1960s, Gagarin and other talented pilots were being screened for their fitness and aptitude as “cosmonauts”—something no one had ever been before. (There was only so much we could know about the effect of space travel on a human.)
When it came down to him and 19 other candidates, an Air Force doctor made the following evaluation of him:
Modest; embarrasses when his humor gets a little too racy; high degree of intellectual development evident in Yuri; fantastic memory; distinguishes himself from his colleagues by his sharp and far-ranging sense of attention to his surroundings; a well-developed imagination; quick reactions; persevering, prepares himself painstakingly for his activities and training exercises, handles celestial mechanics and mathematical formulae with ease as well as excels in higher mathematics; does not feel constrained when he has to defend his point of view if he considers himself right; appears that he understands life better than a lot of his friends.
Gagarin was also heavily favored by his peers—even those otherwise competing with him for the glory of first man in space. When the 20 candidates were asked to anonymously vote for which other candidate they would like to see as the first to fly, all but three chose him
Another favorable factor was, of all things, his short stature (at least partly a product of his rough and impoverished childhood). At just 5’2″, Gagarin could easily fit in the small, rudimentary cockpit of the Vostok 1. (Being the first into space is scary enough—imagine in something that cramped.)
As Valentina Malmy wrote beautifully in the book Star Peace:
He was like a sound amplified by a mountain echo. The traveler is small, but the mountains are great, and suddenly they merge into a single whole. Such was Yuri Gagarin. To accomplish a heroic exploit means to step beyond one’s own sense of self-preservation, to have the courage to dare what today seems unthinkable for the majority. And to be ready to pay for it. For the hero himself, his feat is the limit of all possibilities. If he leaves something “in reserve”, then the most courageous deed thereby moves into the category of work: hard, worthy of all glorification, but — work. An act of heroism is always a breakthrough into the Great Unknown. Even given most accurate preliminary calculations, man enters into that enterprise as if blindfold, full of inner tension.
I can’t wrap my head around being the first person to venture into something as unknown and terrifying as space—to be able put your thumb up in front of you and our big planet as small as your fingernail.
Little wonder why Gagarin became such a worldwide celebrity, touring dozens of countries in the years following his fateful flight. The geopolitical implications melted away in the face of this impressive feat, and the man’s genuine charm and affability—this was something all humankind could celebrate.
Of course, this was still the Cold War: As a living symbol of Soviet triumph, Gagarin could not be risked on another spaceflight, given their inherent danger even today, let alone fifty years ago. Ironically, he died unexpectedly just a few years later during a routine training flight, an event subject to much secrecy and rumor (one conspiracy theory is that newly installed Soviet leader Leonid Brezhnev ordered his death due to being overshadowed by the gregarious cosmonaut at public events).
For his part, the “Smiling Soviet” seemed above such politics, notwithstanding his (likely symbolic) stint as a member of the Soviet legislature. As to be expected, being the first man in space really changes you and puts things in perspective; you’re literally looking down on everything you, and all your fellow humans, have ever known. I wonder if it was surreal or even lonely being the only person with that sort of view.
Despite being banned from the U.S. by the Kennedy Administration—perhaps because his popularity among average Americans undermined the competitive spirit of the Space Race—Gagarin was honored by the Apollo 11 crew (ironically the same mission that ended the race in America’s favor). Astronauts Neil Armstrong and Buzz Aldrin left a memorial on the surface of the moon commemorating him and fellow cosmonaut Vladimir Komarov, the first human to venture into Outer Space, and the first to die there. (Another memorial was left by Apollo 15 in 1971 to commemorate the Americans and Russians who died in space.)
Though untimely and cruelly ironic—an expert pilot dying from a routine flight rather than the first space mission—Gagarin is survived by one hell of a legacy: The almost banal regularity of human spaceflight in the 21st century is a testament to his courageous and spirited embrace of the ultimate unknown.
Yesterday was World Water Day, launched by the UN in 1993 to raise awareness about the importance of water both environmentally and for humanity as a whole.
I think our strictly terrestrial species is ill-equipped to truly grasp the significance of water, from its role in generating most of our oxygen, to the fact that most living things that have ever lived have been aquatic or amphibious.
As a middle class person in a developed part of the world, it is also east to take for granted just how elusive access to clean water is; for most of human history, most humans died or were sickened (sometimes permanently) by diseases related to dirty water.
While we’ve made tremendous progress over the past century alone, well over a million humans still die annually from water-borne diseases (many of them children), and nearly one out of four people lack the access to clean water that most us take as a given. The effects of climate change and overexploitation risks depleting an already strained water supply—making World Water Day’s mission of awareness all the more invaluable.
Below is a big data dump concerning all things water, including the progress we’ve made in expanding clean water access, and the challenges that remain in continuing this development while doing so sustainably.
On this day in 1843, A Christmas Carolby English author Charles Dickens was first published (first edition pictured below), arguably influencing Christmas as we know it more than any pagan tradition. In fact, the phrase “Merry Christmas” was popularized by the story!
Dickens was ambiguous about religion; while he was likely a Christian and admired Jesus, he openly disliked rigid orthodoxy, evangelicalism, and organized religion. (He once published a pamphlet opposing the banning of games on the Sabbath, arguing that people had a right to pleasure.)
To that end, a Christmas Carol placed less emphasis on faith and observance and instead focused on family, goodwill, compassion, and joy. Dickens sought to incorporate his more humanist approach to the holiday, constructing Christmas as a family-centered festival that promotes generosity, feasting, and social cohesion. Some scholars have even termed this “Carol Philosophy”.
So when religious and nonreligious folks alike think of loved ones and the “Christmas spirit”, they are basically channeling Dickens’ once-unique take on the holiday. (Though in his time, other British writers had begun to reimagine Christmas as a celebratory holiday, rather than a strictly religious occasion.)