The Canadian Doctor Who Discovered Insulin and Gave it to the World for Free

On this day in 1922, a dying 14-year-old named Leonard Thompson received the first purified dose of insulin for his diabetes at Toronto General Hospital in Canada.

Barely six months before Thompson received his life-saving dose, a team of researchers led by his doctor, Frederick Banting of the University of Toronto, discovered that a hormone known as insulin regulates blood sugar, successfully isolating it to treat humans. (As is common with such groundbreaking work, Banting’s colleagues came from various countries and were building on the research of German and Romanian scientists.)

Though widely seen as a modern disease (and it is indeed more common) diabetes is one of the oldest known scourges of humanity; it is described in Egyptian and Indian medical records well over 2,000 years ago. In the 19th century, a 10-year-old child with Type 1 diabetes would typically live for just another year; now, thanks to discoveries like insulin, people with Type 1 diabetes can expect to live almost 70 years.

Until Banting’s achievement, the recommended treatment for Type 1 diabetes was a near-starvation diet, in order to keep sugar from accumulating in the blood. Thompson was just 65 pounds, and probably days from death, before Banting injected him with insulin; another round of shots successfully stabilized his blood sugar levels—and spared him and countless others from enduring such a long, painful, and dangerous treatment.

Banting rightfully won the Nobel Prize in Medicine the following year, along with Scottish team member John James Rickard Macleod. (At age 32, Banting remains the youngest Nobel laureate in the field). Believing that his colleague Charles Herbert Best also deserved recognition as a co-discoverer, the humble Canadian doctor shared his prize money with him.

But more telling of Banting’s character and contributions to humanity was what he did with this groundbreaking—and potentially lucrative—accomplishment: He refused to patent it and make a profit even after being offered $1 million and royalties for the formula. Banting believed that the Hippocratic Oath prohibited him from profiting off such lifesaving treatment, stating that “insulin belongs to the world, not to me”. His co-laureate Macleod likewise turned down the opportunity.

Thus, it was Banting’s teammates Best and James Collip, a Canadian biochemist, who were officially named as inventors in the patent application—but they immediately transferred all rights to their insulin formula to the University of Toronto for just one dollar. All these men believed that insulin should be made as widely available as possible, without any barriers such as cost—something quaint by today’s standards, where the costs of the four leading types of insulin in the U.S. have more than tripled over the past decade, to roughly $250 a vial (some patients need two to four vials a month).

No doubt, Banting and his colleagues would be spinning in their graves.

The ASAT Race

Russia’s anti-satellite (ASAT) test, which took out an old but large Soviet satellite, garnered widespread condemnation for the risk it posed to the International Space Station—and the wider problem of mounting space debris threatening human endeavors in space. But it also reflects yet another battleground among the world’s major powers.

Russia is one of just four countries—along with the U.S., China, and India—to have anti-satellite capabilities; this test exceeded Indian and American altitudes, but fell short of China. No doubt yet another race is in the works.

In addition to clogging up space with more deadly junk, such technology also serves as a demonstration to rivals: In a highly digital world, the ability to take down satellites can greatly weaken an enemy’s recon and spying capabilities, as well as disrupt disrupt the lives of billions.

Ironically, all four anti-sat players are also competing in the development and/or launching of satellites, which are in greater demand than ever.

The Outer Space Treaty

On this day in 1967, the Outer Space Treaty entered into force, becoming the first effort to establish universal principles and guidelines for activities in outer space. It was created under the auspices of the United Nations based on proposals by the world’s two principal space powers, the United States and Soviet Union.

Naturally, I took the opportunity to improve the Wikipedia article about it, which deserves greater justice (See the before and after photos below.)

It may not be a household name — then again, few treaties are —but the Outer Space Treaty remains one of the most relevant texts in international law today. It is the foundational framework for what we now know as space law, a legal field that is more relevant than ever now that dozens of countries and companies are actively involved in space activities.

The Outer Space Treaty forms the basis of ambitious projects such as the International Space Station (the biggest scientific endeavor in history) and the Artemis Program, a U.S.-led international coalition to return humans to the Moon and to ultimately launch crewed missions to Mars and beyond.

May be a black-and-white image of 2 people, people sitting and indoor
The treaty was signed in Washington, Moscow, and London, representing the first three countries to have artificial satellites in space at the time.

The main crux of the Outer Space Treaty is preventing the placement of weapons of mass destruction in space; broader principles include allowing all nations to freely explore space; limiting space activities to peaceful purposes; preventing any one nation from claiming territory in space; and fostering goodwill and cooperation in space exploration (such as rescuing one another’s astronauts or preventing our space probes from damaging others).

I know, I know, it is all quite idealistic. But all things considered, the treaty has held up fairly well: Most of the world’s countries, including all the major space powers, have ratified it and abided by its terms (after all, it is in everyone’s self-interest to keep everyone else from putting nukes in space). Naturally, some provisions were written vaguely enough to allow some workarounds — for example, space forces are still allowed so long as they are not armed with WMDs and belligerent.

The Outer Space Treaty is influential enough to still be referenced by the major space programs, and has enough legitimacy that every government feels the need to at least pay lip service to its terms. Whether this holds up in an ever-intensifying rivalry among both countries and companies is a different story — but it is certainly better than nothing.

The Joys of Bottled Borscht in Space

Across different times, cultures, and places, food has always been a unifier. This is especially salient in space, where the tough environment and complete detachment from Earth makes a good meal both comforting and psychologically affirming.

Some endearing examples: pictured below are American astronauts holding what appear to be tubes of Russian vodka given to them by Russian cosmonauts in a gesture of goodwill. This followed the famous “handshake in space” of 1975, when the two political and scientific rivals docked one another’s flagship space vessels in an unlikely display of cooperation and mutual respect (notwithstanding continued rivalry in and out space). The “vodka” was actually Russian borscht, a sour but hearty beet soup.

Supercluster

Flashforward to this photo of a typical dinner night aboard the International Space Station, which by some measures is the largest and most expensive scientific project in history. Not much has changed otherwise.

01G_SEP2015_Meals Group B_LIVE.jpg

Once again, the U.S. and Russia have come together in space exploration, despite their very real political differences, this time joined by Japan, Canada, and over eleven European nations. This makes the creature comforts of space all the more enjoyable, as Smithsonian Magazine notes:

One big perk of international cooperation on the station is the advancement of the space food frontier. Astronauts and cosmonauts regularly gather on both sides of the station to share meals and barter food items. Roscosmos’ contribution to the food rations is the unique assortment of canned delicacies from traditional Russian cuisine. Perlovka (pearl barley porridge) and tushonka (meat stew), dishes familiar to the Russian military veterans since World War II, found new popularity among the residents of the station. Cosmonaut Aleksandr Samokutyaev says his American counterparts were big fans of Russian cottage cheese.

The cosmonauts, meanwhile, have few complaints about sharing meals with a country that flies up real frozen ice cream (not the freeze-dried stuff made for gift shops), as the U.S. did in 2012. Ryazansky has also spoken fondly of the great variety of American pastries. “We should say,” he clarified, “our food is better than the Americans’…. Despite the variety, everything is already spiced. But in ours, if you wish you can make it spicy; if you want, you can make it sour. American rations have great desserts and veggies; however, they lack fish. Our Russian food has great fish dishes.” The cosmonauts’ cuisine benefits when European and Japanese crew arrive. Both agencies brought unique flavors from their culinary heritages—including the one thing the cosmonauts really wanted. “Japanese rations have great fish,” Ryazansky wrote.

Every new cargo ship comes with fresh produce, filling the stale air on the station with the aroma of apples and oranges. Deprived of strong flavors in their packaged food, cosmonauts often craved the most traditional Russian condiment: fresh garlic. Mission control took the request seriously. “They sent us so much that even if you eat one for breakfast, lunch, and dinner, we still had plenty left to oil ourselves all over our bodies for a nice sleep,” Suraev joked on his blog.

There’s something endearing and downright adorable about astronauts perhaps the world’s toughest and gruffest folks, one would think — excitedly exchanging meals with one another like kids trading candy on the playground. It almost makes you forget all the petty and vicious squabbles back on Earth. (As I understand it, scientists, space explorers, and visionaries of these nations tend to operate on a different level than their politicians.)

The Intelligence of Betta Fish

Contrary to popular belief, Siamese fighting fish are fairly intelligent. Research indicates they have complex behaviors, social interactions, and even individualized personalities. Males engage in carefully coordinated combat, dance-like courtship, and the building of “bubble nests”, which they fiercely protect; all this indicates a fairly well developed nervous system. Bettas are even capable of associative learning, meaning they develop and adopt certain responses to new stimuli (think of Pavlov’s famous experiment with dogs, where they learned to associate a bell ring with food).

Having had bettas for over fifteen years—including around 36 at the moment (blame the pandemic!)—I can vouch for this by personal experience. Our bettas are inquisitive, alert, and generally perceptive of their surroundings, watching and exploring anything new that comes their way. They also have varied personalities: Some are nearly always aggressive, tending to flare at us when we walk by; others are more shy and reclusive. They even have distinct tastes in food (which has prompted me to get several different brands and types).

Image may contain: plant
Our beautiful betta Dream, a “dumbo” or “elephant ear” type.

Now, aside from this being anecdotal, I know we humans tend to anthropomorphize animals, especially our pets, attributing human traits, behaviors, and intelligence to their natural behaviors. But there is quite a bit of scientific research backing my impressions (and perhaps those of fellow betta fish keepers).

In fact, Siamese fighting fish are frequently utilized in physiology and psychology studies due to their complex biology; many scientists in these fields consider them “prime models” in understanding how hormones and other hormones affect behavior.

For example, one study found that bettas were affected by antidepressants, specifically fluoxetine, which relies on serotonin transporter pathways to regulate behaviors; in this case, the bettas saw a reduction in their characteristic aggression, which indicates that have a comparable neurological framework. (In fact, bettas can be bored, depressed, and happy; moving them to a bigger tank or placing new decorations will elicit a positive response, with each specific betta having its own preference.)

A more recent study showed that bettas are able to synchronize their behavior during fights—something that has been observed among mammal as well! The longer they fought, the more they could precisely time their strikes and bites, to an extent that surprised the researchers. The study also determined that fights are highly choreographed, with seemingly “agreed on” breaks between each move. Bouts escalated every five to ten minutes, when fish locked onto each other’s jaws to prevent breathing—and thus test who can hold out the longest. The bettas then break apart to catch their breath, and the cycle begins anew—not unlike a boxing match!

Even more surprising, the team found that this synchronicity went down to the molecular level: Certain genes of the combatants were “turned on”, and while it is unclear what they do, this may influence how bettas will engage in future fights. Thanks to the betta’s renowned martial prowess, the researchers claim to have a “new dimension” to studying the relationship between genes and the nervous system in humans.

Given the complex personalities among bettas, and their capacity to feel happy, sad, or bored, they should be given far more than a cup or vase to live in: Not unlike humans, they prefer more space, more decor, and cleaner water, even if they can otherwise tolerate less than ideal conditions.

The Swedes Who Saved Millions of Lives

Meet the Nils Bohlin and Gunnar Engellau, whose work at Swedish carmaker Volvo has helped save millions of lives worldwide.

Engellau, Volvo’s president and an engineer himself, helped push for a more effective seatbelt, after a relative died in a traffic accident due partly to the flaws of the two-point belt design—which was not even standard feature in cars at the time. This personal tragedy drove Engellau to find a better solution, hiring Bohlin to find a solution quickly.

There were two major problems with the historic two-point belt design, which crosses the lap only. First, because the human pelvis is hinged, a single strap fails to restrain the torso, leaving passengers vulnerable to severe head, chest and spinal injuries; positioned poorly, the belt can even crush internal organs on impact. Second, they were notoriously uncomfortable, so many people chose not to wear them. Bohlin’s innovation was to find a design that resolved both problems at once.

After millions of dollars and thousands of tests through the 1950s and 1960s, Volvo became the first carmaker in the world to standardize the three-point safety belt we now take for granted. More than that, Volvo pushed hard for the seatbelt to be adopted in its native Sweden, which like most places was initially resistant to having to wear seatbelts.

But Volvo didn’t stop there. While it patented the designs to protect their investment from copy-cats, the company did not charge significant license fees to rivals or keep the design to itself to give their cars an edge. Knowing that lives were at stake worldwide, Engellau made Bohlin’s patent immediately available to all. Having sponsored the costly R&D, they gifted their designs to competitors to encourage mass adoption. It is estimated that Volvo may have lost out on $400 million in additional profits, if not more.

Instead, literally millions of people have been spared injury and death by this now-ubiquitous seatbelt we take for granted. All because a couple of Swedes decided to put people over profits (which isn’t to say they didn’t reap any financial incentive, but proved you can do both).

World Mental Health Day

Today is World Mental Health Day, launched in 1996 by the UN—at the urging of the World Mental Health Federation and with support from the WHO—to raise awareness about one of the most misunderstood but increasingly problematic issues facing humanity.

Even the concept of mental health is fairy new in human history. What we now call mental illnesses were known, studied, and treated by the ancient Mesopotamians, Egyptians, Greeks, Romans, Chinese, and Indians. Some were called “hysteria” and “melancholy” by the Egyptians, and certain Hindu texts describe symptoms associated with anxiety, depression, and schizophrenia. The Greeks coined the term “psychosis”, meaning “principle of life/animation”, in reference to the condition of the soul.

In virtually every society up until the 18th century, mental health was associated with moral, supernatural, magical and/or religious causes, usually with the victim at fault in some way. The Islamic world came closest to developing something like a mental health institution, with “bimaristans” (hospitals) as early as the ninth century having wards dedicated to the mentally ill. The term “crazy” (from Middle English meaning “cracked”) and insane (from Latin insanus meaning “unhealthy”) came to mean mental disorder in Medieval Europe.

In the mid 19th century, American doctor William Sweester coined the term “mental hygiene” as a conceptual precursor to mental health. Advances in medicine, both technologically and philosophically, quickly found the connection between mental and physical health while minimizing the idea of moral or spiritual flaws being the cause (the Greeks did come close to this, namely Hippocrates, who linked syphilis to a physical cause).

But the dark takeaway from this was the so called “social hygiene movement“, which saw eugenics, forced sterilization, and harsh experimental treatments as the solutions to mental and physical disabilities or divergences. Though the Nazis were the ultimate manifestation of this odious idea, their propaganda and policies cited most of the Western world, including the U.S., as standing with them in their efforts to cleanse populations. (In fact, the term mental health was devised after the Second World War partly to replace the now-poisoned idea of mental “hygiene”.)

While we have come a long way towards realizing the evils and horrors of how we treat mental illness—from ancient times to very recent history—abuses, misunderstandings, and neglect remain worldwide problems.

Hence I also want to take today to thank everyone throughout my life who has been so understanding, supportive, and affirming with respect to my own mental health struggles. I would never have broken through my anxiety or depression induced barriers without a loving and compassionate social support structure along the way (to say nothing of my relative socioeconomic privileges, which unfortunately remains the most common barrier to mental health treatment in the U.S.).

I am certainly luckier than most. Mental illnesses are more common in the U.S. than cancer, diabetes, or heart disease, which are far better known and addressed. Over a quarter of all Americans over the age of 18 meet the criteria for having a mental illness. Youth mental health has become especially dire, with 13% reporting a major depressive episode just over the past year, of whom only 28% get treatment. And over 90% of Americans with a substance abuse issue (which is usually tied to mental health) receive no treatment.

Worldwide, one out of four humans endure a mental health episode in their lifetimes. Depressive disorders are already the fourth leading cause of the global disease burden, and will likely rank second by the end of 2020, behind only ischemic heart disease. According to the World Health Organization (WHO), the global cost of mental illness—in terms of treatment, lost productivity, etc.—was nearly $2.5 trillion in 2010, with a projected increase to over $6 trillion by 2030.

Tragically, most mental health issues can be treated with relative ease: 80% of people with schizophrenia can be free of relapses following one year of treatment with antipsychotic drugs combined with family intervention. Up to 60% of people with depression can recover with a proper combination of antidepressant drugs and psychotherapy. And up to 70% of people with epilepsy can be seizure free with simple, inexpensive anticonvulsants. Even changing one’s diet could have an effect.

But over 40% of countries have no mental health policy, over 30% have no mental health programs, and around 25% have no mental health legislation. Nearly a third of countries allocate less than 1% of their total health budgets to mental health, while another third spend just 1% of their budgets on mental health. (The U.S. spent about 7.6% in 2001.)

In his book, Lost Connections: Uncovering the Real Causes of Depression – and the Unexpected Solutions, Johann Hari explores the environmental and socioeconomic factors that contribute to poor mental health, and how these are often neglected in discussions and approaches to depression and anxiety.

Someone could meditate, think positively, or pursue therapy all they want, but if they are rationing insulin to stay alive, cannot find affordable housing, struggle to find a well paying job, and are otherwise at the mercy of external forces that leave them fundamentally deprived, such treatments—however effective and beneficial in many contexts—can only go so far.

He illustrates this perfectly with the following account:

In the early days of the 21st century, a South African psychiatrist named Derek Summerfeld went to Cambodia, at a time when antidepressants were first being introduced there. He began to explain the concept to the doctors he met. They listened patiently and then told him they didn’t need these new antidepressants, because they already had antidepressants that work. He assumed they were talking about some kind of herbal remedy.

He asked them to explain, and they told him about a rice farmer they knew whose left leg was blown off by a landmine. He was fitted with a new limb, but he felt constantly anxious about the future, and was filled with despair. The doctors sat with him, and talked through his troubles. They realised that even with his new artificial limb, his old job—working in the rice paddies—was leaving him constantly stressed and in physical pain, and that was making him want to just stop living. So they had an idea. They believed that if he became a dairy farmer, he could live differently. So they bought him a cow. In the months and years that followed, his life changed. His depression—which had been profound—went away. ‘You see, doctor,’ they told him, the cow was an ‘antidepressant’.

To them, finding an antidepressant didn’t mean finding a way to change your brain chemistry. It meant finding a way to solve the problem that was causing the depression in the first place. We can do the same. Some of these solutions are things we can do as individuals, in our private lives. Some require bigger social shifts, which we can only achieve together, as citizens. But all of them require us to change our understanding of what depression and anxiety really are.

This is radical, but it is not, I discovered, a maverick position. In its official statement for World Health Day in 2017, the United Nations reviewed the best evidence and concluded that ‘the dominant biomedical narrative of depression’ is based on ‘biased and selective use of research outcomes’ that ‘must be abandoned’. We need to move from ‘focusing on ‘chemical imbalances’, they said, to focusing more on ‘power imbalances’.

I can only hope that as mental health becomes less stigmatized—less a matter of superstition, genetic inferiority, or moral and individual failing—we can work towards building fairer and more just societies that promote human flourishing, physically, mentally, and spiritually.

Source: WHO

The Little Satellite that Triggered the Space Age

On this day in 1957, the Soviet spacecraft Sputnik 1, the first artificial satellite to orbit the Earth, was launched from the Baikonur Cosmodrome (the first, largest, and most active space port to this day). Thus, began a series of pioneering Soviet firsts—from nonhuman lunar landings to explorations of Venus—that would in turn trigger the Space Race with America culminating in the Moon landings.

60 Years Since Sputnik | Space | Air & Space Magazine

Ironically, despite the centralized and authoritarian nature of the Soviet political system, the U.S.S.R. never developed a single coordinating space agency like NASA. Instead it relied on several competing “design bureaus” led by brilliant and ambitious chief engineers vying to produce the best ideas. In other worlds, these Cold War rivals embraced space exploration with the other side’s philosophy: the Americans were more government centered, while the Russians went with something closer to a free market. (Of course, this oversimplifies things since the U.S. relied and still relies on independent contractors.)

Sergei Korolev - Wikipedia

Hence Sputnik was the product of six different entities, from the Soviet Academy of Science to the Ministry of Defense and even the Ministry of Shipbuilding. The satellite had been proposed and designed by Sergei Korolev, a visionary rocket scientist who also designed its launcher, the R-7, which was the world’s first intercontinental ballistic missile. He is considered the father of modern aeronautics, playing a leading role in launching the first animal and human into space, with plans to land on the Moon before his unexpected death in 1966—three years before the U.S. would achieve that feat (who knows if the Russians would have made it had Korolev lived).

As many of us know, Sputnik’s launch led to the so called “Sputnik crisis”, which triggered panic and even hysteria among Americans, who feared the “free world” was outdone by the communists and that American prestige, leadership, scientific achievement, and even national security were all at stake. (After all, the first ICBM had just been used to launch the satellite and could very well do the same with nukes.)

Surprisingly, neither the Soviet nor American governments put much importance in Sputnik, at least not initially. The Russian response was pretty lowkey, as Sputnik was not intended for propaganda. The official state newspaper devoted only a few paragraphs to it, and the government had kept private its advances in rocketry and space science, which were well ahead of the rest of the world.

The U.S. government response was also surprisingly muted, far more so than the American public. The Eisenhower Administration already knew what was coming due to spy planes and other intelligence. Not only did they try to play it down, but Eisenhower himself was actually pleased that the U.S.S.R., and not the U.S., would be the first to test the waters of this new and uncertain frontier of space law.

But the subsequent shock and concern caught both the Soviet and American governments off guard. The U.S.S.R. soon went all-in with propaganda about Soviet technological expertise, especially as the Western world had long propagandized its superiority over the backward Russians. The U.S. pour money and resources into science and technology, creating not only NASA but DARPA, which is best known for planting the seeds of what would become the Internet. There was a new government-led emphasis on science and technology in American schools, with Congress enacting the 1958 National Defense Education Act, which provided low-interest loans for college tuition to students majoring in math and science.

After the launch of Sputnik, one poll found that one in four Americans thought that Russian sciences and engineering were superior to American; but the following year, this stunningly dropped to one out of ten, as the U.S. began launching its own satellites into space. The U.S.-run GPS system was largely the result of American physicists realizing Sputnik’s potential for allowing objects to be pinpointed from space.

The response to Sputnik was not entirely political, fearful, or worrisome. It was also a source of inspiration for generations of engineers, scientists, and astronauts across the world, even in the rival U.S. Many saw it optimistically as the start of a great new space age. The aeronautic designer Harrison Storms—responsible for the X-15 rocket plane and a head designer for major elements of the Apollo and Saturn V programs—claimed that the launch of Sputnik moved him to think of space as being the next step for America. Astronauts Alan Shepard, the first American in space, and Deke Slayton, one of the “Mercury Seven” who led early U.S. spaceflights, later wrote of how the sight of Sputnik 1 passing overhead inspired them to pursue their record-breaking new careers.

Who could look back and imagine that this simple, humble little satellite would lead us to where we are today? For all the geopolitical rivalry involved, Sputnik helped usher in tremendous hope, progress, and technological achievement.

The WEIRD Phenomenon

Most of us are familiar with the Muller-Lyer optical illusion above, named after its creator, German psychologist Franz Carl Müller-Lyer.

Like most optical illusions, it is designed to test basic brain and visual functions, helping us learn how and why human senses, cognition, etc. work the way they do. Many folks think the second line is longer than the first, even though both are the same, which purportedly shows that humans are susceptible to certain visual guides like arrows (though explanations for why this happens vary).

But the results do not tell the whole story: while many Westerners fall for this illusion (myself included), a study of 14 indigenous cultures found that none were tricked to the same degree. In fact, some cultures, like the San people of the Kalahari Desert, knew the two lines were equal length.

That’s because most studies claiming to reflect universal traits of human psychology and physiology only do so for a small and specific demographic—people from “WEIRD” societies, or Western, Educated, Industrialized, Rich and Democratic—which represent a tiny minority of all humans (about 12 percent).

The “WEIRD” phenomenon was first described in a 2010 paper from the University of British Columbia in Vancouver, which found that 96 percent of studies in economics, psychology, and cognitive science—such as the ones on optical illusions—were performed on people with European backgrounds. A sample of hundreds of studies in leading psychology journals found close to 70 percent of subjects were from the U.S., and of these, 67 percent were undergraduates studying psychology (which further slants studies to reflect one particular age group).

All this means that a randomly selected American undergraduate is 4,000 times likelier to be a subject in a psych study—and thus reflect all of human nature—than a random non-Westerner.

Yet when scientists perform some of these experiments in other cultures, the results are very different—not just for optical illusions, but for things as diverse as moral reasoning, notions of fairness, and sexual behavior. Even mental disorders seem to manifest differently across cultures and ethnic groups: one small study found that people with schizophrenia in India and Ghana hear friendlier voices than their counterparts in the U.S., suggesting that culture and environment may play a role. (This may account for why Westerners have a harder time with the Muller-Lyer optical illusion than some indigenous people: Most Americans are raised in urban environments where horizontal lines and sharp corners are ubiquitous; this presumably influences us into making optical calibrations that can potentially misfire, which forager societies like the San do not have to worry about.)

In fact, people from WEIRD societies like the U.S. appear to be outliers among humans, with the authors of the UBC concluding that Westerners “are among the least representative populations one could find for generalizing about humans”.

As an writer for NPR blithely noted, “It was not so much that the emperor of psychology had no clothes. It was more that he was dancing around in Western garb pretending to represent all humanity”.

Fortunately, researchers have wizened to these biases over the past decade, carefully adding qualifiers and caveats such as “in college populations” or “in Western society.” But its still easy for journalists, analysts, and casual readers like ourselves to read the findings of these studies and ascribe them to all of humanity. Much of human nature, like humans themselves, is a lot more complicated and multi-variable than WEIRD folks suggest.