An Ancient Greek Yearbook

You’re looking at an ancient Greek yearbook, which was rediscovered earlier this month after over 130 years in storage at a Scottish museum.

It lists the names of 31 graduates from the ephebate, a year of military and civic training undertaken around age 18 to prepare for life as adults. It ends with “of Caesar”, referring to emperor Claudius, the fourth ruler of the Roman Empire (41-54), indicating they graduated during his reign. (Greece had been under Roman rule for over a century, though its traditions—like the ephebate—remained largely unchanged.)

Among the names clearly visible on the marble are Atlas, Dionysos, Theogas, Elis, Zopyros Tryphon, Antypas, and Apollonios; many have never been seen before, and some are nicknames, such as Theogas for Theogenes and Dionysas for Dionysodoros. Using shortened names was unusual, and likely indicates that the graduates had a sense of camaraderie; the full class was probably about 100 men, and the use of nicknames—along with terms like “co-ephebes”, or “co-cadets”—indicates that this inscription was made by classmates who had become friends and wanted to remember each other.

According to Dr. Peter Liddel, professor of Greek history and epigraphy at the University of Manchester, who managed the discovery, this is also the earliest evidence of noncitizens taking part in the ephebate in this period—suggesting a greater level of social and cultural integration in the empire than previously thought.

“This is a really interesting inscription”, says Dr. Liddel, “partly because it’s new but also because it gives us new names and a bit of insight into the sort of access or accessibility of this institution which is often associated with elite citizens.”

It is unknown where the list was displayed, but it could have been somewhere public, such as a community space or gymnasium where the young men trained.

Dr. Liddel said: “It was made to create a sense of camaraderie and comradeship among this group of people who had been through a rigorous training program together and felt like they were part of a cohort.”

“It’s the ancient equivalent of a graduate school yearbook,” he reveals, “although this is one which is created by a number of individuals who wanted to feel like they had come together as friends.”

“It’s the ancient equivalent of a graduate school yearbook,” he reveals, “although this is one which is created by a number of individuals who wanted to feel like they had come together as friends.”

Another example of ancient peoples being more familiar and relatable than we would think!

Sources: Greek Reporter, NPR

Happy Anniversary to History’s Second Constitution

On this day in 1791, the Polish-Lithuanian Commonwealth—one of the largest and most powerful countries in Europe—adopted the first written national constitution in Europe, and only the second in the world, after the U.S. Constitution just two years earlier.

Like its counterpart across the Atlantic, Poland’s constitution—titled the Governance Act and known simply as the Constitution of 9 May 1791—was influenced by the Enlightenment, the European intellectual movement that, among other things, pioneered concepts like civil liberty, individual rights, religious and political tolerance, and so on.

The first page of the original 1791 constitution.

Remarkably, despite the vast geographic distance between the two countries, Poland’s constitutional structure was markedly similar to that of America: There were three branches of government—legislative, executive, and judicial—with checks and balances, a bicameral legislature, and a cabinet of ministers. The constitution declared that “all power in civil society [should be] derived from the will of the people” and defined the role of government as ensuring “the integrity of the states, civil liberty, and social order shall always remain in equilibrium. While Roman Catholicism was recognized as the “dominant faith”, freedom of religion was guaranteed—a remarkable proposition in a continent where people regularly killed each other for being the wrong Christian or simply holding the wrong doctrine.

The people of Poland-Lithuania were defined not as “subjects” of a king, but “citizens” with popular sovereignty—which included townspeople and peasants, who in most of Europe had no such recognition. The right to acquire property, hold public office, and join the nobility—whose powers and immunities were restricted—was extended to millions more people, including Jews (who almost everywhere else were denied anything akin to legal recognition, let alone political rights).

The new constitution even introduced a version habeas corpus—the core legal right that prevents abuse of power—known as Neminem captivabimus, summarized as “We shall not arrest anyone without a court verdict”.

The Constitution of 9 May 1791, an idealized portrayal of the constitution’s adoption, by Polish artist Jan Matejko. It was painted to commemorate the 100th anniversary of its adoption.

To be clear, the Constitution of 9 May 1791 had its limits, and its radicalism should not be overstated. The monarchy was still retained, with the king serving as head of the executive branch. Religious minorities such as Jews, as well the peasants who made up the vast majority of the population, still had few powers. While constrained, the nobility was not abolished as in the U.S. and later France, and in fact still retained many privileges.

But even in these areas, the Commonwealth went farther than almost any other country in the world at the time. The monarchy was not absolute: The king’s powers were constrained by the constitution and essentially shared with a council of ministers, who could overrule his decrees, forcing him to go to parliament. While peasants and Jews had few rights, they now had official protection from abuse—a step closer to recognizing their political rights, well beyond what was normal at the time. Eligible middle-class people could even join the ranks of nobility, a seemingly paradoxical form of progress that, again, was unusual for the time; nobles certainly couldn’t ride roughshod over commonfolk as they did elsewhere in Europe (which isn’t to say there weren’t abuses—this is still feudal Europe after all).

In any event, the Constitution of 9 May 1791 was a relatively bold and momentous step in the right direction, as evidenced by its rarity at the time—and sadly, by its short existence. In fewer than two years, the Polish-Lithuanian Commonwealth would be extinguished by the absolute monarchies of neighboring Prussia and Russia, which felt threatened by the constitution and the dangerous “revolutionary” ideas it introduced and could spread. Poland would cease to exist for well over another century, with its experiment never being fully tested—but also never dying off entirely, as the then-ongoing French Revolution and subsequent political reverberations would prove.

World AIDS Day

Belated World AIDS Day post: Although HIV/AIDS remains a scourge of humanity—particularly in it’s likely place of origin, Africa—we have made tremendous progress in reducing both infections and rates of death. Being HIV positive is no longer the death sentence it once was—ironically the large number of people living with the disease is in part a testament to the success of treatments and of policies to make them widely affordable and accessible (aided in large part by the much-maligned WHO).

As usual, German data-crunching company Statista lays it all out beautifully in their Instagram (which I highly recommend following).

Even though #worldaidsday has been used to promote awareness of the disease and mourn those who have died from it since 1988, the global epidemic is far from over.

According to data by @unaidsglobal, more than ten million people with HIV/AIDS don’t currently have access to antiretroviral treatment and the number of new infections with #HIV has remained the same compared to 2019 at roughly 1.5 million. When taking a closer look at the numbers, there are enormous regional differences in terms of battling the epidemic. Eastern and southern Africa, for example, combine for 55 percent of all known HIV/AIDS cases, while reducing new infections by 43 percent between 2010 and 2020. Western and central Africa also saw a decline of 37 percent when comparing 2010 and 2020, although it falls short of the benchmark of 75 percent set by the United Nations General Assembly.

While the number of new infections has dropped from 2.9 million in 2000 to 1.5 million last year, the number of people living with HIV increased from 25.5 million to approximately 37.7 million over the past two decades. According to UNAIDS, the increase is not only caused by new infections, but also a testament to the progress that has been made in treating HIV with antiretroviral therapy, which has vastly improved the outlook of those infected with HIV.

The even more astute data-lovers at Our World in Data vividly convey both the scale of the problem and just how much we have progressed, even in the most hard-hit places:

While in law school, I and some colleagues had the incredible opportunity to meet the hard working and earnest people at UNAIDS headquarters in Geneva. This unique entity is the first and only one of its kind in the world, combining the personnel and resources of nearly a dozen U.N. agencies to offer a comprehensive response to this pandemic. UNAID is also the only initiative to include civil society organizations in its governing structure.

Since it was launched in 1994, UNAIDS has helped millions of people worldwide get antiretroviral treatment for HIV/AIDS, provided millions more with preventative methods. Thanks to their efforts, and those of their partners across the world, the rate of infection and death by HIV/AIDS has stagnated or even declined in many areas, while the rate of treatment has increased.

As with so many other things, the COVID-19 pandemic has weakened the fight against HIV/AIDS, disrupting preventative measures and sapping away at an already-taxed healthcare system. With reports of individuals who seem to have naturally cured themselves of the virus, I have hope that we can regain momentum and maybe even develop an outright cure. Fortunately, the progress of the past several years proves we do not have to wait until then to make a difference to tens of millions of lives.

The First War to End All Wars

Yesterday was an even more devastating anniversary than the bar exam.

On July 28, 1914—exactly one month after the assassination of Archduke Franz Ferdinand—Austria declared war on Serbia and the First World War began. Despite directly setting off the war, both nations would soon be overshadowed by the much bigger players they dragged with them: France, Germany, Russia, and the U.K.

See the source image

After putting up stiff resistance for the first year, Serbia was conquered by the end of 1915 and occupied by Austro-Hungarian forces until the war’s end in 1918. Over 1.1 million Serbs died, including one out of four troops, up to a quarter of the population and 60 percent of men; proportionally, Serbia suffered more losses than any other country involved (the Ottoman Empire ranks second in this regard, losing 13-15 percent of people, followed by Romania at 7-9 percent).

For its part, the weak and declining Austro-Hungarian Empire lost over 2 million people, of whom 120,000 were civilians, amounting to about 4 percent of its total population. Having exhausted itself in its pyrrhic victory against Serbia, the country barely kept it together throughout the conflict, remaining a peripheral power dependent on German support; indeed, Austria-Hungary would ultimately collapse into several new countries, some of which would join Serbia to form a new multiethnic state called Yugoslavia.

All told, some 8 million fighting men were killed by combat and disease, and 21 million more were wounded. As many as 13 million civilians died as a result of starvation, exposure, disease, military action, and massacres. Four great empires and dynasties—the Hohenzollern, the Habsburg, the Romanov, and the Ottoman—fell, and the intercontinental movement of troops helped fuel the deadliest influenza pandemic in history. The ripple effects of the war, from the Great Depression, to World War II, to the Cold War, continue to be felt today. The war helped usher in the Russian Revolution, and ultimately the Soviet Union, the first major communist government (which ironically would play the pivotal role in helping end the second iteration of the war).

See the source image
See the source image

Better known are the grievances engendered by the post-war Versailles Treaty, which helped fuel the desperation and misery that became the Nazi’s stock and trade. Even Japan saw its star rise further as a major world power, belatedly joining the Allies and getting a seat at the table as one of the leaders of the post-war League of Nations (no small feat for a non-European country).

In Casualties of History, John Arquilla describes the almost morbidly comical arrogance and stupidity of this meat grinder of a conflict:

“Yes, a second and even more destructive conflict followed all too soon after the “war to end all wars”, impelling a name change from Armistice Day to Veterans Day. And the rest of the 20th century was littered with insurgencies, terrorism, and a host of other violent ills — most of which persist today, guaranteeing the steady production of new veterans, of which there are 22 million in the United States.

But despite the seemingly endless parade of wars waged and fresh conflicts looming just beyond the bloody horizon, World War I still stands out for its sheer horror. Over ten million soldiers died, and more than twice that number were wounded. This is a terrible enough toll. But what makes these casualties stand out even more is their proportion of the total numbers of troops mobilized.

For example, France put about 7.5 million soldiers in the field; one in five died, and three out of four who lived were wounded. All other major combatants on both sides suffered horribly: the Austro-Hungarian Empire’s 6.5 million soldiers had a combined casualty rate of 74 percent. For Britain and Russia, the comparable figures totaled a bit over 50 percent, with German and Turkish losses slightly below one-half of all who served. The United States entered the conflict late, and so the overall casualty rate for the 4.3 million mobilized was “just” 8 percent. Even so, it is more than double the percentage of killed and wounded from the Iraq War, where total American casualties amounted to less than 4 percent of the one million who served.

Few conflicts in all of military history have seen victors and vanquished alike suffer such shocking losses as were incurred in World War I, so it is worth taking time to remember how this hecatomb came to pass. A great body of evidence suggests that this disaster was a product of poor generalship. Historian Alan Clark’s magisterial “The Donkeys” conveys a sense of the incredible stubbornness of high commanders who continued, for years, to hurl massed waves of infantry against machine guns and rapid-firing artillery. All this went on while senior generals stayed far from the front. A British field commander, who went riding daily, even had soldiers spread sand along the country lane he followed, to make sure his horse didn’t slip.

It is little wonder that in the face of Nazi aggression barely a generation later, most of Europe melted away and succumbed to occupation within a year. Most nations did not have the political or public will to endure yet another meat grinder of a conflict; indeed, the major powers could not imagine that anyone would actually want another war given all the bloodletting that went around. Perhaps the greatest tragedy of the First World War was the fact that even all that death and destruction failed to stem the hatred, cruelty, and aggression of monstrous men and their millions of supporters and collaborators; in fact, the shortsightedness and vindictiveness of postwar leadersas had already been evidenced by their callous ineptitude on the battlefieldall but ensured that desperation and humiliation would give the likes of Hitler, Mussolini, and their minions plenty of currency to start an even bloodier.

Thanks goodness that, for now, that has not played out again all these decades later.

Source: Encyclopædia Britannica, Inc./Kenny Chmielewski

Echoes of the Roman Empire

The more you read about the history and politics of Rome, the more you realize that America follows the Roman example far more closely than just architecture and Latin terminology; even the word “senate” roughly translates from Latin to “council of elders” — an apt description of the generation gap between those with political power and everyone else (though to both the Greeks and the Romans, this was not a bad thing; age signified experience after all).

Read some of the descriptions of Rome’s political system by historians like Adrian Goldsworthy and Richard Miles with today’s America in mind.

The Romans valued military service above all else. It was seen as both a noble obligation of citizenship and as a way to drum up glory and thus political support. Over time, Roman politicians began to stress their personal military service — or at least their support of the military — to get elected. Political factions increasingly supported military conquests as a way to get popular approval, distract the masses with the glory of triumph, or to prove they’ve got the chops to govern.

Ironically, this deification of the military — for which the U.S. is unique among established democracies — would contribute to Rome’s downfall, as one general or soldier after another would seize power against venal politicians by capitalizing on their popularity following a victory or distinguished war record (only to of course become venal politicians themselves).

Roman high office was notoriously and openly cliquish. Only the same handful of wealthy, intermarried families had a shot at power. The Romans believed that merit and achievement passed on from generation to generation, prompting politicians to emphasize the accomplishment or one past or distant relative or another (which was easy to do since they all intermarried and could thus point to -someone- to do the trick). This had the obvious effect of creating political dynasties that made it very hard for so called “new men” to enter into politics, or at least the highest offices. Eventually, when the republic and later the empire crumbled under the weight of incompetent and corrupt politicians, these new men — now emphasizing their nonpolitical nature and success in business or the military — capitalized on the public’s disgust with established politicians, only to become part of the problem in the end.

Politics in Rome was highly personal, given the aforementioned dominance of families. Politicians openly curried favor with certain families for support, and both sides expected something in return. For this reason, Rome did not have political parties per se; there was little in the way of established policy or consistently ideology, as politicians just went with whatever would advance their interests or those of their allies or clients. Alliances shifted constantly; everyone invoked public service and the need to serve the public, but it was an open secret that politics was just a means to an end of power, wealth, and glory. Again, none of this was unusual; the Romans openly tried to work within this system to their own ends.

During emergencies, most commonly war, the Romans suspended politics as usual and appointed a “temporary” solution in the form of the “dictatorship”, a Latin term the describes a single individual’s ability to take control — i.e. “dictate” — policy for the good of the republic. Though the office typically lasted just six months, the famous case of Julius Cesar, who was alleged to have sought permanent dictator status, shows the age-old problem of balancing liberty and security.

Even Roman culture mirrored our own: The Romans stressed the material wealth, prosperity, and relative freedom that came with becoming a Roman citizen. They advertised to citizens and foreigners alike the sophisticated baths, restaurants (possibly a Roman invention), and other amenities unique to Roman life. They even developed a sophisticated credit system, not unlike today’s credit cards, to allow average people to ostensibly benefit.

Comparing America to the Roman Republic and Empire is a cliche among political scientists — but clearly for good reason I think.

The Arab Queen Who Took on the Roman Empire

I’ve recently become fascinated with the ancient historical figure of Zenobia, a third century Arab queen who is the only woman to almost rule the Roman Empire.

Image may contain: 2 people, people standing
An idealized portrayal titled Queen Zenobia’s Last Look upon Palmyra, by Herbert Gustave Schmalz (1888)

Zenobia came to power as regent to her ten year old son, who inherited the throne of Palmyra, an ancient Mesopotamian city that was one of the wealthiest and most powerful in the ancient world. (You may recall it was targeted by ISIS for destruction, which led to literally millennia of history being lost.)

By the time it came under Roman control in the first century, Palmyra was already a prosperous and cosmopolitan city, mostly Arab but with large minorities of Greeks, Armeans, and other ethnic groups. Multiple languages were spoken, a variety of faiths were tolerated, and there was even a Greco-Roman style senate that ran various civil affairs. Its incredible wealth and beauty—including cutting edge urban planning and numerous monuments and public works—earned it the moniker “pearl of the desert”. Situated at the crossroads between the Roman Mediterranean and the Western Asia, its caravans went across Europe, Africa, and even the Silk Road, making it a huge asset to Rome—and allowing its rulers uniquely significant autonomy under Roman imperial rule.

In fact, by the time Zenobia became the de facto queen of Palmyra in 267, the desert city-state had essentially become an allied power rather than a province; not only did it bring commercial goods and revenue, but it offered protection against unruly Arab tribes and eastern rivals, most of all the old nemesis, the Persians. Hence when the Roman Empire began to unravel during its “Crisis of the Third Century”, Zenobia apparently saw an opportunity for her people to attain well deserved greatness.

The Palmyrene Empire she founded spanned most of the Roman east, from central Turkey into western Iraq and down to Egypt (then one of the richest provinces of Rome). While she declared both herself and her son as emperors of all of Rome, she was never able to extend her rule past these territories, though her conquest of Egypt and managing to keep the Persian at bay (who had detected Roman weakness) had been impressive enough. Zenobia was definitely a product of her city: She spoke four languages, received a comprehensive education, and was steeped in the latest philosophy and science. Her reign was characterized by a policy of religious tolerance and intellectualism. While she worshipped a pantheon of Semitic gods, she was familiar with other faiths and cultures, and accommodated all religious groups, from the small but controversial cult known as Christianity, to the Jews who had long been in conflict with Rome. She invited scientists, philosophers, and other thinkers from all over the known world to her royal court, seeking to turn Palmyra into the next Athens.

While her empire barely lasted three years before it was subdued by Rome—her ultimate fate remaining unknown—Zenobia left a lasting legacy.

The Augustan History, a fourth-century Roman collection of biographies of emperors and usurpers lamented that “all shame is exhausted, for in the weakened state of the [Roman] commonwealth. . . a foreigner, Zenobia by name . . . proceeded to cast about her shoulders the imperial mantle [and ruled] longer than could be endured from one of the female sex.” She is also a point of pride to the people of Syria (where the Palmyrene kingdom was located) and remains a role model to women across the Arab world and beyond. Even Edward Gibbon, the famous seminal historian of the Roman world, remarked that few women in history were as influential as her.

COVID-19 and the Impartial Judgments of Nations

With the world responding to the pandemic in a variety of ways—and many countries learning from each other or from the U.N. World Health Organisation (itself made up of experts all over the world)—I am reminded of the largely forgotten words of James Madison, the architect of the U.S. Constitution.

This darling of patriots and conservatives—the Federalist Society uses his silhouette as its logo—once said that “no nation was so enlightened that it could ignore the impartial judgments of other nations and still expect to govern itself wisely and effectively.”

In the Federalist Papers, which were published to promote ratification of the Constitution, he emphasized the importance of respecting global public opinion:

An attention to the judgment of other nations is important to every government for two reasons: the one is, that, independently of the merits of any particular plan or measure, it is desirable, on various accounts, that it should appear to other nations as the offspring of a wise and honorable policy; the second is, that in doubtful cases, particularly where the national councils may be warped by some strong passion or momentary interest, the presumed or known opinion of the impartial world may be the best guide that can be followed. What has not America lost by her want of character with foreign nations? And how many errors and follies would she not have avoided, if the justice and propriety of her measures had in every instance been previously tried by the light in which they would probably appear to the unbiased part of mankind?

This was at a time when the U.S. was virtually the only republic in the world. Even the most patriotic and liberty-loving Founders recognized that whatever the political or cultural differences between the nations of the world, mere pragmatism should permit us to take whatever ideas or resources we can.

Consider that unlike other nations, we declined to use the W.H.O.’s test kits. Back in January, over a month before the first COVID-19 case, the Chinese published information on this new mysterious virus. Within a week, German scientists had produced the first diagnostic test. By the end of February, the U.N. shipped out tests to 60 countries.

As I’ve said ad naseum, global cooperation is not merely idealistic or Utopian: It’s the sober reality of living in a globalized society where we face problems that affect all humans, regardless of where they happen to be born. Even in the 18th century, our political founders and leaders understood this. We ignore it at our peril.

International Mother Language Day

In honor of International Mother Language Day—created to promote linguistic diversity and preservation—check out this beautiful and very detailed chart of the world’s languages. A lot of the data might surprise you!

It’s too big too fit here, but below is a little snapshot to give you an idea.

Here are some fun and colorful language infographics that do fit here!

The-100-Most-Spoken-Languages-in-the-World_Supplemental

As the name suggests, the massive Indo-European family includes every language from northern India through Iran and nearly all of Europe between Portugal and Russia (with Hungarian, Estonian, and Finnish being notable exceptions).

The language with the most speakers is, probably not surprisingly, English; about 15 percent of humanity can speak!

However, the vast majority of people who speak English learn it as a second language (as you might have noticed with the top infographic). Here are the languages with the most native speakers compared to second language (2L) speakers:

Here’s an interesting breakdown from the source:

Nearly 43% of the world’s population is bilingual, with the ability to switch between two languages with ease.

From the data, second language (L2) speakers can be calculated by looking at the difference between native and total speakers, as a proportion of the total. For example, 66% of English speakers learned it as a second language.

Swahili surprisingly has the highest ratio of L2 speakers to total speakers—although it only has 16 million native speakers, this shoots up to 98 million total speakers. Overall, 82% of Swahili speakers know it as a second language.

Swahili is listed as a national or official language in several African countries: Tanzania, Kenya, Uganda, and the Democratic Republic of Congo. It’s likely that the movement of people from rural areas into big cities in search of better economic opportunities, is what’s boosting the adoption of Swahili as a second language.

Indonesian is another similar example. With a 78% proportion of L2 speakers compared to total speakers, this variation on the Malay language has been used as the lingua franca across the islands for a long time. In contrast, only 17% of Mandarin speakers know it as a second language, perhaps because it is one of the most challenging languages to learn

Tragically, the U.N. has good reason to dedicate a day for the preservation of languages: The 100th most common language is “Sanaani Spoken Arabic”, spoken primarily in Yemen by around 11 million people. Yet there are a total of 7,111 languages still spoken today, meaning the vast majority of them—all but 100—have less than 11 million speakers.

In fact, approximately 3,000 all languages (40 percent) are at risk of being lost, or are already in the process of dying out today. (By one estimate, a language dies every two weeks.) Fortunately, growing awareness and advanced technology are helping to document and preserve these unique aspects of human existence, and all the unique ideas, stories, and concepts they each contain.

When Cities are as Powerful as Nations

Before the emergence of the political units we now call countries, humans organized themselves in a variety of other ways, ranging from bands and tribes, to chiefdoms, kingdoms, and empires. Most of these entities were not proper countries as we think of them today, lacking a cohesive political or national identity, a firm boundary, or much in the way of an organized government.

The ancient societies of Egypt, Greece, China, Mesoamerica, the Indus River Valley, and Mesopotamia were among the exceptions, which is why they are recognized as “cradles of civilization”, places where the first features of what we consider modern society emerged: agriculture, urban development, social stratification, complex communication systems, infrastructure, and so on.

The urban character of civilization is what I find most interesting, because cities were where power, both political and economic, was concentrated. Urban centers were the places from which rulers asserted their authorities. Cities are where democracy and republicanism took root, and where civic engagement survived through the Middle Ages in places like Florence, Venice, Krakow,and Hamburg.

This dynamic has changed little in the 21st century; in fact, it is arguably stronger and more pronounced than ever, as globalization, population growth, and advanced technology come together to create metropolises as populous, wealthy, and powerful as entire countries.

The following map, courtesy of CityLab, draws on data from 2015 to prove the incredible growth and prestige of modern cities (the data for cities comes from the Brookings Institution’s Redefining Global Cities report, while the data for nations is from the World Bank’s World Development Indicators; the map was compiled by Taylor Blake of the Martin Prosperity Institute).

A few highlights noted by the article:

  • Tokyo, the world’s largest metro economy with $1.6 trillion in GDP-PPP, is just slightly smaller than all of South Korea. Were it a nation, Tokyo would rank as the 15th largest economy in the world.
  • New York City’s $1.5 trillion GDP places it among the world’s twenty largest economies, just a tick under those of Spain and Canada.
  • Los Angeles’ $928 billion GDP is bit smaller than Australia’s, with $1.1 trillion.
  • Seoul ($903 billion) has a bigger economy than Malaysia ($817 billion).
  • London’s $831 billion GDP makes its economic activity on par with the Netherlands ($840 billion).
  • Paris, with $819 billion in GDP, has a bigger economy than South Africa, $726 billion.
  • The $810 billion economy of Shanghai outranks that of the Philippines, with $744 billion.

To put things in further perspective: if you added up the ten largest metropolitan areas, you’d get an economy of over $9.5 trillion, bigger than the Japanese and German economies combined. Add the next ten largest metros, and you get the second largest economy in the world, at $14.6 trillion, less than four trillion shy of the U.S.

In other words: Cities really are the new power centers of the global economy—the platforms for innovation, entrepreneurship, and economic growth. But when it comes to fiscal and political power, they remain beholden to increasingly anachronistic and backward-looking nation-states, which has become distressingly obvious with the rise of Trumpism in the United States and populism around the world.

The greatest challenge facing us today is how to ensure that global cities have the economic, fiscal, and political power to govern themselves and to continue to be a force for innovation and human progress.


Very relevant question as the balance of power both within and between countries shifts to certain global cities, especially in the developing world.

What are your thoughts?

Ancient Links Between Rome and Sri Lanka

It never ceases to amaze me how well connected and globalized the ancients were. We think of globalization as a thoroughly modern phenomenon, yet the seeds of it were planted centuries or even millennia ago, where global connections would have seemed impossible. 

As Science reports:

Visit Mantai, nestled into a bay in northwestern Sri Lanka, and today you’ll see nothing but a solitary Hindu temple overlooking the sea. But 1500 years ago, Mantai was a bustling port where merchants traded their era’s most valuable commodities. Now, a study of ancient plant remains reveals traders from all corners of the world—including the Roman Empire—may have visited or even lived there.

Mantai was a hub on the ancient trade networks that crisscrossed the Indian Ocean and connected the distant corners of Asia, Africa, Europe, and the Middle East. The port town flourished between 200 B.C.E. and 850 C.E. During that time, it would have been a nexus for the spice trade, which ferried Indonesian cloves and Indian peppercorns to Middle Eastern and Roman kitchens.

[…]

The team also found remains that could link the port city to the ancient Mediterranean world—processed wheat grains dated to 100 to 200 C.E. and grape seeds dated to 650 to 800 C.E. Neither crop can grow in Sri Lanka’s wet, tropical climate, so they had to be imported, possibly from as far as Arabia or the Roman world. Kingwell-Banham says her team is studying the chemical isotopes absorbed by the plants to determine where they were grown.

But no matter their precise origin, the coexistence of rice and wheat is evidence of Mantai’s “cosmopolitan cuisine,” in which both local and foreign foods were eaten, she says. The discovery of wheat and grapes in Mantai “is entirely new,” and shifts the focus on goods transported from South Asia to the Roman world, to goods that went in the other direction,” Coningham says.


While there is no evidence that Roman merchants or other travelers lived in what is today Sri Lanka, it is certainly not out of the realm of possibility: just a few years ago, remains were unearthed in London that appear to be of Chinese origin — and date back to between the third and fifth centuries C.E., when it was the Roman city of Londonium.