Northeastern’s Historic Buildings

Like many universities, or indeed many large institutions, Northeastern’s history is seeped in the stories of numerous property acquisitions.

Some of those, of course, have been quite scandalous. Northeastern’s Speare Hall, for example, lies on Huntington Avenue between the Boston Symphony and the Museum of Fine Arts – on land which was once housed a magnificent “temple of music”; Boston’s original Opera House.

Opened in 1909, the glory of this opera house was unfortunately short-lived. Its original opera company went bankrupt by 1915. Various theater companies used the space, but the building fell into disrepair.

In 1957, the property was purchased by the Boston Redevelopment Authority, who sold it to Northeastern a week later. It’s unclear to me exactly how transparent this deal was. In a 2009 story, the Boston Globe indicated that the building really was far past repair, and mentions off-handedly that a Northeastern building now sits there.

But in a 2011 piece from Northeastern’s student newspaper, Emeritus Professor Wilfred Holton indicates a different story:

“It was kind of sneaky how they did it,” said Holton. ”Northeastern said they had no interest in the building. Then the developers bought it and it looked like Northeastern had a deal with them because within a week, the university bought it from them. But the university got away with it, obviously.”

Despite the sale, Boston’s cultural community tried to save the building, but “the condition of the building required it to be demolished and rebuilt.”

While I imagine some are skeptical of the necessity of this demolition, Northeastern’s 1976 Master Plan is clear that the Boston Opera House “had been condemned as
unsafe prior to acquisition by Northeastern.”

Apparently a brick from the original building is preserved in the university archives.

What started me on this story, though, was the history of Northeastern’s Holmes’ Hall. Purchased in 1961 and dedicated in 1979, four Northeastern buildings – Lake Hall, Meserve Hall, Nightingale Hall, and Holmes Hall – once belonged to United Drug Company.

United Drug Company (UCD) was the corporate force behind the retail chain of Rexall Drug Stores. Founded in 1903 by Louis Kroh Liggett, the Boston-based company once boasted “as many as 12,000 drug stores across the United States.”

Incidentally, Liggett apparently got his start selling “Vinol” made from wine and cod livers. It’s unclear to me exactly what ailment this tonic was intended to address.

Northeastern’s archives – which house an odd assortment of Rexall remedies – indicates that “in the 1930s, UDC built six buildings on its Boston campus that housed its corporate offices and manufacturing and research facilities.” While many of these buildings were eventually demolished, Northeastern renovated one UDC building – splitting it into the four Northeastern buildings which exist today.

Now, somebody had told me that one of these buildings, Holmes Hall, used to be a rubber factory – a fact I was beginning to doubt as I read about the history of United Drug Company.

But then I ran across this tidbit. After the initial construction of “a small factory” in Boston, “A candy making department was the next installation, followed by one for perfumery in 1905. Stationery and fountain supplies were added in 1910, rubber goods in 1912, brushes in 1913 and hospital items in 1919.”

So I guess United Drug Company had a rather diverse manufacturing portfolio.

John N. Ingham’s Biographical Dictionary of American Business Leaders, Volume 2 – the book you never knew you needed – confirms this history, noting that “United’s first product was a dyspepsia tablet, but it soon was bringing out a wide variety of patent medicines, along with spices, toilet soap, candy, and rubber goods.”

Furthermore, optically scanned versions of various trademark and patent applications can be found online.

But what I really want to know is how the still-standing building maps onto UDC’s original operation.

Some indications of this can be found in a 1998 Northeastern publication:

Evidence of the United Drug Company survives today. Fired terra-cotta shields at the tops of the beveled corners at Greenleaf Street carry the lettering “UD Co.” On the fifth floor of Lake Hall, the Math Department enjoys the dark wood paneling and marble fireplace of United Drug’s president’s office. Every floor in the building carries a large, walk-in safe, perhaps for protecting secret product formulas. And over the door on the Leon Street end of today’s Ryder Building is a carved sign: United Drug Company Department of Research and Technology.

And one more fun fact revealed by that document:

In 1961, Northeastern purchased a seven-acre parcel of land from the United Realty Company. The entire, red-brick industrial complex occupying the site, once owned by the United Drug Company, was to be razed to make way for a sports facility. After reducing three blocks of buildings facing Forsyth Street to rubble, however, the demolition crew was ordered to stop. The University had grown so rapidly that the old buildings now had to be salvaged for offices and laboratories.”

facebooktwittergoogle_plusredditlinkedintumblrmail

Networks in Political Theory

While graph theory has a long and rich history as a field of mathematics, it is only relatively recently that these concepts have found their way into the social sciences and other disciplines.

In 1967, Stanley Milgram published his work on The Small World Problem. In 1973 Mark Granovetter studied The Strength of Weak Ties. But while this applied methodology is young, the interest in networks has been in the ether of political theory for some time.

I’ve noted previously how Walter Lippmann can be interpreted as invoking networks in his 1925 book the Phantom Public. Lippmann argues vehemently against this thing we call ‘public opinion’  – a myth Lippmann doesn’t believe truly exists:

We have been taught to think of society as a body, with a mind, a soul, and a purpose, not as a collection of men, women and children whose minds, souls and purposes are variously related. Instead of being allowed to think realistically of a complex of social relations, we have had foisted upon us by various great propagative movements the notion of a mythical entity, called Society, the Nation, the Community.

Rather than thinking of society as a single, collective whole, Lippmann argues that we ought to “think of society not as the name of a thing but as the name of all the adjustments between individuals and their things.

I’d noted this passage in Lippmann when I first read his work several year ago. But I was surprised recently to come across a similarly networked-oriented sentiment in John Dewey’s 1927 rebuttal, The Public and Its Problems:

In its approximate sense, anything is individual which moves and acts as a unitary thing. For common sense, a certain spatial separateness is the mark of this individuality. A thing is one when it stands, lies or moves as a unit independently of other things, whether it be a stone, tree, molecule or drop of water, or a human being. But even vulgar common sense at once introduces certain qualifications. The tree stands only when rooted in soil; it lives or dies in the mode of its connections with sunlight, air and water. Then too the tree is a collection if interacting parts; is the tree more a single whole than it’s cells?

…From another point of view, we have to qualify our approximate notion of an individual as being that which acts and moves as a unitary thing. We have to consider not only its connections and ties, but the consequences with respect to which it acts and moves. We are compelled to say that for some purposes, for some results, the tree is an individual, for others the cell, and for a third, the forest or the landscape…an individual, whatever else it is or is not, is not just the specially isolated thing our imagination inclines to take it to be.

…Any human being is in one respect an association, consisting of a multitude of cells each living its own life. And as the activity of each cell is conditioned and directed by those with which it interacts, so the human being whom we fasten upon as individual par excellence is moved and regulated by his association with others; what he does and what the consequences of his behavior are, what his experience consists of, cannot even be described, much less accounted for, in isolation.

facebooktwittergoogle_plusredditlinkedintumblrmail

“Love the Hell Out of Everybody” – an Evening with John Lewis

There aren’t too many people who get a standing ovation before they even speak.

John Lewis, former chairman of the Student Nonviolent Coordinating Committee (SNCC) and the  last living member of the “Big Six” civil rights leaders, is one of them.

From the movement he walked on stage, I could feel the energy in the room: the overwhelming love and appreciation for this man who endured so many brutal beatings as he strove for justice; the rising hope that tenaciously carries on from the victories of the civil rights movement; and the growing despair that we are sliding backwards in time, regressing towards our darker days of hatred and oppression.

And then he spoke. A deep, melodic voice that rolled across the room, reverberating from every corner. The crowd fell silent.

This was actually the second time I had the pleasure of hearing John Lewis speak. The first was in 2009 when he was my commencement speaker as I finished my Masters’ degree at Emerson College. The second time, last night, he delved even deeper into his experience of the civil right movement as he was hosted by my former colleagues at Tisch College at Tufts University.

He’s a politician now – Lewis has served as Congressman for Georgia’s 5th congressional district since 1987 – but he doesn’t speak with the same canned cadence which is so widespread amongst elected officials.

You get the distinct impression he genuinely believes what he says; and that his beliefs have been shaped by the difficult crucible of experience.

In 1965 he led nearly 600 protestors in a peaceful march over the Edmund Pettus Bridge in Selma, Alabama.

Prepared for being arrested, the 25-year-old Lewis carried the essentials with him: two books, an apple, a banana, and a toothbrush. All the things he thought he might need in prison.

“I thought I’d be arrested an jailed,” Lewis recalled. “But I didn’t think I’d be beaten – that’d I’d be left bloody there.”

Lewis’ skull was fractured by a police nightstick and he nearly lost his life.

It wasn’t the first time Lewis had been beaten, either. At the age of 21, Lewis was the first of the Freedom Riders to be assaulted while working to desegregate the interstate bus system.

This was life for a young, black man in 1960s America.

And, perhaps, most remarkably, through it all Lewis continues to follow the message of his friend and mentor, Dr. Martin Luther King. In response to such brutal attacks, in the face of the terrible of injustices of today, Lewis turns not to anger, but to love.

“To revolutionize America to be better, we must first revolutionize ourselves. We have to humanize our institutions, humanize ourselves,” he argues.

For Lewis, the choice is quite simple, “You can’t set out to build the beloved community and use tactics which are not loving.”

So he endured the bloody beatings, endured the deepest injustices of our system. And in 2009, when a former Klansman apologized for being one of Lewis’ assailants, the two hugged and cried.

“That’s the power of the way of peace, the way of love, the way of non-violence,” Lewis said.

Of course, not all activist share this view – and in remembering the civil rights movement, we too often gloss over or belittle the important contributions of activists like Malcolm X. But that’s a longer debate for another day.

So for now, I will leave you with a final thought from John Lewis, who has endured so much in his continuing fight for the equality of all people. Quoting Dr. King, Lewis just smiles and explains his philosophy simply:

“Just love the hell out of everybody.”

facebooktwittergoogle_plusredditlinkedintumblrmail

The Death of Dr. Martin Luther King

On April 4, 1968 – forty-eight years ago yesterday – at 6:01 pm, Dr. Martin Luther King, Jr. was assassinated as he stood on the second floor balcony of room 306 at the Lorraine Motel in Memphis, Tennessee.

Four days later,  Congressman John Conyers (D-MI) introduced legislation to establish a national holiday to honor Dr. King. That legislation was eventually signed into law on November 2, 1983; fifteen years after Dr. King’s death.

On the occasion of the bill signing, President Ronald Regan declared:

…our nation has decided to honor Dr. Martin Luther King, Jr., by setting aside a day each year to remember him and the just cause he stood for. We’ve made historic strides since Rosa Parks refused to go to the back of the bus. As a democratic people, we can take pride in the knowledge that we Americans recognized a grave injustice and took action to correct it. And we should remember that in far too many countries, people like Dr. King never have the opportunity to speak out at all. 

But traces of bigotry still mar America. So, each year on Martin Luther King Day, let us not only recall Dr. King, but rededicate ourselves to the Commandments he believed in and sought to live every day: Thou shall love thy God with all thy heart, and thou shall love thy neighbor as thyself. And I just have to believe that all of us—if all of us, young and old, Republicans and Democrats, do all we can to live up to those Commandments, then we will see the day when Dr. King’s dream comes true.

Perhaps, in the optimistic spirit of Dr. King, it is right that we remember his legacy on the day of his birth. Yet this observance is cruel in kindness, somehow – a soft celebration of our darker days.

As a democratic people, we can take pride in the knowledge that we Americans recognized a grave injustice and took action to correct it. 

Took too little action, I’m afraid.

We made progress, no doubt, but far too little compared to the difficult work still ahead of us. The legacy of our history, the deeds of our ancestors, are not so easily wiped out. It’s shameful to pretend otherwise.

Yet once a year, we blithely celebrate our victory; we take pride in our justice and imagine that we, if given the opportunity, would have been on the right side of history. Such tragedies would never happen in our America.

And little do we note the day of Dr. King’s passing; the day white violence took him from our world.

It’s an easy choice in some ways; far better to recognize a day of hope, to celebrate our better selves. But the murder of Dr. King is our legacy, too; a painful reality which is easier to ignore.

So perhaps we would do well to remember the speech Dr. King gave the day before his assassination.

Dr. King declared that he was happy to have lived “just a few years” in the current time. It was a dark and dangerous time, but he was happy because:

Something is happening in our world. The masses of people are rising up. And wherever they are assembled today, whether they are in Johannesburg, South Africa; Nairobi, Kenya; Accra, Ghana; New York City; Atlanta, Georgia; Jackson, Mississippi; or Memphis, Tennessee — the cry is always the same: “We want to be free.”

Dr. King knew that there were threats against him. He knew that the FBI had invested him and urged him to commit suicide. He knew that there were many who would act to see him dead. But, he declared:

…it really doesn’t matter with me now, because I’ve been to the mountaintop.

And I don’t mind.

Like anybody, I would like to live a long life. Longevity has its place. But I’m not concerned about that now. I just want to do God’s will. And He’s allowed me to go up to the mountain. And I’ve looked over. And I’ve seen the Promised Land. I may not get there with you. But I want you to know tonight, that we, as a people, will get to the promised land!

And so I’m happy, tonight.

I’m not worried about anything.

He was murdered the next day.

And his legacy lives on.

But his true legacy is not a reflection of the injustice behind us, but rather a reminder of the work still ahead of us. He had gone up the mountain; he had seen the promised land.

We have a long journey remaining, and there is much work to be done.

facebooktwittergoogle_plusredditlinkedintumblrmail

The AP and Nazi Germany

Harriet Scharnberg, German historian and Ph.D. student at the Institute of History of the Martin Luther University of Halle-Wittenberg made waves yesterday with the release, in the journal Studies in Contemporary History, of her paper, Das A und P der Propaganda: Associated Press und die nationalsozialistische Bildpublizistik.

The paper finds that, prior to the expulsion of all foreign media in 1941, the AP collaborated with Nazi Germany; signing the Schriftleitergesetz (editor’s law) which forbid the employment of “non-Aryans” and effectively ceded editorial control to the German propaganda ministry.

These are claims which the AP vehemently denies:

AP rejects the suggestion that it collaborated with the Nazi regime at any time. Rather, the AP was subjected to pressure from the Nazi regime from the period of Hitler’s coming to power in 1933 until the AP’s expulsion from Germany in 1941. AP staff resisted the pressure while doing its best to gather accurate, vital and objective news for the world in a dark and dangerous time.

AP news reporting in the 1930s helped to warn the world of the Nazi menace. AP’s Berlin bureau chief, Louis P. Lochner, won the 1939 Pulitzer Prize for his dispatches from Berlin about the Nazi regime. Earlier, Lochner also resisted anti-Semitic pressure to fire AP’s Jewish employees and when that failed he arranged for them to become employed by AP outside of Germany, likely saving their lives.

Lochner himself was interned in Germany for five months after the United States entered the war and was later released in a prisoner exchange.

Regardless which finding present a more accurate historical truth, I find this controversy quite fascinating.

According to the Guardian, the AP was the only was the only western news agency able to stay open in Hitler’s Germany, while other outlets were kicked out for refusal to comply with Nazi regulations.

This exclusivity lends credence to the claim they the news agency did, in some way, collaborate – since it seems improbably that the Nazis would have allowed them to continue without some measure of compliance. It also suggests a shameful reason for this compliance: choosing to stay, even under disagreeable terms, was a smart business decision.

But it also highlights the interesting challenge faced by foreign correspondents covering repressive regimes.

For German news media, it was a zero-sum game: either comply with the Schriftleitergesetz or face charges of treason – a charge that would likely have serious repercussions for one’s family as well.

The AP, from what I can tell, seems to have skirted some middle ground.

By their account, the AP did work with a “photo agency subsidiary of AP Britain” which, in 1935 “became subject to the Nazi press-control law but continued to gather photo images inside Germany and later inside countries occupied by Germany.”

While images from this subsidiary were supplied to U.S. newspapers, “those that came from Nazi government, government-controlled or government–censored sources were labeled as such in their captions or photo credits sent to U.S. members and other customers of the AP, who used their own editorial judgment about whether to publish the images.”

The line between collaboration and providing critical information seems awfully fuzzy here.

Critics would claim that the AP was simply looking out for it’s own bottom-line, sacrificing editorial integrity for an economic advantage. The AP, however, seems to argue that it was a difficult time and they did what they had to do to provide the best coverage they could – they did not collaborate, but they played by the rules just enough to maintain the accesses needed to share an important story with the world.

facebooktwittergoogle_plusredditlinkedintumblrmail

A Brief History of Saint Days

So, I went down a bit of a rabbit hole this morning trying to figure out answers to what I thought were somewhat straightforward questions. First, when did people in various western European countries stop celebrating their Saints’ day – or name day, if you will – and second, how did the various reorganizations of the liturgical calendar affect name day celebrations?

I rather thought there would be plenty of information and resources to explore these questions, but I’m afraid I’ve merely found fragments.

The Catholic Church has celebrated feast days for important saints nearly since its inception. St. Martin of Tours, born in 316 in Sabaria (now Szombathely, Hungary) is thought to be the first saint – or at least the first to not die as a marytr.

Saint days quickly became a staple of the early Catholic church. As Christian Rohr has argued, these were not just days of religious observance, but were deeply seeped in the symbols and politics of their times:

When the feudal and the chivalrous system had been fully established during the High Middle Ages these leading social groups had to find an identity of their own by celebrating courtly feasts. So, they distinguished themselves from the rest of the people. Aristocratic festival culture, consisting of tournaments, courtly poetry and music, but also of expensive banquets, was shown openly to the public, representing the own personality or the own social group in general. Town citizens and craftsmen, however, were organized in brotherhoods and guilds; they demonstrated their community by celebrating common procession, such as on the commemoration day of the patron saint of their town or of their profession.

These courtly feasts were “held on high religious celebration days” – over half took place on Whitsunday. For craftsmen, Rohr points to the French city of Colmar, where “the bakers once stroke for more than ten years to receive the privilege to bear candles with them during the annual procession for the town patron.”

And, somewhere amid these deeply interwoven strands of religion, economics, and power, people began celebrating their own Saints’ day. That is, as most people shared a name with one of the saints, that saint’s feast day would have special significance for them.

It’s unclear to me exactly when or how this came about. Most references I read about these name day celebrations simply indicate that they have “long been popular.”

Name days celebrations today – though generally more secular in their modern incarnation – take place in a range of “Catholic and Orthodox countries…and [have] continued in some measure in countries, such the Scandinavian countries, whose Protestant established church retains certain Catholic traditions.”

But here’s the interesting thing: at least based on Wikipedia’s list of countries where name day celebrations are common, the practice is much more common in Eastern Orthodox countries than in Roman Catholic ones.

Now, the great East–West Schism – which officially divided the two churches – took place in 1054. My sense – though I’ve had trouble finding documentation of this – is that celebrating one’s saints’ day was a common practice in both east and west at that time. Name day celebrations do take place in the western European countries of France, Germany, and – importantly – Italy, which seems to indicate that the difference in name day celebration rates is not merely a reflection of an east-west divide.

It’s entirely unclear to me what led to this discrepancy. One theory is that this a by-product of the Reformation – during which time, at least in the UK, various laws banned Catholics from practicing.

But, I also find myself wondering about the effects of various reorganizations of the (Roman Catholic) liturgical calendar – eg, the calendar of Saint Days and other religious festivals. The calendar has been adjusted many times over the years, including as recently as 1969, when Pope Paul VI, explaining that “in the course of centuries the feasts of the saints have become more and more numerous,” wrote justified the new calendar:

…the names of some saints have been removed from the universal Calendar, and the faculty has been given of re-establishing in regions concerned, if it is desired, the commemorations and cult of other saints. The suppression of reference to a certain number of saints who are not universally known has permitted the insertion, within the Roman Calendar, of names of some martyrs of regions where the proclaiming of the Gospel arrived at a later date. Thus, as representatives of their countries, those who have won renown by the shedding of their blood for Christ or by their outstanding virtues enjoy the same dignity in this same catalogue.

Most notably and controversially, Saint Christopher was deemed to not be of the official Roman tradition, though celebration of his feast day is still permitted under some regional calendars. If you’re curious, you can read a list of the full changes made to liturgical calendar in 1969.

Many of these changes, such as the removal of Symphorosa and her seven sons, likely had little effect on anyone’s name day celebration. But, by mere probability, I would think that at some point over the years, someone had their Saint removed from the liturgy – which I imagine would probably be a rather disarming event. Though I suspect that wasn’t a big enough factor in diminishing the strength of the celebration over time.

Well, that is all that I have been able to find out. I have many unanswered questions and many more which keep popping up. If you have some expertise in Catholic liturgy and have any theories or answers, please let me know. Otherwise, I suppose, it will remain another historical mystery.

 

facebooktwittergoogle_plusredditlinkedintumblrmail

The Easter Rebellion and Lessons From Our Past

I had planned today to write something commemorating the centenary of Ireland’s Easter Rising; the quickly-crushed insurrection which paved the way for the Irish Free State.

But such reflections seem somewhat callous against the grim backdrop of current world events.

Just this weekend, a suicide bomber killed at least 70 – mostly children – in an attack on a park in Lahore, Pakistan.

I debated this morning whether to write about that instead. Whether to grieve the mounting death toll from attacks around the world, or whether to question, again, our seemingly preferential concern for places like Brussels and Paris. Or perhaps to highlight the inequities evident in such headlines as CNN’s In Pakistan, Taliban’s Easter bombing targets, kills scores of Christians.

The majority of those killed were Muslim.

Perhaps these details hardly matter; it is all of it a horror.

But if I were to write about every global tragedy, these pages would find room for little else. There is no end to suffering, no limit of atrocity.

Perhaps I should write instead about Radovan Karadzic, the former Bosnian Serb leader, who – twenty years after orchestrating the ethnic cleansing of Srebrenica – was just convicted of genocide, war crimes and crimes against humanity by a United Nations tribunal.

Of course, such news also serves as a reminder that Omar al-Bashir, the current, sitting president of Sudan, is wanted by the International Criminal Court (ICC) for war crimes and crimes against humanity. He is also widely considered to be a perpetrator of genocide, though the ICC demurred from making that charge. The ICC issued its arrest warrant in 2009, citing numerous crimes committed since 2003. Bashir won reelection in 2010 and again in 2015.

It is all too much.

Perhaps I should write about the Easter Rising – a notable event for my own family – after all.

In the midsts of World War I, on Easter Monday 1916, 1,600 Irish rebels seized strategic government buildings across Dublin. From the city’s General Post Office, Patrick Pearse and other leading of the rising, issued a Proclamation of the Provisional Government of the Irish Republic:

We declare the right of the people of Ireland to the ownership of Ireland and to the unfettered control of Irish destinies, to be sovereign and indefeasible. The long usurpation of that right by a foreign people and government has not extinguished the right, nor can it ever be extinguished except by the destruction of the Irish people.

The overwhelming superiority of British artillery soon put an end to the provisional government.  Over 500 people were killed; more than half were civilians. In The Rising historian Fearghal McGarry argues that Irish rebels attempted to avoid needless bloodshed, while, according to  one British soldier, the British troops, “regarded, not unreasonably, everyone they saw as an enemy, and fired at anything that moved.”

During the fighting, the British artillery attacks were so intense that the General Post Office (GPO) was left as little but a burnt-out shell. As an aside, the GPO housed generations of census records and other government documents – making my mother’s efforts to recreate my family tree permanently impossible.

After the the rebellion had been crushed, fifteen people identified as leaders were executed by firing squad the following week.

This week is rightly a time of commemoration and celebration in Ireland. The brutality of the British response galvanized the Irish people – among whom the uprising had initially been unpopular. The tragedy of the Easter Rising thus led to Irish freedom and, after many more decades, ultimately to peace.

It’s a long and brutal road, but amid all the world’s horrors, confronted by man’s undeniable inhumanity to man, perhaps it is well to remember: we do have the capacity for change.

facebooktwittergoogle_plusredditlinkedintumblrmail

Populism and Democracy

Yesterday, I discussed some of the concerns Walter Lippmann raised about entrusting too much power to “the people” at large.

Such concerns are near blasphemy in a democratically-spiritual society, yet I consistently find myself turning towards Lippmann as a theorist who eloquently raises critical issues which, in my view, have yet to be sufficiently addressed.

At their worst, Lippmann’s arguments are interpreted as rash calls for technocracy: if “the people” cannot be trusted, only those who are educated, thoughtful, and qualified should be permitted to voice public opinions. In short, political power should rightly remain with the elites.

I find that to be a misreading of Lippmann and a disservice to the importance of the issues he raises.

In fact, Lippmann’s primary concern was technocracy – the governing of an elite caring solely  for their own interests and whose power ensured their continued dominion. Calling such a system “democracy” merely creates an illusion of the public’s autonomy, thereby only serving to cement elites’ power.

I do not dispute that Lippmann finds “the public” wanting. He clearly believes that the population at large is not up to the serious tasks of democracy.

But his charges are not spurious. The popularity of certain Republican candidates and similarly fear-mongering politicians around the world should be enough to give us pause. The ideals of democracy are rarely achieved; what is popular is not intrinsically synonymous with what is Good.

This idea is distressing, no doubt, but it is worth spending time considering the possible causes of the public failures.

One account puts this blame on the people themselves: people, generally speaking, are too lazy, stupid, or short sighted to properly execute the duties of a citizen. This would be a call for some form of technocratic or meritocratic governance – perhaps those who don’t put in the effort to be good citizens should be plainly denied a voice in governance.

Robert Heinlein, for example, suggests in his fiction that only those who serve in the military should be granted the full voting rights of citizenship. “Citizenship is an attitude, a state of mind, an emotional conviction that the whole is greater than the part…and that the part should be humbly proud to sacrifice itself that the whole may live.”

Similarly, people regularly float the idea of a basic civics test to qualify for voting. You aren’t permitted to drive a car without proving you know the rules of the road; you shouldn’t be allowed to vote unless you can name the branches of government.

Such a plan may seem reasonable on the surface, but it quickly introduces serious challenges. For generations in this country, literacy tests have been used to disenfranchise poor voters, immigrants, and people of color. And even if such disenfranchisement weren’t the result of intentional discrimination – as it often was – the existence of any such test would be biased in favor of those with better access to knowledge.

That is – those with power and privilege would have no problems passing such a test while our most vulnerable citizens would face a significant barrier. To make matters worse, these patterns of power and privilege run deeply through time – a civics test for voting quickly goes from a tool to encourage people to work for their citizenship to a barrier that does little but reinforce the divide between an elite class and non-elites.

And this gives a glimpse towards another explanation for the public’s failure: perhaps the problem lies not with “the people” but with the systems. Perhaps people are unengaged or ill-informed not because of their own faults, but because the structures of civic engagement don’t permit their full participation.

Lippmann, for example, documented how even the best news agencies fail in their duty to inform the public. But the structural challenges for engagement run deeper.

In Power and Powerlessness, John Gaventa documents how poor, white coal miners regularly voted in local elections – and consistently voted for those candidates supported by coal mine owners. These were often candidates who actively sought to crush unions and worked against workers rights. Any fool could see they did not have the interest of the people at heart…but the people voted for them anyway, often in near-unamous elections.

To the outsider, these people seem stupid or lazy – the type whose vote should be taken away for their own good. But, Gaventa argues, to interpret that is to miss what’s really going on:

Continual defeat gives rise not only to the conscious deferral of action but also to a sense of defeat, or a sense of powerlessness, that may affect the consciousness of potential challengers about grievances, strategies or possibilities for change….From this perspective, the total impact of a power relationship is more than the sum of its parts. Power serves to create power. Powerlessness serves to re-enforce powerlessness.

In the community Gaventa studied, past attempts to exercise political voice dissenting from the elite had lead to people loosing their jobs and livelihoods. If I remember correctly, some had their homes burned and some had been shot.

It had been some time since such retribution had been taken, but Gaventa’s point is that it didn’t need to be. Elites had established their control so thoroughly, so completely, that poor residents did what was expected of them without hardly a thought. They didn’t need to be threatened so rudely; their submission was complete.

Arguably, theorists like Lippmann see a similar phenomenon happening more broadly.

If you are deeply skeptical of the system, you might believe it to be set up intentionally to minimize the will of the people. In the States at least, our founding fathers were notoriously scared of giving “the people” too much power. They liked the idea of democracy, but also saw the flaws and dangers of pure democracy.

In Federalist 10, James Madison argued:

From this view of the subject it may be concluded that a pure democracy, by which I mean a society consisting of a small number of citizens, who assemble and administer the government in person, can admit of no cure for the mischiefs of faction. A common passion or interest will, in almost every case, be felt by a majority of the whole; a communication and concert result from the form of government itself; and there is nothing to check the inducements to sacrifice the weaker party or an obnoxious individual. Hence it is that such democracies have ever been spectacles of turbulence and contention; have ever been found incompatible with personal security or the rights of property; and have in general been as short in their lives as they have been violent in their deaths. Theoretic politicians, who have patronized this species of government, have erroneously supposed that by reducing mankind to a perfect equality in their political rights, they would, at the same time, be perfectly equalized and assimilated in their possessions, their opinions, and their passions.

To give equal power to all the people is to set yourself up for failure; to leave nothing to check “an obnoxious individual.”

Again, there is something very reasonable in this argument. I’ve read enough stories about people being killed in Black Friday stampedes to know that crowds don’t always act with wisdom. And yet, from Gaventa’s argument I wonder – do the systems intended to check the madness of the crowd rather work to re-inforce power and inequity; making the nameless crowd just that more wild when an elite chooses to whip them into a frenzy?

Perhaps this system – democracy but not democracy – populism but not populism – is self-reinforcing; a poison that encourages the public – essentially powerless – to use what power they have to support those crudest of elites who prey on fear hatred to advance their own power.

As Lippmann writes in The Phantom Public, “the private citizen today has come to feel rather like a deaf spectator in the back row …In the cold light of experience he knows that his sovereignty is a fiction. He reigns in theory, but in fact he does not govern…”

facebooktwittergoogle_plusredditlinkedintumblrmail

On Public Opinion

Walter Lippmann was notoriously skeptical of “the people.”

The Pulitzer Prize winning journalist was all too familiar with the art of propaganda, with the ease with which elites could shape so-called “public opinion.”

In 1920, Lippmann – who had worked for the “intelligence section” of the U.S. government during the first World War – published a 42-page study on “A Test of the News” with collaborator Charles Merz.

“A sound public opinion cannot exist without access to the news,” they argued, and yet there is “a widespread and a growing doubt whether there exists such an access to the news about contentious affairs.”

That doubt doesn’t seem to have diminished any in the last hundred years.

Civic theory generally imagines an ideal citizen to be one who actively seeks out the news and possesses the sophistication to stay non-biasedly informed of current events. But debate over the practically of that ideal is moot if even such an ideal citizen cannot gain access to accurate and unbiased news.

Lippmann and Merz sought to empirically measure the quality of the news by examining over three thousand articles published the esteemed New York Times during the Russian Revolution (1917-1920).

What they found was disheartening:

From the point of view of professional journalism the reporting of the Russian Revolution is nothing short of a disaster. On the essential questions the net effect was almost always misleading, and misleading news is worse than none at all. Yet on the face of the evidence there is no reason to charge a conspiracy by Americans. They can fairly be charged with boundless credulity, and an untiring readiness to be gulled, and on many occasions with a downright lack of common sense.

Whether they were “giving the public what it wants” or creating a public that took what it got, is beside the point. They were performing the supreme duty in a democracy of supplying the information on which public opinion feeds, and they were derelict in that duty. Their motives may have been excellent. They wanted to win the war; they wanted to save the world. They were nervously excited by exciting events. They were baffled by the complexity of affairs, and the obstacles created by war. But whatever the excuses, the apologies, and the extenuation, the fact remains that a great people in a supreme crisis could not secure the minimum of necessary information on a supremely important event.

And lest we think such failures are relegated to history, consider the U.S. media’s coverage leading up to the Iraq War. Here, too, it seems fair to say that whatever the motives of media, they were indeed derelict in their duty.

Such findings gave Lippmann a deep sense of unease for “popular opinion.”

“The public,” he writes in The Phantom Public (1925), “will arrive in the middle of the third act and will leave before the last curtain, having stayed just long enough perhaps to decide who is the hero and who the villain of the piece.”

The public makes its judgements on gut instinct and imperfect knowledge. Most do not understand a situation in full detail – they know neither the history nor the possible implications of their views. They are consumed with the details of their own daily lives, raising their eyes to politics just long enough to briefly consider what might be best for them in that moment.

Such a system is sure to end in disaster – with public opinion little more than a tool manipulated by elites.

As Sheldon Wolin describes in Political Theory as Vocation, such a system would be ‘democracy’ in name but not in deed:

The mass of the population is periodically doused with the rhetoric of democracy and assured that it lives in a democratic society and that democracy is the condition to which all progressive-minded societies should aspire. Yet that democracy is not meant to realize the demos but to constrain and neutralize it by the arts of electoral engineering and opinion management. It is, necessarily, regressive. Democracy is embalmed in public rhetoric precisely in order to memorialize its loss of substance. Substantive democracy—equalizing, participatory, commonalizing—is antithetical to everything that a high-reward, meritocratic society stands for.

This is the nightmare Lippmann sought to avoid – but it also the undeniable reality he saw around him.

In elevating “the voice of the people” to “the voice of god,” our founders not only made a claim Lippmann considers absurd, but paved the way for a government of elites, by elites, and for elites – all in the hollow, but zealously endorsed, name of “the people.”

facebooktwittergoogle_plusredditlinkedintumblrmail

How Human Brains Give Rise to Language

Yesterday, I attended a lecture by Northeastern psychology professor Iris Berent on “How Human Brains Give Rise to Language.” Berent, who works closely with collaborators in a range of fields, has spent her career examining “the uniquely human capacity for language.”

That’s not to say that other animals don’t have meaningful vocalizations, but, she argues, there is something unique about the human capacity for language. Furthermore, this capacity cannot simply be attributed to mechanical differences – that is, human language is not simply a product of the computational power of our brains or the ability of our oral and aural processing.

Rather, Berent argues, humans have an intrinsic capacity for language. That is, as Steven Pinker describes in The Language Instinct,  “language is a human instinct, wired into our brains by evolution like web-spinning in spiders or sonar in bats.”

While this idea may seem surprising, in some ways it is all together reasonable: humans have specialized organs for seeing, breathing, processing toxins, and more – is it really that much more of a jump to say that the human brain is specialized, that the brain has a specialized biological system for language?

Berent sees this not as an abstract, philosophical question, but rather as one that can be tested empirically.

Specialized biological systems exhibit an invariant, universal structure, Berent explained. There is some variety among human eyes, but fundamentally they are all the same. This logic can be applied to the question of innate language capacity: if language is specialized, we would expect to find for principles: we would expect what Noam Chomksy called a “universal grammar.”

In searching for a universal grammar, Berent doesn’t expect to find such a thing on a macro scale: there’s no universal rule that a verb can only come after a noun. But rather, a universal grammar would manifest in the syllables that occur – or don’t occur – across the breadth of human language.

To this end, Berent constructs a series of syllables which she expects will be increasingly difficult for human brains to process: bl > bn > bd > lb.

That is, it’s universally easier to say “blog” than to say “lbog,” which “bnog” and “bdog” having intermediate difficulty.

One argument for this is simply the frequency of such constructions – in languages around the world “bl” occurs more frequently than “lb.”

Of course, this by no means proves the existence of an innate, universal grammar, as we cannot account for the socio-historical forces that shaped modern language, nor can we be sure such variance isn’t due to the mechanical limitations of human speech.

Brent’s research, therefore, aims to prove the fundamental universality of such syllables – showing that there is a universal hierarchy of what human brain prefers to process.

In one experiment, she has Russian speakers – who do use the difficult “lb” construction – read such a syllable out loud. She then asks speakers of languages without that construction (in this case English, Spanish, and Korean), how many syllables the sound contained.

The idea here is that if your brain can’t process “lbif” as a syllable, it will silently “repair” it to the 2-syllable “lebif.”

In numerous studies, she found that as listeners went from hearing syllables predicted to be easy to syllables predicted to be hard, they were in fact more likely to “repair” the word. Doing the experiment with fMRI and Transcranial Magnetic Stimulation (TMS) further revealed that people’s brains were indeed working harder to process the predicted-harder syllables.

All this, Berent argues, is evidence that a universal grammar does exist. That today’s modern languages are more than the result of history, social causes, or mechanical realities. The brain does indeed seem to have some specialized language system.

For myself, I remain skeptical.

As Vyvyan Evans, Professor of Linguistics at Bangor University, writes, “How much sense does it make to call whatever inborn basis for language we might have an ‘instinct’? On reflection, not much. An instinct is an inborn disposition towards certain kinds of adaptive behaviour. Crucially, that behaviour has to emerge without training…Language is different…without exposure to a normal human milieu, a child just won’t pick up a language at all.”

Evans rather points to a simpler explanation for the emergence of language: cooperation:

Language is, after all, the paradigmatic example of co‑operative behaviour: it requires conventions – norms that are agreed within a community – and it can be deployed to co‑ordinate all the additional complex behaviours that the new niche demanded…We see this instinct at work in human infants as they attempt to acquire their mother tongue…They are able to deploy sophisticated intention-recognition abilities from a young age, perhaps as early as nine months old, in order to begin to figure out the communicative purposes of the adults around them. And this is, ultimately, an outcome of our co‑operative minds. Which is not to belittle language: once it came into being, it allowed us to shape the world to our will – for better or for worse. It unleashed humanity’s tremendous powers of invention and transformation.

facebooktwittergoogle_plusredditlinkedintumblrmail