City Life

I have been almost entirely offline for the last two weeks – in Vienna for 2 days, then in Budapest, first speaking at great workshop on gendered creative teams hosted by CEU, and then for an extra week of sightseeing and visiting.

It was an exciting and valuable trip in a number of ways, and I’m still trying to process all the things I saw and heard; all the people I met and learned from. There was so much, in fact, so many rich details I want to hang on to, that I plan to spend this week slowly reflecting and working through my experience from the last week; some mundane and some academic.

I’m still a little jet-lagged and working my way back into normal life, so I want to start today with some simple observations.

I am hardly the most well-traveled person, but from the places I have been – Japan, India, parts of Europe, and, of course, the U.S. – I have this theory that all big cities are essentially the same in some fundamental way.

I don’t mean to dismiss the differences between places, people, and cultures. Each city I have been in has had a rich personality, uniquely it’s own. But at the same time, there’s something I find delightfully human about the universality of city life: people just trying to get to work and going about their day.

There are tourists and students, people who are paid to be happy, and people who will be grumpy no matter how much they are paid. There are people at all different stages of their lives; some having good days and others having bad days. I saw people taking wedding photos, playing with their kids, and enjoying each other’s company in the park. I heard people complaining, I heard teenagers gossiping, and I saw the blank, morning stare that I can only describe as the universal commuter face.

Cities just have so much life.

And while local customs and culture add a meaningfully distinctive flair to each city, one of the main things I notice when I travel is just how much our shared humanity unites us.

All around the world, no one is excited to commute into work early on a Monday morning.

facebooktwittergoogle_plusredditlinkedintumblrmail

The Gender of Folly

In Erasmus’ famous 1511 essay, The Praise of Folly, the embodiment of Folly herself delivers satirical oratory, praising herself and critiquing the norms and institutions of the day.

The piece itself is wonderfully well written, and there is a wealth of scholarship examining Erasmus’ satirical intents.

But there is one element of the essay which I have always found particularly striking. As Folly finalizes her argument, she closes her refined rhetoric by stating:

If anything I have said shall seem too saucy or too glib, stop and think: ‘tis Folly, and a woman, that has spoken. But of course you will also remember that Greek proverb, “Even a foolish man will often speak a word in season,” unless, perhaps you assume that this does not extend to women.

Patricia Bizzell notes that scholars have generally paid little attention to Folly’s gender – after all, female muses and even fools were common in Renaissance oration, with roots dating back further.

Yet ignoring Folly’s gender seem a misstep  – it is not incidental, but rather a core element of Erasmus’ satire. Folly’s gender allows her dismiss herself – after all, ‘tis Folly, and a woman, that has spoken – even as she delivers outspoken criticism of society.

Her gender also makes her an outsider, as Bizzell writes:

I can’t take the persona’s gender for granted, especially as she’s depicted in Holbein’s illustrations for an early edition of the Praise: a woman in a fool’s cap and bells and an academic gown, speaking from a rostrum to an audience of men similarly attired (see Moriae 1989).

And while female personas were perhaps common in Renaissance work, Folly’s place as an orator is particularly notable. As Bizzell points out, “in the Renaissance, a woman who practices rhetoric in public, whether by orating or publishing, is usually deemed to be unchaste.”

Even as humanists education expanded to include upper class women as well as men, women continued to be barred from the study of rhetoric. Oratory and rhetorical debate were fields where learned men battled. For a woman to enter such an arena – to share her voice in the public sphere was, in Bizzell’s words, like “the only female player in a touch football game…what chaste women would take such a risk?”

All this leaves unanswered the question of exactly what Erasmus’ argues for in Folly, but it raises the importance of gender in transmitting that message.

The role of the Fool has long been to speak truth to power, protected by their own foolishness and disdained place in society. Folly, the unchaste woman, has particular power in this regard – power bestowed by her entire lack of power.

Though ‘entire lack’ is a blatant overstatement here, as the woman rhetor, well trained in the humanists arts, is no doubt of a certain class and a certain race – maligned for her gender but more empowered than others nonetheless. As Bizzell concludes:

If we think of ourselves as symbolically risking making fools of ourselves, we might consider the implications of taking on not only the fool’s disregard for social convention, which allows social criticism and the enactment of solidarity, but also the fool’s embrace of marginal social positions as well.

Perhaps this is ultimately why the persona of Folly spoke so strongly to me when I first read Erasmus’s mock-encomium. In the persona of the foolish slut, I saw, on the one hand, ways to compensate for my lack of gender privilege, that is to wrest rhetorical freedom out of the liabilities I incur as a woman breaking the taboos that still to some extent obtain on a woman’s speaking in public. On the other hand, I saw ways to undermine my race and class privileges, which may prevent me from identifying with oppressed people as much as I want to do: this very adoption of the ass-eared cap lends a provisionality to my words which, I hope, invites all others into the rhetorical process with me.

facebooktwittergoogle_plusredditlinkedintumblrmail

Two Years

I have recently completed the second year of my doctoral program in Network Science at Northeastern University, and it feels an appropriate time to satisfy my periodic indulgence for self-reflection.

Two years. That is a long time, though also not a long time. I know “new” businesses which have been open more than two years; I remember “recent” events which took place far longer than two years ago. Two years is nothing, it is a blink of an eye. Yet the last two years have seemed so long. So long in a good way: I have learned so much, changed so much, grown so much.

It’s been a great two years.

Before I continue, it is worth noting – for those of you playing at home – that, no, I am not almost done. I have at least three years left; so even the halfway mark seems a distant point on the horizon.

But I am entering what I can only describe as the ‘grown up’ phase of my studies. I am officially done with course work – though I will no doubt continue to take classes from time to time.  I’ve nearly put test-taking behind me – though I’ll spend the next several months studying for our Qualifying Exam. On the surface, then, it may seem as though little has changed…but this moment marks a subtle turning point in my academic life; as I increasingly shed the title of student and move into the role of researcher.

I rather imagined this would occur as a crystalizing event. As though I might crawl into my doctoral studies, quietly cocooned until I miraculously emerged a scholar.

And though I knew that’s never how it would happen, I find it nonetheless remarkable how transformative the meticulous metamorphosis has proven to be. I have learned so much – not just facts and skills, though I have learned those,  too – but the past two years have fundamentally shifted the way I think and approach problems.

At the end of my first semester, I wrote that I had “been learning how to see the world through a particular epistemic frame: learning what questions to ask and what tools to deploy in answering them.”

At the end of my first year, I boasted that I could “trade nerdy jokes with people from any discipline” – a remark meant to highlight the value of interdisciplinary work. “As much as I have to learn from everyone I meet,” I wrote,  “We all have something to learn from each other.”

This sentiment is reflected in the theme that comes to mind when I reflect on my past year of learning:

Year 2: I think I might know things.

The first year gave me the lay of the land; helped me learn the contours of all the things I didn’t know. The second year helped me start defining that landscape for myself. It would perhaps be an overstatement to say the second year helped be begin to make my own contributions – but it left me with the ineffable sense that I am on a path to be able to make contributions.

I still have much to learn – there is always more to learn. But as I wind down the second year of my studies, learning feels so much more like the every day act of living rather than the frantic attempts of someone in over their head.

That is to say, I am still learning – I frankly hope to always be learning – but for the first time it feels as though I could contribute nearly as much as I could learn.

Or more plainly: I think –

I might I know things.

 

facebooktwittergoogle_plusredditlinkedintumblrmail

Re-Learning to be Human

I’m returning from a two-week blogging hiatus – the first of several I will be taking over the summer months.

This break was prompted by the madness of finals week: when my blogging devolves into posting snippets of homework assignments, it feels appropriate to take some time off. And then  I decided to take the following week off as well. I was, I decided, in the most general sense of the term, on vacation.

I wasn’t lying on a beach somewhere or taking in the tourist sites, but rather I was staring at the wall, staring at my desk, catching up with people, completing miscellaneous errands, and fundamentally trying to remember how I normally live my life.

Most probably due my emersion in deliberative literature, the phrase that most came to mind this past week was Dewey’s expression, learning to be human.

“To learn to be human,” Dewey writes, “is to develop through the give-and-take of communication an effective sense of being an individually distinctive member of a community; one who understands and appreciates its beliefs, desires and methods, and who contributes to a further conversion of organic powers into human resources and values.”

Like much of Dewey’s writing, the expression comes dangerously close to an impossibly lofty, grandiose vision.

On its face, it seems almost absurdly metaphorical – are humans not born human? In what sense, then, might a human learn to be human?

Dewey argues that what we call “human” is much more than a collection of biological traits. Rather, being human, in it’s most fundamental sense, is essentially a social construct: “everything which is distinctively human is learned.”

Yes, we must indeed “learn to be human.”

And if this sounds absurd, I recommend reflecting on the expression the next time you emerge from an intensely focused cocoon. When you can’t remember what time you normally get up or what you’re supposed to do when you feel hungry. When you have this vague sense that you used to have friends, but you haven’t actually spoken to any of them in weeks. When you’re trying to remember your priorities in life, or maybe just trying to remember how to determine your priorities. When you have no real sense of what’s going on around you, just the unmistakable sense that things have been going on.

When you realize you’ve cordoned yourself so far off from society that you actually need to reintegrate before you can meaningfully engage –

That’s when you’re learning – or relearning, perhaps – what it means to be human.

And as Dewey argues, this isn’t something we can do by ourselves; one does not learn to be human alone. Rather, learning to be human is a fundamentally social endeavor, an ongoing process through which we each learn how to act and interact. It is the every day work of learning and growing; of becoming who we are.

facebooktwittergoogle_plusredditlinkedintumblrmail

Networking

Having attended a conference last weekend, I meet a lot of people and had a lot of conversations…and had a lot of conversations about meeting new people.

One thing that kept coming up was people’s dislike of utilitarian networking – the idea that, especially when at a conference, you should talk to specific kinds of people or intentionally work on building certain relationships out of a pure utilitarian desire to leverage that relationship for your own good.

Perhaps I simply haven’t attended enough conferences, but I don’t find this concern very…concerning. To be clear, I do find the very idea of utilitarian networking to be distasteful, but I don’t find networking to be inherently utilitarian.

Or perhaps I’m just not doing it right.

In a previously life, I would go to social events and not talk to anyone. Not necessarily out of distaste for networking, but out of a general malaise about life. Then, some how, at some point along the line, I started talking to people.

And what I found was that people are really interesting.

Every conversation is like a window into a whole other universe of personhood. And the less you know the person, the more there is to learn.

So now when I go to events, I talk to people. As many people as meaningfully possible. Not out of a utilitarian drive to advance myself through connection, but out of a genuine desire to meet and learn from other.

Maybe I’m wrong, but I just though that’s what networking is.

facebooktwittergoogle_plusredditlinkedintumblrmail

Keeping the Public Sphere Open

Tomorrow I will be participating in a conference on “Keeping the Public Sphere Open” hosted by Northeastern’s NULab for Text Maps and Data. The conference is taking place from 9:30 am – 5:30 pm and is free and open to the public. You may register here.

Here’s the description from the conference website:

On March 24, the NULab will be hosting its first annual conference, showcasing the work of faculty, fellows, alumni, and research collaborators. The conference will include a range of panels and talks, all organized around the theme: “Keeping the Public Sphere Open.”

The keynote address will be delivered by Peter Levine, Associate Dean and Lincoln Filene Professor of Citizenship & Public Affairs in Tufts University’s Jonathan M. Tisch College of Civic Life and Director of CIRCLE (The Center for Information and Research on Civic Learning and Engagement). Uta Poiger, Dean of Northeastern’s College of Social Sciences and Humanities and Professor of History, will deliver a welcome message to open the conference.

The conference will feature research from several NULab-supported projects. Ryan Cordell will speak about the Viral Texts project, Sarah Connell will discuss the Women Writers Project, Sarah Payne and William Bond will share the work of the Margaret Fuller Transnational Archive, and Elizabeth Dillon will talk about the Early Caribbean Digital Archive. There will also be talks by NULab faculty: Brooke Foucault Welles will present on networked counterpublics and the #HashtagActivism project; Nick Beauchamp will discuss his research into productive internet discourse, with Ph.D. candidate Sarah Shugars; David Lazer will talk about his work on transforming democracy by strengthening connections between citizens and town halls; David Smith will share research on modeling information cascades and propagating scientific knowledge; John Wihbey will present on the democratic role of news in an age of networks; Élika Ortega will discuss the architectures of print-digital literature; and Dietmar Offenhuber, Alessandra Renzi, and Nathan Felde will share the outcomes of a public event to digitize and tag posters from the Boston Women’s March.

Other talks will include the work of graduate students: Matt Simonson on social networks and cross-ethnic ties in Uganda; and Elizabeth Polcha and Nicole Keller Day on building the Digital Feminist Commons and how feminist humanists approach coding. NULab Fellow alum Jim McGrath (Brown University) will highlight some of the intersections between digital humanities and public humanities in his work at the John Nicholas Brown Center for Public Humanities and Cultural Heritage.

facebooktwittergoogle_plusredditlinkedintumblrmail

The Internet and Modernity

Reading Lee Rainie and Barry Wellman’s Networked I was struck by their rebuttal of the argument put forth by McPherson, Smith-Lovin, and Brashears; an argument which rings throughout the work of Putnam and other scholars: modern individuals are sad, hollow, isolated shells of humanity and modern technologies like the internet is what made this so.

Perhaps I was struck simply because this is an argument I have given so little serious attention. I am way past even considering concerns that video games make you violent, rock n’ roll leads to devil worship, or that the internet has led to the collapse of our civic infrastructure. It is interesting to consider as a factor, perhaps, to scapegoat the internet – to use Rainie and Wellman’s term – strikes me as absurd.

Rainie and Wellman argue that this “fixation on the internet” ignores “nearly a century of research showing that technological changes before the internet – planes, trains, telephones, telegraphs, and cars – neither destroyed relations and communities nor left them alone as remnants locked up in rural and urban villages.”

In defense of the internet, they point to the fact that “when asked, few people say that they, themselves, are living lives of lonely desperation.” And thus they find it wearisome  that “even with these realizations, some people – and commentators – believe that they are the exceptions and that the masses around them are lonely, isolated, and fearful.”

“There is,” they assure us, “no reason to panic.”

Perhaps what is most striking about this debate – internet: friend or foe? – is that the problem isn’t really one of the modern moment; it is more properly a problem of modernity; an era that stretches back as far as one might dare to extend the concepts of modern thought or sensibilities.

In 1854 – which is, if I’m not mistaken, before the widespread popularity of the internet – Henry David Thoreau wrote in Walden, “the mass of men lead lives of quiet desperation.”

Thoreau went to the woods because he wished to live deliberately; because he yearned to escape the crushing speed and pressures of modern life. 1854 modern life that is. As he famously opens Walden:

I went to the woods because I wished to live deliberately, to front only the essential facts of life, and see if I could not learn what it had to teach, and not, when I came to die, discover that I had not lived. I did not wish to live what was not life, living is so dear; nor did I wish to practise resignation, unless it was quite necessary. I wanted to live deep and suck out all the marrow of life, to live so sturdily and Spartan-like as to put to rout all that was not life, to cut a broad swath and shave close, to drive life into a corner, and reduce it to its lowest terms…

A contemporary Thoreau might say the same about turning of Facebook or sticking with a flip phone; it’s a challenge of modernity not a problem of technology.

1942 Albert Camus wrote of the absurd tragedy of Sisyphus, that Greek hero who was condemned to “ceaselessly rolling a rock to the top of a mountain, whence the stone would fall back of its own weight.” Camus, too, points to the challenge of modernity: “The workman of today works everyday in his life at the same tasks, and his fate is no less absurd. But it is tragic only at the rare moments when it becomes conscious.”

From this perspective, the internet and other technologies have given us increased distraction; increased refuge from the crushing reality of the emptiness that is life.  Which is not to say without these technologies our burden would be relieved; no, we would simply find other ways of burying the truth, of hiding from the void.

The problem, then – if, indeed, there is a problem at all – cannot be laid at the feed of the internet or of specific online platforms. The challenge is much deeper and much more mundane. It is not a challenge of how we live in an ever connected world, it is a fundamental challenge of modern life: how do we live an average, daily life knowing everything that we deeply know?

How, in the words of modern youth, do we even?

facebooktwittergoogle_plusredditlinkedintumblrmail

Aggregated Injustice

I ran across a colorfully titled Mother Jones article which documents “a brief history of men getting credit for women’s accomplishments.” As promised by the subtitle, the article is written to do just that, presenting a series of poignant vignettes from the Paleolithic era to the present.

The entries range from enraging:

1843: Mathematician Ada Lovelace shows how Charles Babbage’s Analytical Engine (a theoretical computer) could be induced to perform complex math. Her contribution, considered the first software, was dismissed by many male historians: “It is no exaggeration to say that she was a manic-­depressive with the most amazing delusions.”

To frustratingly understandable:

1840s: …Mary Ann Evans later writes Middlemarch as George Eliot, probably to avoid “being treated as ‘just’ a female writer,” one expert notes.

The piece also captures the uniquely terrible discrimination faced by African American women:

1888: Ellen Eglin sells the rights to the clothes wringer she invented to an agent. The invention brings “great financial success” to the buyer, who paid her $18. “If it was known that a negro woman patented the invention, white ladies would not buy the wringer,” she explains.

But while this article does an excellent job of encapsulating the gender discrimination which has gone on since nearly the dawn of time, it doesn’t quite capture the aggregated effects of such discrimination.

Consider Michael Spence’s economic model of gender-based pay disparity: imagine an employee pool in which people have two observable characteristics: sex and education. An employer assigns each employee to a higher or lower wage by inferring the unobserved characteristic of productivity. Assume also that gender and productivity are perfectly uncorrelated.

Intuitively, this should mean that gender and pay will also be uncorrelated, however Spence’s game-theoretic model reveals that after initial rounds of hiring, the employer will begin to associate higher levels of education with higher levels of productivity. More precisely, because an employer’s opinions are conditioned on gender as well as education, “if at some point in time men and women are not investing in education in the same ways, then the returns to education for men and women will be different in the next round.”

In other words, Spence finds that there are numerous system equilibria and, given differing initial investments in education, the pay schedules for men and women will settle into different equilibrium states.

While the correlation between education and productivity presents a simple toy model, the “signaling” generated by actual success would presumably create an even stronger effect.

That is, men taking credit for women’s inventions, insights, and effort is not just damaging to the person whose ideas are stolen – it is damaging more broadly to people who are identified as women. It weakens women’s equilibrium for signaling success – an effect, again, felt even more strongly by women of color.

facebooktwittergoogle_plusredditlinkedintumblrmail

Snow Days

Snow days can cause chaos insofar as everything that was scheduled for a snow day needs to be rescheduled for a subsequent day; perhaps even the immediately following day – thus cramming two days of work into one.

Which is surprising, perhaps, because the snow day itself had no shortage of work either.

But somehow time just got all messed up; after a snow day things just don’t quite occur in the right order any more.

But I appreciate snow days as a humbling experience – they come as a reminder that sometimes even the most pressing meetings can still survive being postponed for a day.

facebooktwittergoogle_plusredditlinkedintumblrmail