Medusa

They say that Medusa was the most horrifying woman ever known.

According to legend, she was so terrible to behold that a mere glance at her viper-enshrined visage was enough to render the seer stone. She was so ugly, so terrible to look at, that one could not even survive the horror.

The hero Perseus caught off her head – a just fate, it appears, for such a monster – whereupon he seems to have kept it safely secured to be used as a weapon against unsuspecting foes. I imagine him carrying it around a dirty burlap sack, periodically proudly displaying the dead woman’s head, even in death using her as a tool to defeat foes far greater than he.

In early mythology, Medusa and her Gorgon sisters were born that way – monsters, if you will – with wings and entwined snakes for hair.

This story proved uninspiring, I suppose, because it eventually changed form.

Medusa wasn’t born a monster, no, she was born beautiful. The most beautiful woman you can imagine.

Too beautiful.

Ovid allows Perseus to tell her story:

…Beyond all others she
was famed for beauty, and the envious hope
of many suitors. Words would fail to tell
the glory of her hair, most wonderful
of all her charms—A friend declared to me
he saw its lovely splendour.

Nothing good happens to beautiful women.

Perseus continues:

…the Sovereign of the Sea attained her love
in chaste Minerva‘s temple.

This was a terrible wrong – Poseidon’s forceful attainment of the beautiful Medusa.

Minerva was enraged.

…she turned her head away and held her shield
before her eyes. To punish that great crime
Minerva changed the Gorgon’s splendid hair
to serpents horrible. And now to strike
her foes with fear, she wears upon her breast
those awful vipers—creatures of her rage.

And thus, on Ovid’s telling, Medusa was rightfully punished. For the actions of Poseidon. For being just too beautiful.

Chastised so with awful vipers, men could never again look upon her.

And then brave Perseus sneaks in, finds her asleep, and cuts off her head.

Nothing good ever happens to ugly women.

facebooktwittergoogle_plusredditlinkedintumblrmail

Summer Days

I love how summer days roll by.

I love long summer days and warm summer nights. I love the feeling of possibility, as if everything will be sunny and relaxing forever.

I love sitting outside and reading a good book.

***

I hate how summer days drag on.

I hate how hard it is to focus, to really tackle a task. I hate how much longer everything takes, how everyone’s vacations are at different times.

I hate that it gets so hot I’d rather sit in air conditioning to read a book.

***

In the end I’m not sure, I suppose, whether I love or hate summer days after all. But I do know this –

They do go on.

facebooktwittergoogle_plusredditlinkedintumblrmail

Hungarian Art and Weltanschauung

Inspired in part by my recent trip to the Hungarian National Gallery, I’ve been reading Éva Forgács excellent book, “Hungarian Art.” Forgács frames the arc of Hungarian art through the lens of an ongoing tension between “European” art and culture and distinctively “Hungarian” art and culture.

In the late 19th century, for example, artists and scholars such as Károly Kernstock, György Lukács, and Béla Balázs sought to “integrate Hungarian painting into contemporary European art.” As Forgács argues, they “thought that the time had come to present an argument for synchronicity between new Hungarian achievements and those of Western culture, and thus validate their work in the eyes of a rather reluctant Hungarian audience. They were apparently unaware that the segments of the Hungarian audience that hesitated to accept them did so exactly because of the painters’ European orientation.”

On the other hand, “cultivation of the ‘national genius’ was, through the greater part of the twentieth century, a sub-current in Hungarian art and culture, addressing deeply ingrained, suppressed reservoirs of what was perceived as genuinely Hungarian…However, ‘genuine Hungarian’ artworks had failed to constitute a mythical meta-narrative; they lacked the potential to be come official or mainstream art, or even a decisive trend in counterculture.”

Of particularly interest to me in this debate is the frequent use of the German word Weltanschauung, roughly translated as “worldview.” Lukács wrote that through the work of European-oriented Hungarian artists, “a new Weltanschauung appeared, which aspired to a higher truth than the ephemeral world of appearances of impressionist painting.” Forgács further argues that following the second world war, the European School saw themselves as “constructing a new, post-war, post-holocaust Weltanschauung.” Work that had “an almost revolutionary aura.”

While “worldview” is a passable translation of Weltanschauung, the word itself is much richer than its translation allows. It means not only “worldview,” but implies a shared worldview – a sort of cultural unity without deviation.

The very idea of a “Western” culture or an “Eastern” culture rests upon the concept of Weltanschauung; upon the argument there is something distinctive which binds members of these cultures together.

Wittgenstein, who was particularly interested in how people communicate and share ideas, often refers to Weltanschauung, perhaps most notably asking in Philosophical Investigations: “The concept of a perspicuous representation is of fundamental significance for us. It earmarks the form of account we give, the way we look at things. (Is this a ‘Weltanschauung‘?)”

Though he never answers the question he raises parenthetically, Konstantin Kolenda points to the similarity in a Wittgenstein passage from his earlier Tractatus Logico-Philosophicus: “Everything that can be said can be said clearly.”

If, indeed, everything can be said clearly, that is arguably because of Weltanschauung – because words and symbols have a shared meaning which can successfully be conveyed from me to you.

I think also of the computational models of “cultural systems” undertaken by Spicer, Axelrod, and others. In these models, individuals with distinctive characteristics gradually take on the characteristics of their neighbors – eventually leading to balkanization between communities of identical individuals.

 

And this is what I find so interesting about the struggle in Hungarian art; about the constant tension artists feel between a European and a Hungarian Weltanschauung; about the sense of building a new Weltanschauung.

Weltanschauung is problematic in its unity; in its insistence that all of a culture’s people must share characteristics – or, perhaps, conversely, that a person who does not share certain cultural aspects can be naturally derided as an outsider.

In studying Hungarian artists’ search for Weltanschauung, Forgács engages the divergent approaches as not entirely contradictory, but as trying to seeking out a shared path; to transcend the tension and to build something new. To move beyond the confines of existing Weltanschauung and to truly create.

 

facebooktwittergoogle_plusredditlinkedintumblrmail

City Life

I have been almost entirely offline for the last two weeks – in Vienna for 2 days, then in Budapest, first speaking at great workshop on gendered creative teams hosted by CEU, and then for an extra week of sightseeing and visiting.

It was an exciting and valuable trip in a number of ways, and I’m still trying to process all the things I saw and heard; all the people I met and learned from. There was so much, in fact, so many rich details I want to hang on to, that I plan to spend this week slowly reflecting and working through my experience from the last week; some mundane and some academic.

I’m still a little jet-lagged and working my way back into normal life, so I want to start today with some simple observations.

I am hardly the most well-traveled person, but from the places I have been – Japan, India, parts of Europe, and, of course, the U.S. – I have this theory that all big cities are essentially the same in some fundamental way.

I don’t mean to dismiss the differences between places, people, and cultures. Each city I have been in has had a rich personality, uniquely it’s own. But at the same time, there’s something I find delightfully human about the universality of city life: people just trying to get to work and going about their day.

There are tourists and students, people who are paid to be happy, and people who will be grumpy no matter how much they are paid. There are people at all different stages of their lives; some having good days and others having bad days. I saw people taking wedding photos, playing with their kids, and enjoying each other’s company in the park. I heard people complaining, I heard teenagers gossiping, and I saw the blank, morning stare that I can only describe as the universal commuter face.

Cities just have so much life.

And while local customs and culture add a meaningfully distinctive flair to each city, one of the main things I notice when I travel is just how much our shared humanity unites us.

All around the world, no one is excited to commute into work early on a Monday morning.

facebooktwittergoogle_plusredditlinkedintumblrmail

The Gender of Folly

In Erasmus’ famous 1511 essay, The Praise of Folly, the embodiment of Folly herself delivers satirical oratory, praising herself and critiquing the norms and institutions of the day.

The piece itself is wonderfully well written, and there is a wealth of scholarship examining Erasmus’ satirical intents.

But there is one element of the essay which I have always found particularly striking. As Folly finalizes her argument, she closes her refined rhetoric by stating:

If anything I have said shall seem too saucy or too glib, stop and think: ‘tis Folly, and a woman, that has spoken. But of course you will also remember that Greek proverb, “Even a foolish man will often speak a word in season,” unless, perhaps you assume that this does not extend to women.

Patricia Bizzell notes that scholars have generally paid little attention to Folly’s gender – after all, female muses and even fools were common in Renaissance oration, with roots dating back further.

Yet ignoring Folly’s gender seem a misstep  – it is not incidental, but rather a core element of Erasmus’ satire. Folly’s gender allows her dismiss herself – after all, ‘tis Folly, and a woman, that has spoken – even as she delivers outspoken criticism of society.

Her gender also makes her an outsider, as Bizzell writes:

I can’t take the persona’s gender for granted, especially as she’s depicted in Holbein’s illustrations for an early edition of the Praise: a woman in a fool’s cap and bells and an academic gown, speaking from a rostrum to an audience of men similarly attired (see Moriae 1989).

And while female personas were perhaps common in Renaissance work, Folly’s place as an orator is particularly notable. As Bizzell points out, “in the Renaissance, a woman who practices rhetoric in public, whether by orating or publishing, is usually deemed to be unchaste.”

Even as humanists education expanded to include upper class women as well as men, women continued to be barred from the study of rhetoric. Oratory and rhetorical debate were fields where learned men battled. For a woman to enter such an arena – to share her voice in the public sphere was, in Bizzell’s words, like “the only female player in a touch football game…what chaste women would take such a risk?”

All this leaves unanswered the question of exactly what Erasmus’ argues for in Folly, but it raises the importance of gender in transmitting that message.

The role of the Fool has long been to speak truth to power, protected by their own foolishness and disdained place in society. Folly, the unchaste woman, has particular power in this regard – power bestowed by her entire lack of power.

Though ‘entire lack’ is a blatant overstatement here, as the woman rhetor, well trained in the humanists arts, is no doubt of a certain class and a certain race – maligned for her gender but more empowered than others nonetheless. As Bizzell concludes:

If we think of ourselves as symbolically risking making fools of ourselves, we might consider the implications of taking on not only the fool’s disregard for social convention, which allows social criticism and the enactment of solidarity, but also the fool’s embrace of marginal social positions as well.

Perhaps this is ultimately why the persona of Folly spoke so strongly to me when I first read Erasmus’s mock-encomium. In the persona of the foolish slut, I saw, on the one hand, ways to compensate for my lack of gender privilege, that is to wrest rhetorical freedom out of the liabilities I incur as a woman breaking the taboos that still to some extent obtain on a woman’s speaking in public. On the other hand, I saw ways to undermine my race and class privileges, which may prevent me from identifying with oppressed people as much as I want to do: this very adoption of the ass-eared cap lends a provisionality to my words which, I hope, invites all others into the rhetorical process with me.

facebooktwittergoogle_plusredditlinkedintumblrmail

Two Years

I have recently completed the second year of my doctoral program in Network Science at Northeastern University, and it feels an appropriate time to satisfy my periodic indulgence for self-reflection.

Two years. That is a long time, though also not a long time. I know “new” businesses which have been open more than two years; I remember “recent” events which took place far longer than two years ago. Two years is nothing, it is a blink of an eye. Yet the last two years have seemed so long. So long in a good way: I have learned so much, changed so much, grown so much.

It’s been a great two years.

Before I continue, it is worth noting – for those of you playing at home – that, no, I am not almost done. I have at least three years left; so even the halfway mark seems a distant point on the horizon.

But I am entering what I can only describe as the ‘grown up’ phase of my studies. I am officially done with course work – though I will no doubt continue to take classes from time to time.  I’ve nearly put test-taking behind me – though I’ll spend the next several months studying for our Qualifying Exam. On the surface, then, it may seem as though little has changed…but this moment marks a subtle turning point in my academic life; as I increasingly shed the title of student and move into the role of researcher.

I rather imagined this would occur as a crystalizing event. As though I might crawl into my doctoral studies, quietly cocooned until I miraculously emerged a scholar.

And though I knew that’s never how it would happen, I find it nonetheless remarkable how transformative the meticulous metamorphosis has proven to be. I have learned so much – not just facts and skills, though I have learned those,  too – but the past two years have fundamentally shifted the way I think and approach problems.

At the end of my first semester, I wrote that I had “been learning how to see the world through a particular epistemic frame: learning what questions to ask and what tools to deploy in answering them.”

At the end of my first year, I boasted that I could “trade nerdy jokes with people from any discipline” – a remark meant to highlight the value of interdisciplinary work. “As much as I have to learn from everyone I meet,” I wrote,  “We all have something to learn from each other.”

This sentiment is reflected in the theme that comes to mind when I reflect on my past year of learning:

Year 2: I think I might know things.

The first year gave me the lay of the land; helped me learn the contours of all the things I didn’t know. The second year helped me start defining that landscape for myself. It would perhaps be an overstatement to say the second year helped be begin to make my own contributions – but it left me with the ineffable sense that I am on a path to be able to make contributions.

I still have much to learn – there is always more to learn. But as I wind down the second year of my studies, learning feels so much more like the every day act of living rather than the frantic attempts of someone in over their head.

That is to say, I am still learning – I frankly hope to always be learning – but for the first time it feels as though I could contribute nearly as much as I could learn.

Or more plainly: I think –

I might I know things.

 

facebooktwittergoogle_plusredditlinkedintumblrmail

Re-Learning to be Human

I’m returning from a two-week blogging hiatus – the first of several I will be taking over the summer months.

This break was prompted by the madness of finals week: when my blogging devolves into posting snippets of homework assignments, it feels appropriate to take some time off. And then  I decided to take the following week off as well. I was, I decided, in the most general sense of the term, on vacation.

I wasn’t lying on a beach somewhere or taking in the tourist sites, but rather I was staring at the wall, staring at my desk, catching up with people, completing miscellaneous errands, and fundamentally trying to remember how I normally live my life.

Most probably due my emersion in deliberative literature, the phrase that most came to mind this past week was Dewey’s expression, learning to be human.

“To learn to be human,” Dewey writes, “is to develop through the give-and-take of communication an effective sense of being an individually distinctive member of a community; one who understands and appreciates its beliefs, desires and methods, and who contributes to a further conversion of organic powers into human resources and values.”

Like much of Dewey’s writing, the expression comes dangerously close to an impossibly lofty, grandiose vision.

On its face, it seems almost absurdly metaphorical – are humans not born human? In what sense, then, might a human learn to be human?

Dewey argues that what we call “human” is much more than a collection of biological traits. Rather, being human, in it’s most fundamental sense, is essentially a social construct: “everything which is distinctively human is learned.”

Yes, we must indeed “learn to be human.”

And if this sounds absurd, I recommend reflecting on the expression the next time you emerge from an intensely focused cocoon. When you can’t remember what time you normally get up or what you’re supposed to do when you feel hungry. When you have this vague sense that you used to have friends, but you haven’t actually spoken to any of them in weeks. When you’re trying to remember your priorities in life, or maybe just trying to remember how to determine your priorities. When you have no real sense of what’s going on around you, just the unmistakable sense that things have been going on.

When you realize you’ve cordoned yourself so far off from society that you actually need to reintegrate before you can meaningfully engage –

That’s when you’re learning – or relearning, perhaps – what it means to be human.

And as Dewey argues, this isn’t something we can do by ourselves; one does not learn to be human alone. Rather, learning to be human is a fundamentally social endeavor, an ongoing process through which we each learn how to act and interact. It is the every day work of learning and growing; of becoming who we are.

facebooktwittergoogle_plusredditlinkedintumblrmail

Networking

Having attended a conference last weekend, I meet a lot of people and had a lot of conversations…and had a lot of conversations about meeting new people.

One thing that kept coming up was people’s dislike of utilitarian networking – the idea that, especially when at a conference, you should talk to specific kinds of people or intentionally work on building certain relationships out of a pure utilitarian desire to leverage that relationship for your own good.

Perhaps I simply haven’t attended enough conferences, but I don’t find this concern very…concerning. To be clear, I do find the very idea of utilitarian networking to be distasteful, but I don’t find networking to be inherently utilitarian.

Or perhaps I’m just not doing it right.

In a previously life, I would go to social events and not talk to anyone. Not necessarily out of distaste for networking, but out of a general malaise about life. Then, some how, at some point along the line, I started talking to people.

And what I found was that people are really interesting.

Every conversation is like a window into a whole other universe of personhood. And the less you know the person, the more there is to learn.

So now when I go to events, I talk to people. As many people as meaningfully possible. Not out of a utilitarian drive to advance myself through connection, but out of a genuine desire to meet and learn from other.

Maybe I’m wrong, but I just though that’s what networking is.

facebooktwittergoogle_plusredditlinkedintumblrmail

Keeping the Public Sphere Open

Tomorrow I will be participating in a conference on “Keeping the Public Sphere Open” hosted by Northeastern’s NULab for Text Maps and Data. The conference is taking place from 9:30 am – 5:30 pm and is free and open to the public. You may register here.

Here’s the description from the conference website:

On March 24, the NULab will be hosting its first annual conference, showcasing the work of faculty, fellows, alumni, and research collaborators. The conference will include a range of panels and talks, all organized around the theme: “Keeping the Public Sphere Open.”

The keynote address will be delivered by Peter Levine, Associate Dean and Lincoln Filene Professor of Citizenship & Public Affairs in Tufts University’s Jonathan M. Tisch College of Civic Life and Director of CIRCLE (The Center for Information and Research on Civic Learning and Engagement). Uta Poiger, Dean of Northeastern’s College of Social Sciences and Humanities and Professor of History, will deliver a welcome message to open the conference.

The conference will feature research from several NULab-supported projects. Ryan Cordell will speak about the Viral Texts project, Sarah Connell will discuss the Women Writers Project, Sarah Payne and William Bond will share the work of the Margaret Fuller Transnational Archive, and Elizabeth Dillon will talk about the Early Caribbean Digital Archive. There will also be talks by NULab faculty: Brooke Foucault Welles will present on networked counterpublics and the #HashtagActivism project; Nick Beauchamp will discuss his research into productive internet discourse, with Ph.D. candidate Sarah Shugars; David Lazer will talk about his work on transforming democracy by strengthening connections between citizens and town halls; David Smith will share research on modeling information cascades and propagating scientific knowledge; John Wihbey will present on the democratic role of news in an age of networks; Élika Ortega will discuss the architectures of print-digital literature; and Dietmar Offenhuber, Alessandra Renzi, and Nathan Felde will share the outcomes of a public event to digitize and tag posters from the Boston Women’s March.

Other talks will include the work of graduate students: Matt Simonson on social networks and cross-ethnic ties in Uganda; and Elizabeth Polcha and Nicole Keller Day on building the Digital Feminist Commons and how feminist humanists approach coding. NULab Fellow alum Jim McGrath (Brown University) will highlight some of the intersections between digital humanities and public humanities in his work at the John Nicholas Brown Center for Public Humanities and Cultural Heritage.

facebooktwittergoogle_plusredditlinkedintumblrmail

The Internet and Modernity

Reading Lee Rainie and Barry Wellman’s Networked I was struck by their rebuttal of the argument put forth by McPherson, Smith-Lovin, and Brashears; an argument which rings throughout the work of Putnam and other scholars: modern individuals are sad, hollow, isolated shells of humanity and modern technologies like the internet is what made this so.

Perhaps I was struck simply because this is an argument I have given so little serious attention. I am way past even considering concerns that video games make you violent, rock n’ roll leads to devil worship, or that the internet has led to the collapse of our civic infrastructure. It is interesting to consider as a factor, perhaps, to scapegoat the internet – to use Rainie and Wellman’s term – strikes me as absurd.

Rainie and Wellman argue that this “fixation on the internet” ignores “nearly a century of research showing that technological changes before the internet – planes, trains, telephones, telegraphs, and cars – neither destroyed relations and communities nor left them alone as remnants locked up in rural and urban villages.”

In defense of the internet, they point to the fact that “when asked, few people say that they, themselves, are living lives of lonely desperation.” And thus they find it wearisome  that “even with these realizations, some people – and commentators – believe that they are the exceptions and that the masses around them are lonely, isolated, and fearful.”

“There is,” they assure us, “no reason to panic.”

Perhaps what is most striking about this debate – internet: friend or foe? – is that the problem isn’t really one of the modern moment; it is more properly a problem of modernity; an era that stretches back as far as one might dare to extend the concepts of modern thought or sensibilities.

In 1854 – which is, if I’m not mistaken, before the widespread popularity of the internet – Henry David Thoreau wrote in Walden, “the mass of men lead lives of quiet desperation.”

Thoreau went to the woods because he wished to live deliberately; because he yearned to escape the crushing speed and pressures of modern life. 1854 modern life that is. As he famously opens Walden:

I went to the woods because I wished to live deliberately, to front only the essential facts of life, and see if I could not learn what it had to teach, and not, when I came to die, discover that I had not lived. I did not wish to live what was not life, living is so dear; nor did I wish to practise resignation, unless it was quite necessary. I wanted to live deep and suck out all the marrow of life, to live so sturdily and Spartan-like as to put to rout all that was not life, to cut a broad swath and shave close, to drive life into a corner, and reduce it to its lowest terms…

A contemporary Thoreau might say the same about turning of Facebook or sticking with a flip phone; it’s a challenge of modernity not a problem of technology.

1942 Albert Camus wrote of the absurd tragedy of Sisyphus, that Greek hero who was condemned to “ceaselessly rolling a rock to the top of a mountain, whence the stone would fall back of its own weight.” Camus, too, points to the challenge of modernity: “The workman of today works everyday in his life at the same tasks, and his fate is no less absurd. But it is tragic only at the rare moments when it becomes conscious.”

From this perspective, the internet and other technologies have given us increased distraction; increased refuge from the crushing reality of the emptiness that is life.  Which is not to say without these technologies our burden would be relieved; no, we would simply find other ways of burying the truth, of hiding from the void.

The problem, then – if, indeed, there is a problem at all – cannot be laid at the feed of the internet or of specific online platforms. The challenge is much deeper and much more mundane. It is not a challenge of how we live in an ever connected world, it is a fundamental challenge of modern life: how do we live an average, daily life knowing everything that we deeply know?

How, in the words of modern youth, do we even?

facebooktwittergoogle_plusredditlinkedintumblrmail