Memorialize Love in the Surroundings along with Love Sacs

Author: 
Considering over the beware to a head transforming gadget that could change your home in a stylish house? Subsequently hardly ever check past bean carriers. True at their brand, these closed hand bags contain dry pinto and black beans, PVC pellets or even torn reboundable foam to guarantee the maximum...

How to get My cousin To help you Absolutely adore Everyone Once again – Merely Instructed Believe You decide to do The idea?

Author: 
Does ones holy matrimony have that will sensing that you are located on some muddy downward slope, you might be sliding off the road down in addition to can't end the idea? Easily ended up which person nevertheless absolutely adore my wife We end up trying to find out get...

Normalizing the Non-Standard

I recently read Eisenstein’s excellent, What to do about bad language on the internet, which explores the challenge of using Natural Language Processing on “bad” – e.g., non-standard – text.

I take Eisenstein’s use of the normative word “bad” here somewhat ironically. He argues that researchers dislike non-standard text because it complicates NLP analysis, but it is only “bad” in this narrow sense. Furthermore, while the effort required to analyze such text may be frustrating, efforts to normalize these texts are potentially worse.

It has been well documented that NLP approaches trained on formal texts, such as the Wall Street Journal, perform poorly when applied to less formal texts, such as Twitter data. Intuitively this makes sense: most people don’t write like the Wall Street Journal on Twitter.

Importantly, Eisenstein quickly does away with common explanations for the prevalence of poor language on Twitter. Citing Drouin and Davis (2009), he notes that there are no significant differences in the literacy rates of users who do or do not use non-standard language. Further studies also dispel notions of users being too lazy to type correctly, Twitter’s character limit forcing unnatural contractions, and phones auto-correcting going out of control.

In short, most users employ non-standard language because they want to. Their grammar and word choice intentionally convey meaning.

In normalizing this text, then, in moving it towards the unified standards on which NLP classifiers are trained, researchers explicitly discard important linguistic information. Importantly, this approach has implications for not only for research, but for language itself. As Eisenstein argues:

By developing software that works best for standard linguistic forms, we throw the weight of language technology behind those forms, and against variants that are preferred by disempowered groups. …It strips individuals of any agency in using language as a resource to create and shape their identity.

This concern is reminiscent of James C. Scott’s Seeing Like a State, which raises deep concerns about the power of a centralized, administrative state. In order to function effectively and efficiently, an administrative state needs to be able to standardize certain things – weights and measures, property norms, names, and language all have implications for taxation and distribution of resources. As Scott argues, this tendency towards standardization isn’t inherently bad, but it is deeply dangerous – especially when combined with things like a weak civil society and a powerful authoritarian state.

Scott argues that state imposition of a impose a single, official language is “one of the most powerful state simplifications,” which lays the groundwork for additional normalization. The state process of normalizing language, Scott writes, “should probably be viewed, as Eugen Weber suggests in the case of France, as one of domestic colo­nization in which various foreign provinces (such as Brittany and Occitanie) are linguistically subdued and culturally incorporated. …The implicit logic of the move was to define a hierarchy of cultures, relegating local languages and their regional cultures to, at best, a quaint provincialism.”

This is a bold claim, yet not entirely unfounded.

While there is further work to be done in this area, there is good reason to think that the “normalization” of language disproportionally effects people who are outside the norm along other social dimensions. These marginalized communities – marginalized, incidentally, because they fall outside whatever is defined as the norm – develop their own linguistic styles. Those linguistic styles are then in turn disparaged and even erased for following outside the norm.

Perhaps one of the most well documented examples of this is Su Lin Bloggett and Brendan O’Connor’s study on Racial Disparity in Natural Language Processing. As Eisenstein points out, it is trivially impossible for Twitter to represent a coherent linguist domain – users around the globe user Twitter in numerous languages.

The implicit pre-processing step, then, before even normalizing “bad” text to be in line with dominant norms, is to restrict analysis to English-language text. Bloggett and O’Connor find that  tweets from African-American users are over-represented among the Tweets that thrown out for being non-English.

Dealing with non-standard text is not easy. Dealing with a living language that can morph in a matter of days or even hours (#covfefe) is not easy. There’s no getting around the fact that researchers will have to make difficult calls in how to process this information and how to appropriately manage dimensionality reduction.

But the worst thing we can do is to pretend that it is not a matter of concern; to begin our work by thoughtlessly filtering and normalizing without giving significant thought to what we’re discarding and what that discarded data represents.

facebooktwittergoogle_plusredditlinkedintumblrmail

Don’t Miss the Sept. 20th Nevins Fellowship Confab Call

As we announced last month, NCDD is hosting a special Confab Call with the McCourtney Institute for Democracy and Healthy Democracy next Wednesday, September 20th from 1-2pm Eastern / 10-11am Pacific. The call is the best place to learn more about this incredible opportunity to have a D&D trained student come work with your organization at no-cost, so we strongly encourage the NCDD network to register today!

Confab bubble image

During the call, NCDD Member and McCourtney’s Managing Director Christopher Beem will provide an overview of the Nevins Democracy Leaders Program and its aims, discuss the training that the future fellows are going through, and share more about how your organization can take advantage of this great chance to help cultivate the next generation of D&D leaders while getting more support for your work – all for FREE! We’ll also be joined by NCDD Member Robin Teater of Healthy Democracy, who will share her experiences hosting a fellow this summer.

NCDD is proud to have partnered the last couple years with the McCourtney Institute to help identify organizations in the field that can host Nevins fellows, and we’re continuing the exciting partnership this year. You can get a better sense of what the program experience is like by checking out this blog post from a 2017 Nevins Fellow about their summer fellowship with NCDD Sponsoring Member The Jefferson Center.

This is a rare and competitive opportunity for leading organizations in our field, and this Confab Call will be one of the best ways to find out more about how your group can take advantage of this program, so make sure to register today to save your spot on the call! We look forward to talking with you more then!

Tunnel-dialogue: Application of methods and processes for a participative citizen patrticipation about ecological relevant investment decisions in the special case of road tunnel filters

Author: 
Which effects does the exhaust air of the "Einhorn-Tunnel" in Schwäbisch Gmünd has on humas and the environment and which benefits brings the installation of a tunnel filter? A citizien dialogue in Schwäbisch Gmünd was established to gather results about this topic.

Data Technologies Colonize the Ontological Frontier

Writing recently in Medium, Salvatore Iaconesi -- a designer, engineer and founder of Art is Open Source and Human Ecosystems -- offers an extremely important critique of the blockchain and other data-driven network technologies.

While recognizing that these systems have enormous potential for “radical innovation and transformation,” he astutely warns against their dangerous psychological and cultural effects. They transfer entire dimensions of perception, feeling, relationships, trust-building, and more -- both individually and collectively experienced -- into algorithmic calculations. The System becomes the new repository of such relational epiphenomena. And in the process, our very sense of our shared agency and intentionality, relationships, and a common fate begins to dissolve.

In their current incarnations, the blockchain and related network-based technologies start with the ontological presumption that everything can be broken apart into individual units of feeling and action. Our character, viewpoints, emotions, behaviors, are more are all translated into data-artifacts. This is the essential role of data, after all – to distill the world into manipulable, calculable units of presumably significant information. 

Think that you are a whole human being?  Forget it.  Data systems are abstracting, fragmenting and filleting our identities into profiles that we don’t even control. A simulacrum of our "real identities" is being constructed to suit new business models, laying the foundation for what Iaconsi calls the "transactionalization of life." As he writes:

Everything is turning into a transaction: our relationships, emotions and expressions; our ways of producing, acquiring and transferring knowledge; communication; everything.

As soon as each of these things become the subject of a service, they become transactions: they become an atomic part of a procedure.

Because this is what a transaction is: an atom in a procedure, in an algorithm. This includes the fact that transactions are designed, according to a certain business, operational, strategic, marketing model.

This means that when our relationships, emotions, expressions, knowledge, communication and everything become transactions, they also become atoms of those business models whose forms, allowances, degrees of freedoms and liberty are established by those models.

"Everything, including our relations and emotions, progressively becomes transactionalized/financialized, and the blockchain represent an apex of this tendency. This is already becoming a problem for informality, for the possibility of transgression, for the normation and normalization of conflicts and, thus, in prospect, for our liberties and fundamental rights, and for our possibility to perceive them (because we are talking about psychological effects)," according to Iaconsi.

How does this process work?

By moving "attention onto the algorithm, on the system, on the framework. Instead of supporting and maintaining the necessity and culture of establishing co-responsibility between human beings, these systems include “trust” in procedural ways. In ways which are technical. Thus, the necessity for trust (and, thus, on the responsibility to attribute trust, based on human relations) progressively disappears," he writes.

Therefore, together with it, society disappears. Society as actively and consciously built by people who freely decide if and when to trust each other, and who collectively agree to the modalities of this attribution.

What remains is only consumption of services and products. Safe, transparent and all. But mere transactionalized consumption. Society ends, and so does citizenship: we become citizen of nothing, of the network, of the algorithm.

These are not technical issues, but psychological ones, perceptive ones. And, thus, even more serious.

As soon as I start using them [blockchains], as soon as I start imagining the world through them, everything starts looking as a transaction, as something which is “tokenizable”….Technology creates us just as much as we create technology.

In short, the radical atomization, objectification and financialization of human relationships begins to dissolve the very idea of a shared society.

Institutions and other people disappear, replaced by an algorithm. Who knows where trust is at/in! It is everywhere, diffused, in the peer-to-peer network. Which means that it’s nowhere, and in nobody.

In a weird way it is like in call centers: they are not really useful for the client, and they completely serve the purpose minimizing bother for the companies, letting clients slipping into the “procedure” (which is synonym with algorithm), and avoiding them from obtaining real answers and effects, in their own terms outside of procedures.

These are all processes which separate people from each other, from institutions, organizations, companies, through the Procedure.

Citizens of everywhere. Citizens of nowhere and nothing.

So what might be done?  

Iaconsi talks about the Third Infoscape, which is drives from the concept of the Third Landscape.  He writes that in the Third Landscape, “where ‘technicians’ see ‘weeds,’ the Third Landscape sees opportunity, biodiversity, an open source media which is a reservoir for the future of the planet, which does not require energy to maintain, but produces energy, food, knowledge, relations.”

Citing Marco Casagrande, Iaconsi argues that data and information should not be “laid out geometrically, formally, as in gardens, but more like the woods and wild nature, in which multiple forms of dimensions, boundaries, layers and interpretations co-exist by complex desire, relation and interaction, not by design.”

This, of course, implies “a different kind of technology, a different kind of science, with a different imagination to support it.” It also implies that we begin to speak not just of technology design, but of “sensibility, imagination and aesthetics.” 

Iaconsi's critique reminded me of Montreal-based communications professor Brian Massumi's important 2015 book, Ontopower: War, Powers and the State of Perception.  His basic thesis is that the national security state, in its perpetual fight against terrorism, has telescoped its political priorities into a new ontological paradigm. It seeks to validate a new reality through what he calls “ontopower.” This is “the mode of power embodying the logic of preemption across the full spectrum of force, from the ‘hard’ (military intervention) to the ‘soft’ (surveillance).” 

The point is that perception of reality itself is the new battleground. Power is not just carried out in overt state or policy settings – legislatures, courts, the media. State power wants to go beyond messaging and framing the terms of debate. It has deliberately moved into the realm of ontology to define the terms of reality itself. In the national security context, that means that nefarious terrorist threats exist potentially everywhere, and thus the logic of preemptive military action (drone killings, extra-legal violence, etc.) is thus fully justified. (Cf. the film Minority Report.

Massumi writes:

“Security threats, regardless of the existence of credible intelligence, are now felt into reality. Whereas nations once waited for a clear and present danger to emerge before using force, a threat's felt reality now demands launching a preemptive strike. Power refocuses on what may emerge, as that potential presents itself to feeling.”

So if ontopower is arising as a new strategy in national security agencies, it should not be surprising that a related mode of ontopower is being developed by Silicon Valley, which is a frequent partner with the national security state. 

The new frontier in Big Tech is to leverage Big Data to obtain unassailable market dominance and consumer control.  Naturally, the surveillance state envies this capacity and wants to be dealt into the game. Hence the tight alliances between the US Government and Silicon Valley, as revealed by Snowden. Now that the likes of Google, Amazon, Facebook and others have secured political and economic supremacy in so many markets and cultural vectors, is it any wonder that such power is itching to define social reality itself?

Democrats as technocrats

This web search takes you to a whole stack of good recent writing about the Democratic Party as the technocratic party, with headlines ranging from Twilight of the Technocrats? to The Triumph of the Technocrats. In lieu of a critical review, I’d pose these questions:

  1. What would a technocrat support and do in our context? It’s possible to be a socialist technocrat or a technocrat who works for a huge, for-profit company. I presume that a technocratic Democrat today is someone who believes in optimizing GDP growth, environmental sustainability, and reductions in tangible human distress (e.g., disease, homicide) through efficient governmental policies. These desired outcomes often conflict, and then technocrats are fine with compromise. To qualify as a technocrat, you can’t be too enthusiastic about working with ordinary citizens on public issues, and you can’t base your agenda on controversial, challenging moral ideals.
  2. Do Democrats present themselves as technocrats, in this sense? Some do and some don’t. It seems fair to read the positive agenda of Hillary Clinton’s 2016 campaign as largely technocratic (she promised to govern competently and continue the balanced progress of her predecessor), although her critique of Donald Trump was ethical rather than technical. I also think that Clinton was in a tough spot because she didn’t believe that she could accomplish transformative change with a Republican Congress; thus managerial competence seemed a workable alternative. The 2016 campaign does not demonstrate that she–let alone all Democrats–are fully technocratic. However, consider a different case that is pretty revealing: the Josiah Bartlet Administration. This is an informative example just because it is idealized and fictional, free of any necessary constraints. The Bartlet White House is staffed with hard-working, highly-educated, unrealistically competent, smartest-guy-in-the-room, ethical people who strive to balance the budget while making incremental progress on social issues. Hollywood’s idealized Democrats are technocrats in full.
  3. Do Democrats choose technocratic policies? Again, I’d say “sometimes.” Both the Clinton and Obama Administrations definitely showed some predilection for measurable, testable outcomes; for behavioral economics; and for models that were consistent with academic research about the economy and the climate. They weren’t particularly good at empowering citizens to govern themselves or collaborating with social movements. On the other hand, the Affordable Care Act has a moral core (aiming to cover people without health insurance), even if many of its tools and strategies are best defined as technocratic.
  4. Are Democrats good technocrats? There has been more economic growth under Democratic than Republican presidents. But the sample is small, several Democratic presidents faced conservative congresses, and any correlation with a small “n” can easily be spurious. A deeper point is that Democrats are currently more committed to the mainstream findings of climate science, social policy research, and academic economics than Republicans are. Their accomplishments may be affected by sheer chance, but their strategies tend to be consistent with positivist, empirical research.
  5. Is Democratic technocracy consistent with justice? No. Almost any theory of justice, from libertarian to strongly egalitarian, would demand fundamental shifts from the status quo. Certainly, I would favor deeper changes in our basic social contract. On the other hand, compared to what? Managing our existing social policies in a competent way delivers substantial, if inadequate, justice. It beats incompetence or deliberate assaults on existing social institutions. In a multi-party parliamentary democracy, a center-left technocratic party would play an important role. I would be open to voting for it, depending on the circumstances and the alternatives. In our two-party system, a technocratic and centrist component competes for control of the Democratic Party. It shouldn’t be surprising that this component receives constant criticism from within the Party, because the Democrats represent a broader coalition, and there is plenty of room to the left of someone like Hillary Clinton. Whatever you think of her, I don’t think you can complain that she was criticized from her left.
  6. Is Democratic technocracy good politics? That’s not a question that will be settled to everyone’s satisfaction any time soon. Clinton lost to Trump but also won the popular vote. She was technocratic but not completely so. She faced many contingencies, from Fox News to Bernie to Comey, and handled them in ways that we can debate for the next decade. Again, the answer has to be: Compared to what? A compelling new vision of America’s social contract would beat competent management at the polls. But competent management may beat incompetence or a deeply unpopular vision (from either right or left).
  7. What’s driving the Democratic Party’s drift to technocracy? One could explain it in class terms: the Democratic coalition is now highly educated, including many people who make a living by demonstrating expertise. But I would propose a deeper thesis. Modernity itself is defined by constant increases in specialization and differentiation, plus radical doubts about our ability to know which ends are moral or just. In that context, people prosper who are good at applying technical reasoning to complex problems without worrying too much about whether the ultimate ends are right. Modernity has generated a white-collar governing class that is currently aligned with the Democrats, but more than that, it has generated a very high estimation of expertise combined with a leeriness about moral discourse. Religious conservatives monopolize the opposition to both of these trends. Getting out of this trap requires more than new messages and policies. It is a fundamental cultural problem.

See also: the rise of an expert class and its implications for democracyvarieties of neoliberalismthe big lessons of Obamacarethe new manipulative politics: behavioral economics, microtargeting, and the choice confronting Organizing for Action; and why the white working class must organize.