Easter readings: new selection of articles and notes on democracy, open government, civic tech and others

Open government’s uncertain effects and the Biden opportunity: what now? 

A review of 10 years of open government research reveals: 1) “a transparency-driven focus”,  2) “methodological concerns”, and 3) [maybe not surprising] “the lack of empirical evidence regarding the effects of open government”. My take on this is that these findings are, somewhat, self-reinforcing. 

First, the early focus on transparency by open government advocates, while ignoring the conditions under which transparency could lead to public goods, should be, in part, to blame. This is even more so if open government interventions insist on tactical, instead of strategic approaches to accountability. Second, the fact that many of those engaging in open government efforts do not take into account the existing evidence doesn’t help in terms of designing appropriate reforms, nor in terms of calibrating expectations. Proof of this is the recurrent and mostly unsubstantiated spiel that “transparency leads to trust”, voiced by individuals and organizations who should have known better. Third, should there be any effects of open government reforms, these are hard to verify in a credible manner given that evaluations often suffer from methodological weaknesses, as indicated by the paper.

Finally, open government’s semantic extravaganza makes building critical mass all the more difficult. For example, I have my doubts over whether the paper would reach similar conclusions should it have expanded the review to open government practices that, in the literature, are not normally labeled as open government. This would be the case, for instance, of participatory budgeting (which has shown to improve service delivery and increase tax revenues), or strategic approaches to social accountability that present substantial results in terms of development outcomes.  

In any case, the research findings are still troubling. The election of President Biden gives some extra oxygen to the open government agenda, and that is great news. But in a context where autocratization turns viral, making a dent in how governments operate will take less  policy-based evidence searching and more evidence-based strategizing. That involves leveraging the existing evidence when it is available, and when it is not, the standard path applies: more research is needed.

Open Government Partnership and Justice

On another note, Joe Foti, from the Open Government Partnership (OGP), writes on the need to engage more lawyers, judges and advocates in order to increase the number of accountability-focused OGP commitments. I particularly like Joe’s ideas on bringing these actors together to identify where OGP commitments could be stronger, and how. This resonates with a number of cases I’ve come across in the past where the judiciary played a key role in ensuring that citizens’ voice also had teeth. 

I also share Joe’s enthusiasm for the potential of a new generation of commitments that put forward initiatives such as specialized anti-corruption courts and anti-SLAPP provisions. Having said this, the judiciary itself needs to be open, independent and capable. In most countries that I’ve worked in, a good part of open government reforms fail precisely because of a dysfunctional judiciary system. 

Diversity, collective intelligence and deliberative democracy 

Part of the justification for models of deliberative democracy is their epistemic quality, that is, large and diverse crowds are smarter than the (elected or selected) few. A good part of this argument finds its empirical basis in the fantastic work by Scott Page.

But that’s not all. We know, for instance, that gender diversity on corporate boards improves firms’ performance, ethnic diversity produces more impactful scientific research, diverse groups are better at solving crimes, popular juries are less biased than professional judges, and politically diverse editorial teams produce higher-quality Wikipedia articles. Diversity also helps to explain classical Athens’ striking superiority vis-à-vis other city-states of its time, due to the capacity of its democratic system to leverage the dispersed knowledge of its citizens through sortition.

Now, a Nature article, “Algorithmic and human prediction of success in human collaboration from visual features”, presents new evidence of the power of diversity in problem-solving tasks. In the paper, the authors examine the patterns of group success in Escape The Room, an adventure game in which a group attempts to escape a maze by collectively solving a series of puzzles. The authors find that groups that are larger, older and more gender diverse are significantly more likely to escape. But there’s an exception to that: more age diverse groups are less likely to escape. Intriguing isn’t it? 

Deliberative processes online: rough review of the evidence

As the pandemic pushes more deliberative exercises online, researchers and practitioners start to take more seriously the question of how effective online deliberation can be when compared to in-person processes. Surprisingly, there are very few empirical studies comparing the two methods.

But a quick run through the literature offers some interesting insights. For instance, an online 2004 deliberative poll on U.S. foreign policy, and a traditional face-to-face deliberative poll conducted in parallel, presented remarkably similar results. A 2007 experiment comparing online and face-to-face deliberation found that both approaches can increase participants’ issue knowledge, political efficacy, and willingness to participate in politics. A similar comparison from 2009 looking at deliberation over the construction of a power plant in Finland found considerable resemblance in the outcomes of online and face-to-face processes. A study published in 2012 on waste treatment in France found that, compared to the offline process, online deliberation was more likely to: i) increase women’s interventions, ii) promote the justification or arguments, and iii) be oriented towards the common good (although in this case the processes were not similar in design). 

The external validity of these findings, however encouraging they may be, remains an empirical question. Particularly given that since these studies were conducted the technology used to support deliberations has in many cases changed (e.g. from written to “zoomified” deliberations).  Anyhow, kudos should go to the researchers who started engaging with the subject well over a decade ago: if that work was a niche subject then, their importance now is blatantly obvious. 

(BTW, on a related issue, here’s a fascinating 2021 experiment examining whether online juries can make consistent, repeatable decisions: interestingly, deliberating groups are much more consistent than non-deliberating groups)

Fixing the Internet? 

Anne Applebaum and Peter Pomerantsev published a great article in The Atlantic on the challenges to democracy by an Internet model that fuels disinformation and polarization, presenting alternative paths to address this. I was thankful for the opportunity to make a modest contribution to such a nice piece.  

At the same time, an excellent Twitter thread by Levi Boxel is a good reminder that sometimes we may be overestimating some of the effects of the Internet on polarization. Levi highlights three stylized facts with regards to mass polarization: i) it’s been increasing since at least the 1980’s in the US, ii) it’s been increasing more quickly among old age groups in the US, and iii) in the past 30 years countries present different patterns of polarization despite similar Internet usage.

Of course, that doesn’t mean we shouldn’t be concerned about the effects of the Internet in politics. For instance, a new study in the American Political Science Review finds that radical right parties benefit more than any other parties from malicious bots on social media. 

Open democracy

2021 continues to be a good year for the proponents of deliberative democracy, with growing coverage of the subject in the mainstream media, in part fueled by the recent launch of Helène Landemore’s great book “Open Democracy.” Looking for something to listen to? Look no further and listen to this interview by Ezra Klein with Helène.

A dialogue among giants 

The recording of the roundtable Contours of Participatory Democracy in the 21st Century is now available. The conversation between Jane Mansbridge, Mark Warren and Cristina Lafont can be found here

Democracy and design thinking 

Speaking of giants, the new book by Michael Saward “Democratic Design”, is finally out. I’m a big fan of Michael’s work, so my recommendation may be biased. In this new book Michael brings design thinking together with democratic theory and practice. If the design of democratic institutions is one of your topics, you should definitely check it out!   

Civic Tech 

I was thrilled to have the opportunity to deliver a lecture at the Center for Collective Learning – Artificial and Natural Intelligence Institute. My presentation, Civic Technologies: Past, Present and Future, can be found here.

Scholar articles: 

And finally, for those who really want to geek-out, a list of 15 academic articles I enjoyed reading:

Protzer, E. S. (2021). Social Mobility Explains Populism, Not Inequality or Culture. CID Research Fellow and Graduate Student Working Paper Series.

Becher, M., & Stegmueller, D. (2021). Reducing Unequal Representation: The Impact of Labor Unions on Legislative Responsiveness in the US Congress. Perspectives on Politics, 19(1), 92-109.

Foster, D., & Warren, J. (2021). The politics of spatial policies. Available at SSRN 3768213.

Hanretty, C. (2021). The Pork Barrel Politics of the Towns Fund. The Political Quarterly.

RAD, S. R., & ROY, O. (2020). Deliberation, Single-Peakedness, and Coherent Aggregation. American Political Science Review, 1-20.

Migchelbrink, K., & Van de Walle, S. (2021). A systematic review of the literature on determinants of public managers’ attitudes toward public participation. Local Government Studies, 1-22.

Armand, A., Coutts, A., Vicente, P. C., & Vilela, I. (2020). Does information break the political resource curse? Experimental evidence from Mozambique. American Economic Review, 110(11), 3431-53.

Giraudet, L. G., Apouey, B., Arab, H., Baeckelandt, S., Begout, P., Berghmans, N., … & Tournus, S. (2021). Deliberating on Climate Action: Insights from the French Citizens’ Convention for Climate (No. hal-03119539).

Rivera-Burgos, V. (2020). Are Minorities Underrepresented in Government Policy? Racial Disparities in Responsiveness at the Congressional District Level.

Erlich, A., Berliner, D., Palmer-Rubin, B., & Bagozzi, B. E. (2021). Media Attention and Bureaucratic Responsiveness. Journal of Public Administration Research and Theory.

Eubank, N., & Fresh, A. Enfranchisement and Incarceration After the 1965 Voting Rights Act.

Mueller, S., Gerber, M., & Schaub, H. P. Democracy Beyond Secrecy: Assessing the Promises and Pitfalls of Collective Voting. Swiss Political Science Review.

Campbell, T. (2021). Black Lives Matter’s Effect on Police Lethal Use-of-Force. Available at SSRN.

Wright, N., Nagle, F., & Greenstein, S. M. (2020). Open source software and global entrepreneurship. Harvard Business School Technology & Operations Mgt. Unit Working Paper, (20-139), 20-139.

Boxell, L., & Steinert-Threlkeld, Z. (2021). Taxing dissent: The impact of a social media tax in Uganda. Available at SSRN 3766440.

Miscellaneous radar: 

  • Modern Grantmaking: That’s the title of a new book by Gemma Bull and Tom Steinberg. I had the privilege of reading snippets of this, and I can already recommend it not only to those working with grantmaking, but also pretty much anyone working in the international development space.
  • Lectures: The Center for Collective Learning has a fantastic line-up of lectures open to the public. Find out more here.
  • Learning from Togo: While unemployment benefits websites were crashing in the US, the Togolese government showed how to leverage mobile money and satellite data to effectively get cash into the hands of those who need it the most
  • Nudging the nudgers: British MPs are criticising academics for sending them fictitious emails for research. I wonder if part of their outrage is not just about the emails, but about what the study could reveal in terms of their actual responsiveness to different constituencies.
  • DataViz: Bringing data visualization to physical/offline spaces has been an obsession of mine for quite a while. I was happy to come across this project while doing some research for a presentation

Enjoy the holiday.

DemocracySpot’s Most Read Posts in 2014

Glasses for reading (1936) – Nationaal Archief

(I should have posted this on the 31st, but better late than never)

Below are some of the most read posts in 2014. While I’m at it, I’ll take the opportunity to explain the reduced number of posts in the last few months. Since mid-2014 I have been working with a small team of political and data scientists on a number of research questions at the intersection of technology and citizen engagement (I presented a few preliminary findings here). Following the period of field work, data collection and experiments, we have now started the drafting and peer-review stage of our research. This has been an extremely time-consuming process, which has taken up most of my weekends, when I generally write for this blog.

Still, one of my new year’s resolutions is precisely to better discipline myself to post more regularly. And I am hopeful that the publication of our upcoming research will make up for the recent reduction in posts. We will start to disseminate our results soon, so stay tuned.

In the meantime, here’s a selection of the five most read posts in 2014.

The Problem with Theory of Change

Technology and Citizen Engagement: Friend or Foe? 

A Brilliant Story of Participation, Technology and Development Outcomes

When Citizen Engagement Saves Lives (and what we can learn from it) 

Social Accountability: What Does the Evidence Really Say?


Now the paper: Evidence of Social Accountability Initiatives

sandwichstrategyfox

A little while ago I wrote about Jonathan Fox’s work on the evidence of social accountability initiatives. Initially in the format of a PDF slide presentation, it has now been turned into a magnificent paper, the first of the GPSA working paper series. Below is the abstract:

Policy discussion of social accountability initiatives has increasingly has increasingly focused on questions about their tangible development impacts. The empirical evidence is mixed. This meta-analysis rethinks some of the most influential evaluations through a new lens: the distinction between tactical and strategic approaches to the promotion of citizen voice to contribute to improved public sector performance. Field experiments tend to study bounded, tactical interventions that rely on optimistic assumptions about the power of information alone both to motivate collective action and to influence public sector performance. More promising results emerge from studies of multi-pronged strategies that encourage enabling environments for collective action and bolster state capacity to actually respond to citizen voice. This reinterpretation of the empirical evidence leads to a proposed new series of grounded propositions that focus on state-society synergy and sandwich strategies through which ‘voice’ and ‘teeth’ can become mutually empowering.

You can download the paper here: Social Accountability: What does the Evidence Really Say [PDF]. You can also read my take on the main lessons from Jonathan’s work here. Enjoy the reading.

***

PS: I have been away for a while doing field work, but hope to start posting (more or less) regularly soon.


Social Accountability: What Does the Evidence Really Say?

So what does the evidence about citizen engagement say? Particularly in the development world it is common to say that the evidence is “mixed”. It is the type of answer that, even if correct in extremely general terms, does not really help those who are actually designing and implementing citizen engagement reforms.

This is why a new (GPSA-funded) work by Jonathan Fox, “Social Accountability: What does the Evidence Really Say” is a welcome contribution for those working with open government in general and citizen engagement in particular. Rather than a paper, this work is intended as a presentation that summarizes (and disentangles) some of the issues related to citizen engagement.

Before briefly discussing it, some definitional clarification. I am equating “social accountability” with the idea of citizen engagement given Jonathan’s very definition of  social accountability:

“Social accountability strategies try to improve public sector performance by bolstering both citizen engagement and government responsiveness”

In short, according to this definition, social accountability is defined, broadly, as “citizen participation” followed by government responsiveness, which encompasses practices as distinct as FOI law campaigns, participatory budgeting and referenda.

But what is new about Jonathan’s work? A lot, but here are three points that I find particularly important, based on a very personal interpretation of his work.

First, Jonathan makes an important distinction between what he defines as “tactical” and “strategic” social accountability interventions. The first type of interventions, which could also be called “naïve” interventions, are for instance those bounded in their approach (one tool-based) and those that assume that mere access to information (or data) is enough. Conversely, strategic approaches aim to deploy multiple tools and articulate society-side efforts with governmental reforms that promote responsiveness.

This distinction is important because, when examining the impact evaluation evidence, one finds that while the evidence is indeed mixed for tactical approaches, it is much more promising for strategic approaches. A blunt lesson to take from this is that when looking at the evidence, one should avoid comparing lousy initiatives with more substantive reform processes. Otherwise, it is no wonder that “the evidence is mixed.”

Second, this work makes an important re-reading of some of the literature that has found “mixed effects”, reminding us that when it comes to citizen engagement, the devil is in the details. For instance, in a number of studies that seem to say that participation does not work, when you look closer you will not be surprised that they do not work. And many times the problem is precisely the fact that there is no participation whatsoever. False negatives, as eloquently put by Jonathan.

Third, Jonathan highlights the need to bring together the “demand” (society) and “supply” (government) sides of governance. Many accountability interventions seem to assume that it is enough to work on one side or the other, and that an invisible hand will bring them together. Unfortunately, when it comes to social accountability it seems that some degree of “interventionism” is necessary in order to bridge that gap.

Of course, there is much more in Jonathan’s work than that, and it is a must read for those interested in the subject. You can download it here [PDF].