Sub-theme 5. Modelling and multi-methods approaches in polycentric commons systems
Panel 5.3.
Leveraging computational methods to decode institutions -- implementing computer science methods to study commons.
The panel is dedicated to share research implementing computational methods for extracting, annotating, and analyzing strategies, norms, and rules or any forms of information attempting to regulate agents’ behavior. The panel aims to gather researchers from different disciplines, including computer science, computational social science, as well as social science more broadly, who rely on specific computational methods to develop a better understanding of how institutions govern social interactions, especially those involving commons. However, to date efforts to collate an exposition of different computational approaches is limited. The panel thus encourages the knowledge transfer across different (sub)disciplines, and specifically invites contributions that relate to:
- Novel theoretical or conceptual frameworks studying commons by relying on computational methods;
- Empirical work by both researchers and practitioners exploring opportunities of computational techniques (whether existing or novel) in the field;
- Research focusing on the development of technological approaches that illustrate or unleash novel analytical opportunities, or advance existing techniques for the study of the commons specifically, and institutions more generally.
For all the highlighted directions, the panel is agnostic about the nature of the underlying data (e.g., qualitative, quantitative), and specific methods (including formal methods) used in the study. Where relevant, researchers are further invited to specifically speak to the challenges of the employed techniques as a basis for identifying best practices regarding the selection and application.
- June 24, 2023
- 11:00 am
- Tenth Floor - 1002
1. Socio-cultural heterogeneity and collective action
Sechindra Vallury1, Jacopo Baggio2, and Nathan Cook3
1University of Georgia, USA, 2University of Central Florida, USA, 3Indiana University-Purdue University Indianapolis, USA
This study examines how socio-cultural heterogeneity can shape individual decision-making and interactions in a common-pool resource setting. Scholars have long argued that the identity of an individual – their sense of self – is associated with different social categories (e.g., caste, gender), and their decisions are shaped by norms or beliefs about how people in these categories should behave. These identities may be further reinforced or mitigated through social networks that facilitate the exchange of knowledge, cultural products, symbols, or beliefs. Here, we develop an agent-based model to examine how individuals from different social groups collaborate, share knowledge, learn, and ultimately make decisions affecting cooperation and collective action outcomes. We test this question in the context of a smallholder farming community that is subject to the variability in water availability caused by climate change. Specifically, our model is motivated by the Kuhl irrigation system in India, in which farmers from different social groups make decisions about several collective choice arrangements (e.g., canal repair, participation in meetings). Using precipitation data from 2010-2018, we simulate how performance of the system changes under different projections of climate change in the region (e.g., increase and decrease in rainfall, changes in length of the monsoon season, and changes in the timing of the onset of the monsoon). Our model will help understand how climate change is likely to impact farmers from different social groups and what factors facilitate adaptation to climate change.
2. Deconstructing written rules and hierarchy in peer produced software communities
Mahasweta Chakraborti, Qiankun Zhong, Beril Bulat, and Seth Frey
UC Davis, USA
We employ recent advances in computational institutional analysis and NLP to investigate the systems of authority that are reflected in the written policy documents of the ASF. Our study to decipher the effective similarities or departures of the ASF model from conventional software companies reveals evidence of both flat and bureaucratic governance in a peer production set up, suggesting a complicated relationship between business-based theories of administrative hierarchy and foundational principles of the OSS movement.
Working paper: https://arxiv.org/abs/2206.07992
3. Composing games into complex institutions
Joshua Tan1, Jules Hedges2, Phillip Zahn3, and Seth Frey4
1The Metagovernance Project, USA, 2University of Strathclyde, UK, 3University of St. Gallen, Switzerland, 4UC Davis, USA
Game theory is used by all behavioral sciences, but its development has long centered around tools for relatively simple games and toy systems, such as the economic interpretation of equilibrium outcomes. Our contribution, compositional game theory, permits another approach of equally general appeal: the high-level design of large games for expressing complex architectures and representing real-world institutions faithfully. Compositional game theory, grounded in the mathematics underlying programming languages, and introduced here as a general computational framework, increases the parsimony of game representations with abstraction and modularity, accelerates search and design, and helps theorists across disciplines express real-world institutional complexity in well-defined ways.
Working paper: https://arxiv.org/abs/2108.05318
4. Identifying tensions in the UNESCO proceedings with Natural Language Processing approach.
Joanna Wojciechowska, Igor Kamiński, Maria Śmigielska, Mateusz Sypniewski, Emilia Wiśnios, Stanisław Giziński, Bartosz Pieliński, Hanna Schreiber, and Julia Krzesicka
University of Warsaw, Poland
Natural Language Processing (NLP) has been widely used in argument mining to identify, extract and classify arguments in natural language texts. In this study, we applied NLP techniques to the proceedings of UNESCO, a global organisation that promotes cultural and natural heritage conservation and protection, with the aim to define and analyse specific social dilemmas surrounding heritage commons.
To this end, we preprocessed the proceedings to extract the argumentative units and their components, such as premises and conclusions. Then, we applied argument mining techniques to classify the arguments into different categories and identify the main topics and issues discussed in the proceedings and the relations between them.
Overall, our study demonstrated the potential of NLP techniques in identifying tensions and conflicts in UNESCO proceedings and providing insights into the organisation’s decision-making processes. Further research is needed to develop and refine the argument mining techniques to improve the accuracy and reliability of the results.
5. Decomposing institutional statements using Large Language Models
Stanisław Giziński1 and Bartosz Pieliński2
1Warsaw University of Technology, Poland, 2University of Warsaw, Poland
Institutional grammar, a formalism based on Elinor Ostrom IAD has many applications in modern policy research. Extensions of IG such as Nested ADICO allows for fine grained analysis of complex sentences, capturing institutions in greater details. Such methodology consists of splitting complex institutional statement into smaller statements, called atomic statements, connected by logical relationships.
However, due to complexity of legislative language, coding those complex statements, especially very long sentences proofs to be problematic and time consuming. This results in limited work being done regarding analyzing institution using nested IG paradigm.
I argue that such problem could be tackled with usage of Natural Language Processing. Recent advancements in NLP include raise of large language models, such as GPT. Such models have previously proven to be useful in taks of semantic parsing.
In my talk, I will present method automatic breaking down complex institutional statements into atomic statements with usage of GPT. I will also show how the results of such method along with validation on manually parsed sentences.
I believe that employing such technique in the process of coding institutional grammar can greatly speed up the whole process which could allow for analysis of much bigger volumes of data.