What is IR-for-Good?

15 minute read

Published:

(This blog post has been jointly co-authored by Maria Heuss and Bhaskar Mitra, co-chairs of ECIR’26 IR-for-Good Track. Please see the call for papers for more details about the track. Abstracts are due: October 21 and papers are due: October 28.)


Seriously, what is it? When we met this summer as track chairs to plan ahead for next year’s IR-for-Good track, our conversation largely revolved around this question: “What is IR-for-Good?”. Typically, when a conference has a special track the goal is to nurture a specific research direction that is not yet mainstream within the broader field and to build a community around it. Special tracks generally focus on particular problems or new approaches. But what does it mean to have a special track for IR research that contributes to societal good? Shouldn’t all IR research be societally-beneficial? Does a special track on IR-for-Good (😇) imply that other contributions to the conference are IR-for-Bad (😈)? How do we define what societal good is? And, how do we decide if particular IR contributions are likely to benefit or harm society?

These are the thorny—and yet intellectually exciting and societally critical—questions that we started with. Reflecting on these questions explicitly helped us clarify for ourselves what we are trying to achieve with the IR-for-Good track and prompted several changes to the track this year that we hope the community will find useful. In this post, we would like to share with the broader IR community our motivations for these changes and initiate a conversation about how we collectively center societal needs in IR research and about the role of the IR-for-Good track in affecting desired transformations in the IR community.

What is societal good?

Access to trustworthy information is a critical societal need, including supporting informed citizenry in democratic societies, as a catalyst for social transformations, and as a social determinant of health and economic progress. It is imperative that IR research concerns itself with not just the information needs of individual users but also its responsibilities towards collective societal good. We must not assume that all technological progress in IR contributes positively to society nor accept the techno-deterministic view that there is a single pre-determined path forward for progress in IR research. Instead, we must explicitly study and critique the systemic impact of information access technologies on society in light of the sociopolitical context in which they are developed and deployed, and leverage our improved understanding to guide future IR research towards realizing positive societal outcomes.

But first, we must articulate what we mean by societal good. These conversations are already happening in different parts of the IR community, including in forums like SWIRL 2025 (see Section 7 of the SWIRL’25 report). We reviewed some of this literature and then decided to adopt the following operative definition of IR-for-Good:.

IR-for-Good refers to IR research and practices that contribute towards realizing more equitable, emancipatory, and sustainable futures.

Starting from this definition, we enumerated potential relevant topics of interest for this track to include how IR intersects with and/or can support:

  • Accessibility and disability justice
  • Art, culture, and representation
  • Crisis and disaster management
  • Decolonization and racial justice
  • Emancipation, anti-oppression, and social justice
  • Gender and sexuality justice
  • Informed citizenry, democracy, and collective decision making
  • Law and restorative justice
  • Literacy and knowledge production
  • Privacy and dignity
  • Public health and community health
  • Social, political, and economic equity
  • Sustainability and environmental justice
  • Worker rights and labor movements

We made an intentional choice to center these topics on societally-beneficial outcomes (e.g., equity, emancipation, justice, and sustainability) rather than on the approaches that may help us progress towards those outcomes (e.g., procedural fairness, interpretability, and transparency). We of course welcome submissions focusing on different approaches in this special track. Our motivation for centering outcomes over approaches is to encourage exploration of a broader space of diverse sociotechnical methods as well as to hold ourselves accountable to the ultimate goal of affecting positive societal impact.

We consider this definition of societal good to be neither fixed nor complete. It is finally up to the IR community to iterate, extend, and further explicate this definition over time. But for now, we hope this definition provides reasonable clarity on what societal outcomes we are aspiring for.

Finally, we defined the scope of IR-for-Good track to include IR research that:

  1. Explicitly concerns with new research directions and system design to achieve specific societally beneficial outcomes,
  2. Develops new fairness, privacy, transparency, accessibility, sustainability, and other similar societally-motivated interventions, and/or
  3. Identifies and critiques the ways in which existing IR methods and systems and how we do IR research may contribute to systemic harm or impede social progress.

Within the above specified scope, we invite contributions to the track that explore new positions, critiques, tools, methods, resources, and interventions for IR-for-Good.

How do we decide if particular IR contributions are likely to benefit or harm society?

The ultimate goal of IR-for-Good is to achieve positive societal impact through relevant IR research. To affect real change, we must ensure that our research is grounded in rigorous understanding of the sociotechnical challenges and the complex sociopolitical context in which our work is embedded. We must discourage non-performative research gaze and hold ourselves collectively accountable to ensure that we are not simply “spinning our wheels” and that our scholarship indeed translates to material positive societal impact. And we must be particularly careful to ensure that our research pushes for structural change and does not unintentionally contribute to ethics-washing harmful technologies.

When we, the IR-for-Good track chairs, started discussing how we ensure the community is indeed making progress on desired societal outcomes it became quickly apparent to us that we need to develop new community practices that help us build shared understanding of how technologies impact society and help us be more effective in affecting positive societal change. Intuitively, if we want to encourage more critical scholarly discourse within the IR community on how specific research directions may contribute towards desired societal outcomes, the first step should be to make these theories of change explicit in our scholarship. With that motivation, we are requiring every ECIR’26 IR-for-Good track submissions that propose new IR tools, methods, resources, and interventions to explicitly include a separate section elaborating how the work contributes towards desired societal outcomes. Position papers and critiques are exempted from this requirement as these arguments should anyways be a core contribution of those submissions.

We recommend that the “Theory of Change” section should explicitly state:

  1. What is the identified societal need / problem, and how are the core contributions from this current work expected to address them?
  2. What preconditions are necessary or what assumptions need to hold for this work to have its desired effect, and how likely are they to hold true in practice?
  3. What are possible negative externalities of this approach and is it plausible that this may lead to new or different harms?

Authors are encouraged to include any additional discussions that they may deem relevant in this section. Authors should note that it is not mandatory to name the section “Theory of change” but it should be apparent from the section title that it elaborates on how the work contributes towards desired societal outcomes.

Contributions focusing on algorithmic bias, fairness, transparency, interpretability, explainability, trustworthiness, misinformation, disinformation, hate speech, replicability, transferability, robustness, uncertainty, security, ethics, and other related topics are also required to explicitly articulate how the work contributes towards positive societal outcomes and not implicitly assume that all research on these topics contribute to societal good. As a corollary, certain IR topics that may not have historically been seen as societally focused (e.g., designing distributed information access platforms or developing more effective ranking models without the use of user behavior data) would also be welcome in this track if they can appropriately argue that the work is likely to contribute to societal good, e.g., by making platforms more robust to authoritarian capture or disincentivizing mass ubiquitous user surveillance, respectively.

We want to strongly emphasize that this section should not be an afterthought. Instead it should be a critical part of the key motivations for the work and as important as any other core sections of the respective papers. We encourage authors and reviewers to critically engage with this section while acknowledging the real uncertainty of how any well-intentioned research may impact society in practice. Our goal is not to encourage authors to inflate their claims of societal impact but to rigorously deliberate on their sociotechnical assumptions and to thoroughly enumerate the necessary preconditions for the work to have its desired impact and potential negative externalities. Having explicit theories of change in our publications further opens up the opportunity for future scholarship to analyze, critique, validate, and improve upon these theories of change. It also creates the possibility for scholars from non-IR disciplines to engage with and critically analyze the emerging theories of change within the IR community. And we hope that over time this practice also encourages IR researchers to more actively reach beyond their disciplinary boundaries to work with other scholars, experts, practitioners, policymakers, civil rights advocates, activists, and movement organizers pushing for social justice and sustainability.

Here are a few example cases to illustrate the kind of critical reflections we would like to see more of in the IR community:

Claim: Our work that proposes a method for making expensive machine learning models for IR more efficient contributes towards sustainability and reducing impact on the environment.
Considerations for this claim: What preconditions are necessary for the efficiency improvements to translate to reduced impact on the environment? E.g., according to the Jevons paradox in economics, when technological advancements make a resource more efficient to use (thereby reducing the amount needed for a single application) it often results in overall increase in demand, causing total resource consumption to rise instead of falling. Is it more likely then that more efficient models may in fact lead to a false sense of mitigation and result in much wider adoption contributing to increased harm to the environment?
Claim: Our work that develops new assistive tools for document authoring increases worker productivity and contributes towards reduced labor for workers.
Considerations for this claim: What preconditions are necessary for the improvement in productivity to benefit the workers? In other words, who gets to benefit from the surplus provided by technology here? Does it benefit workers or does it lead to further reduction in their compensation and changes in job expectations that lead to lower status? Does the proposed approach provide any enforceable mechanisms to ensure that the surplus primarily benefits the workers?
Claim: Our work that improves alignment of LLMs towards specific social values contributes towards user safety by preventing exposure to harmful content.
Considerations for this claim: Who gets to decide what is harmful or select the values the system should be aligned with? Could these approaches in fact further centralize power and control over what is deemed "acceptable" vs. "harmful" speech? Could this stifle the voices of marginalized people and social activists? Could this incentivize authoritarian capture of information access platforms to manipulate public opinion? Does the proposed approach consider both the social and technical aspects of this problem to ensure democratic oversight and emancipatory outcomes?
Claim: Our work that proposes new methods for generating explanations for model outputs contributes towards increasing user trust in the system.
Considerations for this claim: Is that trust beneficial or harmful for the user? Is that trust warranted or could it in fact draw users into a false sense of safety and distract them from noticing how the system surveils them and subtly manipulates their behavior? How can this explainability intervention actually help reveal and challenge existing power structures?
Claim: Our work that proposes a new ranking approach for gender fairness contributes towards gender justice.
Considerations for this claim: How does the adopted definition of "gender fairness" in this work translate to mitigating real-world "gender discrimination"? Does this work assume that gender is binary, erasing other identities? Does this work assume that gender is known for all users / subjects and incentivize further intensification of surveillance and collection of private demographic data from members of historically marginalized communities? How can this work be operationalized in practice towards gender and sexuality justice? Are there example use-cases where this can be reliably demonstrated (e.g., ranking in hiring or job recommendation applications)?

Our own theory of change

It would be inconsistent for us to ask of others to state their theories of change without at least very briefly articulating our own. After many conversations and careful deliberations over the summer, we, the co-chairs of ECIR’26 IR-for-Good track, concluded that societal good should obviously be the motivation and goal of all of IR research. The role of the IR-for-Good track in this endeavor then is to be a space where we can explore, experiment with, and develop new community practices and norms to promote more societally-beneficial IR research. Our motivation is to subsequently contribute the identified best practices back to the broader IR community in an effort to ensure that all of IR research is IR-for-Good.

The transformations that IR-for-Good track wants to realize in the broader IR community will not happen overnight. At ECIR’26 we are trying to re-clarify for ourselves and the IR community what we want to achieve with this special track and build on the original IR-for-Good vision. We need future IR-for-Good track chairs to continue evolving these emerging practices and experiment with new ones. We need to collaborate with and enable cross-pollination of ideas and practices with other societally-motivated IR sub-communities, such as TheWebConf Web4Good Special Track, TheWebConf Responsible Web track, RecSys FAccTRec workshop, and KDD Responsible AI Day. And we need to raise our sociopolitical consciousness within the IR community and push towards more epistemic rigor working in partnership with other disciplinary scholars and experts.

We need you 🫵🏽

At ECIR’26, the IR-for-Good track will serve as a platform to showcase the best of the best societally-motivated research in the field of IR. Starting this year, IR-for-Good will be a core track at the conference and will run alongside the main conference, not on workshop day. We invite you to not just participate in this special track, but to be an active part of building a broader movement in the IR community to bend the arc of IR research towards societal good.

Here’s how you can get involved:

  • Submit your work to the track (Abstracts due: Oct 21, papers due: Oct 28).
  • Sign up as a reviewer for the track. We are especially looking for reviewers who can bring in interdisciplinary perspectives, such as at the intersections of IR with human-computer interaction (HCI), information sciences, media studies, design, science and technology studies (STS), social and political sciences, philosophy, law, environmental sciences, public health, and educational sciences.
  • Send us your feedback / ideas for the IR-for-Good track and tell us what you would like to see at the conference track this year.
  • If you are involved in other societally-motivated IR sub-communities or tracks at other IR venues, then let’s share notes and work together!

And please join us in Delft next year to continue the conversation!



Would you like to comment on or discuss this post? You can do so on these social media threads on Bluesky, Mastodon, LinkedIn, and Twitter.