Summit on AI and Democracy

On November 7, 2023, the Summit on AI and Democracy gathered experts across multiple institutions to discuss ongoing research, policy, and development efforts related to the recent advancements in AI.
Carnegie Mellon University Conference on “Operationalizing the NIST Risk Management Framework”

The Carnegie Mellon University Block Center recently held a conference focused on “Operationalizing the NIST Risk Management Framework” that GETTING-Plurality Research Network member, Sarah Hubbard, attended. The attendees spanned government, academia, and industry partners who are working towards Responsible AI. Below, we include a short recap from the event.
Recent Reports on Decentralized Autonomous Organizations

GETTING-Plurality Research Network member, Sarah Hubbard, spent the past academic year as a Technology and Public Purpose Fellow at the Harvard Kennedy School. During this fellowship, she spent time researching and convening various stakeholders on Decentralized Autonomous Organizations. Below is a summary of this work with various recently published reports from the Belfer Center.
White House Memos on Responsible AI

The GETTING-Plurality Research Network submitted a series of memos to the White House in response to their call for memos on national priorites for AI.
GETTING-Plurality Open Research Questions

Network members have developed an open and collaborative list of research priorities in response to the recent advances in artificial intelligence.
Ovadya discusses bridging systems at Columbia symposium

GETTING-Plurality Workstream Lead Aviv Ovadya recently discussed his work on bridging systems as part of “Optimizing for What? Algorithmic Amplification and Society,” at Columbia University’s Knight First Amendment Institute.
Putting Flourishing First: Applying Democratic Values to Technology

In this short research brief, the authors unpack and comment on the four-step logic at the core of GETTING-Plurality’s foundational white paper, Ethics of Decentralized Social Technologies: Lessons from Web3, the Fediverse, and Beyond. They outline four assertions from the paper that demonstrate the power and the challenge – and above all, the urgency – of placing human flourishing at the center of technology governance.
Ethics of Decentralized Social Technologies: Lessons from the Web3, the Fediverse, and Beyond

The plethora of experiments with decentralized social technologies (DSTs)—clusters of which are sometimes called “the Web 3.0 ecosystem” or “the Fediverse”—have brought us to a constitutional moment. These technologies enable radical innovations in social, economic, and political institutions and practices, with the potential to support transformative approaches to political economy. They demand governance innovation. The paper develops a framework of prudent vigilance for making ethical choices in this space that help to both grasp positive opportunities for transformation and avoid the potentially problematic consequences. Most of our specific examples and concerns come from the blockchain/Web3 universe, as this has received the greatest investment, attention, and adoption to date. However, we aim to offer a framework for governance decision-making in conditions of uncertainty that applies more broadly to other DSTs. Specifically, under the framework of prudent vigilance, we propose a pragmatic, democratic, and pluralist approach to navigating bold experimentation with social practices and political economy enabled by these technologies. Our overarching goal is to provide a framework open to transformative improvement and constrained by guardrails and guiding values supportive of democracy, freedom, and pluralism. We take a relatively strong position, rather than simply laying out ethical issues and potential approaches. We seek to be provocative in order to spur further work and hope this paper will serve as a first bridge between academic philosophy and the DST community, which have hardly interacted to date.
Plural Publics

Data governance is usually conceptualized in terms of “privacy” v. “publicity”. Yet a core feature of pluralistic societies is association, groups that share with each other, privately. These are a diversity of (plural) publics, each externally private but with the ability to coordinate and share internally. Empowering plural publics requires tools that allow the establishment of shared communicative contexts and their defense against external sharing outside of context. The ease of spreading information online has challenged such “contextual integrity” and the rise of generative foundation models like GPT-4 may radically exacerbate this challenge. In the face of this challenge, we highlight why we believe the problem of “plural publics” to be a core challenge of data governance, discuss existing tools that can help achieve it and a research agenda to further develop and integrate these tools with a design eye specifically on the requirements of plural publics.
GETTING-Plurality Launch

Harvard Edmond and Lily Safra Center for Ethics Launches Research Network on the Governance of Emerging Technologies through Plurality March 20, 2023 The Harvard Edmond & Lily Safra Center for Ethics is proud to announce the launch of the Governance of Emerging Technology and Technology Innovations for Next-Generation Governance through Plurality (GETTING-Plurality) research network. This […]