Events and Announcements: 1 October 2023

Events and Announcements: 1 October 2023

To have your event or announcement featured in next week’s post, please send a link and a brief description to ojeventsandannouncements@gmail.com!

Calls for Papers

Propaganda and Emerging Technologies: We invite abstracts for ‘Propaganda and Emerging Technologies,’ a conference hosted by the Information Society Project, and to be held at Yale Law School on April 5–6, 2024. Problems of propaganda, hate speech, misinformation, manipulation, and electoral influence have persisted for years. These problems will likely increase as generative AI and extended reality technologies become more sophisticated and accessible. Because the problems are global as well as domestic, the solutions require a global as well as a domestic perspective. This international and interdisciplinary conference aims to explore how generative AI, extended reality, and other emerging technologies create new problems for the public sphere, and the best ways to deal with these problems. The goal of the conference is to help lawyers, policy makers and social media companies anticipate and plan for a new wave of propaganda in the age of artificial intelligence.

We invite abstracts from legal scholars and experts, as well as scholars from other disciplines – including history, public health, philosophy, anthropology, sociology, media studies, information science, computer science, statistics, and cultural studies – that investigate questions of propaganda: historic, contemporary and emerging. We welcome papers that explore (but are not limited to) how new technologies like AI and extended reality might undermine or increase distrust in knowledge-producing and credentialing institutions, including science, journalism, and academia, and how best to counteract these effects. We also seek work that discusses how new technologies will produce new forms and new kinds of propaganda. For more information, click here.

Events

Protecting Civilians Against Digital Threats During Armed Conflict: In situations of armed conflict, access to and the integrity of digital technology is of significant importance and can save lives. At the same time, the digitalization of armed conflicts brings new threats for civilians and those who work to protect them. While state and non-state actors have used digital technology to overcome their adversaries militarily, digital technologies have also been used to disable critical civilian infrastructure and services, to incite violence against civilian populations, to disrupt medical and humanitarian relief efforts, or to undermine trust in the latter. What limits does international humanitarian law set on such operations, and which legal and policy measures should belligerents, tech companies, and humanitarian organizations take to protect civilians from new threats?

Over the past two years, the ICRC’s President has chaired a ‘Global Advisory Board’ of high-level leaders and experts from the legal, military, policy, technological and security field to advise the ICRC on digital threats and identify concrete recommendations to protect civilians against them. In this event, the Board will present its final report featuring four guiding principles and a set of concrete legal and policy recommendations to states, belligerents, tech companies, and humanitarian organizations to prevent or mitigate digital threats. The event is online on 19 October 2023 from 13:30-14:30 CEST. To register, click here.

Online Lecture Series: Underworlds – Sites and Struggles of Global Dis/Ordering: Engagement with practices of global ordering is often guided towards specific locations and legacies: the sovereign state, the formal sources and standards of international law, the intricacies of global diplomacy, the historical juncture and its (anti-)heroes, the international palaces of hope in Geneva, New York, or The Hague. These explorations entail ideas of where power resides and where it is to be unmasked or undone – ideas implicitly grounded in modernist geographies, temporalities, and subjectivities. Starting from the limits of these familiar perspectives, this lecture and workshop series traces the multiple ways in which these sites, actors, and events are cabined, crossed, and cut apart by alternative material arteries, lineages, and languages of global dis/ordering.

The series takes as starting point that authority and order are not fixed properties of specific actors or institutions, but the result of ongoing material processes of ordering and world-making. As such, it traces unconventional forms and sites of global dis/ordering – from raw materials to projections of hope – as material, infrastructural, and discursive compositions that shape patterns of power. The encounter between old- and new materialist, Marxist and decolonial methodologies and modes of critique is one of the key objectives of this series. Its aim, however, is not only methodological: it aspires to inspire new ethical and political openings that attend to our inevitable complicity in taking part in these processes, and reveal new modes of resistance and refusal, of struggle and sociality. These interventions are not narrowly targeted at the old nemeses of critique – the state, the truth, the universal – but work from within both entrenched and emergent material sites and practices of dis/ordering: oceans, oil, coal, breath, debt, commons, frontier(s), waste, hope, wild, wild / vessels. For the lecture schedule and to register, click here.

Announcements

AI & Human Rights Newsletter: If you are interested in artificial intelligence and human rights, sign up for the weekly AI & Human Rights Newsletter! The newsletter is part of the UKRI-funded project ‘AI & Human Rights: Understanding how AI affects our lives and shapes our society’ led by Dr Daragh Murray, Senior Lecturer at Queen Mary University London School of Law. To sign up and to see more information, along with previous editions of the newsletter, visit aihumanrights.blog.

Print Friendly, PDF & Email
Topics
Announcements, Calls for Papers, Events, General
No Comments

Sorry, the comment form is closed at this time.