The International Conference Ethics of Socially Disruptive Technologies (ESDIT2022) will be held on 6&7 October in the Netherlands.
This is a major academic conference in 2022, organized by a consortium of 7 universities in the Netherlands. The focus is on those new technologies that are disruptive & transformative of society: artificial intelligence, robotics, neurotechnology, synthetic biology, 3D printing, energy transition technologies, and others.
About the conference
The objective of the conference is to study and ethically assess the transformative consequences of these emerging technologies, of social institutions, the environment, human relations, personal identities, thought and language. There is a particular focus on challenges of disruptive technologies to key concepts and values, like “truth”, “agency”, “democracy”, “human nature” and “life”.
KEYNOTE SPEAKERS
KEYNOTE SPEAKERS OUTLINE
Keynote Speaker: Sally Haslanger
A characteristic of critical theories is that they resist formulaic solutions to social problems based on a priori theory. Behind this is a commitment to the epistemic and social empowerment of the oppressed so that they can solve their own problems. I argue that we can best understand this epistemic commitment by exploring the path dependency of value. If values are contingent historically and materially grounded constructions, then a priori inquiry will have only a limited role to play in achieving social justice. I will briefly discuss a case study of participatory design in rural Kenya showing how material and embodied engagement in co-design can have an impact. I argue that such path dependency is compatible with values – and social justice – being objective, but not to be discovered by ideal theory.
BIO
Sally Haslanger is Ford Professor of Philosophy and Women’s and Gender Studies at MIT. She also teaches in MIT D-Lab, a hands-on program using participatory design to create inclusive, accessible, and sustainable solutions to global poverty challenges. Broadly speaking, her philosophical work links issues of social justice with contemporary work in epistemology, metaphysics, philosophy of language, and philosophy of mind. A collection of her papers that represent this effort over twenty years was collected in Resisting Reality: Social Construction and Social Critique (Oxford 2012); it received the Joseph B. Gittler award for outstanding work in philosophy of the social sciences. She recently co-authored What is Race: Four Philosophical Views (Oxford 2019) with Joshua Glasgow, Chike Jeffers and Quayshawn Spencer. She is currently finishing a book, Doing Justice to the Social that is under contract with OUP. In 2013-4, she was the President of the Eastern Division of the American Philosophical Association; in 2015, she was elected to the American Academy of Arts and Sciences. See also: http://sallyhaslanger.weebly.com
TRACK KEYNOTE SPEAKERS
- Track 1 – Ethics of Human-like Robots: John Danaher (NUI Galway)
- Track 2 – Fundamental Issues in AI: Lena Kästner (University of Bayreuth)
- Track 3 – Technology and Changing Self-Understanding: Jeroen van den Hoven (TU Delft), Eleonora Viganò (University of Zurich)
- Track 5 – Social Justice and Technology: Shen-yi Liao (University of Puget Sound)
- Track 6 – Control and Technology: C. Thi Nguyen (University of Utah), Regina Rini (University of York)
- Track 7 – The Conceptual Disruption of Nature by Socially Disruptive Technologies: Susanna Lindberg (Leiden University)
- Track 8 – The Technical Mimesis of Nature: Hub Zwart (Erasmus University Rotterdam)
- Track 9 – The role of Socially Disruptive Technologies in Climate change and Climate recovery: Martin Drenthen (Radboud University)
- Track 10 – Criteria for Conceptual Engineering: Manuel Gustavo Isaac (University of Zurich), David Ludwig (Wageningen University), Paul-Mikhail Podosky (Macquarie University), Catarina Dutilh Novaes (VU Amsterdam)
- Track 11 – Moral Change and Technology: Chiara Lisciandra (University of Munich)
TRACK KEYNOTE SPEAKERS OUTLINE
Track 1: John Danaher (NUI Galway)
Track 2: Lena Kästner (University of Bayreuth)
Complex artificial intelligence (AI) systems are often considered black boxes. At the same time, they are becoming increasingly prevalent in modern our lives. As a result, there is an increasing demand to turn these black boxes into glass boxes, viz. make AI systems explainable and their behaviour intelligible. But how should this be achieved? Recent debates about explainable AI (XAI) focus on deploying specific algorithms and tools in specific contexts, uncovering explanatorily relevant information in a single shot. However, looking at the life sciences we know that uncovering how a system behaves as a functional whole is rarely achieved in a single shot. Rather, it takes a lengthy investigative process relying on a combination of different strategies and manipulations. Employing these strategies in XAI holds the potential to significantly enrich the field.
BIO
Lena Kästner is professor for philosophy, computer science and AI at the University of Bayreuth. Her research focuses on explaining the behavior of natural and artificial intelligent systems. Prof. Kästner has a background in cognitive science and neuroscience; she received her PhD in philosophy from Ruhr-University Bochum in 2014 and has held positions at Humboldt Universität zu Berlin, Saarland University and Tilburg University before moving to Bayreuth. She’s currently head-PI of the project Explainable Intelligent Systems (EIS; www.eis.science) funded by the Volkswagen Foundation.
Track 3: Jeroen van den Hoven (TU Delft), Eleonora Viganò (University of Zurich)
In this talk, I will deal with an ethical issue concerning the future selves and good lives of people using digital well-being technologies, which aim to improve mental or physical well-being.
I will show that, as the recommendation algorithms of such technologies employ the user’s past data and/or data coming from subjects similar to the user, they tend to create a homogeneous digital environment limiting the variety and diversity of the choice options. In the long run, this reduces the possibility of the user’s future self to experiment with life opportunities and versions of themselves. The result is a limitation of the good life of the user’s future self in terms of life experience, authenticity, and self-knowledge.
BIO
Eleonora Viganò’s is Postdoctoral Researcher at the Institute of Biomedical Ethics and History of Medicine, at the University of Zurich. She was recently the Executive Manager of the ELSI-Task-Force for the Swiss National Research Programme 75 “Big Data” and developed the moral aspects of the Swiss National Research Programme 77 project “Socially acceptable AI and fairness trade‐offs in predictive analytics”.
Her main research areas are Neuroscience of Ethics and Digital Ethics. She is currently working on the ethical issues of digital technologies aiming to improve people’s well-being.
Her latest publication is the book “Moral choices for our future selves” for the series Routledge Focus on Philosophy.
Track 5: Shen-yi Liao (University of Puget Sound)
It is well-known that racism is encoded into the social practices and institutions of medicine. Less well-known is that racism is encoded into the material artifacts of medicine. We argue that many medical devices are not merely biased, but materialize oppression. An oppressive device exhibits a harmful bias that reflects and perpetuates unjust power relations. Using pulse oximeters and spirometers as case studies, we show how medical devices can materialize oppression along various axes of social difference, including race, gender, class, and ability. Our account uses political philosophy and cognitive science to give a theoretical basis for understanding materialized oppression, explaining how artifacts encode and carry oppressive ideas from the past to the present and future.
Oppressive medical devices present a moral aggregation problem. To remedy this problem, we suggest redundantly layered solutions that are coordinated to disrupt reciprocal causal connections between the attitudes, practices, and artifacts of oppressive systems.
[co-authored with Vanessa Carbonell]
BIO
Shen-yi Liao is an associate professor at University of Puget Sound in the philosophy department; and also affiliated with the Asian studies, bioethics, gender & queer studies, and neuroscience interdisciplinary programs. His current book project is about objects and spaces where cognition meets oppression. He also does research on imagination, experimental philosophy, aesthetics, and language.
Track 8: Hub Zwart (Erasmus University Rotterdam)
Track 6: C. Thi Nguyen (University of Utah), Regina Rini (University of York)
Track 7: Susanna Lindberg (Leiden University)
After earning a PhD at the University of Strasbourg, she has worked as researcher at the University of Helsinki and at the Université Paris Ouest Nanterre; as lecturer and professor at the University of Tampere, and as core fellow at the Collegium for Advanced Studies of the University of Helsinki.
Her publications include From Technological Humanity to Bio-Technical Existence (forthcoming at SUNY, 2023) Techniques en philosophie (Hermann, 2020), Le monde défait. L’être au monde aujourd’hui (Hermann, 2016), Heidegger contre Hegel: Les irréconciliables, and Entre Heidegger et Hegel: L’éclosion et vie de l’être (L’Harmattan, 2010). She also has edited several collected volumes, notably The Ethos of Digital Environments. Technology, Literary Theory and Philosophy (with Hana ROine, forthcoming at Routledge,, 2021), The End of the World (with Marcia Sá Cavalcante Schuback, Rowman and Littlefield, 2017) and Europe Beyond Universalism and Particularism (with Sergei Prozorov and Mika Ojakangas, Palgrave Macmillan, 2014). In addition to this, she has published number of academic articles.
Track 9: Martin Drenthen (Radboud University)
Rewilding is an increasingly popular strategy in nature conservation and ecological restoration. It is sometimes defined as “a progressive approach to conservation” that does not so much aim to restore ecosystems to their previous state, but instead focus on “restoring natural processes to shape land and sea”. Yet, despite its popularity, rewilding is also highly controversial, especially when applied in culturally saturated landscapes. In this paper I examine what is morally at stake in debates between proponents of rewilding and those that see rewidling as a threat to traditional cultural landscapes worthy of protection. I will argue that rewilding should not only be understood as a conservation practice, but also as a disruptive eco-technology. Rewilding implies a radical non-anthropocentric normative reinterpretation of landscape and human history that calls for a critical re-examination of the cultural identities that are based on that history.
BIO
Martin Drenthen is Associate Professor of Environmental Philosophy at the Institute for Science in Society (ISiS) at Radboud University in Nijmegen (Netherlands).
His research topics include environmental hermeneutics, ethics of place, philosophy of landscape, the ethics of environmental restoration and rewilding, and the ethics of wolf resurgence. He was project leader of the research project ‘Reading the Landscape’ which focuses on the relation between notions of moral identity and interpretations of landscape in ethical issues regarding rewilding and landscape conservation. Currently, his research focuses on ethical issues related to cohabitation with resurging unruly wildlife.
He published extensively on environmental philosophy in both Dutch and English. He co-edited New Visions of Nature. Complexity and Authenticity (Springer 2009), Interpreting Nature. The Emerging Field of Environmental Hermeneutics (Fordham University Press, 2013), and Environmental Aesthetics. Crossing Divides and Breaking Ground (Fordham University Press,2014), and Old World and New World World Perspectives in Environmental Philosophy. Transatlantic Conversations (Springer 2014). He is author of Grenzen aan wildheid [2003, in Dutch], on the meaning of Nietzsche’s critique of morality for environmental ethics. His 2018 book Natuur in mensenland [in Dutch] explores the moral significance of rewilding in old cultural landscapes. His most recent book Hek [2020, in Dutch] examines the ethics of the border between agricultural land and nature areas.
Track 10: Manuel Gustavo Isaac (University of Zurich), David Ludwig (Wageningen University), Paul-Mikhail Podosky (Macquarie University), Catarina Dutilh Novaes (VU Amsterdam)
Manuel Gustavo Isaac was a Swiss NSF Research Fellow at the University of Zurich. He has conducted as PI four postdoctoral research projects on conceptual engineering funded by prestigious fellowships and grants in Amsterdam (ILLC), Barcelona (LOGOS), St Andrews (ARCHÉ), and Zurich. His work has been published in Philosophy Compass, Erkenntnis, Ratio, Philosophia, Inquiry, Synthese, and History and Philosophy of Logic, among other venues. In Spring 2020, he launched for 5 seasons the Conceptual Engineering Online Seminar series. He is the creator of the Conceptual Engineering YouTube channel and the initiator of the Conceptual Engineering Network.
____
Gender is not a peppered moth: What attitude should we take towards conceptual engineering?
A split is drawn between cautious pessimists and cautious optimists in conceptual engineering. Where the former worries that concepts are hard to engineer because they are, in a very strong sense, “out of our control,” the latter accepts that control is possible, but still very hard. However, this doesn’t exhaust the space of possible attitudes. Personally, I have no idea what to feel. In this talk, I show that when we attend closely to the conditions of our conceptual environment, we will see that the burden of implementation cannot be placed on the shoulders of individuals. But nor can we rely on existing social infrastructure to get our concepts to stick. In the end, it is hard not to be a cautious pessimist: conceptual engineering just seems infeasible. This offers grounds for thinking that, when it comes to the normativity of conceptual engineering, our attention is better focused on cultivating the right conditions for conceptual uptake, rather than taking our chances with a world designed to keep things conceptually as they are.
BIO
Paul-Mikhail Catapang Podosky (he/they) is a Filipinx philosopher, passionate about all things relating to human drama. Paul’s research investigates the limits of conceptual engineering as a tool for promoting social justice, and in the critical philosophy of race and gender, he explores the politics of classification, with a specific focus on mixed-race identity. Previously, he was Global Perspectives on Society Fellow at New York University, and presently he is Lecturer in Philosophy at Macquarie University.
Track 11: Chiara Lisciandra (University of Munich)
The Science and Ethics of Academic Search Engines
Search engines of scientific literature like the Web of Science, Scopus, and Google Scholar draw on citation ranking algorithms. These new technologies have acquired an increasingly relevant role in scientific research. By complementing traditional academic libraries, they have fundamentally changed the way in which scientists conduct the search and selection of relevant literature. This talk discusses some of the epistemic and ethical consequences of this shift, both for the development of science and for the distribution of scientists’ recognition and rewards. Moving from here, the final part of the talk discusses some normative aspects that search engines bring into play, concerning both objectivity in literature search and fairness in the assessment criteria of scientists.
_
Chiara Lisciandra is a Humboldt Experienced Researcher at the Munich Center for Mathematical Philosophy, University of Munich. She received her Ph.D. in Logic and Philosophy of Science from Tilburg University, The Netherlands. Before Munich, she held positions in Germany, Finland, and the Netherlands. For more information, see Chiara’s website.
Track 13: PANEL (Diana Martin, Jessica Morley, Marc Steen, Mihalis Kritikos, Joyca Lacroix)
___
Marc Steen works as a senior research scientist at TNO, a Dutch research and technology organization. He earned MSc, PDEng and PhD degrees in Industrial Design Engineering at Delft University of Technology. He is an expert in Human-Centred Design, Value-Sensitive Design, and Responsible Innovation. His mission is to support organizations to use technologies in ways that help to create a just society and to promote people’s flourishing. He asks uneasy questions about technologies, especially about the ethics involved in the design and application of algorithms and artificial intelligence. His first book ‘Ethics for people who work in tech’, will be published by Routledge/CRC Press in October 2022.
MORE INFORMATION ABOUT THE TRACKS
This conference comprises 13 tracks
Track 1: Ethics of human-like robots
Track 1: Ethics of human-like robots (rights, moral considerations)
Track chairs: Sven Nyholm (s.r.nyholm@uu.nl), and Cindy Friedman (c.friedman@uu.nl)
The humanoid robots track aims to explore how humanoid robots may challenge, change, or disrupt concepts and aspects we typically associate with what it means to be human. Robots that look and behave like human beings are already being designed and created for a variety of purposes. As these robots are becoming increasingly more human-like, their “humanlikeness” may (re)shape our ideas of our own humanity. The questions this track will explore include: Will humanoid robots impact who we regard as agents and patients in the moral domain? Will this affect the way in which human beings interact with their social and material environment? Is the creation of humanoid robots a good or inherently bad idea?
Track 2: Fundamental issues in AI
Track 2: Fundamental issues in AI ( Agency of AI, human-AI hybrids, intellectural history and evolution of the concept of AI, etc.)
Track chairs: Kristy Claassen (k.claassen@utwente.nl), and Sven Nyholm (s.r.nyholm@uu.nl)
The purpose of this track is to reflect on how we should understand the idea of artificial intelligence. This concept keeps evolving. There are narrow and broad conceptions of what artificial intelligence is, and realistic and science fiction-inspired conceptions of what it can become. The term “artificial intelligence” was coined in a research proposal in 1955, but the idea of creating technologies that are intelligent or that can imitate intelligence goes back much further. It has also long been the stuff of fiction, such as in Mary Shelley’s Frankenstein from 1818. Track presenters are encouraged to reflect critically on what we should understand by “artificial intelligence”, with an eye to the past and the present, as well as with an eye to what the future of AI should be.
Track 3: Technology and changing self-understanding
Track 3: Technology and changing self-understanding
Track chair: Matthew Dennis (m.j.dennis@tue.nl)
The ESDiT self-understanding track will explore the capacity of socially disruptive technologies (SDTs) to improve how we understand the human condition. Recently, a host of technologies have been developed that claim to increase self-understanding in key ways. These technologies include those that provide insights into how our bodies function (fitness trackers), those that shed light on our psychological states (therapy chatbots, self-care apps), and technologies that show how our individual behaviours can be reliably predicted with the help of vast data sets (artificial intelligence, machine learning). This track aims to examine the ethical issues of using technologies to increase self-understanding, and will ask whether this results in living a more satisfying life.
Track 4: Democracy and technology
Track 4: Democracy and technology
Track chairs: Lucie White (l.a.white@uu.nl) and Arthur Gwagwa (e.a.gwagwa@uu.nl)
How have advancements in technology shaped democratic systems on a civil, corporate, national, and supranational level? Is there a sense in which democratic procedure relating to free and fair elections could be enhanced by the deployment of certain technologies, such as blockchain, through its security features? Do technology companies wield a level of power that is at odds with democracy, properly conceived? What impact is the data economy having on citizens’ lives? This track seeks to explore the various ways in which our democratic culture is being influenced by technological developments, and what if, anything, is wrong with this, as well as proposals on what could be done to mitigate harm. Further, as a result of fundamental technological disruption in democratic societies, should we revise our philosophical models of democracy?
Track 5: Social Justice and technology
Track 5: Social Justice and technology
Track chair: Patrik Hummel (p.a.hummel@tue.nl)
Novel, state-of-the-art technologies are often framed as interacting with justice: they can cause, perpetuate, or amplify injustice within and across various domains. Under the right conditions, they could also promote rather than undercut justice, for example by correcting for human cognitive biases. Throughout, the impact of technology could shift the ways in which we think about justice in the first place. There are still gaps in our understanding of the interaction between technologies and justice, their conceptual foundations, and applied strategies for shaping relevant process in society, technology design, and governance. The “Social Justice and Technology” will investigate these questions and related issues.
Track 6: Control and technology
Track 6: Control and technology
Track chair: Emily Sullivan (e.e.sullivan@tue.nl)
How does technology shape control? Perhaps technology leads us to reimagine the very concept of control. Or maybe technology expands what we have control over. Perhaps some technologies take control away from individuals or society. When is a change in control an improvement? When is it problematic? How might non-Western conceptions of control impact technology development or diagnose problematic cases of control? This track addresses these issues and more. This track will consider the concept of control facing a wide variety of technologies and domain applications that, for example, range from the concept of control regarding environmental technologies to the control of information on social media.
Track 7: The conceptual disruption of nature by (SDTs)
Track 7: The conceptual disruption of nature by Socially Disruptive Technologies (SDTs)
Track chairs: Jochem Zwier (jochem.zwier@wur.nl), and Vincent Blok (vincent.blok@wur.nl)
In this track, we critically reflect on the question how Socially Disruptive Technologies (SDTs) disrupt our conceptualizations of and basic assumptions about nature. Current developments in synthetic biology and the increasing interest in geoengineering technologies, for instance, challenge conceptual dichotomies between nature and technology, natural and artificial, organism and artefact, the natural and the human realm. It is however unclear: what does such a disruption mean; what is precisely disrupted, and what triggers disruption? We invite papers to reflect on these questions in the context of SDTs that are assumed to disrupt the concepts of nature.
Track 8: The technical mimesis of nature
Track 8: The technical mimesis of nature (e.g. biomimicry, synthetic biology, Digital Twins)
Track chairs: Paulan Korenhof (paulan.korenhof@wur.nl), and Vincent Blok (vincent.blok@wur.nl)
With nature as a hot-topic on the contemporary political and existential agenda, nature is playing an increasingly prominent role in emergent technology design. Many of these emergent technologies imitate nature in one way or the other in order to deal with contemporary sustainability challenges. Prominent examples of emergent nature-mimicking technologies are biomimetic technologies, synthetic biology, and Digital Twins. As the artificial intends to imitate the natural, this raises several questions: what is the relation between these two? What assumptions about nature and technology underpin this relation? And what are the transformative consequences for the technology, nature, and the human understanding of both? In this track we will critically explore the relation between nature and technology in technologies that imitate nature.
Track 9: The role of SDTs in Climate change and Climate recovery
Track 9: The role of Socially Disruptive Technologies (SDTs) in Climate change and Climate recovery
Track chairs: Dominic Lenzi (d.s.lenzi@utwente.nl), and Vincent Blok (vincent.blok@wur.nl)
Socially Disruptive Technologies (SDTs) are often put forward as enablers of the transition toward a more Sustainable Society, including energy and climate engineering technologies. However, the implementation of these technologies can have destructive side effects elsewhere on the planet, or impose additional injustice. Similarly but less discussed, SDTs like AI, robotics and digital twins raise concerns with environmentally destructive side effects. What explains the generally destructive side effects of technological progress? How have sustainable SDTs been conceptualized as solutions, and how might they actually contribute to climate recovery?
Track 10: Criteria for Conceptual Engineering
Track 10: Criteria for Conceptual Engineering
Track chairs: Guido Löhr ( g.lohr@tue.nl), and Michael Klenk (M.B.O.T.Klenk@tudelft.nl)
Several authors in psychology and philosophy have recently asked the following question: When is it permissible to disrupt a conceptual status quo, either by linguistic interventions (intentionally change the meaning or use of words) or by introducing a new technology? The aim of this track is to find answers to this question.
Track 11: Moral Change and Technology
Track 11: Moral Change and Technology
Track chair: Elizabeth O’Neill (e.r.h.oneill@tue.nl)
What is the relationship between technologies and moral change? By what mechanisms can technologies alter moral norms, values, concepts, practices, roles, relationships, emotions, and other elements of moral life? When do technologies hinder moral change, and when do they facilitate it?
Moral change can occur at multiple levels, ranging from the level of psychological faculties to the level of societies. Philosophers have recently paid special attention to the phenomenon of moral revolutions, as a distinctive form of society-level moral change. Other important forms of society-level moral change include moral reform and moral drift (Baker, R. 2019. The structure of moral revolutions. MIT Press). What roles can technologies play in these or other types of society-level moral change? How can technologies hinder or facilitate moral change at the level of the individual—what roles can technologies play in moral learning and development over the course of the lifetime?
Any moral change will be the result of a complex process involving many types of causes. Are there recurring patterns of interaction between technology and other causal factors, such as human agency and economic, political, or social phenomena? For this track we welcome submissions on the relationship between technologies and moral change, including historical or contemporary case studies.
Track 12: New approaches in Ethics of Technology
Track 12: New approaches in Ethics of Technology
Track chairs: Björn Lundgren (b.a.lundgren@uu.nl), and Philip Brey (p.a.e.brey@utwente.nl)
The aim of this track is to present and discuss research on methods of ethics of technology. Specifically, we aim to explore new methods, or approaches, in ethics of technology that can shed light on both old and new ethical problems of technology. We are also interested in meta-methodological discussions about justification or limitations of various methods in ethics of technology.
Track 13: At the Intersection of ethics and STEM
Track 13: At the Intersection of ethics and STEM
Track chairs: Wijnand IJsselsteijn (w.a.ijsselsteijn@tue.nl), and Matthew Dennis (m.j.dennis@tue.nl)
The ESDiT ‘At the Intersection of Ethics and STEM’ track will investigate the connection between the ethics of and STEM perspectives. Since this is a new ESDiT track, we will begin the session by introducing our research aims and methods, as well as hosting an open discussion on future STEM track activities. We welcome contributions exploring how links between philosophy and the empirical disciplines can be strengthened, especially if these synergies inform the real-world design, deployment, and regulation of socially disruptive technologies. Potential topics include interdisciplinarity, x-phi and empirical philosophy, ethnographic research, ethics/green-washing, knowledge utilisation, and public-private partnerships.
REGISTRATION
CONFERENCE FULLY BOOKED
The registration for the ESDiT2022 conference was open until 5 September 2022. The standard conference fee was €125, €75 for Postdoctoral Students and free of charge for members of the ESDiT consortium.
IMPORTANT DATES
- Call for abstracts
- Deadline for abstract submission: 22 May 2022
- Decision about acceptance: 15 June 2022
- Confirmation of participation for chosen presenters: 1 July 2022
- Conference dates: 6 & 7 October 2022
VENUE
CORPUS Congress Centre
Willem Einthovenstraat 1
2342 BH Oegstgeest (near) Leiden – the Netherlands
ACCOMMODATION
- Hilton Garden Inn Leiden.
- Golden Tulip & Tulip Inn
- Ibis Leiden Centre
- Fletcher Hotels
- Hotel Van der Valk Leiden
- Hotel Van der Valk Sassenheim
CONTACT INFORMATION
ESDiT Conference Secretariat:
esdit2022-bms@utwente.nl
PROGRAM COMMITTEE
- Philip Brey (University of Twente), chair
- Vincent Blok (Wageningen University & Research)
- Michael Klenk (Delft University of Technology)
- Sven Nyholm (Utrecht University)
- Elena Ziliotti (Delft University of Technology)
ORGANIZING COMMITTEE
- Patrik Hummel (Eindhoven University of Technology), track coordinator
- Melanie Braamhaar (ESDiT programme manager)
- Seeta Autar (ESDiT secretariat)
- Ben Hofbauer (Delft University of Technology)
- Julia Rijssenbeek (Wageningen University & Research)
ACKNOWLEDGEMENT
This conference is funded by the Dutch Research Council (NWO), the participating universities (University of Twente, Delft University of Technology, Eindhoven University of Technology, Wageningen University & Research, Utrecht University, UMC Utrecht, Leiden University) in the ESDiT programme and the Faculty of Behavioural, Management and Social Sciences (BMS) of University of Twente.