Description of This Research Line
The “well-being, health, and emotions” research-line [WHE] examines how 1) socially disruptive technologies impact upon people’s actual health, emotions, and well-being 2) how the very concepts (health, emotions, well-being) are disrupted through newly emerging technologies and 3) how experiences and conceptualizations of health and well-being are not homogenous, thus calling for a diversity in perspectives (e.g. from critical disability studies, feminist philosophy of technology, and Non-Western approaches).
For instance, some members of the WHE-research line look at how Brain-Computer-Interfaces used for communication by patients with locked-in-syndrome disrupt and transform the emotions and well-being of BCI-communicators, while also attending to the way in which BCIs might disrupt existing ideas and concepts surrounding ‘healthy communication’ and ‘human well-being,’ while taking into consideration how BCIs might both sustain but also disrupt ableist assumptions about experiences and conceptualizations of communication and well-being.
Much of our line’s research is centered around digital technologies, but we also look at ‘health, emotions, and well-being’ in the application domains of biomedical technologies and climate change, which we view as a socially disruptive phenomenon induced by technologies and socio-technical systems.
Get Involved: We are always interested in new collaborations with researchers (and experts-by-experience) whose research and/or lived experiences bear on the above set of questions! Researchers from the Global South are especially encouraged to reach out. We hold monthly research-line meetings, with some of those being reserved for external guest-speakers.
Related Projects
Ethics of Data-Driven Mental Health Diagnostics [2021-2025]
PhD candidate: Anna van Oosterzee (a.m.vanoosterzee@uu.nl)
Daily supervisor: Dr. Sander Werkhoven (s.werkhoven@uu.nl)
Co-supervisor: Prof. Dr. Joel Anderson (j.h.anderson@uu.nl)
In collaboration with Leiden University, Leiden Institute of Advanced Computer Science
Dr. Anna Kononova (a.kononova@liacs.leidenuniv.nl) & Prof. Dr. Thomas Back (T.H.W.Baeck@liacs.leidenuniv.nl)
Project Description
As part of the Esdit research line on “The Human Condition,” this project addresses a complex and multi-faceted range of issues raised by recent and expected developments in machine learning and predictive data-driven analytics. Through the development of ever-more complex computational techniques to analyse large data sets, translational bioinformatics and other computational approaches increasingly enable medical researchers and clinicians to develop diagnostic approaches that are more fine-grained, personalized, and predictively accurate than those based on current categories of disease. Although “precision medicine” is relatively well established, parallel approaches in psychiatry are quite new. For the data scientists, research psychiatrists, and mental health professionals developing these approaches, these emerging technologies raise vexing ethical issues – issues that are also central to research in philosophy of psychiatry, disability studies, philosophy of science, data ethics, and philosophy of technology.
The aim of this PhD project is to investigate the following cluster of questions: What ethical concerns are raised by integrating data-driven analytics and translational bioinformatics into psychiatric diagnoses? What implications does the highly personalized character of these computational approaches have for reconceptualizing what is “normal” for human beings? How should these concerns shape these emerging technologies' regulation and ongoing design in this highly contested domain?
Empathy, communication technologies, and neurodiversity [2020-2024]
PhD candidate: Caroline Bollen (c.j.m.bollen@tudelft.nl)
Daily supervisor: Dr. Janna van Grunsven (J.B.vanGrunsven@tudelft.nl)
Supervisor: Prof. Dr. Sabine Roeser (S.Roeser@tudelft.nl)
Project Description
Currently, there exists no robust account of empathy as technologically mediated. This is striking, since numerous empirical studies suggest that VR, Social Media Platforms, and other new digital technologies can undermine empathy or precisely extend its scope. One of our main research-questions is ‘how does empathy, qua concept, need to be reconsidered in light of these new digital technologies?’ The answer to this question not only has the potential to change current philosophical debates on the nature and scope of empathy, it is also needed to confront a practical-evaluative lacuna: empirical studies proclaim the empathy-promoting or distorting effects of various digital technologies, thereby seemingly validating or questioning their ethical and social (un)desirability. But since we lack a robust up-to-date philosophical account of empathy as technologically mediated, these assessments are founded on an unexamined notion of empathy. Furthermore, current understandings of the concept empathy often exclude autistic empathetic experiences. Recent knowledge from and acknowledgement of neurodivergent perspectives challenges the way empathy is conceptualized in research.
In this PhD project, a new concept of empathy is being developed that that is inclusive to autistic empathic experiences, and one that can be used to (normatively) reflect on the impact of technology on the way we relate to one another. This is being done on different levels: empathy as a concept in moral theory, empathy as mediated by communication technologies, and the specific case study of empathy as mediated by Augmentative and Alternative Communication (AAC) Technologies.
Related Publications

Attending to the Online Other: A Phenomenology of Attention on Social Media Platforms Book Chapter
In: de Boer & Jochem Zwier, Bas (Ed.): Chapter 9, pp. 215-240, Open Book Publishers, 2024.

Reimagining Digital Well-Being Technical Report
2024.
Ethics of early detection of disease risk factors: A scoping review Journal Article
In: BMC Medical Ethics, vol. 25, 2024.
E-coaching systems and social justice: ethical concerns about inequality, coercion, and stigmatization Journal Article
In: AI and Ethics, pp. 1-10, 2024.
Related Events
Related News & Media
People Involved
Coordinator
Advisors
Core members




















