A longer description of this vacancy
PhD project Improving ethical analysis through better conceptualization of the moral role of technological products and systems (4 years)
Intended starting date: September 2023
Daily supervisor and intended promotor:
- Dr. Adam Henschke, UT, daily supervisor
- Prof. dr. Philip Brey, UT, Promotor
Involved ESDiT-partner(s) from other universities (if any):
The PhD will be based at UT and embedded in the Foundations and Synthesis research line in the ESDiT programme. The topics allows many links to prior ongoing research within ESDIT. Informal supervision by members of the Foundation & Synthesis line will be explored.
Research proposal
If we assume that technology is not neutral but affects moral decision-making and moral outcomes, how can we define its role? Are technological products moral agents, as some have claimed, or co-agents? Or if not, how should we understand their moral role in relation to their human users? What is the moral role of guns in causing harm, for example? While humans ultimately pull the trigger, guns have affordances that make killing much easier, with less risk for the shooter. Or what is the role of self-driving cars in causing harm? Here, a human operator is not even directly involved. Should we therefore assign moral responsibility to self-driving cars? In spite of significant efforts in the philosophy of technology and science and technology studies to analyze and understand the interaction between technology and society, we still have few approaches in ethics that are able to provide answers to these questions. There are many approaches for analyzing interactions between technology, users and society, but few that focus on implications for ethical analysis.
This project investigates how we can conceptualize the role of technological products in moral and immoral acts and events, and how this conceptualization can result in new and better approaches to ethical analysis. The focus is not just on single technological products with single users, but also on larger systems with multiple technological components and human operators. These include both sociotechnical systems – predominantly technical systems with human operators and enablers, such as an electricity networks, industrial production systems and the internet, and technosocial systems – predominantly social systems that rely on technology, such as modern organizations and associations.
The candidate will investigate the different causal and moral roles that technology can take in morally consequential actions and events involving technologies with single users, autonomous technologies, and sociotechnical and technosocial systems. He or she will do so, initially, by studying and critiquing approaches in ethics (especially ethics of technology) that conceptualize the role of technology, and by making explicit what roles are assigned to technology. Also considered will be descriptive approaches to technology, users and society in STS, technology assessment and impact assessment and their possible translation to ethics. This is followed by the development of a new approach for ethical analysis which includes a vocabulary and methodology for understanding and assessing the various roles of various types of (socio)technological products and systems in moral actions and events. The approach will mostly be theoretical but will also involve a number of (smaller) case studies.
There will be a particular focus on socially disruptive technologies: technologies that transform everyday life, social institutions, cultural practices, and potentially even fundamental beliefs and values.
There are different ways to do this project, and the PhD candidate will get room to do it in his or her way. Choices can be made in the types of existing theories that will be studied, the extent to which a comparative analysis and assessment of current approaches in ethics of technology is carried out, the extent to which work on technology & society from philosophy of technology, STS and social sciences will be involved, and the choice of case studies in the project. What is important, though, is that an innovative answer is provided to the research question that will help other researchers in the field of ethics of technology to improve their approaches and give a better and more nuanced place to technology in ethics. Ideally, a vocabulary is formulated which allows one to classify types of technological products and applications of technology according to the ethical role they play, e.g., neutral instrument in human action, instrument in human action that modifies human action, stand-alone actor or co-actor that is able to act (and in case of AI, decide) autonomously, part of a hybrid human-technology ensemble, sociotechnical system or technosocial system, etc. This vocabulary will allow for better assignments of moral and causal responsibility for (moral) outcomes, for better, more ethical choices in whether and how to use technology, and for better ethical analysis, assessment and guidance of technology and technological practices. The case studies should ideally exemplify different technologies according to this taxonomy, and show that the vocabulary can be made to work in practice.
Relation to ESDT research lines and sublines (should specify how the proposal strengthens the research line as a whole):
The main research line of this project is in the F&S line, which it strengthens by continuing the sub-line on new approaches to ethics of technology, while also building on and incorporating the sub-line on the nature of socially disruptive technologies. It aims to connect to other lines by engaging in a dialogue on methods and approaches for doing ethics of technology.
The project centrally focuses on the “new approaches” research priority. It also contributes to the research focus on multidisciplinarity and transdisciplinarity by relating the philosophy and ethics of technology to the social sciences, especially STS, technology assessment and impact assessment. It could also contribute to the conceptual disruption line through its study of the relation between the human/social and the technological, and ways in which this relation can become blurred, as well as new conceptions of agency and responsibility.
Key ESDiT concepts to be investigated: human being / social / technological / agency / responsibility
Key ESDiT technology to be investigated: to be determined after project starts
Keywords
Artifact ethics; moral agency; social agency; moral factors; delegated morality; delegated responsibility; technological affordances; embedded values; distributed morality; sociotechnical systems; sociotechnical networks; extended agency; technological bias; morality in design; politics of technology; moral responsibility; causal responsibility; technology governance; structural ethics; institutional ethics
Selected literature
Adam, A. (2005). Delegating and Distributing Morality: Can We Inscribe Privacy Protection in a Machine?. Ethics Inf Technol 7, 233–242. https://doi.org/10.1007/s10676-006-0013-3
Adam, A. (2008). Ethics for things. Ethics Inf Technol 10, 149–154. https://doi.org/10.1007/s10676-008-9169-3
Arzroomchilar, E. (2022). “Structural Ethics” as a Framework to Study the Moral Role of Non-Humans. Techné: Research in Philosophy and Technology Volume 26, Issue 2, Pages 285-299.
Bazargan-Forward, Saba & Deborah Perron Tollefsen (eds.) (2021). The Routledge Handbook of Collective Responsibility. Routledge.
Bengtsson, S. (2018). Ethics Exists in Communication : Human‐machine ethics beyond the Actor‐Network (pp. 1–25). Retrieved from London School of Economics and Political Science website: http://urn.kb.se/resolve?urn=urn:nbn:se:sh:diva-37239
Brey, P. (2005). Artifacts as social agents. pp. 61‐84 in H. Harbers (Ed.) Inside the Politics of Technology. Agency and Normativity in the Co‐Production of Technology and Society, Amsterdam: Amsterdam University Press. Click here for PDF
Brey, P. (2010). Values in Technology and Disclosive Computer Ethics. In L. Floridi (Ed.), The Cambridge Handbook of Information and Computer Ethics (pp. 41-58). Cambridge: Cambridge University Press. Click here for PDF.
Brey, P. (2014). From Moral Agents to Moral Factors: The Structural Ethics Approach. In P. Kroes & P. P. Verbeek (Eds.), The Moral Status of Technical Artifacts (pp. 125-142). Dordrecht: Springer. Click here for PDF.
Brey, P. (1999). Worker Autonomy and the Drama of Digital Networks in Organizations. Journal of Business Ethics, 22, 15-25. Click here for PDF.
Ceicyte, J. and Petraite, M. (2018). Networked Responsibility Approach for Responsible Innovation: Perspective of the Firm. Sustainability, 10, 1720. https://doi.org/10.3390/su10061720
Chopra, A. and Singh, M. (2018). Sociotechnical Systems and Ethics in the Large. AIES ’18: Proceedings of the 2018 AAAI/ACM Conference on AI, Ethics, and Society, December 2018, Pages 48–53 https://doi.org/10.1145/3278721.3278740
Devon, Richard & Van, Ibo & Poel, Ibo. (2004). Design ethics: The social ethics paradigm. International Journal of Engineering Education. 20.
Dewsbury, Guy & Dobson, John. (2007). Responsibility and Dependable Systems. 10.1007/978-1-84628-626-1.
Ferrero, L. (Ed.). (2022). The Routledge Handbook of Philosophy of Agency (1st ed.). Routledge. https://doi.org/10.4324/9780429202131
Flanagan, M., Howe, D., & Nissenbaum, H. (2008). Embodying Values in Technology: Theory and Practice. In J. Van den Hoven & J. Weckert (Eds.), Information Technology and Moral Philosophy (Cambridge Studies in Philosophy and Public Policy, pp. 322-353). Cambridge: Cambridge University Press. doi:10.1017/CBO9780511498725.017
Floridi. L. (2016). Faultless responsibility: on the nature and allocation of moral responsibility for distributed moral actions. Phil. Trans. R. Soc. A.3742016011220160112 http://doi.org/10.1098/rsta.2016.0112
Floridi, L. (2013). Distributed morality in an information society. Science and Engineering Ethics, 19(3), 727–743.
Floridi, L., and J. W. Sanders (2004), “On the Morality of Artificial Agents”, Minds and Machines, vol. 14, no. 3.
Green, B. (2021). “The Contestation of Tech Ethics: A Sociotechnical Approach to Technology Ethics in Practice,” in Journal of Social Computing, vol. 2, no. 3, pp. 209-225, September 2021, doi: 10.23919/JSC.2021.0018.
Hanson, F. (2009). Beyond the skin bag: On the moral responsibility of extended agencies. Ethics and Information Technology, 11(1), 91–99.
Heersmink, R. Distributed Cognition and Distributed Morality: Agency, Artifacts and Systems. Sci Eng Ethics 23, 431–448 (2017). https://doi.org/10.1007/s11948-016-9802-1
Himma, K. (2009). Artificial agency, consciousness, and the criteria for moral agency: What properties must an artificial agent have to be a moral agent? Ethics and Information Technology, 11(1), 19–29.
Johnson, D. (2006). Computer systems: Moral entities but not moral agents. Ethics and Information Technology, 8(4), 195–204.
Johnson, D. (2015) Technology with No Human Responsibility? J Bus Ethics 127, 707–715. https://doi.org/10.1007/s10551-014-2180-1
Jonas, H. (2014). Technology and Responsibility: Reflections on the New Tasks of Ethics. In: Sandler, R.L. (eds) Ethics and Emerging Technologies. Palgrave Macmillan, London. https://doi.org/10.1057/9781137349088_3
Klenk, M. (2021). How Do Technological Artefacts Embody Moral Values?. Philos. Technol. 34, 525–544. https://doi.org/10.1007/s13347-020-00401-y
Kroes, P. & Verbeek, P. (Eds.) (2014). The moral status of technical artifacts. Philosophy of engineering and technology; Vol. 17, No. 17. Dordrecht: Springer Verlag. DOI: 10.1007/978-94-007-7914-3_8
Lagnado, D.A., Gerstenberg, T. and Zultan, R. (2013), Causal Responsibility and Counterfactuals. Cogn Sci, 37: 1036-1073. https://doi.org/10.1111/cogs.12054
Magnani, L., & Bardone, E. (2008). Distributed morality: Externalizing ethical knowledge in technological artifacts. Foundations of Science, 13(1), 99–108.
Malafouris, L. (2008). At the potter’s wheel: An argument for material agency. In C. Knappett & L. Malafouris (Eds.), Material agency: Towards a non-anthropocentric approach (pp. 19–36). New York: Springer.
Millar, J. (2015). Technology as Moral Proxy: Autonomy and Paternalism by Design, in IEEE Technology and Society Magazine, vol. 34, no. 2, pp. 47-55, June 2015, doi: 10.1109/MTS.2015.2425612.
Nelkin, Dana Kay & Derk Pereboom (eds.) (2022). The Oxford Handbook of Moral Responsibility. New York: Oxford University Press.
Nielsen, R.P., Massa, F.G. (2013). Reintegrating Ethics and Institutional Theories. J Bus Ethics 115, 135–147 (2013). https://doi.org/10.1007/s10551-012-1384-5.
Peterson, M., & Spahn, A. (2011). Can technological artefacts be moral agents? Science and Engineering Ethics, 17(3), 411–424.
Ribes, David, Steven Jackson, Stuart Geiger, Matthew Burton and Thomas Finholt (2013). Artifacts that organize: Delegation in the distributed organization, Information and Organization, Volume 23, Issue 1, Pages 1-14, ISSN 1471-7727, https://doi.org/10.1016/j.infoandorg.2012.08.001.
Sandvig, C., Hamilton, K., Karahalios, K. and Langbort, C. (2016). When the Algorithm Itself is a Racist: Diagnosing Ethical Harm in the Basic Components of Software. International Journal of Communication 10(19): 4972–90.
Shah, Esha and Rutgerd Boelens (2021), The moralization of hydraulics: Reflections on the normative-political dimensions of water control technology, Geoforum, Volume 121, 2021, Pages 93-104, ISSN 0016-7185, https://doi.org/10.1016/j.geoforum.2021.02.009.
Schulzke, M. (2013). Autonomous weapons and distributed responsibility. Philosophy and Technology, 26(2), 203–219.
Sober, Elliott (1988). Apportioning Causal Responsibility. The Journal of Philosophy, vol. 85, no. 6, pp. 303–18. JSTOR, https://doi.org/10.2307/2026721. Accessed 7 Apr. 2023.
Van de Poel, I. (2020). Embedding Values in Artificial Intelligence (AI) Systems. Minds & Machines 30, 385–409. https://doi.org/10.1007/s11023-020-09537-4
Van de Poel, Ibo Lambèr Royakkers & Sjoerd D. Zwart (2015). Moral Responsibility and the Problem of Many Hands. Routledge.
Verbeek, Peter-Paul. (2008). Morality in Design: Design Ethics and the Morality of Technological Artifacts. In: P. Vermaas, P. Kroes, A. Light and S. Moore (eds.), Philosophy and Design. Springer, Dordrecht. https://doi.org/10.1007/978-1-4020-6591-0_7
Verbeek, Peter-Paul (2011), Moralizing Technology, Understanding and Designing the Morality
of Things, Chicago and London: The University of Chicago Press.
Waelbers, K. (2009). Technological Delegation: Responsibility for the Unintended. Sci Eng Ethics 15, 51–68. https://doi.org/10.1007/s11948-008-9098-x
Waelbers, Katinka & Dorstewitz, Philipp. (2013). Ethics in Actor Networks, or: What Latour Could Learn from Darwin and Dewey. Science and engineering ethics. 20. 10.1007/s11948-012-9408-1.
Wallach, Wendell, and Shannon Vallor (2020). Moral Machines: From Value Alignment to Embodied Virtue, in S. Matthew Liao (ed.), Ethics of Artificial Intelligence (New York, 2020; online edn, Oxford Academic, 22 Oct. 2020), https://doi.org/10.1093/oso/9780190905033.003.0014.
Winner, L. (1980). Do artifacts have politics? Daedalus 109(1): 121‐36.
Woodgate, J. and Ajemeri, N. (2022). Principles for Macro Ethics of Sociotechnical Systems: Taxonomy and Future Directions. Computers and Society. https://doi.org/10.48550/arXiv.2208.12616