This symposium is based on an online international and interdisciplinary conference hosted by the Centre for Ethics, University of Toronto, on June 20, 2022.
In the past few years, numerous policy documents have been crafted to ensure AIs are developed, used, and governed for the sake of the public. Many of these documents outline how we should establish trust in AI, offering ethical principles and guidelines.
The field of ethics of AI has pointed out the positive aspects and the limitations of these efforts. We have learned that AI-based technologies, commonly used by for-profit companies and oppressive law enforcement, often serve the powerful, further inequality, and exclude those who are affected from shaping them. At the same time, we see how research can inform activism and result in a meaningful change.
This workshop aims to address some of the insights that we have gained about the ethics of AI and the concept of trust. We critically explore practical and theoretical issues relating to values and frameworks, engaging with carebots, evaluations of decision support systems, and norms in the private sector. We assess the objects of trust in a democratic setting and discuss how scholars can further shift insights from academia to other sectors.
- Hellos and Opening Remarks
- Judith Simon (University of Hamburg), Can and Should We Trust AI?
- video ➡︎ 0:02:38
- Vivek Nallur (University College Dublin), Trusting a Carebot: Towards a Framework for Asking the Right Questions
- video ➡︎ 0:29:44
- Justin B. Biddle (Georgia Institute of Technology), Organizational Perspectives on Trust and Values in AI
- video ➡︎ 0:59:10
- Sina Fazelpour (Northeastern University), Where Are the Missing Humans? Evaluating AI Decision Support Systems in Content
- video ➡︎ 1:38:28
- Esther Keymolen (Tilburg University), Trustworthy Tech Companies: Talking the Talk or Walking the Walk?
- video ➡︎ 2:04:13
- Ori Freiman (University of Toronto), Making Sense of the Conceptual Nonsense “Trustworthy AI”: What’s Next?
- video ➡︎ 2:49:11
You must be logged in to post a comment.