AI EthicsI have applied my expertise in ethics directly by working with consulting firms to educate their clients.
In summer 2021, I worked with Ethically Aligned AI, Inc., to develop an AI ethics micro-credential certificate offered by Athabasca University. As part of the micro-credential's materials, I developed a worksheet called the AI Ethics Canvas. This exercise helps organizations and teams to consider different social and ethical issues as they plan their AI projects. In July 2023 I gave a talk on responsible AI development as part of an AI ethics webinar developed by Overlap Associates, a design consultancy that works mainly with public sector and non-profit clients. Contact me if you're interested in working together. |
Image by Yasmin Dwiputri & Data Hazards Project on Better Images of AI
|
In 2023, I visited Phillips Academy, a boarding school in Andover, Massachusetts, where I spoke with students and faculty about computer ethics, the ethics of gamification, and the ethics of AI. The school's student newspaper, The Phillipian, interviewed me and wrote an article about my visit. Read it here.
|
|
In 2022, I gave an interview to Dialexicon, a philosophy podcast produced by and for youth. The interview focuses on education, trust, and computer ethics. Listen to our conversation with the widget, or visit their website here.
|
|
In summer 2021, I worked with Katrina Ingram (the CEO of Ethically Aligned AI, Inc.), and Digital 55 (a Toronto-based educational media firm) to create a series of four self-paced AI ethics courses for Athabasca University's PowerED™ unit. Access the courses here.
In winter 2022, Katrina and I spoke to University Affairs about the courses and the importance of ethical AI and technology development. Read the article here. Digital 55 produced a short video where Katrina and I talk about the experience. Watch it in the embedded video player or on YouTube. |
In this webinar from 2021, Jeff Behrends, James Mickens and I discuss how Embedded EthiCS at Harvard teaches ethical reasoning skills to computer science students. Watch it in the embedded video player, or on YouTube.
|
|
In 2017, I contributed to a series of videos by philosophers at the University of Sheffield about the refugee crisis.
In my video, I discuss and refute a worry that the City of Sanctuary movement in the UK is undemocratic and contrary to the wishes of the public. The video series was featured at the 2018 Migration Matters Festival in Sheffield. You can watch all the videos in the series at the university's website. |
|
In July 2023, I published a post about ChatGPT, artificial intelligence, automation, the ideology of OpenAI, what all of this says about what we value, and how we should respond.
In April 2023, I published a post summarizing some of my research on moral responsibility and autonomous systems. I argue that developers of autonomous systems and those who deploy them should take responsibility for any harms that result from the system, even if those people are not to blame for the outcome. In March 2022, I published a post describing how I have used a video clip from Star Trek: The Next Generation to introduce a unit on AI ethics. The point was in part to seriously consider the issue of rights for artificial life, but also to set aside these questions in order to discuss ethical issues surrounding forms of AI that exist today. In August 2021, I published a post showcasing my computer ethics course at Dalhousie University. I reflect on the experience of using team-based learning and teaching online during the pandemic, and highlight some of the content of the course and the importance of ethical instruction in computer science programs. |
Copyright 2016–2024 by Trystan Goetze.
|