Parents arranging playdates for their kids is quite commonplace as most want their kids to play, learn from others, and make friends. When scheduling these playdates, parents typically share children’s allergy information or entertainment restrictions that reflect their family’s safety and value concerns.
Enter artificial intelligence (AI) into these scenarios, particularly as they relate to kids’ toys. These AI systems interacting with children can impose certain values that may not align with you or your friends. In Engadget’s recent Public Access article The Cultural Ramifications of Ubiquitous AI, John C. Havens, Executive Director, The IEEE Global Initiative for Ethical Considerations in Artificial Intelligence and Autonomous Systems, explores the ethical considerations of AI as it enters into everyday life.
There are all sorts of cultural implications that need to be considered when designing AI devices like where a robot with facial features would direct its gaze while speaking to avoid offending its owner based on where the device was released; in some cultures it’s rude to not look at someone while you’re speaking while in others deference is expressed by looking towards the floor.
Recently, the IEEE published the first version of Ethically Aligned Design: A Vision for Prioritizing Human Wellbeing with Artificial Intelligence and Autonomous Systems, which aims to address many of these ethical and cultural implications of AI. Launched by The IEEE Global Initiative for Ethical Considerations in Artificial Intelligence and Autonomous Systems, the document is now open for comments and feedback.
Learn more from the recent press release: IEEE Ethically Aligned Design Document Elevates the Importance of Ethics in the Development of Artificial Intelligence (AI) and Autonomous Systems (AS)