About the Activity
Artificial Intelligence (AI) is already being used without guidelines relating to family violence. There is a need to provide guidance to help address the misuse of AI technology, which goes far beyond simple bias or generic criminality. It took several years for the widespread introduction of IoT in the home before technologists realized the harm related to domestic and family violence resulting from IoT devices. Seeing even more intrusive behavior in AI devices, a similar delay should be avoided.
The urgency is great because the understanding of family violence is moving beyond physical and sexual violence to focus on coercive control, such as emotional and economic abuse. If AI draws on an inadequate understanding of coercive control (particularly in vulnerable communities), it could potentially solidify the biases and incomplete knowledge of coercive control.
Goals of the Activity
This IC activity seeks to harness the strengths of AI to better identify and evaluate the risk of family violence.
Getting Involved
Who Should Get Involved
- Organizations dealing with how family violence is experienced by vulnerable communities.
- Organizations dealing with those affected by family violence
- Regulator
- Government user groups like the Police
- Researchers
- AI developers
- Product developers who integrate AI in their products
- Social Policy developers
How to Get Involved
To learn more about the program and how to join the User-Centered Principles for Artificial Intelligence Used in Evaluating Family Violence activity, please express your interest by completing the User-Centered Principles for Artificial Intelligence Used in Evaluating Family Violence interest form.