Can Artificial Intelligence Help Children Who Have Experienced Abuse Through The Disclosure of Their Trauma

I have focused my career on helping children who have been abused speak out about their abuse. From helping in the development of the first evidence-based curriculum for student’s pre-K through 12th grade, to the connectivity of that program with the nation’s largest child abuse hotline, intervening and preventing child abuse is the core purpose of my work.  However, of the various programs and services that I am honored to help offer for these children, one of the most impactful programs I have the opportunity to help develop are multidisciplinary child advocacy centers.

Through multi-disciplinary child advocacy centers, children who have been abused find a safe environment where they can disclose their abuse to a trained forensic interviewer. A child will only need to disclose the trauma they have experienced one time, and one time only, because of the team of community partners working together under one roof. Through this approach, the amount of re-traumatization that a child will have to experience is significantly reduced, and the rate of prosecution of those who commit these crimes against children is significantly increased. This is possible because we are able to capture the full testimony from a child, and that is captured in a video and audio-recorded format so there are no variations in the child’s testimony. As I examine the ISTE Standard, 1,3 Knowledge Constructor, I specifically focused on two parts of the standard. ISTE 1.3c encourages students to curate information from digital resources using a variety of tools and methods to create collections of artifacts that demonstrate meaningful connections or conclusions. ISTE 1.3d encourages students to build knowledge by actively exploring real-world issues and problems, developing ideas and theories and pursuing answers and solutions.

In the book, Teaching In A Digital Age, by Tony Bates book, Bates speaks about emerging technologies such as artificial intelligence (AI). In his chapter on AI, Bates talks about the advancements in AI using examples such as chatbots. According to Bates, “a chatbot is programming that simulates the conversation or ‘chatter’ of a human being through text or voice interactions. Chatbots in particular are a tool used to automate communications with students”. The idea of chatbots, and the learning impact they make inside the classroom for students, had me curious to explore – can artificial intelligence be a useful digital tool to assist children who have been abused to disclose their abuse? And through multi-disciplinary child advocacy centers, can artificial intelligence assist trained forensic interviewers with the digital tools to help collect information in an appropriate way that further reduces the re-traumatization of a child?

In the forensic interview process, forensic interviewers are challenged with trying to decipher what the right questions are; how to present those questions in the right way; and how to know when is the right time to ask them. It is this “formula” that is crucial in being able to prosecute offenders of children, especially when the child who has experienced abuse is the only one aware of the crime, and to do so in a way that does not further re-traumatize the child. A 2018 Neuroscience article out of USC, titled, Can Artificial Intelligence Help Abuse Victims Disclose Traumatic Testimony, sought to “determine if and how computer-aided tools can accurately assess the productivity of forensic interviews”. The researchers that conducted the work presented through this article were aware “of how the rapport built among interviewer and interviewee, the tone in which questions are asked, pauses and even question order can impact how much meaningful information is shared”. In this study, “the way that children in these interviews responded was highly correlated to their age. For younger children, the emotional content of the interviewer’s words had an impact on how much information they were willing to share during the phase of the interview. Older children were more influenced by the way interviewer’s vocalized their words (the pitch and loudness)”. In short, their work began to demonstrate that the use of AI could help train forensic interviewers with virtual assistants that can inform interviewers during the interview on identified patterns and cues; or as a simulated child interview to help develop that critical formula.

I connected this study with the ISTE Standards for Learning 1.3 Knowledge Constructor in two ways. First, it explores how the encouragement of forensic interviewers to curate information received through digital AI resources, can assist in creating meaningful connections and conclusions that advance the appropriate testimony from a child who has experienced abuse as well as advance the appropriate investigation and prosecution of an offender who harms a child. Second, using AI for purposes like simulating a child interview, this study highlights how forensic interviewers can build their knowledge through AI by actively exploring real-world issues and problems associated with child testimonies, and creating ideas and pursuing answers and solutions.

In listening to a presentation from The Zero Abuse Project at the 2018 AI For Good Global Summit, founder Joelle Casteix, shares her work on the use of AI for addressing child abuse. She states, “we train the tool about what the various behaviors are…[and] use AI to find patterns that we have never thought possible, to protect more and more children from abuse within institutions”. She goes on to note that these “new advancements in machine learning and AI are allowing researchers to identify patterns in data sets never before seen…[and] Project G is a tool that identifies risk factors of predatory behaviors, not only on predators who prey on children, but those associated with the cover up of sexual exploitation. It’s an amazing, fascinating tool because what it does, it helps show us what a predator looks like”.

Children should never have to experience the horrors of abuse. For those who do, or for those who are at risk, they should always be protected and educated through prevention education efforts, like that of Childhelp Speak Up Be Safe. For those children who have the need for services offered through multi-disciplinary child advocacy centers, like those offered through Childhelp, they should be given the opportunity to disclose or share their testimony in a way that minimizes their trauma; and in a way that enhances the rate of prosecution for those who offend against children. AI is beginning to show how it can be a valuable tool for forensic interviewers. AI will not only help care in the full and appropriate collection of information from a traumatized child – but also in helping to hold abusers accountable, and begin developing a pattern of data that hopefully can be used to intervene and save children from abuse sooner and with greater efficiency and success. This is truly how AI can be used for the greatest good.

References:

Bates, A. W. (2022). Teaching in a digital age. Retrieved from https://pressbooks.bccampus.ca/teachinginadigitalagev3m/  

Casteix, Joelle. AI For Good Global Summit. How AI Is Helping Stop Abuse and Exploitation of Children. 2018. https://aiforgood.itu.int/how-ai-is-helping-stop-abuse-and-exploitation-of-children-video/

Abitha, M., Ardulov, V., Narayanan, S. (2018). Can Artificial Intelligence Help Abuse Victims Disclose Traumatic Testimony?. University of Southern California. https://ngp.usc.edu/2018/12/20/can-artificial-intelligence-help-abuse-victims-disclose-traumatic-testimony/

Leave a Reply

Your email address will not be published. Required fields are marked *