Does Counseling Need to Address Bias in Psychological AI?
Artificial intelligence is already reshaping how some clients engage with the counseling profession. But, even as AI becomes more efficient and effective, the implicit bias of programmers and counselors developing this technology could limit its ability to serve multiple populations in need. To explore how programmers can integrate cultural competence in the development of psychological AI, Counseling@Northwestern sat down with core faculty members Dr. Tonya Davis and Dr. Russell Fulmer for a discussion about the future of AI in counseling and how this technology could improve engagement in some communities.
The understanding of cultural competence is necessary in counseling, but culture isn’t static. It is not something you can simply teach to students and professionals and move on. Why is cultural competence important and how can counselors best educate themselves?
Dr. Davis: I think culture is defined in a myriad of ways. I think that what we’re teaching when we’re teaching cultural competence is a willingness to be open, to be open-minded, and a willingness to be able to sit in another person’s experience. It’s ever-evolving. And it is so multi-defined that you can’t necessarily just say, “This is what you’re going to learn, it’s in this box, and it’s going to serve you forever.”
Dr. Fulmer: I lived overseas for five years working in medical education. I was counselor to students, faculty, and local islanders. You couldn’t piece together a more diverse climate. It was a trial by fire, but it was the best education of my life. By doing that, I saw some of the limitations of my own training. For instance, the individual diversity within a group with—theoretically—common characteristics is astounding. I agree that culture is not static, but is a dynamic system, multilayered and evolving. An educational foundation in cultural competence gives you a set of ideas from which to expand upon, modify, and basically test in the real world. Regardless of what you read or are taught, there really is no substitute for being immersed in a diverse environment to maximize learning.
Why is understanding the cultural background of a client important for counselors to consider?
Dr. Davis: One of the things that comes to mind is as clinicians it is imperative that we do not pathologize someone’s culture or the distinctions thereof. For example, when someone has deep rooted religious beliefs, a counselor who does not understand those spiritual beliefs or values may want to diagnose that person with anxiety, depression, or the like. To be clear, everyone is unique in many ways. However, in being culturally competent it is essential for counselors to be aware of self and have a working knowledge of personal values and beliefs. Having this understanding can make room for an appreciation for the values and beliefs of others, thereby broaching these subject matters responsibly.
The crux of what we want to discuss here is the intersection between cultural competence and the use of artificial intelligence in the counseling field. But first, can you talk a little about what AI actually looks like in the field today and what role it could play in the future?
Dr. Fulmer: AI is mainly a supplement to traditional, face-to-face counseling. No AI chatbot has taken over a counselor’s primary role and I don’t think any counselor is in danger of losing their job anytime soon. Counselors have general intelligence and AIs are specialists. Still, psychological AI helps people. There’s a burgeoning pile of research that attests to the efficacy of this. The unknowable future shows every sign of AI assisting helping professionals worldwide. To illustrate AI’s proliferation, there is a well-known case of a psychological AI used to aid Syrian refugees with PTSD.
And what does the future look like?
Dr. Fulmer: The future is conjecture. We must have the humility to admit we don’t know. I believe the odds of AI proliferating are greater than AI dwindling. Humans have an abysmal track record of prognosticating the distant future. We fare a little better in the short term. So, here is my prediction that I will keep unspecific: Considering the mind-blowing nature of current AI research coupled with an upcoming, tech-savvy Generation Z, future counselors and clients alike will be much more receptive to the incorporation of AI in every aspect of counseling. Today’s skepticism will give way to tomorrow’s acceptance. Acceptance will depend in part on how AI is manufactured today, which in turn depends on what form that interface of AI and cultural competence takes.
Is there any concern that some of the cultural challenges that exist in person-to-person interactions could creep into AI as programmers are creating algorithms?
Dr. Davis: That’s one of the questions I raised: Who’s programming what? There are psychologists out of Harvard who created the implicit association test (IAT). One of the matters that IAT looks at is ways in which unconscious biases can be measured—some of the things that are hidden just beneath the surface of our awareness, things that are not tangible. How then might this translate when we as helping professionals and as practicing clinicians aid in the programing of AI chatbots? What might we be missing because of our inability to identify unconscious or implicit biases? Are AI programmers provided the tools and resources needed to recognize these biases within themselves? Those are just a few questions that I have. How do we go about navigating that? One way we can do that is to learn to take a solid inventory of self. That’s going to take a dedicated willingness to know thyself. I would also imagine that identifying biases before or in the early stages of algorithm development may prove useful as well. My hope is that all practicing clinicians are reliably modeling a willingness to know oneself, navigating in-depth self-reflection, and doing the interpersonal work required of us to be the clinicians we are striving to be. Are we going to be perfect? No, we’ll never be perfect. I do think that if there is a deliberate effort to identify, understand, and resolve the biases that live beneath the surface of our awareness, we could be one step closer to addressing the cultural challenges that exist in person-to-person interactions, thereby minimizing these challenges in the development of AI.
What are some ways counselors can address issues of bias in AI?
Dr. Fulmer: It starts with awareness on the part of those who create AI. Most AI developers are busy keeping up with a technology that changes more by the week than some entire fields have in years. They are not trying to be cruel or dismissive; cultural competence just may not be at the forefront for them. We have to bring it to the developers’ attention.
Then the question becomes, what to do about it? AI developers enlisting the help of culturally competent counselors would help. The counselors “voice” becomes the AI’s voice (through an algorithm), at least initially. Over time, the AI must be monitored by its developer and the counselor using it, looking for signs of error, bias, and prejudice. Prevention is the best medicine, even if the medicine is imperfect.
Does AI offer an opportunity to break down social distance between counselors and clients who maybe have different racial or ethnic background, sexual orientation, gender identity, etc.?
Dr. Davis: I believe it’s a possibility. What will allow people to open up or prevent people from opening up is so fluid. It’s hard to grasp. I think about my clients who might be agoraphobic—who might be experiencing the inability to leave their homes because of that sort of social anxiety. Can AI assist those clients to a point where I can get the clients out of their home and into my office? I can see that.
I can visualize AI helping individuals in war-torn regions that we cannot access. I can also see how AI could be programed and extremely helpful for people who had traumatic lived experiences. So, does AI also have the propensity to reduce or minimize the effectiveness of the therapeutic alliance or the establishment of the therapeutic relationship in general? Yes, I believe that to be a possibility based on my professional experience. So then, how might we utilize AI in terms of maximizing the client’s ability to experience people? If our client is solely dependent upon AI, does that limit their ability to thrive in social situations? Does that exacerbate what they are experiencing? I imagine these questions are going to have to be considered and determined on a case-by-case basis.
Some clients who have engaged with psychological AI have said they like not working with a human because they feel less judged. What are the practical implications of that response?
Dr. Fulmer: The question makes me wonder, what is the best user interface for participants interacting with an AI? Should it be the image of a person? Should it be an abstract image? That probably will be an upcoming research project. Perhaps the AI could take the form of whatever image or person the client is most comfortable with—a choice that raises questions about the nature of a therapeutic relationship.
I see opportunity. I see promise with psychological AIs helping specific populations. What we are trying to do now is make sure that programmers and counselors receive cultural competence training to ensure such expertise is reflected in the AI.
Dr. Russell Fulmer, Dr. Tonya Davis, and Counseling@Northwestern alumna Lisa Pisani will present on this topic at the International Association for Counseling’s 2018 conference in Rome September 22-23. The conference will address counseling issues across ages and cultures.
Citation for this content: Northwestern University’s online Master of Arts in Counseling Program.