Featured Article
Article Title
Examining Artificial Intelligence Policies in Counsellor Education
Authors
Laurie O. Campbell - Department of Learning Sciences and Educational Research, University of Central Florida, Orlando, Florida, USA
Caitlin Frawley - Department of Counselor Education and School Psychology, University of Central Florida, Orlando, Florida, USA
Glenn W. Lambie - College of Community Innovation and Education, University of Central Florida, Orlando, Florida, USA
Karina S. Cabrera - Department of Counselor Education and School Psychology, University of Central Florida, Orlando, Florida, USA
Bryanna D. Vizcarra - Department of Counselor Education and School Psychology, University of Central Florida, Orlando, Florida, USA
Abstract
Aims: This study investigated counsellor education Council for Accreditation of Counseling and Related Educational Programs (CACREP) programs generative artificial intelligence (AI) policies in doctoral-level counselor education programs. We aimed to contribute to emerging research on the use of generative AI within counselor education. Methods: A content analysis of the policies was conducted along with a linguistic analysis of the policies to determine the authenticity, tone, and analytical nature of the University, and program policies. Results: A content analysis of generative artificial intelligence usage policies within doctoral counselor education programs indicated that only five programs had program-specific generative artificial intelligence policies. Most programs utilized University policies or usage guidance. Conclusion: Suggestions for practice include providing definitional clarity of the different types of AI to reduce potential frustration for learners. Further, programs should consider developing a program-specific policy since the counseling profession requires a high level of ethical responsibility to best serve clients.
Keywords
artificial intelligence (AI); CACREP; counsellor education; ethics; policy
Summary of Research
“...In response to the surge of generative AI,... there is a pressing need for research identifying the future of counselling and counsellor education in the age of artificial intelligence. Therefore, we aim to contribute to this emerging research body on the use of generative AI within counsellor education learning environments by conducting a content analysis of AI policies for doctoral-level counsellor education programmes” (p. 1).
“Counsellor education has leveraged emerging technologies to facilitate learning both in face-to-face and distance learning. In general, the use of technologies in counselling and psychotherapy is an accepted modality for counselling and therapy. Predictors of technology use for counsellors, patients and therapists alike are the perceived relevance and usefulness of the technology… the use of generative AI in counselling and psychotherapy is emerging, and training for its use is unfolding. Therefore, there is a need to investigate the use of generative AI in the education of counsellors” (p. 2).
“Researchers in higher education have called for educators to provide learners with the necessary guidance to understand and ethically use generative AI to foster growth… The development of AI-based policies in an educational context can support learners who will integrate AI into their classwork and future work, promoting transparency… The emergence of generative AI has sparked discussions and apprehension regarding its use in academic settings. Concerns include (a) diminishing human con nection, and (b) stymying critical thinking. Therefore, having established policies regarding generative AI may support counsellor educators in training as they begin to engage with generative AI. Due to the novelty of the use of generative AI and the uncertainty of where generative AI is situated in academic ethics and authorship, counsellor educators have noted the need for policies to support the use of generative AI” (p. 3).
“The primary aim of our study was to characterise AI policies of CACREP-accredited doctoral counsellor education programmes and complete a linguistic analysis of the policies. The research questions guiding our study were as follows: Research Question 1. How are CACREP-accredited counsellor education doctoral programmes addressing generative AI? Research Question 2. What are the common characteristics and linguistic content found in generative AI policies and use guidance among counsellor education doctoral programmes?... Out of the 92 counsellor education doctoral programmes, data were collected from 81 programmes either through direct response or from their publicly available website, meaning only 81 of the 92 programmes responded in some manner” (p. 4).
“The five programme-specific policies had a clout score slightly lower than the university score of 49.24. The authenticity of the statements was genuinely low at 15.21… Of the 81 CACREP-accredited programmes included in the data analysis, only five programmes reported that they utilised a programme-specific AI policy; most programmes relied on university guidelines or did not have a policy” (p. 7).
Translating Research into Practice
Prioritizing Policies
“Counsellor education programmes must carefully evaluate their policies' language for clarity and conciseness as they partner with students. Prior studies related to plagiarism have noted the need for continuity of policies to prevent confusion and to provide clarity. Therefore, counsellor education programmes are encouraged to adopt, develop or revise programme-specific policies and guidelines regarding the utilisation of AI by counsellors in training and counsellor education doctoral students to diminish potential confusion for trainees utilising generative AI in coursework, internship or practicum” (p. 7).
“When developing, creating or revising programme-specific generative AI policies, programmes should consider that the content of the policies includes: (a) definitions, (b) differences and (c) explanations. Guidance for developing policies and guidelines includes clarifying terms. Indicating the distinctions between generative AI and assistive AI would support students by providing clear guidelines. The terms generative AI, assistive AI and AI should be denoted within the policies instead of the generic term AI, as there are distinct differences and many types of AI exist in higher education” (p. 7).
“Counselling and psychotherapy education AI policies should include reminders of client privacy to prevent sharing sensitive and confidential information with generative AI, as it learns from queries. Additionally, these cautions should become a part of the curriculum” (p. 7).
Other Interesting Tidbits for Researchers and Clinicians
“Generative AI use in learning requires the use of statements. The type of AI (generative or assistive) allowed for assignments should be clear. Existing generative AI policies or guidance could reference the honour code and academic integrity as appropriate” (p. 7).
Avoiding Lags in Leadership and Learning
“Higher education is metaphorically wrestling with establishing guardrails related to generative AI and its use by learners by establishing policies for its use. Developing academic policies takes time and interest. Since generative AI developed quickly, there may have been a lag in policy development. The outcomes (e.g., cost, time and energy) of operating without a clear and succinct generative AI policy in counsellor education have not been realised to date…
As generative AI continues to expand and individuals innovate and discover ways to integrate generative AI into teaching and learning practices, higher education needs to embrace clear and concise use guidelines, policies and approaches relative to this emerging technology. Whether a programme allows or prohibits the use of generative or assistive AI, the language utilised in the content of the policy can focus on affiliation type words to work in partnership with learners” (p. 8).
Additional Resources/Programs
As always, please join the discussion below if you have thoughts or comments to add!