This post was originally published on this site.
Cal State Long Beach lecturer Casey Goeller wants his students to know how to use AI before they enter the workforce.
Tasmin McGill/EdSource
Since Open AIâs release of ChatGPT in 2022, artificial intelligence (AI) chatbots and models have found their way into the California college systems. These AI tools include language models and image generators that provide responses and images based on user prompts.
Many college professors have spoken out against AIâs use in college coursework, citing concerns of cheating, inaccurate responses, student overreliance on the tool, and, as a consequence, diminished critical thinking. Universities across the U.S. have implemented AI-detecting software like Turnitin to prevent cheating through the use of AI tools.
However, some professors have embraced the use of generative AI and envision its integration into curricula and research in various disciplines. To these professors, students learning how to use AI is critical to their future careers.
An October 2024 report from the University of Southern Californiaâs Marshall School of Business found that 38% of the schoolâs faculty use AI in their classrooms.
Ramandeep Randhawa, professor of business administration and data science at USC, was one of the reportâs 26 co-authors and organized the effort.Â
âAs companies increasingly integrate AI into their workflows, it is critical to prepare students for this AI-first environment by enabling them to use this technology meaningfully and ethically,â Randhawa said. âUniversities, as bastions of knowledge, must lead the way by incorporating AI into their curricula.â
All in on AI
At California State University, Long Beach, gerontology lecturer Casey Goeller has incorporated AI into his course assignments since fall 2023.
Students enter Goellerâs Perspectives on Gerontology course with various levels of experience with AI. By asking students for a show of hands, Goeller estimates the class is usually evenly split, with some students having no experience, others having dabbled with it and some who have used it extensively.
Goeller aims to help students understand how AI can be beneficial to them academically, whether it be assisting with brainstorming, organizing, or acting as a 24/7 on-call tutor.
To achieve this, Goellerâs assignments include students using an AI tool of their choice to address his feedback on their essays based on criteria such as content, flow and plagiarism concerns. Another assignment, worth 15% of their grade, emphasizes the importance of prompt engineering by having students use AI-generated questions to interview an older person in their life.
While Goeller gets a lot of questions from fellow faculty members about how AI works and how to implement it, he also hears plenty of hesitation.
âThereâs a lot of faculty whoâs still riding a horse to work, I call it,â Goeller said. âOne of them said, âI am never going to use AI. Itâs just not going to happen.â I said, âWhat you should do if you think you can get away with that is tomorrow morning, get up really early and stop the sun from coming up, because thatâs how inevitable AI is.ââ
Goeller heeds the difficulties in establishing a conclusive way to incorporate AI into curricula due to different academic disciplines and styles of learning, but he does recognize the growing presence of AI in the workforce. Today, AI is filling various roles across industries, from analyzing trends in newsrooms and grocery stores, to generating entertainment, a point of contention for SAG-AFTRA members during 2023âs Hollywood strikes.
âIf we donât help our students understand AI before they escape this place, theyâre going to get into the workforce where itâs there,â Goeller said. âIf they donât know anything about it or are uncomfortable with it, theyâre at a disadvantage compared to a student with the same degree and knowledge of AI.â
California State University, Northridge, journalism lecturer Marta Valier has students use ChatGPT to write headlines, interview questions and video captions in her Multimedia Storytelling and Multi-platform Storytelling classes due to the inevitability of AI in the workforce.
The goal of the implementation is to teach students how AI algorithms operate and how journalists can use AI to assist their work. Not using it, she said, âwould be like not using ink.â
âI absolutely want students to experiment with AI because, in newsrooms, it is used. In offices, it is used,â Valier said. âItâs just a matter of understanding which tools are useful, for what and where human creativity is still the best and where AI can help.â
AI tools such as ChatGPT and Copilot are frequently updated, so Valier emphasizes flexibility when teaching about these technological topics.
âI basically change my curriculum every day,â Valier said. âI think it reminds me as a professional that you need to constantly adapt to new technology because itâs going to change very fast. Itâs very important to be open, to be curious about what technology can bring us and how it can help us.â
However, Valier acknowledges the issues of AI in terms of data privacy and providing factual responses. She reminds students that it is their responsibility to make sure the information ChatGPT provides is accurate by doing their own research or rechecking results, and to avoid reliance on the platform.
âBe very careful with personal information,â Valier said. âEspecially if you have sources, or people that you want to protect, be very careful putting names and information that is sensitive.â
Valier sees a clear difference in the quality of work produced by students who combine AI with their own skills, versus those who rely entirely on artificial intelligence.
âYou can tell when the person uses ChatGPT and stays on top of it, and when GPT takes over,â Valier said. âWhat I am really interested in is the point of view of the student, so when GPT takes over, there is no point of view. Even if [a student] doesnât have the best writing, the ideas are still there.â
Balancing AI use in the classroom
Many AI-friendly instructors seek to strike a balance between AI-enriched assignments and AI-free assignments.Â
At USC, professors are encouraged to develop AI policies for each of their classes. Professors can choose between two approaches, as laid out in the schoolâs instructor guidelines for AI use: âEmbrace and Enhanceâ or âDiscourage and Detect.â
Bobby Carnes, an associate professor of clinical accounting at USC, has adopted a balance between both approaches while teaching Introduction to Financial Accounting.Â
âI use it all the time, so it doesnât make sense to tell (students) they canât use it,â Carnes said.
Carnes uses AI to refine his grammar in personal and professional work and to develop questions for tests.Â
âI give ChatGPT the information that I taught in the class, and then I can ask, âWhat topics havenât I covered with these exam questions?â It can help provide a more rich or robust exam,â Carnes said.
He doesnât allow students to use AI in exams that test for practical accounting skills, though.Â
âYou need that baseline, but weâre trying to get students to be at that next level, to see the big picture,â he said.
Carnes said he wants his students to take advantage of AI tools that are already changing the field, while mastering the foundational skills theyâll need to become financial managers and leaders.Â
âThe nice thing about accounting is that the jobs just become more interesting (with AI), where thereâs not as much remedial tasks,â Carnes said.Â
Preserving foundational learning
Olivia Obeso, professor of education and literacy at California State Polytechnic University, San Luis Obispo, believes establishing foundational knowledge and critical thinking skills through AI-free teaching is non-negotiable.
Obeso enforces her own no ChatGPT/AI usage policy in her Foundations of K-8 Literacy Teaching class to prepare her students for challenges in their post-collegiate life.
âAI takes out the opportunity to engage in that productive struggle,â Obeso said. âThat means my students wonât necessarily understand the topics as deeply or develop the skills they need.â
Obeso is also concerned about ChatGPTâs environmental impact: For an in-class activity at the start of the fall 2024 semester, she asked students to research the softwareâs energy and water use.Â
The energy required to power ChatGPT emits 8.4 tons of carbon dioxide per year, according to Earth.Org. The average passenger vehicle produces 5 tons per year. Asking ChatGPT 20-50 questions uses 500 millliters (16.9) ounces of water, the size of a standard plastic water bottle.
By the end of the exercise, Obeso said her students became âexpertsâ on ethical considerations concerning AI, sharing their findings with the class through a discussion on what they read, how they felt and whether they had new concerns about using AI.Â
âYou are a student and you are learning how to operate in this world, hold yourselves accountable,â Obeso said.Â
Jessica Odden, a senior majoring in child development, said Obesoâs class helped them understand AI use in the classroom as an aspiring teacher.
âFor people that are using (AI) in the wrong ways, it makes people reassess how people might be using it, especially in classes like this where we are training to become teachers,â Odden said. âWhat are you going to do when you actually have to lesson-plan yourself?âÂ
Odden makes sure she sticks to learning the fundamentals of teaching herself so that she will be prepared for her first job.
AI in curricula
At the University of California, San Diego, some faculty members have echoed a concern for AIâs infringement upon independent learning.Â
Academic coordinator Eberly Barnes is interested in finding a middle ground that incorporates AI into curricula where it complements studentsâ critical thinking, rather than replaces it.
Barnes oversees the analytical writing program, Making of the Modern World (MMW), where her responsibilities include revising the courseâs policy of AI use in student work.
The current policy enables students to use AI to stimulate their thinking, reading and writing for their assignments. However, it explicitly prohibits the use of the software to replace any of the aforementioned skills or the elaboration of the written piece itself.
Despite the encouraged use of AI, Barnes expressed her own hesitancy about the role of AI in the field of social sciences and the research and writing skills needed to work within it.Â
âOne of the goals in MMW is to teach critical thinking and also to teach academic writing. And the writing is embedded in the curriculum. Youâre not going to learn to write if youâre just going to machine,â Barnes said. âThe policy is inspired by the fact that we donât think thereâs any way to stop generative AI use.â
When Barnes designs the writing prompts for the second and third series in the MMW program, she collaborates with teaching assistants to make assignment prompts incompatible with AI analysis and reduce the likelihood that students will seek out AIâs help for passing grades.
âStudents feel absolutely obsessed with grades and are very pressured to compete,â Barnes said. âThatâs been around. I mean it is definitely worse here at UCSD than it was at other colleges and universities that Iâve been at.â
A tool, not a cheat code
Celeste Pilegard is a professor of cognitive science and educational psychology at UCSD. She has been teaching introductory research methods since 2019, focusing on foundational topics that will prepare students for higher-level topics in the field.
Educators like Pilegard have been struggling to adapt after the widespread adoption of AI tools.Â
âFor me and a lot of professors, thereâs fear,â Pilegard said. âWeâre holding onto the last vestiges, hoping this isnât going to become the thing everyone is using.â
Pilegard is concerned that students rely on AI tools to easily pass their intro-level courses, leaving them without a firm understanding of the content and an inability to properly assess AIâs accuracy.
âItâs hard to notice what is real and what is fake, what is helpful and what is misguided,â Pilegard said. âWhen you have enough expertise in an area, itâs possible to use ChatGPT as a thinking tool because you can detect its shortcomings.â
However, Pilegard does believe AI can assist in learning. She likens the current situation with AI to the advent of statistical analysis software back in the 1970s, which eliminated the need to do calculations by hand.Â
At that time, many professors argued for the importance of students doing work manually to comprehend the foundations. However, these tools are now regularly used in the classroom with the acceptance and guidance of educators.Â
âI donât want to be the stick in the mud in terms of artificial intelligence,â Pilegard said. âMaybe there are some things that arenât important for students to be doing themselves. But when the thing youâre offloading onto the computer is building the connections that help you build expertise, youâre really missing an opportunity to be learning deeply.â