As CXD introduces AI to Bootcamp, students and faculty discuss initiatives to increase AI literacy

This post was originally published on this site.

Courtesy of Bowdoin Communications
ARTIFICIAL INTELLIGENCE IN ACTION: Sophomores bond in the midst of a team activity during bootcamp. This year, CXD repeatedly acknowledged the widespread use of artificial intelligence, or AI, as a critical new aspect of the career process following feedback from last year.

As students gathered in Kresge Auditorium for a series of presentations during this year’s Sophomore Bootcamp, they not only gained the typical tools for writing cover letters and resumes but also learned how to incorporate artificial intelligence (AI) into their job search. The Career Exploration and Development Center’s (CXD) endorsement of AI left many students questioning what role it will play in their future careers.

Noah Rossin ’27 shared that AI was proposed as a resource in the majority of CXD presentations, with recommendations given to use ChatGPT to check resumes, evaluate job eligibility, proofread cover letters and identify keywords in job descriptions. However, Rossin expressed qualms about using AI for those tasks.

“Those are things that other services can do or things that you as a college student should be able to do, and in theory, will be even more enticing to jobs in the future as more people use AI,” Rossin said.

Director of Partnerships and Programming Bethany Walsh, who organized Sophomore Bootcamp, expressed that her goal in implementing AI curriculum was to give students more options to improve their job search as the application landscape changes.

“We’re definitely seeing employers using AI a lot more in how they’re reviewing applications and also expectations of students’ ability to navigate AI tools [are changing], because they’re going to be using them on the job a lot more,” Walsh said. “And that’s happened really rapidly over the last year—it’s just gone from zero to a million.”

Walsh noted that many students approached CXD in the past year with questions about where AI tools can be helpful or harmful to their applications. Because Walsh received few participants when they offered an optional workshop on this topic last year, they decided to incorporate aspects of this talk into other parts of the Bootcamp curriculum.

“One of our main goals of Bootcamp is to make sure every student knows about the resources that are available to them, so they can thoughtfully choose which ones they want to use or how they want to use them,” Walsh said.

Walsh emphasized that as students apply to more jobs and recruiters receive more applications, AI can lighten the burden of this already time-consuming process.

“How can we point you towards the best use of your time so you can find the thing you’re looking for, apply to it successfully and hopefully get through the door to an interview?” Walsh said.

The CXD’s approach to AI education aligns with the College’s plan to roll out AI softwares to faculty and students over the next few years. Professor of Physics and Chair of the Committee on Teaching and Classroom Practice Dale Syphers serves as the principal investigator of a $250,000 grant to integrate AI into Bowdoin’s curriculum. About a quarter of the faculty are currently experimenting with the AI platform Amplify, and the number of classes using AI tripled this semester compared to the last.

Syphers explained that, while softwares like ChatGPT are typically used to generate answers, the College hopes to teach students how to use AI to supplement their learning and critical thinking instead of creating shortcuts.

“The key here at Bowdoin is to get an understanding of how it works on your own when it gives answers, and what might be right or wrong, but not to use that to help you craft your own understanding of a field. And if you can coexist in those two worlds, you are perfectly suited for the future,” Syphers said.

Both Syphers and Walsh clarified that AI is not a threat to future jobs, but that Bowdoin students should also be well-versed in these tools as they enter the job market.

“I think everyone at the College is really invested in preparing students for the world they’re about to enter—good, bad or ugly,” Walsh said. “If you’re comfortable with AI, you’re willing to experiment with it and you’re adaptable and willing to learn, then you’re going to put yourself in a good stance to be successful. We don’t get to choose this unfortunately.”

However, students like Rossin are grappling with the ethical implications of AI and its implementation at a college dedicated to the common good.

“My perspective on it is that AI is a net negative. It is bad for your critical thinking. It is bad for the environment. A lot of these major AI models, like ChatGPT, are trained on almost slave labor in places like Kenya,” Rossin said.

Syphers acknowledged the environmental impacts of AI and the exploitative measures used in its training. However, he predicts that in the next five years, the technology will reduce its carbon footprint by localizing to computers instead of large data processing centers, eventually allowing AI softwares to use the same amount of energy as a laptop.

“I think everyone is going to have to come to their own understanding of where the ethics are for use. I base mine on where this technology is and where it’s going,” Syphers said. “So yes, in the short term, it’s going to have a huge energy [impact]. But in the long term, I don’t believe it is.”

Similar to the CXD’s approach, Sypher encourages students to strike their own balance with AI but to understand where the technology is heading.

“This is a tidal wave that is heading our way. There is no doubt about it. There’s no place to hide from it,” Syphers said.