This Counselor Used AI to Help Students Apply to College. Here’s How – Education Week

This post was originally published on this site.

High school counselors have a lot on their plates. They’re holding individual meetings with students to help them think about their futures, fostering connections with college admissions officers, and writing letters of recommendation.

Much of the work can get repetitive, as they field the same questions from multiple parents every year and write dozens of recommendation letters.

That’s where artificial intelligence tools can help, said Jeffrey Neill, director of college counseling at Graded: The American School of São Paulo in Brazil, a private school that serves students ages 3 through high school who use an American curriculum and learn in English.

AI has made its way into many other aspects of K-12 education, helping teachers create lesson plans, grade students’ work, compose emails to parents, and craft Individualized Education Programs, or IEPs, for students with disabilities.

Neill discussed his experience with incorporating AI tools into counseling at the College Board’s annual forum here in Austin this week. Neill, whose newsletter offers a detailed look at some of the free and paid online tools he uses, also spoke with Education Week on best practices around AI in counseling high school students about college.

This interview has been edited for length and clarity.

How can counselors use AI tools in their regular work?

The tasks that really lend themselves toward artificial intelligence are those that are repetitive and are simple and information-based. So, for example … letters of recommendation.

These are heartfelt, passionate letters, but when a counselor takes the time to stop and think about how much time goes into them, I think most would recognize that once they have all of the information in one place, aggregated in one location, the writing process doesn’t take as long as they think. It used to take me about three hours to write a single letter of recommendation, and I realized that at least an hour-and-a-half, maybe more of that, was me collecting the information from all the different locations.

Artificial intelligence can do that, pull it all together so that when you’re going to write that letter, you’re working from one source of information, really just hacking away at it, making it into that passionate letter that is advocating on behalf of the individual student.

There’s also lots of little things that can help in terms of the processes and procedures. About 150 colleges and universities from around the world have visited us on campus so far this year, and every time a college comes and visits, we try to advertise it to our students. We just ask ChatGPT to write a three-sentence blurb about the university, including famous alums or something like that, and we blast that out to the students. I could write, in most cases, on my own, but it does it faster.

I spent some time [at the College Board forum] talking about the email triage tool, which is essentially an artificial intelligence tool that is trained exclusively on your past emails. Anytime a new email comes in, it drafts based on what it thinks you will say based upon what you’ve already said to other people when answering the same question. And right now, I think it’s probably operating at like a 90 percent accuracy rate. I do have to change some things, but it has also learned how to write like me.

How can counselors use AI tools with their students?

We’ve been experimenting with lots of different ways to basically empower students to make good use and responsible and ethical use of these tools.

I start by saying, “…there is only one rule: don’t copy and paste text from ChatGPT and claim it as your own.”

But there are some ways in which we’re trying to get the students to use it as a sounding board, a way to get feedback. One is to dump in a completed essay and ask it, “Please rank this essay on a scale of 1 to 10, and give me three points of critical feedback for how I might improve it.”

There’s no difference between asking ChatGPT to do that and asking me as their college counselor, asking their English teacher, asking their parent, independent consultant, whoever. The benefit of ChatGPT and empowering our students is they can do it on their timeframe.

We’ve also developed some tools in specific regards to the Common Application [which allows students to apply to multiple colleges and universities at once]. The activity section, which is basically the resume section, there’s a 150-character description where they say, “what’s the activity?” Cross-country. How often do you do it? And then it says, “in 150 characters, tell us about this.”

And the students often have a really hard time with formulating what that is, either because they have so much to say, or they just say so little. So we’ve created an iterative prompt that asks a series of questions of the student to help formulate what a good use of those 150 characters would be.

Are there any cons to using AI in counseling?

It’s more cautionary, is just the idea of student confidentiality, trying to go to great lengths to honor FERPA [the federal Family Educational Rights and Privacy Act] or GDPR [the General Data Protection Regulation in the European Union] or your regional version of those protections, to make sure confidential student information is not being put out there. I strongly recommend that when you work with any tool that might be dealing with confidential information, like a letter of rec, to make sure you have clear conversations with your IT department and whoever oversees the data protection to make sure that you’re staying within the realm.

Another con is just that there are so many tools out there. One of the challenges is we don’t have a lot of spare time. We don’t have a lot of time to … go explore, see what kind of other tools are out there that I might use.

I think just the only other negative or con in this situation is … that there has been some pretty sloppy language by universities around the use of artificial intelligence, where, one particular university stated publicly, students should not be using artificial intelligence at all.

What does that mean? Because on one hand, the student who goes onto Amazon to get an SAT test prep book is using artificial intelligence. So many kids are using things like Grammarly to grammar spell-check. Are those things not supposed to happen? What I think they mean is, don’t copy and paste. Don’t use it to write the essay. There are many other ways, though, and I’m very curious to see how this develops and how the language becomes more precise.

When it comes to our work, we’ve been very transparent with university reps about what we do, and however much skepticism that some of them might possess at the outset, when they hear about the amount of time we spend in writing those [letters of recommendation] the old-fashioned way versus how much time we are given back, and coupled with the idea that we’re being very intentional about giving that time back to the students … even the most ardent skeptics heard that and kind of threw their hands up and said, “This sounds amazing.”

What are some of the equity considerations of using AI in counseling?

I have about 35 to 38 students per grade. I get to know each one of them intimately, know their parents and families.

And then you go a public school in say, California, where some of the student-counselor ratios go up to 600 to 1. They don’t know their kids. They can’t know their kids. And as part of the Common App process, there’s a box that we can check that says, I am not writing a letter of recommendation for this student.

You think about those two applications—one where the counselor checked the box and one where the counselor submitted a two-page, comprehensive letter. It’s not to say that this student is hurt by not having a letter, but the student who has a letter is getting an advantage. There’s a total lack of equity in this system that really rewards a person like me to be helping kids who ostensibly already have a leg up.

And then, we run into this in Brazil in particular, just access to the internet. It is not a foregone conclusion that every kid has access to the internet. And so when artificial intelligence is fundamentally or primarily a tool based upon the internet, if you don’t have the internet, … there is an access issue from that perspective as well that I’m worried about.