ChatGPT is in the classroom; teachers want kids to think on their own | Education
13 min readTechnology in classrooms is rarely a case of simple addition.
The calculator, for example, was welcomed into some K-12 classrooms in the 1990s and banned in other states for use on standardized tests. Laptops, now handed out to millions of students as they start school, once met with resistance from educators who questioned how much schools should invest in computers at all.
Now, the debate has turned to what students can access with those computers. Among the biggest questions facing teachers and administrators: What should they do about ChatGPT?
“I think that calculators are more similar to spellcheck than this,” said Edgerton High School English teacher Sue White. “This feels different because people are tricked into thinking that it can think.”
ChatGPT, which can be accessed online, allows users to ask a question or submit a prompt and receive an artificial intelligence-generated response. The language model, launched in November, is “capable of generating human-like text based on context and past conversations,” according to its developer, OpenAI.
Mars Subola, who teaches English at Madison East High School, described themself as somewhat naive about the artificial intelligence chatbot throughout the last school year. That changed in the year’s final weeks, as Subola noticed a “huge influx of writing that just didn’t seem like my kids’ writing.”
Subola discovered that more than a dozen students had ChatGPT write their papers for them.
“It just kind of felt like … they didn’t have the trust in themselves or in myself that we could create a solution to whatever problem they were facing,” Subola said. “They didn’t have to resort to ChatGPT and it just felt like these are really, really, really talented, intelligent, skilled writers who found a way out, I guess.”
The growing prevalence of AI writing bots and their increasing accessibility online create questions for educators from elementary schools through college about how to make them an educational asset rather than a shortcut, especially when it comes to writing.

Pretending it doesn’t exist isn’t an option, and not ideal for students who will eventually enter a workforce where they might be expected to use ChatGPT or other AI tools as part of their daily role.
“It’s going to have to be teaching alongside AI, not ignoring — that’s just not possible,” said another Edgerton English teacher, Jess Eichstedt, who has plenty of apprehension about the technology.
Even with the understanding that they will have to adapt, Eichstedt and other educators have concerns about the long-term consequences of students becoming overly reliant on the technology rather than developing their own skill and voice as a writer — from hampering their job prospects to losing some of the “humanity” that makes writing an art.
This spring, the Madison Metropolitan School District tried to buy time for teachers as they adapted. The district banned access to ChatGPT on district devices this spring to help give teachers time to adjust, said Deputy Superintendent TJ McCray, who led the technology department before moving into his new role in June.
It’s not a long-term solution, McCray said, because students with other devices will still have access, creating an inequity. That said, he’s not sure when MMSD will remove the ban.
“First and foremost we need to make sure our teachers felt comfortable with understanding what it is,” McCray said, noting that he wants “our teachers to just begin to embrace” the opportunities ChatGPT can offer in classrooms.
What matters most to teachers interviewed by the Cap Times and others confronting the use of AI in classrooms is conveying to students that sometimes the process of writing is more important than the final product.
“A lot of the writing that we work on with teachers and students is, writing is thinking, writing is reflection, writing is metacognition,” said Bryn Orum of the Greater Madison Writing Project, a University of Wisconsin-Madison program that offers professional development for teachers and writing camps for youths. “The dirty secret is often it’s not really about the writing; the writing is a tool to know what you know, to sort out what you know, to organize your thoughts, to look back on your thinking and understand or to express yourself to yourself or to someone else.
“That really will always be a human process that is necessary in educational spaces, but in lots of other places as well; and at the same time, there’s a lot of writing that if it can be done by a robot, maybe it should be.”
‘No one understands what (AI) does’
Robin Zebrowski has been working in the AI field since the 1990s, eventually earning a master’s degree in a cognitive science program that was “basically a theoretical artificial intelligence program.”
Now a professor of cognitive science at Beloit College, Zebrowski called it “a horror show” to see AI entering the mainstream conversation in the way it has over the past year.
“It’s like when you try and tell everybody to listen to your favorite band and everyone’s like, ‘No, I don’t want to do that,’ and then everyone suddenly loves it and they’re in the top 40, and you’re like, ‘Oh, I’m so sick of this,’” she said.
She’s been disappointed with a lack of work explaining “how the technology actually does what it does and what you can reasonably expect from it,” seeing most articles instead focused on “people saying, ‘Here we are, we’re about to change the world,’ or, ‘Oh, my God, this shouldn’t exist, let’s bury it in the dirt.’”
“The biggest problem is that no one understands what this system does, and therefore they expect things that are not what it was meant to do,” Zebrowski said.
“If you’re asking (AI) questions about something factual, it’s not answering with concepts of those things, it is simply spitting out the statistically most common word that came after that phrase in its training set,” she explained. “Literally just doing math and saying, ‘Oh, I’ve seen this phrase before, I should spit this next word out.’ That’s it.
“So I’m not that worried about my students using it to write papers because they would be bad papers.”
Explaining how the system works was the first order of business at a recent symposium for teachers, organized by the Greater Madison Writing Project and hosted by Madison College, called “How To Teach Writing in the Age of ChatGPT.”
UW-Madison computer sciences professor Jerry Zhu explained how AI language models work, illustrating the increasing complexity that chatbots are able to operate with. ChatGPT and others base their technology on “conditional probabilities” of what letter, word or punctuation is most likely to come after another based on its database of how language has been used.
Edgerton’s Eichstedt, who attended the symposium with her colleague White, said the session shows that “the technology is incredibly impressive.”
But, she said, its educational limits still exist: “We want to continue to encourage original thought. This is a sampling of what’s already in existence.
“We have to monitor that all the time, making sure students aren’t just feeding back what we tell them, either,” Eichstedt said. “AI aside, I don’t want them to just regurgitate what I’ve taught them; I want them to have insight beyond what we’ve already discussed.”

The OpenAI logo is seen on a mobile phone in front of a computer screen displaying ChatGPT examples and limitations.
‘Are (students) going to be able to think?’
Technology has long forced educators to adjust.
Computer labs allowed students to explore the internet and write papers in a way that schools with just one or two desktops couldn’t years earlier. Now, most students have access to their own device to do the same wherever they are in a school or outside of it.
AI might simply be the next step in that ongoing evolution. Instead of debating whether ChatGPT should be used, educators should be figuring out the rules and considering AI use on a “continuum,” said Brian Bartel, a technology integration specialist for kindergarten through sixth grade in the Appleton Area School District.
“Plugging in a prompt, copying and pasting the entire thing, we don’t want that and I think we can agree that that’s not something we want to do and we don’t want students to do that,” Bartel said. “On the other side of that is not using any help at all — not using the internet, for example, not using a tutor, not using someone to proofread your paper, completely doing it on your own is that other end of it.
“Maybe somewhere in the middle is where we want students to be: to generate ideas, to perhaps go over the grammar piece to suggest improvements to your writing.”
Generating ideas is a troubling step for White, who called writing “expression of thought.”
“That concern for me is not just the ethics of it, but just the humanity of it,” she said. “Are (students) going to be able to think?”
The technology is forcing some to rethink how they assess student learning. Assessment and assignment design received significant attention at the Writing Project symposium in July.
“We’re teaching the students how to think, we’re teaching students how to reason through and give evidence that makes sense to the audience they’re speaking to, to really consider the rhetorical situation in which they find themselves rather than just regurgitate back an answer, which a bot is just as good at as anybody else,” said Sarah Johnson, director of the Madison College Writing Center.

Emily Hall, the Writing Across the Curriculum director at UW-Madison, talks about some of the concerns surrounding use of artificial intelligence and ChatGPT in education during her keynote presentation at a ChatGPT symposium at Madison College Goodman South Campus.
Focused on learning, not catching cheaters
A significant point of discussion around the use of AI among educators has been how to detect it and catch students cheating.
Some software programs claim to be able to detect when things were written by ChatGPT, but experts have suggested those tools are unreliable and will quickly become outdated as the ChatGPT technology advances. Emily Hall, the Writing Across the Curriculum director at UW-Madison, said her “No. 1 concern was the sort of culture of suspicion that (ChatGPT) might inaugurate.”
Instead, Hall told educators at the July symposium, they need to develop a trusting relationship with their students on the use of AI. She believes talking with students about what exactly they are supposed to learn from an assignment could build trust and help students feel more motivated.
“I know it’s idealistic, but my goal is to have people feel like they’re on the same team, that we are working together,” Hall said in an interview with the Cap Times. “As your instructor I want you to learn and for this assignment, I think the best way for you to learn this is by working through it on your own and developing ideas, and here’s why.”
The ultimate solution could vary classroom to classroom.
“That’s where we’re trying to wrap our heads around, what are the systems we need to have in place to make sure that students are actually showcasing their learning, and not someone else’s or not a machine’s learning?” Appleton’s Bartel told the Cap Times earlier this year. “Teachers, first of all, need to have discussions with students about this as a tool and maybe model it to students, as well.”
While classroom autonomy is generally a principle most teachers support, some are seeking guidance from school leadership as they navigate this new technology, McCray said. Some teachers who undertook the initial learning opportunities the Madison school district offered this spring asked for “a policy to tell us exactly should we use it or not.”
“I don’t believe that’s exactly what we want,” McCray said. “Instead, we need to be able to allow teachers to really enhance it. ChatGPT is just like any other instructional tool that exists, and it needs to be up to the teachers to determine the way in which they utilize it.”
AI already is a workforce skill
Some see the growth of AI as inevitable.
That means that knowing how to use tools like ChatGPT, and whatever develops after it, will be a key skill for students as they enter the workforce.
“If we think about being able to do it all on their own, that’s not the future they’re going into, and most companies won’t want them to be there, either,” Bartel said. “They want their future employees to be able to leverage the AI systems to maximize productivity.”
In some professions, writing bots already are used daily.
“I have so many different friends who are like, ‘We use AI on a regular basis because it’s cutting down the amount of time that we have to do things,’” McCray, the Madison deputy superintendent, said. “We need to make sure, are we truly preparing our students for every capability that exists beyond our classrooms? Technology is the thing that’s doing it, so we have to embrace it.”
Laura Grossenbacher teaches writing to engineering students at UW-Madison as the director of the engineering school’s Technical Communication Program and thinks often about how students in high school and college need to be familiar with the latest tools professional firms are using — with a caveat.
“I hear that in engineering firms they’re already using ChatGPT to generate these reports and people have adopted this wholesale, and I think our students might be at a disadvantage if they don’t know something about using these machines,” Grossenbacher, who presented at the July symposium, said. “At the same time, I can’t help but feel like the pendulum may swing at some point and people will realize, ‘No, the best people we have, the folks we want in leadership positions don’t need a machine; they can make good decisions and think and write on their own.’”

Teachers take notes during Emily Hall’s keynote presentation during the Teaching Writing in the Age of ChatGPT symposium at Madison College Goodman Campus.
ChatGPT could help English learners
The English language technology could also help specific populations who otherwise struggle with the language.
For students — and adults — whose native language is something other than English, educators see an opportunity to level the playing field in some cases. ChatGPT could help them write stronger resumes or cover letters when seeking a job, for example. On the other hand, some worry those sorts of short-term benefits could stunt long-term learning.
“I see that would be helpful, but I also see that, will they ever learn English as well as they should?” White said.
Another side to the conversation about equity for ChatGPT is in access and understanding. Different standards at schools or even classroom to classroom within a single school could lead to students having different understandings of the technology that could hamper them in college or at work.
“I worry that with the onset of these different tools, there’ll be the kids of computer scientists who have all the cutting edge plugins and tools and other kids won’t and might not have access,” UW-Madison’s Hall said.
Bot surveillance of students?
Educators acknowledged some ethical concerns around AI use in schools.
For Appleton’s Bartel, those concerns are in line with broader questions around how technology is used to track students throughout their education, acknowledging that surveillance is the problem with generative AI “that keeps me up at night.”
“Technology’s done a wonderful thing for schools, but it’s also created a digital dossier for students that follows them everywhere,” Bartel said. “Students can’t make mistakes in the way they used to and I want to highlight that because if anywhere, schools should be a place where students can make a mistake that doesn’t follow them for the rest of their lives.
“If we are building artificial intelligence into systems that can analyze, observe, seek out information in ways that we don’t even know it’s possible, we have to be careful that that doesn’t really determine a student’s path in ways that we just can’t imagine right now.”
In addition, there are concerns around the privacy of schoolwork. Some wonder if a teacher plugs a student’s paper into an AI detector, is that a violation of some expectation or law, given that some detectors might retain data from the academic product a student created.
Beloit’s Zebrowski said “the ethical questions should be very forefront in lots of people’s minds,” but that many people are unaware of what those questions are, from the environmental costs of computers used to program the technology to whether it’s actually beneficial for students.
“I do think that the question of deception should be on people’s minds, because if you don’t know what the technology does, you’re likely to be deceived into thinking it does more than it does,” Zebrowski said. “If kids are using it, they’re going to absolutely think it’s a person.”
Grossenbacher, from the UW engineering program, emphasized the importance of recognizing what humans bring to the table. For example, she said, two people can summarize a reading or a discussion in different ways.
“I don’t like to think about human beings as AIs, but we have our own training data that has essentially trained us to think about things in certain ways that are creative and unique to us,” Grossenbacher said. “You take that away and you hand that over to a machine, then you’re relying on whatever value system it has scraped from the web.”
For Madison East’s Subola, the stakes are simple and significant: “I fear that with more and more AI usage, especially in areas like English and the humanities, you’re going to lose what it means to be human.”
“A big part of what makes us people and special is that we tell stories and we connect it to our personhood and our experiences and that’s how we create community,” Subola said. “If we don’t learn how to communicate effectively, then we all lose our humanity.”