The thread about Honor Council/ChatGPT accusation made me wonder about this and I didn’t want to hijack that thread.
I know ChatGPT is relatively new. Neither S23 (spring 2023 community college DE) or S24’s (full-year online high school) teachers have addressed it at all with respect to what they see as legitimate uses or honor code violations. I’m hopeful that going forward teachers/professors will be very clear about what they consider acceptable and not. But until then, what do you think?
Some things are obvious:
Have ChatGPT write an essay and turn it in as your own – NOT OKAY
Have ChatGPT write an essay and change the language in it to be your own – NOT OKAY
but others I feel like are a bit more ambiguous…
Tell ChatGPT your thesis and have it suggest supporting points, which you then find evidence for on your own – probably NOT OKAY
Tell ChatGPT the general area you are need to write a paper about, and ask it to help you brainstorm topics – maybe okay? (Both my sons have used it this way this semester. Neither actually used any of the ideas, but it helped them get their own ideas flowing.)
What do you think? For what purposes do you think it is okay to use ChatGPT for papers?
Not for papers specifically, but this is what my son says is the norm in his programming class:
Have ChatGPT help you debug code, if you get stuck - OKAY
Have ChatGPT debug all your code - not advisable, you should try first to do it yourself
Have ChatGPT explain what some code does, if you get stuck - OKAY
Have ChatGPT explain all the code - not advisable, you should try first to do it yourself
Have ChatGPT write the code - NOT OKAY (but a lot of kids still do it)
I think it’s ok to get debugging help from chatGPT - as long as you then spend time and effort understanding what the issues are.
Otherwise you’re going to find yourself drawing a blank at an in-person job interview when your interview shows you a piece of code and asks you what’s wrong with it.
Yeah, I agree. I am just reporting what my son says about what students consider to be the norm in his class. Personally, I’m a bit skeptical about using the ChatGPT “training wheels” so much.
On the other hand my husband says everyone at work uses ChatGPT constantly (for developing software in big bay area tech company) and that good use of AI tools like ChatGPT is now an essential skill… especially for things like test cases, comments in code, debugging, explaining.
I would prohibit it when kids are developing foundational writing skills in K-12 grades. Afterwards? Too hard to monitor, require in-class assessments only. I fully expect college essays to be abandoned in the near future as AOs get overwhelmed with such essays.
Our son leads a development team and doesn’t want anyone wasting time coding what can be produced effortlessly. I posted previously that his developers are now freed from mundane coding tasks and can focus on solving problems faster and more efficiently. They can review Chat code and, for anything not quite on target, feed the alternative in to ensure that that particular interpretation does not recur. This type of iterative learning improves both Chat and the user which is how AI machine learning is supposed to work.
Currently, he is attending the three-month captain’s course to earn his next rank. His team, with the approval of the instructor, is using ChatGPT4 to complete all of their homework. They feed in the exact problems (strategic battlefield, resource allocation, communications, and logistics issues, etc.) to see what the program comes up with. They have been stunned by the acceptability of the output, often requiring little modification. Army Cyber is very interested/involved in AI capability and is developing and experimenting with it in many ways. What AI can do, let it do, and use manpower to teach it to do better.
The conundrum is that in order to improve AI, people need to have the skills to evaluate what it produces and modify as necessary, kind of a chicken-and-egg dilemma. So it is still very important for students to be able to do original work and hone their problem solving skills to avoid using AI to create cement lifejackets.
How schools harness this beast will be interesting to watch. I don’t believe policies or honor codes to define or prevent plagiarism are the way to go. I think we need to embrace the power and accessibility of AI tools and teach students how to use them to best advantage. And, perhaps, altogether change the way we teach and what we expect students to learn.
If it isn’t cheating to ask a person to help you, it is OK to use and AI bot.
Similarly, if you would be required to acknowledge a person for this help, you are required to acknowledge the use of an AI bot.
There are many wealthy kids who are using a person to write their essays (there was even an ad for this type of job, which somebody shared here). So now any kid with decent computer access can get the same sort of service.
My daughter’s fourth grade class has a number of students who do not/will not write (or learn to physically write), and feel that watching a video replaces reading a book. So, it’s already happening. (Not even getting to the issues she has with some of them and math…)
You’re absolutely right. If students can, and are, submitting essays that they did not write, and there is no way to tell this, it becomes a pointless exercise. If colleges want to require essays, they need to have ways to make sure that the essay is written by the student. I do not know how. Maybe have the student write something in a proctored room, with a series of prompts that change each year?
Yes there is an AI checker and a kid we know already got an honor code violation for submitting an essay that substantially used AI. I would think it would need to be in the category of “have it write the essay for you” to be picked up though
I feel our best option is a private school that I found for my youngest child that still teaches handwriting and requires the kids to write their answers in full sentences and paragraphs, often by hand. The school issues computers preloaded with software. They do enough writing in school in front of the teacher that she can tell if a student is using assistive technology.
I don’t know if it is permitted for some students with accommodations, but my child who does not receive accommodations is not permitted to use assistive writing technology of any kind in school.
I still talk to them about the importance of doing the writing themselves and never using assistive technology other than search engines and reputable websites. And I am continually having the “what’s a reputable website” discussion.
Haven’t AOs repeatedly insisted that they can tell the voices of 17-year-olds in their essays? AFAIK, ChatGPT and the like aren’t trained to mimic the voices of any particular age group.