chatgpt is worse than stack overflow

This commit is contained in:
Wouter Groeneveld 2023-10-05 11:00:30 +02:00
parent 14879aa064
commit c7160fa46c
1 changed files with 68 additions and 0 deletions

View File

@ -0,0 +1,68 @@
---
title: "ChatGPT Is Worse For Students Than Stack Overflow"
date: 2023-10-05T10:11:00+02:00
categories:
- education
tags:
- AI
---
Our first-year engineering students have been exposed to higher education for a good two weeks now, and as is typical for each "virgin" semester, casualties are slowly but surely starting to appear. This year, however, as a teacher, something irritated me more than usual: the quickness of giving up.
Focus has been a worsening problem for decades now, thanks to the invention of smartphones and social media, and even though I tell myself not to pay too much attention to the slackers on the last few rows that do nothing but scroll on their phone, I always return home disappointed and annoyed. This time even more so, thanks to the speed at which students whip up answers by just asking ChatGPT to do their Python exercise.
When I confront them with this, they quipped: "how else should I learn this?" Perhaps by trying the exercise yourself? Perhaps by getting used to failing, asking for help, studying the given material, and retrying? It dawned to me that this phenomena isn't exactly new: high school students regularly dare to outsource (and even pay for) their work, which is easy to do through the internet. It can be as simple as asking the Stack Overflow community to solve it for you.
But chances are low of your Stack Overflow question being answered as soon as it's posted. That means that students usually won't get the answer handed to them within the time frame of my class. This dynamic completely changed with the release of ChatGPT---which, again to my frustration, first-year students are more than happy to exploit.
I know a lot of papers are being published concluding that we'll have to accept that ChatGPT is there and our way of teaching should change because of it. I disagree. That's like saying "okay people can't stop scrolling on their phones, I guess we shorten and cut up our content 'cause they can't handle it anymore"---which, to an extend, is exactly what's happening with education, by the way. I admit that restricting usage usually has the opposite effect, but the way technology is heading right now only gives me headaches.
Here's a simple example: Given a string in Python, write a function that converts each character that's not in uppercase to uppercase and the other way around:
```python
sentence = "Hi There Sup?"
expected = "hI tHERE sUP?"
```
If you were to ask this using ChatGPT, I've seen it produce something like:
```python
def convert(sentence):
result = []
for c in sentence:
if c.isupper():
result.append(c.lower())
else:
result.append(c.upper())
return "".join(result)
```
Then the student has the guts to ask me to explain this code, since they don't understand it---especially the weird `join()` statement that of course we haven't seen yet, while the easy solution that we expect students to come up with is:
```python
def convert(sentence):
result = ""
for c in sentence:
if c.isupper():
result = result + c.lower()
else:
result = result + c.upper()
return result
```
This exercise is just there to get to know `def` and `for`, while they should be familiar with strings and how to append things from a previous class. To make matters worse, I was heavily confused after seeing `"".join()`, since I regularly use that trick myself, and instead of explaining the simple version, I explained the hard one. Then the student proceeded by yelling "See? ChatGPT!" Right. Time for my blood pressure pills.
If we as a society keep on creating and accepting tools like ChatGPT (and smartphones...) that can clearly be exploited in all manner of ways, then I wonder what kind of world our daughter is going to live in thirty-forty years from now. These first-year students are probably proud of themselves of having the ability to copy-paste and quickly "finishing" the assignment, but in reality:
- They have no idea how good or how bad the proposed solution is, since they're still lacking critical thinking skills and programming experience;
- They have no idea that some proposed solutions are plain wrong and accept most or all of it anyway;
- They have no idea on which data the model is based on, and how unethically it was assembled;
- They have no idea that anything your put in will also be gobbled up and used in some way;
- They have no idea they gave away their data such as mobile number etc. to create an OpenAI account, and how it undoubtedly will be misused in the future.
- The worst part: if you try to explain the points raised in this post, they won't listen.
The problem is that teaching ethics, critical thinking, and other key abilities that help put a much-needed perspective to use these tools with responsibility simply cannot be done within the first two semesters in higher education, which means entry courses such as the Python one will always be suspect to copy-paste bragging. Sure, exams with no access to these tools can easily expose these students, but does this mean that in the end ChatGPT is partially responsible for an even higher drop-out rate?
I increasingly feel like an old man yelling at clouds...