![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
Last Friday I attended a Zoom panel on ChatGPT hosted by my school's writing center. Signing up for ChatGPT requires like five levels of identity verification, so I haven't fucked around with it myself, but here's what I managed to pick up from the panel.
So first, ChatGPT can't actually, like, write papers. The sentences it generates are grammatically correct, but they don't connect to one another in a way that makes sense to a human being. If you're skimming the body of what the software generates without paying too much attention, it's legible, but the writing shows its flaws in the connective details. Also, ChatGPT generates citations that don't exist. In a broad field like English or History, you could maybe get away with this, but not so much in a more specialized field. And then, on top of that, ChatGPT can't write lab reports or other documents summarizing numerical data.
In other words, if you want to use ChatGPT to write a paper, it's going to take about as much time as if you wrote the paper from scratch. What the writing center director said is that what most undergrads who don't write their own papers do is have a friend or someone from their family write their papers for them, and this is difficult to catch because the same human being is writing all of their papers. So basically, Penn kids are too smart to use ChatGPT to cheat.
That being said, ChatGPT can do three interesting things. First, you can use it as something like a style+grammar check for something you've already written. Second, you can plug in something that someone else has written in order to make their writing more comprehensible (which can be useful when trying to make your way through an academic article, for instance). And third, you can ask ChatGPT to write a simplified-language Wikipedia article for you about an unfamiliar topic.
This woman's caveat about using ChatGPT is that giving it prompts and "training" it is a discrete skill that takes time and dedication to develop. So I guess it's like AI image generation, where a prompt like "Dark Souls castle labyrinth" will get you something cool and interesting, while "a hand holding a flower" will return pixelated garbage.
I've said this before, but I think AI text generators are already being used fairly extensively, especially for things like fake Amazon reviews and placeholder pages on larger websites that exist solely for the purpose of search engine optimization. But still, all the recent articles about ChatGPT being indistinguishable from undergraduate writing don't really reflect the reality of what the software is capable of doing at the present moment.
So first, ChatGPT can't actually, like, write papers. The sentences it generates are grammatically correct, but they don't connect to one another in a way that makes sense to a human being. If you're skimming the body of what the software generates without paying too much attention, it's legible, but the writing shows its flaws in the connective details. Also, ChatGPT generates citations that don't exist. In a broad field like English or History, you could maybe get away with this, but not so much in a more specialized field. And then, on top of that, ChatGPT can't write lab reports or other documents summarizing numerical data.
In other words, if you want to use ChatGPT to write a paper, it's going to take about as much time as if you wrote the paper from scratch. What the writing center director said is that what most undergrads who don't write their own papers do is have a friend or someone from their family write their papers for them, and this is difficult to catch because the same human being is writing all of their papers. So basically, Penn kids are too smart to use ChatGPT to cheat.
That being said, ChatGPT can do three interesting things. First, you can use it as something like a style+grammar check for something you've already written. Second, you can plug in something that someone else has written in order to make their writing more comprehensible (which can be useful when trying to make your way through an academic article, for instance). And third, you can ask ChatGPT to write a simplified-language Wikipedia article for you about an unfamiliar topic.
This woman's caveat about using ChatGPT is that giving it prompts and "training" it is a discrete skill that takes time and dedication to develop. So I guess it's like AI image generation, where a prompt like "Dark Souls castle labyrinth" will get you something cool and interesting, while "a hand holding a flower" will return pixelated garbage.
I've said this before, but I think AI text generators are already being used fairly extensively, especially for things like fake Amazon reviews and placeholder pages on larger websites that exist solely for the purpose of search engine optimization. But still, all the recent articles about ChatGPT being indistinguishable from undergraduate writing don't really reflect the reality of what the software is capable of doing at the present moment.