![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
I bitch and moan about GenAI, but I think it’s important to emphasize that it’s not all one thing, and that many applications of these programs can be super useful. Like in language learning, for instance, or in helping researchers in STEM fields organize and present data. Just because some people are evil and stupid and lazy doesn’t mean the technology is “bad” by default.
One of my many problems with the hypocrisy of woke academics is that they put all their time and energy into “calling out” and “critiquing” abstract problems while all but ignoring more specific material problems that they can actually solve. Like sure, we need to resist Islamophobia. Of course we do!! But also, how about paying our Arabic teachers a living wage? Like maybe more than $4k a class, a salary that hasn’t been raised in more than twenty years?
With this in mind, I started attending (via Zoom) labor organization meetings. Something like 80% of my university’s faculty is severely underpaid, not to mention employed on extremely precarious annual contracts. The current political climate isn’t helping. I’m not convinced that I can actually be useful to anyone, but it’s been interesting to follow the process.
In the most recent meeting, what a union leader from another school said is that the university will attempt to frustrate organizational efforts by dividing the faculty in different programs along ideological lines. One example she gave involved ChatGPT; namely, how most Humanities people hate it and want it to be banned while STEM people tend to be more curious and permissive.
For me at least, that really hammered down the point that the “enemy” isn’t necessarily the technology itself. Rather, it’s how institutions use the technology to exacerbate pre-existing inequalities related to labor.
One of my many problems with the hypocrisy of woke academics is that they put all their time and energy into “calling out” and “critiquing” abstract problems while all but ignoring more specific material problems that they can actually solve. Like sure, we need to resist Islamophobia. Of course we do!! But also, how about paying our Arabic teachers a living wage? Like maybe more than $4k a class, a salary that hasn’t been raised in more than twenty years?
With this in mind, I started attending (via Zoom) labor organization meetings. Something like 80% of my university’s faculty is severely underpaid, not to mention employed on extremely precarious annual contracts. The current political climate isn’t helping. I’m not convinced that I can actually be useful to anyone, but it’s been interesting to follow the process.
In the most recent meeting, what a union leader from another school said is that the university will attempt to frustrate organizational efforts by dividing the faculty in different programs along ideological lines. One example she gave involved ChatGPT; namely, how most Humanities people hate it and want it to be banned while STEM people tend to be more curious and permissive.
For me at least, that really hammered down the point that the “enemy” isn’t necessarily the technology itself. Rather, it’s how institutions use the technology to exacerbate pre-existing inequalities related to labor.
no subject
Date: 2025-07-03 05:59 pm (UTC)no subject
Date: 2025-07-05 12:24 pm (UTC)I spent about a week driving around in Europe this past May. I can usually get by when it comes to speaking, but reading can be a challenge. I therefore spent a lot of time experimenting with Google Lens, which lets you take a picture of something (a menu, for instance) with your smartphone and then instantly overlays the text with a (surprisingly very good!!) English translation. It's really amazing, like Star Trek in real life.
And it's wild to me that we can take such a literally fantastic set of technologies and be like: How do we use this to exacerbate pre-existing income inequalities and destroy the middle class?
Like the utopian potential is definitely there! But just, as you said, part of the responsibility (and the joy, I would think) of developing these technologies is putting your money where your mouth is as you help to guide their use and implementation in the right direction.