As Halloween creeps closer, the shadows on campus aren’t just from carved pumpkins or fog machines, they’re also digital. Across the university, students and professors are talking about something far more chilling: the ghostly rise of artificial intelligence (AI). Most professors have already placed strict limits on AI in their syllabi, allowing only for sentence structure or grammar purposes. For many, this rule isn’t just about academic integrity, it’s about keeping human creativity alive in an age when computers are becoming too good at imitation.
The “spookiness” of AI hit new heights this month when Martin Luther King Jr.’s estate intervened after strange, disrespectful AI-generated videos of King began circulating online. Sora 2, OpenAI’s new video model, was forced to block users from generating videos of King after his family condemned viral clips that distorted his legacy.
In a statement, the estate said, “Public figures and their families should have control over how their likeness is used, especially when technology can easily twist or disrespect their image.”
His daughter, Bernice King, added a more personal plea. She said, “I concur concerning my father. Please stop.” She publicly described the trend as creators “tarnishing the legacies of real people” and turning real lives into “disgusting over-processed hotdogs.”
At the same time, the trend isn’t isolated to King alone. On TikTok and other social platforms, AIdriven clips featuring the likenesses of historical and pop-culture figures including Malcolm X, Michael Jackson, Robin Williams and more are exploding in popularity. Users create short loops of these figures in unusual, sometimes absurd contexts: Malcolm X dancing in a club, Michael Jackson giving a lecture on how to dab or King rhyming rap lyrics. The content is viral, oddly entertaining, but also ethically wrong.
This chilling development connects directly to the concern many professors at the university express about AI in the classroom. If technology can twist a revered civil rights leader’s image into something fake and demeaning, it’s not hard to imagine how it could also misrepresent a student’s voice or learning. The fear isn’t about robots taking over, but about the loss of authenticity. The university’s AI policy is meant to protect originality and truth, the same values that are being tested globally as AI deepfakes haunt digital spaces.
Using AI to write an essay might save time, but it also risks replacing a student’s unique perspective with something hollow. The fascination remains. Students joke about AI ghosts haunting their essays or whispering answers during late-night study sessions. It’s all fun and games until someone realizes those same algorithms are generating eerily realistic versions of people who never consented to appear. That’s the real “spookiness” of AI: not that it’s alive, but that it makes the unreal look alive, and makes truth harder to tell from fiction.
As the Halloween moon rises over campus, the message is clear: keep your creativity human. The ghost in the computer is real, but so is your own voice. So, enjoy your costumes, your candy and the “haunt” of AI. But steer clear of the ghost in the computer. Let’s keep what’s real… real.
Happy Halloween, Chargers, and beware the ghostly glow of AI.
