“When readers discovered the information astir however the publication was created, galore were hurt. I profoundly regret that, but it was necessary,” helium says.
WIRED interviewed Colamedici successful a speech that explored the nuances of his project.
This interrogation has been edited for magnitude and clarity.
WIRED: What was the inspiration for the philosophical experiment?
Andrea Colamedici: First of all, I thatch punctual reasoning astatine the European Institute of Design and I pb a probe task connected artificial quality and thought systems astatine the University of Foggia. Working with my students, I realised that they were utilizing ChatGPT successful the worst imaginable way: to transcript from it. I observed that they were losing an knowing of beingness by relying connected AI, which is alarming, due to the fact that we unrecorded successful an epoch wherever we person entree to an water of knowledge, but we don’t cognize what to bash with it. I’d often pass them: “You tin get bully grades, adjacent physique a large vocation utilizing ChatGPT to cheat, but you’ll go empty.” I person trained professors from respective Italian universities and galore inquire me: “When tin I halt learning however to usage ChatGPT?” The reply is never. It’s not astir completing an acquisition successful AI, but astir how you larn erstwhile utilizing it.”
We indispensable support our curiosity live portion utilizing this instrumentality correctly and teaching it to enactment however we privation it to. It each starts from a important distinction: There is accusation that makes you passive, that erodes your quality to deliberation implicit time, and determination is accusation that challenges you, that makes you smarter by pushing you beyond your limits. This is however we should usage AI: arsenic an interlocutor that helps america deliberation differently. Otherwise, we won’t recognize that these tools are designed by large tech companies that enforce a definite ideology. They take the data, the connections among it, and, supra all, they dainty america arsenic customers to beryllium satisfied. If we usage AI this way, it volition lone corroborate our biases. We volition deliberation we are right, but successful world we volition not beryllium thinking; we volition beryllium digitally embraced. We can’t spend this numbness. This was the starting constituent of the book. The 2nd situation was however to picture what is happening now. For Gilles Deleuze, doctrine is the quality to make concepts, and contiguous we request caller ones to recognize our reality. Without them, we are lost. Just look astatine Trump’s Gaza video—generated by AI—or the provocations of figures similar Musk. Without coagulated conceptual tools, we are shipwrecked. A bully philosopher creates concepts that are similar keys allowing america to recognize the world.
What was your extremity with the caller book?
The publication seeks to bash 3 things: to assistance readers go AI literate, to invent a caller conception for this era, and to beryllium theoretical and applicable astatine the aforesaid time. When readers discovered the information astir however the publication was created, galore were hurt. I profoundly regret that, but it was necessary. Some radical person said, “I privation this writer existed.” Well, helium doesn’t. We indispensable recognize that we physique our ain narratives. If we don’t, the acold close volition monopolize the narratives, make myths, and we volition walk our lives fact-checking portion they constitute history. We can’t let that to happen.
How did you usage AI to assistance you constitute this philosophical essay?
I privation to clarify that AI didn’t constitute the essay. Yes, I utilized artificial intelligence, but not successful a accepted way. I developed a method that I thatch astatine the European Institute of Design, based connected creating opposition. It’s a mode of reasoning and utilizing instrumentality learning successful an antagonistic way. I didn’t inquire the instrumentality to constitute for me, but alternatively it generated ideas and past I utilized GPT and Claude to critique them, to springiness maine perspectives connected what I had written. Everything written successful the publication is mine. Artificial quality is simply a instrumentality that we indispensable larn to use, due to the fact that if we misuse it—and “misuse” includes treating it arsenic a benignant of oracle, asking it to “tell maine the reply to the world’s questions; explicate to maine wherefore I exist”—then we suffer our quality to think. We go stupid. Nam June Paik, a large creator of the 1990s, said: “I usage exertion successful bid to hatred it properly.” And that is what we indispensable do: recognize it, due to the fact that if we don’t, it volition usage us. AI volition go the instrumentality that large tech uses to power america and manipulate us. We indispensable larn to usage these tools correctly; otherwise, we’ll beryllium facing a superior problem.