Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124

“When readers discovered the truth about how the book was created, many were injured. I am deeply regretted, but it was necessary,” he said.
Wired in a conversation at the Ward Columdis that searched for the summary of his project.
This interview has been edited for length and precision.
Wired: What was the inspiration of the philosophical test?
Andrea Columdici: First, I learn immediate thoughts in the European Institute of Design and I lead a research project on artificial intelligence and thought systems of Foggia University. Working with my students, I realized that they were using the chatzP in the worst way: to copy from it. I noticed that they are losing a compromise of life depending on AI, which is worrying, because we live in an era where we have access to the sea of ​​knowledge, but we do not know what to do with it. I often warned them: “You can get a good grade, even using the chatzipi to cheat, but you will be empty.” I have trained professors from several Italian universities and many have asked me: “When can I stop learning to use the ChatzP?” The answer is never. This is not about to finish any education in AI, however How You learn when using it. “
We must save our curiosity when using this tool properly and teach us how we want it. It all begins with an important difference: there is information that makes you passive, which reduces your skills to think over time and contain information that tells you the challenge that makes you smarter than your limits. That is how we should use AI: as a conversation that helps us think differently. Otherwise, we do not understand that these tools are designed by large technology companies that impose a specific ideal. They choose the connections between data, and above all, they consider our customers as satisfied. If we use AI in this way it will only ensure our bias. We will think we’re fine, but in reality we will not think; We will be digitally hugged. We cannot afford this disappointment. It was the starting point of the book. The second challenge was to describe how it happened now. For Gils Delws, philosophy is the ability to create ideas and today we need to understand our reality. Without them we were lost. Just watch Trump’s Gaza video – produced by AI – or in the provocation of statistics like musk. Without strong conceptual equipment, we are destroying the ship. A good philosophical concept that creates a good philosophical concept that is like the key that allows us to understand the world.
What was your goal with the new book?
The book has tried three things to do: to be educated in readers, to discover a new concept for this era and to help to be theoretical and practical at the same time. Many were injured when readers discovered the truth about how the book was created. I am deeply sorry, but it was necessary. Some people say, “I hope this author exists.” Okay, she’s not. We must understand that we make our own details. If we do not do that, the far rights will make the description exclusive, make myths, and when they write history, we will spend our lives truth-cheek. We can’t let it be.
How did you use AI to write this philosophical essay?
I would like to clarify that AI did not write the article. Yes, I have used artificial intelligence, but not in conventional ways. I created a method that I taught in the European Institute of Design based on creating opposition. This is a way to think and use machine learning in an opposing way. I did not ask the machine to write for me, but instead made it the idea and then I used GPT and Clode to criticize them, to give me a view to what I wrote. Everything written in the book is mine. Artificial intelligence is a tool that we must learn to use, because if we abuse it – and “abuse” it includes to be considered as a kind of oracle, it is “to answer me the question of the world; explain why my presence” – but we can lose the ability to think. We become foolish. Jun Pike, a great artist of the 1990s, says: “I use technology to hate it properly.” And this is what we must do: understand it, because if we don’t do it, it will use us. AI will turn into the equipment that Big Tech uses us to control and manage us. We must learn to use these tools properly; Otherwise, we will face a serious problem.