As we move into a new semester, I want to remind our readers of an important fact: Artificial Intelligence (AI) is not your classmate, nor is it your friend, and should not be treated as such. AI can help, but unchecked reliance erodes critical thinking and risks replacing genuine learning.
The use of AI has significantly increased over the past few years, especially since the release of generative AI tools in 2022 and 2023. AI usage at work has nearly doubled in just two years, according to a Gallup study done in June.
This rise illustrates the corporate indoctrination laid upon us by artificial toolboxes like Microsoft Copilot and OpenAI. When you mention ChatGPT, people immediately know what you’re referencing. AI technology has latched its black tendrils onto our culture, integrating itself not only into offices and workspaces, but also into the public eye, masquerading as a trendy fad.
I hate to burst the proverbial bubble, but the invention of AI itself has already irrevocably changed the reality of our world. And as we continue to feed the mechanized mire, our reliance upon it is pushing us to science fiction-like real-world consequences. I’ll forego the whole “Skynet is real” metaphor in exchange for words that have you reflect on your part in the grander story.
By nature, we are scholars. The blessing of free will permits us to seek the unknown, until it becomes known— to better ourselves and the world around us. And this is why we must retain our scholarly drive, not to be corrupted by the murky ooze that capitalism attempts to feed us through a guise of intellectual assistance.
In May, researchers at MIT surveyed 319 participants on how generative AI affected their critical thinking, asking: When and how do workers perceive they use critical thinking with AI? What factors influence that effort?
In the study, a large majority of the participants reported reduced perceived effort in various cognitive activities when using AI. That is to say, the more these participants used AI, the less their brain seemed to function and relied on their own skills.
The study also found that higher confidence in the AI tools led to less critical thinking overall, since they didn’t need to, because a machine was doing it. Critical thinking is a learned skill that must be practiced, not taken for granted.
Additionally, it was found that participants with stronger confidence in their own abilities put more effort into critical thinking, and vice versa for the latter.
So how can we apply this to our own studies? Sure, you could let AI write a few papers for you. We all juggle jobs, bills and families. Isn’t this tool supposed to make life easier?
At Palomar, a Nectir pilot program found that nearly three-quarters of students reported improved learning with an AI assistant. That can’t be ignored. But as encouraging as that sounds, research also shows students may rely on AI at the cost of their own critical thinking skills.
While I acknowledge that there are uses and benefits for AI as a tool, I urge users to not be too comfortable with their virtual assistants.
AI tools were not created with students’ best interests at heart. Yes, they may ease workloads in the short term, but they also risk replacing jobs and weakening study habits. Instead of turning towards a soulless study-buddy, we should be invoking our community for advice. Hearing other voices can spark new ideas and lead to deeper thinking and understanding.
AI is a tool to be used, not a teammate to rely on. Let’s not forget we are the real scholars.
