Whether you’re new to Hopkins or months away from graduation, I’d like you to reflect on how many times you’ve heard the phrase “I just care about the grade,” in the last month.
Whether you’re new to Hopkins or months away from graduation, I’d like you to reflect on how many times you’ve heard the phrase “I just care about the grade,” in the last month. I can’t even count how many times I’ve heard it in the last week, from my own mouth or from my peers.
As students at a highly competitive school that prides itself on the overwhelming quantity of students who got into Yale last year (I don’t need to drop the number, you know it too), it’s an accepted fact that grades are the most important aspect of our academic lives. No matter how many times teachers tell us not to think too much about them, the truth is that grades have their hands on the steering wheel when it comes to shaping the competitive learning environment Hopkins values so much. But it leaves me questioning what the impact of such an environment is on our AI dependency as students.
According to a recent study conducted by Education Week, approximately 1 in 5 students report using AI in a way they consider to be outright cheating, while 25% see themselves operating in a “gray area.” While these stats may not directly apply to AI use on our campus, it's scary to face how pervasive AI use is in our community. I think almost every single student I know has turned to AI at least once in their academic careers in a way that teachers would consider academically dishonest—whether it’s minor homework help or a bigger infraction like plagiarising an essay.
After sitting through our hour-long academic honesty assembly, I was left with a sense of dissatisfaction with Hopkins’ approach to limiting AI use. My friend summed it up more aptly than I could put into words: she said, “Fix the problem, not the solution.” Rather than attempting to scare AI dependency out of us, it seems more productive to look at it as a bad solution to a problem that we are largely unwilling to solve. Because ultimately, I don’t think the majority of Hopkins students rely on AI because they want to cheat or are just feeling lazy—it’s because of a more deeply rooted outlook that stems from surrounding oneself with constant competition. Ultimately this gives rise to two larger issues that drive AI dependency: our heavy prioritization of grades over engagement, and the average student’s overloaded schedule.
When it comes to focusing on grades, I’m as guilty as the next person. As a humanities person, my attitude towards math and science is entirely dependent on the letter grade I most recently received. To a certain extent, this makes sense. Part of the purpose of grades is to motivate students to engage with the material on a deeper level, but when that source of motivation overshadows the discovery and joy that’s intended to come hand in hand with learning, issues start to arise. Because cutting corners is so much easier when we start letting the ends justify the means.
I’m not saying students need to gobble up every bite of information and ignore the fact that we’re being evaluated. But there's value in learning for the sake of learning, not for getting something out of it. When this transactional mindset seeps into every aspect of our academic lives, it strips us of the ability to fully appreciate the opportunities and experiences that we have access to at the tips of our fingers. One of the most dangerous—and arguably most apparent—ways this manifests itself is through our rampant AI use. At the end of the day, using grades as a justification for reliance on AI really overshadows the point of going to a school like Hopkins: the quality of education we actually receive.
The other, and perhaps more widely acknowledged, facet of AI use is our objectively overloaded schedule. With a heavy workload and hours of afterschool commitments, I sympathize with AI overusers. It’s definitely tempting to cut those corners when you get home at 7:00 pm on Tuesday and somehow have a test, quiz, and paper due the next morning. I think I speak for most Hopkins students when I say we’ve all been there, some of us more regularly than others. The optics of doing well on all three with so little time to prepare aren’t great from where I’m standing—and our extension policy leaves a lot to be desired in terms of relieving the load. As someone who only fully commits to athletics for one or two seasons rather than three, I have it easier than most. I think many students will agree with me when I say it's difficult to find room for some grace towards oneself with everything else there is to balance.
One way Hopkins could possibly decrease AI use on campus is by creating more spaces where students use it responsibly rather than banning it outright. For example, workshops on academic integrity or assignments that integrate AI use intentionally could help stimulate this. And honestly, Hopkins has already succeeded at doing some of these things. In the past couple of years I’ve had several assignments that thoughtfully use the software to engage students in innovative ways while setting more deliberate boundaries. In addition, and perhaps decreasing by work intensity could also help limit AI use. I appreciate efforts made by the school to broaden the extension policy, but ultimately if we want to create real change we need to look at the issue from students’ perspectives. I don’t know about anyone else, but one extension per class doesn’t feel like enough to actually change students’ circumstances enough to promote less AI use.
To be fair, criticizing Hopkins’ competitive environment is a double-edged sword: on the one hand it brings attention to some issues we’re largely unwilling to solve, and on the other competition defines Hopkins’ identity and is what makes it desirable. But at the same time this conundrum deserves our attention, and it casts a wider net on addressing our AI consumption than using fear to scare us out of dependency. I applaud the effort to sympathize with students and start to acknowledge why so many turn to AI, but our mindset as a school should really address the problem itself, not just name that one exists. It might seem overly optimistic, but as members of this community we have the ability to change the way we approach our education, and how we use artificial intelligence as a result.