Generals Embrace AI. Professors Hide From It.
My career has been influenced by two very different industries, and they're taking very different approaches to AI.
Early in December, Secretary of War Pete Hegseth stood before the Pentagon and delivered a message with the subtlety of a drill sergeant: “AI should be in your battle rhythm every single day. It should be your teammate.” Within 72 hours, three million military personnel, civilians, and contractors had access to Google’s Gemini through GenAI.mil. The directive was clear. Log in. Learn it. Use it immediately.
Meanwhile, at universities across the country, committees are still meeting.
I have watched both worlds up close. I spent five years overseeing machine learning at Chegg, at the center of education technology, and over a decade leading AI companies in national security. The contrast is jarring. One institution treats artificial intelligence as an existential opportunity. The other treats it as an existential threat. One sector is sprinting. The other is still lacing its shoes.
The numbers tell a damning story. A 2024 survey found that 80 percent of American provosts report their institutions lack comprehensive AI policies. Faculty request guidance and receive silence. Students report confusion about when AI use crosses into misconduct. Some universities have reverted to pen-and-paper exams, a retreat to the 19th century to avoid the challenges of the 21st. Sciences Po in Paris and others have banned ChatGPT entirely, as if prohibition ever stopped determined students from finding workarounds.
The Pentagon, by contrast, chose integration over prohibition. The Department of Defense’s memo frames AI as a competitive necessity: “Victory belongs to those who embrace real innovation.” Under Secretary Emil Michael put it bluntly: “There is no prize for second place in the global race for AI dominance.”
Higher education seems to believe there is plenty of time for second place.
The irony cuts deep. Universities exist to prepare students for the economy they will enter. That economy will be saturated with AI. Every major law firm, consulting company, and technology employer now expects fluency with these tools. Yet the institutions charged with workforce preparation are actively preventing students from developing that fluency. They are training young people for a world that no longer exists, and at a time when students are increasingly unable to find jobs.
The comparison reveals something uncomfortable about institutional priorities. The military, often caricatured as rigid and hierarchical, moved with startling speed. Within months of recognizing AI’s strategic importance, the Pentagon deployed it to every desktop in the organization. Universities, supposedly bastions of intellectual agility and openness to new ideas, remain mired in debate. The institution designed to follow orders proved more adaptable than the one designed to question everything.
Part of this reflects genuine complexity. Academic integrity matters. Assessment has to mean something. These are not trivial concerns. But two years after ChatGPT’s release, the question is no longer whether to engage with AI. The question is whether American higher education will lead that engagement or be dragged into it, unprepared and resentful.
A few schools are waking up. Purdue and Ohio State have introduced AI competency requirements for undergraduates. Credit where due. But these announcements came two years after ChatGPT launched, and the requirements take effect in 2026. The Pentagon moved in 72 hours.
The students, meanwhile, have already decided. Surveys show 88 percent of UK students used AI for assessments this year, up from 53 percent the year before. American numbers track similarly. The gap between official policy and actual behavior grows wider by the semester. Universities can pretend this is a problem of enforcement. In reality, it is a failure of leadership.
And these students will graduate into a workforce that assumes AI fluency. Their employers will not care whether their professors approved of ChatGPT. They will care whether the new hire knows how to use it.
Right now, service members have AI on their desktops with full training and support. Students have it on their phones, pretending they don’t. One institution is teaching its people how to use the future. The other is teaching them how to hide from it.


