Content is abundant. Genuine understanding-the kind that survives three weeks and transfers to a new problem-is far rarer. Completion rates for online courses hover below 15%, according to MIT and Harvard researchers studying MOOCs. Students enroll with real intent, then drift away. The content was never the problem. The design was.
A Pew Research Center survey of 1,458 U.S. teens and their parents from Sept. 25 to Oct. 9, 2025-finds that 57% of teens use AI to search for information, while 54% use it to help with schoolwork. Yet their queries extend beyond asking a chatbot to define the Pythagorean Theorem or to explain the significance of Boo Radley's character in To Kill a Mockingbird.
Parents are opting their children out of school-issued laptops and are asking teachers to return to pen and paper. In a recent report, families described a growing discomfort with this digital imperative in education. Importantly, this is less about the logistical aspects of technology and more about something universal: Control. These instincts seem reasonable. Screens distract, and artificial intelligence hovers over homework like an invisible, or worse, a co-conspirator in cheating.
Whenever I made my initial rounds at a school, a quick peek at its technological resources was often a reliable predictor of its ability to meet students' broad needs. The differences in the quality and volume of computing labs at a school like Lincoln Park High School on Chicago's wealthy north side, where the local population is 75% white, versus Raby High School, located in economically distressed East Garfield Park which is 83% Black, were stark.
Our guest authors started the year strong with first-rate content full of thought leadership. Want to know how to make your virtual classroom survive the 15-minute cliff, or why we need to build rights-based AI-powered EdTech systems that protect children's privacy? Check out the first installment for eLearning Industry's Guest Author Article Showcase for 2026. In no particular order, these are our favorite guest posts published in January.
Designed for a comparative literature course on medieval and Renaissance-era writing and announced by UCLA at the end of 2024, the digital textbook was immediately met with widespread mockery and derision from educators. Its AI-generated cover was riddled with incomprehensible text - "Of Nerniacular Latin To An Evoolitun On Nance Langusages," for example - and featured generic visuals that had little to do with the period it was supposedly covering. At the time, Elizabeth Landers, a grad student who helped put together the volume, said that the errors "aren't a failure of AI." Instead, she argued, "they're an intentional artistic choice that prompts students to question their assumptions about language, meaning and historical truth."
Initially, I surveyed the situation from the safe distance of a journalist who happens to also be a career professor and university administrator. I saw myself as an envoy between America's college campuses and its citizens, telling the stories of the people whose lives had been shattered by these transformations. By the summer, though, that safe distance had collapsed back on me.
Each class begins with several minutes of journaling in notebooks, and nearly all assignments must be handwritten and physically turned in. "If you walk into almost any one of my classes today, you will see that all of my students are handwriting," Bond says, "and they are journaling, and they are constantly and consistently doing everything with a pen or a pencil."
When students began using ChatGPT for homework in late 2022, chatbots were widely seen as cheating tools to be banned or blocked. Now, across K-12 and higher education, that resistance is giving way to a broader acceptance that AI is here to stay - and that avoiding it could leave students unprepared for what comes next. That shift has created an opportunity the tech giants are racing to seize.
Done! Finished! One might expect to hear such exclamations from exultant college students, relieved or ready to rejoice upon polishing off their latest essay assignment. Instead, these are the words I hear with increasing frequency from fellow professors who have come to think that the out-of-class essay itself is now done. It's an antiquated assignment, some say. An outmoded form of pedagogy. A forlorn fossil of the Writing Age, a new coinage that seems all too ready to consign writing instruction to extinction.
To counter that, he revived oral exams and enlisted an AI agent to administer them at scale, in an attempt to "fight fire with fire." "We need assessments that evolve toward formats that reward understanding, decision-making, and real-time reasoning," Ipeirotis said. "Oral exams used to be standard until they could not scale," he added. "Now, AI is making them scalable again." In the blog post detailing the experiment, Ipeirotis said he and his colleague built the AI examiner using ElevenLabs' conversational speech technology.
In today's competitive higher education landscape, universities and academic publishers must uphold content credibility and trust. As the global eLearning market is projected to reach $840.11 billion by 2030, the demand for robust QA services in education is rapidly increasing. The rise of digital learning and AI tools has introduced new challenges to maintaining academic integrity. According to the International Center for Academic Integrity, 65-75% of undergraduates admit to cheating at least once, while 62% have cheated on written assignments.
Artificial intelligence is fundamentally reshaping countless industries; education is no exception. As AI tools rapidly enter classrooms, there are concerns about fair access, effective implementation, and the risk of widening the still persistent digital divide. Who are the players best positioned to guide this transition in a way that truly benefits every student? I recently spoke with Alix Guerrier, CEO of DonorsChoose, an education nonprofit where teachers submit funding requests based on classroom needs.
As AI becomes integral to modern learning, these platforms are now expected to deliver predictive insights, automated workflows, and highly personalized learning experiences. Yet many providers still operate legacy LMS systems, unable to support modern workloads or data-intensive features. These platforms are rigid, costly to maintain, and slow to update. Release cycles can often stretch into quarterly or annual updates, creating a widening gap between user expectations and the platform's capabilities.
It includes the potential for a new AI system, Gemini for Government, which the government hopes will cut bureaucracy, automate routine tasks and free up civil servants to focus on improving services for people. Through the partnership, Google DeepMind said its existing cutting-edge AI models will be made available to UK scientists. These include tools like AlphaGenome, which uses AI to sequence strands of DNA and spot potential weaknesses; and AI Co-scientist, supporting researchers to generate new theories and research proposals.
Ask any instructor what helps students learn, and it's unlikely any of them will answer "a really big wall of text". It's incredible to me, as both a university instructor and a UX designer, that the army of people working at OpenAI are not imagining better tools for our students. I want to walk you through a design pattern in ChatGPT that, despite its good intentions, might be creating unintended hurdles for students.
SAN FRANCISCO (KGO) -- Colin Kaepernick is launching a new AI platform to help boost student literacy. "It can help ask questions to get those ideas to come to life for you to explore areas that you don't know, to explore and one of the things that we often talk about is - we don't know what we don't know," Kaepernick said. The AI tool called "Lumi" is designed to help students create and publish stories while building reading skills.
The U.S. workforce is facing a pivotal challenge: A widening skills gap that threatens economic growth and innovation. While demographic trends-like declining birth rates and a shrinking pipeline of young workers-are real, the more actionable issue is the growing mismatch between the skills employers need and those available in the labor market.