
The Future of AI in Education
TL;DR
The education sector is undergoing a seismic shift as generative AI reshapes how students learn and how institutions operate. While the technology presents real opportunities—especially for startups and SMEs—it also raises critical questions around pedagogy, trust, data ethics, and investment readiness. Founders looking to scale in this space will need more than flashy features; they'll need a strong AI strategy, measurable outcomes, and a deep understanding of the human elements that make learning meaningful.
For much of its history, education has relied on a remarkably consistent formula: a teacher, a classroom, a textbook. While delivery mechanisms have evolved from slate boards to smartboards, from libraries to online repositories, the underlying model has remained deeply human. The classroom was a space of relationship, of pacing and rhythm, of direct accountability between student and educator. Then came generative AI, not the kind that recommends your next playlist, but the kind embodied by large language models like OpenAI's ChatGPT, Anthropic's Claude, and Google's Gemini. Released to the public in late 2022, these tools are capable of writing essays, solving equations, summarizing texts, translating languages, and responding to student queries with uncanny fluency.
In the span of just 18 months, their rapid adoption has forced schools, universities, and EdTech companies across the globe to re-evaluate what they teach, how they assess learning, and what role human educators should play in an AI-assisted classroom. While the rush to integrate AI into education has been met with equal parts enthusiasm and trepidation, one thing is clear: the future of learning will not look like the past.
For startups and SMEs, this changing landscape presents both an opportunity and a challenge. Education has traditionally been a slow-moving, institutionally bound space, hard to penetrate, heavily regulated, and resistant to disruption. But the AI shift has unsettled that formula. And in that uncertainty, new models and partnerships are beginning to take shape.
A Shift in Infrastructure, Not Just Interface
"The biggest changes we're seeing aren't just in how students interact with tools," says Nick Solly, CTO of MOHARA. "The real shift is happening in how those tools are built."
Solly's team has been embedding generative AI into the backend of EdTech platforms: automating testing, enhancing data oversight, and even speeding up product iteration. These changes are mostly invisible to the average user, but they matter. A tutoring platform that previously shipped updates every six months can now do so every six weeks.
Matthew Henshall, founder of the online learning platform Lessonspace, agrees. "AI has made us code faster, test better, and offer teachers more visibility into what's happening post-lesson." Lessonspace, which operates in 90+ countries, powers real-time instruction in subjects like coding and mathematics. And while the platform is actively exploring AI-enhanced experiences for both tutors and students, Henshall cautions against overpromising.
"Everyone's focused on whether AI can replace a tutor," he says. "But the magic is still in the social contract between people, the expectation that a student shows up at 4pm and the tutor is there, present and accountable. AI doesn't replicate that."
The Human Element (Still) Matters
Henshall's comment echoes a larger concern among educators: that in the race to digitize, we might forget what makes learning meaningful.
A recent MIT Sloan study published in 2025 found that overreliance on AI tools, especially without guidance, can lead to diminished critical thinking skills over time, particularly in younger learners. The study tracked students across four schools in the U.S. Midwest and found that those who used AI to complete assignments without teacher oversight showed a measurable decline in metacognitive engagement over six months.
This aligns with the concerns raised by Dr. Melody Lang. For founders, this is more than a philosophical concern; it's a product strategy issue. Tools that undermine learning outcomes, even subtly, will lose traction over time. As Lang puts it: "Founders need to be able to answer: why AI, and why now? […] We want to back founders who think critically about the pedagogical implications of their tools," she says. "Adding an AI layer isn't a value proposition in itself. It has to enhance learning, not dilute it."
For Lang, this means funding startups that view AI not as a substitute, but as a co-pilot, tools that augment teacher capacity, personalize pacing, or provide richer formative feedback.
Building Trust, Not Just Features
But Lang also points to a more emotional dimension. "Education is built on trust. Between teachers and students, parents and schools, founders and their users. AI can destabilize that if not introduced carefully." For SME leaders and scale-up founders, these are not abstract values. Trust directly shapes adoption rates, retention, and word-of-mouth in an ecosystem where gatekeepers, teachers, parents, procurement teams, are rightly cautious. Crucially, trust cannot be built solely on branding or promises. It must be grounded in evidence-based solutions that prove their impact and effectiveness.
Irmak Atabek, co-founder of KidsAI, understands this risk intimately. Her company builds explainable AI tools for children aged 5–12. "We made a conscious choice: Olii, our AI assistant, doesn't pretend to be your friend. It doesn't simulate empathy, or say things it can't back up."
Instead, Olii is designed to be honest about what it is and isn't. It can answer factual questions, explain difficult topics, or help children explore a concept, but it doesn't replace the emotional warmth of a teacher or caregiver.
"For kids, that boundary is essential," says Atabek. "You can't blur the line between tool and companion. That's where harm can creep in."
KidsAI's approach has earned it praise in academic and policy circles, particularly in Europe, where child safety regulations around digital products are becoming increasingly stringent.
What Investors Are Looking For Now
The investor lens has shifted too. As Lang notes, capital is now flowing toward startups that can demonstrate their AI strategy, not simply declare one.
"It's no longer enough to say you're exploring AI," says Lang. "You need to show that you've embedded it in a way that drives clear outcomes, academic, operational, or otherwise."
This is consistent with what's playing out globally. A HolonIQ report released in July 2024 showed that while EdTech funding declined 17% overall year-on-year, investment in AI-enabled EdTech products rose 36%. The catch? Startups with vague AI roadmaps or inflated claims saw markedly lower conversion rates with institutional investors.
For startup founders, this means AI must be woven into your product architecture, not slapped on as a feature. You'll need a credible roadmap and early signals of traction, whether that's improved outcomes, faster onboarding, or better retention.
This has led to a subtle recalibration: AI needs to be real, not rhetorical.
If you're raising in the next 12 months, now is the time to tighten your AI narrative, and make sure your metrics can back it up.
Ethics, Privacy and the Role of Institutions
There are also open questions around data privacy, especially when tools interact with minors. Who owns the data generated by AI-student interactions? What happens if an AI provides biased or inappropriate information? Who is accountable?
Solly puts it bluntly: "If your AI tool is gathering insights from a 10-year-old's browsing behavior or lesson responses, you need to be prepared for a whole different level of scrutiny."
Several of the panelists noted the growing regulatory momentum in Europe and the Middle East. In the UAE, new frameworks are being tested for AI oversight in schools. In the UK, Ofsted has launched a task force exploring ethical AI use in education. And in South Africa, some public universities have begun formally allowing AI-generated submissions, provided students disclose their usage and remain accountable for the content.
"This shift is already underway, and many institutions are scrambling to catch up," says Henshall. "The challenge is to adapt without losing the integrity of the learning experience."
Powerful opportunities ahead
So where does that leave us?
The consensus among those building and funding in this space is cautious optimism. AI will reshape education likely more than any technology in the last two decades. But its success will depend not on the size of the model or the flashiness of the interface, but on the depth of its alignment with how people actually learn.
Atabek is clear-eyed about the path forward. "This is still a human business. Kids don't remember the software; they remember the feeling they had while learning."
Or, as Solly puts it: "Trust will outlast every product cycle. That's what we need to build for."
4 Emerging Signals to Watch:
- Micro-interventions with AI – Instead of trying to replace teachers, many startups are focusing on short, AI-powered interventions (e.g. post-lesson summaries, voice-based reminders, adaptive quizzes) that integrate seamlessly into existing platforms.
- AI Literacy as a Core Skill – Schools are beginning to treat AI as a literacy issue, not a tech issue. Teaching students when to use it, how to challenge it, and how to reflect on its outputs is becoming central.
- Decentralized Tutoring Networks – Platforms like Lessonspace are creating micro-networks of qualified tutors globally, with AI helping to match students to the right instructor based on goals, personality, and learning style.
- Ethical Certification for AI Tools – Several NGOs and policy groups are pushing for an "AI-Ready" certification for tools used in classrooms. Expect this to become a differentiator for startups in the next few years.
Direct quotes in this article were drawn from conversations held with MOHARA and guests as part of a virtual roundtable discussion on the future of AI in education, August 2025- part of MOHARA LIVE's AI ROUNDTABLE SERIES.
Thinking about your own AI readiness?
For those navigating this shift, it's easy to feel behind. You don't need a full-time AI team or a pricey assessment from a large consultancy to get started. MOHARA's AI Readiness Programme helps growing companies assess where AI can actually move the needle, operationally, educationally, and commercially.
Learn more hereorreach out to our team directlyMOHARA Team
Innovation & Strategy
Related Articles
AI in Product Development: Streamlining the Path from Idea to Launch
Explore how artificial intelligence is revolutionizing the product development lifecycle.
Introducing the Transformer: A new model for building Startups in 2025
A new category is emerging in venture creation, designed for the AI-native era.