Using This Moment Well: A Builder’s Guide to EdTech in the Age of AI
- Krzysztof Kosman

- 21 hours ago
- 6 min read
How to choose your wedge, respect context, and build products that change real trajectories
AI is hitting education at the same moment education systems are already stretched: teacher shortages, student disengagement, rising dropout, and a coming wave of job disruption.
In his conversation on the EdTech Dots Podcast, Greg Margas put words to a tension many builders feel: we’re overstimulated by ideas and hype, but under-focused on the few things that would actually matter in ten years.
This article pulls out several of those themes and reframes them as a practical guide for founders, developers, and product leaders building at the intersection of software, AI, and education.
Watch the full EdTech Dots conversation with Greg Margas: https://youtu.be/M-86DgKOo7Y
Focus in an Overstimulated EdTech Market
The EdTech/AI space is full of attractive buzzwords: gamification, neuro-edtech, adaptive learning, copilots for everything. Greg pointed to something more mundane: some of the strongest businesses in this space are built on narrow wedges like:
a single gamification plugin that improves engagement in existing courses
an accessibility plugin that makes e-learning usable for learners who were effectively excluded
These are not grand “reinvent education” visions. They are focused answers to specific pains.
What this means for builders:
Pick one problem, for one user, in one context. “Help first-year CS students complete weekly exercises” is a better starting point than “fix computer science education.”
Anchor on outcomes, not activity. Design around completion rates, concept mastery, reduction in manual teacher effort — not number of prompts, minutes in app, or AI conversations.
Ignore 90% of trends on purpose. You don’t need to bolt on a “community,” a marketplace, and a chatbot in v1. Focus is not only a product choice; it’s a cultural constraint on what you allow into your roadmap.
In an overstimulated market, clarity about the one problem you exist to solve is itself a competitive advantage.
Local-First, Global-Aware
Greg also stressed something many globally ambitious teams would rather ignore: education problems don’t transfer 1:1 between contexts.
A solution that works beautifully in one Polish university may fail completely in a rural school in another country. Not because the code is bad, but because:
curricula differ
schedules and assessment cultures differ
infrastructure and bandwidth differ
incentives for teachers and institutions differ
What this means for builders:
Pilot deeply in one context. A real department, a real school, a real community. Not an abstract “global learner.”
Design with constraints in mind. Assume uneven bandwidth, mixed devices and older hardware, and teachers with very limited time and support.
Keep your solution configurable, not hard-coded. So you can later adapt calendars, grading scales, languages, and policies without rebuilding the product from scratch.
“Local-first, global-aware” is a more honest and ultimately more scalable stance than “global from day one.” The goal isn’t to avoid specificity; it’s to start specific while leaving room to generalize.
Hard Problems Are Layered Problems
One of the concrete challenges Greg mentioned is university dropout. Institutions have been trying to reduce it for more than a decade, with only partial success.
What matters here is his framing: dropout is multi-layered. It’s not just about learning materials or a better user interface. Financial stress, mental health, family obligations, poor academic preparation, lack of belonging — all of these interact.
In an AI-heavy moment, it’s tempting to pitch “we reduce dropout with personalization and nudges.” That’s the kind of oversimplification that erodes trust.
What this means for builders:
Treat problems like dropout as ecosystems, not toggles. Your product might influence one slice: early risk detection, better feedback, smoother transitions between modules. Name that slice honestly.
Be evidence-minded, not “AI-powered” for its own sake. Log the signals that matter: missed key activities, repeated failures on core concepts, changes in participation — and surface them to humans who can act.
Keep humans in the loop by design. Advisors, tutors, teachers, or peers should be able to see what the system is flagging and decide what to do, not be overridden by an opaque model.
Respecting complexity doesn’t mean you can’t contribute. It means you’re clear about which layer you’re touching, and you design your system to work with the rest of the human infrastructure, not pretend to replace it.
Equal Access as a 10-Year Benchmark
Greg articulated a simple ten-year test: in a decade, will a kid in rural Poland have access to learning quality comparable to a student at an elite university? And more generally: will we have fewer “lost geniuses” — people whose potential was never realized because they were born in the wrong place or into the wrong circumstances?
That’s a powerful benchmark because it’s both ambitious and concrete.
What this means for builders:
Design for imperfect conditions, not ideal ones. If your product only shines on modern hardware and fast internet, it’s likely excluding exactly the learners who’d benefit most.
Think beyond institutional budgets. Equal access is partly a business model choice: tiers or sponsorship models that reach low-resource environments, and ways for individuals to access value even if their institution never buys.
Be intentional about who your product is really for. There’s nothing wrong with building for well-funded schools — as long as you don’t claim you’re “fixing access for everyone” while doing it.
Equal access doesn’t happen by accident. It happens when teams treat it as a design constraint and a north star, not just a talking point.
AI Disruption and the Shift from Content to Pathways
Another of Greg’s points cuts against the “AI is just hype” narrative: job losses driven by automation and AI are real and already visible in some markets. Many estimates suggest that more than half the workforce will need substantial upskilling or reskilling over the coming years.
If that’s true, then the job of AI-enhanced education is not simply to generate more content. It is to help people move from one life path to another with dignity and confidence.
What this means for builders:
Think in terms of pathways, not prompts. A useful system doesn’t just answer today’s question; it helps someone see where they stand now (skills and gaps), a believable next role or competence, and the steps and projects that could get them there.
Make room for identity and emotion. Losing a job is not just an information problem. It’s a hit to identity and self-worth. Tools that assume a calm, confident learner will miss the point.
Measure success in changed trajectories. Did someone complete a transition into a new role? Did they stay in a program that would traditionally lose them? These are harder to measure than “lessons completed,” but far more meaningful.
If AI in education leaves people with more content but no clearer path, we will have wasted this moment.
Mindset, Fear, and Psychological Safety
Finally, Greg shared a personal story: internalizing in school that he was “bad at maths,” then later freezing completely during a practice lesson when he stood in front of a class. That experience was so intense that he didn’t want anything more to do with teaching.
This isn’t just a personal anecdote. It’s a reminder that mindset and psychological safety are prerequisites for learning and for adopting new tools.
What this means for builders:
Design low-stakes practice environments. Whether for students or teachers, people need places to try, fail, and try again without public embarrassment.
Use feedback to build confidence, not just correctness. Highlight progress and strategies (“you broke the problem into smaller parts”) as well as errors.
Treat reluctance as a signal, not resistance. If teachers are slow to adopt your tool, the problem may not be usability alone. It may be fear of being exposed, judged, or replaced. Your product and messaging can either amplify or reduce that fear.
We cannot seriously talk about “AI in classrooms” while ignoring the emotional reality of the humans we expect to use it.
Using This Moment Well
Underneath all of Greg Margas’s points is a simple, uncomfortable question: when we look back in ten years, will we be able to say we used this moment well?
For builders, that doesn’t translate into “solve everything.” It translates into something much more operational:
Choose one real problem and one real context.
Understand its layers before promising to “fix” it.
Treat equal access and real human mobility as constraints, not slogans.
Build systems that open pathways for people whose worlds are being reshaped by AI.
Design with the mindset and feelings of learners and teachers in mind.
If your next release moves even a small group of people closer to those goals — one school, one program, one cohort, one teacher — you’re on the right side of this moment.
Everything else is just noise.


