top of page

AI Transcribes My Piano Lessons But Misses What Matters

I spend my days at J.P. Morgan building technical solutions for a global investment bank. Then I go home and teach piano to 15-20 students every week.

The contrast is sharp.

At work, AI tools face intense scrutiny. Data security concerns limit what we can deploy. But at Bournemouth Music School, I have flexibility corporate environments don't allow.

I use Notion's AI transcription tool to record every lesson. It produces actionable notes detailing what we covered. Parents love it. It saves me time.

But here's what the transcription misses.

What AI Cannot Capture

The tool struggles with bar numbers. It can't track specific notes in the order they're played. Sometimes it confuses who's speaking.

These aren't minor gaps.

When I studied at Chetham's School of Music, my teachers didn't just tell me what to fix. They physically adjusted my posture. They held my arms and shook them to release tension in my shoulders.

That tactile feedback changed how I played.

Research shows 78.1% of violin teachers agree physical interaction is essential for posture, instrument positioning, and tension optimization. You can't replicate that through a screen.

When I teach now, I read tension in real time. Gripping. Shoulder stiffness. Arms that need loosening.

I make adjustments in the moment. Students trust that expertise because they feel the difference immediately.

AI can't physically move you.

The Dependency Problem

Apps like Simply Piano promise convenience. Turn them on, follow the prompts, learn to play.

But as soon as you turn the app off, you don't know how to play the piano.

That's not learning. That's guided dependency.

Students become overly reliant on AI feedback. They don't develop problem-solving skills or the ability to self-assess. Learning music requires trial and error, persistence, and personal growth.

Real musicianship comes from trying things live. Learning. Struggling. Without backing tracks or excessive pointers.

Your brain needs to wire itself to learn and build the patterns from scratch.

There's no shortcuts.

The Growing Divide

Here's what concerns me most.

Well-funded institutions can afford expensive AI systems at scale. They're creating enhanced educational experiences for students who already have advantages.

Meanwhile, smaller music schools struggle to compete. The AI capital divide means wealthy students get technology plus human teachers to guide them. Others just get the technology.

That gap will widen unless we're intentional about access.

At Bournemouth Music School, we bring lessons to students' homes. £20 for 30 minutes, £40 for an hour. We're young teachers who connect with students differently than traditional instructors.

We're exploring AI tools. I'm even developing proprietary solutions for music education.

But the core mission stays the same: keep learning music alive in an age obsessed with convenience.

What Actually Works

AI transcription helps students practice what we covered in lessons. That's valuable.

But it can't replace the moment when a teacher physically adjusts your hand position and you suddenly understand what relaxed playing feels like.

It can't build the neural pathways that come from struggling through a difficult passage without algorithmic assistance.

It can't create the trust that develops when an expert shows you exactly what to change and you hear the improvement instantly.

I work in one of the most technologically advanced industries in the world. I understand what AI can do.

I also understand what it can't.

Teaching music requires physical presence, human judgment, and the kind of struggle that builds genuine skill. Apps create the illusion of progress while your brain never learns to function independently.

The students I teach are developing real musicianship. Not app-dependent mimicry.

That difference matters more than any transcription tool can capture.

Comments


bottom of page