26 Comments

Hi, Alex, Curriculum Director of Math Academy here.

After some reflection, I decided to respond to this.

TLDR

* Math Academy is a work in progress (hence why we're still in beta).

* Do we avoid or skip the teaching of concepts? No, certainly not!

* Could we do more to ensure conceptual understandings are hitting home? Absolutely.

* Can anything be done to ensure that a larger proportion of 4th-grade students can benefit? For sure.

* Is Math Academy fundamentally broken and cannot be fixed? GTFOH!

Long Form

I cannot post as a single comment, so it must be done as a series of comments.

While the author raises a few reasonable points, it's nothing we're not already aware of. Also, it's self-evident that the author had no intention of properly engaging with the platform.

It's not my intention to go through the review line-by-line and comment on every point. However, some points about the author's perspective on education stood out for me.

> Good teachers don’t sit around and watch kids read. They ask probing questions. They ask for explanations. They build understanding.

Not quite. When a good one-on-one tutor introduces a new concept to a student, they might start with some (very) short motivation. Then, they'll introduce a worked example, as simple as possible to start, and carefully walk through the solution, invoking prerequisite ideas where necessary and using their knowledge to anticipate where the pain points in the solution process might be. They show them how to solve the problem and then get the student to demonstrate their understanding by solving similar problems.

Introducing a concept for the first time is usually not the time to ask "probing questions." It's way too early for that! The student only learned what the hell a quadratic equation was two minutes ago, and you're already probing them! Give them a chance to internalize some information first!

So, go through 1-2 worked examples and check that they understand by having them solve some similar problems. Once they've gotten the hang of that, slightly increase the complexity of the problem. Do that 2-3 times, and you'll come away with a student who's had a happy and satisfying learning experience!

Most students struggling with a new idea don't want to be probed for understanding every two seconds. If anything, too much probing too early could undermine their confidence entirely. Now, that's not to say students shouldn't be probed. But give them a chance to get familiar with the material first. Once they're happy, confident, and have reached a certain point in their understanding, *then* you can challenge them.

Unlike most users, I feel the author did not make a genuine effort to engage with the product. Quote:

> Given the choice, most people will skim over technical explanations... I was no different.

Actually, no. Most people genuinely interested in learning new material are willing to spend some time reading information about what they're trying to learn! (I'm assuming we're talking about more mature students now). Surely, it's not gotten to the point where we're discouraging reading to learn!

Granted, technical discussions can be difficult to wade through, at any level. I sympathize. But this is why we break a course down into hundreds of individual topics, which are themselves broken down into multiple stages. The cognitive load (i.e., the amount of new information that requires storing in working memory) at any one point during the learning experience is designed to be as low as possible.

The author raises a reasonable point: getting some 4th-graders to read text and learn from it is nigh impossible. This could be for a variety of reasons. We've never claimed that our 4th-grade course *in its current form* will work for every 4th-grade student, and admittedly, it's down to us to do more to make that dream a reality. But with all that said, our 4th-grade course currently has a 93% completion rate (meaning, 93% of students on this course mastered *every single topic*!), and 99% of lessons are passed within two attempts (coincidentally, 93% of lessons passed within the first attempt). So there's no doubt that *some* 4th-graders can read the material, learn from it, and consequently get tremendous value out of the product.

It's our job to widen the net to make the course as inclusive as possible. However, far from being an unfixable problem as the author suggests, technical solutions that are already in the works are possible, such as:

* In-task analysis (figuring out exactly what students are doing while using the platform and taking steps to intervene if they slack or goof off).

* In task-coaching, using the in-task analysis data (basically guiding the students on how to use the system correctly).

* For younger students who have difficulty reading text, videos and/or animations are always a possibility.

The in-task analysis and coaching are technically challenging. You'll need to figure out what the student is doing in real-time and respond accordingly, just like a teacher or tutor would. But these things are certainly not impossible. We're already on it!

Expand full comment

Quote:

> I still don’t know what a geometric distribution is.

That's probably because, as you said, you didn't read the information!

Our discussion about the geometric distribution goes like this (spread out over multiple lessons):

* Introduce a simple, concrete problem. In this case, it's "A fair die is thrown. What's the probability of getting a six for the first time on the third throw?" For example, (2 3 6) or (1 1 6).

* Figure out the answer using properties of probability they already know (product rule, independence, layering on top of existing knowledge).

* Generalize this concept. This gives them the information they need to construct the PMF of a geometric random variable. We also break down each factor in the PMF, analyzing its structure and building a "deeper understanding."

* Go back to the original problem, modeling using the language of a geometric variable, and show we get the same answer as before.

* Have students actively practice computing some geometric probabilities in a non-contextual setting.

* Have students actively practice identifying real-life situations that can be modeled using a geometric random variable.

* Have students actively practice solving problems involving geometric random variables.

If you do not understand geometric distributions both conceptually and procedurally after making a serious attempt at all that, then you're in a significant minority. Nearly 80% of students can solve contextual problems involving the geometric distribution in a timed quiz environment. These questions aren't easy. There's no prompting: the students won't know they're supposed to use the geometric distribution unless they've understood this concept! We don't tell them. Yet 78% of students can correctly reason about geometric distribution problems in a timed setting (p.s. This is impossible unless they've understood the concepts and the necessary procedures).

Later, we introduce the negative binomial distribution similarly, and show how they're related. Students who give it enough time will eventually learn how closely these two distributions are related and go on to do interesting things with moment-generating functions, etc.

For someone claiming to be so hell-bent on requiring a deep understanding of things, I cannot understand why, as an adult, the author wouldn't take the time to read the information that explains everything you need to know about geometric random variables. We go pretty deep into procedural *and* conceptual ideas in this particular topic.

I will concede that we could do more in the learning process to probe students' understanding of the conceptual side of things once we've explained it to them. Introducing more "elaborative interrogation" questions and multipart problems for this purpose is part of our roadmap. Why haven't we done it already? One of the other commenters here put it quite well:

> I reckon both approaches are valid and not mutually exclusive—you can alternate—but if you are making a product, you probably have to choose one or the other, at least to begin with, simply because of time and budget constraints.

Bingo! We're a micro-entity (basically a mom-and-pop business) with extremely limited resources until recently. It's difficult to do everything all at once. But as an entity that's seen some decent growth in the last 6-12 months, we will certainly start addressing this.

https://x.com/ninja_maths/status/1887898815761621478

(the author kindly acknowledged this post and its contents)

The Pasadena program is entirely separate from the online platform. I'm not sure why the author even referenced this.

As for the "fundamentally broken" remark, I honestly don't think any fair-minded person can take that comment seriously when thousands of users are getting tremendous value from the product.

For balance, I'd like to include some posts from users who are actively, genuinely, and seriously engaged with the product (not just to sign up for a short time, pick as many holes as possible, and write a review):

https://jonathanwhitmore.com/posts/2024-09-10-MathAcademy-after-2000-points/

https://x.com/KamStaszewski/status/1888598235318473056

https://chadnauseam.com/coding/gamedev/the-game-design-of-math-academy#The+Game+Design+of+Math+Academy

https://x.com/sumitdotml/status/1874456583306100864

https://x.com/abarrallen/status/1869165529514250697

https://gmays.com/how-im-relearning-math-as-an-adult/

And before anyone asks: We have not paid or otherwise incentivized any of the people I've quoted here. I could also provide more feedback like this.

Math Academy is a work in progress, probably half-finished at best. There's still lots to do (hence why we're still in beta), and the system is still rough in places. I'm all for constructive feedback, and I'll be the first to admit that there's still a lot of work to do. But far from being impartial, this review pushes the envelope to the point of being unreasonable for reasons unknown.

For those serious about learning or relearning math but are now on the fence (TBF, I would be too after this). Just try for yourself. There's a 30-day money-back guarantee if it's not a great fit. If we fail to teach you math, we don't want your money! But the reality is that thousands of students believe they're gaining tremendous value from the product, hence why we're still here.

Expand full comment

"Math Academy is a work in progress, probably half-finished at best."

But then why claim that Math Academy is already superior to traditional teaching and equal to expert tutors? And not just for adult learners, but for all ages? And not just for gifted students, but for all students?

I'm happy to give you guys time to figure things out, but you're already out here claiming that you've got the code cracked. And it seems to me that everything that you'll need to do to improve MA will lead to less than 4x acceleration.

MA is clearly not "fundamentally broken" for all purposes. As I mentioned, I could imagine it as a powerful supplement to a course. But having seen where the acceleration is coming from, I think that as you expand you'll be forced to slow students down, or else lose the 90%+ pass rate. That, along with trying to match the rhetoric you guys use to critique more traditional learning (which of course also has its flaws), is what I was going for with that last line.

Expand full comment

"And it seems to me that everything that you'll need to do to improve MA will lead to less than 4x acceleration."

I don't think this is true.

* In-task coaching and analysis. These won't do anything to slow things down - on the contrary, these will help students to be more efficient.

* Elaborative interrogation and projects. We already have projects - we need to improve them. That's where the majority of elaborative interrogation questions will feature.

* Videos/animations for younger students: These will most likely be optional for those who want them. Again, I don't see this as a huge time-sink on the student's part: instead of reading a paragraph or two of text, they'll watch a short video.

Expand full comment

I think in-task coaching and elaborative interrogation will definitely introduce friction and slow students down, just as it does in real-time teaching. (I agree that videos probably won't.)

Expand full comment

I disagree, and I think we're in a better position to comment since we're the ones who are actually building this thing. But I guess we'll see.

Expand full comment

Alex Smith - Have you taken a look at the worked examples for different levels (Algebra, Geometry) from the SERP institute? They incorporate the self-explanation prompts (as well as other best practices for worked examples) that Michael Pershan references below. I agree that they would potentially slow the pace down, and also agree that's not a bad thing, because it lessens the chance a learner reading the explanation will focus on the wrong thing and over-estimate their understanding.

https://www.serpinstitute.org/geometry-by-example

Expand full comment

The background for my comments is not just my understanding of teaching but also my read of research -- for example, on self-explanation.

https://link.springer.com/article/10.1007/s10648-018-9434-x

Expand full comment

I've been reading Pershan for years, and I think accusing him of bad-faith writing here is not a fair move on your part. I'd recommend a little more curiosity and a little less judgment. (In particular, his book on Worked Examples would genuinely I think inform your practice, whether you agree with his critique here or not.)

More specifically, there are a lot of tools that work for independently engaged, interested students. It's not a killer argument that you're doing great work for people who will already read the longform explanations and do their best to interpret them. I've independently taught myself lots of new things from textbooks. I'm not a typical 4th grade student!

Pershan was testing your software by using it as a typical student would. He didn't skip the explanations because he didn't care, he skipped them because students will skip them if they can. The path of least resistance needs to be content mastery for the median student to master the content. I think it's pretty clear here that he's correct that, right now, the path of least resistance isn't content mastery. You don't even really contest that, you just say "lots of people do read the explanations and they like them". Congratulations, you've written a pretty solid textbook. That doesn't make it a revolution.

If the path of least resistance leads to poor understanding, this is not a game changer to most teachers for most students. I think, charitably, you've got a lot of work to do to change that.

Expand full comment

I don't know if this matters but let the record state that I said that I *skimmed* them, not skipped.

Expand full comment

Thanks for the review, Michael. I’ve noticed some of the recent MA developer posts, so I was keen to check it out. I’m surprised because I thought this would be right up your alley (and so did you, I guess?).

But I so wonder how much of your assessment is about a disagreement between how “conscious” to make a particular act of learning?

It’s been ages since I read your book, but if I remember correctly, you like to use examples to stimulate thought, reflection, and analysis prior to doing a procedure—is that right?

But you can also use examples to build a kind of mental muscle memory without ever really making the learner aware of it, until you choose to bring something to their awareness “wax-on, wax-off” style. That approach requires a lot of volume, so you want to make each individual task as quick as possible to get enough reps.

(I reckon both approaches are valid and not mutually exclusive—you can alternate—but if you are making a product, you probably have to choose one or the other, at least to begin with, simply because of time and budget constraints.)

Based on your review, MA sounds intentionally like a high-rep, high-speed, low-reflection way to build muscle memory, which is good for trivialising procedures and making other conceptual instruction easier. Does that sound fair? Am I missing the point? (Maybe I should just read their manifesto and not dump these questions on you! 😅)

Expand full comment

"Based on your review, MA sounds intentionally like a high-rep, high-speed, low-reflection way to build muscle memory."

Some have brought up piano scales or weight lifting as an analogy for what MA is offering. And there's something to that analogy. It's the sort of super-focused drill work that is an important part of learning.

But then let's take the analogy further. No piano lesson is just drills. No baseball practice is just weight lifting. When I took piano lessons, we did a few minutes of drills and then I got coached on a particular piece of music. When I coach baseball...well, it's little league, so there's no weights. But we don't just practice micro-motions, like gripping the ball across the seams or getting in a good athletic stance on defense. We do that, and then we teach and play the game.

There's a role for high-rep, high-speed, low-reflection, muscle memory learning. MA makes it the whole game.

Expand full comment

This accelerated learning is in contrast to AoPS where students spend *more* time on each subject, not less than what you see in a typical school. My kids spent 1 calendar year (12 months, not 9) on algebra. (AIME quals starting in middle school, ended up at Stanford) I can't even conceive of spending 3 months studying algebra. I have a student who spent 2 years on AoPS algebra before representing at IMO and ending up at MIT.

Expand full comment

> Math Academy offers direct instruction for procedures, discovery learning for concepts

🎯🎯🎯

Expand full comment

This is simply not accurate.

Concepts are clearly explained throughout the curriculum. Can we do more? Yes, and more can indeed be done to probe conceptual understanding. If you care to read my comments on this article, you can understand more about the situation and why many of the author's arguments simply don't hold water (though I concede a few valid points are made).

https://open.substack.com/pub/pershmail/p/math-academy-wants-to-supercharge?r=26hw1q&utm_campaign=comment-list-share-cta&utm_medium=web&comments=true&commentId=92323085

Expand full comment

> Concepts are clearly explained throughout the curriculum

Just because concepts are clearly explained doesn't mean they're read or learned. Check your clickstream logs and let us know how many kids linger over those conceptual explanations vs. skipping straight to the exercises and pattern matching from the worked examples.

Expand full comment

> Just because concepts are clearly explained doesn't mean they're read or learned.

I understand the point, and I agree with it.

As well as clearly presenting mathematical concepts, the curriculum already contains plenty of points where we check for conceptual understanding, and we don't allow students to make progress unless they've demonstrated the required understanding. We need more of this, certainly, and I have already stated that we plan to do more.

We also have regular timed quizzes, multipart problems, and reviews, which all help ensure students learn the material and are not simply pattern-matching.

In its current form, Math Academy works exceptionally well for serious students with reasonably good study skills who want to fully engage with the material as it is presented.

Students who are uninterested in engaging with the material or do not read the information (like the author) will struggle, fair enough. But as I mentioned, Math Academy is a work in progress. We're still in beta.

* Can we do more to engage less serious students? Certainly.

* Can we do more to help students develop the study skills they need to be successful with Math Academy? Absolutely.

* Are there more ways we can check for conceptual understanding along the way? Yes.

* Are we doing all of the above? You bet!

I discussed our plans to address all these points in my response to the main article if you'd be good enough to read it. If you'd like to critique some of the specific points or otherwise have a constructive discussion about our plans, I'm happy to do that.

Expand full comment

I wonder whether the need for this type of program (and Khan academy before it) is an indictment on our school systems ignoring skills training in mathematics. There are certain part of math that definitely requires memorization and practice of skills, but it's not everything. I like in the Beast Academy series there is a character named Sargent Rote who shows up whenever this type training is needed. But there is a reason Sargent Rote is not your main teacher, he only needs to show up once in a while.

In a world where students are never properly drilled in these skills, the lack of skills show up as the most obvious and looks like the only problem. As a result, we see programs like this showing up promising a miracle cure for math.

Expand full comment

From my perspective, US schools are certainly spending a great deal of time on procedural learning. Even teaching that I don't find particularly effective focuses on procedures over concepts. If I had to point to a single system-wide fault, it's that we tend to teach a unit/chapter and not return to it at some grade levels, so kids move on without having the skills actually mastered.

Expand full comment

Long time reader, first time poster. Geo distributions are for modelling situations where the variable is the number of attempts before success. e.g. X = the number of times Michael must roll a dice before scoring a 6, X~Geo(1/6). It's a geometric (exponential) sequence, P(X=n)= failure^(n-1) * success. If you are familiar with Binomial distributions, where the number of trials is fixed and number of successes is the variable, Geo is a special case of the Negative Binomial - where the number of trials is the variable and the number of successes is fixed. i.e. Geo(p) is NB(1, p).

As an 11 - 18 maths teacher coming from a background in Computer Science, I have little faith in any online 'adaptive' platforms that I've come across thus far.

Expand full comment

Aw, rats. I was feeling kind of hopeful about this option. I'll still check it out, but with much lower expectations. From what you're saying it sounds like a slightly more thoughtful version of IXL that's added in automatic spaced repetition. Were the explanations shallow, or just too tedious? Seems that even using it as a practice supplement would be problemmatic if you don't have access to all the topics at any given time.

Expand full comment

Personally, I’ve used Math Academy for five months and I’ve become so much more comfortable with calculus than I ever was. I find that filling in the intuition after the fact is far more efficient than trying to grok every single math skill ahead of time.

Expand full comment

Sounds though like you've had past experience with calc? I think "sharpening previously or incompletely remembered math" is definitely a solid use case given my critiques.

Expand full comment

Ben, genuine Q here, have you tried testing your MA-obtained knowledge on non-MA questions? I'm concerned about what Michael wrote we the questions all being the same and hence 'learning' is just repeated plug-n-play's.

Expand full comment

I think the explanations themselves are fine, if tedious and wordy. There were times though when the explanation was particularly shallow, and I couldn't know if there was a better explanation coming later (when they served me a follow-up skill) or if it was in some pre-req lesson.

Though you're right that it would be tricky to use as a supplement, it would be kind of cool to have practice that filled in prior gaps. It's true that it would jump around the course -- sometimes previewing, sometimes reviewing. Not sure if that would work or not.

Expand full comment