Math Academy, a buzzy online math program, claims its users learn math at “four times the speed of a traditional math class.” That is fast. Unbelievably fast. It typically takes eight years to get from 4th to 12th Grade, but with Math Academy it takes just two, and then on to college courses like “Linear Algebra” and “Machine Learning.”
Jason and Sandy Roberts launched Math Academy as an accelerated math program for middle schoolers, housed in Pasadena Unified School District. The software arrived after Jason, an early Uber employee, got tired of collecting and marking homework. Once the pandemic hit Jason developed the software further, transforming Math Academy into an edtech company with ambitions to fix math education, which they believe to be badly broken.
Well, sure. Everyone in edtech says school is broken and they can fix it. I was prepared to give Math Academy no further thought.
But I was intrigued, mainly because of a dynamic, constantly growing 457-page manifesto created by their Director of Analytics, titled “The Math Academy Way.” This manifesto is subtitled “Using the Power of Science to Supercharge Student Learning.” In it they claim to have designed a system that “emulates the decisions of an expert tutor.” I was definitely skeptical, but they’re thoughtful and have experience with kids. So I shelled out $50, completed my diagnostic exam, and entered the world of accelerated learning.
Explanations are Optional
Math Academy is unabashedly procedural. They believe above all in practice that closely follows given examples. Here is how lessons are described in “Math Academy Way”:
Each lesson starts out with an introduction, and then moves to a worked example, followed by 2-5 practice questions on the same type of problem as the worked example.
Introduction. Example. Practice, largely multiple choice. This is the core of the Math Academy experience.
Now, I’m a big fan of practice. But consider what it would look like for me, a human tutor, possibly an expert, to teach as this program does:
I hand my student a page from a textbook and ask her to read. We don’t talk. She tells me when she’s finished. Then I hand her another piece of paper, this time with an example on it. She looks at it as long as she likes. Then I give her two multiple-choice problems. If she solves them correctly I say “Congratulations! You’ve earned +5 XP.”
Is that great tutoring? For a high school student? For a 4th Grader? No, no, no. Good teachers don’t sit around and watch kids read. They ask probing questions. They ask for explanations. They build understanding.
Given the choice, most people will skim over technical explanations. During my month on MA, enrolled in Probability & Statistics, I was no different, especially since the worked examples were briefer, clearer, and always sufficient for passing the lesson.
![](https://substackcdn.com/image/fetch/w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F26a831e4-c71b-477c-8254-5088005c5069_1450x1460.png)
Here is one source of that 4x acceleration: Math Academy has created a system where explanations are optional.
In classrooms, we don’t let kids hit fast-forward. We prompt with questions. We command attention in real-time. Admittedly, that slows things down. I know a lot of kids would skip it if they could—that’s why you hand out the worksheet after the discussion, you know?
In most cases, “Math Academy Way” comes down strongly in favor of teacher guidance. They are against group work and unguided practice. Fair enough. But their app gives students total control over time on task and depth of engagement. No expert tutor would do this; researchers certainly don’t recommend it.
But you only get that sweet XP for solving problems…so yeah, students are going to go faster, and learn shallowly as a result.
Never Forget, Never Understand
You might expect that this would all take care of itself. MA is after all a “spaced repetition” program, constantly reviewing earlier content. Poor learning will eventually lead to incorrect answers. Users will be forced to revisit earlier material and improve.
But during my time with the program, there was a problem with this: Math Academy never let me forget my shallow learning. I kept answering questions correctly, and MA kept pushing me onward, without a chance to deepen my understanding.
I ended up thinking about multiplication facts. A definite finding of cognitive science is that people store facts in a connected network. A prompt of “12 x 8” triggers a whole host of associations—6 x 8 = 48, 12 x 4 = 48, 10 x 8 = 80, 8 + 8 = 16, and so on. A significant part of what we call “conceptual understanding” is this network of learned associations.
In remedial situations, you’ll often encounter students whose network of associations is weak. For them, 12 x 8 triggers 12 + 8 = 20, or even twelve 8s. In intervention situations it sometimes makes sense to focus on memorizing facts and formulas first, without worrying as much about the network. As MA puts it, if you practice retrieving this fact often enough, you’ll “retain it indefinitely.” In some cases, that can help.
In my time with MA, I felt myself turning into this kind of remedial student. Spaced reptition works—I really was able to hold on to formulas for variance and expected value long enough to easily ace their quizzes. But I felt the weakness of my own understanding, and it honestly bummed me out. I still don’t know what a geometric distribution is.
My retention was only possible because they were targeting it so precisely. Their probability course is composed not of units but of 180 topics, each itself composed of 2-3 skills. The problems I practiced were identical to the examples, just with different numbers. Same with review and assessment—I never encountered a question in practice that I hadn’t seen explained before.
In an actual classroom, this sort of micro-targeting is impossible.1 I would have forgotten the variance formula, and had to go back and relearn it—perhaps more deeply. My weak understanding would have been exposed by unfamiliar questions which didn’t directly resemble ones I had already studied. I’d need to go back and relearn, or ask for help. It would require, no doubt, a certain amount of resilience. I guess you could call that a flaw.
But in Math Academy, learning was smooth and flawless, to a fault.
I hated this. It made me feel sad and stupid to know how to answer questions but only in this shallow way. It also meant that, for all the criticism in “Math Academy Way” of unguided instruction, I was essentially left on my own to flesh out my understanding. Math Academy offers direct instruction for procedures, discovery learning for concepts.
But, yes: this flawless learning will indeed go faster than a typical classroom.
Who Is Math Academy For?
Friends with bright kids sometimes ask me for advice. Their children are bored in math and looking for more. It’s a hard question to answer. I’m always looking for products or courses to recommend.
Man, I sure wish I could recommend Math Academy.
I will say that as a supplement to an actual course, MA could be powerful. I don’t know of another site that offers this much basic practice for university mathematics, something that is sorely needed.
But in the absence of a teacher, I simply can’t recommend this program, even for strong students. The rapid gains are somewhat illusory. Unless you’re going deeper into something familiar, you’ll have to go back and study it all again. I was on track to pass Probability & Stats in a few months based on my accumulated ability to answer multiple-choice questions I didn’t understand. Is this what strong math students deserve?
At the end of my month on MA, I went looking for information about the original Pasadena program. I found a student article titled “Math Academy, A Decade Later,” and was surprised to see my perspective echoed by Pasadena high school students:
“Math Academy sucks and genuinely should not exist. The dropout rate is absolutely ridiculous and the stress it puts on students is just unparalleled,” said one student who left the program after 8th grade.
So what are the problems that students have with the program? For one, students say that going through content at such a fast pace results in very high standards and only a surface-level coverage of many important concepts. “We do too much in not enough time, and by the time we finish learning a topic, it doesn’t stick with us,” said another student who left after middle school.
“The XP system is just dumb. It isn’t relevant to anything we learn in class and the negative XP discourages learning on the website because the goal isn’t to learn, it’s to get as much XP as possible,” said a sophomore currently in Math Academy.
I truly have no ill will to the people who have made this thing. They all seem sincerely devoted to improving math education. There are things that they do well. They’re only in “Beta,” and will surely make further tweaks as they grow.
But I have to call it like I see it: Math Academy is fundamentally broken, and I don’t think it can be fixed.
Though they tried to build it into their in-person classes: “Before building our online system, we actually tried performing a loose approximation of spaced repetition manually while teaching in a human-to-human classroom.“
Hi, Alex, Curriculum Director of Math Academy here.
After some reflection, I decided to respond to this.
TLDR
* Math Academy is a work in progress (hence why we're still in beta).
* Do we avoid or skip the teaching of concepts? No, certainly not!
* Could we do more to ensure conceptual understandings are hitting home? Absolutely.
* Can anything be done to ensure that a larger proportion of 4th-grade students can benefit? For sure.
* Is Math Academy fundamentally broken and cannot be fixed? GTFOH!
Long Form
I cannot post as a single comment, so it must be done as a series of comments.
While the author raises a few reasonable points, it's nothing we're not already aware of. Also, it's self-evident that the author had no intention of properly engaging with the platform.
It's not my intention to go through the review line-by-line and comment on every point. However, some points about the author's perspective on education stood out for me.
> Good teachers don’t sit around and watch kids read. They ask probing questions. They ask for explanations. They build understanding.
Not quite. When a good one-on-one tutor introduces a new concept to a student, they might start with some (very) short motivation. Then, they'll introduce a worked example, as simple as possible to start, and carefully walk through the solution, invoking prerequisite ideas where necessary and using their knowledge to anticipate where the pain points in the solution process might be. They show them how to solve the problem and then get the student to demonstrate their understanding by solving similar problems.
Introducing a concept for the first time is usually not the time to ask "probing questions." It's way too early for that! The student only learned what the hell a quadratic equation was two minutes ago, and you're already probing them! Give them a chance to internalize some information first!
So, go through 1-2 worked examples and check that they understand by having them solve some similar problems. Once they've gotten the hang of that, slightly increase the complexity of the problem. Do that 2-3 times, and you'll come away with a student who's had a happy and satisfying learning experience!
Most students struggling with a new idea don't want to be probed for understanding every two seconds. If anything, too much probing too early could undermine their confidence entirely. Now, that's not to say students shouldn't be probed. But give them a chance to get familiar with the material first. Once they're happy, confident, and have reached a certain point in their understanding, *then* you can challenge them.
Unlike most users, I feel the author did not make a genuine effort to engage with the product. Quote:
> Given the choice, most people will skim over technical explanations... I was no different.
Actually, no. Most people genuinely interested in learning new material are willing to spend some time reading information about what they're trying to learn! (I'm assuming we're talking about more mature students now). Surely, it's not gotten to the point where we're discouraging reading to learn!
Granted, technical discussions can be difficult to wade through, at any level. I sympathize. But this is why we break a course down into hundreds of individual topics, which are themselves broken down into multiple stages. The cognitive load (i.e., the amount of new information that requires storing in working memory) at any one point during the learning experience is designed to be as low as possible.
The author raises a reasonable point: getting some 4th-graders to read text and learn from it is nigh impossible. This could be for a variety of reasons. We've never claimed that our 4th-grade course *in its current form* will work for every 4th-grade student, and admittedly, it's down to us to do more to make that dream a reality. But with all that said, our 4th-grade course currently has a 93% completion rate (meaning, 93% of students on this course mastered *every single topic*!), and 99% of lessons are passed within two attempts (coincidentally, 93% of lessons passed within the first attempt). So there's no doubt that *some* 4th-graders can read the material, learn from it, and consequently get tremendous value out of the product.
It's our job to widen the net to make the course as inclusive as possible. However, far from being an unfixable problem as the author suggests, technical solutions that are already in the works are possible, such as:
* In-task analysis (figuring out exactly what students are doing while using the platform and taking steps to intervene if they slack or goof off).
* In task-coaching, using the in-task analysis data (basically guiding the students on how to use the system correctly).
* For younger students who have difficulty reading text, videos and/or animations are always a possibility.
The in-task analysis and coaching are technically challenging. You'll need to figure out what the student is doing in real-time and respond accordingly, just like a teacher or tutor would. But these things are certainly not impossible. We're already on it!
Thanks for the review, Michael. I’ve noticed some of the recent MA developer posts, so I was keen to check it out. I’m surprised because I thought this would be right up your alley (and so did you, I guess?).
But I so wonder how much of your assessment is about a disagreement between how “conscious” to make a particular act of learning?
It’s been ages since I read your book, but if I remember correctly, you like to use examples to stimulate thought, reflection, and analysis prior to doing a procedure—is that right?
But you can also use examples to build a kind of mental muscle memory without ever really making the learner aware of it, until you choose to bring something to their awareness “wax-on, wax-off” style. That approach requires a lot of volume, so you want to make each individual task as quick as possible to get enough reps.
(I reckon both approaches are valid and not mutually exclusive—you can alternate—but if you are making a product, you probably have to choose one or the other, at least to begin with, simply because of time and budget constraints.)
Based on your review, MA sounds intentionally like a high-rep, high-speed, low-reflection way to build muscle memory, which is good for trivialising procedures and making other conceptual instruction easier. Does that sound fair? Am I missing the point? (Maybe I should just read their manifesto and not dump these questions on you! 😅)