New Perspective on the Department of Education’s Report to Congress on Educational Software
Friday, April 06, 2007
As I reviewed the recent report on the effectiveness of reading and math software published by the DOE, the study appeared to be very well designed, written, and executed. However, given my background as an educational software developer, I quickly spotted one very large confound. That is that the reading and math software selected were all “first-generation” products that, quite honestly, are mostly outdated! I mean that these products lack intelligence in how they interact with students.
Let me elaborate. If you put a student on a computer-based learning program, it won’t necessary benefit the student, if the instructional content is not appropriate to the student’s instructional level. It’s like going to the doctor and saying, “I don’t feel well. Can you give me some medicine?” And then the doctor throws you a bottle of random pills. This would never happen. The doctor would first do a thorough diagnosis.
This diagnosis is exactly what is missing from the most of the software products used in this study. Let’s be realistic, most big publishers (and most of the programs were from larger publishers) simply buy smaller companies, repackage their materials, and then market those materials as the silver bullet solution. And of course they pay for a study to show this to be true. For the most part, the first-generation products used in the study treat reading or math as a linear skill for instruction. If Johnny is in second grade and is behind in reading, they place him into their first grade materials. This is a simplistic approach to instruction and is thus ineffective! They don’t ask what the real issue is. Is it decoding, vocabulary, comprehension strategies, or a combination of all three? In math, is it a lack of knowledge of math terms or basic math facts, or is it a conceptual misunderstanding like that of fractions? Students’ needs are diverse, and without proper evaluation, instruction will be inefficient, whether by direct instruction or by computer mediation.
The sad reality of most first-generation technology products is that publishers have simply taken one set of curricula and pushed it onto the computer. Often they say there is a “diagnostic” assessment built into it, but it is usually little more than a placement test. Students may take it, and it may place them into a single linear point in the instruction. Then students proceed to perform lessons that are either too hard or too easy. The reality is that, especially in urban schools, students are so far behind we cannot waste any instructional time. Every minute must be teaching students at their instructional level, or it is wasted.
To meet the individual needs of students, deep and thorough diagnostic assessment must be combined with flexible and differentiable instruction. This is true for both the classroom and computer-based instruction. At Let’s Go Learn, we understand this model and use it to guide our product development. But then again we are a next-generation technology company, building tools from the ground up and not just repackaging old technologies.
Fundamentally, next-generation tools combine a powerful assessment solution with multiple instructional solutions. I say multiple instructional solutions because another reality is that all instruction courseware have different approaches. Some work well with certain students, while others benefit other students. Generally, companies that are good at creating instruction are not always good at creating assessments. So a teaming is necessary. This is exactly what Let’s Go Learn has done. Our proven expertise is in producing great online diagnostic assessments. But we don’t have the time or expertise to produce great computer-based instructional lessons. We’ve teamed up with multiple instructional developers and have combined our skills to make next-generation products, which will meet the instructional needs of a greater range of students. This has to happen more! And it will. More people today understand what diagnostic assessment means. Less people get it confused with accountability or state testing. In 2001, it was amazing how many administrators thought that their state tests were diagnostic. Fortunately, now people are getting smarter. Now we just need to work on making our kids smarter by putting diagnostic assessment into our classrooms and computer-based activities.
Finally, please don’t be too down on technology based on this study. After all, technology is an evolving tool. Look at the stone hammer of the early cave men. It wasn’t much better than a rock in hand, but when the stone hammer became an iron hammer, it was indeed much better.
My humble opinion,
Richard Capone, CEOBack to Company News