Online Education Blog

Our blogs discuss issues pertaining to the Education community for both educators and parents.
Home > Blog Home

How can my district prepare for the new Common Core State Standards online test?

Last weekend I presented at the California Math Council conference in Asilomar, California, and a month earlier at the council’s Palms Spring conference.  I presented on the technology implications of the new Smarter Balanced Assessment Consortium (SBAC) state test.  I defined “adaptive assessment” for the audience and discussed the logistics of online testing, and we had a super discussion on the entire topic.  (My slides - PDF)

Here are some of the questions and concerns that emerged about CCSS and the upcoming Smarter assessment.

  • How many computers will we need to test all our students?
  • What is computer adaptive testing?
  • How can adaptive testing be used on a summative state assessment?
  • What will the test look like?
  • How can free response questions be used in an assessment?
  • Is the state test going to be valid if they use open or free responses?
  • Will the tests work on iOS or iPads?
  • What is the difference between the summative SBAC assessment and the interim assessments?
  • What is the difference between SBAC and PARCC?
  • How can we prepare for the new CCSS assessment?

Before I answer these questions, I want to point out that many teachers and administrators present were concerned about whether the new state tests would be fair or valid given that many of the CCSS were potentially very hard to assess.  For instance, higher-level thinking and problem solving were areas of real concern.  I and others in the audience tried to put things in perspective.  The CCSS are here.  They reflect a shift toward instructional standards that is necessary for long-term student growth and readiness for college and careers.  With that said, districts cannot control how well SBAC or PARCC develop their assessments.  But schools and districts can adopt instructional practices to teach students for the long term:  not teaching to the test, not focusing solely on skills-based exercises, including higher-level thinking and problem-solving strategies in our curriculum, etc.  Of course, the idea of a new online state test is a scary prospect, so the focus remained on discussing and debating the implications of SBAC. 

Q: How many computers will we need to test all our students?

A:  The testing window for the year-end summative test is currently set to 12 weeks.  This means that one or two labs of computers may be sufficient for a school of 500.  I pointed out that in our experience, schools were able to test their students with our assessments using only one lab.  I recommended that schools that only have one lab work to have two labs or approximately 60 computers.  This could mean 30 computers and 30 touch screen devices. 

Q: What is computer adaptive testing?

A: Adaptive testing allows the test to adjust to the student in real-time.  It serves to reduce student frustration and improve the quality of the assessment by not wasting precious testing time with questions that are clearly too hard or too easy for a student.  In the case of our assessments, even at-risk students walk away feeling good about how they did.  Let’s Go Learn works on a set level, and once we know a student can’t master a construct such as short-vowel, there is no reason to give him or her the full set of 5-8 questions.  We stop and move on to something easier within the same sub-test or on to a new sub-test.

Q: How can adaptive testing be used on a summative state assessment?

A: There was a lot of worry because in an adaptive assessment students are not all given the same questions.  Therefore, concerns were raised as to whether the test could provide normative results—for example, Student A is at the 45th percentile, etc.  I explained that this is not a concern.  In adaptive tests, all items are statistically scaled prior to the real assessment.  The modern statistical methods used today allow adaptive tests to be norm-referenced.  I am 100% confident in this and it should not be a concern at all. 

Q: What will the test look like?

A: SBAC has release items on their site:  (http://www.smarterbalanced.org/sample-items-and-performance-tasks/)

Q: How can free response questions be used in an assessment?  AND
Q: Is the state test going to be valid if it uses open or free responses?

A: The questions surrounding open or free response brought a lot of heated discussion.  First, I clarified that open or free response does not always mean the question will be hard to score.  In the case of certain math or spelling questions, free response is easy to score and reduces the chance of guessing.  If the answer to a math problem is “15.8,” allowing the student to enter a free response reduced the chances of guessing from 25% in a four-choice multiple choice question to almost 0% in an open response question.

However, I think the greatest concern related to the category of written free responses.  Teachers and administrators were concerned that having either humans or an automated system review the responses might lead to inaccurate results.  There is a rubric that is supposed to be used for the scoring.  But here is what I stated, in sum:  SBAC is not going to be using items that are unreliable.  If a rubric is faulty or the human scores are not consistent, this fact will come out in the trial data analysis.  I reminded everyone that this is a high-stakes test, and if the variability is too great, the items will not be used.  I’ve talked to psychometricians, and we have one at Let’s Go Learn; I know that identifying bad items is a known science.  In simple terms, you take sample items and pilot them with real students.  If too many test-takers get a question correct, you throw it out.  If too few test-takers get a question correct, you throw it out.  If the variability of the item is too great, you throw it out.  This last section is harder to explain, but essentially, it is derived by comparing items with other similarly leveled items.  In large quantities, you can group students with like abilities.  If these students have results that vary too much with certain free response questions, you now know that these questions have some issues and will probably not give you reliable results if used. 

Q: Will the tests work on iOS or iPads?

A: All test and web developers know that Adobe Flash is on the way out.  HTML5 is the new standard and from my expert perspective, I am fairly confident that all new state tests will work on the iOS or iPad platforms.  Also, SBAC states that they will support multiple platforms, including iOS.  See the link here.

Q: What is the difference between the summative SBAC assessment and the interim assessments?

A: The summative assessment will be given during the last 12 weeks of school.  This is the norm-referenced, high-stakes test that will be used to rank schools.  The interim assessment is an optional assessment that districts can choose either to use or not.  It will provide formative and diagnostic data to inform instruction.  Given that Let’s Go Learn is a diagnostic assessment company, we see this interim assessment as being very different and probably having a greater adaptive logic range.  Reports will target teachers who will use the data in the classroom to inform instruction.  Here are some samples of Let’s Go Learn’s diagnostic reports; notice how they are focused on individual students.

Reading Detailed Report - K-7 Math Detailed Report - Pre-Algebra Detailed Report

Q: What is the difference between SBAC and PARCC?

A: SBAC, or the Smarter Balanced Assessment Consortium, is developing an adaptive online state assessment based on the new CCSS.  The PARCC is another group that is building an online state test that is not computer-adaptive.  So CCSS will be assessed, but it will follow the traditional model of a fixed test and be administered per grade level. 

SBAC webstie - PARCC website

Q: How can we prepare for the new CCSS assessment?

A: From a technology standpoint, online testing is not new.  The pitfalls are known:  Internet bandwidth, district web-content filters, problematic wireless networks, lack of logistical experience by schools, etc. 

My recommendation is that you check out the links below.  Then create your own district guidelines.  Also, most districts now use some sort of web-based services.  I think it is a good idea to use web-based testing tools that will help you move towards personalized learning and at the same time give you experience at rolling out testing to portions of your student population.  Our diagnostic assessments and optional instruction are a good place to start.  You can use them with a single class, school, or district.  Get More Info or Start a Trial Now with Let's Go Learn.

District Technical Document Review Guide (PDF) - School Lab Guide (PDF)

 

 

Tags: CCSS, SBAC, PARCC, CAT, computer adaptive testing, common core state standards, smarter balanced assessment consortium, what can schools expect

Comments
  1. Jonathan of GUSD
    Reply 02/13/13

    This was very helpful information and answered many of my questions.  I admit our district has been very anxious about the new CCSS and how they will affect us in the coming years.  Like many other districts we are also trying to move towards personalized learning.  Thus, we will be contacting you soon to evaluate DORA and ADAM.  I think it is time we integrated good online diagnostic assessment into our schools and classrooms.

Post a new comment:

Name:

Email:

Smileys

Remember my personal information

Notify me of follow-up comments?

Security Challenge:

Type in the following word: assess

Submit