A New Era: The Consortia Assessments Go Live
The Alliance for Excellent Education Invites You
to Attend a Webinar
A New Era:
The Consortia Assessments Go Live
Pascal D. Forgione, Jr., PhD, Executive Director,
K–12 Center at ETS
Jacqueline E. King, PhD, Director of Higher Education Collaboration, Smarter Balanced Assessment Consortium
Bob Rothman, Senior Fellow,
Alliance for Excellent Education
Laura Slover, Chief Executive Officer, Partnership for Assessment of Readiness for College and Careers (PARCC)
After four years of development and extensive field testing involving thousands of educators and testing experts, the new assessments that measure student performance against the Common Core State Standards go live this year. Millions of students will take the tests, developed by the Partnership for Assessment of Readiness for College and Careers (PARCC) and the Smarter Balanced Assessment Consortium. What will these assessments look like? What did the consortia learn from the field tests? How will the consortia continue in the future?
Please join the Alliance for Excellent Education for a webinar to discuss the consortia assessments. Leaders of the two consortia, Jacqueline King and Laura Slover, will describe plans for implementation in school year 2014–15. Pascal Forgione will discuss the implications for districts and states. Bob Rothman will moderate the discussion. Panelists will also address questions submitted by webinar viewers from across the country.
Support for this webinar is provided in part by the William and Flora Hewlett Foundation.
Register and submit questions for the webinar using the registration form below. After registering, you will receive an email confirmation. Please check your email settings to be sure they are set to receive emails from firstname.lastname@example.org.
Please direct questions concerning the webinar to email@example.com.
If you are unable to watch the webinar live, an archived version will be available at all4ed.org/webinars usually one or two days after the event airs.
Bob Rothman: Good afternoon. My name is Bob Rothman. I’m a senior fellow at the Alliance for Excellent Education, a nonprofit policy and advocacy organization in Washington, DC. I’ll be your host today. We are very glad that you’ve joined us for the next 90 minutes as we examine one of the most significant developments in American education – the launch of new assessments in more than half the states. Please join the conversation by using the hash tag all for ed. Or leave a question in the box below this video player.
Today we are privileged to be joined on set by a panel of experts on this manner. This promises to be a very informative session and we will meet our panelists in just a moment. But first let me provide a bit of background.
Four years ago Secretary of Education Arnie Duncan awarded grants to two consortia of states to develop new assessments to measure student progress against the Common Core state standards in English language arts and mathematics. At the time Secretary Duncan called these next generation of assessments and described them as a “absolutely game changer in America education”. These assessments would, for the first time, let students, parents and teachers know if students are on track toward college and career readiness according to Secretary Duncan. The assessments would measure complex thinking and learning, not just basic skills. And he said they would set a consistent high bar for success nationwide as well as making use of technology.
Now four years later that day is here. The assessments developed by the two consortia, the Partnership for Assessment of Readiness for College and Careers, or PARCC, and the Smarter Balanced Assessment Consortium, are being administered this school year in 26 states. Millions of students will be taking the assessments providing themselves, their parents and their teachers with new and important information on student performance.
All this raises a number of logical questions. What will these assessments look like? What changes have the consortia made after last year’s field tests? How will with the consortia continue moving forward? Fortunately we are joined today by leaders of the two consortia. They are here to answer those questions and questions you have about the assessments.
To my immediate left is Jacquelyn King, director of higher education collaboration for Smarter Balanced. She’s a former assistant vice president of the American Council on Education and works with higher education institutions to insure that the assessments align with the expectations of colleges and universities.
Next to Jackie is Laura Slover, chief executive officer of PARCC. She’s a former high school English teacher and former senior vice president of Achieve, where she played a key role in the development of the Common Core state standards.
And at the end of the table we have Pascal Pat Porgion. Pat is the executive director of the K12 center at ETS and a former district and state superintendent. Welcome, everyone.
This is the seventh webinar we have held with the two consortia. You can view all of these and track the development of the assessments on our website, www.allfored.org. Remember, this webinar is an interactive affair. You can join us via the hash tag all for ed. And don’t forget, there’s a box below the video player where you can ask questions. We will go to those questions from time to time, but please know we will not be able to get to every question. We received several before this webcast even got started when you registered.
With those preliminaries out of the way we’ll begin the discussion. First, Pat, do you have any opening comments?
Male 2: Thank you, Bob. It’s good to be with you again for our alliance sponsored webinar with the leaders of the two comprehensive multi state assessment consortia. Our objectives in these webinars is to foster broader and deeper understanding of the important next generation assessment development work under say by the two state led and state governed assessment consortia. The Partnership for the Assessment of Readiness for College and Career, or PARCC, and the Smarter Balanced Assessment Consortium, or Smarter Balance.
We’ve now completed four years of work on the design, development and pilot testing of these new assessment systems. And as one who spent many years at the district and state levels, I understand how very challenging it is to stay up on the important initiatives like this that will have a major impact on how our students will be prepared and assessed in some 33 member states across the country beginning in this spring 2015.
I thank the Alliance for Excellent Education for sponsoring these webinars and both consortia for your participation. In addition, one page up to date summary information about the current assessment designs of PARCC and Smarter Balance are available on the Alliance site for this webinar. And the K12 Center website at www.k12center.org. Please join the K12 Center email list and we’ll send timely and useful resources to you as they become available. Thank you again, Bob.
Bob Rothman: Thank you, Pat. Now let’s turn to Jackie and Laura and find out what’s been happening with the consortia. First, there have been a number of changes in state participation in the consortia over the past few months. We’ll start with Laura. What’s the current status of your membership? Which states will be using the assessments this year?
Laura Slover: Thank you, Bob. Thanks for having me. Thanks, Pat, for being here. And it’s a real testament to the work you do that you’ve had seven webinars. So thank you for continuing to invite us back. PARCC is very strong right now. We have 12 states, 11 states and the District of Columbia, moving forward to administer the tests this spring. Nearly 5 million students will participate. And we are also in conversations with other states and entities who have indicated an interest in joining the group and coming on board.
Your introduction was really great framing, and I just wanna thank you for that. You know in addition to wanting to provide a comparable metric across states, the states that are working together on PARCC are really building quality. And what they care most about is a quality measure that includes writing at every grade level, problem solving and critical thinking at every grade level and making sure that kids are doing more than just filling in bubbles. And that’s really important because we want to make sure that classrooms, kids, teachers are spending the most time doing the most important work.
Bob Rothman: Jackie, what’s the current status of your membership?
Jacquelyn King: Sure. We also have a slide to run this down for folks. There are 17 states and one territory, the US Virgin Islands, that we expect to use the Smarter Balanced assessments in 2014/15. Those states are in the process right now of establishing memoranda of understanding with UCLA. Smarter Balance will be a program within the graduate school of education at UCLA. In fact, we are already in that transition process now. So 12 states have signed those memoranda. We expect six more to sign them this year. And then we have four more states that are staying engaged in the consortia, but that are still making decisions about assessments and likely won’t administer the Smarter Balanced assessments until some time subsequent to 2014/15.
Pat Porgion: Let me ask a question about can schools in districts that are not part of the consortia use your assessments? And Jackie, why don’t you start off?
Jacquelyn King: Sure, Pat. So participation by independent schools or other schools outside of the public school systems would really be a state decision within Smarter Balanced. So if say for example a group of private schools or a parochial school system were to have an interest in participating in Smarter Balanced, they would need to work that out with their state department of education. And then they could participate sort of under the auspices of their state department.
Pat Porgion: Laura?
Laura Slover: The same is true for PARCC. PARCC is going to make available the assessments to nonmember state, for examples, and other entities. So we’ve had inquires from independent schools, as Jackie mentioned, independent schools. Charter schools. Archdiocese. Even a little bit of interest from schools in other countries, which is exciting. And of course there are districts within PARCC states that would like to test at more grade levels than their state may be. So we are in conversations about that with a number of entities and we will be rolling out more information on that this month.
Pat Porgion: Thank you.
Bob Rothman: Thanks. I’m going to ask specifically about high school assessments cause those are a little different from three to eight. And specifically to Jackie because you have comprehensive exams at the high school level, how many states are using those two develop their own end of course tests.
Jacquelyn King: No one will have end of course tests for 2014/15 based on the Smarter Balanced item bank. But I’m aware of at least one state that is considering doing that. That would be Idaho. Building end of course assessments for grades 9, 10 and possibly 12 using the Smarter Balanced item bank. That’s the only one I’m aware of at this point. It is an option that we are making available to our states.
Bob Rothman: Laura, are there any modifications for the high school tests for PARCC states?
Laura Slover: PARCC is continuing to develop and will administer this year a high school assessments in English 1, 2 and 3. So freshman year, sophomore year and junior year. And in mathematics – two different math sequences. One a traditional math sequence of Algebra I, Geometry and Algebra II. And then a more integrated math 1, math 2, math 3 sequence. So we’re sticking with that original plan.
Pat Porgion: Laura, PARCC has recently changed its price structure. Can you explain the reason for that and what it means for states in the PARCC consortium?
Laura Slover: Well, initially several years ago PARCC publicized an estimate for an average price at about 29.50 a student. And those were early estimates and we wanted to be really conservative. As states were building their budgets. So I’m happy to say that we have been able to find economies of scale and bring down that price quite considerably by about $5.00 a student. Because as the numbers have come in on the volumes of how many students are gonna be taking the test and what states are doing to administer the test, we’ve been able to capture those economies of scales.
So I think the most important thing, as I mentioned, we’ve got 5 million students testing. And as more and more students in more states and entities come in, we’ll be able to drive that price down even lower. So that’s a really good plug for economies of scale. And of course they’re buying together the best in class that they have built through hours and hours and hundreds of review sessions with educators across those states.
Pat Porgion: When I was in Connecticut and Delaware, as a little state I never had those economies of scale. So that’s very good. Jackie, do you anticipate any changes in the Smarter Balanced price structure? And have many states opted for the full package versus just the summative assessment only?
Jacquelyn King: Sure. And we’ve got a slide that details the Smarter Balanced price structure. It has not changed since our original numbers came out. For the summative assessment only it’s $22.50 per student. And for the full package, which includes access to our digital library, our interim assessments as well as the summative assessment, it’s our estimate is $27.30 per student. I call that an estimate. It raises an important point about the way our price structure works. States will pay a membership fee to Smarter Balanced to support the continued research and development and the state led nature of the governance of the assessments. But they’ll administer the assessments themselves. And they’ve all entered into their own contracts to do test administration and scoring.
So looking at that slide, the second set of numbers, for example, $17.75 for the full system, that’s an estimate of what states will pay to administer the assessment. But it really depends on the services that a state plans to offer, the arrangements that they make with vendors, etcetera. And so it can definitely come in lower than that, absolutely. Potentially higher. Although we haven’t heard any issues with states being surprised to see that their administration costs are coming in higher than what we estimated. Which is great.
In terms of the packages that the states are choosing, states are still signing their agreements with us and with UCLA. But to date we only have two states that are opting to only provide the summative assessment. The rest of them are offering the full package. Which is great. We’re thrilled. Thrilled about that.
Bob Rothman: I want to follow up, excuse me on this, model of state contracting for delivery. Have there been in any mini consortia that are forming, so the states can do this jointly. And also we’ve heard some questions about what type of delivery systems will they use. Will they use Smarter Balanced open source system or can they use propriety systems?
Jacquelyn King: Sure. So there have been states that have bandied together. A number of the New England states. Probably not surprising given the history in New England with the NCAP assessment. Also a group of states in the Pacific Northwest. Kind of the northwest corner of the country that are procuring together. So that’s been a way for smaller states to still realize those economies of scale.
Good question also about the test delivery platforms. Most states will use some – will either use our open source or some slight variation on that in order to match what they currently do in other subjects. For example, in social studies or in science. Don’t have a complete breakdown of that yet, of which states are gonna be in which camp in terms of what they’re doing. Some states are still finalizing those arrangements.
Bob Rothman: Mm hmm. Great.
Laura Slover: Bob, if I might. Just PARCC has a slightly different model. So the 12 states that I mentioned are really moving together to contract and each one is building its own separate contract, but they’re all using a common vendor and a common platform to make sure that they have the highest degree of comparability.
You asked about mini consortia. I think within PARCC there are mini consortia forming around which states might want translations in a particular language, again, for the sake of getting economies of scale. If they band together and they all want particular languages, those can be developed more cost effectively.
Bob Rothman: Okay. We also want to ask about the field tests that you conducted last spring. They were very extensive involving millions of students. And we want to talk about what you learned from that and how your assessments changed based on your learning. First, Laura, how did the field test go? Any major glitches?
Laura Slover: Well, I’m really glad to say it went very well. I think Jackie and I both had a lot – we’ve been through a lifetime in the last year, but it did go very well. We had a lot of upfront preparation, both consortia and we – you know everything worked fairly smoothly. Now I want to say it was not without glitches. When you administer a large technology based field test to over a million kids, glitches are bound to happen. But they were relatively small. They had to do with log on issues not system failure, for example. So that was very good news. We had about a million kids, as I said, 16,000 schools and administered in 14 states and DC. And we learned a few things. And I think there’s probably a follow up question on this. But we learned that schools really benefited from doing a dress rehearsal, and that’s something that we’ll be promulgating a lot more information about. How can classrooms get online and how can schools really practice with the technology in advance, particularly the equation editor and some other functionality around the items.
In cases where students had seen sample question and tutorials and educators had taken those tutorials it went very smoothly. There were no log on issues at all and students were very familiar with the functionality.
We did learn that the test administration manuals, which essentially guide a test administrator through the process of test administration were a little weedy and need to be refined. And those have been on the – they’ve been reviewed and are being revised right now.
The last thing I’d say is we took a close look at testing time during the process and learned that we had really allocated extra time to make sure that all students have sufficient time to complete during the field test. And we were able to actually reduce while still giving students every opportunity to complete their answers, to reduce a little bit of the time that schools need to schedule. Because kids weren’t taking advantage of that time.
Bob Rothman: So about how much time in elementary grades and high school?
Laura Slover: We’re looking at a range. The difference between time on task, which is about how long kids will actually be taking the test versus how long schools will schedule – schools will schedule around nine to just over ten hours. But students should complete the test in around the seven to eight hour range.
Bob Rothman: Jackie, how did the Smarter Balanced field test go?
Jacquelyn King: It went very well. We tested 4.2 million students. Which makes it the largest online assessment ever given in the United States. Which is quite something when you think that it’s just the practice run. We were also in a little more than 16,000 schools. And given that size and that volume, and we had five states that tested all or virtually all of their students, including several states who had never done online assessment before. Most notably California. And hats off to the states. They really just prepared so well. And as a result of that preparation we really saw very few problems. The glitches very similar to PARCC, the technology glitches were minor. They were about lost passwords. Sometimes a little bit of confusion around using some of the accommodations tools online which are new. And where we realized we needed to improve. Similarly some of our test administration manuals make them more streamlined, simpler, easier for people to find information. But all in all just a really a bigger success than we anticipated it would be quite frankly.
I’d say in terms of lessons learned, I’ve already mentioned keeping those test administration materials streamlined, clear, simple for people is really important. We also saw that, and this really goes beyond PARCC or Smarter Balanced, but managing test security in the era of the smart phone is challenging. And particularly with the high school students, being sure that those test administration procedures keep those phones on the other side of the room from the students, very important. We were monitoring Twitter and Facebook and finding pictures of, occasionally pictures of test questions and that sort of thing. So this is something that in the world of testing everybody’s just gonna have to find a way to deal with.
So those were some of the lessons learned. We did not have any major changes as a result of the field test in our test design.
Pat Porgion: That’s gonna be our next question, Jackie.
Jacquelyn King: Great.
Pat Porgion: Let’s shift beyond the logistical administrative, which were remarkable for the size of this effort in both your cases, it obviously gave you insights into the constructs you’re measuring and what your assessment instrument should look like. Jackie, could you start with are there any changes to the Smarter Balanced assessment system as a result of the data you gained so that next spring we’ll have some different configurations of assessment? Can you highlight some pieces?
Jacquelyn King: Sure. Nothing significant in that regard. You know we did do a pilot test in the spring 2013 with about 650,000 students. So that was really our first big tryout of the item types. And we did at that juncture make some adjustments. But really once we got to the field test, having had that experience in 2013, the focus was much more on test administration and logistics. And we haven’t identified any major changes to items or item types that we’re planning.
We did notice in some of the earlier grades that some of the instructions may not have been written as simply as they should have been for the grade level. So we’ve gone back and taken a look at that. But that’s really the main change.
Pat Porgion: Could you reflect on the interim system? Because that has changed. And maybe let our audience know of some of the changes. I didn’t know if it came out of the field test. But you just announced some changes there.
Jacquelyn King: We did. The interim system is very exciting. We’re thrilled about it. It will roll out to schools this winter. And it features two types of assessments. An interim comprehensive assessment which is designed on the same blueprint, will cover the same material as the end of year assessment and will be scored and reported on the same scale as the end of year assessment. That test we do intend ultimately to be a computer adaptive test like our end of year summative assessment. But for 14/15 we don’t feel that we’ve got the number of test questions that we need to mount a fully robust computer adaptive assessment for interim. So in this year, this first introductory year it will be a fixed form assessment. And then subsequently in 15/16 it will become computer adaptive.
The other type of assessment is the interim assessment blocks. Those are shorter tests focused around groups of related standards. So I think of them as an end of unit assessment. The kind of thing that the fourth grade teachers might all want to implement in the school when they’ve finished a significant unit say on operations with fractions. That will be a fixed form test also for this year, ultimately with a goal of it moving towards being computer adaptive as well.
Pat Porgion: Laura, any changes to the system of assessment for PARCC coming out of the field test?
Laura Slover: Pat, there are a few changes. So just as Jackie said, the PARCC field test was really an opportunity to test the test. It wasn’t a test of students per say. There’s no student level data coming out of the test. But really what you take a look at is the items and how they fair. And whether you get enough reliable and valid data from the assessment overall. And so in two ways we are able – in addition to the logical things around scheduling and around test administration manuals, we were able to identify two areas that we could change.
So one the blueprint for the English language arts test. We were able to reduce the number of reading passages. We learned that we didn’t need as many reading passages in order to get reliable data about students. And so what we want are quality tests. We don’t want to test a moment more than we need to. So that we were able to reduce the length of the test.
And as I mentioned before, we also learned that kids didn’t need quite as much time to complete the items as we had allowed them, allocated for them. So for both of those reasons we were able to reduce the overall testing time, which is really important. We don’t want to test kids any more than they need to. We think that assessment of course goes hand in hand with good instruction, but we want to make sure that we are getting the most out of that experience as we can.
I will say that I had the opportunity to sit in and watch students in a district in Maryland take the test. And it was really exciting. We learned – so I got to see first hand the excitement of the kids who took the tests. I think there’s a slide that shows some of their quotes. But the kids generally reacted quite positively to the computer based environment. They thought it was more fun and more connected to the work they were doing in schools and on their own. And they – I was particularly gratified to hear that they liked the item types. And some of them said things like, this is much more – it really makes you think. It’s more engaging than the kinds of tests we’re taking right now. So that really speaks well to the quality and the conceptual demand of the content.
Pat Porgion: Might you comment, I know the PARCC board just made a decision to take speaking and listening from a require, not part of accountability, off of the table for this current year. Did that insights come from the pilot for that or what brought you to that?
Laura Slover: Thanks for the question. That’s a good reminder that PARCC is also, as Smarter Balanced, developing a number of additional tools like kindergarten and first grade diagnostic tools. Third through ninth grade diagnostic tools. And the speaking and listening assessments that you referenced.
The decision was made to not require them of all schools but to incentivize students to take them and teachers to use them because they’re gonna be such great high quality tools. Speaking and listening is part of the standards. And it’s often a part of the standards that goes unassessed. So we want to make sure that teachers have access to these great tools. But it’s not quite ready at this point for it to be included as part of a summative assessment experience.
Pat Porgion: Great.
Bob Rothman: Yeah, I think before we move on to the prepared questions we’ve got a question from a viewer that probably is worth clarifying on that point. Jasmine from here in Washington wants to know what’s included in a state assessment package? What do states get and what’s optional? So I think if we can clarify that for the viewers that would be helpful.
Laura Slover: Sure. Should I start?
Bob Rothman: Go.
Laura Slover: So for PARCC what states are going to administer this year are the summative assessments. But there are two components of that which are important to unpack a little bit. One are the performance based assessments. So in English language arts that’s the writing. Students will read several sources and write about those sources using evidence from those sources to support an argument, make a claim and provide some examples from text. In mathematics they’ll be asked to do extended problems, multi step problems where they bring information from one part of the question in and solve multiple steps and watch how the information unfolds, justify their responses and explain why they think their answer is correct.
Importantly but in the math instance students can get partial credit. So they can work their way through a problem. But if they make a computational error at the end they still get points for what they do know, which is really important.
At the end of the year there is another part of the component called the end of year component that is selected response, there’s short answer and the results will come back very quickly. That assess reading and the kind of priority topics in mathematics. Together those components will compromise the end of year score if you will. And that will be, you know so both parts of those really matter. And that was important because when the states were developing these assessments they wanted to make sure that emphasis was placed on the hardest to measure skills like writing and problem solving because those are the things that tend to get short shift in typical assessments.
The other components, the diagnostic components are being field tested and piloted this year. So there’s an opportunity for member states to get involved in the piloting of those and the early tools and the speaking and listening tools. So those will be rolled out next year and will be optional for states and districts to use.
Bob Rothman: They’ll be available at the beginning of the year?
Laura Slover: They’ll be available at the beginning of the year. Yeah.
Bob Rothman: And?
Jacquelyn King: So in Smarter Balanced there’s three components to the Smarter Balanced system. We’ve talked a bit about the summative assessment, the end of year assessment. That has two major sections to it. The first is a computer adaptive test. The – which computer adaptive testing, my lay person’s description of it is the game of you’re getting hotter, you’re getting colder. Students start at the beginning of the assessment taking questions at about the middle range of difficulty. If they get the question right, they get a somewhat more difficult question. If they get the question wrong, question’s a little bit easier. Until you sort of – the test really sort of hones in on the student’s level of knowledge and skill. Then moving on to the next topic and the process repeats. That’s the first part of the assessment.
The second part is performance task. Very much like the performance based assessment that Laura described. This is a chance for students to really dig in to a particular topic. To pull on their skills from an array of areas. So in English language arts they’re gonna be reading critically. They’ll be writing. They’ll be doing – demonstrating their research skills. All pulling that together around a particular task. Likewise in mathematics they’re going to not just use their mathematics knowledge, but they’re gonna communicate their reasoning. They’re gonna describe how is it that I came to this answer. To demonstrate their conceptual understanding and not just that they’ve memorized procedures.
So all of that is comprised in the end of year summative assessment.
I’ve mentioned the interim assessments already. Those are rolling out to schools this winter.
And then the last leg of the stool is the digital library. This is a tool for teachers. It’s not a test. It is a collection of about fifteen hundred instructional and professional development resources that have been identified and screened by teachers for teachers. We have had well over 1,000 teachers around the country working on this now for several years. Bringing in resources. Screening them. Making sure that they meet high standards for quality. And then ultimately populating this digital repository. That’s been available as a preview for all of our states this summer. That preview just ended last week on the thirtieth. And it’s now available to the states, to teachers in the states that have subscribed. So that’s a way – wonderful set of resources for teachers to embed formative assessment in their classroom practice every day. And to really elicit information about student learning so that they can customize instruction.
Bob Rothman: Laura, PARCC will have a digital library too?
Laura Slover: Yes, I failed to mention that. PARCC has what’s called the PartnershipResourceCenter. Which is very similar to what Jackie described and is under development and will be in pilot phase this fall and into this spring and then available to all the PARCC states.
Bob Rothman: Thank you. I wanna get back to the field test. One of the issues was around technology. You mentioned that this was the largest online assessment ever administered. What did the field test tell you about school’s capacity to administer online tests, both their hardware, software and their bandwidth? And how are you accommodating schools that might now have the technological capacity yet? Laura, you want to start?
Laura Slover: Sure. We had a very good experience, as I said, with the platform. Generally the platform functioned well. There were, as Jackie and I both mentioned, there were little quirky issues that came up. Some glitches. But they were very manageable and small in scale.
So we learned that, I mean the big headline was very positive experience in terms of the technology. We learned that kids really prefer, really enjoyed the technology based platform and the experience. It’s much more like what they’re used to. They got to highlight. They got to drag and drop. They got to plot numbers on a number line. They really got to take advantage of the technology to do technology enhanced work which is really accelerating their learning and really reflecting what they can do.
I think that states are in different places on where they are in administering computer based tests. Some have been there for a while. For some it is new. And I think there’s a learning curve. And the learning curve has primarily I think to do with scheduling and how to do move students in and out of computer based environment for testing. And as I said, that’s old hat in some cases. And in some cases it’s gonna take another, a little while to figure that out. But the dress rehearsal of the field test was quite useful in that regard.
We think that a lot of the modules and technology based training that we offered in advance, I think there are five modules on the PARCC website that help to get administrators and educators engaged and know how to use the tools. We think there’s an opportunity for those to be used more – you know for those to be accessed more and in preparation for this fall and this spring, I think that would be very helpful.
But we learned that, you know the system is read. We had a million kids and we’ll have 5 million this year. And the platform works and works well.
Bob Rothman: Are the schools ready?
Laura Slover: The schools are ready. You asked about bandwidth and hardware. PARCC has put out – bandwidth I think is – primarily we learned that the schools that participated in the field had sufficient bandwidth. We didn’t have real bandwidth issues. There were log on problems and problems with passwords. So in the moment issues that we were generally able to resolve quite quickly.
In regards to devices, PARCC put out last year, maybe a year and a half ago, device specifications. Bandwidth specifications and device guidance about what the rule of thumb is, how many devices schools should have. And the guidance that we put out essentially said that schools should have as many computers as their largest tested grade. And that would be enough to provide the access that schools need for that.
So I think ideally in a perfect world you’d have one device per every kid at every moment. I think that’s an instructional need, not just an assessment need. That we want to make sure that our schools are using the technology that is twenty-first century in not just for an assessment experience but in their day to day instruction.
Bob Rothman: Mm hmm. Jackie, what did you find?
Jacquelyn King: Sure. So I think we found in general that access to devices wasn’t typically the most significant issue with – one of the benefits of a computer adaptive test is you don’t have a lot of the same security worries. So you can spread out your test administration over a longer period of time. And as I think schools are still getting used to that freedom and kind of wrapping their heads around that flexibility. But by taking advantage of that there really isn’t, while there certainly you can make a good instructional argument for one to one computing, it’s not necessary for the assessment. In fact, we estimate that a 600 student middle school could assess all their students with one 30 computer lab if they were to take advantage of the full testing window that we’re offering.
So devices didn’t seem to be a big issue. We were concerned, and I think again this is a broader concern, this is the year that Microsoft is going to stop supporting Windows XP. And when we first implemented, together with PARCC, our technology readiness tool we were seeing a lot of schools, as I recall more than half, responding – this is a couple of years ago. That they were running Windows XP. And that’s a broader concern, of course, because when a company stops supporting a software that means they stop supporting a software that means they stopped providing patches for viruses and that sort of thing.
So a much broader concern than the consortia, but nonetheless something that we were worried about. We’ve since seen that schools have really responded to that, recognized that problem. And so the proportion that were using Windows XP has dropped significantly down to less than, I believe it was less than 20 percent. So that was a kind of whew, good, we’re glad to see that.
The biggest issue for school seems to still be bandwidth. We saw, especially in our five states where they tested all the students, that they’ve managed that pretty well. But we will still offer a paper and pencil option for three years for schools that are still needing help, you know needing to make that transition. And it does seem like where that’s a concern the biggest issue is bandwidth.
Laura Slover: I should have mentioned that as well. That PARCC will offer a pencil and paper version of the assessment. In the field test about three-quarters of the students took it online. And about a quarter took the pencil and paper version. And it looks like probably it’ll be about the same in the operational assessment this year. It’ll be about three-fourths to one quarter.
Pat Porgion: Your assessment systems represent a step change in our testing industry by the inclusion of the performance component. Really measuring the whole spectrum of the hard to measure standards. What did we learn from the pilot, Jackie, about the performance component and the use of it? And were kids engaged and do you think we can trust those results?
Jacquelyn King: I think so, Pat. I think we really saw it with in particular the engagement with the younger students. They were thrilled that somebody wanted to know what they thought and that they got to demonstrate their learning. I think our challenge, to the extent that we have one, is with the high school students, who are taking a lot of assessments in high school. And who aren’t necessarily interested in doing something just because their teacher told them this is important. That’s where the involvement of higher ed becomes so critical. And we’re thrilled that we already have several states who have committed to use the Smarter Balanced assessments as an indicator of readiness for credit bearing courses.
This means that when eleventh graders sit for the assessment in these states they will know that if they’re admitted to a public institution in their home state that that institution will place them directly into credit bearing courses. They won’t have to take remedial courses as long as they score at the necessary level and complete any twelfth grader requirements that those public higher ed systems have established.
So that I think over time as more and more higher ed institutions come online with that and we see them moving in that direction strongly, which is really fantastic. I think that’s gonna help. But I do think in the near term we do have a communications challenge to make sure that our high school students understand why these assessments are important and why it’s important for them to put forth their best effort to demonstrate what they know.
Pat Porgion: How did your states feel about the addition of the performance component?
Laura Slover: We had great feedback particularly from kids who really, as Jackie said, enjoyed showing what they knew. They enjoyed being able to actually think and have to go back into a text to find an answer. So it pushed them to do the kinds of things that good teachers are already doing. And we did a survey for the kids after they had taken the assessment and asked them a number of things. But in particular were they confident that they had seen the content before. Majority of them had seen the content before. I know that one worry of course is are we getting out too far ahead of where teachers are with their implementation of the standards. And what we learned from the survey at least was no. The majority of kids had seen the majority of content before in their classrooms. And the other thing we learned is that kids really liked how they were engaged in the items to be able to show what they know.
I think Jackie’s point about high school is it’s not a new problem. It is the same kinds of motivational issues we’ve had and the shift I think here is that these assessments, both PARCC and Smarter Balanced, are door openers. They’re door openers directly into credit bearing courses at colleges and universities, two and four year. And both consortia had worked very hard with the higher ed communities across the states to accept those scores as one of a set of measures that will indicate readiness for credit bearing courses without the need for additional remediation and without the need to take an additional placement test when a student gets on campus. And so that will trigger incredible opportunity for students and we think that that will help incentivize them to take it seriously and do well. We also think that teachers, because the assessment so closely mirror what good teaching looks like and really are no different than the content in the standards they measure with great fidelity the standards that this will provide real incentives for teachers to adapt and adopt these assessments because it’ll help them make these tight connections between what they’re doing instructionally and what their kids are actually being asked to do on assessments. So they won’t need to do separate test prep. We really think it will eliminate drill and kill test prep. Students will be doing in their classrooms the kind of work that will prepare them for the assessments at the end of the year and therefore, will save teachers time and will make the whole connection instructionally much more relevant and close.
Pat Porgion: Thank you. Bob.
Bob Rothman: Well, let’s look to the future. Jackie, first with you, the federal funding that I mentioned at the beginning has now ended. So how are you sustaining yourself –
Jacquelyn King: Sure. So our – we did ask for and receive a short no cost extension of our federal grant. It will expire at the end of this year. So December 31. That helped us to wrap up, tie up a few loose ends with some of our projects. In particular our work around setting achievement levels on the assessments. But as I mentioned at the top of the show, our states are signing agreements with UCLA. We are becoming an independent program within the UCLA graduate school of education and information studies. So I never thought I’d heard myself say this, but go Bruins.
So and that process has already begun. Which is really exciting. The staff are becoming staff of UCLA. And we are starting to develop that work with the university. And it provides us with a number of benefits that we think. We have access to some really fantastic researchers and graduate students, partners of ours will be CRESST, the Center for Research and Evaluation Standards of Student Testing at UCLA. Fantastic group of people who will be partners with us on a lot of our validation research and psychometric research going forward. We also take advantage of all the infrastructure of a large university and don’t need to build that ourselves. So a lot of great advantages.
It also allowed our states to procure their membership in Smarter Balanced in a somewhat more streamlined way because UCLA is a public entity. So that process is well under way.
Bob Rothman: Laura, you also have an extension, but how have your organized yourself to sustain yourself in the future?
Laura Slover: We have an extension. The consortium has an extension through August 2015 to wrap up additional work. Finalize the work on the important diagnostic tools, the performative tools and the mid year and speaking and listening assessment. So those will be in pilot and field test and be completed next summer to be rolled out next year. So that important work is still being done. And PARCC has a little bit of a different model for how it was gonna organize itself. All the states are moving together to deliver the assessment on a common platform. Each is building its own contract with vendors to do that, with the vendor to do that.
But the states have also, and we have recently formed a nonprofit called PARCC, Inc. Surprise. That I am now employed by. And we are moving forward to make sure that the consortium itself in terms of the states and the state governance stay together and build on this common vision.
The standard setting process that Jackie referred to that Smarter Balanced is doing right now is something that PARCC will do next summer as well after the first fill administration of the assessment. So that’s another thing that the nonprofit will take the lead on and make sure that states build good teams from each state. And I think we’ll talk about this a little more later, but a team of higher ed and K12 leaders from each state who will come together and set those performance levels. So that there is a common set of metrics across the states. And that’s another thing that will really gird the states together and keep them together is this common way of thinking about what performance means across the states.
Bob Rothman: Well, let’s go ahead and talk about that. You have different approaches to standard setting. Why did you choose the approach you did and what do you hope to get out of it?
Laura Slover: I’m not sure if the approaches themselves are different. The timeline’s a little bit different. PARCC is gonna build a process we’ve started already to build out teams in each state of higher ed faculty and leaders and K12 educators and leaders to come together across states. So there’ll be a team within each state and then an umbrella across the states. And they’ll come together and look at the setting of the standards.
Now setting of the standards means identifying where those performance levels should be set. PARCC will have five performance levels. Many states have four. Some states have three. PARCC really wanted to – the states agreed that they wanted to have five performance levels to get a more finite look at performance across the full performance continuum. So they would have just as much information about high achieving kids as low achieving kids. Because the more information we have about students, as a former teacher I can say this – the more information you have about kids the better able you are to support them.
So the standard setting process will happen next summer after the first full administration of the assessment and all of the data from that will go in to this process where educators will look at actual items from the assessment and make judgments about which items students performing at a level one would get right, a level two, a level three and a level four. And then those performance levels will then be recommended by the panel to the governing board and to the higher ed advisory committee that together will establish the final cut scores.
Jacquelyn King: So our process for setting achievement levels actually kicks off today. Yeah, it’s very exciting. We have a slide on this which might be helpful for people to walk through. And since it’s happening now I’m gonna take a minute to walk you through the process.
Last spring our states approved our achievement level setting design. And we are now in the process of implementing that design that they approved. We are starting today what we’re calling our online panel for achievement level setting. This is something that really nobody has done before. We’re treading some ground with this one. We will also have the kind of in person gathering that Laura described. That starts in a week. And there’ll be about 500 educators at that gathering. That’s pretty traditional for what states have done to set achievement levels.
But you take that number of 500 and you break it down across – we had 22 states participate in our field test. And then you – English and math, each grade level, higher ed, K12. Very quickly you can end up in a situation where you may only have one third grade math teacher on that panel that is recommending third grade math standards, achievement levels. And we really felt like it was important to have more participation. So we came up with this notion of the online panel.
This is a means whereby really anybody who wanted to could sign up to participate in recommending achievement level score for level three, which is our level for being on grade level and on track to being college and career ready. And we had 10,000 people register to participate in this process that will be going on for the next two weeks. So from your home or office log on to your computer if you registered to to do this. And you will walk through a series – you’ll walk through some training. So you’ll get some training on how to do this. And then you’ll walk through a set of test questions and make your recommendation about the test question that best differentiates between a student not yet on grade level, not yet on track and the student who would be on track and on grade level. And we will aggregate up that information across all these 10,000 individuals. Being able to look at what did the teachers? What did the higher ed faculty say? What did the parents say? What did the business leaders say?
And we’ll bring that information into the workshop. So when that third grade panel is meeting to talk about math, they will know what the hundreds, maybe over 1,000 people who looked at third grade math, what did they have to say? What did their fellow teachers have to say about where that line should be drawn for that achievement level?
So that’s super exciting and it started today and we’re just really interested to start to have the feedback roll in. So that’s launched. I’ve mentioned we’ll have a workshop with about 500 educators participating next week. That kicks off. We’ll then have a committee that will look across the grades to make sure that there’s proper alignment across the grades of the achievement levels. Our technical advisory committee will review it. And then on November 6 our states are gathering for a vote on the achievement levels. And for grade 11 that vote will encompass both K12 and higher education leaders.
And then subsequent to that, there’s still a necessity in most states for some kind of formal adoption process. Typically by state board of education. So that will occur last this fall, through the winter in each state as needed in advance of the first administration of the summative assessment.
Laura Slover: It’s really exciting. I just want to say this is precedent setting. So it’s very exciting to be embarking on this journey.
Jacquelyn King: It is. It is.
Pat Porgion: One of our listeners sent in a question that builds off of that. How will the test scores for the first operational test using the Common Core state standards affect students and schools? Knowing that a first effort with a new development may be challenging for everyone. So at least we understand it’s for everyone challenging. Will there be any leeway basically? Brooke asked that question. Laura, would you –
Laura Slover: I’ll take a stab and then I’d love to hear Jackie. So as I mentioned, the PARCC states are gonna set the performance level standards next summer. We’re gonna do that as quickly as we can using the data that comes from the first real assessment so it’s real data that kids are – you know that are on the items that we use. Those standards will be set, those performance levels will be set. And they’ll be adopted by the consortium in a manner very similar to what Jackie said. So the policy group, the chiefs, the state superintendents and the higher ed, their higher ed partners will be voting to establish that based on the recommendation of all of these work groups who have done the deep work looking at the data on the items and the items themselves.
Those will be the consortium wide performance levels. And we will establish a level four as the level that means ready for credit bearing with a high likelihood of success. And then those will be back mapped down through the grades.
Your question is very interesting. I guess it was Brooke’s question. Each state will determine how it plans to use the information from the test in the first year. PARCC will set the cut scores, establish the performance levels and then report out on that. But each state has its own state policy and its own approach to what “counts”. And that will be determined in each state and is not dictated by the consortium itself.
Pat Porgion: Jackie?
Jacquelyn King: Exactly. It really is a state policy decision. We all know that we’re living in this brave new world of waivers. And states without waivers. You can’t tell the players without a scorecard. So each state is making its own policy decisions about how it will use scores for various types of accountability.
Bob Rothman: We received a number of questions and we should get your answers to this. About accommodations for students with disabilities and for English language learners. So could you briefly describe the policies you have?
Jacquelyn King: Sure. So because these assessments are being offered on computer, we have the ability to take them far more accessible for students than tests have ever been in the past. And in Smarter Balanced we’ve thought about that at sort of three different levels. We have a set of what we call universal tools that are available to any student if and when they need them. So, for example, something that I know for me as I’ve gotten older, being able to zoom the text and make it a little bit bigger because I’m having a hard time – my arms keep getting shorter and shorter. That’s something that’s available to all students. Using scratch paper. Those kinds of things. That you can now make available with no challenge to everybody.
Then we have a set of what we call designated supports. And there’s a lot of those on the Smarter Balanced assessment. That’s probably the biggest category. Those are available to any student, but an adult needs to activate those in the system. Because we don’t want students playing with so many different tools and bells and whistles that it’s actually distracting them from doing the assessment. We want them to have the tools that they need. This is where our features for English language learners would come in. Smarter Balanced has translation of mathematics, glossaries in mathematics in 11 languages and multiple dialects.
So if, for example, a student’s first language is Arabic, on the math assessment if it’s a question about finding the ratio of – finding the area of a playground. The word area, because that’s part of the mathematical concept that is being evaluated, would not be translated. But the word playground would be translated. And the student can just mouse over that word and there it is in Arabic for them.
We will also offer for students who are in a full language immersion, dual language immersion kind of program, in Spanish we will have a stacked translation, a full translation. But actually the research has shown, it’s pretty fascinating, that a full translation for most students actually makes the assessment more difficult not less difficult. They’re reading twice as much information. And they can get hung up on the translation as opposed to the actual content of the question.
So the glossaries actually that students only need to access it when they need it, otherwise if they know the word playground in English, they’d keep going along their merry way and answer the question. The research has really shown that that’s the optimal approach. And so that’s what we’ll be making available to to students. To most students with the stacked translation available if needed.
The last set of tools are accommodations for students who have an IEP or a section 504 plan. And those encompass both a number of tools that are embedded in the system, for example, videos of human signers for those who need American sign language. Refreshable Braille. And then some tools that are standalone non embedded. But it’s quite extensive. There’s detailed information on our website. I encourage viewers who are interested to go to our website, to our page for under represented students –
Bob Rothman: And your website is?
Jacquelyn King: www.smarterbalanced.org. Thank you. And to take a look at that. Because it’s quite an extensive list.
Bob Rothman: Laura?
Laura Slover: Well, I don’t want to repeat what Jackie said. That was a great list. I think the most important thing is just to underscore what she said. The computer based environment – two things are really important. One is this universal design set of principles that both consortia have used, which really means that the accommodations are being built in not retrofitted afterwards. So it’s a huge benefit to the field and actually really pushes towards more innovation and more opportunity.
PARCC will also have these tools that everyone can use. Highlighter. Zoom. Zoom in. Yeah, I like that function too. You can change the background on things. You know things to make it easier for anybody to use. And then there’s these tools that can be turned on. An adult will have to turn those on based on a prearrange or pre-understood need that the student has. So those are easily accessible. They just need to be activated. But the bigger picture is that we want to make to these accessible to all students. So that all students can show what they know in an environment that they’re comfortable with in the classroom to reflect the learning that they’ve done. So providing access to all kids, ELL and students with disabilities is a high priority.
Pat Porgion: A two part question with ___ ___. Both you alluded to the fact that among your member states some are still influx and may decide to take a wait and see attitude the first year. Do you anticipate more states participating in the consortium in the future? And also there is the question about what resources would be available to non participating states in the future? Is there a way for them to take advantage of these terrific development activities? Jackie?
Jacquelyn King: Those are great questions. We certainly hope that there’ll be more states participating in the future. That would be fantastic. We do have four states that are continuing to be active in the consortium and we hope that those states ultimately will use the assessments. A number of the things that we have done we’ve done in open source. So all of the software that we have developed is open source software. And we’ve got another – I’ve got another website for you. For the techies in the crowd there is a website called smarter app. Smarterapp.org. And that is our site for developers, for software people. Where they can download all of the specifications and even the source code for the tools that we’ve developed.
So states or even countries that aren’t participating in Smarter Balanced can take advantage of those software tools that we have developed. And we really hope that by putting that out there in open source we wills pure innovation and bring new players into the assessment business, because it really sort of lowers some of the barriers to entry for smart technologically savvy people out there who want to invent the next great thing, to get involved. And that they’ll be a way to really kind of move the whole field of assessment forward and a catalyst for that.
In terms of access to the actual assessments themselves or to the digital library, which I know is something that a lot of teachers are excited about and want access to. At this point that really has to happen through a state. There are continuing costs to maintaining and growing the digital library and to – I should mention that it’s not just a thing that you and you pull down resources. It’s an online community. Teachers can rate materials. They can enter into dialogue with their colleagues around the country about the materials that they’re using. We’re gonna be sponsoring forums through the digital library, bringing experts in to the digital library to provide professional development experiences for teachers digitally.
So the cost of doing all of that does require that it’s only open to subscribers. We have had some school districts that don’t now have access to the digital library want to make arrangements through some Smarter Balanced states to get access to it. We’ll see how that plays itself out and develops over time.
Pat Porgion: Laura.
Laura Slover: Well, we certainly hope that the consortium will grow. That additional states and entities will want to join. And we think that that will happen because of the quality of the tools, the incredible leadership of state educators across K12 and higher ed spreading the word about the good work that they’ve done to develop these tools, and the low costs that can be achieved by banding together for best in class assessment tools. These are tools that will be not just summative but diagnostic, formative, benchmark assessments. And we think that the power of the system will draw new players to the table.
Now one of the things we haven’t talked a lot about but there is a sense right now that there’s too much testing going on. In districts and in states. The power of the consortia developed assessments are one that they are high quality. They’re gonna give data back at a very fine grain level. The kind of data that teachers, as a former teacher I would have loved to have had about my students. About their knowledge. About what they’re learning. So I could support them in their future growth.
The whole system is gonna be aligned. It’s gonna be really high quality. It’s gonna provide this data. And there’ll be fewer but better quality assessments that provide data at a deeper level so that teachers can really use that in making decisions about their instruction. All of these tools will be aligned, so the information will be clear and connected. They won’t be getting a whole bunch of things that seem disconnected and perhaps not that useful. But a set of tools that really track students growth both from the beginning of the year to the end of the year, but then from year to year. And we think that the power of that will be very attractive to not just states but other entities like districts, charter management groups and schools that are really trying to do what’s right for their kids and provide a trajectory and an onramp to success after high school.
Pat Porgion: Could I just push a little more? Regarding items. Obviously, as Bob mentioned in the beginning, the federal government put a very big investment in each of these consortia. You’ve developed incredible item bank. When I was in little Connecticut developing tests for Delaware, I mean if I could get two forms of the test done and keep one in case a mistake happened, you know it’s just very wonderful the modeling you’ve done for the industry. But some people out there who aren’t members for different reasons, could they approach you about getting access to the items? Cause obviously you’d have to protect confidentiality. I mean you’ve put a lotta sweat equity in this. Is there any way your boards would think in the future that you could develop that access bridge? Because then it won’t be us and them, you know it’ll be kinda two consortia who are continuing. And I can’t wait for 2.0. Imagine what’s gonna come in the future. And other states kinda might just wanna – the items were an issue that’s been brought up many times in my presentation. How do we get access to this and learn from it? And I just wasn’t sure. So, Laura, is that gonna come? Is it something that’s possible? What’s your thoughts on that with the board?
Laura Slover: I think it’s very possible. You know the Race to the Top assessment grant required both consortia to make available the content, the assessments that we are developing. And we are exploring ways to make that content accessible to a wide group of interested parties. The trick for us in the first year is making sure that we have enough items. And Jackie mentioned this earlier, enough items to fill out the forms the member states are counting on for this year. And at all times to hold those items secure. Because the assessment will not – requires secure items.
So in the first year we’re looking to provide access to forms. So interested parties could administer PARCC forms. Either on the same platform that other states are going to use or there’s potential for a different platform. And so I’d be happy to talk to people about that. And then in the future there is the question of whether we can find an opportunity to provide access to items over time.
Now I will say that the Race to the Top item development covered essentially two years of item development maybe three. And that PARCC states are committed to releasing those items. Because the most important thing we can do is to release into the public domain quality items that can be used in classrooms. So all of the items by the end of the first three years of the program-ish will be released into the public domain as a release policy that the consortium has.
So there’s two issues. One is test security. But then there’s providing access and lifting the hood on the incredible work that’s been done. And I think we can do both of those things over the next few years.
Pat Porgion: – the understanding of that point. That eventually they will be released. Cause that’s a very good point. But obviously you have to have the security.
Laura Slover: Absolutely.
Pat Porgion: Anything to add to that, Jackie?
Jacquelyn King: Sure. So we have established a policy that, similarly that would provide interested states with access to the assessments for the same membership fee that we would charge to a member state. They don’t have to pay more, but they wouldn’t pay less than a member state. That only seems fair. Providing access to the item bank only, and pardon us for the testing speak, I am not a testing person so to me this is an item. But for testing people an item means a test question. So providing people with access to the questions. I’ll try to use the more universal term. It presents some challenges that we really honestly nobody has approached us yet. And so it’s a bridge that we haven’t come to and so we haven’t crossed it. But because these items, even if they’re kept secure, the more eyeballs that are on them the quicker you need to retire them as secure items. And so that presents additional cost to the member states. So you’ve gotta figure out – there’s some serious head scratching to do to figure out a policy that would allow folks to have access to questions while not disadvantaging the member states that are continuing to pay for the ongoing research and development.
So it’s certainly not something that we’re opposed to thinking about. And if it comes to pass that there are states that are interested in that and approach us about that, then that’s something that we would consider.
Laura Slover: Can I add one other thing? I think another tension is around – there’s item security and then there’s tension around comparability. And, you know if you’re a real testing person, a psychometrician, you would insist on similar test administration procedures, testing environments, testing windows, the same training protocols for scoring, the same rubrics, the same anchor papers. So you can see that the more degrees of separation between the process that’s been established and what other states or entities might use, the more degrees of separation there the less likely that comparability statements can be made. Or at least it’s much more challenging to do that. And so I think that needs to be part of the conversation even though the consortium would love to invite other players in, thinking about that last piece of comparability will be really important.
Pat Porgion: And may the perfect not be the enemy of the good. Cause we want you to deliver a good product, and we know that measurement hasn’t always been a fan in letting you do that. But we’ve gotta have better products for our kids. And so thank you for the good work. Bob.
Bob Rothman: Yeah. You’ve described in a lot of ways a very new world in assessment for most schools and students. So this question from Helen from Albuquerque seems relevant. What’s the best way to prepare students for this test, particularly the online testing? Similar resources or how do you prepare students to take a test that’s new for them?
Jacquelyn King: So the first thing I would say is teach the standards. With whatever curriculum you are using in your school district, if you are teaching students to read critically, to pull evidence from text, to write persuasively using evidence, to really understand math at a conceptual level, your kids are gonna do well on these assessments. If you’re implementing the standards with fidelity, that’s what you need to do. There’s no special magic drill that you need to put your students through.
That said, there are a number of tools so that your students and you can be familiar with the assessments in advance. We’ve made available for about the last year and a half a practice test. And we’ve refreshed those a couple of times over that period. Those are full assessments in English and math at each grade level that your students can go through. They feature all the accommodations and other tools that we’ve been describing. So that’s a great opportunity for your students and you to become very familiar with what the assessments look like.
We also have if you’re shorter on time or you’re getting close to test administration time, we also have something called a training test. Those are by band. There’s one for middle, elementary, middle and high school. They’re much shorter. And they’re really – the purpose of those is to quickly familiarize students with the software platform and the tools that are available to them. The kinds of items they’re gonna use. The things they’re gonna do on the assessment. So there’s two options there that teachers can take advantage of.
Bob Rothman: Mm hmm. Laura, how would you respond to Helen?
Laura Slover: Well, I would say that PARCC has developed and made available many of the same tools. Several years ago we released sample items that just showed the PARCC understanding of the standards, how they might be assessed, the kinds of activities and questions that students might be asked. And those continue to be very helpful in kind of bringing to light, shedding light on the kinds of experience students will have. Sample tests. So they’re practice tests on the website now and I’m happy to say there’ll be additional – we’ll be adding additional practice tests later this fall for students and for teachers who want to get deeply into the content and for parents to go and see what kinds of questions are gonna be asked of their students.
And there are a number of professional development modules, training modules, etcetera, for test administrators, educators and any parent could go on and look at that.
I just want to finish where Jackie started. Which is these assessments are about the standards. So parents and teachers who have familiarized themselves with the standards themselves and can prepare their kids for that, there’ll be no surprises. This is the most transparent assessment effort ever. Unprecedented in fact, in that every blueprint document task models, sample items, everything is on the website in the eye of the public. So people can really get prepared and get ready and know exactly what’s coming.
I do want to say, again, that therefore, there’s no need to have some artificial kind of test prep going on in classrooms. And district administrators and school leaders should take note of that and make sure that what their teachers are doing is engaging students in meaningful content and teaching the standards. Through whatever curriculum they’re using, the best preparation is good instruction.
Jacquelyn King: Absolutely.
Bob Rothman: We have a lot of other questions but we’re running – starting to run down on time. So I wanted to give you both an opportunity to say anything that you haven’t said so far that you would like our viewers to hear either about your activities for the coming year or about what the – anything you want to add. So, Laura, you want to start?
Laura Slover: Well, I think the most exciting – there are two really exciting things happening this year. First of all, the administration and roll out of the actual assessment. Building on a successful field test last year and going into the 2014/15 school year, little known fact, PARCC will actually be rolled out in December for a small sample of kids who will take the block test this fall. And then the main administration will be in the spring where we have about 5 million kids taking it. So it’s a big year and a very important year in the assessment followed by the standard setting work next summer where we really engage the higher ed community, together with the K12 educators to set those performance level.
At the same time it’s the year of the instructional tools that will be rolled out. The diagnostic assessments. The kindergarten, first grade formative tools, the midyear and speaking and listening tests. Those will all be piloted in states, field tested. There’s a great opportunity to get teachers and classrooms engaged in the field tests of those diagnostic assessments. This is a huge opportunity. All of those items and tools which will ultimately be computer adaptive will be tested out this year. And, again, in this effort to create good quality aligned tools that can help districts and schools get rid of a lot of the assessments that they’re currently taking and kinda go deep on meaningful tools that provide good information, I think it’s a great opportunity. It’s also really good opportunities to get educators involved in the craft of building assessments. So I would welcome people to get involved and call me directly if they want to become part of that piloting and field test work.
Bob Rothman: Mm hmm. Jackie.
Jacquelyn King: Sure. So it’s a huge year for us. We are launching our partnership with UCLA. We are this month having the achievement level setting that I described. We’re very excited about. And then we’re rolling out the full assessment system. The digital library already available to teachers, and we’re getting some great feedback on that. The interim assessments. And then the summative assessment coming this spring. So a lot happening. Very, very exciting. Continuing to involve a lot of educators in our member states. I’m personally very excited to see the roll out of the use of this assessment in higher ed. The states I’ve neglected to mention earlier, in Washington and West Virginia, the states, in those states the public higher ed institutions will use the eleventh grade assessment, the administration this spring. We are hopeful that a number of states, other state public higher ed systems will join them this year. So really a lot of excitement going on. We’ll be launching a lot of our validation research and continuing that this year. So there’s a lot of good working happening and it’s exciting to see.
Bob Rothman: Anything else you wanted to add that hasn’t –
Laura Slover: Thank you for your support.
Jacquelyn King: Thank you for this opportunity.
Pat Porgion: I thought it might be worth knowing that litmus test of what’s ahead of us is gonna be with those teachers and educators in the classroom who want to make a difference for our kids. Someone from California, I didn’t know there was a Carson, California, but there is. Susan asked, can teachers get training to write their own specific assessments and lesson? You’ve talked about that a little bit today, but I think to that audience that really wants to feel comfortable that these lessons and assessments can mirror testing in the style and form of each of your consortium.
Jacquelyn King: I would really encourage you – it was Susan?
Pat Porgion: Susan.
Jacquelyn King: Susan, I would really encourage you to make sure you’ve got your credentials for the digital library and check it out. Because it really, it includes sample instructional materials that describe how you can embed assessment in what you’re doing every do. It can be writing test questions for quizzes and things like that. It can also be the way you do classroom conversation and dialogue with students and the way you elicit information from students in the process of teaching. And from listening to the dialogue that students have with each other. And being able to really plume that for good information. So lots of different ways for teachers to use the formative assessment process.
The other thing that’s in the digital library is, and we’re building more of this and adding new information weekly, is professional development resources on how to take advantage of assessment results and how to really use those in the classroom. I think one of our challenges and one of the reasons people feel like there’s too much testing is they get these results back but they don’t know what to do with the information. So we’re really hoping through the digital library to provide teachers with a lotta support, so when they get the Smarter Balanced results back, they really understand how to take that information and put it to work in their classrooms.
Pat Porgion: Laura?
Laura Slover: That’s critical and assessment literacy and how to use results is something that the PARCC states are looking closely at. In regards to how teachers can use the PARCC tools to inform their instruction, there are a number of things that PARCC has developed, and they’re on the website. Like evidence statements. What is the quality performance that we’re looking at in students? Task models. Well, if we want to get this performance, what kind of tasks do we need to put in front of students so they can really show what they can do? And then the sample items and the practice tests can be good models.
You know one of my visions as a former high school teacher is to get educators deeply engaged in the development of assessment items in general. Both for use in their classroom but also for use in some of these tools that are shared across states. So we are launching a pilot to get educators in the driver’s seat on developing items. And I’ll be able to talk more about that in the coming months. But we’re looking for educators who, using a set of resources and tools, guideposts if you will, create, become the creative innovators of quality items and submit those to be vetted by their peers. And to be included in the future world of the assessments.
So more on that to come. But it’s very promising and really exciting as a former educator to be able to pursue that.
Pat Porgion: And I know both of you care so much about the teacher and the parent and student. And obviously I thought it was worth punctuating that, Bob.
Bob Rothman: Mm hmm. That’s a good point. One other question about sort of the relationship between the consortia from Cheryl from California. Are the two consortia planning to do comparability studies between students’ scores to determine if a score would be comparable in a state that uses the other consortia assessment?
Jacquelyn King: I think where it’s safe to say we have really focused is on that level of that measure of grade level performance. When a student is on grade level and on track to being college ready in the Smarter Balanced state, how doe that look in comparison to them being similarly defined in a PARCC state? And making sure that we’ve got good strong comparability at that level. That’s what’s really gonna be crucial for policymakers as they want to look at test results.
Laura Slover: Yes. And I’d say particularly around the college and career – at every grade level, but particularly as students go off into institutions of higher ed, we want to be able to say that with great confidence whether a student goes through a Smarter Balanced assessment state or a PARCC assessment state that the information that higher education is getting, colleges and universities, is comparable in terms of it reliability and validity for getting kids into credit bearing courses without the need for remediation. Which is of course the goal of what we’re here to do in the first place.
Bob Rothman: Great. Well, we really could continue on this all day, but unfortunately our time on the set is up. There were many questions that were asked that we did not get to. For that reason we’ll keep you posted on how we will work to get more questions answered and make other materials available on our website, www.allfored.org.
I want to thank the William and Flora ___ Foundation for making this webinar possible. And thank especially our panelists today, Laura and Jackie. And I want to thank Pat and Nancy Dory from the K12 Center for their partnership on this and our previous webinars. We really appreciate our work together.
Thanks very much for joining us today and giving us the opportunity to dig deeply in this matter. And one final thing, if you’re a member of the AllianceActionAcademy, and if you are you know what that is, the code word for today’s webinar is green. You can enter this code word into a box on this webinar page for extra points. If you’re not a member of the ActionAcademy, you can learn more and join by visiting our website, www.allfored.org, and clicking on take action.
For the Alliance for Excellent Education, thank you very much for joining us.
[End of Audio]
Welcome to the Alliance for Excellent Education’s Action Academy, an online learning community of education advocates. We invite you to create an account, expand your knowledge on the most pressing issues in education, and communicate with others who share your interests in education reform.
or register for Action Academy below: