State ESSA Plans: Gift or Empty Stocking for Nation’s Students?
State ESSA Plans: Gift or Empty Stocking for Nation’s Students?
Chad Aldeman, Principal, Bellwether Education Partners
Phillip Lovell, Vice President of Policy Development and Government Relations, Alliance for Excellent Education
Nikki McKinney, Director of Policy Development and Federal Government Relations, Alliance for Excellent Education
Andrew Ujifusa, Assistant Editor, Education Week
On December 13, 2017 the Alliance for Excellent Education held a webinar exploring trends in state plans to implement ESSA. It has been nearly two years since the Every Student Succeeds Act (ESSA) was signed into law by President Barack Obama, who called the measure “a Christmas miracle.” Are ESSA state plans a gift for the nation’s students or do they represent an empty stocking?
Thus far, sixteen out of the initial seventeen ESSA state plans have been approved by the U.S. Department of Education (ED). The remaining plans have been submitted and will likely be approved by January 2018.
Which state policies are innovative? What policies are concerning? How are the U.S. Congress and ED responding to state plans? This webinar dove into some of the details of policies that states are putting in place regarding school letter grades, opting out of assessments, identifying low-performing schools, and more.
Please direct questions concerning the webinar to email@example.com. If you are unable to watch the webinar live, an archived version will be available at https://all4ed.org/webinars 1–2 business days after the event airs.
The Alliance for Excellent Education is a Washington, DC–based national policy, practice, and advocacy organization dedicated to ensuring that all students, particularly those traditionally underserved, graduate from high school ready for success in college, work, and citizenship.
If you are interested in renting the Alliance’s facilities for your next meeting or webinar, please visit our facilities page to learn more.
Photo by Allison Shelley/The Verbatim Agency for American Education: Images of Teachers and Students in Action
Nikki McKinney: Good afternoon. My name is Nikki McKinney and I’m the Director of Policy Development and Federal Government Relations here at the Alliance for Excellent Education. All4Ed is a national policy, practice, and advocacy organization dedicated to ensuring that all students – particularly, those who are historically underserved – graduate from high school ready for success in college and a career and citizenship.
Two years and three days ago, President Obama signed the Every Student Succeeds Act, or ESSA, into law. Described by President Obama as a Christmas miracle, ESSA is a bipartisan reauthorization of the Elementary and Secondary Education Act of 1965, and is seen as a dramatic but long overdue pendulum swing from the prescriptive construct of No Child Left Behind to a system based on state and local flexibility but within federal guardrails for equity.
The first step to implementing ESSA – the development and approval of state ESSA plans – is well under way. Sixteen states and the District of Columbia submitted their state plans last spring. And all but one of those have been approved.
All4Ed has developed dashboards that rate these state plans on key equity policies. You can find dashboards for Arizona, Connecticut, Colorado, Delaware, D.C., Illinois, Louisiana, Maine, Massachusetts, New Jersey, New Mexico, Nevada, North Dakota, Oregon, Tennessee, and Vermont at the link below.
The remaining states and territories submitted plans in September. And the Department has 120 days to approve them. So, what has the process to develop state plans yielded thus far? Are states leveraging the flexibility within ESSA to advance equity?
Or are they submitting plans just to address the bare minimum of federal requirements? We’ll examine those questions and many others during today’s webinar. But before we get to that, a few housekeeping items.
You can follow today’s conversation and ask questions on Twitter using the hashtag #ESSA. You can also submit questions for our panelists, using the box below. This webinar will be archived for later viewing at www.all4ed.org/webinars.
Now, let me introduce our guests for today. To my right is Phillip Lovell. Phillip is the Vice President of Policy and Government Relations for the Alliance for Excellent Education. Phillip leads All4Ed’s policy and advocacy efforts and formally worked for organizations, such as the National Crime Prevention Council and America’s Promise Alliance on issues ranging from juvenile justice to homelessness. Welcome, Phillip.
Phillip Lovell: Thanks much.
Nikki McKinney: To Phillip’s right is Andrew Ujifusa. Andrew is an Assistant Editor for Education Week. He covers education policy at the federal and state levels, and writes for Politics K-12 Blog. So glad you could join us today, Andrew.
Andrew Ujifusa: Thank you for having me.
Nikki McKinney: And next to Andrew is Chad Aldeman. Chad is a Principal at Bellwether Education Partners. Previously, Chad was a Policy Advisor in the Office of Planning, Evaluation and Policy Development at the U.S. Department of Education, where he worked on elementary and secondary education act waivers, teacher preparation, and the Teacher Incentive Fund.
Thank you for being with us here today, Chad.
Chad Aldeman: Glad to be here.
Nikki McKinney: We are thrilled to have all of you – a very distinguished panel. But since Phillip is my boss, we’re going to start with you.
Phillip Lovell: Lucky me.
Nikki McKinney: The title of today’s webinar is “State ESSA Plans: Gift or Empty Stocking for Nation’s Students?” As we mentioned earlier, All4Ed is developing dashboards on each state plan. So in a nutshell, what’s the answer: gift or empty stocking?
Phillip Lovell: It’s an excellent question. And I think it’s kind of like if you have a really great gift card in your stocking. But then you look under the tree, and you don’t have very much there. So, it’s somewhat of a mixed bag.
ESSA – the state plans – they cover so much from things that are relatively straightforward like goals and the percentage of students that are supposed to be proficient in math and reading – right, relatively straightforward – to things that are really wonky, like end size – the number of kids that have to be present in order to trigger the law’s accountability and reporting requirements.
So across all these policies, there were many opportunities for nice gifts under the tree and many opportunities for empty stockings. And we saw them both.
Nikki McKinney: Chad, you – similarly policy-wonkish to my boss, Phillip – Bellwether Education Partners, and the fine folks at the Collaborative for Student Success convened an independent peer review process to review all the ESSA state plans.
The peer review was bipartisan and included folks from both the Bush Administration, the Trump Administration – at least, the transition team – as well as former Obama Administration officials.
From that peer review process, what’s your take: Christmas gift or empty stocking?
Chad Aldeman: I’m going to go with Phillip on the empty stocking metaphor. I think it works pretty well here. It is a gift to states. They can figure out what to do with it. As Phillip said, it’s sort of like a gift card the state’s can decide what to do with.
But it’s also somewhat empty at this point. So, we looked and found that a lot of the state plans still are incomplete. They’re not always finalized. There are missing components about how the indicators are calculated or how the indicators will roll up into overall ratings.
And so, I like that metaphor about it’s a vision. You can take it what you want with it. And we also know that states will go forward once their plans are approved and make lots of decisions about how things are implemented.
And we’re still going to see those types of things coming forward at some point.
Nikki McKinney: Okay. Andrew, we know that there are the policy wonks, like our friends, Phillip and Chad. But there’s the court of public opinion. There has been a lot of stakeholder engagement done by states. We know that there are parents and teachers and other outside organizations who are looking at these plans.
So in the court of public opinion, what’s the overall feeling about state plans? Is the public aware of what’s going on? Are they expressing any strong feelings about plans? Or is this just seen as another “check the box” bureaucratic exercise?
Andrew Ujifusa: So, I wish we could say that there was detailed public opinion polling on each of these state plans. I would spend so much time on those polls. But unfortunately, we’re not there. We don’t have that kind of polling.
I think it’s too early to say just how the public feels. I think a lot of people in a lot of states haven’t gotten a good grasp of what’s in the plans. Oftentimes, as Phillip said, they’re very wonky – deal with very technical issues.
There was an interesting poll that the Collaborative for Student Success put out last spring, I believe, where they surveyed teachers and the general public about ESSA itself, not any specific ESSA plan.
And the poll found that a plurality of the public – 43 percent – said that they viewed ESSA as, at least, a good opportunity to get things done in schools that would help students. I think a plurality of teachers sort of took the other view that they were maybe seeing this as just another federal initiative that maybe wouldn’t help their work very much.
I think that’s also a function of the fact that the ESSA plans don’t address some very controversial things that people are used to reading in the news, like teacher evaluation. They don’t really deal with that hot button issue.
So, I think it’s too early to say, in many cases, just how the public feels about most of these plans.
Nikki McKinney: Okay. Now, Chad and Phillip, and probably even you, Andrew – there are few people who have probably read all 50 state plans, the District, and the Territories.
Phillip Lovell: We know how to have a good time.
Nikki McKinney: Yeah, you do.
Chad Aldeman: Party on.
Nikki McKinney: So from a positive standpoint, what’s the strongest element that you’ve seen coming out of the state plans? And are there any exceptions to that?
Phillip Lovell: Sure, so – and actually, we have a couple slides here. One of the things that I think was the strongest element of the state plans were the goals that were set in the state plans. So in the dashboards that you were talking about, we gave states a green on their goals if they had a goal of having 75 percent or more of students’ proficient by 2030, and a yellow for slightly less than that.
So, and as we can see, among the 16 approved state plans, all of the states got a green or a yellow, so fairly strong. I would say a caveat to this is that it’s important for goals to actually matter in the accountability system. And by and large, they don’t.
But an exception in, like, Illinois and Nevada, for example, their interim goals matter in the system. We think that’s important. But the goals themselves – by and large, states set fairly high goals. In addition, one of the concerns going into ESSA implementation was that the flexibility provided in the law would be used to water down our focus on academics.
And by and large, we didn’t see that. We gave states a green if their indicators were 75 percent focused on academics or more. And as you can see, 15 out of the 16 states with approved plans had a strong focus on academics.
So, those were the two, I think, you know, strong elements of the state plans.
Chad Aldeman: Yeah, I would mostly agree with that. I think one of the biggest positive trends over the last few years in the accountability space has been the expansion of what we use to define what it means to be a quality school.
And so, more states are including more robust measures beyond just the NCLB era reading and math proficiency. So in the first 17, all 17 used at least some measure of individual student growth. Even among all 51, we saw 47 that had some measure of student growth that they’re building into their plans.
We’re seeing more states include things like science and social studies. Some states have student engagement surveys, or they’ve been able to include physical education or art – other things that they’ve found high quality ways to measure that.
And generally, our peers found those things to be positive. To Phillip’s point, they’re still – for the most part, states are still putting a strong emphasis on academics. And there’s a balance there.
We don’t know exactly what that right balance is. It’s more of an art than a science. But I think our peers were generally encouraged to see both of those trends continue at the same time. A couple exceptions, I would say – there’s a couple states that do have some lower weighting.
North Dakota, for their high schools, is only be going to give 51 percent of the school’s weighting based on academics. Maryland is only going to give 40 percent, I believe, maybe 45 percent of their elementary school weighting on elementary reading and math and growth.
And so, that’s a concern that those aren’t going to be a significant weight, particularly if they’re going to be outweighed by other things that maybe aren’t as linked to long-term student outcomes.
Nikki McKinney: So, we’ve covered what some states have done really well and what, across the board, states are doing pretty well. Converse to that, where do we think some states fail to live up to the promise of ESSA?
What are some – is there a weakness you saw across the 50 states, as you reviewed them?
Phillip Lovell: So, definitely. I mean, I think that’s one of the biggest challenges in the state plans. I would say, generally, is around equity and, specifically, two areas of policy. One, where states are having their letter grades, and do student subgroups actually count within those letter grades?
And then two, the area of how they’re defining consistently underperforming students for targeted intervention. And we recently just did a blog talking about the consistently underperforming issue.
And I know we’ll go into more detail and this later in the webinar. But we think that it’s really important for, if we’re going to have letter grades, then they should be reflective of all kids. And that’s not the case in many states.
And that said, there are a few exceptions. Actually, thankfully, there are several exceptions. So for example, in Tennessee, 40 percent of their letter grade is based on separate performance. And in Louisiana, you can’t have an A as a school if you have a subgroup of students who’s been identified for targeted support.
So, there are some states that have, I think, important safeguards around these policies. But by and large, the – and the equity issues, which we already know are the most important, are also some of the weakest in state plans.
Chad Aldeman: Yeah, we would agree with that. So, our peers didn’t give – we had a one through five rating scale for all of our nine different categories. And our peers didn’t give a five to any state for the equity issue – the category that we called “All Students”.
So, we can dig on more on that later. The one thing that I would add to Phillip’s point was about – we had a category looking at how states were supporting low-performing schools. And generally, those plans tend to be fairly vague and non-specific.
We’re coming off an era where No Child Left Behind was very specific and had a one size fits all model. That didn’t work so well for lots of places. The Obama Administration tried a more intensive effort called the School Improvement grants.
That had some downsides, as well. So, ESSA gives states a lot more flexibility about how they design school turnaround interventions. And there’s still a lot of uncertainty in the plan. So, we noted that in ESSA, there’s something called a 7-percent set-aside that all states are required to take for school improvement activities.
And they can choose – states can choose how they want to design that funds, whether they want to award it by formula or by competition. The federal template did not ask states what they were doing with that money.
It’s about a billion dollars, nationally. About 12 of the 34 states had something in their plans about what they’re going to do. That means the other 26 – the other 22 did not. There’s another set-aside. It’s an optional set-aside for states called the Direct Student Services to provide students in low-performing schools a direct service – things like tutoring or transportation or other things.
And none of the 34 states said that they were going to take that up. And only one state even mentioned it. So, there’s a lot of – to the earlier point about the empty stocking, there’s a lot of still details to be filled out about what the school improvement plans actually look like.
Phillip Lovell: And I would just add to that I think that the Direct Student Services set-aside is, to me, the biggest missed opportunity in the implementation thus far. At least in the first round, I think that Louisiana and New Mexico had said that they were planning to use the 3-percent set-aside.
It can be used for things like access to advanced coursework. And considering I think – and one of the other bright spots, as you had said, Chad, in the ESSA plan is that more states are using more than just proficiency in math and reading.
They’re also now looking at college and career readiness. Well, students need access to that coursework. And we know that not every student has access to the coursework. That 3-percent set-aside can be used to fill the gaps.
But very few states are actually planning to use that 3-percent set-aside.
Chad Aldeman: Yep. I forgot to give my exception, if that’s okay to add those in. So, New Mexico was one of the plans where our peers found –
Phillip Lovell: Brilliant thinking.
Chad Aldeman: Yeah, exactly. The other ones I would give would be – in this round – New York, Indiana, Rhode Island all had fairly strong plans. And they had a mixture – those three states all talked about how they were going to use their school improvement funds.
They all had different ways to provide direct feedback to schools. New York has sort of an inspection school quality review process that they have people trained to do reviews that go out from the state that then have a very short 7-page template that’s standardized that then they give feedback to schools to help them draft an improvement plan.
Rhode Island has what they call a hub of evidence-based practices that then they can point to schools and districts. So, “Here are strong interventions that we recommend.” And states can – or schools and districts can then pick from those lists.
Nikki McKinney: Excellent. Andrew, I’m going to come back to you.
Andrew Ujifusa: Okay.
Nikki McKinney: ESSA is a federal statute. And many of the architects of ESSA are still sitting members of Congress. I think we actually have a clip of Senator Murray at a hearing a few weeks ago, expressing her views of the Department’s oversight of ESSA plans. Let’s roll that clip.
[Video played, 0:17:18 to 0:18:02]
Senator Murray: You and I worked together on Every Student Succeeds Act. We reached an agreement that gives states flexibility while including some clear requirements for states in the statute. The requirements are in black and white.
They’re in the law and have nothing to do with regulations. And I am deeply troubled that violations of the law are being ignored by the Department of Education. I want to give you an example. The law requires in statute that states identify three distinct categories of schools for improvement: bottom five percent of schools, all schools where one subgroup of students is consistently underperforming, and schools where any subgroup is performing as poorly as the bottom five percent.
But plans are now being approved that violate this.
Nikki McKinney: So, Andrew, what are other members of Congress saying about state plans, and in particular, about the Department’s review of state plans? Do they share Senator Murray’s viewpoint?
Andrew Ujifusa: Yeah, so this is kind of a multipart answer. I think that Murray there was speaking for a lot of democrats, including Bobby Scott, who’s the top democrat for education in the House, as well as civil rights groups around the country and similar organizations.
I think that was a culmination of frustration that Murray feels because she and others believe that Betsy DeVos, the Education Secretary, has been a little too quick to waive some state plans through the approval process.
Now in that clip, you might’ve noticed that she didn’t call out a particular state. But she did leave sort of some clues as to where she thought there were big problems in ESSA plans, including for consistently underperforming students and how that group is defined and handled.
We sort of went back and tried to reverse engineer her question and asked, “Well now, what state could she be talking about?” And we thought that Delaware might be an example where she felt that the plan was approved, you know, too quickly without more oversight of that.
In Delaware’s plan, there’s sort of a duplication of the definition of what is a consistently underperforming group of students. Now, I think it’s – she hasn’t gone so far as to say, “This state’s plan is a problem and shouldn’t have been approved,” and so on and so forth.
But I think Murray, Scott, and a lot of groups in states are feeling frustrated with how DeVos is handling this. We should also mention John Kline, who is no longer in Congress but was an ESSA architect, former Republican Chairman of the House Ed Committee.
He actually wrote to DeVos, saying essentially that he’s concerned that at least a few state plans aren’t requiring that the same statewide – or statewide tests be given to all students. And this was something where I think all of ESSA’s authors agree that, you know, you want all the kids taking the same statewide high quality, ideally, test.
And then, it’s up to the states in many cases how to use the results from that. So, Kline was concerned that if you don’t have all the kids taking the same test, you can’t really judge results and use the results in a uniform way to address, you know, disparities or problems in schools.
And then, there’s Lamar Alexander, who I think is taking a very different approach. He is not keen on aggressive federal oversight. He has made this point repeatedly. In fact, there was a notable exchange over the summer, where one of Betsy DeVos’s deputies, Jason Botel, wrote to Delaware.
Delaware’s plan has sort of been strangely essential to a lot of these ESSA debates, I think. Wrote to Delaware, identifying various problems he saw with their plan. And Alexander did not like that. Alexander is the Chairman of the Senate Education Committee, Republican, very powerful former Secretary of Education himself.
And he basically told Botel pretty directly that he didn’t think that Botel had read the law carefully and that he should read it again. And that he did not want the Federal Department of Education mucking around in states’ plans that way and essentially telling them the problems with their plans.
And since then, I think DeVos’s folks have had much more hands-off approach, in terms of feedback. So, ESSA’s authors, as you say, have sort of taken different approaches to DeVos’s oversight of ESSA plans.
Nikki McKinney: That has to make everybody’s life easier, doesn’t it? Andrew, I’m going to stay with you. And, Chad, I’d also like you to think about your response to this question. One of the newest aspects of ESSA is the school quality and student success indicator.
How have states tackled this in their ESSA plans?
Andrew Ujifusa: Sure. So, as you said, this is sort of a new element that people were addressing in their ESSA plans. Basically, ESSA’s authors designed it to be something that didn’t have to do with test scores or graduation rates.
They wanted states to use it as an opportunity to look at non-test score things that they thought were important for the education of their students. As you can see, the majority of states picked as at least one indicator of school quality and student success, chronic absenteeism and attendance.
And I think there are a few reasons for that. There’s a generally agreed upon definition. I think it’s missing at least 10 percent of school days, about 18 days. It’s relatively easy to track compared to other things they might be interested in measuring.
And also, I don’t know how much credit you want to give him. But for the one year he was Education Secretary, John King talked about a lot. Now, I don’t know how much influence he had when states were putting together their plans.
But he definitely raised the profile of the issue. But as you can see, you know, states also picked a variety of other things. And even within the college and career readiness definition, there’s a lot of variety, as I think Chad has noted, in terms of, you know, what that means state by state.
And that variety might be a good thing.
Nikki McKinney: Chad, your thoughts?
Chad Aldeman: Yeah, so the law has some rules on what this means. So, every state has to have at least one school quality or student success indicator. And the law says that they have to be statewide and comparable.
They have to be valid and reliable. And they have to be able to be disaggregated across subgroups. So, that is tricky for some indicators that some states might’ve wanted to do. All of those different criteria have made it difficult in a good way, I think, for most, in a sense, difficult to incorporate new measures.
But I think our peers – and I talked about this earlier – we had 45 experts look at the state plans. And they were optimistic. They like to see some of these new ideas, particularly when they’re done in a high quality way, and figuring out how to measure them well in a way that’s fair for schools and then slowly incorporate them into how we view school success.
I’ll mention a couple things. One is I’m a little worried about how we measure chronic absenteeism. We’re setting thresholds that in states it’s either measured as a number of days that are missed or a percentage of days that are missed.
And those thresholds could become gained, if you will, about how – getting students right underneath that threshold, as opposed to the chronic absenteeism data, so far, the research says it matters for kids to be in school.
And as a binary, it’s better for kids to not be chronic absent than to be chronic absent. But it’s not fine grained enough to say, “If you risk nine percent of the year vs. ten percent of the year, that’s not a huge difference.”
It’s a little more nuanced than that. And I’m worried that states are setting up some strange incentives there.
Andrew Ujifusa: Could I also add I think that before the state plans started rolling in, there was a lot of speculation that at least a few states might be interested in measuring social emotional learning in some way. It’s been talked about a lot in the last few years.
It’s sort of a trendy thing for education policy wonks to discuss. But I also think in the SEL community, if you will, there was some fear that by measuring it, particularly for accountability purposes that it would create problems for SEL, that it’s not designed to be measured that way for accountability.
And as it turns out, I don’t think any state chose SEL for the school quality measure. That doesn’t mean that states aren’t exploring issues or using social emotional learning in schools. You just won’t see it in ESSA plans.
Nikki McKinney: Excellent.
Phillip Lovell: Yeah, I think in a few states, it said that they were going to explore it –
Andrew Ujifusa: Yes.
Phillip Lovell: – and see if they can incorporate it, as the measures get refined and perfected.
Andrew Ujifusa: Yes.
Phillip Lovell: But by and large, I think there are some restraint in making sure that their indicators are high quality and ready for accountability. Yeah, so I totally agree.
Andrew Ujifusa: They’re not there yet.
Chad Aldeman: Right.
Phillip Lovell: Yeah, they’re not there yet.
Nikki McKinney: I want to piggyback and pick up on something that Andrew said in his previous response about the school quality and student success indicator. And that is that a number of states are using this college and career readiness measure.
How are states proposing to do this in their plans? What does this college and career readiness indicator look like, generally?
Phillip Lovell: So, you know, by and large, there are different ways, in which it’s measured. But I think most states are using similar ingredients in their cake, right. So, everyone has, you know, butter or sugar and other things that are yummy.
But they’re making different types of cake with their cause and surety measure. And that’s important because this isn’t like graduation rates, where we will then have something that you can compare across states.
But it is good that we’re moving in this direction. So, states are using things like the percentage of kids that are participating in Advanced Placement or getting a three or above on an AP test, a four or above on an International Baccalaureate test, dual enrollment, performance on the ACT and SAT.
We saw a lot of states focusing in on career readiness. And so, including things like the percentage of students that are receiving industry recognized credentials. I’ll note a few states that I think did some innovative things.
So, South Carolina will be reporting – or at least, they proposed that they will report the percent of students that are college ready, that are career ready, that are college or career ready, and that are college and career ready.
Right, so you have to – well, if you’re watching this at home, you can scroll back to see what I said. Right, so college ready, career ready, college or career ready, and college and career ready. And maybe this is just interesting to the wonks among us.
But I think this is actually really important, as we know that just getting a diploma is not enough. South Carolina is going to be really looking at the information and making that publicly available, just, “How ready are our students? And in what areas are they ready?”
So, I think that’s really good. In Louisiana, they have a strength of diploma index. So again, looking at more than just our kids getting a diploma. But they’re giving 160 points for if you’re getting a – if you have an associate’s degree, 100 points for if you’re graduating in 4 years, and then 150 points if you’re getting a 3 or higher on AP, a 4 or higher on IB.
Or if you’re getting an advanced industry recognized credential, then you get fewer points, 110 points if you get a basic industry recognized credential. All of it is to say that Louisiana is not looking at each indicator as being equal.
Right, so getting a basic industry recognized credential is not the same as getting a four on an IB test. And Louisiana doesn’t treat theme the same, whereas I think some states they’re using all these different indicators. But they, by and large, give them all the same weight.
Chad Aldeman: Yeah, I would add a couple things to that. One, and I think this is a positive trend nationwide over the last few years, is building this in. we used to just hold high schools accountable for graduating a fairly low level proficiency test.
And having these included is a positive in my mind. There are lots of intricacies to it, and so I’ll mention a couple. One, if you’re interested in these, the Education Strategy Group has followed this pretty closely.
And they have a good resource for what states are doing and strength of these measures. So, I won’t go into too much. But I will say a couple things. One, the denominator really matters, whether you’re looking at the cohort of entering 9th graders, whether you’re looking at just the cohort of 12th graders, or graduates.
The denominator changes for each of those groups, and you’re giving different incentives based on which one you use. The other thing our peers called out was whether the measures were externally validated.
So, if you’re using something like Advanced Placement or International Baccalaureate or an industry certification, those are all – have some external check on them – a external check on quality.
Some states are using more, like, career plans or other things like that, where it’s going to be much more difficult to know whether it’s a high quality pathway or not. And then, the last thing I would say is whether certain student groups are being pushed into certain pathways I think is a real concern for our peers about, you know, if academic kids are being pushed into the academic pathways and their other students that are going to be pushed into the career pathways, are those really, truly fair and comparable?
Phillip Lovell: And having been one of the peers, you know, that was definitely something that we were observant of and concerned about. And I want to go back to the issue of the denominator because this is super important.
Some states are looking – so when they’re using remediation rates, right, which is a good thing – we should be looking at the percentage of students that need remediation in postsecondary education as a measure of how well the system is working.
And really, only a few states are using remediation. But we’ll just take that as an example. If we’re looking at just 12th graders, we’re missing out on kids who may have dropped out in grades 9, 10, and 11.
So, are remediation rates, as high as they might be, they’re actually undercounts of whether kids are actually ready for college and career because you don’t count dropouts. And if we’re looking at AP course taking and performance, same thing goes.
If we’re just looking at 11th and 12th grade students or just graduates, then we may have inflated figures of college and career readiness. And that’s something that they’ll definitely need to keep an eye out for.
Nikki McKinney: Andrew, one of the major areas of disagreement in education accountability is whether schools should receive a summative rating, such as an A through F letter grade, or whether schools should be measured using a dashboard that includes data on multiple indicators or some combination of the two.
What did states propose in their ESSA plans?
Andrew Ujifusa: Yeah, so I was so eager to talk about this a few minutes ago. I jumped the gun on the slide. But now, we can get to it. So, I think for many folks this is sort of the business end of accountability. And it’s sort of the – especially when you’re thinking about parents and how they think about their child’s school or interact with school.
This is sort of an – both sort of a final product, in a way, but also an entry point for a lot of people in the general public. So, the short answer is, as you can see from the slide, if you’re a fan of schools getting some sort of final grade or rating, you’re in luck because most states – 42 and D.C. – do that.
Now, there’s a variety within that that I can get to in a second. But there were only six states that we counted that used a dashboard model that, you know, assigns a different grade or a different rating to individual factors in the accountability system.
But it doesn’t tote them all up and say to a school, “All right, at the end of the day, you’re an A school,” or, “you’re an F school,” or, “you’re a three-star school,” or whatever. A few states chose not to do that.
I think one thing I want to highlight from the slide, even within the summative ratings, A through F, obviously, has been very controversial. It’s been pushed in a lot of states by, among other groups, Jeb Bush’s Foundation for Excellence in Ed.
That number that you see there is pretty consistent with what we’ve seen over the last few years, in terms of the number of states using that. It grew rapidly for a few years, and then it’s kind of leveled off.
I kind of find it interesting that Nebraska, for example, is using words to describe schools. On the one hand, it’s very clear, right. But different words mean different things to different people. And they can hide a lot of things, in some cases.
So, and then you have – I don’t know if Chad’s going to talk about this – but you have the example of California, where it’s gotten a lot of attention, as to how they rate different things going on at different schools.
And a lot of folks are complaining that it is impossible to – for the average person – to understand. So, there are some general trends there. But within that, there’s a lot of variety too.
Nikki McKinney: Chad, did you want to add anything?
Chad Aldeman: Yeah. The only thing I would add is that – so, we would actually add more states to our list of not doing a summative rating. And the reason is there are some states – like, actually, California is going to have a rating system.
They’re going to use their color-coded dashboard to identify schools to meet ESSA’s requirements. And there are – we counted about 15 states that are doing that process, but not going to use it for anything else.
And they might not even warn schools about when they’re close to being identified. They might not use the rating system for any particular purpose.
Andrew Ujifusa: Right.
Chad Aldeman: And so, I think that is a concern. Like, I’ve written in the past about summative ratings. I’m on the record saying I like it. I think it sends a good message to parents. It’s easily to communicate what an overall sense, a quick sense of what schools are doing.
If states don’t do that, someone else is going to. And the other thing is that it helps drive improvement. So, we know – like, in New York, when they transitioned from Mayor Bloomberg where they had an A through F system, under Mayor de Blasio, they use the same school rating system, but they just dropped the grades.
So, a researcher calculated the grades, and then said, “Did F schools continue to improve as they had been previously?” And he found that just by dropping the letter grades, F schools no longer had an incentive to improve.
And so, I do think there is some value about that summative rating that there’s – states don’t always like it. But it does send a good value message to people.
Nikki McKinney: That’s interesting. Phillip, in the All4Ed dashboards, you looked at the inclusion of subgroup performance and the rating of school performance and the definition of consistently underperforming in identifying schools for support.
In the 16 approved plans, how did states fare on your dashboard?
Phillip Lovell: So, this is, like we were saying earlier, I think one of the biggest areas for improvement within the states that currently have their plans pending. And again, we just released a blog on this. So, it has some additional details there, but a few slides to show.
So for targeted support, the law says that three types of schools need to be identified. And this was one of Senator Murray’s top concerns, right, that we have to have some schools identified for comprehensive improvement.
And these are schools that are, by and large, really have a lot of need. But then, we have two types of schools that are identified for targeted support – one group that have consistently underperforming subgroups, and then another group that’s labeled in the law as needing additional targeted support because these subgroups are performing at the level of the lowest performing five percent of Title I schools.
To get a green in our system, by our measure, you have to have a separate definition for consistently underperforming and for this additional targeted support, because in the law there are two separate requirements.
And then in addition to that, you had to be using two or fewer indicators to identify a school for targeted support. And the reason is that we were seeing that a number of states are essentially requiring students to fail on all indicators, in order to be identified.
And we think that if you’re not performing at grade level in math or if you’re not performing at grade level in reading or if you have a low graduation rate, that should be sufficient to trigger a response.
You shouldn’t have to be failing in math, failing in reading, not going to school, having a low graduation rate, and not, you know, having access to art or whatever else might be in the system. So all told here, five states received a red in our system, eight received a yellow, and only three received green.
The other issue around the inclusion of subgroup performance – so, ESSA says that the performance of subgroups must matter. And what we saw that in the states’ systems, thankfully, in ten of the approved states, there was some way for subgroups to matter in their rating system.
But in six of them, four received a red; two received a yellow. We gave red if subgroups just didn’t count in the system. And two, if there was some way for them to matter, but it wasn’t majorly substantial.
The good news is that you had states like Louisiana, like I mentioned earlier. Illinois is another state, where you can’t get the highest rating in the system if you have a low performing subgroup. In Tennessee and D.C., you have specific percentage of the rating going to subgroups, so subgroups don’t matter.
But to the point around the letter grades, they can be transparent. But are they actually accurate? So in one state, for example, we were looking at school report card, and a school got an excellent for two years in a row.
But then, you peel back the onion a bit, and go to Page Two, go to Page Three, go to Page Four, get out your magnifying glass, and you find that, “Oh, African American students have a graduation rate, you know, of 60 percent.”
And that’s not exactly a transparent system. So, we think that that’s another area where there needs to be some improvement.
Nikki McKinney: Chad, in the Bellwether process, you also examined these issues by looking at the degree to which all students, including all student subgroups, are receiving a high quality education and the extent to which a state ensures that schools and student subgroups that are most in need of support, receive it.
In your review of submitted plans, what are you seeing on these issues?
Chad Aldeman: Yeah, as generally similar trends to what Phillip identified, I’d add a couple things that we saw. One is our peers generally wanted to see more from states on this category. So, most states took what the law requires.
And sometimes they copied and pasted it into their plan, essentially saying they’re going to be following the law. And, I mean, that’s a good assurance. But these states have had two years to develop their plans.
I think our peers generally wanted to see more about what are actually going to happen in those schools. Also, states have had two years to design their systems. They’re proposing these systems in place.
And I think our peers wanted to see some more indicators of what are the actual effects of those systems. So I give you an example. Only 3 out of the 51 states could estimate how many schools they’ll be identifying under their rules.
Those states are Minnesota, Louisiana, and Tennessee. And they are the only ones that either have run their system or presented numbers publicly on the results of their system. Which either says that they haven’t run their system, or they’re not willing to share the results yet of what their system has.
And neither of those interpretations are positive ones. So, there’s just a lot to be determined. I think it’s a little bit concerning that we’re at this stage of the process and states haven’t run their systems yet.
And they’re going to be running them in six months in the summer of 2018. And I think we’re going to get some surprises about how many schools are identified, where those schools are. We’re not going to have much warning in lots of states.
And we might have a – all of a sudden, to Andrew’s point about people aren’t paying attention to ESSA, that might be a place where the rubber starts hitting the road.
Andrew Ujifusa: Yeah. And I just want to add a little bit of important context. Some of your viewers out there might already know this. But in an interview that Betsy DeVos did, she was talking about these ESSA plans.
And she said, in terms of the flexibility that states take advantage of and so on and so forth, she said she wants states to go right up to the line, in terms of what they can do in these plans under the law and just basically see how far they can push it.
Now, she wasn’t advocating for states to violate the law. But she was – it’s an unusual thing to hear, I think, for many people coming from an education secretary in the context of states trying to follow a federal law, right.
And so, I think for folks like Lamar Alexander, that was very encouraging; for others, not so much.
Phillip Lovell: And if I can just add something to this, I think that yes, people may not be paying attention to ESSA now. But when the plan – but when schools start getting identified, then all of a sudden, this will start to matter.
And, you know, I think that the way that states have defined consistently underperforming and this additional targeted support school that a number of states were just using the additional targeted support definition, in order to have as few schools identified as possible because under NCLB, you saw an increasing number of schools getting identified to the point that the systems were really losing legitimacy.
What I think is going to happen is that there’ll actually be a number of schools that are identified, but they’ll only be identified for students with disabilities or English learners. Because the way that that additional targeted support requirement is setup, it says that, basically, that the subgroup on its own is at the level of the lowest performing five percent of all Title I schools.
And so, I think that we’ll have a number of schools that are identified. But we may not see all the subgroups getting support that is needed within those schools because they’re only identified for their special needs students or for their English learners.
And I think it’s something that a lot of us will be watching out for to see just, “Do these systems pass the smell test,” right. If you have subgroups that are getting graduation rates below 60 percent and they’re not identified, like, that’s something to be concerned about.
Clearly, the system is not doing what it should be when you’re mixing all of these indicators together and they’re coming out with maybe a cake that tastes good or a cake that tastes not so good.
Andrew Ujifusa: And I think that the issue of subgroups and how they’re handled – I think setting aside issues like opt-out, which has gotten a lot of attention in the last few years – I think that is the single biggest issue, the single biggest concern that member of Congress are expressing.
If you look at the letters they’ve sent and their comments, subgroups, subgroups, subgroups are very important to folks like Patty Murray and Bobby Scott.
Phillip Lovell: And the Alliance for Excellent Education.
Andrew Ujifusa: Of course.
Chad Aldeman: And Bellwether.
Nikki McKinney: And Bellwether and the Collaborative for Student Success. So for those of you who are out their watching, we have exhausted every metaphor there is to exhaust in the education community. We have baked cake.
We’ve got stockings. We’ve got onions peeling back. We are metaphored out.
Phillip Lovell: We all took English class.
Nikki McKinney: We –
Phillip Lovell: Those are our college creative writing and our English writing skills.
Nikki McKinney: And you all sort of got into – and were reading my mind about the next question that I wanted to pose was that states have been working on these plan for the last two years. Sixteen have been approved.
The remaining will be approved in the coming weeks and months. So, what is the next phase of ESSA implementation?
Phillip Lovell: So, I think – so, a couple things are going to be happening. So one, we’ll start to see schools being identified. But I think the next sort of policy issues to be addressed will be at the district level, as there’s a requirement in the law for states to have their plans.
But districts have to have their ESSA plans. And then in addition to the district ESSA plan, when schools are identified, they have to have their school improvement plan. So, there are – pardon the additional metaphor – but there are a few more bites at the education policy apple here.
Andrew Ujifusa: Another food metaphor. That’s –
Phillip Lovell: ‘Tis the season.
Andrew Ujifusa: You know, the thing that I’m interested in – this isn’t strictly an accountability issue. But it does deal with public reporting and something everybody understands, which is money. ESSA has a requirement that a school’s per pupil spending be identified and where that money comes from.
And I think if you have districts where there are a lot of disparities between different types of schools and who goes to those schools, I think in some communities that could cause – start a lot of conversation, I think.
Phillip Lovell: That’s the whole –
Chad Aldeman: Yeah, the only thing I would add – so, I like to say that 2016 was sort of the year of federal implementation. The administration at the time, the Obama Administration tried to draft regulations, tried to provide guidance to the field. Those got taken away, so it caused 2017 to start on a hectic basis.
2017 is sort of all of the year of the states. It’s about the states submitting their plans. 2018 is where we start getting to the districts and where actual parents and teachers start seeing the effects of ESSA, and both good and bad.
They’re going to wake up and see what they’re being held accountable for and whether they’re being identified for improvement. And to Phillip’s point, we’re going to see improvement plans, improvement activities, some of the dollars that I mentioned earlier – that $1 billion in school improvement activities – and funds will start going out soon.
Phillip Lovell: And surely, more states will start taking the 3-percent set-aside after hearing from Chad Aldeman that it’s a good idea to, so.
Chad Aldeman: That’s right –
Yeah, so those sorts of things. The other thing I would say, I’m optimistic, like, long-term. ESSA is a blank template. So, states can always come back and if they want to add social emotional learning, if they want to revise weighting, if they want to be more aggressive in student services, or whatever that is, a new governor or a new chief can come in and steer in that direction.
And the template – the floor is open for them to do those things.
Nikki McKinney: Well, thank you. You’ve answered all of my questions. But we have audience members, who’ve submitted questions, as well. So, I’m going to throw them out at you sort of rapid fire, so be ready. Our first question comes from James in Pennsylvania.
And we’ve talked a lot about summative ratings. You’ve got a particular perspective. Phillip, I know the Alliance for Excellent Education does, as well. So James’s question is, “Will states that use a dashboard rather than a summative rating be at a disadvantage compared to their peers” – those schools that use a dashboard, as opposed to a summative rating?
Phillip Lovell: I think it depends on what we think about as an advantage or a disadvantage. You know, with the dashboard, the theory, I think, is that you get more rich information on – and you’re not just looking at does the school get an A, but what’s going on with achievement, what’s going on with graduation rates, what’s going on with college and career readiness.
The flipside is that if you don’t – if that system was really complicated – and, you know, you can look to California’s plan, I think, as an example. And when you see their 5-by-5 matrix, 25 indicators – I mean, it looks more like a Christmas tree.
And is that going to be understandable to parents and the public the way that it should? I see the downside to the A through F system is that, you know, a very few states that have them, I think, weight subgroups enough in them.
So, there are advantages and disadvantages to each approach. I think that what really matters is when it comes down to being able to compare, whether it’s a letter grade or a dashboard with what schools are identified, it’s like, “Does this system make sense?”
And then, we can see if there’s an advantage or a disadvantage to one approach.
Chad Aldeman: Yeah, the first thing I would say is – so, every state has proposed an identification system. So, they have summative ratings. Like, Pennsylvania has one. California has one. They’re just not compiling the indicators in –
Phillip Lovell: Into a letter grade or something, yeah.
Chad Aldeman: Into a letter grade or something like that. But they have ways that they’re – essentially, business rules that they’re going to identify and satisfy ESSA’s requirements about identifying comprehensive and targeted support schools.
And so, they have a summative rating system. They’re just not doing it in a very transparent or clear method. And to Phillip’s point, I think that accountability systems work best when people know what they’re being held accountable for and they can understand and react.
And so, I think the downside of a place that is choosing not to do a summative rating in any really transparent way, is that they’re not giving that feedback to schools about how they can improve or anything like that.
And there’s a risk that parents and educators won’t understand why they’re being identified, and they won’t know what to do to no longer be identified.
Nikki McKinney: Okay. Several of our viewers have asked which state plans have the most promising practices for school turnaround. Andrew, Chad, any thoughts about that?
Andrew Ujifusa: I would just say, generally – and then, Chad and Phillip might have more detail. But I think that a lot of the plans are very vague on school improvement. Part of that is because while these are state plans, a lot of the turnaround strategies are going to be managed by districts – are going to be run by districts.
So, it’s tough to capture that in a single state plan. And states were also allowed to be pretty vague on them. And I think that this is another instance where if states didn’t have to provide a lot of fine-grained detail, they didn’t.
Now, you can argue about the merit of that. But I think that’s a big – those two things combined are a big reason why we don’t necessarily see a lot of detail about school improvement plans in the ESSA plans.
Chad Aldeman: Yeah, I mentioned a couple earlier. New York has this inspection system that seems promising. And they’ve been using it for a few years. And they’ve gotten good feedback on that. And they’re going to keep using it under ESSA.
New Mexico we mentioned, as well. They have sort of a list of interventions that they think are high quality. And if a school struggles after a certain number of years, they’ll essentially force them to pick from interventions on that list.
It’s not as prescriptive as NCLB, like, “Here’s what happens in Year One, Year Two, and so forth.” But it is a finite list of activities that schools have to choose from. Other states have sort of, like, state takeover or state mandates if a school struggles for a certain number of years.
So, Indiana has a program where the state, essentially, can dictate things about money or use of funds after a certain number of years of low performance. So generally, our peers liked that specificity when they saw it.
I think our peers would have different preferences on what sort of interventions they might like. But seeing something they could latch onto and say, “This is actually going to happen,” in the case of low performance.
Phillip Lovell: And this is – and I think, you know, all of this comes down to implementation, but especially school improvement comes down to implementation. Because like, you can have something that’s really vague on paper, but that in implementation gets done really well.
Or you can have something that’s really detailed on paper, but in implementation doesn’t actually get carried out. So, you know, I think that the state plans were really vague in this area. But moving forward, it’s really the school improvement plans themselves and the implementation of them that’s really going to matter.
Chad Aldeman: I’ll add one other thing that I think was sort of a missed opportunity. A lot of states have multi-tiered systems as supports. And it’s unclear exactly what those do. A lot of them are using regional centers to try to give supports.
I was missing seeing content-specific areas of supports. So to Phillip’s earlier point, Louisiana provided data, saying that about half of their schools are going to be identified for low performing students with disabilities.
So, state needs to come up with a response to that, then, and find maybe that school – those schools should be some sort of cohort that then get all tailored support around that particular area of need.
Nikki McKinney: Interesting. Candice in Virginia asks, “Does anyone really think that states will stick with the goals, academic or grad rates, that they’ve set in their plans?” Ha, ha, thank you, Candice, for stumping the panel.
Andrew Ujifusa: I’m not touching that one.
Phillip Lovell: I think that they will stick with their graduate on rate goals because, by and large, graduation rates have been improving. And the graduation rate goals, in comparison when you compare baseline to the goals themselves, haven’t been huge leaps because graduation rates are much higher now than they were, you know, ten years ago.
On achievement, many states put into their state plan, “And we will reevaluate our goals and, you know, recalibrate moving forward,” because the goals – you know, many states set very aspirational goals.
And I think that, you know, as policy people, you have to ask yourself, “So, what is the purpose of the goal? Is it to set an aspiration for the system? Or is it to be used as an accountability metric?”
And by and large, you know, that – when we – well, we have – if you have goals that are ideal goals, you know, 95 plus percentage of kids being proficient but you’re starting out with proficiency in the single digits, well then you’re setting – you know, are you setting up reasonably ambitious expectations for schools?
Well, no, those are probably not reasonably ambitious. But at the same time, if you’re setting goals that are in the 50s, you know, what does that say about your expectations for kids?
Chad Aldeman: Yeah, I totally agree. So, the graduate goals tend to be better because they were based on historical evidence and they’re generally tied to some objective benchmark about, “Here’s what we’ve done in the past. Here’s what we can expect going forward.”
Achievement is totally different. They’re a hodgepodge of throwing some targets out that are really hard to decide whether they’re ambitious or achievable at what the balance is. And to Phillip’s point, most states – they don’t matter other than they’re going to be reported on.
But for a school, they don’t matter, other than a report card. And so, there’s no accountability attached to them. There are a few states that have used the goals in their exit criteria for low performing schools, for a school to show that they’ve made progress.
That’s the exception. Nevada and a couple other states are doing things like that.
Andrew Ujifusa: There’s also a huge amount of variation as to when those goals are set. Some states are booking five or six years out, which is not too far away. But then, there are states, I think, going into the late 2020s and even beyond –
Phillip Lovell: Beyond 2030, even. Yeah.
Andrew Ujifusa: Yeah, so I think that some of those kids who would be expected to meet those goals, at least nominally, they’re very young or may not be alive. So, there is a lot of variation between states.
Chad Aldeman: We also have a lot of moving targets. So a lot of states, as they transition to new assessments, revise their goals. And so every time that happens, they keep revising goals. And it becomes very difficult to have that historical performance to look back, and then set goals that are meaningful.
Phillip Lovell: Yeah, I would prefer to err on the side of ambition and seeing goals – I mean, I think the goals should matter in the system, don’t get me wrong. In the few states that do, I think that makes a world of sense.
But I would rather set high goals that are not achieved than set low goals that send just a message of mediocrity.
Nikki McKinney: We’ve received a lot of questions from viewers related to federal funding for education and for ESSA implementation, specifically. Folks have noted that there are looming budget cuts, that there’s already limited resources.
And so, how do you all propose states should plan to address gaps and equitable funding for their most vulnerable students? Tough question to close out the webinar. Anybody want to try and tackle that one?
Chad Aldeman: I’ll give it a shot. So, as the question suggested, there is the possibility that there could be federal budget cuts. We don’t know exactly how the Trump budget proposal and the House budget, or spending bill and the Senate spending bill will all play out.
But it’s certainly a possibility that funds in many areas are going to stay flat or have a decline. That decline is particularly possible when it comes to Title II, which deals with teachers and class size. That could be cut significantly, which could impact some states, at least ESSA preparation.
I would also just note that GOP tax bills, which are being sort of merged together and discussed maybe as we speak, right now, could, depending on where you live, exert sort of a downward pressure on state and local school funding.
So, the short answer is, if you’re an advocate for school funding, it’s not a great time. And it’s not entirely clear where you might look for new revenue.
Phillip Lovell: Yeah, I think it’s really important for member of Congress to hear about the importance – I mean, it’s always important, but especially with ESSA being implemented now and schools will be starting to be identified, you know, later, you know, this summer that it’s going to be pretty apparent that, “Oh, wait a second, we’re going to need actual funding in order to identify these schools.”
You know, a lot of states were really nervous about identifying more schools than they could provide funding for. I think it’s a perfectly reasonable concern. And those concerns need to be understood by our colleagues on Capitol Hill so that Title I and that 7-percent set-aside – so that that 7-percent set-aside can be as robust as possible.
Chad Aldeman: Yeah, I think Title I, even in this climate, will get about the same funding level. I don’t think it’s going to go up dramatically. I don’t think it will be cut dramatically. So, a individual school district won’t see a big change in their Title I dollar amount, where they will see changes, as it probably will not keep up with inflation.
And it definitely won’t keep up with inflation and per pupil adjustments. And so –
Phillip Lovell: And it probably won’t keep up with the number of schools that are identified.
Chad Aldeman: Right, right. So, I think it’s – and the other portions of the federal government about the tax bill – I think it’s in state and local budgets is where a lot of the work has to be done.
Andrew Ujifusa: Mm-hmm.
Nikki McKinney: Well sadly, we’ve come to the end of our time this afternoon. I want to thank our panelists, Phillip Lovell, Andrew Ujifusa, and Chad Aldeman, for being with us today. Remember, that this webinar will be archived for later viewing at www.all4ed.org/webinars.
I’m Nikki McKinney for the Alliance for Excellent Education. Enjoy the rest of your day.
[End of Audio]
Welcome to the Alliance for Excellent Education’s Action Academy, an online learning community of education advocates. We invite you to create an account, expand your knowledge on the most pressing issues in education, and communicate with others who share your interests in education reform.
or register for Action Academy below: