I will protect your pensions. Nothing about your pension is going to change when I am governor. - Chris Christie, "An Open Letter to the Teachers of NJ" October, 2009

Friday, February 5, 2016

The Mis-NAEPery of @TeachForAmerica

Teach For America is having a big party in Washington, D.C. this weekend to celebrate its 25th year -- and look who's crashing:
Teach for America, the program that places newly minted college graduates in some of the nation’s most difficult classrooms for two-year teaching stints, is holding a summit this weekend in Washington to mark its 25th anniversary.
The list of speakers reads like a who’s who of activists and leaders behind recent changes in education policy around the country, from former D.C. Schools chancellor Michelle Rhee to Eva Moskowitz, the head of the largest chain of charter schools in New York City. The singer Janelle Monáe will entertain at a glittery gathering of an estimated 15,000 Teach For America alumni; the organization’s many donors will also be on hand.
And roaming among them is Gary Rubinstein, a nationally known scold of TFA.
Rubinstein, a former TFA volunteer who is in his 14th year of teaching math at Stuyvesant High School in New York City, says he wants to force an “honest discussion” about TFA — including its weaknesses.
To that end, he created a Twitter account @TFA25FactCheck and a new blog and will attend the summit, looking for opportunities to inject what he calls “reality” into discussions about the best ways to improve public education. He is helping to organize a happy hour for those who share his concerns about TFA and said he will also hold an impromptu discussion during the three-day event, after he said his requests to join official panels were spurned by TFA organizers.
TFA doesn't welcome critics? I'm simply shocked...

You can follow this weekend's festivities via Twitter: @TFA25FactCheck. But be prepared: if you haven't recalibrated your B.S. detector, it might just explode:


Apparently, this slide was shown as "proof" that the reformy reforms in Washington D.C. -- including innumerate teacher evaluation -- are leading to gains so large in the nation's capitol that they must be rendered in bright yellow bars

Too bad whoever made this forgot a few things:

- Test score gains are not necessarily equivalent across different tests. In other words: we don't know if gaining 10 points on the Grade 4 reading test is at all equivalent to gaining 10 points on the Grade 8 math test.

- Test score gains are not necessarily equivalent across different parts of a score distribution. In other words, going from a 230 to a 240 is not equivalent to going from a 260 to a 270: it might be much harder to gain 10 points from one starting point than it is from another.

So combining the scale scores of different tests at different starting points and then comparing them is pretty much worthless.

- It is pointless to compare test score gains without accounting for changes in student populations. We know D.C. has seen substantial demographic changes; you can't just slap up scores that correlate to student characteristics without acknowledging these changes.

- Test score changes are not, by themselves, proof that particular policies are successful. Look at the top of this slide: "How the DC Public Schools Changed Everything to Get, Grow, and Keep Great Teachers and Principals." Is the person who put this up seriously suggesting a few teacher policy choices are the cause of the test score gains? That it couldn't possibly be a host of other factors? Really?

OK, I wasn't there. Maybe this was a simple descriptive introduction, leading up to a sophisticated analysis with proper controls for student population changes and scale differences. Maybe this was used as an example of how not to use NAEP data to make a case for a particular policy intervention. Maybe TFA is going to have a weekend full of serious policy discussions, and not engage in some really shameless data manipulation to push their particular agenda.

Maybe...



ADDING: Tonight's menu:

Mis-NAEPery a la Baker.

Mis-NAEPery a la Polikoff.

Mis-NAEPerty a la DiCarlo.

Bon appétit!

Monday, February 1, 2016

Charter School Realities: East Brunswick, NJ

This year, I'm doing an occasional series about the many unknown charter schools that aren't affiliated with the big, non-profit charter management organizations like KIPP or Uncommon or Success Academies. Last time, I showed how a small charter in Red Bank, NJ, was likely impeding any chance of the community being able to integrate their schools. Let's stay in the Jersey 'burbs and discuss another small charter, and how it's affecting the local public district schools.

  * * *

Nearly two years ago, the NJDOE denied an expansion request for Hatikvah Academy Charter School in East Brunswick. But if there's one thing to know about charter operators, it's that they can be a persistent bunch: a year later, Hatikvah got permission to enroll its current students, who were in grades K through 5, to stay until Grade 8. But even that wasn't enough...

According to this letter from the East Brunswick Public Schools -- the "hosting" public school district for Hatikvah -- the charter now wants to expand its enrollments for all of its grades. The reason, ostensibly, is that Hatikvah has more students on its wait list than it does available seats. But the district says there really isn't a community demand for the charter, because Hatikvah is drawing students from all over the state:

I made this map from NJDOE data provided to me by concerned citizens whose districts are affected by Hatikvah (more in a minute). The sending districts to the north are more than 25 miles away; the furthest to the east is more than 15 miles. But not only that:



About half of Hatikvah's students are in East Brunswick, where the school is located. But small numbers of students -- in some cases, a single student -- are coming from districts within a wide area. Florham Park, the northernmost district, is sending one student. What's going on here?

The answer is to be found in Hatikvah's unique curriculum: it's a Hebrew immersion school. Apparently, the NJDOE thinks it's so important for a publicly funded school to teach Hebrew that it's willing to allow students from a wide area to enroll in Hatikvah, dragging their share of taxpayer revenue with them.

If you're surprised to hear that tax dollars can be used to support Hebrew charter schools, you shouldn't be: there is actually a group called the Hebrew Charter School Center, which "...works with public charter schools and planning groups who focus on the instruction of the Hebrew language and culture, as well as the study of the culture and history of Israel and its immigrant communities." The group claims Hatikvah as one of its schools; according to spekaupnj.org, it has also funded a religious after-school program at the charter:
We have looked at all of the HCSC's available Form 990s, and what we found is that from 2010-2013 the HCSC has given over $1 million dollars to Hatikvah, and over $500,000 to create a religious after school program that ONLY serves students from Hatikvah. The stated goal of the program, called Nefesh Yehudi Academy, is "to provide a Jewish education to complement the curriculum of a Hebrew immersion charter school program, while encouraging students to appreciate the diversity of all Jews."

This is not surprising, since the mission of the Steinhardt Foundation for Jewish Life is "to revitalize Jewish identity through educational and cultural initiatives that reach out to all Jews, with an emphasis on those who are on the margins of Jewish life, as well as to advocate for and support Hebrew and Jewish literacy among the general population."

While Mr. Steinhardt is welcome to use his vast fortunate to "revitalize Jewish identity," his efforts to do so using public tax dollars to create secular charter schools supplemented by religious after school programs, both created and supported with HCSC funds, is a misuse of the tax dollars that flow into the public charters. Steinhardt has essentially created a low cost alternative to a religious education (Hatikvah is free, and Nefesh Yehudi Academy is $2,800 as per their registration form). 
So the taxpayers pony up for the Hebrew instruction during the school day, and private funds supplement the religious instruction "after school." I guess that's one way around the First Amendment...

Now, this might not be all that serious if the hosting districts weren't feeling the effects of having to support a charter -- even one that enrolls just a few of their students. Districts, however, have fixed costs; depending on the size of the districts, losing students to a charter can have a real fiscal impact on the hosts. Ask East Brunswick:
As in the past, Hatikvah posits that increasing enrollment would not harm East Brunswick. They suggest that only approximately 9 students of the proposed expansion of 200 students would come from East Brunswick and this would only cost the district in year one an additional $100,000 or so. (Think what it would cost the other districts that have no say in Hatikvah’s expansion into their districts.) Even another $100,000 next year would harm East Brunswick. The East Brunswick Public Schools already spend a sum equal to 98% of the State imposed cap to support Hatikvah. Another $100,000 would make it more than 100% of the cap! The cost would increase each year as students move up through the grades.
That's the host district, again, sending about half of the students. But what about some of the others -- like, say, Highland Park, which makes up about 6 percent of Hatikvah's students? Well, funny you should ask...

Long time readers know one of my best blogging (and real-life) buddies is the intrepid Darcie Cimarusti, aka Mother Crusader. Darcie has been fighting for a good long time to bring sanity to the New Jersey education system, starting with her hometown of Highland Park. Her activism led her to run for the school board and win a seat; this past year, she was elected board president.

Darcie and I have been talking about Hatikvah and how its expansion will damage her own daughters' school district, which led me to look more carefully at the data and produce the analysis below. As Darcie recently told the state BOE in testimony, Hatikvah has done real damage to her district's finances:

My board would like to inform you that Hatikvah has essentially morphed into a statewide charter school, pulling students from 28 districts in 7 counties to fill their seats. Hatikvah was approved to serve East Brunswick, and East Brunswick alone. In fact, in their 2009 application to the state they stated that they didn’t anticipate any out of district enrollment. Now five years into their charter, 50% of their enrollment comes from districts other than East Brunswick, demonstrating a clear lack of interest in their district of residence.

There are no provisions in the state’s charter school law that allow a statewide charter to even exist. This Department has written no regulations, and this Board has approved no regulations to oversee the operation of a statewide charter. If Hatikvah were restricted to accepting students only from their district of residence, not only would they be unable to expand, they would be forced to close. Even if they were restricted to their district of residence and a handful of contiguous districts they would be unable to exist.

In order to keep their doors open they continue to draw more students from more districts, and as each year passes, more funds are lost. When Hatikvah opened in 2010 we lost $61,847. This year Highland Park lost $318,201. In 5 years time, Highland Park’s costs have increased fivefold.

But Highland Park is not part of Hatikvah’s district of residence, so Hatikvah is not required by law to notify our district of their plans for expansion, and our district is not given an opportunity to respond to the proposed expansion. 

We are essentially left in the dark and rendered voiceless.
And that is the insanity of NJ charter school law: because the state is the authorizer, a charter does not need local approval before it can enroll students and cause harm to a district's programs. A district, therefore, has to bear the fiscal burden of supporting a redundant school system simply to support a boutique curriculum that the vast majority of families in the town do not want.

How does this make any sense? Is it really so important that a few families, scattered over a wide region, get a publicly funded school to teach their children Hebrew? Is the benefit from this so great that it's worth negatively impacting the finances of local, district public schools?

And, perhaps more importantly: why should local towns have to support charter schools that increase economic and racial segregation?



Here's the free and reduced-price lunch percentages over the last five years for Hatikvah and its top seven sending districts. Hatikvah has never enrolled the same proportion of economically disadvantaged students as its largest sending districts. And the racial differences are also pronounced:


Hatikvah's concentration of white students is higher than any of its largest feeder schools. Is anyone really surprised that a Hebrew immersion school would enroll a mostly white population? In addition:

Hatikvah enrolls fewer special education students than its feeders -- in some cases, far fewer. This is typical for New Jersey charter schools, which largely under-enroll special needs students. When you look at the raw numbers, this becomes even more obvious:


In addition: the very few special education students Hatikvah does enroll tend to have lower-cost needs compared to East Brunswick.

Again, this is quite typical for a New Jersey charter school. And it matters not just in terms of cost: it matters for accountability measures.

You see, it's easier to raise your test scores when you serve proportionally fewer children in economic disadvantage, or fewer who have special education needs. But what if you control for those differences? As I've done many times before, I use a linear regression model here that adjusts test scores for differences in student populations (see below for specifics). The sample is every school in Middlesex County, NJ; the scores are from the last administration of the NJASK in 2014. Let's see how Hatikvah's 5th Grdaers did compared to East Brunswick's in English Language Arts (ELA):

The East Bruswick schools are in green: they all "beat prediction," meaning they all scored better than we would predict based on their free and reduced-price lunch (FRPL) and special education proportions. Hatikvah scored under prediction.

Does this mean East Brunswick's schools are "better" than Hatikvah's? I wouldn't ever make that claim based on a few tests scores, but I will say this: it's hard to justify Hatikvah's expansion based on any claim that the charter educates its students better than East Brunswick's schools. Here are the adjusted math scores:

Can anyone credibly make a claim that Hatikvah is a badly needed alternative to East Brunswick's public district schools?

I have more of these below for Grades 3 and 4; they vary somewhat, but there is no consistent pattern of Hatikvah outscoring East Brunswick. And it's the same in Highland Park:




Highland Park has more than twice the students, proportionally, who are FRPL than Hatikvah. Yet they outstrip the charter on adjusted Grade 5 ELA and math scores -- handily. Again, there's variation by grade... but where is any proof these districts' students need a charter for a "better" educational option than their already fine public district schools?

Folks, if you want your kids to go to a school where they learn Hebrew, then by all means, sign them up at a private school. Or send them for religious instruction after school -- it's your right, of course. But the idea that our schools should be able to cater to every parent's desire, every educational whim, is absurd.

The taxpayers should not be made to support redundant and inefficient systems of schools simply to fulfill the specialized wishes of a scant few families.

It took former Education Commissioner Chris Cerf a while to figure it out, but he eventually came to the realization that putting a bunch of boutique charters in the 'burbs was a really bad idea -- educationally, fiscally, and politically. After all, Chris Christie's (quickly eroding) political base was in the suburbs: why would they want to see their public schools, which enhance the value of their homes, damaged just so a few students could enroll in a Hebrew immersion school fueled by their tax dollars?

Unfortunately, a few of these charters out in the leafy 'burbs got approved; now, some are having trouble enrolling enough students from their own towns to justify their existence. I don't doubt that if I started a charter based on a clown college curriculum, I could probably find enough students across a wide area who would sign up. That doesn't mean allowing these schools to expand their reach is good public policy -- partially if they increase racial and economic segregation while damaging the public district schools' bottom lines.

Next in this series: let's stay out in the Jersey 'burbs for a bit more -- but then I want to get to Upstate New York...


Highland Park: One of the nicest small towns you'll ever visit.


ADDING: Here are some other adjusted scores. The regression model uses NJASK scale scores for the dependent variable, and school-level FRPL percentage or special education percentage (all courtesy of the NJDOE) for the independent variables. Residuals are expressed as scale score differences and are not standardized. For the Grade 4 models, special education percentage wasn't a statistically significant factor, so I removed it (yeah, we could argue about that, I know, but I ultimately decided it was more fair to do so if I couldn't get p < 0.05). I plotted the residuals on the y-axis against FRPL percentage on the x-axis just to give a little more description to the data.




Here are adjusted SGP or "growth" scores. It's a weaker model because SGPs account somewhat more for student characteristics, but again: there's no consistent pattern of Hatikvah outscoring its host districts.




 Here are the same graphs for Highland Park:








Again: where's the evidence Hatikvah is really needed in either town?

Tuesday, January 26, 2016

Charter School Realities: Red Bank, NJ

ADDING: I've mentioned this before: hyperlocals are becoming some of the best sources for local reporting out there. RedBankGreen.com was the source for the quotes below, but I also read some earlier reports to get some context. They do a really nice job.


Let me show a pie chart Bruce Baker made that I keep coming back to:


This is from a dataset I made (with Bruce's guidance) of charter schools and their affiliated charter management organizations (CMOs). We hear a lot these days about KIPP or Uncommon or Success Academies; however, we hear much less about Academica or Charter Schools USA or White Hat. What do we really know about these schools?

Further: what about that "OTHER" category? Who runs these schools? How do they perform? How do they affect their local districts?

One goal I have over the next year at this blog is to spend some more time looking at these lesser-known charter schools -- the ones who, in reality, are the backbone of the charter sector. Let me start here in New Jersey with a story that doesn't involve a high profile charter leader like Eva Moskowitz or a high profile CMO like KIPP; however, it's a story that I believe is quite instructive...

* * *

Red Bank might be best known as the hometown of the great Count Basie. Like many small towns in New Jersey, it runs its own K-8 school district; high school students attend a larger "regional" high school that is fed my two other K-8 districts, Little Silver and Shrewsbury. The three districts are all quite small; when combined, their size is actually smaller than many other K-12 districts or regional high schools in the area.

This map from the National Center for Education Statistics shows the Red Bank Regional High School's total area, and the three smaller K-8 districts within it. You might wonder why the three districts don't consolidate; just the other day, NJ Senate President (and probable gubernatorial candidate) Steve Sweeney argued he'd like to do away with K-8 districts altogether. The estimates as to how much money would be saved are probably too high, but in this case it would still make a lot of sense.

The reality, however, is that these three K-8 districts are actually quite different:

Here are the free-lunch eligible rates for the three K-8 districts, and the regional high school. Red Bank students are far more likely to qualify for free lunch, a measure of economic disadvantage. Last year, Shrewsbury had one student who qualified for free lunch. It's safe to guess most of the high school's FL students came from Red Bank.

But I've also included another school: Red Bank Charter School. It's FL population is higher than Little Silver or Shrewsbury, but only a fraction of the FL population in Red Bank. What's going on?

Well, if you read the local press (via RedBankGreen.com), you'll see that this charter school is a huge source of controversy in the town:
With the first flakes of an anticipated blizzard falling outside, a hearing on a proposed enrollment expansion by the Red Bank Charter School was predictably one-sided Friday night.
As expected, charter school Principal Meredith Pennotti was a no-show, as were the school’s trustees, but not because of the weather. They issued a statement earlier in the day saying they were staying way because the panel that called the hurry-up session should take more time in order to conduct “an in-depth analysis without outside pressure.”
Less expected was district Superintendent Jared Rumage’s strongly worded attack of charter school data, which he said obscured its role in making Red Bank “the most segregated school system in New Jersey.”
That's a very strong claim. I'm not about to take it on, but I do think it's worth looking more closely at how the charter school's proposed expansion might affect Red Bank's future:
The charter school proposal calls for an enrollment increase to 400 students over three years beginning in September. Supporters of the non-charter borough schools contend the expansion would “devastate” the district, draining it of already-insufficient funding, a claim that charter school officials and their allies disputed at a closed-door meeting Wednesday night.
As a row of chairs reserved for charter school officials sat conspicuously empty, a standing-room crowd gathered in the middle school auditorium heard Rumage revisit familiar themes, claiming that the expansion plan filed with the state Department of Education on December 1 relies on outdated perceptions about the district.
Continuing a battle of statistics that’s been waged for the past eight weeks, Rumage countered assertions made at a closed meeting Wednesday, where charter school parents were told the expansion would have no adverse impact on the district, and would in fact bolster the district coffers.
This is, of course, the standard play by charter schools these days when confronted with the fiscal damage they do to their hosting districts: claim that they are actually helping, not hurting, their hosts. Julia Sass Rubin*, however, did a study of how Red Bank CS funding affects the local schools. What she points out -- and what seems to have been lost on the charter's spokespeople in their own presentation to their parents -- is that the charter gets less funding per pupil largely because it enrolls a different student population than the public district schools.

This is one of the great, untold secrets of NJ charter school funding: the amounts are weighted by the types of students you enroll. If a charter school takes a student who qualifies for free lunch, or is Limited English Proficient, or has a special education need, the charter gets more money than if it took a student who was not in those categories. That's only fair, as we know students who are at-risk or have a particular educational need cost more to educate.

Here are the special education classification rates for all schools in the Red Bank Regional HS area. Red Bank CS has, by far, the lowest classification rate of any district in the region. Of course they are going to get less funding; they don't need it as much as their host district, because their students aren't as expensive to educate. Further, by enrolling fewer special education students, they are concentrating those students in the Red Bank Borough Public Schools. Is this a good thing?

But that's not the only form of segregation that's happening:


Red Bank Borough has few white students in its public district; the charter school has far more. But look at the high school and the other two feeders: they have even more white students proportionally than the charter school. Yes, the charter is creating segregation -- but that's hardly the entire story.

In addition, there's one more very curious thing about this situation. There are, in fact, other areas in New Jersey with K-8 districts that feed into regional high schools, and those K-8 districts, like here, can have very different student populations. StateAidGuy points out a particularly interesting case in Manchester Regional High School: many students who attend K-8 school in North Haledon, a more affluent town than its other neighboring feeders, don't go on to the regional high school. The unstated reason is that parents in that town do not want their children attending high school with children from less-affluent districts; Jeff also notes the racial component to that situation.

But that's not the case for Red Bank Regional High School; in fact, the school attracts more students than those who graduate from its feeders!

These are the sizes of different student cohorts when they are in Grade 8 in the feeders, and then Grade 9 in the high school. The high school actually attracts more students from the area: it has popular vocational academies that can enroll students from other districts, and an extensive International Baccalaureate program.

So the notion that the largely white and more affluent families in Shrewsbury and Little Silver would be scared off by a three-district consolidation with Red Bank doesn't seem to have a lot of evidence to support it. The students already come together in the high school, and that appears to be working out well (at least as far as we can learn from the numbers).

Furthermore, the three towns are within a small geographic area, about 4 miles across. A centrally located school, particularly for the younger children, wouldn't be any further than a couple of miles away for families. It would be quite feasible to implement a "Princeton Plan" for the area; for example, all K-2 students would attend one school, 3-5 another, and 6-8 another.

But the Red Bank Charter School appears to be moving the area away from desegregation. If the expansion goes through, it's likely to make any chance at consolidation go away, because the Red Bank district is likely to become more segregated.

Again, the effects of consolidation on the budgets of the schools would probably be modest -- but the effects on desegregation could be enormous. New Jersey has highly segregated schools; this would be a real chance to undo some of that. But expanding a charter which serves a fundamentally a different student population is almost certain to make segregation in the Red Bank region more calcified. 

And for what? In their application for expansion, Red Bank CS boasts about its higher proficiency rates than Red Bank Boro. But it's not hard to boost your test scores when you enroll fewer special need students and fewer students in economic disadvantage. What if you take into account the different student populations?

What I've done below is to create a simple linear regression model that predicts mean scale scores. The sample is all schools in Monmouth County, NJ. The model uses free and reduced-price lunch (FRPL) as an independent variable in Grade 5. I add special education percentages (which weren't statistically significant in the Grade 5 model) for Grade 8.

What I'm basically doing here is looking at all the schools in the county and, based on their scores and their students, creating a model that predicts where we would expect the school's average test score to be given its student population. Some schools will "beat prediction": they'll score higher than we expect. Some will score lower than prediction.

Let me be very clear on this: I would never suggest this is a comprehensive way to judge a school's effectiveness. I'm only saying that if you're going to make a case that a school should be allowed to expand based on its test scores, this is a far more valid approach than simply putting out numbers that are heavily influenced by student population characteristics.

Let's start with Grade 5 English Language Arts (ELA).

That's Red Bank Middle School in the upper right. About 76 percent of the variation in Grade 5 ELA scale scores in Monmouth County can be statistically explained by the percentage of FRPL students enrolled in each school. Red Bank Middle has one of the highest FRPL rates in the county, yet it does exceptionally well in getting test scores above where we'd predict they'd be based on its student population.

What about Grade 8?

For those with sharp eyes: I changed the x-axis to FL instead of FRPL (the model still uses FRPL). The charter does somewhat better than Red Bank Middle school; however, the public district school in Red Bank still beats prediction.

Here are the models for math:


Same thing: in Grade 5, Red Bank Middle beats Red Bank CS in adjusted scores. The reverse happens in Grade 8, but Red Bank Middle still beats prediction.

I always say this when I do these: absent any other information, I have no doubt that Red Bank CS is full of hard-working students and dedicated teachers; they should all be proud of their accomplishments. But it's clear that it's very hard to make the case that Red Bank CS is far and away superior to Red Bank Middle. 

The Red Bank region has a chance to do something extraordinary: create a fully-integrated school district that serves all children well. I don't for a second believe that will be at all easy; we have plenty of research on the tracking practices, based on race and other factors, of schools that are integrated in name only.

But why turn down the chance to at least attempt something nearly everyone agrees is desirable in the name of "choice"? Especially when the "choice" is going to have a negative effect on the hosting school's finances? And when there's little evidence the "choice" is bringing a lot of extra value to its students to begin with?

Who knows -- maybe there's some way to have Red Bank CS be part of this. Maybe it can provide some form of "choice" to all students in the region. But not like this; all an expansion will do in this case is make it even harder to desegregate the area's schools. This is exactly the opposite of NJDOE Commissioner Hespe's mandate; can he honestly say there are benefits from expanding Red Bank CS that are worth it?

I wish I could say that what's happening in Red Bank is an isolated incident; it's not. Let's stay out in the NJ 'burbs for our next stop...

Stay strong, Red Bank!
(photo via RedBankGreen.com)


* Full disclosure: Julia and I co-wrote the NJ charter school report for the Tanner Foundation last year.

Monday, January 25, 2016

Once Again: The FACTS On Newark, Charters, and Spending

I see we have to go over this once again. Fine:

- TEAM Academy Charter School, the Newark branch of the nation charter chain KIPP, outscored the Montclair School District one year in one grade on one test -- by 0.2 scale score points.


The next year, that same cohort of students, who were now in Grade 4, showed substantially different results on the same test; the Montclair average was now substantially higher than TEAM/KIPP's.



I don't point this out to suggest either that Montclair's schools are superior, or that TEAM/KIPP's schools are inferior. Without adequately controlling for at least the observed variations in each district's populations (and acknowledging that there are likely many unobserved variations), any comparison between the two systems is utterly pointless.

My point here is that facile, a-contextual, cherry-picked factoids like these are completely meaningless, and that people who bring them up time and again show themselves to be fatuous.

- The latest official figures for TEAM/KIPP's post-secondary (college) enrollment rate is 82 percent. I think this is very good and TEAM/KIPP should be proud of their work; however, once again, it is pointless to say that TEAM/KIPP is getting far superior results than the district schools unless and until you account for the differences, both reported and unreported, in the student populations. Further, simply citing one year's post-enrollment rate, which has not even been confirmed by official sources, is at best incomplete and at worst just plain old lazy.

- Dale Russakoff's book, The Prize, does not make the claim that TEAM/KIPP spends $400 per student on custodians while the Newark Public Schools spends $1,200 per student. As I wrote in my brief on Russakoff's (mis-)use of data, here is the relevant passage from the book:
“Christie had not funded the full formula since taking office, citing the state fiscal crisis, but the allocation was still equivalent to about $20,000 per student. Less than half of this, though, reached district schools to pay teachers, social workers, counselors, classroom aides, secretaries, and administrators – the people who actually delivered education to children. For example, the district calculated that it spent $1,200 a year per student on Avon’s janitorial services; BRICK founder Dominique Lee researched the cost on the private market and found it was close to $400 per student.” (p.135)
First of all, there is nothing in here about TEAM/KIPP. Second, the claim of $1,200 per year at BRICK, an NPS school, is unsourced. My review of NPS data calls into question the veracity of the claim; NPS documents showed spending of about $225 per pupil on custodial salaries (see my brief for the data source). Finally, there is no documentation of how Lee calculated her figure, or what the "private market" means.

I think I've been more than fair to Russakoff, but I also think it's simply unacceptable for "facts" like these to work their way into the mainstream media. She has actually misquoted her own book in interviews. It's important to be clear and rigorous with this stuff; I have found Russakoff's use of data in The Prize to be neither. Sorry to be blunt, but enough's enough.

- The notion that Newark's charters have less bureaucratic bloat than NPS schools is contradicted by state data.

Newark spends more on classroom instruction per pupil than most Newark charters, including TEAM/KIPP.

NPS spends far more on student support services -- guidance counselors, nurses, librarians, psychologists -- than the charters.

This is reflected in the large number of these support personnel per student at NPS compared to most charters.

While TEAM/KIPP has equivalent numbers of social workers per student compared to NPS, the district also has many more psychologists, school counselors, and nurses per student.


NPS has lower administration costs per pupil than any Newark charter school.

NPS's administrative salary costs are among the lowest in the city.

Despite having a crumbling infrastructure, NPS plant costs are not inordinately high compared to the charters.

Russakoff has claimed that only half the money spent by NPS makes it "into the classroom." Yet she never explains what that means, she never explains her methodology for arriving at the figure, and she never fully sources the figure. In the face of all this contradicting evidence that comes directly from the state, Russakoff and the people who quote her have an obligation to explain the apparent contradiction here. Is the state data wrong? If so, how do we know?

You can't just fling data around without explaining how it was created, where you got it, and how it should be interpreted in the proper context.

I'm tired of hectoring people who clearly don't give a damn about their own reputations. But I'm not going to stop pointing out when claims are made about schools that have no proper context, are cherry-picked, are poorly sourced, or are just plain wrong. What I have above are the facts. You can check them out yourself. If I'm wrong, I'll correct them.

But if I'm right...

You can't argue with people who repeatedly bury their heads in the sand. All you can do is point out the facts to those who are willing to listen.


ADDING: This is very, very frustrating to me. In an otherwise excellent conversation about Newark and its schools, Owen Davis, who I admire greatly, uses Russakoff's book as a source to make the case the charters have less bureaucratic bloat than NPS:
OD: Of course the district should undergo the “forensic audit” that Russakoff suggests. More money should be going to the children in the classrooms, especially when that means more social workers, counselors, teachers assistants, etc. But it has to be understood w/in the context of a depressed local economy where middle class jobs are scarce.
The charter schools in Newark aren’t weighed down by that economic drag, and Russakoff shows how kids and teachers benefit from leaner bureaucracies and more agile administrators. There’s no question that kids are better off when their schools can provide them with more, faster. But the existence of charter schools doesn’t answer the question of wider economic impacts when the district shrinks. [emphasis mine]
Again: Russakoff's tale is contradicted by official state data. Further, she has absolutely not made the case that her sources are better than the state's own reporting.

This has got to stop. We are telling the wrong story, and it's going to lead us to the wrong conclusions.

Sunday, January 24, 2016

Stronger Than The Swarm

Regular readers have probably noticed two things about this blog:

1) Lately, the posts are longer and much more likely to be filled with statistical stuff.

2) I've had it with arguing with reformy hacks.

This is probably something I should have said as part of a New Years post, but what the hell... I had a lot of time to think today while pushing the snow blower around, and it's become increasingly clear that I want to change direction in 2016:

As I said before: the reformy side really has nothing. If the best response you have to charter skepticism -- which is not, by the way, the same as saying there is no place for choice or charter schools in our education system; it's actually saying that the claims of vastly superior results in the charter sector are largely nonsense -- is to make thinly veiled accusations of racism, you're really running on fumes.

If the best response you have to the legitimate concerns of parents who, among other actions, opt their children out of standardized tests is to say that they are merely coddling their kids, you really have nothing to contribute to the conversation about America's schools.

If you spend your days beating up teachers unions while ignoring the serious problem of inadequate and inequitable funding for our schools, you're not someone I want to waste my time on.

So this blog is, I hope, entering a new phase. Or maybe it's more accurate to say I'm going to try to spend more time writing things I myself would like to read: evidence-based, rigorous, serious discussions about American education, using publicly available data and other forms of evidence to fight off the tired, ignorant platitudes that have come to dominate the conversations about this nation's schools.

Reformsters: if you want to "swarm" me while I do this, go ahead. At this point, I really couldn't care less. I am not going to waste my time debating you on your facile meandering. If you want me to engage, step up; otherwise, you're just not worth it.

Let's start by spending the next week or two looking at the New Jersey suburbs, and why almost everything you've heard about school "choice" is probably wrong. Stand by...

Help is on the way...

Wednesday, January 20, 2016

PARCC, Teacher Evaluations & "Junk Statistics": An Expert Speaks

A little background on what you're about to read: 

In the spring of last year, nj.com posted a story about the PARCC exam -- the new, semi-national standardized test that has been a large source of controversy -- and how it would affect teacher evaluations in the state.

I happened to notice a really great comment from "Rutgers Professor" just below the article. The scholar in question is Gerald A. Goldin. I don't know him personally, but I had certainly heard about him: he is the definition of a scholar, distinguished in both his field and the teaching of his field.

It bothered me, frankly, that someone as knowledgable as Goldin, who had written a genuine essay within his comment, wasn't featured more prominently in this post. Since I'm at Rutgers myself, I contacted him to ask if I could publish what he wrote.

I didn't hear back from him until later in the fall; he was away, and then I was away, and you know how that goes. Dr, Goldin, however, was very gracious and agreed to let me reprint what he wrote. I only apologize I haven't done so until now.

What you're about to read is important; Gerald Goldin's opinion on how PARCC will be used matters. I know the state has dropped the percentage of SGP used in a teacher's total evaluation to 10 percent, but even that's too much for a method that is fundamentally invalid. 

I'm honored to host this essay on my blog. Thanks, Dr. Goldin, for this contribution.


* * *


An 8th thing to know: Junk statistics

I read with interest the on-line article (March 16, 2015), “7 things to know about PARCC’s effect on teacher evaluations” at www.nj.com/education/.

As a mathematical scientist with knowledge of modeling, of statistics, and of mathematics education research, I am persuaded that what we see here could fairly be termed "junk statistics" -- numbers without meaning or significance, dressing up the evaluation process with the illusion of rigor in a way that can only serve to deceive the public.

Most New Jersey parents and other residents do not have the level of technical mathematical understanding that would enable them to see through such a pseudoscientific numbers game. It is not especailly reassuring that only 10% of the evaluation of teachers will be based on such numbers this year, 20% next year, or that a teacher can only be fired based on two year’ s data. Pseudoscience deserves no weight whatsoever in educational policy. It is immensely troubling that things have reached this point in New Jersey.

I have not examined the specific plans for using PARCC data directly, but am basing this note on the information in the article. Some of the more detailed reasons for my opinion are provided in a separate comment.

In short, I think the 8th thing to know about PARCC’s effect on teacher evaluation is that the public is being conned by junk statistics. The adverse effects on our children’s education are immediate. This planned misuse of test results influences both teachers and children.

Sincerely,

Gerald A. Goldin, Ph.D.
Distinguished Professor
Mathematics, Physics, and Mathematics Education
Rutgers – The State University of New Jersey

-------------------------------------------------------------------------------------------------------------------

Why the reportedly planned use of PARCC test statistics is “junk science”:

First, of course, we have the “scale error” of measurement in each of the two tests (PARCC and NJ-ASK). Second, we have random error of measurement in each of the two tests, including the effects of all the uncontrollable variables on each student’s performance on any given day, resulting in inattention, misreading of a question, “careless mistakes,”  etc. Third, we have any systematic error of measurement – possibly understating or overstating student competency –  that may be present in the test instruments, may be different in the two instruments, and may vary across the test scales.

The magnitude of each of these sources of error is about doubled when the difference of two independently-obtained scores is taken, as it is in calcualting the gain score. In addition, since two different test instruments are being used in the calculation, taking the difference of the scores requires some derived scale not specified in the article, which can introduce additional error. These sources of error mean that each student's individual gain score has a wide "error bar" as a measure of whatever it is that each test is designed to measure.

Fourth, we have “threshold effects” – some students are advanced well beyond the content intended to be measured by each test, while others are far behind in their knowledge of that content. The threshold effects contribute to contaminating the data with scores that are not applicable at all. Note that while the scores of such students may be extremely high or low, their difference from one year to the next may not to be extreme at all. Thus they can contribute importantly in calculating a median (see below).

A fifth effect results from students who did not take one of the two tests. Their gain scores cannot be calculated, and consequently some fraction of each teacher’s class will be omitted from the data. This may or may not occur randomly, and in any case it contributes to the questionability of the results.

Sixth is the fact that many variables other than the teacher influence test performance – parents’ level of education, socioeconomic variables, effects of prior schooling, community of residence, and so forth. Sophisticated statistical methods sometimes used to “factor out” such effects (so-called “value added modeling”) introduce so much additional randomness that no teacher’s class comes close in size to being a statistically significant sample. But without the use of such methods, one cannot properly attribute “academic growth” or its absence to the teacher.

According to the description in the article, the student gain scores are then converted to a percentile scale ranging from 0 to 100, by comparison with other students having "similar academic histories." It is not clear to me whether this means simply comparison with all those having taking both tests at the same grade level, or also means possibly stratifying with respect to other, socioeconomic variables (such as district factor groupings) in calculating the percentiles. Then the median of these percentile scores is found across the teacher’s class. Finally the median percentile of gain scores is converted to a scale of 1-4; it not specified whether one merely divdes by 25, or some other method is used.

However, a seventh objection is that test scores, and consequently gain scores, are typically distributed according to a bell-shaped curve (that is, approximately a normal distribution). Percentile scores, on the other hand, form a level distribution (that is, they are uniformly distributed form 0 to 99). This artificially magnifies the scale toward the center of the bell-shaped distribution, and diminishes it at the tails. Small absolute differences in gain scores near the mean gain score result in important percentile differences, while large absolute differences in gain scores near the extremes result in small percentile differences.

There are more complications. The distribution of performance on one or both tests may be skewed (this called kurtosis), so that it is not a symmetrical bell-shaped curve. How wide the distribution of scores is (the “sample standard deviation”) is very important, but does not seem to have been taken into account explicitly. Sometimes this is done in establishing the scales for reporting scores, in which case one thereby introduces an additional source of random error into the derived score, particularly when distributions are skewed.

Eighth, and perhaps most tellingly, the median score as a measure of central tendency is entirely insensitve to the distribution of scores above and below it. A teacher of 25 students with a median “academic growth” score of 40 might have as many as 12 students with academic growth scores over 90, or not a single student with an academic growth score above 45. To use the same statistic in both cases is patently absurd.

These comments do not address the validity of the tests, which some others have criticized. They pertain to the statistics of interpreting the results.

The teacher evaluation scores that will be derived from the PARCC test will tell us nothing whatsoever about teaching quality. But their use tells us a lot about the quality of the educational policies being pursued in New Jersey and, more generally, the United States.

Gerald A. Goldin, Ph.D.
Distinguished Professor, Rutgers University
Mathematics, Physics, Mathematics Education