Long ‘waiting list’ for Florida vouchers doesn’t actually exist
This belongs in the you-can’t-make-up-this-stuff category.
The short version:
Florida’s lawmakers are considering expanding a voucher-like tax credit program because, legislators keep saying, there is a huge waiting list of families who want to participate. It turns out that there is no waiting list.
The long version:
The Florida Legislature has been considering legislation that would expand the state’s Tax Credit Scholarship Program, a voucher-like scheme that allows public money to be used for private school tuition but wouldn’t require much if anything in the way of accountability from schools that accept vouchers. (For example, the students wouldn’t have to take the high-stakes standardized tests required of public school students.)
The Senate bill’s sponsor, Republican Bill Galvano of Bradenton, wound up pulling the bill after stories about the lack of accountability began to spread, and he cited the accountability measures as the reason for his action. He did not mention an embarrassing video that was uncovered in which Doug Tuthill, the president of Step Up for Students, which administers the tax credit program, talks about how much money his organization spends funding political campaigns. The Tampa Bay Times wrote about the video in this story, which said in part:
In the video, Step Up for Students President Doug Tuthill outlined the organization’s political strategy. He talked about the role of an affiliated political committee.
“One of the primary reasons we’ve been so successful we spend about $1 million every other cycle in local political races, which in Florida is a lot of money,” Tuthill told a group at the University of California, Berkeley. “In House races and Senate races, we’re probably the biggest spender in local races.”
Tuthill said he and other proponents “make low-income families the face of the program.”
“We put those people in the face of Democrats and say ‘How can you deny this parent the right to educate their child in the ways that they need?’ ” he said.
Just when people thought the expansion of the program was dead in the legislature for the year, Florida House members found a way to resurrect it by combining it with another reform bill still alive. What will happen is unclear.
But the larger point is that the expansion of the program has been pushed by Step Up For Students based on what it and supportive legislators have said is a very, very long waiting list of families who want to participate. Rep. Erik Fresen, a Republican from Miami who was one of the legislators who figured out how to keep the expansion idea alive, said at a hearing in Tallahassee about the bill that there is a waiting list of families seeking the tax credits that now stands “at 100,00 students.” During the debate about the legislation, a figure of 34,000 families on a waiting list has been thrown about, as have other figures.
Specifically citing such numbers suggests there is an actual waiting list. But, it turns out, there isn’t. After school activists and reporters asked for details about the waiting list, Step Up For Students acknowledged that, alas, it doesn’t really keep one. There aren’t any people on the waiting list because there isn’t a waiting list. Why?
Jon East, of the redefinED blog, which is published by Step Up For Students, wrote in this post:
The people who process applications at Step Up, which publishes this blog, have become so overwhelmed in recent years that they no longer wanted to give low-income families false hope. They concluded that the main reason for the waiting list was mostly for show, and they wanted no part of that.
Mostly for show? The organization has sought an expansion of the program, and legislators have cited the waiting list as a reason for funding it.
For the current school year, 2013-14, the cap limit of $286 million has allowed Step Up to serve 59,765 low-income students. But applications were coming in so fast last spring that the processing team decided to stop taking them on June 28, about as month-and-a-half before school started. Even so, 94,104 students had already started.
That number from June is the origin of the 34,000 “waiting list” that has been asserted many times during the current debate. In reality, it’s not a waiting list, but it’s a powerful indication of demand.
Whatever the demand, it remains the case that public funding has no business being used to pay for private school tuition. That’s the bottom line.
November 12, 2013
FEA cautions against jumping to conclusions after court orders flawed VAM numbers released
TALLAHASSEE – Florida Education Association (FEA) President Andy Ford expressed disappointment after a First District Court of Appeal (DCA) panel ordered the release of flawed evaluation data for every teacher in Florida, overturning a trial judge’s earlier ruling. He cautioned Floridians not to jump to conclusions about the rankings of teachers because the numbers provided by the Florida Department of Education (DOE), based solely upon student test scores, provide an incorrect measure of public school teachers.
FEA intervened to join the Florida Department of Education in the case -- Morris Publishing Group v. Florida DOE -- defending against the disclosure of teachers’ value-added scores in response to a public records request by Jacksonville newspaper The Florida Times-Union. The trial court ruled against disclosure since teacher performance evaluations have historically been exempt from public disclosure, and The Times-Union appealed. Today, a panel of the First DCA overturned the trial court ruling.
“The evaluation data on teachers that is about to be made public is meaningless, which is why we joined in to enforce the public records exemption and prevent it from being published,” said FEA President Andy Ford. “The numbers to be released are subject to misinterpretation. They have not been put in their proper context.”
Ford said that research has shown that even the most sophisticated and valid “value-added” or “VAM” measurements are limited in what they can measure.
“But Florida’s VAM formula is not valid,” Ford said. “it is deeply flawed.”
Nearly all teachers’ VAM numbers are calculated according to students’ FCAT scores, yet only about 35 percent teach students and subjects tested on the FCAT. So for 65 percent or more of teachers, the VAM does not even attempt to measure the teacher’s actual teaching. The Legislature openly recognized this flaw earlier in passing SB 1664, which requires future VAM scores to be based upon a teacher’s actual students. But the two years of VAM data the court has ordered to be released does not take into account the new law, making all of the data meaningless.
“The FEA fully supports teacher accountability,” Ford said. “But assessments of teachers, like assessments of students, must be valid, transparent and multi-faceted. These value-added model calculations are none of these. We hope that The Florida Times-Union – and anyone else who publishes these numbers – makes it fully clear to its readers how little meaning these numbers have in determining the quality of an individual teacher.”
VAM Talking Points
- FEA fully supports teacher accountability. But assessments of teachers, like assessments of students, must be valid, transparent and multi-faceted. These value-added model calculations are none of these. We hope that The Florida Times-Union – and anyone else who publishes these numbers – makes it fully clear to its readers how little meaning these numbers have in determining the quality of an individual teacher.
- The FEA fully supports teacher accountability, as no one wants an ineffective teacher in the classroom. But assessments of teachers, like assessments of students, must be valid, transparent and multi-faceted. These value-added model (VAM) calculations are none of these.
- Just look at that formula above. It is ludicrous to try to determine the value of a teacher using a formula that is comprehensible only to a small number of statisticians. With the problems that the DOE has been having with data on testing and school grades, we have little confidence in these complex figures used to determine a teacher’s evaluation.
- The numbers released by the DOE are subject to misinterpretation. They are mechanical calculations and have not been put in their proper context. The complex value-added statistical model is part of a highly complex accountability system. The Florida public is well aware of the ongoing problems with FCAT and school grades. The full accountability system must be examined and reimagined.
- Research has shown that even the most sophisticated and valid VAM measurements are limited in what they can measure. But Florida’s VAM formula is not valid; it is deeply flawed in practice.
- The two-year cumulative number includes data from 2009-10, before the law even went into effect, calculated after the fact with gaping holes of missing data and little or no roster verification.
- Nearly all teachers’ VAM numbers are calculated according to students’ FCAT scores, yet only approximately 35 percent of teachers teach students and subjects tested on the FCAT. So for 65 percent or more of teachers, the VAM does not even attempt to measure the teacher’s actual teaching.
- The Legislature openly recognized this flaw in passing SB 1664, which requires future VAM scores to be based upon a teacher’s actual students. But the two years of VAM data to be released by DOE does not take into account the new law, making all of the data meaningless.
- After two years, millions of dollars spent on the formula and countless hours spent in implementing SB 736, if these calculations are all the DOE has to show for its efforts, Floridians can now see what this effort has accomplished for Florida’s students: Nothing. It’s an expensive boondoggle that is taking money from the classroom and putting it in to the pockets of the private consultants contracted by DOE. Our students deserve better.
- Researchers have issued numerous warnings about basing teacher evaluations substantially on student test scores. The Legislature and the DOE have largely ignored these warnings.
- Most researchers agree that VAM is not appropriate as a primary measure for evaluating individual teachers. Reviews of research on value-added methodologies for estimating teacher “effects” based on student test scores have concluded that these measures are too unstable and too vulnerable to many sources of error to be used for teacher evaluation.
- Value-added models, taken by themselves, are not an adequate measure of overall educational quality. Like any other measure based on standardized tests, VAMs provide an incomplete view of students’ knowledge, skills and dispositions. Standardized tests only assess a fraction of what teachers teach and students learn; VAM scores based on these “fractional” standardized test results should not be used as the singular assessment of a teacher’s impact on student learning.
- Teachers’ ratings are affected by differences in the students who are assigned to them. Statistical models cannot fully adjust for the fact that some teachers will have a disproportionate number of students who may be exceptionally difficult to teach students with poor attendance, who are homeless, who have severe problems at home, etc. Also, the model does not accommodate high performing students who “hit the ceiling” with near-perfect or perfect test scores and cannot show growth. For example, a teacher of gifted students may have a negative VAM score based on student learning growth even though all of his students consistently perform at the highest achievement level each and every year.
- Value-added models of teacher effectiveness do not produce stable ratings of teachers. Teachers look very different in their measured effectiveness when different statistical
methods are used. In addition, a given teacher may appear to have differential effectiveness from class to class, from year to year, and even from test to test. Researchers have noted that ratings are most unstable at the upper and lower ends of the scale, where many would like to use them to determine high or low levels of effectiveness.
- In other states, a teacher who scored in the top quintile (top 20 percent) in one year had about a 50 percent chance of scoring in the third, fourth or fifth quintile the next year. Did half of the highly rated teacher one year become average or below the next year? No, the system is flawed.
- It is impossible to fully separate out the influences of students’ other teachers, as well as school conditions, on their apparent learning. Many previous teachers have lasting effects, for good or ill, on students’ later learning, and other current teachers also may have an impact on students’ knowledge and skills.
- The instability and bias of the measures may cause the wrong teachers to be fired and other capable teachers to quit.
- Good systems must be designed so that teachers are not discouraged from collaborating with other teachers or from teaching the students with the greatest educational needs.
- Parents, teachers and the public don’t trust the ever-changing numbers coming out from DOE with regard to testing and school grading. Why should they trust the more complex figures coming from the same department as it relates to the evaluations of more than 180,000 Florida public school teachers?