This blog provides information on public education in children, teaching, home schooling

Showing posts with label data. Show all posts
Showing posts with label data. Show all posts
Monday, October 12, 2009

California Knocks Down Data Firewall

California is adept at building firebreaks to stop advancing wildfires throughout the state. The inferno that is the student-teacher firewall issue was apparently doused yesterday when Governor Arnold Schwarzenegger signed a bill that eliminates a statutory ban on using student achievement data to evaluate teachers. The existence of such a restriction would have deemed California ineligible for a federal Race to the Top competitive grant award.

Here is the Governor's press release.

Here, here and here are background posts on the student-teacher data firewall issue in California.
You have read this article Arnold Schwarzenegger / ARRA / California / data / Race To The Top / student / teacher evaluation with the title data. You can bookmark this page URL https://apt3e.blogspot.com/2009/10/california-knocks-down-data-firewall.html. Thanks!
Wednesday, July 29, 2009

Is California's "Firewall" Penetrable?

California Superintendent of Public Instruction Jack O'Connell countered criticism by Education Secretary Arne Duncan about a state law restricting the use of student assessment data in teacher evaluations. As reported in today's Los Angeles Times, O'Connell highlighted Long Beach Unified as a school district that does exactly that.
California's top education official sought Tuesday to counter federal criticism of the state's reluctance to use student test scores to evaluate teachers, paying a visit to Long Beach to highlight one of the few California school districts to make extensive use of such data.

The Long Beach Unified School District's use of student scores to assess the effectiveness of programs, instructional strategies and teachers is a rarity in California, and state Supt. of Public Instruction Jack O'Connell called it a model for other California school districts during a hastily arranged round-table discussion.

At issue is a 2006 California law that prohibits use of student data to evaluate teachers at the state level. O'Connell said Obama and Duncan misunderstand the law, which does not bar local districts from using the information.
O'Connell also released a statement on this issue last week.

Long Beach Unified is a 2009 finalist for the Broad Prize and was recently profiled by TIME magazine as one of the top urban school systems in the nation.
You have read this article Arne Duncan / ARRA / California / data / Jack O'Connell / Long Beach Unified / Race To The Top / stimulus / student / teacher evaluation with the title data. You can bookmark this page URL https://apt3e.blogspot.com/2009/07/is-california-penetrable.html. Thanks!
Tuesday, July 28, 2009

(Re)Focusing on What Matters

Last week I spoke at a meeting of the Lumina Foundation’s Achieving the Dream Initiative, a meeting of policymakers from 15 states all working to improve the effectiveness of community colleges. At one point, a data working group shared results of its efforts to create new ways to measure college outputs. This was basically a new kind of report card, one capable of reporting results for different subgroups of students, and enabling comparisons of outcomes across colleges. Something like it might someday replace the data collection currently part of the IPEDS.

While it's always gratifying to see state policymakers engaging with data and thinking about how to use it in meaningful ways, I couldn’t help but feel that even this seemingly forward-thinking group was tending toward the status quo. The way we measure and report college outputs right now consistently reinforces a particular way of thinking-- a framework that focuses squarely on colleges and their successes or failures.

What’s the matter with that, you’re probably wondering? After all, aren’t schools the ones we need to hold accountable for outcomes and improved performance? Well, perhaps. But what we’re purportedly really interested in—or what we should be interested in—is students, and their successes or failures. If that's the case, then students, rather than colleges, need to be at the very center of our thinking and policymaking. Right now this isn't the case.

Let’s play this out a bit more. Current efforts are afoot to find ways to measure college outcomes that make more colleges comfortable with measurement and accountability--and thus help bring them onboard. That typically means using measures that allow even the lowest-achieving colleges at least a viable opportunity for success, and using measures colleges feels are meaningful, related to what they think they’re supposed to be doing. An example: the 3-year associates degree completion rates of full-time community college entrant deemed “college ready” by a standardized test. We can measure this for different schools and report the results. Where does that get us? We can then see which colleges have higher rates, and which have lower ones.

But then what? Can we then conclude some colleges are doing a better job than others? Frankly, no. It’s quite possible that higher rates at some colleges are attributable to student characteristics or contextual characteristics outside an individual college (e.g. proximity to other colleges, local labor market, region, etc) that explain the differences. But that’s hard to get people to focus on when what’s simplest to see are differences between colleges.

It's not clear that this approach actually helps students. What if, instead, states reported outcomes for specified groups of students without disaggregating by college? How might the policy conversation change? Well, for example, a state could see a glaring statewide gap in college completion among majority and minority students. It would then (hopefully) move to the next step of looking for sources of the problem—likely trying to identify the areas with the greatest influence, and the areas with the most policy-amenable areas of influence. This might lead analysts back to the colleges in the state to look for poor or weak performers, but it might instead lead them to aspects of k-12 preparation, state financial aid policy, the organizational structure of the higher education system, etc. The point is that in order to help students, states would need to do more than simply point to colleges and work to inspire them to change. They’d be forced to try and pinpoint the source(s) of the problems and then work on them. I expect the approaches would need to vary by state.

Don’t get me wrong, I’m not trying to absolve any college of responsibility for educating its students. What I’m suggesting is that we think hard about why the emphasis right now rests so heavily on relative college performance—an approach that embraces and even creates more insitutional competition—rather than on finding efficient and effective ways to increase the success of our students. Are we over-utilizing approaches, often adopted without much critical thought, that reify and perpetuate our past mistakes? I think so.

Image Credit: www.openjarmedia.com
You have read this article Achieving The Dream / community college / data / higher education / Lumina Foundation / policy with the title data. You can bookmark this page URL https://apt3e.blogspot.com/2009/07/refocusing-on-what-matters.html. Thanks!
Friday, July 24, 2009

Obama's Firewall

Michele McNeil at Education Week's Politics K-12 blog reports that President Obama himself approved the Race To The Top provision that restricts these competitive grants for going to states that restrict the use of student achievement data in teacher evaluations. Short of statutory or regulatory changes made by these 3 states, should we be saying au revoir to the Golden, Empire and Badger states?

Only two things can render a state ineligible for Race to the Top grants.

And only one of them is a biggie: the student-teacher data firewall issue.

This effectively means New York, California, and Wisconsin, at the very least, are ineligible for Race to the Top—or will at least have some explaining to do. They have laws on the book that essentially bar the use of student-achievement data in some teacher-evaluation decisions.

Erin Richards at the Milwaukee Journal-Sentinel picks up the story, too.

Background here and here.
You have read this article Arne Duncan / ARRA / Barack Obama / data / President / Race To The Top / stimulus / teacher / U.S. Department of Education with the title data. You can bookmark this page URL https://apt3e.blogspot.com/2009/07/obama-firewall.html. Thanks!
Monday, May 4, 2009

Cheerleading for NCLB

I guess my reaction to today's Washington Post shout out to No Child Left Behind ("'No Child' in Action") from former U.S. Education Secretary Margaret Spellings is a question: "If NCLB's accountability alone is such a silver bullet, then how come test scores at the high school level didn't improve?"

Although Spellings mentions that NCLB requires math and reading tests in grades 3-8, it is quite disingenuous of her not to mention that such tests were also required in high school. If the achievement gains aren't sustained through high school, what real difference does it make?

The wise Aaron Pallas offers his take on this issue ("Wishful Thinking"), calling into question Spellings's claims:
But what portion of those trends can be attributed to NCLB? Margaret Spellings refers to changes since 1999, which is convenient for her story, because there were sharp increases in grade 4 reading between 2000 and 2002, and in grade 4 and grade 8 math between 2000 and 2003. But NCLB was signed into law in January, 2002; the first final regulations dealing with assessment were issued in December, 2002; and initial state accountability plans were approved by the U.S. Department of Education no later than June, 2003. The 2003 main NAEP was administered between January and March of 2003. Is it realistic to claim that NCLB affected scores before the 2003 NAEP administration? I, and a great many other analysts, think not.

Only in Margaret Spellings’ world can NCLB affect NAEP scores for the four years before the law was passed and implemented. Now that’s wishful thinking.

UPDATE -- Diane Ravitch comes to similar conclusions in her blog post.
Thus, when one looks at the patterns, it suggests the following: First, our students are making gains, though not among 17-year-olds. Second, the gains they have made since NCLB are smaller than the gains they made in the years preceding NCLB. Third, even when they are significant, the gains are small. Fourth, the Long Term Trend data are not a resounding endorsement of NCLB. If anything, the slowing of the rate of progress suggests that NCLB is not a powerful instrument to improve student performance.
Caveat emptor.
You have read this article Aaron Pallas / accountability / Assessment / data / Margaret Spellings / NAEP / NCLB / No Child Left Behind with the title data. You can bookmark this page URL https://apt3e.blogspot.com/2009/05/cheerleading-for-nclb.html. Thanks!
Tuesday, July 8, 2008

A Blank Prescription for Policy Reform

Stanford has yet another interesting publication, Pathways, which integrates critical analysis of research on poverty with bold prescriptions for policy reform. I'm a bit late to the game in mentioning this, since the first issue came out in December, but then again as Mom-to-toddler a 6- month lag or so on my reading ain't bad.

I'm especially a fan of Becky Blank's piece which includes a list of priority efforts for antipoverty programs. Blank is whip-smart, and was recently a contender for the chancellorship here at Madison. Darn it all, she wasn't chosen.

Here's a quote from near the end of her article, which should illustrate why many of us would've loved to have her here. "Social policy evaluation is one of the least well appreciated tools of long-term policy design."

If there was one message I'd send to the presidential candidates, it's that when tackling ANY area of policy reform, please please please take the evaluation of your programs seriously. No, data isn't entirely objective, but it's a whole heck of a lot more objective than simply deciding something's working -- or not-- based entirely on politics or ideology.
You have read this article data / evaluation / policy reform / Rebecca Blank / social policy / Stanford University with the title data. You can bookmark this page URL https://apt3e.blogspot.com/2008/07/a-blank-prescription-for-policy-reform.html. Thanks!

LinkWithin

Related Posts Plugin for WordPress, Blogger...