This blog provides information on public education in children, teaching, home schooling

Showing posts with label college degree. Show all posts
Showing posts with label college degree. Show all posts
Wednesday, October 6, 2010

Witchy Woman

What an eerie coincidence. It turns out that 1969 gave birth both to the Monty Python comedy troupe as well as to Christine O'Donnell, Tea Party darling and Republican nominee for one of Delaware's two U.S. Senate seats.

What do Monty Python and O'Donnell have in common? Why, witches, of course!!!

One of the highlights from the Pythons' 1975 feature film, "Monty Python and the Holy Grail" is a scene that employs a scientific method -- one that I can easily see some Tea Party candidates employing in public policy if given the chance -- to determine whether a woman is, in fact, a witch.



In one of the most bizarre beginnings to a political advertisement EVER, 2010 Senate candidate O'Donnell announces that "I am not a witch."



O'Donnell, as you may have heard, admitted in 1999 on Bill Maher's ABC show, "Politically Incorrect," that she had "dabbled in witchcraft" and had a date "on a satanic altar." Whether or not it's actually true, it is just downright bizarre, especially when considered alongside her other wacky quotes. In addition -- to bring it back to education -- O'Donnell has repeatedly lied about her education credentials, claiming falsely that she studied at Oxford, claiming that she was taking graduate classes at Princeton University, and claiming for years that she was a graduate of Fairleigh Dickinson University.

Delaware Republicans must be so proud, having thrown a well-respected, long-serving congressman Mike Castle under the bus, for a woman, who even Karl Rove admits says "a lot of nutty things."

Is it just me or has this political year brought out some of the craziest -- and, in certain cases, dangerous -- assortment of public officials ever? The likes of Michele Bachmann, Ken Cuccinelli and Jim DeMint already represent some Americans. The likes of Sarah Palin, Sharron Angle and Carl Paladino would like to.

Happy Halloween!
You have read this article Christine O'Donnell / college degree / Delaware / Education with the title college degree. You can bookmark this page URL http://apt3e.blogspot.com/2010/10/witchy-woman.html. Thanks!
Monday, December 21, 2009

First, Do Your Homework

There's growing concern with higher education's affordability problem, as well there should be. It's hard to see how college will promote social mobility if a kid's ability to access it is increasingly linked to whether or not his family has money.

So it's heartening to see college leaders attempting to provide solutions. But it'd be even better if we first saw them earnestly attempting to understand where the real sources of trouble lie. I'm afraid that step's being skipped a bit too often, running the risk of making things worse.

Here's a recent example. At this month's Regents Board Meeting, University of Wisconsin System President Kevin Reilly was explicitly asked to name some solutions to promoting affordability at his institutions. There were many ways he could respond. To his credit, Reilly acknowledged the importance of growing the state's paltry support for need-based aid and he said that multiple solutions were needed--there's no one silver bullet. Fair enough. But then he took a bit of a flying leap, saying we also needed an informational campaign aimed at helping students and families understand that it's best to finish college in four years.

Huh? This one left me scratching my head.

More specifically, Reilly said that his administration needs to do a better job communicating with students and families about their educational "choices" and the financial implications of those choices. He suggested that students and families do not know that finishing in four saves money, and if they did, they'd make "better" decisons.

Based on what, exactly?

Was Reilly in possession of some new empirical evidence indicating that Wisconsin families don't perceive the returns to a college degree, or one earned on time? Had he or his staff done homework that showed students were taking longer to finish because they lacked a "focus on 4?" I wish this was the case, but I doubt it. The only data Reilly has publicly provided for his argument is this: he compared the completion rates of in-state students to out-of-state students and noted that the latter group pays more tuition and finishes degrees faster. Given the numerous differences between the two groups, this is an especially weak argument, and one that a decent analysis of the data could easily tear apart.

On the other hand, we have a new national report from the Gates Foundation about the most common reason for college dropout: students' overwhelming need to work. There's also a rigorous study from the National Bureau of Economic Research showing that declining resources for higher education (e.g. supply-side factors) contribute more to college completion rates than do student-side factors. In an earlier paper, the same authors pointed to how the overcrowding of non-top 50 public institutions (a category into which nearly all of Reilly's institutions fit) leads to increased time-to-degree. And within Wisconsin I am co-leading a team of researchers investigating precisely how and why affordability matters for college success. None of that work provides support for the idea that students don't know that finishing a degree faster will save them money. Instead, they have a hard time figuring out how to make that happen while juggling work, family, and school.

Of course, Reilly isn't alone in thinking that he needs to share this "money-saving advice" with students and families. The problem is that his assumption and his message aren't benign. In particular, both come across as out-of-touch and insensitive to the harsh realities of some students' lives. Just think about his words on the subject, which include these quotes: "You've got to realize how much more you're going to be paying unless you focus." "...Part of the problem clearly is students choosing to say, 'I don't want to take an 8 a.m. course' or 'I want to take my courses between 10 (a.m.) and 3 (p.m.) on Tuesday and Thursday.,," "We need to be clearer about results of choices that students and families make about college...There are ways that students and families, by planning ahead a bit and making some focused intentional choices, can hold the cost of an education down."

The assumption he's making-- that the choices made by low-income families are not "intentional" or even informed--rests on shaky, volatile ground. As I've argued elsewhere, the common sport of painting working-class students and families as irrational is off-base. In fact, taken in the context of significant constraints on their lives the decisions many students make about extending their time to degree are quite rational. As a former UW undergraduate told me, ‘It's not an issue of choosing to work when classes are available, but often an issue of you don't get to choose your schedule, especially as the number of hours you work increases."

I have a feeling that when making his suggestion, Reilly was referencing those picky students who want to sleep late and be choosy about their courses, a common rep given to the Madison undergrads (for example). The problem is, those aren't the same students not completing degrees in 4 years. In essence, he's drawing on impressions of an elite group of students to shape solutions to the problems of the non-elite. Not gonna work.

In the absence of any empirical support, one has to wonder-- why does this idea have any traction at all? I think its because it fits with American ideals-those who work the hardest and "focus" the most will get ahead. It places the blame squarely on individuals rather than institutions, even when purporting to share responsibility. Constraints be damned; if you "know" what's good for you, you'll do it. Plus, communicating to students what's good for them is far less expensive than providing the financial support they need to make their actual choices pay off.

Crafting solutions to policy problems without doing sufficient homework first can incur trouble. For one, you risk insulting and alienating the very folks you wanted to help. That's certainly what happened here. As the former UW student told me, "Very few people are oblivious to the fact that adding an extra year to your education costs more money... I'm disappointed that the UW-System seems absolutely unaware of the challenges faced by its students, and its president believes that it's due to personal choice or ignorance that a student would not graduate in four years...The system misunderstands the plight of students who have similar circumstances to the ones I experienced."
You have read this article affordability / college completion / college degree / higher education / Kevin Reilly / UW System / UW-Madison / Wisconsin with the title college degree. You can bookmark this page URL http://apt3e.blogspot.com/2009/12/first-do-your-homework.html. Thanks!
Friday, August 28, 2009

Strengthening Student Support: A Sensible Proposal with What Results?

Cross-posted from Brainstorm

Anyone who's taken a hard look at the reasons why more students drop out of community college realizes it's got to have at least something to do with their need for more frequent, higher-quality advising. After all, in many cases these are students who are juggling multiple responsibilities, only one of which is attending college, and they need to figure out a lot of details-- how to take the right courses to fit their particular program (especially if they hope to later transfer credits), how to get the best financial aid package, how to work out a daily schedule that can maximize their learning, etc. It's fairly easy to figure that in fact community college students would likely stand to benefit more from good advising than their counterparts at many 4-year institutions.

Except high-quality advising isn't what they get. Counselor-student ratios are on average 1000:1. That's right-- one counselor for a population the size of a decent high school. In elementary and secondary schools the ratio is 479:1. There's a pay disparity as well-- in k-12 the Bureau of Labor Statistics reports that the median annual earnings for a counselor in 2006 was nearly $54,000. For counselors at community colleges it was $48,000 (and for those at other colleges it was $42,000). Now, perhaps the salary differentials reflect the different work load, and assumptions about it being easier to counsel adults. But I tend to think this is offbase-- these are outdated notions of who community college students are and what they need.

So what would happen if we reduced the counselor/student ratio at community colleges to a standard even better than the national average in k-12? And at the same time ramped up the intensity of the counseling? Theory would suggest we should see some meaningful results. Many studies, including my own, point toward a persistent relationship between parental education and college outcomes that's indicative of the importance of information-- and information (plus motivation) is what counseling provides. So, putting more counselors into a community college and increasing the quality of what they provide should work-- if students actually go and see them.

To test these hypotheses, MDRC (a terrific NYC-based evaluation firm) recently conducted a randomized program evaluation in two Ohio community colleges. In a nutshell, at college A students in the treatment group were assigned (at random) to receive services from a full-time counselor serving only 81 students, while at college B students in the treatment group had a counselor serving 157 students. In both cases, the control group students saw counselors serving more than 1,000 students each. In addition to serving far fewer students than is typical, these counselors were instructed to provide services that were "more intensive, comprehensive, and personalized." In practice, students in the treatment group did see their counselors more often. The "treatment" lasted two semesters.

The students in this study are Midwesterners, predominantly (75%) women, predominantly white (54%), with an average age of 24, half living below the poverty line and half are working while in school. I think it's also worth pointing out that while all applied for financial aid, these were not folks who were overwhelming facing circumstances of deprivation-- 88% had access to a working car, and 64% had a working computer in their home. And 98% were U.S. citizens.

The results indicate only modest results. After one semester of program implementation, the biggest effects occured-- students in the treatment group were 7 percentage points more likely to register for another semester (65 vs. 58%). But those differences quickly disappeared, and no notable differences in outcomes like the number of credits taken and other academic outcomes occured. Moreover, the researchers didn't find other kinds of effects you might expect--such as changes in students' educational goals, feelings of connection to the college, or measured ability to cope with struggles.

So what's going on? The folks at MDRC suggest 3 possibilities: (1) the program didn't last long enough to generate impacts, (2) the services weren't comprehensive enough, (3) advising may need to be linked to other supports--including more substantial financial aid--in order to generate effects. I think these are reasonable hypotheses, but I'd like to add some more to this list.

First and foremost, there's a selection problem. MDRC tested an effect of enhanced advising on a population of students already more likely to seek advice-- those who signed up for a study and more services. Now, of course this is a common problem in research and it doesn't compromise the internal validity of the results (e.g. I'm not saying that they mis-estimated the size of the effect). And, MDRC did better than usual in using a list of qualified students (all of whom, by the way had to have completed a FAFSA) and actively recruiting them into the study-- rather than simply selecting participants from folks who showed up to a sign-up table and agreed to enter a study. But, in the end they are testing the effects of advising on a group that was responsive to the study intake efforts of college staff. And we're not provided with any data on how that group differed from the group who weren't responsive to those efforts--not even on the measures included on the FAFSA (which it seems the researchers have access to). Assuming participants are different from non-participants (and they almost always are), I'm betting the participants have characteristics that make them more likely to seek help-- and therefore are perhaps less likely to accrue the biggest benefits from enhanced advising. I wish we had survey measures to test this hypotheses-- for example we could look at the expectations of participants at baseline and compare them to those of more typical students-- but the first survey wasn't administered until a full year after the treatment began. To sum, up, this issue doesn't compromise the internal validity of the results, but it may help explain why such small effects were observed-- there are often heterogeneous effects of programs, and those students for whom you might anticipate the bigger effects weren't in the study at all.

A second issue: we just don't know nearly enough about the counterfactual in this case-- specifically, what services students in the control group received. (We know a bit more about differences in what they were offered, e.g. from Table 3.3, but not in terms of what they received,) We are provided comparisons in services received by treatment status only for one measure-- services received 3+ times during the first year of the study (Appendix Table c.3), but not for the full range of services such as those shown in Appendix Table C.1. For example we don't know that students in the control and treatment groups didn't have similar chances of contacting a counselor 1 or 2 times, only the incidence of 3+ contacts. If the bar was rather high, it may have been tougher to clear (e.g. the treatment would've needed to have a bigger impact to be significant).

Having raised those issues, I want to note that these are fairly common problems in evaluation research (not knowing much about either study non-participants or about services received by the control group), and they don't affect MDRC's interpretations of findings. But these problems may help us understand a little bit more about why more substantial effects weren't observed.

Before wrapping up, I want to give MDRC credit for paying attention to more than simply academic outcomes in this study-- they tested for social and health effects as well, including effects on stress (but didn't find any). As I've written here before, we need to bring the study of student health and stress into educational research in a more systematic way, and I'm very glad to see MDRC doing that.

So, in the end, what have we learned? I have no doubt that the costs of changing these advising ratios are substantial, and the impacts in this case were clearly low. Right now, that doesn't lend too much credence to increasing spending on student services. But, this doesn't mean that more targeted advising might not be more effective. Perhaps it can really help men of color (who are largely absent from this study). Clearly, (drumroll/eye-rolling please), more research is needed.
You have read this article college degree / community college with the title college degree. You can bookmark this page URL http://apt3e.blogspot.com/2009/08/strengthening-student-support-sensible.html. Thanks!
Thursday, June 4, 2009

Sorting, Selection, and Success

Cross-posted from Brainstorm, over at the Chronicle of Higher Education.

The latest report from the American Enterprise Institute, Diplomas and Dropouts, hits one nail on the head: plenty of students starting college do not finish a degree. Access does not equate with success, and partly as a result, U.S. higher education is perpetuating a lot of inequality.

What do we do about this? The authors identify a key fact: “analysis of graduation rates reveals wide variance among institutions that have similar admissions standards and admit students with similar track records and test scores.” They interpret this to mean that “while student motivation, intent, and ability matter greatly when it comes to college completion, our analysis suggests that the practices of higher education institutions matter, too.”

This is a pretty common argument made by many policy institutes and advocacy organizations, including but not limited to the Education Trust and the Education Sector. I understand their goal—to make sure that colleges and universities can’t hide behind the excuse of “student deficits” in explaining low graduation rates, and instead get focused on the things they can do something about. In some ways that mirrors efforts over the last fifty years to focus on “school effects” in k-12 education —witness the continuing discussion of class size and teacher quality despite evidence that overall variation in student achievement is much more attributable to within-school differences in student characteristics than to between-school differences (school characteristics). Like many others, I read those findings to say that if we really want to make progress in educational outcomes, we must address significant social problems (e.g. poverty, segregation) as well as educational practices. Don’t misinterpret me- it’s not that I think teachers don’t matter. It’s simply a matter of degree—where and how can we make the biggest difference for kids, and under what circumstances.

Unlike k-12, access in higher education isn’t universal and competitive admissions processes and pricing structures result in lots of sorting of kids into colleges and universities. As a result, they differ tremendously in the students they serve. In turn, as the AEI report admits, this necessarily shapes their outcomes.

The problem is, all this sorting (selection bias) has to be properly accounted for if you want to isolate the contributions that colleges make to graduation rates. (I’ll qualify that briefly to add that the role college enrollment management —tuition setting, financial aid, and admissions— plays in the sorting process is quite important, and is under colleges’ control.) But if you want to isolate institutional practices that ought to be adopted, you first have to get your statistical models right.

Unfortunately, I don’t think the AEI authors have done that. To be sure, they try to be cautious, pointing out colleges that look “similar” but have extremely different graduation rates (rather than modestly different ones). But how they reached “similarity” leaves a lot to be desired. It seems to rest entirely on level of selectivity and geographic region. Their methods don’t begin to approach the gold standard tools needed to figure out what works (say, a good quasi-experimental design). Important student-level characteristics (socioeconomic background, high school preparation, need for remediation, etc) aren’t taken into account. Nor are many key school-level characteristics (e.g. resource levels and allocations). In sum, we are left with no empirical evidence that the numerous other plausible explanations for the findings have even been explored.

I’m not surprised by this, but have to admit that I’m a bit bummed. Yes, I “get” that AEI and places like it aren’t research universities. Folks don’t want to spend long periods of time conducting highly involved quantitative research before starting to talk policy and practice. But I don’t see how this approach is moving the ball forward—sure it gets peoples’ attention, but it’s not compelling to the educated reader—the one who votes and takes action to change the system. Moreover, it doesn’t get us any closer to the right answers, or provide any confidence that if we follow the recommendations we can expect real change.

There have been solid academic studies of the causes for variation in college graduation rates (here’s one example). They struggle with how hard it is to deal with the many differences among students and colleges that are not recorded – and thus not detectable—in national datasets. If we want better answers, we need to start by investing in better data and better studies. In the meantime, I think skipping the step of sifting and winnowing for the most accurate answers is inadvisable. Though, sadly, unsurprising…
You have read this article American Enterprise Institute / college degree / graduation rate / higher education / research with the title college degree. You can bookmark this page URL http://apt3e.blogspot.com/2009/06/sorting-selection-and-success.html. Thanks!
Friday, January 23, 2009

Ensuring Real Education

It's been months since his PR folks sent me Real Education but I'm finally ready to weigh in on C. Murray's new treatise. Far too many naive bloggers think Murray's hit the nail on the head.

So here's my take: Murray is an opponent of expanding formal education, and especially a college-for-all culture that broadly promotes college aspirations. He argues that academic degrees reflect students’ general cognitive and social skills rather than what they learned in college or how well they will perform on the job. But even though in some sense credentials do act as signals and of course the skills of college graduates are not entirely created by colleges, there is still good evidence that what students learn in school has an invaluable, positive impact on their long-term life outcomes.

Moreover, Murray offers no practical alternatives. He argues that employers should develop testing instruments to better assess skills for specific jobs when, in fact, these assessments already exist in many occupations and organizations. Where such assessment tools are being used, they are at best very weak predictors of worker performance. The bottom line is that formal education both creates important skills and provides signals for employers that are quite valuable.

You won't find me arguing for an increase in meaningless credentialing nor advocating that college to become compulsory for everyone. My take is that by setting expectations for sub-baccalaureate outcomes and equipping community colleges with the resources needed to achieve those outcomes, we can enable a revitalized focus on student learning. I mean both the forms of general and specialized learning needed to perform specific jobs, and the kind of skills that all citizens need, and that colleges are best positioned to provide.

So in conclusion, no-- Murray's Losing Ground didn't change my opinion of welfare, and it sure isn't changing my opinion about schooling. But I do thank the publishers, since Real Education has evolved into a tasty chew treat for my puppy.
You have read this article book / Charles Murray / college degree / community college with the title college degree. You can bookmark this page URL http://apt3e.blogspot.com/2009/01/ensuring-real-education.html. Thanks!
Wednesday, October 1, 2008

Palin is a Swirling Student!

What a day! Today, the day before the VP Candidate faceoff (I can hardly wait!), the AP busts out with a fabulous story that Sarah Palin switched colleges 6 times in 6 years!

This makes Palin what I referred to in my U. Penn. doctoral dissertation as a "swirling student." Changing colleges is relatively common among today's college students, but as my research shows, it's especially common for students with poor college grades. Palin started at U. Hawaii-Hilo, a four-year school. She moved to Hawaii Pacific (another 4-year) then to North Idaho College (2 year), then to U. Idaho (4 year), then to Matanuska-Susitna College (4- year), and then back to U. Idaho.

What to make of this? Well, as the AP story indicates, her reasons are very unclear. Normally, I would caution anyone against assuming these are indicative of poor decisions (but I won't do that, since this woman is clearly not a maker of good decisions!). But one thing is very, very clear -- it is INCREDIBLY hard to get a coherent, rich college education when you're constantly changing schools. Congrats to Palin for eventually attaining a degree, but it's far from certain that she learned anything from the college experience.

"Aha", you say-- well, that explains her incredible lack of intellect or awareness of the world around her, interest in reading newspapers and so on. Maybe those things are associated with a college education, not a college degree.
You have read this article 2008 election / college / college completion / college degree / Sarah Palin / vice president with the title college degree. You can bookmark this page URL http://apt3e.blogspot.com/2008/10/palin-is-swirling-student.html. Thanks!
Tuesday, August 12, 2008

Moving Beyond Access

While the proportion of high school graduates going on to college has risen dramatically, the percent of entering college students finishing a bachelor’s degree has not. In 1972, just over half (53%) of all high school graduates went on to college and 39% of those students finished a BA within 8.5 years of leaving high school. Twenty years later, 81% of high school graduates attended college, but only 42% completed a bachelor’s degree. As a result, the proportion of the population attaining college access—and therefore “some college”—has increased much faster than the proportion of the population succeeding in earning college degrees.

What can we do about this? For some thoughts, see my new paper, A Federal Agenda for Promoting Student Success and Degree Completion, published today by the DC-based Center for American Progress.

You have read this article Center for American Progress / college / college completion / college degree with the title college degree. You can bookmark this page URL http://apt3e.blogspot.com/2008/08/moving-beyond-access.html. Thanks!

LinkWithin

Related Posts Plugin for WordPress, Blogger...