by Anselmo Watkins
The widely anticipated release of the Data-Based Assessment of Research-Doctorate Programs by the National Research Council on September 28 has raised the dander of higher education administrators across the country, with one going so far as to threaten to “kick the ass” of committee members should he ever meet any of them.
According to the National Academies’ website, the survey is intended to help universities improve the quality of their programs through “benchmarking” while providing prospective students with information on the specifics of the doctoral programs. The data was collected during the week of July 24, 2006, which is what has Richards – and many of his peers across the country – up in arms.
“The survey data is over four years old. It is pretty much useless at this point except for a 20-20 hindsight review,” said Bryce College executive vice chancellor Jack Richards. “No school in the country is exactly the same as it was four years ago. We have invested so much here to build great programs, and now they are turning back the clock. It makes me so angry that I want to pummel the people responsible.”
Richards cites Bryce College’s Autistic Outreach PhD program that started in 2005. The two-year program was just concluding its first year when the survey was taken and had not yet graduated any students. Since then the program has been praised as one of the strongest of its kind in the country.
“So, according to this survey, we have no graduates, no accolades. Nothing. The last four years of work has been totally erased,” said program chair Jessica Leveller. “How, exactly, does this provide a service for prospective students?”
“I appreciate the fact that they want to make things better, but this information is essentially useless unless they have some sort of time machine,” she added.
“We have been having meetings about this for weeks,” said Moffice University VP for external affairs Greg Easter. “Everyone is on edge because of the potential damage this could do. We’ve built a Q&A website, and we’re preparing for questions from the media and our students. I feel like we’re battening down for a hurricane.”
“I guess it might help some of the schools that have been hammered by the budget crisis, but for those that improved? Bupkis,” he added.
Perhaps the most noteworthy aspect of the survey is the fact that it has been burdened by delays. Conducted by a 15-person committee which was supplemented by an 11-person panel that assisted in data collection, the survey website says the data was originally scheduled to be released in September 2008.
“I wonder if they’ll do a study on the importance of getting time-sensitive research out in a timely manner,” quipped Easter.
The survey also was conducted in 1983 and 1995. There is no available record of the time it took for those survey results to be compiled and released.
Speaking on condition of anonymity, a source familiar with the NRC and its study explained the thoughts behind releasing the outdated information.
“We know that this is going to do more harm than good, but we have to produce something. The folks who gave us those grants want to see some bang for their bucks,” the source said. “Truth be told, I think there is a bit of ego at work. Our NRC is just the third most important agency in the country with these initials. We really need to do something to get noticed and at least jump up to second behind the Nuclear Regulatory Commission. Those guys are bigtime.”