Skip to main content

In the Media

Placement Rates, Other Data Colleges Provide Consumers Are Often Alternative Facts

March 22, 2017
By Jon Marcus

Placement Rates, Other Data Colleges Provide Consumers Are Often Alternative Facts

When prospective students want to know how much money they’ll make if they major in a particular field at Montgomery College, it goes to great lengths to give them an answer.

The Maryland community college used a private company to painstakingly cross-check 22,000 graduates’ names, addresses, phone numbers, email addresses and other identifying information against professional licensing records from state agencies, online career sites such as LinkedIn and CareerBuilder, and other sources to get as close as it could to a true idea of future earnings.

That’s about as accurately as any college can tell you the likelihood of whether its alumni get jobs when they graduate — something surveys show incoming freshmen consider the single most important reason to get a degree.

The Huffington Post
This story also appeared in The Huffington Post

Most institutions don’t make anywhere near that great of an effort. The job-placement rates they give prospective students are instead based on unscientific surveys of alumni. Who are the likeliest to answer? Those who have good jobs, experts in this field say, which can dramatically improve the results. But universities seldom disclose that’s where those impressive-looking numbers come from

At a time of alternative facts, much of the information students get when choosing colleges — which they’ll be doing soon, as acceptance letters show up in their mailboxes — is sometimes inaccurate, almost never independently corroborated and often intended to put the best face on the universities’ performance with carefully chosen wording.

This includes not only the important questions of how many graduates get jobs in their fields and how much they’ll make with a degree in a given major, but whether their credits will transfer, how students do on graduate school admission tests and even how much an education will ultimately cost.

“It’s a scandal. There’s no other word for it,” said Mark Schneider, vice president of the American Institutes for Research and former commissioner of the National Center for Education Statistics, who leads an effort called College Measures to report postgraduate earnings of colleges and universities using state government employment data without the involvement of the schools themselves.

“At one time one could argue it was hard to get these numbers and you did the alumni surveys and if they worked in your favor, you published them,” Schneider said. “There are so many other ways to get these numbers now.”

Rather than having more of an incentive to improve the accuracy of the information they provide consumers, however, colleges and universities may soon have less of one.

The Trump administration and Republican congressional leaders, arguing that regulation puts a burden on universities and colleges that contributes to increasing costs for students, say they want to roll back controversial Obama-era attempts at greater transparency. That job would be assumed by accrediting agencies, which critics complain are made up of university and college insiders who evaluate each other’s programs.

One of these, the Accrediting Council for Independent Colleges and Schools, or ACICS, was stripped in December by the U.S. Department of Education of its authority after several for-profit colleges it oversaw were found to have fabricated their graduation and job-placement rates and produced misleading information about their prices, among other things. Several, including Corinthian Colleges, have closed. ACICS has appealed.

It’s not just for-profit colleges and universities that report misleading information, however, consumer advocates contend. After five years of enrollment declines and a falloff in the number of high school graduates, college and university recruiting offices are under huge and growing pressure to fill seats.

Some have reported inaccurate figures outright. A flurry of well-regarded universities were found in 2011 and 2012 to have provided faulty information about such things as students’ high school class standing and entrance test scores to U.S. News and World Report and other independent rankings. All have since announced changes to their reporting policies.

“If you create very powerful, very public incentives for gaming numbers, people will do it,” said Wendy Nelson Espeland, a sociologist at Northwestern University and coauthor of Engines of Anxiety: Academic Rankings, Reputation, and Accountability, about how the race for higher rankings affects colleges’ behavior.

“If you’re reporting honest numbers and other people aren’t, you’re going to plummet in the rankings,” Espeland said. “There’s all kinds of rhetorical games that you can play.” And while the instances of this that have been widely exposed date back to the early 2010s, she said, Espeland is certain they continue. “The stakes for playing those games are higher than ever,” she said. “There’s all kinds of stories about how this stuff gets massaged.”

Several law schools have been sued by their own graduates for allegedly fabricating job-placement rates, for instance. In the most recent case, one against the Thomas Jefferson School of Law that was decided last year, a former student alleged that the 80 percent of alumni the school reported had gotten work included a graduate who was cleaning pools and another who was waiting tables.

She lost, and judges in similar cases have agreed with the law schools and ruled that students enroll in higher education at their own risk. No institution can guarantee employment, they’ve said.

That’s a vexing problem for the increasing number of students applying to college whose own parents haven’t already navigated this complex process and who may be easily misled, consumer advocates say.

“The colleges tell them, ‘Ninety-five percent of our graduates are employed with full-time jobs,’” said Bob Giannino, CEO of uAspire, which works with low-income and first-generation college applicants. “It doesn’t tell them how that data was ascertained. It could be that they did a survey and that’s what came back. It doesn’t say whether these are meaningful jobs worth the incredible amount of debt you’re about to take out.”

Some institutions seem to choose statistics that put the best spin on things.

Students who earned certificates from the public Washburn Institute of Technology in Topeka, Kansas, in 2014 in computer systems networking and telecommunications — a specific course that a university spokesman said is no longer offered — earned a median salary of $17,331, according to the U.S. Department of Education, which tracked this using graduates’ Social Security records.

On its website, under “job and salary outlook” for the identical-sounding computer-networking certificate program that’s offered now, however, that university uses figures from the Kansas Department of Labor. They show that the average pay of a network and computer systems administrator there is $68,618.

Both are true, if at odds with one another. In addition to the fact that the earlier program was eliminated and may not have been identical to the existing one, the spokesman said the university doesn’t have the resources to track down graduates and find out what they’re making, as the U.S. Education Department did. The state Labor Department figures, by comparison, he said, are readily available, even if they don’t necessarily show what previous graduates of that specific Washburn course actually earn.

A crusade to compel universities and colleges to have this kind of information independently audited went nowhere.

“Self-regulation in any industry doesn’t work,” said the man behind that campaign, Myron Roomkin, former dean of the Weatherhead School of Management at Case Western Reserve University. “There’s going to be a certain percentage of people and organizations that are going to try to go close to the edge of the envelope as they can, and even over it.”

Roomkin got interested in this issue when a competing business school reported information about the quality of its applicants he knew was exaggerated — because the same students had applied to his school.

But when he suggested to the association of people who compile such figures at universities and colleges that this would be a good topic for their annual convention, Roomkin said, “There was no interest whatsoever. It’s like doctors; they don’t want anyone holding a program on malpractice. The lawyers don’t want people looking over their shoulders, the cops don’t want anybody looking over their shoulders.”

Even in cases where information was once audited, it isn’t any more. A spokeswoman for the Graduate Management Admissions Council said it no longer randomly audits the average scores colleges report their students achieve on its Graduate Management Admission Test — an important measure of how well undergraduates are prepared for graduate school — and now takes the information directly from the schools’ websites, without verifying it independently.

The spokeswoman said the change was because the commercial data-collection tool the organization previously used was replaced. (In response to persistent complaints about law school data, the American Bar Association has begun to audit the placement rates reported by the law schools it accredits; a spokesman there said the first round of audits is still under way.)

Giannino said he sees confusing information about prices, too. For one thing, the “sticker price” colleges and universities advertise is not necessarily the one most students pay. For another, while colleges and universities are required to put a “net-price calculator” on their websites, the accuracy of this tool is not independently corroborated, and Giannino said that many net-price calculators are out of date.

Other research has shown that colleges and universities in the same counties estimate dramatically different off-campus living costs, which they’re also required to disclose — and that some reduce these costs without explanation, magically offsetting increases in tuition and fees.

Offers of financial aid can also be misleading.

“They’ll say, ‘You have received the President’s Tuition Scholarship,’ and you look at it and think, ‘Oh my god, I got all my tuition paid for,’” Giannino said. “But your other costs may not be covered by that scholarship. So those kids don’t even apply anywhere else, and three or four months later they learn that that tuition scholarship covers only $10,000 of the $40,000 cost.”

Nor do universities and colleges usually disclose that the financial aid they offer incoming freshmen is likely to decline when those freshmen become sophomores, in what critics call a “bait-and-switch.” Federal data broken down by college and university show that a lower percentage of undergraduates typically receive institutional financial aid than freshmen alone do; the amount they get is also usually lower.

Some colleges and universities that promise their credits will transfer don’t explain that they often mean general-education credits, and not credits toward a major. Students who do change schools, as a result, lose an average of 13 credits, or more than a semester, that they’ve already finished and paid for, prolonging their educations and further increasing the cost.

A few public colleges and universities, some with prodding from state legislatures and activists, have begun to try to fix these problems.

“Students don’t often go into the college-going experience worried that the college is going to try to put one over on them. They’re really pretty trusting,” said Mamie Voight, vice president of policy research at the Institute for Higher Education Policy. “Colleges should be held responsible for meeting their end of the bargain.”

The state of Washington has produced a transfer student bill of rights requiring, among other things, that students at its public universities get clear and accurate information, including when and whether their credits will transfer.

“Students were coming to us and saying, ‘Help us,’” said Randy Spaulding, director of academic affairs and policy for the Washington Student Achievement Council, a state agency. “They wanted some assurance that the schools were looking out for them.”

Then there’s the Montgomery College efforts to produce reliable job-placement rates.

“If you ask any community college, they’ll tell you the return response rates are abysmal” for those alumni surveys on which most base the placement rate and postgraduate earnings they report, said Kevin Long, Montgomery’s senior planning and policy analyst. Yet most use them anyway, and few explain that to their students.

What Montgomery does, by comparison, Long said, “is not going to be completely accurate every time, but it’s at least closer to reality for students — a much more honest depiction of what a student can expect.”

Frustrated by the fog of information, outside organizations have taken matters into their own hands. College Measures uses state unemployment insurance data to show the real-world job placements and earnings of graduates of universities and colleges, by major, in Colorado, Tennessee, Texas, Virginia and other states. The Obama administration’s College Scorecard also tracks the earnings of alumni of specific colleges and universities, though not by major and only for students who received federal financial aid.

Many prospective students, including adults considering going back to college, don’t know these tools exist, or don’t use them, however, according to a survey by Public Agenda.

Most, said Giannino, rely on what they get from colleges. “And there’s no accountability mechanism to make sure the data you get is not inaccurate or not misleading. So you have students and families making decisions with really, really bad information.”

View original article