Everyone needs a college degree, but not everyone can afford one.
The economic shift toward skilled labor, coupled with an increasing demand from employers across all industries for credentialed workers, means that college degrees are a must-have. While universities still market the benefits of their degrees as though they are a springboard to success, the reality is that there are fewer career tracks that don’t demand a degree and these jobs are paying less.
In any case, the baseline demand for degrees has ballooned in recent years, along with tuition rates, pricing out students from lower-income families. In fact, the cost of college effectively changes the very definition of ‘lower-income.’
Tuition for Life
The core of the trouble isn’t just the cost; it is the stagnation and decline of the long-term return on investment.Lowering cost barriers to renowned and highly competitive schools does not really make them more accessible; the overwhelming demand for a brand-name degree will still see hordes of rejection letters flying out to applicants emboldened by a tuition guarantee.
The high cost of a college education is not a problem unique to lower-income families, even if it is more pronounced. For lower-income Americans, college debt has about the same effect as payday loans: dooming them to a cycle of desperation, default, and deepening insolvency.
Just as grocery taxes burden poor families more egregiously simply because they spend proportionately more of their income on necessities, the now mandatory college degree impacts such families more dramatically as well — without significantly elevating earning potential or basic employment prospects.
However, treating tuition as a problem only felt by poorer Americans misses the economic fallacy surrounding the whole higher education industry. Consequently, would-be solutions exacerbate as much as they help.
Universities are not and should not be responsible for ensuring they are affordable; they should be accountable for demonstrating consistent, measurable value at any price.
Keeping Politics Out of Money
The university cost conflict has provided a great platform for partisan grandstanding, as arguments over what the role of the state (and federal government) is in paying for university, and what individuals and families can reasonably be expected to contribute.The trouble with the plans — like President Obama’s goal of guaranteeing
two free years of community college based on academic merit, Stanford University’s new promise of free tuition for families making less than $125,000 per year (the median household income for its students), or even Starbucks’ new offer to provide scholarships to its employees -- is that they all accept the rising cost of college, along with the value of the associated education, without question.
Shifting the bill from families, to universities, to taxpayers via government subsidies -- or even to private sector corporations -- does nothing to actually control costs. The politics of competing plans miss the point that the basic economics of the university system don’t make sense.
As in the rest of the economy (with the exception of health care), cost-controls tend to be more effective (and natural) when prices are tied to real, measurable value. If degrees were priced relative to their actual, long-term value, it would follow that students could pay for them in a more predictable, sane manner. Presently, that is not the case, because we have been conditioned to prize degrees without measuring their worth.
Buying Status or Acquiring Knowledge?
Degree valuation is largely artificial, because it is tacked to the selectivity of university admissions first and foremost, and subsequently to brand names and the research program they represent — even though neither has any direct correlation with individual learning outcomes, education quality, or practical skill development.If students who graduate from highly-selective, research-driven universities make more, it does not necessarily follow that they are contributing more in terms of productivity or intellectual capital — merely that the degrees they purchased opened more gilded doors.
Even as observers hoped that disruptive technology might democratize education through Massive Open Online Courses (MOOCs) and online education, the reality has yet to catch up. Some institutions have extended their brand-name to online portals (Rutgers has offered various online degrees since 1999) without any associated discount; MIT and Harvard are all but driving the MOOC bandwagon, but such free courses cannot apply to a degree program.
Holding schools more accountable for measurable, practical outcomes would do more for cost (and especially value) management than any combination of public and private financing initiatives. Additionally, it would prove once and for all whether online learning can compete head-to-head with traditional classroom models.
Ready for the Real World
There was a time when apprenticeships and trade schools were the norm, and the university setting was reserved for intellectuals and researchers. Changing that model is not a bad thing, but simply requiring a degree—any degree—as a standard prerequisite to employment is not practical, efficient, or affordable.
A modern education does not remain so for long. In as little as a few years, depending on the industry, the ‘facts’ one learned in college could be outdated to the point of uselessness by virtue of changing technology and scientific advancement. In terms of longevity, then, college degrees take longer to pay off than they do to ‘expire.’
Further discrimination between vocational training (of the sort often found in community colleges), advanced academics (most bachelor degrees), and research (for many students and tenured professors, a career in itself, rather than a means to an end) should also allow universities to better compete by matching missions, instead of being everything to everyone. At least it would give students a clearer idea of what they are buying: what degree, instead of a degree.
Currently, there is no reliable way to determine what a given graduate took away from his or her time in college; except, of course, for a document proving commencement. How can we confidently claim one MBA program is better than another without having a set of standards to measure student performance? How can we associate academic quality with a research university whose classes are overwhelmingly taught by graduate students?
Without standards, we can’t — we just cling to assumptions.
The modern university model hasn't changed much in its structure from the earliest examples of mixing research with higher education. What society — and the modern economy — expect from these institutions has changed fundamentally. Bridging that divide is critical to any sustainable cost-management program.
Bringing the university system in line with reality doesn’t mean compromising on quality; it just requires proving that quality still exists in economic terms.