Biden's recent (popular) decision to cancel some student debt seemed to dance around a bigger question: what is college for?
Obviously, one answer is to provide some sort of vetting and training for professions and professional schools. You want to be a doctor? Pass organic chemistry and do well on your MCATs. You'll need a good undergraduate program for that.
"Job training" however would seem to be a low priority. You hear all the time that people responsible for hiring people directly out of college want generalists and people who did well in school, almost regardless of their major. Sure, a STEM major...blah blah blah. But investment banks hire English majors who they think can...think.
There are excellent vocational colleges and community colleges, where you go to learn a craft or job skill, and we should support them. We should also support the X-ray technicians and IT support personnel with decent societal benefits like universal single payer health insurance, but whatever. Different post.
All of this is what I thought should be the proper role of college in society, but I was wrong.
It was to pay a handful of football coaches an obscene amount of money.
No comments:
Post a Comment