This may be a bit overdramatic, but maybe it is time to look at this paradigm.
It used to be that you needed a degree to work in certain fields, now it is for almost everything. The degree has replaced the high school diploma as the minimum education standard required for most living wage work.
But it has very little to do with actual training for one's chosen field... More and more it matters little what your B. A. was in, only that you have one.
In this way, it really is being treated as diplomas use to be.
When hiring, we used to say that this was because we wanted to see that the potential employee could "finish what they started," and I still agree that there is some merit to that concept. Plus, the development of critical thinking skills that, more and more, happens in college versus high school is another benefit.
But considering the cost of college these days, there is a darker, but huge, benefit for companies that only hire college grads... the immense debt most recent grads are entering the workforce with!
There is so much fear around paying off these loans that people will put up with a lot more than they used to... Low wages and salaries, intolerable and invasive corporate policies, and the knowledge that most workers are immediately replaceable due to all the out of work college grads out there just waiting for the opportunity to "do something with their degrees" or, even more insidious, grads who are just desperate to start paying off their loans.
Of course, having workers who have to go into virtually life long debt for the privilege of working for your company, mostly doing work that could be completed by any average high school grad?
Fearful employees willing to do anything and put up with anything just to keep ahead of their government debt…
What’s better than this? If you don't look too closely, these companies even come off looking like the good guys, since they are the ones giving us a chance to keep one step ahead of the big bad menace of the government.