A college degree, I mean.
I'm afraid it's become an unnecessary expense.Once upon a time, a college degree made you an educated person. You were worth hiring. You learned critical skills. Nowadays, not so much.
And a law school degree is also descending into irrelevance.
It's an entertaining discussion. From the comments:
What would happen if they lower the salary of a women's studies professor? Would she take a job in private industry bitching at men?
Absolutely. And she'd be well paid to do it.