When I say computer science, I mean the practical kind - where people are writing useful software that will be used by someone else.

If universities really cared about educating their students, they would take an audit once a year (at least) to take scope of the current programming world.

They would look at the currently available development platforms, and identify the ones that were well-established and provided the best introduction to tools that could solve real-world problems that real-world application developers were working on.

Then they would go out and hire professionals with real experience in these languages (!) to train the students in using these tools without sucking.

This would comprise most of the third and fourth year studies for computer scientists (or whatever bachelor's degree that would represent "practical programming").

But wait, what about the first two years? Oh, well that would be spent teaching developers to work on projects in a way that would make them (the software projects) not suck.

Everyone initially sucks at writing software that coders (even themselves) will have to maintain. It generally takes at least a year or two to work the worst of these symptoms out of your system.

If universities focused on the practical, they would work on beating that stupidity out of every student who wanted to write software.

If universities were practical, while simultaneously forward-thinking, they would smack the stupidity out of students while also training them on systems that real companies were using to solve current problems.

Based on my limited education, I note that universities spend most of their efforts on the completely irrelevant (general electives, ho!) and mathematical work that is only useful to coders who are already well-educated.

But don't take my opinion too seriously - I am a college dropout, after all.