Friday, September 5, 2014

The Failure of Colleges and Computer Programming...

I've taught at several colleges, released a book on programming, assisted other teachers, assisted other students, attended classes and interviewed a lot of candidates for senior and mid level IT Roles with great contrasts between those focused on college education and those without.  This may be a rant, but it's intention is to help potential students pick classes and learn, and to help colleges improve their curriculum.

Naturally, I'm open to  debate about any of this.  I want to help out, and I'm not going to help improve things by being close minded.


1) Math is not that important in programming, so why do so many colleges have such hefty pre-reqs?


"Any high school freshman has enough math skills to become a great computer programmer."

I've known brilliant people, with IQ's over 160, who didn't want to go to college, because they weren't that good at the math.  However, they were excellent at taking large complicated business plans and wrapping software and sites around them, exactly what they were and would be doing in the real world anyway.  I'm not arguing against basic algebra and even some minor calculus, but in general, any high school freshmen in the US has enough math skills to become a great computer programmer.

So why do the colleges do this?  I believe this comes from history.  When computer's were first coming to use, they were typically 2 things.  1) was a specialist piece of hardware focused on a particular task, and 2) was a GIANT programmable calculator.  While the study of electricity and circuits already existed, the ability to program these machines would have fallen into the realm of a mathematician.  And since then, colleges have never been able to let go of a strong math background as part of it.

I'd love to hear some feed back on why math is so important here.  Because after 80,000 hours of tech work, I don't see it as a norm.

*Update: From conversations with many other readers, Science in general is very mathematical.  Most people sign up for Computer Science courses, and therefore having a math background is important.  Many college do have more practice oriented degrees available.


What should happen instead?  I believe that most colleges should drop the strong requirements in math when it comes to computers, unless the major somehow focuses on math over all.  The most important thing to most businesses is the ability to apply business logic in applications.  Even in game development, it is common to have a programmer use existing math functions, or stub out the code and let a more math focused individual figure out the most effective way to resolve a math problem.

TO STUDENTSUnless you know your goals will require a lot of math, like 3D Physics Engines, I recommend finding a school that focuses more on patterns than on math.



2) Why are outdated patterns running rampant in professional college courses?


"a good instructor will typically have real world projects and commitments"

Spaghetti code, Hungarian Notation and other terrible naming conventions are still demonstrated in modern courses, and those are just a couple obvious ones.  The real problem stems from instructors who are unaware or unwilling to find out about the changes happening in the world.

Items in a resume are typically obsolete past 4 years in the tech industry, yet many college instructors haven't worked outside the classroom for far longer.  I mean no offense to current instructors.  I've worked with some and helped them improve their classes.  The ones I've worked with have all been very receptive to new coding technology and patterns.  But by the nature of not having deadlines for projects, and not having to work with new technologies every semester, it is harder for many instructors to improve or even see what needs improvement.

I would say that a good instructor will typically have real world projects and commitments relating to the technology they teach.  They will probably have blogs themselves, but will be able to recommend others that they pay attention to about new techs and patterns.  The hardest part of it is that many Instructors don't know when their material is becoming outdated, and most colleges rely primarily on the instructors what is the right material to teach.  In my opinion, a good instructor will probably be split between working on paid contracts or projects for other clients and teaching their classes.  Often getting students involved in the actual work.

TO STUDENTS:  This is a tough one, but find a mentor working with as modern stuff as you can find; heck find a couple.  Ask them to review the required materials, and occasionally sample code and lessons from the instructors and see what they think.  Also make sure your professors are actually still in the modern world or programming.  If they haven't released TEAM projects in a while, don't trust that their training is modern.




3) Why is flash still being taught?

"I was a strong VB6 supporter, but I would never make it a requirement."

Flash will not work on Mac/Tablets/Phones, and in general is no longer accepted as a relevant consumer technology.  Despite that I still see plenty of colleges teaching it, some even presenting it as a highlight of advanced website development, this year.  While this relates to outdated patterns, the point is that colleges are often presenting classes, just because that is what they taught last year, and many students coming in won't know any better.  And these outdated classes are often required in order to pass.

I for one was a strong VB6 supporter, and did not like the fact that .NET required 15 times the resources in memory just to run it.  VB6 is still used by a huge number of companies, despite it being more than a decade old.  VB6 is an effective language, and I know there are plenty of jobs out there for people with good VB6 skills.  But I would never make that a required class for programming in college.  I could see offering it, especially if a considerable amount of local businesses were using it, but that's it.

TO STUDENTS:  Check dice.com for a lot of their required course topics, and see if you can find many jobs in the area you want to live.  Is it used much at all?  If you're not finding many of the required topics, even in the school's home town, it's probably a bad sign

*Update: After speaking with many of the readers, it was pointed out, that many colleges are trying to teach patterns and practices that can be applied to near any language.  Making the student more universally knowledgeable.  Despite that, the major languages of today have been out there for a while.  I still believe that a person can AND SHOULD be taught those principals via industry standards.  


4) College is often harder than it should be.


"It was one of the toughest reads I've ever experienced"


I remember the first book I read on Binary back in the 80's (though I don't recall the name or author).  It was somewhere around 200 pages, clearly written by someone with a strong mathematical background.  It was one of the toughest reads I've ever experienced.  Later in my own teachings, I was able to effectively teach all the same concepts and materials to 7th graders in less than an hour, and have them using it in code as part of a game class.

I also remember taking a course on some development practices that I was very familiar with, but wanted to make sure I had it down well.  The first 6 weeks were easy, all well below my skill level, but on the 7th week we were still doing things well within my skill level but presented in complex ways that made little sense.  I had to decompile the libraries to see the internal structure of the code to understand what on earth was required.  After reflecting on it for a while, I realized the project was simple, and relatively straight forward, it was just the instruction and training leading up to it was poorly communicated and complex to follow.  The students having trouble weren't at fault, the teacher was.  To their credit most of the course was good, but I see that happen all the time; papers and course materials written by people who are clearly brilliant in their professions, but poor communicators.

TO INSTRUCTORS:  You might argue that the real world is complicated and project are not usually straight forward.  The problem is that doesn't help you learn.  It would be far better to present the topics as simply as possible and then provide extra homework that focuses on breaking down more complicated business logic and code.

Tutors and any professionals assisting should be able to help correct and improve these issues in the actual course material.  But it is typically not their place to help.  College curriculum management is often a hidden practice derived primarily from a single mind (The professor) and geared to be unchanged by others unless there is an obvious mistake, and not for improvements otherwise.

TO STUDENTS:  Understand that materials are often taught from one mental perspective.  The fact that you don't get it, doesn't make you stupid, it more often means the instructor didn't present it well. Keep in mind that you might just need a different perspective.  When pieces of it start to click, try to figure out why it didn't in the first place.  Maybe it was a hard topic with no better way to learn it, maybe it was taught poorly, maybe you were missing some small but pivotal piece of knowledge that should have been picked up in an earlier course.  Understanding the difficulty will often help you with improving your learning with that particular teacher or course material.

*Update: from a college instructor at an ivy league university, I've been told that often colleges often like to include a few near impossible, yet required courses at the beginning as kind of a filtering out process.  I.e. if you don't have the determination/experience to figure you way through the course, then you *probably* won't do as well in general, but I disagree with this practice.  I should also include that I have no idea which colleges or how often practices like this are used.



5) So what's holding them back?


"what does it mean for students when the topics covered last year have changed"


Unlike most college majors, computers are changing at an incredible rate.  Medicine, physics and other sciences are seeing significant improvements as well, but they are usually improvements, not so much completely shifting paradigms.  Computers on the other hand, are evolving left and right, and not just in improvements, but significant changes that cause us to adapt different solutions and patterns in every language, not to mention the invention and need for new languages.

Many colleges and professors may approach course restructuring facing great difficulty.  It can be expensive; catalogs, books, accreditation and curriculum's need to be updated.  It can be confusing; what does it mean for students when the topics covered last year have changed?   It can be hard to track student growth; how do you track if minor changes helped when the entire class material keeps changing?  Professors and instructors need to know all these new materials inside and out; to teach it, you better know it.

There are hundreds of difficulties in altering the directions and requirements in courses, majors and colleges.  But this is what colleges and universities are supposed to be doing.  Below are some proactive suggestions.

TO STUDENTS:  Be expressive at your schools.  If you feel the technology is out dated or practices are not relevant, bring this up to your professor, mentors and school directors/councillors.  Schools often have the uniquely difficult task of staying ahead of the curve in technology, and at the same time keeping it stable and valuable.  Its tough, especially if no one is helping them.



6) Here are a few things I think can help colleges and universities.


6.1) Interview the instructors every year.

For real jobs.  There are many companies who would be willing to help out, if you put the requests out.  Let your instructors go out for some significant positions in their fields.  (not to actually find other employment of course)  Don't expect them to always succeed.  The point of the exercise is to start identifying when you are off base with what the real world wants.  The instructors should be taking a strong look at any part they came up weak, and also strongly considering if the courses are preparing the students for positions like these.

*Update, this also helps the teacher focus on preparing students for similar situations.

TO STUDENTS:  Ask your mentors to interview you for mock positions.  Not to get a job, but to get a review of how well you did, weaknesses you can improve and technologies that should have been more common.


6.2) Start Dicing up your required course topics.

Make a list of all the technologies listed in required courses (granted that is not so obvious when it comes to patterns and some other class types), do job searches (like Dice.com) in your area and compare them to other more popular technologies.  If you teach a technology that only has 5 jobs in your state and those are more just footnotes, its a good sign to at least drop it from required courses.


6.3) Have the students make the classes.

Offer extra credit for students to rewrite the training material.  If the students walk into a topic understanding that there is room for improvement, they'll be less apt to put themselves down for being too stupid to get it, and be more apt to figure out why it is difficult to learn, and once understood, be more comfortable sharing their difficulties with others.

TO STUDENTS: If you can, find mentors who are advanced in your class, or have completed it prior, and ask them if they can simplify topics when the teacher or teaching material is confusing.  People will help out more often than not, because it makes them feel more valuable.


6.4) KISS.

Keep It Simple Stupid. Math is not as prominent in computer science as many colleges make it out to be, but adds a high level of complexity that wards off the interests of many young programmers.  I implore colleges to put together groups to determine what is really important to jobs or not.  Around 50% of any interview I do is based on cultural fit.  But seldom are there courses that teach good cultural etiquette in the modern business world, especially in computer science.

The word Job might be too specific here.  Programmers are changing, often telecommuting teams of people who have never met.  And many of the biggest web companies started from kids in their garage (or basement or web team or whatever)  What is really needed?

TO STUDENTS: School Course Counselors can be a great value, but seldom will you find one who has "been there/done that" in your selected field.  I strongly recommend you find mentors willing to review the required classes with you and see it if is valuable or not, especially when selecting a school in the first place.


This isn't that tough, we can do it.

No comments:

Post a Comment