Hot on HuffPost Tech:

See More Stories
Free Switched iPhone app - try it now!
AOL Tech

Dev Chair : Do we want scientists or engineers?

Good computer science graduates do not make good software developers. Really, I mean it. But for the polar opposite reason that these two New York University computer science professors think.

When I was in high school my physics teacher once told us, "All physics experiments work. They just may not work the way you want them to."

This encapsulates neatly what software development is all about. On one hand, it is science. It is deterministic. Each programming language statement performs exactly as stated (baring bugs in the compiler, or the SDK, or the OS). On the other hand, software development is closer to engineering where years of experience allows a software developer to spot patterns in the model and apply them to build a system.

Unfortunately, just as in physics, computer science courses do not prepare students for what comes after graduation. Skills that are considered crucial in almost all commercial software projects are either not taught in college or are only touched upon. This disparity between the skills graduates possess and what the industry is looking for means it generally takes one to two years of working in real life project for a graduate to become fully trained.
For example, here is Stanford University Computer Science course schedule for 2007 & 2008 (spring, summer, autumn, and winter). There are plenty of computer science courses such as Object-Oriented Programming (in C++), plenty of advanced topics such as Artificial Intelligence (AI) and algorithm design. But there is only one course that teaches software engineering, and one on C# and .Net Framework.

Where do students learn about project management, requirement gathering, testing, builds & deployment? Or soft skills such as how to interact with customers and clients? Do colleges expect students to acquire these skills by themselves on the side? How would the students know they are picking up the 'right' skills, and not just acquiring bad habits?

Look at it from another angle. How useful is it for students to know how to design and program an OS kernel? Or to know object oriented programming in C++, verses a 'modern' languages such as Java, C#, Ruby, or even VB.NET? How many OS programmers do we need compared with the number of regular commercial software application programmers? Unless the products are a new OS or a new programming language, I would rather have engineers who build applications on my team than computer scientists who theorize and argue the merits of a design pattern.

One explanation given by colleges is that they are not there to dictate what type of role a graduate would/should take in the industry and therefore they are there to teach the fundamentals so the graduates will have the basic skills to further specialize. One may wish to be a developer while another may wish to be a tester, but personally I feel it is more of an excuse than an explanation.

Yes, the industry changes rapidly. Tools or techniques fashionable a last yeara are replaced by something new and shiny this year. Does it not sound oxymoronic to anyone that a computer science department is unable to react to these changes like the industry? Why bother teaching students how powerful and wonderful computers are and how they can improve our lives when the very people teaching these concepts are not keeping pace with the industry?

If computer science departments are not willing to prepare students for real life jobs, perhaps another department (Electrical Engineering for example) or a completely new one should take over that role? After all, you wouldn't want a scientist to build a bridge now, would you?

Tags: c, c#, computer science, ComputerScience, devchair, java, ruby, stanford, vb.net

Comments

13