The Innovation Paradox

A number of prominent software developers have written essays and blog postings explaining how important it is to embrace new technologies, encourage developers of ‘fringe’ languages, and the like. There are a number of articles that I’m too lazy to look up that make these assertions and similar ones. Here’s one. Here’s another. It is extremely common for people to argue that the most important thing when developing software is to use the right tool for the job; if Python is the best language for a task, then use it instead of a more popular language. This is true to a very large degree, but this viewpoint misses a crucial practical aspect to software development.

“Right tool for the job” is an idealogical stance, one I happen to strongly agree with, but as with all idealogical stances, it frequently is at odds with practical considerations which require compromising the ideal. When tasked with new development, the most natural view is to figure out what tools and languages are needed, then use them. This attitude, however, is somewhat shortsighted. A product spends far more time in maintenance mode than in original development, and the people who tend to maintain the applications are rarely the original developers for the lifetime of the product. That means that whoever maintains the code has to understand it well enough to make changes and, as a result, the person who is hired to take that role must be familiar with the tools and languages used to develop it.

So it’s very possible that when you hire a Python or Lisp programmer, you get ‘better’ programmers because if someone uses a fringe language, they are likely to do so because they are passionate about it, which makes them a better programmer than your average paycheck monkey. Indeed, a lot of people out there seeking and getting Java or .NET jobs are merely people who know the language well enough to collect a paycheck, and they aren’t passionate about the language or software development in general. The argument that someone seeking a Python job is more passionate DOES hold a lot of water (even though there are many passionate Java and .NET developers), because paycheck monkeys are unlikely to learn Python, Lisp, or Ruby: they’re going to learn what they need to give themselves the most opportunity.

The problem arises when that original passionate developer retires, moves up in the company, leaves the company, gets hit by a bus, or avoids being hit by a bus due to a last-second premonition that causes Death to go after him/her as well as a group of sexy teenagers that also avoided being hit by the bus and created an imbalance that must be corrected using a series of complex deadly Rube Goldberg machines. The point is, when someone has to take the project over, if the project is written in a fringe language by a passionate person, it becomes extremely difficult to find a good candidate.

This is something a lot of software development houses don’t consider during the ‘right tool for the job’ discussion, and it seems to be something that various prominent developers are ignoring when proclaiming the importance of hiring people that write Python or Lisp. Yes, the original developer is likely to be more passionate about his or her work if they prefer a language they had to learn on their own, but the long-term consideration of finding a replacement for that person cannot be ignored.

Where I work, we write a lot of ColdFusion applications. ColdFusion was used because it was the right tool for the job when we began development, but now the company needs to expand and hire more people. There are tons and tons of Java/.NET developers out there looking for work, but there are far fewer CF developers. When we used CF, it was because the most important thing was speedy development, and ColdFusion was ideal for rapid prototyping and development. The problem is that we’re now stuck with that decision when looking for job applicants, and the selection pool is very limited because of the ColdFusion restriction. Sure, if we were in New York or some heavily populated area, things would be different, but we’re in Denver, and there just aren’t that many programmers out here.

In retrospect, it may have been better to develop in something more popular, not because it was a better fit for the business, but because it would make the search for additional workers less difficult. I’m not advocating going with what is popular at all times; innovation is unlikely with such a mentality. I’m simply saying that, when new development work starts and someone talks about the right tool for the job, it’s worth considering the downside later on, which is, in my experience, substantial.

comments powered by Disqus