Tuesday, April 24, 2007

What happened to OOP?

When recruiting engineers I always start with a discussion on object oriented programming. I try not to completely surprise the candidate, so I always list the four topics: abstraction, encapsulation, polymorphism, and inheritance. Then I ask the candidate to define each term, and tell me how (and why) it is used in real world situations.

I have to admit that I am amazed at the percentage of developers who do not know the fundamentals of OOP. Even developers coming right out of school struggle with this conversation. This despite the fact that most of our entry level developers are coming out of Masters of Computer Science programs.

It has caused me to wonder if OOP is coming out of favor. If this is the case, then what is replacing it? Gang of four patterns? Something else?

You can say I am an old school developer. I learned to code when the style was structured programming. There was no concept of OOP when I earned my Computer Science degree. I was introduced to OOP several years later, when building my first applications for Windows. I immediately saw the maintain beauty of maintaining a single piece of reusable code, this was a logical extension of function libraries.

OOP was a natural evolution of structured programming, and yet I was amazed at the number of my colleagues that did not make the switch. And those who didn't were relegated to mainframe jobs and maintenance of legacy system. The best engineering opportunities were given to those who were evangelists of object oriented programming.

But as Internet development took off during the first dot com boom, a couple of trends started. One was the adoption of Visual Basic, and the other was design patterns from the gang of four (Erich Gamma, Richard Helm, Ralph Johnson, and John Vlissides).

In my mind, Visual Basic is and was a horrible trend in the practice of software development. The language, especially in its' early versions, encourage poor programming practices. And Microsoft's CASE style designer tools only exacerbated the problem. Departmental developers from corporations picked up the easy to learn language, and churned out applications that were impossible to maintain. Although recent versions of the Basic language implement OOP constructs, traditional VB developers generally do not use them.

Design patterns should have been an advancement in OOP development. In practical use, however, programmers often use patterns without practicing OOP themselves. Much the same way that procedural developers use Java or .Net and still fail to implement OOP.

The problem for me, as a manager and leader of engineering teams, is finding people who write code that is easy to maintain. To a degree, I equate maintainability with re-usability because code that is reused is not rewritten. And code that is reused is tested frequently. Coders that do not understand or practice disciplined OOP will fall into the copy-and-paste trap. When this happens, many versions of similar code appear throughout the source code, creating a maintenance nightmare.

The irony here is that descriptions of object oriented programming are very common. Wikipedia, for instance has an entry for OOP that a developer could review and understand in a couple of minutes. I hope too, that Computer Science programs strive to instill these basic concepts to their students. In the meantime I continue my search for solid OOP engineers who will help evolve our products.

Thursday, April 12, 2007

Recruiting headaches; how not to get hired

I spend much of my time recruiting people for my technology teams. Currently we have openings for software engineers, QA Analysts, and a DBA. Thankfully I have corporate resources to post ads and review resumes. After candidates successfully navigate through HR, I do a short phone screen to gauge the candidate's ability to carry a conversation, and to match their knowledge with their experience (as it appears on paper).

I believe the best developers and testers have a strong academic knowledge of their profession. From this knowledge, good coding habits or solid testing methodology is learned. So my phone conversations always start with a discussion of fundamentals.

For software engineers, the conversation begins with a discussion of object oriented programming. Because I often catch candidate by surprise with this line of questioning, I always list the specific terms we will discuss. Abstraction, Encapsulation, Inheritance, and Polymorphism. I have been applying these concepts for nearly twenty years and expect every competent developer understands how they work in real world applications.

Of course some candidates can not describe OOP concepts. I usually give a lot of latitude on abstraction and encapsulation. These tend to be more conceptual than inheritance and polymorphism, which are implemented with specific language constructs. Imagine my surprise, though, when a candidate recently told me he did not know any of the terms.

When a candidate struggles with my OOP discussion, I try to give him some relief by asking him to define inheritance. Inheritance, after all, is fundamental to all modern programming and is easy to describe; simply define the word. But I was shocked when the candidate admitted that he did not know of inheritance.

This story should end right here, for at this point I politely ended my line of questioning and suggested that we did not have an appropriate "fit". The candidate, however, wasn't quite finished. He assured me that given proper requirements he could finish any project. Then he claimed that academic concepts weren't that important and if necessary he could easily look up the information he needed.

Of course he was wrong. A developer can not possibly build a class library or reusable object without understanding inheritance. In fact you can not write a meaningful Java, c++, or .Net application without inheriting an object. And you can not possibly know when to use an interface without understanding polymorphism. You can not implement objects without understanding encapsulation, and your objects will be a mess if you do not apply abstraction. Most of all, you can not hope to get hired into a position if downplay knowledge an interviewer deems important.
My quality assurance conversions cover testing methods in a similar manner as the OOP concepts. For QA, I have twelve types of tests that I expect a candidate to discuss. In the course of the discussion, I ask that the candidate describe how he applied each type of test in his experience.

I am guilty of acquiring my definitions of the tests from the web. I can't even remember where they came from, but we have a list of a few dozen QA terms. From this list I have flagged twelve to drill candidates. I always start by asking the candidate to describe Black Box Testing.

It was surprising when a recent candidate quoted the exact phrase "not based on any knowledge of internal design or code." Surprising because that phrase is an exact match of the definition on my sheet. I assumed this was a coincidence and continued.

But the next item was quoted exactly too. And the third, Unit Test was again quoted exactly, using the phrase "the most 'micro' scale of testing." Nobody talks like that. She could have at least used the word "smallest".

At this point I stopped her and asked what she was reading from. She denied reading the responses. I mentioned that I was reading my terms from a sheet that we had acquired long ago, and asked where she had gotten her material. I do not believe it is possible for anyone to have memorized those specific terms using such precise language. Again, she denied using reference material.

Generally speaking, I would not fault a person for using notes or reference material during a phone screen. But I expect they can show how an academic concept makes sense in real situations. I would also expect a person to acknowledge using reference material when it has become obvious to the interviewer. She had the opportunity to look resourceful, but instead looked like a cheat.

In both these cases, the candidate's resume looked great but they weren't a good fit for our team. These candidates simply didn't apply common sense to our conversation. Want to know how not to get hired? Simple, be unprepared, lie, or ridicule the interviewer's questions; it's guaranteed to keep you out of the position.

The end of an era, say goodbye to an icon

I was shocked last week when I received my copy of InfoWorld. Printed in the corner of the cover was the announcement: "The Final Print Issue."

I have been reading InfoWorld for nearly twenty years. I remember when it was printed on large newsprint. In the early days, the paper was the premier source of information on technology. The articles, reviews, and news were relevant to the industry. I loved Robert X. Cringley when he provided insider information wrapped in his unique humor about Pammy and his Studebaker Hawk.

But the years have not been kind of the paper. The content simply has not retained the same quality and relavence it had during the 90s. Even Cringley digressed into useless gossip from readers and ranting against Microsoft. From my perspective, only Tom Yager remained insightful.

InfoWorld spins the change as evolution; a move away from the print world and strictly into the online world. I see it as evolution toward death. The move ignores the fact that people read newspapers and magazines for the convenience of the media. It's very easy to skim through a paper, absorb the headlines, and drill into interesting articles. No one will skim online content in the same manner.

Everyone knows the web is a tremendous source of information. And every provider of news, technology or otherwise, must have a web presence. The online experience, however, is very different from the offline experience. InfoWorld, for example, was delivered to me. It's the perfect push technology because once it is in my hands I always flip through it. InfoWorld also sends me email with links to their site. These are very easy to ignore, unsubscribe, or filter as junk. I generally ignore them.

I don't see InfoWorld reinventing themselves into Digg. It is still old school and its' web site doesn't put them on the Web 2.0 radar (which is getting tired anyway). In fact, their web-site feels more like CNN than Digg. As for me, it looks like I will be getting my technology news from eWeek and Wired.

Hey Spencer F. Katt, I miss Pammy and the Studebaker.

You might also like ...