RIGHT. This is exactly why these tests are given. In Deliberate Practice, these people are called experienced non-experts.
Ten years of bad programming may make you experienced, but it doesn't make you good.
I would question the big-picture awareness of any competent developer who refused this section of an interview. How senior can you really be if you don't recognize why this section of the interview is being given?
I think we are discussing two different cases. I was referring to the dev that recognizes why it's given but philosophically disagrees with this approach of the company's interview process.
(By the way I'm a huge fan of your app and have used it every week for years.)
I cannot agree with you more. It is unfortunate that experience is often taken as synonymous with years alone and skill or knowledge gained or a lack thereof is ignored.
Better opportunities meaning they don't have a coding verification step for senior engineers and so some of your peers don't actually know how to code (permanently or just for awhile before they get outed and fired)? That sounds like a worse opportunity.
I have worked with good technical managers and know of good sales engineers that might not pass a coding verification test like this. I don't think it's right for every role. But that test doesn't directly correlate with the responsibilities of their job. I'm a big fan of contracted simple projects to test someone's skills beyond interview questions.
That's irrelevant. Obviously asking people who aren't expected to program to program in an interview is silly, we're talking about asking people who would be hired for a programming role to program.
I've never met a great architect that didn't also understand how to code. It's hard for me to imagine being unable to structure code and design solutions at a micro level but had a good intuition for structuring code and designing solutions at the macro level.
The biggest recurring issue in any production-quality programming is the handling of edge cases. (Most frequent source of production issues is config management, but let's keep that out of the scope here.)
I would expect a really good design to consider the expected edge cases up-front, and take them into account. The goal cannot be a dogmatic "eliminate all edge cases" approach - because that can only become a self-defeating exercise in frustration. Instead the edge cases should become easier to detect and, as much as possible, not cascade. Debugging a subtle concurrency related timing issue or race condition is bad enough. Debugging the cascading failure and data loss due to a chain of them is a morale destroyer.
Understanding where these edge cases can crop up is important. Even more so, understanding why they appear, and how to address them, is the mark of a really good senior engineer.
For that reason I expect a senior to be able to still program. Senior engineers may not spend their time glazing at their editor, but I do expect any senior to spend plenty of their time with both code review and mentoring.
I really don't get the attitude that its insulting to ask for someone to show you some programming. You wouldn't hire a musician without seeing them play. I have interviewed plenty of expert non-beginning programmers who couldn't do shit. You can't find that out without asking them to code something. There are tons of people from large companies in my town who cannot do anything. I also worked at some of the large companies and ran into them.
How would you identify these people if you don't ask for programming problems.
Nobody's saying you can't test their ability to program. You can ask questions about programming without having them solve some problem from freshman data structures class. You can ask them how they do their current job, and ask for specifics about code they've written. You can ask them to review pieces of code to see how they would improve or change them. You can ask them for a code sample, and then have them walk you through it to be sure they understand it as well. There are many other less insulting options that still give you the chance to "hear the musician play." As an analogy, you probably wouldn't hire Wynton Marsalis and ask him to play "Happy Birthday" in an interview. You would probably ask for some samples of his music, or links to videos of him playing, etc.
Thanks for responding. So what is insulting about specifically asking for coding? It is not that different than asking someone to look at code at the interview and asking for comments. We do both things at my company. I am not persuaded by the famous musician example, think of someone trying out that you haven't seen play. Someone famous who is active and say not recently suffered an accident is in a different category.
We interviewed some experienced people who did terrible in the in-person interview. So we tried giving experienced people a pre-interview question, that maybe took a couple of hours of focused effort. The idea was the people who couldn't do it great during that instant could sit down in their preferred environment and code something up and take their time.Some people did better with this. I personally hate pre-interview questions, so I argued against it but we still tried it.
I find the code review of bad code too easy, not persuasive. Is it a bad idea for a service to hand out ids that are memory locations of runtime structures, and take them back? Obviously, but a surprising number of people don't see that, including experienced people.
My theory is that the state of technology is that people are able to use APIs, libraries, and Stack Overflow code to muddle through most business requirements. And if that's true, perhaps the standards of an engineer have shifted? If those people actually did work at those companies for those many years, surely they did something of business value?
Or maybe the questions being asked in technical interviews are no longer germane to the actual day to day experience of coding?
I'm not saying you should necessarily hire those people, but perhaps all of those articles that go "Why can't our programmers program???" should stop pearl-clutching and actually try to figure out why that is.
I've interviewed probably dozens of people who couldn't code, but never worked with one. I've had plenty of bad coworkers, but all of them could write code to a degree where they would pass basic coding interview.
I assume there is just a large population of people who can't do anything at all, and they just migrate from one company to another, coasting there w/o doing anything for a couple of years, because firing is hard. I keep remembering this story i heard from a co-worker. He worked at a giant laptop repair shop. There was a guy there who didn't know how to fix anything. He stayed for a year or so, and then finally was let go. They found dozens of spare parts in his drawers -- he would just order random parts from the warehouse to imitate work, and leave them in his desk.
This is part of the unspoken dues that corporations pay to the society, creating this hidden safety net for people who are just not good at what they do, or perhaps just can't do anything. It is both good and terrible. I am glad that this safety net exists, but it is really disparaging to these people. I wish society had a better way of helping them.
I prefer to not conflate "computer science" with "software development." The majority of the problems with tech interviews in the Valley stem from an overabundance of CS grads who mistakenly believe that their academic trivia quiz questions have much value for developing software people use.
Very few programming jobs actually require the more mathy end of CS. If you want an analogy with physical science and engineering, the the algorithms-and-data-stuctures interview is like giving a candidate for a satellite engineering job an interview laden with mid to upper level undergrad physics questions. It's just nonsense.
> Very few programming jobs actually require the more mathy end of CS
I disagree. I'm a lowly web dev who is not in Silicon Valley, but I do know the practical benefits of applying the right data structures and algorithms, even though I don't apply this knowledge every single day.
Yes, nested for loops and a hash table will "solve" almost any problem and you can push the release out the door, but how much will that sloppiness cost the employer/customer in hardware and the maintainer and users in time?
> how much will that sloppiness cost the employer/customer in hardware and the maintainer and users in time?
Next to nothing in almost every case.
In fact that hash table is probably overkill most of the time. Unless you're dealing with keys that have a costly comparison, a large number of key value pairs, or a large number of lookups (for some context-specific value of "large") the most naive and "inelegant" of solutions you could imagine, an unsorted array of key value pairs, is sufficient most of the time. So in fact what you call a "sloppy" solution is probably, and quite by accident, far more powerful an algorithm than the problem truly calls for.
Understanding complexity analysis is useful when the domain has performance demands or constraints that make it necessary. "Lowly" web dev is almost never one of those domains.
A directed graph to represent arbitrary user-defined element-event relationships between fields that triggered changes in values depending on one or more events on other fields. Since the events could repeatedly cascade, skipping transient states drastically improved the performance.
I also avoid using exponential complexity all the time.
What you describe is far more often a CS guy looking for a way to apply his CS education instead of asking whether he's identified the correct problem, and frequently that's because, having only an academic CS background, it doesn't occur to him there might be a more serious underlying problem that has a better solution.
I don't know if it was your intention, but you're almost sounding like a CS education is a handicap to solving problems- this is at odds with my experience (I was an embedded developer before I earned my degree)
How do you know? Because they failed your whiteboard test? That doesn't necessarily tell you anything about what they can do with a real workstation and realistic deadlines, unless you've empirically evaluated its effectiveness (including hiring some people who failed it).
This isn't very scientific, but I feel pretty confident about not hiring the people I described. Frankly, I have no interest in working with someone who can't come up with the syntax to define a function in the language of their choice. This is a real thing I have seen from people with significant experience on their resume.
There are candidates who are more on the line, and those I'm less confident about rejecting. But we can't just hire the hundreds of applicants per position we see, and I haven't come across a filtering strategy that isn't obviously worse than what we do now (which, to be clear, is the candidate's choice between whiteboarding and coding in an editor).
The Peter principle is one thing to think about. Another is finding those developers roles that suit them better. Maybe that's some other aspect of the software pipeline like QA or testing or internal tools.
Perhaps someone with more experience interviewing and hiring (or not hiring) in this category can provide a better suggestion.
I mean, I wasn't _really_ looking for a suggestion. I interview them and tell them we're not interested. Which seems like the right thing to do in this situation.
This idea of hiring everyone with experience and hoping you have a job for them doesn't seem terribly sustainable.
> Another is finding those developers roles that suit them better. Maybe that's some other aspect of the software pipeline like QA or testing or internal tools.
I'm used to QA and testing being done by developers. This results in a lot of high-quality automated tests, which are much cheaper than testing the same thing manually every few days.
Internal tools is where I would want my best developers, I would think. Dragging down the productivity of the entire department by having bad internal tooling is pretty worst-case.
> Another is finding those developers roles that suit them better. Maybe that's some other aspect of the software pipeline like QA or testing ...
As a QA professional, I regret greatly that HN's rules restrain me from giving your post the appropriately pithy response it deserves.
QA is not your dumping ground for failed devs; we do not want them either. If they couldn't hack it as a SDE, they are certainly not going to be able to hack it as a SDET either.
Hi, I think perhaps you're reading into the non-exhaustive list of example alternatives to traditional developer roles more than intended.
I don't see anything in my comment to suggest that "QA is dumping ground for failed devs", nor was that my intention.
In fact, I wouldn't label anyone a "failed dev". Rather, the idea behind my post was to find a role that better fits that person's skill set and experience. My apologies if this was unclear.
Is this true, or is it true that there are devs with a lot of experience who can't program during an interview? An interview is an artificial environment.
But those are easily recognized: looking at their github, past jobs, portfolio and calling a few references? I would not invite seniors who do not have a clear past and if they do I do not require the kind of interview discussed here. Also most seniors are referrals anyway... Still do not see the point of these weird interviews I read about here, also not for juniors.