21 things to check before you submit your technical test

As a programmer and a recruiter, I tend to read my candidates’ take-home technical tests before submitting them. I’m pretty sure it’d be completely wrong to give anyone feedback on their completed tests before submitting them to my clients, so I’ve instead built-up non-specific pre-test advice over the years, which seems fair to send to candidates along with clients’ technical tests.

There’s a pretty limited amount that can be learned about a developer in an interview, and the interviewing developers generally don’t interview people for a living, so we want to try and avoid making one of their learning points that you’re sloppy or not especially conscientious.

If you follow through this checklist before submitting your technical test you’ll avoid at least some of the simply-fixed issues that I’ve seen arise over and over and over again in test submissions.


1. Is your code packaged in the right format?

Double-check what you’ve been asked for. Is it a zip file? tgz? An upload to BitBucket? Is your code meant to be an installable library? If it’s a series of small tasks to complete, have you added a script that demonstrates they are completed?

2. Is your code sensibly named? Does it decompress sensibly?

Sensibly named is usually your-name-company-name-tech-test.tgz. Sensibly decompresses is generally (albeit language specific): “it unzips / untars into its own named directory” and not “it pollutes the current working directory with hundreds of files and directories”. If you’re submitting a repository, have you used a sensible branch name like master?

3. Have you removed random file detritus?

Is there anything you’ve included that a sensible .git-ignore would have stopped you committing? .DS_Store, ~myfile.js, myfile.c.swp? If you’re submitting a repository, are there random other branches that you shouldn’t be committing?

4. Have you removed any sample data that isn’t yours to include?

Incredibly, people do use their current company’s live database as a source for test data for technical tests at other companies. That’s bad, don’t do that. Make sure any sample data you include is either randomly generated or sensibly licensed, and don’t include other people’s personal information (including lists of email addresses)

4.1 Have you removed any code that wasn’t yours to include?

I don’t want to see your current workplace’s copyright notice in any code you’re submitting; it doesn’t matter if you claim you wrote most of it or “they’re just useful generic functions”. I’ve received far too many pieces of proprietary production from developer’s current workplaces when asking for sample code…

Contents / README.md

5. Have you documented what you completed and how?

Some tests will explicitly ask you for commentary on what you completed, what you didn’t complete, and thoughts on your implementation. If you’ve been asked for it, you should definitely include it. If you haven’t been asked for it, you should probably include it. What are the strengths and weaknesses of your approach? Are there other approaches you considered? What would you have done if you’d had more time?

6. Have you added installation / run instructions?

Don’t make the code reviewer think too hard. Add any instructions for installation and running the code that might be relevant, even if that starts with obvious things like `node install`. If it’s a web application + server, does it have a default bind port? What’s the local URL that the reviewer should go to to start? How can they run the tests? Do they need to initiate a test database? Is that easily done using copy-pasteable commands you’ve included?

7. Is your version history / commit log sensible?

If you’ve been asked to deliver your code by pushing it to a version control system, are you happy with the version control artifacts you’ve created? Will the contents of git log pass muster? How are your commit messages?


8. Have you removed pointless boilerplate?

Documentation is great! Place-holder documentation created by a module / framework starter script is worthless if you haven’t added anything to it. Tests are great! Tests that your testing library add as a “HOW TO” are worthless. Code is important! Stubs that aren’t used that the computer wrote are not very interesting, and usually have no place in your test submission

9. Have you run a code tidier over the code?

Some people consider code style to be a deeply personal choice. Some people think it’s best handled by a computer and a shared style configuration. Everybody (almost) agrees that internal consistency is important. While you can go wrong by mixing tabs and spaces as indentation, you’ll almost certainly never go wrong by running a popular code tidier over your code in the default settings

10. Have you linted your code?

Linters find “areas worthy of review” in your code via static analysis. They help Perl developers remember to `use strict` and `use warnings`, they help JavaScript developers to find a sensible semi-colon strategy, and they incessantly nag Haskell developers to lose redundant parentheses. You don’t have to accept all of your language’s linter’s suggestions, but you should make sure you review all of the suggestions to catch any dumb mistakes

11. Does your documentation render properly?

Push any Markdown files to a private GitHub repo and check they render correctly. If your programming language has an embedded documentation format like POD, or uses method annotations for documentation, try viewing that documentation and check you haven’t botched it by a mistype character that throws the parser off

12. Have you triple-checked your code dependencies?

Code reviewing code that has incorrectly-specified dependencies is irritating. If you’ve installed libraries globally, it can be easy to miss that you’ve forgotten to specify them in your code as well. Consider spinning up a virgin AWS instance and checking you can install and run your code on to check

13. Have you removed any extraneous dependencies?

In the process of developing anything, I generally use libraries that I don’t end up using by the time I’m finished. If your code reviewer is like me and starts by looking at the dependencies to get a feel for how your code might fit together, you’ll confuse and frustrate them if you’ve left in a bunch of dependencies you no longer make use of, and you’ll look sloppy


14. Did you write some tests?

There may be good reasons you haven’t included automated tests in your submitted code. “I didn’t have time” and “I couldn’t be bothered” are not good reasons

15. Do your tests pass? Do your tests pass without filling the screen with warnings?


16. Are your tests good examples of production code?

Testing code should be a first-class citizen. This means it should be documented, sensibly laid out, and easy to follow. Separating test data from the the test code that exercises it is usually a sensible starting strategy

17. Are you including tests you didn’t write, and if so, are they useful?

See point 8. A boiler-plate test that performs a useful function is fine. The example test script that your framework included (assert( true, "Hello world!")) is generally worthless


18. Can you explain what every line of your code is doing?

Developers have a nasty habit of taking a cargo-cult approach to interacting with their tooling, especially given the all-encompassing knowledge available on StackOverflow.

If you’ve copied-and-pasted any part of your code from StackOverflow, or library documentation, or it was generated for you by your tooling, or your linter told you to do it a certain way, make sure you know what — and can explain what — it does.

19. Have you avoided “cutting edge approaches”?

Libraries and approaches the reviewer of your test hasnโ€™t seen before are more likely to elicit a response of WTF? rather than admiration for your use of cutting-edge tooling. Unless you’re reasonably certain the company uses — or is likely familiar with — any exotic libraries or approaches, your technical test is not a great place to show off your familiarity with them. Using Cucumber for your test suite will usually be a rash choice unless you’re being asked to show off your BDD chops.

Your technical test is also not the place to demonstrate you know how to use arcane language techniques. It’s not the place to introduce a library that overhauls your language’s entire OO system that was released 2 weeks ago. Limit your tooling to tools you probably wouldn’t need to ask permission to introduce into production.

20. Is your code robust?

For example:

  • Is there user-provided input at some point? Is your code easily broken by bad input?
  • Does your code degrade nicely with edge-cases?
  • Can I pass untrusted inputs into your SQL?
  • Can the user create a security issue by passing you HTML or JavaScript where you expect something else?
  • Do you need to consider text encodings? Have you done so?


21. Have you used the test to show off your best work, or did you rush through it?

Aim to do a portion of the test well, rather than desperately trying to complete the whole thing and running out of time. Well-presented, thought-out, commented and tested code that completes 3 of the 4 set tasks is often better received than the trappings of a new framework youโ€™ve started building to solve all the problems they might ever encounter.

Most tests I’ve seen in the wild set an unrealistically short time-frame to complete all the tasks. I’m yet to see anyone penalized for spending more time than they were allocated to produce good work — I’ve seen it commented on a few times by reviewers, but never penalized.

The code you’ve submitted will be seen as a proxy for the code you’ll produce when you work there, so bring your A game. In almost all cases, quality is more important than quantity.

And so…

Did I miss anything? Have you been caught out by making stupid mistakes in your technical tests? Do you see the same issues over and over in code you have to review from interviewees? Drop me an email to [email protected], I’d love to hear from you.

Published by


Peter is a former developer and CTO turned recruiter who wants to demystify the recruitment process so that developers can find jobs and get paid more.