• Home  / 
  • Review
  •  /  exercism.io — Crowdsourced code review, mentorship and learning

Buy me a coffee »

exercism.io — Crowdsourced code review, mentorship and learning

Recently one of my old co-workers sent me the URL to exercism.io. I looked at it and fell instantly into love.

In this article I will explain why I love the site and how you could benefit from using it too.

Pre-requisite

To use exercism.io you will need

  • the command line client
  • a GitHub account
  • compiler / interpreter of the language(s) you want to use

So this is not much. Most of the developers already have GitHub accounts and if you use a programming language you already have the compiler or the interpreter needed to run your applications.

How it goes

Easy. You have the command line client and after setting it up (the configuration can be found at the project’s website with a detailed how to get started tutorial) you can call

exercism fetch [<programming language>]

The programming language is optional. If you do not provide it the CLI downloads the first exercise for every language available.

With every fetch you get the latest available version of already fetched exercises — and you can only fetch the next exercise to solve in the provided order.

While solving the exercises take a closer look at the provided unit tests. Sometimes the majority of the tests is skipped or commented out (I’ve encountered this with Ruby and Elixir) . This means that your solution goes against one particular test case — meaning that you see that your solution works perfectly. But this is not really the case.

But as I’ve told you this is not always the case. Java and Python runs all the test cases at once so you see what’s missing in your code.

So let’s pretend your code is OK, all tests pass. Then you submit your solution with the command line client.

After this the code is visible at the website (the CLI shows you the exact URL) and you can receive nitpicks.

Nitpicks are comments and suggestions for improvements about your code. Sometimes the remarks can be partially wrong because of your ideas how to solve the problem or the developer himself/herself is learning the language and doesn’t know the module working with regular expressions.

If you encounter such a nitpick: do not get angry. Think back on the time when you started to code. You did false assumptions too and had some concepts which were not quite readable.

Giving nitpicks

If you submitted the code you not only become nitpicks but you can give them along for other developers solved the same problem.

Again: when giving nitpicks think about your beginning times when you used the concepts you have learned as you have learned them. For example if-else statements are not always necessary if you use a return statement inside of the if block — but some devs use the if-else if-else construct anyway. In this case be friendly and make a nice remark about the unnecessity of the else statement.

With giving nitpicks you can deepen your knowledge about the topics of the concepts. Martin Fowler says too that if you want to lear something tell (teach) about this topic. So be friendly with the other developers.

What I am missing

Well, this is not the love where all your wishes are fulfilled right at the start. I miss a little feature (and I think there are more people thinking the same): the submitted exercises are not evaluated after the submission.

This means I could submit any solution to the server and get the next exercise.

Of course the nitpicks would mention it that I am doing something wrong (and it would cut my reputation down if I would care).

I went through the open issues but could not find any about this topic in the general repository. Perhaps they are in the language-specific ones? Update: I’ve been told which issue I was looking for. I thought there has to be one because I cannot be the only one wishing for a functionality like this.

Anyhow, I understand that some languages do not fit into this approach because you need not only an interpreter but to compile the source code and run it. And with a growing amount of users it could be excessive to start evaluation processes and it could impact the server’s performance.

But this issue could be handled. The solutions don’t have to be evaluated on the fly — it can take some time (at Coursera for example it takes around 20 minutes to evaluate some homework assignments). Well, this would disable the possibility for the user to download the next exercise until the evaluation against the unit tests passed but I would say this is a good time to wait and do a retreat on the code. There is always a way to improve the solution or make it a bit cleaner.

And this was it. Of course there is one thing which has to be considered: the test cases should be stored along with the submission — this is because it can change over time with a pull request — and it would be a bit bad to get your submission rejected because of a new version of a test case.

I would be glad to possibly help to get this done for interpreted languages.

Conclusion

exercism.io is a nice way to learn new concepts of programming languages — with the help of other fellow developers. And this way you can get other perspectives on how to solve a problem.

And what I really liked is when I mentioned some concepts I’ve seen that these concepts were given along to other devs from the ones I mentioned them.

So if you want to learn new ways of looking at problems and try test-first development approach try exercism.io. I do it currently with Elixir.

About the author

GHajba

Senior developer, consultant, author, mentor, apprentice. I love to share my knowledge and insights what I achieve through my daily work which is not trivial -- at least not for me.

2 comments

Leave a comment: