I am always been fascinated about speed, from overclocking hardware to efficient algorithms, but I always struggle with compiled programming languages and concurrency, I remember my early days as a developer shooting my foot with this concepts. For compiled programming languages I used to say: “well this binary is fast enough I don’t need more speed”; or for scripting languages: “Hey! it only runs once, I dont need to speed things up”. But then I got involved on web applications development I found that speed matters. If you have a latency greater than a second, your users will start to complain, so is better to be fast.
Nobody complains if you make your code faster
I just want pretty badges
While I was building this site I wanted to have a pretty styled badges from my github account, but I could not found any service that I liked. So I decide to build my own badge system to pull from my github repositories statistics to present them nicely. The problem was simple fetch your repositories, get the programming languages on each one of them and add it into a single hash. Very simple, right?
The first solution isn't always best
When in doubt benchmark
Experimentation and Second Aproach
Although the ruby algorithm was extremely simple, it took me around 15 seconds to fetch all my repositories languages in less than 20 lines of code, it looks a good example to benchmark against go.
I figured out that the latency was caused by the secuence on api calls, if I had 20 repositories and each request took me around 185[ms] well there is my bottleneck. So I decided solve this using concurrency for that I had to implement another get method using channels, instead returning a value and fetch each repository language using go rutines, the result was fantastic, the whole process took less than a second. I MADE IT!!
Concurrency in Go is painless, but you will have to be more declarative and write more code.