Programming


Swift Vs Objective-C the ultimate answer

Since 2014 Swift programming language has gotten the fascination of developers and strangers, but what about the old and reliable Objective-C, that for years has been the default language in iOS and OSX applications. Is Objective-C dead? Do the new generations should only know Swift? Is Swift the tool for everything? In this article, I will try to answer those questions, by comparing them in different categories. > In this article I am not considering backend development with Swift

No big framework, No problem

Recently you might face the need to learn a new frontend framework, and last year you had to do the same. Unfortunately, this is a very common practice among web developers, spending too much time during the learning curve for a technology that won’t be using in their next job. A pure javascript solution is not very sexy, but with the right architecture, any developer can be productive since the first day.

What and where precision matters

It is well known in all programming languages the floating point math loses some accuracy. Therefore we have something like the following statement 0.1 + 0.2 != 0.3 This is explained in detail in here, but basically, is because some fractional numbers aren’t well represented on binary, and the carryover loses accuracy, but it keeps precise. There are several ways to mitigate this effect, the most known use double numbers or big decimal representations, the problem with those is the operations tend to use more computing resources, especially with BigNumbers, that usually are objects with large strings and complicated math behind.

Divide and... conquer?

It has been taught “divide and conquer” is a great technique to solve any problem. Well, sometimes in practice it is harder than we, though, here is a case study of how our team apply this concept and still wasn’t enough, we need to divide even more and tweak some other things. The problem In the beginning, a the developer in charge to create an algorithm for the mean value for some data points did not have in mind the massive data and the required memory to process that piece of code in the future.