Have you noticed how more and more projects use python when in the past they would have use C/C++? I’m certainly seeing it.
Don’t get me wrong, I love python, it is so much more enjoyable to code than embedded C. I’m not here to complain “It was better in the old days”. I would like to reflect a bit about whether embedded programming is bound to die or not.
Low level not required anymore
Having learned how to code for the first time using assembler on a 8bit microcontroller, I was able to see the evolution of the programing languages and techniques until today. Back in the days having an OS on your microcontroller was ridiculous : way too big. Nowadays it is more and more difficult to find a project that doesn’t use an OS (like freeRTOS for example).
The fact is that Moore’s law might be dead for big processors, it is still going quiet well for the smaller ones. Every generation gives more processing power, more memory, more connections. Modern microcontrollers (high end) have 100MHz processor, USB and Ethernet. At that point it is not a micro anymore, it is a small computer.
Having a more powerful target, you can start using a little bit more abstracted languages like going from assembler to C, or from C to C++ maybe even python.
That process works both ways, electronics are more and more powerful, but computer are also smaller and smaller (raspberry pi for example). Why bother developing custom electronics when you can get a small powerful computer for very cheap.
Big data to the rescue?
At the same time I see this unfolding regarding the low level, we see more and more companies managing huge amount of data. And more generally project where having the best hardware is a given, but you want to get as much as possible out of it.
In those cases I can see more and more applications for the concepts that are deployed in low level programming. Memory managements, algorithm optimization, network optimization, all those are part of the daily tasks of the low level developer.
Moore’s law keeps on being applied at the law level giving use more and more powerful hardware allowing us to use more powerful methods and languages. At the other end of the spectrum bigger and bigger sets of data forces us to keep the low level knowledge or resource management for those application.
An other killer of low level could be the smartphone : why have a new electronic device when an app fills out the need, I wrote on that subject here.