I remember when software development used to be much simpler in earlier days. By earlier days I mean over twenty years ago…
Taking a computer science university degree meant that you had to be good at maths with a logical mind to write code and algorithms from scratch. You needed to have knowledge of the hardware resources available. I remember having to write a program on the register! You had to be very frugal with storage, memory and CPU. Memory leaks were the bane of my life. Didn’t like the way it had to be coded, tough luck.
There were no software frameworks, less programming languages to choose from, and that career path called ‘product management’ didn’t exist. Anything graphical was a text based green screen (if you were lucky). Build engineers were employed full time to merge code changes. Everything was driven from the command line. DevOps, what’s that? Oh, and the waterfall methodology was the only method (rapid prototyping didn’t count).
Fast forward to today. We have all of the above and more! Moore’s Law could be applied to rate of change in the software world and learning needed. Don’t like the language or tool you’re using? Swap it out. No wonder we have to be agile.
No single Software Engineer knows it all anymore. Knowledge of the ‘full stack’? That’s highly unlikely. Having both depth and breath? Not possible. If someone says they do, that’s stretching the truth. It’s more likely that they have a sliver of understanding given their experiences. These days You need more experts than even before in order to provide a correct solution.
Even with all this change, the great thing is that the principles of software development have remained virtually unchanged. And because of that you can jump from one technology/tool to another while make inferences, educated guesses and being good at working out the pros/cons. Some call that being fickle. I see it as having a bigger, better toolbox from which you can choose the right tool for that moment in time.
One thing that’s vastly improved is our creativity of naming software ‘stuff’, be that new languages, frameworks and more. Just look at Ruby with its gems, Chef with its cookbooks.
Oh and that subject of names, I studied at uni called neural networks. I should have paid more attention to it back then but it wasn’t a cool enough name. I never thought that writing a program to automatically recognise geometric shapes would be useful. But now neural networks is part of Artificial Intelligence, i better hit the books, I mean ebooks.