Software - not hardware- will drive performance in the AI-era, so Rust, Go, Swift and Dart could be the best option to future-proof a dev’s career
Below a list of emergening trends in technology that, as developers, we should probably keep in mind both in our professional life and in designing our career path.
Moore’s trend (aka please do not call it law) is over. CPUs will start getting physically bigger again: some of the Apple A11’s secret sauce over Qualcomm’s Snapdragon 835 could be identified in a bigger surface area
Optimization is where enhanced performance will come from, and it will become a bigger and bigger deal than before, since hardware won’t be there to cover software’s shortcomings anymore (oh and by the way, we might actually have new systems with lower specs because of Meltdown and Spectre)
Hardware fatigue might ensue. Since incremental enhancements in smartphones and laptops are basically coming from revamped design and camera features, original manufacturers will need to up their game in justifing the purchase of a new version of a gadget, specifically:
AI — based software features (that might just conveniently require a specific chip) is where gadget manufacturers will look for growth. The Pixel 2 embraces this trend: a higher DxOMark score for a camera won’t cut it anymore
AI— based assistants and related features will become central: at the moment it seems that Google is the most well placed player in mobile, and Alexa in the SmartHome. Apple’s Siri is years behind the incumbents, which raises the question whether or not Apple went to market too early with their AI-assistant — a very non-Jobs move for a company famous for arriving late at the game with the intention of getting it perfectly right. Regarding Samsung, it seems that the most required feature of Bixby is the ability to re-map the button that triggers it to the Google Assistant
If the above trend continues, expect the emergence of an Open Source Assistant. “But… what about the seed data to train the Machine Learning algorithm?” Hm, what about a distributed system like this?
AI — based OS will ensue. Enter Google’s Fuchsia, an Operating System that has all the ambition of becoming the new Android, both for mobile, desktop and wereables (“this is the year of the convergence” has become the new “this is the year of the Desktop Linux”).
Fuchsia looks very interesting for several reasons:
It has a new UI paradigm based on a GoogleNow-like feed organized around Stories and Modules: it has to be seen whether or not this will end-up like the famously tragic Windows 10 move when they tried to remove the Desktop only to put it back shortly after
Fuchsia is a clean-slate: it’s not Linux nor Unix. It uses a new kernel called Zircon and as such it won’t have to bear any legacy “feature”
Fuchsia is being built with and will support performance-first programming languages (promoting good coding practices). Specifically, the old behemoths are still there (C and C++) but the new cool kids on the block are too (Rust, Go and Swift). Plus, it has a UI built on Flutter, which uses the statically typed Dart programming language
Regardless of the success of Fuchsia (which will be likely released in 2019 or 2020), I personally believe to be very wise for any developer that wonders what to add to their dev toolbelt to basically observe whathever Programming Language Fuchsia is supporting and learn it:
If you are into FrontEnd you should learn TypeScript (used in both React and Angular) and keep a very open mind about Dart and Flutter for unifying a mobile codebase (Ionic, Cordova and any webview-based framework won’t withstand the test of performance)
If you are into BackEnd Microservices, Go should definetely be part of your toolkit. I personally find some Rust concepts utterly over-complicated for the server realm but hey, security first, right?
If you are into IoT or embedded systems, besides the usual C and C++, Rust should be on your horizon