2020: Trends and predictions for technology and IT
Predictions are always hard to make especially in technology, or are they? Machine Learning is getting better and better at making predictions, while IoT is reversing the common approaches in Cloud Computing while using Cloud Computing. This article aims to give insights about what may (or may not) happen in 2020.
Cloud: Cloud and Edge computing, containerization (and the K-word)
Cloud computing has revolutionized the way large applications, datasets, databases and almost everything is deployed and managed. It is safe to assume the trend will continue throughout 2020. While some innovations that gained momentum in the last years are now less prominent (such as OpenStack), some others, *cough* Kubernetes, are steadily gaining traction.
As Cloud computing continues its advance it currently seems obvious containerization is the chosen path that leads forward. Docker will probably continue to lose its predominant position in the field while others (podman) continue to feast on it. Docker Swarm is currently behind Kubernetes and its use cases are narrow and even Docker EE now packs Kubernetes. It is likely this trend will continue. On a side note there is a subtle trend that sees workload returns from cloud to on-premises, it’s rather difficult to predict what will happen to this trend.
Edge computing is offloading elaborations where the data produced for them comes from as closely as possible. In the past decades we’ve seen a trend where data (and software using that data) were consolidated into facilities, data centers and more recently the cloud. While this allowed simpler approaches with the advent of Internet of Things and smart devices it has become difficult and sometimes impossible to gather data coming from appliances and process them in a centralized fashion anymore. Inverting this process resulted in Edge computing, technologies like 5G and inexpensive units such as but not limited to the Raspberry Pi further boost this trend.
Although Edge computing is slowly and steadily gaining attention, a solution to rule-them-all has yet to appear. As a matter of fact there are a few open source IoT/Edge platforms that aim to speed the adoption of Edge computing but none of them is currently mature enough to take the leap. This results in most IoT/Edge solutions being proprietary and/or baked in-house.
Data science: Artificial Intelligence, Machine Learning and Deep Learning (and Big Data)
Software is eating the worldMarc Andreessen
I’d really say at this point software ate the world already. What software is doing right now is learning about what it swallowed, the same world we live in: Artificial Intelligence (actually ML). AI is almost undoubtedly the hottest topic in IT right now and it is highly likely this will continue or even grow further in 2020.
Within AI Machine Learning is the field who has reached a good balance between results and difficulty to achieve. ML techniques and models are becoming more and more pervasive and precise. From email classification to sentiment analysis, ML is literally gaining acceleration and more and more programmers are exposed to AI, ML concepts. On top of that Business Intelligence is slowly but steadily adopting ML, what would happen if e-commerce were to benefit from mainstream ML? Hard to say, hard to happen in 2020 but not entirely out of the race.
Deep learning is the use of concepts from Neurology applied to problems and algorithms, essentially Neural Networks. Deep Learning is currently undergoing an accelerated phase in its existence, while using Neural Networks for problems addressed by ML may be overkill, producing reliable, reusable networks for complex problems is still prerogative of the elite. Open frameworks such as TensorFlow 2 are making it easier to approach and learn DL, nonetheless the precedent phrase still holds true. It is likely Deep Learning will continue its growth in 2020, but it is very unlikely to become pervasive as ML currently is (mind that ML is not THAT pervasive).
Big Data is in decline, while AI/ML/DL are benefiting from huge amounts of data, the means to store and process such data in an organized fashion is steadily getting obfuscated by the importance of AI algorithms. The merge between Cloudera and Hortonworks is pretty indicative of how the market is shrinking. Despite this fact, Apache Spark is still relevant, and while not as hyped as before it still is the go-to for Big Data processing. Hadoop interest is almost entirely gone and the software is now regarded as a brick to build AI/ML on top of it. Meanwhile other solutions such as Ceph compete with Hadoop in a market that’s not what used to be three, four years ago.
IoT, smart devices, Blockchain, security et al
Internet of Things is about extending Internet to everyday life, placing Internet-enabled appliances where most wouldn’t look; enabling people to leverage pervasive technology while keeping it hidden. The IoT has been a trend for quite a while now and it is likely to keep a steady, decreasing, pace. From smart cities to smart homes and wearable devices, IoT has the potential to revolutionize the way we live every day. But that is unlikely to happen soon, the lack of a common platform, the need for different protocols and hardware (e.g. LoRaWAN-enabled) is currently hampering the capability of IoT to become widely-spread. In 2020, IoT will probably benefit from 5G a lot, while ML/DL will continue to thrive and will probably become a core part of future IoT solutions.
Cryptocurrencies have been suffering from market fluctuations in the last two years, while Bitcoin has become widely-known and its usage rose up dramatically, the hype towards Cryptocurrency has begun to fade (in people not interested in tech/decentralization). What hasn’t faded is the Blockchain technology which is “secretly” being used by many organizations to build the most disparate things. Making predictions about cryptocurrencies is hard, but it is likely that Blockchain interest will continue to wear or to stagnate.
Security is kind of hard to define since it can mean many different things in the field. What’s likely to happen is that security in AI/ML/DL as well as IoT security will become more important. Security is an evolving field and ML/DL algorithms applied to security have only tackled the tip of it.
Hardware is likely to get a lot more attention after all the problems that were discovered last year. Open hardware initiatives will surely be taken much more seriously. RISC architectures will continue to gain attention, while it is unlikely that ARM will surpass x64 inside data centers.
Robotics, bioinformatics, health informatics still continue to grow, at times slowly. It is unlikely one of these fields will abruptly gain momentum in 2020.
(Bonus!): Programming languages for 2020
- The best overall: Python. From Data Science, Cloud computing and IoT, to automation and websites, Python is everywhere, might as well invest time into learning it.
- Best for Data Science: Python, R. While Python is a general-purpose language used in Data Science, R is a language built to work with data, and one of the most popular choices when it comes to Data Science.
- Best evergreen: C/C++. You may be puzzled to hear this but C/C++ programmers are still needed in many industries, especially robotics and manufacturers. While it may not be the best investment for your time, C/C++ are not going to disappear soon.
- Best for enterprise: Java. Java presence in enterprise is still marked and while many enterprises are now transitioning to microservices/containers, Java is still there to withstand enterprise tasks and to aid the transition.
- Bonus: Go. If you’re interested in a lightweight, easy-to-learn, easy-to-use, easy-to-parallelize language, Golang is for you.
Latest posts by mark (see all)
- What is Big Data? - 15 January 2020
- What is Data Science? - 8 January 2020
- 2020: Trends and predictions for technology and IT - 1 January 2020