The holiday season and the first few weeks of every year is a time to reflect on the old and getting ready for the new, with no shortage of predictions about what the future will bring us. One of the things “the old” can teach us is how bad we are at predicting the near future. From the US presidential elections to the Chicago Cubs winning the World Series and Leicester City being crowned the Premier League champions, 2016 was a particularly tough year for Nostradamus-wannabes.
When it comes to science and technology, we often overestimate how fast we’ll achieve the next big thing, but we tend to get it right in the very long run. We know what the future will look like, but we don’t know exactly when that future will happen. From “The Jetsons” to the campy 1966 Batman TV series, inventions that looked impossible back then are trivial today. Granted, Batman’s GPS was a bit cumbersome (see below), but hey, nobody is perfect.
The last couple of years will come down as the time when the world woke up to the promise of machine learning and robots—or the time when the hype around it approached its peak: apparently, those become mandatory levers to get generous funding for any new venture, from smart clocks to intelligent medical devices.
In a not-so-distant past, we were afraid that manufacturing jobs would be replaced by automation of production lines and some specific jobs would move overseas where labor was cheaper, but thought that work in areas such as consulting, legal advice, and hairdressing were safe and would remain local. But as we move fast and furiously through this digital age, that certainty is vanishing. We are seeing strong evidence that machines can pretty much learn and do almost anything, given enough data and good algorithms. There’s an argument to be made that machines still don’t have the ability to learn like humans, and lack intuition and creativity, but advances in unsupervised learning suggest that even that last frontier may be crossed at some point.
When you think about it, this is nothing new: one of the sure-shot predictions we could make since the beginning of civilization was that the workplace of the future will be very different from the one of the present. Hunter-gatherers became farmers who then moved to the city as artisans, then factory workers, and finally office-dwellers. The workplace has always been changing, but transitions that used to take centuries or generations to complete are now happening in a matter of a few years. While the traditional workplace used to move with the speed of a Gone with the Wind movie, its digital incarnation looks more like a time-lapse video.
What this actually means is that most of our jobs, as they currently exist, are doomed. Some will disappear over the next few years, others will stay around for much longer, but we don’t precisely know their expiration dates. Oddly enough, this should not lead to despair and a dystopian future. By acknowledging this is inevitable, we can do what we always did and prepare for that future where jobs will keep disappearing but careers can live on, as new jobs and needs will be created as we progress down that path. We’ll move from a centuries-old model in which we learn a trade as youngsters and have the same job for life to one that seems much more interesting: learning will be a continuous life activity and we’ll eagerly anticipate what our next job will look like. By then, nothing will sound more outdated than saying that “you can’t teach an old dog new tricks,” and that’s a good thing. Old and new dogs alike will soon thrive on learning new tricks.