Sat Feb 22, 2020
Last month I took stock at the start of a new decade and had some fun thinking about both the previous and the next ten years in tech. Because it was so much fun, I’d like to continue on from last month’s theme and go a bit deeper on the tech retrospective.
I looked back on some old blog posts from the last decade where I made some sort of prediction to see if I got anything right. I mostly didn’t. Being a soothsayer is, quite frankly, really hard.
So, instead, here are some predictions I got wrong.
You’ve got mail
In 2012, I predicted that personal email would fade away. I gave some very good reasons for email’s eventual decline. But I was wrong.
Email is far from dead - to the contrary, email newsletters have made a comeback. What happened?
I believe the main reason is that many people have retreated to email as a safe haven. They are tired of the nonsensical bombardment of information in social media. Your social media feed is no longer a simple chronological timeline - algorithms now determine what you see. And more importantly, what you don’t see.
Privacy issues have also come to the fore of late. The social media honeymoon is over - people are now much more wary of how their data is being sold and manipulated.
For marketers, engagement is no longer the Holy Grail it once was. Public opinion is fickle and mass messaging that misses the mark can backfire and turn ugly. Just look what happened to Tim Hortons when they tried to offer free coffee to Meghan and Harry if they moved to Canada.
Furthermore, the number of “followers” or “friends” you have does not translate to loyalty or benevolence. Donald Trump, for instance, may have a lot of followers on Twitter, but only because Americans are wisely following the Sun Tzu adage to “keep your friends close, but keep your enemies closer”.
So, yeah, email isn’t going anywhere. Unlike certain RSS readers that I won’t mention. I still enjoy using an RSS reader, but most people don’t. I can’t blame anyone for falling back to the inbox they’ve been trusting since 1972.
The unlikely content-creation machine
Six years ago, I thought that Netflix would greatly benefit from streaming rentals (paying à la carte for newer titles in addition to the monthly subscription fee). I still think it was a decent idea at the time, but it’s now obvious that Netflix doesn’t need to do this at all.
Introducing additional paid options would only muddy the waters. It would also dilute their value proposition as a seamless one-stop buffet of streaming goodies. Not to mention that the licensing arena has become increasingly hostile. Disney, for example, having just launched their own streaming platform, Disney Plus, have yanked the rights to their movies back from Netflix. It seems unlikely Disney would ever let Netflix rent out anything of theirs.
Yes, Netflix still has a lot of bad B movies and other stinkers that failed at the box office. But their latest strategy is one that I failed to take seriously in 2014 - making their own shows and movies. Since then, Netflix has pumped billions of dollars into churning out their own content. House of Cards was just the beginning of a “produced by Netflix” avalanche.
Some of their movies and shows are really bad. Some of them are really good. At the Golden Globes this year, Netflix got a record-setting 34 nominations. They only won two.
However, Netflix doesn’t really care if their content is good or bad, as long as there is always something new to watch. They are winning on sheer output alone. Other competitors can’t hope to outpace their content-creation machine. Even Disney Plus, which got off to a great start with millions of subscribers, doesn’t have much original content to watch after you’re done with The Mandalorian.
So, ignore my advice and carry on Reed Hastings - you’re doing just fine. I just wish you hadn’t cancelled all the Marvel shows I liked (RIP Daredevil) - that was petty.
Cars are…still cars
Also six years ago, I imagined a post-PC era where you could dispense with the PC and seamlessly hook your phone up to a keyboard and monitor to do some serious computing.
This hasn’t happened, and isn’t likely to anytime soon.
Modern phone hardware is more than capable - feast your eyes on the specs for Samsung’s recently-announced Galaxy S20: An SoC that clocks up to 2.84 GHz and up to 16GB of RAM. That’s actually double the RAM I have on my workhorse PC at work.
The problem is with the software. Nobody has the vision or resources to create a hybrid UI that scales and adapts to different screen sizes and user input methods. And implementing a dual-boot option is not user friendly, especially if data is not seamlessly shared between application equivalents on each platform.
Speaking of Samsung, they did come up with a Desktop interface for your phone called Dex, but it seemed half-hearted at best. Dieter Bohn had a good take on it last year - his conclusion is that “we built a different future than the one we imagined”.
Instead of making your phone the central hub, it’s easier to keep your data in the cloud and access it on whatever platform you’re using, whether it be a computer,or iPad, or TV. The emphasis isn’t on horsepower anymore - it’s on latency; how fast you can stream your data to your terminal.
Perhaps in the far future a subdermal microchip implant will seamlessly sync our data with our computer, phones, earbuds, smart glasses, etc. on a 10G network. A truck will still be a truck and a car will still be a car (albeit a much faster car), but the driver will seamlessly switch between vehicles and never lose speed.
Tick tock Apple Watch
When smartwatches first started flourishing in the last decade, I wasn’t sure what problem they were solving. I don’t think anybody did. I think the only reason everybody aggressively started putting one out is that there was a rumour that Apple was making one.
Well, the rumours of course were true. However, when it was first announced in 2014, my initial impression of the Apple Watch wasn’t exactly enthusiastic. The first Apple Watch was an ungodly mess - trying to be everything to everybody and sporting a half-baked UI (that was completely overhauled a few years later).
I was right about one thing at the time - fitness tracking is the only reasonable use for a smartwatch. I even picked up a Samsung Gear Fit 2 a few years ago to track my runs. In retrospect, buying a Samsung device for use with my iPhone was a terrible idea because they don’t play well together. But I still use it - mostly just to tell time and to sneak a peek at my phone notifications while sitting in Church (if my Gear Fit is in the mood to show notifications, that is).
Apple has wisely focused on health over the last five iterations of their smartwatch. Potentially life-saving features such as fall detection, heart irregularity notifications, high-decibel warnings, and the ECG app are genuinely useful. However, despite the Apple Watch becoming incrementally more useful every year, I can’t shake off my disappointment at how slow progress has been since it launched over five years ago. Here’s what I predicted when the first Apple Watch came out:
In a few years, smartwatches will be thinner, have better battery life, sport a more diverse design, and be more autonomous (no tethering to a smartphone in your pocket).
When looking at the Apple Watch, the opposite has happened. The newest Apple Watch (Series 5) is thicker than the original (by 0.2 mm). Battery life is still limited to one day, with the Series 5 actually getting worse battery life because of its always-on display. There are less diverse hardware designs than previously now that Apple has ditched the more expensive metals nobody was buying (not that the $10,000 Gold Apple Watch was ever going to be a bestseller).
And finally, the Apple Watch must still be tethered to an iPhone to work properly. There is a cellular version available now, but it’s not completely autonomous. A standalone Apple Watch is probably something Apple is still working towards. It would almost certainly expand the potential sales market in a significant way. But it’s not happening anytime soon.
Honourable mention: RIP Cyanogenmod
Android’s two miserly years of software support was one of the reasons I switched to an iPhone in 2015. My 6S is still on the latest version of iOS and support for the next version is looking good too.
However, when I was using Android, I tried to keep the software fresh by flashing third-party ROMs onto my HTC One S. The best ROM out there was Cyanogenmod and I had high hopes for its continuing success:
I think that Cyanogenmod is just getting started, and will ride a wave of support from budget-conscious and environmentally-aware people who are pushing back against the two-year upgrade cycle.
Sadly, this never happened. A number of bad decisions led to the company’s demise at the end of 2016. Flashing ROMs hasn’t gotten any easier but the good news is that Cyanogenmod was forked and lives on in LineageOS.
What hasn’t changed: Google and other Android device makers still only offer two years of software updates.