So, 2023 launched like a rocket ship with an extremely busy January and even busier February. And even though I could have asked ChatGPT to fill in for me for January, I swallowed my pride and allowed my streak to break.
I’ve been putting out a post every month since 2020, but I’m no robot, merely an imperfect human. Thus, January has come and gone without a post. If I had a time machine, I’d go back and write one, but regretfully, I don’t.
Since nobody human actually visits this site, I’m sure the AI robots trawling this site for data will forgive me. I will forgive myself too, which is arguably more important.
How do you know I’m real?
I didn’t think we’d get here so quickly. ChatGPT has revealed itself as the uncanny villain to writers everywhere. Writing on any generic topic has become meaningless overnight, because the AI can probably do it better.
So, I ask you again, how do you know I’m real?
Well, when I was in first grade, I threw up on a girl in reading circle and made her cry. I hate eating raw onions and cooked apple pie. I think Country music is the worst, but I love a good Western. On a trip to Europe with my best friend, we got into an argument. Then we just sat on a stairway in Florence for eight hours straight, refusing to speak to each other. I’ve never tried bungie jumping, nor do I plan to. Trust me - I’m human.
But isn’t that what an AI would say to convince you it’s human? 🤔
People have been using ChatGPT to write everything from college essays, to legal documents (saving them thousands of dollars no doubt), to their wedding vows (come on dude, so not worth it). ChatGPT can even write pretty good code.
I can promise you, however, that I have no plans to hand off my writing duties to an AI anytime soon and won’t even attempt the in-vogue party trick of asking ChatGPT to write a post and pass it off as my own writing.
There are of course problems with ChatGPT, the most troubling being that it states facts so confidently and eloquently, you’d assume those facts are true (they are often not). Don’t believe what you read, indeed.
But nobody can deny that how we think of the written word has forever changed, especially if you’re a writer.
And so concludes a tale of two tablets. The stage has been set. To summarize from Part 1: The 10th generation iPad and the iPad Air 5 are facing off. The winner is my birthday present.
Plot twist - I already declared a winner a while ago, long before October’s iPad event.
In the run-up to last month’s event, all the tech YouTubers and journalists kept spouting the same conventional wisdom: Hold off on buying an iPad until Apple reveals their new goodies. And under normal circumstances, this is great advice. Only fools, for example, buy new iPhones in August.
I ignored their advice - and got a great deal on the iPad Air 5 that far surpasses any Black Friday deal out there.
A lot of people were excited about the new iPads coming this month. As usual, the wild rumours had some people pumped. But what we got didn’t quite live up to the hype. In fact, it set a new low bar. Let’s dive into it.
It’s been twelve years since Steve Jobs first introduced the iPad. It was a dream come true at the time. Before 2010, touch-screen tablets only existed in Star Trek.
Everybody wanted one. Initial sales were so high, many thought the laptop’s days were numbered and a new age of touch-screen computing had begun. Over the next three years, the iPad got thinner, lighter, and even more powerful, culminating in the first iPad Air in 2013 - a beautiful, sleek tablet that was so good, I could resist no longer and picked one up for myself.
Ten years on, and I still use that iPad Air every day. Even though it is an adequate YouTube machine, it is frustratingly slow and outdated. So, I decided that this will be the year that I upgrade. The fact that I’ve been happy with the first-gen iPad Air for this long is a testament to how good it is. But it is also an indication that iPads never quite took off as computers. And by “computers”, I mean productivity machines that you do “work” on.
This tension is a common theme among tech enthusiasts, and if you’re one of those, you’ve no doubt come across this question before: Why is the iPad’s software holding back such powerful hardware?
Well, I’m not really interested in answering that question.
I’ve been obsessing over the iPhone’s notch since 2017. Not only do I pen an ongoing segment here called Notch Watch, but I even created a dedicated Notch Watch website.
However, I’ve reached a turning point - an editorial dilemma if you will - as the notch’s days are surely numbered. All my complaining over the “dreaded notch” will soon have to… evolve.
At Apple’s annual September event, Apple introduced the iPhone 14 and iPhone 14 Plus. These do still have notches. They also still have everything the iPhone 13 had last year, including the old A15 chip. We’ll chat more about this later.
But with the iPhone 14 Pro and iPhone 14 Pro Max, Apple tried really, really hard to change the notch narrative. And you know what? They might have done it.
Instead of a notch, the Face ID sensors and selfie camera have been moved into a pill-shaped cut-out that only Apple could get away with calling the “Dynamic Island”.
Yes, there is nobody else on this planet who could take this bombastic name seriously. As a case in point, Joanna Stern at the Wall Street Journal hilariously rowed out to a literal island in a canoe for her iPhone 14 Pro review. Amazing. The tech press have even started calling the old notch the “Static Peninsula” - with tongue firmly in cheek.
But goofy name aside, the Dynamic Island - sorry, I just can’t - the pill cut-out (henceforth called “the island”) is now far more than it appears to be.
Her name was Bailey - and she was the best horse. Gentle, plodding, and content to stay in line - instead of bolt ahead like Dexter (the horse behind me bearing her unfortunate rider). Bailey’s calm temperament, despite carrying a new rider, was most welcome.
It was an especially hot day in the Cariboo. An unusually wet spring, however, had kept the landscape verdant and the mosquitoes thriving. Due to the heat, our route led us through serene, shaded forest rather than open plains.
I felt a good connection with Bailey - a slight tug at the reigns was sufficient to nudge her in the right direction. And when we dismounted for a rest stop, she responded to my call when the time came to mount up once more.
Given my lack of experience (does riding a horse in Red Dead Redemption count?), trotting was uncomfortable at first, until I started to anticipate Bailey’s rhythm and stand up in the saddle. The last time I had ridden a horse (in real life) was 18 years ago in Costa Rica, so it had been awhile.
There’s something so meditative and calming about swaying in the saddle, listening to the monotonous clip-clop of the hooves, and simply trusting this magnificent animal to bear your weight on a day when it’s too hot to even walk.
Summer is in full swing, which means desk jockeys who spend all day in front of the computer are forced to venture outside and get sunburns, mosquito bites, and decreased life expectancy from breathing in campfire smoke. Unless sitting all day (which is, of course, “the new smoking”) hasn’t exacted an even worse toll.
Seriously though, I do love being outdoors, soaking up the beauty of nature in person instead of settling for Planet Earth in HDR. I also appreciate being in areas with no cell service. I hate to use the phrase “tech detox”, but it’s a great tech detox. Taking an extended break from being online only makes you appreciate the internet more when you return.
I’m well aware that there is an anti-tech movement out there with zealots who switch their smartphones for flip phones , only use paper books and maps, and eschew screens of any kind. But I don’t think we can go backwards. It only takes the frustrating and lengthy act of texting on a flip phone for the doubt to creep in.
Smartphones are actually more efficient because they save time. It’s quicker to follow GPS directions than stare helplessly at a paper map. E-books save so much space that even Marie Kondo can’t deny the benefits. I could go on.
On the other hand, I’m genuinely mystified on how people can spend four to five hours on their phones everyday. Is there really that much to do after reading a few tweets?
Regardless, some time in the backcountry, away from civilization - and far away from the crowds - is good for everybody.
What more could you ask for from Apple’s WWDC than a plethora of Craig Federighi (aka Hair Force One) memes? From Craig’s hair blowing in the wind to Craig in a garish tracksuit - we got it all.
Apple’s video production team proves once again that online video presentations are more fun for Apple fans who could never afford to attend in person.
Although Apple did invite a group of both developers and journalists to attend, it was only to watch the video presentation outside while getting painfully toasted in the Californian sun. Not really worth the expense, in my opinion, even if you got to eat croissants in the Apple Park café.
Who knows what next year will bring. But, for now, I’m enjoying the cheeky transitions and faster pace of a pre-recorded show.
As is tradition around here, here are my personal thoughts and impressions of what went down. For a full detailed rundown, head over to your favourite tech journalism site…
At Google I/O ten years ago Sergey Brin dropped into a Google Hangout with a gang of daredevil skydivers wearing Google Glass. The demo was messy, chaotic and infused with pure hubris.
Both Google Hangouts and Glass are now discontinued. But the 2012 Google I/O is still the most memorable one, if only for its overreaching ambition.
This year’s I/O felt very similar to ones from the last few years - measured, academic, and forgetful. It was only at the end when a glimmer of excitement made me sit up straighter in my chair.
Google Glass had returned.
Well… not exactly. The ever-respectable and approachable Sundar Pichai, waxing on about the possibilities of AR, dropped a video showing a prototype of smart glasses stripped of everything - including the “glasshole” camera - that only performed one function: real-time language translation.
Google is onto something here. Single-purpose gadgets can become great products precisely because they are born out of constraint to become focused tools. It’s the natural evolution of product design that precedes the all-purpose “one device”. You simply cannot converge into a compelling device that “does everything” before the technology is ready.
I only hope that Google stays the course with this - not unlike many other dreamers I know, they’re just really bad at finishing what they started. One thing’s for sure though - nobody is going to be calling an Asian grandmother struggling to understand English a “glasshole”. Google’s scaled-back approach is definitely devoid of any hubris this time.
I was starting to wonder if I’m the only human on planet Earth who still writes blog posts anymore. I’m not going to lie - now that social media is entrenched into the fabric of the 21st century, blog posts seem retro and anachronistic - even to an old man like me.
And then I came across Matt Gemmell’s brilliant insight on the rhetoric of writing in these dark times, giving much-needed context to my thoughts. He’s right - words matter. Gemmell sees “a worrying trend towards trivialisation amongst those who make things on their own”.
I found his disassembly of the phrase “blog post” particularly insightful.
He points out that the word “blog” has “connotations of the ramblings of some random person, without authority or polish”.
And here is his take on the word “post”:
It focuses on the mechanic of putting-up, making the actual published material almost irrelevant. It’s an emission; it’s another chunk. That’s minimising, and trivialising. It’s insulting.
Gemmell brings it home by calling out “content” as the most hideous word out there: “content is fungible, space-filling, placeholder-replacing stuff, and that’s not even its most offensive connotation,” he argues.
Read the rest of his piece to see how the offense escalates.
So, perhaps I won’t call my longform pieces “blog posts” anymore. To counterbalance the insidious narrative, writers need to re-frame what we do. If blogging is dead, then it is us who should wield the knife.
And needless to say, I’ll be calling myself a “writer” until the day I die - never a “content creator”. It’s no wonder that such a demeaning job title asks for so little pay. It really is depressing how far the monetary value of the written word has plummeted. And how so many writers are complicit because they have been beaten into submission and have lost all confidence in their craft and talents.