A decade ago, there was a naïve delight in posting on Facebook or Twitter. Social media hadn’t yet evolved into the complex, anxiety-producing system of social currency it is today. Status updates were simply that - status updates.
In fact, when Facebook opened up in 2007, you were forced to write them in the third-person: “John is baking cookies today.”
There was no pressure to be culturally savvy or sarcastic or applause-worthy. You could be authentic, sincere, or boring if you wanted to because it all started out as a friendly little experiment. In the beginning, there were no “likes”. Can you even imagine social media without a heart or a thumbs up moderating every online interaction?
So, when I came across this article about Mastodon on the Outline, I was intrigued. I’d forgotten that microblogging used to be fun. I’d also totally forgotten about Mastodon, which arrived on the scene in 2016 and seemed as doomed to fail as app.net or ello. But how could you not feel the nostalgic, warm-and-fuzzy appeal of “a nicer version of Twitter” that “makes the internet feel like home again”? And sometimes open-source initiatives in the decentralized web can flourish - just look at where Bitcoin is going these days.
Unfortunately, although the people who hang out in Mastodon instances seem just lovely (depending on the instance), the Mastodon front-end user interface is just horrible.
The chemistry lecture was so boring I wanted to drop my head onto my desk and pass out. Even surreptitiously browsing through Twitter (as the rest of the class was doing apparently) held no appeal. I needed a Starbucks or something, anything to keep me awake.
The professor droned on about the spectroscopic properties of carbon compounds. Properties I could have rattled off when I was in kindergarten. This stuff was so easy.
It’s not easy being smart. As a young kid burdened with a ridiculously high IQ, I quickly learned to downplay my intelligence or face ostracism on the playground. In high school, I quietly drifted through the system, keeping my mouth shut in class rather than being labelled as the teacher’s pet.
I got in a lot of trouble for daydreaming. The teachers never caught on that while I was doodling in my notebook and drawing pictures of unicorns and spaceships, I was soaking in every word the teachers said.
Now I’m in college and I’m finally free to embrace my inner geek. Unfortunately, the middle-of-nowhere small town where I live, Mountain Valley, isn’t exactly a magnet for intellectual types.
The professor started talking about an experiment I was familiar with, and writing out some equations on the whiteboard. But he was getting it all wrong. Nails on a chalkboard would be less painful than watching the professor painstakingly write out the wrong equations. I couldn’t help myself - I raised my hand.
I’ve been looking back at the past year and chastising myself for only writing half as many posts as I’d hoped.
So, I’m going to mix it up a bit and write more fiction on this blog. I’ll start with a serialized short story with the modest goal of one “chapter” per post with no set schedule and see how it goes.
I’ll intersperse these fictional interludes with other posts as well, so I’m not going all-in on the fiction thing.
I should warn prospective readers that any fiction on this site is probably rated R so you can expect some coarse language at the very least. Also, this is all very rough and experimental and unedited, so don’t take this as a finished piece of work - it’s just for fun. For fun, but not a necessarily a “fun” writing style because my writing tends to get really dark really quickly. I should probably work on that.
As you may know, I have a thing for time travel. So, the first story I’m starting with could be considered Back to the Future fan fiction of sorts. None of the characters from this movie appear here, but I’ve shamefully ripped off the plot from Back to the Future 2.
The genesis for this story lies in this YouTube video where Biff Tannen is replaced with Donald Trump with such eerie seamlessness. I’ve since learned, by the way, that the Back to the Future 2 writers intentionally based Biff’s character on Trump.
Obviously, the darkest timeline meme has also contributed to the thought process behind this story. And a weird dream, which I won’t go into.
In the Back to the Future films, Marty McFly jumps back in time 30 years from 1985 to 1955 and also 30 years forward in time from 1985 to 2015. I thought it would be fun for a protagonist in 2015 to jump back in time to 1985 to experience all the amusingly anachronistic quirks of another era. It’s funny, and more than a little disturbing, to think of the 80s as another era because I lived through that time.
Anyway, this preamble is now longer than part one of my story, so I’d best begin.
My iPhone 6S shows no signs of aging, so I’m not looking to upgrade any time soon. But even if I was, the iPhone X is not a phone I would buy.
Not because it is a terrible product - the reviews have been borderline sycophantic. Nor because it is too radical - I would almost certainly pass over the iPhone 8 as being too similar to the 6S to justify an upgrade. Not even the high price phases me, which most reviewers cite as the main drawback of the X. After paying $1200 for my last iPhone (to upgrade the piss-poor entry-level memory from 16GB to 64GB), an extra $200 doesn’t seem that hard to swallow.
And no, I have no compelling reason to switch back to Android at this point.
The reason I wouldn’t buy this phone is that it may be “the future of the smartphone”, but it is not “the smartphone of the future.” It is a precursor to perfection. A signpost pointing to greatness a few years from now. And I thank all those who bought it for alpha testing my future phone.
Here are my thoughts. Let’s start with the hardware design, which is most important because, unlike software, it cannot change in this iteration of the device.
Google Home is Canada’s first official smart speaker, so I couldn’t resist pre-ordering one when it landed in the True North. Being an early adopter is definitely out of character for me - I think this is the first product I’ve ever pre-ordered. Anyway, I’ve been using it for a few months now. Here are my thoughts so far.
The main draw of the Google Home is the ability to converse with the Google Assistant. Ambient computing, as it is called by some, is a cool sci-fi concept and, along with the flip phone and the tablet, is another Star Trek crossover from fantasy into reality. A lots of geeks out there love the idea of leaning back in their captain’s chair with a cup of Earl Grey and barking out “computer!” to chat with an AI.
The problem is, in real life, you aren’t running a starship. And to run your household seamlessly through voice control, the Internet of Things (IoT) needs to catch up to the future. Right now it is a tangled mess of incompatible competing platforms. A case in point - Siri doesn’t seem to have a problem controlling my Philips Hue smart bulbs, but Google does. The Google Assistant cheerily assures me the lights are off, but they stubbornly stay on, proving that even an artificial intelligence can lie.
Although I deal more in words than aesthetics, I have a deep appreciation of design - especially architectural, industrial, and UI design.
I have an extra appreciation for a fully-realized and well-documented design language. When I was tasked to design a user interface, for example, I found that Google’s documentation for their Material Design to be an invaluable resource.
I recently upgraded to Windows 10 on my work machine and I am enjoying the fresh coat of “Metro” paint. But I have to say that Microsoft’s newly-announced Fluent Design feels far more like the future. And I mean that literally, because the future of computing may very well take place in augmented reality, or mixed reality as some now prefer to call it.
If there is a counterpoint to the so-called ambient computing environment (presided over by virtual assistants such as the Amazon Echo’s Alexa), then this is it. Or maybe we are moving toward a fusion of these two UI paradigms?
Fluent Design is clearly an attempt to establish some design fundamentals in a 3D environment. But, there is also a sparse Desktop concept floating out there that looks amazing. I’m looking forward to a more defined design language as Fluent Design moves out of the concept and into the implementation phase.
Having said that, is it time for the pendulum to swing away from the clean, minimalist trend of the past decade and back toward colour and chaos?
About twenty years ago, when I was a student, I lived in an old apartment building on a busy main street. It wasn’t in a good neighbourhood. The dudes in the building across from us appeared to be drug dealers and would beg us for smokes every time we went on the balcony - even though we never had any damn smokes.
But the rent was cheap. I had a roommate, and each of us chipped in less than $300 a month.
That summer was the worst. The building was a rotting carcass that smelled sickening when the sun baked its cracked wooden siding. And even with the windows closed - which only made the smell worse - the noise from the traffic roaring by never ceased.
I had an early bedtime that summer because I was working at the Canadian Tire warehouse and had to wake up at 5 AM to make the bus on time.
But I didn’t get much sleep.
Ten years ago today, wordbit joined the blogosphere. A few years after that, the term blogosphere joined other buzzwords such as web 2.0 in the internet fad graveyard. I’m hoping social media will be joining them at some point.
But ten years ago, social media was still fun and innocent. Heck - ten years ago, there were no smartphones. The iPhone was to debut about four months later in a defining moment that changed the trajectory of the internet forever. Three years later, in 2010, Wired Magazine declared that “the Web is dead”. The Web, in case you didn’t know, meaning the World Wide Web (WWW) - a collection of html pages served up in desktop browsers. The Web’s death was greatly exaggerated of course - you can thank responsive web design for that.
Then again, there are millions of people who use Facebook but have no idea that Facebook exists on the Internet. Zuckerberg hooked even the most technophobic on the Internet without them even knowing it.
Yes, the world is a different place - but where does that leave the humble blog?
Shelling out for a fitness tracker in January seems like a cliché. Yet, here I am - shelling out for a fitness tracker. I don’t have a New Year’s resolution to get in shape, and I’m not planning on buying one of those ill-fated gym memberships that everybody regrets signing up for after one too many holiday pig-outs.
It is true though that I haven’t gone running for a while. I entirely blame the anomalous snowy weather we’ve been having on the West Coast for that. I also know that a fitness tracker won’t work miracles in getting my butt off the couch - only good old-fashioned discipline can do that.
I guess buying my first fitness tracker just kind of happened.
About that MacBook Pro I had my eye on. I couldn’t wait. As much as I appreciate all my iMac has done for me, I just couldn’t listen to the endless spinning and chirping of that dreadful hard drive for another year without going insane.
I’ll admit, I was hoodwinked at first by the narrative spun by a few entitled tech journalists. It had been two years since the MacBook Pro was last updated in early 2015. People were hungry for a new mac. That is why the general disappointment was all the more compounded when the new MacBook Pros were released recently.
Professionals bemoaned the lack of bleeding edge CPUs and a limit of 16GB of RAM, because they all of course need at least 32GB (dude, please). And recreational users (like me) winced hard at the sky-high price increase. An increase, no doubt, resulting from the addition of a questionable touch bar added to replace the function keys.
But I came to another realization. A realization that perhaps did not enter the minds of most tech pundits who begrudge the steep price for being an early adopter, but then pay it anyway.