Google I/O - a lot cooking, but not much to eat

Fri May 27, 2022

At Google I/O ten years ago Sergey Brin dropped into a Google Hangout with a gang of daredevil skydivers wearing Google Glass. The demo was messy, chaotic and infused with pure hubris.

Both Google Hangouts and Glass are now discontinued. But the 2012 Google I/O is still the most memorable one, if only for its overreaching ambition.

This year’s I/O felt very similar to ones from the last few years - measured, academic, and forgetful. It was only at the end when a glimmer of excitement made me sit up straighter in my chair.

Google Glass had returned.

Well… not exactly. The ever-respectable and approachable Sundar Pichai, waxing on about the possibilities of AR, dropped a video showing a prototype of smart glasses stripped of everything - including the “glasshole” camera - that only performed one function: real-time language translation.

Google is onto something here. Single-purpose gadgets can become great products precisely because they are born out of constraint to become focused tools. It’s the natural evolution of product design that precedes the all-purpose “one device”. You simply cannot converge into a compelling device that “does everything” before the technology is ready.

I only hope that Google stays the course with this - not unlike many other dreamers I know, they’re just really bad at finishing what they started. One thing’s for sure though - nobody is going to be calling an Asian grandmother struggling to understand English a “glasshole”. Google’s scaled-back approach is definitely devoid of any hubris this time.

Watching the I/O livestream, it was good to see in-person events happening again. There’s something about audience applause that validates what the presenter is saying. Without it, the presentation feels more like an infomercial where the presenter can say any bonkers thing and get away with it.

I wish I could tell you that the audience alone gave I/O the energy it needs, but it’s hard to get excited about announcements such as “24 new languages added to Google Translate”.

We got the inevitable improvements: “we made YouTube better” (auto-generated chapters), “we made Google Maps better” (Immersive View), “we made Google Docs better” (automated summaries), and “we made Google Search better” (Scene Exploration).

Most of the improvements flex Google’s AI muscle and are… nice. Nice, but not mind-blowing. Every year, Google trots out obscure-sounding AI acronyms like LaMDA and PaLM and expects the audience to get excited about language models, linguistics, and natural-language processing, which of course, they don’t.

Perhaps Google’s forthcoming AI Test Kitchen App will allow people to get some practical hands-on time with these models and it won’t sound like homework anymore. I hope so, because so far, public access to AI tools has been extremely limited.

There recently was a lot of excitement and buzz around an AI tool, but it wasn’t Google’s (now that’s like rain on your wedding day). I’m talking about the incredible DALL·E 2. This AI system can create complex image compositions on the fly based on a text description.

You really have to see it to believe it. Take a look at the DALL·E 2 subreddit for some samples.

This tool is also not currently for public use, I might add. But, DALL·E 2 brings us what is sorely missing in machine learning right now: whimsy, wonder, absurdism, surrealism, delight, and surprise.

Not to be outdone, Google hastily announced their retort, even claiming to have better performance over DALL·E 2. They call it Imagen. But this announcement came a couple of weeks after I/O - I guess even they were taken aback when DALL·E 2 went viral. Amazing stuff - I wonder why didn’t they show it off at I/O 🤔.

I’m sure Google was hoping that the last part of I/O would really get people pumped - hardware announcements. We got a good look at the Pixel 6A, Pixel 7, and Pixel Buds Pro. We even got a look at an outdated-looking Pixel tablet coming in 2023.

But the biggest applause was for the Pixel Watch.

It remains to be seen whether the Pixel Watch can compete with the top-of-its-class Apple Watch, especially considering these peripheral devices are ecosystem-locked. But Android users have had it rough in the wearables department for so long and I’m happy to say that the Pixel Watch appears to have a premium industrial design.

Traditional round watches are so much better looking than the “computer-on-your-wrist” rectangular watches.

Although, I get the form-over-function argument here: scrolling through content on a round smartwatch is not a great user experience - unless you’re okay with cramped or cut-off text.

I’m looking forward to seeing how the UI looks and performs.

Google’s strategy to Pixel-ify everything and create a “family” of products that all just work together is obviously very familiar to Apple users. Given the open - almost experimental - nature of Google’s platform and the sheer number of Android partners out there, I’m not sure Google will arrive at the same place as Apple, but they’re giving it the good old college try.

In the meantime, Samsung - sitting at the top of the Android food chain - is shrugging its shoulders. I’m sure they’re not particularly concerned either.

So yeah, that’s it really - Google’s hardware announcements were light on details and didn’t have me leaping out of my chair. Although I did shout “yes please!” when the presenter said “we’re working towards your TV pausing itself when you get up for a snack”. That would be sick, but unfortunately, is just another unfinished project.

In conclusion: It seems like there’s a lot cooking at Google, but not a lot to eat right now.

I’m not sure what’s going on with Apple either - but WWDC is coming up next month, so there isn’t long to wait for some software news.

We’ll probably have to wait longer for hardware news (there might be some MacBook stuff at WWDC), although plenty of rumours are flying around as usual. The rumoured iPhone 14 Pro sounds decent: no notch (replaced with hole punch and pill), A16 chip, massive camera with 48-megapixel Wide lens, titanium body, and even a vapor chamber thermal system.

The iPhone 14 rumours, however, are puzzling. From what I’ve heard, the iPhone 14 will retain the same form factor, notch, and even the same A15 chip as the iPhone 13! The iPhone 14 display likely still won’t support high refresh rates (ProMotion). It also won’t be getting the 48-megapixel Wide lens camera that the Pro is supposedly getting. The only upgrade seems to be to the selfie camera.

So, why would anyone buy an iPhone 14 if it is almost exactly the same as an iPhone 13? If the rumours are true, then surely only the selfie girls would be interested.

The iPhone 14 really is a mystery. Skimping on the chip upgrade seems especially egregious as, until now, processing power was a reliable benchmark to determine the phone’s longevity. But, we’re jumping ahead of ourselves - maybe it won’t happen. The mystery will be solved in September one way or another and then I’ll have more to say.

As far as the notch goes, I’ve updated my Notch Watch website with a prediction from the reliable display supply-chain analyst, Ross Young. By his estimation, the all-screen dream won’t arrive until the iPhone 18 in 2026. Yelp. That is a far longer wait than I was hoping for.

Although to be fair, after actually living with the notch on my iPhone XS, it doesn’t bother me that much anymore. Don’t get me wrong - I still think it’s an ugly wart. But the hole-and-pill cut outs rumoured for the iPhone 14 Pro look worse. One wart is better than two warts.

That about does it for this month - see you in June!


  « Previous: Next: »