The Google Pixel’s squeeze for assistant was a button without a button

The Google Pixel’s squeeze for assistant was a button without a button

The Pixel 2 is almost a five-year-old phone, but it introduced a feature that I miss more and more with each passing year. It was called Active Edge and it allowed you to summon Google Assistant just by squeezing your phone. In a way, it’s an unusual idea. But it did give you something sorely lacking in modern phones: a way to physically interact with the phone to simply get something. done.

Looking at the sides of the Pixel 2 and 2 XL, you won’t see anything to indicate it’s holding something special. Sure, there’s a power button and volume rocker, but otherwise the sides are sparse. However, give the phone’s bare edges a good squeeze and a subtle vibration and animation will play, while Google Assistant pops up from the bottom of the screen, ready to start listening to you. You don’t need to wake up your phone, hold down any physical or virtual buttons, or touch the screen. You squeeze and start talking.

Looking at the sides of the Pixel 2, you’d never guess that it’s actually a button.
Photo by Amelia Holowaty Krales/The Verge

We’ll talk about how useful this is in a second, but I don’t want to overlook how great it feels. Phones are rigid objects made of metal and plastic, and yet the Pixel can tell when I’m applying more pressure than when I’m holding it. According to an old iFixit teardown, this is made possible by a few strain gauges mounted inside the phone that can detect the slight bend in the phone case when you squeeze it. For the record, this is a change that my human nervous system is unable to pick up on; I can’t tell that the phone is bending at all.

Whether you found Active Edge useful probably came down to whether you liked using Google Assistant, as illustrated in this Reddit thread. Personally, the only time I really used a voice assistant on a daily basis was when I had the Pixel 2 because it was literally at my fingertips. the thing that did so convenient is that the squeeze basically always worked. Even if you were in an app that hid the navigation buttons or your phone screen was completely off, Active Edge still got the job done.

While that made it extremely useful for looking up fun facts or doing quick calculations and conversions, I’d say Active Edge could have been a lot more useful if I’d been able to remap it. I enjoyed having the assistant, but if I could have turned on my flashlight with a squeeze, I would have had instant access to my phone’s most important features no matter what.

This version of the feature actually existed. HTC’s U11, which came out a few months before the Pixel 2, had a similar but more customizable feature called Edge Sense. The two companies worked together on the Pixel and Pixel 2, which explains how it ended up on Google devices. That same year, Google bought HTC’s mobile division team.

Active Edge also wasn’t Google’s first attempt at providing an alternative to using the touchscreen or physical buttons to control your phone. A few years before the Pixel 2, Motorola allowed you to open the camera by twisting your phone around and karate-choosing the flashlight, not unlike how you played music on a 2008 iPod Nano. The camera shortcut came during the period of relatively short time that Google owned Motorola.

However, as time went on, phone makers moved further and further away from being able to access some essential features with a physical action. Take my daily driver, an iPhone 12 Mini, for example. To launch Siri, I have to press and hold the power button, which has become burdensome since Apple got rid of the home button. To turn on the flashlight, something I do several times a day, I have to wake up the screen and tap and hold the button in the left corner. The camera is a bit more convenient, as it can be accessed with a left swipe on the lock screen, but the screen still has to be on for it to work. And if I’m really wearing On the phone, the easiest way to access the flashlight or camera is through Control Center, which involves swiping down from the top right corner and trying to select a specific icon from a grid.

In other words, if I look up from my phone and notice that my cat is doing something cute, there’s a good chance she’s stopped when I actually open the camera. It’s not that it’s difficult to launch the camera or turn on the flashlight, it’s just that it could be much more convenient if there was a dedicated button or press gesture. Apple even briefly acknowledged it when it made a battery case for the iPhone that had a button to launch the camera. A few seconds saved here or there adds up over the life of a phone.

Just to prove the point, this is how fast it is to launch the camera on my iPhone compared to the Samsung Galaxy S22 where you can double click the power button to launch the camera:

Gif showing an iPhone camera launch with the Control Center shortcut and a Samsung S22 camera launch with a button press.  The S22 launches its camera a second or two faster than the iPhone.

There’s less time to think when you can just press a button to launch the camera.

Neither phone handles screen recording and camera preview very well, but the S22 opens its camera app before you even tap the camera icon on the iPhone.

Unfortunately, even Google phones aren’t immune to the disappearance of physical buttons. Active Edge stopped showing up on Pixels with the 4A and 5 in 2020. Samsung also removed a button it once included to summon a virtual assistant (which, tragically, turned out to be Bixby).

There have been attempts to add virtual buttons that are activated by interacting with the device. Apple, for example, has an accessibility feature that lets you tap the back of your phone to launch actions or even your own mini programs in the form of shortcuts, and Google added a similar feature to Pixels. But to be perfectly honest, I just haven’t found them reliable enough. A virtual button that almost never works is not a great button. Active Edge worked almost every time for me, even though I had a beefy OtterBox on my phone.

It’s not like the physical controls on phones have completely disappeared. As I mentioned earlier, Apple lets you launch things like Apple Pay and Siri through a series of taps or presses on the power button, and there’s no shortage of Android phones that let you launch the camera or other apps by double-pressing the power button. switched on. button.

I would say, however, that one or two shortcuts assigned to a single button cannot give us easy access to everything we need. should have easy access to. To be clear, I’m not demanding that my phone be absolutely covered in buttons, but I do think the big manufacturers should take a cue from the phones of the past (and, yes, the smaller phone makers, I see them Sony freaks). and bring back at least one or two physical shortcuts. As Google showed, that doesn’t necessarily require adding an additional physical key that needs to be waterproofed. Something as simple as pressing can be a button that allows users to quickly access functions that they, or in the case of Pixel, Google, consider essential.

Leave a Comment

Your email address will not be published.