Google Home gadgets to get an incredibly useful new feature this week

Google Home gadgets to get an incredibly useful new feature this week 1214014

Google is bringing a handy new feature to its Google Nest Hub (née Google Home Hub) and Nest Hub Max this week. These smart home gadgets will now tweak what appears on the display based on how far away you’re standing from the screen. So, if you’re barking an order from the other side of the room, the reply will be much larger than if you’re lying in bed and quietly whisper an instruction first thing in the morning.

Google has already implemented some of these changes in both the Google Nest Hub and Hub Max so that timers, commute times and weather will be displayed at different sizes depending on where you are in the room. How does Google know where you’re standing?

Well, the Nest Hub Max has a built-in Nest camera, so that’s pretty obvious. The smart screen already used the camera to customise the notifications it would display based on who was peering at the screen – so you wouldn’t be presented with your partner’s calendar appointments and vice versa. Once the Nest Hub Max had identified you, it would also be able to handle requests about your personal Spotify account, contacts, or any other personal data.

But one of the biggest selling points of the Google Nest Hub at launch was that it didn’t have a camera – something Google proudly announced on-stage as it suggested users could place the smart gadget on a bedside table to work as an alarm clock without losing any sleep over privacy implications. So, how does it know where you’re standing?

It turns out Google can send a high frequency chirp from the smart speaker and then listen to the echoes using the built-in microphones as the sound bounces back. As David Attenborough fans will undoubtedly know, this is exactly the same technique used by bats to gauge their location.

In a blog post about the new technique rolled-out to the Google Nest Hub, Product Manager Ashton Udall says: “Bats emit ultrasonic ‘chirps’ and listen to how those chirps bounce off of objects in their environments and Travel back to them. In the same way, Nest Hub and Nest Hub Max emit inaudible sound waves to gauge your proximity to the device.

“If you’re close, the screen will show you more details and touch controls, and when you’re further away, the screen changes to show only the most important information in larger text. Ultrasound sensing allows our smart displays to react to a user’s distance.”

Own a Google Home? Google just added a bunch of new features you need to try
Amazing iPad deal will leave you FURIOUS if you bought 2019 Apple tablet on Black Friday

As mentioned above, the ultrasound sensing is already up-and-running for timers, commute times and weather.

But keeping checking your Nest Hub and Nest Hub Max over the next week to see the same technology be implemented to reminders, appointments, and alerts whenever you approach the display.

And if you’re worried about the privacy implications of this new feature, Google says “because this is using a low-resolution sensing technology, ultrasound sensing happens entirely on the device and is only able to detect large-scale motion (like a person moving), without being able to identify who the person is,” so Google won’t be able to work out who is shuffling around your house… unless you own a Google Nest Hub Max, that is.

As with all updates to the Google-designed products, you don’t have to trawl any settings menus. Google will stagger the roll-out and you should see the new capabilities appear in the coming days automatically. To test it, stand a good distance from your Nest Hub and ask to see your upcoming reminders, or any appointments scheduled for the rest of the day.


Related posts

Sarah Ferguson looked luminant at BFI luminous fundraising gala in London 


Gardening tips: Reap the benefits of cutting your lawn with a scythe | Garden | Life & Style


Sergio Garcia Really Likes Wearing the Green Jacket


Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.