After Apple’s iOS 13 Push, Google Tightens its Privacy Rules | TechTree.com

After Apple’s iOS 13 Push, Google Tightens its Privacy Rules

Google comes out with a note on how it plans to enhance user privacy with its Assistant in the wake of snooping allegations

 

Months after the world woke up to the shocking news that smart speakers were privately sharing voice samples with humans in an obvious privacy screw-up, Google has now followed Apple and Amazon to officially announce how users can go about setting their privacy to curtail their voice samples from being researched.

The story of eavesdroppers listening in to voice samples caused a furore in the industry some months ago in the United States when it came out that tech giants were using voice samples to study tone, tenor and pronunciation with a view to enhance the output quality of their smart assistants on mobile devices.

While Apple issued a public apology late last month, Amazon brought out a document that allowed users to understand how their data gets used and how they could review it. Google, which had paused its human audio review globally in July, came out with a blog post to announce that it was resuming the review with increased user data control.

As is always the case, the blog post begins with a rather curt “We believe you should be able to understand how your data is used and why, so you can make choices that are right for you.” Of course, there is no apology or even a modicum of regret in the tone of the post that goes on to detail points that at best attempts to smoothen a few ruffled feathers.

Authored by Nino Tasca, Senior Product Manager, Google Assistant, the blog clarifies that Google doesn’t retain audio recordings by default. It claims that this was never the case and would not be in the future too. It goes on to exhort users to continue using the Assistant “to help you throughout the day, and have access to helpful features like Voice Match.”

By way of an obtuse apology, the post has this to say… “Recently we’ve heard concerns about our process in which language experts can listen to and transcribe audio data from Google Assistant to help improve speech technology for different languages. It’s clear that we fell short of our high standards (our bold and underline) in making it easy for you to understand how your data is used, and we apologize.”

Tasca goes on to give suggest that users could opt in to the Voice & Audio Activity (VAA) settings while setting up the Assistant. Users can also view past interactions with the Assistant and delete any of these at any time.

In other words, the onus of having voice samples on board the Assistant is with the user. Which means, Google may just continue to keep doing what it does unless the user is sharp enough and cleans the cache once a day!

Of course, the blog does say that by end of the year, it plans to update the Google Assistant policies to reduce the amount of voice data that the company stores. It says Google will delete a majority of audio data older than a few months. Of course, one just needs to guess how many months connote “few” in the blog. Small mercies that the company won’t keep your audio files forever.

Google’s statement comes at a time when Microsoft revealed last month that it no longer uses human review for audio snippets from Skype’s translate feature or Cortana on Xbox. Facebook too had stated that it would stop human review from its Messenger though this announcement runs contrary to its revelation that it used human transcription for its Portal hardware released last week.

Do these changes reveal real desire to enhance user privacy? Only time will tell. Or till the time the next controversy breaks out.


TAGS: Google, Google Assistant, Voice, Recordings, Privacy