Skip to main content

Apple is turning Siri audio clip review off by default and bringing it in house

The top line news is that Apple is making changes to the way that Siri audio review, or ‘grading’ works across all of its devices. First, it is making audio review an explicitly opt-in process in an upcoming software update. This will be applicable for every current and future user of Siri.

Second, only Apple employees, not contractors, will review any of this opt-in audio in an effort to bring any process that uses private data closer to the company’s core processes.

Apple has released a blog post outlining some Siri privacy details that may not have been common knowledge as they were previously described in security white papers.

Apple apologizes for the issue.

“As a result of our review, we realize we haven’t been fully living up to our high ideals, and for that we apologize. As we previously announced, we halted the Siri grading program. We plan to resume later this fall when software updates are released to our users — but only after making the following changes…”

It then outlines three changes being made to the way Siri grading works.

  • First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.
  • Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.
  • Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.

Apple is not implementing any of these changes, nor is it lifting the suspension on the Siri grading process that it halted until the software update becomes available for its operating systems that will allow users to opt in. Once people update to the new versions of its OS, they will have the chance to say yes to the grading process that uses audio recordings to help verify requests that users make of Siri. This effectively means that every user of Siri will be opted out of this process once the update goes live and is installed.

Apple says that it will continue using anonymized computer generated written transcripts of your request to feed its machine learning engines with data, in a fashion similar to other voice assistants. These transcripts may be subject to Apple employee review.

Amazon and Google had previous revelations that their assistants were being helped along by human review of audio, and they have begun putting opt-ins in place as well.

Apple is making changes to the grading process itself as well, noting that, for example, “the names of the devices and rooms you setup in the Home app will only be accessible by the reviewer if the request being graded involves controlling devices in the home.”

A story in The Guardian in early August outlined how Siri audio samples were sent to contractors Apple had hired to evaluate the quality of responses and transcription that Siri produced for its machine learning engines to work on. The practice is not unprecedented, but it certainly was not made as clear as it should have been in Apple’s privacy policies that humans were involved in the process. There was also the matter that contractors, rather than employees, were being used to evaluate these samples. One contractor described as containing sensitive and private information that, in some cases, may have been able to be tied to a user, even with Apple’s anonymizing processes in place.

In response, Apple halted the grading process worldwide while it reviewed the process. This post and updates to its process are the result of that review.

Apple says that around 0.2% of all Siri requests got this audio treatment in the first place, but given that there are 15B requests per month, the quick maths tell us that though it is statistically insignificant, the raw numbers could be quite high.

The move away from contractors was signaled by Apple releasing employees in Europe, as noted by Alex Hearn earlier on Wednesday.

Apple is also publishing an FAQ on how Siri’s privacy controls fit in with its grading process, you can read that in full here.

The blog post from Apple and the FAQ provide some details to consumers about how Apple handles the grading process, how it is minimizing the data given to data reviewers in the grading process and how Siri privacy is preserved.

Apple’s work with Siri from the beginning has focused enormously on on-device processing whenever possible. This has led a lot of experts to say that Apple was trading raw capability for privacy by eschewing the data-center heavy processes of assistants from companies like Amazon or Google in favor of keeping a ‘personal cloud’ of data on device. Sadly, the lack of transparency on human review processes and the use of contractors undercut all of this foundational work Apple has been doing from the beginning. So it’s good that Apple is cranking all the way back to past industry standard on its privacy policies regarding grading and improvement. That is where it needs to be.

The fact is that no other assistant product is nearly as privacy focused as Siri — as I said above, some would say to the point of hampering its ability to advance as quickly. Hopefully this episode leads to better transparency on the part of Apple when humans get involved in processes that are presumed to be fully automated.

Most people assume that ‘AI’ or ‘machine learning’ mean computers only, but the sad fact is that most of those processes are intensely human driven still because AI (which doesn’t really exist) and ML are still pretty crap. Humans will be involved in making them seem smarter for a very long time yet.



from Apple – TechCrunch https://ift.tt/346eZid

Comments

Popular posts from this blog

Thousands of cryptocurrency projects are already dead

Two sites that are actively cataloging failed crypto projects, Coinopsy and DeadCoins , have found that over a 1,000 projects have failed so far in 2018. The projects range from true abandonware to outright scams and include BRIG , a scam by two “brothers,” Jack and Jay Brig, and Titanium , a project that ended in an SEC investigation. Obviously any new set of institutions must create their own sets of rules and that is exactly what is happening in the blockchain world. But when faced with the potential for massive token fundraising, bigger problems arise. While everyone expects startups to fail, the sheer amount of cash flooding these projects is a big problem. When a startup has too much fuel too quickly the resulting conflagration ends up consuming both the company and the founders and there is little help for the investors. These conflagrations happen everywhere are a global phenomenon. Scam and dead ICOs raised $1 billion in 2017 with 297 questionable startups in the mix. The

Dance launches its e-bike subscription service in Berlin

German startup Dance is launching its subscription service in its hometown Berlin. For a flat monthly fee of €79 (around $93 at today’s exchange rate), users will get a custom-designed electric bike as well as access to an on-demand repair and maintenance service. Founded by the former founders of SoundCloud and Jimdo , the company managed to raise some significant funding before launching its service. BlueYard led the startup’s seed round while HV Capital (formerly known as HV Holtzbrinck Ventures) led Dance’s €15 million Series A round, which represented $17.7 million at the time. E-bike subscription service Dance closes $17.7M Series A, led by HV Holtzbrinck Ventures The reason why Dance needed so much capital is that the company has designed its own e-bike internally. Called the Dance One, it features an aluminum frame and weighs around 22kg (48.5lb). It has a single speed and it relies on its electric motor to help you go from 0 to 25kmph. And the best part is that you