Apple: Raising the stakes on data privacy
Apple started the year in privacy off with a colossal flex — Giant billboards across from January’s CES show in Vegas saying what happens on your iPhone stays on your iPhone. It was privacy by design. Privacy as a civil right.
That was quickly followed by a bug that could allow eavesdropping through FaceTime calls, an impressive array of new privacy protections including Sign in with Apple, HomeKit secure video and routers, increased tracking protection, private federated learning, and an anonymized new Find My network.
Then, a scandal involving Siri and letting humans, contractors even, listen to and grade customer voice recordings for quality assurance. An industry-wide practice-become-scandal, yes, but one unexpected and unacceptable given Apple’s position on privacy, both moral and marketing.
Up, down, up, down. What was left was to make it right. And, Apple’s doing that in two ways: A) With the release a week ago of new, fully disclosed opt-in grading process for Siri last week as part of iOS 13.2 and, B) the unveiling of a new, detailed privacy website including four new privacy-centric white papers.
Apple’s new privacy website
Apple’s had a privacy section on its website for a while and they’ve updated it before. That’s all been around Apple’s stated policies of data minimization, on-device intelligence, transparency and control, identity protection, and security to enable privacy.
Today’s update, though, is the biggest one yet. It goes beyond statements and policies to break down, in plain language — with, yes, clever titled and cute animations — what Apple is doing across their software and services products to make privacy manifest.
- Intelligent Tracking Prevention, social widget blocking, fingerprinting defense, private browsing, and search minimization Safari.
- Random identifiers, on-device personalization, location fuzzing, sandboxed extensions, and end-to-end encryption in Maps.
- On-device memory creation, sharing suggestions, classification, and curation, single photo permission, secure metadata sync between devices, and sharing control in Photos.
- Random identifiers and end-to-end encryption for iMessage and FaceTime.
- Random identifiers, on-device suggestions, on device-processing, for Siri, and with new opt-in grading system.
- Zero profiling and on-device recommendations in News.
- Device account numbers and dynamic security codes to protect your Apple Pay and Apple Card purchase data — even from merchants. Also, Apple Payments Inc. keeps your Apple Cash transactions separate even from Apple.
- Encrypted health data, and if you start sharing activity and then stop, historical information is purged from shared devices.
- Share location only once, background tracking notifications, Wi-Fi and Bluetooth shaming, location sharing options on a case-by-case basis.
- Random identifiers, on-device curation and recommendations, and opt-in information sharing with publishers in News.
- No in-game tracking or ads in Apple Arcade.
- Encrypted data transmission, unique key encrypted storage for Home, Health, and Keychain data, two-factor authentication, and Apple-retained keys for data stored at third-party data centers, for iCloud.
- Encrypted, keychain stored data for Home, local analysis for HomeKit Secure Video, local processing for HomeKit actions, granular control for HomeKit-enabled routers, and random identifiers for Siri requests.
There was a story yesterday about a woman who was using a third-party cycle tracking app and forgot to fill it in one month. Well, unbeknownst to her, that app was secretly sharing her personal, intimate data, and she started getting served Facebook ads for baby products. Because, the people stealing her data just assumed she must be pregnant. She went back, filled in the month, and the ads stopped. Then, creeped out, she took to Twitter.
That’s the largely unregulated, surveillance-centric world we’re living in and it’s only by actively supporting all the as close to zero-knowledge options we can that companies will be forced to adopt similar policies and implement privacy-respecting technologies.
Apple’s new privacy white papers
The new white papers Apple has posted dive more deeply into four specific apps and services.
- Safari’s cross-site tracking prevention, fingerprinting defense, private click measurement — which Apple has proposed as a new web standard — a way for advertisers to get the data they legitimately need without sucking everything else up along with it — and letting the browser-maker along for the ride.
- Location services permissions, disclosures, settings, background access, tracking notifications, and Bluetooth and Wi-Fi shaming. Which, if you’ve seen how many apps iOS 13 has exposed as secretly using Bluetooth to spy on our location, is a real eye-opener, if not rage inducer.
- Photos on-device processing for scene classification, composition analysis, people and pet identification, image quality, face quality, and audio classification, relevancy curation, sharing suggestions, optional full-data sharing, and a ton of other quantifiers and enhancements. Which is way beyond what I thought they were doing there.
- Sign-in with Apple’s email obfuscation, anonymous relay, and tracking prevention. In other words, same single-sign-on convenience as sign-in with Google or Facebook, but without the huge compromise in personal privacy that go with them.
One of the technologies I’m most fascinated by is private federated learning.
Federated Learning is one of the latest machine learning buzz concepts that basically pushes down a model to your device, lets you help train it, then pushes that training back up to the cloud where it’s combined with everyone else’s on-device training to improve the model for everyone.
It offers better, immediate performance for us because it’s on our actual devices all the time, but it also gets better over time as the model gets updated by everyone.
I’ve described machine learning before as Tinder for computers, where it’s yes, no, no, no, yes, yes, no, hotdog. In this case, as your phone gets better at identifying your stuff, for example, everyone’s phones get better at identifying all stuff.
What a lot of companies call private, though, isn’t really private. Anonymized data has a way of being easily de-anonymized based on a variety of signals that come along with it.
So, what Apple is doing, is obscuring things further using differential privacy.
If you’re not familiar with it, imagine you and 999 other family members are at dinner arguing about which is better, hot dogs or hamburgers. Instead of everyone blurting out their opinions and suffering the wicked side-eye for them, they each flip a coin. Then, if it’s heads, they write it down truthfully. If it’s tails, they lie and write down the opposite. That way, you never know if any one person lied or told the truth, but with statistics, you can still pull back the real, total results.
That’s what Apple’s doing with federated learning both on-device and on the server for things like adding new trendy terms to the QuickType keyboard and Siri recommendations. So, literally, it has a better chance of knowing the memes.
Ok, wow, that got nerdy fast.
Setting the bar
So, real talk. I get the knocks on Apple and privacy. I really do. The billboards. The ads. The attitude. It can come off as cocky and, when the inevitable bugs like FaceTime calling, or screw-ups, like Siri data grading, come up, that cockiness comes back to bite them on the ass, at best, or make it all ring hollow, at worst.
The industry schadenfreude is palpable.
Some, many even, might say Apple should cut out the hype, stop with the strut, and just fix and improve their stuff.
And I get it. I totally get it. That was my reaction too. But, now, I think that’s exactly wrong.
Apple raising the stakes is ultimately better for everyone. It forces a higher level of scrutiny and accountability on them. Those billboards put them in the crosshairs. That’s great for Apple customers.
But it also annoys the hell out of, and helps shame, the rest of the industry, into doing better. It puts them in the crosshairs as well. And that’s great for all customers.
Before Windows viruses, nobody cared about security. Then, the malware panic began and security became a competitive advantage. Now, everyone is better off.
Before, Facebook and Cambridge Analytica, nobody cared about privacy. Now, privacy is on the verge of becoming a competitive advantage as well.
We’re already seeing Facebook and Google being forced to put privacy front and center on their keynote stages this year. They focused on third parties instead of first, of course. And Facebook tried to gaslight us by conflating privacy with encryption and Google with data retention. But that, at least, puts them into play and under the spotlight. Hopefully to enjoy the same scrutiny and accountability — both for what they’re doing and what they’re failing to do.
That includes Apple. iCloud Backups are fail safe because that’s what’s best for a majority of users, but I still want to see a fail secure option for those who need it. iMessage is end-to-end encrypted but SMS fallback isn’t. I’d love a per-instance option to prevent that, instead of just a universal switch off. Google is still the default search engine, and there’s no way to choose Duck Duck Go as part of the initial setup. And then there’s governments. America with San Bernadino. Australia with anti-encryption laws. And most recently, China, and their efforts to exert influence well beyond their borders.
So, Apple, you go. Ratchets up the rhetoric, ratchet up the pressure as well, on yourselves and on the industry.
Like that James Cameron saying: Set your goals so ridiculously high that even your failures, you will fail above everyone else’s success.
Make the stakes that high. Higher. High enough that you worry about failing, all the time, and force everyone else in the industry to worry about it and do something about it as well.
My Lesson Planning
Mac News and Rumors
via iMore – The #1 iPhone, iPad, and iPod touch blog https://www.imore.com/
November 7, 2019 at 01:04AM