Apple has shared its latest white paper via its machine learning journal. Today’s entry is “Learning with Privacy at Scale” and covers specific algorithms Apple is using with differential privacy to improve product features with some specific use cases like discovering popular emoji.

Launched this past summer, Apple has used its machine learning journal to share about the evolution of Siri, how ‘Hey Siri’ works, its facial detection, and more.

Today’s paper shares details about the balance of accessing user data to improve products, while using local differential privacy to protect users’ information.

Within the differential privacy framework, there are two settings: central and local. In our system, we choose not to collect raw data on the server which is required for central differential privacy; hence, we adopt local differential privacy, which is a superior form of privacy [3]. Local differential privacy has the advantage that the data is randomized before being sent from the device, so the server never sees or receives raw data.

Apple also notes that its system is opt-in only and transparent, with no data recorded or sent before user approval.

The document goes into detail about the system architecture Apple is using as well as the algorithms it has designed, including a “Private Count Mean Sketch”, “Private Hadamard Count Mean Sketch”, and a “Private Sequence Fragment Puzzle”.

As for use cases, Apple notes that it is able to improve predictive emoji QuickType suggestions based on location.

Given the popularity of emojis across our user base, we want to determine which specific emojis are most used by our customers and the relative distribution of these characters. To that end, we deploy our algorithms to understand the distribution of emojis used across keyboard locales. For this use case, we set the parameters for CMS to be mm = 1024, kk = 65,536, and ϵϵ = 4 with dictionary size of 2600 emojis.

The data shows many differences across keyboard locales. In Figure 6, we observe snapshots from two locales: English and French. Using this data, we can improve our predictive emoji QuickType across locales.

Other use cases include “Identifying High Energy and Memory Usage in Safari” and “Discovering New Words”.

Read the full journal entry here.

Check out 9to5Mac on YouTube for more Apple news:

FTC: We use income earning auto affiliate links. More.

Check out 9to5Mac on YouTube for more Apple news:

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

About the Author

Michael Potuck

Michael is an editor for 9to5Mac. Since joining in 2016 he has written more than 3,000 articles including breaking news, reviews, and detailed comparisons and tutorials.