Apple unveiled its newest A14 processor with the new iPad Air last month, and early benchmarks have already hinted at its performance. In a new interview with German news outlet Stern, Apple’s vice president of platform architecture Tim Millet offered additional details on the A14 and more.

Millet explained in the interview that part of the potential of the A14 process is due to the growing capabilities of machine learning, which is “enabling a whole new class” of applications. “It takes my breath away when I see what people can do with the A14 Bionic chip,” Millet said.

Apple did not invent neural networks, “the foundations for them were laid many decades ago,” says Millet. “But back then, there were two problems: there was no data and there was no computing power to develop the complex models that could process this amount of data.” In 2012, there was finally a breakthrough that greatly accelerated the training times of models. Only then did technologies such as face unlocking become conceivable in a smartphone.

Millet also emphasized that part of Apple’s benefit is that its hardware and software teams can work closely together on maximizing performance from processor improvements. This extends to third-party developers, as well:

“We work very closely with our software team throughout the development process to ensure that we don’t just build a piece of technology that is useful to a few. We wanted to make sure that thousands upon thousands of iOS developers could do something with it.”

“Core ML is a fantastic opportunity for people who want to understand and find out what possibilities they have. We’ve invested a lot of time in making sure that we don’t just put transistors in the chip that are then not used. We want the masses to access them.”

Finally, Millet offered some interesting color on how the use of face masks amid the COVID-19 pandemic has affected usage of Face ID. The Apple VP explained that while there are ways Apple could theoretically tailor Face ID to work with face masks, doing so would compromise security:

“It’s hard to see something you can’t see. The face recognition models are really good, but it’s a tricky problem. Users want convenience, but they also want to be safe. And at Apple, it’s all about making sure that the data stays secure.”

“We can think of techniques that don’t involve the part of the face that is covered by the mouth-nose protection,” Millet explains. “But then you lose some of the features that make your face unique, which in turn makes it easier to imagine that someone could unlock the phone.”

One important thing to keep in mind is that Apple is likely still being cagey about the details of the A14 processor because it is what will also power the upcoming iPhone 12. Apple is presumably saving some details about the A14 until the iPhone 12 event, which is expected to occur sometime this month.

Millet’s full interview can be found on the Stern website.

FTC: We use income earning auto affiliate links. More.

Check out 9to5Mac on YouTube for more Apple news:

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

About the Author

Chance Miller

Chance is an editor for the entire 9to5 network and covers the latest Apple news for 9to5Mac.

Tips, questions, typos to chance@9to5mac.com