Most Read This Week
From the Blogosphere
How Is Apple Using Machine Learning? | @ThingsExpo #AI #ML #DL #DX #IoT
Today, machine learning is found in almost every product and service by Apple
By: Nate Vickery
Sep. 23, 2017 12:00 PM
Today, machine learning is found in almost every product and service by Apple. They use deep learning to extend battery life between charges on their devices and detect fraud on the Apple store, recognize the locations and faces in your photos, and help Apple choose news stories for you.
The concept of AI (Artificial Intelligence) has been the subject of many discussions lately. According to some predictions, AI will have the ability to learn by itself, outclassing the capabilities of the human brain, and even manage to fight for equal rights by the year 2100. Even though these are (still) just speculations and predictions, companies like Apple are developing and implementing machine learning technology, which is still in its infancy. How is Apple using machine learning?
Apple's beginnings with deep learning technologies
Today, machine learning is found in almost every product and service by Apple. They use deep learning to extend battery life between charges on their devices and detect fraud on the Apple store, recognize the locations and faces in your photos, and help Apple choose news stories for you. Machine learning determines whether the owners of Apple Watch cloud are really exercising or just perambulating. It figures out whether you'd be better off switching to the cell network due to a weak Wi-Fi signal.
Apple's smart assistant
Siri's voice recognition was moved to a neural-net based system. The system began leveraging machine learning techniques, including DNN (deep neural networks), long short-term memory units, convolutional neural networks, n-grams, and gate recurrent units. Siri was operational with deep learning, while it still looked the same.
Every iPhone user has come across Apple's AI, for example, when you swipe on your device screen to get a shortlist of all the apps that you're most likely to open next, or when it identifies a caller who's not memorized in your contact list. Whenever a map location pops out for the accommodation you've reserved, or when you get reminded of an appointment that you forgot to put into your calendar. Apple's neural-network trained system watches as you type, detecting items and key events like appointments, contacts, and flight information. The information is not collected by the company, but stays on your iPhone and in cloud-based storage backups - the information is filtered so it can't be inferred. All this is made possible by Apple's adoption of neural nets and deep learning.
During this year's WWDC, Apple presented how machine learning is used by a new Siri-powered watch face to customize its content in real-time, including news, traffic information, reminders, upcoming meetings, etc., when they are supposed to be most relevant.
Making mobile AI faster with new machine learning API
The essential machine learning tools that the new Core ML will support include neural networks (deep, convolutional, and recurrent), tree ensembles, and linear models. As for privacy, the data that's used for improving user experience won't leave the users' tablets and phones.
The announcement of making AI work better on mobile devices became an industry-wide trend, meaning that other companies might be trying that as well. As for Apple, it's clear that deep learning technology has changed their products. However, it's not clear whether it's changing the company itself. Apple carefully controls the user experience, with everything being precisely coded and pre-designed. However, engineers must take a step back (when using machine learning) and let the software discover solutions by itself. Will machine learning systems have a hand in product design, if Apple manages to adjust to the modern reality?
Subscribe to the World's Most Powerful Newsletters
Today's Top Reads