iOS 11 ushers in a whole suite of features for the mobile operating system
Apple has revealed iOS 11, bringing in a suite of updated features and a smarter Siri.
At Apple’s World Wide Developer Conference (WWDC) 2017, the Cupertino company championed a host of consumer-level updates to its new mobile operating system, These include the ability to control multi-room audio with the HomeKit app, more stickers, the ability to make person-to-person Apple Pay payments through iMessage, and better photos features including 50 percent better compression for high-quality pictures.
Siri and iOS 11
But the most interesting part of iOS 11 is the smarter Siri. Apple’s virtual assistant looks to take on the likes of Google Assistant and Amazon’s Alexa, by making better use of deep learning techniques to enable Siri to better learn a users behaviour and react to it.
For example, Siri will learn about the users browsing habits and iOS interactions and surface information on what they are interested in, such as suggesting a news article.
With iOS 11, deep learning takes place on the iOS device, allowing for Siri to perform translations in various languages, such as English German, Spanish and Chinese, as well as predicting what a user might do, write or search next. Apple has also boosted Siri through the use of deep learning to have a more natural sounding female and male voice.
Apple is arguably behind Google, Amazon and Microsoft when it comes to a virtual assistant, but with iOS 11, Siri might just be catching up at a rapid pace.
For developers, iOS 11 also brings in a wider development kit for integrating Siri deeper in third-party apps.
Another interesting ‘smart’ feature is the evolution iOS 11 brings to Apple’s CarPlay in-car interface. The new mobile OS uses an updated Maps app and uses Wifi and Bluetooth connectivity to formulate a Doppler effect to work out when an iPhone users is driving and prompt a ‘Do not disturb while driving’ option to prevent drivers from being distracted by undesired notifications on their iPhone when behind the wheel.
VR development kit
Using the cameras, central and graphics processors, and the motion sensors of an iPhone or iPad, developers can use ARKit to develop AR experiences and integrate their own third-party frameworks into the software and tap into graphics engines such as Unity and the Unreal engine.
More machine learning
Apple also noted that iOS 11 opens up more API for developers that give them the ability to tap into the machine learning platform Apple uses known as CoreML. giving them access to tools for integrating features face tracking and natural language cognition into their apps. Previously, such machine learning features were limited to native apps on iOS.
CoreML brings developers access to all manner of neural network machine learning modelling, including deep, convolutional and recurrent learning, along with more straight forward machine leaning such as tree ensembles and linear learning models.
These tools should be a boon for iOS developers looking to inject more smart features into their applications.
The reveal of iOS 11 follows Apple’s showcase of it new macOS High Sierra.