iOS 11 sets a new standard for the world’s most advanced mobile operating system. Your apps can now become more intelligent using the power of machine learning with Core ML. You can create incredible augmented reality experiences with ARKit. And you can deliver a more unified and immersive user experience with new multitasking features, including drag and drop for iPad, the new Files app, new camera APIs, new SiriKit domains, Apple Music integration, and more.
The latest update of ARKit in iOS 11.3 delivers new features that let you create an even more realistic user experience. With improved scene understanding, your app can see and place virtual objects on vertical surfaces, and more accurately map irregularly shaped surfaces. Real world images, such as signs, posters, and artwork can be integrated into the AR experience, so your app can fill a museum with interactive content or bring a movie poster to life. And now, the pass-through camera view of the real world is higher resolution and supports auto-focus for a sharper view in more situations.
iPhone XS, iPhone XS Max, and iPhone XRfeature a trio of all-screen displays paired with A12 Bionic and the next-generation Neural Engine. A12 Bionic is the smartest and most powerful chip ever in a smartphone, and is designed for performance in mind with Core ML, ARKit, Metal, and more. And now Face ID works even faster than before to securely and privately unlock, authenticate, and pay.
Build unparalleled augmented reality experiences for hundreds of millions of users on iOS — the biggest AR platform in the world. With ARKit 2 on iOS 12, your AR apps can now be experienced by multiple users simultaneously, and resumed at a later time in the same state. You can also incorporate real-world objects into your AR experiences, giving your users even greater immersive opportunities.