Your Guide to Apple’s ARKit
Your Guide to Apple’s ARKit
The current value of the augmented reality (AR) market stands at an eye-watering £26 million. With Microsoft, Google, Apple and Facebook just a few of the big names investing in the technology, this value is only set to soar. It’s therefore in the interests of every business to get on board with this futuristic technology.
AR blurs the lines between digital reality and the physical world. AR applications allow users to see 3D digital models in real-world settings. This capability opens up endless exciting possibilities for retail, healthcare, education, manufacturing and beyond.
In this article, we discuss a software development kit (SDK) that is opening up this technology to the masses. Apple’s ARKit has enabled developers to create apps and experiences that have seeped into our everyday lives. We’re here to show you how it works, what it can do and offer some advice on how to make the most of the technology.
What is ARKit?
Apple’s ARKit is a development tool for creating AR experiences and currently the world’s largest AR platform. It was developed and released for the first time in 2017. ARKit is now in its fourth generation, providing even more advanced functionality.
Apple has built the ultimate AR platform, designing both hardware and software in sync, from the ground up. They have built the ultimate AR double-act. The software utilises the potential of iOS device cameras, processors and motion sensors to create highly realistic experiences between the digital and physical world.
Can I Use ARKit?
One of the many advantages of ARKit is its accessibility. The software is compatible with a wide array of iOS devices. The basic requirements are iOS 11 and an A9 processor.
Click here for a full list of compatible devices.
How Does ARKit Work?
ARKit uses a technology called Virtual Inertial Odometry (VIO) combined with plane detection to make AR applications possible.
VIO is a highly complex bit of technology. It combines information from the iOS device’s motion-sensing hardware with computer vision analysis of the scene visible to the device’s camera. This means that 3D models realistically sit within a physical environment and respond to the phone movements made by the user.
Along with VIO, plane detection is another crucial element needed for ARKit to work. In simple terms, plane detection ‘grounds’ your 3D objects. Without it, digital objects would float aimlessly in space. With plane detection, 3D objects realistically interact with surfaces in the physical world, like floors, ceilings or tabletops.
What Features are Included in ARKit 4?
ARKit 4 includes an impressive range of features. Below are a selection of the most impressive highlights and new additions. For a complete breakdown, head to Apple’s dedicated developer website.
ARKit 4’s Depth API delivers a new and improved way to access the detailed information picked up by the LiDAR Scanner. Devices equipped with the LiDAR Scanner at present include the iPhone 12 Pro, iPhone 12 Pro Max and iPad Pro. These devices can detect more accurate information about the surrounding environment than previous generations. The information is then used to integrate virtual objects quickly and enable them to interact with the physical world more realistically than before.
Face tracking is now supported by far more devices. Any device with an A12 Bionic chip or later, including iPhone SE, can now support AR using the front-facing camera. The TrueDepth camera now tracks up to three faces at once. This is great news for anyone hankering to use Snapchat filters in their group shots.
Location Anchors use high-resolution data from Apple Maps which lets you place AR experiences at specific points in the world. Using latitude, longitude and altitude coordinates, you can anchor your AR experience to related landmarks or city locations. However, this currently requires iPhone XS, XS Max, XR or later and is only available in select cities so far.
Built-In AR Coaching for iOS 13
There are even more advanced features for more recent iOS devices. Every AR experience has to initialise upon opening. The software and hardware needs time to detect and make sense of the surrounding environment. In iOS 13 and later, the built-in coaching view lets you teach people how to use your app while the programme is initialising.
The default coaching view provides users with tips on how to set up their AR experiences. With some editing, you can configure this set up for your own app. Change the visual style of the coaching or input additional instructions as required.
Best of the Rest
Besides the new additions, ARKit 4 has kept the best of ARKit 3’s features, including:
People Occlusion: Virtual objects pass realistically in front and behind real-world people.
Motion Capture: Track the motion of a person with a single camera and use people’s movements and poses to interact with your AR experience.
Simultaneous Front and Back Camera: Use face and world tracking simultaneously across both front and back cameras.
10 Top Tips for Creating AR Experiences with ARKit
Getting your head around the Apple ARKit features are just the first steps of many required to create your own AR experiences. Whether you’re creating a new game, education app or digital services software, here are some general AR factors to consider as well as specific ARKit tips:
1. AR is meant to be immersive. Keep controls and unnecessary information to a minimum so that you can devote as much screen space as possible to displaying your app’s virtual objects within the user’s physical world.
2. If you do need to offer some instruction, keep it relatable and avoid jargon. Words like ‘plane’, ‘anchors’ and ‘adjust tracking’ won’t make a lot of sense to a user unfamiliar with how AR works. Instead, use conversational language such as ‘move more slowly’ and ‘try turning on more lights’.
3. How realistic your experience is, largely depends on the quality of the 3D models and the effects placed on them. Make sure to use textured surfaces, reflect environmental lighting conditions and cast 3D object shadows on real-world surfaces. You can achieve all of this using ARKit 4.
4. Make sure your app updates scenes 60 times per second. This will prevent any flickering or unnatural jumping in the movement of your 3D objects and keep your digital illusions as believable as possible.
5. Enhance the user experience by including audio effects. When a 3D object hits a wall or drops off a table edge, consider adding lifelike sound effects to add to the realism.
6. AR applications cannot be used in all kinds of environments. They may require the user to have ample space to move around or need a large, flat surface to project onto. Make it clear to users from the outset what will be required for the app to work properly. A more advanced option is to provide alternate settings depending on the environment your user is in.
7. Consider the real-life movements of the user when you create your app. People immersed in AR are not paying as much attention to their physical surroundings, large sweeping movements could bring them smack bang into a lamppost, wall, grandma – you get the idea. Introduce motion gradually and always consider the scope of your movements.
8. Users won’t be able to see everything on one screen and will have to move around to interact with the rest of the AR experience. To help people find what they are looking for, guide people towards off-screen virtual objects using pointers, arrows or audible cues.
9. Think about how users will interact with virtual objects. If the user is going to be moving around a lot themselves, indirect controls such as arrows on the screen can work better. However, direct controls where the user moves the object by touching it directly are more immersive and intuitive. These work best when the user largely remains still. Apple provides a useful section of their website that delves into the world of user interactions, there’s a page dedicated to iOS user gestures.
10. Try to avoid repeated reloading by embedding non-AR experiences into the AR view. For example, if visitors to an online furniture store are looking at a virtual armchair in their living room, provide options to change the upholstery within the AR experience. In doing so, the change can happen quickly, without users having to revert to a 2D screen and then reload the AR experience.
The Potential of Apple ARKit
Apple’s ARKit has been used to create some of the most successful and advanced AR apps on the market today. IKEA’s internationally popular IKEA Place app allows users to experiment with virtual furniture to make sure it will fit and suit their home before they buy. LEGO has delighted children (and adults!) by bringing their sets to life with built-in AR anchors, virtual LEGO sets and AR games. For more inspiration on what you can build with this technology, check out our post on the latest and greatest AR examples.