This tutorial will walk through the setup process of building an AR application for iOS. This is done by using the Unity’s ARKit Plugin which will handle all of the setup and code required to get started. By the end of this tutorial, you will have a base Unity project with most of the AR features in place.
Requirements
First and foremost, you will need to download Unity. The ARKit Plugin works with any version above Unity 2017.4. You can download Unity here. In the download client, make sure to check “iOS Build Support” in the “Components” section.
You will also need Xcode with version 10.0+ and an iOS device running iOS 12+.
Creating a Unity Project with the ARKit Plugin
We will be using the ARKit Plugin to create a new project in Unity that will include all of the functionality of Apple’s ARKit SDK natively in Unity.
To begin, download the plugin’s base project from here. Then launch Unity and open this project. You can read more about the plugin’s features here. The plugin also comes with a few example scenes that shows off a few of the major ARKit features such as face tracking.
Setting Up the Scene
Upon opening the plugin project, you’ll be presented with a new scene within the project that contains just the Main Camera and some lighting. This section will go through the steps of setting up this scene to have the basic ARKit features of scanning a room and placing objects. Luckily, the ARKit plugin will do most of this hard work for us, it’ll just be a matter of attaching the right scripts to the right GameObjects.
Building to an iOS Device
Before we get going with creating the scene, it’ll be useful to know how to build the project and run it on an iOS device so that you can test along the way. To do this, go to File -> Build Settings in Unity. Then click “Add Open Scenes” to select the current scene. You’ll need to name the scene whatever you want then save it to the Assets folder.
Next, switch the platform to iOS by selecting Switch Platform in the bottom right corner of the window. If you didn’t install iOS device support when installing Unity, you’ll have to do that now. Finally, select Build And Run. You will want to create a “Builds” folder in the project directory and save the build to it.
This will build the Unity project into an Xcode project. If you haven’t used Xcode, you’ll need to add your Apple ID in the Xcode preferences. If you can an error in Xcode, you may need to add your Team to the Xcode project and change your Bundle Identifier. Once it successfully installs on your device, you see the default Unity background.
Getting Camera Passthrough
Let’s start by getting our device to use its camera as the background for the scene. You’ll notice that your Assets folder contains all of the scripts and scenes for the ARKit Plugin. For this we will want to first attach the UnityARVideo script to the Main Camera GameObject which can be found in Assets/UnityARKitPlugin/Plugins/iOS/UnityARKit/Helpers/UnityARVideo.cs
(or just by searching the Assets folder). Simply drag this script to the Main Camera in the GameObject Hierarchy. If you click on the Main Camera now, you’ll see the script in the Inspector.
You’ll see that there is a public material field called Clear Material in the script. Drag the YUVMaterial material from the Assets folder to this public field. You will also need to add the UnityARCameraNearFar.cs
script to the Main Camera. In the Camera component of Main Camera, you’ll see a dropdown menu for “Clear Flags“. Change this to “Depth only” so that the Main Camera will use your phone’s camera output as the background instead of the default skybox.
Next, we need to create a camera manager that can control the position and rotation of the camera within the world. Right click in the Hierarchy and select Create Empty to create an empty GameObject. Name it “Camera Manager” Then drag the UnityARCameraManager.cs
script into Camera Manager. In the inspector for Camera Manager, drag the Main Camera GameObject into the Camera field of the script.
At this point, if you build the project, you should be able to see your device’s camera output.
Adding Point Clouds and Planes
Now to get depth data in our project, we will need to use the Point Cloud Particles. Create a new empty GameObject called PointCloudParticles and attach the PointCloudParticleExample.cs script
. In its inspector, add ParticlePrefab to the Point Cloud Particle Prefab. Change Max Points to Show to 10000 and Particle Size to 0.01.
We’ll use these to create viable planes where we can place objects. Create another new empty GameObject called “GeneratePlanes“. Attach the UnityARGeneratePlane.cs
script to this object. Drag the debugPlanePrefab to the Plane Prefab field in the script as we’ve done before.
If you build the project now, you should see yellow point clouds map out our environment and generate planes on flat surfaces.
Adding Objects to the World
Now that we have this depth map of the world, let’s add some basic functionality of adding an object!
Right click in the Hierarchy and add a 3D Object->Cube. Create another empty GameObject called “Cube Parent” and drag the cube object onto the parent object to make it a child object. We want to do this so that we can keep track of the parent objects transform (position in the world). The Plugin includes the functionality of spawning the cube on a tap of the screen which is nice since it would otherwise require some annoying raycasting. Drag the UnityARHitTestExample.cs
script onto the child cube object. In the Hit Transform field, drag the cube parent object.
And that’s it! This will be enough to get you started using ARKit in Unity. With this base scene and functionality, you should be able to expand this scene to do whatever you want in AR!