As an Associate of Unity Asset Store, this website earns from qualifying purchases & contains affiliate links: check the footer for more info.

AR Testing in Unity & AR Foundation

AR Testing with AR Foundation & Unity is not a quick process by default. To debug & test updates in your App or Game with Augmented Reality, you need to make a new build every time by default.

However, there are some Unity Assets (Plugins/Editor AR Extensions) which will help you to test AR experience right in the Unity Editor, accelerating the development process several times. This article will give an overview of each AR Testing Solution and then compare them to each other in a convenient table.

AR Testing with Unity Assets (Plugins)

AR Foundation Editor Remote

AR Foundation Editor Remote — AR Testing as It Should Be, “Run & Drive” solution with live-data streaming from AR device to Unity Editor.

Unity Asset is built on top of XR Plugin architecture. This means that the installation process is as easy as installing any other XR Plugin into your project.

Unity Asset Store — Download Button

After installation, you need to install an AR Companion app to your AR device and then run your AR scene in Unity Editor. AR Companion will send AR data back to the Editor, so you can test your project as if it is running on a real device. 

This solution is the closest you can get to real-world usage but requires you to have an AR device (which you probably already have if you decided to dig into AR development).

AR Foundation Editor Remote requires no additional scene setup and has no coupling with the existing project — just open AR Companion and run your AR Foundation scene in the Unity Editor.

Unity AR Foundation XR Simulation

Unity provides the official tool for testing in Unity Editor for AR Foundation 5 without Building and live-data streaming from AR device: XR Simulation. This method requires additional developer overheads and does not give a complete picture of the real build experience for each target device.

Custom Solution by Makaka Games

I make Unity Assets with AR Foundation and I developed my own way of Testing for Plane Detection and Image Tracking packages.

As I can see, I provide the fastest way to testing: I just allow playing the game in the Unity Editor with correct camera settings and without additional overheads. My way of view in the Unity Editor is the same as in the real build, but without Tracked Planes and Image Markers.

Controls are described in the Testing section of AR Throwing (Unity Asset).

AR Testing in Editor with Unity AR Foundation (ARCore, ARKit)

The hardware laptop camera is also turned on for testing — this method of getting the camera feed is used on devices that do not support Plane Detection with ARCore, ARKit — that is, the game will work on low-budget devices using gyro or accelerometer (“Pseudo AR” way).

I guess that adding support for the XR Simulation by Unity will make the testing longer and harder. I believe that if you need to test with the target environment, you need to build every time for true AR experience of each target platform: iOS, Android, etc.

You can’t replace real testing – because you can’t have the mobile AR iPhone sensors on your Mac. Real testing exists only with real sensors. XR Simulation by Unity is the middle way between my way and real testing, and I don’t see any advantages for now of adding it.

Unity MARS (Mixed and Augmented Reality Studio)

Unity MARS is an official Unity framework for AR development. It’s not just another XR Plugin, but a whole suite of tools for AR development. 

It wraps the AR Foundation API and, to integrate MARS into an existing project, you need to make a full transition from AR Foundation to MARS and learn the new API. You need to rebuild your AR scenes from the ground up with MARS components and rewrite your scripts. And once you make this transition, there is no way back — your project now depends on MARS and its subscription model. So, it’s better to start working with MARS in the early stages of development.

Unity MARS has many great features for AR development, but this article only covers its functionality related to testing AR Foundation in Editor.

Currently, MARS provides simulated environments to test your AR scenes in Editor. Also, with the Unity AR Companion App you can record an AR Environment using a real device, then save to the cloud and sync with Unity Editor. AR Foundation Editor Remote to stream session in Real-time from the Device.

Several built-in simulated environments cover many scenarios. You can switch between them instantly in the Simulation view to test how your AR objects will interact with different real-world arrangements.

AR Simulation

AR Simulation is also built on top of XR Plugin architecture that comes with the aforementioned advantage of easy installation, but it implements a different approach and, instead of streaming data from AR device to Editor, the plugin simulates the data.

Unity Asset Store — Download Button

In contrast to AR Foundation Editor Remote, using which you should hold your AR device with one hand, in AR Simulation everything happens instantly right in the Unity Editor and both of your hands stay free.

But on the downside, AR Simulator is different from real-world behavior which makes it harder to develop optimal user experiences. AR Simulation requires minor scene setup if you want simulated planes or tracked images to appear at non-default positions.

Smart AR

Also, there is a Smart AR plugin that one can confuse with a remote tool for testing AR in Editor, but it is not: this is a fairly limited tool that allows you to re-position previously created objects and save their positions into a file.

Once you finished saving the positions to a file, you’re pretty much done: there is no built-in way to send this file back to the Editor. Plugin will not be added to the comparison table below because it has no AR Testing features.

Unity MARS vs AR Foundation Editor Remote vs Others

This table conveniently gathers all AR features supported by some AR Testing Solutions, so you can pick one that better suits the needs of your AR project.

AR Testing FeatureAR Foundation Editor RemoteAR SimulationUnity MARS
ARKit Support+++
ARCore Support+++
HoloLens SupportJoin Beta Test+/- (anchors only)+
Magic Leap SupportJoin Beta Test+/- (plane tracking, anchors, image tracking & raycast only)+
Plane Tracking+++
Point Cloud Tracking+++
Image Tracking+++
ARKit Mesh Classification+
Face Tracking++/- (Editor Testing with Webcam requires ULSee plugin)
Session Recordings+++
Raycast (ARRaycastManager)++
Light Estimation++
Anchors (ARAnchorManager)++/-
ARCore Cloud Anchors+
Device Simulator++
Single Touch Simulation++
Multi-Touch Simulation++
iOS Eye Tracking+
iOS Face Blendshapes+
Camera Feed Video+
Accessing the Camera Image on the CPU++
Location Services (GPS)+
ARKit Human segmentation+
ARKit 2D & 3D body tracking+
ARKit Object Tracking+
ARKit World Map+
Easy integration into an existing project++– (requires a full transition to MARS API)
No coupling with the project (you can remove the extension at any time)++ (requires minor scene cleanup of simulated trackables)– (no simple way back)
Simulated Environments++
New Input System+++/- (no multi-touch)
Minimum AR Foundation version3.
Minimum Unity version2019.42019.32019.4
Comparison Table of AR Testing Tools for AR Foundation & Unity
Unity Assets

Support for Unity Assets

I am Andrey Sirota, Founder of Makaka Games and full-time Publisher on the Unity Asset Store. First, read the latest docs online. If it didn’t help, get the support.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Comment moderation is enabled. Your comment may take some time to appear.

Back to top button