AR Testing in Unity & AR Foundation
AR Testing with AR Foundation & Unity is not quick process by default. To debug & test updates in your App or Game with Augmented Reality you need to make a new build every time.
However, there are 3 Unity Assets (Plugins/Editor AR Extensions) which will help you to test AR experience right in Unity Editor speeding up the development process several times:
This article will give an overview of each AR Testing Solution and then compare them to each other in a single convenient table.
AR Testing with Unity Assets (Plugins)
AR Foundation Editor Remote
AR Foundation Editor Remote — AR Testing as It Should Be, “Run & Drive” solution with live-data streaming from AR device to Unity Editor.
Unity Asset is built on top of XR Plugin architecture. This means that the installation process is as easy as installing any other XR Plugin into your project.
After installation, you need to install an AR Companion app to your AR device and then run your AR scene in Unity Editor. AR Companion will send AR data back to the Editor, so you can test your project as if it is running on a real device.
This solution is the closest you can get to real-world usage but requires you to have an AR device (which you probably already have if you decided to dig into AR development).
AR Foundation Editor Remote requires no additional scene setup and has no coupling with the existing project — just open AR Companion and run your AR Foundation scene in the Unity Editor.
AR Simulation is also built on top of XR Plugin architecture that comes with the aforementioned advantage of easy installation, but it implements a different approach and, instead of streaming data from AR device to Editor, the plugin simulates the data.
In contrast to AR Foundation Editor Remote, using which you should hold your AR device with one hand, in AR Simulation everything happens instantly right in the Unity Editor and both of your hands stay free.
But on the downside, AR Simulator is different from real-world behavior which makes it harder to develop optimal user experiences. AR Simulation requires minor scene setup if you want simulated planes or tracked images to appear at non-default positions.
Unity MARS (Mixed and Augmented Reality Studio)
Unity MARS is an official Unity framework for AR development. It’s not just another XR Plugin, but a whole suite of tools for AR development.
It wraps the AR Foundation API and, to integrate MARS into an existing project, you need to make a full transition from AR Foundation to MARS and learn new API. You need to rebuild your AR scenes from the ground up with MARS components and rewrite your scripts. And once you make this transition, there no way back — your project now depends on MARS and its subscription model. So it’s better to start working with MARS in the early stages of development.
Unity MARS has many great features for AR development, but this article only covers its functionality related to testing AR Foundation in Editor.
Currently, MARS provides simulated environments to test your AR scenes in Editor. But in the future, you’ll be able to record an AR session with a real device or stream the session in real-time, similar to AR Foundation Editor Remote.
Several built-in simulated environments cover many scenarios. You can switch between them instantly in the Simulation view to test how your AR objects will interact with different real-world arrangements.
Also, there is a Smart AR plugin that one can confuse with a remote tool for testing AR in Editor, but it is not: this is a fairly limited tool that allows you to re-position previously created objects and save their positions into a file.
Once you finished saving the positions to a file, you’re pretty much done: there is no built-in way to send this file back to the Editor. Plugin will not be added to the comparison table below because it has no AR Testing features.
Unity MARS vs AR Foundation Editor Remote vs Others
This table conveniently gathers all AR features supported by all aforementioned AR Testing Solutions so you can pick one that better suits the needs of your AR project.
|AR Testing Feature||AR Foundation Editor Remote||AR Simulation||Unity MARS|
|HoloLens Support||Join Beta Test||+/- (anchors only)||+|
|Magic Leap Support||Join Beta Test||+/- (plane tracking, anchors, image tracking & raycast only)||+|
|Point Cloud Tracking||+||+||+|
|ARKit Mesh Classification||+||–||–|
|Face Tracking||+||–||+/- (Editor Testing with Webcam requires ULSee plugin)|
|ARCore Cloud Anchors||+||–||–|
|Single Touch Simulation||+||+||–|
|iOS Eye Tracking||+||–||–|
|iOS Face Blendshapes||+||–||–|
|Camera Feed Video||+||–||–|
|Accessing the Camera Image on the CPU||+||–||–|
|Location Services (GPS)||+||–||–|
|ARKit Human segmentation||+||–||–|
|ARKit 2D & 3D body tracking||+||–||–|
|ARKit Object Tracking||+||–||–|
|ARKit World Map||+||–||–|
|Easy integration into an existing project||+||+||– (requires a full transition to MARS API)|
|No coupling with the project (you can remove the extension at any time)||+||+ (requires minor scene cleanup of simulated trackables)||– (no simple way back)|
|New Input System||+||–||+/- (no multi-touch)|
|Minimum AR Foundation version||3.0.1||3.0.1||2.1.8|
|Minimum Unity version||2019.4||2019.3||2019.3|