AR Testing in Unity & AR Foundation

AR Testing of Real User Experience with AR Foundation & Unity is not a quick process by default. To debug & test updates in your App or Game with Augmented Reality, you need to make a new build every time.
However, there are some Unity Assets (Plugins/Editor AR Extensions) which will help you to test AR experience right in the Unity Editor, accelerating the development process several times. This Tutorial will give an overview of each AR Testing Solution and then compare them to each other in a convenient table.
AR Testing with Unity Assets (Plugins) in Editor
AR Foundation Remote
AR Foundation Remote — AR Testing as It Should Be, “Run & Drive” solution with live-data streaming from AR device to Unity Editor.
Unity Asset is built on top of XR Plugin architecture. This means that the installation process is as easy as installing any other XR Plugin into your project.

After installation, you need to install an AR Companion app to your AR device and then run your AR scene in Unity Editor. AR Companion will send AR data back to the Editor, so you can test your project as if it is running on a real device.
This solution is the closest you can get to real-world usage but requires you to have an AR device (which you probably already have if you decided to dig into AR development).
AR Foundation Editor Remote requires no additional scene setup and has no coupling with the existing project — just open AR Companion and run your AR Foundation scene in the Unity Editor.
Unity AR Foundation XR Simulation
Unity provides a free official tool for testing in Unity Editor for AR Foundation 5+ without building and live-data streaming from AR device: XR Simulation. This method does not give a complete picture of the real build experience for each target device, but it’s a great starting point.
Use Navigation Controls to imitate translational and rotational motions of smartphone: such way you can test the AR Safe Zone in my Unity Assets where it’s present & observe AR Objects from all sides.

Use Fullscreen of Game View in Unity Editor while testing to get a seamless experience. Testing on Real Device will be different from Testing in Unity Editor.
Also in my Unity Assets with AR Foundation, you can forcibly test the case when AR is Not Supported by checking the next flags in the Hierarchy window on AR Foundation Support Checker (Game Object):
- Is Checked In Editor On Init;
- Is AR Unsupported In Editor Test;
- Is AR Unsupported Not In Editor Test.
Unity MARS (Mixed and Augmented Reality Studio)
Unity MARS is an official Unity framework for AR development. It’s not just another XR Plugin, but a whole suite of tools for AR development.
It wraps the AR Foundation API and, to integrate MARS into an existing project, you need to make a full transition from AR Foundation to MARS and learn the new API. You need to rebuild your AR scenes from the ground up with MARS components and rewrite your scripts. And once you make this transition, there is no way back — your project now depends on MARS and its subscription model. So, it’s better to start working with MARS in the early stages of development.
Unity MARS has many great features for AR development, but this article only covers its functionality related to testing AR Foundation in Editor.
Currently, MARS provides simulated environments to test your AR scenes in Editor. Also, with the Unity AR Companion App you can record an AR Environment using a real device, then save to the cloud and sync with Unity Editor.
Use 3rd-party AR Foundation Remote plugin instead to stream session in Real-time from the Device.
Several built-in simulated environments cover many scenarios. You can switch between them instantly in the Simulation view to test how your AR objects will interact with different real-world arrangements.
Unity MARS vs AR Foundation Remote vs XR Simulation
This table conveniently gathers all AR features supported by some AR Testing Solutions, so you can pick one that better suits the needs of your AR project.
| AR Testing Feature | AR Foundation Remote | AR Foundation XR Simulation | Unity MARS |
|---|---|---|---|
| ARKit Support | + | + | + |
| ARCore Support | + | + | + |
| HoloLens Support | Beta Test | + | |
| Magic Leap Support | Join Beta Test | + | |
| Plane Tracking | + | + | + |
| Point Cloud Tracking | + | + | + |
| Image Tracking | + | + | + |
| Meshing | + | + | + |
| ARKit Mesh Classification | + | + | + |
| Face Tracking | + | – | +/- (using Session Recordings) |
| Session Recordings | + | – | + |
| Raycast (ARRaycastManager) | + | + | – |
| Light Estimation | + | + (starting from v6.0) | – |
| Anchors (ARAnchorManager) | + | + | – |
| ARCore Cloud Anchors | + | – | – |
| Device Simulator | + | – | – |
| Single Touch Simulation | + | – | – |
| Multi-Touch Simulation | + | – | – |
| iOS Eye Tracking | + | – | – |
| iOS Face Blendshapes | + | – | – |
| Camera Feed Video | + | – | – |
| Occlusion | + | + (starting from AR Foundation 6.0) | – |
| Accessing the Camera Image on the CPU | + | + | – |
| Location Services (GPS) | + | – | – |
| ARKit Human segmentation | + | – | – |
| ARKit 2D & 3D body tracking | + | – | – |
| ARKit Object Tracking | + | – | – |
| ARKit World Map | + | – | – |
| Easy integration into an existing project | + | + | – (requires a full transition to MARS API) |
| No coupling with the project (you can remove the extension at any time) | + | + | – (no simple way back) |
| Simulated Environments | – | + | + |
| New Input System | + | + | +/- (no multi-touch) |
| Minimum AR Foundation version | 3.0.1 | 5.0 | 4.0 |
| Minimum Unity version | 2019.4 | 2021.2 | 2019.4 |
| Pricing | $150 | Free | $600/year |
Testing AR Build on Smartphone
Plane Detection
Example Unity Asset with Plane Detection: FPS Shooter.
You can learn more about Plane Detection in AR Foundation here.
Image Tracking and Multiple Images
Example Unity Asset with Multiple Image Tracking: Business Card.
You can learn more about Image Tracking in AR Foundation here.
ARCore (Android) Behavior
ARCore Requirements: Supported Devices & Augmented Images.
ARCore API assumes that AR Images are static in the environment (2), so once they are recognized successfully, they will always appear in the list of anchors (updated list) (1) until the session is reset.
This means 2 things:
- AR Foundation will never mark the Tracked Image as removed, and it will never mark the Tracking State for the Tracked Image as None.
- If you try to test 2 Different Images via your monitor (desktop/laptop) and you will switch images with CMD+Tab (macOS) or Alt+Tab (Windows), so Images will be in the same position on the monitor, then ARCore will not understand that.
ARKit (iOS) Behavior
You can learn more about ARKit Requirements for Image Tracking here.
- Tracking State for the Tracked Image can be set as None when the camera stops seeing it.
- 2 images can be successfully tested on the monitor in the same position, as it described above for ARCore.
Base Code Implementation for Image Tracking
An image that goes out of view, for example, might not be removed, but its tracking state likely changes. Actually, don’t worry about how removed is operated internally in ARKit/ARCore. You need to show the content when it’s relevant — my code provided below is enough for this task.
private void OnTrackedImagesChanged(ARTrackedImagesChangedEventArgs args)
{
foreach (ARTrackedImage arTrackedImage in args.added)
{
DebugPrinter.Print("added. Name: " + arTrackedImage.name);
ShowContent(arTrackedImage, true);
}
// ----------------------------------------------------------------
// ARCore NOTICE from https://makaka.org/unity-tutorials/ar-testing
// ----------------------------------------------------------------
// ARCore API assumes that AR Images are static in the environment
// so once they are recognized successfully they will always appear
// in the list of anchors until the session is reset. This means that
// AR Foundation will never report that a tracked images is removed
// nor will a tracked image's tracking state be reported as None.
// args.updated contains all images that were added
foreach (ARTrackedImage arTrackedImage in args.updated)
{
if (arTrackedImage.trackingState == TrackingState.Tracking)
{
ShowContent(arTrackedImage, true);
}
else
{
ShowContent(arTrackedImage, false);
}
//DebugPrinter.Print($"upd." +
// $" State: {arTrackedImage.trackingState}." +
// $" Name: {arTrackedImage.name}");
}
foreach (ARTrackedImage arTrackedImage in args.removed)
{
DebugPrinter.Print("removed. Name: " + arTrackedImage.name);
ShowContent(arTrackedImage, false);
}
}
XR Simulation (Unity Editor)
- With XR Simulation, can detect and track all simulated tracked images in the Unity Editor, even if you have not included their textures in your reference image library.
To optionally bind a simulated tracked image to your reference image library, set the Image field of its Simulated Tracked Image component to reference a texture asset that is also used in the reference image library. - When XR Simulation detects images that are not part of the reference image library, the corresponding ARTrackedImage trackables will not contain a fully initialized referenceImage. Instead, the guid property of the referenceImage is set to zero, and its texture is set to null.
Mutable Runtime Reference Image Library
MutableRuntimeReferenceImageLibrary is intended to add Images for Tracking on the go when the user has running app on his phone. Using AR Foundation 5.1.0+ and the next tips, you can test the Adding of New Images into the Mutable Library in the Runtime correctly with XR Simulation.
Demo Scene to Test
There is BasicImageTracking Example Scene that uses Mutable Library in AR Foundation Samples, and I:
- Duplicated the Simulated Environment by Default: Window > XR > AR Foundation > XR Environment > XR Environment overlay > Pencil icon > Duplicate Environment.
- Added one more Tracked Image next to the existing one on the Scene and set the Image field as it mentioned before.
- Added the next parts of code in the OnTrackablesChanged() method of ARTrackedImageManager.cs.

2 Approaches to Test
You can test the Adding of New Image in Runtime using 2 predefined lists.
1. added list
foreach (var newImage in added)
{
Debug.Log("added >: " + newImage.referenceImage.guid);
}
There is no reaction to added list when a new image added to the library during runtime because all Simulated Tracked Images were in the Camera View at start.
The Simulated Environment assumes that AR Image is static in the environment, so once it appears in the camera view, it marks as static because this is not a separate image — this is a container for an image (SimulatedTrackedImage.cs) which is not moving.
To trigger added list, you just need to add a new SimulatedTrackedImage.cs into the camera view: you can just place the SimulatedTrackedImage Game Objects before the Scene Start at some distance to the right/to the left from the Camera View.
This method will not work for true testing goals:
Scene View > open Duplicated Simulation Environment prefab to observe camera position > root Game Object > Simulation Environment component > Camera Starting Pose > Rotation/Position.

2. updated list
foreach (var newImage in updated)
{
Debug.Log("updated: " + newImage.referenceImage.guid);
}
It can trigger perfectly for Added Image in Runtime and you can track the first appearance after adding an image to the library.