GuidesReferenceIssuesSupportGet API Key

Augmented Reality (AR)


ViroCore uses ARCore for AR Tracking

ARCore supports the following devices: Google Pixel, Pixel XL, Pixel 2, Pixel 2 XL, Samsung Galaxy S8 running N and later.

The Viro platform supports Augmented Reality (AR) development through integration with ARCore. This guide will first give an overview of AR and provide a high-level overview of the components and features that enable developers to build AR experiences.

Device Support

AR is not supported on all platforms. The full list of devices supporting AR can be found on Google's ARCore page. To programmatically detect if a device supports ARCore, use the ViroViewARCore.isDeviceCompatible(Context) method. If a given device isn't compatible, an UnavailableDeviceNotCompatibleException will be thrown if you attempt to construct a ViroViewARCore.

If a device is compatible, then when the ViroViewARCore is constructed users will be prompted to download ARCore, if they haven't already. This is an asynchronous flow that you can listen to by passing a StartupListener into the ViroViewARCore constructor. If the download and install of ARCore was successful, the StartupListener's onSuccess() callback will be invoked. Otherwise, onFailure() will be invoked with the corresponding error code.

To summarize, the workflow is as follows:

Required Camera Permissions for AR

ViroCore requires access to the device's camera for AR enabled features like tracking / SLAM to function properly. This permission check automatically occurs during the onActivityResumed phase within the ViroViewARCore view. If the user had already previously given Camera permissions for your application, everything should work as per normal.

However, if no Camera permissions had yet been given, Viro will automatically ask for Camera permissions via Android's requestPermissions() API. This should pop up a friendly permission request dialog as shown below:

If user then denies camera permissions, the StartupListener's onFailure() callback will be invoked with the corresponding CAMERA_PERMISSIONS_NOT_GRANTED error code. Note that like any other Android permissions, you can also preemptively call and check for Camera permissions within your Android application before constructing the ViroViewARCore control.

Developing for AR

Camera Tracking

The Viro platform supports development of 6 degrees-of-freedom (6DOF) AR experiences. The platform tracks the user's real-world rotation and position as he or she moves, and keeps the virtual camera in sync with that movement. The platform maintains a right-handed coordinate system, where the origin of the system is the user's location at the time AR tracking was initialized. The camera's forward vector of [0, 0, -1] and up vector is [0,1,0].

AR Scenes and Anchors

AR experiences are created in Viro by creating a ViroViewARCore, and configuring and setting an ARScene. ARScene blends virtual 3D content with the device camera's view of the real world. Similar to a normal Scene, an ARScene contains a hierarchy of Node objects representing the virtual 3D world. Behind these objects the camera's live video feed is rendered. ARScene ensures the two worlds -- real and virtual -- stay in sync with a unified coordinate system.

There are two ways to add virtual content to the ARScene. The first is to simply place content manually. The origin of the coordinate system is the initial position of the user, and the scale is meters. To add content you only need to set the position of the Node within this coordinate system, and add it to the scene graph. You can find more information on Scenes in our Scenes guide.

The second way to add virtual content is to use ARAnchor. ARAnchors represent features detected in the real-world. You can associate (or "anchor") your content to these features by adding your content to the ARNode that corresponds to each detected ARAnchor. To do this, implement the ARScene.Listener. The ARScene.Listener callbacks are invoked whenever ARAnchors are found (or updated, or removed) in the real-world.

In the example below, we create an ARScene and set it up to add a virtual Box to sit on every detected real-world plane.

final ARScene scene = new ARScene();

scene.setListener(new ARScene.Listener() {
    public void onAmbientLightUpdate(float lightIntensity, float colorTemperature) {
    public void onAnchorFound(ARAnchor anchor, ARNode arNode) {
        // Create a Box to sit on every plane we detect
        if (anchor.getType() == ARAnchor.Type.PLANE) {
            Box box = new Box(1, 1, 1);
            Node boxNode = new Node();
    public void onAnchorRemoved(ARAnchor anchor, ARNode arNode) {
    public void onAnchorUpdated(ARAnchor anchor, ARNode arNode) {
    public void onTrackingInitialized() {

ViroView view = new ViroViewARCore(context, null);

AR Features

Viro supports numerous AR features, some of which are detailed below.

AR Hit Testing

ViroViewARCore has a variety of methods used to "hit-test" against the real-world. These hit tests can be used to determine, to the best of ARCore's ability, what real-world features exist at a given point on the 2D screen. Note that since a single 2D point on the view corresponds to a 3D ray in the scene, multiple results may be returned (each at a different depth). The results may be anchors, or they may be feature points that have not yet been fully identified.

Fixed to World Dragging

Normally when dragging a Node through a DragListener, the Node when dragged stays at a fixed distance from from the user, as though the user is dragging the Node across the inner surface of a sphere. This is called FixedToWorld dragging.

Viro also supports FixedToWorld dragging in AR, where instead of keeping a dragged Node's distance from the user fixed, the dragged Node's distance is instead determined by its intersection with the nearest real-world object. This is useful for dragging a virtual object across a real-world surface, for example.


Portals are an AR effect where a 'window' or 'door' is displayed that users can use to peer into a virtual world, as shown below.

PortalScene is the root of the subgraph of Nodes that is displayed through a Portal. Each PortalScene can contain any number of child nodes and content, and each PortalScene can have its own background texture. If a PortalScene is set to passable, users are able to walk through the Portal into the PortalScene.

The following example shows how to create a simple Portal that transports you from your current location to a beach-like resort.

// Add a Light so the ship door portal entrance will be visible
OmniLight light = new OmniLight();
light.setPosition(new Vector(0, 1, -4 ));

// Load a model representing the ship door
Object3D shipDoorModel = new Object3D();
shipDoorModel.loadModel(Uri.parse("file:///android_asset/portal_ship.vrx"), Object3D.Type.FBX, null);

// Create a Portal out of the ship door
Portal portal = new Portal();
portal.setScale(new Vector(0.5, 0.5, 0.5));

// Create a PortalScene that uses the Portal as an entrance.
PortalScene portalScene = new PortalScene();
portalScene.setPosition(new Vector(0, 0, -5));

// Add a 'beach' background for the Portal scene
final Bitmap beachBackground = getBitmapFromAssets("beach.jpg");
final Texture beachTexture = new Texture(beachBackground, Texture.Format.RGBA8, true, false);


To try this example, you can download the beach 360 photo and portal ship door assets here. Unzip these and store in your applications assets folder. The resulting scene is shown below.

Video and Still Capture

Viro supports video and still image capture, to make sharing AR experiences easy. To do this, simply grab the ViroMediaRecorder from the ViroViewARCore.

Point Cloud Rendering

Enable point cloud rendering in ARScene to display on-screen the rough feature points that ARCore is detecting. This is primarily useful for debugging and analysis.