GuidesReferenceIssuesSupportGet API Key


Powering AR with Viro


Device Support

Viro uses ARCore for tracking. The list of supported devices is provided here.

The Viro platform supports Augmented Reality (AR) development through various AR-specific components and features. This guide offers an overview of AR and a high-level look at the tools Viro provides to build these experiences.

Building AR Experiences

The Viro platform provides a large suite of components that developers can leverage to build their AR experiences. However, unlike traditional 3D rendering and VR, AR experiences are meant to be responsive to the user's real world. In order to accomplish this, the Viro platform provides a number of AR specific classes and features detailed below:

AR Classes

The following is an overview of AR classes provided by Viro.




Top-level Android View for rendering AR content. The camera feed is automatically rendered as the view's background.


Maintains a scene graph that is rendered in AR atop the real-world. Enables the creation of anchors to latch virtual objects onto the real-world. Provides a callback interface you can use to be alerted when 'trackable' objects like horizontal or vertical planes are detected.


Anchors connect nodes (ARNodes) to the real-world. You can also hoist these up to the cloud to create multiplayer AR applications.


Scene-graph node that's anchored to the real-world by way of an ARAnchor. Adding content to ARNodes ensures the content will not drift.


Anchor that represents a horizontal or vertical plane that the AR system detected in the real world. This can be the surface for table-top game, or a wall on which to mount a virtual picture.


Anchor that represents a detected image. You can create entire virtual user interfaces that appear over real-world images. Or make things like movie posters come alive when they're detected.


Define what images you want Viro to detect by creating and adding ARImageTargets.


Render the AR point cloud or extract its raw data for debugging. The AR point cloud is the feed of edge and image detection data coming into the system.

AR Enhanced APIs

The AR system also provides other information which we make available through the following APIs on our existing components.


API Location


6 Degrees of Camera Movement

Camera position and orientation can be accessed from ViroView.

The camera automatically moves in step with the user's movements in the real world. This results in virtual objects appearing to stay in their positions.

Video Pass Through (rendered as background)


In AR, the back camera is enabled and serves as the "background" of the view onto which virtual objects are rendered.

Ambient Light Estimation

ARScene offers two methods getEstimatedAmbientLightIntensity() and getEstimatedAmbientLightColor(). You can also listen for ambient light updates in the ARScene.Listener.

Provides an estimate of light intensity and color as detected by the camera.

Fixed to World and Fixed to Plane Dragging

Supported by all Nodes by setting the Node's drag type.

Allows the user to drag objects such that they look fixed to points or planes in the real world.

AR-based Hit Tests

ViroViewARCore provides a variety of performARHitTest

Allows to user to get points in the real world from the AR system. For example, if the user touches the screen, what real-world object (if any) did she touch?


Portal and PortalScene.

Allows the developer to add a virtual "portal" from the real-world into a virtual world, and back again.

Video and still capture

The ViroMediaRecorder accessible from the ViroView.

Makes it easy to record and share AR experiences with others.

With the above features, developers can illuminate their AR experiences with more realistic lighting, make their virtual objects interact realistically with the environment, add other-worldly portals, and much more.