{"_id":"5b689070f3c8dc0003c6d1da","project":"5a065a6134873d0010b396ab","version":{"_id":"5b689070f3c8dc0003c6d1df","project":"5a065a6134873d0010b396ab","__v":0,"forked_from":"5b4e90b9d9ea0b00031c6194","createdAt":"2018-04-18T18:19:34.288Z","releaseDate":"2018-04-18T18:19:34.288Z","categories":["5b689070f3c8dc0003c6d1a4","5b689070f3c8dc0003c6d1a5","5b689070f3c8dc0003c6d1a6","5b05923ea5a2f9000357b452","5b05f793c2c86c0003cbe414","5b689070f3c8dc0003c6d1a7","5b689070f3c8dc0003c6d1a8","5b689070f3c8dc0003c6d1a9"],"is_deprecated":false,"is_hidden":false,"is_beta":false,"is_stable":true,"codename":"","version_clean":"1.9.0","version":"1.9.0"},"category":{"_id":"5b689070f3c8dc0003c6d1a7","project":"5a065a6134873d0010b396ab","version":"5b689070f3c8dc0003c6d1df","__v":0,"sync":{"url":"","isSync":false},"reference":false,"createdAt":"2018-05-24T14:09:29.251Z","from_sync":false,"order":2,"slug":"develop-ar","title":"Augmented Reality"},"user":"579a69d53de0a217007eda56","githubsync":"","__v":0,"parentDoc":null,"updates":[],"next":{"pages":[],"description":""},"createdAt":"2018-05-24T14:11:47.773Z","link_external":false,"link_url":"","sync_unique":"","hidden":false,"api":{"results":{"codes":[]},"settings":"","auth":"required","params":[],"url":""},"isReference":false,"order":2,"body":"When you create a ViroViewARCore, the first thing you'll notice is the camera feed rendered as the background to your ARScene. That camera background represents the real world. Viro enables you to fuse virtual objects and UI with that real world. You accomplish this with tracking and anchors.\n[block:api-header]\n{\n  \"title\": \"Camera Tracking\"\n}\n[/block]\nThe Viro platform supports development of 6 degrees-of-freedom (6DOF) AR experiences. The platform tracks the user's real-world rotation and position as he or she moves, and keeps the virtual camera in sync with that movement. The platform maintains a right-handed coordinate system, where the origin of the system is the user's location at the time AR tracking was initialized. The scale of the coordinate system is zero. The camera's forward vector (the direction in which it is facing) -- is initially `[0, 0, -1]`, and the camera's initial up vector is `[0,1,0]`.\n[block:image]\n{\n  \"images\": [\n    {\n      \"image\": [\n        \"https://files.readme.io/ac9ecad-camera.png\",\n        \"camera.png\",\n        607,\n        676,\n        \"#3cbbfb\"\n      ]\n    }\n  ]\n}\n[/block]\n\n[block:api-header]\n{\n  \"title\": \"AR Scenes and Anchors\"\n}\n[/block]\nAR experiences are created in Viro by creating a [ViroViewARCore](https://developer.viromedia.com/virocore/reference/com/viro/core/ViroViewARCore.html), and configuring and setting an [ARScene](https://developer.viromedia.com/virocore/reference/com/viro/core/ARScene.html) on that view. ARScene blends virtual 3D content with the device camera's view of the real world. Similar to a normal Scene, an ARScene contains a hierarchy of Node objects representing the virtual 3D world. Behind these objects the camera's live video feed is rendered. ARScene ensures the two worlds -- real and virtual -- stay in sync.\n\nThere are two ways to add virtual content to the ARScene, described below.\n\n## Manual Anchoring\n\nThe first is to place content manually. While it's possible to add a [Node][3] to the [ARScene][4] at an arbitrary point in world coordinate space, this is not recommended: if AR conditions change, the Node may begin to drift and disconnect from its initial position. Instead, the preferred way to add content to arbitrary positions is to *anchor* that content using [ARScene.createAnchoredNode(Vector)][1]. This method returns an ARNode at the provided position. An ARNode is a Node that's affixed to the real-world via an ARAnchor. Each ARNode serves as a regular [scene graph](doc:scenes) node: you can add child nodes, lights, sounds, and more, and this entire virtual subgraph will stay in sync with the real world. ARNodes will stay latched to their position until [ARNode.detach()][2] is invoked. Detaching an ARNode will remove it from the ARScene and make it unusable.\n\nThe following example shows how to construct an ARNode at an arbitrary position 2 meters in front of the user. A box is added to the ARNode, and \"Hello World\" text is displayed in a child node above the box.\n\n[1]: https://developer.viromedia.com/virocore/reference/com/viro/core/ARScene.html#createAnchoredNode(com.viro.core.Vector)\n[2]: https://developer.viromedia.com/virocore/reference/com/viro/core/ARNode.html#detach()\n[3]: https://developer.viromedia.com/virocore/reference/com/viro/core/Node.html\n[4]: https://developer.viromedia.com/virocore/reference/com/viro/core/ARScene.html\n[block:code]\n{\n  \"codes\": [\n    {\n      \"code\": \"// Create the AR view and scene\\nViroView view = new ViroViewARCore(context, null);\\nfinal ARScene scene = new ARScene();\\n\\n// Create a new anchored node 2 meters in front of the user\\nARNode node = scene.createAnchoredNode(new Vector(0, 0, -2));\\n\\n// AR node creation can fail in poor lighting conditions\\nif (node != null) {\\n    Box box = new Box(1, 1, 1);\\n    node.setGeometry(box);\\n  \\n    // Create a child node for the text, above the box\\n    Node textNode = new Node();\\n    textNode.setPosition(0, 1, 0);\\n  \\n    // Create the 'Hello World' text\\n    Text text = new Text(view.getViroContext(), \\\"Hello World\\\", 1, 1);\\n    textNode.setGeometry(text);\\n    node.addChildNode(boxNode);\\n}\\n\\nview.setScene(scene);\\n\",\n      \"language\": \"java\"\n    }\n  ]\n}\n[/block]\n## Trackable Anchoring\n\nThe second way to add virtual content is to listen for features (also called \"trackables\") detected by the AR system. Trackables are things like vertical or horizontal surfaces, or [images](doc:ar-image-recognition) (like posters or markers) found in the real world. Each time one of these is found by the AR system, an associated [ARAnchor](https://developer.viromedia.com/virocore/reference/com/viro/core/ARAnchor.html) is created. You can associate your content to these features by adding your content to the [ARNode](https://developer.viromedia.com/virocore/reference/com/viro/core/ARNode.html) that corresponds to each anchor. To do this, implement the [ARScene.Listener](https://developer.viromedia.com/virocore/reference/com/viro/core/ARScene.Listener.html). The ARScene.Listener callbacks are invoked whenever ARAnchors are found (or updated, or removed).\n\nIn the example below, we create an ARScene and set it up to add a virtual Box to sit on every detected *horizontal* real-world plane.\n[block:code]\n{\n  \"codes\": [\n    {\n      \"code\": \"final ARScene scene = new ARScene();\\n\\nscene.setListener(new ARScene.Listener() {\\n    public void onAmbientLightUpdate(float intensity, Vector color) {\\n    }\\n  \\n    public void onAnchorFound(ARAnchor anchor, ARNode arNode) {\\n        // Create a Box to sit on every plane we detect\\n        if (anchor.getType() == ARAnchor.Type.PLANE) {\\n            ARPlaneAnchor planeAnchor = (ARPlaneAnchor) anchor;\\n            \\n            // Ensure this is a horizontal plane\\n            if (planeAnchor.getAlignment() == ARPlaneAnchor.Alignment.HORIZONTAL) {\\n                Box box = new Box(1, 1, 1);\\n                Node boxNode = new Node();\\n                boxNode.setGeometry(box);\\n                arNode.addChildNode(boxNode);\\n            }\\n        }\\n    }\\n  \\n    public void onAnchorRemoved(ARAnchor anchor, ARNode arNode) { \\n    }\\n  \\n    public void onAnchorUpdated(ARAnchor anchor, ARNode arNode) {\\n    }\\n  \\n    public void onTrackingUpdated(TrackingState state, TrackingStateReason reason) {\\n    }\\n});\\n\\nViroView view = new ViroViewARCore(context, null);\\nview.setScene(scene);\",\n      \"language\": \"java\"\n    }\n  ]\n}\n[/block]","excerpt":"","slug":"ar-anchors","type":"basic","title":"Tracking and Anchors"}

Tracking and Anchors


When you create a ViroViewARCore, the first thing you'll notice is the camera feed rendered as the background to your ARScene. That camera background represents the real world. Viro enables you to fuse virtual objects and UI with that real world. You accomplish this with tracking and anchors. [block:api-header] { "title": "Camera Tracking" } [/block] The Viro platform supports development of 6 degrees-of-freedom (6DOF) AR experiences. The platform tracks the user's real-world rotation and position as he or she moves, and keeps the virtual camera in sync with that movement. The platform maintains a right-handed coordinate system, where the origin of the system is the user's location at the time AR tracking was initialized. The scale of the coordinate system is zero. The camera's forward vector (the direction in which it is facing) -- is initially `[0, 0, -1]`, and the camera's initial up vector is `[0,1,0]`. [block:image] { "images": [ { "image": [ "https://files.readme.io/ac9ecad-camera.png", "camera.png", 607, 676, "#3cbbfb" ] } ] } [/block] [block:api-header] { "title": "AR Scenes and Anchors" } [/block] AR experiences are created in Viro by creating a [ViroViewARCore](https://developer.viromedia.com/virocore/reference/com/viro/core/ViroViewARCore.html), and configuring and setting an [ARScene](https://developer.viromedia.com/virocore/reference/com/viro/core/ARScene.html) on that view. ARScene blends virtual 3D content with the device camera's view of the real world. Similar to a normal Scene, an ARScene contains a hierarchy of Node objects representing the virtual 3D world. Behind these objects the camera's live video feed is rendered. ARScene ensures the two worlds -- real and virtual -- stay in sync. There are two ways to add virtual content to the ARScene, described below. ## Manual Anchoring The first is to place content manually. While it's possible to add a [Node][3] to the [ARScene][4] at an arbitrary point in world coordinate space, this is not recommended: if AR conditions change, the Node may begin to drift and disconnect from its initial position. Instead, the preferred way to add content to arbitrary positions is to *anchor* that content using [ARScene.createAnchoredNode(Vector)][1]. This method returns an ARNode at the provided position. An ARNode is a Node that's affixed to the real-world via an ARAnchor. Each ARNode serves as a regular [scene graph](doc:scenes) node: you can add child nodes, lights, sounds, and more, and this entire virtual subgraph will stay in sync with the real world. ARNodes will stay latched to their position until [ARNode.detach()][2] is invoked. Detaching an ARNode will remove it from the ARScene and make it unusable. The following example shows how to construct an ARNode at an arbitrary position 2 meters in front of the user. A box is added to the ARNode, and "Hello World" text is displayed in a child node above the box. [1]: https://developer.viromedia.com/virocore/reference/com/viro/core/ARScene.html#createAnchoredNode(com.viro.core.Vector) [2]: https://developer.viromedia.com/virocore/reference/com/viro/core/ARNode.html#detach() [3]: https://developer.viromedia.com/virocore/reference/com/viro/core/Node.html [4]: https://developer.viromedia.com/virocore/reference/com/viro/core/ARScene.html [block:code] { "codes": [ { "code": "// Create the AR view and scene\nViroView view = new ViroViewARCore(context, null);\nfinal ARScene scene = new ARScene();\n\n// Create a new anchored node 2 meters in front of the user\nARNode node = scene.createAnchoredNode(new Vector(0, 0, -2));\n\n// AR node creation can fail in poor lighting conditions\nif (node != null) {\n Box box = new Box(1, 1, 1);\n node.setGeometry(box);\n \n // Create a child node for the text, above the box\n Node textNode = new Node();\n textNode.setPosition(0, 1, 0);\n \n // Create the 'Hello World' text\n Text text = new Text(view.getViroContext(), \"Hello World\", 1, 1);\n textNode.setGeometry(text);\n node.addChildNode(boxNode);\n}\n\nview.setScene(scene);\n", "language": "java" } ] } [/block] ## Trackable Anchoring The second way to add virtual content is to listen for features (also called "trackables") detected by the AR system. Trackables are things like vertical or horizontal surfaces, or [images](doc:ar-image-recognition) (like posters or markers) found in the real world. Each time one of these is found by the AR system, an associated [ARAnchor](https://developer.viromedia.com/virocore/reference/com/viro/core/ARAnchor.html) is created. You can associate your content to these features by adding your content to the [ARNode](https://developer.viromedia.com/virocore/reference/com/viro/core/ARNode.html) that corresponds to each anchor. To do this, implement the [ARScene.Listener](https://developer.viromedia.com/virocore/reference/com/viro/core/ARScene.Listener.html). The ARScene.Listener callbacks are invoked whenever ARAnchors are found (or updated, or removed). In the example below, we create an ARScene and set it up to add a virtual Box to sit on every detected *horizontal* real-world plane. [block:code] { "codes": [ { "code": "final ARScene scene = new ARScene();\n\nscene.setListener(new ARScene.Listener() {\n public void onAmbientLightUpdate(float intensity, Vector color) {\n }\n \n public void onAnchorFound(ARAnchor anchor, ARNode arNode) {\n // Create a Box to sit on every plane we detect\n if (anchor.getType() == ARAnchor.Type.PLANE) {\n ARPlaneAnchor planeAnchor = (ARPlaneAnchor) anchor;\n \n // Ensure this is a horizontal plane\n if (planeAnchor.getAlignment() == ARPlaneAnchor.Alignment.HORIZONTAL) {\n Box box = new Box(1, 1, 1);\n Node boxNode = new Node();\n boxNode.setGeometry(box);\n arNode.addChildNode(boxNode);\n }\n }\n }\n \n public void onAnchorRemoved(ARAnchor anchor, ARNode arNode) { \n }\n \n public void onAnchorUpdated(ARAnchor anchor, ARNode arNode) {\n }\n \n public void onTrackingUpdated(TrackingState state, TrackingStateReason reason) {\n }\n});\n\nViroView view = new ViroViewARCore(context, null);\nview.setScene(scene);", "language": "java" } ] } [/block]