8th Wall XR for Unity enables any developer to quickly and easily build AR apps that run on any Android or iOS device.
8th Wall XR for Unity works by bringing the best possible AR experience to each user's device. It integrates seamlessly with native APIs like ARKit, and ARCore, while allowing the very same apps to be run on any Android or iPhone by using the phone's camera and inertial sensors. It installs natively into a standard Unity workflow and provides easy-to use APIs for lighting, surfaces, textures, image detection, world points, hit tests and 6DoF tracking.
This document describes how to integrate 8th Wall XR into Unity applications.
8th Wall XR for Unity Release 11.2 is now available! This release provides a number of updates and enhancements, including:
Release 11.2:
New Features:
Breaking Changes:
Click Here for to see a full list of changes.
Mobile OS:
iOS:
Android:
Recommended: Android Nougat (7.0) or higher on an ARCore compatible phone.
Unity:
XCode:
Platform | Lighting | AR Background | AR Textures | Camera Motion | Horizontal Surfaces | Vertical Surfaces | Image Detection | World Points | Hit Tests |
---|---|---|---|---|---|---|---|---|---|
ARKit 1.5/2.0 (iOS 11.3+/12.0+) | Yes | Yes | Yes | 6 DoF | Yes, Deformable | Yes, Deformable | Yes | Yes | Yes |
ARKit 1.0 (iOS <11.3) | Yes | Yes | Yes | 6 DoF | Yes, Deformable | No | No | Yes | Yes |
ARCore 1.5** | Yes | Yes | Yes | 6 DoF | Yes, Deformable | Yes, Deformable | Yes | Yes | Yes |
iOS 7/8/9/10 | Yes | Yes | Yes | 6 DoF (Scale Free) | Yes, Instant Planar | No | No | Yes | Yes |
Android (Non-ARCore) | Yes | Yes | Yes | 6 DoF (Scale Free) | Yes, Instant Planar | No | No | Yes | Yes |
[**] ARCore Note: Unsupported devices, devices without the ARCore app installed, or devices with an outdated version of ARCore will use 8th Wall's SLAM tracker instead of ARCore. If you have ARCore installed on your device, please check the Google Play store for updates.
This guide provides all of the steps required to get you up and running quickly and easily.
Videos:
If you don't already have Unity installed, please download it from www.unity3d.com
Note: Make sure you install either the Android, iOS build support package, or both - depending on which platform you plan to develop for:
New Users:
Existing Users:
If you don't have an XR Developer Workspace, go to https://console.8thwall.com/workspaces and create one by clicking the "+ Create a New Workspace" button.
Click "Downloads" in the left hand navigation
Download 8th Wall XR for Unity (.unitypackage file)
Optional: Download a sample 8th Wall XR enabled Unity project:
Open Unity and on the welcome screen, click "New" to create a new project. Give it a name and click "Create Project"
First, add 8th Wall XR to your Unity project. Locate the .unityplugin file you just downloaded and simply double-click on it. A progress bar will appear as it's loaded.
Once finished, a window will display the contents of the XR package. Leave all of the boxes checked and click "Import".
Make sure your XR Developer workspace is selected
Click "Dashboard" in the left hand navigation
Click "+ Create a new App Key":
IMPORTANT: The bundle identifier entered will here needs to be identical to the bundle identifier in your Unity project.
Note: A bundle identifier is typically written in reverse domain name notation and identifies your application in the app store. It should be unique.
IMPORTANT: Make sure that the Bundle Identifier matches the bundle identifier you entered in Step #2.
Create an XRController object in your scene. You can do this in the Hierarchy panel via Create -> XRController or GameObject menu -> XRController:
Note: You'll notice that the XRController object is automatically configured with a Tag of "XRController". This is for your convenience so that other XR scripts can automatically find this one. Don't change it!
At this point you are ready to create AR-enabled objects in your scene. Here are some features to take advantage of while building your application:
The 8th Wall Console is a web based portal that allows you to:
To access the 8th Wall Console, go to https://console.8thwall.com
New Users:
Existing Users:
Go to https://console.8thwall.com/login and login with your email and password.
If you have multiple workspaces associated with your console user, select your XR Developer workspace.
An App Key is required to build and run mobile apps using 8th Wall XR. Each mobile app you create requires a unique App Key that is tied to the bundle identifier of your app.
It is recommended that you use the same bundle identifier, (i.e., package), on Android and iOS so one App Key can be used for both the Android and iOS versions of the same app.
Make sure your XR Developer workspace is selected
Click "Dashboard" in the left hand navigation
Click "+ Create a new App Key":
IMPORTANT: The bundle identifier entered will here needs to be identical to the bundle identifier in your Unity project.
Note: A bundle identifier is typically written in reverse domain name notation and identifies your application in the app store. It should be unique.
IMPORTANT: Make sure that the Bundle Identifier matches the bundle identifier you entered in Step #2.
NOTE Before building your project, first navigate away from the XRAppSettings panel. Simply select any asset or GameObject in your scene (i.e Click on Main Camera). There is a race condition in some versions of Unity related to AssetDatabase.SaveAssets() being called in PreProcessBuild scripts that can cause Unity to crash.
Note: Disabling an App Key will cause any applications using it to stop working. You can always re-enable it later.
Click Dashboard in the left navigation.
Find the desired App Key in the list of "My XR Applications" and click on the name to expand.
Click the Status slider to the OFF position.
Click OK to confirm.
To re-enable an App Key after it has been disabled:
Click Dashboard in the left navigation
Find the desired App Key in the list of "My XR Applications" and click on the name to expand.
Click the Status slider to the ON position.
Click OK to confirm.
Note: Deleting an App Key will cause any apps using it to stop working. You cannot undo this operation.
Click Dashboard in the left navigation.
Find the desired App Key in the list of "My XR Applications" and click the Delete button.
Type "DELETE" to confirm and then click OK.
Each Workspace has a team containing one or more Users, each with different permissions. Users can belong to multiple Workspace teams.
Add other members to your team to allow them to use the same App Keys you have created. You can collaboratively work on the same set of apps and see their usage.
Team members can have one of three roles:
Capabilities for each role:
Capability | OWNER | ADMIN | DEV |
---|---|---|---|
View Dashboard / Usage | X | X | X |
App Keys - Create | X | X | X |
App Keys - Edit | X | X | X |
App Keys - Delete | X | X | X |
App Keys - Enable/Disable | X | X | X |
Downloads | X | X | X |
Teams - View Users | X | X | X |
Teams - Invite Users | X | X | |
Teams - Remove Users | X | X | |
Teams - Manage User Roles | X | X | |
Workspaces - Create | X | X | X |
Workspaces - Edit | X | ||
Workspaces - Manage Plans | X | ||
View Quick Start | X | X | X |
View Release Notes | X | X | X |
Edit Profile | X | X | X |
A Workspace is a logical grouping of Users and Applications. Grouping Applications under the same Workspace allows you to view consolidated usage and billing. Workspaces can contain one or more Users, each with different permissions. Users can belong to multiple Workspaces.
There are 3 types of Workspaces:
When signing up for a new 8th Wall account, you must select an inital workspace type. You can create additional workspaces later.
The Workspaces page in the console allows you to view and manage the Workspaces you are a member of.
Use one of the following methods to access the Workspaces page:
NOTE: You can have a maximum of one FREE Web Developer workspace and one FREE AR Camera workspace.
8th Wall XR Remote allows you to preview and test AR apps directly inside Unity without having to compile and deploy it to a device. It turns your iOS or Android device into a remote control that streams data to your development computer such as touches, camera image, camera position (6DoF), field of view, accelerometer, gyroscope, magnetometer and environmental sensing data (e.g. light level, detected surfaces and anchor points).
XR Remote allows developers to build and test AR apps faster than ever before!
Overview video: https://youtu.be/ZkWj11RDNr8
8th Wall Remote runs on both iOS and Android devices, and supports streaming to Unity (Mac & Windows) via USB and WiFi connections.
The current OS / Device / Connection combinations are supported:
Desktop OS | iOS (WiFi) | iOS (USB) | Android (WiFi) | Android (USB) |
---|---|---|---|---|
Windows | Yes1 | No | Yes1 | Yes |
Mac | Yes | Yes2 | Yes | Yes |
Other Information:
[2] Mac / iOS / USB connections:
Remote functionality is enabled by default within XRController. To enable, select the XRController game object within your scene, and check "Enable Remote":
Remote functionality is enabled by default within XRController. To disable, select the XRController game object within your scene, and uncheck "Enable Remote":
To start streaming data from the 8th Wall XR Remote app on your device, hit Play within the Unity editor.
If Remote is enabled, your Game View will indicate that it's waiting for a remote to connect:
Next, open the XR Remote app on your device, and select your connection method. Select USB if your device is connected via USB cable. For WiFi select the hostname of your computer running Unity:
Screenshot of XR Remote in action:
Having issues with XR Remote?
First, please make sure you are running the latest version installed on your device. See Install Remote App for links to the App stores.
The #1 cause of connectivity problems is the firewall on your development system. Before troubleshooting any further, or contacting 8th Wall, please completely disable the Windows or macOS firewall and try again.
Please see the Troubleshooting section for common connection issues and resolutions.
8th Wall XR provides a number of controller scripts that enable you to easily add AR functionality to your application without having to write any code. Simply attach a controller to a relevant game object.
The following controllers are provided:
Controller | What Does It Do? | Attached To | Description |
---|---|---|---|
XRCameraController | Camera Movement | Main Camera | XRCameraController modifies the position and rotation of the camera in your scene as you move your device in the real world. |
XRImageDetectionController | Image Detection | Game Object | Set and get image targets to be detected |
XRLightController | Light Estimation | Directional Light | XRLightController adjusts the intensity of your scene light based on the lighting conditions in the world around you. |
XRSurfaceController | Surface Estimation | Game Object | XRSurfaceController places objects onto detected surfaces. |
XRVideoController | AR Scene Background | Main Camera | XRVideoController captures camera input and sets it as the background of your scene. |
XRVideoTextureController | AR Textures | Game Object | XRVideoTextureController captures camera input and sets it as the main texture on a game object. |
Controller overview video: https://youtu.be/NPI6hnHlNgs
Description
XRCameraController attaches to your Main Camera. It's primary function is to control the position and rotation of the camera in your scene as you move your device in the real world.
6DoF tracking is supported on all phones. This includes phones without ARKit or ARCore!
Variables
Parameter | Type | Default | Description |
---|---|---|---|
scale | float | 1.0 | scale the camera's position vector by this value to affect movement speed |
Example
Select the Camera in your scene and perform one of the following actions:
To scale the movement speed of the camera, adjust the "scale" parameter.
Videos
6DoF Experience (On all phones!): https://youtu.be/R07Y3EE9BwY
Please refer to the support matrix to see which devices support image detection.
Description
XRController contains functions that allow you to set and get image targets to be detected:
8th Wall's public GitHub repo contains two example scripts that make it easy to define images to detect and trigger events on Game Objects when image detection occurs:
This example simply moves a specified game object to the location of the detected image, but could be customized to achieve different behavior.
Example
In this example, we will use an image of a rocket to trigger placement of a clock game object.
This example assumes you have already:
Download XRImageDetectionController.cs and XRImageDetectionTargetController.cs and add them to your Unity project:
An image of the 8th Wall rocket exists in the GitHub repo under images/. Download and add the image to your Unity project. This makes it easy to test image detection by simply pointing your camera at the image.
For each image you wish to detect, make sure the target texture has the following properties set:
Create an empty game object called "XRImageDetectionController" and attach XRImageDetectionController.cs:
Configure Detection Textures:
For each image:
Attach XRImageDetectionTargetController.cs to the game object you wish to "anchor" to the image target
Result
Video: https://youtu.be/EvFisEJEHsY
Description
XRLightController adjusts the intensity of your scene light based on the lighting conditions in the world around you.
Example
Select the light in your scene and perform one of the following actions:
Description
Attach XRSurfaceController to a game object in your scene. When XR detects a surface, the game object will be placed on the surface. Typically this will be a Plane and other game objects will be children of the plane so everything is placed together.
IMPORTANT: For Non-ARKit/Non-ARCore devices, please make sure that the gameobject (e.g. Plane) that has the XRSurfaceController attached is at height zero (y=0). To get an idea of how it will look on your device, select the main cam and look at Camera Preview...
AR Shadows: If you have attached an XRSurfaceController to a plane (i.e. a "ground" object) in your scene, and want it to both be transparent AND receive shadows, click here for more info.
Variables
Parameter | Type | Default | Description |
---|---|---|---|
deformToSurface | bool | false | If true, modify the rendered mesh and the collider mesh of the surface so that it matches the detected surface. This allows for interactions like shadows that clip to surface boundaries, and objects that can fall off surfaces. Only for ARKit/ARCore based phones. |
displayImmediately | bool | false | If true, the game object will appear as placed in the scene prior to surface detection. If false, the renderer and collider for this object are disabled, and all child objects are deactivated until a surface is detected. |
groundOnly | bool | false | If true, only attach to ground surfaces. If displayImmediately is on, groundOnly only adjusts the height of the game object to try to match it to the ground. If displayImmediately is off, groundOnly searches for surfaces that are sufficiently lower than the camera and are lower than any other detected surface. If false, use any detected surface. |
lockToFirstSurface | bool | true | If true, attach to the first detected surface and don't move it. If false, move the game object to the currently active surface. |
onSurfaceReady | UnityEvent | Invoked as soon as the surface is visible and positioned. This is invoked immediately when displayImmediately is true. Otherwise it fires at the same time as onSurfaceAttach. | |
onSurfaceAttach | UnityEvent | Invoked on first attachment to a detected surface, i.e. the first time a suitable surface is detected. This is invoked immediately on devices where native AR engines are disabled or unavailable. | |
onSurfaceSwitch | UnityEvent | Invoked when switching from one surface to another after the first attachment to a surface. This will never be invoked when lockToFirstSurface is true, when displayImmediately and groundOnly are true, or on devices where native AR engines are disabled or unavailable. |
Example
First, create a plane and call it "GameSurface":
Select the object in your scene and perform one of the following actions:
Description
XRVideoController script captures camera input and sets it as the background of your scene.
Example
Select the Camera in your scene and perform one of the following actions:
Unity Editor:
Result:
Description
XRVideoTextureController script captures camera input and sets it as the main texture on a game object.
Example
Select the object in your scene and perform one of the following actions:
It's recommended that the material on your object is "unlit" (i.e Unlit/Texture) to avoid appearing washed out. XR takes care this automatically for you!
Unity Editor:
Result:
This section of the documentation contains details of 8th Wall XR's scripting API.
Use this information to interface directly with the 8th Wall XR engine instead of using the controllers provided by 8th Wall.
Enumeration
Description
Indicates the availability of ARCore on the current device.
Properties
Property | Description |
---|---|
UNSPECIFIED | The device is non-Android, or unable to determine the availability of ARCore. |
SUPPORTED_APK_TOO_OLD | The Android device is supported by ARCore, ARCore is installed, but the installed version of ARCore is too old. |
SUPPORTED_INSTALLED | The Android device is supported by ARCore, ARCore is installed, and is available for use. |
SUPPORTED_NOT_INSTALLED | The Android device is supported by ARCore, but ARCore is not installed on the device. |
UNSUPPORTED_DEVICE_NOT_CAPABLE | The Android device is not supported by ARCore. |
UNKNOWN | The Android device does not have ARCore installed, and the query to check for availability has either: failed with an error, timed out or not completed. |
Struct
Description
AR capabilities available to a given device based on its native AR engine (e.g. is it using ARKit vs ARCore vs. 8th Wall's Computer Vision technology).
Properties
Property | Type | Description |
---|---|---|
positionTracking | PositionTracking | The type of AR position tracking used by the device. |
surfaceEstimation | SurfaceEstimation | The type of AR surface estimation used by the device. |
targetImageDetection | TargetImageDetection | The type of AR image-target detection used by the device. |
Convenience Methods
Function | Description |
---|---|
IsPositionTrackingRotationAndPosition | If true, the device has 6DoF positional and rotational camera tracking, where position is based on physically accurate distances (such as the position tracking offered by ARKit and ARCore). |
IsPositionTrackingRotationAndPositionNoScale | If true, the device has full 6DoF positional and rotational camera tracking, where position is scaled based on the distance to a horizontal surface in the visual scene. |
IsSurfaceEstimationFixedSurfaces | If true, the device is using 8th Wall instant surface placement (Non-ARKit/ARCore). |
IsSurfaceEstimationHorizontalOnly | If true, the device is using an early version of ARKit/ARCore that only supports detection of horizontal surfaces. |
IsSurfaceEstimationHorizontalAndVertical | If true, the device is using a newwer version of ARKit/ARCore that supports detection of both horizontal and vertical surfaces. |
IsTargetImageDetectionUnsupported | If true, the device is running an AR engine that does not support detection of image-targets. |
IsTargetImageDetectionFixedSizeImageTarget | If true, the device is running an AR engine that supports detection of image-targets of a developer-specified, predfined phyisical size in meters. |
Enumeration
Description
The type of AR position tracking used by the device.
Properties
Property | Description |
---|---|
UNSPECIFIED | Unable to determine the tracking engine used by the device. |
ROTATION_AND_POSITION | The device has 6DoF positional and rotational camera tracking, where position is based on physically accurate distances (such as the position tracking offered by ARKit and ARCore). |
ROTATION_AND_POSITION_NO_SCALE | The device has full 6DoF positional and rotational camera tracking, where position is scaled based on the distance to a horizontal surface in the visual scene. |
Enumeration
Description
The type of AR surface estimation used by the device.
Properties
Property | Description |
---|---|
UNSPECIFIED | Unable to determine the surface estimataion engine used by the device. |
FIXED_SURFACES | The device is using 8th Wall instant surface placement (Non-ARKit/ARCore). |
HORIZONTAL_ONLY | The device is using an early version of ARKit/ARCore that only supports detection of horizontal surfaces. |
HORIZONTAL_AND_VERTICAL | The device is using a newwer version of ARKit/ARCore that supports detection of both horizontal and vertical surfaces. |
Enumeration
Description
The type of AR image-target detection used by the device.
Properties
Property | Description |
---|---|
UNSPECIFIED | Unable to determine the image-target detection engine used by the device. |
UNSUPPORTED | The device is running an AR engine that does not support detection of image-targets. |
FIXED_SIZE_IMAGE_TARGET | The device is running an AR engine that supports detection of image-targets of a developer-specified, predfined phyisical size in meters. |
public bool IsPositionTrackingRotationAndPosition()
Parameters
None
Description
This is a convience method. Equivalent to calling:
XRCapabilities.positionTracking == XRCapabilities.PositionTracking.ROTATION_AND_POSITION;
If true, the device has 6DoF positional and rotational camera tracking, where position is based on physically accurate distances (such as the position tracking offered by ARKit and ARCore).
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class MyClass : MonoBehaviour {
private XRController xr;
void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
if (xr.GetCapabilities().IsPositionTrackingRotationAndPosition()) {
// Device is using ARKit/ARCore for 6DoF tracking
} else {
// Device is using 8th Wall 6DoF tracking (Non-ARKit/ARCore).
}
}
}
public bool IsPositionTrackingRotationAndPositionNoScale()
Parameters
None
Description
If true, the device has full 6DoF positional and rotational camera tracking, where position is scaled based on the distance to a horizontal surface in the visual scene.
This is a convience method. Equivalent to calling:
XRCapabilities.positionTracking == XRCapabilities.PositionTracking.ROTATION_AND_POSITION_NO_SCALE;
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class MyClass : MonoBehaviour {
private XRController xr;
void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
if (xr.GetCapabilities().IsPositionTrackingRotationAndPositionNoScale()) {
// Device is using 8th Wall 6DoF tracking (Non-ARKit/ARCore).
} else {
// Device is using ARKit/ARCore for 6DoF tracking
}
}
}
public bool IsSurfaceEstimationFixedSurfaces()
Parameters
None
Description
If true, the device is using 8th Wall instant surface placement (Non-ARKit/ARCore).
This is a convience method. Equivalent to calling:
XRCapabilities.surfaceEstimation == SurfaceEstimation.FIXED_SURFACES;
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class MyClass : MonoBehaviour {
private XRController xr;
void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
if (xr.GetCapabilities().IsSurfaceEstimationHorizontalAndVertical()) {
Debug.Log("Device supports vertical surface estimation.");
} else if (xr.GetCapabilities().IsSurfaceEstimationHorizontalOnly()) {
Debug.Log("Device is using ARKit/ARCore for horizontal surface estimation.");
} else if (xr.GetCapabilities().IsSurfaceEstimationFixedSurfaces()) {
Debug.Log("Device is using 8th Wall instant surface placement (Non-ARKit/ARCore).");
} else {
Debug.Log("Can't determine Surface Estimatation capabilities");
}
}
}
public bool IsSurfaceEstimationHorizontalOnly()
Parameters
None
Description
If true, the device is using an early version of ARKit/ARCore that only supports detection of horizontal surfaces.
This is a convience method. Equivalent to calling:
XRCapabilities.surfaceEstimation == SurfaceEstimation.HORIZONTAL_ONLY;
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class MyClass : MonoBehaviour {
private XRController xr;
void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
if (xr.GetCapabilities().IsSurfaceEstimationHorizontalAndVertical()) {
Debug.Log("Device supports vertical surface estimation.");
} else if (xr.GetCapabilities().IsSurfaceEstimationHorizontalOnly()) {
Debug.Log("Device is using ARKit/ARCore for horizontal surface estimation.");
} else if (xr.GetCapabilities().IsSurfaceEstimationFixedSurfaces()) {
Debug.Log("Device is using 8th Wall instant surface placement (Non-ARKit/ARCore).");
} else {
Debug.Log("Can't determine Surface Estimatation capabilities");
}
}
}
public bool IsSurfaceEstimationHorizontalAndVertical()
Parameters
None
Description
If true, the device is using a newer version of ARKit/ARCore that supports detection of both horizontal and vertical surfaces.
This is a convience method. Equivalent to calling:
XRCapabilities.surfaceEstimation == SurfaceEstimation.HORIZONTAL_AND_VERTICAL;
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class MyClass : MonoBehaviour {
private XRController xr;
void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
if (xr.GetCapabilities().IsSurfaceEstimationHorizontalAndVertical()) {
Debug.Log("Device supports vertical surface estimation.");
} else if (xr.GetCapabilities().IsSurfaceEstimationHorizontalOnly()) {
Debug.Log("Device is using ARKit/ARCore for horizontal surface estimation.");
} else if (xr.GetCapabilities().IsSurfaceEstimationFixedSurfaces()) {
Debug.Log("Device is using 8th Wall instant surface placement (Non-ARKit/ARCore).");
} else {
Debug.Log("Can't determine Surface Estimatation capabilities");
}
}
}
public bool IsTargetImageDetectionUnsupported()
Parameters
None
Description
If true, the device is running an AR engine that does not support detection of image-targets.
This is a convience method. Equivalent to calling:
XRCapabilities.targetImageDetection == TargetImageDetection.UNSUPPORTED;
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class MyClass : MonoBehaviour {
private XRController xr;
void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
if (xr.GetCapabilities().IsTargetImageDetectionFixedSizeImageTarget()) {
Debug.Log("Device supports detection of image-targets of a developer-specified, predfined phyisical size in meters.");
} else if (xr.GetCapabilities().IsTargetImageDetectionUnsupported()) {
Debug.Log("Device does not support detection of image-targets.");
} else {
Debug.Log("Can't determine Image Detection capabilities");
}
}
}
public bool IsTargetImageDetectionFixedSizeImageTarget()
Parameters
None
Description
If true, the device is running an AR engine that supports detection of image-targets of a developer-specified, predfined phyisical size in meters.
This is a convience method. Equivalent to calling:
XRCapabilities.targetImageDetection == TargetImageDetection.FIXED_SIZE_IMAGE_TARGET;
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class MyClass : MonoBehaviour {
private XRController xr;
void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
if (xr.GetCapabilities().IsTargetImageDetectionFixedSizeImageTarget()) {
Debug.Log("Device supports detection of image-targets of a developer-specified, predfined phyisical size in meters.");
} else if (xr.GetCapabilities().IsTargetImageDetectionUnsupported()) {
Debug.Log("Device does not support detection of image-targets.");
} else {
Debug.Log("Can't determine Image Detection capabilities");
}
}
}
Description
XRController provides low-level access to 8th Wall XR and can be used to interface directly with the engine instead of using the controllers provided by 8th Wall.
Public Variables
Parameter | Type | Default | Description |
---|---|---|---|
enableRemote | bool | True | Allow the 8thWall Remote app to stream AR data to the Unity editor. This value can only be changed prior to pressing 'play' in the Unity Editor. |
remoteOnly | bool | False | Only use the XR controller in this scene to enable development with the 8thWall Remote app; Don't enable any AR features. |
enableLighting | bool | True | Enable lighting estimation in the AR engine. Changes to this value will take place after the next call to ConfigureXR. |
enableCamera | bool | True | Enable camera motion estimation in the AR engine. Changes to this value will take place after the next call to ConfigureXR. |
enableSurfaces | bool | True | Enable horizontal surface finding in the AR engine. Changes to this value will take place after the next call to ConfigureXR. |
enableVerticalSurfaces | bool | False | Enable vertical surface finding in the AR engine. Changes to this value will take place after the next call to ConfigureXR. |
enableCameraAutofocus | bool | False | Allow the camera to autofocus if it's able. This may improve the quality of the camera feed but might decrease tracking performance. |
requestAndroidCameraPermissions | bool | True | If true, 8th Wall XR will automatically request camera permissions on Android devices if needed. If false, you will be responsible for requesting them elsewhere in your app. Only applies to Unity 2018.3 and newer. |
Public Functions
Function | Description |
---|---|
ConfigureXR | Reconfigure XR based on the currently selected options (lighting, camera, surfaces, etc.). |
DisabledInEditor | Indicates that AR related features are currently disabled when playing a scene in the Editor. |
DisableNativeArEngine | Disable the native AR engine (e.g. ARKit / ARCore) and force the use of 8th Wall's instant surface tracker. |
GetActiveSurfaceId | Returns the id of the currently active surface, or 0 if there is no active surface. The active surface is the detected surface that is currently in the center of the device's camera feed. |
GetActiveSurfaceMesh | (Deprecated) Returns the Mesh of the active surface, or null if there is no active surface. |
GetARCoreAvailability | Returns the availability status of ARCore on Android devices. |
GetCameraIntrinsics | Get the instrinsic matrix of the Camera. This specifies how to set the field of view of the unity camera so that digital objects are properly overlayed on top of the AR camera feed. |
GetCameraPosition | Get the position of the Camera in Unity's world coordinate system. |
GetCameraRotation | Get the rotation of the Camera in Unity's world coordinate system. |
GetCapabilities | Returns the AR Capabilities available to the device, e.g. position tracking and surface estimation. |
GetDetectedImageTargets | Returns the image-targets that have been detected after calling SetDetectionImages. |
GetDetectionImages | Get the target-images that were last sent for detection to the AR engine via SetDetectionImages, or an empty map if none were set. |
GetDeviceCapabilities | Static method that returns the ARCapabilities available to the device, e.g. position tracking and surface estimation. |
GetLightExposure | Returns the exposure of the environment as a value in the range -1 to 1. |
GetLightTemperature | Returns the light temperature of the environment. |
GetRealityRGBATexture | Returns what the phone's camera is capturing as an RGBA texture. |
GetRealityYTexture | Returns what the phone's camera is capturing as a Y texture stored in the R channel. |
GetRealityUVTexture | Returns what the phone's camera is capturing as a UV texture stored in the RG channels. |
GetRealityTextureAspectRatio | Returns aspect ratio (width/height) of captured image. |
GetSurface | Returns the XRSurface of the surface with the requested ID, or XRSurface.NO_SURFACE if no surface with that id exists. |
GetSurfaces | Returns a list of all surfaces known to the AR engine. |
GetSurfaceWithId | (Deprecated) Returns the Mesh of the surface with the requested ID, or null if no surface with that id exists. |
GetTextureRotation | Get the amount that a camera feed texture should be rotated to appear upright in a given app's UI based on the app's orientation (e.g. portrait or landscape right) on the current device. |
GetTrackingState | Returns tracking state (and reason, if applicable) of underlying AR engine as an XRTrackingState struct. |
GetVideoShader | Returns the appropriate Video shader for drawing the AR scene background. |
GetVideoTextureShader | Returns the appropriate Video texture shader for drawing AR video textures on objects. |
GetWorldPoints | Returns the estimated 3d location of some points in the world, as estimated by the AR engine. |
HitTest | Estimate the 3D position (in unity units) of a point on the camera feed. |
IsPaused | Indicates whether or not the XR session is paused. |
Pause | Pause the current XR session. While paused, the camera feed is stopped and device motion is not tracked. |
Recenter | For Non-ARKit/ARCore phones, reset surface to original position relative to camera. |
Resume | Resumes the XR session. |
SetAppKey | Set App Key. Needs to be set prior to Start() in the Unity lifecycle (i.e. in Awake or OnEnable). |
SetDetectionImages | Sets the target-images that will be detected. |
ShouldUseRealityRGBATexture | Returns if capture feed is encoded as a single RGBA texture (e.g. ARCore). |
UpdateCameraProjectionMatrix | Configure the XRController for the unity scene. |
public void ConfigureXR()
Parameters
None
Description
Reconfigure XR based on the currently selected options (lighting, camera, surfaces, etc.).
All configuration changes are best-effort based on the setting and device. This means that changes might take effect immediately (next frame), soon (in a few frames), on camera session restart (the next call to pause and resume), or never (if a setting is not supported on a given device).
using System;
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
/**
* Example for toggling AutoFocus On/Off from UI element.
* ConfigureXR() needs to be called after modifying XRController features.
* Attach this to a UI/Toggle element
*/
public class CameraAutofocus : MonoBehaviour {
private XRController xr;
void Awake () {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
bool isChecked = gameObject.GetComponent<UnityEngine.UI.Toggle>().isOn;
UpdateXRConfigCameraAutofocus(isChecked);
}
public void UpdateXRConfigCameraAutofocus(bool enableCameraAutofocus) {
xr.enableCameraAutofocus = enableCameraAutofocus;
xr.ConfigureXR();
}
}
public bool DisabledInEditor()
Parameters
None
Description
Indicates that AR related features are currently disabled when playing a scene in the Editor. AR features can be disabled because "enableRemote" is set to false, or because there is no connected 8th Wall Remote. When running on device, this always returns false (i.e. AR features are always enabled).
public class XRCameraController : MonoBehaviour {
public const float METERS_SCALE = 1.0f;
public const float FEET_SCALE = 3.28084f;
private XRController xr;
private Camera sceneCamera;
private bool initialized = false;
// XRCameraController.scale allows for scaling the effective units of a scene. For example, if
// feet is a more natural unit for a scene than meters, set scale to 3.28084f.
public float scale = METERS_SCALE;
void OnEnable() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
if (!xr.DisabledInEditor()) {
Initialize();
}
}
void Initialize() {
initialized = true;
sceneCamera = GetComponent<Camera>();
xr.UpdateCameraProjectionMatrix(sceneCamera, transform.position, scale);
}
void Update () {
if (xr.DisabledInEditor()) {
return;
}
if (!initialized) {
Initialize();
}
transform.position = xr.GetCameraPosition();
transform.rotation = xr.GetCameraRotation();
sceneCamera.projectionMatrix = xr.GetCameraIntrinsics();
}
}
public void DisableNativeArEngine(bool isDisabled)
Parameters
Parameter | Type | Default | Description |
---|---|---|---|
isDisabled | bool | Disable the native AR engine (e.g. ARKit / ARCore) and force the use of 8th Wall's instant surface tracker. |
Description
Disable the native AR engine (e.g. ARKit / ARCore) and force the use of 8th Wall's instant surface tracker. This can be useful if you want to test compatibility with legacy devices, or if you want to use 8th Wall's instant surface tracking.
This should only be called during scene setup, e.g. in Awake or OnEnable.
public class XrEngineController : MonoBehaviour {
public bool disableNativeArEngine = false;
void OnEnable() {
var xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
if (xr) {
xr.DisableNativeArEngine(disableNativeArEngine);
}
}
}
public long GetActiveSurfaceId()
Parameters
None
Description
Returns the id of the currently active surface, or 0 if there is no active surface. The active surface is the detected surface that is currently in the center of the device's camera feed.
public class XRSurfaceController : MonoBehaviour {
private XRController xr;
private long surfaceId = Int64.MinValue;
void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
}
void Update() {
// If there are no meshes, reset the id to default and don't change anything
Mesh mesh = xr.GetActiveSurfaceMesh();
if (mesh == null) {
surfaceId = Int64.MinValue;
return;
}
surfaceId = xr.GetActiveSurfaceId();
}
}
public Mesh GetActiveSurfaceMesh()
Parameters
None
Description
Deprecated in XR 7.0. Replaced by: xr.GetSurface(xr.GetActiveSurfaceId()).mesh
Returns the Mesh of the active surface, or null if there is no active surface.
public class XRSurfaceController : MonoBehaviour {
private XRController xr;
private long surfaceId = Int64.MinValue;
void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
}
void Update() {
// If there are no meshes, reset the id to default and don't change anything
Mesh mesh = xr.GetActiveSurfaceMesh();
if (mesh == null) {
surfaceId = Int64.MinValue;
return;
}
}
}
public static ARCoreAvailability GetARCoreAvailability()
Parameters
None
Description
Returns the availability status of ARCore on Android devices as an ARCoreAvailability enum.
On non-Android devices, this will return ARCoreAvailability.UNSPECIFIED.
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class CheckARCoreAvailability : MonoBehaviour {
void Start() {
switch (XRController.GetARCoreAvailability()) {
case ARCoreAvailability.SUPPORTED_APK_TOO_OLD:
Debug.Log("The Android device is supported by ARCore, ARCore is installed, but the installed version of ARCore is too old.");
break;
case ARCoreAvailability.SUPPORTED_INSTALLED:
Debug.Log("The Android device is supported by ARCore, ARCore is installed, and is available for use.");
break;
case ARCoreAvailability.SUPPORTED_NOT_INSTALLED:
Debug.Log("The Android device is supported by ARCore, but ARCore is not installed on the device.");
break;
case ARCoreAvailability.UNSUPPORTED_DEVICE_NOT_CAPABLE:
Debug.Log("The Android device is not supported by ARCore.");
break;
case ARCoreAvailability.UNKNOWN:
Debug.Log("The Android device does not have ARCore installed, and the query to check for availability has either: failed with an error, timed out or not completed.");
break;
default:
// ARCoreAvailability.UNSPECIFIED
Debug.Log("The device is non-Android, or unable to determine the availability of ARCore.");
break;
}
}
}
public Matrix4x4 GetCameraIntrinsics()
Parameters
None
Description
Get the instrinsic matrix of the Camera. This specifies how to set the field of view of the unity camera so that digital objects are properly overlayed on top of the AR camera feed. The returned intrinsic matrix is suitable for the Unity Camera that was previously configured using UpdateCameraProjectionMatrix().
public class XRCameraController : MonoBehaviour {
public const float METERS_SCALE = 1.0f;
public const float FEET_SCALE = 3.28084f;
private XRController xr;
private Camera sceneCamera;
// XRCameraController.scale allows for scaling the effective units of a scene. For example, if
// feet is a more natural unit for a scene than meters, set scale to 3.28084f.
public float scale = METERS_SCALE;
void OnEnable() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
sceneCamera = GetComponent<Camera>();
xr.UpdateCameraProjectionMatrix(sceneCamera, transform.position, scale);
}
void Update () {
transform.position = xr.GetCameraPosition();
transform.rotation = xr.GetCameraRotation();
sceneCamera.projectionMatrix = xr.GetCameraIntrinsics();
}
}
public Vector3 GetCameraPosition()
Parameters
None
Description
Get the position of the Camera in Unity's world coordinate system.
public class XRCameraController : MonoBehaviour {
public const float METERS_SCALE = 1.0f;
public const float FEET_SCALE = 3.28084f;
private XRController xr;
private Camera sceneCamera;
// XRCameraController.scale allows for scaling the effective units of a scene. For example, if
// feet is a more natural unit for a scene than meters, set scale to 3.28084f.
public float scale = METERS_SCALE;
void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
sceneCamera = GetComponent<Camera>();
xr.UpdateCameraProjectionMatrix(sceneCamera, transform.position, scale);
}
void Update () {
transform.position = xr.GetCameraPosition();
transform.rotation = xr.GetCameraRotation();
sceneCamera.projectionMatrix = xr.GetCameraIntrinsics();
}
}
public Quaternion GetCameraRotation();
Parameters
None
Description
Get the rotation of the Camera in Unity's world coordinate system.
public class XRCameraController : MonoBehaviour {
public const float METERS_SCALE = 1.0f;
public const float FEET_SCALE = 3.28084f;
private XRController xr;
private Camera sceneCamera;
// XRCameraController.scale allows for scaling the effective units of a scene. For example, if
// feet is a more natural unit for a scene than meters, set scale to 3.28084f.
public float scale = METERS_SCALE;
void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
sceneCamera = GetComponent<Camera>();
xr.UpdateCameraProjectionMatrix(sceneCamera, transform.position, scale);
}
void Update () {
transform.position = xr.GetCameraPosition();
transform.rotation = xr.GetCameraRotation();
sceneCamera.projectionMatrix = xr.GetCameraIntrinsics();
}
}
public XRCapabilities GetCapabilities()
Parameters
None
Description
Returns the AR Capabilities available to the device, e.g. position tracking and surface estimation.
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class MyClass : MonoBehaviour {
private XRController xr;
void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
if (xr.GetCapabilities().IsPositionTrackingRotationAndPosition()) {
// Device is using ARKit/ARCore for 6DoF tracking
} else {
// Device is using 8th Wall 6DoF tracking (Non-ARKit/ARCore).
}
}
}
public List<XRDetectedImageTarget> GetDetectedImageTargets()
Parameters
None
Description
Returns the image-targets that have been detected after calling SetDetectionImages.
using System;
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class DetectedImagePositions : MonoBehaviour {
private XRController xr;
void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
}
void Update () {
var images = xr.GetDetectedImageTargets();
foreach (var im in images) {
Debug.Log("Position of Detected Image Target: " + im.position.ToString());
}
}
}
public Dictionary<String, XRDetectionImage> GetDetectionImages()
Parameters
None
Description
Get the target-images that were last sent for detection to the AR engine via SetDetectionImages, or an empty map if none were set.
To add an image to the image-targets being detected, call:
var images = xr.GetDetectionImages();
images.Add(
"new-image-name",
XRDetectionImage.FromDetectionTexture(
new XRDetectionTexture(newImageTexture, newImageWidthInMeters)));
xr.SetDetectionImages(images);
public static XRCapabilities GetDeviceCapabilities()
Parameters
None
Description
Static method that returns the AR Capabilities available to the device, e.g. position tracking and surface estimation.
public class MyClass : MonoBehaviour {
private XRController xr;
void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
if (xr.GetDeviceCapabilities().IsSurfaceEstimationHorizontalOnly()) {
// Device is using ARKit/ARCore for surface estimation.
} else {
// Device is using 8th Wall instant surface placement (Non-ARKit/ARCore).
}
}
}
public float GetLightExposure()
Parameters
None
Description
Returns the exposure of the environment as a value in the range -1 to 1.
public class XRLightController : MonoBehaviour {
private XRController xr;
private Light sceneLight;
void Start() {
sceneLight = GetComponent<Light>();
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
}
void Update () {
if (xr.DisabledInEditor()) {
return;
}
// Update the light exposure.
float exposure = xr.GetLightExposure();
float temperature = xr.GetLightTemperature();
// Exposure ranges from -1 to 1 in XR, adjust to 0-2 for Unity.
sceneLight.intensity = exposure + 1.0f;
sceneLight.colorTemperature = temperature;
RenderSettings.ambientIntensity = exposure + 1.0f;
}
}
public float GetLightTemperature()
Parameters
None
Description
Returns the light temperature of the environment. Temperature measures the color of light on a red to blue spectrum where low values (around 1000) are very red, high values (around 15000) are very blue, and values around 6500 are close to white.
public class XRLightController : MonoBehaviour {
private XRController xr;
private Light sceneLight;
void Start() {
sceneLight = GetComponent<Light>();
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
}
void Update () {
if (xr.DisabledInEditor()) {
return;
}
// Update the light exposure.
float exposure = xr.GetLightExposure();
float temperature = xr.GetLightTemperature();
// Exposure ranges from -1 to 1 in XR, adjust to 0-2 for Unity.
sceneLight.intensity = exposure + 1.0f;
sceneLight.colorTemperature = temperature;
RenderSettings.ambientIntensity = exposure + 1.0f;
RenderSettings.ambientLight = tempToColor(temperature);
}
float trunc(double color) {
return color < 0.0f ? 0.0f : (color > 255.0f ? 255.0f : (float)color);
}
Color tempToColor(float temp) {
temp = temp < 0.0f ? 0.0f : temp / 100.0f;
float red = temp <= 66.0f
? 255.0f
: trunc(329.698727446 * Math.Pow(temp - 60.0, -0.1332047592));
float green = temp <= 0.0f
? 255.0f
: (temp < 66.0f
? trunc(99.4708025861 * Math.Log(temp) - 161.1195681661)
: trunc(288.1221695283 * Math.Pow(temp - 60.0, -0.0755148492)));
float blue = temp >= 66.0f
? 255.0f
: (temp <= 10.0f
? 0.0f
: trunc(138.5177312231 * Math.Log(temp - 10.0) - 305.0447927307));
return new Color(red / 255.0f, green / 255.0f, blue / 255.0f);
}
}
public Texture2D GetRealityRGBATexture
Parameters
None
Description
Returns what the phone's camera is capturing as an RGBA texture.
public class XRVideoTextureController : MonoBehaviour {
private XRController xr;
void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
Renderer r = GetComponent<Renderer>();
r.material.shader = xr.GetVideoTextureShader();
if (xr.ShouldUseRealityRGBATexture()) {
r.material.mainTexture = xr.GetRealityRGBATexture();
} else {
r.material.SetTexture("_YTex", xr.GetRealityYTexture());
r.material.SetTexture("_UVTex", xr.GetRealityUVTexture());
}
r.material.SetInt("_ScreenOrientation", (int) Screen.orientation);
}
}
public float GetRealityTextureAspectRatio()
Parameters
None
Description
Returns aspect ratio (width/height) of captured image.
public class XRVideoController : MonoBehaviour {
private XRController xr;
private Camera cam;
public void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
cam = GetComponent<Camera>();
}
public void OnPreRender() {
if (xr.DisabledInEditor()) {
return;
}
float scaleFactor = cam.aspect / xr.GetRealityTextureAspectRatio();
}
public Texture2D GetRealityUVTexture()
Parameters
None
Description
Returns what the phone's camera is capturing as a UV texture stored in the RG channels.
public class XRVideoTextureController : MonoBehaviour {
private XRController xr;
void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
Renderer r = GetComponent<Renderer>();
r.material.shader = xr.GetVideoTextureShader();
if (xr.ShouldUseRealityRGBATexture()) {
r.material.mainTexture = xr.GetRealityRGBATexture();
} else {
r.material.SetTexture("_YTex", xr.GetRealityYTexture());
r.material.SetTexture("_UVTex", xr.GetRealityUVTexture());
}
r.material.SetInt("_ScreenOrientation", (int) Screen.orientation);
}
}
public Texture2D GetRealityYTexture()
Parameters
None
Description
Returns what the phone's camera is capturing as a Y texture stored in the R channel.
public class XRVideoTextureController : MonoBehaviour {
private XRController xr;
void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
Renderer r = GetComponent<Renderer>();
r.material.shader = xr.GetVideoTextureShader();
if (xr.ShouldUseRealityRGBATexture()) {
r.material.mainTexture = xr.GetRealityRGBATexture();
} else {
r.material.SetTexture("_YTex", xr.GetRealityYTexture());
r.material.SetTexture("_UVTex", xr.GetRealityUVTexture());
}
r.material.SetInt("_ScreenOrientation", (int) Screen.orientation);
}
}
public XRSurface GetSurface(long id)
Parameters
None
Description
Returns the XRSurface of the surface with the requested ID, or XRSurface.NO_SURFACE if no surface with that id exists.
public class XRSurfaceController : MonoBehaviour {
private XRController xr;
private long surfaceId = Int64.MinValue;
void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
}
void Update() {
if (xr.DisabledInEditor()) {
return;
}
XRSurface surface = surfaceId == Int64.MinValue
? xr.GetSurface(xr.GetActiveSurfaceId()) : xr.GetSurface(surfaceId);
// If no mesh, reset the id to default and don't do anything.
if (surface == XRSurface.NO_SURFACE || surface.mesh == null) {
surfaceId = Int64.MinValue;
return;
}
}
public List<XRSurface> GetSurfaces()
Parameters
None
Description
Returns a list of all surfaces known to the AR engine.
public class XRSurfaceController : MonoBehaviour {
private XRController xr;
private long surfaceId = Int64.MinValue;
void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
}
void Update() {
if (xr.DisabledInEditor()) {
return;
}
List<XRSurface> allSurfaces = xr.GetSurfaces();
// Print the ID's of all surfaces
foreach(XRSurface surface in allSurfaces) {
Debug.Log("Found surface with ID: " + surface.id);
}
}
public Mesh GetSurfaceWithId(long id)
Parameters
None
Description
Deprecated in XR 7.0. Replaced by: xr.GetSurface(id).mesh
Returns the Mesh of the surface with the requested ID, or null if no surface with that id exists.
public class XRSurfaceController : MonoBehaviour {
private XRController xr;
private long surfaceId = Int64.MinValue;
void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
}
void Update() {
Mesh mesh = surfaceId == Int64.MinValue ? xr.GetActiveSurfaceMesh() : xr.GetSurfaceWithId(surfaceId);
if (mesh == null) {
surfaceId = Int64.MinValue;
return;
}
surfaceId = xr.GetActiveSurfaceId();
}
}
public XRTextureRotation GetTextureRotation()
Parameters
None
Description
Get the amount that a camera feed texture should be rotated to appear upright in a given app's UI based on the app's orientation (e.g. portrait or landscape right) on the current device.
switch(xr.GetTextureRotation()) {
case XRTextureRotation.R270:
rotation = -90.0f;
scaleFactor = cam.aspect * xr.GetRealityTextureAspectRatio();
xrMat.SetInt("_ScreenOrientation", (int) ScreenOrientation.LandscapeRight);
break;
case XRTextureRotation.R0:
rotation = 0.0f;
xrMat.SetInt("_ScreenOrientation", (int) ScreenOrientation.Portrait);
break;
case XRTextureRotation.R90:
rotation = 90.0f;
scaleFactor = cam.aspect * xr.GetRealityTextureAspectRatio();
xrMat.SetInt("_ScreenOrientation", (int) ScreenOrientation.LandscapeLeft);
break;
case XRTextureRotation.R180:
rotation = 180.0f;
xrMat.SetInt("_ScreenOrientation", (int) ScreenOrientation.PortraitUpsideDown);
break;
default:
break;
}
public XRTrackingState GetTrackingState()
Parameters
None
Description
Returns tracking state (and reason, if applicable) of underlying AR engine as an XRTrackingState struct.
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class TrackingStateExample : MonoBehaviour {
private XRController xr;
private XRTrackingState trackingState;
// Use this for initialization
void Start () {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
}
// Update is called once per frame
void Update () {
trackingState = xr.GetTrackingState();
switch (trackingState.status) {
case XRTrackingState.Status.UNSPECIFIED:
case XRTrackingState.Status.NOT_AVAILABLE:
case XRTrackingState.Status.NORMAL:
Debug.Log("Tracking State: " + trackingState.status);
break;
case XRTrackingState.Status.LIMITED:
Debug.Log("Tracking State: " + trackingState.status + " Reason: " + trackingState.reason);
break;
default:
break;
}
}
}
public Shader GetVideoShader()
Parameters
None
Description
Returns the appropriate Video shader for drawing the AR scene background.
public class XRVideoController : MonoBehaviour {
private XRController xr;
private Material xrMat;
private Camera cam;
public void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
cam = GetComponent<Camera>();
cam.clearFlags = CameraClearFlags.Depth;
xrMat = new Material(xr.GetVideoShader());
xrMat.SetInt("_ScreenOrientation", (int) Screen.orientation);
}
public Shader GetVideoTextureShader()
Parameters
None
Description
Returns the appropriate Video texture shader for drawing AR video textures on objects.
public class XRVideoTextureController : MonoBehaviour {
private XRController xr;
Material rMat = null;
void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
if (!xr.DisabledInEditor()) {
Initialize();
}
}
void Initialize() {
initialized = true;
// Set reality texture onto our material. Make sure it's unlit to avoid appearing washed out.
// Note that this requires Unlit/Texture to be included in the unity project.
Renderer r = GetComponent<Renderer>();
rMat = r.material;
rMat.shader = xr.GetVideoTextureShader();
if (xr.ShouldUseRealityRGBATexture()) {
var tex = xr.GetRealityRGBATexture();
rMat.mainTexture = tex;
texAspect = tex.width * 1.0f / tex.height;
} else {
var ytex = xr.GetRealityYTexture();
rMat.SetTexture("_YTex", ytex);
rMat.SetTexture("_UVTex", xr.GetRealityUVTexture());
texAspect = ytex.width * 1.0f / ytex.height;
}
}
}
public List<XRWorldPoint> GetWorldPoints()
Parameters
None
Description
Returns the estimated 3d location of some points in the world, as estimated by the AR engine.
using System;
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class WorldPointsRenderer : MonoBehaviour {
private static readonly float POINT_WIDTH = 0.04267f; // Golf ball
private static Color DARK_GREEN = new Color(5.0f/255, 106.0f/255, 50.0f/255);
private XRController xr;
private List<PointWithRenderer> points;
private class PointWithRenderer {
public readonly GameObject point;
public readonly MeshRenderer renderer;
public PointWithRenderer() {
point = GameObject.CreatePrimitive(PrimitiveType.Sphere);
renderer = point.GetComponent<MeshRenderer>();
renderer.material.color = DARK_GREEN;
point.transform.localScale = new Vector3(POINT_WIDTH, POINT_WIDTH, POINT_WIDTH);
}
public void RenderPoint(Vector3 position) {
point.transform.position = position;
renderer.enabled = true;
}
}
void RenderPoints(List<XRWorldPoint> pts) {
int v = 0;
foreach (var pt in pts) {
// Center
if (points.Count < v + 1) {
points.Add(new PointWithRenderer());
}
points[v].RenderPoint(pt.position);
++v;
}
for (; v < points.Count; ++v) {
points[v].renderer.enabled = false;
}
}
void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
points = new List<PointWithRenderer>();
}
void Update () {
RenderPoints(xr.GetWorldPoints());
}
}
public List<XRHitTestResult> HitTest(float x, float y)
public List<XRHitTestResult> HitTest(float x, float y, List<XRHitTestResult.Type> includedTypes)
Parameters
X and Y are specified as numbers between 0 and 1, where (0, 0) is the upper left corner and (1, 1) is the lower right corner of the camera feed as rendered in the camera that was specified with UpdateCameraProjectionMatrix.
includedTypes optionally filters the results by the source of information that is used to estimate the 3d position. See XRHitTestResult
Description
Estimate the 3D position (in unity units) of a point on the camera feed, optionally filtering the results by the source of information that is used to estimate the 3d position. If no types are specified, all hit test results are returned.
X and Y are specified as numbers between 0 and 1, where (0, 0) is the upper left corner and (1, 1) is the lower right corner of the camera feed as rendered in the camera that was specified with UpdateCameraProjectionMatrix.
Mutltiple 3d position esitmates may be returned for a single hit test based on the source of data being used to estimate the position. The data source that was used to estimate the position is indicated by the XRHitTestResult.Type.
// Hit Test - return all hits
List<XRHitTestResult> hits = new List<XRHitTestResult>();
if (Input.touchCount != 0) {
var t = Input.GetTouch(0);
if (t.phase == TouchPhase.Began) {
float x = t.position.x / Screen.width;
float y = (Screen.height - t.position.y) / Screen.height;
hits.AddRange(xr.HitTest(x, y));
}
}
// Hit Test - filter to only return points on detected surfaces
List<XRHitTestResult> hits = new List<XRHitTestResult>();
List<XRHitTestResult.Type> types = new List<XRHitTestResult.Type>();
types.Add(XRHitTestResult.Type.DETECTED_SURFACE);
if (Input.touchCount != 0) {
var t = Input.GetTouch(0);
if (t.phase == TouchPhase.Began) {
float x = t.position.x / Screen.width;
float y = (Screen.height - t.position.y) / Screen.height;
hits.AddRange(xr.HitTest(x, y, types));
}
}
public bool IsPaused()
Parameters
None
Description
Indicates whether or not the XR session is paused.
public class PauseResume : MonoBehaviour {
private XRController xr;
void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
}
public void ResumeXRSession() {
if(xr.IsPaused()) {
xr.Resume();
}
}
}
public void Pause()
Parameters
None
Description
Pause the current XR session. While paused, the camera feed is stopped and device motion is not tracked.
public class PauseResume : MonoBehaviour {
private XRController xr;
void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
}
public void PauseXRSession() {
xr.Pause();
}
}
public void Recenter()
Parameters
None
Description
For Non-ARKit/ARCore phones, reset surface to original position relative to camera
public class XRViewController : MonoBehaviour {
private XRController xr;
private long surfaceId = Int64.MinValue;
void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
xr.Recenter();
}
public void Resume()
Parameters
None
Description
Resumes the XR session.
public class PauseResume : MonoBehaviour {
private XRController xr;
void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
}
public void ResumeXRSession() {
if(xr.IsPaused()) {
xr.Resume();
}
}
}
public void SetAppKey(string key)
Parameters
Parameter | Type | Default | Description |
---|---|---|---|
key | string | none | The app key you wish to use in your application. |
Description
Set App Key. Needs to be set prior to Start() in the Unity lifecycle (i.e. in Awake or OnEnable).
You will need to create a unique license key for each 8th Wall app that you develop.
public class MyXRClass : MonoBehaviour {
private XRController xr;
private string myAppKey = "Example123";
void OnEnable() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
xr.SetAppKey(myAppKey);
}
}
public void SetDetectionImages(Dictionary<String, XRDetectionImage> images)
Parameters
None
Description
Sets the target-images that will be detected.
using UnityEngine;
using System.Collections;
using System.Collections.Generic;
public class XRImageDetectionController : MonoBehaviour {
private XRController xr;
// Specify list of images (textures) you want to search for in Unity Editor
// Textures must have the "Read/Write Enabled" setting checked, and must have the "Non Power Of 2" setting set to "None".
public List<XRDetectionTexture> detectionTextures;
public void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
if (detectionTextures.Count > 0) {
Dictionary<string, XRDetectionImage> detectionImages =
new Dictionary<string, XRDetectionImage>();
foreach (XRDetectionTexture detectionTexture in detectionTextures) {
detectionImages.Add(
detectionTexture.tex.name, XRDetectionImage.FromDetectionTexture(detectionTexture));
}
xr.SetDetectionImages(detectionImages);
}
}
}
public bool ShouldUseRealityRGBATexture()
Parameters
None
Description
Returns true if capture feed is encoded as a single RGBA texture (e.g. ARCore). If false, the capture feed is stored in two separate textures containing the Y and UV color components. These two textures should be combined using an appropriate shader prior to display.
public class XRVideoController : MonoBehaviour {
private XRController xr;
private Material xrMat;
private CommandBuffer buffer;
private bool isCBInit;
private Camera cam;
private bool initialized = false;
private Shader shader;
public void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
if (!xr.DisabledInEditor()) {
Initialize();
}
}
private void Initialize() {
initialized = true;
cam = GetComponent<Camera>();
cam.clearFlags = CameraClearFlags.Depth;
isCBInit = false;
shader = xr.GetVideoShader();
xrMat = new Material(shader);
}
public void OnPreRender() {
if (xr.DisabledInEditor()) {
return;
}
if (!initialized || xr.GetVideoShader() != shader) {
Initialize();
}
if (!isCBInit) {
buffer = new CommandBuffer();
buffer.Blit(null, BuiltinRenderTextureType.CurrentActive, xrMat);
cam.AddCommandBuffer(CameraEvent.BeforeForwardOpaque, buffer);
isCBInit = true;
}
if (xr.ShouldUseRealityRGBATexture()) {
xrMat.mainTexture = xr.GetRealityRGBATexture();
} else {
xrMat.SetTexture("_YTex", xr.GetRealityYTexture());
xrMat.SetTexture("_UVTex", xr.GetRealityUVTexture());
}
}
}
public void UpdateCameraProjectionMatrix(Camera cam, Vector3 origin, float scale)
Parameters
Parameter | Type | Default | Description |
---|---|---|---|
cam | Camera | The camera that should have it's projection matrix updated. | |
origin | Vector3 | (0,0,0) | Initial camera position in the virtual scene. |
facing | Quaternion | (0,0,0,1) | Initial camera orientation in the virtual scene. |
scale | float | 1.0 | Scale provides information about how units in Unity's coordinate system relate to distances in the real world. |
Description
Configure the XRController for the unity scene.
The Camera provides information about how AR overlay data will be presented, so that subsequent calls to GetCameraIntrinsics return appropriate values. Origin and Facing specify an initial camera position and orientation in the virtual scene so that the virtual scene can be properly aligned to the real world.
When the engine is started, the camera will start in the scene at the provided origin, facing along the x/z direction as specified by facing. Tilts and in-plane rotations in the facing rotation are ignored. Scale provides information about how units in Unity's coordinate system relate to distances in the real world.
For example, if scale is set to 10, moving the device 1 physical meter will cause the unity Camera to move by 10 unity units, while moving the device by 10cm will cause the unity Camera to move by 1 unity unit. Note that scale only applies when GetCapabilities().positionTracking is PositionTracking.ROTATION_AND_POSITION. When a device uses PositionTracking.ROTATION_AND_POSITION_NO_SCALE, the scene is scaled by the height of the origin value.
public class XRCameraController : MonoBehaviour {
public const float METERS_SCALE = 1.0f;
public const float FEET_SCALE = 3.28084f;
private XRController xr;
private Camera sceneCamera;
// XRCameraController.scale allows for scaling the effective units of a scene. For example, if
// feet is a more natural unit for a scene than meters, set scale to 3.28084f.
public float scale = METERS_SCALE;
void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
sceneCamera = GetComponent<Camera>();
xr.UpdateCameraProjectionMatrix(sceneCamera, transform.position, scale);
}
}
Struct
Description
An image-target that was detected by an AR Engine.
Properties
Property | Type | Description |
---|---|---|
id | Int64 | A unique identifier for this detected image-target that is consistent across updates. |
name | String | The name of the image-target that was provided by the developer on a call to XRController.SetDetectionImages. |
position | Vector3 | The position of the center of the image in unity coordinates. |
rotation | Quaternion | The orientation of the detected image. The detected image lies in the x/z plane of this rotation. |
width | float | Width of the detected image-target, in unity units. |
height | float | Height of the detected image-target, in unity units. |
trackingState | TrackingState | The tracking state for the detected image. |
Public Functions
None
Enumeration
The state of tracking for a given image-target.
Properties
Property | Description |
---|---|
UNSPECIFIED | Unable to determine the tracking state of the image-target. |
FULL_TRACKING | The location of an image-target is being tracked using the camera image. |
LAST_KNOWN_POSE | The location of an image-target can no longer be tracked using the camera image, so it is tracked using its last known pose. |
NOT_TRACKING | The location of the image-target is not being tracked. |
Struct
Description
Source image data for a image-target to detect. This can either be constructed manually, or from a Unity Texture2D.
Properties
Property | Type | Description |
---|---|---|
widthInPixels | int | The width of the source binary image-target, in pixels. |
heightInPixels | int | The height of the source binary image-target, in pixels. |
targetWidthInMeters | float | The expected physical width of the image-target, in meters. |
encoding | Encoding | The encoding of the binary image data. |
imageData | byte[] | The binary data containing the image-target to detect. |
Constructors
Constructor | Description |
---|---|
XRDetectionImage | Initializes a new XRDetectionImage struct with a specified widthInPixels, heightInPixels, targetWidthInMeters, encoding, and imageData. |
Public Functions
Function | Description |
---|---|
FromDetectionTexture | Initializes a new XRDetectionImage from a unity Texture2D and a specified targetWidthInMeters. |
Enumeration
Description
Indicates the binary encoding format of a image-target to detect.
Properties
Property | Description |
---|---|
UNSPECIFIED | Unable to determine the image-target binary encoding. |
RGB24 | Pixels are stored in 3-byte RGB values, values ranging from 0-255, ordered by row. The length of imageData should be 3 widthInPixels heightInPixels. |
RGB24_INVERTED_Y | Pixels are stored in 3-byte RGB values, values ranging from 0-255, ordered by row in reverse order (from bottom to top). The length of imageData should be 3 widthInPixels heightInPixels. |
static public XRDetectionImage FromDetectionTexture(XRDetectionTexture texture)
Parameters
Property | Type | Description |
---|---|---|
texture | XRDetectionTexture | The unity Texture2D that can be used as a source for image-target detection. |
Description
Generate the image-target (XRDetectionImage) to detect from an XRDetectionTexture
using UnityEngine;
using System.Collections;
using System.Collections.Generic;
public class XRImageDetectionController : MonoBehaviour {
private XRController xr;
// Specify list of images (textures) you want to search for in Unity Editor
// Textures must have the "Read/Write Enabled" setting checked, and must have the "Non Power Of 2" setting set to "None".
public List<XRDetectionTexture> detectionTextures;
public void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
if (detectionTextures.Count > 0) {
Dictionary<string, XRDetectionImage> detectionImages =
new Dictionary<string, XRDetectionImage>();
foreach (XRDetectionTexture detectionTexture in detectionTextures) {
detectionImages.Add(
detectionTexture.tex.name, XRDetectionImage.FromDetectionTexture(detectionTexture));
}
xr.SetDetectionImages(detectionImages);
}
}
}
public XRDetectionImage(int widthInPixels, int heightInPixels, float targetWidthInMeters, Encoding encoding, byte[] imageData)
Parameters
Property | Type | Description |
---|---|---|
widthInPixels | int | The width of the source binary image-target, in pixels. |
heightInPixels | int | The height of the source binary image-target, in pixels. |
targetWidthInMeters | float | The expected physical width of the image-target, in meters. |
encoding | Encoding | The encoding of the binary image data. |
imageData | byte[] | The binary data containing the image-target to detect. |
Description
Initializes a new XRDetectionImage struct with a specified widthInPixels, heightInPixels, targetWidthInMeters, encoding, and imageData.
public XRDetectionImage FromDetectionTexture(XRDetectionTexture texture){
byte[] byteData;
if (texture.tex.format == TextureFormat.RGB24) {
byteData = texture.tex.GetRawTextureData();
} else {
Texture2D newTexture2DInRGB24 = new Texture2D(texture.tex.width, texture.tex.height,
TextureFormat.RGB24, false);
newTexture2DInRGB24.SetPixels(texture.tex.GetPixels());
newTexture2DInRGB24.Apply();
byteData = newTexture2DInRGB24.GetRawTextureData();
}
return new XRDetectionImage(
texture.tex.width,
texture.tex.height,
texture.widthInMeters,
Encoding.RGB24_INVERTED_Y,
byteData);
}
}
Struct
Description
A unity Texture2D that can be used as a source for image-target detection.
Properties
Property | Type | Description |
---|---|---|
tex | Texture2D | The unity texture containing the image data for detection. Textures must have the "Read/Write Enabled" setting checked, and must have the "Non Power Of 2" setting set to "None". |
widthInMeters | float | The expected physical width of the image-target, in meters. |
Public Functions
None
Struct
Description
The result of a hit test query to estimate the 3D position of a point shown on the device's camera feed.
Properties
Property | Type | Description |
---|---|---|
type | Type | The type of data that was used to generate the hit test result. |
position | Vector3 | The estimated 3d position in the unity scene of the queried point in the unity scene based on the camera feed. |
rotation | Quaternion | The estimated 3d rotation of the queried point on the camera feed. |
distance | float | The estimated distance from the device of the queried point on the camera feed. |
Public Functions
None
Enumeration
Description
The type of data that was used to generate a hit test result.
Properties
Property | Description |
---|---|
UNSPECIFIED | Unable to determine how a hit test result was generated. |
FEATURE_POINT | The location of a hit test result was estimated from nearby feature points. |
ESTIMATED_SURFACE | The location of a hit test result was inferred from the location of a known surfaces, but the AR engine has not yet confirmed that that location is actually a part of a surface. |
DETECTED_SURFACE | The location of a hit test result is within the bounds of a confirmed detected surface. |
Struct
Description
A surface detected by an AR surface detection engine.
Properties
Property | Type | Description |
---|---|---|
id | Int64 | A unique identifier for this surface that persists across updates. |
type | Type | The type of the surface, e.g. horizontal or vertical. |
rotation | Quaternion | The orientation of this surface. Applying this rotation to GameObjects will rotate them to lie flat on the surface. |
mesh | Mesh | A mesh that covers the surface. |
Public Functions
Function | Description |
---|---|
Equals | Determines whether this instance and a specified object, which must also be an XRSurface object, are equal. |
GetHashCode | Calculates a hash of the surface object. |
Operators
Operator | Description |
---|---|
operator == | Determines whether two specified XRSurface objects are equal, e.g. their id, Type, rotation and mesh are the same. |
operator != | Determines whether two specified XRSurface objects are different, e.g. their id, Type, rotation and mesh are not all the same. |
Enumeration
Description
The type of the surface, e.g. horizontal or vertical.
Properties
Property | Description |
---|---|
UNSPECIFIED | Unable to determine the type of surface that was detected. |
HORIZONTAL_PLANE | A flat surface parallel to the ground, e.g. a table or the ground. |
VERTICAL_PLANE | A flat surface perpendicular to the ground, e.g. a wall. |
public override bool Equals(object o)
Parameters
Property | Type | Description |
---|---|---|
o | object | object to compare to. Must be an XRSurface object |
Description
Determines whether this instance and a specified object, which must also be an XRSurface object, are equal e.g. id, Type, rotation and mesh are the same.
bool EqualSurfaces(XRSurface a, XRSurface b) {
return a.Equals(b);
}
public override int GetHashCode()
Parameters
None
Description
Gets the hash of an XRSurface object
public class XRSurfaceController : MonoBehaviour {
private XRController xr;
private long surfaceId = Int64.MinValue;
void Start() {
xr = GameObject.FindWithTag("XRController").GetComponent<XRController>();
}
void Update() {
if (xr.DisabledInEditor()) {
return;
}
List<XRSurface> allSurfaces = xr.GetSurfaces();
// Print the ID's of all surfaces
foreach(XRSurface surface in allSurfaces) {
Debug.Log("Hash of surface " + surface.id + " is: " + surface.GetHashCode());
}
}
public static bool operator ==(XRSurface a, XRSurface b)
Description
Determines whether two specified XRSurface objects are equal, e.g. their id, Type, rotation and mesh are the same.
bool IsNotASurface(XRSurface a) {
return a == XRSurface.NO_SURFACE;
}
public static bool operator !=(XRSurface a, XRSurface b)
Description
Determines whether two specified XRSurface objects are different, e.g. their id, Type, rotation and mesh are not all the same.
bool FoundASurface(XRSurface a) {
return a != XRSurface.NO_SURFACE;
}
Enumeration
Description
Rotation (orientation) of the device
Properties
Property | Description |
---|---|
UNSPECIFIED | Unable to determine the camera feed texture rotation. |
R0 | The camera feed texture does not need to be rotated. |
R90 | The camera feed texture should be rotated by 90 degrees. |
R180 | The camera feed texture should be rotated by 180 degrees. |
R270 | The camera feed texture should be rotated by 270 degrees. |
Struct
Description
Indicates the current quality of a device's tracking in the user's current environment.
Properties
Property | Type | Description |
---|---|---|
status | Status | Indicates the current quality level of tracking. |
reason | Reason | Indicates why tracking is currently limited. Only specified when tracking status is LIMITED. |
Public Functions
None
Enumeration
Description
Indicates the current quality level of tracking.
Properties
Property | Description |
---|---|
UNSPECIFIED | Unable to determine tracking quality. |
NOT_AVAILABLE | Tracking is not currently enabled. |
LIMITED | Tracking is enabled but its quality is currently low. See Reason for more information. |
NORMAL | Tracking is enabled and operating as expected. |
Enumeration
Description
Indicates why tracking is currently limited. Only specified when tracking status is LIMITED.
Properties
Property | Description |
---|---|
UNSPECIFIED | Tracking status is not currently LIMITED, or unable to determine why tracking is LIMITED. |
INITIALIZING | Tracking is limited because the tracking engine is still starting up. |
RELOCALIZING | Tracking is limited because the tracking engine is unable to determine the device's current location. |
TOO_MUCH_MOTION | Tracking is limited because the device is moving too much. |
NOT_ENOUGH_TEXTURE | Tracking is limited because the current camera feed does not contain enough visual information to determine how the device is moving. |
Struct
Description
A point in the world detected by an AR engine.
Properties
Property | Type | Description |
---|---|---|
id | Int64 | A unique identifier of this point that persists across frames. |
position | Vector3 | The 3d position of the point in the unity scene. |
confidence | float | Indicates how confident the AR engine is in the location of this point. |
Public Functions
None
In order to fully support ARCore functionality within your 8th Wall XR enabled Unity application, you will need to install ARCore 1.5 (or newer) on your ARCore supported mobile device.
ARCore Supported Devices: https://developers.google.com/ar/discover/supported-devices
Install ARCore on the device:
Note: Unsupported devices, devices without the ARCore app installed, or devices with an outdated version of ARCore will use 8th Wall's SLAM tracker instead of ARCore. If you have ARCore installed on your device, please check the Google Play store for updates.
To allow customers to access ARKit features that haven't been exposed through the 8th Wall XR API, or modify ARKit configuration parameters, it is now possible to override:
To do so, edit Assets/Plugins/iOS/XROverride.mm
Note: This is intended for advanced users only.
This file contains 3 methods that you can override:
Function | Description |
---|---|
c8_getCustomARSession | Allows you to instantiate your own ARSession for 8th Wall to use. |
c8_getCustomARSessionConfig | Allows you to override the default ARWorldTrackingConfiguration (e.g. to enable things like environmental texture cubemaps) |
c8_getCustomARSessionDelegate | Allows you to implement delegate methods (e.g. if you wanted to use didUpdateAnchors for reading AREnvironmentProbeAnchors) |
Overriding c8_getCustomARSession allows you to instantiate your own ARSession for 8th Wall to use.
Example:
void* c8_getCustomARSession() {
auto session = [ARSession new];
return (__bridge_retained void*) session;
}
If you want a custom ARWorldTrackingConfiguration (whether 8th Wall makes an ARSession for you or not), override this method.
Example:
void* c8_getCustomARSessionConfig() {
auto config = [ARWorldTrackingConfiguration new];
return (__bridge_retained void*)config;
}
Note: The default ARWorldTrackingConfiguration used by 8th Wall XR currently has the following properties:
If you want custom ARSessionDelegate methods for the existing ARSession, override this method to create NSObject<ARSessionDelegate>
and return its pointer.
Example:
void* c8_getCustomARSessionDelegate() {
auto delegate = [CustomDelegate new];
return (__bridge_retained void*) delegate;
}
@interface CustomDelegate : NSObject<ARSessionDelegate>
- (void)session:(ARSession *)session didUpdateFrame:(ARFrame *)frame;
@end
@implementation CustomDelegate
- (void)session:(ARSession *)session didUpdateFrame:(ARFrame *)frame {
NSLog(@"CustomDelegate::didUpdateFrame");
}
@end
AR is more realistic when virtual objects in your scene cast shadows onto the physical world. 8th Wall XR provides a material and a shader that can be used on transparent surfaces.
Video: https://youtu.be/NPI6hnHlNgs?start=258
Under Assets/XR/Materials/, you will find a material called "XRTransparentSurface". Apply the material to the surface object you want to make transparent (i.e. a ground plane)
Alternatively, create your own material material (for example, call it MyXRTransparentSurface), and select 8thWall -> XRShadow from the list of shaders, then apply the material to the surface object you want to make transparent (i.e. a ground plane):
To adjust the opacity of the shadows, select your Directional Light, and adjust the Strength of the shadow. To improve shadow quality, set resolution to "Very High Resolution"
For improved shadow quality, go to Edit -> Project Settings -> Quality. Set Shadow Resolution to "Very High Quality" and Shadow Projection to "Close Fit"
Also, check the "Default" quality settings (indicated by GREEN checkbox) for each platform, and make that shadows are enabled:
If needed, change the default quality level by clicking on the "down arrow" under each platform and selecting a higher quality that has appropriate shadow settings:
Result:
AR is more realistic when virtual objects in your scene are sized properly for the environment around you. Often times assets imported into your unity scene (from the Asset store or elsewhere) aren't sized properly and need to be adjusted. While you can typically adjust the transform.localScale of the objects, it can get complex if you have to scale hundreds or thousands of game objects. Also, scaling objects can have a negative impact on things like physics, especially in situations where objects are scaled very small.
8th Wall XR allows you to adjust the experienced scale without actually modifying the scale of your game objects. This works
ARKit/ARCore devices
For ARKit/ARCore devices, adjust the "scale" parameter of the XRCameraController script that is attached to the Main Camera. By increasing this value you essentially scale both the content and the camera movement speed. Increase this value to make objects look smaller. Decrease the value to make them look bigger.
8th Wall SLAM devices (Non-ARKit/Non-ARCore)
For Non-ARKit/Non-ARCore devices, adjust the scale of the scene by modifying the height (Y-position) of the Main Camera. Use the Camera Preview within the Unity Editor to get a feel for how the objects will look:
You can test the latest features of 8th Wall XR by switching to the Beta channel.
To Switch Channels:
Release 11.2:
New Features:
Breaking Changes:
Release 11.1:
Fixes and Enhancements:
Release 11:
Fixes:
Release 10:
Fixes:
Release 9.3:
Fixes:
Release 9.2:
Fixes:
Release 9.1:
New Features:
Fixes:
Release 9:
New Features:
Fixes:
Release 8.1:
Fixes:
Release 8:
New features:
Enhancements and Fixes:
Release 7.1:
Fixes:
Release 7:
New Features:
Enhancements and Fixes:
Release 6.3:
Fixes:
Release 6.2:
Fixes:
Release 6.1:
Fixes:
Release 6:
New Features:
Enhancements and Fixes:
Release 5:
New 8th Wall Console - provides developer access to:
New Features:
Enhancements and Fixes:
Release 4:
Enhancements and Fixes:
XRSurfaceController:
XRController:
XREngine:
Release 3:
XRSurfaceController enhancements:
Release 2:
Release 1:
Issue: I'm using 8th Wall XR and Unity 2018.3 (or newer) to build an Android app, but when I run the app on device, the background is white and I'm not asked for camera permissions.
Example:
Resolution: If you are using Unity 2018.3 or newer, your app must manually request Android permissions.
Unity made a change in 2018.3 where Android apps no longer automatically request camera permissions. It’s now controlled by a new API.
Here is more info on the change and new APIs:
https://blogs.unity3d.com/2018/12/13/introducing-unity-2018-3/
https://docs.unity3d.com/2018.3/Documentation/Manual/android-RequestingPermissions.html
8th Wall XR Release 10 added an option to XRController that will handle this for you. This option only appears if you are using Unity 2018.3 or newer. If you would like 8th Wall XR to automatically request Android Camera Permissions on app startup, check this box:
If you'd prefer to handle requesting Android permissions yourself, disable "Request Android Camera Permissions" on your XRController and create a custom script to handle this. Here is a very basic example of requesting camera permissions (attach to a game object):
using UnityEngine;
#if PLATFORM_ANDROID
using UnityEngine.Android;
#endif
public class RequestCamera : MonoBehaviour {
void Awake() {
#if PLATFORM_ANDROID
if (!Permission.HasUserAuthorizedPermission(Permission.Camera)) {
Permission.RequestUserPermission(Permission.Camera);
}
#endif
}
}
Issue: When XR Remote is running on an Android device and connected to Unity via USB, data isn't streaming in Play mode.
Resolution: Restart adb
Kill adb
adb kill-server
Restart adb
adb start-server
adb devices -l
Issue: XR Remote won't connect to Unity over WiFi
If you are unable to connect via WiFi, please check the following:
Issue: Unity crashes when I attempt to Build my project if XRAppSettings is selected.
Resolution: Before building your project, first navigate away from the XRAppSettings panel. Simply select any other asset or GameObject in your scene (i.e Click on Main Camera). There is currently a race condition in Unity related to AssetDatabase.SaveAssets() being called in PreProcessBuild scripts that can cause Unity to crash.
Issue: I have a 3D Plane game object in my scene, and applied the XRTransparent surface. Shadows appear within the Unity Editor, but when the app is run on my device, the shadows are missing.
Resolution: Check default quality settings. The selected profile is set to "Disable Shadows".
Video: https://youtu.be/NPI6hnHlNgs?start=305
INCORRECT - Example of "Very Low" default quality where shadows are disabled:
CORRECT - Example where shadows are enabled:
If needed, change the default quality level by clicking on the "down arrow" under each platform and selecting a higher quality that has appropriate shadow settings:
Issue: I'm trying to run my app on a device without ARKit or ARCore. As I move my phone, the camera position does not update.
Resolution: Check the position of the Main Camera in your Unity Scene. The camera should NOT be at a height (Y) of zero. Set it to a desired approximate real life height (i.e 1.0 or 1.2)
To preview initial positions of objects in your scene, select the Main Camera. The Unity Editor will show a camera preview. This is what you can expect to see when running the app on a Non-ARKit/Non-ARCore device, and can size/position your objects accordingly.
Also, if you have any game objects with an XRSurfaceController attached, it's assumed that the ground is at a height (Y) of zero.
Issue: Scene crashes on iOS devices that support Metal, if Metal API is removed from Unity project settings.
XCode reports an "EXC_BAD_ACCESS" error message within the setManagedCameraTexture() function:
This happens on certain phones if "Auto Graphics API" is unchecked and Metal is removed from the list of Graphics APIs as seen here:
See Apple Documentation for the list of phones that support Metal
Resolution: Add Metal to list of Graphics APIs
Issue: My mobile application asks for camera access, but doesn't render the camera view.
Resolution: Disable Multithreaded Rendering
Issue: When building for iOS, XCode complains there are undefined symbols
Example:
Undefined symbols for architecture arm64:
"_cblas_sgemv", referenced from:
c8::HMatrix::operator*(c8::HPoint<3ul>) const in libXRPlugin.a (hmatrix.o)
c8::HMatrix::operator*(c8::HVector<3ul>) const in libXRPlugin.a (hmatrix.o)
ld: symbol(s) not found for architecture arm64
Resolution:
8th Wall XR automatically adds various frameworks via the XRBuildPostProcessor script (in XR/Editor). In this particular example, the cblas functions come in from linking the "Accelerate" framework.
Make sure that the following frameworks are included with your XCode project:
Issue: None of the AR features are working (camera position/rotation, Video background, etc)
Looking through logs, you'll see NullReferenceExceptions such as:
"NullReferenceException: A null value was found where an object instance was required.
at XRController.GetCameraPosition () [0x00000] in
Resolution: Add an XRController (with "XRController" tag) to your scene. You can do this in the Hierarchy panel via Create -> XRController or GameObject menu -> XRController:
Need some help? 8th Wall is here to help you succeed. Contact us directly, or reach out to the community to get answers.
Ways to get help:
Slack | Email Support | Stack Overflow | GitHub |
---|---|---|---|
![]() |
![]() |
![]() |
![]() |
[1] Intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into any contract. It is not a commitment to deliver any material, code, or functionality, and should not be relied upon in making purchasing decisions. The development, release, and timing of any features or functionality described for 8th Wall’s products remain at the sole discretion of 8th Wall, Inc.