The easiest way to get started with the RealityAPI is to connect to the XR system with the default configuration via the connectXR method.
It will try to connect to the device's camera stream and enable model-based tracking.
Once connected, the camera stream will be shown in the background of the viewer automatically.
Since having access to an image source (e.g. the AR camera stream) is a prerequisite for using the
AR functionalities, the RealityAPI methods can only be used once the RealityAPI has been connected to an image source.
This can be done by calling the connectXR method. The method will return a Promise which resolves when the connection was successful.
It requires a configuration object as an argument, which determines what parts of the API are desired to be used.
See XRConfiguration for more information. Whether the desired capabilities are available can be checked by
calling the getXRCapabilities method.
Example: Using the device's camera stream
constctx = webvis.getContext();
// Start by requesting the capabilities of the XR system constcapabilities = ctx.getXRCapabilities();
// Check if the XR system supports providing XR images from the device if (!capabilities.includes(webvis.XRCapability.SUPPORTS_DEVICE_IMAGE_SOURCE)) { // The XR system does not support providing XR images from the device. }
// Connect to the XR system with the desired configuration constxrConfiguration = { imageSourceConfig: { type:webvis.XRImageSource.DEVICE// Use the device's camera stream }, modelTrackingEnabled:capabilities.includes(webvis.XRCapability.MODELTRACKER), // If unset, defaults to true deviceScreenshotsEnabled:capabilities.includes(webvis.XRCapability.DEVICE_SCREENSHOTS), // If unset, defaults to false autoShowBackgroundFeed:true// Show the background feed automatically when available, default is true }; awaitctx.connectXR(xrConfiguration);
Now, the RealityAPI is connected to the XR system and can be used to access the AR functionalities.
Please note, that you can only use some of the functionalities if the XR system has the required capabilities.
Example: RealityAPI function call fails due to unsupported XR Capability
// Will fail if the XR system does not support model tracking, i.e. XRCapability.SUPPORTS_MODEL_TRACKING is not present in the capabilities ctx.exportXRInitTemplate();
For a complete list of available capabilities, see XRCapability.
For a detailed description of the AR functionalities, see the corresponding methods in the RealityAPI.
After the AR functionalities are no longer needed, the RealityAPI can be disconnected by calling the disconnectXR method.
Also, if the user wants to use a different configuration, the RealityAPI can be reconnected with the new configuration.
Example: Disconnecting from and reconnecting to the XR system
// ... // Disconnect from the XR system awaitctx.disconnectXR();
// Reconnect to the XR system with a different configuration constnewXRConfiguration = { ... }; awaitctx.connectXR(newXRConfiguration); // ... awaitctx.disconnectXR();
The XR Playback API is a subset of the RealityAPI that allows for playback of frame sequences.
With the XR Playback API, recorded frame sequences can be played back anywhere, removing the need to visit the physical location for each test iteration.
The recordings can be used to test new features easily or increase development iteration speed without the need of having a physical device available.
To use the XR Playback API, the XR system must have the XR Playback capability SUPPORTS_PLAYBACK. Furthermore, inside the xrConfiguration
object, the imageSourceConfig must be of type PLAYBACK. This type allows for a more detailed configuration
tailored to the playback source. The configuration object must specify the URL of the frame sequence to be played back.
The URL must point to the manifest.json file of the frame sequence which contains the necessary information about the frame sequence.
See XRImageSourceConfigPlayback for more information.
The XR Playback API provides methods to control the playback, such as starting, pausing, stopping, seeking, setting the playback speed, etc.
You can also query the playback properties and state to get information about the playback sequence.
XR Playback API methods can be identified by the naming infix XRPlayback.
Anchors the model at the current 3D pose. Visually, this will have the effect that the model will stay at the
current position and orientation in the real world.
Please note, that in a model-based tracking scenario, the model will get anchored automatically
when the alignment of the model with the real world object is high enough that tracking can be performed
(model is snapped).
Returns a Promise which resolved when the operation was successful or rejects in an error case
disconnectXR
disconnectXR(): Promise<void>
Experimental
Disconnect from the XR system.
This method should be called when the AR functionalities are no longer needed.
Returns Promise<void>
Returns a Promise which resolved when the operation was successful or rejects in an error case
enterXRInitMode
enterXRInitMode(xrInitOptions?): Promise<void>
Experimental
Enter the XR initialization mode. The initialization mode is used as an entry point for model-based tracking.
It unanchors any previously anchored model and starts the model-based tracking process.
In this mode, the user can align the model with the real object (snapping). When the model is
snapped, the anchored value will be set to true which will trigger a XRStateChangedEvent.
By that, the init mode gets exited and the model is anchored to the real object. The model now gets tracked
and moves with the device.
Please note, that this method should only be called if the XR system has the SUPPORTS_MODEL_TRACKING capability.
Returns a Promise which resolved when the operation was successful or rejects in an error case
exportXRInitTemplate
exportXRInitTemplate(): Promise<string>
Experimental
Exports an initialization template for model-based tracking.
In a model-based tracking scenario, after a successful tracking session, the learned initialization data
can be exported with this function and stored as a template for later.
Get the array of XRCapability. The array of capabilities determines what parts of the API can be used. The capabilities
are mostly defined by the used device.
Returns an Array<XRCapability> containing the XR system's capabilities.
getXRMembers
getXRMembers(): number[]
Experimental
Returns an array of memberIDs of those session members which are currently using an XR device and have an active
Reality connection.
The array will not contain the session member ID of the current session member.
If the session is not connected, the promise will reject with an error.
If the session is connected, but no active Reality users are found, the promise will resolve with an empty array.
Returns number[]
The array of memberIDs of those session members that are currently publishing an XRImage stream
Returns a Promise which resolved when the operation was successful or rejects in an error case
importXRInitTemplate
importXRInitTemplate(template): Promise<void>
Experimental
Imports an initialization template for model-based tracking.
In a model-based tracking scenario, initialization templates are captured during the tracking process.
This initialization data is linked to previously visited viewpoints along the traveled camera path.
Once the tracking is lost the templates are used to quickly reinitialize from similar viewpoints without the
user having to align the line model with the real object.
Once the initialization template data is imported, it will persist until enterXRInitMode with
resetInitTemplate set to true is called.
Returns a Promise which resolved when the operation was successful or rejects in an error case
requestXRDeviceScreenshot
requestXRDeviceScreenshot(): Promise<string>
Experimental
Request a screenshot of the webview's content inside the native XR device application.
Returns Promise<string>
Returns a Promise which contains the base64 encoded image data when the operation
was successful or rejects in an error case.
seekXRPlayback
seekXRPlayback(frameIndex): Promise<void>
Experimental
Jump to the frame with the specified index in the playback sequence. To get the total amount of frames in the sequence,
see XRPlaybackProperties and getXRPlaybackProperties method.
Returns a Promise which reports wether the operation was successful or not
setXRPlaybackBoomerang
setXRPlaybackBoomerang(boomerang): Promise<void>
Experimental
Specify whether the playback should "boomerang" (play forward and backward in a loop).
This is useful for creating a seamless transition at the end of a sequence in terms of pose updates.
Parameters
boomerang: boolean
Whether the playback should boomerang or not
Returns Promise<void>
Returns a Promise which reports whether the operation was successful or not
Returns a Promise which resolves with the amount of frames in the playback if successful
and rejects otherwise
setXRPlaybackSpeed
setXRPlaybackSpeed(speed): void
Experimental
Set the desired playback speed. The specified speed must be a value between 0 and 1.
It gives the playback speed as a fraction of the original FPS which is stored in the XRPlaybackProperties.
Starts spectating the XRImage stream published by the session member with the specified ID within a shared session.
This will also hide any other background feed that is currently shown.
Parameters
sessionMemberId: number
The session member id of the member to spectate
Returns Promise<void>
Returns a promise which resolves when the operation was successful or rejects in an error case
AR and webvis: The RealityAPI
Overview
The RealityAPI provides AR functionalities inside the webvis context. This includes:
Quick Start
The easiest way to get started with the RealityAPI is to connect to the XR system with the default configuration via the connectXR method. It will try to connect to the device's camera stream and enable model-based tracking. Once connected, the camera stream will be shown in the background of the viewer automatically.
Prerequisites: XR Capabilities, XR Configuration and connecting to the local XR system
Since having access to an image source (e.g. the AR camera stream) is a prerequisite for using the AR functionalities, the RealityAPI methods can only be used once the RealityAPI has been connected to an image source. This can be done by calling the connectXR method. The method will return a Promise which resolves when the connection was successful. It requires a configuration object as an argument, which determines what parts of the API are desired to be used. See XRConfiguration for more information. Whether the desired capabilities are available can be checked by calling the getXRCapabilities method.
Example: Using the device's camera stream
Now, the RealityAPI is connected to the XR system and can be used to access the AR functionalities. Please note, that you can only use some of the functionalities if the XR system has the required capabilities.
Example: RealityAPI function call fails due to unsupported XR Capability
For a complete list of available capabilities, see XRCapability. For a detailed description of the AR functionalities, see the corresponding methods in the RealityAPI.
XR State
The XR state can be queried by calling the getXRState method. The XR state is represented by the XRState type.
Disconnecting from the XR system
After the AR functionalities are no longer needed, the RealityAPI can be disconnected by calling the disconnectXR method. Also, if the user wants to use a different configuration, the RealityAPI can be reconnected with the new configuration.
Example: Disconnecting from and reconnecting to the XR system
XR Playback API
The XR Playback API is a subset of the RealityAPI that allows for playback of frame sequences.
With the XR Playback API, recorded frame sequences can be played back anywhere, removing the need to visit the physical location for each test iteration. The recordings can be used to test new features easily or increase development iteration speed without the need of having a physical device available.
To use the XR Playback API, the XR system must have the XR Playback capability SUPPORTS_PLAYBACK. Furthermore, inside the xrConfiguration object, the imageSourceConfig must be of type PLAYBACK. This type allows for a more detailed configuration tailored to the playback source. The configuration object must specify the URL of the frame sequence to be played back. The URL must point to the
manifest.json
file of the frame sequence which contains the necessary information about the frame sequence. See XRImageSourceConfigPlayback for more information.Example: XR Playback API
The XR Playback API provides methods to control the playback, such as starting, pausing, stopping, seeking, setting the playback speed, etc. You can also query the playback properties and state to get information about the playback sequence. XR Playback API methods can be identified by the naming infix
XRPlayback
.