openxr_settings.rst 14 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318
  1. .. _doc_openxr_settings:
  2. OpenXR Settings
  3. ===============
  4. OpenXR has its own set of settings that are applied when OpenXR starts.
  5. While it is possible for OpenXR extensions implemented through Godot plugins to add additional settings,
  6. we will only discuss the settings in the core of Godot here.
  7. .. image:: img/openxr_settings.webp
  8. General settings
  9. ----------------
  10. Enabled
  11. ~~~~~~~
  12. This setting enables the OpenXR module when Godot starts.
  13. This is required when the Vulkan backend is used.
  14. For other backends you can enable OpenXR at any time by calling ``initialize`` on the :ref:`OpenXRInterface <class_openxrinterface>`.
  15. This also needs to be enabled to get access to the action map editor.
  16. You can use the ``--xr-mode on`` command line switch to force this to on.
  17. Default Action Map
  18. ~~~~~~~~~~~~~~~~~~
  19. This specifies the path of the action map file that OpenXR will load and communicate to the XR Runtime.
  20. Form Factor
  21. ~~~~~~~~~~~
  22. This specifies whether your game is designed for:
  23. - ``Head Mounted`` devices such as a Meta Quest, Valve Index, or Magic Leap,
  24. - ``Handheld`` devices such as phones.
  25. If the device on which you run your game does not match the selection here, OpenXR will fail to initialise.
  26. View Configuration
  27. ~~~~~~~~~~~~~~~~~~
  28. This specifies the view configuration your game is designed for:
  29. - ``Mono``, your game provides a single image output. E.g. phone based AR;
  30. - ``Stereo``, your game provides stereo image output. E.g. head mounted devices.
  31. If the device on which you run your game does not match the selection here, OpenXR will fail to initialise.
  32. .. note::
  33. OpenXR has additional view configurations for very specific devices that Godot doesn't support yet.
  34. For instance, Varjo headsets have a quad view configuration that outputs two sets of stereo images.
  35. These may be supported in the near future.
  36. Reference Space
  37. ~~~~~~~~~~~~~~~
  38. Within XR all elements like the player's head and hands are tracked within a tracking volume.
  39. At the base of this tracking volume is our origin point, which maps our virtual space to the real space.
  40. There are however different scenarios that place this point in different locations,
  41. depending on the XR system used.
  42. In OpenXR these scenarios are well defined and selected by setting a reference space.
  43. Local
  44. ^^^^^
  45. The local reference space places our origin point at the player's head by default.
  46. Some XR runtimes will do this each time your game starts, others will make the position persist over sessions.
  47. This reference space however does not prevent the user from walking away so you will need to detect if the user does so
  48. if you wish to prevent the user from leaving the vehicle they are controlling, which could potentially be game breaking.
  49. This reference space is the best option for games like flight simulators or racing simulators
  50. where we want to place the :ref:`XROrigin3D <class_xrorigin3d>` node where the player's head should be.
  51. When the user enacts the recenter option on their headset, the method of which is different per XR runtime,
  52. the XR runtime will move the :ref:`XRCamera3D <class_xrcamera3d>` to the :ref:`XROrigin3D <class_xrorigin3d>` node.
  53. The :ref:`OpenXRInterface <class_openxrinterface>` will also emit the ``pose_recentered`` signal
  54. so your game can react accordingly.
  55. .. Note::
  56. Any other XR tracked elements such as controllers or anchors will also be adjusted accordingly.
  57. .. Warning::
  58. You should **not** call ``center_on_hmd`` when using this reference space.
  59. Stage
  60. ^^^^^
  61. The stage reference space is our default reference space and places our origin point at the center of our play space.
  62. For XR runtimes that allow you to draw out a guardian boundary this location and its orientation is often set by the user.
  63. Other XR runtimes may decide on the placement of this point by other means.
  64. It is however a stationary point in the real world.
  65. This reference space is the best option for room scale games where the user is expected to walk around a larger space,
  66. or for games where there is a need to switch between game modes.
  67. See :ref:`Room Scale <doc_xr_room_scale>` for more information.
  68. When the user enacts the recenter option on their headset, the method of which is different per XR runtime,
  69. the XR runtime will not change the origin point.
  70. The :ref:`OpenXRInterface <class_openxrinterface>` will emit the ``pose_recentered`` signal
  71. and it is up to the game to react appropriately.
  72. Not doing so will prevent your game from being accepted on various stores.
  73. In Godot you can do this by calling the ``center_on_hmd`` function on the :ref:`XRServer <class_xrserver>`:
  74. - Calling ``XRServer.center_on_hmd(XRServer.RESET_BUT_KEEP_TILT, false)`` will move the :ref:`XRCamera3D <class_xrcamera3d>` node
  75. to the :ref:`XROrigin3D <class_xrorigin3d>` node similar to the ``Local`` reference space.
  76. - Calling ``XRServer.center_on_hmd(XRServer.RESET_BUT_KEEP_TILT, true)`` will move the :ref:`XRCamera3D <class_xrcamera3d>` node
  77. above the :ref:`XROrigin3D <class_xrorigin3d>` node keeping the player's height, similar to the ``Local Floor`` reference space.
  78. .. Note::
  79. Any other XR tracked elements such as controllers or anchors will also be adjusted accordingly.
  80. Local Floor
  81. ^^^^^^^^^^^
  82. The local floor reference space is similar to the local reference space as it positions the origin point where the player is.
  83. In this mode however the height of the player is kept.
  84. Same as with the local reference space, some XR runtimes will persist this location over sessions.
  85. It is thus not guaranteed the player will be standing on the origin point,
  86. the only guarantee is that they were standing there when the user last recentered.
  87. The player is thus also free to walk away.
  88. This reference space is the best option of games where the user is expected to stand in the same location
  89. or for AR type games where the user's interface elements are bound to the origin node
  90. and are quickly placed at the player's location on recenter.
  91. When the user enacts the recenter option on their headset, the method of which is different per XR runtime,
  92. the XR runtime will move the :ref:`XRCamera3D <class_xrcamera3d>` above the :ref:`XROrigin3D <class_xrorigin3d>` node
  93. but keeping the player's height.
  94. The :ref:`OpenXRInterface <class_openxrinterface>` will also emit the ``pose_recentered`` signal
  95. so your game can react accordingly.
  96. .. Warning::
  97. Be careful using this mode in combination with virtual movement of the player.
  98. The user recentering in this scenario can be unpredictable unless you counter the move when handling the recenter signal.
  99. This can even be game breaking as the effect in this scenario would be the player teleporting to whatever abstract location
  100. the origin point was placed at during virtual movement, including the ability for players teleporting into
  101. locations that should be off limits.
  102. It is better to use the Stage mode in this scenario and limit resetting to orientation only when a ``pose_recentered`` signal is received.
  103. .. Note::
  104. Any other XR tracked elements such as controllers or anchors will also be adjusted accordingly.
  105. .. Warning::
  106. You should **not** call ``center_on_hmd`` when using this reference space.
  107. Environment Blend Mode
  108. ~~~~~~~~~~~~~~~~~~~~~~
  109. The environment blend mode defines how our rendered output is blended into "the real world" provided this is supported by the headset.
  110. - ``Opaque`` means our output obscures the real world, we are in VR mode.
  111. - ``Additive`` means our output is added to the real world,
  112. this is an AR mode where optics do not allow us to fully obscure the real world (e.g. Hololens),
  113. - ``Alpha`` means our output is blended with the real world using the alpha output (viewport should have transparent background enabled),
  114. this is an AR mode where optics can fully obscure the real world (Magic Leap, all pass through devices, etc.).
  115. If a mode is selected that is not supported by the headset, the first available mode will be selected.
  116. .. Note::
  117. Some OpenXR devices have separate systems for enabling/disabling passthrough.
  118. From Godot 4.3 onwards selecting the alpha blend mode will also perform these extra steps.
  119. This does require the latest vendor plugin to be installed.
  120. .. _doc_openxr_settings_foveation_level:
  121. Foveation Level
  122. ~~~~~~~~~~~~~~~
  123. Sets the foveation level used when rendering provided this feature is supported by the hardware used.
  124. Foveation is a technique where the further away from the center of the viewport we render content, the lower resolution we render at.
  125. Most XR runtimes only support fixed foveation, but some will take eye tracking into account and use the focal point for this effect.
  126. The higher the level, the better the performance gains, but also the more reduction in quality there is in the users peripheral vision.
  127. .. Note::
  128. **Compatibility renderer only**,
  129. for Mobile and Forward+ renderer, set the ``vrs_mode`` property on :ref:`Viewport <class_viewport>` to ``VRS_XR``.
  130. .. Warning::
  131. This feature is disabled if post effects are used such as glow, bloom, or DOF.
  132. Foveation Dynamic
  133. ~~~~~~~~~~~~~~~~~
  134. When enabled the foveation level will be adjusted automatically depending on current GPU load.
  135. It will be adjusted between low and the select foveation level in the previous setting.
  136. It is therefore best to combine this setting with foveation level set to high.
  137. .. Note::
  138. **Compatibility renderer only**
  139. Submit Depth Buffer
  140. ~~~~~~~~~~~~~~~~~~~
  141. If enabled an OpenXR supplied depth buffer will be used while rendering which is submitted alongside the rendered image.
  142. The XR runtime can use this for improved reprojection.
  143. .. Note::
  144. Enabling this feature will disable stencil support during rendering.
  145. Not many XR runtimes make use of this,
  146. it is advised to leave this setting off unless it provides noticeable benefits for your use case.
  147. Startup Alert
  148. ~~~~~~~~~~~~~
  149. If enabled, this will result in an alert message presented to the user if OpenXR fails to start.
  150. We don't always receive feedback from the XR system as to why starting fails. If we do, we log this to the console.
  151. Common failure reasons are:
  152. - No OpenXR runtime is installed on the host system.
  153. - Microsoft's WMR OpenXR runtime is currently active, this only supports DirectX and will fail if OpenGL or Vulkan is used.
  154. - SteamVR is used but no headset is connected/turned on.
  155. Disable this if you support a fallback mode in your game so it can be played in desktop mode when no VR headset is connected,
  156. or if you're handling the failure condition yourself by checking ``OpenXRInterface.is_initialized()``.
  157. Extensions
  158. ----------
  159. This subsection allows you to enable to various optional OpenXR extensions. Keep in
  160. mind that the extensions will only work if the OpenXR runtime (SteamVR, Oculus, etc)
  161. the project is ran with supports them.
  162. Debug Utils
  163. ~~~~~~~~~~~
  164. Enabling this will log debug messages from the XR runtime.
  165. Debug Message Types
  166. ~~~~~~~~~~~~~~~~~~~
  167. This allows you to choose which debug messages are logged.
  168. Hand Tracking
  169. ~~~~~~~~~~~~~
  170. This enables the hand tracking extension when supported by the device used. This is on by default for legacy reasons.
  171. The hand tracking extension provides access to data that allows you to visualise the user's hands with correct finger positions.
  172. Depending on platform capabilities the hand tracking data can be inferred from controller inputs, come from data gloves,
  173. come from optical hand tracking sensors or any other applicable source.
  174. If your game only supports controllers this should be turned off.
  175. See the page on :ref:`hand tracking <doc_openxr_hand_tracking>` for additional details.
  176. Hand Tracking Unobstructed Data Source
  177. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  178. Enabling this means hand tracking may use the exact position of fingers, usually
  179. what a headset camera sees.
  180. Hand Tracking Controller Data Source
  181. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  182. Enabling this means hand tracking may use the controller itself, and infer where
  183. fingers are based on controller input or sensors on the controller.
  184. Hand Interaction Profile
  185. ~~~~~~~~~~~~~~~~~~~~~~~~
  186. Enabling this extension allows the use of two new hand tracking poses. Pinch pose
  187. which is the location between the thumb and index finger pointing forward, and poke
  188. pose which is at the tip of the index finger.
  189. This also allows 3 more gesture based inputs. Pinch, when the user pinches their
  190. thumb and index finger together. Aim activation, when the index finger is fully
  191. extended. And Grasps, when the user makes a fist.
  192. When a hand interaction profile and controller interaction profile are supplied, the
  193. runtime will switch between profiles depending on if optical tracking is used or if
  194. the user is holding a controller.
  195. If only a hand interaction profile is supplied any runtime should use hand
  196. interaction even if a controller is being held.
  197. Eye Gaze Interaction
  198. ~~~~~~~~~~~~~~~~~~~~
  199. This enables the eye gaze interaction extension when supported by the device used.
  200. When enabled we will get feedback from eye tracking through a pose situated between the user's eyes
  201. orientated in the direction the user is looking. This will be a unified orientation.
  202. In order to use this functionality you need to edit your action map and add a new pose action,
  203. say ``eye_pose``.
  204. Now add a new interaction profile for the eye gaze interaction and map the ``eye_pose``:
  205. .. image:: img/openxr_eye_gaze_interaction.webp
  206. Don't forget to save!
  207. Next add a new :ref:`XRController3D <class_xrcontroller3d>` node to your origin node
  208. and set its ``tracker`` property to ``/user/eyes_ext``
  209. and set its ``pose`` property to ``eye_pose``.
  210. Now you can add things to this controller node such as a raycast, and control things with your eyes.
  211. Binding Modifiers
  212. -----------------
  213. These control whether or not binding modifiers can be used. Binding modifiers are
  214. used to apply thresholds or offset values. You can find information on how to use
  215. and set them up on the XR action map page :ref:`here <doc_binding_modifiers>`.
  216. Analog Threshold
  217. ~~~~~~~~~~~~~~~~
  218. Allow analog threshold binding modifiers.
  219. Dpad Binding
  220. ~~~~~~~~~~~~
  221. Allow D-pad binding modifiers.