Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
OVERLAY ON AN ARTIFICIAL REALITY ENVIRONMENT
Document Type and Number:
WIPO Patent Application WO/2024/085996
Kind Code:
A1
Abstract:
Aspects of the present disclosure are directed to providing an overlay over a concurrently executing artificial reality (XR) environment. Implementations immerse a user in a three‑dimensional XR environment via a XR system. An overlay manager can dynamically display a two-dimensional overlay over the three-dimensional XR environment in response to input that triggers the overlay. The overlay can provide a runtime that executes application components (e.g., system shell applications and applications remote from the system shell). For example, an executing application can provide a two-dimensional virtual object (e.g., panel) displayed in the two‑dimensional overlay. The overlay can provide a user concurrent access to the 2D components of additional applications, without requiring termination of the three‑dimensional XR environment. In some implementations, the XR environment is transitioned to a paused state while the overlay is active and restored to an active state when the overlay is closed.

Inventors:
FURTWANGLER NATHAN (US)
CAMPBELL PETER LOWYRIE (US)
FORBES RONALD OMEGA JR (US)
LUTHER MATTHEW (US)
Application Number:
PCT/US2023/033526
Publication Date:
April 25, 2024
Filing Date:
September 22, 2023
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
META PLATFORMS TECH LLC (US)
International Classes:
G06F3/01; G02B27/01; G06F8/30; G06F8/36; G06F8/38; G06F9/445; G06F9/451; G06F9/46; G06F9/48; G06F9/54; G09G5/14
Foreign References:
US20170214782A12017-07-27
Other References:
GAME UI EXAMPLES: "[VR] Half Life Alyx Menu / Settings / UI", 27 February 2021 (2021-02-27), XP093104653, Retrieved from the Internet [retrieved on 20231122]
Attorney, Agent or Firm:
COLBY, Steven et al. (US)
Download PDF:
Claims:
CLAIMS

1 . A method for providing an overlay over a concurrently executing artificial reality, XR, environment, the method comprising: displaying, to a user by a XR system, a three-dimensional XR environment, wherein content for the XR environment is provided by a first software process executing at the XR system; receiving input that initiates a two-dimensional overlay component; and triggering a display of the two-dimensional overlay component, over the XR environment, to the user by the XR system, wherein content of the two-dimensional overlay component is provided by a second software process executing at the XR system, and the triggering pauses the XR environment such that the first executing software process is transitioned to a background priority and the second executing software process is transitioned to a foreground priority.

2. The method of claim 1 , wherein, while the first executing software process is in the background priority, a majority of XR system resource capacity is allocated to executing the first software process.

3. The method of claim 1 or claim 2, wherein the XR system resource capacity comprises at least processor capacity.

4. The method of any preceding claim, further comprising: receiving first input via a first input channel that interacts with the overlay component and second input via a second input channel that interacts with the XR environment.

5. The method of claim 4, wherein the first input channel comprises user hand input and, in response to the first input, the second executing software process moves a cursor at the overlay component, the user hand input comprising touch input, handheld controller input, tracked hand motion, or any combination thereof; and/or preferably wherein the second input channel comprises user head input and, in response to the second input, the first executing software process alters the display of the XR environment to the user by changing a perspective of the user’s view of the XR environment, the head input comprising tracked user head movement, tracked user gaze, or any combination thereof.

6. The method of any preceding claim, wherein at least a portion of the content displayed by the overlay component is based on the user’s context with respect to the XR environment; and preferably wherein the XR environment comprises a shared XR environment from which the user can travel to a plurality of XR worlds, and the portion of content comprises a list of at least some of the users that are present in a current XR world of the user.

7. The method of any preceding claim, wherein the first executing software process comprises a XR process that hosts one or more applications that provide functionality for the XR environment, the second executing software process comprises a shell process that hosts one or more additional executing applications, and the overlay component displays content from the one or more additional executing applications.

8. The method of claim 7, further comprising: terminating, in response to user input, display of the overlay component, wherein the first executing software process is transitioned to the foreground priority in response to the terminating, and the XR environment is resumed from the pause; and preferably wherein state information is stored for the overlay component in response to the terminating, the state information comprising a state of content displayed at the overlay component and at least one application state for at least one additional executing application of the one or more additional executing applications; and further preferably wherein, upon triggering a redisplay of the two-dimensional overlay component over the XR environment to the user by the XR system, content displayed at the redisplayed overlay component is restored using the stored state information, and an application state for the at least one additional executing application is restored using the stored state information.

9. A computer-readable storage medium storing instructions that, when executed by a computing system, cause the computing system to perform a process for providing an overlay over a concurrently executing artificial reality, XR, environment, the process comprising: displaying, to a user by a XR system, a three-dimensional XR environment, wherein content for the XR environment is provided by a first software process executing at the XR system; receiving input that initiates a two-dimensional overlay component; and triggering a display of the two-dimensional overlay component, over the XR environment, to the user by the XR system, wherein content of the two-dimensional overlay component is provided by a second software process executing at the XR system, and the triggering pauses the XR environment such that the first executing software process is transitioned to a background priority and the second executing software process is transitioned to a foreground priority.

10. The computer-readable storage medium of claim 9, wherein, while the first executing software process is in the background priority, a majority of XR system resource capacity is allocated to executing the first software process.

11 . The computer-readable storage medium of claim 9 or claim 10, wherein the XR system resource capacity comprises at least processor capacity.

12. The computer-readable storage medium of any one of claims 9 to 11 , further comprising: receiving first input via a first input channel that interacts with the overlay component and second input via a second input channel that interacts with the XR environment.

13. The computer-readable storage medium of claim 12, wherein the first input channel comprises user hand input and, in response to the first input, the second executing software process moves a cursor at the overlay component, the user hand input comprising touch input, hand-held controller input, tracked hand motion, or any combination thereof; and/or preferably wherein the second input channel comprises user head input and, in response to the second input, the first executing software process alters the display of the XR environment to the user by changing a perspective of the user’s view of the XR environment, the head input comprising tracked user head movement, tracked user gaze, or any combination thereof.

14. The computer-readable storage medium of any one of claims 9 to 13, wherein at least a portion of the content displayed by the overlay component is based on the user’s context with respect to the XR environment.

15. A computing system for providing an overlay over a concurrently executing artificial reality, XR, environment, the computing system comprising: one or more processors; and one or more memories storing instructions that, when executed by the one or more processors, cause the computing system to perform a process comprising: displaying, to a user by the computing system, a three-dimensional XR environment, wherein content for the XR environment is provided by a first software process executing at the computing system; receiving input that initiates a two-dimensional overlay component; and triggering a display of the two-dimensional overlay component, over the XR environment, to the user by the computing system, wherein content of the two-dimensional overlay component is provided by a second software process executing at the computing system, and the triggering pauses the XR environment such that the first executing software process is transitioned to a background priority and the second executing software process is transitioned to a foreground priority.

Description:
OVERLAY ON AN ARTIFICIAL REALITY ENVIRONMENT

TECHNICAL FIELD

[0001] The present disclosure is directed to a concurrent overlay on an artificial reality environment.

BACKGROUND

[0002] Artificial reality systems have grown in popularity and this trend is expected to accelerate. Immersive artificial reality environments can provide unique experiences and support virtual social interactions among users. However, artificial reality systems often create a rigid user experience once a user is immersed in an artificial reality environment. For example, interactions with other applications, such as communication applications, often require terminating the artificial reality environment prior to engaging with the other applications. The computing resources consumed when immersing a user in an artificial reality environment often leaves XR systems with few available resources to support other applications.

SUMMARY

[0003] According to a first aspect of the disclosure, there is provided a method for providing an overlay over a concurrently executing artificial reality (XR) environment, the method comprising: displaying, to a user by a XR system, a three- dimensional XR environment, wherein content for the XR environment is provided by a first software process executing at the XR system; receiving inputthat initiates a two- dimensional overlay component; and triggering a display of the two-dimensional overlay component, over the XR environment, to the user by the XR system, wherein content of the two-dimensional overlay component is provided by a second software process executing at the XR system, and the triggering pauses the XR environment such that the first executing software process is transitioned to a background priority and the second executing software process is transitioned to a foreground priority.

[0004] In some embodiments, while the first executing software process is in the background priority, a majority of XR system resource capacity is allocated to executing the first software process.

[0005] In some embodiments, the XR system resource capacity comprises at least processor capacity.

[0006] In some embodiments, the method further comprises: receiving first input via a first input channel that interacts with the overlay component and second input via a second input channel that interacts with the XR environment. [0007] In some embodiments, the first input channel comprises user hand input and, in response to the first input, the second executing software process moves a cursor at the overlay component, the user hand input comprising touch input, handheld controller input, tracked hand motion, or any combination thereof.

[0008] In some embodiments, the second input channel comprises user head input and, in response to the second input, the first executing software process alters the display of the XR environment to the user by changing a perspective of the user’s view of the XR environment, the head input comprising tracked user head movement, tracked user gaze, or any combination thereof.

[0009] In some embodiments, at least a portion of the content displayed by the overlay component is based on the user’s context with respect to the XR environment. [0010] In some embodiments, the XR environment comprises a shared XR environment from which the user can travel to a plurality of XR worlds, and the portion of content comprises a list of at least some of the users that are present in a current XR world of the user.

[0011] In some embodiments, the first executing software process comprises a XR process that hosts one or more applications that provide functionality for the XR environment, the second executing software process comprises a shell process that hosts one or more additional executing applications, and the overlay component displays content from the one or more additional executing applications.

[0012] In some embodiments, the method further comprises: terminating, in response to user input, display of the overlay component, wherein the first executing software process is transitioned to the foreground priority in response to the terminating, and the XR environment is resumed from the pause.

[0013] In some embodiments, state information is stored for the overlay component in response to the terminating, the state information comprising a state of content displayed at the overlay component and at least one application state for at least one additional executing application of the one or more additional executing applications,

[0014] In some embodiments, upon triggering a redisplay of the two-dimensional overlay component over the XR environment to the user by the XR system, content displayed at the redisplayed overlay component is restored using the stored state information, and an application state for the at least one additional executing application is restored using the stored state information. [0015] According to a second aspect of the disclosure, there is provided a computer-readable storage medium storing instructions that, when executed by a computing system, cause the computing system to perform a process for providing an overlay over a concurrently executing artificial reality (XR) environment, the process comprising: displaying, to a user by a XR system, a three-dimensional XR environment, wherein content for the XR environment is provided by a first software process executing at the XR system; receiving input that initiates a two-dimensional overlay component; and triggering a display of the two-dimensional overlay component, over the XR environment, to the user by the XR system, wherein content of the two-dimensional overlay component is provided by a second software process executing at the XR system, and the triggering pauses the XR environment such that the first executing software process is transitioned to a background priority and the second executing software process is transitioned to a foreground priority.

[0016] In some embodiments, while the first executing software process is in the background priority, a majority of XR system resource capacity is allocated to executing the first software process.

[0017] In some embodiments, the XR system resource capacity comprises at least processor capacity.

[0018] In some embodiments, the computer-readable storage medium further comprises: receiving first input via a first input channel that interacts with the overlay component and second input via a second input channel that interacts with the XR environment.

[0019] In some embodiments, the first input channel comprises user hand input and, in response to the first input, the second executing software process moves a cursor at the overlay component, the user hand input comprising touch input, handheld controller input, tracked hand motion, or any combination thereof.

[0020] In some embodiments, the second input channel comprises user head input and, in response to the second input, the first executing software process alters the display of the XR environment to the user by changing a perspective of the user’s view of the XR environment, the head input comprising tracked user head movement, tracked user gaze, or any combination thereof.

[0021] In some embodiments, at least a portion of the content displayed by the overlay component is based on the user’s context with respect to the XR environment. [0022] According to a third aspect of the disclosure, there is provided a computing system for providing an overlay over a concurrently executing artificial reality (XR) environment, the computing system comprising: one or more processors; and one or more memories storing instructions that, when executed by the one or more processors, cause the computing system to perform a process comprising: displaying, to a user by the computing system, a three-dimensional XR environment, wherein content for the XR environment is provided by a first software process executing at the computing system; receiving input that initiates a two-dimensional overlay component; and triggering a display of the two-dimensional overlay component, over the XR environment, to the user by the computing system, wherein content of the two-dimensional overlay component is provided by a second software process executing at the computing system, and the triggering pauses the XR environment such that the first executing software process is transitioned to a background priority and the second executing software process is transitioned to a foreground priority.

BRIEF DESCRIPTION OF THE DRAWINGS

[0023] Figure 1 is a block diagram illustrating an overview of devices on which some implementations of the present technology can operate.

[0024] Figure 2A is a wire diagram illustrating a virtual reality headset which can be used in some implementations of the present technology.

[0025] Figure 2B is a wire diagram illustrating a mixed reality headset which can be used in some implementations of the present technology.

[0026] Figure 2C is a wire diagram illustrating controllers which, in some implementations, a user can hold in one or both hands to interact with an artificial reality environment.

[0027] Figure 3 is a block diagram illustrating an overview of an environment in which some implementations of the present technology can operate.

[0028] Figure 4 is a block diagram illustrating components which, in some implementations, can be used in a system employing the disclosed technology.

[0029] Figures 5 is an artificial reality environment with a displayed overlay.

[0030] Figure 6 is a diagram of a system architecture for concurrently executing an artificial reality environment with an overlay service.

[0031] Figure 7 is a diagram of an example overlay display.

[0032] Figure 8 is a flow diagram illustrating a process used in some implementations of the present technology for a providing an overlay over a concurrently executing artificial reality environment.

[0033] The techniques introduced here may be better understood by referring to the following Detailed Description in conjunction with the accompanying drawings, in which like reference numerals indicate identical or functionally similar elements.

DETAILED DESCRIPTION

[0034] Aspects of the present disclosure are directed to a shell system with an overlay manager that provides an overlay over a concurrently executing artificial reality environment. Implementations immerse a user in a three-dimensional artificial reality environment via an artificial reality system. The overlay manager can dynamically display a two-dimensional overlay over the three-dimensional artificial reality environment in response to input that triggers the overlay. The overlay can provide a secondary execution runtime that executes components of applications (e.g., system shell components and components from applications remote from the system shell). For example, an executing application can provide a two-dimensional virtual object (e.g., panel) that is displayed in the two-dimensional overlay. The overlay can provide a user concurrent access to additional applications without requiring termination of the three-dimensional artificial reality environment. In some implementations, the artificial reality environment is transitioned to a paused state while the overlay is active and restored to an active state when the overlay is closed. Example applications include web browsers, music players, video players, social media applications, messaging or other communication applications, third-party applications, stream ing/casting applications, a content library application, or any other suitable application.

[0035] When triggered, an overlay can display controls, tools, widgets, etc. with which the user can use to interact with or launch/execute one or more applications separate from the application controlling the artificial reality environment. The overlay manager can display the overlay as two-dimensional virtual object (e.g., panel) within the current artificial reality environment, and the overlay can include widgets from other applications executing on the artificial reality device. The overlay can thus display content for the executing applications and support user interactions (e.g., cursors based interactions) with the executing applications. Multiple applications can be executed in the overlay (e.g., by the shell system) at once and multiple corresponding two-dimensional objects can be displayed in the overlay.

[0036] In some implementations, the content displayed by the overlay two- dimensional virtual object is contextual, relative to the state of artificial reality environment. For example, the overlay two-dimensional virtual object can display known users (e.g., social links, members of a group, users that previously gamed with the user, etc.) that are present in the same artificial reality world to the user and/or that are proximate to the user’s location in the artificial reality world. In some implementations, the overlay two-dimensional virtual object can display a deep link that transports the user to the location of one or more of the known users within the artificial reality world. In some implementations, the overlay two-dimensional virtual object can display a link to initiate a dialogue with a known user. For example, the link can launch a communication application/virtual object (e.g., messaging application, audio or video call application, etc.) that initiates a dialogue (e.g., message, call, etc.) with the known user.

[0037] While interacting with the artificial reality environment, the user can travel to different locations within an artificial reality world. The overlay (e.g., stored overlay state) can travel along with the user such that when the user restores the overlay, the overlay is provided with the previously stored state (e.g., state when the user last closed the overlay) and updates application/virtual object content according to the user’s updated context relative to the artificial reality environment (e.g., traveled to destination). For example, the overlay can display an updated list of known users that are present in the user’s current artificial reality world and/or that are proximate to the user’s updated location in the current artificial reality world.

[0038] Implementations of the overlay manager store the state of active/launched applications and their corresponding two-dimensional virtual objects when the overlay is closed by a user (e.g., to resume/restore the artificial reality environment when ending a user session, etc.). The responsibilities for storing state can be divided among the overlay manager and the individual applications. For example, state stored by the overlay manager can include: size and position (e.g., within the overlay/shell environment) of the corresponding two-dimensional virtual object(s), application specific display parameters (e.g., tab orientation, relevant websites/webpages, etc.), and any other suitable application state parameter that initialize an application. State stored by the individual applications can include: specific state of content displayed at the time the application is closed (e.g., open tab, scroll position, open webpage, open messages in a communication application, etc.), state of content displayed in the corresponding two-dimensional virtual object, and any other suitable application state parameter. [0039] When an overlay is restored, implementations of the overlay manager can restore/initialize display of application(s) according to the overlay’s stored state (e.g., position and size of the corresponding virtual objects, initialization parameters for the application, etc.) and the application itself can restore the contents according to the application’s stored state. In combination, this state save and restoration workflow provides the user a seamless experience. For example, when an overlay is closed with components having a given state for running applications (e.g., virtual object position and size, content displayed in each virtual object, etc. within the overlay) and a user later triggers display of the overlay again, the restored overlay displays the components, corresponding to various applications with features shown in the overlay, in the same state (without any additional interaction with or management by the user). [0040] Implementations of a resource manager can allocate, while the overlay is displayed/active, system resources between a process that executes the artificial reality environment and a process that executes the overlay. For example, a XR environment process can connect to a runtime service that provides XR system level resources and implements the XR environment. When an overlay is triggered, a shell process with an overlay service can connect to the runtime service. In some implementations, the shell process can maintain a persistent connection to the runtime service while the XR environment is displayed. The shell process and overlay service can interact with the runtime service to utilize system resources and implement the overlay.

[0041] Implementations of a display manager coordinate display of pixels for a composite of the overlay display and the artificial reality environment. For example, while the overlay is displayed (e.g., over the artificial reality environment) the artificial reality environment process can continue to provide display output (e.g., frames, pixels, etc.) for display at the artificial reality system. The shell process for the overlay can provide the display manager display output for the displayed overlay and the artificial reality environment process can provide the display manager display output for the artificial reality environment. For example, the artificial reality environment can be an immersive three-dimensional environment that alters the display presented to the user in response to detected movement of the user, such as changes in head position. The display output from the XR environment process (e.g., in response to changes in user head position) are provided to the display manager while the overlay process provides display output (e.g., pixels, frames, etc.) for the displayed overlay. [0042] Implementations of the display manager can generate a composite display that includes the two-dimensional content for the overlay and the three- dimensional content for the artificial reality environment. For example, the user can interact with the two-dimensional overlay by driving a cursor via user input while the display of the three-dimensional artificial reality environment in the background is dynamically changed in response to user head movement. In some implementations, the artificial reality environment in the background is paused (e.g., in a frozen state, in a holding state, etc.) while the overlay is displayed/active, however the artificial reality environment process continues to output display (e.g., frames, pixels) while paused, such as display changes in response to user head movement.

[0043] Implementations of the resource manager can allocate artificial reality system resources, such as processor capacity, A) to the artificial reality environment process to support execution of the artificial reality environment and B) to the shell process to support execution of the overlay. In some implementations, the resource manager allocates a majority of processor capacity to the artificial reality environment process while the overlay is displayed. Processor capacity can represent a number of processor cores, threads, processor core portions (e.g., clock speed, processor cycles I processing windows, processor cache memory, etc.), any other suitable processor capacity metric, or any combination thereof. In some cases, the artificial reality environment process may require larger quantities of artificial reality system resources to maintain execution of the artificial reality environment (even in a paused/frozen state). The resource manager’s capacity allocation can provide system resources to drive overlay functionality while preserving allocation of system resources for the artificial reality environment. The allocation achieves concurrent execution of the overlay process and the artificial reality environment process, as well as concurrent display of the overlay and the artificial reality environment.

[0044] In some implementations, the resource manager can scale portions of processor capacity allocated to the shell process/overlay when the overlay is displayed and/or in response to interactions with the user. For example, the clock speed of the processor cores allocated to the shell process/overlay can be dynamically scaled up. In these implementations, the processor capacity available to the shell process/overlay can be increased without shifting system resources from the artificial reality environment process.

[0045] Embodiments of the disclosed technology may include or be implemented in conjunction with an artificial reality system. Artificial reality or extra reality (XR) is a form of reality that has been adjusted in some manner before presentation to a user, which may include, e.g., virtual reality (VR), augmented reality (AR), mixed reality (MR), hybrid reality, or some combination and/or derivatives thereof. Artificial reality content may include completely generated content or generated content combined with captured content (e.g., real-world photographs). The artificial reality content may include video, audio, haptic feedback, or some combination thereof, any of which may be presented in a single channel or in multiple channels (such as stereo video that produces a three-dimensional effect to the viewer). Additionally, in some embodiments, artificial reality may be associated with applications, products, accessories, services, or some combination thereof, that are, e.g., used to create content in an artificial reality and/or used in (e.g., perform activities in) an artificial reality. The artificial reality system that provides the artificial reality content may be implemented on various platforms, including a head-mounted display (HMD) connected to a host computer system, a standalone HMD, a mobile device or computing system, a "cave" environment or other projection system, or any other hardware platform capable of providing artificial reality content to one or more viewers. [0046] "Virtual reality" or "VR," as used herein, refers to an immersive experience where a user's visual input is controlled by a computing system. "Augmented reality" or "AR" refers to systems where a user views images of the real world after they have passed through a computing system. For example, a tablet with a camera on the back can capture images of the real world and then display the images on the screen on the opposite side of the tablet from the camera. The tablet can process and adjust or "augment" the images as they pass through the system, such as by adding virtual objects. "Mixed reality" or "MR" refers to systems where light entering a user's eye is partially generated by a computing system and partially composes light reflected off objects in the real world. For example, a MR headset could be shaped as a pair of glasses with a pass-through display, which allows light from the real world to pass through a waveguide that simultaneously emits light from a projector in the MR headset, allowing the MR headset to present virtual objects intermixed with the real objects the user can see. "Artificial reality," "extra reality," or "XR," as used herein, refers to any of VR, AR, MR, or any combination or hybrid thereof.

[0047] Conventional XR systems can display a XR environment to a user, but often fail to provide additional functionality outside the XR environment. For example, conventionally, XR systems are unable to pause or freeze a XR environment and permit other applications to execute concurrently. Because XR environment software processes often require large quantities of computing resources, conventional XR systems trade-off between executing the applications/software processes that support the XR environment and executing other applications.

[0048] Implementations provide an overlay over a concurrently executing XR environment. For example, the XR environment can be supported by a XR software process and the overlay can be supported by a shell software process, where both the XR software process and shell software process are concurrently executing. In some implementations, while the overlay is displayed, the XR environment is frozen/paused, the XR software process is backgrounded, and the shell software process is foregrounded. Although the shell software process is foregrounded, a majority of the XR system computing capacity can be allocated to executing the XR software process to keep the XR environment from crashing. The shell software process and overlay service can be designed with lean computing resource requirements. Additional applications can be executed via the shell software process and the content for the additional applications can be displayed in the overlay. Accordingly, implementations provide mechanisms that permit additional applications to execute concurrently with a paused and/or frozen XR environment.

[0049] Several implementations are discussed below in more detail in reference to the figures. Figure 1 is a block diagram illustrating an overview of devices on which some implementations of the disclosed technology can operate. The devices can comprise hardware components of a computing system 100 that provide an overlay over a concurrently executing artificial reality environment. In various implementations, computing system 100 can include a single computing device 103 or multiple computing devices (e.g., computing device 101 , computing device 102, and computing device 103) that communicate over wired or wireless channels to distribute processing and share input data. In some implementations, computing system 100 can include a stand-alone headset capable of providing a computer created or augmented experience for a user without the need for external processing or sensors. In other implementations, computing system 100 can include multiple computing devices such as a headset and a core processing component (such as a console, mobile device, or server system) where some processing operations are performed on the headset and others are offloaded to the core processing component. Example headsets are described below in relation to Figures 2A and 2B. In some implementations, position and environment data can be gathered only by sensors incorporated in the headset device, while in other implementations one or more of the non-headset computing devices can include sensor components that can track environment or position data.

[0050] Computing system 100 can include one or more processor(s) 110 (e.g., central processing units (CPUs), graphical processing units (GPUs), holographic processing units (HPUs), etc.) Processors 110 can be a single processing unit or multiple processing units in a device or distributed across multiple devices (e.g., distributed across two or more of computing devices 101-103).

[0051] Computing system 100 can include one or more input devices 120 that provide input to the processors 110, notifying them of actions. The actions can be mediated by a hardware controller that interprets the signals received from the input device and communicates the information to the processors 110 using a communication protocol. Each input device 120 can include, for example, a mouse, a keyboard, a touchscreen, a touchpad, a wearable input device (e.g., a haptics glove, a bracelet, a ring, an earring, a necklace, a watch, etc.), a camera (or other light-based input device, e.g., an infrared sensor), a microphone, or other user input devices.

[0052] Processors 1 10 can be coupled to other hardware devices, for example, with the use of an internal or external bus, such as a PCI bus, SCSI bus, or wireless connection. The processors 1 10 can communicate with a hardware controller for devices, such as for a display 130. Display 130 can be used to display text and graphics. In some implementations, display 130 includes the input device as part of the display, such as when the input device is a touchscreen or is equipped with an eye direction monitoring system. In some implementations, the display is separate from the input device. Examples of display devices are: an LCD display screen, an LED display screen, a projected, holographic, or augmented reality display (such as a heads-up display device or a head-mounted device), and so on. Other I/O devices 140 can also be coupled to the processor, such as a network chip or card, video chip or card, audio chip or card, USB, firewire or other external device, camera, printer, speakers, CD-ROM drive, DVD drive, disk drive, etc.

[0053] In some implementations, input from the I/O devices 140, such as cameras, depth sensors, IMU sensor, GPS units, LiDAR or other time-of-flights sensors, etc. can be used by the computing system 100 to identify and map the physical environment of the user while tracking the user's location within that environment. This simultaneous localization and mapping (SLAM) system can generate maps (e.g., topologies, girds, etc.) for an area (which may be a room, building, outdoor space, etc.) and/or obtain maps previously generated by computing system 100 or another computing system that had mapped the area. The SLAM system can track the user within the area based on factors such as GPS data, matching identified objects and structures to mapped objects and structures, monitoring acceleration and other position changes, etc.

[0054] Computing system 100 can include a communication device capable of communicating wirelessly or wire-based with other local computing devices or a network node. The communication device can communicate with another device or a server through a network using, for example, TCP/IP protocols. Computing system 100 can utilize the communication device to distribute operations across multiple network devices.

[0055] The processors 110 can have access to a memory 150, which can be contained on one of the computing devices of computing system 100 or can be distributed across of the multiple computing devices of computing system 100 or other external devices. A memory includes one or more hardware devices for volatile or non-volatile storage, and can include both read-only and writable memory. For example, a memory can include one or more of random access memory (RAM), various caches, CPU registers, read-only memory (ROM), and writable non-volatile memory, such as flash memory, hard drives, floppy disks, CDs, DVDs, magnetic storage devices, tape drives, and so forth. A memory is not a propagating signal divorced from underlying hardware; a memory is thus non-transitory. Memory 150 can include program memory 160 that stores programs and software, such as an operating system 162, overlay manager 164, and other application programs 166. Memory 150 can also include data memory 170 that can include, e.g., sensor data, virtual object data, application data, overlay state data, configuration data, settings, user options or preferences, etc., which can be provided to the program memory 160 or any element of the computing system 100.

[0056] Some implementations can be operational with numerous other computing system environments or configurations. Examples of computing systems, environments, and/or configurations that may be suitable for use with the technology include, but are not limited to, XR headsets, personal computers, server computers, handheld or laptop devices, cellular telephones, wearable electronics, gaming consoles, tablet devices, multiprocessor systems, microprocessor-based systems, set-top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, or the like.

[0057] Figure 2A is a wire diagram of a virtual reality head-mounted display (HMD) 200, in accordance with some embodiments. The HMD 200 includes a front rigid body 205 and a band 210. The front rigid body 205 includes one or more electronic display elements of an electronic display 245, an inertial motion unit (I MU) 215, one or more position sensors 220, locators 225, and one or more compute units 230. The position sensors 220, the IMU 215, and compute units 230 may be internal to the HMD 200 and may not be visible to the user. In various implementations, the IMU 215, position sensors 220, and locators 225 can track movement and location of the HMD 200 in the real world and in an artificial reality environment in three degrees of freedom (3DoF) or six degrees of freedom (6DoF). For example, the locators 225 can emit infrared light beams which create light points on real objects around the HMD 200. As another example, the IMU 215 can include e.g., one or more accelerometers, gyroscopes, magnetometers, other non-camera-based position, force, or orientation sensors, or combinations thereof. One or more cameras (not shown) integrated with the HMD 200 can detect the light points. Compute units 230 in the HMD 200 can use the detected light points to extrapolate position and movement of the HMD 200 as well as to identify the shape and position of the real objects surrounding the HMD 200.

[0058] The electronic display 245 can be integrated with the front rigid body 205 and can provide image light to a user as dictated by the compute units 230. In various embodiments, the electronic display 245 can be a single electronic display or multiple electronic displays (e.g., a display for each user eye). Examples of the electronic display 245 include: a liquid crystal display (LCD), an organic light-emitting diode (OLED) display, an active-matrix organic light-emitting diode display (AMOLED), a display including one or more quantum dot light-emitting diode (QOLED) sub-pixels, a projector unit (e.g., microLED, LASER, etc.), some other display, or some combination thereof.

[0059] In some implementations, the HMD 200 can be coupled to a core processing component such as a personal computer (PC) (not shown) and/or one or more external sensors (not shown). The external sensors can monitor the HMD 200 (e.g. , via light emitted from the HMD 200) which the PC can use, in combination with output from the IMU 215 and position sensors 220, to determine the location and movement of the HMD 200.

[0060] Figure 2B is a wire diagram of a mixed reality HMD system 250 which includes a mixed reality HMD 252 and a core processing component 254. The mixed reality HMD 252 and the core processing component 254 can communicate via a wireless connection (e.g., a 60 GHz link) as indicated by link 256. In other implementations, the mixed reality system 250 includes a headset only, without an external compute device or includes other wired or wireless connections between the mixed reality HMD 252 and the core processing component 254. The mixed reality HMD 252 includes a pass-through display 258 and a frame 260. The frame 260 can house various electronic components (not shown) such as light projectors (e.g., LASERS, LEDs, etc.), cameras, eye-tracking sensors, MEMS components, networking components, etc.

[0061] The projectors can be coupled to the pass-through display 258, e.g., via optical elements, to display media to a user. The optical elements can include one or more waveguide assemblies, reflectors, lenses, mirrors, collimators, gratings, etc., for directing light from the projectors to a user's eye. Image data can be transmitted from the core processing component 254 via link 256 to HMD 252. Controllers in the HMD 252 can convert the image data into light pulses from the projectors, which can be transmitted via the optical elements as output light to the user's eye. The output light can mix with light that passes through the display 258, allowing the output light to present virtual objects that appear as if they exist in the real world.

[0062] Similarly to the HMD 200, the HMD system 250 can also include motion and position tracking units, cameras, light sources, etc., which allow the HMD system 250 to, e.g., track itself in 3DoF or 6DoF, track portions of the user (e.g., hands, feet, head, or other body parts), map virtual objects to appear as stationary as the HMD 252 moves, and have virtual objects react to gestures and other real-world objects.

[0063] Figure 2C illustrates controllers 270 (including controller 276A and 276B), which, in some implementations, a user can hold in one or both hands to interact with an artificial reality environment presented by the HMD 200 and/or HMD 250. The controllers 270 can be in communication with the HMDs, either directly or via an external device (e.g., core processing component 254). The controllers can have their own IMU units, position sensors, and/or can emit further light points. The HMD 200 or 250, external sensors, or sensors in the controllers can track these controller light points to determine the controller positions and/or orientations (e.g., to track the controllers in 3DoF or 6D0F). The compute units 230 in the HMD 200 or the core processing component 254 can use this tracking, in combination with IMU and position output, to monitor hand positions and motions of the user. The controllers can also include various buttons (e.g., buttons 272A-F) and/or joysticks (e.g., joysticks 274A- B), which a user can actuate to provide input and interact with objects.

[0064] In various implementations, the HMD 200 or 250 can also include additional subsystems, such as an eye tracking unit, an audio system, various network components, etc., to monitor indications of user interactions and intentions. For example, in some implementations, instead of or in addition to controllers, one or more cameras included in the HMD 200 or 250, or from external cameras, can monitor the positions and poses of the user's hands to determine gestures and other hand and body motions. As another example, one or more light sources can illuminate either or both of the user's eyes and the HMD 200 or 250 can use eye-facing cameras to capture a reflection of this light to determine eye position (e.g., based on set of reflections around the user's cornea), modeling the user's eye and determining a gaze direction.

[0065] Figure 3 is a block diagram illustrating an overview of an environment 300 in which some implementations of the disclosed technology can operate. Environment 300 can include one or more client computing devices 305A-D, examples of which can include computing system 100. In some implementations, some of the client computing devices (e.g., client computing device 305B) can be the HMD 200 or the HMD system 250. Client computing devices 305 can operate in a networked environment using logical connections through network 330 to one or more remote computers, such as a server computing device.

[0066] In some implementations, server 310 can be an edge server which receives client requests and coordinates fulfillment of those requests through other servers, such as servers 320A-C. Server computing devices 310 and 320 can comprise computing systems, such as computing system 100. Though each server computing device 310 and 320 is displayed logically as a single server, server computing devices can each be a distributed computing environment encompassing multiple computing devices located at the same or at geographically disparate physical locations. [0067] Client computing devices 305 and server computing devices 310 and 320 can each act as a server or client to other server/client device(s). Server 310 can connect to a database 315. Servers 320A-C can each connect to a corresponding database 325A-C. As discussed above, each server 310 or 320 can correspond to a group of servers, and each of these servers can share a database or can have their own database. Though databases 315 and 325 are displayed logically as single units, databases 315 and 325 can each be a distributed computing environment encompassing multiple computing devices, can be located within their corresponding server, or can be located at the same or at geographically disparate physical locations. [0068] Network 330 can be a local area network (LAN), a wide area network (WAN), a mesh network, a hybrid network, or other wired or wireless networks. Network 330 may be the Internet or some other public or private network. Client computing devices 305 can be connected to network 330 through a network interface, such as by wired or wireless communication. While the connections between server 310 and servers 320 are shown as separate connections, these connections can be any kind of local, wide area, wired, or wireless network, including network 330 or a separate public or private network.

[0069] Figure 4 is a block diagram illustrating components 400 which, in some implementations, can be used in a system employing the disclosed technology. Components 400 can be included in one device of computing system 100 or can be distributed across multiple of the devices of computing system 100. The components 400 include hardware 410, mediator 420, and specialized components 430. As discussed above, a system implementing the disclosed technology can use various hardware including processing units 412, working memory 414, input and output devices 416 (e.g., cameras, displays, IMU units, network connections, etc.), and storage memory 418. In various implementations, storage memory 418 can be one or more of: local devices, interfaces to remote storage devices, or combinations thereof. For example, storage memory 418 can be one or more hard drives or flash drives accessible through a system bus or can be a cloud storage provider (such as in storage 315 or 325) or other network storage accessible via one or more communications networks. In various implementations, components 400 can be implemented in a client computing device such as client computing devices 305 or on a server computing device, such as server computing device 310 or 320.

[0070] Mediator 420 can include components which mediate resources between hardware 410 and specialized components 430. For example, mediator 420 can include an operating system, services, drivers, a basic input output system (BIOS), controller circuits, or other hardware or software systems.

[0071 ] Specialized components 430 can include software or hardware configured to perform operations for controlling an indicator by combining signals from multiple sensors. Specialized components 430 can include system shell 434, resource manager 436, display manager 438, and components and APIs which can be used for providing user interfaces, transferring data, and controlling the specialized components, such as interfaces 432. In some implementations, components 400 can be in a computing system that is distributed across multiple computing devices or can be an interface to a server-based application executing one or more of specialized components 430. Although depicted as separate components, specialized components 430 may be logical or other nonphysical differentiations of functions and/or may be submodules or code-blocks of one or more applications.

[0072] System shell 434 can manage the software components of a XR system, such as system display components, software functionality for system display components, and interactions with launched applications. For example, system shell 434 can be a shell environment in which applications are executed, such as shell applications or applications remote from the shell. Example applications include web browsers, music players, video players, social media applications, messaging web browsers, music players, video players, social media applications, messaging or other communication applications, third-party applications, streaming/casting applications, a content library application, or any other suitable application.

[0073] Implementations of system shell 434 can interact with the applications executing at the XR system to manage visual displays for the applications, such as two-dimensional virtual object displays (e.g., panels). System shell 434 can manage positioning and sizing of the virtual object displays and allocate portions of display area/volume to the executing applications. In some implementations, system shell 434 can comprise several units of executing software in combination. Additional details on system shell 434 are provided below in relation to Figures 5, 6, 7, blocks 806, 808, 810, and 812 of Figure 8.

[0074] Resource manager 436 can allocate system resources between a process that executes the XR environment and a process that executes the overlay. For example, resource manager 436 can allocate XR system resources, such as processor capacity, to a XR environment process to support execution of the XR environment and a shell process to support execution of the overlay. Processor capacity can represent one or more of a number of processor cores, processor core performance (e.g., clock speed, processor cycles or processing windows, processor cache memory, etc.), any other suitable processor capacity metric, or any combination thereof. In some implementations, the processor capacity resource allocations from resource manager 436 can provide enough system resources to the shell process to support overlay functionality while preserving a majority of system resources for allocation to the XR process to support the XR environment. For example, the shell process and overlay service can be configured with lean resource requirements so that concurrent execution with the XR environment process is feasible.

[0075] In some implementations, resource manager 436 can scale portions of processor capacity allocated to the overlay when the overlay is displayed and/or in response to interactions with the user. For example, the clock speed of the processor cores allocated to the shell process/overlay can be dynamically scaled up. In these implementations, resource manager 436 can increase the processor capacity available to the shell process/overlay without shifting system resources from the artificial reality environment process. Additional details on resource manager 436 are provided below in relation to Figure 6.

[0076] Display manager 438 can coordinate display of pixels for a composite of the overlay and the XR environment. For example, while the overlay is displayed (e.g., over the XR environment) the XR environment process can continue to provide display output (e.g., frames, pixels, etc.) for display at the XR system. In some implementations, display manager 438 can receive display output for the overlay from the shell process and display output for the XR environment from the XR process. For example, the XR environment can be an immersive three-dimensional environment that alters the display presented to the user in response to detected movement of the user, such as changes in head position.

[0077] Implementations of display manager 438 can generate a composite display that includes the two-dimensional content for the overlay and the three- dimensional content for the XR environment. For example, the user can interact with the two-dimensional overlay by driving a cursor via user input while the display of the three-dimensional XR environment in the background is dynamically changed in response to user head movement. In some implementations, the XR environment in the background is paused (e.g., in a frozen state, in a holding state, etc.) while the overlay is displayed/active, however the XR environment process continues to output display (e.g., frames, pixels) while paused, such as display changes in response to user head movement. Additional details on display manager 438 are provided below in relation to Figures 5 and 7.

[0078] Figures 5 is an artificial reality environment with a displayed overlay. Diagram 500 includes XR environment 502, overlay 504, and user 506. Implementations can immerse a user 506 in XR environment 502 via any suitable XR system. For example, user 506 can interact with XR environment 502 via hand-held controller input (e.g., tracked motion, buttons, joystick, etc.), user movement (e.g., tracked hand movements), or any other suitable input channel. In some implementations, user 506 can trigger the display of overlay 504, such as via a button press of a hand-held controller, tap on an Ul element (e.g., on a user's wrist), in response to a pre-defied gesture or voice command, or via any other suitable input. [0079] In response to the user input, implementations display overlay 504, a two-dimensional virtual object (e.g., panel) over XR environment 502. When overlay 504 is displayed, XR environment 502 can be concurrently executed in a paused or frozen state. For example, a shell process can implement overlay 504 and a XR process can implement XR environment 502. The shell process (or any other suitable process) can notify the XR process that overlay 504 is active and, in response, the XR process can transition XR environment 504 into the paused or frozen state. For example, when overlay 504 is active, the shell process can be granted input priority over the XR process. When the shell process has input priority, portions of the input channels from the XR system (e.g., hand-held controller input, tracked user hand input, etc.) can drive interactions at overlay 504 (e.g., cursor based interactions) instead of XR environment 502 interactions. Due to this transition, XR environment 502 may receive no or limited input from user 506 via the input channels of the XR system, and thus XR environment 502 can be paused until overlay 504 is closed/no longer active.

[0080] Overlay 504 can display a shell controls, widgets, and tools e.g., where user 506 can launch applications. For example, a launched application can comprise a display component, such as a two-dimensional virtual object (e.g., panel). Overlay 504 can permit user 506 access to applications run by the XR system without terminating XR environment 502. Implementations concurrently run the shell process and the XR process and concurrently display overlay 504 and XR environment 502. In some implementations, interactions with overlay 504 are achieved by user input via a first channel of the XR system (e.g., hand-held controller input, tracked user hands, etc.), however XR environment 502 can alter its display in response to user input via a second channel of the XR system (e.g., tracked head position). For example, the first channel input can drive cursor based interactions at overlay 504 while the second channel input dynamically changes the view of XR environment 502. Implementations concurrently display changes to the display of overlay 504 based on the first channel input and changes to XR environment 502 based on the second channel input.

[0081] In some implementations, a multi-client runtime service provides the shell process and the XR process access to system resources that support the overlay and the XR environment. Figure 6 is a diagram of a system architecture for concurrently executing an artificial reality environment with an overlay service. Diagram 600 includes shell process 602, XR process 604, XR runtime process 606, overlay service 608, game engine 610, and XR runtime service 612.

[0082] During immersive display of a XR environment, game engine 610 performs XR environment dynamics and drives display of the immersive three- dimensional experience. Game engine 610, running in XR process 604, accesses system resources via XR runtime service 612 that operates in XR runtime process 606. When an overlay is triggered, shell process 602 can connect to XR runtime process 606 to access system resources for the overlay. For example, XR runtime process 606 can be a multi-client runtime. In some implementations, shell process 602 comprises a persistent connection to XR runtime service 612.

[0083] Dynamics and processing for the overlay can be performed by overlay service 608. When an overlay is active (e.g., displayed over an XR environment), implementations allocate a majority of system resources (e.g., processor capacity) to XR process 604 (e.g., game engine 610) and a minority of system resources to shell process 602 (e.g., overlay service 608). XR runtime process 606 can provide access to these allocated system resources to shell process 602 and XR process 604.

[0084] In some implementations, user input relative to the displayed overlay can include interactions with two-dimensional virtual objects (e.g., panels) that comprise the display components of launched/executing applications. Figure 7 is a diagram of an example overlay display. Diagram 700 includes overlay 702, XR environment 704, and virtual objects 706 and 708. Overlay 702 can be displayed over XR environment 704, for example while XR environment 704 is in a paused or frozen state.

[0085] Overlay 702 displays virtual objects 706 and 708, which comprise two-dimensional panels. Virtual object 706 can display content for a first executing application launched by a user and virtual object 708 can display content for a second executing application launched by the user. User interactions can reposition and resize virtual objects 706 and 708 in overlay 702. Implementations can store the state of overlay 702, for example by storing: an overlay state that defines location, position, and initialization parameters for a virtual object and corresponding executing application; and an application state that defines content displayed in the virtual object by the executing application (e.g., webpage, scroll position, playing video or song, open messages in a messaging application, etc.). For example, the state of overlay 702 can be stored when a user closes the overlay or otherwise ends the user session with the XR system.

[0086] In some implementations, each application can store its own application state upon closing of overlay 702. For example, the applications can comprise Android applications with lifecycle management. An overlay manager (e.g., a shell system application/process) can store the overlay state upon closing of overlay 702. For example, the overlay state can be saved as a serialized JavaScript Object Notation (JSON) file, or any other suitable data file.

[0087] In some implementations, a user may reopen overlay 702 after previously closing the overlay. For example, the user may close overlay 702, interact with XR environment 704 for a period of time, and later reopen overlay 702. In this example, the overlay manager can restore overlay 702 according to the stored overlay state and stored application states to recreate the display of overlay 702 when it was closed. For example, the overlay manager can restore virtual objects 706 and 708 to their positions/sizes when overlay 702 was closed by accessing the stored overlay state (e.g., accessing and de-serializing the JSON data file). In addition, the overlay manager can relaunch/initialize the applications previously executing, and the launched applications can restore the contents displayed by the executing applications at virtual objects 706 and 708 (e.g., website, scroll positioning, song playing, video playing, messages open, etc.) by accessing their own stored application states. In combination, virtual objects 706 and 708 can be positioned and sized to mimic the state of overlay 702 when closed and the contents of virtual objects 706 and 708 can be restored to mimic the content displayed when overlay 702 was closed. [0088] Those skilled in the art will appreciate that the components illustrated in Figures 1-7 described above, and in each of the flow diagrams discussed below, may be altered in a variety of ways. For example, the order of the logic may be rearranged, substeps may be performed in parallel, illustrated logic may be omitted, other logic may be included, etc. In some implementations, one or more of the components described above can execute one or more of the processes described below.

[0089] Figure 8 is a flow diagram illustrating a process used in some implementations of the present technology for a providing an overlay over a concurrently executing artificial reality environment. In some implementations, process 800 can be performed by a XR system. In some implementations, process 800 can be triggered when a XR environment is displayed to a user.

[0090] At block 802, process 800 can display a XR environment to a user. For example, a XR system can display an immersive three-dimensional XR environment to a user. In some implementations, content for the XR environment can be provided by a process executing at the XR system, such as an XR process. The XR process can host one or more XR applications that provide functionality for the XR environment. In some implementations, the XR process can connect to a XR runtime service.

[0091] At block 804, process 800 can detect when an overlay has been triggered. For example, a user can launch an overlay via input from the XR system, such as a button push on a hand-held controller, tracked user hand input, touchpad input, or any other suitable input. When an overlay has been triggered, process 800 can progress to block 806. When the overlay has not been triggered, process 800 can loop back to block 802, where the XR environment can continue to be displayed.

[0092] At block 806, process 800 can display the two-dimensional overlay over the XR environment. For example, the two-dimensional overlay can provide a separate overlay service, under control of a shell process, in which additional applications can be executed. In some implementations, the shell process can connect to the XR runtime service.

[0093] At block 808, process 800 can transition the shell process to a foreground priority and the XR process to the background priority. For example, the shell process (e.g., an overlay service at the shell process) can be granted user input priority over the XR environment when the overlay is launched. The user input priority for the shell process permits user input via a first input channel of the XR system to drive interactions at the displayed overlay (e.g., cursor based interactions) rather than the XR environment. In some implementations, an indication about the user input priority of the overlay can be provided to the XR environment (e.g., XR process), and the XR environment can initiate a pause or frozen state in response to the indication. In some implementations, user input via a second input channel can drive interactions with the XR environment (e.g., the backgrounded XR process). For example, the first input channel can be user hand input (e.g., touch input, hand-held controller input, tracked hand motion, or any combination thereof) and the second input channel can be user head input (e.g., tracked user head movement, tracked user gaze, or any combination thereof).

[0094] In some implementations, while the XR process is in the background priority, a majority of XR system resource capacity is allocated to executing the XR process. For example, a majority of the XR system resource capacity may be required to execute the XR process in order to keep the XR process from crashing. In some implementations, the XR system resource capacity comprises at least processor capacity.

[0095] At block 810, process 800 can receive input and interact with the user. For example, the displayed overlay can receive user input, via the first input channel (e.g., user hand input), to launch/execute one or more additional applications, interact with two-dimensional virtual objects executing at the shell process, move or resize the two-dimensional virtual objects, and perform other suitable shell environment functionality. In some implementations, the two-dimensional overlay and launched additional applications run concurrently with the XR environment (e.g., in a paused or frozen state). For example, the shell process and the XR process can run concurrently and support concurrent execution of the overlay and XR environment.

[0096] In some implementations, at least a portion of the content displayed by the overlay component is based on the user’s context with respect to the XR environment. For example, the XR environment can be a shared XR environment (e.g., shared with a plurality of users) that includes a plurality of XR worlds (e.g., different portions of the XR environment). The portion of content displayed by the overlay component that is based on the user’s context with respect to the XR environment can be a list of known users (e.g., the user’s friends, users in a group or clan, etc.) that are present in a same XR world of the XR environment as the user.

[0097] In some implementations, the XR environment can receive user input via the second input channel (e.g., user head input). For example, in response to input via the second input channel, the XR process can alter the display of the XR environment to the user by changing a perspective of the user’s view of the XR environment. In other words, when the user moves the user’s head and/or changes the user’s gaze, the XR process can alter the display of the XR environment (e.g., displayed behind the overlay) to provide the user an immersive experience.

[0098] At block 812, process 800 can detect when the user closes the overlay. For example, the user can close the overlay via user input at the XR system, by ending the user session with the XR system, or in any other suitable manner. In some implementations, user input that closes the overlay can terminate display of the overlay component (e.g., display of the overlay over the XR environment). When it is detected that the user has closed the overlay, process 800 can progress to block 814. When it is not detected that the user has closed the overlay, process 800 can loop back to block 810, where user interactions can continue until it is detected that the user has closed the overlay.

[0099] At block 814, process 800 can store state information for the overlay component. For example, state information can be stored for the overlay component in response to the terminating, the state information comprising a state of content displayed at the overlay component and at least one application state for at least one additional application executing via the shell process.

[00100] At block 816, process 800 can resume the XR environment. For example, when display of the overlay is terminated via input at the XR system (e.g., hand-held controller buttons, selection of a close element at the overlay via user input, etc.) the XR environment can be notified. In some implementations, the XR environment can be transitioned to the foreground in response to the terminating (e.g., the XR environment can regain priority for input via the first input channel). In this example, based on the notification, the XR environment can be transitioned from a paused or frozen state to a resumed or unfrozen state. In some implementations, user interactions with the XR environment via user input at the XR system (e.g., given the retained user input priority) can resume the XR environment. Process 800 can then progress to block 802, where the XR environment is displayed to the user.

[00101] In some implementations, the overlay can be redisplayed in response to a trigger (e.g., user input that loads the overlay). For example, the redisplayed overlay can comprise previously stored state information. Upon triggering a redisplay of the overlay over the XR environment, content displayed at the redisplayed overlay can be restored using the stored state information, and an application state for at least one additional executing application can be restored using the stored state information. For example, the content displayed at the overlay can be restored to the content it displayed when it was closed/terminated. In addition, the additional application (e.g., hosted by the shell process) can be restored to the execution state that it was in when the overlay was closed/terminated.

[00102] Reference in this specification to "implementations" (e.g., "some implementations," "various implementations," “one implementation,” “an implementation,” etc.) means that a particular feature, structure, or characteristic described in connection with the implementation is included in at least one implementation of the disclosure. The appearances of these phrases in various places in the specification are not necessarily all referring to the same implementation, nor are separate or alternative implementations mutually exclusive of other implementations. Moreover, various features are described which may be exhibited by some implementations and not by others. Similarly, various requirements are described which may be requirements for some implementations but not for other implementations.

[00103] As used herein, being above a threshold means that a value for an item under comparison is above a specified other value, that an item under comparison is among a certain specified number of items with the largest value, or that an item under comparison has a value within a specified top percentage value. As used herein, being below a threshold means that a value for an item under comparison is below a specified other value, that an item under comparison is among a certain specified number of items with the smallest value, or that an item under comparison has a value within a specified bottom percentage value. As used herein, being within a threshold means that a value for an item under comparison is between two specified other values, that an item under comparison is among a middle-specified number of items, or that an item under comparison has a value within a middle-specified percentage range. Relative terms, such as high or unimportant, when not otherwise defined, can be understood as assigning a value and determining how that value compares to an established threshold. For example, the phrase "selecting a fast connection" can be understood to mean selecting a connection that has a value assigned corresponding to its connection speed that is above a threshold. [00104] As used herein, the word "or" refers to any possible permutation of a set of items. For example, the phrase "A, B, or C" refers to at least one of A, B, C, or any combination thereof, such as any of: A; B; C; A and B; A and C; B and C; A, B, and C; or multiple of any item such as A and A; B, B, and C; A, A, B, C, and C; etc.

[00105] Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Specific embodiments and implementations have been described herein for purposes of illustration, but various modifications can be made without deviating from the scope of the embodiments and implementations. The specific features and acts described above are disclosed as example forms of implementing the claims that follow. Accordingly, the embodiments and implementations are not limited except as by the appended claims.