Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEM AND METHOD FOR RICH CONTENT BROWSING MULTITASKING ON DEVICE OPERATING SYSTEMS WITH MULTITASKING LIMITATIONS
Document Type and Number:
WIPO Patent Application WO/2024/097301
Kind Code:
A1
Abstract:
A controller can be used with a computing device to select and/or interact with content using user input devices on the controller. The content can be locally-stored on the computing device and/or streamed from a remote device. In one embodiment, a contextually-aware platform service switcher is provided. In another embodiment, a system and method for automatic content capability detection are provided. In yet another embodiment, a system and method for rich content browsing multitasking on device operating systems with multitasking limitations are provided. These embodiments can be used alone or in combination. Other embodiments are provided.

Inventors:
CHOW CASEY (US)
KAPUR ROHAN (US)
HOOKANO KAUHI (US)
Application Number:
PCT/US2023/036609
Publication Date:
May 10, 2024
Filing Date:
November 01, 2023
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
BACKBONE LABS INC (US)
International Classes:
A63F13/53
Foreign References:
US20190230400A12019-07-25
US20150128042A12015-05-07
US11389721B22022-07-19
US202117504283A2021-10-18
US202016808339A2020-03-03
US202217866166A2022-07-15
US202217866234A2022-07-15
US202217856895A2022-07-01
US202217850912A2022-06-27
Attorney, Agent or Firm:
HETZ, Joseph, F. et al. (US)
Download PDF:
Claims:
What is claimed is:

1. A method comprising: performing in a computing device in communication with a controller: receiving an input from the controller indicating actuation of one or more user input devices of the controller; and in response to receiving the input, displaying a subordinate view of content, wherein the subordinate view provides an indication of a user’s position in a queue to play the content.

2. The method of Claim 1, wherein the subordinate view comprises a minimized display area in a picture-in-picture display.

3. The method of Claim 1, wherein the subordinate view comprises an overlay to a display area.

4. The method of Claim 1, wherein the content is displayed in a browser.

5. The method of Claim 1, wherein the computing device’s operating system does not support the subordinate view.

6. The method of Claim 1, wherein the content is playable via a controller app in the computing device.

7. A non-transitory computer-readable storage medium storing a computer program having instructions that, when executed by one or more processors in a computing device, cause the one or more processors to: receive an input from a controller in communication with the computing device indicating actuation of one or more user input devices of the controller; and in response to receiving the input: un-focus content while keeping the content in a foreground; and display a subordinate view of the content.

8. The non-transitory computer-readable storage medium of Claim 7, wherein the subordinate view comprises a minimized display area in a picture-in-picture display.

9. The non-transitory computer-readable storage medium of Claim 7, wherein the subordinate view comprises an overlay to a display area.

10. The non-transitory computer-readable storage medium of Claim 7, wherein the subordinate view provides indicia identifying the content;

11. The non-transitory computer-readable storage medium of Claim 7, wherein the subordinate view provides information about the content;

12. The non-transitory computer-readable storage medium of Claim 11, wherein the information comprises an indication of a user’s position in a queue to play the content.

13. The non-transitory computer-readable storage medium of Claim 7, wherein the content is displayed in a browser.

14. The non-transitory computer-readable storage medium of Claim 13, wherein the instructions, when executed by the one or more processors, further cause the one or more processors to: prevent termination of the browser.

15. The non-transitory computer-readable storage medium of Claim 7, wherein the subordinate view is movable by a user in a display.

16. The non-transitory computer-readable storage medium of Claim 7, wherein the computing device’s operating system does not support the subordinate view.

17. The non-transitory computer-readable storage medium of Claim 7, wherein the instructions, when executed by the one or more processors, further cause the one or more processors to: automatically display the subordinate view.

18. The non-transitory computer-readable storage medium of Claim 7, wherein the instructions, when executed by the one or more processors, further cause the one or more processors to: provide an indication to a user that the user can perform other activities while waiting on a queue.

19. The non-transitory computer-readable storage medium of Claim 7, wherein the content is playable via a controller app in the computing device.

20. A computing device comprising: an interface configured to communication with a controller; and one or more processors configured to communicate with the interface further configured to: receive an input from the controller indicating actuation of one or more user input devices of the controller; and in response to receiving the input, displaying a subordinate view of content.

21. The computing device of Claim 18, wherein the subordinate view provides an indication of a user’s position in a queue to play the content.

22. The computing device of Claim 18, wherein the computing device’s operating system does not support the subordinate view but utilizes operating- system- supported subordinate views to simulate the subordinate view.

23. A non-transitory computer-readable storage medium storing a computer program having instructions that, when executed by one or more processors in a computing device, cause the one or more processors to: receive an input from a controller in communication with the computing device indicating interaction of one or more user input devices of the controller; in response to receiving the input from the controller, focus a user interface that presents content for engagement; and in response to receiving user input, un-focus content selected from the user interface and display the un-focus content subordinate to the user interface, and in response to receiving user input, dismiss a user interface that is currently presented that cannot un-focus, and in response to receiving user input, launch a user interface that is currently not presented.

24. The non-transitory computer-readable storage medium of Claim 23, wherein the user input is received from the controller.

25. The non-transitory computer-readable storage medium of Claim 23, wherein the user input is received from a touch screen of the computing device.

26. The non-transitory computer-readable storage medium of Claim 23, wherein the subordinate view comprises a minimized display area in a picture-in-picture display.

27. The non-transitory computer-readable storage medium of Claim 23, wherein the subordinate view comprises an overlay to the user interface.

28. The non-transitory computer-readable storage medium of Claim 23, wherein the subordinate view provides indicia identifying the content.

29. The non-transitory computer-readable storage medium of Claim 23, wherein the subordinate view provides an indication of meaningful state data to play the content.

30. The non-transitory computer-readable storage medium of Claim 23, wherein the content is playable via a controller app in the computing device.

31. A system comprising: a controller; and a computing device comprising: one or more processors; and a non-transitory computer-readable storage medium storing a computer program having instructions that, when executed by one or more processors, cause the one or more processors to: receive an input from the controller indicating actuation of one or more user input devices of the controller; and in response to receiving the input: un-focus content while keeping the content in a foreground; and display a subordinate view of the content.

Description:
System and Method for Rich Content Browsing Multitasking on Device Operating Systems with Multitasking Limitations

Cross-Reference to Related Application

[0001] This application claims the benefit of U.S. provisional patent application number 63/422,797, filed November 4, 2022, which is hereby incorporated by reference.

Background

[0002] A game controller is a device used to provide input to a video game, for example, to control an object or character in the video game. The video game may be running on a computer, a specially-designated gaming system, or a mobile device. In some prior art devices, the game controller is designed to mechanically couple to a mobile device. Also, a controller can be used with a computing device to select and/or interact with content using user input devices on the controller. The content can be locally- stored on the computing device and/or streamed from a remote device. For example, the controller can be a game controller used to play a game that is native to the computing device and/or to play a game that is streamed from a remote server to a browser of the computing device.

Brief Description of the Drawings

[0003] Figure 1 is a top, perspective view showing portions of a game controller of an embodiment

[0004] Figure 2 is a top, perspective view of the game controller of Figure 1.

[0005] Figure 3 is a top, perspective view of the game controller of Figure 1.

[0006] Figure 4 is a partially exploded view of the game controller of Figure

3.

[0007] Figure 5 is a bottom, perspective view of the game controller of Figure 3.

[0008] Figure 6 is a perspective view of a game controller of an embodiment.

[0009] Figure 7 is a top view of the game controller of Figure 6.

[0010] Figure 8 is an illustration of an integrated game controller of an embodiment.

[0011] Figure 9 is an illustration of an example environment for a game controller of an embodiment. [0012] Figure 10 is an illustration of a cloud service of an embodiment.

[0013] Figure 11 is an illustration of subsystems of an embodiment.

[0014] Figure 12 is an illustration of a mobile game controller and mobile device of an embodiment.

[0015] Figure 13 is a flowchart of an example upgrade procedure of an embodiment.

[0016] Figure 14 is an illustration of a game controller of an embodiment with primary and secondary ports.

[0017] Figure 15 is a diagram of a configuration flow of an embodiment.

[0018] Figure 16 is a screen shot of a settings menu of an embodiment.

[0019] Figure 17 is a screen shot of a selection menu for a Play on Any Screen feature of an embodiment.

[0020] Figure 18 is an illustration of a demonstration of a Play on Any Screen feature of an embodiment.

[0021] Figure 19 is a block diagram of a mobile game controller, a mobile device, and a USB host of an embodiment.

[0022] Figure 20 is a screen shot of a capture, edit, and share feature of an embodiment.

[0023] Figure 21 is a screen shot of an embodiment for recording.

[0024] Figure 22 is a screen shot of an embodiment for streaming.

[0025] Figure 23 is a block diagram of a mobile game controller and a platform operating service of an embodiment.

[0026] Figure 24 is a screen shot of an embodiment for choosing a recording mode.

[0027] Figure 25 is a screen shot of an embodiment displaying a notification that smart recording is enabled.

[0028] Figure 26 is a block diagram of an audio/video pipeline of an embodiment.

[0029] Figure 27 is an illustration of a mobile phone and mobile game controller of an embodiment.

[0030] Figures 28-29 are screen shots that illustrate a live streaming feature of an embodiment.

[0031] Figures 30-31 are screen shots that illustrates an account linking feature of an embodiment. [0032] Figures 32-37 are screen shots that illustrate a capture gallery feature of an embodiment.

[0033] Figure 38 is a screen shot that illustrates a watermarking feature of an embodiment.

[0034] Figures 39-41 are screen shots that illustrate a video editor feature of an embodiment.

[0035] Figures 42-43 are screen shots that illustrate a game tagging feature of an embodiment.

[0036] Figure 44 is an illustration of a controller input architecture of an embodiment.

[0037] Figure 45 is an illustration of buttons of a controller of an embodiment.

[0038] Figure 46 is an illustration of a button descriptor of an embodiment.

[0039] Figure 47 is an illustration of an integrated dashboard of an embodiment.

[0040] Figure 48 is an illustration of a content grid of an embodiment.

[0041] Figure 49 is an illustration of an app store badge of an embodiment.

[0042] Figure 50 is a flow diagram of an integrated dash of an embodiment.

[0043] Figure 51 is an illustration of a content grid of an embodiment.

[0044] Figure 52 is a content page flow diagram of an embodiment.

[0045] Figure 53 is an illustration of a general client-server architecture of an embodiment.

[0046] Figure 54 is a flow diagram illustrating application/service games database interaction of an embodiment.

[0047] Figure 55 is an illustration of a system of architecture of an embodiment.

[0048] Figure 56 is a flow diagram of an application integrated dashboard of an embodiment.

[0049] Figure 57-59 are screen shots that show examples of personalized content of an embodiment.

[0050] Figure 60 is a flow diagram of an embodiment.

[0051] Figure 61 is a screen shot of an embodiment showing an example of recently-played games.

[0052] Figure 62 is a screen shot that illustrates this embodiment. [0053] Figures 63-64 are screen shots of an embodiment for launching a browser.

[0054] Figures 65-66 are screen shots of example implementations of a browser rendering an external gameplay service of an embodiment.

[0055] Figure 67 is a screen shot of an exit screen of an embodiment.

[0056] Figure 68 is a screen shot of an embodiment showing popular games.

[0057] Figures 69-70 are screen shots of an embodiment for searching content.

[0058] Figure 71 is an illustration of a platform operating service and external gameplay service of an embodiment.

[0059] Figure 72 is a notification flow diagram of an embodiment.

[0060] Figure 73 is a screen shot of a Friends that Play feature of an embodiment.

[0061] Figure 74 is a screen shot of an example game search view of an embodiment.

[0062] Figures 75-76 are screen shots of welcome screens of an embodiment.

[0063] Figure 77 is a screen shot of an embodiment for user name input.

[0064] Figure 78 is a screen shot of an embodiment that informs a user that a console is not required.

[0065] Figures 79-83 are screen shots of an embodiment for educating a user and prompting the user to purchase or subscribe.

[0066] Figures 84-86 are screen shots of an embodiment.

[0067] Figure 87 is a screen shot showing a notification of an embodiment.

[0068] Figure 88 is a flow diagram of an embodiment.

[0069] Figure 89 is a screen shot of an embodiment showing an example of human readable presence indicator.

[0070] Figure 90 is a block diagram of a computing environment of an embodiment.

[0071] Figures 91 and 92 are screenshots of an embodiment.

[0072] Figure 93 is a flow diagram of an embodiment related to focusable content.

[0073] Figure 94 is a flow diagram of an embodiment related to un-focusable content. [0074] Figure 95 is a flow chart of a method of an embodiment for assigning a function to one or more user input devices of a controller.

[0075] Figure 96 is a flow chart of a method of an embodiment for using a picture-in-picture view of an un-focused application to keep a user’s place in a queue. [0076] Figure 97 is a flow diagram of an embodiment for keeping a user’s place in a queue.

[0077] Figure 98 is a flow chart of a method of an embodiment for determining content to include for selection on a user interface display.

[0078] Figure 99 is a flow chart of a content sync method of an embodiment. [0079] Figure 100 is a flow chart of a content fetch method of an embodiment.

[0080] Figure 101 is a flow diagram for content sync and fetch processes of an embodiment.

[0081] Figure 102 is a flow chart of a method of an embodiment.

[0082] Figure 103 is a flow diagram of a background state indication/termination prevention method of an embodiment.

[0083] Figure 104 is a flow diagram of a user interaction of an embodiment. [0084] Figures 105-111 are screenshots of embodiments.

Detailed Description

[0085] L Software-Enabled Mobile Game Controller with Integrated

Platform Operating Service

[0086] Features, aspects, and advantages of the presently-disclosed technology may be better understood with regard to the following description and accompanying drawings (including actual screenshots). The drawings are for the purpose of illustrating example embodiments, but it is understood that the embodiments are not limited to the arrangements and instrumentality shown in the drawings. Additionally, here are some terms that may be used in the description below, in addition to illustrative examples for each term in accordance with certain embodiments. It will be understood by one of ordinary skill in the art that each term might comprise numerous other and/or different examples:

[0087] Mobile game controller: A physical device which captures user inputs and interacts with a computing device to allow the user to play a video game.

[0088] Computing device: Smartphone, tablet, etc. [0089] Gameplay device: The combination of a mobile game controller with a computing device or all-in-one computing device.

[0090] Embedded software: Software running on the mobile game controller. [0091] Platform Operating Service: Software app (one or more) and cloud service (one or more).

[0092] Smart mobile game controller: The combination of a mobile game controller, computing device, embedded software, and platform operating service. [0093] It is understood that the description discloses, among other things, various example systems, methods, apparatus, and articles of manufacture including, among other components, firmware and/or software executed on hardware. It is also understood that such examples are merely illustrative and should not be considered as limiting. For example, it is contemplated that any or all of the firmware, hardware, and/or software aspects or components can be embodied exclusively in hardware, exclusively in software, exclusively in firmware, or in any combination of hardware, software, and/or firmware. Accordingly, the examples provided are not the only way(s) to implement such systems, methods, apparatus, and/or articles of manufacture.

[0094] Additionally, references herein to “embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one example embodiment of an invention. The appearances of this phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. As such, the embodiments described herein, explicitly and implicitly understood by one skilled in the art, can be combined with other embodiments.

[0095] The description is presented largely in terms of illustrative environments, systems, procedures, steps, logic blocks, processing, and other symbolic representations that directly or indirectly resemble the operations of data processing devices coupled to networks. These process descriptions and representations are typically used by those skilled in the art to most effectively convey the substance of their work to others skilled in the art. Numerous specific details are set forth to provide a thorough understanding of the present disclosure. However, it is understood to those skilled in the art that certain embodiments of the present disclosure can be practiced without certain, specific details. In other instances, well known methods, procedures, components, and circuitry have not been described in detail to avoid unnecessarily obscuring aspects of the embodiments.

[0096] When any embodiments are read to cover a purely software and/or firmware implementation, at least one of the elements in at least one example is hereby expressly defined to include a tangible, non-transitory medium such as a memory, DVD, CD, Blu-ray, and so on, storing the software and/or firmware.

[0097] Attaching a mobile game controller to a computing device brings to life an entirely new gaming player device and gaming environment. A combination of a user input device, platform operating service, cloud service, screen, and mobile operating system allows the system to take advantage of a plethora of synergies and offers entirely new experiences. The embodiments described herein can provide a sophisticated and highly-extensible gaming experience that is vastly more than the sum of the parts.

[0098] While not every component would be, or necessarily need to be, used in every configuration, the components of the system(s) may include:

[0099] 1 A mobile game controller

[00100] The mobile game controller is an input device designed to primarily be used with a computing device, typically with input surfaces on either side of the computing device. The mobile game controller can be connected via a wired or wireless connection to the computing device. In certain embodiments, the mobile game controller might also output signals, such as haptic feedback, visual, audible signals, or some combination thereof. While the embodiments described herein are not so limited unless otherwise stated, an example of a mobile game controller is the Backbone One, offered by Backbone (www.playbackbone.com).

[00101] One example type of a mobile game controller is described in the U.S. Patent Application No. 17/504,283, filed October 18, 2021 entitled “Game Controller for a Mobile Device with Flat Flex Connector,” which is a continuation-in-part application of U.S. Patent Application No. 16/808,339, filed March 3, 2020, both of which are incorporated into the present disclosure by reference. The following are also hereby incorporated by reference: (a) U.S. Patent Application No. 17/866,166, filed July 15, 2022, entitled “Game Controller for a Mobile Device with Flat Flex Connector;” (b) U.S. Patent Application No. 17/866,234, filed July 15, 2022, entitled “Game Controller for a Mobile Device with Extended Bumper Button;” (c) U.S. Patent Application No. 17/856,895, filed July 15, 2022, entitled “Game Controller with Magnetic Wireless Connector;” and (d) U.S. Patent Application No. 17/850,912, filed July 15, 2022, entitled “Game Controller for a Mobile Device.”

[00102] It is also understood that a mobile game controller, in some embodiments, might be less mobile or substantially fixed or stationary, depending on the desired use or application of the mobile game controller. Additional embodiments of mobile game controllers can be as follows.

[00103] An external controller can be connected to a television or other computing devices. In certain embodiments, the mobile game controller can have a “pass-through charging port,” which enables electrical power to be transferred to a computing device during gameplay. As battery life can have a large impact on usage and retention, this feature can allow users to continue playing longer without running out of charge. In some cases, if the controller is wireless, the pass-through charging port can also, either by default or by configuration, charge the controller as well. The secondary port designed for multi-use is positioned in a way that does not interfere with the hands during gameplay.

[00104] In certain embodiments, the mobile game controller has one or more software service buttons which, when selected, can perform in-app functions or functions through an platform operating system API as opposed to providing the inputs solely via the standard input device framework provided from the device operating system meant for standard game controller inputs (e.g., ABXY, Ll/Rl, L2/R2, D-Pad, Joysticks, Start, Select). (Start and Select are also sometimes known as Menu and Options.) In some embodiments, a software service button is a physical button of an electrical circuit that can be depressed by a user. In other embodiments, a software service button is a capacitive touch or other kind of soft button.

[00105] A software service button may also send inputs via the mobile operating system’s input device framework based on user context. For example, in some embodiments, the behavior of the software service button can change depending on whether the current application is in the foreground or background. For example, when the user is playing the game inside a web browser within the platform operating service, the software service button behavior changes based on context. For example, while streaming a game inside the Amazon Luna service, pressing the software service button can now send a Human Interface Device (HID) command that opens up the Amazon Luna menu by triggering the “Start” button or invoke the cloud gaming service’s own capture functionality. [00106] Note that in certain embodiments, a software service button can be another type of input besides a button, such as a switch or thumbstick. In other embodiments, a software service button is a combination of two or more inputs that the platform operating service interprets as a single signal to perform the aforementioned functions.

[00107] Z A Computing Device

[00108] A primary computing device is a computing device that can be connected via the main interface of the mobile game controller (e.g., a Lightning plug, USB-C plug, or via Bluetooth). In certain embodiments, the primary computing device is the main type of device that the mobile game controller is designed to be used with. In some embodiments, examples of computing devices include an iPhone™, an Android phone, or tablet computer (“tablet”).

[00109] In certain embodiments, the primary computing device can be a general-purpose computing device, like a mobile smartphone, which can often offer functionality like productivity tools, telephony, messaging, wireless communication (e.g., phone calls, etc.). Thus, in this case, the mobile game controller system transforms a user’s general-purpose computing device into a gaming-centric device. In certain embodiments, the primary computing device and mobile game controller can be physically fused into one device for the end user. A secondary computing device is a computing device that is connected via a secondary port, either wired or wireless, on the mobile game controller, which is not the primary computing device the mobile device was designed for.

[00110] T Device Operating System

[00111] The device operating system has a framework or API interface that enables gaming input hardware devices to interact with applications running on the device operating system of a computing device. Examples include Android, iOS, and webOS.

[00112] 4. Content

[00113] Content can refer to any form of media where a user can install, view, play, edit, store, and share. Applications that enable a user to access content can itself be considered content. Games are a form of content either installed or streamed. Examples include_Amazon Luna Games, Remote Play, Roblox, Minecraft, Marvel Snap, etc.

[00114] Platform Operating Service [00115] The platform operating service comprises one or more applications (apps) that can run on computing device(s) and one or more services. The platform operating service can be part of the device operating system (first-party with respect to the device operating system) or be third-party with respect to the device operating system. Further, the platform operating service can also include features of the device operating system. For example, if the device operating system were to offer a certain functionality described below, the platform operating service could include that functionality.

[00116] Examples might include, but are not limited to, a combination of the game controller app, APIs natively on the computing device, interfaces within the device operating system, and various backend services hosted both internally and externally. Further, the platform operating service application can comprise any user interface presented on the computing device. For example, a popup prompt describing the benefits of the platform operating service that is surfaced by the device operating system would also be considered a part of the platform operating service application. A website displayed on the computing device that enables the user to view and launch content would also be considered part of the platform operating service application. The platform operating system can interface with hardware features of the mobile game controller.

[00117] The platform operating service can interact with one or more software service buttons. The software service buttons can be used to perform application functions. In some embodiments, the software service button can be used to load the application into the foreground view of the device operating system stack. The platform operating service can also interact with one or more software service indicators. Software service indicators are physical affordances on the mobile game controller such as status lights that the platform operating service can activate. For example, when a device is connected, a software service indicator on the mobile game controller might pulse white.

[00118] In certain embodiments, the platform operating service provides the following features. The platform operating service application can be launched with the software service button and the user can navigate the interface with the inputs on the device.

[00119] The platform operating service application can invoke additional software-enabled functionality including the “play on any screen” function (allows the user to reconfigure the multi-use port to have other functionality via the integrated application), live streaming, screen sharing, flashback recording, and more.

[00120] The integrated application can contain a search function which allows the user to query a database of content across multiple services (e.g., remote play, cloud gaming, or native applications). In certain embodiments, when a user launches the platform operating service application, they can use the controller to search for compatible games across the platform operating service. The user can either launch into or download them.

[00121] The integrated application can also allow users to connect their accounts from external services including, but not limited to, Xbox Cloud Gaming, Steam, Netflix, and Apple Arcade. For instance, based on recent history of gameplay, the platform operating service application can then insert those games into the compatible games list within their library or otherwise adjust the Integrated Dashboard content. Users can then use the software service button to open directly into a dashboard of their compatible games across multiple external services. Further, this allows the platform operating service’s cloud service to provide enhanced suggestions to users based on a multitude of inputs such as device state, user account information, platform preferences, and more.

[00122] The content suggestions in the platform operating service application can be based on the product SKU properties of the device. For example, if the user's smart mobile game controller device is tailored for a particular service, e.g. PS Remote Play, then when the platform operating service application renders Integrated Dashboard it can provide a response that prioritizes PlayStation content.

[00123] The platform operating service application can reconfigure its button icon language based on the symbology used on the mobile game controller. E.g. if a mobile game controller with an ABXY button layout is detected vs. a game controller with an ABCD button layout, the interface can automatically detect the SKU and replace the button hints inside of the interface.

[00124] Mobile apps are typically based in portrait mode (designed to be consumed in the format where the computing device is held so that the longer edge is vertical) and are designed to be used primarily with touch interfaces. In certain embodiments, when the mobile game controller is attached to the computing device, the platform operating service application can adjust the screen orientation into an alternate orientation, such as landscape, on the screen, and the interface can accept the mobile game controller inputs as the primary input modality.

[00125] The interface of the platform operating service application can be navigated using game controller inputs. In some embodiments, button hints are shown to illustrate what controls correspond to which actions on the screen. The interface may also accept touch controls. In certain embodiments, users may be able to launch into content without using the touch interface, using only controls on the mobile game controller. In some embodiments, a device operating system can have native features (e.g. landscape mode, controller support) that can be used by a mobile game controller. In this case, it is still part of the platform operating service. In certain embodiments, it can be used by the software service button to interact with the features natively built into the device operating system.

[00126] 6. Cloud Service

[00127] In certain embodiments, the platform operating service can comprise one or more cloud services, hosted on one or more server computers, to provide additional features, including suggestions and recommendations for compatible game or media content for the mobile game controller. The cloud service is not necessarily hosted in a cloud computing facility or service provider. In other embodiments, the cloud service can be hosted in a self-maintained data center or even on a computer on the user’s local area network. Further, the cloud service can also comprise first and/or third-party services that stream games to the platform operating service over a network such that they can be consumed through the user interface of the platform operating service.

[00128] Figure 10 is an illustration of a cloud service 500 of an embodiment. As shown in Figure 10, the cloud service 500 of this embodiment comprises a server 510 connected with a database 520 and an analytics element 530. The cloud service 500 of this embodiment also comprises a content management element 540, a game notification provider 550, and a push notification provider 560.

[00129] Example Hardware

[00130] The following discussion refers to the drawings numbered Figures 1 to 9 that are provided with this disclosure. The three-digit numbers appearing in the text are reference numbers and refer to features in the drawings.

[00131] Figure 1 is a top, perspective view showing portions of a game controller 100, according to embodiments, in an example of a retracted configuration of the game controller and illustrated next to an example mobile device 199. Figure 2 is a top, perspective view of the game controller 100 of Figure 1, illustrating how the game controller 100 may contact and support a mobile device 199 in some embodiments. As illustrated in Figures 1-2, the game controller 100 may include a first handle 101, a second handle 102, and a bridge 119. Each of the first handle 101 and the second handle 102 is configured to contact and support the mobile device 199, though not all contemplated embodiments will include the second handle 102.

[00132] In certain embodiments, “mobile device” generally refers to a portable, handheld computing device, such as a smartphone, tablet, or other comparable mobile device. The mobile device typically runs an operating system, including but not limited to iOS or Android.

[00133] As illustrated, the first handle 101 includes a user-accessible, first hardware interface 103. The first hardware interface 103 could be a button, an analog stick, a touchscreen, a touchpad, a knob, a slider, a switch, a wheel, a dial, a directional pad, or another such feature configured to accept touch inputs from a user’s finger or a stylus. As shown in Figure 1, the first hardware interface 103 may include multiple such hardware interfaces.

[00134] As illustrated, the second handle 102 further includes a user-accessible, second hardware interface 104. As above for the first hardware interface 103 of the first handle 101, the second hardware interface 104 could be a button, an analog stick, a touchscreen, a touchpad, a knob, a slider, a switch, a wheel, a dial, a directional pad, or another such feature configured to accept touch inputs from a user’s finger or a stylus. The second hardware interface 104 may include multiple such hardware interfaces, as illustrated in Figure 1.

[00135] In configurations, one or more of the buttons, etc. of the first hardware interface 103 or the second hardware interface 104 are the software service buttons discussed elsewhere in this disclosure.

[00136] One or both of the first handle 101 and the second handle 102 may include a connector 125 for physical and electrical connection to the mobile device 199. The connector 125 may be, for example, a USB-C connector. In addition, one or both of the first handle 101 and the second handle 102 may include a pass-through charging port 120 or a headphone jack 122, or both. The pass-through charging port 120, for example, allows the mobile device 199 to have its battery charged when the mobile device 199 is attached to the connector 125. The headphone jack 122 allows an audio signal generated by the game controller 100 or the mobile device 199, or both to be sent to an external headphone or other speaker.

[00137] Figure 3 is a top, perspective view of the game controller 100 of Figure 1, showing only certain features internal to the game controller 100. Figure 4 is a partially exploded view of the game controller 100 of Figure 3. Figure 5 is a bottom, perspective view of the game controller 100 of Figure 3.

[00138] As illustrated in Figures 3-5, the first handle 101 includes a first electronic circuit 141 that is coupled to the first hardware interface 103. For example, as illustrated in Figure 3, the first hardware interface 103 has a corresponding feature

142 of the first electronic circuit 141. The corresponding feature 142 of the first electronic circuit 141 is configured to translate a mechanical, touch input to the first hardware interface 103 into an electrical signal. Hence, for example, the corresponding feature 142 of the first electronic circuit 141 may be an electronic switch.

[00139] Likewise, the second handle 102 includes a second electronic circuit

143 that is coupled to the second hardware interface 104. For example, as illustrated in Figure 3, the second hardware interface 104 has a corresponding feature 144 of the second electronic circuit 143. The corresponding feature 144 of the second electronic circuit 143 is configured to translate a mechanical, touch input the second hardware interface 104 into an electrical signal. Hence, for example, the corresponding feature

144 of the second electronic circuit 143 may be an electronic switch.

[00140] The second handle 102 further includes an electronic controller 145. The electronic controller 145 is configured to receive an electrical signal from the second electronic circuit 143. The electrical signal from the second electronic circuit

143 may be, for example, the electrical signal produced by the corresponding feature

144 of the second electronic circuit 143 in response to a touch input at the second hardware interface 104. The electronic controller 145 is also configured to receive an electrical signal from the first electronic circuit 141 via a flat, flexible cable 146. The electrical signal from the first electronic circuit 141 may be, for example, the electrical signal produced by the corresponding feature 142 of the first electronic circuit 141 in response to a touch input at the first hardware interface 103.

[00141] The flat, flexible cable 146 is configured to conduct an electrical signal between the first handle and the second handle. In configurations, the flat, flexible cable 146 is a flat and flexible plastic film base, with multiple, flat, metallic conductors bonded to one surface of the film base. As illustrated most clearly in Figure 5, the flat, flexible cable 146 may be coupled at a first end 147 of the flat, flexible cable 146 to the first electronic circuit 141 and, at a second end 148 of the flat, flexible cable 146, to the second electronic circuit 143.

[00142] As illustrated most clearly in Figure 4, in configurations the flat, flexible cable 146 includes a double fold 149. In configurations, the double fold 149 may be at the midline 121 of the bridge 119. In the illustrated configuration, the double fold 149 includes a folded section 150. Between the folded section 150 and a first elongated section 151 of the flat, flexible cable 146 is a first fold 153. And between the folded section 150 and a second elongated section 152 of the flat, flexible cable 146 is a second fold 154.

[00143] As illustrated, the fold angle 155 of the first fold 153 and the fold angle 156 of the second fold 154 are substantially equal and are less than 180°. As used in this context, “substantially equal” means largely or essentially equivalent, without requiring perfect identicalness. Accordingly, the first elongated section 151 of the flat, flexible cable 146 and the second elongated section 152 of the flat, flexible cable 146 are substantially parallel. As used in this context, “substantially parallel” means largely or essentially equidistant at all points (if the longitudinal centerline 169 of each elongated section were conceptually extended), without requiring perfect parallelism.

[00144] Consequently, the folded section 150 is at an angle to each of the first elongated section 151 and the second elongated section 152 as defined by the fold angles. Stated another way, before it is folded, the flat, flexible cable 146 has a longitudinal centerline 169 midway between its edges. Once folded, the longitudinal centerline 169 running through the first elongated section 151 and the longitudinal centerline 169 running through the folded section 150 are and an angle to each other, that angle being related to the fold angles.

[00145] In configurations, including in the illustrated configuration, the double fold 149 causes the same side 157 of the flat, flexible cable 146 to double over onto itself for each of the first fold 153 and the second fold 154. In configurations, the flat, flexible cable 146 is fixed to the bridge. The flat, flexible cable 146 may be fixed to the bridge by, for example, glue or another adhesive. In configurations, the flat, flexible cable 146 is fixed to the bridge at the midline of the bridge. Fixing the flat, flexible cable 146 to the bridge may help to prevent the flat, flexible cable 146 from sliding within the bridge 119 during normal use of the game controller 100, where the game controller 100 may be repeatedly moved between the retracted configuration and the extended configuration. In configurations where the flat, flexible cable 146 is fixed to the bridge, the bridge may or may not include the tray 158 (as describe below), the flat, flexible cable 146 may or may not include the double fold 149, and the tray 158 may or may not include the double jog 160 (described below).

[00146] As illustrated most clearly in Figure 4, the bridge 119 may include a tray 158 that is configured to contain the flat, flexible cable 146 within the tray 158. As illustrated, the tray 158 may include a narrow conduit 159 that is slightly wider and taller than the flat, flexible cable 146 such that the flat, flexible cable 146 fits snugly within the tray 158. The tray 158 is configured to prevent the flat, flexible cable 146 from bunching during normal use of the game controller 100, where the game controller 100 may be repeatedly moved between the retracted configuration and the extended configuration.

[00147] In configurations, the tray 158 may include a double jog 160, or turn. The double jog 160 in the tray 158 is configured to snugly contain the double fold 149 of the flat, flexible cable 146. Accordingly, the double fold 149 of the flat, flexible cable 146 coincides with the double jog 160 in the tray 158. The combination of the double fold 149 and the double jog 160 help to prevent the flat, flexible cable 146 from sliding within the tray 158 (and, therefore, within the bridge 119) during normal use of the game controller 100, where the game controller 100 may be repeatedly moved between the retracted configuration and the extended configuration.

[00148] Figure 6 is a perspective view of a game controller, according to embodiments. Figure 7 is a top view of the game controller of Figure 6, shown with an example mobile device. In addition to what is described here, the game controller 200 of Figures 6 and 7 may have the features and options as described above for the game controller 100 of Figure 1. Accordingly, the same reference numbers are used. [00149] As illustrated in Figures 6 and 7, in configurations a game controller 200 may include a magnetic connector 241 within the bridge 219. In configurations, the bridge 219 of Figures 6 and 7 may be as described above for the bridge 119 of Figure 1, except as noted here. The magnetic connector 241 is configured to magnetically retain a mobile device 199 to the game controller 200. As examples, the magnetic connector 241 may retain the mobile device 199 to the game controller 200 by magnetic wireless connection, including by magnetic induction for near- field communication or for wireless charging, or both near-field communication and wireless charging. The wireless charging is to recharge the battery of the mobile device 199. In configurations, the magnetic connector 241 allows the mobile device 199 to be retained to the game controller 200 without a wired connection. As noted, in configurations, the wireless connection, provided by the magnetic connector 241, may allow the transfer of data between the game controller 200 and the mobile device 199 through near-field magnetic induction (NFMI). In configurations, the magnetic connector 241 may retain the mobile device 199 to the game controller 200 in addition to a wired connection, such as through the connector 125 (see Figures 1 and 2). In configurations that include a wired connection, versions may include spring mechanisms, such as the first spring mechanism 107 and the second spring mechanism 107 discussed above, to bias the first handle 101 or the second handle 102, or both, toward the retracted configuration, thereby helping to retain the mobile device. Other versions may lack one or both of the first spring mechanism 107 or the second spring mechanism 107.

[00150] Accordingly, in configurations the magnetic connector 241 may provide a wireless data connection, wireless charging, physical attachment to the game controller 200 through magnetic attraction, all three of those features, or any two of those features.

[00151] As illustrated in Figure 7, a user may place the mobile device 199 onto the magnetic connector 241. While the mobile device 199 is shown in landscape position in Figure 7, in configurations the mobile device 199 instead may be placed onto the magnetic connector 241 in portrait position. In configurations, the first handle 101 and the second handle 102 of the game controller 200 may be translated in the retraction direction toward the magnetic connector 241. In configurations, when the mobile device 199 is magnetically retained to the game controller 200, the first handle 101 and the second handle 102 contact the mobile device 199 as the first handle 101 and the second handle 102 translate in the retraction direction. To remove the mobile device 199, the first handle 101 and the second handle 102 of the game controller 200 may be translated in the extension direction away from the magnetic connector 241. In other configurations, the mobile device 199 may be removed from the magnetic connector 241 without the need to move the first handle 101 or the second handle 102 of the game controller 200, such as by lifting the mobile device 199 off of the magnetic connector 241. [00152] In configurations, the magnetic connector 241 is not within the bridge 219 but is instead affixed to or within another part of the game controller 200, such as the first handle 101 or the second handle 102.

[00153] In another example configuration, such as illustrated in Figure 8, an integrated game controller 300 comprises a display screen 302 integrated into a handheld controller 304 and does not require interfacing with a mobile device. Specifically, in this configuration the device 300 utilizes its own operating system and other software to perform the methods described elsewhere as being performed by the mobile device 199.

[00154] Figure 9 illustrates an example environment for a game controller 400, which could be, as examples, the game controller 100 of Figure 1, or the game controller 200 of Figures 6 and 7, or the game controller 300 of Figure 8. In the configuration illustrated in Figure 9, a game controller 400 sends a signal 402 over a network 404. The network 404 communicates the signal 402 to a cloud gaming server 406, wherein a game 408 is stored and run. The server 406, then streams a display 410 of the game back to the user at the game controller 400. In other configurations, the game play may be local, meaning that the game is stored and run locally, typically by the mobile device 199.

[00155] Example Mobile Game Controller

[00156] In this section, a mobile game controller is described that is designed for a computing device and interoperates with a platform operating service. In certain embodiments, the game controller specializes in high-performance input controls combined with custom protocols to empower the platform operating system. The embedded system within the mobile game controller comprises several subsystems that work together to enable rich features of the platform. Unique features enabled by this sophisticated embedded system can include one or more of: low-latency game controller inputs, custom platform- specific buttons (software service buttons), a software service indicator (e.g., a multi-color status light), pass-through charging, pass-through audio via external headset, platform operating service API and analytics, and cross -platform compatibility

[00157] Figure 11 is an illustration of subsystems 600 of an embodiment. As shown in Figure 11, these subsystems 600 include a processor 610, a memory 620, a primary transceiver 630, a game controller interface 640 (which comprises a measurement sequencer 642, an HID gamepad 644, and a gamepad profile manager 646), a configurable secondary port 650 (which comprises a secondary transceiver 652 and a charger subsystem 654), an application interaction model 660, a controller analytics element 670, and boot/upgrade support element 680.

[00158] As with many embedded solutions, the system is powered by a processor (CPU) 610 with its associated random access memory 620. In addition, the system is connected to persistent external memory, which is used to store the program, configuration data, and any system information that need to be remembered across power cycles.

[00159] To establish connectivity to a mobile device, the system requires a transceiver capable of transmitting digital data over a serial bus. The bus can implement input events, audio, and bulk data traffic simultaneously if there is sufficient bandwidth on the bus. The mobile device can optimize bus traffic depending on the use case, but potentially at the cost of transmission latency.

[00160] Figure 12 is an illustration of a mobile game controller 700 and mobile device 800 of an embodiment. As shown in Figure 12, this architecture comprises a mobile game controller 700 and a mobile device 710. The mobile game controller 700 comprises a CPU 701, a memory 702, a game controller interface 703, an audio interface 704, a charging subsystem 705, and a transceiver 706. The mobile device 710 comprises a transceiver 720, an operating system 730, and a mobile application 740. The operating system 730 comprises an input framework 732, an accessory framework 734, and system audio 736. The mobile application 740 comprises a game controller interface 742, a software stack 744, and an accessory interface 746.

[00161] In certain embodiments, the wired transceiver and serial bus can be replaced with a wireless RF transceiver and over-the-air protocol. For example, a Bluetooth protocol using the HID and serial port profiles can be interchangeable with the wired equivalents. Downstream functionality in the mobile operating system can be largely unaffected by the difference.

[00162] Game Controller Interface

[00163] The game controller interface can implement a measurement system which is designed to periodically measure the various surface inputs on the controller. The responsibility of the measurement system is to schedule and measure all of the input signals, both analog and digital voltages. The period of the measurement system directly impacts latency, and should ideally be as low as possible. A measurement interval close to 1 millisecond enables the controller to have a maximum input rate of approximately 1000Hz. Practical rates are usually limited by the computing device, however.

[00164] Once all of the inputs have been measured, each input is translated into an input report. The format of input report is determined by a Human Interface Device (HID) descriptor. HID is a well established standard defined by USB-IF and is utilized broadly among devices which support game controllers. The HID descriptor defines collections of inputs and outputs for the device and how the respective reports are formatted. Both inputs and outputs in the HID descriptor are assigned usage values, which are generally standardized via the USB HID usage table. The assembled input reports can then be sent to the transceiver, rate throttled as necessary based on the mobile device on the other end.

[00165] If the mobile game controller is designed to work with multiple device vendors or platforms, it may be necessary to tailor the HID descriptor for each target platform. To address this in the embedded system, multiple HID descriptors can be organized into a profile system, where a profile contains: HID descriptor, HID usage map (surface input to HID input), Input Rate, Vendor ID / Product ID, and Other product strings and identifiers. In addition, when the mobile game controller has multiple transceivers, the profile will preferably need to specify one or more transceivers to interoperate with.

[00166] Configurable Secondary Port

[00167] To make the mobile game controller with additional platforms other than the primary computing device, a secondary port may exist on the product to enable USB connectivity to additional platforms. Example platform devices include PC, Mac, iPad, tablets, TVs, and game consoles. Since USB has become the charging standard for mobile devices, the system can combine the function of charging and game controller within the same port.

[00168] The platform operating service feature of configuring the secondary port for another platform is referred to as Play On Any Screen, and is explained in further detail in this document.

[00169] To implement a USB game controller on a secondary port, the flexible gamepad profile system can be reused used for the primary port, and simply target the secondary transceiver instead. In addition, due to the HID usage map that exists on every gamepad profile, there is no restriction in simultaneously mapping inputs to the primary and secondary transceivers. Once the surface inputs are read, it is just a matter of translating the input values into the format defined by the specified HID descriptor.

[00170] Software Service Indicator

[00171] The mobile game controller can support one or more software service indicators (e.g., a programmable status light with rich waveform support). In the current device, the status light has full RGB color support as well as brightness control. To achieve dynamic brightness, the processor utilizes programmable pulse width modulators (PWM) which allow for fine grain control over each color component's intensity.

[00172] A waveform management layer exists on top of the status light control, which enables more complicated patterns such as smooth in/out fading, and programmable duty cycle blinks. Waveforms also have a priority level which allows for multiple waveform states to run concurrently, but only the highest priority waveform is rendered. Waveforms that have the same priority level can optionally be configured to stack, in which case all waveforms at that level will execute in a round robin fashion, essentially taking turns based on the defined waveform period.

[00173] In addition to waveform brightness modulation for fading, the software service indicator can globally modulate its brightness via API control from the platform operating service. In some embodiments, the brightness level is synchronized with the brightness level of the mobile device screen.

[00174] Accessory Analytics

[00175] Within the embedded software, analytics are automatically aggregated for mobile game controller presses and hold durations. The analytics values are collected for every input over a period of time and sent up to the app in a single report. When the platform operating service application receives each analytics report, it has the opportunity to apply further processing and aggregation based on the use case. For example, the platform operating service application uses the analytics to identify when there is user activity to drive features that need an idle state.

[00176] In addition, the accessory analytics can be recorded in a real-time mode to supply game controller metadata to gameplay recordings. This can be useful not only as a scrubbing tool during editing, but also as a visualization tool of game controller input when viewing the video.

[00177] Communication Layer [00178] In certain embodiments, the mobile game controller can establish a custom bidirectional communication link with the computing device. This serial interface allows data to be sent in two main use cases: (1) command-response, used for features such as firmware upgrade where the app initiates a command and waits for the accessory response; and (2) push messages, used for buttons and other asynchronous events where the accessory sends the message to the app.

[00179] Platform Operating Service API

[00180] To extend the functionality of the mobile game controller and mobile device, a protocol can be created to communicate with the platform operating service. The protocol utilizes the bi-directional communication protocol to exchange messages which enables various system features.

[00181] Custom Platform Buttons

[00182] Additional buttons not standard to extended game controllers can be sent up to the platform operating service as push messages. The button press and release events can be sent directly, along with the button hold time information. A custom platform button is an embodiment of a software service button defined previously.

[00183] If the platform communication link is not established, the mobile game controller may send internal transmissions to the mobile device itself to help locate the necessary software.

[00184] Status Light Control

[00185] The status light is an embodiment of the software service indicator.

Control of the mobile game controller’s status light is provided in the API to allow for rich color and waveform customization. Waveforms set through the API can coexist with any internal waveforms, and controlled via a priority level. Typically, waveforms set by the platform operating service will take precedence over the built-in waveforms.

[00186] Application Launch

[00187] The platform operating service can request to launch a specific application on the mobile device. Applies to mobile devices that support a rich accessory protocol.

[00188] Return to Menu [00189] The platform operating service can request the mobile device exit to its normal operating mode/menu. This is achieved by sending a specific HID usage to instruct the mobile device to go back to its main menu.

[00190] Secondary Gamepad Profile

[00191] The platform operating service can set the gamepad profile for the secondary port on the mobile game controller. Setting the profile will write the configuration into persistent memory such that it is loaded by default when the game controller is operating via the secondary port.

[00192] Analytics Configuration

[00193] The platform operating service can configure the interval at which analytics are reported as well as a timeout before analytics stop sending when the controller is idle. Idle is determined when there is no change on any of the controller inputs, which would be redundant to send when all inputs are 0.

[00194] Analytics Reporting

[00195] Analytics are exchanged in two ways with the platform operating service. First, the mobile game controller can periodically send analytics report packets based on the configured interval. Second, the platform operating service can read out analytics manually through a polling approach. In either case where the analytics are accessed any accumulators or statistics are reset.

[00196] Firmware Upgrade

[00197] The mobile game controller can have the capability to upgrade the software on its embedded system through an upgrade mechanism. This upgrade process allows for the system to expand the functionality of the game controller as the capabilities of the platform operating service evolve. In the mechanism described here, the platform operating service makes the judgment whether a firmware upgrade is appropriate, and will initiate and control the upgrade procedure. For instance, to support robust and error free upgrading of a device powered game controller, the serial flash within the embedded system is partitioned into multiple regions. This helps to ensure that the area of flash being written to during the upgrade is not the same area in which the embedded system is operating from.

[00198] Additionally, the flash can further be subdivided as a file system so that a partition contains more than one binary file. For example, there may be a file for the program itself, a file for configuration parameters, and a file for embedded resource data. As part of the upgrade procedure, each file within the target partition is erased and written one by one, performing verification steps after each file is written. If the upgrade procedure is interrupted for any reason, such as if the game controller is unplugged, the upgrade can resume from the last completed step. Once all files within the target partition have been successfully written, the final step is to configure the system to boot from the updated region of the flash. The system can either reboot immediately, or configure the flash pointer and wait for the next power cycle.

[00199] Figure 13 is a flowchart 1300 of an example upgrade procedure of an embodiment. As shown in Figure 13, a determination is made regarding whether the mobile game controller is connected (act 910). If the mobile game controller is not connected, the process waits for connection (act 920). However, if the mobile game controller is connected, the firmware version is read (act 930), and a determination is made regarding whether a newer firmware exists (act 940). If a newer firmware does not exist, the firmware is up-to-date (act 950). However, if a newer firmware does exist, info on the next file partition is checked (act 960) to see if the partition matches the target version (act 970). If it matches and this is the final file (act 980), the procedure ends. However, if it does not match, the partition is erased (act 975) and the partition is written (act 985).

[00200] Background Upgrade

[00201] In certain embodiments, when the mobile game controller is used in conjunction with a platform operating service, it is possible to conduct the upgrade of the embedded system when the integrated dashboard of the platform operating system is not visible. In this case, the platform operating service is continuing to run in a background operating mode to maintain the connection to the mobile game controller. The procedure need not be different when it is performed in the background as long as the platform operating service can maintain a constant communication link to the mobile game controller.

[00202] When the upgrade procedure has completed, the platform operating service may have multiple options to apply the new embedded system firmware: reboot immediately (or reboot as soon as possible), quietly stage new firmware and wait for next boot, and provide a notification indicating new firmware is ready [00203] Play On Any Screen - Overview

[00204] In the past, wired mobile game controllers could only be fully compatible with one computing device type, like an Apple iPhone or Android. As a result, users would have to own multiple game controllers to have an optimal experience across multiple devices (e.g., mobile, PC, Google Chrome or other web browsers, game consoles, Mac, smart TVs, living room entertainment systems, etc.). For example, in order to play on a traditional living room game console and a mobile computing device, a user would need to own at minimum both a traditional living room game console controller and a mobile game controller. This is due to a range of factors including, but not limited to, issues such as physical port incompatibility, interface protocol incompatibility, or software incompatibility. Although wireless game controllers, such as those based on Bluetooth, allowed for some multi-use scenarios, users had to deal with the potentially higher latency of the wireless connection on their traditional game console game controller.

[00205] The Play On Any Screen feature allows the mobile game controller to intelligently detect when a port, such as the secondary port, is connected to another host device, and reconfigure the game controller interface. (Figure 14 is an illustration of a game controller of an embodiment with primary and secondary ports.) By utilizing a secondary port, such as the charging port, the controller interface can be easily customized without affecting the primary game controller function. In certain embodiments, The Play On Any Screen feature is intuitive, and therefore it can eliminate, or reduce, the need for additional explanation in user manuals or education on how to reboot the device into different modes. Persistent memory inside of the controller helps to ensure that the Play On Any Screen function works when disconnected from the mobile device.

[00206] In certain embodiments, Play On Any Screen is one of the features of the mobile game controller system enabled by the configurable port. The configurable port allows a mobile game controller to be used with additional computing devices beyond the main devices they were designed to physically couple with, including, but not limited to a PC, Mac, iPad, Google Chrome. The wired connection of Play On Any Screen can offer superior end-to-end latency, lower than traditional Bluetooth- connected devices, which can provide a better user experience when a user is streaming games from the cloud. Through the combined system, which comprises embedded software, platform operating service and device operating system, users can now use their mobile gaming controller with other computing devices like iPad, Mac, and PC to play on external services such as Amazon Luna.

[00207] In certain embodiments, the user configures which additional external platform they would like to use via the platform operating service. The user uses an appropriate cable to wire the charging port to a capable host device. The smart mobile game controller system is able to intelligently detect when the charging port is connected to another USB host and is able to reconfigure as a USB game controller in a manner as to be fully compatible with the connected computing device. Persistent storage inside of the mobile game controller can help to ensure that the Play On Any Screen function works when detached from the computing device.

[00208] Example Configuration Flow

[00209] In certain embodiments, the platform operating service has an application installed on a computing device which communicates with the mobile game controller through the primary transceiver. In the case of a Lightning version of the mobile game controller, the primary communication for supporting the platform operating service application is through the Lightning protocol, and the secondary port communication uses a USB transceiver.

[00210] Figure 15 is a diagram of a configuration flow of an embodiment between a platform operating service 1000, a mobile game controller 1010, and a secondary platform 1020. As shown in Figure 15, the mobile game controller 1010 sends a message to the platform operating service 1000 to connect through the primary port (act 1030), and, in response, the platform operating service 1000 establishes a communication link (act 1040). After the user sets the secondary gamepad profile (act 1050), the mobile game controller 1010 disconnects the primary port (act 1060) and connects to the secondary platform 1020 through the secondary port (act 1070). The secondary platform 1020 then establishes the HID game controller (act 1080).

[00211] In certain embodiments, to use the feature, a user selects the device they want to play on in the platform operating service’s settings (see Figures 16 and 17) and then they connect their mobile game controller to the device with a cable (e.g., a USB-C to USB-C cable or a Lightning to USB-C cable). The software service indicator can change state when this occurs to signal to the user that the device is being used with an alternate host device as opposed to the primary one. For instance, the software service indicator can be a status light that glows blue (or some other color or indication) in this mode to indicate that it’s being used with another device. (See Figure 18, which shows a user demonstrating the behavior of the Play on Any Screen feature, where a status light can change color when the functionality is active and in use.) With this feature, the end-to-end input latency can be lower than traditional Bluetooth controllers, making it optimal for cloud game streaming use cases.

[00212] Example Implementation

[00213] In certain embodiments, Play On Any Screen can be implemented based on a combination of key aspects of the system architecture: (i) configurable secondary port, optionally doubling as a charging port; (ii) HID game controller profile management in the embedded software, as well as flexible/reconfigurable USB endpoints in the microprocessor; and (iii) a platform operating service application on a computing device that communicates with the mobile game controller to assist in configuring the mode. Figure 19 is a block diagram of a mobile game controller 1100, a mobile device 1110, and a USB host 1120 of an embodiment. As shown in Figure 19, the mobile game controller 1100 comprises a primary transceiver 1102, gamepad inputs 1104, a secondary transceiver 1106, and a charging subsystem 1106.

[00214] Configurable Secondary Port

[00215] In certain embodiments, the configurable secondary port is used to establish power and data connectivity to a secondary host platform such as PC, Mac, iPad, etc. This secondary port can double as a charging port. When the primary mobile device is not connected, the pass-through charging function can be ignored, while the secondary transceiver on the port can be used to communicate with another device.

[00216] The mobile game controller can detect when a USB host is connected to its secondary port by monitoring the state of the USB differential data pins. There is little harm in turning on the transceiver regardless of the presence of a USB host, but some embodiments may probe the state of the data pins and suspend the device if a dedicated charging port is detected on the other side of the cable. A common approach to determine a dedicated charging port is to measure the voltage on the D+/D- lines of the USB. In the event that the lines are not in a J or K state (where + and - are opposite states), it can be inferred that there is not a valid USB bus on the other end.

[00217] When configuring the function of the secondary port, a specific game controller profile is specified, with each profile designed to specifically interoperate with the secondary host platform. Each host platform may have its own requirements or button mapping for HID game controllers which often requires platform specific customization. Some host platforms may have further requirements such as an authentication layer, which require further customization of the device descriptors. As an example, additional USB endpoints could be required to implement a platform specific communication and authentication protocol.

[00218] In certain embodiments, the configurable secondary port can make its decision on which game controller profile to use based on a physical input on the game controller. This can be a new physical switch or existing input on the controller surface. The embedded system can scan this input on connection to the secondary port, or scan continuously to reconfigure the secondary port on the fly.

[00219] In certain embodiments, the configurable secondary port can make its decision on which game controller profile to use based on an internal or external detection of the host device. As an example, the decision could be made by detecting the type of cable connected to the secondary port. The port could also provide an internal mechanism to identify the host, such as the use of a platform specific protocol utilizing vendor specific interfaces on the serial bus.

[00220] In certain embodiments, the configurable secondary port may implement a wired interface other than USB. In this case, equivalent game controller device paradigms can be used to implement the behavior of Play On Any Screen: (i) a communication medium to exchange information between device and host; (ii) a device descriptor which describes the device configuration to the host; (iii) a HID descriptor which describes how the surface inputs map to Human Interface Device inputs; and (iv) a mechanism in which HID inputs can be transmitted from device to host.

[00221] In certain embodiments, the primary and secondary transceivers of the mobile game controller may be the same physical hardware block. As an example, the mobile game controller may have a single USB transceiver which gets shared by both the primary and secondary ports. In order to manage the primary and secondary host device signals, a multiplexing solution is used to switch the single transceiver between multiple ports. The switch can then be managed by detecting whether the primary or secondary ports are physically connected. Usually the primary port takes precedence when plugged in, and the secondary port becomes active when the primary port is disconnected.

[00222] Game Controller Profiles

[00223] In certain embodiments, one of the key capabilities in the embedded software is the ability to reconfigure the game controller configuration to enable support of nearly any permutation of Human Interface Device (HID), and ultimately identify as many kinds of game controller. Each platform (PC, Mac, iPad, etc.) has its own limited set of devices they support out of the box which require tailored HID descriptors. In addition, software running on a platform may have additional button mapping requirements that can influence the requirements. For example, a web browser on a host platform may require a specific button mapping to work with a cloud gaming service.

[00224] Example Game Controller Profile Subsystem

[00225] At the top level, there is a single management module that controls which game controller profile should be used. Which profile to select is based on two factors: which port is currently active, primary or secondary (plug or receptacle), and which gamepad mode was specified by the user. The logic might use a lookup table. [00226] For each game controller profile, there are several interfaces that can be customized based on the needs of the profile:

[00227] Top-level USB descriptor: This is the top-level configuration of the USB device, which encodes which USB interfaces and endpoints are used, as well as unique values like IDs or strings to uniquely identify with the USB host. In addition to preparing the data to send over USB, the descriptor data can be used to configure local USB hardware, such as endpoint type, direction, packet size, etc.

[00228] HID report descriptor: This is the data established over USB which encodes which game controller elements are present and how to pack/unpack the value for each. Each mode has dramatically different report descriptors, and even different encodings/mappings for buttons, joysticks, and triggers. HID usage map: A compact lookup table which maps surface inputs to HID usages present in the HID report descriptor.

[00229] HID input report builder: This interface takes the HID report descriptor data, HID usage map, and sampled game controller input values, and packages the data into a HID report which can be transmitted via USB. The rate of transmission is often dependent on the platform for which the profile is designed to work with.

[00230] USB event hooks: A USB stack provides various hooks that allow us to respond to USB events coming from the host. These events vary by implementation, but to give a few examples: USB HID class requests, USB set interface requests, USB endpoint and other kinds of interrupt events. The key here is that the USB stack handles a large percent of the low level communication details, but gives the option for these profiles to implement custom behaviors, effectively minimizing the amount of redundant code needed for each gamepad type.

[00231] In certain embodiments, the HID report descriptor is parsed directly inside the embedded system in the same way that a host device would. This unlocks functionality that requires dynamically assigning bit/byte positions when building HID input reports. By including the parsing within the embedded system, the descriptor and usage maps can be sent directly without requiring post processing by the platform operating system.

[00232] Embedded Software Updates

[00233] To increase flexibility of the system, certain embodiments may utilize the platform operating system to update the game controller profiles supported by the mobile game controller. In one embodiment, the platform operating system updates the embedded system through a firmware update. In this case, new embedded code can be added to extend existing profiles or add new profiles.

[00234] In certain embodiments, game controller profiles can be established by the platform operating service and written to the mobile game controller via the platform operating service API. In this case, controller support can be added to new platforms by transferring just the game controller profile data. In this embodiment where the mobile game controller can be represented by a standard HID descriptor, a common game controller profile can be customized by externally driven data. The profile can be transferred as a HID report descriptor and a HID usage map and stored to the embedded system’s persistent storage.

[00235] Analytics

[00236] An important potential challenge with this feature is obtaining usage metrics and debugging issues in the field (e.g., suppose there is an issue in recognizing the mobile game controller as an input device on a particular version of iPadOS.) Since the Play On Any Screen feature can be used with a different computing device as opposed to just the primary computing device for the mobile game controller, usage stats and analytics are recorded and stored in the mobile game controller’s embedded storage, then read out on the next connection to the platform operating service application installed on the computing device and communicated with the platform operating service’s cloud service. This implementation enables off- platform data and analytics collection. [00237] In certain embodiments, the secondary platform connected through the secondary port can supply analytics data to the mobile game controller. The analytics are communicated through the secondary transceiver and stored into the mobile controller’s embedded storage to be read out on the next connection to the platform operating service.

[00238] Content Capture System - Overview

[00239] Options for recording gameplay on a mobile computing device were limited until a capture system was introduced that enabled recording, streaming, edit, and sharing as part of an end-to-end system for mobile device game capture.

[00240] Traditionally, device operating systems bury screen video recording in a setting shelf, and in some cases do not expose the feature by default. The capture system allows the user to start/stop capture recording without having to remove their hands from the mobile game controller. By integrating a software service button and embedded software function into the gameplay device itself, the feature can reach a large audience. In addition, because it is very easy to access and low friction, users are not taken out of the game to access the feature.

[00241] In certain embodiments, the feature can be organized into three main areas: (1) control signals and status indication, (2) background audio/video pipeline; and (3) content upload, delivery, and tagging powered by backend servers. Through testing with users, it was observed that start and stop recording functions, while certainly useful, also are not sufficient for all functions. On occasion, a player may only realize that an interesting gameplay moment occurred after the fact. In this scenario, being able to trigger a flashback recording, or retroactively grab video or audio buffers (e.g., the last 30 seconds of video or audio) is useful. Similarly, users may also want to live stream to another destination or share their screen with other users in the platform operating service.

[00242] Example Capture Modes

[00243] Example capture modes include screen recording, screen sharing, live streaming (to a live streaming service such as Backbone, Twitch, YouTube, Facebook, etc.), and flashback recording. A capture button is a software service button that enables screen capture on a mobile game controller. A capture button can be integrated with the platform operating services to support other forms of capture besides start and stop recording; namely enabling streaming, screen-sharing, and flashback recording. Note that the term “capture” is used to describe these different forms of capture.

[00244] One issue with capturing mobile gameplay is that content creators oftentimes do not know when the capture is actually occurring. For example, a YouTube or Twitch content creator may use the built-in operating system interface to capture the screen. In some examples, the mobile operating system may not provide a sufficient visual affordance for indicating that the screen is being captured. Further, doing so may occlude gameplay on the screen which is harmful to the user experience. Thus, content creators may play for an extended period of time, only to realize after the fact that none of their session was recorded because, for example, the screen was not being captured or the built-in screen capture crashed in the background.

[00245] When a mobile game controller is combined with the screen of the computing device, the screen of the computing device would normally have to be relied solely on to understand the state of the capture system. In the smart mobile game controller system, there is a unique solution in that a software service indicator, such as a status light, can be used to indicate that the screen is being captured without occluding screen real estate. Figures 20-22 are screen shots that illustrate these features.

[00246] Control and Status

[00247] On the smart mobile game controller there can be a software service button (e.g., a Capture Button) to allow you to capture gameplay (e.g., start/stop recording or streaming). There can also be a software service indicator on the device to indicate broadcast/capture status. In addition, the device operating system can serve as a control and indication mechanism, using the display and interactive controls to enable the capture functionality. Regardless of the initiation source, the platform operating service application and embedded software can work in harmony to provide the user with a single, consistent, visual and tactile affordance that readily communicates state even if the application is in the background. These affordances include, but are not limited to, software service indicators.

[00248] The software service indicator can give users instant, or substantially instant, feedback on the current capture state and functionality. To indicate to the user that a capture is occurring, for example, a status light can be illuminated. This light can be further customized to communicate additional state details such as applying a color profile or pulsing pattern to provide functionality insight at a glance. One example of this is utilizing associated brand colors with active service integrations, such as pulsing the status light the associated shade of purple when the user is streaming to Twitch. Further, in the case of a status light, a platform service indicator, the illumination state, color, illumination patterns, and brightness can also be changed to provide visual cues to the user. These states can be singular, or enhanced to provide overlapping visual queues to the user at any time. For example, while recording and streaming, the status light can be illuminated and pulsing between the assigned capture color (Red by default) and the color of the associated streaming service (example: Purple for Twitch).

[00249] The software service indicator can of course be supplemented with one or more on-screen affordances as well. In some embodiments, the on-screen affordance can be inside the platform operating service application and/or in the mobile operating system. In certain embodiments, the platform operating service, because it can potentially be accessed through the software service button, can also indicate on the screen the current state of the capture system in lieu of or in conjunction with a software service indicator. In certain embodiments, the platform operating service can stop an outstanding capture session on the disconnection of the mobile game controller.

[00250] Example Implementation

[00251] In certain embodiments (see Figure 23), the capture system is implemented through coordination of the mobile game controller 1200 and the platform operating service 1210. As shown in Figure 23, the mobile game controller 1200 of this embodiment comprises a capture button 1202, controller input 1204, and a status light 1206. The platform operating service 1210 comprises an application 1220 and an operating system 1230. The application 1220 comprises an API 1221 and a capture service 1222 that outputs output destinations 1223, 1224 to a network and file storage 1225, 1226, respectively.

[00252] The mobile game controller 1200 provides the surface buttons and status indicators for the capture system. The platform operating service 1210 provides the screen and audio capture capabilities and handles interfacing to various cloud services. Using a platform operating service API 1221, the two systems exchange status and control information to create a capture system that is accessible and useful to the user. [00253] Mobile Game Controller

[00254] In certain embodiments, the mobile game controller contains a software service button (capture button) which can be used to interact with the capture system of the platform operating service. Through the platform operating service application, the capture button, a software service button, can invoke various recording functions based on the use case. For instance, in screen recording mode, a short press starts/stops recording, and a long press saves a marker in the recording. In flashback recording mode, a short press starts/stops circular buffer recording, and a long press saves the last N seconds from the circular buffer. In screenshare mode, a short press starts/stops screensharing/buffering, and a long press saves the last N seconds from the circular buffer (Flashback recording can run concurrently with screensharing). In live stream mode, a short press starts/stops livestreaming/buffering, and a long press saves the last N seconds from the circular buffer (Flashback recording can run concurrently with live streaming).

[00255] In some embodiments, the Capture Button, as a software service button, can have special behaviors when the platform operating service application is in the foreground. For example, the device operating system can have its own circular buffer of length N seconds, and therefore when the user is streaming cloud games in the platform operating service, they don’t have to request user permission. In some embodiments, the Capture Button, as a software service button, can have user programmable behaviors where the function of the button is defined by a user setting in the platform operating service settings. For example, a user may select whether they want a traditional capture recording button or to use the flashback recording feature. Figures 24 and 25 are screen shots related to the recording features.

[00256] Example Status indication

[00257] Based on the currently-active output destinations, one or more LED patterns are selected and sent to the connected controller. When multiple LED patterns stack, the patterns alternate to allow for multiple states to be shown simultaneously.

[00258] Platform Operating Service

[00259] In some embodiments, the platform operating service implements the capture system through a service running on the mobile device which is able to capture audio and video data, compress/encode into the required format(s) and transfer to the specified output destination. The capture service operates in the background in order to record gameplay, and communicates with the platform operating system application through a bi-directional messaging interface. Example messages include: Add output destination (start recording), Remove output destination (stop recording), Report capture status, Software service button pressed, and Error has occurred with the output destination.

[00260] Broadcast and Capture Service

[00261] The capture service can be described as an audio/video pipeline. In the simple case, audio and video routes to a single destination, but in more complicated cases multiple output destinations can be active at the same time. On a mobile device, there is generally one interface to the mobile operating system audio/video stream, and therefore a common entry point for input samples can exist when supporting multiple destinations. Audio/video samples can then be multiplexed and routed to feature specific sample handlers.

[00262] Audio/Video Pipeline

[00263] Figure 26 is a block diagram of an audio/video pipeline of an embodiment. As shown in Figure 26, raw screen/audio samples (raw video and audio sample data)_1300 are provided to multiplexed sample handlers 1310_with independent start/stop controls for each interface. Multiple possible output destinations allowi for simultaneous operation and delivery. For example, the multiplexed sample handlers 1310 can output a screen recording 1320 which is sent via file VO as a movie file 1330, screen sharing 1340 which is sent via a network to a WebRTC server 1350, live streaming 1360 which is sent via a network to an RTMP ingest server 1370, and a flashback recording 1380 which is sent via file VO as a movie file 1390. The service scales based on active outputs, with the ability to add/remove while running. The service can shut down when the last output is removed, and start when one or more outputs are added.

[00264] Audio and video sample data

[00265] Video frames are delivered from the mobile operating system as pixel buffers, which have a buffer descriptor defining the pixel format, e.g. 32-bit RGBA. Each frame is given a timestamp which helps encoders establish the frame rate of the video. For efficiency, video frames often arrive in a buffer list which represent a list of video frames over some period of time. The downstream encoders therefore need to be capable of unpacking several buffers at a time, and will consider the embedded time stamps when controlling the frame rate. [00266] Audio frames are delivered from the mobile operating system in a similar way as pixel buffer, except their buffer descriptor defines the format of PCM audio, such as the number of channels, bit depth, and sample rate. Audio buffers can also arrive in a buffer list and timestamps are important for the encoder to match up with the video frames.

[00267] Usually the capture system records the gameplay audio, but some mobile operating systems may supply voice audio from a system or external microphone. In this case, the voice samples arrive in a similar way as the audio frames, but may need to be mixed before reaching the encode. However, some sophisticated MPEG encoders may support multiple audio tracks which can reduce the complexity in the capture service.

[00268] Sample Handlers

[00269] In this description of the capture system, a sample handler is a subfeature of the service which implements a single output destination. Output destinations may be local for gameplay recording or broadcast externally for streaming destinations.

[00270] Example Screen Recording

[00271] For this mode, audio and video samples are streamed into an MPEG encoder and progressively written to disk until the process is stopped by the user. Resolution, bitrate, and framerate can all be specified by the user.

[00272] Example Screen Sharing

[00273] For this mode, real-time media credentials are established when the handler is started which handles authentication as well as which room to share in. Once connected, audio/video frames are encoded and queued for upload to the realtime media server. Resolution, bitrate, and framerate are automatically handled by the feature to optimize for responsiveness.

[00274] Example Live Streaming

[00275] For this mode, a RTMP (Real-time messaging protocol) ingest server endpoint is established when the handler is started which determines which live streaming channel will receive the video content. Audio/video frames are adaptive encoded, based on the server bitrate limits as well as network performance.

[00276] Example Flashback Recording

[00277] Audio and video samples are encoded on the fly into segments, but instead of uploading to the web as with an HLS (HTTP Live Streaming) use case, the segments get written into a file based circular buffer continuously. A short history (for example 15-30 seconds) is maintained, where the oldest segments are replaced with new segments in a circular fashion. When a recording is requested, the various segments are assembled in chronological order to produce a movie file.

[00278] One important criteria of the flashback recording feature is that the recording mechanism continues to run even if a request to save video occurs. This means that the system is capable of exporting a movie while simultaneously managing new segments in the circular buffer.

[00279] Example Content Delivery and Upload

[00280] Through coordination of the device operating system and platform operating services, user captured audio/video content can be uploaded to the platform operating service, and ultimately delivered more broadly to other users.

[00281] Captured Gameplay

[00282] In certain embodiments, gameplay that is captured to file can be uploaded to the platform operating service and distributed through a content delivery network. Metadata from the associated gameplay is automatically stored on the platform operating service to power search and content discovery.

[00283] Live Streamed Gameplay

[00284] In certain embodiments, gameplay can also be streamed live to various first- or third-party services like Twitch and YouTube (see Figure 27). The platform operating service tracks concurrent user live streams and screen sharing sessions to notify other users of watchable content.

[00285] Smart Record

[00286] Smart record is an example implementation of the flashback recording feature within a platform operating service, with deep integration with the mobile game controller.

[00287] Flashback Recording

[00288] In certain embodiments, the flashback recording implementation uses a recording mechanism similar to HTTP live streaming (HLS), where the audio/video encoder is configured in segmentation mode. In this mode the encoder will output multiple movie fragments rather than one single continuous movie. With HLS, each segment would be uploaded to a streaming endpoint on a periodic interval. Instead, the segmentation interval is set to 1 second, and each segment is stored into a circular buffer. To reduce runtime memory requirements, each segment is written to computing device storage, with a limit on the maximum number of segments that can live on the computing device local storage at any given time (usually N+M where N is the number of seconds of flashback memory, and M is the number spare/scratch buffers to help prevent overrun). Once the circular buffer is full, any new segments replace the oldest segment in a circular fashion.

[00289] The flow may comprise of the following steps:

[00290] Setup audio/video encoder in segment mode

[00291] Save initialization segment from the encoder for use as movie header in final output file

[00292] Setup circular buffer of N + M movie segments, 1 sec intervals; where N is the number of seconds of flashback history and M is number of extra seconds of buffering to avoid segment overwrite during video export. Write pointer points to the oldest segment, read pointer is always N entries forward from write pointer.

[00293] Observe screen/audio buffers uncompressed and feed into audio/video encoder.

[00294] For each movie segment out of the encoder, write the segment into the circular buffer until a stop event occurs.

[00295] When a “create flashback recording” event arrives, write initialization segment as movie header, and write N movie segments based on flashback history depth. One or more movie segments may arrive during this process, flashback history should have M movie segments of extra space to avoid overrun

[00296] In certain embodiments, the flashback recording mechanism may need to be started or ask for user permission before starting. In this example, the user would first press the capture button to initiate the capture service to enable recording, and then would use a different button gesture to save a flashback movie to their capture gallery. For example, a short press can start/stop flashback recording service, and a press-and-hold can save flashback movie and remaining recording.

[00297] While the flashback recording mechanism is running, some embodiments may use a software status indicator to indicate the system is actively supporting flashback recording mechanism. The status indicator can change its state based on if the flashback recording service starts, stops, or successfully creates a flashback movie. For example, no status light when flashback recording is stopped, a solid yellow light when flashback recording is running, and a triple blink pattern on the yellow light when successfully saving a movie [00298] Automatic Recording

[00299] In certain embodiments, the process of triggering a flashback movie to be saved can be automatic. Instead of requiring the user to press a software service button, the system may instead create movies automatically using additional context from the mobile game controller. For example, a real-time controller analytics stream can be analyzed to determine peak activity on the controller inputs to decide to save a movie.

[00300] Movies that are saved from the same recording session can be grouped by the platform operating service so they can be quickly reviewed simultaneously, allowing for more efficient editing of gameplay. Grouping movies of the same recording session also has the benefit of being able to combine or stitch the content more easily.

[00301] Live Streaming

[00302] In certain embodiments, this document describes live streaming gameplay on a device operating system in conjunction with the mobile game controller. The system can be designed to enable for “one-touch” control of the live stream, allowing for the broadcast to be controlled from a software service button. In some embodiments, the control of the live stream can also be implemented through a user interface inside the platform operating service application. The interface can either be a controller interactable button or a touch surface. In the absence of a software service button, such as when a mobile game controller is disconnected or does not support a software service button for capture, an alternate interface can be presented in place of the controller button. See Figures 28 and 20 are screen shots that illustrate a live streaming feature of an embodiment.

[00303] The following subsystems help implement a seamless live streaming experience: account linking, audio/video capture and streaming, and accessory service capable of background operation.

[00304] Account Linking

[00305] Figures 30-31 are screen shots that illustrate an account linking feature of an embodiment. To set up a live stream within a capture service, it may be necessary to do a one-time setup to link the user account with the platform operating service. The platform operating service application requires a proper stream key which in turn determines the destination URL for the live stream ingest server. Most popular live streaming services provide an API to link the account, but it is also possible for the user to manually enter their stream key or manually input the URL to the ingest server.

[00306] In certain embodiments, the platform operating service provides a user interface to connect and link a live stream account with the platform operating service application. Once the account is linked, the stream key and other live stream channel information (user name, channel name, etc.) may be accessed from the live streaming API and used in conjunction with the audio/video capture service to enable live streaming of gameplay.

[00307] In certain embodiments, the platform operating service provides a user interface to manually enter a stream key for a live streaming account which can be used in conjunction with the audio/video capture service to enable live streaming of gameplay.

[00308] Audio/Video Streaming

[00309] In some embodiments, the platform operating service has established the stream key and/or ingest server URL, a RTMP connection is formed with an RTMP server in preparation for streaming audio/video content to the live stream.

[00310] To get gameplay audio/video to the live streaming service, the screen and system sounds need to be captured and converted via MPEG encoder. The encoded frames then get passed on to a broadcaster module which handles packaging in the RTMP frame format and sending through a network transport to the established RTMP connection.

[00311] In additional embodiments, other video transport protocols (e.g. Secure Reliable Transport, HTTP live streaming) can be used to stream audio/video content to a live stream with adjustments for any encoding specifications.

[00312] Status Indication

[00313] In some embodiments, a software status indication can be used with a mobile game controller to show the live stream status, such as a status light. The broadcast status in this case is communicated from the platform operating service to the mobile game controller.

[00314] In certain embodiments, the status indication may be set by the platform operating service for display within the operating system of the computing device. The operating system may allow for status indication in its status bar, system notifications, and other places where system status can be found.

[00315] Capture Edit/Upload - Capture Gallery [00316] In certain embodiments, once a video clip has been recorded, it is copied into the platform operating service application’s local storage for easy access to view/edit. All outstanding videos and screenshots are organized into a gallery within the platform operating service application with the ability to view, edit, share, and delete. In addition, the capture gallery shows any videos that have been uploaded to the platform operating service, which is described herein. Figures 32-37 are screen shots that illustrate a capture gallery feature of an embodiment.

[00317] Watermarking

[00318] In certain embodiments, a recorded video may be watermarked before uploading or saving externally. A watermark can be a static or moving image or text that is blended into the video to track its original creator and/or program that created it. Figure 38 is a screen shot that illustrates a watermarking feature of an embodiment.

[00319] Example Video Editor

[00320] When viewing original video recordings, the platform operating service application allows the user to edit and trim their video in preparation for uploading to the platform operating service. The in-app video editor is designed to be fully controller navigable, using the joysticks of the smart controller to quickly scrub through video in an ergonomic fashion, and utilizing the mobile game controller button shortcuts to toggle between the trim handles. When the user moves the joystick left and right, the scrub position of the video is altered and the preview of the current video frame is updated in real-time. There are three scrubbing modes: (1) scrub start position of trim window, (2) scrub current position of preview playback, and (3) scrub end position of trim window.

[00321] As part of the editing workflow, the user will need to establish where the clip starts and ends. To do this efficiently, the user will playback the video and pause at the point where they want the video to start. Once the position is identified, they can use a mobile game controller button to create the start and end markers. The user can then use the mobile game controller buttons to quickly toggle between the three scrubbing modes, usually previewing the video after making slight adjustments to the clip endpoints.

[00322] To aid in the editing of long videos, an activity waveform is shown in the video timeline in the screen shots of Figures 39-41. This waveform is derived from the controller input activity while the gameplay was recorded. The signal itself is essentially a weighted sum of all of the controller inputs at each point in time. The resulting waveform tends to be quite noisy, so the signal is finally processed through a moving average filter to smooth out the visualization. The end result is that areas of gameplay with significant game controller activity show up as peaks in the waveform, and places where the user is waiting on loading screens, awaiting player respawn, etc. are visualized as low activity. This can significantly improve the efficiency of finding points of interest within a video recording.

[00323] Game Tagging

[00324] In certain embodiments, using a combination of the software service button and a universal search feature, users can easily tag the game that was recorded in their video. The application can track when a user launches a game from an application user interface, such as the personalized dashboard and from the universal search functionality. In some embodiments, the application can also detect when a user launches a game from outside the application by interacting with the device operating system. This launch detection functionality can be utilized to detect what games were played during the recording period and suggest these games to the user. The user can then confirm the game tagging suggestion (see Figure 42). In the case that the source of the recording is unknown or incorrect, the user can search for and select the game to be tagged (see Figure 43). The software application can prioritize games the user has previously played or have been detected, then provide a search prompt to tag any game within the platform operating service.

[00325] Example Video Upload

[00326] Once a video has been trimmed within the minimum acceptable length, the user can save and upload the clip to the platform operating service. In certain embodiments, this is a two-step process. First, the clip is uploaded to the content delivery network that hosts the video. For the best user experience, a CDN can be used which provides flexibility in streaming the video using HLS technology at various bitrates and resolutions. Once the file is uploaded to the CDN, the source url is passed on to the platform operating service along with other metadata for the video such as author, tagged game, and other video attributes important for the viewing experience.

[00327] When a video is added to the platform operating service, it is associated with the logged in user so that it will appear in the local user’s profile as well as in certain areas of the dashboard that include user generated content. For example, gameplay video uploads that are correctly tagged can appear on game detail pages for other users. Additionally, videos can show up in recent and trending rows on the Integrated Dashboard.

[00328] Platform Operating Service with Integrated Dashboard - Overview [00329] One of the important issue users of a mobile game controller face is that they are not able to access all of the content across multiple distinct services in a straightforward way. Mobile app stores do not necessarily have an incentive to enumerate or index individual game titles that can be played through use cases such as console/PC remote play or cloud game streaming. This is largely because streamed content is typically consolidated into a single app (e.g., Netflix or Amazon Luna) and because some of this content can be streamed on a host device wholly separate from the computing device connected to the mobile game controller as is the case with console/PC remote play.

[00330] In certain embodiments, a smart mobile game controller has one or more integrated dashboards. This software transforms the computing device into a significantly more capable end-to-end gaming device, acting as an overall entry point into the experience of using the smart mobile game controller. The integrated dashboard is a part of the platform operating service. The integrated dashboard can provide a rich full screen experience when connected to the mobile game controller. When using the software service button(s) and/or connecting the mobile game controller, the integrated dashboard can be opened. The integrated dashboard can be designed to be primarily in landscape mode when the mobile game controller is connected to the computing device. Further, the integrated dashboard user interface can be configured to treat game controller input as its primary input modality. Thus, the combination of the mobile game controller and the platform operating service can feel like a dedicated end-to-end gaming system that elevates it above a core input device experience.

[00331] The integrated dashboard can aggregate all the services from native distribution (e.g., Apple Arcade), remote play (e.g., PlayStation Remote Play), and cloud game streaming (e.g., Amazon Luna). Through the integrated dashboard, the user can seamlessly launch streamed games from online resources directly inside the platform operating service using the smart service button, reducing user-friction and streamlining their ability to instantly jump into gaming content. The Integrated Dashboard allows the user to seamlessly launch games through the mobile game controller, reducing user-friction and streamlining their ability to instantly jump into gaming content. Users can leverage a universal search feature across multiple services to easily find games across the platform operating service and their computing device. User actions within the interface can be dynamically updated to be contextual and pertinent to the current device state, taking into account installation status, streaming capability, past user actions, user capabilities and linked accounts, localization, and other factors. The software service button(s) can further enhance the experience by allowing the user to quickly switch between different gaming experiences on the mobile device.

[00332] Controller Input Architecture

[00333] To navigate the integrated dashboard with a mobile game controller, the user interface can adjust its interaction paradigm to a focus-based approach. Some or all of the user interface components can be selectable via controller inputs. Further, in certain embodiments, some or all of the user interface can respond to touch. When a component is selected it is considered to be in focus, and generally only one component can be in focus at once. The selected component can receive additional input from the controller such as button clicks. For instance, a button may be selected by navigating the UI with the joystick and then activated by pressing the A button or touching the interface element directly. Figure 44 is an illustration of a controller input architecture of an embodiment.

[00334] To help the user navigate the efficiently, the integrated dashboard utilizes several surfaces on the game controller. For the purposes of this button explanation, a standard ABXY button layout is used to describe the core front facing buttons (see Figure 45). Product variations of the smart mobile game controller may use alternate symbols and glyphs while maintaining the same four-button diamond. These buttons are referred to as face buttons. In certain embodiments, the face buttons labels can be transposed as well based on the product variation; for example, the face buttons could be BAXY instead of ABXY.

[00335] Joystick and directional pad are used to adjust which component of the dashboard is in focus. Face buttons of the controller are used to move forward and back (A button will launch into the focused content, equivalent to a tap, and B button will return back to the previous content, equivalent to tapping on a back chevron/button). Remaining face buttons of the controller are used for contextual actions (X button will execute contextual action slot 1, and Y button will execute contextual action slot 2). Options button (...) is used for opening contextual actions that are associated with a menu of options. Ll/Rl shoulder buttons are used for quick tab navigation, and in some cases scroll to beginning or end of a scrolling collection. Menu button (hamburger) is used for invoking the dashboard's secondary menu system (system settings, friends list, etc.). The software service button 1 is used for switching between games and to bring the dashboard back into focus. Capture button (software service button 2) is used for interacting with the capture system.

[00336] Button Hints

[00337] To make the integrated dashboard as intuitive as possible, button hints can be used to explain the various actions that may change contextually in the system. A button hint is a combination of a symbol/glyph image representing the controller button plus a short action text describing the action. Button hints can be dynamic. When the platform operating service application requests information about the Integrated Dashboard, the platform operating service returns a Contextual Action Button (CAB) that describes the button in more detail. Part of the CAB description is the button hint. The button hint is used for rendering the button (see Figure 46). Another part of the CAB description is the action. The CAB action is invoked when the corresponding button on the mobile game controller is pressed. CAB descriptions are cached locally by the platform operating service application for faster rendering when the application loads.

[00338] When focus changes to a new component in the dashboard, any available contextual actions can be translated into a button hint affordances. This explicitly defines which button can be used to invoke each action. Since many components support more than one possible action, button hints can be grouped together in a fixed location on the screen. This provides a consistent anchor point for the user to understand what options are possible at any given time.

[00339] Dashboard Architecture

[00340] As shown in Figure 47, the integrated dashboard can be organized into the following top-level component collections: shortcut buttons, status bar, side menu, content grid, and contextual button hints.

[00341] Regarding the shortcut buttons, in the top left, several shortcuts can be provided for easy access to commonly accessed features. For example, in certain embodiments, there can be a magnifier glass button or other button to easily bring up the universal search feature. The shortcut buttons can be invoked by pressing buttons on the mobile game controller. For instance, the menu button is activated when pressing the menu button on the mobile game controller. Regarding the status bar, since the integrated dashboard can be a full screen experience, the device operating system status can be replaced with a custom status bar. This shows common status items such as time and battery level, but also includes platform operating service features such as number of friends online.

[00342] Regarding the side menu, additional settings and features are organized into an intermittently visible side menu. The menu can be revealed at any time by pressing the mobile game controller’s menu button. Regarding the content grid, content driven by the platform operating service can stretch edge to edge of the screen, and is conceptually organized into a grid. This grid of content tiles can be easily navigated with the joystick and leads to additional content pages. Regarding contextual button hints, in the lower right, any applicable button hints can be shown. Which buttons are shown depends on the content which is currently in focus. Each contextual action possible for the content will be shown as its own button, and will dynamically update when new content is selected on the dashboard.

[00343] Content Grid

[00344] As shown in Figure 48, content from the platform operating service is primarily presented as a grid. Content is organized horizontally into rows and within each row contains several individual items of content. Content can take many forms such as games, editorial pages, promotions, user-generated content (UGC), and gameplay highlights. Game content can have a badge which displays the game’s platform. For example a game may have an App Store, Xbox or Apple Arcade badge. All data in the content grid is server driven, but the smart mobile game controller shares its identity and analytics to customize the content.

[00345] Figure 49 is an illustration of an app store badge of an embodiment. When the content grid is in focus, one row is selected at a time. The in-focus row can be scrolled to a position which is centered vertically for clarity on what is in focus, and to ensure other floating components of the dashboard do not obstruct the content. Using the mobile game controller joystick or directional pad, the up or down directions will scroll the content vertically, switching focus to the row above or below the currently selected row. The content is scroll in row increments rather than pixel/point increments to dramatically improve the navigation efficiency. Using the mobile game controller joystick or directional pad, the left and right direction conversely will navigate between tiles of content within the focus row. Similar to row scrolling, content moves one item at a time rather than scrolling the canvas by pixel. In both scrolling modes, holding a particular direction will initially move one component at a time. As the controller input is held/sustained, over time the input event will be repeated, and at increasing intervals. The scrolling therefore accelerates over time allowing the user to quickly navigate to the beginning or end of the focused content.

[00346] Figure 50 is a flow diagram of an integrated dash of an embodiment. As shown in Figure 50, an application 1400 sends an integrated dashboard request 1440 to a service 1410. In response to receiving the request, the service 1410 collects recommended games 1450 from a recommendation engine 1430, as well as user games 1445, friend highlights 1455, trending highlights 1460, platform games 1465, perks and rewards 1470, and active screen sharing 1475 from a database 1420. The service 1410 then sends an integrated dashboard response 1480 to the application 1400 based on the collected information.

[00347] Content Pages

[00348] Content pages can be opened by pressing the primary face button when a particular content tile is in focus. This can expand into a detailed view for the content, which utilizes a subset of the content grid functionality, but driven by the details of the game/article, as shown in Figure 51. Content pages can either be inline or external. With inline pages, the content can expand/zoom inline within the content grid. This is analogous to how collapse/expand chevron controls work within a text editor. Collapsing the detail page restores the original row, expanding the detail page reveals additional row content beneath the original row.

[00349] External content pages however can be presented in a separate modal which enables the page to take up the entire screen and is ideal for rich full screen experiences. External pages can be ideal when helping the user focus on a single piece of content, whereas inline pages are better suited for situations where the user wants to scroll through a collection of games, for example. Content page information can be requested from the platform operating service. In certain embodiments, the platform operating service application sends a request to the platform operating service for the detailed game information. The service responds with various metadata and identifiers such as, but not limited to, the sources of the game, screenshots, and a summary. [00350] Figure 52 is a content page flow diagram of an embodiment. As shown in Figure 52, the application 1400 sends an extended game info request 1510 to the service 1410, which gathers sources (platforms) 1520, screenshots 1530, and a summary 1540 from the database 1420. The service 1410 then sends an extended game info response to the application 1400.

[00351] Additional embodiments could include content that has been favorited by the user to form a content page, content surfaced from non-gaming external apps such as Netflix or a web browser, and content not publicly available such as in application test environments such as TestFlight or Firebase.

[00352] Server Architecture

[00353] The platform operating service that powers the integrated dashboard experience essentially provides an API over a rich content database which aggregates data from multiple sources. The platform operating service application is provided with the necessary information to render the content.

[00354] Figure 53 is an illustration of a general client-server architecture of an embodiment. As shown in Figure 53, a cloud service 1600 is in communication with a cloud streaming service 1610 and a gameplay device 1620. The cloud service 1600 comprises a server 1635, a database 1640, an analytics element 1645, a push notification provider 1650, a game information provider 1655, and a content management element 1660.

[00355] Games Database

[00356] The platform operating service’s games database comprises a library of games across all relevant external platforms with information about each game. Each game entry can contain the information needed by the platform operating service application in order to render various functionality.

[00357] The games database serves multiple purposes within the platform operating service application, including, but not limited to: (i) assessing whether a particular game supports game controllers, (ii) providing rich media and localized metadata for the game to surface to the user, (iii) user-initiated search and filtering of games, so that users can manually determine whether a particular game is supported by the platform, discover games supported by their gameplay platform of choice, learn more about a particular game, and find highlights and screenshots of other users playing the game, and (iv) surfacing game suggestions to users on the Integrated Dashboard, both through editorial curation and algorithmic personalization (see below)

[00358] This games database can be kept up-to-date through both automated consistency checks that compare the database’s content against content platforms’ own databases, and in the cases of platforms that do not provide such information, manual intervention as the result of user reports of missing or incorrect games. This manual intervention can be important given the fragmented availability of this information across multiple platforms. Human intervention usually occurs as the result of user reports, such as app reviews or customer support requests, but user search and launch analytics can eb used to determine games that users wish to play but have not yet reported. The games surfaced from these queries are tested to ensure they support game controllers and are then marked as such in the database.

[00359] Figure 54 is a flow diagram that shows application/service games database interaction. As shown in Figure 54, after the application launches (act 1710), the application 1400 sends a request 1720 for the latest games to the service 1410, which gets game information 1730 from the database 1420 and responds to the application 1400 (act 1740). The application 1400 then adds/updates games to the application database 1700 (act 1750).

[00360] Analytics

[00361] As the user interacts with the platform operating service application, analytics can be sent to the platform operating service. The platform operating service can use these analytics to enhance the user experience by personalizing application content. In some embodiments, this can be achieved by customizing the data returned in the application’s Integrated Dashboard request.

[00362] The flow of analytics can start on the gameplay device and can be sent to the platform operating service’s analytics service. The analytics service can forward the events to several destinations. One destination can be a recommendation engine. This engine is used to recommend games to a user based on their activity and possibly other user’s preferences. Another destination can be an analytics store. The analytics store is used to build information the platform operating service can use when generating Integrated Dashboard content.

[00363] Examples of customized content include prioritizing rows based on usage, prioritizing rows based on the age of the content, hiding items the user may already have installed, displaying highlights based on views or reactions, and driving behavior in the application, such as prompting the user for a review after a custom set of criteria is met.

[00364] Figure 55 is an illustration of an architecture that can be used for analytics. As shown in Figure 55, this architecture comprises a cloud service 1800 in communication with a gameplay device 1810. The cloud service 1800 comprises a server 1820, a database 1830, an analytics store 1840, an analytics service 1850, and a recommendation engine 1860.

[00365] Personalization

[00366] The following are examples of content personalization. The mobile game controller-based launch analytics and installed game detection can be used in conjunction with the games database and additional user context, such as the user’s device state, information provided by the device operating system, and information from accounts on external or native platforms, to produce game recommendations as part of the integrated dashboard. Viewing analytics of user generated content can also be fed into the recommendation engine to influence game and content recommendations. Games can also be organized into groups that can be displayed as rows on the Integrated Dashboard. The personalized dashboard can rank the rows based on whether the row has been seen or not, to additionally surface new and unseen content. The platform operating service application can report to the cloud service the list of games that it has detected. The platform operating service uses this information to provide lists of games based on a platform. For instance, if the user has Playstation Remote Play installed they may see a row specifically for Playstation games. The platform operating service may enable the user to maintain friendships with other users, allowing the server to surface friend activity and relevant content such as game highlights and suggestions from friends. Game tiles can also be customized with information about other users that are playing a particular game.

[00367] Game tiles can be associated with actions based on specific accessorydevice context that are relevant and customized to the user and current device state such as “Download”, “Play Instantly”, “Add or Hide from Library”, “See Details”, “Ways to Play”, and more. Content can also be segmented based on product SKU, to surface games, how-tos, perks and benefits, and other product specific content, such as Playstation titles and content for the Playstation co-branded SKU. The server can know the user’s configured language and can provide localization specific content, including content filtering and availability, language, media and image variation, ratings, and more.

[00368] Launching the Application with Software Service Button

[00369] In some embodiments, the user can press a software service button in order to launch the platform operating service application. In certain embodiments, the application can take the user into an onboarding and setup flow, adjust computing device permissions, or go straight into the integrated dashboard. When the integrated dashboard is shown to the user, it can receive data from the platform operating service to display user personalized content; alternatively, it can display curated content. The user can use the mobile game controller to navigate through the integrated dashboard where they can launch games, view clips and highlights, adjust settings, or use any feature part of the integrated dashboard.

[00370] User Journey

[00371] The following is an example of a user journey: (1) connect mobile game controller to the computing device, (2) press software service button, (3) latest personalized content shown in dashboard, (4) navigate to desired gaming experience, (5) launch into the game.

[00372] Example Implementation

[00373] In certain embodiments, the platform operating service app can request game information from the platform operating service’s game database in order to be cached locally. This reduces load times and subsequent network bandwidth requirements to enhance the user experience. The local games database cache provides data used throughout the platform operating service application including the rendering of games in the Integrated Dashboard as well as game search and discovery. [00374] In certain embodiments, the platform operating service application can provide metadata, including detected games, to the platform operating service so that the platform operating service can generate and return a personalized dashboard, currently represented by collections of related content organized into rows. A row may contain popular games, trending content, device specific information, or simply a list of games. Further, some rows may contain rendering instructions for the platform operating service application to render with specialized logic not otherwise available on the Integrated Dashboard. One example of such an instruction is known as “User Games”— when the platform operating service application encounters this instruction from the platform operating service, it renders tiles for all games the platform operating service application knows to be installed on the computing device.

[00375] The platform operating service application can render the integrated dashboard. The user can navigate the integrated dashboard using any combination of input from the mobile game controller and touch gestures. A user can also use the mobile game controller or touch gestures to open or launch a tile, after which the application can perform a designated action, such as launch a game, open a rich media experience, visit a website, view another aspect of the application, or launch a game or experience within another application on the computing device.

[00376] Figure 56 is a flow diagram of an application integrated dashboard of an embodiment. As shown in Figure 56, after the application 1400 launches (act 1910), the application 1400 sends an integrated dashboard request 1920 to the service 1410, which communicates with the database 1420 (act 1930). The application 1400 renders a cached home screen (act 1940), and the service 1410 returns an integrated dashboard response 1950. The application database 1900 and the application 1400 exchange stored integrated dashboard information (acts 1960, 1970), and the application 1400 updates the integrated dashboard (act 1980).

[00377] Additional embodiments can be launching content on an external device, launching videos, opening web sites, viewing live streams, other applications or other functionality within the application. Additionally, the platform operating service application can also be launched when the controller is attached to the computing device. Figure 57-59 are screen shots that show examples of personalized content.

[00378] Viewing Available Installed Content

[00379] In some embodiments, the user can open the platform operating service application through pressing a software service button or launching from the device operating system. The user can be taken to the Integrated Dashboard to see what content is available to use within the application. The Integrated Dashboard can surface content that can be played, as well as recall and quickly launch games the user had previously played.

[00380] The following is an example user journey: (1) user opens the personalized dashboard by pressing the software service button, (2) the platform operating service compares its understanding of installed applications with a list of supported games, and (3) the user sees a list of supported games on the personalized dashboard.

[00381] Example Implementation

[00382] In certain embodiments, the platform operating service can automatically determine what games are available to play by leveraging a number of data sources in the underlying system and comparing it to a curated list of games known to be compatible with the platform operating system. This system can also preserve user privacy by only collecting and storing information about software packages known to be games, thus limiting the data available to cloud services to only the details necessary for such a collection.

[00383] Figure 60 is a flow diagram that illustrates an embodiment. As shown in Figure 60, a cloud service 2000 sends a list of supported games for the platform to the application 1400 (act 2020). After the user presses the software service button (act 2030), the application 1400 sends a query to the application operating system 2010 for installed software packages (act 2040), and the application operating system 2010 returns the list (act 2050). The application 1400 then sends a list of installed packages that are also games to the cloud service 2000 (act 2060), which returns presentational metadata for games (act 2070). The application 1200 then sends a rendered list of available games to the user (act 2080).

[00384] Further, the platform operating service can receive a list of content that is supported for the gameplay device. When the user presses the software service button, the platform operating service application can query the application operating system for installed software packages, as well as restore a cached list of applications previously detected on the gameplay device (in case, for example, the computing device operating system does not return a complete list of installed software packages). The software application can then cross-reference this list of known available content with the cloud service-provided list of supported content, and send this list to the cloud service. The cloud service, which can store metadata for all supported games, is then able to send down a customized set of metadata for the games to render, for example, on the personalized dashboard for the user to view and launch available content.

[00385] In some embodiments, the application can retrieve and/or store the last time content was detected, and then send the list of installed content to the server in order of this last detection time. This additional functionality can be used to sort the rendered list of available content to the user so that more recently-used content is easier for the user to see and launch.

[00386] Additional embodiments could include non-game media being cataloged and launched from the application, games that are not locally installed on the gameplay device but are still available to play using the gameplay device (see below section).

[00387] Another embodiment could be the application not sending such information to the cloud service in favor of rendering a list of available content using locally- stored information. One embodiment can be to allow the user to determine whether an installed application can be played with the gameplay device and maintain this user-curated list of supported content. The benefit and use case of such embodiment(s) can be to enable the user to view available content available when the cloud service is unavailable, for example, because the user does not have an network connection.

[00388] Another embodiment could include being able to add or remove content from this list. Users can create a list of their favorite games that they can quickly access when opening the application.

[00389] Figure 61 is a screen shot showing an example of recently-played games.

[00390] Playing Content Available on an External Gameplay Service

[00391] In some embodiments, the content listed in the Integrated Dashboard may be available to stream from an external service within the platform operating service application. The user is able to view details about the content and launch the content without leaving the application. The user can go between the application and the external content by using the software service buttons and continue to leverage platform operating service features.

[00392] The following is an example of a user journey: (1) the user opens the Integrated Dashboard by pressing the software service button, (2) the user navigates the Integrated Dashboard using the smart mobile game controller and finds a game on the Integrated Dashboard that is available on an external gameplay service, (3) the user presses a button on the smart mobile game controller to launch the game, (4) a browser is opened inside of the platform operating service application, (5) the user plays the game without having to leave the application, leveraging the same smart mobile game controller features as installed games, such as live streaming and recording of gameplay, including flashback recording, rich presence notifications, and controller-integrated real-time audio chat), and (6) the users presses the software service button to exit the game and return to the Integrated Dashboard.

[00393] Example Implementation

[00394] To determine what games are available to the user, the platform operating service application can surface a prompt to the user to authorize external gameplay services to share gameplay history with the platform operating service. If a user proceeds with the prompt, the platform operating service can then launch the user into the Account Linking functionality.

[00395] The platform operating service games database can store mappings between the streaming services’ identifiers and the platform operating service identifiers, which allows the service to identify which games from the user’s gameplay history are playable on the platform operating service, and then to surface curated metadata and imagery for playable games on the personalized dashboard. These games can be intermingled with games installed on the gameplay device to provide the user a seamless experience between games on external gameplay services and those available on the platform operating service. Further, the platform operating service can identify and include games the user played on the external gameplay platform, but not on the platform operating service; this allows the platform operating service to prompt users to continue games they were previously playing elsewhere. Figure 62 is a screen shot that illustrates this embodiment.

[00396] The platform operating service can specify how to launch the game on the external gameplay service. In some embodiments, the client can be instructed to launch a browser to a specific resource on the external gameplay service. It is also possible to launch the computing device browser to view details about the game specified, or to launch an embedded experience showcasing games available on the external gameplay platform. As example of this is shown in the screen shots in Figures 63 and 64

[00397] Users can also direct the platform operating system to open the browser to a specific destination, which allows the user to browse websites in full screen and launch games not provided on the personalized dashboard, or utilize external gameplay services not directly supported by the platform operating service. Figure 65 is an example implementation of the browser rendering an external gameplay service, in this instance, Xbox Cloud Gaming, and Figure 66 is an example implementation of the browser rendering an external gameplay service, in this instance, Xbox Cloud Gaming, with a touch-based exit button. The exit button could be tapped by the user to exit the content viewing experience or could indicate to the user that they can use the software service button to exit the experience.

[00398] The browser can be presented on top of the personalized dashboard interface. The platform operating service application can configure the browser to support inputs from the mobile game controller, so that there is no need for any platform- specific intervention for the game controller to support play.

[00399] The browser can allow users to return to the personalized dashboard by pressing the software service button, and then confirming their intention to exit (see Figure 67). This confirmation dialog can be provided because sessions on external gameplay services often take considerable time to initialize. Using the software service button, which can be otherwise unavailable for games on external gameplay services, enables the platform operating service application to support all standard mobile controller inputs in a full- screen immersive experience while still enabling the platform operating service application to suspend or end the experience when a user requests to do so.

[00400] Additional embodiments could include external computing devices streaming games into the application, games available within the platform operating service, or games available within a specialized browser or content rendering mechanism. This invention could also enable users to launch and/or play games renderable by the browser functionality that are included with or downloaded into the platform operating service application, as well as launch and/or play games within a browser application external to the platform operating service. Another embodiment could also be launching into an external application for the purpose of enabling the user to play games. Additional embodiments can leverage other embedded content runtime to stream games into the application.

[00401] Content Discovery

[00402] In some embodiments, the Integrated Dashboard can provide the ability for users to discover content personalized to them from the platform operating device. Analytics and other metadata are leveraged to surface desirable content. By using the mobile game controller, the user can navigate through the personalized content, view details on the content, and play/install content directly to their computing device. [00403] The following is an example of a user journey: (1) the user opens the Integrated Dashboard by pressing the software service button, (2) the user navigates integrated dashboard using the mobile game controller, (3) the user sees a curated and personalized list of games based on application analytics, (4) for native games, the user can download the game via single button push using the mobile game controller, (5) for installed native and streamable games, the user launches the game with a single button push using the mobile game controller, and (6) the user is proved with additional options specific to content type and context [00404] Example Implementation

[00405] Game discovery in the platform operating service application depends on several components. The first component is the games database. When the platform operating service application is launched the client requests and synchronizes data from the platform operating service games database. The games database is stored locally on the computing device in order to provide fast lookups and reduce user bandwidth requirements and usage.

[00406] The second component is the Integrated Dashboard. Another phase of the platform operating service application launch is to query the platform operating service for the updated Integrated Dashboard information. This information includes popular games as well as other information the user might find relevant based on analytic data and games detected on the computing device.

[00407] Using the combination of the information in the games database and collected user data, the user can be provided relevant contextual actions allowing them to install a new game, launch an existing game, view more information about a game, manage game visibility within the interface, and more.

[00408] In certain embodiments, the Integrated Dashboard can begin playback of rich media content without user intervention. For example, if a user does not change their selection of a specific tile, the Integrated Dashboard could begin playback of an associated video within the tile, or in another designated user interface element. Such an associated video could be supplied by the platform operating service cloud service, or retrieved from an alternative source, and could be played back using a content viewer embedded in the relevant user interface element. This could enable the user to engage in the content more easily, better educate the user about the value of a particular piece of content, or encourage the user to launch the content. [00409] Additional embodiments could include games installed on other external computing devices, game favorited on computing devices web browsers (see Figure 68).

[00410] Content Search

[00411] In some embodiments, the platform operating service application can provide users a way to search for content installed on the computing device, content provided by external providers, or content that has been stream enabled by external content providers. Users can use the mobile game controller or leverage touch controls to navigate the user interface. Upon finding a game, the user can view details on the game from the local storage version of the games database, download or play the game.

[00412] The following is an example of a user journey: (1) the user navigates to the dashboard with the mobile game controller, (2) the user selects the search button by pressing a button on the mobile game controller, (3) the user is presented with a list of games, (4) the user presses the App Store button to see only games available in the App Store, (5) the user installs the game from the app store, and (6) the user launches the game.

[00413] Example Implementation

[00414] Game search is another feature that uses the games database stored on the phone. When the user presses the search button a view is presented that contains all of the games known to the platform operating service application. The list can be searched by title or the list can be filtered by selecting a platform from the list. The games database contains all of the information needed when the user selects a game from the list.

[00415] The list of games known to the application is synchronized with the service during application launch. Refer to the games database section for details about application/service synchronization.

[00416] Additional embodiments could include searching through all associated computing devices where the platform operating service application is installed, and any content that can be detected have been played before (see Figures 69 and 70). Depending on the SKU of the mobile game controller, the search can be tailored to show different items in priority that work better with the mobile game controller or adjust the software service buttons and software service indicators.

[00417] Account Linking [00418] Account linking gives users the ability to play games without having to leave the platform operating service application to re-authenticate with external cloud gameplay services. Further, it enables the platform operating service application to detect, suggest, and launch games on external cloud gameplay services without additional user intervention.

[00419] The following is an example of a user journey: (1) the user opens the platform operating service application by pressing the software service button, (2) the personalized dashboard recognizes the user has not linked their account on an external platform and renders a prompt to link their account on the platform with their platform operating service account, (3) the user uses the smart mobile game controller to launch into the prompted account linking flow, (4) the user enters their external platform login credentials (on successful connection, the platform operating service application returns the user to the personalized dashboard), and (5) the user starts a cloud gameplay session without the need to enter their credentials.

[00420] Example Implementation

[00421] Account linking leverages the interoperability of several components: the gameplay device, the cloud service, the cloud gameplay platform servers, and the cloud gameplay platform authentication website. Figure 71 is an illustration of a platform operating service 2100 and external gameplay service 2110 of an embodiment. As shown in Figure 71, the platform operating service 2100 comprises a cloud service 2102 and a gameplay device 2104, and the external gameplay service 2110 comprises a cloud gameplay platform service 2112 and a cloud gameplay platform authentication website 2114.

[00422] In certain embodiments, Account Finking works as follows: (1) the platform operating service application requests authentication with an external service, (2) the platform operating service responds with a URE for the user to enter their credentials, (3) the platform operating service application renders the credential website in an embedded web browser, (4) the user enters their credentials and authorizes the platform operating service to access their information, and (5) a token representing this authorization is stored in the platform operating service, and the user’s authentication state is securely stored within the platform operating service application.

[00423] The platform operating service stores the authorization state and can recognize whether the user has already authenticated with the external cloud gameplay servers, the platform operating service leverages a dynamic button mapping capability to enter this account linking flow only when it is relevant to the user. Otherwise, the platform operating service can specify either to hide the prompt or instead to show a confirmation that the platform has been successfully linked.

[00424] The approach allows the platform operating service application to request authentication with an arbitrary number of services without additional client changes. Since the user’s authentication state for the cloud gameplay platform is stored within the platform operating service application, launching the embedded web browser into a cloud streaming gameplay experience with that platform can be retrieved on every game launch so that the user does not need to reauthenticate.

[00425] Additional embodiments could be synchronizing any authentication state or authorization credentials from the platform operating service application to the cloud service, or to store all credentials and authorization locally on the gameplay device.

[00426] Content Discovery Through Notifications

[00427] In some embodiments, the platform operating service can surface notifications to users when other users of the platform are viewing/playing content. The notifications can provide a discovery mechanism for content within the platform operating service application. Users can use the mobile game controller or touch controls to open details of the game and launch/in stall the game on their computing device.

[00428] The following is an example user journey: (1) a friend of user starts playing a game with smart mobile game controller, (2) the user receives notification of friend activity, (3) the user presses smart service button, (4) the integrated dashboard opens and leads user to game detail page, (5) the user downloads the game via single button push, and (6) the user launches the game with a single button push. [00429] Example Implementation

[00430] Figure 72 is a notification flow diagram of an embodiment. As shown in Figure 72, after “Gameplay Device 2” 2202 launches a game (act 2210), “Gameplay Device 2” 2202 sends a rich present update to the service 1410 (act 2220). The service 1410 sends a request to push notification to friends (act 2230) to the push notification provider 2200, which sends the push notification (act 2240). “Gameplay Device 1” 2201 then presses the smart software button (act 2250), and an application launches to the friend’s game (act 2260). Then, the button is pressed to install the game (act 2270), and the button is pressed to launch the game (act 2280).

[00431] When a user starts playing a game, the platform operating service application sends a request to the platform operating service to update their presence. User presence contains rich context about which game they are playing. The platform operating service will then generate and send push notifications to friends letting them know that a friend of theirs is playing a game.

[00432] The friend that receives the notification can interact with the push notification using the software service button. In this case the software service button can direct the UI to open detailed information about the game that their friend launched. From there the user can view more information about the game, install the game if they don’t already have it or launch the game to join their friend.

[00433] Additional embodiments could include sending notifications when friends are playing games that could be played on the smart mobile game controller, whenever an associated friend is streaming, or sending notifications to users on other surfaces besides the computing device, such as a user’s television or smart appliance. [00434] Social Play Through Content Tiles

[00435] In some embodiments, the platform operating service can surface which friends of a user has played a certain game to the Integrated Dashboard. As the user navigates the content grid of the Integrated Dashboard with the mobile game controller or touch controls, the specific content tile may display N number of friends that have viewed/played the content. Users can use the mobile game controller or touch controls to open details of the game and launch/install the game on their computing device.

[00436] The following is an example of a user journey: (1) the user presses opens the Integrated Dashboard by pressing the software service button, (2) the user navigates personalized content on integrated dashboard using the smart mobile game controller, (3) the user sees a game which friends also play, (4) the user opens up game details page, (5) the user downloads the game via single button push, and (6) the user launches the game with a single button push.

[00437] Example Implementation

[00438] Users in the platform operating service application can search for friends and add them at any time. The list of friendships is then stored in the platform operating service. Analytics track events in the application. Each time a game is launched, the platform operating service application registers an analytic event with the platform operating service. The combination of these concepts is what powers Friends that Play. When the platform operating service application queries for the Integrated Dashboard tiles, the platform operating service does a query for friendships. The server uses the list of friends to then query if any friends have played a given tile. The list of friends who have played a game is then returned to the platform operating service application for presentation to the user. Figure 73 is a screen shot of a Friends that Play feature of an embodiment.

[00439] Searching for New Content

[00440] In some embodiments, the Integrated Dashboard can provide users the ability to search for new content both native and external to the computing device through the platform operating service. Users can use the mobile game controller or touch controls to type search terms to find content. Upon finding any content, the user can use the mobile game controller or touch controls to open details of the content and launch/install the content on their computing device.

[00441] The following is an example user journey: (1) the user plugs in computing device into mobile game controller, (2) the user presses software service button, (3) in the integrated dashboard, there is a clearly visible search button, (4) the user enters text into the search prompt, (5) the user finds the game and opens its detail page, (6) the user downloads the game via single button push, and (7) the user launches the game with a single button push.

[00442] Example Implementation

[00443] Searching for a new game can leverage the games database on the computing device. The platform operating service application keeps the local version of the games database up to date each time the platform operating service application is launched by requesting the most recent information from the platform operating service. Each game in the database contains information that allows the platform operating service application to install or launch a game. When the user opens the search view, the list of games is rendered. The user can use the mobile game controller to scroll through the list or select one of the filters to only show games from a particular source.

[00444] Additional embodiments could include searching the games database for games that could be played with any associated computing device and device operating system, searching through external cloud streaming providers, or searching the list directly from the cloud service. Another embodiment could include searching a custom list provided locally or from a platform operating service without caching the results each time. The content surfaced can be for any form of content even if it not controller supported. Figure 74 is a screen shot of an example game search view. [00445] Per-Product Customization

[00446] When the mobile game controller is plugged into the computing device, the unique product identifiers are registered with the platform operating service. This in turn, adjusts the suitable content which should be shown in the dashboard. For example, in the case of a product designed for a particular cloud gaming service, the content delivery system will prioritize games from that service to help get the user started.

[00447] In addition, the product version also affects the button symbols and hints used throughout the platform operating service application. So, if the version of the smart mobile game controller implemented a different layout of its face buttons, for example different than ABXY, the buttons in the app would dynamically update to reflect the currently attached controller. Lastly, the smart mobile game controller implements a custom vendor string to enable third party apps and games to detect the controller and adapt their own UI as necessary.

[00448] Additional embodiments could be changing software service indicators to match indicators while playing different cloud streaming games, software service indicators to match external content indicators (eg red lights on external game controllers)

[00449] In- App Button Functionality

[00450] Additional software service buttons on the mobile game controller can be designed to enable unique platform operating service functionality. The implementation and usages are described below.

[00451] Example Software Service Button

[00452] The software service button on the smart mobile game controller provides a high degree of flexibility, and its function is contextual, based on the state of the system. When the platform operating service application is not active, the button can enable launching the app into the foreground. The process of launching the platform operating service triggers synchronization between the system and application to update the dashboard with any game context changes and provide updated play history with contextual actions. The service is then able to retain and utilize the contextual information on subsequent launches to drive additional personalization of the Integrated Dashboard content.

[00453] When the software service button is pressed while the software operating service is active, the platform operating service decides the action based on the state of the Integrated Dashboard. In the case where there is no an interactive gaming session within the Integrated Dashboard, pressing the button switches back to the previous gaming context (e.g., previous entry on the task stack). However, when in an interactive gaming session, the button can instead act as a signal to manage the session. Usually this is to invoke some kind of system level menu.

[00454] Lastly, some games have their own concept of a home button, and menu functionality around the button. The software service button on the smart mobile game controller also implements a secondary gesture, where holding the button longer maps to the game or service’s home button functionality, and shorter presses map to the platform operating service application. This allows for short presses to do the smart Integrated Dashboard behavior, and long presses to interact with the game/service.

[00455] Example Software Service Button - Capture Button

[00456] The capture button is another software service button on the smart mobile game controller. Its function is generally focused on audio/video capture, but similar to the other software service button, its behavior is contextual based on the state of the system._The capture button can be designed to support three primary gestures: (l)_single press, (2) double press (i.e. two presses in quick succession), and (3)_prcss and hold. The nominal capture button behavior is to start/stop recording with a single press, and take a screenshot with press and hold. The user can then change the behavior of the capture system as they desire. For example, the user may opt to enable Smart Recording where the capture service continuously records gameplay into a short circular buffer, and produces video clips on demand. In this case, short press would enable/disable the feature, double press would create a clip, and press and hold would take a screenshot.

[00457] The capture button is limited to clip recording. Through the integrated dashboard, a user can also take advantage of screensharing and live streaming features. With the architecture of the capture service allowing multiple destinations, it is possible to configure the button gestures to control multiple recording functions. For example, while live streaming, you could map the gestures to: single press (start/stop clip recording), double press (save a clip), and press and hold (end live stream). This setup would allow the user to simultaneously record clips while also streaming their gameplay to a live streaming service. Screensharing is very similar and so any remote broadcasting feature could be plugged into one of the capture gestures easily.

[00458] Contextual Interaction with Notifications

[00459] Another aspect of the platform operating service is that it can have the ability to surface notifications to the user. If the user presses the software service button while a notification is active on screen, the platform operating service application can combine the press with the notification context, to bring up the relevant content in the dashboard or execute the relevant action. For example, if the user is notified that another user on the platform invited them to participate in a party voice chat, the user can press the software service button to quickly join the chat.

[00460] Similarly, when a new video clip is recorded, a notification will appear. You could use the software service button to immediately view or edit the clip. In this case, the smart service buttons work together to allow for intuitive control of the system using just the controller.

[00461] The implementation of the contextual interaction with notifications is agnostic of whether the device operating system supports button interaction with notifications, or whether the platform operating service (“the app”) is even running on the device when the software service button is pressed. This is done by the fact that, on application launch, the device operating system will check what the last relevant displayed notification was, and execute the appropriate action. Therefore, as long as it is possible for the device operating system to support an application launch (i.e. of the platform operating service), this behavior is possible.

[00462] Mute

[00463] The options software service button can also be used while in a party or a room. Double tapping on this button will mute if unmuted or unmute if muted if there is an active voice session.

[00464] Customization

[00465] The software service buttons can also be customized. For instance, the buttons could be mapped to start or join a party, send messages or invoke other functionality in the platform operating service.

[00466] Setup and Onboarding [00467] One user onboarding system is designed both to ensure users are fully educated on the functionalities of the system and to collect the necessary information and system permissions to allow them to use social networking services properly. [00468] When users first launch the app, if they do not connect the accessory, they are presented with a screen explaining the benefits of purchasing the accessory via a product video, as well as a call to action to purchase the accessory via an e- commerce portal (see the Welcome Screen in Figure 75). Once the application detects that the accessory is connected, it transitions to the screen shot shown in Figure 76. The unit on the left in that screen shot is a three-dimensional render of the accessory, which animates into view and then repeatedly swivels and bounces. This animation occurs by stitching the starting animation with a video that can be looped, creating a seamless infinite looping transition. Both videos are encoded using HEVC to enable transparency.

[00469] Users can authenticate themselves into the service using one of two identity providers: Sign in with Apple, or Google Authentication. When using Sign in with Apple, authentication can take place by validating the user-provided authorization code against the provider’s server, which then returns claims about the user’s identity that are used to prefill the user’s profile information. When using Google’s authentication flow, the application instead provides the service an identity claim (in the form of a JSON Web Token) from the identity provider, which the service then verifies using RSA public key encryption.

[00470] In both cases, once the user’s identity is verified, the service issues a persistent access and refresh token in accordance with OAuth 2.0.

[00471] User Phone Number Verification

[00472] One onboarding process verifies the user has a phone number, and the screen shot in Figure 77 allows the user to input a user name.

[00473] Cloud Gaming Education

[00474] In order to educate the user that the platform operating service and controller allows for frictionless gameplay on mobile devices, it may surface to the user the fact that they can interact with cloud gaming/streaming services with inputs from the mobile gaming controller, and from the platform operating service. It may also educate the user that use of these services do not require the presence of a non- mobile device, such as a specialized gaming console (e.g. an Xbox) or a personal desktop/laptop, as shown in the screen shot in Figure 78. [00475] Subscription Education

[00476] Users may have difficulty in discovering and accessing features provided by the platform operating service, complicating their ability to receive benefit and value from the service. This can be especially important when the platform operating service may be an embedded experience within the device operating system. To ensure users can find and understand the value of the platform operating service, regardless of whether they have access to all or some aspects of the platform operating service, the software service button can be utilized to facilitate easy access to some interfaces of the platform operating service, including functionality that educates and surfaces to users the features the platform operating service provides.

[00477] Example Implementation

[00478] In some embodiments, some aspects of the platform operating service may require additional purchase or purchases, such as a subscription. Users may be educated about the benefits of these aspects from within the platform operating service application and/or prompted to purchase or subscribe to such features (see Figures 79-83).

[00479] In certain embodiments, the launching the platform operating service application could show user interfaces designed to educate users about the benefits of the platform operating service, either immediately or after the user interacts with the platform operating service for a period of time. Users can be provided with easy access to these explanations, for example, by pressing the software service button to or launching the platform operating service application through the device operating system. This can simplify the user’s access to learning about the benefits of the platform operating system, which could otherwise only be accessible through other launch interfaces, such as the default launch functionality included in the device operating system.

[00480] If the user proceeds with the purchase or subscription, the software service button can be pressed to open an alternative screen, such as the Integrated Dashboard, or can show a user-configurable interface on press. This can allow the user to access the benefits of the platform operating service, especially the aspects thereof that may require additional purchase, without needing to use other, potentially more difficult to use, interfaces provided by, for example, the device operating system. [00481] In other embodiments, the application can show these interfaces during the user setup process as well. This can ensure the user is given multiple opportunities to understand the benefits of the platform operating service.

[00482] Mobile Game Controller Physical Accessory Integration

[00483] An important aspect of the system experience is ensuring broad mobile device, mobile device protective case, and/or other physical accessory compatibility, namely enabling/improving fitment with mobile devices that have distinct dimensions, interfaces and device attributes as well as allowing the product to be used with a mobile device both coupled with and without protective cases.

[00484] In the past, the physical fitment of the phone in the mobile game controller is not a key consideration of the user experience, leading to device obsolescence given the rapid cadence of mobile device development and user confusion. From an experience point of view fitment can mean more than just baseline compatibility; fitment is especially important because, in order for the system to feel like a dedicated gaming host device, the mobile device and the mobile game controller are ideally rigidly coupled in one or more primary axes so that there is minimal flexion or play of the mobile device within the mobile game controller so as to enable immersive gameplay and provide a premium feel typically embodied in consumer electronics through stiff enclosures. In embodiments that involve some type of physical constraint (e.g. docking with Lightning interface or USB-C) or magnetic attach solution, the phone may tilt or translate undesirably, stressing the connector or magnetic attachment module. Additional risks include insufficient magnet strength due to tilt or translation misalignment due to the use of an incompatible phone case or an incompatible phone lacking a magnetic attach subsystem.

[00485] In some art in mobile device-based VR, the user friction of having to remove a mobile device case in order to use mobile device-based VR products has had a harmful impact on long-term user retention.

[00486] The following proposes a unique solution, specific to a mobile game controller system, to allow mobile game controller devices to improve fitment and physical interface compatibility with a multitude of mobile devices and mobile device cases or accessories by taking advantage of the integrated platform operating service and platform cloud service.

[00487] In this system, during the product or app setup flow, or when a mobile device is connected to the mobile game controller, the mobile device/tablet model, product SKU number, and device sensor data (e.g., accelerometer and camera data) from the mobile device as well as the mobile game controller can be sent to and processed by the platform operating service to trigger a specific on-screen interaction within the application intended to suggest ways the user can improve the physical fitment and coupling of devices by adjusting the phone or to intelligently recommend a compatible accessory. The recommended accessory can be an adapter used to enable or improve fitment or a protective mobile device case specific to a mobile device model that can be kept on the mobile device while the mobile game controller device is in use. Further, the on-screen interface can allow the user to then checkout and place an order for the relevant accessory if they do not already have it with the recommended accessory SKUs pre-populated based on the aforementioned telemetry (e.g., phone model, device SKU, accelerometer data). If the user’s account information is known, the accessory checkout experience can also be delivered via email, text message, or through in-app means.

[00488] In certain embodiments, when a user connects a mobile device p that does not meet the product’ s fitment criteria to a mobile game controller device with SKU s, the platform operating service can request that the application provide on screen cues to suggest an accessory or physical mobile device placement that enables or enhances fitment. This prevents the user from having to make an independent, subjective judgment call about the perceived quality of the fitment, or rely solely on physically printed instructions that may lead to ambiguity or confusion.

[00489] For example, if an iPhone or Android phone is undesirably tilted, translated and/or misaligned in the product, causing shear or axial force on a Lightning, USB-C or other physical connector, the application can leverage mobile accelerometer data to provide on screen cues that indicate the mobile device is not level and can suggest using an alternate accessory that allows the phone to sit closer to level. In one embodiment, adapter inserts can be used in order to allow for phones of various sizes to fit into the smart mobile game controller device.

[00490] Further, in the event that the phone screen, phone camera systems, ports, speakers, or microphones are occluded due to the to the nature of the fitment of the mobile device p, the corresponding sensor data can be sent to the server, or audio test tones can be played and sent to the server. The server and application can interpret this data and can then inform the user via on screen cues that there is a fitment issue and suggest an adapter or accessory that would address it. [00491] In certain embodiments, when such a mobile device p is connected, another possible concern is that the adapter or other necessary accessory is not coupled with the device at point of sale or is otherwise difficult to obtain. In some embodiments, the application can intelligently recommend a specific adapter or accessory SKU s’ based on the modes of data described above and further provide a checkout flow to allow the user to purchase the relevant accessory. For example, a new iPhone model is released after the introduction of a mobile game controller that is otherwise physically incompatible without the usage of an adapter. Alternatively, suppose that the user connects a phone for which they do not have an adapter that would enable or improve fitment. Leveraging the modes of data described above, the correct adapter can automatically be selected and encoded in the checkout URL, allowing the user to purchase in the app or via the web through the mobile browser. The user can then, in a single action, order an adapter SKU s’ that allows a mobile game controller device SKU 5 to have optimal fitment with a specific mobile device model p. This idea can be pursued to provide a scalable fitment solution across a large sample space of potential phone models, mobile game controllers and accessories. In one embodiment, physical adapters can be automatically manufactured as needed through 3D printing, additive manufacturing or subtractive manufacturing and sent to the user, possibly through automated ordering and delivery systems. In another embodiment, already stockpiled adapters can be automatically ordered and shipped to the user (“drop shipped”) directly from the manufacturer or warehouse, eliminating the need for human customer service agents.

[00492] In some embodiments, prior user analytics from the platform operating service can be used to determine if a user owns a mobile game controller SKU 5. In this case, the user can then be prompted as soon as it is detected that a user has installed the application on the mobile device p. The converse is also true. Which is to say, the modes of data described above do not have to be processed synchronously in order to recommend an adapter accessory.

[00493] In some embodiments, the user’s subscription status or the distribution channel through which the device was purchased can inform the price and distribution method of the accessory. For example, a user may purchase a mobile game controller device as described here that comes with a compatible phone case; however, the compatible phone case may not be offered at point of sale due to the large number of potential phone models and cases. Rather, when the user sets up the device or connects their phone, using the combination of the SKU s, mobile device p, and/or the user account data, the server and application can pre-populate a checkout flow for a mobile device case c compatible with both mobile game controller SKU s and detected mobile device p. Therefore, the user can have a method that allows them to purchase or redeem the mobile device in a single action. The checkout link can also be a function of the channel through which the device was originally purchased. [00494] In some embodiments, the same implementation can also apply to recommend accessories for mobile game controllers or mobile devices such as carrying cases, battery pack, and more.

[00495] Figures 84-86 are screen shots that relate to the above embodiments.

[00496] Community and Social

[00497] In the past, many rich social experiences around gameplay were based around users’ consoles and computing devices thought to be powerful enough for rich gameplay. The following describes a set of systems that ensure users can have a rich social experience on a much wider range of computing devices and operating services.

[00498] Real-Time Sharing of Activity State Performed on the Gameplay Device

[00499] In some embodiments, the activities performed by users on the gameplay device can be shared with other users, for example, with the user’s friends on a social network, and this activity can be shared in real-time. Users can see various presence states for other users, for example, what games their friends are playing, whether they are currently in an audio chat experience, whether they are leveraging livestreaming or screen sharing functionalities, and whether they are available to engage in synchronous gameplay experiences. Users can be notified by the computing device about changes to this status and can be prompted to engage in social experiences with other users (see Figure 87).

[00500] The following is an example of a user journey. First, the user performs one of a number of actions that can trigger a change to their activity status. This can include, but is not limited to, connecting a mobile gameplay controller to their computing device, launching the platform operating service application, launching a game from the integrated dashboard, entering or leaving an audio chat, beginning or ending sharing their screen in an audio chat, beginning or ending a live stream, and entering a period of inactivity by not providing input to the game controller for a period of time. Second, other users receive notifications about changes to the user’s activity status. Users viewing the platform operating service can see the user's activity status update in real time.

[00501] Example Implementation

[00502] In certain embodiments, the platform operating service application can observe a number of signals that represent different aspects of user activity, including, but not limited to, whether the application is active, whether the mobile game controller is connected to the computing device, the time of the last input the user made to the mobile game controller, whether the user is engaging in an audio chat, and if so, which one, the content the platform operating service has detected, whether the platform operating service application believes the user is playing any content, and whether the user is sharing a live stream, or sharing their screen through an audio chat. On update of any of these signals, the application can transmit the state change to the cloud service.

[00503] In certain embodiments, the cloud service can receive the transmitted state change and update its understanding of the user’s status. The cloud service can aggregate the state of multiple gameplay devices belonging to the same user to present a complete understanding of the user’s status, and can transmit this understanding to other gameplay devices in real time. Applications can use the platform operating service server’s transmitted understanding to render the state information into a human-readable presentation of the user’s status. Further, the platform operating service can send notifications to the user through a number of different media to prompt the users to engage with whose state changed, or other related users; for example, the platform operating service can respond to a user who becomes active and starts playing a game by prompting other users to play the same game, or the system can respond to a user who starts broadcasting a live stream by prompting other users to view the live stream.

[00504] As shown in the flow diagram in Figure 88, when User A launches Game A (act 2320), User A’s application 2300 sends a notification to the cloud service 2010 (act 2330). The cloud service 2010 then notifies User B’s application 2310 (act 2340), which notifies User B (act 2350). User B then taps on the notification to view User A’s status (act 2360). In response, User B’s application 2310 sends a request for User A’s status to the cloud service 2010 (act 2370). The cloud service 2010 responds by informing User B’s application 2310 that User A is active and is playing Game A (act 2380). User B’s application 2310 the sends a notification to User B (act 2390).

[00505] Figure 89 is a screen shot showing an example of human readable presence indicator.

[00506] In other embodiments, the platform operating service application can observe different sets of signals, or transmit state information on a different schedule, for example, on a fixed interval to the platform operating service. Similarly, other embodiments could include the platform operating service transmitting the state change in other intervals, or only transmitting its understanding of user state upon request. Further, other embodiments could include the transmission of user activity state between computing devices directly, without the intervention of the platform operating service.

[00507] In other embodiments, the platform operating service can receive updated state information from sources besides the platform operating service application. For example, the platform operating service can receive information from external gameplay platforms about the user’s activity status and/or what games a user is currently playing or has played recently. Such an embodiment could be enabled by leveraging the Account Linking functionality mentioned above.

[00508] As noted above, this description discloses, among other things, various example systems, methods, apparatus, and articles of manufacture including, among other components, firmware and/or software executed on hardware. It is also understood that such examples are merely illustrative and should not be considered as limiting.

[00509] Attaching a mobile game controller to a computing device brings to life an entirely new gaming player device and gaming environment. A combination of a user input device, platform operating service, cloud service, screen, and mobile operating system allows the system to take advantage of a plethora of synergies and offers entirely new experiences. The embodiments described herein can provide a sophisticated and highly-extensible gaming experience that is vastly more than the sum of the parts.

[00510] II. System and Method for Rich Content Browsing Multitasking on Device Operating Systems with Multitasking Limitations [00511] Introduction

[00512] In one embodiment, a contextually-aware platform service switcher is provided. In another embodiment, a system and method for automatic content capability detection are provided. In yet another embodiment, a system and method for rich content browsing multitasking on device operating systems with multitasking limitations are provided. These embodiments can be used alone or in combination. Other embodiments are provided. It is important to note that any of these embodiments can be used alone or in combination, and details of these embodiments should not be read into the claims unless expressly recited therein.

[00513] Overview of an Exemplary Computing Environment

[00514] Turning now to the drawings, Figure 90 is an illustration of a computing environment of an embodiment. As shown in Figure 90, this environment comprises a user controller 3100, a computing device 3200, and a remote device 3300. The user controller 3100 and computing device 3200 are in communication with each other via respective wired or wireless interfaces 3108, 3208. Likewise, the computing device 3200 and the remote device 3300 are in communication with each other via wired or wireless interfaces 3209, 3308. As used herein, “in communication with” can mean in direct communication with or in indirect communication with via one or more components, which may or may not be mentioned herein. For example, in the embodiment shown in Figure 90, the computing device 3200 and the remote device 3300 are in communication with each other via a network 3250 (e.g., the Internet, a local area network, a peer-to-peer wireless mesh, etc.). However, in other embodiments, the computing device 3200 and the remote device 3300 can communicate with each other in the absence of a network. Also, as used herein, the remote device 3300 is “remote” in the sense that it is physically separate from the computing device 3200 in some fashion. In many implementations, the physical distance is relatively great, such as when the remote device 3300 is located in another town, state, or country. In other implementations, the physical distance may be relatively short, such as when the remote device 3300 is in the same room or building as the computing device 3200. Also, the term “remote device” can refer to a single remote device or multiple remote devices.

[00515] As shown in Figure 90, in this embodiment, the controller 3100 comprises one or more processors 3102, a memory 3104, and one or more user input devices 3106. The user input devices 3106 can take any suitable form, such as, but not limited to, a button, a joystick, a switch, a knob, a touch- sensitive screen/pad, a microphone for audio input (e.g., to capture a voice command or sound), a camera for video input (e.g., to capture a hand or facial gesture), etc. To be clear, as used herein a “user input device” refers to a control surface and not to the entire system or parent device on which user input devices are placed.

[00516] Generally speaking, the controller 3100 can be used by a user in the selection and (passive or active) consumption of content (e.g., playing a game, watching a video, listing to audio, reading text, navigating a displayed user interface, etc.) presented using the computing device 3200 in some fashion. The controller 3100 may be referred to based on the content with which it is being used. For example, the controller 3100 can be referred to as a game controller when it is being used to play a game. And if the controller 3100 is being used to play a game on a mobile device, such as a phone or tablet (as opposed to a relatively- stationary game console), the controller 3100 can be referred to as a mobile game controller. However, the same controller 3100 may also be used to control the playback of nongame content, such as video or audio. Accordingly, a specific use should not be read into the term “controller” unless expressly stated.

[00517] The computing device 3200 can also take any suitable form, such as, but not limited to, a mobile device (e.g., a phone, tablet, laptop, watch, eyewear, headset, etc.) or a relatively more- stationary device (e.g., a desktop computer, a set- top box, a gaming console, etc.). In the embodiment shown in Figure 90, the computing device 3200 comprises one or more processors 3202 and a memory 3204. In this particular embodiment, the memory 3204 stores computer-readable program code for an operating system (O/S) 3210 (e.g., iOS or Android), native content 3220, and an application configured for use with the controller 3100 (“controller app”) 3240. This application 3240 will sometimes be referred to herein as the client platform operating service or system. Exemplary functions of this application 3240 will be described herein. Also, as used herein, “native content” refers to content that is at least partially stored in the computing device 3200. For example, native content can be wholly stored on the computing device; or native content can be stored partially on the computing device 3200 and partially on one or more remote devices 3300 or some other device or set of devices.

[00518] The remote device 3300 also comprises one or more processors 3302 and memory units 3304 storing remote content 3320 and an application (“ app”) 3340 (which is sometimes referred to herein as the remote platform operating service or system) that can be used to communicate with the controller app 3240 or another entity on the computing device 3200.

[00519] It should be understood that more or fewer components than what are shown in Figure 90 can be used. For example, the computing device 3200 can have one or more user input device(s) (e.g., a touchscreen, buttons, switches, etc.), as well as a display (e.g., integrated with a touchscreen). Further, while the components in the controller 3100, computing device 3200, and remote device 3300 are all shown in respective single boxes in Figure 90, implying integration in respective single devices, it should be understood that the components can be located in multiple devices. For example, the processor 3302 and memory 3304 in the remote device 3300 can be distributed over multiple devices, such as when the processor 3302 is a server and the memory 3304 is a remote storage unit. As used, the remote device 3300 can also refer to multiple remote devices that are in communication with the computing device 3200. Other variations for any of the devices 3100, 3200, 3300 are possible.

[00520] Finally, the memory 3104, 3204, 3304 in these various devices 3100, 3200, 3300 can take any suitable form and will sometimes be referred to herein as a non-transitory computer-readable storage medium. The memory can store computer- readable program code having instructions that, when executed by one or more processors, cause the one or more processors to perform certain functions.

[00521] Exemplary Game Controller Implementation

[00522] As mentioned above, the controller 3100, computing device 3200, and remote device 3300 can take any suitable form. For purposes of describing one particular implementation of an embodiment, the controller 3100 in this example takes the form of a handheld game controller, the computing device 3200 takes the form of a mobile phone or tablet, and the remote device 3300 takes the form of a cloud gaming system. As example of this is shown in Figures 1 and 9, which were discussed above. Again, this is just one example, and other implementations can be used. Further, as mentioned above, a game is just one example of content that can be consumed, and the controller 300 can be used with other types of content (e.g., video, audio, text). So, the details presented herein should not be read into the claims unless expressly recited therein.

[00523] As discussed above, Figure 1 shows an example handheld game controller and mobile phone of an embodiment. This game controller has a number of user input devices, such as joysticks, buttons, and toggle switches. In this example, the game controller takes the form of a retractable device, which, when in an extended position, is able to accept the mobile phone. A male communication plug on the controller mates with a female communication port on the computing device to place the controller and computing device in communication with one another. The controller in this embodiment also has a pass-through charging port that allows the computing device to have its battery charged and a headphone jack. In other embodiments, the controller can connect to the computing device through other means such as pairing wirelessly to the phone. Again, this is just an example, and other types of controllers can be used, such as those that do not fit around a mobile device. [00524] As shown in Figure 9, in this embodiment, the controller can be used to play a game that is locally stored on the computing device (a “native game”) or a game that is playable via a network on a cloud gaming service. In this example embodiment, remote gameplay, based on input from the game controller, the computing device sends signals to the cloud gaming service and receives display data back. In one embodiment, a browser on the computing device is used to send and receive the signals to stream the game to the user. There can be multiple variants of remote game play. One embodiment includes a host device, such a game console, PC, or other computing device not actively being controlled that can be streamed to the active computing device, such as a smartphone, from a host device (e.g., game console or PC) that a user can access remotely via their smartphone) and Another embodiment includes a cloud gaming service (which can be streamed from a data center), such as Xbox Game Pass, Amazon Luna, or other service, that can be streamed to the active computing device.

[00525] In one embodiment, the controller app 3240 can facilitate the selection of a game (or other content). For example, the controller app 3240 can display a user interface (e.g., on a display of the computing device 3200 or on another display). The controller app 3240 can also receive user input from the controller 3100 to navigate and engage with content, for example, browse for, select, and launch a game from a displayed list of games. In this example, once the game is launched, input from the game controller 3100 can be provided directly to the game or indirectly to the game through the controller app 3240. As will be discussed in more detail below, the controller app 3240 can enhance the standard experience offered on a computing device by extending functionality and providing enhanced interface capabilities in addition to the inherent interface of the computing device itself. For example, in some embodiments, the controller app 3240 assigns a function to one or more of the user input devices on the controller 3100 based on the particular content being consumed. As will be discussed in more detail below, such assignment can be done by sending mapping information 3105 (see Figure 90) to be stored in the memory 3104 of the controller 3100. In another embodiment, this assignment can also be achieved through a dynamic interface within the service 3240 that intercepts and maps controller 3100 input before propagating the input to the service 3240.

[00526] In one embodiment, the controller 3100 is used in a cloud gaming environment. The following paragraphs provide a discussion of such an environment. It is important to note that this is merely an example and that the details discussed herein should not be read into the claims unless expressly recited therein.

[00527] With the advent of cloud game streaming on the iOS platform in 2021, users could play games like Halo: Master Chief Collection on their mobile device through services like Xbox Game Pass and the now-defunct Google Stadia for the first time. But the user experience was plagued with issues: users would have to pin a browser page for each service they were interested in onto their iOS home screen and, much to their confusion, would have to further configure each pinned browser to work with their controller. The lack of discoverability of this feature in conjunction with the difficulty in getting it to work resulted in relatively-low adoption for cloud gaming on mobile in general. Prior gaming controllers did not initially offer any direct support for cloud gaming as a result. Users could launch and play native mobile games through the associated controller app, but instructions, user education, and features that enabled cloud gaming, arguably the most-disruptive technology in all of gaming within the last decade, did not exist. The main issue was that it was difficult to explain to users how to go about setting up cloud gaming due to the sheer amount of user complexity.

[00528] To address these issues and make cloud gaming far more accessible to a wider array of players, one embodiment embeds a content browser within the controller app 3240 and pipes inputs from the controller 100 through the controller app 3240 into the content browsers. With this embodiment, users can now have the benefits of both native gaming experiences and streamed gaming experiences all inside the controller app 3240. [00529] This approach presents several advantages. For example, because cloud games are often much more complex than their native mobile counterparts, users can be gradually onboarded onto cloud gaming after they have gotten used to playing simpler native games locally on the computing device 3200. This allows cloud gaming to be a user’s second experience. As another example, as will be discussed in more detail below, a user input device on the controller 3100 can allow users to alternate between multiple cloud gaming services (e.g., Google Stadia and Amazon Luna) within the controller app 3240, which enables much more rich experiences, as opposed to having them as independent web clips pinned to a user’s home screen. Users can stay within one central application for all the streaming as well as native needs.

[00530] As another advantage, with these embodiments, users can take full advantage of hardware-software integration with the controller 3100 to avoid unintended disconnects and reconnects in between bouts of gameplay. Typical apps running on the computing device 3100 lack the ability to stay active in the background. In one embodiment, an app is allowed to be kept active in the background, which, when combined with the embedded cloud gaming experience, allows the user to persist their cloud gaming session even if they background the application when, for example, responding to a quick text message, so they never lose their place in the content. Users can still use their computing device 3100 (e.g., as a phone), whether it is for music, charging, or other apps, and can shift back-and-forth between their gaming session and their previous context without the downsides.

[00531] Also, one of the biggest hurdles to making cloud gaming accessible is reducing queue time or wait time for a session. In some game streaming environments, the cloud service provider allocates remote instances that users can connect to, which can require users to wait before they enter into a gaming session. On mobile, where users expect to be able to do everything right away, being asked to wait, say, six minutes for a game session to start, can be especially vexing, especially when there are numerous free, instant, and downloadable alternatives that are basically one app store search away. In one embodiment, the controller app 3240 allows users to start a cloud gaming session and through a subordinate (e.g., picture- in-picture) view, enable them to continue to utilize the controller app 3240, which can contain game suggestions, user-generated content, and more while they wait for the session to load. This combination of the controller app 3240, controller 3100, and embedded cloud gaming experiences enables users to use their wait time productively, so they are not stuck waiting on a screen for minutes at a time as the service connects. [00532] Further, as will be described in more detail below, the behavior of one or more user input device(s) on the controller 3100 can be greatly enhanced when cloud gaming is embedded in the controller app 3240 based on context. For example, when in the embedded cloud gaming view within the controller app 3240, users can perform one press to move between the game session and the top-level integration dashboard, as well as back-and-forth between the two.

[00533] In one embodiment, to enable this fully-integrated cloud gaming experience for users, a web browser can be embedded inside the controller app 3240 that is capable of receiving controller input at the same time as the larger application. In one example implementation, this is built on top of the operating system-provided WKWebView primitive on iOS and the WebView primitive on Android, respectively. When a user enters input through the controller 100, the event is first sent into the controller app 3240, which then assesses the state of the controller app 3240 to route the event accordingly. In the case of the browser, the controller app 3240 can decide to route the input event to the browser when the browser is active and recently launched a game, and to route the event to other subsystems of the application otherwise. When the event gets routed to the browser, the browser then forwards the event into the underlying content, which can then respond to the event accordingly, usually by changing the state and modifying its display or making a sound.

[00534] Since most cloud streaming applications are currently hosted on the web, a content browser embodied as a web browser allows the controller app 3240 to support a wide variety of cloud gaming platforms. Content can be launched from the controller app’s integrated dashboard through user input, for example, either a gesture on the computing device 3100 or an input into the controller 3100, which the controller app 3240 can determine to mean it should launch specific content. The specific instructions to launch the content can be stored in a content database stored remotely and sent to the controller app 3240 via a remote procedure call. In the example of a cloud game, the database could specify the game be launched by visiting a specific URL in the browser, which would then load the necessary code and data to play the game within the browser, with minimal additional user intervention.

[00535] For example, if a user wishes to pause gameplay, they can do so through a number of ways, such as, but not limited to, pressing a button during gameplay. This could minimize the gameplay into a subordinate (e.g., picture in picture) view that allows the user to do other things within the controller app 3240. The specific implementation details can differ by platform due to multitasking limitations on the platforms. For example, Android uses a picture-in-picture view that leverages existing overlay technologies available on the platform, while, on iOS, the corresponding primitive is not able to sustain games content reliably, so a picture-in- picture primitive can be developed that is capable of hosting an arbitrary web browser.

[00536] Unfortunately, again due to iOS system limitations, this browser may not be shown outside the surface of the controller app 3240, so a number of alternative ways of showing the user state can be developed. For example, when the user places the controller app 3240 in the background, the controller app 3240 can send the user a push notification indicating to them that their game is still available in the background, and tapping the notification can resume gameplay. Another example is that the application state can be minimized into an iOS 16 Live Activity display, which persistently reminds the user that their gameplay session is still active. This can be accomplished by checking on a background event whether the user was actively playing a game, and if so, using iOS API calls to register the Live Activity display.

[00537] These content displays (e.g., picture-in-picture) can be used to perform other tasks while the user is queuing to start gameplay (e.g., for platforms such as Nvidia GeForce Now, which may have a limited number of available gameplay consoles). To make this experience more ergonomic and useful, a number of technologies can be developed to help surface key context, such as, but not limited to countdown timers, while queueing in line for gameplay. For one, key information can be extracted from the external content provider, such as, but not limited to, the user’s position in the queue, from the browser. This can be achieved, for example, by injecting ECMAScript code into the content that can extract information and pipe that information to the application for alternate display. In a picture-in-picture view, this may be important because the position in the queue can be illegible when the content page is shrunken and needs to be expanded or otherwise reformatted to be legible in the view. This information can then surface in the aforementioned surfaces like a push notification, iOS Live Activity display, or on Android, a status indicator light or screen on the device. [00538] Further, especially on iOS, it may be necessary to develop a way to ensure that a cloud gameplay session is kept active while in the background because applications can often be terminated if they are in the background for more than a few seconds to minutes, depending on the application. While users may infer that this is the case, since this is the standard behavior on certain operating systems like iOS, when using a picture-in-picture, users generally expect the content within the inner picture to persist while multitasking. Preferably, users would not lose their gameplay state, which can be extremely time-consuming and frustrating to restore. To do this, when the controller app 240 can receive an event indicating it is in the background, it can request the controller to keep the application alive for a set period of time. In one example implementation, it does not do so indefinitely to avoid impacting the computing device’s battery life; if kept alive indefinitely, the user may find their device battery drained due to the background state because they forgot about their cloud gaming session. The controller 3100 can store this request in its memory and then send API calls at a regular interval to the device operating system 3210 to ensure the application stays active.

[00539] As noted previously, the exemplary computing and game environments discussed above are merely examples. The various embodiments discussed herein can, but do not have to be, used with these exemplary computing and game environments. As such, the details presented herein should not be read into the claims unless expressly recited therein.

[00540] Programmable User Input Device(s)

[00541] As mentioned above, in one embodiment, some or all of the user input devices of the controller 3100 are programmable by the controller app 3240. That is, the controller app 3240 can assign a specific function to a specific user input device (e.g., so a certain function happens when a specific button is pressed) or to more than one user input device (e.g., the certain function happens when two or more specific buttons are pressed simultaneously or in sequence). For ease of illustrating the following embodiments, a single button is going to be used as the “one or more input devices” that are programmed with a function. However, it should be understood that more than one input device may need to be actuated to trigger the function and that other types of user input devices (e.g., a switch, knob, joystick, a microphone, a camera, etc.) can be used instead of a button. [00542] The controller app 3240 can assign a function to the button in any suitable way. For example, in one embodiment, the controller app 3240 provides the controller 3100 with an identification of the button and a command that the controller 3100 is to send to the computing device 3200 when the button is pressed. That mapping information 3105 can be stored in the memory 3104 of the controller 3100. When the button is pressed, the one or more processors 3102 in the controller 3100 use the stored mapping information 3105 to identify the command associated with the pressed button and then send that command to the computing device 3200. In another embodiment, the mapping information is stored in the memory of the computing device 3200. In that embodiment, when the button on the controller 3100 is pushed, the processor(s) 3102 in the controller 3100 can provide the computing device 3200 with the identification of the button, and the controller 3100 uses its internally- stored map determined by the content state to execute the appropriate command. In yet another embodiment, the mapping can be stored on a remote device 3300, which the computing device 3200 can query. In this embodiment, when the controller 3100 button is pressed, the processor(s) 3102 therein can provide the computing device 3200 with the identification of the button; the controller app 3240 can then send information to the remote device 3300 related to the button press, including, but not limited to, the identity of the button and the content state of the controller app 3240. The remote device 3300 can respond with instructions for the controller app 3240 on how to respond to the press, or modify display information it had already been sending to the controller app 3240. Other ways of assigning a function to a button can be used.

[00543] Any suitable function can be assigned to the button on the controller 3100. In one embodiment, the function that is assigned to a button is based on whether selected content is consumable remotely from a remote device 3300 (i.e., remote content 3320) or locally from the computing device 3200 (i.e., native content 3220). If the content is playable remotely from the remote device 3300 (e.g., using a browser in the computing device 3200), the controller app 3240 (or some other entity) can assign a certain function to the button. However, if the content is playable locally from the computing device 3200, the controller app 3204 (or some other entity) can assign a different function to the button. So, the function of the button depends on the context of the content (e.g., the logic of the button assignment can be done in response to the content state, so that the logic maps states to functions). Many alternatives can be used. For example, instead of basing the assignment of the button on the location of the content (local vs. remote), the assignment can be based on whether the content is focusable or un-focusable, as discussed below, whether the content has a prominent interface that needs to be wholly displayed, whether the interface has an alternate UI state that may be better suited to the interface device, based on a user-stated preference, or other factors.

[00544] Exemplary Programmable Functions

[00545] The following paragraphs provide examples of various functions that can be assigned to the button. It should be understood that these are merely examples and that other types of functions can be assigned. Further, while some of the content in these examples are games, it should again be understood that other types of content, including video, audio, and text, can be used.

[00546] In one example, the content playable from the remote device 3300 is focusable, content playable locally from the computing device 3200 is un-focusable, and the functions assigned to the button on the controller 3100 take this difference into account. As used herein, “focusable content” refers to content that can be selectively focused and un-focused via a user input from the controller, a user gesture on the gameplay device, programmatic means, or another event propagated to the content. When content is in focus, the content is expected to respond to user input from the controller 3100 in its normal use. In one embodiment, when a game is in focus, information about actuation of a joystick, switches, etc. on the controller 3100 can be provided to the game (e.g., to move a character in the game). When content is not in focus, the content does not normally respond to user input from the controller 3100. Instead, that user input can be provided to whatever content is currently in focus, and in some embodiments, is not provided to the content at all. So, if a user interface of the controller app 3240 is in focus and the game is not in focus, movement of the joystick can cause movement of a cursor in the displayed user interface instead of movement of the character in the game. In some embodiments, the content is prevented from receiving user input as one exemplary means of preventing response to user input, but in other embodiments, the content can be provided user input and told by the platform operating system not to respond to the input, or a subset of the user input the controller 100 receives is passed on to the content.

[00547] In some embodiments, un-focused content is still displayed to the user in some fashion. (In contrast, un-focusable content is either open (in which case the content is displayed) or closed (in which case the content is not displayed).) In one embodiment, the un-focused content is displayed in a minimized display area in a picture-in-picture display. The non-minimized display area in the picture-in-picture display can display other content, such as, but not limited to, a user interface of the controller app 3240, native content 3220 (e.g., another game, a web browser, etc.), or other remote content 3320. For example, Figure 91 shows un-focused content (here, a game) in a minimized display area 3400 in a picture-in-picture display and a user interface of the controller app 3240 in the non-minimized display area 3410, allowing the user to select other native or remote content. As another example, Figure 92 shows an un-focused game in a minimized display area 3500 in a picture-in-picture display and a web browser in the non-minimized display area 3510. These are merely examples, and other apps, etc. can be shown in the minimized and non-minimized display areas.

[00548] Instead of the un-focused content being a miniaturized version of what would otherwise be displayed, other types of representation of the un-focused content can be displayed. The representation can take any suitable form, such as, but not limited, to, a banner (e.g., with the name of the content) overlaid on top of other displayed content (an overlaid subordinate user interface)), a side-by-side display, an icon, a change in how other content is displayed, etc. As will be discussed in more detail below, the representation can provide dynamic information to the user about the content, such as a user’s place in a queue, a countdown or timer for when their game session will start, a chat or message count, an interaction indicator, general state, or other appropriate information communicating the un-focused content state, so the user can productively and simultaneously utilize the focused content inside of the overall platform operating service. In certain mobile game controller embodiments, this can be highly advantageous because the controller app can be in landscape mode and be navigable with the mobile game controller input device, so the user can remain in their context without having to detach the mobile game controller or revert to the overall device operating system’s primary interface, which would potentially cause them to change orientations or relinquish control of their input experience. In such an embodiment, the user is thereby able to remain within the abstraction of a gaming system experience created on the mobile device.

[00549] Turning again to the drawings, Figure 93 is a flow diagram illustrating the process of assigning a function to a button on the controller 3100 to focus/un- focus focusable content. In this example, the controller 3100 takes the form of a mobile game controller, the controller app 3240 takes the form of a client platform operating service, the remote app 3340 takes the form of a server platform operating service, and the remote content 3320 takes the form of a game. Also in this example, the button on the controller 3100 is referred to as a “software service button (SSB),” as, in this example, un-focusing the content causes a user interface of the client platform operation system (e.g., the controller app 3240) to be displayed. As shown in Figure 93, the controller app 3240 causes a user interface to be displayed on the computing device 3200 (act 3605), the user then uses the appropriate user input device(s) on the controller 3100 to navigate to desired content (act 3610). For example, movement of a user input device, e.g. the right joystick or directional pad, can cause a cursor to move on the displayed user interface to select content, cause selected content to be displayed larger or differently (e.g., with a darkening layer or outline, scaling, animation, etc.). Also, as will be discussed in detail below, the user interface can present both focusable content (e.g., remote content 3320) and un- focusable content (e.g., native content 3220) for selection by the user. The flow diagram in Figure 93 is directed to focusable content (more specifically, to toggling/swapping between focusing and un-focusing the focusable content), whereas the flow diagram in Figure 94 is directed to un-focusable content (more specifically, to toggling/swapping between opening and closing un-focusable content).

[00550] In this example, when focusable content is available and unfocused, the SSB is initially assigned to the function of launching the selected content (act 3615). So, when the SSB is pressed (act 3620), the controller 3100 sends the assigned command to the controller app 3240 on the computing device 3200 (act 3620), in response to which the controller app 3240 sends a request to the remote app (content platform service) 3340 for the focusable content (act 3630). The focused content is then displayed in the appropriate user interface (e.g., web browser) (act 3635), and the controller app 3240 re-assigns the function of the SSB to un-focus the content (act 3640). So, when the SSB is later pressed, the controller 3100 sends the newly- assigned command to the controller app 3240 (act 3645), which causes the content to be displayed in an un-focused manner (e.g., in a minimized picture-in-picture display, as an overlaid banner, etc.) (act 3650). The controller app 3240 then re-assigns the function of the SSB yet again, this time to re-focus the content (act 3655). So, the SSB is used in the example to toggle between focusing and un-focusing the focusable content. In addition to the SSB, the platform operating service 3240 can also maintain contextual behavior in response to other interfaces on the computing device 3200, for example, available touch interface, button press, or movement-based gesture, allowing for a single controller interface to behave in the same fashion as the SSB. [00551] As mentioned above, the user interface of the controller app 3240 can present both focusable content (e.g., remote content 3320) and un-focusable content (e.g., native content 3220) for selection by the user. Figure 94 is a flow diagram related to un-focusable content. As shown in Figure 94, the controller app 3240 causes a user interface to be displayed on the computing device 3200 (act 3705), The user then uses the appropriate user input device(s) on the controller 3100 to navigate to desired content (act 3710). In this example, when un-focusable content is selected, the SSB is initially assigned to place the controller app 3240 in the background (act 3715). So, when the SSB is pressed (act 3720), the controller app 3240 is placed in the background by the processor(s) 3202 of the computing device 3200 (act 3725). This would cause the user interface to no longer be displayed; instead, the selected native content 3240 would be displayed. The function of the SSB is reassigned to move the controller app 3240 in the foreground (act 3730). That way, when the SSB is later pressed (act 3735), the controller app 3240 would be moved to the foreground (act 3740), and the function of the SSB would revert to placing the controller app 3240 in the background (act 3745). So, the SSB is used in the example to toggle between placing the control app 3240 and the native content 3240 in the foreground/background.

[00552] As mentioned above, in certain embodiments, the controller navigation can include the ability to foreground and background platform operating services. In certain embodiments, the term foreground can refer to prioritizing a platform operating service’s user interface (e.g., maximizing, highlighting, switching to, etc.) for interaction on a computing device. In certain embodiments, the term foreground could also refer to prioritizing the processes and resources in a computing device for the associated platform operating service. In certain embodiments, the term background could refer to de-prioritizing a platform operating service’s user interface (e.g. minimizing, dimming, switching from, etc.) as to allow for another primary interaction intent to foreground. In certain embodiments, the term background could refer to de-prioritizing the processes and resources in a computing device for the associated platform operating service. [00553] When content is in the background, the computing device 3200 may reclaim resources in the computing device 3200 (e.g., memory, network connections, processing resources, etc.). When the user later wants to use the content (put the content in the foreground), the content may need to be reloaded. For example, for local content reload, the content can be reloaded from the memory 3204, resuming the state from disk, which may not be possible if the content does not provide an adequate persistent state restoration mechanism. For remote content reload, closing the connection to the remote content server 3300 could erase the content state or the connection may be reallocated so the user will need to wait to be allocated a new collection/remote content server.

[00554] In contrast, when the content is in the foreground, the computer device 3200 can avoid reclaiming resources in the computing device 3200. In some situations, content in the foreground is displayed or otherwise presented to the user, whereas content in the background is not.

[00555] As mentioned above, Figure 93 is directed to an example flow for focusable content, and Figure 94 is directed to an example flow for un-focusable content. Figure 95 is a flow chart showing an example of how both flows can interact. As shown in Figure 95, the controller app 3240 displays a user interface, which can present both focusable content (e.g., remote content 3320) and un- focusable content (e.g., native content 3220) (act 3805). Next, a determination is made regarding whether unfocused content is present (act 3806). If unfocused content is not present, the controller app 3240 assigns the SSB the function of placing the controller app 3240 in the background (act 3810). If the SSB is pressed when unfocused content is present, the controller app 3240 can determine whether focusable content is available (act 3820). When available, the focused content is presented (act 3825), and the SSB is assigned to un-focus the content (act 3830). So, when the SSB is pressed (act 3835), the processor(s) 3202 in the computing device 3200 move the content to the un-focused state (act 3840) and display the un-focused content as an overlay, minimized picture-in-picture display, etc. (act 3845). The SSB behavior is then assigned to focus content (act 3847). That way, if unfocused content is present at act 806, pressing the SSB (act 3807) causes the content to re-focus (act 3825). Thus, the SSB in this part of the flow chart serves to toggle between focusing and unfocusing the content. It should be noted that in other embodiments, this contextually- aware focus toggle behavior could be initiated through other interface mechanisms on a controller 3100 or through user devices on the computing device.

[00556] Referring to earlier in the flow chart, if the SSB is pressed when no focusable content is launched (act 3850), the controller app 3240 is moved to the background (act 3855), and previously-selected native content 3220 is moved to the foreground (act 3860) (this also occurs if it is decided in act 3820 that the content is not focusable). The function of the SSB is then assigned to move the controller app 3240 to the foreground (act 3865). That way, when the SSB is pressed (act 3870), the main user interface is displayed (act 3805).

[00557] As mentioned above, when un-focusable content is swapped out for the display of the main user interface, the un-focusable content is placed in the background. By being in the background, resources in the computing device 3200 can be reclaimed. In that situation, when the content is swapped back to the foreground, the user may need to wait for the processor(s) 3202 of the computing device 3200 to re-populate states of the content or to relaunch the content altogether. [00558] In one embodiment, when focusable content is un-focused, the content is still kept in the foreground and not moved to the background, even though the user may not be currently actively interacting with the content. This can prevent the previously discussed detriments, such as state loss, from occurring. This feature can have particular advantages in the context of gameplay. For example, some remote games have a waiting list to play (e.g., when the game supports play by a limited number of users). When the game is launched and is full, the user is placed in a queue. However, if the user places the game in the background, they may lose their place in line. In other instances, there is simply a load time for the session to start because a remote server has to, for example, be configured to have the game and user account set up to allow for a cloud gaming session. By keeping the game in the foreground, when un-focused, the user keeps their place in line while, by being unfocused, the user can use other services or apps on the computing device 3200 to multitask, as will be discussed further below. Also, keeping the content in foreground provides the ability to perform rich display or enter/exit the un-focused state when entering/exiting the line, as will also be discussed further below.

[00559] Figure 96 is a flow chart that illustrates this example process. As shown in Figure 96, when the user is in a queue for remote content (act 3905) and then presses the SSB (act 3910), the computing device 3200 determines if there is existing content in a browser (act 3915). If there isn’t, content is launched in an external application, as discussed above (act 3920). However, if there is, the content is launched in full screen mode (act 3925). When the user later presses the SSB (act 3930), the content transitions to a picture-in-picture or another representational view (e.g. a banner overlay) (act 3940). That way, the user keeps their place in the queue while being able to multitask. In some embodiments, the representation of the content can show the user’s position in the queue and/or alert the user when they are out of (or close to being out of) the queue. This could be represented by a player count, countdown indicator, or other affordance within, or overlaid on top of, the platform operating service or even overlaid on the computing device’s operating interface. In one embodiment, picture-in-picture mode can be automatically unfocused if it is detected that a user enters a queue or is in a lobby, or other passive waiting situation, allowing the user to seamlessly return to the platform operating service and appropriately routing controller 3100 input allowing for multitasking. As will also be discussed further below, keeping a user’s place in a queue can be important given the time to ramp up to the game or get ready to start. In some environments, only the user’s position in the queue could be available. Also, the queue and real-time changes thereof can be a proxy for time remaining before gameplay can begin.

[00560] Figure 97 is a flow diagram that illustrates this process. As shown in Figure 97, when the user selects content to view on an external platform 3300 (act 4005), the controller app 3240 requests the content from the external platform 3300 (act 4010). A document including the position in the queue is returned (act 4015) and displayed to the user (act 4020). When ready, the content is provided and presented to the user (acts 4025, 4030).

[00561] Automatic Content Capability Detection

[00562] As mentioned above, the user interface of the controller app 3240 can present both focusable content (e.g., remote content 3320) and un-focusable content (e.g., native content 3220) for selection by the user. In some situations, not all content may support the controller 3100 as an input device. For example, content may have restrictions around which controllers can be used. These restrictions can relate to, for example, operating system issues, network connectivity, input compatibility limitations, application-level restrictions on input devices, licensing issues, software restrictions placed by the manufacturer, local legal issues, geographic limitations, and others. [00563] In one example, which will be illustrated in conjunction with Figures 98-100, the remote device 3300 is configured to provide only remote content 3320 to the computing device 3200 that is compatible with the controller 3100, whereas some native content 3220 on the computing device 3200 may not be compatible with the controller 3100. The controller app 3240 can determine which native content 3220 is compatible with the controller 3100 based on the compatible remote content 3330 it receives from the remote device 3300. That is, as shown in Figure 98, in this example, from the home position (act 4105, the controller app 3240 performs content sync (act 4110) and content fetch (act 4115) processes, so that only playable content is provided on the main user interface for user selection (act 4120). Figure 99 is a flow chart of a content sync process of an embodiment, and Figure 100 is a content fetch process of an embodiment. These figures will be described below.

[00564] Turning first to Figure 99, in the content sync process of this embodiment, the controller app 3240 discovers the native (local) content 3220 on the computing device 3200 (act 4205) and sends a remote database request to the remote device 3300 (act 4210) to get a content database (act 4215), which identifies the remote content 3320 on the remote device 3300. As mentioned above, in this example, the content database can contain content that is compatible with the controller 3100. The controller app 3240 compares the content in the content database with the detected local content (act 4220) to assess compatibility (act 4225). The controller app 3240 compares the local and remote content results to assess compatibility (act 4225) and stores a database of compatible content (act 4230), which, here, would be the remote content database and whichever local content is deemed compatible with the controller 3100 based on the result from the remote content database. Only verified compatible and relevant content is stored and provided back to the main process, keeping unrelated content private on the computing device 3200. That is, in this embodiment, local content is kept on the client side until it can be assessed for compatibility and relevance before returning a relevant content data set to be used in the content fetch.

[00565] As shown in Figure 100, during the content fetch process, the server 3300 receives the compatible games information (act 4305), as well as a request for content (act 4310). So, in this embodiment, an identification of the local content 3220 that is compatible with the controller 3100 is provided as part of a fetch request. The server 3300 then performs contextual action processing (act 4315) and contextualizes the content (act 4320). This can involve assigning behavior to a display region for the content in the user interface. For example, if content can be served from the remote device 3300, a “stream” or “cloud” icon or alternate signifier can be displayed, or a download button can be provided for downloading native content. As another example, if certain content does not currently support the controller 3100, a mechanism can be provided for the user to send a request to have that content support the controller 3100 in the future.

[00566] The server 300 then provides the controller app 3240 with a list of compatible content based on user-specific metrics and segmentation algorithms. This content is associated with relevant actions that are included in the return data model. These actions and capabilities are provided to the computing device 3200 to be used in hardware button mapping and user interface affordances.

[00567] Figure 101 is a flow diagram that illustrates the above steps. As shown in Figure 101, the controller app 3240 sends a content capability request to the server 3300 (act 4405), which responds with a response (act 4410). The controller app 3240 then sends an installed content query to the computing device 3200 (act 4415), which returns information about the installed content (act 4420). The controller app 3240 then sends a request to the server 3300 for a content dashboard containing verified content (act 4425). The server 3300 provides that information (act 4430), and the controller app 3240 displays that information in the dashboard user interface (act 4435).

[00568] Example: Game Controller Overview

[00569] The following sections provide example implementations of some of the embodiments discussed above. It should be noted that these are just examples and the details discussed herein should not be read into the claims unless expressly recited therein. In some of these examples, the mobile game controller has one or more software service buttons which, when selected, can perform in-app functions or functions through an operating system API as opposed to providing the inputs solely via the standard input device framework provided from the device operating system meant for standard game controller inputs. A software service button may send inputs via the mobile operating system’s input device framework based on user context (e.g., whether content is focusable/un-focusable, whether content is native content or remote content, etc.). For example, in some embodiments, the behavior of the software service button can change depending on whether the current application is in the foreground or background. For example, when the user is playing the game inside a web browser within the platform operating service, the software service button behavior changes based on context. For example, while streaming a game inside the Amazon Luna service, pressing the software service button can now send a Human Interface Device (HID) command that opens up the Amazon Luna menu by triggering the “Start” button or invoke the cloud gaming service’s own capture functionality. [00570] In certain embodiments, the platform operating service application can be launched with the software service button, and the user can navigate the interface with the inputs on the device. In certain embodiments, when a user launches the platform operating service application, they can use the controller to search for compatible games within the platform operating service and can perform appropriate actions, such as launch into or download them via provided contextually-aware actions mapped to a controller. The integrated application can also allow users to connect their accounts from external services including, but not limited to, Xbox Cloud Gaming, PS Remote Play, Nvidia GeForce NOW, Amazon Luna, Steam, Netflix, and Google Play, Apple Arcade. For instance, based on recent history of gameplay, the platform operating service application can then insert those games into the compatible games list within their library or otherwise adjust the Integrated Dashboard content. Users can then use the software service button to open directly into a dashboard of their compatible games across multiple external services. Further, this allows the platform operating service’s remote service to provide enhanced suggestions to users based on a multitude of inputs, such as device state, user account information, platform preferences, and more.

[00571] In some embodiments, the content listed in the Integrated Dashboard may be available to stream from an external service within the platform operating service application. The user is able to view details about the content and launch the content without leaving the application. The user can move between the application and the external content by using the software service button(s) and continue to leverage platform operating service features. The user journey begins when the user opens the Integrated Dashboard by pressing the software service button or manually opening the service. The user then navigates the Integrated Dashboard using the smart mobile game controller and finds a game on the Integrated Dashboard that is available on an external gameplay service. After the user presses a button on the smart mobile game controller to launch the game, a content browser is opened inside of the platform operating service application. The user plays the game without having to leave the application, leveraging the same smart mobile game controller features as installed games.

[00572] The browser can allow users to return to the personalized dashboard by pressing the software service button and then confirming their intention to exit. This confirmation dialog can be provided to provide a better user experience as sessions on external gameplay services often take considerable time to initialize. Using a software service button, which can be otherwise unavailable for games on external gameplay services, enables the platform operating service application to support all standard mobile controller inputs in a full- screen immersive experience while still enabling the platform operating service application to suspend or end the experience when a user requests to do so.

[00573] Additional embodiments could include external computing devices streaming games into the application, games available within the platform operating service, or games available within a specialized browser or content rendering mechanism. This could also enable users to launch and/or play games renderable by the browser functionality that are included with or downloaded into the platform operating service application, as well as launch and/or play games within a browser application external to the platform operating service. Another embodiment could also be launching into an external application for the purpose of enabling the user to play games. Additional embodiments can leverage other embedded content runtimes to stream games into the application.

[00574] It should be understood that these are merely examples and that other implementations can be used. Accordingly, none of the details presented herein should be read into the claims unless expressly recited therein.

[00575] Example: Contextually- Aware Platform Service Switcher

[00576] Due to limitations of computing devices, using a platform operating service may interfere with or disrupt other content services when used simultaneously. This adds complexity for users attempting to engage with the controller’s platform operating service while also interacting with content on other platform operating services. So, it is desired to allow users to switch between platform operating services seamlessly without disrupting their experience on other services, such as an active gaming session. In one embodiment, the controller, computing device, embedded software, and platform operating services work together to intelligently contextualize the user’s intent and reduce disruptions in their ability to consume and interact with content, as well as enhance the experience by allowing simultaneous and intuitive interactions using user input device(s) (e.g., a single hardware button) on the controller.

[00577] In some embodiments, external platform operating services can be enveloped into subordinate views within the controller’s platform operating service using a contextually-aware software service button on the controller. This view maintains the content state of the external platform operating service while allowing the user to engage and interact with content within the controller’s platform operating service.

[00578] The platform operating service can be able to launch external platform operating services, such as Xbox Cloud Gaming, in an encapsulated interface. While this content is active, the controller’s software service button can operate contextually to toggle focus between the external content and the platform operating service interface. Further, the controller can automatically switch functional context of all controls between the external content and the platform operating service in response to focus state changes. By keeping the external content active, the external platform operating services do not get disrupted by context switching between these interfaces. [00579] While in the platform operating service, the user is able to engage with the interface as normal, including launching other content, such as native games, or placing the platform operating service in the background within the computing device. Regardless of user engagement, the external platform operating service state is maintained within this subordinate view. The controller and platform operating service detect these events and appropriately assign contextual behavior to the software service buttons to allow the user to foreground the application as normal and then return to the appropriate contextual state of automatically toggling focus.

[00580] The platform operating service can engage with various forms of content including launching native-installed and cloud-enabled games using third- party platform operating services directly from within the controller’s platform operating service application. For cloud-enabled gaming services, an external platform operating service, such as a web browser, can be used. In normal use, the user would launch these services in a browser run on the computing device. This browser operates independently of other platform operating services and is subject to the limitations of the operating system of the computing device, such as being terminated while in the background.

[00581] The platform operating service allows the user to launch a cloud gaming platform operating service within a subordinate view within the controller application, preserving the state of the cloud gaming service, while allowing the user to engage with other content within the controller application through a contextually- aware software service button on the smart mobile gaming controller or other controller input.

[00582] When the user presses the associated software service button on the controller, the cloud gaming platform operating service transitions into a non-focused view, such as picture-in-picture, alternative UI, or otherwise, preserving its state while delegating user interactions from the controller back to the controller’s platform operating service. When the user presses the associated software service button, or other assigned input, on the controller while the external platform operating service is running in the non-focused view, the view can be brought back to a focused state, restoring input from the controller back to the service, enabling a seamless transition between services.

[00583] If the platform operating service is unfocused in a subordinate view and the user launches another natively-installed platform operating service, such as a mobile game, the controller can detect that the user is currently using a natively- installed platform operating service and can intelligently change the behavior of the software service button to take the user back to controller’s platform operating service when pressed. After returning, the controller’s platform operating service intelligently reassigns the software service button function to its focus toggle behavior, allowing the user to once again seamlessly switch between natively- installed external platform operating services and cloud-enabled platform services through simple interactions with the controller.

[00584] By allowing the platform operating service to launch external platform operating services within a contained environment, the platform operating service prevents the services from being terminated by the device operating system, providing the user with a seamless experience for both services. This provides external content state preservation. Also, by intelligently applying contextual awareness, the controller can propagate intended user interactions to the appropriate service automatically, while simultaneously allowing switching between services, including native content, cloud gaming content, and the platform operating service itself. This provides contextual user input handling. Further, by algorithmically assessing various data points such as service state and user actions, the controller is able to associate user interaction with implied intent and provide a seamless transition between services without disrupting the experience or associated services. This provides contextual software service functionality.

[00585] As can be seen from the above, the integration of the platform operating service, device operating system, and controller allows for seamless switching between content mediums, such as natively-installed games, first-party content, and cloud-enabled platform operating services using a software service button on the controller. This switching uses persistent self-contained mechanisms that assess user engagement and system context.

[00586] Also, an alternative implementation launches cloud-enabled services from within the platform operating service to utilize the default computing device’s OS handling of the services, such as integrated web browsers. However, this can provide an unreliable experience that often discards the service state while context switching between services, forcing the user to restart and in some cases, even re- authenticate, services upon return. Using a separate internal context within the platform operating service also runs the risk of the user having their service state suspended or terminated, causing similar frustrations.

[00587] Handling these services within a self-contained embodiment, such as picture-in-picture, that can persist the state while intelligently handling input context with the controller overcomes these problems. Using a self-contained embodiment to launch cloud-enabled platform operating services and keep them running in a nonfocused state while using other aspects of the platform operating service provides unique use cases where a user can wait for content to load, players to join, a game to start, and otherwise, while engaging with the platform operating service, including switching or launching into another game, viewing other content, or backgrounding the app altogether and returning to their game session using the software service button. The container embodiment can be extended in other ways (e.g., to support additional features from the platform operating service to be used while in an external operating service context). These features can include, but are not limited to, voice communications, text communications, mobile game controller settings, and more. [00588] Example: Automatic Content Capability Detection [00589] Content available from various software marketplaces, such as Google Play, have various levels of support for mobile game controllers. These marketplaces are not guaranteed to implement a way for users or platform operating services to determine the level of support and associated capabilities for mobile game controllers in regard to provided content. In one implementation, the controller’ s behavior and end-user experience can be enhanced by content that supports mobile game controllers to ensure functionality. As such, a way to determine whether or not content has controller support, and to what degree, could enhance the end-user experience.

[00590] In some embodiments, the platform operating service can automatically detect the degree to which content supports controllers, a content datastore can be created that defines the level of support for controllers to a degree of reliability and quality that would provide an enhanced user experience. This datastore can be created by manually testing a comprehensive content collection that encompasses the breadth of content likely to be consumed by users within the platform operating service. Each piece of content can be tested with a smart gaming controller to determine the degree of compatibility with the mobile game controller. [00591] To preserve user privacy, the smart mobile gaming controller can provide this compatibility information from a central repository to the platform operating service that scans the user’s computing device to assess which supported content is available. This verified content list can then be processed by the smart gaming controller and automatically displayed while providing contextual actions within the platform operating service according to content compatibility levels without exposing non-relevant data to external devices or services.

[00592] To build the library, a comprehensive list of content to be tested can be curated from multiple sources and first-party research. This list serves as a starting point for testing and curation of content and associated compatibility. This list is formed through aggregating potentially-compatible content from online resources, manual data entry, and other sources. Manually testing this content with a mobile gaming controller allows for a determination of the level of support to be made. Multiple controllers can be tested to verify support consistency, including first-party controllers, generic controllers, wireless controllers, wired controllers, and others. This process can involve discovering, installing or accessing, and consuming the content in an appropriate context using the mobile gaming controller. In some embodiments, such as game applications, for example, this level of support is determined by play testing content in various configurations for the mobile game controller, and other controller options. Once the level of support is determined, the results are stored in a data repository, which can be utilized by the smart gaming controller.

[00593] To determine relevant content for the user, the smart mobile gaming controller can intelligently detect supported content on the mobile gaming device. Compatible content is then contextually evaluated, so that the user can view compatible content within the smart gaming controller interface with contextually appropriate actions available. For example, content available on the user’s mobile gaming device can be displayed within a collection in a dashboard of a user interface. This collection can be organized by level of support, with each piece of content being displayed with contextually-appropriate actions. In this embodiment, for supported content, said actions can include the ability to launch natively-installed content, stream or view cloud-enabled content, share content through content sharing services, and more. In addition, unsupported content can be displayed to the user with appropriate contextual actions, such as the ability to launch a marketplace review to encourage content providers to enhance their offerings with better mobile game controller support.

[00594] By observing and storing different levels of controller support, including, but not limited to, “Unsupported”, “Tutorial Unsupported”, “UI Unsupported”, and “Fully Supported”, appropriate expectations for the user can be set, and content can be categorized adequately within the platform operating service. This provides a level of granularity in the controller support database. Further, automatic detection of supported games can be provided. By scanning the user’s computing device to detect available content and automatically detect which content can be displayed in the platform operating service based on compatibility level, the smart mobile gaming controller can provide a seamless browsing experience for the user. Users can intuitively make a connection between what they see in the platform operating service and what they have installed on their computing device, while having access to contextually-appropriate actions.

[00595] Regarding unsupported content, by detecting available content that does not have adequate levels of support, the smart mobile gaming controller can be able to provide users with options to enhance their experience. Allowing users the ability to send a request for controller support to content providers when they interact with a non-supported content within the platform operating service effectively creates incentives for the mobile game industry to prioritize controller support and empowers users to self-advocate for a better experience.

[00596] Online web-based databases can provide some information regarding content capability with mobile game controllers. However, these sources typically do not provide detailed levels of support that adequately inform users of the expected user experience. Further, these services typically do not provide the integration of a smart mobile gaming controller that automatically assesses available content on a mobile gaming device to then determine, contextualize, and display the content with appropriate actions to the user. These services are often outdated, lacking an adequate coverage of available relevant content and are difficult to find, digest, or access from a mobile gaming device’s available platform operating services. In contrast, these embodiments allow for streamlined integration with the mobile gaming controller, providing the user with contextual actions available via the hardware interface of the mobile game controller in addition to the platform operating service UI.

[00597] Further, due to various mobile gaming device limitations, determining support for installed games was not previously possible due to user privacy concerns with reporting information to a remote software service. These embodiments facilitate a streamlined user experience with appropriate contextual actions, providing a reliable repository of content with mobile game controller support needed to be created. Using this repository intelligently and responsibly to assess content availability on a mobile gaming device, while keeping user privacy intact, provides an advantage. Also, user privacy changes by mobile gaming device’s operating systems, such as Android, can make processing this information remotely impossible, as applications found to be doing so might be flagged and removed from the marketplace.

[00598] There are many alternatives that can be used with these embodiments. For example, use of the controller can be extended to synthesize support for content that would otherwise be unsupported. Another alternative relates to adding a platform operating extension that would enable content publishers to more readily add controller support to their content.

[00599] Example: Rich Content Browsing Multitasking on Device Operating

Systems with Multitasking Limitations [00600] External platforms can exhibit limited support for a number of common multitasking operations, such as switching between applications or viewing multiple applications at the same time. In addition, some device operating systems provide further restrictions to the platform operating service’s ability to create such capabilities. One embodiment can help ensure users can engage in multiple activities on the platform operating service simultaneously, even when the device operating system places the aforementioned restrictions on the platform operating service. This embodiment can also enable users to engage in activities outside of the platform operating service while simultaneously engaging in activities on the platform operating service. Further, techniques can be used to drive user awareness of the limitations of the device operating system and surface rich information about the state of the content. For example, if a user is using a gameplay platform that requires them to wait in a virtual queue, the user’s position in the queue can be surfaced even while the user is engaging with other surfaces within the platform operating service.

[00601] In one embodiment, a platform operating service is provided that can include a browser component capable of displaying static or interactive content. This browser component can be displayed on its own or as a subordinate view of other user interfaces within the platform operating service. If hosted on a gameplay device with a software service button enabled controller attached, pressing the software service button can toggle between the full screen state or the subordinate view state of the content browser component. In one embodiment, the component, while in a subordinate view, could be moved around by the user using a gesture. In another embodiment, the view could be integrated into the overall service UI. In yet another embodiment, the view could persist even outside of the service UI as an overlay. Performing a touch gesture or other interaction on the subordinate component could also toggle between the full screen and subordinate state of the component.

[00602] In some embodiments, the content browser can be distinct from similar components provided by the device operating system. This could be necessary because the device operating system component may not support certain kinds of rich content. The browser component could maintain and store rich state of the content within, for example, in the case of a game, the gameplay state, or in the case of a video, the time the video is being played, playback state, volume, brightness, and so on. Further, the embodiment can include, but is not limited to, the availability of an internal or external content experience, such as Nvidia GeForce Now, that could require a user to queue before actively engaging with the content. The content browser component could launch into such a content experience and display a dynamic representation of the content in the browser, such as the user’s position in the queue.

[00603] In some embodiments, the content browser may not display on user interface surfaces outside of the platform operating service, potentially due to lack of support from the device operating system. In such scenarios, the platform operating service may surface additional surfaces to educate users about this lack of support and the expected behavior of the content browser. Such a behavior could require the use of alternative surfaces provided by the device operating system, such as, in the case of a mobile operating system, a push notification or even the picture-in-picture system enabled in the device.

[00604] One embodiment can utilize a content browser that is capable of displaying a number of rich media, including, but not limited to, photos, videos, interactive web pages, and games. When displayed in full screen, the content browser can receive input from the mobile game controller so that the user can interact with the content. The browser can be launched from the integrated dashboard or other interfaces of the platform operating service via controller input on, or attached to, the gameplay device.

[00605] In some embodiments, the controller can enable the user to continue using other portions of the platform operating service while engaging with content within the content browser, or queuing for access to content on an external service through various input interactions, such as a software service button, which could interpret short and long press states. The software service button can be contextually aware such that these press patterns can expand the content browser into full screen, or shrink into a subordinate view, or perform custom behaviors within the external content, depending on the current state of the content browser and associated press. While in the subordinate state, the content browser can be displayed on the platform operating service at the same time as other surfaces of the platform operating service, while the service delegates input from the mobile game controller away from content, and instead propagate input to allow the user to navigate and interact with other interfaces. The content browser can maintain the session state of the content while moving between these subordinate states, ensuring a user does not lose their content state while interacting with the platform operating service. This could be done by preserving the content browser instance in the gameplay device memory while the user interface for the content browser component transitions between the relevant states. This transition can also be animated or communicated to the user in a manner that reinforces the behavior, providing a seamless user experience.

[00606] In some embodiments, while in a subordinate state, the content browser can be displayed over other interfaces of the platform operating service and can be moved around using a user gesture or other user input. The content browser can provide visual or haptic feedback to the user and fluid animations to suggest aesthetically pleasing locations to place the browser within the platform operating service. Further, the content browser can be closed through a series of inputs to a controller attached to a computing device.

[00607] When the user exits the platform operating service, the current embodiment can indicate to the user the effect of this action on the state of the content browser. In some embodiments, this can be done by presenting a push notification to the user, or leverage facilities provided by the device operating system, such as iOS’s Live Activities or Dynamic Island functionality, Android’s notification tray, a UI affordance, or other operating system affordances that present the state of background operations. One embodiment of such a behavior could be indicating to the user that because the content browser is not visible, the device operating system may elect to end the content browsing session after a period of time. Another embodiment could be indicating to the user the current position the user is in a queue for an external gameplay service.

[00608] In some embodiments, the platform operating service can ensure the device operating system does not terminate the content browser. In such embodiments, the mobile game controller could interact with the device operating system to request the non-suspension of the platform operating service’s core processes; the platform operating service could request the mobile game controller perform such a routine when the user exits the platform operating service. Since such a routine could impact the battery life of the gameplay device, the platform operating service could allow the user to specify the preferred amount of time they would like to keep the content browsing routine alive.

[00609] A dynamic content browser embodiment can enable the user to activate the subordinate component without losing their position in an external content queue. Further, this embodiment can use a software service button to activate or deactivate the subordinate service component. Also, by indicating the background behavior of the content browser to the user, the platform operating service allows the user to better understand the multitasking behavior of the gameplay device. Further, content browsing termination prevention can be provided. This behavior can be constructed using the unique capabilities of the controller to interact with the device operating system as well as the platform operating service, allowing the platform operating service to interact with the device operating system in ways that otherwise may not be allowed by the device operating system. By preventing the termination of the content browsing session, the platform operating service allows the user to perform multiple simultaneous activities on the computing device without losing the state of their content browsing session. Such a capability could be otherwise unavailable to the user.

[00610] Leveraging the platform operating service, device operating system, and computing device allows users to perform a number of other activities while engaging in gameplay or content browsing activities. No other platform operating service is able to provide a dynamic content browser with background multitasking capabilities while hosted on device operating systems with multitasking model restrictions as described.

[00611] In some environments, users are able to launch and queue for content in external applications, but due to device operating system limitations, for example the inability to programmatically launch web browser-based applications (also known as PWAs), such experiences often could not be launched directly from the platform operating service and required users to launch such experiences manually. Further, on certain device operating systems, such experiences were limited in their ability to intelligently educate users about their background behavior and are unable to prevent termination of the content browsing session, causing significant user frustration. [00612] The development of a browser capable of displaying rich and interactive content experiences enables an extension of the browser capabilities to allow more dynamic behaviors such as described above. The development of the contextually aware platform service switcher and further development of methods to extract key information from content platforms that utilize queueing can also be used. The presence of the software service button can enable a one-click entering and exiting of the full-screen content browser experience via a subordinate interface, significantly simplifying the user experience. In one embodiment, the content representation feature as a picture in picture component could be applied to other parts of the platform operating service. In another embodiment the service can be used to view related content while in queue; for example, if a user is waiting to play a game, the user can see content related to the game, such as the game description, user generated content made using the game, or tips and tricks that could help with gameplay.

[00613] Additional embodiments include alternate indications of content or browser state to the user, including, but not limited to, displaying the user’s position in the queue, indicating to the user that they are in a queue, a display of the content being viewed in the content browser (such as the name, or a representative icon of the content), or a display of the platform that the content is being hosted. A timer (i.e., not showing position in the queue but having a countdown) can also be used. In another embodiment, the user could be in a lobby, waiting room, or other chat-enabled interface, and the service could display the number of messages, direct communications, or other pertinent information. Further, the placement of the browser component or indication of the browser state could be placed in other positions within the platform operating service, for example, the browser could be placed as a tile within the integrated dashboard of the platform operating service. On some device operating systems, cross-application picture in picture functionality is enabled for static displays of content, so alternative embodiments also include the use of this static picture in picture display component to display a user’s position in a queue. [00614] In some embodiments, alternative placement or display mechanisms can be used to enable the user to perform other activities on the platform operating service while the content browser is active. For example, the content browser could be placed in the background of the integrated dashboard, displayed on top of the integrated dashboard but with enough transparency to see other portions of the platform operating service, or only a portion of the content browser could be displayed to the user, or the presence of a content browser session could be indicated using a service indicator. Further, some alternative embodiments could include displaying other portions of the platform operating service over or alongside the content browser; this could still enable the user to use other portions of the platform operating service while the content browser remains active, or inversely, enable use of the content browser while displaying other interfaces of the platform operating service. Embodiments can be used that include any combination of such alternative placement or display states, and could move between these states based on user input or other stimuli.

[00615] In other embodiments, the browser interface in a subordinate state could be used to surface other forms of contextual actions besides returning to the content browser in a full screen state. For example, tapping and holding the browser interface, or pressing the software service button, could invite another user to play with the user, exit the content, or view details or metadata about the content inside the browser component. Other embodiments could also include the platform operating service’s automatically reopening the full screen content browser experience without user intervention; in such embodiments, the platform operating service could return to the content browsing experience automatically when the user-requested content is available for viewing or interaction. Another alternative embodiment could be applying the aforementioned techniques on internal or local gameplay platforms that may require queueing to access. In other embodiments, the behavior of the content browsing termination prevention could dynamically respond to a number of factors instead of, or in addition to, the user- specified behavior. For example, the content browser could prevent termination for a longer period of time when the computing device’s battery level is higher, reduce the termination period when the gameplay device has weak network signal, or disable the functionality entirely when the device operating system indicates it is operating in a power conservation state.

[00616] In another embodiment, when the content browser recognizes that content enters a queue, or other recognized passive state, it could enter the subordinate state automatically and/or indicate to the user that they can perform other activities on the platform operating service while waiting on the state to become active.

[00617] As described above in a previous embodiment, the platform operating service can provide a picture-in-picture component that is distinct from the picture-in- picture component provided by the device operating system. This could be due to a limitation in the device operating system, such as limiting the content type eligible for picture in picture support. The content browser component can maintain and store state of the content within, for example, in the case of a game, the gameplay state, or in the case of a video, the time the video is being played.

[00618] When a content browser, such as an iOS web view (e.g., WKWebView) is removed from one view hierarchy and installed into another view hierarchy, its contents and metadata, as a whole, can remain intact. Also, when a content browser is rendered inside a parent container, any motion that the parent view undergoes introduces no render issues on the subordinate content . “Swapping” the contents of the web session between the subordinate and full screen mode, instead of re-loaded, allows the user not to lose their content’s context, such as loading progress, current display and data state, or potentially their spot in a queue. This can be achieved by having the contents stored globally in memory and passed, by an interactor, to another view. Thus, the reference to the content data is not lost when dismissing one location and showing another. When the user engages the controller input to toggle between locations, a swap routine can be performed; that is, if the content browser is open, switch to browser; if the browser is open, switch to PiP. In this location swap routine, the new appropriate location can be rendered, with the old one being tom down first. Then, the parent view of the content can be adjusted to be that of the newly rendered location. A custom animation routine can be employed such that the deferred addition of the content feels smooth and clearly communicates the behavior of the transition to the end user.

[00619] Figure 102 is a flow chart of a method of an embodiment. As shown in Figure 102, the user is on the integrated dashboard (act 1505) and requests content to launch (act 4510). Then, a determination is made regarding if there is existing content in the browser (act 4515). If there is, the method confirms that the user wants to replace existing content (act 4520). If there isn’t, the content launches in the full screen (act 4525). Next, the user presses the SSB (act 4530), in response to which the content transitions to a picture-in-picture component (act 4535). The user is then shown the integrated dashboard with the content browser overlaid (act 4540).

[00620] When the user closes the platform operating service (act 4545), a determination is made as to what content was in the content browser (act 4450). If the browser was viewing content, a push notification is sent to the user about background content behavior (act 4555) and a display of content being kept active is activated via the computing device’s operating system (act 4560). If the content is queued for content viewing, a push notification is sent to the user affirming the queue position (act 4563) and a display of queue positions is activated via the device’s operating system (act 4565). In either event, a determination is then made of whether the user opted in to keep content active while the platform operating service is in the background (act 4570). If the user opted in, the request is kept alive for a specified time period (act 4575). If the user didn’t opt in, the platform operating service enters into a background state (act 4580 and 4585).

[00621] Figure 103 is a flow diagram of a background state indication/termination prevention method of an embodiment. As shown in Figure 103, the user presses the SSB (act 4605), and a request is made to transition to an unfocused state (act 4610), which triggers the transition (act 4615). An un-focused browser representation is displayed (act 4620), and a gesture is made to exit the platform operating service (act 4625). Then, the user exits the platform operating service (act 4630). Enqueue notifications are sent (acts 4635 and 4640), and the platform operating service is reopened after a notification is tapped (acts 4645 and 4650). The queue position information is sent (act 4655), and an indicator is shown (acts 4660 and 4665). A game is being played (act 1670), the indicator is displayed of the current game (act 4675), and the queue position is displayed (act 4680).

[00622] Figure 104 is a flow diagram of a user interaction of an embodiment. As shown in Figure 104, an integrated dashboard is displayed (act 4705), and a user selects content (act 4710). An identifier for the content is sent (act 4715), and request is made for the content (act 4720). A document is returned with a position in a queue (act 4725), which is displayed (act 4730). The content is requested when ready (act 4735) and then returned (acts 4740 and 4745). After the SSB is pressed (act 4750), a request is made to transition to an un-focused state (act 4755), and the transition is made (act 4760). The exit button is pushed to exit the browsing session (act 4765). Many alternatives can be used. For example, the SSB can launch the controller app or the controller app can be automatically launched upon device plug in but force users to tap the screen to trigger the focusing/un-focusing of content. In another embodiment, a visual affordance is not used, and the content browser is kept alive in the background so that users derive the benefit of keeping their place in the game. Further, a user can launch content, and a countdown or similar affordance can appear on the screen while it is loading or otherwise in a passive state. This would give the user the benefit of being able to use the wait time productively (and a PiP or other unfocused content would not need to be supported (besides the initial countdown, which is not an un-focused piece of content)). In this way, the user can utilize the platform operating service productively while the content the user wants to engage with loads in the background. In yet another embodiment, other user inputs on the controller, such as a touch screen element on the computing device can perform that function. [00623] Many alternatives can be used. For example, the SSB can launch the controller app or the controller app can be automatically launched upon device plug in but force users to tap the screen to trigger the focusing/un-focusing of content. In another alternative, a visual affordance is not used, and the content browser is kept alive in the background. This way, users derive the benefit of keeping their place in the game. Further, a user can launch content, and a countdown or some similar affordance can appear on the screen while it is loading. This would give the user the benefit of being able to use the wait time productively (and a PiP or other un-focused content would not need to be supported (besides the initial countdown, which is not an un-focused piece of content)). In this way, the user can hang out in the platform operating service while the game the user wants to play loads in the background, allowing the user to spend that otherwise idle time productively. In yet another alternative, instead of the SSB taking the user in and out of the un-focused state, other user input devices on the controller or a touch screen element on the computing device can perform that function.

[00624] Conclusion

[00625] It is intended that the foregoing detailed description be understood as an illustration of selected forms that the invention can take and not as a definition of the invention. It is only the following claims, including all equivalents, that are intended to define the scope of the claimed invention. Finally, it should be noted that any aspect of any of the embodiments described herein can be used alone or in combination with one another.

[00626] III. Examples

[00627] The disclosed technology is illustrated, for example, according to various examples described below. Various examples of examples of the disclosed technology are described as numbered examples (1, 2, 3, etc.) for convenience. These are provided as examples and do not limit the disclosed technology. It is noted that any of the dependent examples may be combined in any combination, and placed into a respective independent example. The other examples can be presented in a similar manner.

[00628] Example 1. A method comprising: performing in a game controller comprising a primary port, a secondary port, and a non-volatile memory storing a plurality of game controller profiles: using one of the plurality of game controller profiles to enable use of the game controller by a first computing device connected with the primary port; receiving a selection of a different one of the plurality of game controller profiles; and using the selected different one of the plurality of game controller profiles to enable use of the game controller by a second computing device connected with the secondary port.

[00629] Example 2. The method of any one of the preceding Examples, wherein the selection is received from a user interaction with the first computing device.

[00630] Example 3. The method of any one of the preceding Examples, wherein_the selection is received from a position of a user-movable switch on the game controller.

[00631] Example 4. The method of any one of the preceding Examples, wherein_the selection is received from an identification of the second computing device, wherein the identification is received from the second computing device via the secondary port.

[00632] Example 5. The method of any one of the preceding Examples, wherein_receiving an update to the plurality of game controller profiles stored in the non-volatile memory in the game controller.

[00633] Example 6. The method of any one of the preceding Examples, further comprising:_causing an output of a visual indicator of the game controller to change in response to the game controller being used with the second computing device.

[00634] Example 7. The method of any one of the preceding Examples, further comprising:_receiving analytics data from the second computing device regarding use of the game controller with the second computing device; and_sending the analytics data to the first computing device for local or remote analysis.

[00635] Example 8. The method of any one of the preceding Examples, wherein the secondary port is configured to use a USB protocol.

[00636] Example 9. The method of any one of the preceding Examples, wherein_the secondary port is configured to pass data and power.

[00637] Example 10. A game controller comprising:_a first port; a second port; one or more processors; a non-transitory computer-readable medium; and program instructions stored on the non-transitory computer-readable medium that, when executed by the one or more processors, cause the one or more processors to perform functions comprising: in response to a first computing device being connected with the first port, sending a first set of descriptors to the first computing device to enable the first computing device to use the game controller; and in response to a second computing device being connected with the second port, sending a second set of descriptors to the second computing device to enable the second computing device to use the game controller.

[00638] Example 11. The game controller of any one of the preceding Examples, further comprising: a transceiver used at least by the second port. [00639] Example 12. The game controller of any one of the preceding Examples, wherein the program instructions, when executed by the one or more processors, further cause the one or more processors to perform functions comprising: monitoring a state of differential data pins of the second port to detect whether the second computing device is connected with the second port; and enabling the transceiver in response to detecting that the second computing device is connected with the second port.

[00640] Example 13. The game controller of any one of the preceding Examples, wherein the first and second set of descriptors comprise different human interface device (HID) descriptors.

[00641] Example 14. The game controller of any one of the preceding Examples, wherein the first and second set of descriptors comprise different user input device mappings.

[00642] Example 15. The game controller of any one of the preceding Examples, wherein the first and second set of descriptors comprise different audio settings.

[00643] Example 16. The game controller of any one of the preceding Examples, wherein_the first and second set of descriptors comprise different authentication protocols.

[00644] Example 17. A non-transitory computer-readable medium storing program instructions that, when executed by one or more processors of a game controller comprising primary and secondary ports, cause the one or more processors to perform functions comprising: dynamically changing a configuration of the game controller from a first configuration that allows the game controller to be used by a first computing device connected with the primary port to a second configuration that allows the game controller to be used by a second computing device connected with the secondary port.

I ll [00645] Example 18. The non-transitory computer-readable medium of any one of the preceding Examples, wherein the configuration of the game controller is dynamically changed in response to a user input.

[00646] Example 19. The non-transitory computer-readable medium of any one of the preceding Examples, wherein the configuration of the game controller is dynamically changed automatically in response to on an identification of the second computing device.

[00647] Example 20. The non-transitory computer-readable medium of any one of the preceding Examples, wherein the first and second configurations comprise different human interface device (HID) descriptors, user input device mappings, audio settings, and/or authentication protocols.

[00648] Example 21. A method comprising: performing in a game controller comprising a port and a non-volatile memory storing a plurality of game controller profiles: using one of the plurality of game controller profiles to enable use of the game controller by a first computing device connected with the port; receiving a selection of a different one of the plurality of game controller profiles; and using the selected different one of the plurality of game controller profiles to enable use of the game controller by a second computing device later connected with the port.

[00649] Example 22. The method of any one of the preceding Examples, wherein the selection is received from a user interaction with the first computing device.

[00650] Example 23. The method of any one of the preceding Examples, wherein_the selection is received from a position of a user-movable switch on the game controller.

[00651] Example 24. The method of any one of the preceding Examples, wherein the selection is received from an identification of the second computing device, wherein the identification is received from the second computing device via the port.

[00652] Example 25. The method of any one of the preceding Examples, further comprising: receiving an update to the plurality of game controller profiles stored in the non-volatile memory in the game controller.

[00653] Example 26. The method of any one of the preceding Examples, further comprising:_causing an output of a visual indicator of the game controller to change in response to the game controller being used with the second computing device.

[00654] Example 27. The method of any one of the preceding Examples, further comprising :_receiving analytics data from the second computing device regarding use of the game controller with the second computing device; and_sending the analytics data to the first computing device for local or remote analysis.

[00655] Example 28. The method of any one of the preceding Examples, wherein_the port is configured to use a USB protocol.

[00656] Example 29. The method of any one of the preceding Examples, wherein_the port is configured to pass data and power.

[00657] Example 30. A method comprising: performing by a platform operating service application in a computing device coupled with a mobile game controller:_aggregating games available to play, wherein the games comprise at least one game that is locally playable from the computing device and at least one game that is remotely playable; and displaying an integrated dashboard that presents the aggregated games for user selection via the mobile game controller.

[00658] Example 31. The method of any one of the preceding Examples, further comprising: receiving a user selection of a game that is remotely playable; and launching the game within the platform operating service application.

[00659] Example 32. The method of any one of the preceding Examples, wherein the game is launched in a browser opened inside of the platform operating service application.

[00660] Example 33. The method of any one of the preceding Examples, further comprising: toggling between the game and the integrated dashboard in response to actuation of a user input element on the mobile game controller.

[00661] Example 34. The method of any one of the preceding Examples, further comprising: receiving a user selection of a game that is remotely playable; and launching the game within an application separate from the platform operating service application.

[00662] Example 35. The method of any one of the preceding Examples, further comprising: identifying a game previously played but not played through the platform operating service application; and displaying a prompt to continue the game. [00663] Example 36. The method of any one of the preceding Examples, wherein_the integrated dashboard is displayed in response to actuation of a user input element on the mobile game controller.

[00664] Example 37. The method of any one of the preceding Examples, wherein at least one of the aggregated games is a game suggested based on a property of the mobile game controller.

[00665] Example 38. The method of any one of the preceding Examples, wherein_the property comprising a SKU.

[00666] Example 39. The method of any one of the preceding Examples, wherein the mobile game controller is configured for a particular external game service, and wherein the method further comprises prioritizing a display in the integrated dashboard of a game playable from the particular external game service. [00667] Example 40. The method of any one of the preceding Examples, wherein_the integrated dashboard presents details of the aggregated games in expandable inline pages.

[00668] Example 41. The method of any one of the preceding Examples, wherein_the integrated dashboard is configured to allow user selection of one of the aggregated games.

[00669] Example 42. The method of any one of the preceding Examples, further comprising presenting a perk and/or benefit based on a SKU of the mobile game controller.

[00670] Example 43. The method of any one of the preceding Examples, further comprising receiving a share an image or clip via the mobile game controller. [00671] Example 44. The method of any one of the preceding Examples, wherein_the at least one game that is remotely playable is remotely playable from at least one external game service.

[00672] Example 45. The method of any one of the preceding Examples, wherein the at least one external game service comprises a cloud game streaming service.

[00673] Example 46. The method of any one of the preceding Examples, wherein_the at least one game that is remotely playable is remotely playable from a console using a remote play feature.

[00674] Example 47. A non-transitory computer-readable medium storing program instructions that, when executed by one or more processors of a computing device, cause the one or more processors to perform functions comprising: identifying games that are available to play using the computing device and a mobile game controller in communication with the computing device, wherein at least one game is locally stored in the computing device, and wherein at least one other game is remotely stored external to the computing device; and displaying a user interface that presents the identified games for user selection via the mobile game controller.

[00675] Example 48. The non-transitory computer-readable medium of any one of the preceding Examples, wherein the program instructions, when executed by the one or more processors, further cause the one or more processors to perform functions comprising: providing a search function to query a database of games across multiple game services.

[00676] Example 49. The non-transitory computer-readable medium of any one of the preceding Examples, wherein the program instructions, when executed by the one or more processors, further cause the one or more processors to perform functions comprising: changing a behavior of a user input element on the mobile game controller based on an identification of a game provider associated with a game being played by the computing device.

[00677] Example 50. The non-transitory computer-readable medium of any one of the preceding Examples, wherein the program instructions, when executed by the one or more processors, further cause the one or more processors to perform functions comprising: receiving from a server in communication with the computing device: a recommend game, a friend’s highlight, a trending highlight, a perk, a reward, active screen sharing, and/or promoted content.

[00678] Example 51. A computing device comprising: one or more processors; a non-transitory computer-readable medium; and program instructions stored on the non-transitory computer-readable medium that, when executed by the one or more processors, cause the one or more processors to perform functions comprising: aggregating content that is available to play locally from the computing device and content that is available to play remotely from a remote content service; and displaying the aggregated content for selection via a handheld controller in communication with the computing device.

[00679] Example 52. The computing device of any one of the preceding Examples, wherein the program instructions, when executed by the one or more processors, further cause the one or more processors to perform functions comprising: starting or stopping recording in a circular buffer in the computing device in response to a user interface element on the handheld controller being pressed for a first duration; and saving a last N seconds of recorded content from the circular buffer in response to the user interface element on the handheld controller being pressed for a second duration.

[00680] Example 53. The computing device of any one of the preceding Examples, wherein the program instructions, when executed by the one or more processors, further cause the one or more processors to perform functions comprising: editing the recorded content.

[00681] Example 54. The computing device of any one of the preceding Examples, wherein the program instructions, when executed by the one or more processors, further cause the one or more processors to perform functions comprising: providing a suggestion to improve a physical coupling of the computing device and the handheld controller.

[00682] Example 55. The computing device of any one of the preceding Examples, wherein the content comprises a game.

[00683] Example 56. The computing device of any one of the preceding Examples, wherein the handheld controller comprises a mobile game controller. [00684] Example 57. A non-transitory computer-readable storage medium storing a computer program having instructions that, when executed by one or more processors in a computing device, cause the one or more processors to: determine whether content is playable remotely from a server or locally from the computing device; in response to determining that the content is playable remotely from the server, assign a first function to one or more user input devices of a controller in communication with the computing device; and in response to determining that the content is playable locally from the computing device, assign a second function to the one or more user input devices of the controller.

[00685] Example 58. The non-transitory computer-readable storage medium of any one of the preceding Examples, wherein_the first function causes the computing device to un-focus the content while keeping the content in a foreground.

[00686] Example 59. The non-transitory computer-readable storage medium of any one of the preceding Examples, wherein_the instructions, when executed by the one or more processors, further cause the one or more processors to: after the content is un-focused, assign a function to the one or more buttons that causes the computing device to re-focus the content.

[00687] Example 60. The non-transitory computer-readable storage medium of any one of the preceding Examples, wherein the instructions, when executed by the one or more processors, further cause the one or more processors to: in response to the content being playable remotely from the server, assign a function to the one or more buttons that causes the computing device to launch the content.

[00688] Example 61. The non-transitory computer-readable storage medium of any one of the preceding Examples, wherein the un-focused content is displayed in a minimized display area in a picture-in-picture display.

[00689] Example 62. The non-transitory computer-readable storage medium of any one of the preceding Examples, wherein a non-minimized display area in the picture-in-picture display displays a user interface.

[00690] Example 63. The non-transitory computer-readable storage medium of any one of the preceding Examples, wherein the un-focused content is displayed as an overlay to other displayed content.

[00691] Example 64. The non-transitory computer-readable storage medium of any one of the preceding Examples, wherein the other displayed content comprises a user interface.

[00692] Example 65. The non-transitory computer-readable storage medium of any one of the preceding Examples, wherein the second function causes the computing device to move a platform operating service to a background and place the content in a foreground.

[00693] Example 66. The non-transitory computer-readable storage medium of any one of the preceding Examples, wherein the instructions, when executed by the one or more processors, further cause the one or more processors to: in response to the content being in the foreground, assign a function to the one or more buttons that causes the computing device to move the content to the background and place the platform operating service in the foreground.

[00694] Example 67. The non-transitory computer-readable storage medium of any one of the preceding Examples, wherein the content is playable via a controller app in the computing device.

[00695] Example 68. A method comprising: performing in a computing device in communication with a controller: determining whether content is focusable or un- focusable; and performing at least one of the following: in response to determining that the content is focusable, assigning a first function to one or more user input devices of the controller, wherein the first function selectively focuses and un-focuses the content; and_in response to determining that the content is un-focusable, assigning a second function to the one or more user input devices of the controller, wherein the second function selectively swaps the content between a foreground and a background.

[00696] Example 69. The method of any one of the preceding Examples, wherein_content streamed from a server is focusable and content local to the computing device is un-focusable.

[00697] Example 70. The method of any one of the preceding Examples, further comprising displaying un-focused content in a minimized area of a picture-in- picture display.

[00698] Example 71. The method of any one of the preceding Examples, further comprising displaying un-focused content as an overlay to other displayed content.

[00699] Example 72. The method of any one of the preceding Examples, wherein_as signing at least one of the first and second functions comprises sending mapping information to the controller.

[00700] Example 73. The method of any one of the preceding Examples, wherein_the content is playable via a controller app in the computing device.

[00701] Example 74. A controller comprising: an interface configured to place the controller in communication with a computing device; one or more user input devices; and one or more processors configured to communicate with the interface and the one or more user input devices, wherein the one or more processors are further configured to receive mapping information from the computing device to map a function to the one or more input devices based on a context of content.

[00702] Example 75. The controller of any one of the preceding Examples, wherein the context specifies whether the content is focusable or un-focusable. [00703] Example 76. The controller of any one of the preceding Examples, wherein the context specifies whether the content is local content or streamed content. [00704] Example 77. The controller of any one of the preceding Examples, wherein the controller comprises a game controller and the computing device comprises a mobile device. [00705] Example 78. The controller of any one of the preceding Examples, wherein the controller is configured to fit around the computing device.

[00706] Example 79. The controller of any one of the preceding Examples, wherein the content is playable via a controller app in the computing device.

[00707] Example 80. A system comprising: a controller; and a computing device comprising: one or more processors; and a non-transitory computer-readable storage medium storing a computer program having instructions that, when executed by the one or more processors, cause the one or more processors to: determine whether content is playable remotely from a server or locally from the computing device; in response to determining that the content is playable remotely from the server, assign a first function to one or more user input devices of the controller; and in response to determining that the content is playable locally from the computing device, assign a second function to the one or more user input devices of the controller.

[00708] Example 81. A non-transitory computer-readable storage medium storing a computer program having instructions that, when executed by one or more processors in a computing device, cause the one or more processors to: receive, from a remote device, identification of content playable from the remote device, wherein the content playable from the remote device is compatible with a controller in communication with the computing device; identify content locally stored in the computing device, wherein some of the content locally stored in the computing device is not compatible with the controller; determine a subset of the content locally stored in the computing device that is compatible with the controller, wherein the determining is made based on the content playable from the remote device; and display an identification of the subset in a user interface.

[00709] Example 82. The non-transitory computer-readable storage medium of any one of the preceding Examples, wherein the instructions, when executed by the one or more processors, further cause the one or more processors to: provide, to the remote device, the identification of the subset; and receive, from the remote device, the identification of the subset for display in the user interface.

[00710] Example 83. The non-transitory computer-readable storage medium of any one of the preceding Examples, wherein the instructions, when executed by the one or more processors, further cause the one or more processors to: send, to the remote device, a request for the identification of the content playable from the remote device.

[00711] Example 84. The non-transitory computer-readable storage medium of any one of the preceding Examples, wherein the instructions, when executed by the one or more processors, further cause the one or more processors to: create a content datastore.

[00712] Example 85. The non-transitory computer-readable storage medium of any one of the preceding Examples, wherein content locally stored in the computing device is displayed differently than content playable from the remote device.

[00713] Example 86. The non-transitory computer-readable storage medium of any one of the preceding Examples, wherein a display of the identification of the subset is organized by level of support.

[00714] Example 87. The non-transitory computer-readable storage medium of any one of the preceding Examples, wherein the instructions, when executed by the one or more processors, further cause the one or more processors to: display, in the user interface, an identification of content locally stored in the computing device that is not compatible with the controller.

[00715] Example 88. The non-transitory computer-readable storage medium of any one of the preceding Examples, wherein the instructions, when executed by the one or more processors, further cause the one or more processors to: send a request to make currently-incompatible content compatible with the controller.

[00716] Example 89. The non-transitory computer-readable storage medium of any one of the preceding Examples, wherein the content is playable via a controller app in the computing device.

[00717] Example 90. A method comprising: performing in a computing device in communication with a remote device: receiving, from the remote device, identification of remote content stored on the remote device; identifying native content stored in the computing device; comparing characteristic of the remote content and the native content to assess compatibility of the native content with a controller in communication with the computing device; and displaying, for selection, a subset of the native content that is compatible with the controller.

[00718] Example 91. The method of any one of the preceding Examples, further comprising: sending, to the remote device, an identification of the subset. [00719] Example 92. The method of any one of the preceding Examples, further comprising: sending, to the remote device, a request for the identification of the remote content.

[00720] Example 93. The method of any one of the preceding Examples, further comprising: creating a content datastore.

[00721] Example 94. The method of any one of the preceding Examples, wherein the native content is displayed differently than the remote content.

[00722] Example 95. The method of any one of the preceding Examples, further comprising: organizing the display by level of support.

[00723] Example 96. The method of any one of the preceding Examples, further comprising: displaying, in the user interface, an identification of incompatible content.

[00724] Example 97. The method of any one of the preceding Examples, further comprising: sending a request to make currently-incompatible content compatible with the controller.

[00725] Example 98. The method of any one of the preceding Examples, wherein the content is playable via a controller app in the computing device.

[00726] Example 99. A computing device comprising: a memory configured to store local content, some of which is incompatible with a controller; a first interface configured to communication with the controller; a second interface configured to communicate with a remote device; and one or more processors configured to communicate with the first and second interfaces and the memory, wherein the one or more processors are further configured to determine a subset of the local content that is compatible with the controller without informing the remote device of the local content.

[00727] Example 100. The computing device of any one of the preceding Examples, wherein the one or more processors are further configured to display the subset for selection by a user.

[00728] Example 101. The computing device of any one of the preceding Examples, wherein the one or more processors are further configured to determine the subset by comparing characteristics of the local content and remote content that is compatible with the controller. [00729] Example 102. The computing device of any one of the preceding Examples, wherein the one or more processors are further configured to send a request to make currently-incompatible content compatible with the controller. [00730] Example 103. The computing device of any one of the preceding Examples, wherein the content is playable via a controller app in the computing device.

[00731] Example 104. A system comprising: a controller; and a computing device comprising: one or more processors; and a non-transitory computer-readable storage medium storing a computer program having instructions that, when executed by one or more processors in a computing device, cause the one or more processors to: receive, from a remote device, identification of content playable from the remote device, wherein the content playable from the remote device is compatible with the controller; identify content locally stored in the computing device, wherein some of the content locally stored in the computing device is not compatible with the controller; determine a subset of the content locally stored in the computing device that is compatible with the controller, wherein the determining is made based on the content playable from the remote device; and display an identification of the subset in a user interface.