Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
PERSONALIZED OPTICS
Document Type and Number:
WIPO Patent Application WO/2023/096713
Kind Code:
A1
Abstract:
Eyewear (100) dynamically adjusts viewing effects to match the wearer, the object or scene being viewed (luminance, color prominence, glare, visual blur/noise), other conditions: sensory parameters (gaze direction, focal length, eye gestures, other eye activity, other senses, wearer inputs), medical conditions, wearer location, environmental parameters, wearer activity, use by the wearer, the wearer's field of view. The eyewear can adjust visual features presented to the wearer, such as changes in refraction, polarization/shading, color, prismatic angles/functions, 3D displays. Eyewear can be tailored to form factor: glasses, contacts, RID, IOL, facemask/helmet, vehicles, windows, screens, scopes, AR/VR devices, nerve sensors, external devices. Eyewear can adjust refraction, polarization/shading, color filtering/injection, false coloring, color change; prismatic angles/functions. Eyewear can respond to wearer activity: police, military, firefighter, emergency responder, search and rescue, vehicle operation, sporting/theme-park events, viewing advertising/ storefronts, conversation. Hybrid optimization of eyewear can be personalized to users.

Inventors:
LEWIS SCOTT (US)
Application Number:
PCT/US2022/047070
Publication Date:
June 01, 2023
Filing Date:
October 19, 2022
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
PERCEPT TECH INC (US)
International Classes:
G02B27/01; A41D13/11; G02C7/10; G02C7/16
Foreign References:
US11160319B12021-11-02
US20160029716A12016-02-04
US20180000179A12018-01-04
Attorney, Agent or Firm:
SWERNOFSKY, Steven (US)
Download PDF:
Claims:
Claims

1. Facewear including a face covering disposed to obscure or obstruct at least a portion of a user’s face; a processor disposed to receive first information, the processor being disposed to present an audio/video effect on the face covering.

2. Facewear as in claim 1, wherein the first information includes an audio/video stream; the processor is disposed to decode the audio/video stream to provide second information; the processor is disposed to use the second information to present the audio/video effect on the face covering.

3. Facewear as in claim 1, wherein the audio/video effect includes a video image or a moving video.

4. Facewear as in claim 1, wherein the audio/video effect includes an audio output, a song, voice, or a sound effect.

5. Facewear as in claim 1, wherein the presentation device includes one or more of: a lamp or photodiode, a color-alterable device,

6. Facewear as in claim 1, wherein the presentation device is perceivable to an observer of the wearer of the facewear.

7. Facewear as in claim 1, wherein the audio/video effect includes an image of a mouth disposed to express at least one movement of the user’s mouth.

8. Facewear as in claim 7, wherein the movement of the user’s mouth is disposed to express at least a portion of the user’s voice.

9. Facewear as in claim 7, wherein the movement of the user’s mouth is disposed to show an image to express one or more of: speech, a song, a sound effect, a vocal effect.

10. Facewear as in claim 7, wherein the movement of the user’s mouth is disposed to show an image to express one or more of: blowing a kiss, breathing, burping, clearing one’s throat, coughing, gagging, groaning, hiccups, humming, licking or smacking one’s lips, panting.

11. Facewear as in claim 1, wherein the audio/video effect includes an image disposed to show one or more of: an emoji, an emotion, an emoticon, or another symbol.

12. Facewear as in claim 1, wherein the audio/video effect includes an image disposed to show one or more of: text in a language being spoken, text in a translation thereof, a known symbol, artwork, a design, a photograph, or an image of a painting, a representation of data, medical information with respect to the user, a livestream, a movie, a television image, or a webcam image.

13. Facewear as in claim 1, wherein the audio/video effect includes an image disposed to show one or more of: an image associated with a game or sport, an image received from among a group of players thereof, an image received from among a group of observers thereof, an image received from among a group of attendees at a political rally, an image showing support for a candidate or an opinion.

14. Facewear as in claim 1, wherein the audio/video effect includes an image disposed to show one or more of: an image of at least a portion of a concert, an image of at least a portion of a party, an image of at least a portion of a political event, an image of at least a portion of an entertainment event; an image of at least a portion of a sports event.

15. Facewear as in claim 1, wherein the audio/video effect includes an image disposed to show one or more of: an image providing a warning signal, an image providing information to alert a nearby second person.

16. Facewear as in claim 15, wherein the nearby second person includes an emergency responder, a law enforcement officer, a volunteer, another person disposed to provide assistance.

17. Facewear including a presentation device disposed on or near at least a portion of a user’s face; a processor disposed to receive first information, the processor being disposed to present an audio/video effect on the presentation device.

18. Facewear as in claim 17, wherein the first information includes an audio/video stream; the processor is disposed to decode the audio/video stream to provide second information; the processor is disposed to use the second information to present the audio/video effect on the presentation device.

19. Facewear as in claim 17, wherein the audio/video effect includes a video image or a moving video.

20. Facewear as in claim 17, wherein the audio/video effect includes an audio output, a song, voice, or a sound effect.

21. Facewear as in claim 17, wherein the presentation device includes one or more of: a lamp or photodiode, a color-alterable device,

22. Facewear as in claim 17, wherein the presentation device is perceivable to an observer of the wearer of the facewear.

23. Facewear as in claim 17, wherein the audio/video effect includes an image of a mouth disposed to express at least one movement of the user’s mouth.

24. Facewear as in claim 23, wherein the movement of the user’s mouth is disposed to express at least a portion of the user’s voice.

25. Facewear as in claim 23, wherein the movement of the user’s mouth is disposed to show an image to express one or more of: speech, a song, a sound effect, a vocal effect.

26. Facewear as in claim 23, wherein the movement of the user’s mouth is disposed to show an image to express one or more of: blowing a kiss, breathing, burping, clearing one’s throat, coughing, gagging, groaning, hiccups, humming, licking or smacking one’s lips, panting.

27. Facewear as in claim 17, wherein the audio/video effect includes an image disposed to show one or more of: an emoji, an emotion, an emoticon, or another symbol.

28. Facewear as in claim 17, wherein the audio/video effect includes an image disposed to show one or more of: text in a language being spoken, text in a translation thereof, a known symbol, artwork, a design, a photograph, or an image of a painting, a representation of data, medical information with respect to the user, a livestream, a movie, a television image, or a webcam image.

29. Facewear as in claim 17, wherein the audio/video effect includes an image disposed to show one or more of: an image associated with a game or sport, an image received from among a group of players thereof, an image received from among a group of observers thereof, an image received from among a group of attendees at a political rally, an image showing support for a candidate or an opinion.

30. Facewear as in claim 17, wherein the audio/video effect includes an image disposed to show one or more of: an image of at least a portion of a concert, an image of at least a portion of a party, an image of at least a portion of a political event, an image of at least a portion of an entertainment event; an image of at least a portion of a sports event.

31. Facewear as in claim 17, wherein the audio/video effect includes an image disposed to show one or more of: an image providing a warning signal, an image providing information to alert a nearby second person.

32. Facewear as in claim 31, wherein the nearby second person includes an emergency responder, a law enforcement officer, a volunteer, another person disposed to provide assistance.

33. Eyewear including a color-alterable element disposed to change color in response to one or more conditions; wherein the color-alterable element includes a portion of one or more of: a contact lens, an eyewear frame, a facemask or helmet, or a heads-up display; wherein the conditions include one or more of: an electromagnetic signal; a signal from an external device; a measure of audio/video noise within a selected field of view; a measure of ambient audio/video noise, pollution, or temperature; a measure of proximity to a selected location, object, or person; a recognized location, object, or person within a field of view; a time of day, day of the week, season, or weather; a user control or gesture; a user gaze direction or depth of focus; a user location; a user medical condition.

34. Eyewear as in claim 33, including one or more sensors disposed to provide information to a computing device; wherein the computing device is disposed to determine, in response to information from the one or more sensors, whether the user is exhibiting a condition for which the eyewear should change color.

35. Eyewear as in claim 34, wherein the computing device is disposed to receive information with respect to one or more of: an ambient environment condition; or an allergenic condition, a pollution condition, or a weather condition; a communication condition; or a presence of one or more friends, relatives, or other persons known to the user, a presence of one or more objects recognized by the user, a presence of one or more signals directed at the user or otherwise near the user; a user condition; or a desired color or pattern for the eyewear, a desired message to communicate, an emergency condition, an emotional condition, or a medical condition.

36. Eyewear as in claim 33, wherein the color-alterable element is disposed to change color or color texture in a time-dependent manner.

37. Eyewear as in claim 36, wherein the time-dependent manner includes one or more of: in a cyclical or pseudo-cyclical manner; or in a repeating or near-repeating manner; or in which the color-alterable element cycles among a set of colors or color combinations; in a random or pseudo-random manner; or in response to a random or pseudo-random effect; or in which the color-alterable element changes among a set of randomly selected colors; in a manner imitating or providing a florescent effect; or in response to ultraviolet light, in response to a selected audio/ video frequency of light or sound; or by emitting light in response to a musical or other audio input, by emitting light in response to a selected electromagnetic frequency; in a manner imitating a natural process; or in response to a simulation of a natural artifact; or in a manner imitating glitter or a molded material including glitter, a sparkling material, or a pattern evincing gemstones; or in a manner imitating a natural creature; or in response to a pattern evincing one or more of: a butterfly, a chameleon, a hummingbird, or a lightning bug.

38. Eyewear as in claim 33, wherein the color-alterable element is disposed to signal to an observer that a user of the eyewear is subject to a selected condition.

39. Eyewear as in claim 38, wherein at least one observer includes the user of the eyewear.

40. Eyewear as in claim 33, wherein the color-alterable element includes a contact lens disposed to change color, wherein the color to which the color-alterable element changes is responsive to a color of the user’s lens.

41. Eyewear as in claim 40, wherein the color-alterable element includes a contact lens disposed to change color, wherein the color change of the color-alterable element is limited to a region outside the user’s field of view.

42. Eyewear as in claim 33, wherein the color-alterable element includes a contact lens disposed to change color, wherein the color change alters an external view of the user’s eye.

43. Eyewear as in claim 33, wherein the color-alterable element includes an e-chromatic material.

44. Eyewear as in claim 43, wherein the e-chromatic material is disposed to change color in response to a received electromagnetic signal other than visible light.

45. Eyewear as in claim 33, wherein the color-alterable element includes one or more of: a temple, a flat portion of the eyewear, a frame, a portion of a lens holder, or a nosepiece.

46. Eyewear as in claim 33, wherein the color-alterable element includes one or more of: a first portion of the frame nearer the lenses, or a second portion of the frame farther from the lenses.

47. Eyewear as in claim 33, wherein the color-alterable element is disposed to change color to alter a color balance of ambient light.

48. Eyewear as in claim 47, wherein the color-alterable element is disposed to change color to alter the user’s field of view.

49. Eyewear as in claim 33, wherein the color-alterable element is disposed to change color to alter a perceived size of the user’s pupil.

50. Eyewear as in claim 49, wherein the color-alterable element is disposed to change color for a portion of the user’s pupil, to alter a color alter a color balance of the user’s vision.

51. Eyewear as in claim 49, wherein the color-alterable element is disposed to change color only outside the user’s pupil.

52. Eyewear as in claim 33, wherein the color-alterable element is disposed to change color in other that a fixed color or color pattern.

53. Eyewear as in claim 52, wherein the color change includes one or more of: a coloring/ tinting effect that changes with time, a florescent effect, or a glitter effect.

54. Eyewear as in claim 33, wherein the color-alterable element is disposed to change color in response to an audio or video signal.

55. Eyewear as in claim 54, wherein the audio or video signal is provided by the user.

56. Eyewear as in claim 54, wherein the audio or video signal is received by the eyewear.

57. Eyewear as in claim 54, wherein the color change is disposed in synchrony with the audio or video signal.

58. Eyewear as in claim 33, wherein the color-alterable element is disposed to change color to present an image of a portion of the user’s face.

59. Eyewear as in claim 33, wherein the color-alterable element is disposed to change color without interfering with the user’s field of view.

60. Eyewear as in claim 33, wherein the eyewear includes an e-chromatic material disposed inside a substantially clear coating or material, wherein coloring/tinting changes to the e-chromatic material are disposed to show externally through the substantially clear material.

61. Eyewear as in claim 33, wherein the eyewear is responsive to one or more magnetic fields, wherein the eyewear is disposed to change color when a magnetic field is present.

62. Eyewear as in claim 61, wherein the color-alterable element includes one or more polymer beads disposed to change color when a magnetic field is present.

63. Eyewear as in claim 62, wherein the polymer beads are microscopic in size.

64. Eyewear as in claim 63, wherein the polymer beads are responsive to a tool disposed to be wielded by a user.

65. Eyewear as in claim 33, wherein the one or more conditions includes one or more of: an ambient condition, an ambient condition capable of affecting the user, prompting a migraine, or prompting another medical condition with respect to the user or a person within the user’s field of view, an effect within the user’s field of view, an environment in which the user is participating, a local weather condition, pollution measure, or pollen count, a medical condition observed by the user, a medical condition of a person within the user’s field of view, a medical condition of the user, an object recognized within the user’s field of view, a user input, a user activity, an emotional affect or mood observed by the user, an emotional affect or mood of a person within the user’s field of view, an emotional affect or mood of the user, or a user focus on a particular portion of their field of view, a window screen, or a vehicle window or windshield.

66. Eyewear as in claim 33, wherein the one or more conditions include one or more of: whether the user is excessively tired, has high or low blood pressure, is intoxicated, or is about to or is currently subject to migraine or photophobia.

67. Eyewear as in claim 66, wherein one or more of: medical personnel, emergency responders, nearby volunteers, or friends of the user, can identify whether the user needs aid or assistance.

68. Eyewear as in claim 33, wherein the signal from an external device includes one or more of: an audio/video signal, an electromagnetic signal, or an ultrasonic signal.

69. Eyewear as in claim 68, wherein the audio/video signal includes one or more of: music, a user’s voice, a voice received from a person other than the user.

70. Eyewear as in claim 33, wherein the eyewear includes a processor responsive to an external signal; the processor is disposed to control the shading/inverse-shading or coloring/tinting of the color-alterable element so as to present the condition of the user to one or more persons.

71. Eyewear as in claim 70, wherein the eyewear includes a processor responsive to an external signal; the external signal includes audio/video information; the processor is disposed to control the shading/inverse-shading or coloring/tinting of the color-alterable element in response to recognition of a particular person’s voice, facial image, or another selected image, in the audio/video information.

72. Eyewear as in claim 70, wherein the eyewear includes a processor responsive to an external signal; the external signal includes audio/ video information; the processor is disposed to control the shading/inverse-shading or coloring/tinting of the color-alterable element so as to show that the user has recognized a particular person’s voice or facial image, or that the user has recognized another selected image, in the audio/video information.

73. Eyewear as in claim 33, wherein the signal from an external device includes a signal responsive to a condition of a user, the condition of the user including one or more of: a distance from a selected group of persons, an emotional condition, a location, a medical condition.

74. Eyewear as in claim 73, wherein the eyewear includes a processor responsive to the signal from the external device; the processor is disposed to control the shading/inverse-shading or coloring/tinting of the color-alterable element so as to present the condition of the user to one or more persons.

75. Eyewear as in claim 74, wherein the one or more persons include one or more of: a user of the eyewear, one or more persons in line of sight of the eyewear, one or more emergency responders, one or more medical personnel, one or more law enforcement officers, one or more firefighters, one or more search/ rescue personnel, one or more observers of a sports activity.

76. Eyewear as in claim 33, wherein the signal from an external device includes a signal responsive to a selected condition, the selected condition including one or more of: a count of a selected group of persons within a selected distance, a distance of the eyewear from a selected group of persons, a distance to or location of the external device, an emotional condition of a selected person, an identity of a selected person, a medical condition of a selected person.

250

77. Eyewear as in claim 76, wherein the eyewear is disposed to provide a change in shading/inverse-shading or coloring/tinting in response to whether a collection of selected persons are sufficiently nearby, or if one or more of those selected persons has wandered off or gotten lost.

78. Eyewear including a shading/inverse-shading or coloring/tinting element disposed to shade/inverse-shade or color/ tinting in response to an audio or video signal; wherein the shading/inverse-shading or coloring/tinting element is coupled to one or more lenses of the eyewear.

79. Eyewear as in claim 78, including a receiver disposed to receive a signal from an external device, wherein the external device includes one or more of: an external measurement device, a smartphone or mobile device, a health monitor, a blood oximeter, a blood pressure monitor, a heart rate monitor, a mood-sensing ring, or a thermometer or other temperature sensor; wherein the eyewear is disposed to provide feedback with respect to a physical state of a user of the eyewear, without having to perform any measurement or review any measuring device.

80. Eyewear as in claim 79, including equipment disposed for the user to participate in a sport or another activity requiring consistent attention or rapid reaction.

81. Eyewear as in claim 80, wherein the activity requiring consistent attention or rapid reaction includes one or more of: operating an aircraft, race car, motorcycle, dirt bike, or another vehicle; or playing a first-person shooter or another video game.

82. Eyewear as in claim 78, wherein wherein the shading/inverse-shading or coloring/tinting element includes a portion of one or more of: a contact lens, an eyewear frame, a facemask or helmet, a heads-up display.

251

83. Eyewear as in claim 78, wherein the audio or video signal includes a song, another music presentation, or another audio or video signal.

84. Eyewear as in claim 78, wherein the eyewear is disposed to receive the audio or video signal and, in response thereto, shade/inverse-shade, color/ tint, or illuminate.

85. Eyewear as in claim 84, wherein the audio or video signal includes an alarm; and the eyewear is disposed to provide the user with a colorized indicator with respect to the alarm.

86. Eyewear as in claim 84, wherein the eyewear is disposed to provide the user with a colorized experience with respect to the audio or video signal.

87. Eyewear as in claim 78, wherein the eyewear is disposed to respond to the audio or video signal in response to an external device.

88. Eyewear as in claim 87, wherein the audio or video signal includes an electromagnetic or ultrasonic signal.

89. Eyewear as in claim 87, wherein the external device is responsive to a person other than the user of the eyewear.

90. Eyewear as in claim 78, wherein the eyewear is disposed to perform shading/inverse-shading or coloring/tinting in response to the user’s voice or facial movements.

91. Eyewear as in claim 90, wherein the eyewear is disposed to present a variable set of shading/inverse-shading or coloring/tinting in response to one or more of: a set of user’s gestures; or

252 when the user, in a selected way, moves their eyes, nose, mouth, chin, neck, or other elements of their head/neck.

92. Eyewear as in claim 90, wherein the eyewear is disposed to present a variable set of shading/inverse-shading or coloring/ tinting when the user is speaking, singing, grunting, or otherwise making artificial noises.

93. Eyewear as in claim 90, wherein the eyewear is disposed to present an image on one or more lenses, on a facemask, on another facial covering, or on a device coupled thereto, in response to when the user is speaking, singing, grunting, or otherwise making artificial noises.

94. Eyewear as in claim 93, wherein the eyewear is disposed to present a picture of how the user’s facial features would look without the facemask or facial covering, in response to movement or sound associated with the user’s facial features.

95. Eyewear as in claim 94, wherein the picture includes one or more of: a caricature of the user’s face, a filtered version of the user’s face using an image filter, a a picture of another person’s face (such as a celebrity or a friend/relative of the user), a picture of an animal or cartoon, or another arbitrary image.

96. Eyewear as in claim 93, including a microphone disposed to receive sounds from the user’s mouth, throat, or other vocal apparatus.

97. Eyewear including a lens having a first adjustment disposed to be applied to a first portion of a user’s field of view and a second adjustment disposed to be applied to a second portion of the user’s field of view; at least one of the first adjustment or the second adjustment including more than one function disposed to be applied to infalling light, the more than one function including one or more of:

253 a selected refraction function, a selected shading/inverse-shading function, a selected color- ing/tinting function, a selected color balancing function, a selected polarization function, a selected prismatic deflection function, a selected dynamic visual optimization function.

98. Eyewear as in claim 97, wherein at least one of the first or the second portion of the user’s field of view includes a central or peripheral portion thereof.

99. Eyewear as in claim 97, wherein at least one of the first or the second portion of the user’s field of view includes a part of an upper or a lower portion thereof and a part of a central or peripheral portion thereof. too. Eyewear as in claim 97, wherein at least one of the first or the second portion of the user’s field of view includes an upper or a lower portion thereof.

101. Eyewear as in claim 97, wherein the first and the second portion of the user’s field of view collectively include reader glasses.

102. Eyewear as in claim 97, wherein the first and the second portion of the user’s field of view collectively include a bifocal, trifocal, or multifocal lens.

103. Eyewear as in claim 97, wherein the first adjustment is responsive to one or more of: a set of infalling light or a set of images, identified in the first portion of the user’s field of view.

104. Eyewear as in claim 103, wherein the infalling light or images identified in the first portion of the user’s field of view includes one or more of: content recognized with respect to that portion of the user’s field of view, or information with respect to ambient circumstances.

254

105. Eyewear as in claim 97, wherein the first portion of the user’s field of view includes a central portion thereof and the second portion of the user’s field of view includes a peripheral portion thereof.

106. Eyewear as in claim 97, wherein the first portion of the user’s field of view includes an upper portion thereof and the second portion of the user’s field of view includes a lower portion thereof.

107. Eyewear as in claim 97, wherein the first portion of the user’s field of view includes one or more of: a portion with respect to content recognized within the scope of the lenses; a set of ambient circumstances recognized within the scope of the lenses, or a time of day or location; a set of user inputs provided at a time when the user is viewing content using the lenses; a set of bookmarks defined by the user with respect to functions to be applied by the lenses in response to content recognized, a set of ambient circumstances recognized, or a set of user inputs provided at a time when the user is viewing content using the lenses, within the scope of the lenses.

108. Eyewear as in claim 107, wherein the first adjustment associated with the first portion includes a selected amount of shading/in- verse-shading in response to a selected location within the user’s field of view.

109. Eyewear as in claim 108, wherein the first adjustment includes a first amount of shading/inverse-shading in a close-range portion of the user’s field of view; the second adjustment includes a second amount of shading/inverse-shading in a distant portion of the user’s field of view.

110. Eyewear as in claim 108, wherein when the user is reading in a bright environment, the first adjustment includes a first amount of shading/inverse-shading in a portion of the user’s field of view associated with reading.

255 in. Eyewear as in claim 108, wherein when the user is operating a vehicle, the first adjustment includes a first amount of shading/inverse-shading in a portion of the user’s field of view associated with a bright field of view.

112. Eyewear as in claim 107, wherein the first adjustment associated with the first portion includes a selected coloring/tinting in response to a selected location within the user’s field of view.

113. Eyewear as in claim 112, wherein the selected coloring/tinting is in response to a brightness or coloring/tinting of an ambient environment.

114. Eyewear as in claim 112, wherein when the user is determined to be about to be, or currently, subject to migraine or photophobia, the selected coloring/tinting is disposed to increase an amount of green light in response to a brightness or coloring/tinting in the user’s field of view.

115. Eyewear as in claim 112, wherein when the user is determined to be sensitive to bright light, or is subject to migraine or photophobia, the selected coloring/tinting is disposed to reduce an amount of blue/ultraviolet light in response to a brightness or coloring/tinting in the user’s field of view.

116. Eyewear as in claim 112, wherein when the user is determined to be subject to at least some color blindness, the selected coloring/tinting is disposed to enhance those portions of the user’s field of view that relate to particular colors for which the user’s attention is to be drawn.

117. Eyewear as in claim 116, wherein particular colors for which the user’s attention are to be drawn toward are disposed in a brighter format, or in a flashing format, or

256 particular colors for which the user’s attention are not to be drawn toward are disposed in a dimmer format, or in a grayer format.

118. Eyewear as in claim 97, including a dynamic eye tracking mechanism; an outward-looking camera; a computing device disposed to perform one or more of: an artificial intelligence or machine learning technique; wherein the lens is disposed to apply two or more alternative functions to infalling light; the computing device is disposed to be responsive to the dynamic eye tracking mechanism and the outward-looking camera, so as to determine an adjustment to infalling light from the user’s field of view which enhances visibility of at least one viewable objects or scene the user favors; the computing device is disposed to be responsive to the adjustment to infalling light, so as to determine a set of parameters of one or more of the artificial intelligence or machine learning technique.

119. Eyewear including a shading/inverse-shading or coloring/tinting element disposed to shade/inverse-shade or color/tinting in response to a combination of one or more of a color or color balance in a user’s field of view; wherein the shading/inverse-shading or coloring/tinting element is coupled to one or more lenses of the eyewear; the shading/inverse-shading or coloring/tinting element is disposed to, in response to a combination of one or more of a coloring/tinting or color balance in the user’s field of view, to present one or more particular coloring/tinting in a selected more visible or less visible format.

120. Eyewear as in claim 119, wherein the coloring/tinting or color balance include a traffic indicator; the presentation includes an enhancement of that traffic indicator.

257

121. Eyewear as in claim 119, wherein the eyewear is responsive to a medical condition of the user including a type of color blindness or difficulty in distinguishing coloring/tinting; the coloring/tinting or color balance includes the coloring/tinting for which the user is subject to coloring/tinting blindness or difficulty in distinguishing coloring/tinting; the presentation includes an enhancement of that coloring/tinting.

122. Eyewear as in claim 119, wherein the presentation includes one or more of: an enhancement or de- enhancement of a selected coloring/tinting or color balance.

123. Eyewear as in claim 122, wherein when the user is subject to red/green color blindness or when the user’s field of view is filtered to restrict coloring/tinting to primarily shades of green, the user’s field of view is adjusted to show red coloring/tinting in one or more of: a brighter format or a flashing format.

124. Eyewear as in claim 119, including a dynamic eye tracking mechanism; an outward-looking camera; a computing device disposed to perform one or more of: an artificial intelligence or machine learning technique; wherein the computing device is disposed to be responsive to the dynamic eye tracking mechanism and the outward-looking camera, so as to determine a preferred control of the shading/inverse-shad- ing or coloring/tinting element which enhances visibility of at least one viewable objects or scene the user favors; the computing device is disposed to be responsive to the preferred control of the shading/in- verse-shading or coloring/tinting element, so as to determine a set of parameters of one or more of the artificial intelligence or machine learning technique.

125. Eyewear including a dynamic eye tracking system disposed to determine a gaze direction or focal length of a user, wherein the dynamic eye tracking system is disposed to determine a location in three-dimensional space at which the user is looking;

258 one or more lenses disposed to have an amount of refraction be dynamically altered; a control circuit disposed to dynamically alter the amount of refraction of the one or more lenses in response to the gaze direction or focal length of the user.

126. Eyewear as in claim 125, wherein at least one of the one or more lenses includes regions composable into a bifocal, trifocal, progressive, or otherwise multi-focal lens; the control circuit is disposed to alter the amount of refraction in response to one or more of: which region the user’s gaze direction is through, a distance of the user’s focal length; whereby the control circuit is disposed to optimize the user’s visual acuity in response to the location in three-dimensional at which the user is looking.

127. Eyewear as in claim 125, wherein at least one of the one or more lenses includes regions composable into a bifocal, trifocal, progressive, or otherwise multi-focal lens; the control circuit is disposed to alter the amount of refraction in response to one or more of: which region the user’s gaze direction is through, a distance of the user’s focal length; whereby the control circuit is disposed to optimize the user’s visual acuity in response to distance of an object at which the user is looking.

128. Eyewear as in claim 127, wherein at least one of the lenses includes a bifocal lens having a close-vision region and a distant vision region.

129. Eyewear as in claim 127, wherein at least one of the lenses includes a trifocal lens having a close- vision region, a distant vision region, and a mid-range vision region.

130. Eyewear as in claim 127, wherein at least one of the lenses includes a progressive lens having multiple regions distinguished at one or more of: borders between regions, or relatively smoothly progressing from a first to a second correction and from a second to a third correction.

259

131. Eyewear as in claim 127, wherein at least one of the lenses includes multiple regions disposed at one or more of: upper or lower ranges of the user’s field of view, right or left ranges of the user’s field of view, central or peripheral portion of the user’s field of view.

132. Eyewear as in claim 125, wherein the control circuit is disposed to perform object recognition in response to the user’s gaze direction; the control circuit is disposed to alter the amount of refraction in response to one or more of: a distance of an object recognized by object recognition; whereby the control circuit is disposed to optimize the user’s visual acuity in response to distance of the object recognized by object recognition.

133- Eyewear as in claim 132, wherein the eyewear is disposed to optimize the user’s visual acuity in response to the distance of an object at which the user is looking.

134. Eyewear as in claim 132, wherein at least one of the one or more lenses is disposed to have its refraction electronically controlled.

135- Eyewear as in claim 125, including an outward-looking camera; a computing device disposed to perform one or more of: an artificial intelligence or machine learning technique; wherein the computing device is disposed to be responsive to the dynamic eye tracking mechanism and the outward-looking camera, so as to determine a preferred control of at least one of the lenses which enhances visibility of at least one viewable objects or scene the user favors; the computing device is disposed to be responsive to the preferred control of at least one of the lenses, so as to determine a set of parameters of one or more of the artificial intelligence or machine learning technique.

260

136. Eyewear including a dynamic eye tracking system disposed to determine a gaze direction or focal length of a user; one or more lenses each having a plurality of regions, each region disposed to perform a function altering a field of view, at least two of those regions disposed to perform different functions; wherein the different functions include at least two or more of: refraction, shading/inverse- shading, coloring/tinting or color balance alteration, polarization, prismatic alteration, or dynamic visual optimization.

137. Eyewear as in claim 136, including a dynamic eye tracking system disposed to determine a gaze direction or focal length of a user, wherein the dynamic eye tracking system is disposed to determine a location in three-dimensional space at which the user is looking; a control circuit disposed to dynamically alter a degree to which at least one of the regions performs its function with respect to the user’s field of view, in response to the gaze direction or focal length of the user.

138. Eyewear as in claim 137, wherein at least one of the regions includes a close- vision region through which the user can look when gazing at a relatively close object, the close-vision region having an amount of refraction associated with a likely distance at which the user will be looking through that region.

139. Eyewear as in claim 138, wherein the amount of refraction associated with the close-vision region is fixed.

140. Eyewear as in claim 138, wherein the amount of refraction associated with the close-vision region is adjustable in response to a determination of the user’s visual intent.

141. Eyewear as in claim 137, wherein at least one of the regions includes a distant vision region through which the user can look when gazing at a relatively distant object.

142. Eyewear as in claim 141, wherein the amount of refraction associated with the distant vision region is fixed.

261

143. Eyewear as in claim 141, wherein the amount of refraction associated with the distant vision region is adjustable in response to a determination of the user’s visual intent.

144. Eyewear as in claim 137, wherein at least one of the regions includes a mid-range vision region through which the user can look when gazing at a relatively mid-range object.

145. Eyewear as in claim 144, wherein the amount of refraction associated with the mid-range vision region is fixed.

146. Eyewear as in claim 144, wherein the amount of refraction associated with the mid-range vision region is adjustable in response to a determination of the user’s visual intent.

147. Eyewear as in claim 136, including an outward-looking camera; a computing device disposed to perform one or more of: an artificial intelligence or machine learning technique; wherein the computing device is disposed to be responsive to the dynamic eye tracking mechanism and the outward-looking camera, so as to determine a function to be applied to infalling light from the user’s field of view which enhances visibility of at least one viewable objects or scene the user favors; the computing device is disposed to be responsive to the determined function to be applied to infalling light, so as to determine a set of parameters of one or more of the artificial intelligence or machine learning technique.

148. Eyewear including a dynamic eye tracking system disposed to determine a gaze direction or focal length of a user; one or more lenses each disposed to perform a plurality of functions altering a field of view, wherein the functions include: refraction, shading/inverse-shading, coloring/ tinting or color balance alteration, polarization, prismatic angle deflection, or dynamic visual optimization.

262

149. Eyewear as in claim 148, including a first lens disposed to perform a first alteration of a first region of a field of view, the first alteration including one or more of: refraction, shading/inverse-shading, coloring/ tinting or color balance alteration, polarization, prismatic angle deflection, or dynamic visual optimization; a second lens disposed to perform a second alteration of a second region of the field of view, the second alteration including one or more of: refraction, shading/inverse-shading, color- ing/tinting or color balance alteration, polarization, prismatic angle deflection, or dynamic visual optimization; wherein the first and the second region each include at least a portion of the field of view, the first and the second lens having differing effects, the first and the second lens being combinable to provide a combination of effects.

150. Eyewear as in claim 149, wherein the first region includes a combination of a first selected refractive effect and/ or a first selected shading/inverse-shading effect.

151. Eyewear as in claim 149, wherein the second region includes a combination of a second selected refractive effect and/or a second selected shading/inverse-shading effect.

152. Eyewear as in claim 149, wherein the first region includes a combination of a first selected refractive effect and/ or a first selected shading/inverse-shading effect; the second region includes a combination of a second selected refractive effect and/or a second selected shading/inverse-shading effect different from the first shading/inverse-shading effect; wherein a user is provided with a distinction between the first and second selected refractive effects using a difference between the first and second selected shading/inverse-shading effects.

153- Eyewear as in claim 149, wherein the first region includes a combination of a first selected refractive effect and/ or a first selected shading/inverse-shading effect;

263 the second region includes a combination of a second selected refractive effect and/or a second selected shading/inverse-shading effect different from the first shading/inverse-shading effect; wherein the second lens is disposed to perform shading/inverse-shading to encourage a user to look through a selected one of the first region or the second region.

154. Eyewear as in claim 149, wherein the first region includes a combination of a first selected refractive effect and/ or a first selected shading/inverse-shading effect; the second region includes a combination of a second selected refractive effect and/or a second selected shading/inverse-shading effect different from the first shading/inverse-shading effect; wherein one or more of the first or second lens is disposed to perform shading/inverse-shading to encourage a user to look toward a selected gaze direction, to promote user health.

155. Eyewear as in claim 149, wherein the first region includes a combination of a first selected refractive effect and/ or a first selected shading/inverse-shading effect; the second region includes a combination of a second selected refractive effect and/or a second selected shading/inverse-shading effect different from the first shading/inverse-shading effect; wherein one or more of the first or second lens is responsive to one or more of: a viewing feature of the field of view, a user viewing attention pattern, or a user medical condition.

156. Eyewear as in claim 148, including a third lens disposed to perform a third alteration of a third region of the field of view, the third alteration including one or more of: refraction, shading/inverse-shading, coloring/tinting or color balance alteration, polarization, prismatic angle deflection, or dynamic visual optimization; wherein the first and the third region each include at least a portion of the field of view, the first and the third region having differing effects, the first and the third lens being combinable to provide a combination of effects.

264

157. Eyewear as in claim 156, wherein the third region includes a combination of a third selected refractive effect and/ or a third selected shading/inverse-shading effect.

158. Eyewear as in claim 156, wherein the third region includes a combination of a third selected refractive effect and/or a selected coloring/tinting effect.

159. Eyewear as in claim 148, including an outward-looking camera; a computing device disposed to perform one or more of: an artificial intelligence or machine learning technique; wherein the computing device is disposed to be responsive to the dynamic eye tracking mechanism and the outward-looking camera, so as to determine whether application of one or more of the first or second lens enhances visibility of at least one viewable objects or scene the user favors; the computing device is disposed to be responsive to the determination of application of one or more of the first or second lens, so as to determine a set of parameters of one or more of the artificial intelligence or machine learning technique.

160. Eyewear including a dynamic eye tracking system disposed to determine a gaze direction or focal length of a user; one or more lenses each disposed to perform a plurality of functions altering a field of view, wherein the functions include: refraction, shading/inverse-shading, coloring/tinting or color balance alteration, polarization, prismatic angle deflection, or dynamic visual optimization; each one lens of the one or more lenses having a different selected set of functions, each one lens associated with a different portion of a field of view.

161. Eyewear as in claim 160, wherein a first and a second lens of the one or more lenses collectively include an overlapping portion of the field of view.

162. Eyewear as in claim 160, wherein a first one of the lenses includes an upper portion of the field of view; a second one of the lenses includes a lower portion of the field of view.

163. Eyewear as in claim 160, wherein a first one of the lenses includes an center portion of the field of view; a second one of the lenses includes a peripheral portion of the field of view.

164. Eyewear as in claim 160, wherein a first and a second lens of the one or more lenses each have different ones or more of: refractive functions, shading/inverse-shading functions, coloring/tinting or color balancing functions, polarization functions, prismatic deflection functions, or dynamic visual optimization functions.

165. Eyewear as in claim 160, wherein a first and a second lens of the one or more lenses are each responsive to one or more selected features of the user’s field of view, the selected features including one or more of: content recognized with respect to a selected portion of the field of view; ambient circumstances recognized with respect to the selected portion of the field of view; a user input provided in associated with the user viewing content using the lenses; a bookmarks one or more functions to be performed, selected by the user in response to one or more of the preceding factors.

166. Eyewear as in claim 160, including an outward-looking camera; a computing device disposed to perform one or more of: an artificial intelligence or machine learning technique; wherein the computing device is disposed to be responsive to the dynamic eye tracking mechanism and the outward-looking camera, so as to determine which one or more of the plurality of functions altering a field of view the user favors; the computing device is disposed to be responsive to the determination of which one or more of the plurality of functions, so as to determine a set of parameters of one or more of the artificial intelligence or machine learning technique.

167. Eyewear including a lens having a shading/inverse-shading effect responsive to a signal from a nonlethal area effect weapon.

168. Eyewear as in claim 167, wherein the nonlethal area effect weapon provides a sudden bright light.

169. Eyewear as in claim 167, wherein the nonlethal area effect weapon includes one or more of: a flashbang grenade, or a bright floodlight.

170. Eyewear as in claim 167, wherein the lens is disposed to shade a user from an effect of a sudden bright light.

171. Eyewear as in claim 167, wherein a user of the lens includes one or more of law enforcement personnel, military personnel, or animal control personnel.

172. Eyewear as in claim 167, including a computing device disposed to attempt to predict a time when the weapon is about to affect a recipient thereof, and disposed to control the lens to shade a user from an effect thereof in response to the predicted time.

173. Eyewear as in claim 172, wherein the computing device is disposed to trigger a shading/inverse-shading effect for the lens a selected time in advance of the effect of the weapon, in response to the predicted time.

174. Eyewear including a sensor disposed to detect a sudden bright light; a lens disposed to perform a shading/inverse-shading effect in response to the sensor; wherein a user of the lens is protected from the bright light.

175. Eyewear as in claim 174, wherein the user includes personnel piloting a vehicle.

267

176. Eyewear as in claim 175, wherein the vehicle includes one or more of: an aircraft, racing car, sailboat or speedboat.

177. Eyewear as in claim 175, wherein the sudden bright light includes one or more of: the sun being revealed after an obstacle, another vehicle’s bright lights, a change from a dark region to a non-dark region, a laser.

178. Eyewear as in claim 174, including a computing device disposed to attempt to predict a time when the bright light is about to affect the user, and disposed to control the lens to shade the user from an effect thereof in response to the predicted time; wherein the computing device is disposed to trigger the shading/inverse-shading effect a selected time in advance of the effect of the bright light, in response to the predicted time.

179. Eyewear including a sensor disposed to detect a signal from an external device; a lens disposed to perform a function in response to the signal, the function including one or more of: refraction, shading/inverse-shading, coloring/tinting or color balance alteration, polarization, prismatic angle deflection, dynamic visual optimization, or presenting one or more images or sequences of images.

180. Eyewear as in claim 179, wherein the external device includes one or more of: a smartphone, a mobile device, a camera, a processor, a local wi-fi hotspot, a sensor disposed on/in or near the user, a transponder disposed on an item of merchandise or on/in or near the user, a transponder disposed on an item of merchandise or on/in or near another user, another eyewear, a local audio/video signal or a broadcast/nar- rowcast signal associated with an audio/video signal, an electromagnetic or ultrasonic signal provided by an entertainer.

181. Eyewear as in claim 180, wherein the smartphone or mobile device is disposed to recognize one or more features of an ambient environment.

268

182. Eyewear as in claim 181, wherein the one or more features include one or more of: a measure of luminance, a measure of color- ing/tinting, or a measure of audio/video complexity or other interface with possible visual acuity.

183. Eyewear as in claim 181, wherein when the mobile device detects features of the ambient environment which indicate that an adjustment of shading/inverse-shading or coloring/tinting is called for, the mobile device is disposed to signal the eyewear to make an adjustment thereto.

184. Eyewear as in claim 181, wherein the eyewear is disposed to make such adjustments in response to a user directing the mobile device.

185. Eyewear as in claim 180, wherein the eyewear is disposed to allow the user to send/receive, or respond to, messages including one or more of: advertising, comments on media articles, communication with other users, phone calls or text messages, or social media.

186. Eyewear as in claim 185, wherein the eyewear is disposed to detect one or more eye/face gestures, eyebrow or head gestures, hand/ finger gestures, or hand movement toward or away from a sensor.

187. Eyewear as in claim 179, wherein the external device includes a camera disposed to capture one or more still or moving images, and a processor disposed to operate on those images to detect one or more gestures by a user.

188. Eyewear as in claim 187, wherein the gestures include one or more eye/face gestures and/or hand/finger gestures by the user.

189. Eyewear as in claim 188, wherein the eyewear is disposed to perform, in response to one or more of the gestures, one or more of the following:

269 adjusting a level or volume with respect to music or other audio/video presentation to the user; accepting or rejecting an offer or receive a screen-sharing or other AR/VR communication with another eyewear or digital eyewear; performing an operation of the smartphone or mobile device, performing shading/inverse-shading or coloring/tinting control with respect to one or more lenses; altering shading/inverse-shading or coloring/tinting; or altering a zoom or distant focus control.

190. Eyewear as in claim 189, wherein the operation of the smartphone or mobile device includes one or more of: making or taking a call, sending or reading a text message, using an AR/VR display, sending or reading a social media communication.

191. Eyewear as in claim 179, including a processor coupled to the sensor, disposed to control at least one function performed by the lens, and disposed to receive medical information with respect to a user; wherein the processor is disposed to determine when the user is subject to a current medical condition or is likely to be subject to an oncoming or future medical condition; the processor, in response thereto, is disposed to exchange information with one or more of: the user, a second eyewear, a person other than the user, or an external device.

192. Eyewear as in claim 179, including a processor coupled to the sensor, disposed to control at least one function performed by the lens, and disposed to receive medical information with respect to a user; wherein the processor is disposed to receive information from one or more of: the user, a second user, a second eyewear, or an external device, and in response thereto, to request or receive information from the user.

270

193. Eyewear as in claim 179, wherein the processor is disposed to provide information to one or more of: the user, an emergency responder, medical personnel, a volunteer, or a second person; and the processor is disposed to provide information to obtain assistance to the user.

194. Eyewear as in claim 179, wherein the sensor is coupled to one or more of: the user, an ambient environment, an external device capable of determining a medical status of the user.

195. Eyewear as in claim 179, including a processor coupled to the sensor, disposed to control at least one function performed by the lens, and disposed to receive medical information with respect to a user; wherein the sensor is coupled to one or more of: the user, an ambient environment, an external device capable of determining a medical status of the user; the processor is disposed to receive control at least one function of the lens so as to perform one or more of: assisting in ameliorating the likelihood, severity, or duration of the medical condition; or assisting with self-care by the user.

196. Eyewear as in claim 195, wherein the processor is disposed to control the lens to perform one or more of: admitting light in a selected frequency range, filtering infalling light to alter the color balance toward the selected frequency range, or injecting light in the selected frequency range.

197. Eyewear as in claim 196, wherein the selected frequency range includes 5oo-56onm (green).

198. Eyewear as in claim 195, wherein the processor is disposed to control the lens to perform one or more of: excluding light in a selected frequency range, filtering infalling light to alter the color balance away from the selected frequency range, or injecting light to counterbalance light in the selected frequency range.

271

199. Eyewear as in claim 198, wherein the selected frequency range includes blue or ultraviolet.

200. Eyewear as in claim 179, including a processor coupled to the sensor and disposed to receive the signal; wherein the processor is disposed, in response to the signal, to present an augmented reality (AR) / virtual reality (VR) image and/ or sound to the user.

201. Eyewear as in claim 200, wherein the user is engaged in an event including one or more of: observing or participating in a show: in a movie theater, as part of an outdoor presentation, as part of an interactive presentation, or in/on a ride having a presentation associated therewith.

202. Eyewear as in claim 201, wherein the AR/VR image and/or sound provides one or more of: commentary on or information about the event; closed-captioning, subtitles, or translation; or assistance to users with sight or hearing impairment.

203. Eyewear as in claim 200, wherein the user is engaged in one or more of: operating a vehicle; observing or participating in a sporting event; a re-creation or re-enactment of an historical event or an alternative version thereof; a role-playing game or live-action role-playing game; a geo-caching event, orienteering event, or scavenger hunt; or an event having special effects as a part thereof.

204. Eyewear as in claim 203, wherein when the user is engaged in operating a vehicle, the AR/VR image and/or sound provides one or more of: information about operating the vehicle, information about the ambient environment in which the vehicle is disposed to travel, information about the vehicle’s speed or heading, or information about other vehicles;

272 enhancements or warnings to the user with respect to particular sensor data; information from an instructor or observer with respect to the user’s operation of the vehicle; or information to assist users with sight or hearing impairment.

205. Eyewear as in claim 203, wherein when the user is engaged in observing or participating in an event, the AR/VR image and/or sound provides one or more of: information about the event, statistical information about players, historical information about re-enactments; or assistance to users with sight or hearing impairment.

206. Eyewear as in claim 200, wherein the user is observing or participating in an instructional, testing, or training exercise, a military field training exercise, a search/ rescue training exercise, a law enforcement training exercise, or a firefighting training exercise.

207. Eyewear as in claim 200, wherein the user is observing or participating in a computer game or video game, a first-person shooter, or a role-playing game.

208. Eyewear as in claim 179, wherein the external device includes one or more of: a GPS or other location device, or a proximity sensor or other external tracking device.

209. Eyewear as in claim 208, wherein when operating a vehicle, the eyewear is disposed to provide a warning signal to a user in response to a vehicle condition.

210. Eyewear as in claim 209, wherein the warning signal includes a brief flash, a change in shading/inverse-shading or color- ing/tinting, a shaded/inverse-shaded or a colored/tinted marker in a portion of the user’s field of view, another indicator, or another shading/inverse-shading effect.

273

211. Eyewear as in claim 209, wherein the vehicle condition includes one or more of: when the vehicle exceeds a speed limit, when the vehicle approaches a designated exit or turnoff, when the vehicle is within a selected proximity of a law enforcement vehicle, when the vehicle is near or is causing a driving or racing hazard, or as selected by the user.

212. Eyewear as in claim 209, wherein while traveling with one or more other persons, the vehicle condition includes one or more of: when the user exceeds a selected distance from those other persons, when the user exceeds that selected distance for a designated amount of time, when the user exhibits signs of becoming detached from a group.

213. Eyewear as in claim 209, wherein while traveling with one or more other persons, the vehicle condition includes one or more of: when the user approaches within a selected distance of a second user in the first user’s circle of persons related to the first user, with respect to a social network, or with respect to a class schedule.

214. Eyewear as in claim 179, wherein the signal is disposed to warn a user of an audio/video effect that is disposed to shock or surprise the user; wherein the lens is disposed to protect the user from the audio/video shock.

215. Eyewear as in claim 214, wherein the audio/video effect includes one or more of: a gunshot, a loud sound, a muzzle flash, a flashbang grenade, a bright flash, an explosive, an electrical/ chemical energy release, or a sudden noise or lighting change.

216. Eyewear as in claim 214, wherein the user is a law enforcement officer; the eyewear is disposed to receive a signal from a weapon that it is about to be discharged; wherein

274 the eyewear is disposed, in response to the signal, to adjust the lens so as to prevent the user from having their eyesight disturbed by the weapon.

217. Eyewear as in claim 216, including earphones or earplugs; wherein the eyewear is disposed, in response to the signal, to adjust the earphones or earplugs so as to prevent the user from having their hearing disturbed by the weapon.

218. Eyewear as in claim 216, wherein the weapon disposed to send the signal is associated with a second law enforcement officer.

219. Eyewear as in claim 214, wherein the signal is encrypted when sent and decrypted when received; whereby the signal prevented from being received by an unauthorized party.

220. Eyewear including a dynamic eye tracking system disposed to determine a gaze direction or focal length of a user, wherein the dynamic eye tracking system is disposed to determine a location in three-dimensional space at which the user is looking; a processor disposed to recognize an external device in a field of view; a control circuit disposed, in response to one or more user controls, to exchange one or more signals with the external device, to operate the external device.

221. Eyewear as in claim 220, wherein the external device includes one or more control elements; the control circuit is disposed to exchange one or more signals with the control elements in response to the user controls.

222. Eyewear as in claim 220, wherein when the external device includes a vehicle, the control circuit is disposed to allow the user to control the vehicle using one or more user controls.

275

223. Eyewear as in claim 222, wherein the vehicle includes one or more of: a ground vehicle, an aircraft, or a watercraft.

224. Eyewear as in claim 222, wherein the control circuit is disposed to allow the user to perform one or more of: starting the vehicle, setting a temperature or related controls, turning on/off air conditioning or defrosters or related controls, operating a radio or related equipment, opening/closing doors or windows, opening/ closing an engine hood or a trunk, opening/ closing a gas or other fluid entry, extruding/ retracting cup holders or related equipment, turning on/off internal lights or displays, turning on/ off or adjusting external lights, presenting/highlighting alerts such as from the engine or fuel reserves, controlling “cruise control” or other automatic driving controls, or controlling other controls relating to electric vehicles or golf carts.

225. Eyewear as in claim 222, wherein when the vehicle includes an aircraft, the control circuit is disposed to allow the user to perform one or more of: operating engine and/or flight surface controls using one or more gestures, or one or more hand/ finger gestures, increasing/decreasing a throttle setting, executing a slip or turn, raising or lowering an elevator or aileron, or operating cabin lights or a radio.

226. Eyewear as in claim 222, wherein when the vehicle includes a ground vehicle, the control circuit is disposed to allow the user to perform one or more of: operating an accelerator/brake, gearing, or turning controls; executing a turn; operating doors, windows, locks, or a trunk; or altering an a clutch or throttle, a gear selection, or one or more warning blinkers or lights.

227. Eyewear as in claim 220, wherein the one or more user controls include one or more of: a gesture, a combination of two or more gestures, a gesture in combination with a gaze direction or focusing distance, a gesture in combination with a hand-held vehicle control, or a gaze direction or focusing distance in combination with a hand-held vehicle control.

276

228. Eyewear as in claim 227, wherein the gesture includes one or more of: an eye/face gestures or hand/finger gestures, a combination of two or more gestures.

229. Eyewear as in claim 220, including a lens disposed to present augmented reality or virtual reality to the user of periodic, requested, or triggered information with respect to travel.

230. Eyewear as in claim 229, wherein the information with respect to travel includes warnings of racing conditions, road hazards, directions, or timing.

231. Eyewear as in claim 220, wherein the external device includes a weapon; the external device includes a control circuit disposed to receive a signal and activate/de-acti- vate a safety mechanism in response thereto.

232. Eyewear as in claim 231, wherein the weapon includes a pistol, rifle, or taser.

233- Eyewear as in claim 231, wherein the weapon includes a sight having a detector disposed to determine whether a user of the weapon is authorized.

234. Eyewear as in claim 233, wherein the detector disposed to determine whether a user is authorized includes an iris scanner or another biometric device.

235- Eyewear as in claim 231, wherein the weapon includes a processor having a camera and disposed to determine whether a target of the weapon has been excluded from targeting by the user.

236. Eyewear as in claim 235, wherein the excluded target includes a non-suspect citizen or another law enforcement officer.

277

237. Eyewear as in claim 231, including a dynamic eye tracking system disposed to determine a gaze direction or focal length of a user; wherein the weapon includes a targeting pointer; including a processor disposed to determine whether the user’s gaze direction and/or focal length are directed at the same target as the targeting pointer.

238. Eyewear as in claim 237, wherein the processor is disposed to warn the user when the user’s gaze direction and/or focal length are not directed at the same target as the targeting pointer.

239. Eyewear as in claim 237, including a safety mechanism; wherein the processor is disposed to activate the safety mechanism when the user’s gaze direction and/ or focal length are not directed at the same target as the targeting pointer.

240. Eyewear as in claim 237, wherein the targeting pointer is not visible to the unaided human eye; the processor is disposed to present a virtual image of the targeting pointer to the user when the user’s gaze direction and/or focal length are not directed at the same target as the targeting pointer, without revealing the targeting pointer to a target.

241. Eyewear as in claim 231, including a detector disposed to determine when a user gesture indicates to activate the weapon; a processor responsive to the detector and disposed to activate the weapon in response to a user gesture.

242. Eyewear as in claim 241, wherein the user gesture includes one or more eye/face gestures or hand/ finger gestures.

243. Eyewear as in claim 241, wherein the user gesture includes one or more blinks, eye or face movements, mouth or nasal movements, squints, or winks.

278

244. Eyewear as in claim 241, wherein the processor is disposed to personalize its actions in response to one or more user gestures in response to a user control.

245. Eyewear as in claim 220, wherein the external device includes medical or scientific equipment; the external device includes a control circuit disposed to receive a signal and activate/de-acti- vate the equipment in response thereto.

246. Eyewear as in claim 245, wherein the medical equipment includes surgical or dental equipment.

247. Eyewear as in claim 245, wherein the medical equipment includes a surgical or dental sensor having a safety threshold associated therewith; the control circuit is disposed to provide a warning or another signal to medical personnel in response to the sensor indicating a value outside the safety threshold.

248. Eyewear as in claim 220, including a dynamic eye tracking system disposed to determine a gaze direction or focal length of a user; wherein the external device includes sports equipment; including a processor disposed to determine whether the user’s gaze direction and/or focal length are directed at the same target as the sports equipment.

249. Eyewear as in claim 248, wherein the control circuit is disposed to warn the user when the user’s gaze direction and/ or focal length are not directed at the same target as the sports equipment.

250. Eyewear as in claim 249, wherein the sports equipment includes a golf club; the control circuit includes a sensor disposed to determine a direction of a golf ball when the user uses the sports equipment; the control circuit is disposed to present a confirmation to the user when the when the user’s gaze direction and/or focal length are directed at the same target as the direction of the golfball.

279

251. Eyewear as in claim 220, including a dynamic eye tracking system disposed to determine a gaze direction or focal length of a user; wherein the external device includes one or more of: a security door, an iris scanner or another biometric scanner; including a processor disposed to send a signal to the security door in response to whether the user’s iris scan or other biometric scan indicate that the user is authorized.

252. Eyewear including one or more lenses each disposed to perform one or more functions altering a field of view, wherein the functions include one or more of: refraction, shading/inverse-shading, coloring/ tinting or color balance alteration, polarization, prismatic angle deflection, or dynamic visual optimization; a blink detection system disposed to determine when a user is blinking; a control circuit disposed to alter the functions of the one or more lenses while the user’s eye is closed during a blink.

253- Eyewear as in claim 252, including a dynamic eye tracking mechanism; a computing device disposed to perform one or more of: an artificial intelligence or machine learning technique; wherein the computing device is disposed to be responsive to the dynamic eye tracking mechanism and the blink detection system, so as to attempt to determine ahead of time when the user is about to blink; the computing device is disposed to be responsive to the determination ahead of time when the user is about to blink, so as to determine a set of parameters of one or more of the artificial intelligence or machine learning technique.

254. Eyewear including one or more lenses each disposed to perform one or more functions altering a field of view, wherein the functions include one or more of: refraction, shading/inverse-shading, coloring/ tinting or color balance alteration, polarization, prismatic angle deflection, or dynamic visual optimization;

280 a gaze detection system disposed to determine when a user’s eye is closed; a control circuit disposed to alter the functions of the one or more lenses while the user’s eye is closed.

255. Eyewear as in claim 254, wherein the gaze detection system disposed to determine when a user’s eye is closed during a blink, squint, wink, or other eye activity; the control circuit is disposed to act in response to the blink, squint, wink, or other eye activity.

256. Eyewear as in claim 254, wherein the gaze detection system disposed to determine when a user’s pupil is covered by the user’s eyelid; the control circuit is disposed to act in response to when the user’s pupil is covered.

257. Eyewear as in claim 254, including a computing device disposed to perform one or more of: an artificial intelligence or machine learning technique; wherein the computing device is disposed to be responsive to the gaze detection system, so as to attempt to determine ahead of time when the user’s eye is about to be closed; the computing device is disposed to be responsive to the determination ahead of time when the user’s eye is about to be closed, so as to determine a set of parameters of one or more of the artificial intelligence or machine learning technique.

258. Eyewear including one or more lenses each disposed to perform one or more functions altering a field of view, wherein the functions include one or more of: refraction, shading/inverse-shading, coloring/ tinting or color balance alteration, polarization, prismatic angle deflection, or dynamic visual optimization; a sensor disposed to detect an ambient environment condition in which at least one of the lenses is likely to be subject to fog, frost, or another obstruction of visibility.

281

259. Eyewear as in claim 258, wherein the sensor includes one or more of: a thermometer, a thermocouple, or another temperature detector.

260. Eyewear as in claim 258, wherein at least one the lenses is coupled to one or more of: a resistive circuit or another heating element.

261. Eyewear as in claim 258, including a user control, the user control being responsive to one or more of: an eye/face gesture, a hand/ finger gesture, or a capacitive or touch control.

262. Eyewear as in claim 258, including a receiver disposed to receive one or more of: information with respect to the ambient environment, or weather or related information; a processor coupled to the receiver and disposed to attempt to predict whether at least one of the lenses is likely to be subject to fog, frost, or another obstructions of visibility.

263. Eyewear as in claim 258, including a receiver disposed to receive one or more of: information with respect to the ambient environment, or weather or related information; one or more of: a resistive circuit or another heating element, coupled to at least one the lenses; a processor coupled to the receiver and disposed to attempt to predict whether at least one of the lenses is likely to be subject to fog, frost, or another obstructions of visibility, and coupled to at least one of the resistive circuit or another heating element and disposed to proactively treat one or more of the lenses so as to prevent obstructions to visibility.

264. Eyewear as in claim 263, including one or more of: a resistive circuit or another heating element, coupled to one or more of: a windshield or a clear surface of a vehicle; wherein the processor is coupled to the receiver and disposed to attempt to predict whether at least one of the windshield or the clear surface is likely to be subject to fog, frost, or another obstructions of visibility, and is coupled to at least one of the resistive circuit or another heating

282 element and disposed to proactively treat one or more of the windshield or a clear surface of a vehicle so as to prevent obstructions to visibility.

265. Eyewear including a gesture sensor disposed to determine one or more of: a first gesture sensor responsive to a relative distance, location, or motion of a first gesture or user input; or a second gesture sensor responsive to a second gesture or user input indicating a setting with respect to the first gesture or user input.

266. Eyewear as in claim 265, wherein the first gesture sensor includes a sensor disposed to determine a measure of distance, location, or motion responsive to a movement by a user; the measure of distance, location, or motion is substantially adjustable in a continuous range.

267. Eyewear as in claim 265, wherein the first gesture sensor includes one or more of: a camera disposed to determine an observed size of an object to be sensed; a capacitive sensor disposed to determine a relative distance of an object to be sensed; an infrared or temperature sensor disposed to determine a relative distance of an object to be sensed a light sensor disposed to determine an amount of light obscured by the object to be sensed; or a time-of-flight sensor disposed to determine a relative distance of an object to be sensed.

268. Eyewear as in claim 267, wherein the object to be sensed includes one or more of: the user’s hand, finger, or another body part.

269. Eyewear as in claim 267, wherein the infrared or temperature sensor is disposed to determine a comparison of a local infrared flux or temperature, with an expected temperature of the object to be sensed.

283

270. Eyewear as in claim 267, wherein the light sensor is disposed to determine a comparison of an amount of light available when in a shadow of the object to be sensed, with an amount of light available in an ambient environment.

271. Eyewear as in claim 265, wherein the first gesture sensor is responsive to movement of the object sensed in a first direction; the second gesture sensor is responsive to movement of the object sensed in a second direction perpendicular to the first direction.

272. Eyewear as in claim 271, wherein the first direction includes a linear distance from the first sensor.

273. Eyewear as in claim 271, wherein the second direction includes movement in a plane substantially perpendicular to a direct line from the first sensor.

274. Eyewear as in claim 265, wherein the second gesture or user input includes one or more of: an eye/face gesture, a hand/finger gesture, or a voice input.

275. Eyewear as in claim 265, wherein the second gesture or user input includes one or more of: a blink/ wink, an eye/eyebrow movement, a nose gesture, a smile/smirk or grimace, a squint, a tongue gesture, or a voice input.

276. Eyewear as in claim 265, wherein the second gesture or user input includes one or more of: an instruction to maintain or set a value in response to the first gesture or user input, an instruction to delete or remove a value maintained or set in response to the first gesture or user input, an instruction to alter a value disposed to be maintained or set in response to the first gesture or user input, or an instruction to switch to another value disposed to be maintained or set in response to the first gesture or user input.

284

277. Eyewear as in claim 276, wherein the value disposed to be maintained or set in response to the first gesture or user input includes one or more of: a measure of pain, light sensitivity, audio sensitivity, or another indicator of the occurrence or severity of migraine or photophobia (or phonophobia); a measure of the user’s desire for shading/inverse-shading, a measure of brightness or shadow the user prefers, a measure of the user’s desire for coloring or color balancing, a measure of the user’s desire for refraction (or a measure of fuzziness or unclarity the user perceives), or another visual effect to be applied to light infalling to the user’s eye; a measure of other indicator of the user’s subjective perception of a treatment for migraine or photophobia as recently or currently being applied by the digital eyewear 1700; a measure of the user’s subjective belief that migraine, photophobia, or phonophobia, is oncoming or likely to be so, or is finishing or likely to be so; a measure or other indicator of the user’s ambient environment or recent behavior, an amount of sleep, a measure of stress, a measure of perceived confusion or glare; a measure of, a degree of, or an intensity of, efforts at self-care; an adjustment of dynamic visual optimization that the user prefers, or a measure of visual acuity currently or recently perceived by the user.

279. Eyewear including a front piece; one or more of: a right temple or a left temple; wherein at least one temple is coupled to the front piece using a hinge having a magnetic coupling; the temple coupled to the front piece being disposed to complete a circuit, using a circuit coupling other than the magnetic coupling, when the hinge is in a selected position, the circuit being coupled using at least a portion of the front piece.

280. Eyewear as in claim 278, wherein the right temple is disposed to complete a circuit including the right temple and the left temple, using a circuit coupling other than the magnetic coupling, when the hinge between the right temple and the front piece is in a first selected position and the hinge between the left temple and the front piece is in a second selected position.

285

281. Eyewear as in claim 280, wherein the circuit being coupled between the right temple and the left temple uses at least a portion of the front piece.

282. Eyewear as in claim 280, wherein the front piece includes at least a portion of a circuit; the circuit being coupled between the right temple and the left temple uses the portion of the circuit of the front piece.

283. Eyewear as in claim 278, wherein the magnetic coupling is disposed to be completed when the hinge is opened to a position in which the eyewear is disposed to be placed on the user; the circuit is disposed to be completed when the magnetic coupling is completed.

284. Eyewear as in claim 278, wherein the right temple is disposed to couple to the front piece using the magnetic hinge, wherein the hinge between the right temple is disposed to hold the right temple in a fixed position with respect to the front piece using a magnetic field.

285. Eyewear as in claim 278, wherein the left temple is disposed to couple to the front piece using the magnetic hinge, wherein the hinge between the left temple is disposed to hold the left temple in a fixed position with respect to the front piece using a magnetic field.

286. Eyewear as in claim 278, wherein the circuit includes a plurality of electrical/electronic connections; at least one of the electrical/electronic connections is disposed to couple a power source or circuit load to the circuit.

287. Eyewear as in claim 278, wherein the circuit includes a plurality of electrical/electronic connections;

286 at least one of the electrical/ electronic connections is disposed to couple a data source or data sink to the circuit.

288. Eyewear as in claim 278, wherein the circuit includes a plurality of electrical/electronic connections; at least one of the electrical/electronic connections is disposed to couple a power source or circuit load to the circuit; at least one of the electrical/electronic connections is disposed to couple a data source or data sink to the circuit.

289. Eyewear as in claim 278, wherein the circuit includes one or more optical connections; at least one of the one or more optical connections is disposed to couple a data source or data sink to the circuit.

290. Eyewear as in claim 278, wherein the circuit includes one or more electrical/electronic connections; the circuit includes one or more optical connections; at least one of the electrical/electronic connections is disposed to couple a power source or circuit load to the circuit.

291. Eyewear as in claim 278, wherein the circuit includes one or more electrical/electronic connections; the circuit includes one or more optical connections; at least one of the electrical/electronic or the optical connections is disposed to couple a data source or data sink to the circuit.

292. Eyewear as in claim 278, wherein one or more of: a first right temple or left temple is disposed to be detached from the eyewear; when one or more of the first right temple or left temple is detached from the eyewear, the hinge is disposed to allow connection of a second right temple or left temple.

287

293. Eyewear as in claim 292, wherein the second right temple or left temple is disposed to replace the first right temple or left tem-

294. Eyewear as in claim 292, wherein the second right temple or left temple is disposed to provide different or additional functions from the first right temple or left temple.

295. Eyewear including one or more circuit elements disposed in distinct separable portions, disposed to be coupled using one or more connectors, the one or more connectors being disposed to perform one or more of: coupling the separable portions while the eyewear is in operation; or coupling one or more of those portions to external devices while the eyewear is otherwise dormant.

296. Eyewear as in claim 295, wherein the separable portions include one or more of: a front piece, a first temple, or a second temple.

297. Eyewear as in claim 296, wherein the front piece is disposed to support one or more lenses disposed with respect to viewing.

298. Eyewear as in claim 296, wherein the front piece is disposed to be supported by one or more of the first or second temple.

299. Eyewear as in claim 296, wherein one or more of the front piece or the first temple is disposed to be coupled to a batteiy or energy storage.

300. Eyewear as in claim 296, wherein one or more of the front piece or the first temple is disposed to be coupled to a computing device or data storage.

288

301. Eyewear as in claim 296, wherein one or more of the front piece, the first temple, or the second temple, are disposed to be coupled to one or more of: a battery, energy storage, or a power source; one or more of the front piece, the first temple, or the second temple, are disposed to be coupled to one or more of: a communication device, a computing device, or data storage.

302. Eyewear as in claim 295, wherein the separable portions include one or more elements selected by a user, selected by medical personnel, or personalized to a user.

303. Eyewear as in claim 302, wherein the medical personnel include one or more of: an emergency responder, an eye surgeon, an optometrist, an ophthalmologist, or a neurologist.

304. Eyewear as in claim 295, wherein the separable portions include one or more of: a front piece, a first temple, or a second temple; one or more of the first or second temples are disposed to be coupled to the front piece using one or more detachable hinges.

305. Eyewear as in claim 304, wherein one or more of the detachable hinges includes a magnetic hinge and a circuit, the circuit being disposed to couple one or more of the first or second temple to the front piece.

306. Eyewear as in claim 304, wherein one or more of the detachable hinges includes a magnetic hinge and a circuit, the circuit being disposed to couple one or more of the front piece, the first temple, or the second temple, to one or more of: a batteiy, energy storage, or power source.

289

307. Eyewear as in claim 304, wherein one or more of the detachable hinges includes a magnetic hinge and a circuit, the circuit being disposed to couple one or more of the front piece, the first temple, or the second temple, to one or more of: a communication device, a computing device, or data storage.

308. Eyewear as in claim 304, wherein one or more of the detachable hinges includes a magnetic hinge and a circuit, a first circuit being disposed to couple one or more of the front piece, the first temple, or the second temple, to one or more of the front piece, the first temple, or the second temple, are disposed to be coupled to one or more of: a battery, energy storage, or power source; a second circuit being disposed to couple one or more of the front piece, the first temple, or the second temple, are disposed to be coupled to one or more of: a communication device, a computing device, or data storage.

309. Eyewear as in claim 308, wherein the first and second circuit are collectively disposed to couple one or more of: the batteiy, energy storage, or power source, to one or more of: the communication device, a computing device, or data storage.

310. Eyewear as in claim 295, wherein the first and second temples are coupled to the front piece including one or more magnetic hinges, wherein at least one of the magnetic hinges are disposed to hold the temples in place, when worn by a user.

311. Eyewear as in claim 310, wherein the first and second temples are coupled to the front piece including one or more magnetic hinges, wherein each of the first and second temples are coupled to the front piece by two or more of: mechanically, magnetically, and electrically.

312. Eyewear as in claim 310, wherein the first and second temples are electrically or optically coupled to the front piece, wherein the front piece is disposed to couple the first and second temples together by one or more of: electrically or optically.

290

313. Eyewear as in claim 310, wherein at least one of the first and second temples includes one or more of: a battery, energy storage, or power source; at least one of the first and second temples includes one or more of: a communication device, a computing device, or data storage; the first and second temples are coupled to the front piece to dispose a digital and power coupling between the hinge for the first temple and the hinge for the second temple.

314. Eyewear as in claim 295, wherein the first and second temples are coupled to the front piece including one or more hinges, wherein at least one of the hinges includes a first portion including a magnetic coupling and a second portion including an electrical coupling, the electrical coupling having a plurality of pairs of electrical pins, each pair of pins having a first pin on a first side of the hinge and a second pin on a second side of the hinge, whereby closing the hinge disposes an electrical and a magnetic coupling, and opening the hinge removes the electrical and a magnetic coupling.

315. Eyewear as in claim 314, wherein one or more of the pins is disposed to be coupled to a selected fixed voltage.

316. Eyewear as in claim 315, wherein the selected fixed voltage includes a ground voltage or a digital voltage.

317. Eyewear as in claim 295, wherein a temple having a battery or energy storage is disposed to be coupled to a charger/ recharger; the batteiy or energy storage is disposed to be restored to a full charge after at least partial use.

318. Eyewear as in claim 295, wherein a temple having a computing device or data storage is disposed to be coupled to a communication element or a second computing device or data storage; the computing device or data storage is disposed to exchange data between the eyewear and an external device.

291

319. Eyewear as in claim 295, wherein the eyewear is disposed indicate a low-battery or other lack of charging condition by presenting an audio/video indicator.

320. Eyewear as in claim 319, wherein the eyewear is disposed to flash a greater number of times as a low-battery or lack of charging condition becomes more serious.

321. Eyewear as in claim 319, wherein the eyewear is disposed to fail-over to a benign state including one or more of: a set of clear lenses, another state in which the eyewear is usable.

322. Eyewear as in claim 319, wherein the eyewear is disposed to fail-over softly to a benign state, maintaining its power reserve for urgent uses.

323. Eyewear including a front piece; one or more of: a right temple or a left temple; wherein at least one temple is coupled to the front piece using a hinge having a magnetic coupling; wherein the at least one temple is disposed to be decoupled from the magnetic coupling to the front piece and replaced with a replacement temple.

324. Eyewear as in claim 323, wherein the front piece is disposed to be decoupled from the right temple and the left temple and replaced with a replacement front piece.

325. Eyewear as in claim 324, wherein the replacement front piece is disposed to include one or more of the following: support for a replacement lens, a replacement color-alterable element, a replacement coupling between the right temple and the left temple.

292

326. Eyewear as in claim 325, wherein the replacement lens is disposed to match a new prescription for a user.

327. Eyewear as in claim 325, wherein one or more of: the replacement color-alterable element is disposed in or on one or more of: the front piece, or the right or left temple; the replacement color-alterable element is disposed to couple to a color-alterable element in or on one or more of: the right or left temple; the replacement color-alterable element is disposed to match a color-alterable element in or on the eyewear.

328. Eyewear as in claim 324, wherein the front piece includes a front circuit element; the replacement front piece is disposed to include a replacement front circuit element.

329. Eyewear as in claim 328, wherein the replacement front circuit element is disposed to match or to take advantage of a replacement connection of one or more of: between the front piece and the right temple, between the front piece and the left temple, or between the right temple and the left temple.

330. Eyewear as in claim 324, wherein the replacement front piece includes one or more of: a debugging device, or a replacement of a defective part.

331. Eyewear as in claim 323, wherein the replacement temple includes one or more of: a debugging device, or a replacement of a defective part.

332. Eyewear including a lens; a circuit disposed to be coupled to the lens using a magnetic connector, the circuit being disposed to perform a function with respect to at least a portion of a field of view.

293 333- Eyewear as in claim 332, wherein the circuit is disposed on a second lens, the second lens operating on infalling light from the portion of the field of view; the second lens being controlled by the circuit to perform the function.

334. Eyewear including a lens; a circuit disposed to adjust an operation of the lens to perform a function on at least a selected portion of a field of view, the circuit using polarization to highlight, or de-highlight or otherwise block, that selected portion; the circuit being disposed to perform the function in response to instruction from a user, the user selecting a set of elements in response to which to determine the selected portion to be highlighted, de-highlighted, or otherwise blocked; whereby the circuit performs polarization on the selected portion of the field of view, the polarization having the effect of highlighting, de-highlighting, or otherwise blocking, the selected portion.

335- [850] (20a). Eyewear as in claim 334, including a polarization circuit disposed to adjust a function of the lens, the polarization circuit being responsive to the circuit disposed to adjust an operation of the lens.

336. [851] (20b). Eyewear as in claim 334, including a shading/inverse-shading or coloring/ tinting circuit disposed to adjust a function of the lens, the shading/inverse-shading or coloring/tinting circuit being responsive to the circuit disposed to adjust an operation of the lens.

337- [852] (20c). Eyewear as in claim 334, including a dynamic eye tracking mechanism; an outward-looking camera; a computing device disposed to perform one or more of: an artificial intelligence or machine learning technique; wherein the lens is disposed to apply two or more alternative functions to infalling light;

294 the computing device is disposed to be responsive to the dynamic eye tracking mechanism and the outward-looking camera, so as to determine an adjustment to infalling light from the user’s field of view which enhances visibility of at least one viewable objects or scene the user favors; the computing device is disposed to be responsive to the adjustment to infalling light, so as to determine a set of parameters of one or more of the artificial intelligence or machine learning technique; wherein at least one of the two or more alternative functions includes using polarization to highlight, or de-highlight or otherwise block, a selected portion of the user’s field of view.

338. Eyewear including a lens having a plurality of layers, each one of the plurality of layers disposed to perform a coloring/tinting function on at least a selected portion of a field of view; wherein an overlay of the layers performs a function on the selected portion presenting one or more selected color images to a user.

339. Eyewear as in claim 338, wherein at least one of the plurality of layers is disposed to perform a uniform coloring/tinting function on more than half of the entire field of view.

340. Eyewear as in claim 339, wherein the uniform coloring/tinting function includes enhancing a set of frequencies including green.

341. Eyewear as in claim 339, wherein the uniform coloring/tinting function includes enhancing a set of frequencies in a range between 5OO-56onm.

342. Eyewear as in claim 339, wherein the uniform coloring/tinting function includes enhancing a set of frequencies having one or more of the following effects: preventing, ameliorating, or treating, effects of migraine or neuro- ophthalmic disorder.

343. Eyewear as in claim 339, wherein the uniform coloring/tinting function includes de-enhancing a set of frequencies including blue or ultraviolet.

295

344. Eyewear as in claim 339, wherein the uniform coloring/tinting function includes enhancing a set of frequencies including red or amber.

345. Eyewear as in claim 338, wherein at least one of the plurality of layers is disposed to perform a distinct coloring/tinting function on each of a selected set of pixels or regions of the field of view.

346. Eyewear as in claim 345, wherein the distinct coloring/tinting function on the selected set of pixels or regions is disposed to present an image to the user.

347. Eyewear as in claim 345, wherein the distinct coloring/tinting function on the selected set of pixels or regions is disposed to present a moving image to the user.

348. Eyewear as in claim 338, wherein one or more of the layers includes a dichromic material disposed on a base layer of the lens, the dichromic material being disposed to change color between a clear state and a coloring/tinting state in response to a signal received from a device outside the eyewear.

349. Eyewear as in claim 348, wherein the coloring/tinting state includes a color filtering state disposed to admit only a selected set of frequencies.

350. Eyewear as in claim 348, wherein the signal received from a device outside the eyewear includes an electronic signal.

351. Eyewear as in claim 348, wherein the base layer includes a substantially clear glass component.

352. Eyewear as in claim 348, wherein the dichromic material includes a coating chemically deposited on the base layer.

296 353- Eyewear as in claim 348, wherein the dichromic material provides a band-pass filter for a selected color.

354. Eyewear as in claim 353, wherein the selected color includes one or more of: red, green, blue, cyan, magenta, yellow, or a selected combination thereof.

355. Eyewear as in claim 348, wherein the dichromic material provides a band-pass filter for a selected frequency range.

356. Eyewear as in claim 338, wherein a first layer includes a substantially clear glass base component; the first layer having a first coating (such as on a first side) having a first filtering effect, and a second coating (such as on a second side) having a second filtering effect; whereby the first and second filtering effects are applied to light passing through the selected layer.

357. Eyewear as in claim 356, wherein the first coating is disposed on a first side of the base component.

358. Eyewear as in claim 356, wherein the second coating is disposed on a second side of the base component.

359. Eyewear as in claim 338, wherein a first layer includes a substantially clear glass base component; the first layer having a first coating having a first filtering effect, and a second coating having a second filtering effect; whereby the first filtering effect is applied to light passing through a first portion of a selected field of view and the second filtering effect is applied to light passing through a second portion of a selected field of view.

360. Eyewear as in claim 359, wherein the first coating is disposed on a first side of the base component.

297

361. Eyewear as in claim 359, wherein the second coating is disposed on a second side of the base component.

362. Eyewear as in claim 338, wherein a first layer includes a substantially clear glass base component; the first layer having a first coating having a first filtering effect, and a second coating having a second filtering effect; whereby the first filtering effect is applied to light passing through a first portion of a selected field of view and the second filtering effect is applied to light passing through a second portion of a selected field of view; whereby both the first and the second filtering effects are applied to light passing through at least a selected portion of a field of view, providing a combination disposed to provide a selected color.

363. Eyewear as in claim 338, including a dynamic eye tracking mechanism; an outward-looking camera; a computing device disposed to perform one or more of: an artificial intelligence or machine learning technique; wherein the lens is disposed to apply two or more alternative functions to infalling light; the computing device is disposed to be responsive to the dynamic eye tracking mechanism and the outward-looking camera, so as to determine an adjustment to infalling light from the user’s field of view which enhances visibility of at least one viewable objects or scene the user favors; the computing device is disposed to be responsive to the adjustment to infalling light, so as to determine a set of parameters of one or more of the artificial intelligence or machine learning technique; wherein at least one of the two or more alternative functions includes using a multi-layer lens, each layer being disposed to perform at least one filtering effect on frequencies of light.

364. Eyewear including a lens having a first adjustment disposed to be applied to a first portion of a user’s field of view and a second adjustment disposed to be applied to a second portion of the user’s field of view;

298 at least one of the first adjustment or the second adjustment including more than one function disposed to be applied to infalling light, the more than one function including one or more of: a selected refraction function, a selected shading/inverse-shading function, a selected color- ing/tinting function, a selected color balancing function, a selected polarization function, a selected prismatic deflection function, a selected dynamic visual optimization function.

365. Eyewear as in claim 364, wherein at least one of the first or the second portion of the user’s field of view includes an upper or a lower portion thereof.

366. Eyewear as in claim 364, wherein the first and the second portion of the user’s field of view collectively include reader glasses.

367. Eyewear as in claim 364, wherein the first and the second portion of the user’s field of view collectively include a bifocal, trifocal, or multifocal lens.

368. Eyewear as in claim 364, wherein at least one of the first or the second portion of the user’s field of view includes a central or peripheral portion thereof.

369. Eyewear as in claim 364, wherein at least one of the first or the second portion of the user’s field of view includes a part of an upper or a lower portion thereof and a part of a central or peripheral portion thereof.

370. Eyewear as in claim 364, wherein the first adjustment is responsive to one or more of: a set of infalling light or a set of images, identified in the first portion of the user’s field of view.

371. Eyewear as in claim 370, wherein the infalling light or images identified in the first portion of the user’s field of view includes one or more of: content recognized with respect to that portion of the user’s field of view, ambient light or images.

299

372. Eyewear as in claim 364, including a dynamic eye tracking mechanism; an outward-looking camera; a computing device disposed to perform one or more of: an artificial intelligence or machine learning technique; wherein the lens is disposed to apply two or more alternative functions to infalling light; the computing device is disposed to be responsive to the dynamic eye tracking mechanism and the outward-looking camera, so as to determine an adjustment to infalling light from the user’s field of view which enhances visibility of at least one viewable objects or scene the user favors; the computing device is disposed to be responsive to the adjustment to infalling light, so as to determine a set of parameters of one or more of the artificial intelligence or machine learning technique; at least one of the two or more alternative functions includes using a selected one of a first or second refraction function, a selected shading/inverse-shading function, a selected coloring/tint- ing function, a selected color balancing function, a selected polarization function, a selected prismatic deflection function, or a selected dynamic visual optimization function.

373. Eyewear including a dynamic eye tracking mechanism; an outward-looking camera; a computing device disposed to perform one or more of: an artificial intelligence or machine learning technique; wherein the lens is disposed to apply two or more alternative functions to infalling light; the computing device is disposed to be responsive to the dynamic eye tracking mechanism and the outward-looking camera, so as to determine an adjustment to infalling light from the user’s field of view which enhances visibility of at least one viewable objects or scene the user favors; the computing device is disposed to be responsive to the adjustment to infalling light, so as to determine a set of parameters of one or more of the artificial intelligence or machine learning technique;

300 at least one of the two or more alternative functions includes using a shading/inverse-shading or coloring/tinting element to shade/inverse-shade or color/tint, at least a selected portion of the user’s field of view; the shading/inverse-shading or coloring/tinting element is disposed, when the user is looking in a selected direction, to shade or inverse-shade or color/tint the selected portion of the user’s field of view, so as to encourage the user to alter their gaze direction.

374. Eyewear including a dynamic eye tracking mechanism; an outward-looking camera; a computing device disposed to perform one or more of: an artificial intelligence or machine learning technique; wherein the lens is disposed to apply two or more alternative functions to infalling light; the computing device is disposed to be responsive to the dynamic eye tracking mechanism and the outward-looking camera, so as to determine an adjustment to infalling light from the user’s field of view which enhances visibility of at least one viewable objects or scene the user favors; the computing device is disposed to be responsive to the adjustment to infalling light, so as to determine a set of parameters of one or more of the artificial intelligence or machine learning technique; the dynamic eye tracking system is disposed to determine one or more of: a gaze direction or focal length, of a user, the dynamic eye tracking system being disposed to determine a three-dimensional location at which the user is looking; and including an object recognition system disposed to select an object at the three-dimensional location at which the user is looking; a shading/inverse-shading or coloring/tinting element disposed to shade/inverse-shade or color/tint at least a portion of the user’s field of view associated with the object selected by the object recognition system, so as to improve a measure of visual acuity with respect to the object, for the user.

375. Eyewear including a dynamic eye tracking mechanism; an outward-looking camera;

301 a computing device disposed to perform one or more of: an artificial intelligence or machine learning technique; wherein the lens is disposed to apply two or more alternative functions to infalling light; the computing device is disposed to be responsive to the dynamic eye tracking mechanism and the outward-looking camera, so as to determine an adjustment to infalling light from the user’s field of view which enhances visibility of at least one viewable objects or scene the user favors; the computing device is disposed to be responsive to the adjustment to infalling light, so as to determine a set of parameters of one or more of the artificial intelligence or machine learning technique; at least one of the two or more alternative functions includes using the lens and a second lens, using the lens to provide a first effect in at least a portion of the user’s field of view, and disposing a second effect provided by the second lens to encourage the wearer to look through the first lens.

376. Eyewear including a dynamic eye tracking mechanism; an outward-looking camera; a computing device disposed to perform one or more of: an artificial intelligence or machine learning technique; wherein the lens is disposed to apply two or more alternative functions to infalling light; the computing device is disposed to be responsive to the dynamic eye tracking mechanism and the outward-looking camera, so as to determine an adjustment to infalling light from the user’s field of view which enhances visibility of at least one viewable objects or scene the user favors; the computing device is disposed to be responsive to the adjustment to infalling light, so as to determine a set of parameters of one or more of the artificial intelligence or machine learning technique; at least one of the two or more alternative functions includes using the lens and a second lens, the lens having a first visual effect and the second lens having a second visual effect, and selecting between the lens and the second lens in response to a user action.

377. Eyewear including a dynamic eye tracking mechanism; an outward-looking camera;

302 a computing device disposed to perform one or more of: an artificial intelligence or machine learning technique; wherein the lens is disposed to apply two or more alternative functions to infalling light; the computing device is disposed to be responsive to the dynamic eye tracking mechanism and the outward-looking camera, so as to determine an adjustment to infalling light from the user’s field of view which enhances visibility of at least one viewable objects or scene the user favors; the computing device is disposed to be responsive to the adjustment to infalling light, so as to determine a set of parameters of one or more of the artificial intelligence or machine learning technique; at least one of the two or more alternative functions includes using a shading/inverse-shading or coloring/ tinting element disposed to shade/inverse-shade or color/ tint at least a portion of the user’s field of view in response to a visual effect, the visual effect including one or more of: an amount of brightness or luminosity, a color balance, a comparison of an intensity of incoming light with an amount of sensitivity by the wearer, a comparison of a color balance with the wearer’s preferred color balance, or a degree of likely or possible damage to the wearer’s eyesight or night vision, disposed within the user’s field of view; the shading/inverse-shading or coloring/ tinting element is disposed, when the user is looking in a selected direction, to shade/inverse-shade or color/tint the portion of the user’s field of view that provides relief from the visual effect.

378. Eyewear including a dynamic eye tracking mechanism; an outward-looking camera; a computing device disposed to perform one or more of: an artificial intelligence or machine learning technique; wherein the lens is disposed to apply two or more alternative functions to infalling light; the computing device is disposed to be responsive to the dynamic eye tracking mechanism and the outward-looking camera, so as to determine an adjustment to infalling light from the user’s field of view which enhances visibility of at least one viewable objects or scene the user favors; the computing device is disposed to be responsive to the adjustment to infalling light, so as to determine a set of parameters of one or more of the artificial intelligence or machine learning technique;

303 at least one of the two or more alternative functions includes using: a first signal coupled to a first view disposed to be available to a user using an interface between the user and at least a region of the user’s field of view; a second signal coupled to a second view disposed to be available to the user without using the interface between the user and the region of the user’s field of view; a comparator disposed to determine a measure of visual acuity for a user in response to a difference between the first and the second signal; the eyewear being disposed to apply at least a first visual effect to the region of the user’s field of view in response to the measure of visual acuity.

379. Eyewear including a dynamic eye tracking mechanism; an outward-looking camera; a computing device disposed to perform one or more of: an artificial intelligence or machine learning technique; wherein the lens is disposed to apply two or more alternative functions to infalling light; the computing device is disposed to be responsive to the dynamic eye tracking mechanism and the outward-looking camera, so as to determine an adjustment to infalling light from the user’s field of view which enhances visibility of at least one viewable objects or scene the user favors; the computing device is disposed to be responsive to the adjustment to infalling light, so as to determine a set of parameters of one or more of the artificial intelligence or machine learning technique; at least one of the two or more alternative functions includes using the lens and a second lens, the lens being disposed to apply a first selected adjustment to a forward-facing portion of a field of view, the second lens being disposed to apply a second selected adjustment to a peripheralfacing portion of the field of view, and altering the first and second selected adjustments so as to to improve visual acuity with respect to the peripheral-facing portion of the field of view to nearer that of the forward-facing portion of the field of view.

380. Eyewear including a dynamic eye tracking mechanism; an outward-looking camera;

304 a computing device disposed to perform one or more of: an artificial intelligence or machine learning technique; wherein the lens is disposed to apply two or more alternative functions to infalling light; the computing device is disposed to be responsive to the dynamic eye tracking mechanism and the outward-looking camera, so as to determine an adjustment to infalling light from the user’s field of view which enhances visibility of at least one viewable objects or scene the user favors; the computing device is disposed to be responsive to the adjustment to infalling light, so as to determine a set of parameters of one or more of the artificial intelligence or machine learning technique; at least one of the two or more alternative functions includes using: an object recognition system disposed to recognize an object viewable in a field of view and to determine a direction at which the object is viewable, disposing the lens to apply a first selected adjustment to a forward- facing portion of the field of view, disposing a second lens to apply a second selected adjustment to a peripheral-facing portion of the field of view, wherein the first and second selected adjustments include one or more visual effects responsive to the direction where the object is viewable.

381. Eyewear including a dynamic eye tracking mechanism; an outward-looking camera; a computing device disposed to perform one or more of: an artificial intelligence or machine learning technique; wherein the lens is disposed to apply two or more alternative functions to infalling light; the computing device is disposed to be responsive to the dynamic eye tracking mechanism and the outward-looking camera, so as to determine an adjustment to infalling light from the user’s field of view which enhances visibility of at least one viewable objects or scene the user favors; the computing device is disposed to be responsive to the adjustment to infalling light, so as to determine a set of parameters of one or more of the artificial intelligence or machine learning technique; at least one of the two or more alternative functions includes using one or more of the alternative functions to apply a selected adjustment to a peripheral-facing portion of the field of view,

305 so as to improve visual acuity with respect to the peripheral-facing portion of the field of view to nearer that of a forward-facing portion of the field of view.

382. Eyewear including a dynamic eye tracking mechanism; an outward-looking camera; a computing device disposed to perform one or more of: an artificial intelligence or machine learning technique; wherein the lens is disposed to apply two or more alternative functions to infalling light; the computing device is disposed to be responsive to the dynamic eye tracking mechanism and the outward-looking camera, so as to determine an adjustment to infalling light from the user’s field of view which enhances visibility of at least one viewable objects or scene the user favors; the computing device is disposed to be responsive to the adjustment to infalling light, so as to determine a set of parameters of one or more of the artificial intelligence or machine learning technique; a sensor disposed to detect a light brighter than ambient in a peripheral field of view; a processor coupled to the sensor and disposed to determine whether the light is approaching a direction in which the user’s vision is likely to be impaired, the processor being disposed to select one or more of the alternative functions in response thereto.

383. Eyewear including a dynamic eye tracking mechanism; an outward-looking camera; a computing device disposed to perform one or more of: an artificial intelligence or machine learning technique; wherein the lens is disposed to apply two or more alternative functions to infalling light; the computing device is disposed to be responsive to the dynamic eye tracking mechanism and the outward-looking camera, so as to determine an adjustment to infalling light from the user’s field of view which enhances visibility of at least one viewable objects or scene the user favors; the computing device is disposed to be responsive to the adjustment to infalling light, so as to determine a set of parameters of one or more of the artificial intelligence or machine learning technique;

306 a sensor disposed to detect a light brighter than ambient in a peripheral field of view; a processor coupled to the sensor, the processor being is disposed to control a shading/in- verse-shading or coloring/ tinting effect on the lens.

384. Eyewear including a dynamic eye tracking mechanism; an outward-looking camera; a computing device disposed to perform one or more of: an artificial intelligence or machine learning technique; wherein the lens is disposed to apply two or more alternative functions to infalling light; the computing device is disposed to be responsive to the dynamic eye tracking mechanism and the outward-looking camera, so as to determine an adjustment to infalling light from the user’s field of view which enhances visibility of at least one viewable objects or scene the user favors; the computing device is disposed to be responsive to the adjustment to infalling light, so as to determine a set of parameters of one or more of the artificial intelligence or machine learning technique; at least one of the two or more alternative functions includes using a reflective effect from a mirror or a deflective effect from a prism, so as to allow the user to see at an angle not otherwise within their field of view.

307

Description:
Personalized optics

Table of Contents

PATENT APPLICATION . 1

INCORPORATED DISCLOSURES . 4

COPYRIGHT NOTICE . 4

BACKGROUND . 4

SUMMARY OF THE DISCLOSURE . 5

BRIEF DESCRIPTION OF THE FIGURES . 35

DETAILED DESCRIPTION . 38

GENERAL DISCUSSION . 38

TERMS AND PHRASES . 46

FIGURES AND TEXT . 53

Fig. 1 - Active Correction or Enhancement . 53

Fig. 2 - Retinal Image Display . 104

Fig. 3 - Contact lenses or intra-ocular lenses . 105

Fig. 4 - Facemask or helmet . 113

Fig. 5 - Scopes or sights . 123

Fig. 6 - Nerve sensors/stimulators . 125

Fig. 7- Used with display . 126

Fig. 8 - Hybrid personalization . 130

Fig. 9 - Dynamic adjustment of polarization . 155

Fig. 10 - Adjustment of magnification . 161

Fig. 11 - Dynamic adjustment of reflection . 165

Fig. 12 - Dynamic adjustment of 3D presentation . 166

Adapting to changes in light/ dark viewing . 169

Protecting eyesight from changes in light/dark environments . 169

Fig. 13 - Illumination where the user is looking . 175

Fig. 14 - Peripheral vision . 181

Fig. 15 - Music and entertainment shading /inverse-shading . 186

Fig. 16 - Controlling external devices . 205

Fig. 17 - Hand/finger gesture sensor . 217

Fig. 18 - Couplable circuit elements and temples . 222

Fig. 19 - Clip-on couplable circuit elements and lenses . 229 Fig. 20 - Multi-layer lenses . 234

Fig. 21 - Highlighting using polarization . 236

Combination of functions . 237

ALTERNATIVE EMBODIMENTS . 238

CLAIMS . 240

ABSTRACT OF THE DISCLOSURE . 308

Personalized optics

Incorporated Disclosures

[1] This Application describes technologies that can be used with inventions, and other technologies, described in one or more of the following documents. This Application claims priority, to the fullest extent permitted bylaw, of these documents.

[2] This Application is a continuation of

— Application 17/534,444, filed November 23, 2021, naming inventor Scott LEWIS, titled “Personalized optics”, Attorney Docket No. PCP 6603a, currently pending.

[3] Each of these documents is hereby incorporated by reference as if fully set forth herein. Techniques described in this Application can be elaborated with detail found therein. These documents are sometimes referred to herein as the “Incorporated Disclosures,” or variants thereof.

Copyright Notice

[4] A portion of the disclosure of this patent document contains material subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever.

Background

[5] This background is provided as a convenience to the reader and does not admit to any prior art or restrict the scope of the disclosure or the invention. This background is intended as an introduction to the general nature of technology to which the disclosure or the invention can be applied.

[6] Corrective lenses must match the needs of the wearer; more particularly, those lenses must match the needs of the wearer when viewing a object at a particular distance, or when otherwise providing assistance to the wearer in viewing. However, standardized lenses do not match every wearer, and even lenses that are specific to a particular wearer do not match every viewing distance or provide proper assistance in all circumstances. Accordingly, some corrective lenses provide more than one amount of correction, depending on distance to an object being viewed by the wearer. These are sometimes called “bifocals” or “progressive” lenses; they provide different corrective eye prescriptions depending on the position of the object in the wearer’s field of view.

[7] One drawback with bifocals or progressive lenses is that they cannot easily be standardized, as each wearer might have a different amount of correction required at each different distance. However, having bifocals or progressive lenses made-to-order can be expensive, at least in comparison with lenses having a single prescription. Another drawback that can occur is that the distance to an object being viewed might fail to match its location in the wearer’s field of view, such as when the object is in an unexpected position or when the object is moving toward or away from the wearer. This can cause inconvenience to the wearer by prompting them to move their head about in an effort to move the object into a position within their field of view that is properly corrected. If the object is moving quickly, it might occur that the wearer cannot do this well enough to anticipate the object’s movement.

[8] Another drawback that can occur, both with single-prescription lenses and with bifocals or progressive lenses, is that the wearer’s features might change. The wearer’s eye prescription can change with time. The effective prescription needed by the wearer can also change with respect to whether the wearer is squinting, or possibly other reasons. This can cause inconvenience to the wearer by failing to provide proper correction after time, or in circumstances that might not have been anticipated by the lens maker.

[9] Each of these issues, as well as other possible considerations, might cause difficulty in aspects of using eyewear that is preferably matched to the wearer and scene being viewed.

Summary of the Disclosure

[10] This summaiy of the disclosure is provided as a convenience to the reader, without any intent to limit or restrict the scope of the disclosure or the invention. This summary is intended as an introduction to more detailed description found in this Application, and as an overview of techniques explained in this Application. The described techniques have applicability in other fields and beyond the embodiments specifically reviewed in detail. [11] Among other disclosures, this Application describes a system, and techniques for use, capable of providing eyewear that can dynamically adjust its effect on viewing to match a combination of the wearer, the object or scene being viewed, and possibly other conditions.

Adjusting visual features

[12] In one embodiment, the eyewear (or digital eyewear) can be disposed to be responsive to one or more of:

— sensory parameters, such as the wearer’s gaze direction or focal length; eye gestures or multiple eye gestures by the wearer; other eye activity by the wearer, such as pupil or iris size, blink rate, squinting, eye twitching or nystagmus, saccades; or other senses such as hearing, smell, or touch (possibly including the wearer triggering a control on the eyewear, conducting a hand or other body gesture, or otherwise as described herein);

— medical conditions, such as whether the wearer is subject to allergies, “dry eyes” and related conditions, migraines/photophobia or related conditions, sleep deprivation, epilepsy or other seizure concerns, being under the influence of alcohol or other substances, or otherwise as described herein;

— wearer parameters, such as the wearer’s eye activity, or changes thereof; the wearer’s location or distance from a selected object, or changes thereof; or otherwise as described herein;

— environmental parameters, features of the wearer’s field of view, such as luminance, color prominence, glare, visual blur or noise, or otherwise as described herein; presence of particular objects or people in view, such as persons known to the wearer, or such as weapons (guns, knives, or otherwise as described herein); or features of the ambient environment, such as a relationship between the wearer and scene or object being viewed, such as whether the wearer is in motion with respect thereto, or otherwise as described herein.

[13] In one embodiment, the eyewear or digital eyewear can be disposed to be responsive to wearer activity, such as one or more of:

— an activity being conducted by the wearer, such as whether the wearer is engaged in police, military, firefighter, emergency responder, search and rescue activity, or otherwise as described herein;

— whether the wearer is engaged in operating a vehicle, such as a racing car, a speed boat, an aircraft, another type of vehicle, or otherwise as described herein; whether the wearer is engaged in observing a sporting activity or other event, such as a baseball or football game, a live-action or recorded concert, a movie or other presentation, a theme-park event or other interactive experience, an advertisement or store front, an augmented reality (AR) or virtual reality (VR) event or other three-dimensional (3D) experience, or otherwise as described herein;

— whether the wearer is reading, conversing with another person, viewing a target at a distance, viewing a panorama, or otherwise as described herein; or other possible wearer activities.

[14] In one embodiment, the type of eyewear or digital eyewear can be disposed to be particular to a use being made by the wearer. For example, wearable eyewear can include one or more of:

— glasses, contact lenses, a retinal image display (RID), an intra-ocular lens (IOL), or otherwise as described herein;

— a helmet, such as might be disposed for use by police, military, firefighter, emergency responder, search and rescue activity, or other personnel;

— augmented eyewear, such as a microscope or telescope, a rifle scope or other scope, binoculars, a still or motion-picture camera, “night vision” glasses or other infrared detectors, or otherwise as described herein;

— nerve sensors or stimulators, such as optic nerve sensors or stimulators, optical cortex or other brain element sensors or stimulators, or otherwise as described herein;

— whether the eyewear can be used in combination or conjunction with other devices, such as smartphones, smart watches, or other wearable or implantable devices; concert screens or other displays; AR presentations; cameras, scopes, and related devices; wireless or other electromagnetic signals; medical devices; or otherwise as described herein.

[15] In one embodiment, the eyewear or digital eyewear can be disposed to be responsive to a wearer’s field of view (FOV), or a portion of the wearer’s FOV, such as whether the wearer’s FOV is long-range or short-range, higher or lower, right or left, central or peripheral vision, or otherwise as described herein.

[16] In one embodiment, the eyewear or digital eyewear can be disposed to adjust visual features presented to the wearer, such as using changes in refraction; changes in polarization or shading; changes in color filtering, color injection, false coloring, or otherwise as described herein; changes in prismatic angles or functions; changes in presentation of 3D displays; or otherwise as described herein. [17] For example, the eyewear or digital eyewear can be disposed to adjust visual features presented to the wearer, so as to encourage the wearer to look in a particular direction or through a particular region of the lenses. In such cases, the eyewear can include multiple lenses with combined operations to provide further personalization.

Multiple lenses and combinations

[18] In one embodiment, a lens can include a first region for vision correction (possibly using refraction), such as for close-range viewing, and a second region for different vision correction (also possibly using refraction), such as for longer-range viewing. The first region or the second region, or portions thereof, can be adjusted so as to optimize the viewer’s clarity of vision. For one example, the amount of refraction can be adjusted in each region, such as using electronic control of the refraction. This can have the effect that the wearer can be provided with a relatively optimized view despite the distance of objects at which they are looking. For another example, one or more of the regions, or portions thereof, can be shaded or inverse-shaded, such as using electronic control of the shading or inverse-shading. This can have the effect that the wearer can be provided with a relatively optimized view despite the direction in which they are looking.

[19] In another embodiment, the eyewear or digital eyewear can include a combination of more than one lens. For example, a first lens can include a first region for vision correction (again, e.g., using refraction) for close-range viewing and a second region for vision correction for longer- range viewing. A second lens can be aligned with the first lens and can shade either the first region or the second region, so as to encourage the wearer to focus on relative close-range viewing or on relative longer-range viewing. The second lens can be responsive to features of the wearer’s eye, such as when the wearer’s eyes become strained or dried-out in response to excessive close-range viewing. When the eyewear detects a prospective or actual problem with respect to the wearer’s attention pattern, the eyewear can cause the second lens to shade regions aligned of the first lens, so as to encourage the wearer to alter their attention pattern. In alternative embodiments, the eyewear can respond to wearer’s attention pattern with respect to bright lights or lights with glare or flashing, concentration on small obj ects, lights with a potentially adverse color balance, or other aspects of the wearer’s field of view (FOV) that might affect the wearer’s vision, attention, or medical conditions. [20] In another embodiment, the eyewear or digital eyewear can include more than one function, each associated with separate portion of the user’s field of view. For example, the lenses can be separated into upper/lower portions, such as in “reader” glasses; or such as in bifocal, trifocal, or multifocal lenses. In such cases, the upper/lower portions of the lenses can each be disposed with separate functions. The upper/lower portions of the lenses can each include different refractive functions. Similarly, the upper/lower portions of the lenses can each include different shad- ing/inverse-shading functions, different coloring/tinting or color balancing functions, different polarization or prismatic deflection functions, different dynamic visual optimization functions, or as otherwise described herein.

[21] In such cases, the different functions can be responsive to the particular portion of the user’s field of view affected by the lenses, such as when the lenses relate to upper/lower portions or central/peripheral portions of the user’s field of view. Alternatively, the different functions can be responsive to one or more of the following: (A) content recognized with respect to the portion of the user’s field of view within the scope of the lenses; (B) ambient circumstances recognized with respect to the portion of the user’s field of view within the scope of the lenses, such as time of day or location; (C) user inputs provided at a time when the user is viewing content using the lenses; (D) defined “bookmarks” with respect to functions to be applied with respect to one or more of the preceding factors; or as otherwise described herein.

Shading/inverse-shading and illumination

[22] For example, the eyewear or digital eyewear can be disposed to adjust shading with respect to an object or a portion of the user’s field of view (FOVj at which the user is looking. In such cases, when the user is looking in a particular direction, the eyewear can be disposed to shade only portions of the user’s FOV in that direction. Similarly, in such cases, when the user is looking at a particular object, such as when looking in a particular direction and at a particular depth of focus so as to distinguish a selected object, the eyewear can be disposed to shade only that selected object. An outbound camera, such as a camera mounted behind one or more of the lenses and disposed to view a location or region at which the user is looking, can be disposed to determine an amount of shading that optimizes the user’s view, or to determine an amount of shading that optimizes a clarity of the location or region at which the user is looking. [23] In such cases, the eyewear or digital eyewear can be disposed to detect where the user is looking in response to one or more of: a dynamic eye tracking system, or in response to one or more “outbound” cameras disposed to review the user’s field of view (FOV) from inside one or more lenses. For example, the dynamic eye tracking system can be disposed to determine in what direction, and at what depth of focus, the user is looking. This can have the effect that the dynamic eye tracking system can determine a location in three-dimensional (3D) space at which the user is looking. For another example, the outbound camera can be disposed to examine the user’s FOV from inside one or more of the lenses. Either of these techniques can have the effect that when the user moves their head or otherwise alters their FOV, the eyewear can adjust the 3D location that is shaded. More precisely, the eyewear can adjust a location on each lens so that the joint focus of the user’s eyes at that 3D location is shaded.

[24] This can have the effect that the eyewear or digital eyewear “shades where the user is looking”. This can be applied to inverse shading as well. When the user adjusts the direction they are looking, adjusts the depth of field at which they are looking, tilts their head, squints, otherwise moves due to an external force, the eyewear can shade where the user looks, and if so desired, only where the user looks. For example, the user might be in a vehicle, such as an aircraft, racecar, or sailboat or speedboat, or the user might be looking at a dashboard or instrument, or the user might be looking at an external object. In such cases, the eyewear can shade where the user is looking, notwithstanding the user’s head or eye movement, the vehicle’s movement, or other movement that might affect where the user is looking. Similarly, the eyewear or digital eyewear can shade with respect to a particular light source, such as a welding torch, a glass blowing element, a firearm or fireworks, or as otherwise described herein.

[25] In another embodiment, the eyewear or digital eyewear can be disposed to provide light in a direction where the user is looking, such as in response to a dynamic eye tracking mechanism. This can have the effect that the eyewear can illuminate objects at which the user is looking; thus, the eyewear can “light where the user is looking”. Thus, when the user adjusts the direction in which they are looking, adjusts the depth of field at which they are looking, tilts their head, squints, otherwise moves due to an external force, the eyewear can “light where the user looks”, and if so desired, only where the user looks. This can be applied to specific portions of the user’s field of view as well, such as only applied to an upper/lower or a central/peripheral portion of the lenses. For example, as described above, the user might be in a vehicle, such as an aircraft, racecar, or sailboat or speedboat, or the user might be looking at a dashboard or instrument, or the user might be looking at an external object. In such cases, the eyewear can illuminate where the user is looking, notwithstanding the user’s head or eye movement, the vehicle’s movement, or other movement that might affect where the user is looking.

[26] In one such case, the eyewear or digital eyewear can include a lamp (such as a laser or an LED) and can be disposed to direct the light from the lamp in a direction or at a focal length where the user’s eyes are focusing. The lamp can be disposed on a portion of the eyewear, such as on a front piece (such as at a location between the user’s eyes) or on an earpiece, sometimes also referred to herein as a “temple” (such as at a location near the user’s temple) and disposed to provide a light beam in a direction which the user is looking at or focused at a distance at which the user is focusing.

Differential shading/inverse-shading

[27] In one embodiment, the eyewear or digital eyewear can be disposed to perform differential amounts of shading/inverse-shading for distinct locations in the user’s field of view. For example, the eyewear or digital eyewear can be disposed to perform a first amount of shading/inverse-shading in a close-range portion of the user’s field of view and a second amount of shading/inverse-shading in a distant portion of the user’s field of view. For a first example, when the user is reading in a bright environment, such as in sunlight, the eyewear or digital eyewear can be disposed to shade/inverse-shade the portion of the user’s field of view associated with reading, to account for brightness of the reading material. For a second example, when the user is operating an aircraft, the eyewear or digital eyewear can be disposed to shade/inverse-shade the portion of the user’s field of view associated with a bright sky field, to account for brightness of the ambient environment.

[28] Similarly, the eyewear or digital eyewear can be disposed to adjust coloring/tinting or color balance of at least a portion of the user’s field of view, such as in response to the brightness or coloring/tinting of the ambient environment. For a first example, when the user is determined to be sensitive to bright light, such as when the user is subject to migraine or photophobia, the coloring/tinting of the ambient environment can be adjusted to reduce (in whole or in part) the amount of blue/ultraviolet light in the coloring/tinting of the user’s field of view. For a second example, when the user is determined to be about to be, or currently, subject to migraine, the coloring/tinting or color balance of the ambient environment can be adjusted to increase the amount of green light received by the user’s eyes.

[29] In another embodiment, when the user is subject to at least some color blindness (whether natural or induced by an adjustment to the coloring/tinting of the user’s field of view), the coloring/tinting of the ambient environment can be adjusted to enhance those portions of the user’s field of view that relate to particular colors for which the user’s attention is to be drawn. For example, when the user is subject to red/green color blindness or when the user’s field of view is filtered to restrict coloring/tinting to primarily shades of green, the user’s field of view can be adjusted to show red coloring/tinting (such as traffic lights or signs when the user is driving) in a brighter format or in a flashing format, so as to allow the user to determine the presence of those colors even when the user is unable to see them directly.

Music and entertainment shading

[30] In one embodiment, the eyewear or digital eyewear can be disposed to respond to an audio/video signal, such as a song or other music presentation. The eyewear or digital eyewear can receive the audio/video signal and shade/inverse-shade in response thereto. Similarly, the eyewear or digital eyewear can receive the audio/video signal and illuminate in response thereto. This can have the effect that the user experiences a shading/inverse-shading effect or an illumination effect in response to a song, another music presentation, or another audio/video signal.

[31] For example, the eyewear or digital eyewear can be disposed to allow an external device or another person to control its response to the audio/video signal. In such cases, the eyewear or digital eyewear can be disposed to allow a “DJ” or another entertainer to send a signal (such as an electromagnetic or ultrasound signal) to the eyewear or digital eyewear, so as to provide the user with a music shading experience.

[32] For example, the eyewear or digital eyewear can be disposed to provide a color change with respect to one or more lenses in response to the song, other music presentation, or other audio/video. This can have the effect that an external device or another person can provide the user with a colorized experience to go with associated music. This can alternatively have the effect that an external device or another person can provide the user with a colorized enhancement of an alarm or other audio/video (such as a fire alarm or a emergency vehicle siren). [33] In one embodiment, the eyewear or digital eyewear can be disposed to perform shad- ing/inverse-shading or coloring/tinting in response to an external measurement device, such as one or more of the following: an Apple Watch™, Fitbit™, blood oximeter, blood pressure monitor, heart rate monitor, mood-sensing ring, thermometer or other temperature sensor, or as otherwise described herein. This can have the effect that the user can obtain relatively quick feedback with respect to their own physical state, without having to perform any measurement or review any measuring device. This can be useful when the user is engaged in a sport or another activity requiring consistent attention and/ or rapid reactions, such as operating an aircraft, race car, motorcycle, dirt bike (or many other vehicles); playing a video game (particularly a “first-person shooter”), or as otherwise described herein.

[34] In one embodiment, the eyewear or digital eyewear can be disposed to perform shad- ing/inverse-shading or coloring/tinting in response to the user’s voice or facial movements. For example, the eyewear or digital eyewear can be disposed to present a variable set of shading/in- verse-shading or coloring/tinting when the user is speaking, singing, grunting, or otherwise making artificial noises, or as otherwise described herein. For another example, the eyewear or digital eyewear can be disposed to present a variable set of shading/inverse-shading or coloring/tinting in response to a set of user’s gestures, such as when the user moves their eyes, nose, mouth, chin, neck, or other elements of their head/ neck in selected ways.

[35] For another example, the eyewear or digital eyewear can be disposed to present a picture on the eyewear’s lenses (or on a device coupled thereto, such as a facemask or other facial covering) in response to the user’s speaking, singing, or other examples such as described herein. In such cases, the eyewear or digital eyewear can be disposed to present a picture of how the user’s facial features would look without the facemask or facial covering, in response to movement or sound provided by the user’s facial features. This can have the effect that the user can wear a protective covering on their face (such as when they are ill or when they are protecting against transmitted illness) and still present a natural look to others while speaking or otherwise communicating.

[36] For example, the user can wear a protective facemask and still show facial features while speaking, singing, or as otherwise described herein. An audio microphone can be disposed to receive sounds from the user’s mouth, throat, or other vocal apparatus, and can be disposed to present a picture of how the user’s face/mouth would look while the user was speaking, singing, or as otherwise described herein. The picture also need not look exactly as the user’s facial features would look; the picture can be presented on a facemask or other facial shield, showing a picture of an arbitrary face/mouth. For example, the picture presented can be disposed to show a caricature of the user’s face, a filtered version of the user’s face (such as using an image filter), a picture of another person’s face (such as a celebrity or a friend/ relative of the user), a picture of an animal or cartoon, another arbitrary image, or as otherwise described herein.

Controlling external devices

[37] In another such case, the eyewear or digital eyewear can include a transmitter, such as an electromagnetic or ultrasonic transmitter, disposed to control an external device, such as a smartphone or other mobile device. For example, when the user is looking at a smartphone (or other mobile device), the eyewear can send a signal to the mobile device that directs the mobile device to highlight a designated portion of the mobile device’s screen. Thus, when the user is looking at a particular portion of the screen, the eyewear can direct the mobile device to highlight only that portion of the screen.

[38] This can have the effect that the mobile device can show the user just what the user is looking for. Possible advantages include (A) enabling the user to more easily see that portion of the screen, (B) saving power usage or battery time available to the smartphone, (C) providing increased brightness for the designated portion of the screen without using excessive power or battery time, or possibly other advantages. Alternatively, this can also have the effect that the mobile device can urge the user to review particular portions of the screen, such as by moving the highlighted portions of the screen across a sequence of text to improve reading speed.

[39] In additional examples of these techniques, the eyewear or digital eyewear can be disposed to recognize commands or requests from the user to alter the intensity (or other features) of the illumination. In such cases, user commands can include capacitive or touch controls, eye or face gestures, finger or hand gestures, head or mouth movements, voice commands, electromagnetic commands from another device, other user commands described herein, or other ways the user can direct the eyewear. [40] In such cases, the eyewear or digital eyewear can be disposed to allow the user to direct the illumination to have a different amount of area at the illuminated device, a different angle or amount of polarization, a different color or color balance (or a different set of colors in a varying color pattern), or otherwise as described herein. In additional such cases, the eyewear can be disposed to direct the mobile device to increase a magnification, or to impose other visual effects, on the portion of the screen being viewed by the user. For example, the eyewear can be disposed to alter a color or color balance of that portion, to cause that portion to blink, or otherwise change a way that portion can be viewed by the user.

[41] For another example, the eyewear or digital eyewear can be disposed to operate with multiple display screens, whether controlled by a single device (either a mobile device or a “desktop” device) or multiple devices. In such cases, the eyewear can determine whether the user is looking at a first screen or a second screen, and in response thereto, cause the screen being looked at (the “active” screen) to have a first visual effect and the screen not being looked at (the “inactive” screen) to have a second visual effect. For example, the eyewear can direct the inactive screen to be substantially dimmed, so the user is not subject to excessive brightness directed at their peripheral vision. For another example, the eyewear can direct the inactive screen to have its color balance altered: (A) the inactive screen can be filtered to be more amber, so as to reduce peripheral-vision brightness in the blue portion of the visual spectrum; or (B) the inactive screen can be directed to provide green light, so as to prevent or reduce the likelihood of, or to treat or reduce the severity of, migraines, photophobia, or neuro-ophthalmic disorder.

Controlling vehicles

[42] For example, when the user is looking at an external device with a control panel or other control elements, the eyewear can operate the external device using one or more controls in the control panel. For another example, when the external device includes a control for a vehicle (such as a ground vehicle, an aircraft, or watercraft), the user can control the vehicle, or elements thereof, by looking in a direction relevant to the control and executing a gesture or other technique for triggering the control. In such cases, the user can use a gesture in combination with a gaze direction or focusing distance to aid the user’s ability to operate the vehicle; this can be valuable when the vehicle is operated at speed or when the user is making rapid operation decisions. [43] For example, the user can be allowed to operate a vehicle using eye/face gestures or hand/ finger gestures, such as one or more of the following: starting the vehicle, setting a temperature or related controls, turning on/off air conditioning or defrosters or related controls, operating a radio or related equipment, opening/ closing doors or windows, opening/closing an engine hood or a trunk, opening/closing a gas or other fluid entry, extruding/retracting cup holders or related equipment, turning on/off internal lights or displays, turning on/off or adjusting external lights, presenting/highlighting alerts such as from the engine or fuel reserves, controlling “cruise control” or other automatic driving controls, or controlling other controls relating to electric vehicles such as golf carts.

[44] For example, when the user is operating a vehicle having a dashboard with control elements, such as an automobile or a racing car (with control elements on a dashboard or steering wheel) or such as an aircraft (with control elements on level and upper dashboards, a control yoke, or a throttle), the user might benefit from (A) maintaining eye contact with a path of travel, (B) maintaining control contact with one of those controls when operating a different control, or otherwise as described herein. In such cases, the eyewear or digital eyewear can respond to one or more of: a user gaze direction, a user hand/eye gesture or other body movement, so as to direct operation of a first vehicle control while the user otherwise maintains operation of a second vehicle control.

[45] For some possible examples:

— In an aircraft, the user can operate engine and/or flight surface controls using one or more eye/facial gestures or one or more hand/ finger gestures (such as possibly hand/ finger gestures as described in the Incorporated Disclosures). For example, the user can glance upward/ downward once (or 2-3 times in succession) and perform a first selected gesture (such as a blink or squint) to increase/decrease a throttle setting. For another example, the user can glance rightward/left- ward and perform a second (not necessarily different) selected gesture (such as a hand wave or a finger touch at a selected location on a control panel near the pilot) to execute a slip or turn. Alternatively, the user can perform other combinations of actions (thus, of eye/facial gestures, hand/ finger gestures, and/or other body movements) to operate other aircraft controls. For another example, the user can glance or look at a particular control element and perform a selected gesture to operate that control element; thus, looking at an artificial horizon and performing a thumbs-up (or down) gesture to raise (or lower) an elevator control. Alternatively, the user can perform other combinations of actions, such as described herein, to operate other aircraft controls (including such possibilities as operating cabin lights, a radio, or otherwise as described herein).

— In an automobile or racing car, the user can operate accelerator/brake, gearing, and/or turning controls, using one or more eye/facial gestures or one or more hand/finger gestures. For example, the user can glance upward/ downward once (or 2-3 times in succession) and perform a first selected gesture (such as a blink or squint) to apply/ relax an accelerator and/or apply/ release a brake. For another example, the user can glance rightward/leftward and perform a second (not necessarily different) selected gesture (such as a hand wave or a finger touch at a selected location on a wheel) to execute a turn. Alternatively, the user can perform other combinations of actions, such as described herein, to operate other vehicle controls (including such possibilities as operating doors, windows, locks, a trunk, or otherwise as described herein).

[46] Other and further possibilities are not limited to aircraft, automobiles, or racing cars. For example, the user can receive information with respect to a travel surface while operating a motorcycle, mountain bike, dirt bike, or bicycle. The user can maintain their hands on handlebars to control a direction of the vehicle while concurrently performing eye/ facial gestures to operate one or more vehicle controls, such as altering an accelerator, a clutch or throttle, a gear selection, one or more warning blinkers or lights, or otherwise as described herein. The eyewear or digital eyewear can also present augmented reality or virtual reality (AR/VR) input for to the user to provide periodic, requested, or triggered information with respect to travel, such as warnings of racing conditions, road hazards, directions, timing, or otherwise as described herein.

Controlling other devices

[47] For example, when the user is a law enforcement officer (or military personnel), the eyewear or digital eyewear can control a weapon, such as a pistol, rifle, or taser. The user can set a safety mechanism on a weapon so as to operate that weapon only when the user explicitly releases the safety using an eye/facial gesture.

[48] In one alternative, the user can select a set of persons as explicit non-targets (such as non-suspect citizens or other law enforcement officers) so as to operate the weapon only when the eyewear or digital eyewear recognizes that it is not directed at (thus, pointed at or otherwise targeting) one or more of those non-targets. This can have the effect of reducing the likelihood of the law enforcement officer accidentally targeting one or more of the explicitly designated non-tar- gets.

[49] In another alternative, the weapon can be associated with the user in response to an iris scan or other biometric device. The eyewear or digital eyewear can include (or be coupled to) an iris scanner or other biometric device, which can be disposed to lock the weapon if the user is not detected as the proper operator of the weapon. This can have the effect that if a law enforcement officer’s gun is taken away, the new possessor of the weapon cannot use it. The weapon can be associated with more than one such user, such as a set of law enforcement officers who work together.

[50] In another alternative, the weapon can cooperate with the eyewear or digital eyewear so as to determine whether the weapon is properly aligned with an object at which the user is looking. For example, if the user is looking at a particular target, but the weapon is in fact aimed elsewhere, the eyewear or digital eyewear can either (A) signal the user that the weapon is mis-aimed, or (B) signal the weapon not to fire at the wrongly selected target. Similarly, the eyewear or digital eyewear can be disposed to present one or more of: (C) a laser sight indicating where the weapon is aimed, or (D) a virtual image of what a laser sight would look like, either of which might allow the law enforcement officer to accurately aim at the target, in the latter case, without so revealing to the target. This can have the effect that the weapon is less likely to be fired at an erroneous target.

[51] In another alternative, the eyewear or digital eyewear can detect one or more gestures or other controls by the user and operate the external device itself in response thereto. For example, when the user is a law enforcement officer, the eyewear or digital eyewear can detect an eye/face gesture and perform an associated operation such as “squint to shoot” or “wink to shoot” with a firearm or other weapon.

[52] In such cases, the user might personalize their selection of gestures and the actions associated therewith. Alternatively, the eyewear or digital eyewear can be disposed to adjust to user gesture capability when determining its sensitivity to those gestures. For example, a user who can easily manipulate their nose might be offered a selection of gestures associated with nasal movements, such as flaring the nostrils, raising the nose bridge, or wiggling the nose; a user unable to easily perform such actions might be offered a different selection of gestures. [53] For another example, when the user is an emergency responder (or medical personnel), the eyewear or digital eyewear can control medical equipment, such as presenting a warning to medical personnel with respect to a patient condition while performing a surgical operation (or a dental surgery operation). During or prior to a medical procedure, the medical personnel can select a warning trigger for a medical sensor, such as using an eye/facial gesture. When the medical personnel is performing the medical procedure, if the medical sensor presents a sensor value that satisfies the warning trigger, a computing device coupled to the medical sensor can so indicate, thus prompting the eyewear or digital eyewear to present the warning to the medical personnel without the latter having to direct their attention or their gaze toward the medical sensor during the procedure.

[54] For another example, when the user is participating in a sport, the eyewear or digital eyewear can be disposed to identify a direction in which they are looking and can match that with a direction in which they are directing sports equipment. For a golfer attempting a putt, the eyewear or digital eyewear can be disposed to show a direction in which the ball would move given the angle of the putter and the degree of backswing the player is allocating; when this lines up with a direction the player is looking, the eyewear or digital eyewear can be disposed to present a confirming notification. When available, the eyewear or digital eyewear can be disposed to compute a likely path in response to a contour map of a putting green, and where possible, in response to a wind direction and strength.

[55] For another example, the eyewear or digital eyewear can be disposed to send an electromagnetic or other signal to an external device, so as to allow a user to control that device using eye/face gestures or other gesture detectable by the eyewear or digital eyewear. In such cases, the eyewear or digital eyewear can be disposed to detect the user’s eye/face gestures and to send one or more appropriate electromagnetic or other signals to the external device to operate one or more of its functions.

[56] For some possible examples:

— The user can use the eyewear or digital eyewear to control a garage door or other automatic device using one or more eye/face gestures. The eyewear or digital eyewear can detect the one or more eye/face gestures and, in response thereto, send an electromagnetic signal to the garage door to cause it to open/close, as the user instructs. — The user can use the eyewear or digital eyewear to control a security door in response to an iris scanner or other biometric scanner. The eyewear or digital eyewear can include (or be coupled to) the iris scanner or other biometric scanner and can send an electromagnetic signal to the security door to cause it to open/close, as the user instructs.

— The user can use the eyewear or digital eyewear to control an external device, so as to emulate an automobile key, an entertainment device, a game controller, a house lights controller, a laptop (or other computing device) keyboard or pointing device, a sound system controller, a television remote, a universal remote, or any other remote controller. The eyewear or digital eyewear can respond to eye/face gestures and in response thereto, send one or more electromagnetic signals to external devices so as to emulate an appropriate controller, as the user instructs.

— The user can use the eyewear or digital eyewear to control an external device in response to an RFID transponder. The eyewear or digital eyewear can include (or be coupled to) the RFID transponder and can allow the transponder to operate with the external device. Where applicable, the user can send one or more signals to control the external device in response to eye/face gestures.

Control using other devices

[57] In one embodiment, the eyewear or digital eyewear can be disposed to cooperate with one or more external devices, so as to identify control signals from the user and adjust operation of the eyewear or digital eyewear, the external devices, or both. For example, the external devices can include a smartphone or mobile device, such as a mobile device including a camera disposed to capture one or more images of the user and including a processor disposed to operate on those images to detect one or more eye/face gestures by the user.

[58] For example, the mobile device can be disposed to recognize one or more eye/face gestures and/or hand/finger gestures by the user and to adjust one or more of the following:

— A level or volume with respect to music or other audio/video presentation to the user. For example, the user can raise/lower the volume until satisfied.

— An offer or receive a screen-sharing or other AR/VR communication with another eyewear or digital eyewear. For example, the user can present their own field of view to another user who is willing to receive it.

— An operation of the smartphone or mobile device, such as to make or take a call, send or read a text message (possibly using an AR/VR display with the eyewear or digital eyewear), send or read a social media communication, or as otherwise described herein. For example, the user can communicate with another user using social media or otherwise.

— A shading/inverse-shading or coloring/tinting control with respect to one or more of the lenses. For example, the user can alter shading/inverse-shading or coloring/tinting until satisfied.

— A zoom or distant focus control. For example, the user can “zoom in” or out, or alter their depth of focus. The user might also use this type of control when playing a video game.

Or as otherwise described herein.

[59] For example, the mobile device can be disposed to recognize one or more features of an ambient environment, such as a measure of luminance, a measure of coloring/tinting, a measure of audio/video complexity or other interface with possible visual acuity, or as otherwise described herein. When the mobile device detects features of the ambient environment which indicate that an adjustment of shading/inverse-shading or coloring/tinting is called for, the mobile device can signal the eyewear or digital eyewear to make that adjustment.

[60] For another example, the user can direct the mobile device to cause the eyewear or digital eyewear to make such adjustments, in response to the user’s preference in the moment. Thus, rather than requiring the user operating a mobile device to pause operation of that device so as to operate the eyewear or digital eyewear, the user can direct the mobile device to make any adjustments with respect to shading/inverse-shading, coloring/tinting, or other effects as the user might desire.

[61] In another embodiment, the external device can include a vehicle having a set of controls disposed to operate the eyewear or digital eyewear. The vehicle can be real or virtual (such as in an AR/VR environment, or such as in a simulation or video game). For example, the controls can be disposed on a dashboard, on a steering wheel or control yoke, on a detachable control device, or as otherwise described herein. The controls can include one or more of the following:

— A control to adjust shading/inverse-shading coloring/tinting or color balance, refraction, polarization, prismatic deflection, or other audio/video effects.

— A control to set automatic adjustment of one or more audio/video effects, such as setting a threshold at which one or more such audio/video effects are performed.

— One or more sensors disposed to detect objects and/or proximity at a side of the vehicle.

Or as otherwise described herein. [62] When in use with a vehicle, the eyewear or digital eyewear can be disposed to receive signals from the vehicle indicating its state, and possibly warnings with respect to its status and/or proximity. For example, when backing up, the vehicle can be disposed to send a signal to all nearby eyewear or digital eyewear, each of which can alert its user of a possible hazard. For example, the user can be shown a flashing screen or a flashing icon, a warning color/tint (e.g., red), or a warning message. Similarly, when operating a vehicle that is backing up, the operator can be warned of any objects in the way or in proximity thereto, in a similar manner.

Coloring/tinting and combinations

[63] In another embodiment, the eyewear or digital eyewear can include a combination of the two lenses described above, plus a third lens having an additional or complementary effect. For example, the third lens can be tinted, either with a fixed chemical tint or with an electronically activated tint. This can have effect that the eyewear can be disposed to provide clarity of vision to the wearer both at close-range and at longer-range distances, while also protecting the wearer’s eyesight or night vision against damage from excessive light (whether ambient light or artificial spotlights) or from glare. In such cases, a chemical or electrochemical tint can be applied to a surface of the first or the second lens, without substantially increasing the thickness of the eyewear. In such cases, the third lens can assist with shading or inverse-shading, particularly with respect to colors that are relatively intense for computer, smartphone, and other device displays. For example, color balance, color filtering, tinting, and related effects can protect the wearer’s eyes against excessive blue or ultraviolet from mobile phones, particularly when viewed in an otherwise dark environment. This can have the effect of allowing the wearer to read from a display in an otherwise bright ambient environment, without having to increase the brightness of the display to the point of eye pain.

[64] In such cases, in addition to tinting, the third lens can also be disposed to adjust the color balance of the wearer’s field of view, or to filter out undesired frequencies (or to specifically inject desired frequencies). For one example, wearers who are subject to migraines or photophobia can have the color balance of their field of view adjusted to allow for greater brightness without excessive pain, or to provide calming, soothing, or therapeutic colors such as amber or green. For another example, the third lens can provide a separate visual effect, such as a polarization effect (to reduce glare), a prismatic effect (to alter a direction of the wearer’s line-of-sight or field of view), or otherwise as described herein.

[65] In one embodiment, the eyewear or digital eyewear can be responsive to a detected mental state of the user, or a diagnosis of the user’s mental state by medical personnel, so as to provide coloring/tinting, or other color balance effects to assist with treatment and/or amelioration of adverse mental states. For example, users who are subject to migraines can be aided by altering the coloring/tinting and/or the color balance of their field of view so as to include more green light (particularly green light in the 5OO-56onm range). For another example, users who are subject to seasonal affective disorder (“SAD”) can be aided by altering the coloring/tinting and/ or the color balance of their field of view so as to include more blue light.

Adjusting shading/inverse-shading

[66] In an environment in which there is a substantial amount of excessive lighting from one or more sources, it can matter (A) whether any particular light source exceeds an amount of ambient light, and if so, by how much; (B) whether the user is looking in the direction of, or focusing on, any particular light source, and if so, how directly; and (C) whether the object the user is looking at is bright or not, has contrast or not, is reflective or not, or other factors that might have an effect on the user’s eyesight. In such cases, it can be desirable to adjust an amount of shading in response to lighting conditions and in response to the nature of the object at which the user is looking.

[67] For example, one such environment can be when the user is controlling an aircraft. A pilot’s eyes might need to look at instruments within the aircraft, and those instruments might be positioned (A) in shadow, (B) where they reflect sunlight, (C) where they are illuminated by cabin lights, or some combination thereof. A pilot’s eyes might alternatively need to look at objects outside the aircraft, and those objects might be positioned (A) in shadow, such as under cloud cover, (B) where they reflect sunlight, such as when the cloud cover itself is brightly lit, (C) where they are backlit by sunlight, such as when transiting the sun or approaching from sunward, or some combination thereof.

[68] Accordingly, it can be desirable to adjust shading in response to whether the user is looking at an object outside the aircraft or whether the user is looking at an instrument inside the aircraft. The eyewear can be disposed to shade in response to (A) a direction at which the user is looking or (B) a distance at which the user is focusing, such as in response to a dynamic eye tracking system, (C) whether the user tilts their head or otherwise gestures in response to a change in attitude concurrent with looking inside or outside the aircraft.

[69] For another example, the eyewear or digital eyewear can be disposed to obtain information with respect to an ambient environment near the user, so as to determine whether adjusting shading/inverse-shading is necessary or desirable. The eyewear or digital eyewear can be disposed to request and/or receive information from one or more of the following: a compass direction detector, a detector of the user’s head angle (e.g., toward the sky or toward the ground), an elevation detector, a GPS device or other location detector, a time of day or season detector, an ultraviolet measurement device, a weather detector, or as otherwise described herein. The eyewear or digital eyewear can be disposed to integrate such information so as to determine a measurement of surrounding luminance, and where applicable, its primary direction and coloring/tint- ing.

[70] In one embodiment, the eyewear or digital eyewear can be disposed to adjust shading with respect to at least a portion of the user’s field of view (FOV) in response to a sudden rise (or other change) in brightness/luminosity or color balance. For example, the user’s eye might have been subjected to a bright light or a laser. In such cases, the eyewear can be disposed to shade in response to an intensity of the bright light or laser, so as to protect the user’s eyes against damage to eyesight or night vision. In such cases, the eyewear can be disposed to shade in response to a direction of the bright light or laser, so as to maintain as much of the user’s field of view (FOV) as possible, and so as to provide the user with an indicator of where the bright light or laser is coming from. If the user is piloting a vehicle, such as an aircraft, or sailboat or speedboat, the user can use this information to direct the vehicle toward or away from the source of the bright light or laser.

Adjusting polarization

[71] In one embodiment, the eyewear or digital eyewear can be disposed to detect polarization of the bright light or laser, and to adjust polarization with respect to at least a portion of the user’s field of view (FOV) in response thereto. This can have the effect that the brightness/luminosity of the bright light or laser can be reduced (when the bright light or laser is polarized). This can also have the effect that the eyewear can protect the user’s eyes against damage to eyesight or night vision, while providing the user with the ability to see through the region of their FOV impacted by the bright light or laser. The eyewear can also be disposed to detect changes in the polarization of the bright light or laser, and to adjust polarization with respect to those changes, so as to maintain protection of the user’s eyes even when the bright light or laser is itself changing.

[72] In one embodiment, the eyewear or digital eyewear can be disposed to adjust polarization when light sources the user desires to view are polarized at a relative angle to the eyewear that causes those light sources to be difficult to see. In particular, the eyewear can be disposed to adjust polarization when the user divides their attention between an ambient environment, such as when operating a vehicle, and close-range devices, such as controls or sensors in that vehicle. Maladjustment between polarization of close-range devices and eyewear can cause the controls or sensors to appear extremely darkened, or even black, reducing their value to the user to nearly zero. The eyewear can be disposed to determine a relative angle between the external devices and the eyewear’s own polarization angle, so as to assure that external devices remain clear to the user even when the user moves their head at different angles or looks at the external devices from differing angles.

[73] In one embodiment, the eyewear or digital eyewear can be disposed to adjust polarization to account for a selected display. When the eyewear or digital eyewear includes an e-polarizer, the e-polarizer can be disposed to adjust polarization in response to a control signal, which can itself be modified to adjust to each particular display that comes within the user’s field of view.

[74] For one example, when the user is watching a broadcast or streaming presentation, such as using a television display, the polarization can be adjusted to account for the particular characteristics of the television display. One possible use might be to deliberately polarize the lenses so as to render the screen blank to the user during commercials; in such cases, the eyewear or digital eyewear can be disposed to retain audio so as to allow the user to determine when the commercial is over, or the audio can be muted and a timer can be counted down for 30 seconds for the time allotted to the commercial.

[75] For another example, a right-circular or left- circular polarizer can be used to shade images on the display without blacking out the entire display with planar polarization. Adjusting prismatic deflection

[76] In one embodiment, the eyewear or digital eyewear can be disposed to detect a direction of a bright light or laser, and to adjust a prismatic angular deflection with respect to at least a portion of the user’s field of view (FOVj in response thereto. This can have the effect that the brightness/luminosity of the bright light or laser can be deflected from the user’s eye. This can also have the effect that the eyewear can protect the user’s eyes against damage to eyesight or night vision, while providing the user with the ability to see through other regions of their field of view not impacted by the bright light or laser. The eyewear can also be disposed to detect changes in the direction of the bright light or laser, and to adjust prismatic angular deflection with respect to those changes, so as to maintain protection of the user’s eyes even when the bright light or laser is itself changing.

[77] In alternative related cases, the prismatic angular deflection can be disposed to be responsive to a fixed (ophthalmic) deflection, or responsive to an electronically controlled shift of the user’s angular view. For example, a fixed ophthalmic deflection can be presented so as to adjust for a user’s misaligned eyes.

Adjusting visual effects during blink

[78] In one embodiment, the eyewear or digital eyewear can be disposed to adjust shading with respect to at least a portion of the user’s field of view (FOV) during a time period while the user blinks. Since a blink takes a finite amount of time, the eyewear can adjust an amount of shading while the user is blinking (and the pupil is covered by the eyelid). This can have the effect that the user sees a different amount of shading before the blink and after the blink. The eye integrates the amount of shading into its received image. This can have the effect that the user does not notice the change in the amount of shading.

[79] In one embodiment, the eyewear or digital eyewear can be similarly disposed to adjust other visual effects (such as polarization, refraction, or prismatic deflection) with respect to at least a portion of the user’s field of view (FOV) during a time period while the user blinks. Similar to adjustment of shading during the user’s blink, this can have the effect that the user sees different other visual effects (such as polarization, refraction, or prismatic deflection) before the blink and after the blink, which can be integrated by the eye into its received image, so that the user does not notice the change.

[80] Similarly, the eyewear or digital eyewear can be disposed to adjust one or more visual effects with respect to at least a portion of the user’s field of view during a time period while the user is not looking in that direction. For example, the eyewear or digital eyewear can be disposed to adjust shading/inverse-shading of one or more lenses for a peripheral portion of the user’s field of view while the user is looking in a frontal direction, or for a frontal portion of the user’s field of view while the user is looking in a peripheral direction.

Responding to visual acuity

[81] In one embodiment, the eyewear or digital eyewear can be disposed to determine a measurement of visual acuity available to the user, and to adjust an effect on the user’s field of view in response thereto. For example, the eyewear can measure visual acuity in response to a comparison between (A) a first view available to the user using the eyewear, and (B) a second view available to the user without using the eyewear. The eyewear can include a first camera disposed to capture the field of view available to the user using the eyewear and a second camera disposed to capture the same field of view available to the user, only without using the eyewear. The first camera can be disposed to view through a lens of the eyewear; the second camera can be disposed to view outside any lens of the eyewear. By determining a difference between the first camera and the second camera, the eyewear can determine a measurement of visual acuity available to the user while using the eyewear.

[82] In response to the measure of visual acuity, the eyewear can adjust one or more parameters, such as color balance, polarization, shading, or other parameters affecting the user’s field of view. The user’s field of view can depend at least in part on what one or more objects the user is looking at or focusing upon. As described herein, the eyewear can determine what one or more objects the user is looking at or focusing upon in response to a dynamic eye tracking system or other features of the scene available in the user’s field of view.

[83] In one embodiment, the eyewear or digital eyewear can be disposed to present a 3D display on a selected background. For example, the selected background can include a screen, such as a smartphone screen or a screen with respect to another mobile device. For another example, the selected background can include a billboard, a movie theater screen, a theme-park display or other interactive display, an outdoor background, a region of the sky or other natural background, or another region of the wearer’s field of view appropriate for a 3D display.

Signaling from external devices

[84] In one embodiment, the eyewear or digital eyewear can be disposed to provide signals from one or more external devices, such as a smartphone or mobile device, a GPS or other location device, a proximity sensor or other external tracking device, or another device such as described herein.

[85] For example, while operating a vehicle, the eyewear or digital eyewear can be disposed to “flash” (that is, to rapidly shade/inverse-shade so as to present a brief flash), or otherwise signal the user, when the user exceeds a speed limit, when the user approaches a designated exit or turnoff, when the user is within a selected proximity of a law enforcement vehicle, when the user is near (or is causing) a driving or racing hazard, as otherwise designated or selected by the user, or as otherwise described herein. There is no particular requirement that the signal to the user is a “flash”; a relatively slower change in shading/inverse-shading or coloring/ tinting, a shaded/in- verse-shaded or a colored/tinted marker in a portion of the user’s field of view, or another indicator, would also be workable.

[86] For example, while traveling with one or more other persons, the eyewear or digital eyewear can be disposed to signal the user when the user exceeds a selected distance from those other persons, exceeds that selected distance for a designated amount of time, otherwise exhibits signs of becoming detached from a group, or otherwise as described herein. Similarly, the eyewear or digital eyewear can be disposed to signal the user when the user approaches within a selected distance of a second user in the first user’s circle of “friends” or other persons related to the first user with respect to a social network, with respect to a similar class schedule, or as otherwise described herein.

[87] For another example, the eyewear or digital eyewear can be disposed to allow the user to send/receive, or respond to, messages such as one or more of the following: advertising, comments on media articles, communication with other users, phone calls or text messages, social media, or as otherwise described herein. In such cases, the user can be allowed to use eye/face gestures (possibly including eyebrow or head gestures), hand/ finger gestures (possibly including hand movement toward/ away from a sensor such as described in the Incorporated Disclosures), or otherwise as described herein.

Color change by eyewear

[88] In one embodiment, the eyewear or digital eyewear can be disposed to provide color change by the eyewear or digital eyewear. For example, this can include a color change by the frame when the eyewear or digital eyewear includes glasses, a facemask, helmet, or otherwise as described herein. For another example, this can include a color change by a portion of the eyewear or digital eyewear, such as associated with the iris so as to not interfere with the wearer’s vision, when the eyewear or digital eyewear includes a contact lens, or otherwise as described herein. For another example, this can include a size change associated with the eyewear or digital eyewear, such as associated with the pupil so as to not interfere with the wearer’s vision, when the eyewear or digital eyewear includes a contact lens, or otherwise as described herein. Thus, the color change can include a portion of a contact lens that covers the iris or sclera, but not the pupil. For another example, this can include a color change associated with the pupil or lens, so as to alter a color balance of the wearer’s vision, when the eyewear includes a contact lens or implantable lens, or otherwise as described herein.

[89] In one embodiment, the eyewear or digital eyewear can be disposed to provide a color change other that a fixed color or color pattern. For example, the color change might include one or more of the following: (A) a glitter effect, (B) a florescent effect, (C) a coloring/tinting effect that changes with time, such as a pattern having a time-dependent element. In such cases, the color change might be responsive to an ambient environment, a situational context, a user input, a determination of a user state (such as whether the user is subject to a medical condition, a mental condition such as migraine or photophobia, or a mood or other emotional condition).

[90] In one embodiment, the eyewear or digital eyewear can be disposed to provide color change by the eyewear or digital eyewear frame (such as including temples and/ or a front piece) in response to an audio/video signal, such as described herein with respect to a “music shading” feature. In such cases, the eyewear or digital eyewear can itself change color in synchrony with the audio/video signal. This can have the effect that the eyewear or digital eyewear presents a changing image related to the audio/video signal. [91] In one embodiment, the eyewear or digital eyewear can include an e-chromatic material disposed inside a substantially clear coating or other substantially clear material, so as to show any coloring/tinting changes to the e-chromatic material externally through the substantially clear material. For example, the eyewear or digital eyewear can include an e-chromatic material sandwiched between layers of substantially clear material.

[92] In one embodiment, the eyewear or digital eyewear can be responsive to one or more magnetic fields, so as to change color when a magnetic field is present. The eyewear or digital eyewear can include a frame or portion thereof, or one or more lenses, including polymer beads (possibly substantially microscopic in size) that are disposed to change color when a magnetic field is present. When the eyewear or digital eyewear is responsive to a magnetic field, it can allow the user to adjust or alter their color, or a portion thereof, dynamically using a magnet or another tool.

[93] In one embodiment, the eyewear or digital eyewear can be responsive to a detected mental state of the user, or a diagnosis of the user’s mental state by medical personnel, so as to change color to identify the user’s mood or other mental state. For example, users who are subject to migraines can cause the eyewear or digital eyewear to identify an (oncoming or current) migraine by altering the color of the eyewear so as to inform medical personnel, emergency responders, or other nearby volunteers, to assist the user. For another example, users who are subject to narcolepsy or other disorders can be aided by altering the color of the eyewear so as to inform medical personnel, emergency responders, or other nearby volunteers, to assist the user.

Other changes by eyewear

[94] In one embodiment, the eyewear or digital eyewear can be responsive to information with respect to the ambient environment to detect whether one or more lenses are likely to be subject to fog, frost, similar effects, or other obstructions to visibility. For example, the eyewear or digital eyewear can include a thermometer, thermocouple, or another temperature detector, so as to determine whether fog, frost, or similar effects are likely. In such cases, the lenses can be coupled to a resistive circuit or other heating element, so as to de-fog, defrost, or otherwise maintain the lenses clear of visual obstruction. For another example, the eyewear or digital eyewear can be responsive to one or more controls by the user, such as eye/face gestures, hand/finger gestures (such as those described in the Incorporated Disclosures), capacitive or touch controls, or other techniques for the user to signal their desire to de-fog or defrost the lenses.

[95] In one embodiment, the eyewear or digital eyewear can be disposed to receive information with respect to the ambient environment, as well as other weather or related information, and attempt to predict whether any one or more lenses are likely to be subject to fog, frost, similar effects, or other obstructions to visibility. When obstructions to visibility are predicted, the eyewear or digital eyewear can be disposed to proactively treat one or more of the lenses so as to prevent obstructions to visibility.

[96] In one embodiment, when the user is operating a vehicle, particularly a vehicle having a windshield or other clear surface through which the user is expected to see an operating environment, the eyewear or digital eyewear can be disposed to perform its de-fog, defrost, or similar operation with respect to the windshield as well as with respect to the lenses. This can have the effect that the user can more easily operate the vehicle, such as when the vehicle includes one or more of the following: an airplane, glider, helicopter, or other aircraft; an automobile, motorcycle or dirt bike, race car, truck, or other ground vehicle; a motorboat or other water vehicle; or another vehicle as described herein.

Couplable circuit elements and temples

[97] In one embodiment, the eyewear or digital eyewear can include a set of circuit elements disposed in distinct separable portions. The distinct separable portions can be couplable using modular connectors, capable of both coupling the separable portions while the eyewear is in operation and coupling one or more of those portions to external devices while the eyewear is otherwise dormant. For example, the separable portions can include a set of elements selected by or personalized to the user. Alternatively, the separable portions can be selected by others (such as an optometrist, ophthalmologist, or other medical personnel).

[98] For example, the eyewear or digital eyewear can include a front piece supporting one or more lenses disposed for viewing, the front piece being supported by a first temple having a battery or other energy storage, and a second temple having a computing device or other data storage. The temples can be coupled to the front piece using detachable hinges, each of which can include a magnetic hinge and a circuit coupling, so as to allow the battery to couple to the computing device using a circuit coupling through the front piece.

[99] In one embodiment, when the temples include a magnetic hinge, the hinge can be disposed to hold the temples in place, such as when open and worn by the user. For example, each temple can be held in place by the magnetic hinge and with digital circuitry, with the effect that the temple is both mechanically, magnetically, and electronically coupled to the front piece. The front piece can couple the two temples electronically and digitally, such as with an electric and digital coupling between the hinge for one temple and the hinge for the other temple.

[too] In one embodiment, the coupling between each temple and the front piece can include both a magnetic coupling (such as using an electromagnet) and a digital coupling. For example, the magnetic coupling can be disposed separately from the digital coupling; the digital coupling can include one or more pins, each coupled to a circuit element, that couple when the magnetic coupling is closed. For another example, the magnetic coupling and/ or the digital coupling can be disposed such as couple to a carrier, such as a charger/ recharger or communi cation/storage element as described herein. When the eyewear or digital eyewear includes temples couplable using a hinge, the temples can be disposed to couple to one or more external devices, such as when the eyewear or digital eyewear is not being worn by the user. For example, the eyewear or digital eyewear can be disposed to be opened, so that each temple can couple to an external device suitable to interface with that temple.

[101] When the magnetic coupling is attached to another element, such as another portion of the eyewear or digital eyewear, or such as a carrier, one or more such pins can be disposed to couple analog/digital signals, one or more such pins can be disposed to couple electrical power/ground, one or more pins can be disposed to operate in combination with other circuits as receiver/transmitters, one or more pins can be disposed to operate to transmit optical signals within the eyewear or digital eyewear, or as otherwise described herein.

[102] For example, a temple having a battery or other energy storage can be coupled to a charger/ recharger, so as to restore a relatively full charge to the battery after use. The charger/ recharger can be coupled to a charging dock, charging outlet, charging station, building outlet, or other device for relatively rapid charging/ recharging of the battery. When fully charged, the eyewear or digital eyewear can present an indicator thereof, such as providing fully dark lenses, cycling between/among a sequence of colors (such as green, yellow, and red), flashing while coupled to a charging station, or otherwise as described herein.

[103] For another example, a temple having a computing device or other storage can be disposed to a communication element or to another computing device or storage device, so as to exchange data with the eyewear or digital eyewear and an external device. The external device can include a storage element, a coupling or outlet for a storage element (such as a USB port couplable to a hand-held SSD drive, or a similar device), a communication element (such as a wired or wireless transmitter, or such as a Bluetooth™ or Wi-Fi transmitter coupled to a local area network or to an internet router), and/or a processing element (such as a device suitable to review/revise data maintained in the temple).

[104] In one embodiment, when worn, the eyewear or digital eyewear can indicate a low-battery or other lack of charging condition by presenting an audio/video indicator to a user. For example, the eyewear or digital eyewear can flash or cycle between/among a sequence of colors to so inform the user. The eyewear or digital eyewear can flash a greater number of times, or otherwise indicate a greater urgency, as the low-battery or other lack of charging condition becomes more serious. In such cases, the eyewear or digital eyewear can fail-over to a benign state, such as to a set of eyewear with clear lenses. Similarly, when the eyewear or digital eyewear is very low on power, it can fail-over softly to a benign state, maintaining its power reserve for urgent uses.

Hybrid personalization

[105] In one embodiment, the eyewear or digital eyewear can be disposed to provide hybrid personalization of corrections or enhancement of the user’s vision. For example, the hybrid personalization can include one or more alternative corrections or enhancements of the user’s vision, in combination or conjunction with techniques for focusing the user’s gaze through portions of the eyewear that provide those alternative corrections or enhancements in appropriate circumstances. For example, a region of the eyewear that provides close-range correction or enhancement of the user’s vision can be combined with one or more techniques for directing the user’s gaze through that portion of the eyewear when the user is focusing on one or more objects at that range. [106] For example, the eyewear or digital eyewear can include a computing device performing an artificial intelligence or machine learning technique, coupled to an outward-looking camera and a dynamic eye-tracking mechanism. In response thereto, the artificial intelligence or machine learning technique can “learn” (that is, can adapt to or otherwise determine a set of parameters with respect to) those objects or scenes which the user favors looking at, and can tune its adjustment of the user’s field of view to enhance the visibility of those objects or scenes the user favors.

[107] After reading this Application, those skilled in the art would recognize that other and further combinations or extensions of the described devices and methods, or other possibilities suggested thereby, would be workable without further invention or undue experiment, and are within the scope and spirit of the described invention.

Brief Description of the Figures

[108] In the figures, like references generally indicate similar elements, although this is not strictly required.

[109] Fig. 1 (collectively including Figures 1A-1B) shows a conceptual drawing of example eyewear including wearable glasses, such as providing active correction or enhancement. Figure 1A shows a conceptual drawing of example glasses having multiple active regions related to wearer view. Figure 1B shows a conceptual drawing of example glasses having multiple active pixels related to individual wearer view.

[110] Fig. 2 shows a conceptual drawing of example eyewear including a retinal image display.

[111] Fig. 3 (collectively including Figures 3A-3B) shows a conceptual drawing of example eyewear including contact lenses or intra-ocular lenses. Figure 3A shows a conceptual drawing of example contact lenses having multiple active regions related to wearer view. Figure 3B shows a conceptual drawing of example contact lenses having multiple individual pixels related to wearer view.

[112] Fig. 4 (collectively including Figures 4A— 4D) shows a conceptual drawing of example eyewear including a facemask, helmet, goggles, or visor. Figure 4A shows a conceptual drawing of an example facemask or helmet having multiple active regions related to wearer view. Figure 4B shows a conceptual drawing of an example facemask or helmet having multiple individual pixels related to wearer view. Figure 4C shows a conceptual drawing of an example goggles or visor having multiple active regions related to wearer view. Figure 4D shows a conceptual drawing of an example goggles or visor having multiple individual pixels related to wearer view.

[113] Fig. 5 shows a conceptual drawing of example eyewear including one or more scopes or sights, including binoculars, microscopes, rifle scopes, spotting scopes, telescopes, analog or digital cameras, rangefinders, or otherwise as described herein.

[114] Fig. 6 shows a conceptual drawing of example eyewear including one or more nerve sensors or stimulators. [115] Fig. 7 (collectively including Figures 7A-7B) shows a conceptual drawing of eyewear used with an example display. Figure 7A shows a conceptual drawing of the example display disposed on or in a building or structure. Figure 7B shows a conceptual drawing of the example display disposed in a vehicle.

[116] Fig. 8 shows a conceptual drawing of an example eyewear used to provide hybrid personalization.

[117] Fig. 9 shows a conceptual drawing of an example eyewear used to provide dynamic adjustment of polarization.

[118] Fig. 10 shows a conceptual drawing of an example eyewear used to provide dynamic adjustment of magnification.

[119] Fig. 11 shows a conceptual drawing of an example eyewear used to provide dynamic adjustment with respect to reflection and partial reflection.

[120] Fig. 12 shows a conceptual drawing of an example eyewear used to provide dynamic adjustment with respect to three-dimensional (3D) viewing of a display.

[121] Fig. 13 (collectively including Figures 13A-13B) shows a conceptual drawing of eyewear used to provide dynamic lighting in a direction being viewed by a wearer. Figure 13A shows a conceptual drawing of eyewear being used to provide light where the user is looking. Figure 13B shows a conceptual drawing of eyewear being used to control one or more devices to highlight one or more displays, in response to where the user is looking.

[122] Fig. 14 (collectively including Figures 14A-14B) shows a conceptual drawing of eyewear including a peripheral vision lens. Figure 14A shows a side view of eyewear including a peripheral vision lens. Figure 14B shows a top view of eyewear including a peripheral vision lens.

[123] Fig. 15 shows a conceptual drawing of eyewear capable of performing music and entertainment shading. [124] Fig. 16 (collectively including Figures 16A-D) shows a conceptual drawing of eyewear capable of controlling external devices or being controlled by external devices.

[125] Fig. 17 shows a conceptual drawing of eyewear capable of including a hand/finger gesture sensor.

[126] Fig. 18 (collectively including Figures 18A-B) shows a conceptual drawing of eyewear capable of including couplable circuit elements and temples, and capable of being coupled to an external device.

[127] Fig. 19 shows a conceptual drawing of eyewear capable of including magnetic clip-on couplable circuit elements and lenses.

[128] Fig. 20 shows a conceptual drawing of eyewear capable of including one or more multilayer lenses.

[129] Fig. 21 shows a conceptual drawing of eyewear capable of highlighting using polarization.

[130] After reading this Application, those skilled in the art would recognize that the figures are not necessarily drawn to scale for construction, nor do they necessarily specify any particular location or order of construction.

Detailed Description

GENERAL DISCUSSION

[131] In one embodiment, the eyewear (or digital eyewear) can be responsive to one or more of: sensory parameters, wearer parameters, environmental parameters, or otherwise as described herein. For example, sensory parameters can include the wearer’s gaze direction or focal length; eye gestures or multiple eye gestures by the wearer; other eye activity by the wearer, such as pupil or iris size, blink rate, squinting, eye twitching or nystagmus, saccades; or other senses such as hearing, smell, or touch (possibly including the wearer triggering a control on the eyewear, conducting a hand or other body gesture, or otherwise as described herein). Wearer parameters can include medical conditions, such as whether the wearer is subject to allergies, “dry eyes” and related conditions, migraines/photophobia or related conditions, sleep deprivation, epilepsy or other seizure concerns, being under the influence of alcohol or other substances, or otherwise as described herein; the wearer’s eye activity, or changes thereof; the wearer’s location or distance from a selected object, or changes thereof; or otherwise as described herein. Environmental parameters can include features of the wearer’s field of view, such as luminance, color prominence, glare, visual blur or noise, or otherwise as described herein; presence of particular objects or people in view, such as persons known to the wearer, or such as weapons (guns, knives, or otherwise as described herein); or features of the ambient environment, such as a relationship between the wearer and scene or object being viewed, such as whether the wearer is in motion with respect thereto, or otherwise as described herein.

[132] In one embodiment, the eyewear or digital eyewear can be responsive to wearer activity. Wearer activity can include one or more of: an activity being conducted by the wearer, such as whether the wearer is engaged in police, military, firefighter, emergency responder, search and rescue activity, or otherwise as described herein; whether the wearer is engaged in operating a vehicle, such as a racing car, a speed boat, an aircraft, another type of vehicle, or otherwise as described herein; whether the wearer is engaged in observing a sporting activity or other event, such as a baseball or football game, a live-action or recorded concert, a movie or other presentation, a theme-park event or other interactive experience, or otherwise as described herein, an advertisement or store front, an augmented reality (AR) or virtual reality (VR) event or other three- dimensional (3D) experience, or otherwise as described herein; whether the wearer is reading, conversing with another person, viewing a target at a distance, viewing a panorama, or otherwise as described herein; or other possible wearer activities.

[133] In one embodiment, the type of eyewear or digital eyewear can be particular to a use being made by the wearer. For example, wearable eyewear can include glasses, contact lenses, a retinal image display (RID), an intra-ocular lens (IOL), or otherwise as described herein. For another example, wearable eyewear can include a helmet, such as might be disposed for use by police, military, firefighter, emergency responder, search and rescue activity, or other personnel. For another example, eyewear can include augmented eyewear, such as a microscope or telescope, a rifle scope or other scope, binoculars, a still or motion-picture camera, “night vision” glasses or other infrared detectors, or otherwise as described herein. For another example, eyewear can include nerve sensors or stimulators, such as optic nerve sensors or stimulators, optical cortex or other brain element sensors or stimulators, or otherwise as described herein. For another example, the eyewear can be used in combination or conjunction with other devices, such as smartphones, smart watches, or other wearable or implantable devices; concert screens or other displays; AR presentations; cameras, scopes, and related devices; wireless or other electromagnetic signals; medical devices; or otherwise as described herein.

[134] In one embodiment, the eyewear or digital eyewear can be responsive to a wearer’s field of view (FOV), or a portion of the wearer’s FOV, such as whether the wearer’s FOV is long-range or short-range, higher or lower, right or left, central or peripheral vision, or otherwise as described herein.

[135] In one embodiment, the eyewear or digital eyewear can adjust visual features presented to the wearer, such as using changes in refraction; changes in polarization or shading; changes in color filtering, color injection, false coloring, color change by the eyewear, or otherwise as described herein; changes in prismatic angles or functions; changes in presentation of 3D displays; or otherwise as described herein.

[136] In one embodiment, the eyewear or digital eyewear can include multiple lenses to provide hybrid personalization. A first lens can provide a first adjustment of visual features presented to the wearer, such as correction or enhancement of the wearer’s vision, while a second lens can provide a second adjustment of visual features presented to the wearer, possibly electronically induced, such as changes in refraction, changes in shading/inverse-shading, chromatic alteration (or other changes in color, color balance, color gamut, or false coloring, or otherwise as described herein), changes in polarization, changes in prismatic angles or functions; changes in presentation of 3D displays; or otherwise as described herein.

[137] In one embodiment, the eyewear or digital eyewear can include multiple lenses with combined operations to provide further personalization. For example, a first lens can include a first region in which it provides a first amount of vision correction (refraction), such as for closerange vision, and a second region in which it provides a second amount of vision correction, such as for longer-range vision. In such cases, a second lens can include a first region aligned with the first lens’ first region, in which it provides a first variable amount of shading, and a second region aligned with the first lens’ second region, in which it provides a second variable amount of shading. When the user has been looking through a close-range vision portion of the combined lens for too long, the eyewear can use shading (such as by polarizing that portion of the lens) to darken that portion of the lens and encourage the user to look elsewhere. Thus, the first lens can provide vision correction, while the second lens provides shading to encourage particular gaze directions (or to discourage particular gaze directions). This can have the effect that the user is encouraged not to stare at close objects for too long, and to look away periodically (or otherwise from time to time) at more distant objects. This can have the effect that the user is encouraged to avoid eyestrain. The first lens and the second lens can have additional regions, such as a close-range region, a mid-range region, and a long-range region. The first lens and the second lens can be responsive to other features of the user’s field of view, such as an amount of brightness, a color balance, an amount of concentration on small objects, or other factors that might affect the user’s eyesight, prompt headache, or prompt other medical issues.

[138] In one embodiment, the eyewear or digital eyewear can be disposed to adjust shading with respect to an object or a portion of the user’s field of view (FOV) at which the user is looking. In such cases, when the user is looking in a particular direction, the eyewear can be disposed to shade only portions of the user’s FOV in that direction. Similarly, in such cases, when the user is looking at a particular object, such as when looking in a particular direction and at a particular depth of focus so as to distinguish a selected object, the eyewear can be disposed to shade only that selected object. An outbound camera, such as a camera mounted behind one or more of the lenses and disposed to view a location or region at which the user is looking, can be disposed to determine an amount of shading that optimizes the user’s view, or to determine an amount of shading that optimizes a clarity of the location or region at which the user is looking.

[139] In one embodiment, the eyewear or digital eyewear can be disposed to detect where the user is looking in response to one or more of: a dynamic eye tracking system, one or more “outbound” cameras disposed to review the user’s field of view (FOV) from inside one or more lenses. For example, the dynamic eye tracking system can be disposed to determine in what direction, and at what depth of focus, the user is looking. This can have the effect that the dynamic eye tracking system can determine a location in three-dimensional (3D) space at which the user is looking. For another example, the outbound camera can be disposed to examine the user’s FOV from inside one or more of the lenses. Either of these techniques can have the effect that when the user moves their head or otherwise alters their FOV, the eyewear can adjust the 3D location that is shaded. More precisely, the eyewear can adjust a location on each lens so that the joint focus of the user’s eyes at that 3D location is shaded.

[140] This can have the effect that the eyewear or digital eyewear shades “where the user is looking”. When the user adjusts the direction they are looking, adjusts the depth of field at which they are looking, tilts their head, squints, otherwise moves due to an external force, the eyewear can shade where the user looks, and if so desired, only where the user looks. For example, if the user might be in a vehicle, such as an aircraft, racecar, or sailboat or speedboat, and the user might be looking at a dashboard or instrument, or user might be looking at an external object. The eyewear can shade where the user is looking, notwithstanding the user’s head or eye movement, the vehicle’s movement, or other movement that might affect where the user is looking.

[141] In one embodiment, it might occur that the environment has a substantial amount of excessive lighting from one or more sources. In such cases, it can matter (A) whether any particular light source exceeds an amount of ambient light, and if so, by how much; (B) whether the user is looking in the direction of, or focusing on, any particular light source, and if so, how directly; and (C) whether the object the user is looking at is bright or not, has contrast or not, is reflective or not, or other factors that might have an effect on the user’s eyesight. In such cases, it can be desirable to adjust an amount of shading in response to lighting conditions and in response to the nature of the object at which the user is looking.

[142] For example, one such environment can be when the user is controlling an aircraft. In such cases, a pilot’s eyes might need to look at instruments within the aircraft, and those instruments might be positioned (A) in shadow, (B) where they reflect sunlight, (C) where they are illuminated by cabin lights, or some combination thereof. In alternative such cases, a pilot might need to look at objects outside the aircraft, and those objects might be positioned (A) in shadow, such as under cloud cover, (B) where they reflect sunlight, such as when the cloud cover itself is brightly lit, (C) where they are backlit by sunlight, such as when transiting the sun or approaching from sunward, or some combination thereof.

[143] In such cases, the eyewear or digital eyewear can be disposed to adjust shading in response to whether the user is looking at an object outside the aircraft or whether the user is looking at an instrument inside the aircraft. The eyewear can be disposed to shade in response to (A) a direction at which the user is looking or (B) a distance at which the user is focusing, such as in response to a dynamic eye tracking system, (C) whether the user tilts their head or otherwise gestures (such as using an eye/ face gesture, a head gesture, a hand/ finger gesture, or another gesture) in response to a change in one or more of (airspeed, altitude, attitude, or otherwise as described herein) concurrent with looking inside or outside the aircraft (or transitioning between the two).

[144] In one embodiment, the eyewear or digital eyewear can be disposed to adjust shading with respect to at least a portion of the user’s field of view (FOV) during a time period while the user blinks. Since a blink takes a finite amount of time, the eyewear can adjust an amount of shading while the user is blinking (and the pupil is covered by the eyelid). This can have the effect that the user sees a different amount of shading before the blink and after the blink. The eye integrates the amount of shading into its received image. This can have the effect that the user does not notice the change in the amount of shading.

[145] In one embodiment, the eyewear or digital eyewear can be similarly disposed to adjust other visual effects (such as polarization or refraction) with respect to at least a portion of the user’s field of view (FOV) during a time period while the user blinks. Similar to adjustment of shading during the user’s blink, this can have the effect that the user sees different other visual effects (such as polarization or refraction) before the blink and after the blink, which can be integrated by the eye into its received image, so that the user does not notice the change.

[146] As described above, in one embodiment, the eyewear or digital eyewear can be disposed to adjust shading with respect to at least a portion of the user’s field of view (FOV) during a time period while the user blinks. Since a blink takes a finite amount of time, the eyewear can adjust an amount of shading while the user is blinking (and the pupil is covered by the eyelid). This can have the effect that the user sees a different amount of shading before the blink and after the blink. The eye integrates the amount of shading into its received image. This can have the effect that the user does not notice the change in the amount of shading.

[147] In one embodiment, the eyewear or digital eyewear can be disposed to adjust shading with respect to at least a portion of the user’s field of view (FOV) in response to a sudden rise (or other change) in brightness/luminosity or color balance. For example, the user’s eye might have been subjected to a bright light or a laser. In such cases, the eyewear can be disposed to shade in response to an intensity of the bright light or laser, so as to protect the user’s eyes against damage to eyesight or night vision. In such cases, the eyewear can be disposed to shade in response to a direction of the bright light or laser, so as to maintain as much of the user’s field of view (FOV) as possible, and so as to provide the user with an indicator of where the bright light or laser is coming from. If the user is piloting a vehicle, such as an aircraft, or sailboat or speedboat, the user can use this information to direct the vehicle toward or away from the source of the bright light or laser.

[148] In one embodiment, the eyewear or digital eyewear can be disposed to detect polarization of the bright light or laser, and to adjust polarization with respect to at least a portion of the user’s field of view (FOV) in response thereto. This can have the effect that the brightness/luminosity of the bright light or laser can be reduced (when the bright light or laser is polarized). This can also have the effect that the eyewear can protect the user’s eyes against damage to eyesight or night vision, while providing the user with the ability to see through the region of their FOV impacted by the bright light or laser. The eyewear can also be disposed to detect changes in the polarization of the bright light or laser, and to adjust polarization with respect to those changes, so as to maintain protection of the user’s eyes even when the bright light or laser is itself changing.

[149] In one embodiment, the eyewear or digital eyewear can include an electrically controlled polarizer disposed to alter an angle of polarization in real time. For example, the polarizer can be adjusted in real time in response to changes in a relative angle between the wearer’s eye and a direction of infalling glare. When light is reflected from a surface, it can become polarized in a plane. This can have the effect that a planar polarizer can be adjusted so as to reduce or eliminate the amount of light allowed through the polarizer to the wearer’s eye. In such cases, the electrically controlled polarizer can alter the plane of personalization in response to a sensor for determining an angle at which the glare is viewed. The sensor can include a gyroscope or a magnetometer, or another device suitable to determine a relative orientation of the eyewear with respect to the in- falling glare. Alternatively, the sensor can be disposed inside the eyewear and include a light sensor, an infrared (IR) sensor, a camera, or another device suitable to determine an amount of in- falling glare.

[150] In one embodiment, the eyewear or digital eyewear can include an electrically controlled magnifier disposed to alter an amount of magnification, such as in real time. For example, the magnifier can be adjusted, such as in real time, in response to eye gaze direction by the wearer’s eye, in response to eye gestures or other inputs by the wearer, or in response to object recognition by the eyewear. When the wearer looks at a particular object, their gaze direction and focal length can be determined and can identify a particular object. The eyewear can also identify the object using object recognition. Alternatively, when the wearer provides an eye gesture or other input, the wearer can designate a particular object and direct the eyewear to identify that object. In such cases, the eyewear can determine, such as in response to the wearer’s input, an amount of magnification desired by the wearer and can adjust an amount of magnification of that object provided by the eyewear.

[151] In one embodiment, the eyewear or digital eyewear can provide dynamic adjustment with respect to three-dimensional (3D) viewing of a display. For example, the display can include a smartphone or mobile device display, a phablet or tablet display, a computer display, a wearable or implantable device display, a gaming device display, a video display, or otherwise as described herein. In such cases, the eyewear can determine when the wearer is looking at, or otherwise as described herein directing their gaze toward, the display, and can determine whether the display is disposed to provide a 3D presentation. When the display is disposed to provide a 3D presentation and the wearer alters their gaze from/to the display, the eyewear can turn on/ off a 3D presentation in response thereto. For example, when the eyewear is disposed to provide a 3D presentation at the display and the wearer moves their gaze from the display, the eyewear can turn off its 3D presentation and allow the wearer to see their normal field of view (FOV) without any 3D adjustment. When the wearer moves their gaze to the display, the eyewear can turn on its 3D presentation and allow the wearer to see the display using 3D viewing.

[152] In one embodiment, the eyewear or digital eyewear can be disposed to provide color change by the eyewear. For example, this can include a color change by the frame when the eyewear includes glasses, a facemask, helmet, or otherwise as described herein. For another example, this can include a color change by a portion of the eyewear, such as associated with the iris so as to not interfere with the wearer’s vision, when the eyewear includes a contact lens, or otherwise as described herein. For another example, this can include a size change associated with the eyewear, such as associated with the pupil so as to not interfere with the wearer’s vision, when the eyewear includes a contact lens, or otherwise as described herein. Thus, the color change can include a portion of a contact lens that covers the iris or sclera, but not the pupil. For another example, this can include a color change associated with the pupil or lens, so as to alter a color balance of the wearer’s vision, when the eyewear includes a contact lens or implantable lens, or otherwise as described herein.

[153] In addition to color change, the eyewear or digital eyewear can be disposed to provide a color texture. The color texture can include a combination of multiple colors, such as a color gradient, a color pattern, a picture, or another technique in which more than one color is disposed on the eyewear frame or on a contact lens. The color texture can be disposed over the entire eyewear, such as a gradient, pattern, or picture that is disposed over the entire frame or the whole contact lens. Alternatively, the color texture can be disposed with respect to portions of the eyewear, such as a color texture that is only applied to portions touching the lenses, or only applied to portions at the edges of contact lenses.

[154] The color change can also be disposed to itself change in response to time. The color change can include a continuous change, such as a color texture that cycles from a first to a second color and back again. The color change can include a random element, such as a color texture that changes the color of portions of the eyewear randomly or pseudo-randomly, or randomly or pseudo-randomly and in response to objects in the user’s field of view (or otherwise subject to user parameters, such as the user’s skin temperature). The color texture can even be disposed to present a moving picture, such as on the side of the frame or on the iris portion of a contact lens.

[155] In one embodiment, the eyewear or digital eyewear can combine two or more such functions, such as in response to an input from the wearer designating that those functions should be combined, or such as in response to the eyewear recognizing a circumstance in which the wearer typically requests that those functions should be combined. For example, the wearer can designate that those functions should be combined using an eye gesture or other input. For another example, the eyewear can recognize a circumstance in which the wearer typically requests that those functions should be combined in response to a machine learning technique, such as a statistical response to sensory parameters, wearer parameters, environmental parameters, or otherwise as described herein. In such cases, the sensory parameters or wearer parameters can include information with respect to the wearer’s medical or other status; the environmental parameters or can include information with respect to the scene in the wearer’s field of view (FOV). The eyewear can also be responsive to other information, or to a combination of factors, such as the eyewear being more/less sensitive to selected parameters (or to particular wearer inputs) when sensory parameters or wearer parameters indicate particular medical or other status, or otherwise as described herein.

TERMS AND PHRASES

[156] The following terms and phrases are exemplary only, and not limiting.

[157] The phrases “this application”, “this description”, and variants thereof, generally refer to any material shown or suggested by any portions of this Application, individually or collectively, and including all inferences that might be drawn by anyone skilled in the art after reviewing this Application, even if that material would not have been apparent without reviewing this Application at the time it was filed.

[158] The terms “earpiece” and “temple”, and variants thereof, generally refer to a portion of an eyewear in the form of glasses, used to hold a front piece over a wearer’s ears. Typically, each eyewear has a right earpiece/temple and a left earpiece/temple, disposed to hold the front piece over the wearer’s ears. The “front piece” typically has elements disposed to hold one or more lenses in front of the wearer’s eyes, and typically includes a nosepiece disposed to hold the front piece over the wearer’s nose.

[159] The term “eyewear”, the phrase “digital eyewear”, and variants thereof, generally refers to any device coupled to a wearer’s (or other user’s) input senses, including without limitation: glasses (such as those including lens frames and lenses), contact lenses (such as so-called “hard” and “soft” contact lenses applied to the surface of the eye, as well as lenses implanted in the eye), retinal image displays (RID), laser and other external lighting images, “heads-up” displays (HUD), holographic displays, electro-optical stimulation, artificial vision induced using other senses, transfer of brain signals or other neural signals, headphones and other auditory stimulation, bone conductive stimulation, wearable and implantable devices, and other devices disposed to influence (or be influenced by) the wearer. For example, the eyewear or digital eyewear can be wearable by the user, either directly as eyeglasses or as part of one or more clothing items, or implantable in the user, either above or below the skin, in or on the eyes (such as contact lenses), or otherwise as described herein. The eyewear or digital eyewear can include one or more devices operating in concert, or otherwise operating with other devices that are themselves not part of the eyewear or digital eyewear.

[160] The phrases “coloring”, “color balance”, the term “tinting”, and variants thereof, generally refer to any technique by which a set of one or more frequencies or frequency ranges can be selected for emphasis or deemphasis by digital eyewear, including one or more of: (A) adding or injecting light of one or more frequencies or frequency ranges to the user’s eye or to one or more lenses for receipt by the user’s eye; (B) illuminating digital eyewear or the user’s eye so as to improve the user’s ability to see in one or more frequencies or frequency ranges; (C) filtering or removing light of one or more frequencies or frequency ranges from infalling light, so as to prevent light of those frequencies or frequency ranges from reaching the user’s eye; or otherwise as described herein. In general, coloring/tinting can have the property that the user’s field of view can be improved so as to reduce the likelihood or severity of a medical condition, or to otherwise treat or ameliorate the medical condition.

[161] The phrase “dynamic visual optimization”, and variants thereof, generally refers to any technique by which a moving object can be presented to an observer in a substantially non-moving manner, including one or more of: (A) presenting a sequence of substantially still images, each separately identifiable to the observer with at least some distinction between successive ones of that sequence, which collectively show a continuous motion of the object; (B) presenting a sequence of substantially short moving images, each separately identifiable to the observer with at least some distinction between successive ones of that sequence, which collectively show a continuous motion of the object; or (C) any another techniques described herein by which the observer can distinguish between substantially local positions and direction of motion of the object, without the observer losing the ability to determine a relatively longer motion of the object. In general, dynamic visual optimization can have the property that the observer’s view of the moving object improves the observer’s visual acuity and reduces the cognitive load on the observer when viewing the object.

[162] The term “e-sun reader”, and variants thereof, generally refers to any device disposed to use a shading/inverse-shading effect to provide a readable portion of the wearer’s field of view in bright light, such as in bright sunlight. For example, and without limitation, an e-sun reader can include eyewear disposed to shade/inverse-shade one or more lenses so as to adjust brightness on a smartphone, tablet/phablet, or computer screen or another screen. This can have the effect that the wearer of the eyewear can read the screen even in sunlight (or other bright light) that would otherwise wash out the display on the screen and make it difficult to read.

[163] The phrase “eye-tracking”, and variants thereof, generally refers to any technique by which a gaze direction and/or distance to an object being looked at can be determined, including one or more of: (A) determining a direction in which a user’s eye is oriented; (B) determining a focal length of a single user’s eye, or a point in space at which both user’s eyes are directed; (C) determining a time of flight to an object in a direction in which a user’s eye is oriented; (D) performing object recognize with respect to an object in a user’s field of view in a direction at which a user’s eye is oriented or nearly oriented; or otherwise as described herein.

[164] The phrase “motion blur”, and variants thereof, generally refer to artifacts of viewing objects for which there is relative motion between the user and object, in which the object appears blurred, smeared, or otherwise unclear, due to that relative motion. For example, motion blur can occur when the obj ect and user are moving or rotating relatively quickly with respect to each other. For another example, motion blur can occur when the object is disposed in the user’s field of view other than focused upon, such as a peripheral vision field of view or a upper or lower range of the user’s field of view. [165] The phrase “perceptual optimization”, and variants thereof, generally refers to any technique by which user senses can be disposed to be adjusted so as to provide the user with a preferred review of an ambient environment, whether a natural ambient environment, or in response to an augmented reality or virtual reality environment. In one embodiment, perceptual optimization can include one or more of shading/inverse-shading, coloring/ tinting, polarization, prismatic deflection, dynamic visual optimization, audio signal alteration, or as otherwise described herein. In one embodiment, perceptual optimization can include adjustment by a user, by another person, by a processor operating using control software (such as possibly predictive software, or an artificial intelligence or machine learning technique), or in response to a set of pre-set bookmarks set by one or more of the foregoing.

[166] The phrase “real time”, and variants thereof, generally refer to timing, particularly with respect to sensory input or adjustment thereto, operating substantially in synchrony with real world activity, such as when a user is performing an action with respect to real world sensory input. For example, “real time” operation of digital eyewear with respect to sensory input generally includes user receipt of sensory input and activity substantially promptly in response to that sensory input, rather than user receipt of sensory input in preparation for later activity with respect to other sensory input.

[167] The phrases “sensory input”, “external sensory input”, and variants thereof, generally refer to any input detectable by a human or animal user. For example, sensory inputs include audio stimuli such as in response to sound; haptic stimuli such as in response to touch, vibration, or electricity; visual stimuli such as in response to light of any detectable frequency; nasal or oral stimuli such as in response to aroma, odor, scent, taste, or otherwise as described herein; other stimuli such as balance; or otherwise as described herein.

[168] The phrase “sensory overload”, and variants thereof, generally refers to any case in which excessive volume of a sensory input (such as brightness, loudness, or another measure) can cause information to be lost due to human sensory limitations. For example, excessive luminance in all or part of an image can cause human vision to be unable to detect some details in the image. For another example, images having sensory overload can cause human vision to be unable to properly determine the presence or location of objects of interest. [169] The term “shading”, and the phrases “shading/inverse-shading”, “inverse-shading”, and variants thereof, generally refer to any technique for altering a sensory input, including but not limited to:

— altering a total luminance associated with an image, such as by reducing luminance at substantially each pixel in the image;

— altering a luminance associated with a portion of an image, such as by reducing luminance at a selected set of pixels in the image;

— altering a luminance associated with a portion of an image, such as by increasing luminance at a selected portion of the image, to brighten that portion of the image, to highlight a border around or near that portion of the image, to improve visibility of that portion of the image, or otherwise as described herein;

— altering a loudness associated with an auditory signal, such as by reducing loudness at substantially each portion of the auditory signal;

— altering a loudness associated with a portion of an auditory signal, such as by reducing loudness at a selected set of times or frequencies in that auditory signal;

— altering a loudness associated with a portion of an auditory signal, such as by increasing loudness at a selected set of times or frequencies in that auditory signal, to improve listening to that portion of the image, or otherwise as described herein;

— altering a selected set of frequencies associated with an image, such as to change a first color into a second color, for the entire image, for a portion of the image, or otherwise as described herein;

— altering a selected set of frequencies associated with an image, such as to provide a “false color” image of a signal not originally viewable by the human eye, such as to provide a visible image in response to an IR (infrared) or UV (ultraviolet) or other information ordinarily not available to human senses;

— altering a sensory input other than visual or auditory sensory inputs, such as reducing/in- creasing an intensity of a haptic input, of an odor, or of another sense.

[170] The phrases “signal input”, “external signal input”, and variants thereof, generally refer to any input detectable by digital eyewear or other devices. For example, in addition to or in lieu of sensory inputs and external sensory inputs, signal inputs can include

— information available to digital eyewear in response to electromagnetic signals other than human senses, such as signals disposed in a telephone protocol, a messaging protocol such as SMS or MMS or a variant thereof, an electromagnetic signal such as NFC or RFID or a variant thereof, an internet protocol such as TCP/IP or a variant thereof, or similar elements;

— information available to digital eyewear in response to an accelerometer, a gyroscope, a GPS signal receiver, a location device, an ultrasonic device, or similar elements;

— information available to digital eyewear in response to a magnetometer, a medical imaging device, an MRI device, a tomography device, or similar elements; or otherwise as described herein.

[171] The phrase “mobile device”, and variants thereof, generally refers to any relatively portable device disposed to receive inputs from and provide outputs to, one or more users. For example, a mobile device can include a smartphone, an MP3 player, a laptop or notebook computer, a computing tablet or phablet, or any other relatively portable device disposed to be capable as further described herein. The mobile device can include input elements such as a capacitive touchscreen; a keyboard; an audio input; an accelerometer or haptic input device; an input couplable to an electromagnetic signal, to an SMS or MMS signal or a variant thereof, to an NFC or RFID signal or a variant thereof, to a signal disposed using TCP/IP or another internet protocol or a variant thereof, to a signal using a telephone protocol or a variant thereof; another type of input device; or otherwise as described herein.

[172] The terms “random”, “pseudorandom”, and variants thereof, generally refers to any process or technique having a substantially nonpredictable result, and includes pseudorandom processes and functions.

[173] The phrase “remote device”, and variants thereof, generally refers to any device disposed to be accessed, and not already integrated into the accessing device, such as disposed to be accessed by digital eyewear. For example, a remote device can include a database or a server, or another device or otherwise as described herein, coupled to a communication network, accessible using a communication protocol. For another example, a remote device can include one or more mobile devices other than a user’s digital eyewear, accessible using a telephone protocol, a messaging protocol such as SMS or MMS or a variant thereof, an electromagnetic signal such as NFC or RFID or a variant thereof, an internet protocol such as TCP/IP or a variant thereof, or otherwise as described herein. [174] The phrase “user input”, and variants thereof, generally refers to information received from the user, such as in response to audio/video conditions, requests by other persons, requests by the digital eyewear, or otherwise as described herein. For example, user input can be received by the digital eyewear in response to an input device (whether real or virtual), a gesture (whether by the users’ eyes, hands, or otherwise as described herein), using a smartphone or controlling device, or otherwise as described herein.

[175] The phrase “user parameters”, and variants thereof, generally refers to information with respect to the user as determined by digital eyewear, user input, or other examination about the user. For example, user parameters can include measures of whether the user is able to distinguish objects from audio/video background signals, whether the user is currently undergoing an overload of audio/video signals (such as from excessive luminance or sound), a measure of confidence or probability thereof, a measure of severity or duration thereof, other information with respect to such events, or otherwise as described herein.

[176] The phrase “visual acuity”, and variants thereof, generally refers to the ability of a user to determine a clear identification of an object in the user’s field of view, such as one or more of:

— The object is presented in the user’s field of view against a background that involves the user having relatively greater difficulty identifying the object against that background. This is sometimes called “static” visual acuity herein.

— The object is moving at relatively high speed, or relatively unexpected speed, in the user’s field of view, that involves the user having relatively greater difficulty identifying a path of the object. This is sometimes called “dynamic” visual acuity herein.

— The object is presented in the user’s field of view at an angle, such as a peripheral vision angle or another non-frontal visual angle, that involves the user having relatively greater difficulty identifying the object. This is sometimes called “peripheral” visual acuity herein.

— The object is in motion with respect to the user, such as objects that are moving directly toward or away from the user, or objects that are moving in a region of the user’s peripheral vision.

— The object is located poorly for viewing with respect to a background, such as an object that is brightly backlit, or for which the sun or other lighting is in the user’s eyes, or an object which appears before a visually noisy background, or otherwise is difficult to distinguish.

[177] The phrase “improving visual acuity”, and variants thereof, generally refers to improving the user’s audio and/or visual acuity, or improving the user’s ability to see motion, without degrading the user’s normal ability to sense audio and/or visual information, and without interfering with the user’s normal sensory activity. For example, as described herein, when the user’s visual acuity is improved, the user should still be able to operate a vehicle, such as driving a motor vehicle or piloting an aircraft, or operating another type of vehicle.

[178] The phrases “cognitive load”, “cognitive overload”, “cognitive underload”, and variants thereof, with respect to observing an object, generally refers to a measure of how difficult an observer might find determining a location or movement of that object, such as with respect to a foreground or a background. For example, as described herein, when the user’s cognitive load is reduced (whether due to a reduced amount of cognitive overload or cognitive underload), the user’s visual acuity is generally improved.

— As described herein, the phrase “cognitive overload”, and variants thereof, generally refers to a measure of excessive sensory input, with the effect that the user loses visual acuity due to that overload. For example, cognitive overload can occur with respect to a moving object when that moving object has a relatively bright sky or a relatively noisy image behind it.

— As described herein, the phrase “cognitive underload”, and variants thereof, generally refers to a measure of inadequate sensory input, with the effect that the user loses visual acuity due to that underload. For example, cognitive underload can occur with respect to a moving object when that moving object is relatively dim or indistinct with respect to its background.

[179] After reviewing this Application, those skilled in the art would recognize that these terms and phrases should be interpreted in light of their context in the specification.

FIGURES AND TEXT

Fig. 1 — Active Correction or Enhancement

[180] Fig. 1 (collectively including Figures 1A-1B) shows a conceptual drawing of example eyewear including wearable glasses.

[181] Figure 1A shows a conceptual drawing of example glasses having multiple active regions related to wearer view. [182] Figure 1B shows a conceptual drawing of example glasses having multiple active pixels related to individual wearer view.

Active correction or enhancement — regions

[183] Figure 1A shows a conceptual drawing of example glasses having multiple active regions related to wearer view.

[184] In one embodiment, an example eyewear too can include glasses no disposed for use by the wearer (not shown), including elements shown in the figure, such as one or more of:

— a frame 111, such as possibly including temples 111a, a nosepiece mb, or lens holders 111c;

— at least one lens 112, such as possibly a right lens 112a (shown in Figure 1A), or a left lens 112b (shown in Figure 1B).

[185] In one embodiment, the frame 111 can enclose, or hold, one or more electronic elements shown in the figure, such as one or more of:

— a computing device 121, such as possibly including a processor, memory or mass storage, a power supply, a clock circuit, or other elements used with computing devices;

— a communication device 122, such as possibly including a wireless or wired communicate element, a communication protocol stack, or other elements used with communication devices;

— one or more sensors 123, such as possibly including one or more of: wearer sensors 123a disposed to receive information about the wearer (or their current condition), ambient sensors 123b disposed to receive information about an environment near the wearer (or its current condition), or other sensors.

[186] For example, the sensors 123 can include one or more visually evoked potential (VEP) elements disposed to measure a potential of the wearer’s visual region of the brain. The VEP elements can be disposed using a set of electrodes disposed on the wearer’s scalp, or on a headset or headband, on the wearer’s forehead, on the back of the wearer’s neck, or otherwise as described herein. The sensors 123 can also include elements disposed to measure an electroencephalogram (EEG), an amount of skin moisture, a skin temperature, a galvanic skin response, other elements disposed to measure the wearer’s emotional state, or otherwise as described herein. [187] For another example, the sensors 123 can include one or more devices disposed to perform electroencephalography (EEG), electrooculography (EOG), electroretinography (ERG), optical computed tomography (OCT), or other measures with respect to eye function. For example, anxiety or depression can be determined in response to ERG. For another example, cardiac risk can be determined in response to OCT. For another example, the computing device 121 can be disposed to use other measures with respect to eye function, such as in combination with one or more artificial intelligence (Al) or machine learning (ML) techniques, to predict one or more measures of efficacy of treatment, quality of life after treatment, or otherwise as described herein, with respect to monitoring, predicting, preventing, diagnosing, or treating medical conditions.

[188] For another example, the sensors 123 can include an electric field element disposed to measure a dipole moment of the eye. The dipole moment of the eye is weak but present; it is aligned at a known angle with respect to a gaze direction. This can have the effect that the element disposed to measure a dipole moment of the eye can measure a gaze direction, without requiring any input to, or view of, the pupil or iris.

[189] For another example, the sensors 123 can include a gaze direction sensor (not shown), such as an element disposed to measure a reflection of an electromagnetic signal, such as infrared (IR) light directed at the eye and reflected in response to a direction of the pupil or the lens thereof. In such cases, the gaze direction sensor can provide a signal indicating a direction at which the wearer is looking, such as whether the wearer is looking up/down, right/left, centrally/ perip her- ally, or through what region of the lens the wearer’s gaze is directed. In such cases, the sensors 123 can also include a pupillometer, such as an element disposed to measure a size of the pupil, such as a camera or other device disposed to distinguish a size of the pupil. A size of the pupil can be used to determine a focal length at which the wearer is directing a gaze, such as at a distance, mid-range, or near range.

[190] For another example, the sensors 123 can include one or more devices mounted on a vehicle, such as a vehicle being controlled by the wearer (such as a racing car or an aircraft). For example, the sensors 123 can be disposed surrounding the vehicle, directed at fields of view (FOV) not ordinarily available to the wearer when operating the vehicle. The sensors 123 can be mounted on the vehicle and directed to the sides or rear of the vehicle, at the front of the vehicle and directed at angles from the wearer’s FOV, or otherwise as described herein. The sensors 123 can be mounted on the vehicle and disposed so as to move relative to the vehicle, such as when the vehicle is turning, climbing or diving, accelerating or decelerating, or otherwise as described herein.

[191] For another example, the sensors 123 can include one or more remote devices, such as mounted on aircraft, drones, other vehicles, other distant stations, or otherwise as described herein. In such cases, the sensors 123 can be disposed to transmit information to the computing device 121, so as to control the lenses 112. In additional such cases, the sensors 123 can be disposed to transmit information from an over-the-horizon field of view (FOV), otherwise not ordinarily available to the wearer.

[192] In one embodiment, the lens holders me can be disposed to maintain one or more lenses 112, such as lenses used to correct vision on behalf of the wearer, lenses used to enhance vision on behalf of the wearer, or otherwise as described herein. For example, lenses 112 used to correct vision can have one or more lens prescriptions associated therewith, disposed to correct for myopia, presbyopia, astigmatism, or other wearer vision artifacts. For another example, lenses 112 used to enhance vision can include a zoom feature disposed to present the wearer with a zoomedin or zoomed-out view of the wearer’s field of view (FOV), or can include other features disposed to present the wearer with other vision enhancements described in the Incorporated Disclosures, or otherwise as described herein.

[193] The lenses 112 can include multiple lens regions 131, each disposed to correct vision or enhance vision on behalf of the wearer. For example, the lens regions 131 can include a central vision region 131a, disposed to provide distinct corrections or enhancements to vision in a region where the wearer is looking at objects using their central vision, or one or more peripheral vision regions 131b, disposed to provide distinct corrections or enhancements to vision in a region where the wearer is looking at objects using their peripheral vision. For another example, the lens regions 131 can include a close-vision region 131c, disposed to provide distinct corrections or enhancements to vision in a region where the wearer is looking at a close object, a mid-range vision region I3id, disposed to provide distinct corrections or enhancements to vision in a region where the wearer is looking at a mid-range object, or a distant vision region 131c, disposed to provide distinct corrections or enhancements to vision in a region where the wearer is looking at a distant object. [194] In one embodiment, each lens region 131 can be individually controlled, such as by the computing device 121, or otherwise as described herein. This can have the effect that the wearer’s vision can be corrected or enhanced in each region where the wearer might look. For example, the close-vision region 131c can be disposed with a distinct prescription from the mid-range vision region 13 id. This can have the effect that when the wearer looks at a close object, their vision can be corrected or enhanced with respect to the prescription assigned to the close- vision region 131c, or when the wearer looks at a mid-range object, vision can be corrected or enhanced with respect to the prescription assigned to the mid-range vision region I3id. For another example, the central vision region 131a can be disposed with a distinct prescription from the peripheral vision region 131b. This can have the effect that when the wearer looks directly at an object, their vision can be corrected or enhanced with respect to the prescription assigned to the central vision region 131a, or when the wearer uses their peripheral vision, their vision can be corrected or enhanced with respect to the prescription assigned to the peripheral vision region 131b.

[195] In one embodiment, when the wearer moves their head, the computing device 121 can determine, such as using an accelerometer or a gyroscope (which can be included with the sensors 123), a wearer’s head movement. The computing device 121 can also determine, such as using a dynamic eye gaze tracker (which can be included with the sensors 123), a gaze direction. This information can allow the computing device 121 to determine whether the wearer is intending to look at a close object, a mid-range object, or a distant object; similarly, this information can allow the computing device 121 to determine whether the wearer is using their central vision or peripheral vision. In response thereto, the computing device 121 can control the correction or enhancement associated with one or more of the lens regions 131. This can have the effect that the eyewear too adjusts its correction or enhancement to match the wearer’s intended use thereof.

[196] In another embodiment, when the wearer shifts their gaze, the computing device 121 can determine, such as using a focal length detector (which can be included with the sensors 123), a distance to an object being viewed by the wearer. This information can allow the computing device 121 to determine whether the wearer is intending to look at a close object, a mid-range object, or a distant object. In response thereto, the computing device 121 can control the correction or enhancement associated with one or more of the lens regions 131. This can have the effect that the eyewear too adjusts its correction or enhancement to match the wearer’s intended use thereof. [197] In one embodiment, the lens regions 131 can overlap, such as shown in the figure. An example might occur when close-range overlaps with both central/peripheral vision. In such cases, the intersection of multiple lens regions 131, or the union of multiple lens regions 131, as appropriate, can be invoked by the computing device 121, so as to provide the wearer with the correction or enhancement to match the wearer’s intended use of the eyewear too.

Active correction or enhancement — pixels

[198] Figure 1B shows a conceptual drawing of example glasses having multiple active pixels related to individual wearer view.

[199] In one embodiment, an example eyewear too can include glasses 110 disposed for use by the wearer (not shown), including elements shown in the figure, such as one or more of:

— a frame 111, such as possibly including temples 111a, a nosepiece mb, or lens holders 111c;

— at least one lens 112, such as possibly a right lens 112a (shown in Figure 1A), or a left lens 112b (shown in Figure 1B).

[200] The lenses 112 can include multiple lens pixels 141, each disposed to correct vision or enhance vision on behalf of the wearer. In one embodiment, each lens pixel 141 can include an individual region (such as the multiple lens regions 131, only typically smaller), disposed to provide distinct corrections or enhancements to vision in the region where the wearer’s gaze direction intersects the lens pixel. Similar to the lens regions 131 described with respect to Figure 1A, each lens pixel 141 can be individually controlled, such as by the computing device 121, or otherwise as described herein. This can have the effect that the wearer’s vision can be corrected or enhanced for each direction where the wearer might look.

[201] In one embodiment, the computing device 121 can associate a distinct set of lens pixels 141 for use as a separate one of the multiple lens regions 131. The computing device 121 can control the prescription with respect to each such lens region 131 by controlling each of the lens pixels 141 associated with that particular lens region. Similar to the possibility of overlap of lens regions 131, a set of lens pixels 141 can be associated with more than one such lens region. This can have the effect that when the computing device 121 determines that the wearer is using a particular lens region 131, it can select the set of lens pixels associated with that lens region, even if those lens pixels are also associated with another lens region. Similar to overlap of lens regions 131, the intersection of multiple sets of lens pixels 141, or the union of multiple sets of lens pixels 141, as appropriate, can be invoked by the computing device 121, so as to provide the wearer with the correction or enhancement to match the wearer’s intended user of the eyewear too. When the computing device 121 can determine the wearer’s intended user of the eyewear too and can determine the particular lens pixel 141 that the wearer’s gaze direction passes through, the computing device can invoke only that one lens pixel, possibly updating the particular lens pixel to invoke as the wearer’s gaze direction might change.

[202] The set of lens pixels 141 associated with each such lens region 131 can be adjusted by the computing device 121. This can have the effect that the set of lens pixels 141 associated with each such lens region 131 can be altered from time to time.

[203] In alternative embodiments, the lenses 112 can include one or more layers or alternative regions that can have their shading, or other effects, separately adjusted. Thus, in addition or in lieu of lens pixels 141, the lenses 112 can use separate regions that are adjusted as a whole, rather than being adjusted as a collective of lens pixels 141. When a region is adjusted, this can have the effect that the eye can be drawn toward or away a particular adjusted region. For example, when it is desired to encourage the user to look through a short-range focusing region, other regions can be shaded to decrease visibility, thus encouraging the user to look in a particular direction or through a particular region of the lenses.

[204] For example, a selected lens 112a or 112b can include a first region for a first degree of vision correction, such as using refraction, such as for close-range viewing and a second region for a second degree of vision correction, such as for longer-range viewing. A second lens can be overlaid on the first lens, so that the second lens can shade one or more regions of the first lens. This can have the effect that the user is prompted to look in a selected direction, or through a particular region of the first lens. Thus, the second lens can shade so as to prompt the user to view through the selected lens 112a or 112b, thus looking at a field of view (FOVj through either a selected close-range lens (e.g., lens 112a) or a selected more longer-range lens (e.g., lens 112b).

Predictive techniques

[205] In one embodiment, the computing device 121 can maintain a record of wearer activity with respect to use of the lens regions 131, so as to identify which portions of the lenses 112 should be associated with which lens regions 131 to provide the wearer with the best possible experience with using the eyewear too. For example, when the computing device 121 determines that the wearer is most likely to need a particular prescription for a selected portion of the lenses 112, the computing device can adjust the prescription for that particular portion of the lenses so as to provide the wearer with that prescription when the wearer is using that portion of the lenses.

[206] In one embodiment, the computing device 121 can determine the wearer’s most likely prescription in response to a predictive technique, such as using artificial intelligence (Al) or machine learning (ML). For example, the computing device 121 can train a recurrent neural network (RNN) to predict the wearer’s most likely prescription in response to each lens region 131 and each other set of circumstances, such as information obtained from the sensors 123. Alternatively, the computing device 121 can determine a set of regression parameters to predict the wearer’s most likely prescription in response to each lens region 131 and each other set of circumstances. The computing device 121 can use other and further Al or ML techniques, or other techniques, or otherwise as described herein, to make the desired prediction.

[207] Similar to predictive techniques with respect to the lens regions 131, the computing device 121 can determine the wearer’s most likely prescription in response to one or more predictive techniques, such as using artificial intelligence (Al) or machine learning (ML) with respect to each lens pixel 141, with respect to association of lens pixels 141 with particular lens regions 131, or otherwise as described herein. In such cases, the computing device 121 can assign individual lens pixels 141 to selected lens regions 131, in response to one or more predictive techniques. Also similarly, the computing device 121 can adjust the set of lens pixels 141 associated with each lens region 131 in response to a predictive technique in response to wearer actions, such as the wearer moving their head when their gaze direction should be reassociated with a different lens region 131-

[208] In one embodiment, the computing device 121 can determine the wearer’s most likely medical condition, such as in response to the sensors 123. For example, blink rate and other parameters with respect to the wearer’s eye activity can be used to determine whether the wearer is excessively anxious, depressed, sleep-deprived, or otherwise needs to rest. In such cases, the eyewear too can be disposed to urge the wearer to take a break and rest. This can have the effect that safety is improved, such as for commercial pilots and other pilots, long-haul truckers and other long-distance drivers, police officers, military personnel, firefighters, emergency responders, medical personnel, and other personnel often subject to long hours or stressful circumstances. Alternatively, the eyewear 100 can be disposed to urge the wearer to take a break or to obtain a stimulant, such as caffeine, sugar, a meal, or otherwise as described herein.

Environment features

[209] In one embodiment, an example eyewear 100 can be responsive to environment features, such as: features of wearer’s field of view (FOV), features of objects or scenes within the wearer’s FOV, other features of the ambient environment, or otherwise as described herein.

[210] For example, features of the wearer’s field of view can include one or more of: ambient light, such as total luminance, luminance in a particular region thereof (such as in a region of peripheral vision), prominence of particular colors (such as excessive or inadequate red, green, or blue), glare, ultraviolet (UV), or otherwise as described herein. For another example, features of the wearer’s field of view can include the presence of infrared (IR) frequencies, such as for use with “night vision” eyewear. For another example, features of the wearer’s field of view can include particular frequency mixtures, such as: sunlight, indoor lighting, excessive UV, particularly when inappropriate for the time of day.

[211] For example, features of the wearer’s field of view can include identifying particular objects, such as weapons (guns, knives, or otherwise as described herein), possibly using object recognition. For another example, features of the wearer’s field of view can include identifying particular people, such as friends, teammates, co-workers, search/rescue targets, criminal suspects, accident victims or medical patients, or otherwise as described herein.

[212] For example, features of the wearer’s ambient environment can include the wearer’s location (including whether the wearer is within a particular area (such as within a known geofence), or whether the wearer is within a selected distance of a known object); the absence or presence of known electromagnetic signals, such as identify- friend-or-foe (IFF) signals for particular persons or equipment; atmospheric conditions, such as weather, pollution conditions, or allergens.

Electromagnetic signals and predictive actions [213] When the wearer’s ambient environment includes an IFF signal, the eyewear too can determine whether to adjust features of the wearer’s field of view (FOV) in response to the IFF signal. For example, when the IFF signal indicates a warning that a stun grenade (sometimes called a “flashbang grenade”) is about to be triggered in the local area, the eyewear too can adjust the wearer’s FOV to (A) heavily shade infalling light so as to protect the wearer eyes against the extreme light emitted by the flashbang grenade, and (B) heavily protect the wearer’s ears against the extreme sound emitted by the flashbang grenade.

[214] When the wearer’s ambient environment includes a signal describing an object, the eyewear too can determine whether to adjust feature of the wearer’s field of view (FOV) in response to the object. Alternatively, the eyewear too does not need to explicitly wait for an explicit signal indicating describing the object; the eyewear can use a predictive technique, such as an artificial intelligence (Al) or machine learning (ML) technique to, in response to the ambient environment or other factors, to determine that the object is about to enter the wearer’s FOV, so as to prepare itself accordingly to adjust the wearer’s FOV.

(Dark tunnel)

[215] For example, the signal can indicate that the wearer is about to enter or to exit a dark tunnel, particularly when driving at relatively high speed. In such cases, the signal with respect to entering or exiting a dark tunnel can be emitted by a transmitter at or near the entrance or exit of the tunnel, or can be received with respect to a general location detector, such as a GPS device.

[216] When the signal indicates that the wearer is a driver of a vehicle and is about to enter a dark tunnel, particularly when driving at relatively high speed, the eyewear too can adjust the wearer’s FOV to (A) remove any shading against sunlight so as to allow the driver to see within the dark tunnel, and (B) enhance the wearer’s vision within the dark tunnel, such as by enhancing any lighting within the tunnel, adjusting for any visual blur or noise due to the vehicle moving quickly within the dark tunnel, (C) adjust the wearer’s prescription so as to account for the relative closeness of the walls of the dark tunnel, and (D) enhance the wearer’s vision within the dark tunnel by adding to the light (by inj ecting light) in areas of the wearer’s FOV where the dark tunnel is in shadow. The eyewear too can make similar adjustments to account for entering any similar darkened region, such as a canyon shadowed against sunlight. [217] Similarly, when the wearer’s vehicle exits the tunnel (or other darkened region such as a canyon shadowed against sunlight), the eyewear too can adjust the wearer’s FOV to (A) replace shading against sunlight so as to allow the driver to see when exiting the dark tunnel, and especially to remove glare from sudden sunlight from exiting the dark tunnel, (B) remove any enhancement of lighting so as to not exacerbate the effect of sudden sunlight, (C) adjust the wearer’s prescription so as to account for the relative distance of roadway outside the dark tunnel (or other darkened region), and (D) remove any light injection so as to not overload the wearer’s vision.

(Sudden lighting changes)

[218] The eyewear too can similarly alert the wearer and adjust the wearer’s field of view (FOV) in response to sudden changes in lighting condition, whether those sudden changes are due to known objects, known terrain features, or other known effects. For example, when the eyewear too detects a sudden change in lighting condition, the eyewear can adjust the wearer’s FOV in response to that sudden change. Since the eyewear too can operate electronically, while the wearer’s eye operates using the pupil and iris, this can have the effect that that the wearer’s FOV can be adjusted much faster by the eyewear than by the wearer’s eye muscles. The eyewear too can operate to respond to sudden changes in lighting condition in approximately 1.5 milliseconds, while the pupil and iris might take as long as 300 to 400 milliseconds to respond. Accordingly, the eyewear can protect the wearer against sudden changes in lighting more effectively than the wearer’s eye itself. In such cases, the eyewear too does not need to explicitly wait for a sudden change in lighting condition; the eyewear can use a predictive technique, such as an artificial intelligence (Al) or machine learning (ML) technique to, in response to the ambient environment or other factors, to determine that a sudden change in lighting condition is imminent, so as to prepare itself accordingly to adjust the wearer’s FOV.

[219] For another example, the signal can indicate that the wearer is about to view a display, such as a display described with respect to fig. 7. In such cases, the signal with respect to viewing a display can be emitted by a transmitter on or near the display, or can be received with respect to a general location detector such as a GPS device. (Viewing a display)

[220] When the signal indicates that the wearer is about to view a display, such as when the wearer is driving, and the display includes a billboard or surface that enters the wearer’s field of view (FOV), the eyewear too can adjust the wearer’s FOV to augment the wearer’s vision to inject an image at the location of the display. For example, the image injected onto the display can include information with respect to news, road conditions or weather; one or more advertisements, such as in response to demographic or social information about the wearer, or information about which the wearer has expressed interest, or otherwise as described herein.

[221] When the signal indicates that the wearer is viewing a display, such as a display associated with a smartphone or other mobile device, or another selected background, the eyewear too can adjust the wearer’s field of view (FOV) to include a three-dimensional (3D) display on the display superposed on the selected background. For example, the eyewear too can adjust the wearer’s FOV to present a 3D display on the smartphone’s display when the wearer looks at the smartphone. For another example, the eyewear too can adjust the wearer’s FOV to present a 3D display on another selected background, such as a billboard, a movie theater screen, a theme-park display or other interactive display, an outdoor background, a region of the sky or other natural background, or another region of the wearer’s field of view appropriate for a 3D display.

[222] In one embodiment, the eyewear can be disposed to adjust shading, or other effects, with respect to an object or with respect to a portion of the user’s field of view (FOV) at which the user is looking. In such cases, when the user is looking in a particular direction, the eyewear can be disposed to shade only portions of the user’s FOV in that direction. Similarly, in such cases, when the user is looking at a particular object, such as when looking in a particular direction and at a particular depth of focus so as to distinguish a selected object, the eyewear can be disposed to shade only that selected object. An outbound camera, such as a camera mounted behind one or more of the lenses and disposed to view a location or region at which the user is looking, can be disposed to determine an amount of shading that optimizes the user’s view, or to determine an amount of shading that optimizes a clarity of the location or region at which the user is looking.

[223] For example, when the eyewear detects that the user is looking at a display, such as a smartphone or other mobile device, the eyewear can detect whether shading is necessary or appropriate, in response to the relative brightness of the display and of the ambient environment. For example, if the display is much brighter than the ambient environment (such as when the display is bright and is being viewed in relative darkness), the eyewear can be disposed to shade the region of the user’s field of view (FOV) occupied by the display, and not those areas of the user’s FOV occupied by other, less bright, objects. For another example, if the display is less bright than the ambient environment (such as when the display is not especially bright, when the ambient environment is quite bright, or when the ambient environment is substantial brighter than the display), the eyewear can be disposed to shade the region of the user’s FOV occupied by the display, so as to allow the display to be viewed even in the bright ambient environment.

[224] For another example, when the user is piloting a vehicle (such as an aircraft, a racing car, a sailboat or speedboat, or another controllable moving object), the eyewear can be disposed to detect the locations of the displays associated with that vehicle. For example, the eyewear can be disposed specifically for use with that vehicle, or the eyewear can be disposed to receive information from that vehicle when the user enters the vehicle. When the eyewear detects the locations of the displays associated with that vehicle, the eyewear can determine which displays are excessively bright or are over-brightly lit by the ambient environment. In such cases, the eyewear can be disposed to shade exactly those regions in the user’s field of view (FOV), or those regions in the user’s FOV and also depth of focus, where those displays are located with respect to the user’s position when piloting the vehicle. Similarly, the eyewear can be disposed to shade exactly those regions in the user’s FOV, or those regions in the user’s FOV and depth of focus, where those displays are located with respect to the user’s position when co-piloting the vehicle.

[225] In such cases, the eyewear can be disposed to detect the particular type of vehicle in response to a signal from the vehicle, such as an electromagnetic signal when the user opens or closes a door to the vehicle, or when the user buckles a harness in the vehicle, or triggers an engine for the vehicle, or otherwise indicates their readiness to pilot the vehicle. It other such cases, the eyewear can be disposed to operate with respect to a particular type of vehicle; the eyewear can be pre-loaded with information about the vehicle, including positions of the vehicle’s displays when in operation. The eyewear can be pre-loaded by one or more of: (A) being designed for use with a particular vehicle; (B) having information about distinct types of vehicle and setting itself for use with one type of vehicle in response to one or more signals indicating the user’s starting to pilot that type of vehicle; (C) having information about locations of displays for distinct types of vehicle and setting itself for use with one type of vehicle in response to identifying displays associated with one type of vehicle; or (D) otherwise identifying locations of displays in response to information about the vehicle.

(Viewing an object)

[226] When the signal indicates that the wearer is about to view an object, such as when the wearer is moving in a theme-park ride or other entertainment attraction, and the object is about to enter the wearer’s field of view (FOV), the eyewear too can adjust the wearer’s FOV to augment the wearer’s vision to inject an image at the location of the object. For example, the image injected at the location of the object can replace the wearer’s view of the object with a different object. This can have the effect that the viewable entertainment attraction can be replaced with a different attraction without substantial physical change. For another example, the image injected at the location of the object can augment the wearer’s view of the object with an additional texture, such as a seasonal decoration. This can have the effect that the viewable entertainment attraction can be modified in response to a time of day, day of the week, or season of the year, without substantial physical change.

[227] For another example, the signal can indicate that the wearer is about to view an object, such as when the wearer is moving in a store, shopping mall, or other commercial area, and such as when the object is a product (or a display with respect to a service) in which wearer might be interested. In such cases, the signal with respect to the object can be emitted by a transmitter on or near the object, or can be received with respect to a general location detector such as a GPS device.

[228] When the signal indicates that a product (or a display with respect to a service), in which the wearer might be interested, is about to enter the wearer’s field of view (FOV), the eyewear too can adjust the wearer’s FOV to augment the wearer’s vision to inject an image at or near the location of the object. For example, the image can include (A) information about the product or service, such as a price or sale price, a product specification, a comparison with another product, a set of multiple views of the object, a view of the object in another color or style, or otherwise as described herein; (B) information about customer reviews of the product or services, such as positive or negative reviews that have been deemed helpful by other customers, or otherwise as described herein; (C) information about example uses, other products or services that can be used together, other products or services that have been purchased together, or otherwise as described herein; (D) an advertisement, such as one targeted to the wearer or related to topics in which the wearer is interested. In such cases, the eyewear 100 can adjust the wearer’s FOV at such times when the wearer is directing their gaze or focus at the object itself, rather than the generalized area in which the object can be seen.

[229] When the signal indicates that a product (or a display with respect to a service), in which the wearer might be interested, is being viewed by the wearer, the eyewear 100 can adjust the wearer’s view of the object to augment the wearer’s vision in response to input from the wearer. For example, the wearer can indicate a particular use in which the wearer is interested, in response to which the eyewear 100 can adjust the wearer’s view of the object to show the object in the context of that particular use. For example, when the wearer is viewing a product or service for which ordinary store lighting is not the best suited, the eyewear 100 can adjust the wearer’s view of the object to show the context in which the wearer intends to use the object.

[230] Examples can include:

— when the object includes sportswear or swimwear, or similar clothing, the eyewear 100 can adjust the wearer’s view to show how the object might look in full sunlight at a beach or pool, or otherwise as described herein;

— when the object includes club wear or party clothing, or similar clothing, the eyewear 100 can adjust the wearer’s view to show how the object might look in a bar or club, a party environment, or otherwise as described herein;

— when the object includes makeup or other beauty accessories, the eyewear 100 can adjust the wearer’s view to show how the object might look in a context with respect to one or more intended uses, such as in daytime or nighttime, indoors or outdoors, in bright or dark environments, or otherwise as described herein.

Augmented reality and virtual reality

[231] In one embodiment, an example eyewear 100 can enhance the wearer’s vision using augmented reality or virtual reality. For example, the eyewear 100 can be disposed to provide one or more images in lieu of or in place of images that would otherwise be available to the wearer’s eye from the ambient environment. This can include one or more of:

— shading/inverse-shading light from the ambient environment incoming to the wearer’s eye and replacing that incoming light with other light so as to can form an alternative image not otherwise available in the wearer’s field of view, and replacing the image available to the wearer with an altered field of view;

— overlaying additional light on light from the ambient environment incoming to the wearer’s eye, while also possibly shading/inverse-shading and light from the ambient environment so as to form an additional image not otherwise available in the wearer’s field of view, while still allowing the wearer to see their otherwise-unaltered field of view.

[232] For example, when presenting an augmented reality or virtual reality view to the wearer, the eyewear too can provide one or more of: text, still pictures, moving pictures, lines (such as scrimmage lines or trajectories in sporting events, whether the wearer is a participant or a spectator), highlighting or outlines of objects (such as the wearer’s friends, or such as suspects or weapons when the wearer is a law enforcement officer), information presented in false-coloring or isobars, or otherwise as described herein.

[233] For another example, when presenting an augmented reality or virtual reality view to the wearer, the eyewear too can provide one or more of: an image of an object or person not otherwise present in the wearer’s field of view, such as an object being presented to the wearer as an advertisement or for an informational purpose, an object in a game, an object being presented to the wearer using remote communication; or such as a person whom the wearer is interacting or observing or searching for, with whom the wearer is communicating.

[234] For another example, the wearer can control or invoke augmented reality or virtual reality functions, such as

— to add or remove additional images or information with respect to the wearer’s field of view;

— to adjust the use of augmented reality or virtual reality with respect to the wearer’s field of view;

— to provide selected enhancements to the wearer’s field of view that the wearer might desire (such as to improve audio/ visual acuity, to treat audio/ visual disorders such as glare or migraines, to treat problematic effects of the ambient environment such as inadequate light or contrast, to preserve the wearer’s night vision, or otherwise to correct or enhance the wearer’s audio/visual senses);

— to select between unaltered-reality images and augmented reality or virtual reality images; or otherwise as described herein. [235] For another example, the wearer can control or invoke augmented reality or virtual reality functions to shade/inverse-shade an amount of incoming light from the ambient environment, such as to mitigate glare or excessive brightness, to mitigate excessively bright lighting, to preserve the wearer’s night vision, to inject selected electromagnetic frequencies (such as amber or green light for a calming effect, or such as blue light to a stimulating or waking effect), or otherwise as described herein.

[236] For another example, the wearer can trigger augmented reality or virtual reality features, such as by proximity to a selected object or by looking in a selected direction. Proximity to a selected object can include approaching an object in a store that is for sale, whereupon the object can trigger the augmented reality or virtual reality feature to present an advertisement or other information to the wearer. Similarly, looking in a selected direction can include looking at a menu in a restaurant whereupon the menu can trigger the augmented reality or virtual reality feature to present a description or a picture of the selected menu item, or looking at an object associated with an audio/video presentation to present that presentation, or to present an augmented reality or virtual reality presentation associated with that object.

Medical parameters

[237] In one embodiment, an example eyewear too can be responsive to medical conditions of the wearer, such as whether the wearer is subject to allergies, “dry eyes” and related conditions, migraines/photophobia or related conditions, sleep deprivation, epilepsy or other seizure concerns, being under the influence of alcohol or other substances, or otherwise as described herein.

[238] For example, the eyewear too can determine whether the wearer is subject to allergies in response to whether there is any mucus buildup on the wearer’s eyes or tear ducts, or other parameters with respect to allergies.

[239] For another example, the eyewear too can determine whether the wearer is subject to “dry eyes” in response to whether the wearer exhibits red sclera (such as from display of blood vessels at the sclera), short tear film breakup time, thin tear films, or other parameters with respect to dry eyes, and features described with respect to the Incorporated Disclosures, particularly including Application 16/138,941, filed Sept. 21, 2018, naming the same inventor, titled “Digital eyewear procedures related to dry eyes”, Attorney Docket No. 6301, currently pending. [240] For another example, the eyewear too can determine whether the wearer is subject to migraines/ photophobia or related conditions in response to features described with respect to the Incorporated Disclosures, particularly including Application 15/942,951, filed Apr. 2, 2018, naming the same inventor, titled “Digital Eyewear System and Method for the Treatment and Prevention of Migraines and Photophobia”, Attorney Docket No. 6021, currently pending.

[241] For another example, the eyewear 100 can determine whether the wearer is subject to epilepsy or other seizure concerns, stroke or transient ischemia, traumatic brain injury (TBI), or being under the influence of alcohol or other substances, in response to the wearer’s eye activity, such as pupil or iris size, blink rate, eye twitching or nystagmus, saccade rates and distances, eye rotation, other measurable features of the wearer’s eye activity or facial activity, or otherwise as described herein. The eyewear 100 can determine the actual values of these or other measures, comparison with a baseline “normal” rate for the wearer or for ordinary patients, comparison with a baseline “normal” rate for the wearer under ordinary conditions (such as with respect to blink rate and related measures), or otherwise as described herein. The eyewear 100 can also determine first and other derivatives of those values, first order and other statistical measures of those values, correlations of pairs of those values, medical information with respect to those values, or otherwise as described herein.

[242] For another example, the eyewear too can determine medical parameters with respect to the wearer’s retina, such as whether the wearer’s rods or cones are activated; whether the wearer’s eyes are operating in photopic, mesopic, scotopic modes; a measure of activity of the wearer’s fovea; or otherwise as described herein.

[243] In one embodiment, the eyewear too can, with respect to one or more medical conditions, attempt to predict those medical conditions, prevent those medical conditions, diagnose those medical conditions (such as when they are beginning, occurring, or ending), monitor those medical conditions (as they begin, proceed, finish, end, or recur), treat those medical conditions (possibly with the assistance of the wearer), or otherwise as described herein.

[244] For example, the eyewear too can perform prediction, prevention, diagnosis, treatment, or otherwise as described herein, using one or more artificial intelligence (Al) or machine learning (ML) techniques, such as those described with respect to the Incorporated Disclosures, particularly including Application 15/942,951, filed Apr. 2, 2018, naming the same inventor, titled “Digital Eyewear System and Method for the Treatment and Prevention of Migraines and Photophobia”, Attorney Docket No. 6021, currently pending. In such cases, the eyewear 100 can perform prediction, prevention, diagnosis, treatment, or otherwise as described herein, with respect to medical conditions other than migraines or photophobia; for example, the eyewear 100 can perform these functions with respect to ADD or ADHD, Alzheimer’s disease, autism spectrum disorder, bipolar disorder, cancer, cardiovascular risk, dementia, depression, “dry eyes”, epilepsy or seizure disorders, eye fasciculations, hallucinations, Parkinson’s disease, PTSD, schizophrenia, sleep disorders or circadian disorders (including “night shift” and “jet lag”), stroke or transient ischemia, traumatic brain injury (TBI), other medical conditions, or otherwise as described herein.

[245] In such cases, the eyewear 100 can obtain, such as from a medical database or other remote source, a set of high-resolution longitudinal data with respect to a relatively large population. The high-resolution data can be used to generate an Al or ML model that the computing device 121 can apply to relatively low-resolution data obtained from the eyewear 100. The computing device 121 can apply the Al or ML model to the relatively low-resolution data obtained from the eyewear 100, so as to provide an in-the-field on-the-fly diagnosis with respect to the wearer.

[246] For another example, the eyewear 100 can perform prediction, prevention, diagnosis, treatment, or otherwise as described herein, using one or more Al or ML techniques, such as those described with respect to the Incorporated Disclosures, particularly including Application 16/264,553, filed Jan. 31, 2019, naming inventor Scott LEWIS, titled “Digital eyewear integrated with medical and other services”, Attorney Docket No. 6041, currently pending.

User feedback

[247] In one embodiment, an example eyewear 100 can include glasses 110 disposed for use by the wearer (not shown) and can be responsive to user input. User input can provide information to the computing device 121, such as indicating that the user is attempting a particular viewing activity, as user input indicating that the user accepts/rejects a selected prescription for a particular gaze direction, or as a command directing the computing device to perform a selected action. For example, user input can include one or more of: — eye activity, such as possibly including eye gestures, facial gestures;

— manual activity, such as possibly including manual gestures, touch controls;

— external device activity, such as possibly including external screens, mobile devices, smartphones, smart watches, or computing devices (such as mice or keyboards, trackpads or computer styluses, or capacitive touch devices);

— other bodily activity, such as voice control, or possibly measurable by a wearable or implantable device; or otherwise as described herein.

(Eye gestures)

[248] In one embodiment, eye gestures can include one or more of: blinking one or more times, blinking rapidly with respect to an ordinary blink rate, glancing in a particular direction (such as glancing up/down, right/left, or doing so repeatedly), squinting one or more times, squinting rapidly with respect to an ordinary squint rate, or otherwise as described herein. Facial gestures can include movement of the ears (such as wiggling the ears), eyebrows (such as raising or lowering one or more eyebrows), mouth (such as opening or closing the mouth), teeth, tongue (such as touching controls coupled to the teeth), use of the wearer’s voice, or other facial features, or otherwise as described herein. Eye gestures and other movements can also include deliberately looking at particular objects, such as directing one’s eyes at a camera, scope, target, bar code or QR code, menu item or purchasable object, or another identifiable location in the user’s field of view or in three-dimensional space that can be given a particular eye gesture meaning.

[249] In one embodiment, one or more eye gestures or movements can be combined. Eye gestures or movements can also be supplemented with one or more other gestures or movements, such as facial or mouth gestures or other movements (as described above), head gestures or other movements, hand/finger gestures or other movements, or other bodily gestures or movements. For example, as described herein, the user can move their face, mouth, or head in defined ways that can be give a particular meaning. The user can also (as otherwise and further described herein) move their hands/fingers or body in gestures or other movements within the user’s field of view or in the field of view of a camera, so as to indicate a particular meaning.

(“Shade where you look”) [250] As described herein, this can have the effect that the eyewear too is disposed to “shade where the user is looking”. This can be applied to inverse shading as well. When the user adjusts the direction where they are looking or adjusts the depth of field at which they are looking, whether by moving their eyes or head, tilting their head or moving their facial direction, squinting, or otherwise altering their field of view (FOVj, such as by an external force (such as a centrifu- gal/centripetal force that pushes or turns the user’s body), the eyewear too can, notwithstanding that alteration, shade or inverse-shade where the user looks, and if so desired, only where the user looks.

[251] As described herein, techniques applicable to shading/inverse-shading where the user is looking are also applicable to illuminating where the user is looking.

[252] For example, the user might be in a vehicle, such as when operating or a passenger in an aircraft, racecar, or sailboat or speedboat, or the user might be looking inside the vehicle (such as when looking at a dashboard or instrument), or the user might be looking outside the vehicle (such as when looking at an external object, sky, or terrain). In such cases, the eyewear can shade where the user is looking, notwithstanding the user’s head or eye movement or the vehicle’s movement. Similarly, the eyewear can shade where the user is looking, notwithstanding, when the vehicle applies a force to the user’s head or body, or due to any other movement that might affect where the user is looking. For example, when the vehicle accelerates/decelerates, or turns, the user might be subject to g-forces that cause their head to move or turn, thus altering their gaze direction or possibly the depth of their focal length. The eyewear can include one or more sensors, such as accelerometers or similar devices, so as to determine when the user’s view is altered by external forces rather than the user deliberately chang-ing their gaze direction or depth of focal length.

[253] For another example, the user might be using a particular device, such as a welding torch, a glass blowing element, a firearm or fireworks, a “flashbang” grenade or other bright light disposed to non-lethally disable persons or animals, or as otherwise described herein. Similarly, the eyewear might be disposed to shade/inverse-shade with respect to a particular light source, such as one of the former described devices, or such as one or more of the following:

— An external bright light: including the sun, a brightly-lit sky, a set of brightly-lit clouds, a set of floodlights in a stadium, a large-scale display in a stadium, or as otherwise described herein. — An object presenting glare to the user: such as a body of water, a set of brightly-lit clouds, a metallic or otherwise shiny object, or as otherwise described herein.

[254] In one embodiment, the user’s eye can look in a selected direction with respect to an object being viewed. In such cases, the eyewear too can be disposed to “shade where the user is looking” (as otherwise and further described herein). More specifically, the eyewear too can be disposed to determine an object in the direction at which the user is looking and at a distance at which the user is focusing. In response to the selection of the object at which the user is looking, the eyewear too can be disposed to shade/inverse-shade that particular object. This can have the effect that when the object is subject to excessive light, the user can see the object more clearly when shaded.

[255] In such cases, the eyewear can be disposed to detect where the user is looking in response to one or more of: a dynamic eye tracking system, or in response to one or more “outbound” cameras disposed to review the user’s field of view (FOV) from inside one or more lenses. For example, the dynamic eye tracking system can be disposed to determine in what direction, and at what depth of focus, the user is looking. This can have the effect that the dynamic eye tracking system can determine a location in three-dimensional (3D) space at which the user is looking. For another example, the outbound camera can be disposed to examine the user’s FOV from inside one or more of the lenses. Either of these techniques can have the effect that when the user moves their head or otherwise alters their FOV, the eyewear can adjust the 3D location that is shaded. More precisely, the eyewear can adjust a location on each lens so that the joint focus of the user’s eyes at that 3D location is shaded.

[256] The dynamic eye tracking system can be disposed to determine in what direction, and at what focal length, the user is looking. This can have the effect that of identifying a particular object, at a particular location in three-dimensional (3D) space at which the user is looking. Similarly, the outward-facing camera can be disposed to examine the user’s field of view (FOV) from inside one or more of the lenses.

[257] This can have the effect that the eyewear shades “where the user is looking”. When the user adjusts the direction in which they are looking, adjusts the depth of field at which they are looking, tilts their head, squints, otherwise moves due to an external force, the eyewear can shade or inverse-shade where the user looks, and if so desired, only where the user looks. For example, if the user might be in a vehicle, such as an aircraft, racecar, or sailboat or speedboat, and the user might be looking at a dashboard or instrument, or user might be looking at an external object, external sky, or external terrain. The eyewear can shade where the user is looking, notwithstanding the user’s head or eye movement, the vehicle’s movement, or other movement that might affect where the user is looking.

[258] Similarly, this can have the effect that the eyewear too can be disposed to shade where the user is looking even when the user moves their head or gaze direction. Thus, when the user is in a moving vehicle (so that objects change direction relative to the user as the vehicle moves), or when the vehicle changes direction, or when the vehicle applies a force to the user’s head or body, or due to any other movement that might affect where the user is looking.

[259] When the user moves their head or otherwise alters their field of view (FOV), the eyewear too can be disposed to adjust the three-dimensional (3D) location that it shades. More specifically, the eyewear too can identify one or more lens pixels associated with that three-dimensional (3D) location, as perceived within the user’s FOV on one or more of the lenses, should be shaded so as to shade the particular object at which the user is looking. When the eyewear too determines that lens pixels on both lenses should be shaded, the eyewear too can be disposed to select one or more lens pixels for each lens (although not necessarily the same lens pixels on each lens), so as to cause the joint focus with respect to the user’s eyes to be shaded at that 3D location.

[260] For example, when the user is operating a vehicle, such as an aircraft, racecar, sailboat or speedboat, or another type of vehicle, it might frequently occur that the user directs their gaze to different locations, either inside the vehicle, outside the vehicle, or between inside and outside the vehicle. The user might find themselves changing their direction of view, and thus their field of view (FOV), between two distinct instruments within the vehicle, or between an instrument within the vehicle and an object external to the vehicle (such as another vehicle, an airport or airstrip, a buoy or other marker, or a set of clouds or terrain), or between two objects outside the vehicle (such as between an cloud cover and ground terrain, or such as between a moving vehicle and a stationary object). In such cases, the light environment can be quite different between inside the vehicle and outside the vehicle, and the user might desire a distinct degree of shading for each such light environment. [261] This can have the effect that the eyewear too can be disposed to provide the user with an enhanced degree of perceptual acuity in the direction where the user is looking. When the user changes their gaze direction to look at a different object, the eyewear too can be disposed to provide the user with enhanced visual acuity with respect to the object at which the user is newly looking. This can have the effect that the user is provided enhanced visual acuity in all directions (since the user can look in any direction at any time) without having to enhance the user’s view in all directions at once.

Inside the vehicle

[262] The light environment inside the vehicle might be relatively dark (such as when shaded by parts of the vehicle) relatively bright (such as when the sun directly shines on the instruments or control elements), or subject to another effect (such as when the objects the user is looking at are lit by vehicle cabin lights, or otherwise as described herein). Thus, the amount of ambient light can be substantially different depending on where the user is looking.

[263] When the user changes their direction of view between two distinct instruments, it might occur that one instrument is shaded, while another instrument is lit by external (or internal) lighting. This can have the effect that the amount of shading desired to optimize the user’s view of the instrument depends on the selected instrument and on the relative brightness of the instrument with respect to an amount of light from the ambient environment.

[264] In one embodiment, the eyewear can be disposed with locations of particular instruments or control elements pre-selected. Each individual eyewear can be pre-loaded with locations for instruments or control elements for a particular aircraft or other vehicle, so that it is not necessary for the eyewear to determine which type of aircraft or other vehicle is being flown. This can have the effect that the user can switch between looking at selected instruments or control elements without the eyewear having to determine which instruments or control elements the user is actually looking at or focusing upon. This can have the effect that the user can look at the approximately area where their selected instrument or control element is found, and that the eyewear need not attempt to determine whether the user is actually looking at that selected instrument or control element. [265] Similarly, it might be useful to inverse-shade when the user is looking at an object within the vehicle, such as an instrument or control element. For example, when the user looks at instrument that is relatively dark in a relatively bright environment, the eyewear too can shade regions surrounding (or otherwise near) the object so as to allow the object to appear brighter and more readable with respect to its surroundings.

Inside/outside the vehicle

[266] When the user changes their direction of view between a direction internal to the vehicle and a direction external to the vehicle, it might occur that an instrument internal to the vehicle is lit differently (either more or less brightly) than an object external to the vehicle. For just one example, an instrument internal to the vehicle might be relatively shaded, while a set of clouds or terrain might be brightly lit by the sun.

[267] In such cases, the eyewear can be disposed to shade a different amount in response to a relative change in brightness between a first and a second direction in which the user looks. When the user looks from a relatively bright instrument to a relatively dark external scene, the eyewear can reduce an amount of shading, so as to allow the user to see

Outside the vehicle

[268] The light environment outside the vehicle can be quite bright with respect to some objects, such as when the sun is reflected from cloud cover, when the objects the user is looking at are brightly lit by the sun or by other vehicle lights, when the user’s vision is possibly dazzled by backlighting from the sun or by other vehicle lights, or by other visual effects. Alternatively, the light environment outside the vehicle can be significantly less bright with respect to other objects, such as when the objects the user is looking at are less brightly lit, are in shade, or are not substantial reflectors (such as certain types of ground terrain).

[269] Accordingly, it might be useful to polarize or shade when the user is looking at bright objects. The eyewear too can be disposed to determine whether the user is looking at bright objects in response to both the direction at which the user is looking and the distance at which the user is focusing. For example, a dynamic eye tracking system can be disposed to determine a direction of each of the user’s eyes and to determine a focal length in response to a pupil size or a stereoscopic distance. The dynamic eye tracking system can also be disposed to determine when the user moves or tilts their head, or otherwise alters their gaze direction by a bodily movement.

In general

[270] In an environment in which there is a substantial amount of excessive lighting from one or more sources, it can matter (A) whether any particular light source exceeds an amount of ambient light, and if so, by how much; (B) whether the user is looking in the direction of, or focusing on, any particular light source, and if so, how directly; and (C) whether the object the user is looking at is bright or not, has contrast or not, is reflective or not, or other factors that might have an effect on the user’s eyesight. In such cases, it can be desirable to adjust an amount of shading in response to lighting conditions and in response to the nature of the object at which the user is looking.

[271] For example, one such environment can be when the user is controlling an aircraft. A pilot’s eyes might need to look at instruments within the aircraft, and those instruments might be positioned (A) in shadow, (B) where they reflect sunlight, (C) where they are illuminated by cabin lights, or some combination thereof. A pilot’s eyes might alternatively need to look at objects outside the aircraft, and those objects might be positioned (A) in shadow, such as under cloud cover, (B) where they reflect sunlight, such as when the cloud cover itself is brightly lit, (C) where they are backlit by sunlight, such as when transiting the sun or approaching from sunward, or some combination thereof.

[272] Accordingly, it can be desirable to adjust shading in response to whether the user is looking at an object outside the aircraft or whether the user is looking at an instrument inside the aircraft. The eyewear can be disposed to shade in response to (A) a direction at which the user is looking or (B) a distance at which the user is focusing, such as in response to a dynamic eye tracking system, (C) whether the user tilts their head or otherwise gestures in response to a change in attitude concurrent with looking inside or outside the aircraft.

(Manual activity)

[273] In one embodiment, the eyewear too can be disposed to allow the user to set a shading level by manual activity, such as by touching a control element, performing hand or finger gestures, or otherwise manipulating one or more control elements. For example, the eyewear too can be disposed to allow a user to set a shading level by one or more of the following:

— The user can set a shading level by moving a body part (such as a finger or the back of a hand) on a slider, such as a slider attached to a frame of the eyewear too. For another example, longer/ shorter slides can indicate more/less shading.

— The user can set a shading level by tapping a body part (such as a finger or the back of a hand) repeatedly on a touchable element, such as a button or a capacitive sensing element. For example, more/fewer repeated taps can indicate more/less shading.

— The user can set a shading level by gesturing with a body part (such as a finger, the back or palm of a hand) near a touchable element, such as an outward-facing camera, a capacitive sensor, or an element having an electromagnetic field. For example, waving a hand more/fewer times can indicate more/less shading. In such cases, the user can indicate an end to the count of more/fewer waves using a secondary gesture, such as a closed fist, a movement in another direction, or otherwise as described herein.

[274] In such cases, the eyewear too can be responsive to user gestures either inside the user’s field of view (FOV), such as (A) in response to an outward facing camera in the user’s FOV or in response to a dynamic eye tracking mechanism; or such as (B) in response to an outward facing camera outside the user’s FOV.

[275] For another example, manual activity can include hand gestures (possibly aided by a glove or other sensor), hand gestures conducted within the wearer’s field of view (FOV), other bodily movement within the wearer’s FOV (such as movement by the wearer’s wrist, arm, elbow, leg, knee, or otherwise as described herein). Manual activity can include touch controls 151 (such as on the eyewear too or on an external device). In such cases, the touch controls 151 can include one or more buttons, sliders, switches, or capacitive sensors, and can be mounted on or near the eyewear frame 111. Alternatively, touch controls 151 can be mounted on an external device, such as an external screen, a mobile device, a smartphone, a smart watch, another wearable, a control panel for another device (such as a computing device or a vehicle), or otherwise as described herein.

[276] In such cases, when a touch control 151 is mounted on a vehicle, it can be disposed on a steering wheel for a racing car, a tiller for a sailboat, a control yoke for a speedboat, a control stick for an aircraft, a set of ski poles when skiing or a set of buttons when snowboarding, a controller for a gaming system, or otherwise as described herein. The eyewear too can be disposed to allow the wearer to use a touch control or other control disposed on the steering wheel, control yoke, control stick, ski poles, snowboard buttons, gaming system controller, or otherwise as described herein. The eyewear too can also be disposed to allow the wearer to use an eye gesture, hand gesture, or other gesture, to control the eyewear too itself, such as for shading/inverse-shading, or to control the vehicle or gaming system, such as to increase or decrease speed, alter direction, or control other functions thereof. The eyewear too can also be disposed to use one or more artificial intelligence (Al) or machine learning (ML) techniques to identify circumstances when shading/inverse-shading is desirable for the wearer, or when the wearer is subject to a medical condition or other debilitating circumstance, such as “dry eyes”, migraine/photophobia or related conditions, epilepsy or seizures, or otherwise as described herein.

(Medical conditions)

[277] For another example, a wearable or implantable device can be disposed to measure a bodily function, such as heart rate, movement, walking distance, or otherwise as described herein. In such cases, the wearable or implantable device can use the measure of the bodily function to provide feedback to the eyewear too. Feedback to the eyewear too can indicate that the wearer is in medical distress or is otherwise subject to a medical condition, including whether the wearer is subject to a cardiac or stroke event, whether the wearer is subject to excessive stress, whether the wearer is subject to a migraine, whether the wearer is subject to a seizure, or otherwise as described herein. In such cases, the eyewear too can use the communication device 122 to alert emergency responders, medical personnel, search and rescue personnel, or volunteers who are nearby and able to help. Moreover, the eyewear too can be disposed to respond to medical conditions such as stress, migraine, or otherwise as described herein, by adjusting the wearer’s prescription to assist in treatment of eyestrain, headache or migraine, “dry eye” conditions, or otherwise as described herein.

(Voice commands)

[278] For another example, the eyewear too can be disposed to respond to the wearer’s voice commands, such as by using one or more artificial intelligence (Al) or machine learning (ML) techniques to recognize voice commands, parse those commands, and perform the actions requested by the wearer. In such cases, the eyewear too can be disposed to respond to a wakeup word, so as to only respond to voice commands when the wearer deliberately intends the eyewear 100 to respond, and not to respond to voice commands when the wearer is merely talking to another person (or themselves).

(Gaze direction)

[279] In one embodiment, an example eyewear 100 can be responsive to the wearer’s gaze direction, so as to illuminate a location, an object, or a person, at which the wearer is looking. For example, the eyewear 100 can include sensors 123 including a gaze detector (not shown), disposed to determine a direction at which the wearer is directing their gaze, and a pupillometer (not shown), disposed to determine a size of the pupil and accordingly a focal length.

[280] In one embodiment, the gaze detector can be coupled to a lamp (not shown), disposed to illuminate in an outward direction at a region of the wearer’s field of view where the wearer is looking. For example, when the wearer moves their gaze across their field of view, the lamp can move an illumination effect with the wearer’s gaze direction. This can have the effect that the wearer’s field of view is illuminated where the wearer is looking, without the wearer having to move their hand (when holding a lamp) or their head (when wearing a lamp) to point the lamp toward an object of interest.

[281] Moreover, the lamp can be disposed to present its illumination effect in only the portion of the wearer’s field of view at which the wearer’s gaze is directed, such as to illuminate the location, object, or person, of interest to the wearer, without having to illuminate a larger region that includes the region of interest to the wearer.

[282] In one embodiment, the pupillometer, or another focal length detector, can be disposed to determine a distance at which the wearer is looking. This can have the effect that the eyewear too can determine a specific location, object, or person, of interest to the wearer, rather than a solid angle within the wearer’s field of view. For example, when a specific object of interest to the wearer is nearby, the lamp can be disposed to focus on that nearby object. This can have the effect that that only that nearby object would be illuminated, not objects about which the wearer is not interested. [283] In one embodiment, the illumination effect can be disposed (A) to enhance context sensitivity when viewing the object of interest, such as when the object has better contrast with respect to its background; (B) to enhance visual acuity when viewing the object of interest, such as when the object is subject to less visual blur or noise, motion blur, peripheral blur, or other effects that debilitate visual acuity; (C) to enhance visibility of a feature of the object, such as an edge thereof, a face thereof, writing on the object, or otherwise as described herein.

[284] In one embodiment, the illumination from the lamp can be polarized. This can have the effect that the illuminated object does not present glare to the wearer, even when the object is highly reflective or otherwise shiny. In another embodiment, the illumination from the lamp can be a blinking light or a strobe light. This can have the effect that the wearer can view the object of interest without debilitating their night vision, or while identifying the object to another viewer.

[285] In one embodiment, the illumination from the lamp can include a color effect, such as having a color distinct from the object or its background. For example, the illumination can emphasize the object by altering its color with respect to its background, or by altering the color of the background in the region of the object. For another example, the illumination can emphasize the object by altering the contrast of its color with respect to its background, or by altering the color contrast of the portion of the background in the region of the object.

[286] In one embodiment, the illumination from the lamp can include an augmented reality or virtual reality effect, such as a heads-up display (HUD) in which the object of interest is highlighted, or such as a virtual reality pointer directed at the object of interest.

[287] In one embodiment, the lamp can be directed at the wearer’s eye, such as at the wearer’s pupil or retina. This can have the effect of adjusting the wearer’s pupil or retina, such as to cause the wearer to see the object of interest more brightly or darkened. For example, the lamp can be directed at the wearer’s pupil, such as to cause the pupil to contract and the object to be darkened. This can have the effect of emphasizing the object when otherwise presented against a brightly lit background. For another example, the lamp can be directed at another portion of the wearer’s eye, such as to cause the pupil to expand and the object to be brightened.

[288] In one embodiment, the lamp can be directed at the wearer’s eye, such as at the wearer’s pupil or retina, with the purpose of activating a particular mode of the wearer’s vision. For example, the wearer’s vision can be activated in a mesopic, photopic, or scotopic mode. In another embodiment, the lamp can be directed at the wearer’s eye, such as at the wearer’s pupil or retina, with the purpose of adjusting the size of the wearer’s pupil (A) to ameliorate visual aberration, such as when visual aberration occurs after LASIK surgery or other eye surgery, or (B) to promote night adaptation of the wearer’s vision, such as by adjusting the wearer’s pupil to become narrower even when the wearer enters a darkened region.

[289] User feedback can also include combinations of multiple user inputs, such as multiple eye gestures, multiple manual inputs, multiple external device inputs, combinations of different types of user inputs, or otherwise as described herein. For example, combinations of eye gestures can include activity such as “blink twice and glance left”.

Action by eyewear

[290] In one embodiment, an example eyewear too can be disposed to correct vision or enhance vision on behalf of the wearer. The eyewear too can be disposed to alter refraction, polari- zation/shading, color, prismatic angles/functions, or otherwise as described herein.

[291] For example, the eyewear too can be disposed to correct or enhance the wearer’s vision by altering the amount of refraction (such as an optometry prescription) in response to factors described herein. The eyewear too can be disposed to alter the amount of refraction in response to whether the wearer’s gaze direction or focal length, or whether the wearer’s field of view (FOV) includes a recognized object, with a particular distance. In such cases, the eyewear too can be disposed to alter the amount of refraction to correct or enhance the wearer’s vision to optimize the wearer’s ability to clearly see at the particular distance or to clearly see the recognized object.

[292] As further described herein with respect to predictive techniques, such as artificial intelligence (Al) or machine learning (ML) techniques, the eyewear too can be disposed to alter the amount of refraction in response to a predicted distance at which the wearer is most likely to be focusing when their gaze direction intersects a particular lens region 131 or lens pixel 141 of the lenses 112. Having learned the wearer’s behavior, the eyewear too can be disposed to select the amount of refraction statically, thus, without regard to the nature of the objects or scene in the wearer’s field of view (FOV). Alternatively, having learned the wearer’s behavior, the eyewear too can be disposed to select the amount of refraction dynamically in response to a focus distance determined with respect to the wearer, such as by measurement of the wearer’s pupil or iris size, contraction, or widening.

[293] For another example, the eyewear too can be disposed to correct or enhance the wearer’s vision by altering the amount of polarization/shading of light entering the wearer’s eyes through the lenses 112. In such cases, the eyewear too can alter the amount of polarization/shading in a particular gaze direction to alleviate glare, can alter the amount of polarization/shading in a particular vision region to alleviate excess luminance or UV light, or otherwise as described herein. When the wearer changes their gaze direction or focal length so as to view an object with a different amount of brightness, the eyewear too can alter the amount of polarization/shading in response thereto, so as to match the amount of polarization/shading to the brightness of the object being viewed by the wearer. When the wearer exhibits features associated with medical conditions or other conditions, such as in response to blink rate, pupil or iris size, squinting, redness or showing blood vessels on the sclera, inadequate tear films or tear film breakup time, other eye features (or significant changes therein), or otherwise as described herein, the eyewear too can respond to those conditions by altering the amount of polarization/shading of light entering the wearer’s eyes through the lenses 112. Similarly, the eyewear can adjust infalling light so as to induce photopic, mesopic, or scotopic activity of the eye’s rods and cones.

[294] For another example, in addition to, or in lieu of, responding to the wearer’s eye activity, the eyewear too can be disposed to alter the amount of polarization/shading of light entering the wearer’s eyes in response to a prediction of an amount of infalling light likely to enter the wearer’s eyes. In such cases, the eyewear too can determine its prediction in response to one or more artificial intelligence (Al) or machine learning (ML) techniques, possibly in response to a direction the wearer is facing, a location the wearer is positioned, a time of day, a season of the year, a measure of ambient lighting or detection of a number of ambient artificial lights, or otherwise as described herein. When performing shading, the eyewear too can electronically control the lenses 112, such as particular lens regions 131 or lens pixels 141.

[295] For another example, the eyewear too can be disposed to correct or enhance the wearer’s vision by altering the amount of color filtering, color injection, false coloring, or color changes. In such cases, when the eyewear too determines that the amount of infalling ambient light is excessive, or is excessive for a particular color or in a particular frequency range, or is unbalanced with respect to color, the eyewear too can adjust the amount of filtering for that color or frequency range to limit the amount of infalling light to a reasonable amount. In such cases, the eyewear too can reduce an amount of blue just before sleep. Similarly, the eyewear too can also detect infalling ultraviolet (UV) light, absorb that UV, and inject a false color in lieu thereof, using one or more electromagnetic or photochromatic techniques. In such cases, the eyewear too can alter the color balance of infalling light so as to allow artists, such as graphic designers or web developers, to generate color schema that are accurate when viewed in their intended environment.

[296] When the eyewear too determines that the amount of infalling light is inadequate for a particular color, or in the case of migraines, attempts to treat the migraine effect by injecting some amount of that color (such as green), the eyewear too can adjust the amount of filtering, or can directly inject that color into the wearer’s field of view (FOV), such as by using color LEDs to directly inject selected colors. For example, red LEDs can be used to inject red pixels, green LEDs can be used to inject green pixels, blue LEDs can be used to inject blue pixels, or white LEDs can be used to inject white pixels. When the amount of infalling light is inadequate for the wearer to clearly see color (such as when the wearer’s rods are activated but their cones are not), the eyewear too can provide a false-coloring of the FOV to show features of interest to the wearer, such as when the eyewear too is operated using or in lieu of “night vision” goggles that detect infrared (IR), or when false coloring is used with object recognition, or otherwise as described herein. Similarly, the eyewear too can alter the color balance of infalling light to prompt the wearer’s eye to operate in a photopic, mesopic, or scotopic mode.

[297] When the eyewear too determines that the wearer is subject to an inadequate blink rate, or an excessive blink rate, the eyewear too can adjust the amount of color at selected frequencies injected into the wearer’s field of view (FOV). For example, color injection can be used to control an amount of melatonin produced by the brain. Blue light decreases an amount of melatonin produced by the brain, which is why blue light can interfere with sleep. Melatonin causes the eye to decrease blink rate, so an excessive blink rate can be controlled by color injection, at least in part. For another example, color injection can be used to control an amount of dopamine produced by the brain. Blue light increases an amount of dopamine produced by the brain. Dopamine causes the eye to increase blink rate, so an inadequate blink rate can be controlled by color injection, at least in part. [298] For another example, the eyewear too can be disposed to correct or enhance the wearer’s vision by altering the amount of prismatic angle imposed by the lenses 112. In such cases, when the wearer’s activity indicates that the wearer intends to look down, such as at a keyboard or smartphone, the eyewear too can be disposed to alter the amount of prismatic angle imposed by the lenses 112 so as to allow the wearer to see the keyboard or smartphone without any unnecessary head movement. Similarly, when the wearer’s activity indicates that the wearer intends to look up, such as at a screen, presentation, window, or distant object, the eyewear too can be disposed to alter the amount of prismatic angle imposed by the lenses 112 so as to allow the wearer to see that object without any unnecessary head movement.

Active color change by eyewear - frame

[299] In one embodiment, the eyewear too can be disposed to change a color of its frame 111 or a portion thereof, such as changing a color of its temples 111a (or a portion thereof, such as a relatively flat portion near the temple), its nosepiece mb, or its lens holders (or a portion thereof, such as changing a first portion above the lenses separately from a second portion below the lenses). For example, the eyewear too can be disposed to change its frame 111 from a relatively clear color to a relatively opaque color, or the reverse, or from a relatively cool color (blue) to a relatively warm color (red or orange), or the reverse.

[300] Similarly, the eyewear too can be disposed to change a color of one or more lenses, or portions thereof. When the eyewear too includes a physical frame 111, the eyewear can be disposed to change a first portion of the frame nearer the lenses, or a second portion of the frame farther from the lenses. When the eyewear too includes a contact lens, the eyewear can be disposed to change a portion of the lenses themselves that does not intersect the user’s field of view, such as a portion that covers only the user’s iris. In the latter case, the eyewear too can be disposed to change color in response to a color of the user’s iris (possibly in addition to other factors), with the effect of altering an external view of the user’s iris.

[301] For another example, the color change can be responsive to a wearer condition; to a color determined in response to an electromagnetic signal, such as a signal from the computing device 121 or from a wearer input; to a color determined in response to an environmental condition; or otherwise as described herein. In such cases, the wearer condition can include a medical condition or another condition, such as whether the user is excessively tired, has high or low blood pressure, is intoxicated, or is about to or is currently subject to migraine or photophobia. This can have the effect that other persons, such as medical personnel, emergency responders, nearby volunteers, or friends of the user can identify whether the user needs aid or assistance.

[302] For another example, the eyewear too can be disposed to change a color texture of its frame, thus from a relatively solid color to a non-solid color scheme. In such cases, the non-solid color scheme can include a spotted or striped color scheme including more than one color (or including one color with differing amounts of grey or saturation), an leopard-print or other animal-like scheme, another pattern (whether recognizable as a natural pattern or otherwise as described herein), a modern art pattern, a picture or moving picture mapped onto at least a portion of the frame, or another pattern to the liking of the user.

[303] For another example, the eyewear too can be disposed to change the color texture of its frame in response to time, such as a color texture that cycles through multiple patterns. In such cases, the color texture can cycle through multiple patterns in a rotating order, in a random or pseudorandom order, in an order responsive to a personal condition of the user (such as a medical condition, an emotional state, or another condition responsive to a user status), an ambient condition (such as an environmental condition, an attempt to match other objects, or another condition external to the user.

[304] For another example, the eyewear too can be disposed to change the color texture of its frame in response to an input message, such as a user command, an electromagnetic message from another device, or another message received from outside the eyewear. In such cases, the eyewear too can receive and decode the message (which might be encrypted and have to be decrypted), determine a color texture or sequence of color textures to provide, and alter its color texture or sequence of color textures in response thereto.

[305] In one embodiment, the eyewear’s one or more sensors 123 can be coupled to the computing device 121 and disposed to provide information thereto. The computing device 121 can be disposed to determine, in response to information from the sensors 123, whether the user is exhibiting a condition for which the eyewear too should change color, and if so, in what manner the eyewear should present a new color (or color pattern). [306] For example, the computing device 121 can be disposed to receive information from the sensors 123 with respect to one or more of the following:

— An ambient environment condition, such as an allergenic condition, a pollution condition, a weather condition, or as otherwise described herein.

— A communication condition, such as a presence of one or more “friends” or relatives or other persons known to the user, a presence of one or more objects recognized by the user, a presence of one or more signals directed at the user or otherwise near the user.

— A user condition, such as a desired color or pattern for the eyewear, a desired message to communicate, an emergency condition, an emotional condition, a medical condition, or as otherwise described herein.

Color textures

[307] In addition to color change, the eyewear can be disposed to provide a color texture, which can include a combination of multiple colors. The combination of multiple colors can include a distinct color in each one of a plurality of regions, whether those regions cover the whole of the eyewear frame or the contact lens, or otherwise as described herein. Alternatively, the combination of multiple colors can include a distinct color in each one of a plurality of individual pixels, whether those pixels cover the whole of the eyewear frame or the contact lens. When pixels are used, each individual pixel might be individually substantially similar to its neighbors, but the whole of the eyewear frame or the contact lens can exhibit substantial color variation.

[308] For example, a color texture can include a color gradient, such as a gradient between a blue color and a red color; a color pattern, such as a set of orange polka-dots on a purple background, or a faux- snakeskin or other imitative pattern; a picture, such as a corporate logo or a religious symbol; or another combination in which more than one color is disposed on the eyewear frame or on a contact lens. In such cases, the color texture can be selected so as to present an image of a three-dimensional (3D) object, such as by presenting a different color texture to viewers in response to an angle at which the color texture is viewed.

[309] In one embodiment, the color texture can be disposed over a flat portion of an eyewear temple, thus, to the sides of the user’s head; over a non-pupillary portion of a contact lens, thus, to the sides of the user’s pupil; or otherwise in a location where the color texture does not disturb the user’s vision. Alternatively, the color texture can be disposed over a pupillary portion of a contact lens, so as to affect the color balance of the user’s vision, as otherwise and further described herein.

(Glitter and related concepts)

[310] For another example, the eyewear too can be disposed to change a color or color texture of its frame in a time-dependent manner, thus in one or more of the following:

— In a cyclical or pseudo-cyclical manner, such as in a repeating or near- rep eating manner, such as where the color of the eyewear too or a portion thereof cycles between red-green-blue.

— In a random or pseudo-random manner, such as in response to a random or pseudo-random effect, such as where the color of the eyewear too or a portion thereof changes among a set of randomly selected colors.

— In a manner imitating or providing a florescent effect, such as in response to ultraviolet light, or in response to a selected audio/video frequency of light or sound, such as: emitting light in response to a musical or other audio input, emitting light in response to a selected electromagnetic frequency such as blue light, or as otherwise described herein.

— In a manner imitating a natural process, such as in response to a simulation of a natural artifact, such as: glitter or a molded material including glitter, a sparkling material, a pattern evincing gemstones, or as otherwise described herein.

— In a manner imitating a natural creature, such as in response to a pattern evincing one or more of: a butterfly, a chameleon, a hummingbird, a lightning bug, or as otherwise described herein.

Color variation

[311] The color change or color texture can also be disposed to itself vary in response to one or more factors: a passage of time, a random element, a change in a parameter with respect to the user, or another measurable feature that can be expressed as a color variation. A processor can be disposed to receive information with respect to one or more such measurable features, to determine a new color change or color texture in response thereto, and to direct the eyewear to adopt a new color change or color texture in response thereto.

[312] For example, the processor can, in response to a measure of time, vary the color change or color texture from a starting point to an ending point and back, or from a starting point in a loop back to the same starting point, or otherwise in a selected time-varying pattern, such as a color texture that cycles from a first to a second color and back to the first color, or a color texture that cycles through a selected color wheel. Similarly, the processor can vary the color change or color texture in response to a random element, in response to one or more objects in the user’s field of view, in response to one or more user parameters, such as the user’s skin temperature or the eyewear’s prediction of whether the user is about to be affected by a migraine.

[313] For another example, the processor can be disposed to present a color texture including a moving picture, such as a picture displaying a movie of the user’s children or pets. In such cases, the moving picture can be presented on the side of the eyewear frame, on the iris portion of a contact lens, where the user can see the moving picture, or where persons other than the user can see the moving picture.

(Medical condition)

[314] For example, the wearer condition can include one or more of: a medical condition, such as the wearer suffering from a dry eye condition, a migraine/photophobia or a neurological condition in response thereto, or otherwise as described herein. In such cases, the wearer condition can be determined in response to a computing device processing outputs from sensors coupled to the wearer, from an input from the wearer, or otherwise as described herein.

[315] In one embodiment, when the wearer is undergoing a medical condition for which the wearer needs assistance, medical or otherwise as described herein, the eyewear too can be disposed to change color to alert the wearer, and nearby persons, about the medical condition. For example, when the wearer is undergoing a coronary attack or a stroke, it can be desirable to alert emergency responders and medical personnel, and it can be desirable for volunteers to assist the wearer in being transported to an ambulance or hospital with haste. In such cases, the eyewear too can change color to alert the wearer and nearby persons to do so.

[316] In such cases, the eyewear too can emit an electromagnetic signal, thus changing “color” to a frequency outside normal human vision. The electromagnetic signal can include a coded message which can be received by another device, such as another set of eyewear too, a smartphone or other mobile device, or another type of device, which can be disposed to receive the electromagnetic signal and send a message to alert emergency responders and medical personnel that the wearer needs assistance and is in transit to an ambulance or hospital.

(Wearer emotional state)

[317] In one embodiment, the eyewear 100 can change color to indicate the wearer’s emotional state or mood, such as when the wearer is (A) excessively tired, (B) under the influence of alcohol or other substances, (C) subject to a diabetic complication or other issue, (D) an unexpected lack of energy, or an emotional state or mood otherwise indicating that the wearer needs assistance or should not be operating heavy machinery. In such cases, the wearer or nearby persons can take appropriate action to assist the wearer, such as by calling for a taxi or otherwise bringing them home.

(Ambient condition)

[318] For example, the wearer’s field of view (FOV) can include one or more of: an amount of luminance (whether excessive or inadequate), an amount of glare, an amount of sensory noise or cognitive stimulation (whether excessive or inadequate), or otherwise as described herein. In such cases, the effect of the wearer’s FOV can be determined in response to one or more of: a computing device processing inputs from the wearer’s field of view FOV, a computing device processing the wearer’s response to the wearer’s FOV, an input from the wearer with respect to the wearer’s FOV, or otherwise as described herein.

(Matching other objects)

[319] For example, the eyewear too can be disposed to make itself brighter or dimmer in response to the ambient environment. This can have the effect that the eyewear too can adjust its color with respect to the ambient environment, either by reducing/increasing its contrast with the ambient environment to become less/more prominent. Similarly, the eyewear too can be disposed to make itself less/more prominent with respect to an external device, such as (A) the wearer’s clothing or accessories, or (B) another person’s eyewear, clothing or accessories.

[320] This can have the effect that multiple persons can intentionally match the colors of their eyewear too, so as to easily identify members of a group, such as a tour group. In such cases, the eyewear 100 can communicate with other eyewear so as to determine whether all members of the tour group are present, or whether some statistical measure, such as whether members of the tour group are within a localized area. Similarly, one or more eyewear too can change color in a timevarying manner, such as by cycling among two or more different colors, such as (A) to improve the visibility of the eyewear, (B) to send a coded message to an electronic device, or otherwise as described herein.

[321] For another example, the eyewear too can be disposed to change color in response to an external signal, such as an electromagnetic signal from another eyewear, a smartphone or other mobile device, or another type of device.

(Environmental condition)

[322] For example, the environmental condition can include one or more of: a weather condition, an amount or severity of allergens or pollutants; or another environmental condition (such as a presence of smoke inhalation or soot, a presence of absorbable/inhalable hazards, a presence of hazardous biological/chemical substances, a presence of an ambient drug hazard, a presence of a pathogen, or otherwise as described herein). In such cases, the environmental condition can be determined in response to a computing device processing inputs from the wearer’s field of view (FOV), a computing device processing the wearer’s response to the wearer’s FOV, an input from the wearer with respect to the wearer’s FOV, or otherwise as described herein. In such cases, the weather condition (or the amount or severity of allergens or pollutants) can be determined in response to a sensor disposed to detect those conditions, in response to a GPS or other location device disposed to identify the wearer’s location and obtain a report of those conditions for that location, an input from the wearer with respect to those conditions, or otherwise as described herein.

[323] For example, the environmental condition can include one or more of: a wearer’s field of view (FOV), a weather condition, an amount or severity of allergens or pollutants; or another environmental condition. In such cases, the environmental condition can be determined in response to a computing device processing inputs from the wearer’s FOV, a computing device processing the wearer’s response to the wearer’s FOV, an input from the wearer with respect to the wearer’s FOV, or otherwise as described herein. In such cases, the weather condition (or the amount or severity of allergens or pollutants) can be determined in response to a sensor disposed to detect those conditions, in response to a GPS or other location device disposed to identify the wearer’s location and obtain a report of those conditions for that location, an input from the wearer with respect to those conditions, or otherwise as described herein.

[324] For example, the frame 111 can be coated with an LCD material, an LED material, an OLED material, a PLED (polarized LED) material, a phosphorescent material, or a related material responsive to an electromagnetic signal or an electronic signal, in response to an environmental factor such as temperature or pollutants, or otherwise as described herein. The electromagnetic signal or electronic signal can be received from the computing device, from a wearer condition sensor, from an environmental sensor (including a sensor disposed to detect a color balance or other condition of a scene, an object or person identified in the scene, or otherwise as described herein), from a wearer input sensor, or otherwise as described herein.

[325] For example, the eyewear too can be disposed to have the new frame color set at the time of sale; this can have the effect that a smaller number of stock keeping units (SKUs) can be maintained by a seller while maintaining a degree of flexibility for sales. For another example, the eyewear too can be disposed to have the new frame color set at a time when the eyewear is lent or issued to the wearer, such as when 3D glasses are provided to the wearer at a show; in such cases, it can be advantageous for clerks issuing eyewear to wearers to be able to easily identify which eyewear is associated with which wearer’s tickets to see the show. Similarly, in such cases, it can be advantageous for clerks issuing eyewear to wearers to be able to easily set parameters for the eyewear, while concurrently setting the color of the eyewear to identify that the eyewear has been set with those parameters.

[326] For another example, when a display is disposed so as to allow viewers with different eyewear too to view different presentations, it can be convenient for clerks to present eyewear to viewers that have external colors associated with the presentation the viewer will see. This can have the effect that clerks can easily determine whether eyewear too for an adult presentation is being worn by a child, or whether the eyewear is otherwise improperly associated with the viewer.

[327] For another example, the eyewear too can be disposed to change its frame color in response to the wearer’s control, in response to an amount or color balance of ambient light (such as sunlight, indoor light, an amount of UV light, or otherwise as described herein), in response to a time of day or day of the week, in response to a change in the wearer’s appearance, in response to the wearer’s emotional affect or mood, or otherwise as described herein. In such cases, when the wearer desires to use the eyewear too in a particular context, the wearer can adjust the eyewear so as to match the context, such as by making the eyewear lighter when the wearer desires to emphasize their eyes and darker when the wearer desires the opposite.

[328] For example, the wearer’s emotional affect or mood can be determined in response to the wearer’s body temperature or skin temperature, in response to the wearer’s pupil size or eye motions (or frequency thereof), in response to the wearer’s heart rate or blood pressure (or stability thereof), in response to the wearer’s galvanic skin response, in response to other medical conditions, in response to one or more inputs from the wearer, or otherwise as described herein.

[329] For example, the wearer’s appearance can be determined in response to the wearer’s hair color (such as when it is dyed or gelled or otherwise styled, bleached by sunlight or stage-lit or otherwise altered, or subject to different lighting conditions, or otherwise as described herein); in response to the wearer’s eye color or skin color; in response to a color of the wearer’s outfit (such as clothing or jewelry); in response to whether the wearer is wearing cosplay/ costuming or face paint, makeup or broadcast makeup, or suntan lotion; or otherwise as described herein.

[330] In one embodiment, the eyewear can be disposed to provide an active color change in response to one or more factors, so as to signal to observers that the wearer has a particular condition, such as a medical condition, a wearer activity, or a wearer focus on a particular portion of their field of view. For example, a glasses frame can be disposed to change color without interfering with the wearer’s field of view (FOVj, while communicating to the wearer or to others a wearer’s emotional affect or mood, medical condition, or to match an environment in which the wearer is participating. For example, the glasses frame can match the wearer’s hair color or eye color (even if one or more of those has been altered to match a social occasion), to match the wearer’s outfit, or to match ambient lighting.

[331] In one embodiment, the eyewear can be disposed to provide an active color change when disposed in one or more distinct form factors: a glasses frame; a facemask or helmet; a heads-up display (HUD), a window screen, a vehicle window or windshield, or otherwise as described herein. For example, the glasses frame, facemask, or helmet, can be disposed to change color without interfering with the wearer’s field of vision (FOVj. For another example, the facemask or helmet, HUD, window screen, or vehicle window or windshield, can be disposed to change color so as to alter the wearer’s FOV, such as to alter the color balance of ambient light.

Active color change by eyewear - lens

[332] For another example, the eyewear too can be disposed to change a color of its lens, thus altering an external view of the lens and of the user’s eye in response to the change in color. In such cases, the lens can include an e-chromatic material, defined herein as a material that can change color responsive to an electromagnetic signal. In such cases, the lens can change color in response to one or more of:

— a user input;

— a medical condition of the user;

— a medical condition observed by the user, such as a medical condition of a person within the user’s field of view;

— an emotional state of the user;

— an emotional state observed by the user, such as an emotional state of a person within the user’s field of view;

— an ambient condition, such as an effect within the user’s field of view, a local weather condition, pollution measure, pollen count, or another ambient condition capable of affecting the user, prompting a migraine, or prompting another medical condition with respect to the user or a person within the user’s field of view;

— an object recognized within the user’s field of view, such as described herein; or otherwise as described herein with respect to color change of the eyewear frame.

[333] Similar to the eyewear frame, the eyewear too can be disposed to change a color texture of its lens, thus altering an external view of the lens to other than a solid color. The color texture of the lens can be disposed in one or more color textures, such as those described with respect to color texture of the frame. The color texture of the lens can also be disposed to change with respect to time, such as described with respect to color texture of the frame.

[334] In one embodiment, the color (and color texture) of the lens can be disposed so that the color (and color texture) of the portion of the lens disposed before the pupil differs from that disposed before the iris. For example, the user might desire to show blue irises, but not to alter their field of view (FOV) to show their entire FOV in blue. Accordingly, the color (and color texture) of the lens can be disposed in a first portion before the pupil and a second portion before the iris. This can have the effect that the pupil looks naturally black, while the iris can have its color (or color texture) altered.

[335] For example, the color (and color texture) of the first portion of the lens, before the pupil, can be generally disposed to be clear. In some cases, it might be appropriate to dispose the first portion of the lens to have a color (or color texture) to adjust the color balance of the user’s field of view (FOV). In one case, it might be desirable to adjust the color balance of the user’s FOV to increase the proportion of green or decrease the proportion of blue available for view by the user. In one case, it might be desirable to adjust the shading/inverse-shading of the user’s FOV; explicit shading/inverse-shading can be limited to the pupil.

[336] In such cases, the first portion of the lens, whether clear or color-balance adjusted, can be restricted to the user’s pupil. When the user’s pupil changes in size, such as in response to changes in ambient brightness, the first portion of the lens can be adjusted in response to changes in the size of the pupil, so as to maintain the first portion of the lens to cover substantially only the pupil and no substantial portion of the iris.

[337] In one embodiment, when the eyewear is disposed to adjust the color or color texture of the portion of the lens before the pupil, the eyewear can be disposed to adjust that color or color texture during a time period while the user blinks. As described herein, a blink takes a finite amount of time, so the eyewear can adjust the color or color texture of the portion of the lens before the pupil while the user is blinking (and the pupil is covered by the eyelid). This can have the effect that the user sees a different amount of color or color texture before the blink and after the blink. The eye integrates the amount of color or color texture into its received image. This can have the effect that the user does not notice the change when the eyewear adjusts the color or color texture.

[338] For example, the color (and color texture) of the second portion of the lens, before the iris, can be generally disposed to have a color that alters others’ view of the user’s iris. In such cases, the use’s iris might be naturally brown, but the user might desire to have their iris appear blue. The second portion of the lens can be disposed to show the iris as blue. [339] In such cases, the second portion of the lens, regardless of color or color texture, can be restricted to the user’s iris. When the user’s pupil changes in size, such as in response to changes in ambient brightness, the second portion of the lens can be adjusted in response to changes in the size of the pupil, so as to maintain the second portion of the lens to cover substantially the iris and no substantial portion of the pupil.

[340] Similar to frame color, the eyewear too can be disposed to include one or more contact lenses 300, one or more of which can be disposed to change color, such as described with respect to the frame 111. The eyewear too can also be disposed to include photochromatic lenses 112, which can be disposed to change color, as described with respect to the frame 111. The lenses 112 can also be disposed to change color in response to a gaze direction or focal length, so as to not to impose an artificial color on the wearer’s view through the pupil. The lenses 112 can also be disposed with a color that is adjustable in response to the wearer’s iris color, so as to combine the iris color with the lenses’ color to form a selected color.

[341] For another example, the eyewear too can be disposed to change color or shading in response to a gaze direction or size of the wearer’s pupil, so as to not to interfere with the wearer’s vision. In such cases, when the wearer’s pupil increases/decreases in size, the eyewear too can alter the portions in which it provides color or shading so as to avoid obscuring the width of the wearer’s pupil. Similarly, when the wearer’s pupil moves to change gaze direction, the eyewear too can alter the portions in which it provides color or shading so as to avoid obscuring the direction of the wearer’s pupil.

[342] For another example, the eyewear too can be disposed to deliberately alter the color balance of the wearer’s field of view (FOV), so as to alter the color balance seen by the wearer. In such cases, the eyewear too can alter the color it provides in the region (or for the set of pixels) associated with the wearer’s pupil, so as to alter the color balance of the wearer’s FOV when the eyewear includes a lens 112 disposed in a glasses frame, a facemask, or helmet; when the eyewear includes contact lenses, an intra-ocular lens or other implantable device; or otherwise as described herein.

[343] For another example, the eyewear too can be disposed to deliberately alter an amount of shading or inverse-shading of the wearer’s field of view (FOV), so as to alter an amount of luminance of light infalling to the wearer’s pupil or onto the wearer’s retina. In such cases, the eyewear 100 can alter the amount of shading/inverse-shading it provides in the region (or for the set of pixels) associated with the wearer’s pupil, so as to alter the amount of luminance or light contrast with respect to the wearer’s FOV. Similarly, the eyewear too can alter the amount of shading/inverse-shading it provides so as to assure that the wearer obtains sufficient contrast between objects that the wearer can identify those objects even in relatively dim lighting (such as at night) or excessively bright lighting (such as in bright ambient light, when the object is brightly backlit, or when the object is subject to glare).

Visual acuity

[344] In one embodiment, the eyewear can include a first camera disposed to capture the field of view available to the user using the eyewear and a second camera disposed to capture the same field of view available to the user, only without using the eyewear. The eyewear can determine a comparison between (A) a first view available to the user using the eyewear, such as available using the first camera, and (B) a second view available to the user without using the eyewear, such as available using the second camera. In response to the comparison, the eyewear can determine a measurement of visual acuity available to the user.

[345] The first camera can be disposed to view through a lens of the eyewear; the second camera can be disposed to view outside any lens of the eyewear. The eyewear can determine the measure of visual acuity both in response to the difference between information from the first camera and information from the second camera, and in response to an examination of a digital image from one or more of the two cameras. For example, if there is no difference between the images from the two cameras, it might still occur that the eyewear can improve the user’s visual acuity by altering a visual effect on the user’s field of view, such as using one or more lenses to alter that visual effect. In such cases, the eyewear can adjust one or more lenses so as to adjust that visual effect.

[346] The eyewear can also include a sensor disposed to determine the user’s best currently available visual acuity, thus, whether that is using the eyewear’s lenses or not. The sensor can identify one or more pixels in an image from the user’s field of view and determine whether like pixels present like images to the user. If not, the eyewear can determine that the user’s visual acuity could be improved, similar to the concept of autofocus. However, while autofocus generally measures a distance to a target and alters a focal length of a camera to match that distance, the method and system described herein is not that.

[347] In response to the measure of visual acuity, the eyewear can adjust one or more parameters, such as color balance, polarization, shading, or other parameters affecting the user’s field of view. For example, the eyewear can adjust shading of the object being looked at or focused upon by the user. After adjusting the one or more parameters, the eyewear can re-measure the user’s visual acuity, so as to determine whether the user’s visual acuity has been improved by the adjustment of the one or more parameters. The eyewear can, in response to whether the user’s visual acuity has been improved, reverse the adjustment, extend the adjustment, try an adjustment of one or more different parameters, try a combination of two or more adjustments, or otherwise attempt to improve the user’s visual acuity.

[348] For example, if the user’s view of a selected object is relatively washed out, such as due to excessive brightness or glare, the eyewear can determine that the user’s visual acuity is inadequate due to that excessive brightness or glare. In such cases, the eyewear can determine that it should shade those regions of the user’s field of view that are subject to the excessive brightness or glare. Once the eyewear has done so, the user’s visual acuity should be sufficiently improved that the eyewear can wait until conditions change.

[349] Similarly, the eyewear can adjust the amount of shading so as to prompt the user’s eye to operate in a mesopic range. This can have the effect that the eyewear can optimize the user’s view so as to operate with a best degree of black-and-white vision and color vision. When the amount of brightness in the ambient environment is too little for mesopic vision, the eyewear can brighten the image available to the user, so as to allow the user’s eye to see using both black-and- white vision and color vision. When the amount of brightness in the ambient environment is too much for mesopic vision, the eyewear can shade the image available to the user, so as to allow the user’s eye to see using mesopic vision.

[350] In response to the re-measurement of the user’s visual acuity, the eyewear can determine whether the user’s visual acuity has reached a satisfactory measure, thus, whether the eyewear has successfully improved the user’s visual acuity to a satisfactory degree. If not, the eyewear can continue to adjust the one or more parameters, or combinations or conjunctions thereof, so as to reach a satisfactory measure of user visual acuity. [351] The eyewear can periodically, or otherwise in response to changing conditions, re-meas- ure the user’s visual acuity, so as to obtain a degree of visual acuity that is continually satisfactory. For example, whenever the user changes the direction in which they are looking or the distance at which they are focusing, the eyewear might determine that it should re-adjust the one or more parameters. This can have the effect that the user’s visual acuity can be maintained substantially constantly satisfactory.

Possible use cases

[352] In one embodiment, the eyewear 100 can be disposed to perform shading using the lenses 112 by one or more of: (A) darkening one or more lens regions 131 or lens pixels 141 through which the wearer is viewing a light source; (B) urging the wearer’s pupil or iris to contract, such as by injecting light into the pupil, by triggering an electronic signal to prompt the iris muscle to contract, by inducing a puff of air to prompt the iris muscle to contract, or otherwise as described herein; (C) or by darkening one or more concentric rings of lens regions 131 or lens pixels 141, with the effect that the wearer’s pupil is artificially restricted in width, thus effectively contracted; or otherwise as described herein. This can have the effect that the wearer receives less infalling light on their retina, thus darkening their view.

[353] In one embodiment, the eyewear too can be disposed to perform shading in response to detecting epilepsy or seizure, measuring a rate of oscillation with respect to a seizure event, and fully shading away all infalling light in synchrony with the seizure event, so as to effectively remove any further trigger of the seizure event. This can have the effect that a seizure event can be treated, at least in part.

[354] Similarly, the eyewear too can be disposed to combine restriction of light injected into the pupil or iris, so as to focus infalling light on the center of the retina (the macula), with a disc or ring of light being allowed to flow through to the retina’s rods. This can have the effect that the wearer’s night vision can be improved, as the wearer’s rods would be activated, while also allowing the wearer’s color vision to be used, as the wearer’s cones would be activated. This can also have the effect of providing a treatment for Parkinson’s disease, at least in part. Allowing color into the eye can also have the effect of providing a treatment for autism or dementia, at least in part. [355] In one embodiment, the eyewear 100 can be disposed to specifically urge the wearer’s pupil or iris to contract by either (A) urging the wearer’s pupil or iris muscle to contract, as described just above; (B) or by darkening one or more concentric rings of lens regions 131 or lens pixels 141, with the effect that the wearer’s pupil is artificially restricted in width, thus effectively contracted, as described just above; (C) applying an electromagnetic field to the optic nerve, as further described herein with respect to fig. 6; or otherwise as described herein. This can have the effect that that the wearer’s pupil or iris is urged to contract, which can be useful when conducting LASIK eye surgery or for adjusting pupil size after LASIK surgeiy.

[356] In one embodiment, the eyewear 100 can be disposed to perform inverse-shading using the lenses 112 by one or more of: (A) darkening one or more lens regions 131 or lens pixels 141 through which the wearer is viewing their field of view (FOVj, with the exception of an object, display or screen that is being inverse-shaded; (B) injecting light into the wearer’s eye where their gaze would be directed at the selected inverse-shaded object, display or screen, similar to when the inverse-shaded object is glowing or phosphorescent; or otherwise as described herein. This can have the effect that the selected inverse-shaded object, display or screen is brighter than its background in the wearer’s FOV.

[357] In one embodiment, the eyewear too can be disposed to promote the wearer reading by one or more of: (A) performing enhanced refraction in one or more lens regions 131 or sets of lens pixels 141 in areas of the wearer’s field of view (FOV) through which the wearer would read a book or other readable object, causing a horizontal line to be available for viewing at a higher magnification; (B) performing a prismatic effect using the lenses 112 to alter the angle through which light is bent when passing through the lenses, with the effect that the wearer sees objects as if “looking down” even when the wearer’s gaze direction is straight ahead; (C) darkening one or more lens regions 131 or sets of lens pixels 141 in areas of the wearer’s field of view other than in a region through which the wearer would read a book, leaving a horizontal line available for viewing; or otherwise as described herein. This can have the effect that the wearer is urged to alter their gaze direction toward the book, thus performing a similar function as “reading glasses”. Similarly, the eyewear too can be disposed to promote the wearer reading by darkening one or more lens regions 131 or lens pixels 141 in areas of the wearer’s FOV, leaving a vertical line available for viewing. This can have the effect that that the wearer is urged to alter their gaze direction along the line they are reading, selecting each word in turn. [358] In one embodiment, the eyewear 100 can be disposed to provide an augmented reality (AR) or virtual reality (VR) display of an eye chart, a peripheral vision test, or another eye test. Using the eyewear 100, an optometrist or other medical personnel can conduct an eye exam to determine a prescription for the wearer, such as whether the wearer needs a prescription to address myopia, presbyopia, astigmatism, or otherwise as described herein. The eyewear 100 can also be disposed with a camera directed at the wearer’s retina, so as to determine whether the image provided by the AR or VR display is in focus on the wearer’s retina. This can have the effect that the optometrist or other medical personnel can conduct the eye exam without requiring the wearer to select which image is best in focus.

[359] In one embodiment, the eyewear 100 can be disposed to provide an augmented reality (AR) or virtual reality (VR) display of the wearer’s field of view (FOV), such as when the wearer is a police officer or military personnel, a firefighter or other emergency responder, search/rescue personnel, a physician or other medical personnel, or a volunteer assisting a nearby person in need of aid.

[360] For example, when the wearer is a police officer or is military personnel, the eyewear too can be disposed to use one or more artificial intelligence (Al) or machine learning (ML) techniques to recognize selected types of objects, such as weapons (guns, knives, or otherwise as described herein), that might be dangerous to the wearer. In such cases, the eyewear too can (A) inverse-shade the dangerous object, so as to emphasize its location to the wearer; (B) provide an augmented reality (AR) or virtual reality (VR) display of information with respect to the dangerous object, so as to urge the wearer not to fail to recognize that object. Similarly, the eyewear too can be disposed to use an Al or ML technique to recognize when the object is moving, or is within reach of a person’s hand, or otherwise becomes significantly more dangerous.

[361] For another example, when the wearer is a police officer or is military personnel, the eyewear too can be disposed to receive electromagnetic signals from a flashbang grenade when the grenade is triggered. At or just before the grenade is set to emit light and sound, the eyewear too can completely shade that light and sound, thus protecting the police or military from effects of the grenade. This can have the effect that police and military can use flashbang grenades to stun any opposition, without having to assure that they are not subject to their effects. Similarly, police and military can use light or sound as offensive devices and tactics against persons they seek to subdue. In a related example, police vehicle lights can reach 500 lux in brightness, sufficient to temporarily blind suspects. In such cases, the police vehicle lights can emit an electromagnetic signal when turned on, which can be received by the eyewear 100 so as to completely shade that light, thus protecting the police or military from effects of the extreme light. This can have the effect that the police or military can use vehicle lights against persons they seek to subdue, without having to temporarily blind themselves while so doing.

[362] For another example, when the wearer is a police officer or is military personnel, the eyewear 100 can be disposed to exchange electromagnetic signals with firearms, such as police pistols or military pistols or rifles. Firearms can be set with a “safety” mechanism on or off. In such cases, police or military firearms can be disposed to send electromagnetic signals to the eyewear too so as to indicate whether the safety is on or off, and to receive electromagnetic signals from the eyewear too so as to set the safety on or off. The eyewear too can be disposed so as to allow police or military personnel to identify whether the safety is on or off using an augmented reality (AR) indicator in their field of view (FOV), and to set the safety to be on or off using an eye gesture, hand gesture, or other action. This can have the effect that police or military personnel can both (A) be assured when the safety is on or off with respect to their firearms, and (B) be assured that they can set the safety on or off without having to actually touch the firearm.

[363] For another example, when the wearer is a police officer or is military personnel, the eyewear too can be disposed to exchange electromagnetic signals with firearms, so as to identify in what direction and at what target the firearm is directed. In such cases, firearms can be disposed to send electromagnetic signals to the eyewear too, so as to indicate in what direction the firearm is pointed. The computing device 121 can use this information to determine a line of sight and a current target for the firearm, and can inject this information using an augmented reality (AR) indicator in their field of view (FOV). This can have the effect that police or military personnel can identify at whom they are aiming without revealing that information to an opponent with a laser pointer. The computing device 121 can inject an AR indicator into their FOV to show what would be seen through the firearm’s gun sights, even if the officer is not actually so positioned. This can also have the effect that police or military personnel can identify when they are inopportunely aiming at another officer or at an innocent civilian. In such cases, the police or military can inform the computing device 121, such as using an eye gesture, which persons are not proper targets, and the computing device 121 can control the firearm so as to prevent accidents. [364] For another example, when the wearer is a police officer or is search/rescue personnel, the eyewear 100 can be disposed to use one or more artificial intelligence (Al) or machine learning (ML) techniques to identify one or more persons (such as suspects or rescuees), such as in response to one or more facial recognition techniques, or otherwise as described herein. In such cases, the eyewear 100 can, in response to identifying those persons, can (A) inverse-shade the dangerous object, so as to emphasize its location to the wearer; (B) provide an provide an augmented reality (AR) or virtual reality (VR) display of information with respect to the recognized person; (C) apply an electromagnetic field to the optic nerve, as further described herein with respect to fig. 6; or otherwise as described herein.

[365] For another example, when the wearer is a firefighter or other emergency responder, the eyewear 100 can be disposed to use one or more artificial intelligence (Al) or machine learning (ML) techniques to recognize selected types of events. In such cases, the selected types of events can include (for firefighters) objects that are significantly hotter than expected, such as in response to an infrared (IR) sensor, areas that have dangerous gases or other toxins, such as in response to a chemical sensor, or otherwise as described herein. In such cases, the selected types of events can include (for emergency responders) patients whose vital signs are abnormal, such as in response to a blood oxygen sensor, a blood pressure or pulse rate sensor, or otherwise as described herein.

[366] For another example, the eyewear too can identify one or more persons in need of aid by a volunteer, such as using one or more artificial intelligence (Al) or machine learning (ML) techniques, such as those described with respect to the Incorporated Disclosures, particularly Application 16/264,553, filed Jan. 31, 2019, naming inventor Scott LEWIS, titled “Digital eyewear integrated with medical and other services”, Attorney Docket No. 6041, currently pending.

Fig. 2 - Retinal Image Display

[367] Fig. 2 shows a conceptual drawing of example eyewear including a retinal image display (RID).

[368] In one embodiment, an example eyewear 100 can include elements shown in the figure, such as one or more of: — a frame 201, such as possibly including one or more temples 201a, a nosepiece 201b, or a RID holder 201c;

— at least one RID 202, such as possibly for a right eye or a left eye.

[369] In one embodiment, the RID 202 can provide an alternative image, to replace the image available to the wearer’s eye, or a supplemental image to add to the image available to the wearer’s eye.

[370] To replace the image available to the wearer’s eye, the lens 112 (shown in fig. 1) in front of the wearer’s eye can be opaqued, and the RID 202 can provide the alternative image directly to the wearer’s retina. To opaque the lens 112 in front of the wearer’s eye, the computing device 121 can adjust, with respect to the lens, one or more of: shading, polarization, color filtering, prismatic adjustment, or otherwise as described herein.

[371] For example, the lens 112 can be adjusted by changing

— an amount of shading sufficient to make the lens opaque to a natural field of view (FOVj but not sufficient to prevent the wearer from seeing the retinal image;

— an amount of polarization sufficient to make the lens opaque to a natural FOV, while adjusting the RID 202 with an inverse amount of polarization;

— an selected set of color frequencies sufficient to filter out most of the natural FOV, while adjusting the RID 202 to inject those color frequencies into the retina;

— an amount or function of prismatic adjustment sufficient to cause the eye to not see the natural FOV, while adjusting the RID 202 with an inverse amount of prismatic adjustment; or otherwise as described herein.

[372] To supplement the image available to the wearer’s eye, the lens 112 in front of the wearer’s eye can be allowed to remain clear, and the RID 202 can provide the supplemental image directly to the wearer’s retina.

Fig. 3 - Contact lenses or intra-ocular lenses

[373] Fig- 3 (collectively including Figures 3A and 8) shows a conceptual drawing of example eyewear including contact lenses or intra-ocular lenses. [374] Figure 3A shows a conceptual drawing of example contact lenses having multiple active regions related to wearer view.

[375] Figure 3B shows a conceptual drawing of example contact lenses having multiple individual pixels related to wearer view.

Contact lenses

[376] Similar to Figure 1A or 1B fig. 1), an example eyewear 100 can include one or more contact lenses 300 disposed for use by the wearer (not shown) by affixing the contact lenses to the wearer’s eyes. The contact lenses 300 can include one or more lenses 300, such as possibly a right lens 300a or a left lens 300b. The contact lenses 300 can include elements shown in the figure, such as one or more of:

— a power harvester 301, such as possibly an antenna disposed to receive ambient electromagnetic energy. In one embodiment, the power harvester 301 can include an antenna tuned to receive electromagnetic energy from a cellular phone network, a Wi-Fi network, a 60 Hz power system, or otherwise as described herein;

— a communication device 302, such as possibly including a wireless antenna disposed to transmit or receive information using the power harvester 301, a clock circuit, or other elements used with communication devices;

— a computing device 303, such as possibly coupled wirelessly to the communication device 302, and possibly including a processor, memory or mass storage, a second power supply, or other elements used with computing devices;

— one or more sensors 304, such as possibly embedded in the contact lenses 300 or coupled to the computing device 303, and possibly including one or more of: wearer sensors 304a disposed to receive information about the wearer (or their current condition), ambient sensors 304b disposed to receive information about an environment near the wearer (or its current condition), or other sensors.

[377] In one embodiment, the one or more sensors 304 can also include a magnetic (or magnetizable) ring, or a set of magnetic (or magnetizable) elements at the edge of the contact lenses 300. This can have the effect that when the wearer’s gaze direction changes, the position of the contact lenses 300 also changes to match a vector from the retina through the pupil and iris. The computing device 303 can be disposed to detect the position of the contact lenses 300, such as using a capacitive sensor, a magnetometer, another electromagnetic device, or otherwise as described herein.

[378] In one embodiment, the one or more sensors 304 can also include one or more outwardfacing photovoltaic cells, or similar electronic elements, such as affixed to the contact lenses 300 or elsewhere on the eye, so as to become covered by the eyelid when the wearer blinks. Similarly, the one or more sensors 304 can also include one or more inward-facing photovoltaic cells, or similar electronic elements, such as affixed to the contact lenses 300 or elsewhere on the eye, so as to obtain an image of the retina, which will be blanked out when the wearer blinks. This can have the effect that the sensors 304 can determine a blink rate for the wearer without any complex elements selected to identify when a blink occurs or whether the blink is a complete blink (thus, not a partial blink).

[379] Similar to Figures 1A or 1B (as described herein), the one or more sensors 304 can also include sensors such as those described with respect to the sensors 123 coupled to the frame 111 (fig. 1). Where practical, and such as described with respect to Figures 1A or 1B, these can include one or more of:

— one or more visually evoked potential (VEP) elements disposed to measure a potential of the wearer’s visual region of the brain;

— one or more devices disposed to perform electroencephalography (EEG), electrooculography (EOG), electroretinography (ERG), optical computed tomography (OCT), or other measures with respect to eye function;

— an electric field element disposed to measure a dipole moment of the eye;

— a gaze direction sensor (not shown), such as an element disposed to measure a reflection of an electromagnetic signal, such as infrared (IR) light directed at the eye and reflected in response to a direction of the pupil or the lens thereof. In such cases, the gaze direction sensor can use reflections or refractions from the lenses to provide a signal indicating a direction at which the wearer is looking, as described with respect to Figures 1A and 1B.

— one or more devices mounted on a vehicle or otherwise remote devices, such as described with respect to Figures 1A and 1B, and disposed to provide information to the communication device 302 or the computing device 303.

Intra-ocular lenses [380] Similar to Figure 3A or 3B (as described below), an intra-ocular lens (not shown) can be implanted in the wearer’s eye, such as by replacing or augmenting the natural lens of the wearer’s eye.

[381] For example, the intra-ocular lens can be disposed to be static, such as by determining its shape at the time of implantation, or by causing the amount of refraction by the intra-ocular lens to be set by one or more fuses or other electronic components, the values of which can be set at the time of implantation.

[382] For another example, the intra-ocular lens can be disposed to be alterable by the computing device 121, such as by causing the amount of refraction by the intra-ocular lens to be set by one or more fuses or other electronic components, the values of which can be altered by an electromagnetic signal from the computing device 121 or another device. Similar to the contact lenses 300, the intra-ocular lens can be powered by electromagnetic harvesting, or a related technique.

Multiple active regions

[383] Figure 3A shows a conceptual drawing of example contact lenses having multiple active regions related to wearer view.

[384] Similar to Figure 1A (as shown in fig. 1), the contact lenses 300 can be used to correct vision on behalf of the wearer, enhance vision on behalf of the wearer, or otherwise as described herein. For example, similarly, the contact lenses 300 can correct for myopia, presbyopia, astigmatism, or other wearer vision artifacts. Also similarly, the contact lenses 300 can enhance vision can include a zoom feature disposed to present the wearer with a zoomed-in or zoomed-out view of the wearer’s field of view (FOV), or can include other features disposed to present the wearer with other vision enhancements described in the Incorporated Disclosures, or otherwise as described herein.

[385] Similar to Figure 1A (as shown in fig. 1), the contact lenses 300 can include multiple lens regions 310, each disposed to correct vision or enhance vision on behalf of the wearer. For example, the multiple lens regions 310 can include a close-vision region 311, a mid-range vision region 312, a distant vision region 313, or otherwise as described herein. Also similarly, each lens region 310 can be individually controlled, such as by the computing device 303, or otherwise as described herein. This can have the effect that the wearer’s vision can be corrected or enhanced in each region where the wearer might look.

[386] Similar to Figure 1A (fig. 1), each lens region 310 can be individually controlled, such as by the computing device 303, or otherwise as described herein. This can have the effect that the wearer’s vision can be corrected or enhanced in each region where the wearer might look. For example, the close-vision region 311 can be disposed with a distinct prescription from the midrange vision region 312. This can have the effect that when the wearer looks at a close object, their vision can be corrected or enhanced with respect to the prescription assigned to the close-vision region 311, or when the wearer looks at a mid-range object, vision can be corrected or enhanced with respect to the prescription assigned to the mid-range vision region 312. For another example, the central vision region can be disposed with a distinct prescription from the peripheral vision region. This can have the effect that when the wearer looks directly at an object, their vision can be corrected or enhanced with respect to the prescription assigned to the central vision region, or when the wearer uses their peripheral vision, their vision can be corrected or enhanced with respect to the prescription assigned to the peripheral vision region.

[387] As described with respect to Figure 1A (fig. 1), when the wearer moves their head, the computing device 303 can determine, such as using an accelerometer or a gyroscope (which can be included with the sensors 304), a wearer’s head movement. Similarly, the computing device 303 can also determine, such as using a dynamic eye gaze tracker (which can be included with the sensors 304), a gaze direction. Also similarly, this information can allow the computing device 303 to determine a distance of the object at which the wearer is intending to look; similarly, this information can allow the computing device 303 to determine whether the wearer is using their central vision or peripheral vision, and to control the correction or enhancement associated with one or more of the lens regions 310.

[388] As described with respect to Figure 1A (fig. 1), when the wearer shifts their gaze, the computing device 303 can determine, such as using a focal length detector (which can be included with the sensors 304), a distance to an object being viewed by the wearer. Similarly, this information can allow the computing device 303 to determine a distance of the object at which the wearer is intending to look. Also similarly, the computing device 303 can control the correction or enhancement associated with one or more of the lens regions 310. This can have the effect that the eyewear too adjusts its correction or enhancement to match the wearer’s intended use thereof. [389] As described with respect to Figure 1A (fig. 1), the lens regions 310 can overlap, such as shown in the figure. An example might occur when close-range overlaps with both central/pe- ripheral vision. In such cases, the intersection of multiple lens regions 310, or the union of multiple lens regions 310, as appropriate, can be invoked by the computing device 303, so as to provide the wearer with the correction or enhancement to match the wearer’s intended use of the contact lens 300.

Multiple active pixels

[390] Figure 3B shows a conceptual drawing of example contact lenses having multiple individual pixels related to wearer view.

[391] Similar to Figure 1B (as shown in fig. 1), the contact lenses 300 can include multiple lens pixels 320, each disposed to correct vision or enhance vision on behalf of the wearer. For example, similarly, each lens pixel 320 can include an individual region (such as the multiple lens regions 310, only typically smaller), disposed to provide distinct corrections or enhancements to vision in the region where the wearer’s gaze direction intersects the lens pixel. Also similarly to the lens regions 310, each lens pixel 320 can be individually controlled, such as by the computing device 303, or otherwise as described herein. This can have the effect that the wearer’s vision can be corrected or enhanced for each direction where the wearer might look.

[392] Similar to Figure 1B (as shown in fig. 1), the computing device 303 can associate a distinct set of lens pixels 320 for use as a separate one of the multiple lens regions 310. For example, the computing device 303 can control the prescription with respect to each such lens region 310 by controlling each of the lens pixels 320 associated with that particular lens region. Also similarly to the possibility of overlap of lens regions 310, a set of lens pixels 320 can be associated with more than one such lens region. This can have the effect that when the computing device 303 determines that the wearer is using a particular lens region 310, it can select the set of lens pixels associated with that lens region, even if those lens pixels are also associated with another lens region. Similar to overlap of lens regions 310, the intersection of multiple sets of lens pixels 320, or the union of multiple sets of lens pixels 320, as appropriate, can be invoked by the computing device 303, so as to provide the wearer with the correction or enhancement to match the wearer’s intended user of the eyewear too. [393] As described with respect to Figure 1B (fig. 1), when the computing device 303 can determine the wearer’s intended user of the eyewear 100, and can determine the particular lens pixel 320 that the wearer’s gaze direction passes through, the computing device can invoke only that one lens pixel, possibly updating the particular lens pixel to invoke as the wearer’s gaze direction might change.

[394] Similar to Figure 1B (as shown in fig. 1), in alternative embodiments, the contact lenses 300 can include one or more layers or alternative regions that can have their shading, or other effects, separately adjusted. Thus, in addition or in lieu of lens pixels 320, the contact lenses 300 can use separate regions that are adjusted as a whole, rather than being adjusted as a collection of lens pixels 310. When a region is adjusted, this can have the effect that the eye can be drawn toward or away a particular adjusted region. For example, when it is desired to encourage the user to look through a short-range focusing region, other regions can be shaded to decrease visibility, thus encouraging the user to look in a particular direction or through a particular region of the lenses.

[395] Similar to Figure 1B (fig. 1), the set of lens pixels 320 associated with each such lens region 310 can be adjusted by the computing device 303. This can have the effect that the set of lens pixels 320 associated with each such lens region 310 can be altered from time to time.

[396] For example, a selected contact lens 300 can include a first region for a first degree of vision correction, such as using refraction, such as for close-range viewing and a second region for a second degree of vision correction, such as for longer-range viewing. A second contact lens layer (not shown) can be overlaid on the contact lens 300, so that the second lens layer can shade one or more regions of the contact lens 300. This can have the effect that the user is prompted to look in a selected direction, or through a particular region of the contact lens 300. Thus, the second lens layer can shade so as to prompt the user to view through the selected portion of the contact lens 300, thus looking at a field of view (FOV) through either a selected close-range region or a selected more longer-range lens region.

Predictive techniques

[397] Similar to Figures 3A and 3B, in one embodiment, the computing device 303 can maintain a record of wearer activity with respect to use of the contact lens 300 and its lens regions 311 or 312, so as to identify which portions of the contact lens 300 should be associated with which lens regions 311 or 312 to provide the wearer with the best possible experience with using the contact lens 300. For example, when the computing device 303 determines that the wearer is most likely to need a particular prescription for a selected portion of the contact lens 300, the computing device 303 can adjust the prescription for that particular portion of the contact lens 300 so as to provide the wearer with that prescription when the wearer is using that portion of the contact lens 300.

[398] In one embodiment, the computing device 303 can determine the wearer’s most likely prescription in response to a predictive technique, such as using artificial intelligence (Al) or machine learning (ML). For example, the computing device 303 can train a recurrent neural network (RNN) to predict the wearer’s most likely prescription in response to each lens region 311 or 312 and each other set of circumstances, such as information obtained from the sensors 304. Alternatively, the computing device 303 can determine a set of regression parameters to predict the wearer’s most likely prescription in response to each lens region 311 or 312 and each other set of circumstances. The computing device 303 can use other and further Al or ML techniques, or other techniques, or otherwise as described herein, to make the desired prediction.

[399] Similar to predictive techniques with respect to the lens regions 311 or 312, the computing device 303 can determine the wearer’s most likely prescription in response to one or more predictive techniques, such as using artificial intelligence (Al) or machine learning (ML) with respect to each lens pixel, with respect to association of lens pixels with particular lens regions 311 or 312, or otherwise as described herein. In such cases, the computing device 303 can assign individual lens pixels to selected lens regions 311 or 312, in response to one or more predictive techniques. Also similarly, the computing device 303 can adjust the set of lens pixels associated with each lens region 311 or 312 in response to a predictive technique in response to wearer actions, such as the wearer moving their head when their gaze direction should be reassociated with a different lens region 311 or 312.

[400] In one embodiment, the computing device 303 can determine the wearer’s most likely medical condition, such as in response to the sensors 304. For example, blink rate and other parameters with respect to the wearer’s eye activity can be used to determine whether the wearer is excessively anxious, depressed, sleep-deprived, or otherwise needs to rest. In such cases, the contact lens 300 can be disposed to urge the wearer to take a break and rest. This can have the effect that safety is improved, such as for commercial pilots and other pilots, long-haul truckers and other long-distance drivers, police officers, military personnel, firefighters, emergency responders, medical personnel, and other personnel often subject to long hours or stressful circumstances. Alternatively, the contact lens 300 can be disposed to urge the wearer to take a break or to obtain a stimulant, such as caffeine, sugar, a meal, or otherwise as described herein.

Fig. 4 — Facemask or helmet

[401] Fig. 4 (collectively including Figures 4A—4D) shows a conceptual drawing of example eyewear including a facemask or helmet.

[402] Figure 4A shows a conceptual drawing of an example facemask or helmet having multiple active regions related to wearer view.

[403] Figure 4B shows a conceptual drawing of an example facemask or helmet having multiple individual pixels related to wearer view.

[404] Figure 4C shows a conceptual drawing of an example goggles or visor having multiple active regions related to wearer view.

[405] Figure 4D shows a conceptual drawing of an example goggles or visor having multiple individual pixels related to wearer view.

[406] In one embodiment, an example eyewear too can include a facemask or helmet 400 disposed for use by the wearer (not shown), including elements shown in the figure, such as one or more of: — a frame 401, such as possibly including a headgear 402a (such as a front piece for a facemask, or a head guard for a helmet) or an eye guard 402b;

— at least one lens 403, such as possibly a right lens 403a (shown in Figure 4A), or a left lens 403b (shown in Figure 4B), such as disposed in the eye guard 402b or integrated into the eye guard 402 as part of its solid form.

[407] Similar to the eyewear 100 described with respect to fig. 1, the frame 401 can enclose, or hold, one or more electronic elements shown in the figure, such as one or more of:

— a computing device 121 (as shown in fig. 1), such as possibly including a processor, memory or mass storage, a power supply, a clock circuit, or other elements used with computing devices;

— a communication device 122 (as shown in fig. 1), such as possibly including a wireless or wired communicate element, a communication protocol stack, or other elements used with communication devices;

— one or more sensors 123 (as shown in fig. 1), such as possibly including one or more of: wearer sensors 123a (as shown in fig. 1) disposed to receive information about the wearer (or their current condition), ambient sensors 123b (as shown in fig. 1) disposed to receive information about an environment near the wearer (or its current condition), or other sensors.

[408] For example, similar to Figures 1A or 1B (as described herein), the one or more sensors 123 can also include sensors such as those described with respect to the sensors 123 coupled to the frame 111 (as shown in fig. 1). Where practical, and such as described with respect to Figures 1A or 1B, these can include one or more of:

— one or more visually evoked potential (VEP) elements disposed to measure a potential of the wearer’s visual region of the brain;

— one or more devices disposed to perform electroencephalography (EEG), electrooculography (EOG), electroretinography (ERG), optical computed tomography (OCT), or other measures with respect to eye function;

— an electric field element disposed to measure a dipole moment of the eye;

— a gaze direction sensor (not shown), such as an element disposed to measure a reflection of an electromagnetic signal, such as infrared (IR) light directed at the eye and reflected in response to a direction of the pupil or the lens thereof. In such cases, the gaze direction sensor can use reflections or refraction from the lenses to provide a signal indicating a direction at which the wearer is looking, as described with respect to Figures 1A and 1B. — one or more devices mounted on a vehicle or otherwise remote devices, such as described with respect to Figures 1A and 1B, and disposed to provide information to the computing device 121 or the communication device 122.

[409] In the facemask or helmet 400, or goggles or visor 450, similar described with respect to the glasses 110 (as shown in fig. 1) the one or more lenses 403 can be used to correct or enhance vision on behalf of the wearer, or otherwise as described herein. For example, the lenses 403 can be used to correct vision using one or more lens prescriptions associated therewith, disposed to correct for myopia, presbyopia, astigmatism, or other wearer vision artifacts. For another example, the lenses 403 can be used to enhance vision can include a zoom feature disposed to present the wearer with a zoomed-in or zoomed-out view of the wearer’s field of view (FOV), or can include other features disposed to present the wearer with other vision enhancements described in the Incorporated Disclosures, or otherwise as described herein.

Multiple active regions

[410] Figure 4A shows a conceptual drawing of an example facemask or helmet having multiple active regions related to wearer view.

[411] Similar to Figure 1A (as shown in fig. 1), the lenses 403 can be used to correct vision on behalf of the wearer, enhance vision on behalf of the wearer, or otherwise as described herein. For example, similarly, the lenses 403 can correct for myopia, presbyopia, astigmatism, or other wearer vision artifacts. Also similarly, the lenses 403 can enhance vision can include a zoom feature disposed to present the wearer with a zoomed-in or zoomed-out view of the wearer’s field of view (FOV), or can include other features disposed to present the wearer with other vision enhancements described in the Incorporated Disclosures, or otherwise as described herein.

[412] Similar to Figure 1A (as shown in fig. 1), the lenses 403 can include multiple lens regions 410, each disposed to correct vision or enhance vision on behalf of the wearer. For example, the multiple lens regions 410 can include a close-vision region 411, a mid-range vision region 412, a distant vision region 413, or otherwise as described herein. Also similarly, each lens region 410 can be individually controlled, such as by the computing device 121, or otherwise as described herein. This can have the effect that the wearer’s vision can be corrected or enhanced in each region where the wearer might look. [413] As described with respect to Figure 1A (fig. 1), when the wearer moves their head, the computing device 121 can determine, such as using an accelerometer or a gyroscope (which can be included with the sensors 123), a wearer’s head movement. Similarly, the computing device 121 can also determine, such as using a dynamic eye gaze tracker (which can be included with the sensors 123), a gaze direction. Also similarly, this information can allow the computing device 121 to determine a distance of the object at which the wearer is intending to look; similarly, this information can allow the computing device 121 to determine whether the wearer is using their central vision or peripheral vision, and to control the correction or enhancement associated with one or more of the lens regions 410.

[414] As described with respect to Figure 1A (fig. 1), when the wearer shifts their gaze, the computing device 121 can determine, such as using a focal length detector (which can be included with the sensors 123), a distance to an object being viewed by the wearer. Similarly, this information can allow the computing device 121 to determine a distance of the object at which the wearer is intending to look. Also similarly, the computing device 121 can control the correction or enhancement associated with one or more of the lens regions 410. This can have the effect that the facemask 400a or helmet 400b adjusts its correction or enhancement to match the wearer’s intended use thereof.

[415] Similar to Figure 1A (fig. 1), the lenses 403 can each include multiple lens regions 410, each disposed to correct vision or enhance vision on behalf of the wearer. For example, the lens regions 401 can each include a central vision region, disposed to provide distinct corrections or enhancements to vision in a region where the wearer is looking at objects using their central vision, or one or more peripheral vision regions, disposed to provide distinct corrections or enhancements to vision in a region where the wearer is looking at objects using their peripheral vision. For another example, the lens regions 410 can each include a close-vision region, disposed to provide distinct corrections or enhancements to vision in a region where the wearer is looking at a close object, a mid-range vision region, disposed to provide distinct corrections or enhancements to vision in a region where the wearer is looking at a mid-range object, or a distant vision region, disposed to provide distinct corrections or enhancements to vision in a region where the wearer is looking at a distant object. [416] Similar to Figure 1A (fig. 1), each lens region 410 can be individually controlled, such as by the computing device 403, or otherwise as described herein. This can have the effect that the wearer’s vision can be corrected or enhanced in each region where the wearer might look. For example, the close-vision region 411 can be disposed with a distinct prescription from the midrange vision region 412. This can have the effect that when the wearer looks at a close object, their vision can be corrected or enhanced with respect to the prescription assigned to the close-vision region 411, or when the wearer looks at a mid-range object, vision can be corrected or enhanced with respect to the prescription assigned to the mid-range vision region 412. For another example, the central vision region can be disposed with a distinct prescription from the peripheral vision region. This can have the effect that when the wearer looks directly at an object, their vision can be corrected or enhanced with respect to the prescription assigned to the central vision region, or when the wearer uses their peripheral vision, their vision can be corrected or enhanced with respect to the prescription assigned to the peripheral vision region.

[417] As described with respect to Figure 1A, in one embodiment, when the wearer moves their head, the computing device can determine a wearer’s head movement, such as using an accelerometer or a gyroscope (which can be included with the sensors). The computing device can also determine a gaze direction, such as using a dynamic eye gaze tracker (which can be included with the sensors). This information can allow the computing device to determine whether the wearer is intending to look at a close object, a mid-range object, or a distant object; similarly, this information can allow the computing device to determine whether the wearer is using their central vision or peripheral vision. In response thereto, the computing device can control the correction or enhancement associated with one or more of the lens regions. This can have the effect that the eyewear 400 can adjust its correction or enhancement to match the wearer’s intended use thereof.

[418] As described with respect to Figure 1A, in another embodiment, when the wearer shifts their gaze, the computing device can determine a distance to an object being viewed by the wearer, such as using a focal length detector (which can be included with the sensors). This information can allow the computing device 121 to determine whether the wearer is intending to look at a close object, a mid-range object, or a distant object. In response thereto, the computing device can control the correction or enhancement associated with one or more of the lens regions. This can have the effect that the eyewear 400 can adjust its correction or enhancement to match the wearer’s intended use thereof. [419] In one embodiment, the lens regions can overlap, such as shown in the figure. An example might occur when close-range overlaps with both central/peripheral vision. In such cases, the intersection of multiple lens regions, or the union of multiple lens regions, as appropriate, can be invoked by the computing device, so as to provide the wearer with the correction or enhancement to match the wearer’s intended use of the eyewear 400.

Multiple active pixels

[420] Figure 4B shows a conceptual drawing of an example facemask or helmet having multiple individual pixels related to wearer view.

[421] Similar to Figure 1B (as shown in fig. 1), the lenses 403 can include multiple lens pixels 420, each disposed to correct vision or enhance vision on behalf of the wearer. For example, similarly, each lens pixel 420 can include an individual region (such as the multiple lens regions 410, only typically smaller), disposed to provide distinct corrections or enhancements to vision in the region where the wearer’s gaze direction intersects the lens pixel. Also similarly to the lens regions 410, each lens pixel 420 can be individually controlled, such as by the computing device 121, or otherwise as described herein. This can have the effect that the wearer’s vision can be corrected or enhanced for each direction where the wearer might look.

[422] Similar to Figure 1B (as shown in fig. 1), the computing device 121 can associate a distinct set of lens pixels 420 for use as a separate one of the multiple lens regions 410. For example, the computing device 121 can control the prescription with respect to each such lens region 410 by controlling each of the lens pixels 420 associated with that particular lens region. Also similarly to the possibility of overlap of lens regions 410, a set of lens pixels 420 can be associated with more than one such lens region. This can have the effect that when the computing device 121 determines that the wearer is using a particular lens region 410, it can select the set of lens pixels associated with that lens region, even if those lens pixels are also associated with another lens region. Similar to overlap of lens regions 410, the intersection of multiple sets of lens pixels 420, or the union of multiple sets of lens pixels 420, as appropriate, can be invoked by the computing device 121, so as to provide the wearer with the correction or enhancement to match the wearer’s intended user of the eyewear too. As described with respect to Figure 1B (fig. 1), when the computing device 121 can determine the wearer’s intended user of the eyewear too, and can determine the particular lens pixel 420 that the wearer’s gaze direction passes through, the computing device can invoke only that one lens pixel, possibly updating the particular lens pixel to invoke as the wearer’s gaze direction might change.

[423] Similar to Figure 1B (as shown in fig. 1), the set of lens pixels 420 associated with each such lens region 410 can be adjusted by the computing device 121. This can have the effect that the set of lens pixels 420 associated with each such lens region 410 can be altered from time to time.

[424] Similar to Figure 1B (as shown in fig. 1), the lenses 403 can each include one or more layers or alternative regions that can have their shading/inverse-shading, or other effects, separately adjusted. Thus, in addition to or in lieu of lens pixels 420, one or more of the lenses 403 can use separate regions that are adjusted as a whole, rather than being adjusted as a collection of individual lens pixels 420. When one such region is adjusted, this can have the effect that the eye can be drawn toward or away from a particular adjusted region. For example, when it is desirable to encourage the user to look through a particular focusing region (such as a short-range focusing region or a longer-range focusing region), other regions can be shaded/inverse-shaded to decrease visibility, thus encouraging the user to look in a particular direction or through a particular region of the lenses 403.

[425] For example, a selected lens 403 can include a first region having a first degree of vision correction, such as using refraction (such as for close-range viewing), and a second region for a second degree of vision correction, such as using a different amount of refraction (such as for longer-range viewing). A second lens layer can be overlaid on the first lens 403, so that the second lens layer can shade/inverse-shade one or more of the regions of the first lens 403. This can have the effect that the wearer is prompted to look in a selected direction, or through a particular region of the first lens 403. Thus, the second lens layer can shade/inverse-shade the first lens 403 so as to prompt the wearer to look through the selected portion of the first lens 403. can have the effect of prompting the wearer to look at a field of view (FOV) through either a selected close-range portion of the first lens 403 or a selected more longer-range portion the first lens 403.

Predictive techniques

[426] Similar to Figures 1A and 1B, in one embodiment, the computing device 404 can maintain a record of wearer activity with respect to use of one or more of the lenses 403 and its associated lens regions, so as to identify which portions of the particular lens 403 should be associated with which lens regions to provide the wearer with the best possible experience with using the eyewear 400. For example, when the computing device 403 determines that the wearer is most likely to need a particular prescription for a selected portion of the eyewear 400, the computing device 404 can adjust the prescription for that particular portion of the eyewear 400 so as to provide the wearer with that prescription when the wearer is using that portion of the eyewear 400.

[427] Similar to Figures 1A and 1B, in one embodiment, the computing device can determine the wearer’s most likely prescription in response to a predictive technique, such as using artificial intelligence (Al) or machine learning (ML). For example, the computing device can train a recurrent neural network (RNN) to predict the wearer’s most likely prescription in response to each lens region and each other set of circumstances, such as information obtained from the sensors. Alternatively, the computing device can determine a set of regression parameters to predict the wearer’s most likely prescription in response to each lens region and each other set of circumstances. The computing device can use other and further Al or ML techniques, or other techniques, or otherwise as described herein, to make the desired prediction.

[428] Similar to predictive techniques with respect to the lens regions, the computing device 404 can determine the wearer’s most likely prescription in response to one or more predictive techniques, such as using artificial intelligence (Al) or machine learning (ML) with respect to each lens pixel, with respect to association of lens pixels with particular lens regions, or otherwise as described herein. In such cases, the computing device can assign individual lens pixels to selected lens regions, in response to one or more predictive techniques. Also similarly, the computing device can adjust the set of lens pixels associated with each lens region in response to a predictive technique in response to wearer actions, such as the wearer moving their head when their gaze direction should be reassociated with a different lens region.

[429] In one embodiment, the computing device can determine the wearer’s most likely medical condition, such as in response to the sensors. For example, blink rate and other parameters with respect to the wearer’s eye activity can be used to determine whether the wearer is excessively anxious, depressed, sleep-deprived, or otherwise needs to rest. In such cases, the eyewear 400 can be disposed to urge the wearer to take a break and rest. This can have the effect that safety is improved, such as for commercial pilots and other pilots, long-haul truckers and other long- distance drivers, police officers, military personnel, firefighters, emergency responders, medical personnel, and other personnel often subject to long hours or stressful circumstances. Alternatively, the eyewear 400 can be disposed to urge the wearer to take a break or to obtain a stimulant, such as caffeine, sugar, a meal, or otherwise as described herein.

Goggles or visor

[430] Figures 4C-4D show a conceptual drawing of an example goggles or visor having multiple active regions related to wearer view.

[431] Figure 4C shows a conceptual drawing of an example goggles or visors having multiple individual lens regions related to wearer view.

[432] Figure 4D shows a conceptual drawing of an example goggles or visors having multiple individual lens pixels related to wearer view.

[433] Similar to Figures 4A-4B, in one embodiment, an example eyewear 400 can include a set of goggles or a visor 400 disposed for use by the wearer (not shown), including elements shown in the figure, such as one or more of:

— a frame 401, such as possibly including a headgear 402a (such as a front piece for a facemask, or a head guard for a helmet) or an eye guard 402b;

— at least one lens 403, such as possibly a right lens 403a or a left lens 403b, such as disposed in the eye guard 402b or integrated into the eye guard 402b as part of its solid form.

[434] Similar to the eyewear too described with respect to fig. 1, the frame 401 can enclose, or hold, one or more electronic elements shown in the figure, such as one or more of:

— a computing device 411 (as shown in fig. 1), such as possibly including a processor, memory or mass storage, a power supply, a clock circuit, or other elements used with computing devices;

— a communication device 412 (as shown in fig. 1), such as possibly including a wireless or wired communicate element, a communication protocol stack, or other elements used with communication devices;

— one or more sensors 413 (as shown in fig. 1), such as possibly including one or more of: wearer sensors 413a (as shown in fig. 1) disposed to receive information about the wearer (or their current condition), ambient sensors 413b (as shown in fig. 1) disposed to receive information about an environment near the wearer (or its current condition), or other sensors.

[435] Similar to Figure 1A (as shown in fig. 1), the lenses 403 can include multiple lens regions 420, each disposed to correct vision or enhance vision on behalf of the wearer. For example, similarly, each lens region 420 can include an individual region, disposed to provide distinct corrections or enhancements to vision in the region where the wearer’s gaze direction intersects the lens region. Also similarly to the lens regions described with respect to Figure 1A, each lens region 420 can be individually controlled, such as by the computing device 411, or otherwise as described herein. This can have the effect that the wearer’s vision can be corrected or enhanced for each direction where the wearer might look.

[436] Similar to the Figures 1A-1B, as described with respect to the eyewear 400, the one or more lenses 403 can be used to correct or enhance vision on behalf of the wearer, or otherwise as described herein. For example, the one or more lenses 403 can be used to correct vision using one or more lens prescriptions associated therewith, disposed to correct for myopia, presbyopia, astigmatism, or other wearer vision artifacts. For another example, the one or more lenses 403 can be used to enhance vision can include a zoom feature disposed to present the wearer with a zoomedin or zoomed-out view of the wearer’s field of view (FOV), or can include other features disposed to present the wearer with other vision enhancements described in the Incorporated Disclosures, or otherwise as described herein.

[437] Figure 4D shows a conceptual drawing of an example facemask or helmet having multiple individual pixels related to wearer view.

[438] Similar to Figure 1B (as shown in fig. 1), the lenses 403 can include multiple lens pixels 430, each disposed to correct vision or enhance vision on behalf of the wearer. For example, similarly, each lens pixel 430 can include an individual region (such as the multiple lens regions 420, only typically smaller), disposed to provide distinct corrections or enhancements to vision in the region where the wearer’s gaze direction intersects the lens pixel. Also similarly to the lens regions 420, each lens pixel 430 can be individually controlled, such as by the computing device 411, or otherwise as described herein. This can have the effect that the wearer’s vision can be corrected or enhanced for each direction where the wearer might look. [439] Similar to Figure 1B (fig. 1), the computing device 411 can associate a distinct set of lens pixels 430 for use as a separate one of the multiple lens regions 420. For example, the computing device 411 can control the prescription with respect to each such lens region 420 by controlling each of the lens pixels 430 associated with that particular lens region. Also similarly to the possibility of overlap of lens regions 420, a set of lens pixels 430 can be associated with more than one such lens region. This can have the effect that when the computing device 411 determines that the wearer is using a particular lens region 420, it can select the set of lens pixels 430 associated with that lens region, even if those lens pixels are also associated with another lens region. Similar to overlap of lens regions 420, the intersection of multiple sets of lens pixels 430, or the union of multiple sets of lens pixels, as appropriate, can be invoked by the computing device 411, so as to provide the wearer with the correction or enhancement to match the wearer’s intended user of the eyewear 400.

[440] As described with respect to Figure 1B (fig. 1), when the computing device 411 can determine the wearer’s intended use of the eyewear 400, and can determine the particular lens pixel 430 that the wearer’s gaze direction passes through, the computing device 411 can invoke only that one lens pixel, possibly updating the particular lens pixel to invoke as the wearer’s gaze direction might change.

[441] Similar to Figure 1B (fig. 1), the set of lens pixels 430 associated with each such lens region 420 can be adjusted by the computing device 411. This can have the effect that the set of lens pixels 430 associated with each such lens region 420 can be altered from time to time.

Fig- 5 - Scopes or sights

[442] Fig. 5 shows a conceptual drawing of example eyewear including one or more scopes or sights, including binoculars, microscopes, rifle scopes, spotting scopes, telescopes, analog or digital cameras, rangefinders, or otherwise as described herein.

[443] In one embodiment, an example eyewear too can include elements shown in the figure, such as one or more scopes or sights, including binoculars, microscopes, rifle scopes, spotting scopes, telescopes, analog or digital cameras, rangefinders, or otherwise as described herein. In such cases, each scope or sight can include a frame 501 disposed to maintain at least one lens 502 in position for sighting, such as in a frame holding lenses suitable for long-distance magnification (such as when used with binoculars, microscopes, rifle scopes, spotting scopes, telescopes, or otherwise as described herein), or other functions.

[444] Similar to the eyewear 100 described with respect to fig. 1, the frame 501 can enclose, or hold, one or more electronic elements shown in the figure, such as one or more of: a computing device, a communication device, one or more sensors, or otherwise as described herein.

[445] Also similar to the eyewear 100 described with respect to fig. 1, the one or more lenses 502, can be used to correct or enhance vision on behalf of the wearer, or otherwise as described herein.

[446] Also similar to the eyewear 100 described with respect to fig. 1, the one or more lenses 502, can include multiple active regions (not shown), such as close-vision regions, mid-range vision regions, distant vision regions, central vision regions, peripheral vision regions, or otherwise as described herein. Also similarly, each lens region (not shown) can be individually controlled, such as by the computing device, or otherwise as described herein. This can have the effect that the wearer’s vision can be corrected or enhanced in each region where the wearer might look.

[447] Also similar to the eyewear too described with respect to fig. 1, the one or more lenses 502, can include multiple active pixels (not shown), each possibly associated with one or more of the multiple active regions. Also similarly, the set of lens pixels associated with each such lens region can be adjusted by the computing device. This can have the effect that the set of lens pixels associated with each such lens region can be altered from time to time.

[448] In one embodiment, an example eyewear too can include one or more scopes or sights, analog or digital cameras, or otherwise as described herein, disposed to view a scene from a distant location or from a different angle as would be seen by the wearer. For example, a motion picture camera can be mounted on a vehicle, such as a racing car or an aerobatic aircraft, with an output electromagnetic signal from the camera being transmitted to the eyewear too or injected into the wearer’s field of vision (FOV). This can have the effect that the wearer would be able to use the eyewear too to see the image as provided by the scopes, cameras, or otherwise as described herein. In such cases, the wearer would be able to use the eyewear too to see an event, such as a sporting event, a dangerous event, or another event, without having to have line-of-sight on the objects or scene of the event. Fig. 6 — Nerve sensors/stimulators

[449] Fig. 6 shows a conceptual drawing of example eyewear including one or more nerve sensors or stimulators.

[450] In one embodiment, an example eyewear 600 can include one or more nerve sensors or stimulators, disposed to affect nerve signals on the optic nerve 601, in a vision section 602 of the brain, in another section of the brain, or otherwise as described herein. The nerve sensors or stimulators can include elements shown in the figure, such as one or more of:

— one or more electromagnetic sensors 610, disposed to receive electromagnetic signals from the optic nerve 601, the vision section 602 of the brain, another section of the brain, or otherwise as described herein;

— one or more electromagnetic stimulators 620, disposed to provide and insert electromagnetic signals into the optic nerve 601, into the vision section 602 of the brain, into another section of the brain, or otherwise as described herein; or otherwise as described herein.

[451] For example, the electromagnetic sensors 610 can be disposed to receive electromagnetic signals from the optic nerve 601, to couple those electromagnetic signals, or processed variants thereof, to a computing device (not shown). In such cases, the electromagnetic sensors 610 can determine which signals from the optic nerve 601 are associated with which portions of an image viewed by the wearer. Similarly, the electromagnetic sensors 610 can be disposed to receive electromagnetic signals from the vision section of the brain, another section of the brain, or otherwise as described herein.

[452] In such cases, a computing device (not shown) can compare electromagnetic signals from particular portions (not shown) of the optic nerve 601 with the wearer’s experience of a viewed image (not shown). With information gleaned from the comparison, the computing device can determine an image viewed by the wearer in response to the electromagnetic signals from the optic nerve 601. Similarly, the computing device can compare reception of electromagnetic signals from particular portions of the vision section of the brain, from particular portions of another section of the brain, or otherwise as described herein. [453] In such cases, a computing device can compare electromagnetic signals injected into particular portions of the optic nerve 601 with the wearer’s experience of a viewed image, such as an alteration of a natural viewed image. With information gleaned from the comparison, the computing device can determine how to make adjustments to a natural viewed image, such as in response to an augmented reality (AR) image or signal. Similarly, the computing device can compare injection of electromagnetic signals into particular portions of the vision section of the brain, into particular portions of another section of the brain, or otherwise as described herein.

[454] It might occur that the electromagnetic signals associated with particular portions of the optic nerve 601, the vision section 602 of the brain, or another section of the brain, could be different for distinct wearers. In such cases, the computing device can determine an association of portions of the viewed image with portions of the optic nerve 601, the vision section 602 of the brain, or another section of the brain, or otherwise as described herein, for each individual wearer.

[455] For another example, the electromagnetic sensors 610 can apply an electromagnetic field to the optic nerve 601, or to a visual portion of the brain, to encourage the wearer’s eye to gaze in a selected direction. This can have the effect of ameliorating amblyopia (“lazy eye”), exo- tropia (misaligned eye or “wall eye”), and possibly other directional issues with respect to the eyes. This can also have the effect of encouraging the wearer to look at a particular area or object, such as a target; this can be useful with respect to police officers, militaiy personnel, and in advertising.

[456] Similarly, the electromagnetic sensors 610 can apply an electromagnetic field to the optic nerve 601, or to a visual portion of the brain, to encourage the wearer’s pupil or iris to contract or to expand. This can have the effect that the wearer’s eye is protected against excessive infalling light (such as sudden brightness or glare), or excessive infalling light of a particular color or frequency range (such as excessive blue or UV).

Fig. 7 - Used with display

[457] Fig- 7 (collectively including Figures 7A and 7B) shows a conceptual drawing of eyewear used with an example display.

[458] Figure 7A shows a conceptual drawing of the example display disposed on or in a building or structure. [459] Figure 7B shows a conceptual drawing of the example display disposed in a vehicle.

Eyewear used with a display

[460] In one embodiment, multiple sets of eyewear 700 can be used with a display 701, such as a stationary display 701a (in or on a building or structure) or a moving display 701b (in or on a vehicle). For example, the display 701 can be disposed so as to be viewable by an audience, such as in a public arena. The display 701 can be operated at a refresh rate (frames per second or fields per second) higher than a rate desirable by viewers (such as a refresh rate of 120 Hz, 170 Hz, 240 Hz, 360 Hz, or otherwise as described herein), while each set of eyewear 700 can present to its wearer only a selected subset of the frames being presented by the display 701. This can have the effect that each wearer of a separate set of eyewear 700 can receive a separate subset of the frames being presented by the display 701, thus a separate (motion picture) presentation.

[461] For example, the display 701 can be operated at a refresh rate four times (4x) or eight times (8x) an ordinary refresh rate for a motion picture presentation, thus providing a possibility of four separate motion picture presentations being displayed concurrently. In such cases,

— one such presentation can be associated with an ordinary motion picture, for which eyewear 700 is available to viewers at no extra cost or only a nominal extra cost;

— one such presentation can be associated with a motion picture that is reserved for children or other sensitive viewers, such as a presentation that has been edited to remove one or more of: sex, violence, conflict, frightening images, other adult themes, or otherwise as described herein (such as a non-conflict version, a non-violent version, a “G”-rated version, a “PG”-rated version, an “R”-rated version, an “X”-rated version, of substantially the same motion picture);

— one such presentation can be associated with a motion picture that has been edited to remove “triggering” images or scenes, such as images or scenes that have a substantial effect on wearers with epilepsy, PTSD, psychological sensitivities, images offensive to particular social groups, other triggering images or scenes, or otherwise as described herein;

— one such presentation can be associated with a motion picture that is a “premium” version of the motion picture, such as a “director’s cut”, a version having additional story elements, a version having superior special effects, or otherwise as described herein; or otherwise as described herein. [462] For example, the display 701 can be operated with two, three, or more different presentations, such as those different possibilities described above. A first presentation can include a “G”-rated version, having only “G”-rated scenes; a second “R”-rated presentation can include all the “G”-rated scenes plus other more explicit scenes; a third presentation can include all the “G”- rated scenes plus other more explicit scenes distinct from the “R”-rated presentation or in addition thereto. In such cases, wearers using “G”-rated eyewear 100 would see only the “G”-rated presentation, wearers using “R”-rated eyewear would see only the “R”-rated presentation, and wearers using “X”-rated eyewear would see only the “X”-rated presentation. However, at least some scenes can be shared between pairs of those presentations, and possibly some scenes can be shared among all those presentations.

[463] After reading this Application, those skilled in the art would recognize that the display 701 can present a wide variety of different types of presentations, both including the possible presentations described above, as well as other possibilities. For example, the display 701 can be disposed to present a first version of a motion picture image in ordinary circumstances, or an alternative second version of the motion picture image in circumstances where legal restrictions limit the motion picture images allowed to be shown. This can have the effect that multiple versions of a motion picture image can be distributed, even when the unrestricted version would be banned or otherwise restricted in particular jurisdictions.

[464] For another example, the display 701 can be operated at a refresh rate two times (2x) or four times (4x) an ordinary refresh rate for a motion picture presentation, thus providing a possibility of a three-dimensional (3D) motion picture image being displayed. In such cases, a set of eyewear 700 can be disposed to present selected frames to different ones of the wearer’s eyes. The selected frames can differ slightly, such as with respect to point of view (POV). This can have the effect that that the wearer’s brain can integrate the selected frames, with the wearer seeing a 3D image. For example, a 3D motion picture image can be presented to the wearer by interlacing whether the right lens is open, the left lens is open, both lenses are open, or neither lens is open. This also can have the effect that multiple separate 3D images can be provided to different wearers concurrently.

[465] To present a 3D image, the selected frames can be distinguished by one or more of: — time division multiple access, with a portion of the selected frames, such as about half of them, are presented for a POV for the wearer’s right eye and a portion are presented for a POV for the wearer’s left eye;

— color division multiple access, with the portion presented for the wearer’s right eye being saturated by a first color (which is filtered by the eyewear’s right lens) and the portion presented for the wearer’s left eye being saturated by a second color (which is filtered by the eyewear’s left lens);

— polarization division multiple access, with the portion presented for the wearer’s right eye being polarized in a first manner (such as polarized vertically or right-circularly polarized), which is filtered by the eyewear’s right lens, and the portion presented for the wearer’s left eye being polarized in a second manner (such as polarized horizontally or left-circularly polarized), which is filtered by the eyewear’s left lens; or otherwise as described herein.

[466] Each of these techniques can have the effect that the wearer’s brain integrates the selected frames into a 3D motion picture image.

[467] For another example, the display 701 can be operated in response to movement by the wearer, such as with respect to a theme-park entertainment event or ride. In such cases, when the wearer enters a designated zone, such as within a tunnel or other location without natural light, the display 701 can switch from presenting an ordinary two-dimensional (2D) motion picture image to presenting a 3D motion picture image. Similarly, the display 701 can be operated in response to an ambient light level experienced by the wearer, such as with respect to a daytime or night-time event. In such cases, when the time changes from a daytime event to a night-time event, the display 701 can switch from presenting an ordinary two-dimensional (2D) motion picture image to presenting a 3D motion picture image.

Stationary display

[468] Figure 7A shows a conceptual drawing of the example display disposed in or on a building or structure.

[469] As shown in the figure, the display 701 can be mounted or projected in or on a surface of a building or structure 710, such as a movie theatre screen, an external or internal wall 711 of a building, on a temporary backdrop, on a fog for laser shows or other picture shows, onto a water surface, or otherwise as described herein. In such cases, an audience disposed to view the presentation can use eyewear 700 to obtain an enhanced, or otherwise as described herein edited, motion picture image.

Moving display

[470] Figure 7B shows a conceptual drawing of the example display disposed in or on a vehicle.

[471] As shown in the figure, the display 701 can be mounted or projected in or on a surface of a vehicle 720, such as an external window (which could possibly be used for advertising), an internal display (which could possibly be used for entertainment), a windshield (which could possibly be used for a heads-up display, or “HUD”, for a driver or navigator). In such cases, a driver or navigator can obtain information with respect to manipulating the vehicle, passengers can obtain information with respect to entertainment or travel, or otherwise as described herein.

[472] Alternatively, an audience disposed to view the vehicle can use eyewear 700 to obtain an enhanced, or otherwise edited, motion picture image. This can have the effect that the vehicle can provide advertising or other information to the audience. For example, the vehicle can include a blimp or other lighter-than-air aircraft, onto which a motion picture image is displayed, similar to the display 701 described with respect to Figure 7A. An audience, such as a set of people attending a sports event, can view the vehicle 720 and each obtain information selected by their individual eyewear.

[473] The vehicle can include an internal panel 702, such as (for an automobile) a divider between the driver’s and the passengers’ compartment, onto which a motion picture image is displayed, similar to the display 701 described with respect to Figure 7A. The passengers can view the display 701 and each see a motion picture selected by their individual eyewear. This can have the effect that multiple passengers can each view different motion pictures at full size, even though only one such display 701 is available for viewing.

Fig. 8 - Hybrid personalization [474] Fig. 8 shows a conceptual drawing of an example eyewear used to provide hybrid personalization.

[475] An example eyewear 800 can include one or more elements as shown in the figure, including at least

— one or more lenses 810a, 810b, such as lenses mounted on a frame, or such as contact lenses disposed for wearing by a user (not shown);

— one or more regions 820 disposed on at least one first lens 810a, such as a close-vision region 821, disposed to provide distinct corrections or enhancements to vision in a region where the wearer is looking at a close object, or a distant vision region 822, disposed to provide distinct corrections or enhancements to vision in a region where the wearer is looking at a distant object;

— one or more regions 830 disposed on at least one second lens 810b, such as a close-vision shading region 831 aligned with the close-vision region 821, or a distant-vision shading region 832 aligned with the distant vision region 822.

Multiple lens regions and combinations

[476] For example, the one or more regions 820 can be disposed to include regions composable into a bifocal, trifocal, progressive, or otherwise multi-focal lens. The bifocal, trifocal, progressive, or otherwise multi-focal lenses can be disposed to include different amounts of refraction, such as might be appropriate for viewing at different ranges, so as to optimize the wearer’s clarity of vision or visual acuity. This can have the effect that the wearer can be provided with a relatively optimized view despite the distance of objects at which they are looking.

[477] In such cases, a bifocal lens can include a close-vision region 821 and a distant vision region 822. A trifocal lens can include a close-vision region 821, a distant vision region 822, and a mid-range vision region. A progressive lens can include multiple regions 820 having distinct corrections, either distinguished at borders between regions or relatively smoothly progressing from a first to a second correction, similarly from a second to a third correction, or similarly from each correction to a next correction. A multi-focal lens can include regions 820 disposed at up- per/lower ranges of the user’s field of view (FOV), regions disposed at right/left ranges of the user’s FOV, or otherwise disposed in response to the user’s gaze direction and/or focal length. [478] In such cases, the first region (such as the close-vision region 821) and the second region (such as the distant vision region 822) can be adjusted so as to optimize the wearer’s clarity of vision and/or visual acuity while looking through those regions of the eyewear. For one example, the amount of refraction can be adjusted in each region, such as using electronic control of the refraction. For another example, the amount of refraction can be adjusted in response to one or more of: (A) a focal length at which the wearer is looking; (B) a recognized object at which the wearer is looking; or as otherwise described herein. This can have the effect that the wearer can be provided with a relatively optimized view despite the distance of objects at which they are looking.

(Each region associated with a selected function)

[479] As shown in the figure, the close-vision region 821 can occupy a location through which the user will look when the user is gazing at a relatively close object, such as a book, a computer monitor, a smartphone or other mobile device, or otherwise as described herein. As described herein, the close-vision region 821 can be disposed with a relatively fixed amount of refraction, associated with a likely distance at which the user will be looking through that region, or can be disposed to be adjustable in response to one or more factors, such as relating to the user’s visual intent. Similarly, the distant vision region 822 can occupy a location through which the user will look when the user is gazing at a relatively distant object, such as across or down a street, through a window, or otherwise as described herein. Similar to the close-vision region 821, the distant vision region 822 can be disposed with a relatively fixed amount of refraction or can be disposed to be adjustable in response to one or more factors.

[480] Alternatively, the lenses 810 can include regions in addition to or in lieu of the closevision region 821 or the distant vision region 822, such as a mid-range region (not shown) disposed to provide distinct corrections or enhancements to vision in a region where the wearer is looking at a mid-range object. For example, the mid-range region can be disposed to occupy a location through which the user would look when the user is gazing at a relatively mid-range object, such as an object in the same indoor room, a person with whom the user is speaking, or otherwise as described herein. Similar to the close-vision region 821 and the distant vision region 822, the mid-range region can be disposed with a relatively fixed amount of refraction or can be disposed to be adjustable in response to one or more factors. (Lenses with combined functions)

[481] For another example, the one or more regions 820 can be disposed to include regions with differing effects. The regions with differing effects can be disposed to provide, in their differing regions, distinct effects or a combination of effects, such as (A) one or more refractive effects; (B) one or more shading/inverse-shading effects; (C) one or more coloring/ tinting effects; (D) one or more polarization effects; (E) one or more prismatic angle deflection effects; (F) one or more dynamic visual optimization effects; or otherwise as described herein.

[482] For another example, the one or more regions 820 can be disposed to include overlapping lenses, such as each having differing effects, possibly combinable to each provide a combination of those effects. For example, a first region can include a combination of a selected refractive effect and a selected shading/inverse-shading effect. A second region can include a combination of a different selected refractive effect and/or a different selected shading/inverse-shading effect. A third region can include a combination of a selected refractive effect and/or a selected coloring/tinting effect. Other and further possibilities of combinations, as described herein, are also possible.

[483] For another example, as described herein, the refractive effects can be combined with the shading/inverse-shading effects, so as to provide a first region (such as a close-vison region 821) having a first refractive effect and a second region (such as a distant vison region 822) having a second refractive effect, each of which has a different shading/inverse-shading effect. This can have the effect that the wearer, when looking through the first region (such as the close-vison region 821), can see using the first refractive effect and has that first refractive effect identified using its particular shading/inverse-shading effect; while when looking through the second region (such as the distant vison region 822), can see using the second refractive effect and has that second refractive effect identified using its particular shading/inverse-shading effect.

[484] As described herein, a first lens can be disposed so as to allow the user to look therethrough, while a second lens can be disposed aligned with the first lens and capable of adjusting one or more visual effects of the first lens. For example, the second lens can be disposed to perform shading/inverse-shading with respect to either a first or a second visual region associated with the first lens. Thus, the first lens can include a first region (such as a close-vision region 821) and a second region (such as a distant vision region 822), while the second lens can be disposed to perform shading/inverse-shading so as to encourage the user to look through a selected one of the close-vision region 821 or the distant vision region 822.

[485] In further examples, the second lens can be responsive to features of the user’s eye, so as to encourage the user to look through either the close-vision region 821 or the distant vision region 822, to promote eye health. Similarly, the second lens can be responsive to viewing features of the user’s field of view or the user’s viewing attention pattern, such as bright lights or lights with glare or flashing, concentration on small objects, lights with a potentially adverse color balance, or aspects of the wearer’s field of view vision, attention, or medical conditions.

(Lenses with multiple functions)

[486] In another embodiment, the eyewear can include lenses with multiple regions, or multiple lenses, each having different functions, either overlapping or each associated with separate portion of the user’s field of view. For example, when the lenses have upper/lower portions, such as in “reader” glasses or such as in bifocal lenses, the upper/lower portions of the lenses can each be disposed with separate functions. In additional to or in lieu of different refractive functions, the upper/lower portions of the lenses can each include different shading/inverse-shading functions, different coloring/tinting or color balancing functions, different polarization or prismatic deflection functions, different dynamic visual optimization functions, or as otherwise described herein.

[487] For example, the different functions can be responsive to selected features of the user’s field of view, such as (A) content recognized with respect to the selected portion of the user’s field of view; (B) ambient circumstances recognized with respect to the portion of the user’s field of view; (C) user inputs provided at a time when the user is viewing content using the lenses; (D) “bookmarks” describing what functions to be performed, defined by the user with respect to one or more of the preceding factors; or as otherwise described herein. In the latter case, the user can provide a description, or a set of examples, for which the eyewear can recognize and maintain each such bookmark; when the eyewear recognizes one or more of such bookmarks, the eyewear can direct the lenses, or portions thereof, to perform the functions defined by the user with respect to that bookmark. Triggering selection of an appropriate region

[488] In one embodiment, an appropriate region can be selected in response to a focal length or a gaze direction, such as might be determined by a dynamic eye movement sensor. When the dynamic eye movement sensor detects the particular focal length, gaze direction, or other reason for selecting a particular region, the identified region can be selected in response to (A) an action by the user, such as a gesture; (B) a timer or time duration; (C) an external event; (D) a communication from another device; or another triggering activity, such as further described herein.

(Action by the user)

[489] For example, a dynamic eye movement sensor can be disposed to identify eye movements, pupil direction, pupil width, interpupillary distance, blinking, blink rate, “dry eye” symptoms, or other features with respect to the eye, so as to identify a direction/distance at which the user is looking, or a time during which the user has been doing so. When the user is looking at a relatively close object, such as when reading a book, or smartphone or other mobile device, a close-vision region 821 can be selected. Alternatively, when the user is looking at a relatively distant object, such as when scanning a field of view (FOV) of an aircraft, car/truck, or other vehicle, a distant vision region 822 can be selected.

[490] In one embodiment, an appropriate region can be selected in response to another prompt, such as an eye gesture, facial gesture, head movement, or mouth gesture. For example, the user can select a particular region 820 by directing their eye at that region and concurrently blinking/squinting. A sensor can be coupled to the dynamic eye tracking sensor and can detect the blink/ squint (or one after the other, or both concurrently). In response thereto, the eyewear can select the particular region 820 at which the eye was directed when the eye gesture (or another gesture) occurred.

[491] For example, the eye gesture can include a blink or squint, a glance toward a selected direction, an eyeroll, a selected eye movement (such as a deliberate gaze at a particular object), or another activity by the user’s eyes. The eye gesture can include multiple ones of such gestures or combinations thereof. [492] For example, the facial gesture can include a smile, half-smile, smirk, sneer, spasm, twitch, wince, wink, frown, grimace, grin, pucker, cheek movement, lip movement, nose movement, tongue movement, or another activity by the user’s facial muscles. The facial gesture can include multiple ones of such gestures or combinations thereof.

[493] For example, the hand gesture can include touching a button or surface, sliding a finger or other part of the hand along the button or surface, touching a first and second part of the hand, bringing one or more fingers or other parts of the hand within the user’s field of view (FOV), moving one or more fingers or other parts of the hand in a selected manner, or another activity by the user’s fingers/hands or related anatomy. The hand gesture can include multiple ones of such gestures or combinations thereof.

[494] For example, the head movement can include a hair flip, a nod, jerk, rattle, roll, shake, tilt, turn, twist, or another up/down or right/left movement, or another activity by the user’s head or related anatomy. The head movement can include multiple ones of such movements or combinations thereof. Examples can include multiple successive nods, tilts, turns, or other up/down or right/left head movements by the wearer, possibly in combination with other gestures (as further described herein).

[495] For example, the mouth gesture can include clenching/grinding of teeth, or another activity by the user’s mouth or related anatomy. The mouth gesture can include multiple ones of such gestures or combinations thereof, possibly in combination with other gestures (as further described herein).

[496] In one embodiment, the sensor can be disposed to detect combinations of first and second eye gestures, facial gestures, hand gestures/movements, head movements, mouth gestures, combinations or sequences of first and second types of gestures, or any other detectable action by the user.

[497] For example, the sensor can be disposed to detect when the user glances up and to the left, followed by blinking. For another example, the sensor can be disposed to detect when the user blinks while tilting their head to the right. Those of ordinary skill in the art would readily recognize the wide variety of possible distinct or alternative combinations, orders, or other varieties of user gestures that can be identified to trigger selection of one or more particular regions. (Timer or time duration)

[498] In one embodiment, the eyewear can be disposed to detect when a selected time duration has occurred. For example, the eyewear can detect when the user has been looking at a particular region of their field of view for longer than a selected duration and can trigger an action in response to that duration.

— If the user has been looking at a close-range object for longer than a threshold duration (perhaps 10 minutes, although longer or shorter durations might be called for), the eyewear can trigger shading on a portion of the lens to encourage the user to look away, such as at a longer- range region of their field of view.

— If the user has been looking at a relatively small portion (perhaps 10 degrees square, although larger or smaller portions might be called for) of their field of view for longer than a threshold duration, the eyewear can trigger shading on a portion of the lens to encourage the user to look away, such as at a larger or wider region of their field of view.

— If the user has been looking at a relatively bright region of their field of view (or a region having relatively high contrast, or a region having a relatively high amount of glare), or a region with a flashing light (such as a strobe light) for longer than a threshold duration, the eyewear can trigger shading on a portion of the lens to encourage the user to look away, such as at a region of their field of view that is less bright, has less contrast, or has less glare.

— If the user has been looking at a portion of their field of view having a relatively high saturation of a particular hue (such as bright blue, bright white, bright red, or another color having an effect on the eyes, on the user’s emotional state, or correlated with migraines or other medical issues), the eyewear can trigger shading on a portion of the lens to encourage the user to look away, such as at a region of their field of view that is less colorful or otherwise less likely to trigger a migraine or another medical issue. Alternatively, in such cases, the eyewear can adjust the color balance available to the wearer’s eye, such as by electronically or otherwise altering the chromatic filtering of the lens.

[499] In these examples, or otherwise when the sensor detects that the user would benefit from looking in another direction for some time, the sensor can act to encourage or urge the user to look at another distance, at another object, in another direction, or otherwise ameliorate the issue detected by the sensor. (External event)

[500] In one embodiment, the eyewear can be disposed to detect when an external event occurs that might have an adverse effect on the wearer’s vision. Examples can include: the rapid onset of a change in relative brightness in the wearer’s field of view (FOV), such as a sudden bright light or a sudden removal of light and onset of relative darkness, the rapid onset of a change in glare or flashing, color balance, or other visual effects that might have an adverse effect on the wearer’s vision, prompt migraine, or otherwise affect a wearer medical condition.

[501] In such cases, the eyewear can include a sensor disposed to detect when the external event occurs (or is occurring, or is about to occur, such as responsive to known initial states of the external event or know precursors of the external event. For example, the eyewear can detect a sudden bright light in response to a derivative in a measure of luminance in the wearer’s field of view (FOV), or in response to detecting an object likely to product a sudden bright light, such as a floodlight.

[502] When the eyewear, or a sensor disposed thereon, detects an onset of an external event such as a sudden bright light, the eyewear can be disposed to react rapidly, such as by electronically shading one or more regions of the lenses. Electronic shading can be much faster than chemical shading. This can have the effect that, after an initial detection time (of about 50 msec, more or less), nearly all of the incoming bright light can be shaded, and the wearer can be protected from the sudden bright light.

[503] In one embodiment, law enforcement personnel, military personnel, or other personnel using sudden bright light as a nonlethal area effect weapon (such as when using a flashbang grenade or a bright floodlight), can be shaded from the effects of that sudden bright light. This can have the effect that law enforcement personnel, military personnel, or other related personnel can deploy sudden bright light as a nonlethal area effect weapon without being subject to ill effects therefrom. Similarly, animal control personnel can deploy sudden bright light as a nonlethal area effect weapon applied to wild animals or other natural pests, without those personnel themselves being subject to ill effects therefrom.

[504] For example, law enforcement personnel, military personnel, or other personnel can use bright light (sudden or otherwise) as a nonlethal area effect weapon to suppress or disable persons or animals within a region. In such cases, personnel operating the bright light can direct the bright light into a region, such as using a floodlight or a bright flashlight into that region. Persons or animals within the region into which the bright light is directed will thus lose visual acuity and may suffer disorientation or other loss of direction, and may possibly suffer loss of initiative or other disabling effects. While disabling, these effects would be nonlethal and can thus be applied against persons or animals for which lethal force is unwarranted or undesired. Since law enforcement personnel, military personnel, or other personnel can be shielded against those effects using eyewear disposed to shade/inverse-shade against the bright light being applied, those personnel can continue to operate without disabling effect, even as the target of the bright light is disabled.

[505] In one embodiment, personnel piloting a vehicle, such as an aircraft, racing car, sailboat or speedboat, or another vehicle involving attention to operation, can be shaded from the effects of sudden bright light, such as when a bright light (e.g., the sun) is revealed after an obstacle is passed, or such as when another vehicle’s “brights” are suddenly revealed after cresting a hill, turning a corner, or passing a truck. For example, when the wearer is driving a racing car through a tunnel, the ambient environment as viewed by the wearer can change from relative dark in the tunnel to quite bright as the wearer exits the tunnel. This can have the effect that the driver can drive out of the tunnel at relative high speed without having to wait for their eyes to adjust to the ambient environmental change in brightness.

(Communication from another device)

[506] In one embodiment, the eyewear can be disposed to detect when a signal is received from a second device. For example, the second device can be a flashbang grenade, floodlight, or another device disposed to generate a sudden bright light. When the eyewear detects the signal from the second device, the eyewear can be alerted that a change is about to occur in the wearer’s ambient environment, that might affect the wearer’s eyesight, prompt a migraine, or affect another medical condition. This can have the effect that law enforcement personnel, military personnel, or other personnel in the presence of a device that can generate a sudden bright light, will be alerted ahead of time to the onset of the sudden bright light. Accordingly, their eyewear can shade against the sudden bright light, allowing them to continue without substantial debilitating effect from the sudden bright light. [507] For example, a flashbang grenade or a floodlight can transmit a (possibly encrypted) message to the eyewear before triggering a sudden bright light. The eyewear can receive the message and initiate shading of the lenses before the sudden bright light hits the wearer’s eyes. This can have the effect that the law enforcement personnel, the military personnel, or other personnel can be present when nonlethal visual weapons are used, without having to coordinate with the users of those weapons.

[508] For example, a racing car can receive a message from a device disposed near the exit of a tunnel, informing the driver’s eyewear that there will be a sudden bright light when the vehicle exits the tunnel. The eyewear can receive the message and shade the lenses just as the vehicle exits the tunnel, improving visibility and reducing strain on the wearer’s eyes. Similarly, for another example, the racing car can receive a message from a second device disposed near the entrance of the tunnel, informing the driver’s eyewear that there will be a sudden reduction in light when the vehicle enters the tunnel. The eyewear can receive the message and reduce shading on the lenses just as the vehicle enters the tunnel, improving visibility and clarity for the driver while maneuvering in the tunnel. This can have the effect that the driver is not debilitated by light changes when entering or exiting the tunnel.

Urging the user to use an appropriate region

[509] The one or more regions 820 can combine distinct corrections or enhancements to vision, in combination with one or more other techniques to urge the user to direct their gaze in a direction through those locations when gazing at an object at an associated distance. For example, the eyewear 800 can be disposed to shade a portion of the lens so as to urge the user to look through a portion of their field of view that does not involve looking through that portion of the lens.

— If the user gazes at an object at a close distance, the one or more regions 820 can be disposed to urge the user to look through the close-vision region 821. Similarly, when the user gazes at an object at a close distance, the one or more regions 820 can be disposed to urge the user to look through the close-vision region 821. This can occur when the user is looking through a portion of the lens, such as when the lens includes a bifocal, trifocal, multifocal, or progressive lens (e.g., with differing degrees of vision correction). In such cases, the eyewear can encourage the user, such as using shading of portions of the lens, to look through a correct portion of the lens. — Similarly, if the user is looking through a portion of the lens that is receiving light that is very bright, has substantial glare, or is flashing, the eyewear can urge the user to look away from that portion, such as through a portion of the lens that is not subject to those effects.

— Similarly, if the user is looking through a portion of the lens that is receiving light that is chromatically imbalanced, such as having a substantial excess of bright blue or bright red, or another color having an eyestrain effect, an emotional effect, or prompting a medical condition with respect to the user, the eyewear can urge the user to look away from that portion, such as through a portion of the lens that is not subject to those effects.

(Viewing distance)

[510] In one embodiment, the one or more regions 820 can each include one or more techniques to urge the user to direct their gaze appropriately. For example, the eyewear 800 can determine when the user’s gaze is directed at a close distance, (A) such as by determining the user’s gaze direction, and using object recognition to determine an object at which the user is looking; (B) such as by using pupil width or stereoscopy to determine a distance at which the user’s eyes are focusing; or otherwise as described herein.

[511] Having determined a distance at which the user’s gaze is focused, the eyewear 800 can determine which one of the one or more regions 820 is best suited for the correction or enhancement of the user’s vision when focusing at that distance. For example, when the user gazes at an object at a close distance, the one or more regions 820 can be disposed to urge the user to look through the close-vision region 821.

[512] In one embodiment, the eyewear 800 can be disposed to urge the user, such as under the control of a computing device, an electronic circuit, or otherwise as described herein, to look through a selected region 820 by one or more of:

— shading other unselected regions 820, so as to discourage the user from looking through the unselected regions 820, or inverse-shading the selected region 820, so as to encourage the user to look through the selected region 820;

— altering one or more chromatic responses of the unselected regions 820, so as to discourage the user from looking through the unselected regions 820, or so as to emphasize to the user that the unselected regions 820 are discouraged, or altering one or more chromatic responses of the selected region 820, so as to encourage the user to look through the selected region 820, or so as to emphasize to the user that the selected region 820 is encouraged;

— altering one or more polarization responses of the unselected regions 820, so as to discourage the user from looking through the unselected regions 820, or so as to emphasize to the user that the unselected regions 820 are discouraged, or altering one or more polarization responses of the selected region 820, so as to encourage the user to look through the selected region 820, or so as to emphasize to the user that the selected region 820 is encouraged;

— altering one or more prismatic responses of the unselected regions 820, so as to alter the field of view (FOV) of the user when looking through the unselected regions 820, such as to direct that FOV to one or more objects appropriate to the selected region 820, or altering one or more prismatic responses of the selected region 820, so as to alter the FOV of the user when looking through the selected region 820, such as to direct that FOV to one or more objects appropriate to the selected region 820;

— or otherwise altering the user’s view through the selected region 820 or one or more unselected regions 820, so as to direct the user’s viewpoint through the selected region 820 or away from one or more unselected regions 820.

[513] For example, the eyewear can include multiple lenses with distinct vision correction elements on a first lens and with distinct shading components on a second lens. The first lens can include a region for vision correction (e.g., using refraction) for close-range viewing and a region for vision correction for longer-range viewing. The second lens can be aligned with the first lens, such as by coupling the second lens in alignment with the first lens, so as to allow the second lens to affect the user’s vision at the same time as the first lens.

[514] For example, when the first lens includes multiple regions for vision correction, the second lens can include multiple regions for shading in response to a desired amount of vision correction. When the user focuses on an object at relatively close range, the first lens can include a region disposed for use at that relatively close range. When the user focuses on an object at relatively more distant range, the first lens can include a region disposed for use at that relatively more distant range.

[515] In such cases, when the user looks at an object at the relatively close range, the second lens can be disposed to urge the user to look through the portion of the first lens associated with that relatively close range. The second lens can be disposed to shade those portions of the user’s field of view (FOV) that are not associated with that relatively close range, so as to urge the user to look through the “correct” portion of the first lens, i.e., the portion of the first lens associated with that relatively close range. This can have the effect that the second lens urges the user to look through the “correct” portion of the first lens.

[516] Similarly, when the user looks at an object at the relatively more distant range, the second lens can be disposed to urge the user to look through the portion of the first lens associated with that relatively more distant range. The second lens can be disposed to shade those portions of the user’s field of view (FOV) that are not associated with that relatively more distant range, so as to (similarly) urge the user to look through the “correct” portion of the first lens, i.e., the portion of the first lens associated with that relatively more distant range.

[517] The eyewear can be disposed to control the second lens in response to features of the wearer’s eye, such as when the wearer’s eyes become strained or excessively dry in response to too much close-range viewing. This can have the effect that the eyewear can notice when the user is excessively focused on close-range vision. The eyewear can then encourage the user to relax their vision and look away toward one or more objects at a relatively more distant range.

[518] In one embodiment, the eyewear 800 can include one or more regions 820 disposed for close-order vision, such as a close-vision region that provides +1 diopter (or another adjustment for correction of close-order vision) when the user is gazing at an object disposed at a relatively close location. Such a close-vision region 821 is sometimes disposed for reading or otherwise examining close-order objects and is typically disposed to correct for the user’s ability to view objects at close range. Similarly, the eyewear 800 can include one or more regions 820 disposed for relatively distant vision, such as a distant region that provides +0 diopters (or another adjustment, or lack thereof, for correction of relatively distant vision) when the user is gazing at an object disposed at a relatively distant location. Such a distant region 822 is sometimes disposed for examining distant objects (that is, other than close-order objects) and is typically disposed to correct for the user’s ability to view objects at ranges other than close range.

[519] As described herein, the first lens 810a can include at least one region 821, such as a close-vision region, with which the second lens 810b can include at least one region 831 aligned therewith. Similarly, the first lens 810a can include at least one region 822, such as a distant vision region, with which the second lens 810b can include at least one region 832 aligned therewith. When the eyewear 800 urges the user to look away from the close-vision region 821 on the first lens 810a, the eyewear can shade the aligned region 831 on the second lens 810b so as to discourage the user from looking through the region 821. Similarly, when the eyewear 800 urges the user to look away from the distant vision region 822 on the first lens 810a, the eyewear can shade the aligned region 832 on the second lens 810b so as to discourage the user from looking through the region 822.

[520] In another embodiment, when the eyewear 800 urges the user to look away from the close-vision region 821 on the first lens 810a, the eyewear can adjust the color (or color texture) of the aligned region 831 on the second lens 810b so as to emphasize to the user that looking through the region 821 is disfavored. For example, when the wearer has been looking through the close-vision region 821 for more than a threshold amount of time (possibly 10 minutes, or more or less), the close-vision region 821 can be disposed with no color, or with a grey color, to indicate that it is disfavored, while the distant-vision region 822 can be disposed with an amber or blue color, to indicate that it is not disfavored.

[521] The selection of colors (or color textures) can be made particular to an individual use case. For example, when playing games using a ball whose targeting is relative important, the ball itself can be disposed in an orange color (to stand out), while sky and grass can be disposed using a neutral grey color (to provide background). In contrast, when playing games using a smartphone or other mobile device, a close-vision region 821 can be polarized using settings for digital devices, while a distant vision region 822 can be polarized using settings for natural lighting.

(Bright objects)

[522] Similarly, when the eyewear detects that the user is looking at a very bright object or a region of the user’s field of view (FOV) that is very bright or has substantial glare, or is flashing or otherwise distracting, the eyewear can take action to direct the user’s attention away from the excessive light. For one example, the eyewear can shade the very bright region of the user’s FOV. For another example, the eyewear can direct the user’s attention to another object by inverseshading the other object. In such cases, the eyewear can shade a region substantially surrounding (or otherwise emphasizing) the object to which the eyewear directs the user’s attention, while leaving the object itself unshaded. This can have the effect that the object appears brighter than the nearby shaded regions, without actually increasing the object’s brightness.

[523] For example, the eyewear can be disposed to adjust shading with respect to an object or a portion of the user’s field of view (FOV) at which the user is looking. In such cases, when the user is looking in a particular direction, the eyewear can be disposed to shade only portions of the user’s FOV in that direction. Similarly, in such cases, when the user is looking at a particular object, such as when looking in a particular direction and at a particular depth of focus so as to distinguish a selected object, the eyewear can be disposed to shade only that selected object. An outbound camera, such as a camera mounted behind one or more of the lenses and disposed to view a location or region at which the user is looking, can be disposed to determine an amount of shading that optimizes the user’s view, or to determine an amount of shading that optimizes a clarity of the location or region at which the user is looking.

[524] In such cases, the eyewear can be disposed to detect where the user is looking in response to one or more of: a dynamic eye tracking system, one or more “outbound” cameras disposed to review the user’s field of view (FOV) from inside one or more lenses. For example, the dynamic eye tracking system can be disposed to determine in what direction, and at what depth of focus, the user is looking. This can have the effect that the dynamic eye tracking system can determine a location in three-dimensional (3D) space at which the user is looking. For another example, the outbound camera can be disposed to examine the user’s FOV from inside one or more of the lenses. Either of these techniques can have the effect that when the user moves their head or otherwise alters their FOV, the eyewear can adjust the 3D location that is shaded. More precisely, the eyewear can adjust a location on each lens so that the joint focus of the user’s eyes at that 3D location is shaded.

(Chromatic imbalance)

[525] Similarly, the eyewear can be disposed to detect when a region of the user’s field of view (FOV) is chromatically unbalanced, such as due to an atmospheric effect (e.g., sunrise or sunset). In such cases, the eyewear can be disposed to adjust the chromatic balance of that region of the user’s FOV, such as by altering a chromatic response of the lens (or one lens of multiple such lenses). This can have the effect that the user can see objects that might otherwise be obscured by the color imbalance, such as when sunset can make it difficult to see objects that are red or orange, such as streetlights or vehicle brake lights.

[526] In such cases, the eyewear can also be disposed to identify objects that are normally red/ orange (or another color for which the color imbalance might obstruct visibility) and to inverse-shade a color balance for those objects. For example, the eyewear can be disposed to detect streetlights or vehicle brake lights, or aircraft warning lights or landing strip lights. In such cases, the eyewear can be disposed to chromatically adjust a region of the user’s field of view (FOV) substantially surrounding the object (or otherwise near the object) to remove the imbalanced color, while leaving the object itself its natural color. This can have the effect that the user can clearly see the object with its natural color despite the chromatic imbalance.

(Attention pattern)

[527] Similarly, when the eyewear detects a prospective or actual problem with respect to the wearer’s attention pattern, the eyewear can be disposed to identify a problem and take action to direct the user’s vision. For example, if the eyewear detects that the user’s vision is unusually focused in a particular direction, such as when the user is about to doze off when driving, it can take action to ameliorate the problem. The eyewear can direct the user’s attention away from the region where the user’s vision is unusually focused, or to direct the user’s attention toward a more propitious direction, or otherwise attempt to cause the user to focus properly while driving.

[528] For example, the eyewear can shade a region where the user’s vision is unusually focused, so as to force the user to look in a different direction. Alternatively, the eyewear can shade a very small region where the user’s vision is unusually focused, so as to avoid blocking the user’s vision in the direction where they are driving the vehicle; in such cases, the eyewear can generate a relatively smaller shaded region, disposed to move about in the user’s field of view (FOV), so as to attract the user’s attention and jostle the user away from dozing off. For example, the eyewear can be disposed to generate a relatively small darkly shaded dot that moves so as to attract the attention of the user.

[529] For another example, when the eyewear detects that the user’s vision is unusually defocused, such as might occur when the user is excessively tired, is dozing off, or is subject to an adverse medical condition, the eyewear can be disposed to inverse-shade a region of the user’s field of view (FOV) so as to attract the user’s attention. In such cases, the eyewear can inverseshade a region of the user’s FOV so as to text whether the user is sufficiently responsive to prompts, thus possibly indicating whether the user is sufficiently alert to drive a ground vehicle, pilot an aircraft, or otherwise operate heavy machinery. In such cases, the eyewear can be disposed to alert the user, or if that is insufficient, to send a message to obtain assistance for the user.

[530] In such cases, when the eyewear detects, in response to the user’s vision, that the user is subject to an adverse medical condition, the eyewear can send a message to medical personnel, to emergency responders, to user support persons (such as friends or relatives), to local volunteers or other “good Samaritans”, or otherwise to request assistance. The eyewear can be disposed to send information describing the user’s medical status, the nature of the user’s vision, the nature of the adverse medical condition, and any other information relevant to obtaining or rendering assistance.

Hybrid alteration of unselected regions

[531] In such cases, the eyewear 800 can alternatively apply a different hybrid alteration to the unselected region 820, such as a chromatic alteration, a prismatic alteration, a polarization alteration, or otherwise as described herein.

(Shading/ Inverse-shading )

[532] In such cases, when the eyewear 800 determines that the user is gazing at a close-range object, the eyewear 800 can shade the distant region 822 (or inverse-shade the close-order region 821), so as to encourage the user to look through the close-order region 821. For example, the eyewear 800 can include a shading element disposed to shade the unselected region (that is, the distant region 822) or inverse-shade the selected region (that is, the close-order region 821). This can have the effect that the user is encouraged to look through the close-order region 821. When the unselected region 820 is 100% shaded, that is, it is made completely dark or otherwise opaque, the user can be required to look through the selected region to be able to see, thus would be required to use the selected region.

[533] Thus, when the user’s gaze is directed to a close object, the eyewear 800 can require the user to use the close-order region 821 to view that object. Similarly, when the user’s gaze is directed to a distant object, the eyewear 800 can require the user to use the distant region 822 to view that object. Similar principles apply to mid-range objects, to objects in an area of the user’s peripheral vision, to objects that are partially obscured in the user’s field of view (FOVj, or otherwise as described herein.

[534] In one embodiment, the eyewear 800 can apply shading/inverse-shading by polarization of the selected region 820 or the unselected region 820. For example, the eyewear 800 can apply shading by polarizing the unselected region 820 so as to remove much of its luminance; this can have the effect that the unselected region 820 can appear less bright than the selected region 820, thus encouraging the user to look through the selected region 820. For another example, the eyewear 800 can apply shading by polarizing the selected region 820 so as to remove glare or excessive light; this can have the effect that the selected region 820 can appear without glare or otherwise undisturbed by visual artifacts, thus (again) encouraging the user to look through the selected region 820.

(Chromatic alteration)

[535] For example, the eyewear 800 can apply a first chromatic alteration to the unselected region 820, such as by altering colors in the unselected region 820 to appear black-and-white to the user. This can have the effect that the user would immediately see that the unselected region 820 was disfavored, thus that the user was encouraged to use the selected region 820. However, the user would still be able to use the disfavored unselected region 820, only without color resolution.

[536] For another example, the eyewear 800 can apply a second chromatic alteration to the unselected region 820, such as by altering colors in the unselected region 820 to remove (or to enhance) one or more colors from the color gamut available to the user when looking through the unselected region. For example, the eyewear 800 can filter colors in the unselected region 820 so as to remove blue frequencies from the unselected region 820. This can have the effect that the user would see colors in the unselected region 820 as substantially skewed toward the red end of the visible spectrum, possibly thus appearing more orange in color, and providing the user with a clear hint that unselected region 820 is disapproved by the eyewear 800. (Prismatic alteration)

[537] For another example, the eyewear 800 can apply a prismatic alteration to one or more regions 820, such as by altering a direction of view through those regions 820 to direct the user’s view toward an object at the distance associated with those regions 820. In such cases, the eyewear 800 can direct the user’s view through those regions 820 toward an object at the appropriate distance. In such cases, when the user has eyewear 800 with both a close-order region 821 and a distant region 822, the eyewear 800 can use a prismatic effect to cause the user to see, when the user looks through the close-order region 821, a book, smartphone or other mobile device, or other near object that the user would generally look down to see. This could apply even if the user’s field of view (FOVj through the close-order region 821 was not directed at the near object. Similarly, the eyewear 800 can use a prismatic effect to cause the user to see, when the user looks through the distant region 822, an object or FOV associated with a relatively distant view.

[538] For another example, the eyewear 800 can apply a prismatic alteration to one or more regions 820, such as by altering a direction of view through those regions 820 associated with the user’s peripheral vision. In such cases, the eyewear 800 can direct the user’s view through those regions 820 toward an object otherwise appearing in the user’s peripheral FOV, where the user’s vision can be relatively imprecise, to cause that object to appear in the user’s central FOV, where the user’s vision can have better accuracy. This can have the effect that regions 820 ordinarily associated with peripheral vision can be directed toward the user’s central vision, allowing the user to better perceive peripheral regions when appropriate.

Multiple lenses and hybrid personalization

[539] In one embodiment, the eyewear 800 can include multiple lenses to provide hybrid personalization. A first lens 810a can provide for correction or enhancement of the user’s vision, while a second lens 810b, overlaying at least a part of the first lens, can provide for hybrid personalization of the first lens. For example, the second lens 810b can include a hybrid element, such as having electronically induced shading, or inverse-shading, electronically induced chromatic alteration, electronically induced prismatic alteration, or otherwise as described herein. The hybrid element can be responsive to a programmable computing device. In such cases, the programmable computing device can be responsive to a sensor responsive to infalling light, a sensor responsive to the user’s medical or other condition, a sensor responsive to a user input, or otherwise as described herein.

[540] Alternatively, the hybrid element can include a device responsive to the infalling light itself, such as a chemical or other device. The response to infalling light can be in response to a heat or thermal effect thereof, to an amount of ultraviolet (UV), to an electric or electromagnetic field thereof, or otherwise as described herein. In such cases, the hybrid element can be responsive to infalling luminance to chemically cause shading or chromatic alteration, such as performed by sunglasses responsive to UV or other luminance. Or, in such cases, the hybrid element can be responsive to infalling luminance to polarize the infalling light, with the effect that shading can be performed, such as performed by polarizing sunglasses.

[541] In one embodiment, the eyewear 800 can include, between the first lens 810a and the second lens 810b, an electronic transmitter 840, such as a microwave transmitter, disposed to receive energy from the infalling light or from a voltage differential between the first lens and the second lens. The electronic transmitter can include a power harvester 841, such as further described herein, so as to provide power to the transmitter. In such cases, the electronic transmitter can be disposed to send information to a computer device or other remote device 842, such as for processing, for reporting on the status of the eyewear 800 or the user, for controlling the eyewear or the hybrid element thereof, or otherwise as described herein.

[542] In one embodiment, the eyewear 800 can include, in response to the first lens 810a or the second lens 810b, or the electronic transmitter, a circuit 850 disposed to adjust one or more of: the first lens, the second lens, the electronic transmitter, or another aspect of the eyewear. The circuit 850 can be coupled to and responsive to the computing device or other remote device 842, such as a device on the eyewear 800, remotely couplable to the eyewear, or otherwise as described herein.

[543] For example, the eyewear 800 can adjust a correction or enhancement of the user’s eyesight (such as using the first lens 810a or the second lens 810b) in response to the circuit 850. This can have the effect that the eyewear 800 can respond to sensoiy information from the user, from user input, from the ambient environment, from one or more input images available at or near the user, from one or more objects near the eyewear, from predictive information available from a computing device, or otherwise as described herein. In such cases, the eyewear 800 can adjust the correction or enhancement of the user’s eyesight in response to an object at which the user is looking, in response to a user input, in response to an ambient environment (such as an ambient light level), in response to a medical condition or other condition of the user, or otherwise as described herein.

Emulation of user’s field of view

[544] For another example, the eyewear 800 can include a camera or other input element 860, such as a CCD, an infrared (IR) or ultraviolet (UV) receiver, an electromagnetic antenna, or otherwise as described herein, disposed to receive one or more inputs from a field of view (FOV) associated with the user’s eye(s). For example, a camera or other input element 860 disposed to receive an image similar to the user’s eye can be coupled to the computing device or other remote device 842 and customized to have a substantially identical response to the input image as the user’s eye(s). In such cases, the eyewear 800 can be controlled by the computing device or other remote device 842, coupled to the camera. This can have the effect that the correction or enhancement applied by the eyewear 800 can be automatically adjusted in response to the image seen by the camera or other input element 860.

(Shading alteration)

[545] In one such case, the eyewear 800 can determine an amount of shading required in response to an amount of light infalling on the camera or other input element 860, representing the amount of light infalling on the user’s eye. In response thereto, the eyewear 800 can cause one or more lens regions 820 to be shaded, or to be shaded or inverse-shaded with respect to an object being viewed by the user, so as to optimize a clarity of the user’s view.

[546] In another such case, the eyewear 800 can determine whether there is an excess of ultraviolet (UV) light infalling from the environment. When this occurs, the eyewear 800 can perform shading with respect to the excess UV light, inverse-shading with respect to a selected object being viewed by the user, or other actions to ameliorate the possible effects of excess UV light on the user. For example, some users might be sensitive to excess UV light, such as possibly being subject to migraines, photophobia, or other neurological conditions in response thereto. [547] In one embodiment, the eyewear 800 can be disposed to adjust shading with respect to at least a portion of the user’s field of view (FOV) during a time period while the user blinks. Since a blink takes a finite amount of time, the eyewear 800 can adjust an amount of shading while the user is blinking (and the pupil is covered by the eyelid). This can have the effect that the user sees a different amount of shading before the blink and after the blink. The eye integrates the amount of shading into its received image. This can have the effect that the user does not notice the change in the amount of shading.

[548] In one embodiment, the eyewear can be similarly disposed to adjust other visual effects (such as polarization or refraction) with respect to at least a portion of the user’s field of view (FOV) during a time period while the user blinks. Similar to adjustment of shading during the user’s blink, this can have the effect that the user sees different other visual effects (such as polarization or refraction) before the blink and after the blink, which can be integrated by the eye into its received image, so that the user does not notice the change.

(Color alteration)

[549] In another such case, the eyewear 800 can determine an amount of infalling light in each visible frequency range and compare the infalling light with the user’s ability to distinguish each color. If the user has any color blindness, either complete color blindness or a more common form such as red-green color blindness, the eyewear 800 can adjust the colors presented to the user’s eye in response to one or more of:

— optimizing clarity of the user’s field of view (FOV) after accounting for the user’s color blindness;

— presenting a false color image of the user’s FOV so as to alert the user with respect to the presence of colors the user is not able, or not easily able, to distinguish; or otherwise as described herein.

[550] For example, when the user has red-green colorblindness, the eyewear 800 can present additional brightness over and above the actual infalling light in a selected set of frequencies (such as in blue) to outline red areas of the image and other additional brightness over and above the actual infalling light (such as in yellow) to outline green areas of the image. This can have the effect that the user can see brightly outlined those areas that would otherwise appear grey due to color blindness.

[551] Alternatively, when the user has red-green color blindness, the eyewear 800 can present additional brightness over and above the actual infalling light in a selected set of frequencies (such as in blue) to saturate red areas of the image, so as to allow the user to distinguish red areas of the image from green areas of the image, despite the user’s natural color blindness.

[552] Alternatively, when the user has red-green color blindness, the eyewear 800 can present grey areas of the image to indicate red and green areas of the image, along with chyrons or other markers to indicate whether the greyed-out areas should be red or green. When the user has a different type of color blindness, the eyewear 800 can present other types of indicators to the user.

[553] In another such case, the eyewear 800 can determine a type of weather in the user’s field of view (FOV), such as by using the outward-facing camera or other input element 860, or such as by receiving a weather report from a remote device (not shown) in response to a GPS device or other location device (not shown). For example, when the weather is very sunny, the eyewear 800 can perform shading or color alteration, so as to emulate sunglasses or otherwise protect the user’s eyes from excess sunline, and so as to allow the user to see more clearly in that environment. For another example, when the weather is very cloudy, hazy, or otherwise dark, the eyewear 800 can perform inverse-shading or color alteration, so as to allow the user to see more clearly in that environment.

(Focus alteration)

[554] In another such case, the eyewear 800 can determine whether the image being presented to the user’s eye is in focus. The eyewear 800 can adjust the correction to the image made by one or more lens regions 820 so as to optimize clarity of the image. This can have the effect that unclear images are adjusted by the eyewear 800 so as to present them clearly to the user’s eye. In such cases, the eyewear 800 can perform an autofocus function on the image, such as in response to the gaze direction and focal length of the user’s eye; in response to object recognition of an object being viewed by the user; in response to motion blur, object blur, or visual noise with respect to an object interacting with its background, or otherwise as described herein. In such cases, the user can [555] In another such case, the eyewear 800 can determine a moving object being viewed by the user. For example, when the user is a participant or a view of a sporting event, the moving object can be a baseball, golf ball, or other sporting equipment. For another example, when the user is a law enforcement officer or military personnel, or is engaged in a search-and-rescue operation, the moving object can be a distant person. The eyewear 800 can determine the location of the object in the user’s field of view (FOV), such as using an artificial intelligence (Al) or machine learning (ML) technique, as further described herein with respect to other figures. Having identified the object, the eyewear 800 can determine a distance to the object and can alter the presentation of the object to the user so as to enhance the user’s depth perception thereof.

User control of hybrid personalization

[556] In one embodiment, the eyewear 800 can receive user input, so as to affect the hybrid personalization. For example, when the user is viewing a selected object or a selected portion of the user’s field of view (FOV), the user can perform eye gestures, facial gestures, hand or finger gestures, or other bodily movements, so as to provide inputs to the eyewear 800 that the user desires one or more selected actions by the eyewear.

(Gestures)

[557] For example, the eye gestures can include one or more eye blinks, eye rolls or other pupillary movements, movements of gaze direction, or otherwise as described herein, or combinations thereof. The user can blink twice or more in multiple succession, can look up, look down, look right, look left, or in another selected direction, one or more in succession. In some examples: the user can look left and blink twice; the user can look upward-left three times in succession; the user can look upward-left, upward-right, and then down; or otherwise as described herein.

[558] For example, the facial gestures can include one or more squints, frowns or smiles, nose wiggles, chin movements, teeth clenching, or otherwise as described herein. The user can combine one or more facial gestures, can combine one or more facial gestures with one or more eye gestures, or otherwise as described herein. [559] For example, the hand or finger gestures can include any type of hand or finger movement or positioning, and can be presented

— within the user’s field of view;

— within an image capture region of the eyewear 800;

— within an image capture region of an auxiliary outward-facing camera, such as one mounted on a side or rear of the eyewear 800; or otherwise as described herein. The eyewear 800 can determine that the user has performed one or more hand or finger gestures, and which one, using an artificial intelligence (Al) or machine learning (ML) technique. The hand or finger gestures can be combined with any other gestures available to the user.

(Other user inputs)

[560] The user can also provide other inputs to the eyewear 800 using a touch control or other input device 870. For example, the input device 870 can include a button, capacitive sensor, motion sensor, slide, switch, touchpad, another device responsive to touch or to proximity of the user’s hand or fingers, or otherwise as described herein. When the user activates the input device 870, the eyewear 800 can determine that the user desires one or more selected actions by the eyewear. The selected actions can be predetermined when the eyewear 800 is configured or can be altered by the user.

[561] The user can also provide other inputs to the eyewear 800 using a Bluetooth™ control, smartphone, smart watch, or another mobile device. For example, the user can invoke an application (sometimes called an “app”) on a smartphone or other mobile device, which can communicate with the computing device to provide inputs to, or otherwise control, the eyewear 800.

Fig. 9 - Dynamic adjustment of polarization

[562] Fig. 9 shows a conceptual drawing of an example eyewear used to provide dynamic adjustment of polarization.

[563] An example eyewear 900 can include one or more elements as shown in the figure, including at least — one or more lenses 910, such as lenses mounted on a frame, or such as contact lenses disposed for wearing by a user (not shown);

— one or more regions 920 disposed on at least one lens, the regions being controllable to adjust polarization in real time;

— one or more polarizers 930 disposed on at least one region, the polarizers being controllable to adjust the polarization of their associated regions;

— one or more sensors 940 disposed to determine an angle of the lenses or the regions, the sensors being coupled to the polarizers;

— (optionally) one or more processors 950 disposed to determine a difference between (A) the polarization provided by the polarizers, and (B) a desired polarization.

[564] As further described herein with respect to other and further embodiments, the one or more regions 920 can cover an entire lens 910. In such cases, when polarization of a region 920 is adjusted, the polarization of the entire lens 910 can be adjusted.

[565] As further described herein with respect to other and further embodiments, the one or more regions 920 can each cover a section of an entire lens 910 defined by a portion of the wearer’s field of view (FOV), such as a close-vision region, a distant vision region, or a mid-range vision region. Alternatively, the portion of the wearer’s FOV can include a central region of vision or a peripheral region of vision.

[566] As further described herein with respect to other and further embodiments, the one or more regions 920 can each cover a section of an entire lens 910 defined by an individual small portion of the wearer’s field of view (FOV), such as an individual pixel. One or more such pixels can be combined to define a larger region. As further described herein, these larger regions can include sets of pixels that are defined statically or dynamically.

[567] As further described herein with respect to other and further embodiments, each such region 920 can be dynamically controlled, such as in real time, to adjust the polarization thereof. For example, each such region 920 can include an electrically controlled polarizer disposed to alter an angle of polarization in real time. Adjusting polarization to match glare

[568] In one embodiment, the one or more polarizers 930 can be adjusted in real time in response to changes in a relative angle between the wearer’s eye and a direction of infalling glare. When light is reflected from a surface, it can become polarized in a plane. For example, this can apply when the reflective surface includes a body of water or a solid object such as glass or metal. When the polarizers 930 are adjusted in response to the polarization plane of the infalling glare, this can have the effect that glare infalling to the wearer’s eye can be mitigated. Accordingly, the polarizers 930 can be adjusted so as to reduce or eliminate the amount of glare allowed to reach the wearer’s eye.

[569] In one embodiment, the polarizers 930 can be electrically controlled to make desired adjustments. When the polarizing plane is misaligned with respect to infalling glare, the polarizers 930 can be adjusted so that the alignment is improved. For example, when it is desired that the polarizers 930 are aligned at right angles to the plane of the infalling glare, it might occur, due to the wearer’s motion or due to a change in angle or direction of the infalling glare, that the polarizers 930 are no longer aligned properly. In such cases, the plane of personalization can be adjusted to a proper angle.

[570] In one embodiment, the polarizers 930 can be electrically controlled to alter the plane of personalization. One or more sensors 940 can determine an angle at which the glare is being viewed. One or more processors 950 can determine a difference between (A) the angle at which the glare is being viewed and (B) a desired angle. The processors 950 can generate an electronic control signal (not shown), such as at an output pin of a processor chip or circuit board (not shown). The electronic control signal can be coupled to one or more of the electrically controlled polarizers 930. This can have the effect of altering one or more of the polarizer’s plane of personalization.

[571] In one embodiment, the one or more sensors 940 can include one or more gyroscope or magnetometers, another device suitable to determine a relative orientation of the eyewear with respect to the infalling glare, or a combination thereof. For example, one or more sensors 940 can be mounted on the eyewear, such as on a frame disposed to hold the lenses 910 in place. In such cases, a single sensor 940 can be mounted on the frame near one of the lenses 910, or a pair of sensors can be mounted on the frame near each one of a pair of lenses. Alternatively, a set of sensors 940 can be mounted about one of the lenses 910, such as in a circle or ellipse surrounding a lens, or two sets of sensors can be mounted about two of the lenses, such as in a circle or ellipse associated with and surrounding each lens.

[572] In one embodiment, when the wearer moves their head, the frame can alter their angle with respect to the infalling glare. This can have the effect that the sensors 940 determine that the angle of the wearer’s lenses 910 with respect to a reference, and thus with respect to the plane of the infalling glare, has changed. Thus, the polarization angle of the lenses 910 should be changed to maintain the effectiveness of glare reduction.

[573] Alternatively, the one or more sensors 940 can be disposed within the eyewear (such as mounted between the wearer’s eye and a lens 910) and can include a sensor disposed to measure an amount of infalling light. For example, the sensor 940 can include a light sensor, an infrared (IR) sensor, a camera, or another device suitable to determine an amount of infalling glare. When so disposed, the sensor 940 can measure an amount of infalling glare at one or more polarization angles, in response to which a processor 950 in the eyewear can select an optimum polarization angle at which to minimize the amount of infalling glare.

[574] In one embodiment, when the one or more processors 950 determine the difference between the two specified angles, this can have the effect of determining a difference between (A) the polarization provided by the polarizers 930, and (B) a desired polarization. The one or more processors 950 can determine this difference periodically, aperiodically, or otherwise from time to time, in real time. This can have the effect that the polarization provided by the polarizers 930 can be adjusted in real time to maintain a desired polarization, such as an optimum polarization for minimizing an amount of infalling glare. For example, when the actual polarization differs from the desired polarization, the one or more sensors 940 can detect that infalling glare is more than a measured minimum amount, and the one or more processors 950 can determine an appropriate correction. This can have the effect that the actual polarization is maintained substantially equal to the desired polarization, or at least close enough that the wearer does not notice a difference. Adjusting polarization to match external devices

[575] In one embodiment, the one or more polarizers 930 can be adjusted in real time in response to changes in a relative angle between the wearer’s eye and a direction of an external device. For example, the external device can include a display, such as a control display or sensor display in a vehicle, such as an aircraft or a racing car. At least some of these control displays or sensor displays can be polarized, thus, their outputs include polarized light.

[576] Depending on the location where the wearer is seated and the direction at which the wearer’s head is turned, it can occur that one or more of the polarizers 930 is misaligned with the control display or sensor display, and thus, presents an extremely darkened (or even unreadably black) output.

— In one embodiment, the one or more polarizers 930 can be adjusted in response to the location of the wearer and the direction of their head, and in response to knowledge of the location and orientation of the control displays or sensor displays.

— In another embodiments, the one or more polarizers 930 can be adjusted in response to a signal, such as an electromagnetic signal, from the control displays or sensor displays to the polarizers 930, in which the displays indicate their relative orientation. For example, the displays can communicate with the polarizers 930 using Wi-Fi, Bluetooth™, near field communication (NFC), or another method

— In another embodiment, the one or more polarizers 930 can be adjusted in response to a camera disposed behind the polarizers 930 and looking outward in imitation of the wearer; when the camera sees an excessively darkened control display or sensor display, the eyewear can adjust the direction of polarization so as to provide for clear viewing of the control displays or sensor displays.

— In another embodiments, the one or more polarizers 930 can be adjusted in response to an external signal, such as a user control (possibly from a smartphone or other mobile device), or from one or more of the control displays or sensor displays identifying its location or orientation.

[577] When the eyewear determines a relative orientation between the polarizers 930 and the control displays or sensor displays, the eyewear can adjust, in real time, an orientation of the polarizers 930. This can have the effect that the polarizers 930 are disposed to allow polarized light to pass through the lenses to the wearer’s eyes, so as to allow the wearer to see control displays or sensor displays clearly. [578] When the eyewear receives a user command directing it to change a relative orientation between the polarizers 930 and the control displays or sensor displays, the eyewear can adjust, in real time, an orientation of the polarizers 930. This can have the effect that the polarizers 930 are disposed to allow polarized light to pass through the lenses as directed by the user, so as to allow the user to see control displays or sensor displays in a manner they desire.

[579] When the eyewear determines a relative orientation between the polarizers 930 and the control displays or sensor displays (by any technique), the eyewear can adjust, in real time, an orientation of the polarizers 930. This can have the effect that the polarizers 930 are disposed to allow polarized light to pass through the lenses to the wearer’s eyes, so as to allow the wearer to see control displays or sensor displays clearly.

[580] When used in combination with a camera or other technique for automatically determining the relative orientation between the polarizers 930 and another device (such as control displays or sensor displays), the eyewear can perform a function of “auto-polarization”, similar to “autofocus” techniques used with respect to camera focus on objects at a distance. This can have the effect that the polarization field can be controlled (A) to optimize viewing of the object the user is looking at and focusing upon; (B) to eliminate glare and maintain night vision adaptation for the user; (C) to see clearly in circumstances with relatively wide variances in brightness; and (D) otherwise when unwanted light can be filtered out of the user’s field of view using polarization.

Use cases for adjusting polarization

[581] As described herein, adjusting polarization can be useful when the user is manipulating a vehicle, such as an aircraft, a racing car, a sailboat or speedboat, or otherwise when the user’s attention is divided between viewing an outside-the-vehicle field of view and a set of control displays or sensor displays inside-the-vehicle. In such cases, the light environment outside the vehicle can vary substantial from the light environment inside the vehicle. Other examples include:

— The user can include law enforcement personnel, military personnel, and other personnel whose attention is divided between potential threats and their own equipment. For example, law enforcement personnel can be distracted at night by automobile lights, flashlights, flood lights, gun muzzle flashes, streetlights, and other unexpected light sources. Law enforcement personnel can benefit from polarizing their own light sources and using eyewear that uses that polarization to filter out those law enforcement light sources.

— The user can include racecar drivers and other vehicle drivers whose attention might be impaired by rapid changes in the light environment, such as when entering or exiting tunnels, passing large hills, or otherwise entering or exiting shadowed regions. For example, racing drivers can benefit from using excess light, such as headlights, and filtering that light away when it is not needed.

— The user can include baseball or other players in a stadium context, particularly at night, whose attention is divided between the game equipment itself and external sources of distraction. For example, distractions can include advertisements, floodlights, spectators, other players, and other unexpected attention-seeking objects.

Fig. 10 - Adjustment of magnification

[582] Fig. 10 shows a conceptual drawing of an example eyewear used to provide dynamic adjustment of magnification.

[583] An example eyewear 1000 can include one or more elements as shown in the figure, including at least

— one or more lenses 1010, such as lenses mounted on a frame, or such as contact lenses disposed for wearing by a user (not shown);

— one or more regions 1020 disposed on at least one lens, the regions being controllable to adjust polarization in real time;

— one or more magnifiers 1030 disposed on at least one region, the magnifiers being controllable to adjust the magnification of their associated regions;

— one or more sensors 1040 disposed to determine a gaze direction and/or focal length of the wearer’s eye with respect to the regions, the sensors being coupled to the magnifiers;

— one or more wearer inputs 1050 disposed to receive one or more input controls from the wearer, such as an eye gesture, a touch input, or otherwise as described herein;

— (optionally) one or more processors 1060 coupled to the sensors, the wearer inputs, a combination thereof, or otherwise as described herein. [584] As further described herein with respect to other and further embodiments, the one or more regions 1020 can cover an entire lens 1010. In such cases, when magnification of a region 1020 is adjusted, the magnification of the entire lens 1010 can be adjusted.

[585] As further described herein with respect to other and further embodiments, the one or more regions 1020 can each cover a section of an entire lens 1010 defined by a portion of the wearer’s field of view (FOV), such as a close-vision region, a distant vision region, or a mid-range vision region. Alternatively, the portion of the wearer’s FOV can include a central region of vision or a peripheral region of vision.

[586] As further described herein with respect to other and further embodiments, the one or more regions 1020 can each cover a section of an entire lens 1010 defined by an individual small portion of the wearer’s field of view (FOV), such as an individual pixel. One or more such pixels can be combined to define a larger region. As further described herein, these larger regions can include sets of pixels that are defined statically or dynamically.

[587] As further described herein with respect to other and further embodiments, each such region 1020 can be dynamically controlled, such as in real time, to adjust the magnification thereof. For example, each such region 1020 can include an electrically controlled magnifier disposed to alter an amount of magnification, such as in real time.

[588] In one embodiment, the one or more magnifiers 1030 can be adjusted (such as in real time) in response to one or more of (A) changes in gaze direction and/or focal length of the wearer’s eye, (B) inputs from the wearer, (C) object recognition, or otherwise as described herein. For example, when the wearer’s gaze is directed to a selected object, the eyewear can adjust its magnification with respect to the selected object so as to make that object easier for the wearer to distinguish. This might involve increasing or decreasing an amount of magnification of a portion of the wearer’s field of view (FOV) in which that object is found.

[589] For example, this can apply when the wearer directs their gaze to a particular drawing, symbol, or word on a display (whether a printed page, a physical sign, a computer display, a smartphone or mobile device display, or a heads-up display). Alternatively, this can apply when the wearer directs their gaze to a particular object or person (whether nearby or distant). When the magnifiers 1030 are adjusted in response to selection of a particular object, that object can be made more easily visible to the wearer.

[590] For another example, this can apply when the wearer desires to apply a binocular effect to their field of view (FOVj, such as when the wearer desires to see a distant object more clearly. This can also apply when the wearer desires to see a distant object at a larger magnification, such as when that distant object occupies only a small portion of the wearer’s FOV.

[591] For another example, this can also apply when the eyewear attempts to draw the wearer’s attention to a particular object, such as an object or person that the eyewear has recognized as of interest to the wearer. In such cases, the eyewear can draw the wearer’s attention to the object in one or more of

— highlighting the object/person using shading or inverse-shading;

— highlighting the object/person using color, outlining, artificial phosphorescence (such as by emphasizing the color of the object/person, or altering the color of the object/person to increase contrast with its/their background);

— magnifying the object/person with respect to its/their surroundings;

— magnifying the region in the wearer’s FOV with respect to the object/person; or otherwise as described herein.

[592] In one embodiment, the magnifiers 1030 can be electrically controlled to make desired adjustments, such as to increase/decrease the amount of magnification. For example, the one or more sensors 1040 can determine one or more of (A) a gaze direction and/ or focal length by the wearer, (B) one or more inputs by the wearer, (C) one or more circumstances statistically correlated or otherwise corresponding to circumstances in which the wearer desires an increase/decrease in the amount of magnification, such as when a particular object or person is recognized, or (D) other circumstances in which an increase/decrease in the amount of magnification is desirable. In response thereto, the processors 1060 can generate an electronic control signal (not shown), such as at an output pin of a processor chip or circuit board (not shown). The electronic control signal can be coupled to one or more of the electrically controlled magnifiers 1030. This can have the effect of altering an amount of magnification.

[593] In one embodiment, the one or more sensors 1040 can be disposed within the eyewear (such as mounted between the wearer’s eye and a lens 1010) and can include a sensor disposed to measure a gaze direction and/or a focal length by the wearer’s eye. For example, the sensor 1040 can include an infrared (IR) sensor or a camera directed at the wearer’s eye (such as their pupil), or another device suitable to determine gaze direction and/or focal length. When so disposed, the sensor 1040 can determine an object the wearer’s gaze is directed to. In response to this information, the processor 1060 can select an optimum amount of magnification to maximize the visibility of the object.

[594] Alternatively, the one or more sensors 1040 can be disposed on the eyewear (such as mounted on an externally accessible surface) and can include touchable surface disposed to receive an input by the wearer. For example, the sensor 1040 can include a button, a capacitive touch sensor, a slider, a proximity sensor, a voice input, or otherwise as described herein, disposed to detect when the wearer provides an input indicating the wearer’s desire to increase/decrease magnification. When so disposed, the sensor 1040 can determine that the wearer desires to increase/decrease an amount of magnification of the eyewear. In response to this information, the processors 1060 can increase/decrease the amount of magnification as directed by the wearer.

[595] Alternatively, the one or more sensors 1040 can be disposed on the eyewear, such as mounted on a forward-looking camera directed at the wearer’s field of view (FOV). In such cases, the sensors 1040 can be disposed to provide information from which the processors 1060 can determine an object or person in the wearer’s FOV. In response to this information, the processors 1060 can increase/decrease the amount of magnification so as to make the detected object or person more visible to the wearer.

[596] Alternatively, the processors 1060 can be disposed to receive information from one or more sensors 1040 and can combine that information so as to detect one or more circumstances in which the wearer has statistically desired a change in an amount of magnification. In response to this information, the processors 1060 can determine a reliability of whether or not the wearer would desire a change in an amount of magnification. When the processors 1060 determine a sufficient degree of confidence that the wearer would desire a change in an amount of magnification, the processors can increase/decrease the amount of magnification so as to conform to the wearer’s predicted desire.

[597] In one embodiment, when the processors 1060 determine that the amount of magnification should be changed, this can have the effect of making one or more images or objects in the wearer’s field of view (FOV) more visible to the wearer. The one or more processors 1060 can determine whether to make such changes periodically, aperiodically, or otherwise from time to time, in real time or otherwise as described herein.

Fig. 11 — Dynamic adjustment of reflection

[598] Fig. 11 shows a conceptual drawing of an example eyewear used to provide dynamic adjustment with respect to reflection and partial reflection.

[599] An example eyewear 1100 can include one or more elements as shown in the figure, including at least

— one or more lenses 1110, such as lenses mounted on a frame;

— one or more mirrors 1120 disposed to provide a reflective effect, so as to allow the wearer (not shown) to see at an angle not ordinarily available;

— one or more sensors 1130 disposed to determine a gaze direction and/ or focal length of the wearer’s eye with respect to the mirrors, the sensors being coupled to the mirrors;

— one or more wearer inputs 1140 disposed to receive one or more input controls from the wearer, such as an eye gesture, a touch input, or otherwise as described herein;

— (optionally) one or more processors 1150 coupled to the sensors, the wearer inputs, a combination thereof, or otherwise as described herein.

[600] The mirrors 1120 can be coupled to the processors 1150. The processors 1150 can control the angle at which the mirrors 1120 are positioned, and where applicable, can electronically control a focal length of the mirrors 1120. The processors 1150 can determine an angle and distance at which the wearer is looking, such as by using the sensors 1130 to determine a gaze direction and/or focal length of the wearer’s eye. The processors 1150 can adjust the angle and focal length of the mirrors 1120 in response thereto. This can have the effect that the wearer can see behind themselves, to the side, or otherwise as described herein, using the mirrors 1120.

[601] For example, the mirrors 1120 can be disposed so as to provide a continuous image to the wearer that collectively shows a central vision region and a peripheral vision region. For another example, the mirrors 1120 can be disposed so as to provide a continuous image to the wearer that collectively shows a forward-looking view and a rearward-looking view. This can have the effect that the wearer’s eye and brain can integrate the portions of the presentation by the lenses 1110 and the mirrors 1120 so as to present a full image to the wearer, without any disjoint breaks at edges of the lenses 1110 or at disjoint regions therein.

[602] The wearer can also use the lenses 1110 and the mirrors 1120 to view a peripheral vision region of their field of view, using a central vision region of their retina. This can have the effect that the wearer can have as clear vision of the peripheral vision region of their field of view as they have of the central vision region of their field of view.

Fig. 12 - Dynamic adjustment of 3D presentation

[603] Fig. 12 shows a conceptual drawing of an example eyewear used to provide dynamic adjustment with respect to three-dimensional (3D) viewing of a display.

[604] An example eyewear 1200 can include one or more elements as shown in the figure, including at least

— one or more lenses 1210, such as lenses mounted on a frame, or such as contact lenses disposed for wearing by a user (not shown);

— one or more 3D presentation devices 1220 disposed to provide a 3D presentation, such as a 3D still image (not shown) or a 3D moving image (not shown);

— one or more sensors 1230 disposed to determine a gaze direction and/or focal length of the wearer’s eye with respect to the regions, the sensors being coupled to the magnifiers;

— one or more wearer inputs 1240 disposed to receive one or more input controls from the wearer, such as an eye gesture, a touch input, or otherwise as described herein;

— (optionally) one or more processors 1250 coupled to the sensors, the wearer inputs, a combination thereof, or otherwise as described herein.

[605] As further described herein with respect to other and further embodiments, the 3D presentation devices 1220 can include one or more controllers with respect to the lenses 1210, so as to provide images to the wearer that collectively show a 3D presentation. As further described herein, the images can include portions of the 3D presentation at distinct depths of the wearer’s field of view (FOV). This can have the effect that the wearer’s eye and brain can integrate the portions of the 3D presentation so as to present a 3D image (still or moving) to the wearer. [606] For example, one or more 3D images (such as a 3D still image, a 3D moving image, or a combination thereof) can be presented with respect to a display. The display can include one or more of

— a smartphone or another mobile device display, a phablet or tablet display;

— a wearable or implantable device display;

— a computer display, an internet browser display;

— a gaming device display;

— a television display or another video display;

— a head-up display (HUD), a billboard display, a movie theater display, a window or other see-through display;

— a biomedical display or another telemedicine display;

— a computer-aided design (CAD) display, a modeling or presentation display, or another multi-viewer display; or otherwise as described herein.

[607] In such cases, a 3D still image or a 3D moving image can be presented with respect to a smartphone or another mobile device display, such as might be presented with respect to a game “app” executing on the smartphone or mobile device, or such as might be presented with respect to an 3D video call using the smartphone or mobile device.

[608] Alternatively, a 3D still image or a 3D moving image can be presented with respect to a gaming device display, a computer device, or a related type of display, such as might be presented with respect to a game being played by one player, or between or among more than one player, or such as might be presented with respect to a game being played using the internet or another longdistance communication link.

[609] Alternatively, a 3D still image or a 3D moving image can be presented with respect to a 3D presentation being made to an audience, such as might occur with respect to a live-action show, a movie theater, a news event, a sports activity (in which an individual player’s action can be focused-upon and presented to the audience), or otherwise as described herein. Similarly, a 3D presentation can be made to a class of students, or another audience.

[610] Alternatively, a 3D still image or a 3D moving image can be presented with respect to a telemedicine activity or another relatively long-distance expert activity. For example, an expert can provide oversight to a student or another individual performing an activity under the aegis of an expert who is unable to attend physically, or for whom personal attendance is infeasible.

[611] When the display is disposed to provide a 3D presentation and the wearer alters their gaze from/to the display, the eyewear can turn on/off a 3D presentation in response thereto. For example, when the eyewear is disposed to provide a 3D presentation at the display and the wearer moves their gaze from the display, the eyewear can turn off its 3D presentation and allow the wearer to see their normal field of view (FOV) without any 3D adjustment. When the wearer moves their gaze to the display, the eyewear can turn on its 3D presentation and allow the wearer to see the display using 3D viewing.

[612] As further described herein with respect to other and further embodiments, the 3D presentation devices 1220 can provide the portions of the 3D presentation using one or more of

— time-division multiplexing, in which the distinct portions are interlaced with respect to time;

— color-division multiplexing, in which the distinct portions are distinguished by a color (such as red/blue);

— spatial-division multiplexing, in which the distinct portions are distinguished by a spatial offset or an angle at which they are presented;

— lens-division multiplexing, in which the distinct portions are distinguished by which one of the lenses 1210 (or interlaced pixels or regions of the lenses) at which they are presented; or otherwise as described herein.

[613] As further described herein with respect to other and further embodiments, the eyewear can determine, in response to the wearer’s gaze direction and/or focal length, or in response to an input by the wearer, or in response to a predictive measure in response to circumstances from which a statistical inference can be drawn, or otherwise as described herein, whether the wearer is directing their gaze to a display. When the eyewear is disposed to provide a 3D image with respect to the display, the eyewear can determine when the wearer adjusts their gaze to/from the display. When the wearer adjusts their gaze to/from the display, the eyewear can adjust whether it provides a 3D image, or alternatively, whether it allows normal sight of the wearer’s normal field of vision (FOV) without providing a 3D image. This can have the effect that the wearer can view the 3D image at the display without suffering blur when looking away from the display. Adapting to changes in light/dark viewing

[614] The eyewear and systems described herein can provide a method of adapting to changes in light/ dark viewing, such as by adjusting shading/inverse-shading to activate (or maintain activated) the wearer’s rods in their retina. This can be relevant when, for example,

— the wearer of eyewear described herein transitions from a bright to a dark viewing environment, such as when exiting a bright police car into a dark night-time environment, or such as when entering a bright indoor location from a dark night-time environment;

— the wearer of eyewear described herein transitions from a dark to a bright viewing environment, such as when driving or flying from a dark night-time environment to a bright daytime environment.

[615] In such cases, the wearer’s rods (which can provide detailed viewing in a dark viewing environment) de-activate relatively quickly and re-activate relatively slowly. The eyewear can be disposed to determine when the wearer is about to transition from a dark to a bright viewing environment; in such cases, the eyewear can shade the bright viewing environment so as to prevent the wearer from losing the activation of their rods (losing their “night vision”). If the bright viewing environment is expected to last substantial time, the eyewear can allow the shading effect to fade, so as to activate the wearer’s cones in their retina (gaining “color vision”). This can have the effect that relatively brief exposure to bright light does not cause the wearer to lose their night vision, a phenomenon sometimes called “night blindness”.

[616] The eyewear can also be disposed to determine when the wearer is about to transition from a bright to a dark viewing environment; in such cases, the eyewear can shade the bright viewing environment for a relatively long time, so as to allow the wearer’s rods to activate (thus, providing the wearer with “night vision”). This can be particularly effective when the wearer is driving a car or piloting an aircraft in the direction of sunset; when the sun is low on the horizon, it can shine directly in the wearer’s eyes, degrading the wearer’s night vision at a time when it is about to be most needed.

Protecting eyesight from changes in light/dark environments

[617] Example ambient luminance cases. The eyewear and systems described herein can provide a method of protecting the wearer’s eyesight from changes in relative light/dark environments, such as by adjusting shading/inverse-shading to prevent excessive ambient luminance from penetrating to the wearer’s retina. This can be relevant when, for example,

— the wearer of eyewear described herein is subject to a sudden increase in ambient luminance, such as when transitioning from a relatively dark ambient environment to a relatively bright ambient environment;

— the wearer of eyewear described herein is subject to a sudden increase in ambient luminance, such as when the wearer receives sudden glare or other bright light directed at their eye(s);

— the wearer of eyewear described herein is subject to a sudden increase in background luminance, such as when the wearer is tracking a moving object that moves in front of a bright light source, such as a floodlight or the sun;

— the wearer of eyewear described herein is subject to a sudden increase in sensitivity to ambient luminance, such as when the wearer has recently been the subject of a medical procedure that has the effect of causing the eyes to become more light-sensitive;

— the wearer of eyewear described herein is subject to a sudden increase in ambient luminance, in which the change in ambient luminance is too fast for the wearer’s eyes to react;

— the wearer of eyewear described herein is subject to a sudden increase in ambient luminance, such as when using a “night vision” device or another type of device that amplifies luminance;

— the wearer of eyewear described herein is subject to an increase in ambient luminance with respect to only one of two eyes, such as when the wearer attempts to keep track of multiple objects, one of which has a bright background and one of which does not;

— the wearer of eyewear described herein is subject to an increase in ambient luminance with respect to only a subset of colors, such as when the wearer is subject to an increase in ambient luminance with respect to only blue or ultraviolet, only green, or only red or infrared;

— the wearer of eyewear described herein is subject to a short pulse, or multiple short pulses, of change in ambient luminance, such as a sequence of short pulses of greatly increased ambient luminance.

[618] In one embodiment, the eyewear and systems described herein can include an ambient luminance sensor, as further described herein, that can determine an amount of ambient luminance to which the wearer’s eye is subject. For example, the ambient luminance sensor can be coupled to a computing device, which can control a shading element so as to protect the wearer’s eye against excessive ambient luminance. The computing device can compare the amount of ambient luminance against a threshold value and can determine whether to provide shading in response to that comparison.

[619] For example, this process can be useful when the wearer is subject to a sudden increase in ambient luminance, such as when transitioning from a relatively dark ambient environment to a relatively bright ambient environment. In such cases, the transition can trigger the computing device to provide a different amount of shading, so as to prevent the wearer from being temporarily blinded or subject to eye pain by the newly-bright ambient environment.

[620] For another example, this process can be useful when the wearer is subject to a sudden increase in ambient luminance, such as when the wearer receives sudden glare or other bright light directed at their eye(s). The sudden glare can be from a “flashbang” grenade, as further described herein, from a reflection of a floodlight or the sun from a reflective surface, as further described herein, from revealing a bright background light such as a floodlight or the sun, as further described herein, or otherwise as described herein. This can occur when a shiny object moves so as to cause a reflection of light into the wearer’s eyes, or when a cloud moves away from the sun to reveal bright light. The sudden glare can also result from the wearer tracking a moving object that moves in front of a bright light source, such as a floodlight or the sun, as further described herein. This can occur when the wearer is involved in a sport, such as a baseball player who is tracking a ball with a floodlight or the sun as background.

[621] For another example, this process can be useful when the wearer is subject to a sudden increase in sensitivity to ambient luminance, such as when the wearer has recently been the subject of a medical procedure that has the effect of causing the eyes to become more light-sensitive. Examples of such medical procedures can include (A) cataract surgery, (B) surgery with respect to detachment of the retina, (C) eye dilation from an optometrist visit, or otherwise as described herein. This can occur when the wearer has had their eyes dilated at an optometrist visit and becomes very sensitive to sunlight or other bright light.

[622] For another example, this process can be useful when the wearer is subject to a sudden increase in ambient luminance, in which the change in ambient luminance is too fast for the wearer’s eyes to react. In such cases, the wearer’s eyes can generally only reach with respect to a turn-off or turn-on time for the wearer’s cone or rod cells, while the eyewear can react electronically. Similarly, the wearer’s eyes can generally only react within about 300-500 milliseconds, while the eyewear can react electronically within about 5-50 milliseconds. Thus, the eyewear can react sufficiently fast that bright light, glare, or other debilitating visual input, can be shaded by the eyewear against damage or pain to the wearer.

[623] For another example, this process can be useful when the wearer is using a “night vision” device or another type of device that amplifies luminance, and there is a sudden increase in ambient luminance. In such cases, the device that amplifies luminance can make the error of rapidly increasing luminance as viewed by the wearer, with the possible effects of (A) making it difficult for the wearer to see, (B) debilitating the wearer’s night vision, or otherwise as described herein. For example, when using a “night vision” device, the wearer might be subject to adverse effects when their target shines a light in their direction; in such cases, it can be useful for the eyewear to rapidly shade the wearer against that light. Moreover, in such cases, the shading element can be disposed between the wearer’s eye and the “night vision” device itself.

[624] For another example, this process can be useful when the wearer is subject to an increase in ambient luminance with respect to only one of two eyes. This can occur when the wearer is involved in a sport, such as a baseball player who is (with one eye) tracking a ball with a floodlight or the sun as background, and who is (with another eye) concurrently tracking another player who is moving. In such cases, the wearer’s view of the ball might need to be shaded, while the wearer’s view of the other player might not need to be shaded.

[625] For another example, this process can be useful when the increase in ambient luminance is only with respect to a particular set of frequency, such as a particular range of colors (e.g., blue/ultraviolet, green, red/infrared, or otherwise as described herein). In such cases, the eyewear can shade only with respect to the color(s) for which there is a substantial increase in luminance; thus, the eyewear can restrict its shading to only those color(s). For example, when the increase in ambient luminance is only with respect to blue, the eyewear can shade only blue light, thus reducing the amount of blue light injected into the wearer’s eyes.

[626] For another example, this process can be useful when the increase in ambient luminance is only applied for a very short time duration, such as a short pulse, or multiple short pulses, of change in ambient luminance. For example, the wearer can be subject to a sequence of short pulses of greatly increased ambient luminance. Without shading, this can have a deleterious effect on the wearer’s visual acuity or other visual capabilities; with shading, the wearer can be protected against this effect.

[627] Multiple ambient luminance thresholds. For example, the computing device can maintain two independent ambient luminance threshold values, such as ft and ft, at which an amount of shading is altered, such as to provide shading or to remove shading. Similarly, the computing device can maintain two independent amounts of shading, such as m and a 2 , which represent amounts of shading that are provided (or removed).

[628] For example, without loss of generality, ft > ft and m > a 2 . The threshold amounts ft and ft can be separated by an amount of ambient luminance sufficiently large that the wearer would otherwise recognize the difference. Similarly, without loss of generality, m and a 2 can be separated by an amount of shading sufficiently large that the wearer would otherwise recognize the difference.

[629] In such cases, when the amount of ambient luminance becomes more than ft, the computing device can increase the amount of shading to m, so as to reduce the amount of luminance reaching the wearer’s eyes to a limit that does not impair the wearer’s sight even temporarily. The amount of shading can then be maintained at oi, so as to provide the wearer with a relatively stable viewing environment, at least until the amount of ambient luminance is significantly reduced. When the amount of ambient luminance later becomes less than ft, the computing device can decrease the amount of shading to a 2 , so as to increase the amount of luminance reaching the wearer’s eyes, again so that the wearer’s sight is not impaired even temporarily. This can effectively provide a hysteresis loop, refraining from unnecessaiy changes in shading, so as to provide that the amount of shading can be dependent not only on the amount of ambient luminance, but also on the recent histoiy of the amount of ambient luminance.

[630] In one embodiment, the computing device can maintain a third independent ambient luminance threshold value, such as ft, at which an amount of shading is altered, such as to provide shading or to remove shading. Similarly, the computing device can maintain a third independent amounts of shading, such as <r 3 , which represents an amount of shading that is provided (or removed).

[631] For example, without loss of generality, ft > 0 3 > ft and 07 > tr 3 > tr 2 . As further described herein with respect to the pair of threshold values ft and ft, the pair of threshold values ft and ft, and the pair of threshold values ft and ft, can be separated by an amount of ambient luminance sufficiently large that the wearer would otherwise recognize the difference. Similarly, without loss of generality, the pair of shading values oi and a 3 , and the pair of shading values a 3 and a 2 , can be separated by an amount of shading sufficiently large that the wearer would otherwise recognize the difference.

[632] In such cases, when the amount of ambient luminance becomes more than the intermediate value ft, the computing device can increase the amount of shading to the intermediate value a 3 . Thereafter, when the amount of ambient luminance increases to more than ft, the computing device can increase the amount of shading to oi, or when the amount of ambient luminance decreases to less than ft, the computing device can decrease the amount of shading to <r 2 . With three or more such values ft and j, the amount of shading can be maintained so as to provide a relatively stable viewing environment, at least until the amount of ambient luminance changes significantly, while also continuously adjusting the amount of shading so that the wearer’s sight is not impaired even temporarily. This can effectively provide a sequence of hysteresis loops, so as to make only necessary changes in shading and to otherwise maintain a relative constant amount of shading for only small changes in the amount of ambient luminance.

[633] In one embodiment, the first and the second ambient luminance thresholds can be set to optimize an amount of visual acuity by the wearer’s eye. For example, the wearer’s color sensitivity, the wearer’s contrast sensitivity, the wearer’s night vision, or the wearer’s vision sensitivity, can be optimized. Thus, the amount of shading applied to the ambient light can be set so as to allow the wearer the best possible visual acuity, for example, by providing the best possible contrast between a targeted object and a background, between a targeted object and another object, or otherwise as described herein.

[634] In one embodiment, the amount of ambient luminance can be determined for the viewing environment separately with respect to each eye. Thus, the amount of shading applied to the ambient light can be set separately with respect to each eye, so as to allow the wearer (for example) to pay attention to different target objects with each eye. Similarly, this technique can provide a hysteresis loop of shading, separately with respect to each eye, between the first and second ambient luminance threshold, or between any adjacent pair of ambient luminance thresholds.

Fig. 13 — Illumination where the user is looking

[635] Fig. 13 (collectively including Figures 13A and 13B) shows a conceptual drawing of eyewear used to provide dynamic lighting in a direction being viewed by a wearer.

Illumination by eyewear

[636] Figure 13A shows a conceptual drawing of eyewear being used to provide light where the user is looking. As otherwise described herein, an example eyewear 1300 can include one or more elements as shown in the figure, including at least

— a dynamic eye tracking mechanism 1310 disposed to determine a gaze direction and/ or focal length of the wearer’s eye;

— one or more sensors 1320 disposed to receive commands or requests from the wearer.

[637] As described herein, the eyewear 1300 can also include additional elements disposed to provide illumination where the wearer is looking, such as one or more of

— a lamp (such as an LCD, laser, or other illuminating device) 1330, disposed to provide illumination in a direction in which the wearer is looking;

— one or more sensors 1340 disposed to determine user conditions, such as when the user’s head moves with respect to the user’s field of view, when the user is subject to a medical condition, or otherwise as described herein;

— (optionally) one or more processors 1350 coupled to the dynamic eye tracking mechanism 1310, the sensors 1320 or 1340, a combination thereof, or otherwise as described herein, and disposed to control the lamp 1330.

[638] As described herein, the lamp 1330 can be disposed on a portion of the eyewear 1300, such as on a front piece (such as at a location between the user’s eyes) or on a temple (such as at a location near the user’s temple) and disposed to provide a light beam in a direction which the user is looking and focused at a distance at which the user is focusing. In the figure, the lamp 1330 is shown disposed on the front piece of a set of glasses. However, in the context of the invention, there is no particular requirement for any such limitation; the lamp 1330 can be disposed anywhere it can be used to illuminate an object at which the wearer looks.

[639] As described herein, the eyewear 1300 can be disposed to determine a gaze direction and/or focal length of the wearer’s eye, using the dynamic eye tracking mechanism 1310. In response thereto, the eyewear 1300 can be disposed to direct light from the lamp 1330 in the direction or at the focal length where the wearer’s eyes are focusing, thus, to the location in three- dimensional (3D) space where the wearer is looking. This location can be a particular object or a particular surface of a designated object (such as a surface of a smartphone display).

[640] This can have the effect that the eyewear can illuminate objects at which the wearer is looking, thus, to “light where the user is looking”. Thus, when the wearer adjusts the direction in which they are looking, adjusts the depth of field at which they are looking, tilts their head, squints, otherwise moves due to an external force (such as a centrifugal/centripetal force that pushes or turns the user’s body), the eyewear can “light where the wearer looks”, and if so desired, only where the wearer looks. In such cases, the one or more sensors 1340 can be disposed to determine when the user’s head moves without the user directly attempting to redirect their gaze.

[641] In one embodiment, this form of illumination can also be applied to inverse-shading as well. Thus, the eyewear can be disposed to illuminate a region surrounding where the user is looking, so as to provide the user with better visual acuity of objects within that region. For example, as described herein, the eyewear 1300 can be disposed to shade/inverse-shade a control or a display in a vehicle or otherwise where the user desires to specifically view that control or display in a relatively shaded or darkened area.

[642] As also described herein, the eyewear 1300 can also be disposed to both illuminate where the wearer looks and to shade areas where the wearer is not looking. For example, when the wearer views a display screen, the eyewear 1300 can be disposed to determine where on the display screen the wearer is looking, to specially illuminate that portion of the display screen, and to shade other portions of the display screen. This can have the effect of highlighting one or more particular portions of the display screen, possibly making those portions easier for the wearer to read. Similarly, when the user looks at a control or a display in a vehicle (such as on a dashboard), or similarly looks at a control or a display in a relatively shaded or darkened area, the eyewear 1300 can be disposed to illuminate that particular control or display and to shade regions of the user’s field of view outside that particular control or display.

(Specific portions of field of view)

[643] In one embodiment, the eyewear 1300 can be disposed to provide illumination only in one or more specific portions of the user’s field of view. For example, when the eyewear 1300 includes multiple lenses (or lenses having multiple portions), the eyewear can be disposed to illuminate only for one of those lenses (or only for one of those portions). In the case where the eyewear 1300 includes ’’reader” glasses, for instance, in which a first portion of the lenses are disposed to provide close-range viewing and a second portion of the lenses are disposed to provide distant viewing, the eyewear can be disposed to illuminate where the user looks only for the closerange viewing portion of the lenses (alternatively, the eyewear can be disposed to illuminate where the user looks only for the close-range viewing portion of the lenses). Similarly, the eyewear 1300 can be disposed to illuminate where the user looks only for a central-viewing portion or only for a peripheral-viewing portion of the user’s field of view.

[644] In one embodiment, the eyewear 1300 can be disposed to select one or more such portions of the lenses in which to illuminate where the user looks in response to a user input (such as when the user desires to cause the eyewear to illuminate only close-range viewing), in response to an ambient environment (such as when the user is inside a vehicle whose controls are shaded and the user would benefit from illumination of particular controls or displays in the vehicle), in response to a user condition (such as a medical condition), or otherwise as described herein. In such cases, the sensors 1320 can be disposed to receive the user input. The user input can be in response to a touch or capacitive control, a hand/finger gesture, an eye /face gesture, a body movement, or as otherwise described herein.

[645] For example, the eyewear 1300 can be disposed to illuminate one or more portions of the user’s field of view in response to determining that the user is about to suffer, or is currently suffering, migraine or photophobia. The eyewear 1300 can be disposed to restrict illumination of those one or more portions of the user’s field of view to selected frequencies, such as green in the 500-560 nm range. The eyewear 1300 can thus combine both illumination (in green) with frequency shading/inverse-shading (to reduce untoward frequencies, such as blue/ultraviolet). While the user might benefit from illumination only in green and without blue/ultraviolet generally, this might be of particular value when the user is reading in a relatively bright environment, such as when using “reader” glasses or bifocals.

(Differential lighting of portions of field of view)

[646] In one embodiment, the eyewear 1300 can be disposed to perform differential amounts of illumination or shading/inverse-shading for distinct locations in the user’s field of view. For example, the eyewear 1300 can be disposed to perform a first amount of illumination or shading/inverse-shading in a close-range portion of the user’s field of view and a second amount of illumination or shading/inverse-shading in a distant portion of the user’s field of view. For a first example, when the user is reading in a bright environment, such as in sunlight, the eyewear 1300 can be disposed to illuminate or shade/inverse-shade the portion of the user’s field of view associated with reading, to account for relative brightness or shadowing of the reading material. For a second example, when the user is operating an aircraft, the eyewear 1300 can be disposed to shade/inverse-shade the portion of the user’s field of view associated with a bright sky field to account for brightness of the ambient environment, and to illuminate an inside of the aircraft to account for that inside portion being relatively shaded or darker than the rest of the ambient environment.

[647] Similarly, the eyewear 1300 can be disposed to adjust coloring/tinting or color balance of at least a portion of the user’s field of view, such as in response to the brightness or coloring/tinting of the ambient environment. For a first example, when the user is determined to be sensitive to bright light, such as when the user is subject to migraine or photophobia, the coloring/tinting of the ambient environment can be adjusted to reduce (in whole or in part) the amount of blue/ultraviolet light in the coloring/tinting of the user’s field of view. For a second example, when the user is determined to be about to be, or currently, subject to migraine or photophobia, the coloring/tinting or color balance of the ambient environment can be adjusted to increase the amount of green light received by the user’s eyes.

[648] In another embodiment, when the user is subject to at least some color blindness (whether natural or induced by an adjustment to the coloring/tinting of the user’s field of view), the coloring/tinting of the ambient environment can be adjusted to enhance those portions of the user’s field of view that relate to particular colors for which the user’s attention is to be drawn. For example, when the user is subject to red/green color blindness or when the user’s field of view is filtered to restrict coloring/tinting to primarily shades of green, the user’s field of view can be adjusted to show red coloring/tinting (such as traffic lights or signs when the user is driving) in a brighter format or in a flashing format, so as to allow the user to determine the presence of those colors even when the user is unable to see them directly.

Illumination by external device

[649] Figure 13B shows a conceptual drawing of eyewear being used to control one or more devices to highlight one or more displays, in response to where the user is looking.

[650] As described herein, the eyewear 1300 can also include elements disposed to control an external device, such as

— an electromagnetic or ultrasonic transmitter 1350,

— possibly disposed to control a device 1360 (such as a smartphone or other mobile device, a phablet or tablet, or a desktop computer having a coupled display),

— wherein the external device has one or more viewable displays 1370.

[651] For example, when the user is looking at the device 1360, the eyewear 1300 can determine the wearer’s gaze direction or focal length. The one or more processors 1350 can receive information from the dynamic eye tracking mechanism 1310 and, in response thereto, determine a three-dimensional (3D) location at which the wearer is looking.

[652] The one or more processors 1350 can direct the transmitter 1350 to send one or more signals to the device 1360, informing the latter where the wearer is looking and directing the device 1360 what tasks to perform with that information. For example, the device 1360 can highlight a designated portion of the mobile device’s screen, specifically, a portion at or near where the wearer is looking on the screen.

[653] This can have the effect that the device 1360 can show the wearer just what the wearer is looking for. Some possible advantages include at least the following:

— The wearer can more easily see the portion of the screen they desire to see. For example, when the screen includes text, particularly small text, highlighting the text in which the wearer is interested can more easily allow the wearer to clearly see that text. This can be combined with magnifying the text of interest to the wearer, as other and further described herein. — The device 1360 can more easily display the portion of the screen of interest to the wearer in otherwise adverse lighting conditions, such as by brightening that portion of the screen, without having to brighten the entire screen. This can have the effect of saving power usage and battery time otherwise available to the device.

— The device 1360 can more easily display the portion of the screen of interest to the wearer at a relatively greater brightness, without excessive power usage. This can have the effect of providing a brighter screen to the wearer for viewing, and can also have the effect of limiting the amount of heating by the device to provide a bright display.

[654] The device 1360 can also urge the wearer to review particular portions of the display, such as by moving the highlighted portions of the display while the wearer is viewing them. This can urge the wearer to move their gaze with the highlighting, thus urging the wearer to move their gaze direction across the display. For example, the device 1360 can move the highlighted portions of the screen across a sequence of text to help the wearer improve their reading speed.

[655] In another embodiment, the eyewear 1300 can be disposed to operate with multiple display screens 1370a and 1370b. The multiple display screens 1370a and 1370b can be controlled either by a single device 1360 (whether a mobile device or a “desktop” device) or by multiple devices 1360 (which can be the same or different types of device). In such cases, the eyewear 1300 can determine whether the wearer is looking at a first screen or a second screen, and in response thereto, cause the screen being looked at (the “active” screen) to have a first visual effect and the screen not being looked at (the “inactive” screen) to have a second visual effect.

[656] For example, the eyewear 1300 can direct the inactive screen to be substantially dimmed, so the user is not subject to excessive brightness directed at their peripheral vision. For another example, the eyewear 1300 can direct the inactive screen to have its color balance altered. In such cases,

— The inactive screen can be filtered to be more amber, so as to reduce peripheral-vision brightness in the blue portion of the visual spectrum; or

— The inactive screen can be directed to provide green light, so as to prevent or reduce the likelihood of, or to treat or reduce the severity of, migraines.

(Altering illumination by wearer commands) [657] In additional embodiments, the eyewear 1300 can be disposed to recognize commands or requests from the wearer to alter the intensity (or other features) of the illumination. In such cases, wearer commands can include capacitive or touch controls, eye or face gestures, finger or hand gestures, head or mouth movements, voice commands, electromagnetic commands from another device, other user commands described herein, or other ways the wearer can direct the eyewear 1300.

[658] In such cases, the eyewear 1300 can be disposed to allow the wearer to direct the illumination to have a different amount of area at the illuminated device 1360 or object, a different angle or amount of polarization, a different color or color balance (or a different set of colors in a varying color pattern), or another visual effect. In additional such cases, the eyewear 1300 can be disposed to direct the device 1360 to increase a magnification, or to impose other visual effects, on the portion of the screen being viewed by the wearer. For example, the eyewear 1300 can be disposed to alter a color or color balance of that portion, to cause that portion to blink, or otherwise change a way that portion can be viewed by the wearer.

Fig. 14 — Peripheral vision

[659] Fig. 14 (collectively including Figures 14A and 14B) shows a conceptual drawing of eyewear including a peripheral vision lens. Figure 14A shows a side view of eyewear including a peripheral vision lens. Figure 14B shows a top view of eyewear including a peripheral vision lens.

[660] Figure 14A shows a side view of eyewear including a peripheral vision lens.

[661] Figure 14B shows a top view of eyewear including a peripheral vision lens. One or more peripheral vision lenses can be disposed in a wearer’s peripheral field of view, thus, to a side of the wearer’s eyes and face.

[662] As otherwise described herein, an example eyewear 1400 can include one or more elements as shown in the figure, including at least

— a frame 1410 disposed to hold one or more front vision lenses, such as one for each eye;

— a pair of temples 1420L and 1420R disposed to each hold a peripheral lens, such as one for each eye; — a dynamic eye tracking mechanism 1430 disposed to determine a gaze direction and/ or focal length of the wearer’s eye;

— an object recognition system 1440 disposed to determine a three-dimensional (3D) location of an object within the wearer’s field of view;

— a peripheral light sensor 1450 disposed to sense the light on the side of the eyewear (such as in the wearer’s peripheral field of view) and to provide information to a processor (not shown); the processor can use the peripheral light sensor 1450 to determine where to shade/inverse-shade the peripheral lenses.

[663] As other and further described herein, the dynamic eye tracking mechanism 1430 can be disposed to determine a gaze direction and/or focal length separately for each wearer’s eye. For example, the dynamic eye tracking mechanism 1430 can determine whether the wearer’s eye is looking at an object identified by the object recognition system 1440. The eyewear 1440 can be disposed so that, when the wearer’s eye is looking at an identified object 1460, the eyewear can identify which lens the object is visible through, such as whether the object is visible through a front vision lens or a peripheral vision lens.

[664] The eyewear 1440 can be disposed so that, when the wearer’s eye is looking at an identified object 1460 through a peripheral vision lens, the eyewear can shade/inverse-shade one or more of the front vision lenses or peripheral vision lenses. This can have the effect that, when the wearer is looking at an object 1460 using a peripheral vision lens, the eyewear 1440 can shade/inverse-shade the object so as to improve the wearer’s visual acuity with respect to the object.

[665] For example, the eyewear 1440 can be disposed to

— (1) recognize an object 1460 visible in the wearer’s field of view through a peripheral vision lens (which can operate separately for each eye); and one or more of:

— (2a) shade/inverse-shade a front vision lens so as to urge the wearer to view the object through the peripheral vision lens (which can also operate separately for each eye);

— (2b) shade/inverse-shade the peripheral vision lens so as to improve the wearer’s visual acuity with respect to the object (which can also operate separately for each eye).

[666] The peripheral light sensors can also provide an indicator to the processor when bright light is about enter one or more of the front lenses, thus, directly oncoming the wearer’s front field of view, before the wearer (or sensors on the front lenses) has their vision affected by the incoming bright light. For example, when the wearer is controlling a vehicle, such as an aircraft or speedboat, this can have the effect of warning the eyewear of incoming bright light approaching the wearer’s peripheral vision and then frontal vision, before that incoming bright light debilitates the wearer’s vision (in either direction).

[667] This can also have the effect of providing an artificial compass or artificial horizon for the wearer using the Sun as a reference light. For example, the eyewear can use a timer to determine a proper direction of the Sun, in combination with using a peripheral sensor to determine an angle the wearer or the wearer’s vehicle makes with respect to the Sun. Similarly, the eyewear can use an accelerometer or gyroscope to determine a relative location of the wearer or the wearer’s vehicle (in response to a sequence of accelerations or sequence of movements at constant velocity), or can use a magnetometer to determine a proper direction of the North magnetic pole or South magnetic pole.

[668] The indicator to the processor can also predict other incoming bright lights, such as a headlight, laser, or spotlight, aimed errantly or intentionally at the wearer’s eye. The processor, in response to the predicted incoming bright light, can be disposed to shade/inverse-shade one or more of the lenses (whether frontal lenses or peripheral lenses), so as to prevent the wearer from being adversely affected by the incoming bright light before that bright light shines directly into the wearer’s eye.

[669] More generally, any two of the lenses, such as a forward-facing lens and a peripheralfacing lens, any two forward-facing lenses, or any two peripheral-facing lenses, can provide different adjustments to visual effects available to the wearer. For example, these visual effects can include any of the visual effects described below. Moreover, as any two such visual effects can be applied separately for each lens, they can be applied separately for each eye.

[670] In other embodiments, a single lens, such as a single forward-facing lens or a single peripheral-facing lens, can provide one or more adjustments to visual effects available to the wearer. For example, these visual effects can include any of the visual effects described below.

[671] The object recognition system 1440 can be disposed to determine a three-dimensional (3D) location of an object within the wearer’s field of view. For example, the object recognition system 1440 can be disposed to determine through which lens the wearer can view the object. This can have the effect that the eyewear 1440 can identify where the wearer should look to view the object 1460, such as whether to view the object through a forward-facing lens or a peripheralfacing lens.

[672] The object shading/inverse-shading recognition system 1440 can be disposed to encour- age/discourage the wearer to view objects 1460 using their peripheral vision, or to encourage/ discourage the wearer from viewing objects using their forward vision, or combinations thereof.

[673] As described in the Incorporated Disclosures, the eyewear 1440 can improve the wearer’s visual acuity with respect to the object 1460 by one or more of:

— shade/inverse-shade the object so as to allow the wearer to more easily view the object in an ambient light environment;

— shade/inverse-shade the object to provide a sequence of still images or a sequence of moving images, so as to allow the wearer to more easily view a moving object (or a spinning object, or an object otherwise changing with respect to its background).

[674] In one embodiment, the eyewear 1440 can shade/inverse-shade the peripheral-facing lens so as to substantially decrease brightness of light incoming from a peripheral direction when the wearer is looking forward. For example, the wearer might be operating a vehicle (such as an aircraft or speedboat) and be directing their attention at instruments for the vehicle, while a relatively bright light is incoming from a peripheral direction. In such cases, it can be desirable to shield the bright light from the peripheral direction to provide a shield to allow the wearer to view the instruments while not being blinded from a side.

[675] Similarly, the eyewear 1440 can shade/inverse-shade the frontal-facing lens so as to substantially decrease brightness of light incoming from a frontal direction when the wearer is looking to a side. For example, when the wearer is operating a vehicle (such as an aircraft or speedboat) and is directing their attention at an object outside the vehicle, while a relatively bright light is incoming from a frontal direction, it can be desirable to shield the bright light from the frontal direction to allow the wearer to view the external object on a side while not being blinded from the front.

[676] For example, the eyewear 1440 can shade/inverse-shade the object to provide a still images or a sequence of moving images as described in the Incorporated Disclosures, such as Application 16/684,479, filed Nov. 14, 2019, naming inventor Scott LEWIS, titled “Dynamic visual optimization”, Attorney Docket No. 6501, currently pending. As described therein, the eyewear can shade/inverse-shade the object by interleaving shaded periods with unshaded periods (herein sometimes called “frames”), so as to provide an image of the object that the wearer’s eye integrates into a single image, but which is relatively increased in contrast with an ambient environment. As described therein, each set of frames can include one or more relatively shaded still/ moving images interleaved with a relatively unshaded still/moving images, with the relatively shaded images or relatively unshaded images possibly having different amounts of shading. In such cases, this can have the effect that an obj ect viewed against a bright or visually noisy background, or an obj ect viewed while changing against its background (such as a spinning ball or other object), can be viewed with better visual acuity.

[677] In other and further embodiments, the eyewear 1440 can be disposed to alter the wearer’s vision through one or more of the front vision lenses, or through one or more of the peripheral vision lenses, using techniques other than shade/inverse-shade. For example, the eyewear 1440 can alter any of the visual effects provided by lenses, including:

— a chromatics effect, color balance effect, color injection effect, or other effect on color, viewable through a front or peripheral vision lens;

— a corrective lens effect, magnification effect, refraction effect, or other effect on focus viewable through a front or peripheral vision lens;

— an anti-glare effect, a polarization effect, or other effect on shading/inverse-shading viewable through a front or peripheral vision lens;

— a prismatic effect, a reflection effect, or other effect on direction of images viewable through a front or peripheral vision lens; or any other effect suitable for improving visual acuity with respect to an object viewable through a front or peripheral vision lens.

[678] The eyewear 1440 can be disposed to combine these effects with a shading/inverse- shading effect, with each other, or in multiple combinations.

[679] In other and further embodiments, the eyewear 1440 can be disposed so as to apply a visual-effect adjustment by one or more of the forward-facing lens or the peripheral-facing lens. For example, the peripheral-facing lens can be disposed to provide a magnification effect, antiglare effect in response to an amount of light in the ambient environment, or another visual acuity effect, so as to allow the wearer to more easily view an object 1460 appearing in their peripheral vision; the forward-facing lens can be disposed to provide a shading/inverse-shading effect so as to urge the wearer to look in the direction of their peripheral vision. In other examples, the eyewear 1440 can be disposed to apply “digital visual optimization” as described in the Incorporated Disclosures (thus, providing a sequence of still images or a sequence of moving images), so as to allow the wearer to more easily view an object 1460 appearing in their peripheral vision. In other examples, the eyewear 1440 can be disposed to provide other or further possibilities that allow the wearer to more easily view an object appearing in their peripheral vision, to urge the wearer to view an object 1460 appearing in their peripheral vision, or to otherwise improve the wearer’s peripheral vision.

[680] In other and further embodiments, the eyewear 1440 can be disposed to apply different adjustments to the forward-facing lens and the peripheral-facing lens, so as to encourage the wearer to view, or to improve the acuity of the wearer’s view, of objects 1460 in their peripheral field of view.

Fig. 15 - Music and entertainment shading/inverse-shading

[681] Fig. 15 shows a conceptual drawing of eyewear capable of performing music and entertainment shading/inverse-shading.

[682] In one embodiment, the eyewear 1500 can include one or more of the following:

— An audio/video signal receiver 1501.

— One or more lenses 1502 disposed between a user (not part of the eyewear) and a field of view (not part of the eyewear).

— An optional frame 1503 disposed to support one or more of the lenses 1502.

— An optional lamp 1504 disposed to illuminate at least a portion of the user’s field of view.

— A processor 1505 coupled to the audio/video signal receiver.

[683] The eyewear 1500 can also optionally be couplable to one or more of the following:

— one or more sensors 1506a couplable to the user, to an ambient environment, to an external device having a capability of determining a medical status of the user, or as otherwise described herein, and disposed to provide sensor information 1506b; or — an external device, such as a smartphone or a mobile device, a display (such as a monitor, a radio or television, or as otherwise described herein), an audio/video transceiver (such as a telephone or video conference system), or as otherwise described herein.

[684] The processor 1505 can be coupled to the audio/video signal receiver 1501 and can be disposed to receive audio/video signals 1511 therefrom. For example, the audio/video signals 1511 can include electromagnetic signals, ultrasonic signals, or different signals as otherwise described herein, such as from a radio station 1512 or another source as otherwise described herein. The audio/video signals 1511 can be disposed to encode one or more audio/video streams; those audio/video streams can include music, a song with vocals, another music presentation, a voice presentation, another audio presentation, or another presentation as otherwise described herein.

[685] The processor 1505 can also be coupled to the one or more sensors 1506a and can be disposed to receive sensor information 1506b therefrom. For example, the sensor information 1506b can include electromagnetic signals, ultrasonic signals, or different signals as otherwise described herein, such as from devices disposed to receive information about the user or about the user’s ambient environment. The sensor information 1506b can be disposed to encode information with respect to user conditions, such as user medical conditions, user facial conditions, or as otherwise described herein.

[686] The processor 1505 can be coupled to memory 1521 maintaining a program and/or data disposed to control operation of the processor. The memory 1521 can be disposed logically local to the processor 1505, or logically remotely from the processor; for example, at least some of the memory 1521 can be disposed at a logically remote server 1522, such as in a virtual machine 1523 or using another processing technique.

[687] The program can be disposed to control the processor 1505 to receive the one or more audio/video signals 1511 and to decode those signals into one or more audio/video streams. For example, the audio/video streams can include audio streams as described above, or different streams as otherwise described herein. The processor 1505 can be disposed to couple information from the decoded audio/video signals 1511 to one or more other devices, such as other elements of the eyewear 1500, to another distinct eyewear 1500, to another external device, or as otherwise described herein. [688] The processor 1505 can be coupled to one or more of: the lenses 1502, the frame 1503, or both. The processor 1505 can be disposed to control the lenses 1502, the frame 1503, or both, in response to the audio/video streams or in response to other information derived from the au- dio/video signals 1511. For one example, when the information derived from the audio/video signals 1511 includes a music stream, the processor 1505 can shade/inverse-shade the lenses 1502, such as described herein, in an amount described by, or otherwise in synchrony with, that music stream.

[689] The program can also be disposed to control the processor 1505 to receive the sensor information 1506b and to decode that information into one or more audio/video streams or other information for control of the eyewear 1500 or the mask 1531. For example, the sensor information 1506b can include information with respect to the user’s audio expressions (such as their voice), facial expressions (such as movements of the user’s eyes, nose, or mouth), or as otherwise described herein. The processor 1505 can be disposed to couple decoded sensor information 1506b to one or more other devices, such as other elements of the eyewear 1500, to another distinct eyewear 1500, to another external device, or as otherwise described herein.

Shading/inverse-shading

[690] In one embodiment, the processor 1505 can be disposed to control the lenses 1502 in response to the audio/video signal 1511. The processor 1505 can be disposed to provide a shading/inverse-shading effect on the lenses 1502, so as to alter a luminance level of the user’s field of view in response to the audio/video signal 1511. For example, when the audio/video signal 1511 provides a relatively low volume, the processor 1505 can control shading/inverse-shading of the lenses so that the user sees only a relatively low luminance in their field of view (of the ambient environment). Similarly, when the audio/video signal 1511 provides a relatively high volume, the processor 1505 can control shading/inverse-shading of the lenses so that the user sees a relatively high luminance in their field of view (of the ambient environment). Alternatively, the processor 1505 can control shading/inverse-shading of the lenses in an opposite manner, thus, providing relatively low luminance when the audio/video signal 1511 provides a relatively high volume and relatively high luminance when the audio/video signal 1511 provides a relatively low volume.

[691] This can have the effect that the user experiences a shading/inverse-shading effect in response to the song, other music presentation, or other audio stream derived from the audio/video signal 1511 received by the audio/video signal receiver 1501. When the ambient environment is sufficiently bright, or when the ambient environment has at least one sufficiently bright light within the user’s field of view, the user can experience the song or other music presentation derived from the audio/video signal 1511 as a variation in video luminance as well as audio volume. Alternatively, when the ambient environment is near a difference between relatively brighter and relatively darker, the user can experience the song or other music presentation as a variation between relatively brighter luminance at which the user’s color vision is activated and a relatively darker luminance at which the user’s vision is primarily in black/white.

Other lenses effects

[692] In one embodiment, the processor 1505 can be disposed to alter one or more other lens effects in response to the audio/video signal 1511. The other lens effects can include, in addition to or in lieu of shading/inverse-shading, one or more of:

— The processor 1505 can alter an amount of coloring/tinting of the lenses 1502 in response to the audio/video signal 1511.

— The processor 1505 can alter an amount of, duty cycle of, frequency of, or other features of, dynamic visual optimization (“DVO”) of the lenses 1502 in response to the audio/video signal 1511.

— The processor 1505 can alter an amount of polarization of the lenses 1502 in response to the audio/video signal 1511.

— The processor 1505 can alter an amount of prismatic deflection of the lenses 1502 in response to the audio/video signal 1511.

— The processor 1505 can alter an amount of refraction of the lenses 1502 in response to the audio/video signal 1511.

— The processor 1505 can alter other lens effects of the lenses 1502, or combine such effects, such as otherwise described herein, in response to the audio/video signal 1511.

[693] In such cases, the processor 1505 can be disposed to alter the one or more lens effects for only a portion of the lenses 1502. Similarly, the processor 1505 can be disposed to alter a first lens effect for a first portion of the lenses 1502 and to alter a second lens effect for a second portion of the lenses 1502. [694] For example, the eyewear or digital eyewear can be disposed to provide a color change with respect to one or more lenses in response to the song, other music presentation, or other audio/video. This can have the effect that an external device or another person can provide the user with a colorized experience to go with associated music. This can alternatively have the effect that an external device or another person can provide the user with a colorized enhancement of an alarm or other audio/video (such as a fire alarm or an emergency vehicle siren).

[695] For another example, this can have the effect that the user can experience one or more lens effects, either for portions of the lenses 1502 or otherwise, in response to the audio/video signal 1511.

— The user can enjoy the one or more lens effects at a concert, at a party, when listening to radio or stereo, when watching television, or otherwise while participating with entertainment.

— The user can enjoy the one or more lens effects while observing or participating in a sports event.

— The processor 1505 can be responsive to audio senses, such as when the user is listening to a storm approaching, to adjust video senses, such as altering responses to lightning or other light effects.

— The processor 1505 can be responsive to audio streams, such as from operating equip- ment/machinery, to draw a user’s attention to elements of equipment/machinery exhibiting anomalous behavior. For example, when the user is operating a vehicle, such as an aircraft, ground vehicle, or watercraft, the processor 1505 can be disposed to draw the user’s attention to particular instruments when selected audio streams from the control surfaces, engine, propellers, wheels, wind effects, or other audio streams are indicative of conditions to which the user ought to address their attention.

— The processor 1505 can be responsive to audio streams, such as warning signals, to alert the user. For example, when the user is operating a vehicle in otherwise poor visibility conditions, the processor 1505 can be disposed to use visual cues to alert the user to potentially dangerous conditions: thus, when the user is driving a vehicle near a blind entrance, the processor 1505 can respond to an audio stream indicating entrance of another vehicle by visually alerting the user of the approaching other vehicle (such as by flashing the lenses briefly, or by periodically flashing a dot or symbol on the lenses, or by showing an still or moving arrow pointing at the approaching vehicle, or as otherwise described herein).

Illumination [696] In one embodiment, the processor 1505 can be disposed to control the lamp 1504 in response to the audio/video signal 1511. The processor 1505 can be disposed to use the lamp 1504 to illuminate the user field of view, or a portion thereof, in response to the audio/video signal 1511. This can alter a luminance level of the user’s field of view, or a portion thereof, in response to the audio/video signal 1511.

[697] For example, when the audio/video signal 1511 provides a relatively low volume, the processor 1505 can control the lamp 1504 so that the user sees only a relatively low luminance in their field of view (of the ambient environment). Similarly, when the audio/video signal 1511 provides a relatively high volume, the processor 1505 can control the lamp 1504 so that the user sees a relatively high luminance in their field of view (of the ambient environment). Alternatively, the processor 1505 can control the lamp 1504 in an opposite manner, thus, providing relatively low luminance when the audio/video signal 1511 provides a relatively high volume and relatively high luminance when the audio/video signal 1511 provides a relatively low volume.

[698] This can have the effect that the user experiences illumination in response to the song, other music presentation, or other audio stream. This can allow the user to observe an illuminated region of their field of view when that illuminated region is pertinent to that particular audio stream. The user can then review objects in that illuminated region in response to the particular audio stream. The illumination from the lamp 1504 can be combined with other lens effects, such as coloring/tinting and as otherwise described herein.

[699] One or more examples include:

— When the user is navigating a region that is relatively dark, the processor 1505 can cause the lamp 1504 to illuminate a region from which a sound originates, thus allowing the user to see any objects that have caused the sound. When the user is traveling in a wooded area, the processor 1505 can be disposed to respond to sounds from: animals, branches, or ground shifting, thus allowing safer and/or quieter travel by the user. For example, the user might be an animal control officer or a search/rescue officer.

— When the user is inspecting or traveling in an urban area, the processor 1505 can be disposed to respond to sounds from: animals, people, vehicle, thus allowing safer travel by the user and possibly more effective inspection of the area. For example, the user might be a law enforcement officer, emergency responder, or firefighter. — When the user is inspecting or navigating a region including equipment/machinery, the processor 1505 can cause the lamp 1504 to illuminate a region from which an anomalous sound occurs. For example, the user can be inspecting equipment/machinery, such as a belt that is failing, a device with foreign object damage, a pipe with a leak, or as otherwise described herein. The processor 1505 can cause the lamp 1504 to illuminate the particular location from which the problem is occurring.

Frame presentation effects

[700] In one embodiment, the processor 1505 can be disposed to control the frame 1503 in response to the audio/video signal 1511. The processor 1505 can be disposed to alter the shad- ing/inverse-shading or coloring/tinting of the frame 1503 so as to alter the presentation of the frame, or a portion thereof, to those observing the user. This can alter a presentation of the user themselves, in response to the audio/video signal 1511, such as an electromagnetic signal, an ultrasonic signal, or as otherwise described herein.

[701] For example, the eyewear 1500 can be disposed to provide a change in shading/inverse- shading or coloring/tinting in response to recognition of a particular person’s voice, facial image, or other image, so as to show that the user has identified the particular person. This can be disposed to alert nearby persons that the particular person has been identified, thus allowing teammates or assistants of the user to be alerted to look for that particular person. For example, this can be used by one or more emergency responders, law enforcement officers, search/ rescue personnel, or as otherwise described herein.

[702] For another example, the eyewear 1500 can be disposed to provide a change in shad- ing/inverse-shading or coloring/tinting in response to determining whether one or more of a collection of selected persons are sufficiently nearby, or if one or more of them has wandered off or gotten lost. Similarly, the 1500 can be disposed to provide a change in shading/inverse-shading or coloring/tinting in response to determining that the user would like the attention of that collection of selected persons. This can be disposed to alert those selected persons where they should be congregating, such as when the user is a tour guide or a childcare provider.

[703] For another example, the user can provide one or more frame effects, either for portions of the frame 1503 or otherwise, in response to the audio/video signal 1511. — The user can provide the one or more frame effects at a concert, at a party, at a political rally, at a product presentation, at a show, or otherwise while participating with business presentations, entertainment, or sports.

— The user can provide the one or more frame effects while observing or participating in a sports event.

— The user can provide the one or more frame effects, such as warning signals, to alert nearby other persons. For example, the processor might determine that the user is subject to a medical condition and needs assistance or needs a volunteer to call emergency responders or medical personnel.

Mask presentation effects

[704] In one embodiment, the eyewear 1500 can be coupled to a mask 1531 or another face covering, disposed to obscure or obstruct at least a portion of a view of a user’s face, or another presentation device as otherwise described herein. As described herein, a face covering or another type of “facewear” can include any device disposed to obscure or obstruct, in whole or in part, at least a portion of a view of the user’s face. The face covering or facewear 1531 can be coupled to the eyewear 1500 or can be disposed to operate separate or apart from the eyewear 1500.

[705] The processor 1505 can be coupled to the mask 1531 and disposed to cause the mask to present an audio/video stream. For example, the processor 1505 can be disposed to cause the mask 1531 to display a video image or a moving video image; or the processor 1505 can be disposed to cause the mask 1531 to present an audio output, such as a song, voice, sound effect, or as otherwise described herein.

[706] For example, the mask 1531 can include a presentation device 1532, such as one or more lamps or photodiodes, one or more color-alterable devices whose color is perceivable to an observer of the wearer of the mask, or another device that can be disposed to present an image. Thus, the mask 1531 can be disposed to provide a (still or moving) picture or another image (or set of images) to be seen by an observer. Examples of uses can include one or more of the following:

— A mouth image disposed to express movements of the user’s mouth, such as to assist with expression of the user’s voice. For example, the mouth image can show images to express speech, songs, sound effects (such as crying, giggling or laughter, hissing, sighing, shushing, whistling, onomatopoeia, or as otherwise described herein), “stage whispers”, speaking, and other vocal effects as described herein. For another example, the mouth image can show blowing a kiss, breathing, burping, clearing one’s throat, coughing, gagging (whether real or imitated), groaning, hiccups, humming, licking or smacking one’s lips, grimacing or otherwise expressing pain, panting, smiling or frowning, smirking or otherwise expressing an emotion relating thereto, or as otherwise described herein.

— A mouth or other set of images disposed to show emojis, emotions, emoticons, or other symbols as described herein. For example, the images can show one or more of the following: text, whether in a language being spoken, or a translation thereof; known symbols, such as those having political or religious significance; artwork, such as designs, photographs, or images of paintings; data representations, such as medical information with respect to the user, or as otherwise described herein; livestreaming, movies, television images, webcam images, or other moving pictures; or as otherwise described herein.

— A set of images associated with a game or sport, such as an image transferred from user to user among a group of players, or among a group of observers of a game; or such as an image transferred among users at a political rally, or to show support for a candidate or to show an opinion (or support for an opinion) when relative silence is otherwise desirable.

— A set of images associated with a frame effect, such as an image enhancing or contrasting the frame effect, including one or more of the following: in response to an audio/video signal 1511, at a concert, at a party, at a political rally, or otherwise while participating with entertainment; in response to a signal associated with a sports event; or as otherwise described herein.

— A set of symbols, pictures, cartoons or caricatures, or other indicators that might have the capacity to present information from the user to others. For example, the symbols can include letters or words, emoji or other known symbols (such as representing known emotions, events, or objects), political or economic symbols (such as representing known movements or political parties), a cartoon or caricature version of the user’s face or another person’s face, or as otherwise described herein.

— An image providing a warning signal or other information to alert nearby other persons, such as emergency responders, law enforcement officers, nearby volunteers, or other persons in a position to provide assistance.

[707] While this Application is primarily described with respect to a mask 1531 that covers the user’s mouth and nose, there is no particular requirement for any such limitation. The mask 1531 can include any facewear that covers or augments any one or more portions of the user’s face or related anatomy, such as one or more of: a set of display elements disposed to augment portions of the user’s mouth when the user’s mouth is not actually covered, such as the lips or the corners of the mouth; other portions of the user’s face, such as the chin, cheeks, nose, eyelids, corners of the eyes, eyebrows, forehead, ears, skin color; or as otherwise described herein.

[708] In one embodiment, the eyewear 1500 can include one or more present elements 1507, such as lights, reflectors, other display devices, or as otherwise described herein. The processor 1505 can be disposed to control those present elements 1507 so as to present information to the user, to one or more other nearby persons, or to distant persons (such as possibly using a camera 1508a and/or a communication device 1508b to provide an image of those presentation elements 1507 to one or more distant persons), or as otherwise described herein.

[709] For example, the presentation elements 1507 can include one or more lights 1507a disposed behind the lenses 1502, on the frame 1503, otherwise within the eyewear 1500, or as otherwise described herein. Alternatively, the presentation elements 1507 can include one or more reflectors 1507b disposed so as to direct ambient light or other light generated by the eyewear 1500 to the user or to other persons outside the eyewear. Alternatively, the presentation elements 1507 can include one or more informational devices 1507c disposed to showblack/white or color (e.g., visible in response to ambient light or other light generated by the eyewear 1500) and to present those informational devices 1507c to the user or to other persons outside the eyewear.

[710] The presentation elements 1507 can be disposed to provide a “light show” or other display, such as in response to the audio/video signal 1511 or in response to the sensor information 1506b. Alternatively, the presentation elements 1507 can be disposed to provide one or more types of information to the user, to a monitoring or other device, or to another person. The information provided by the presentation elements 1507 can include one or more of:

— Status information, such as current physical, emotional, or medical information about the user. For example, status information can include how long the user has been looking at a particular gaze direction or focal length, such as might place the user at risk for a dry-eye condition; whether the user’s eyes are defocused or are focused at distinct gaze directions or focal lengths; whether the user’s pupils have different sizes or other indicators of intoxication, stroke, or another medical condition;

— Predictive information, such as an evaluation by the processor 1505 of how likely the user is to develop a dry-eye condition, a migraine condition or neuro-ophthalmic disorder, or another medical condition; or if the user is likely to develop (or is currently undergoing) one or more such conditions, how severe the condition is likely to become, how long the condition is likely to last, or how likely the user is to engage in self-care with respect to the condition;

— Entertainment, news or weather, sports information, financial or traffic reports, or similar information, such as a radio/television broadcast or other audio/video information of interest to the user. For example, the user can receive information about the ambient environment for one or more purposes: so as to avoid adverse traffic, weather, or other hazards; so as to engage in self- care with respect to an allergy condition, an emotional condition, a medical condition, or another condition; so as to obtain more detail about a sports event, in which the user is participating in, reporting on, training participants for, or viewing or evaluating; so as to remain informed of current events or other events of interest to the user; or as otherwise described herein.

— Advertising information, other economic or financial offers, or similar information, such as information about the presence or location of one or more of: gas stations (particularly when traveling), restaurants, restrooms, retail locations and objects for sale therein (particularly when shopping), viewing points or other points of interest.

[711] In one embodiment, status information can be disposed to be provided to the user, such as to encourage self-care or to encourage the user to require assistance from other persons; or to other persons, such as to encourage those other persons to treat the user in accordance with that status information.

— For example, when status information indicates an emotional condition by the user, emergency responders, medical personnel, or law enforcement officers, can use that status information to provide services to the user, or otherwise to treat the user accordingly. Emergency responders and medical personnel can use that status information to rapidly determine what assistance the user needs. Law enforcement officers can use status information with respect to emotional conditions to determine whether de-escalation is appropriate or whether the user presents a threat.

Control of and by external devices (warnings)

[712] In one embodiment, the processor 1505 can be disposed to control the lenses 1502 in response to an external device. For example, the external device can include one or more of the following: a smartphone or mobile device, a local wi-fi hotspot, a sensor disposed on/in or near the user, a transponder disposed on an item of merchandise or on/in or near another user (such as another eyewear 1500), a local audio/video signal or a broadcast/narrowcast signal associated with an audio/video signal (such as an electromagnetic or ultrasonic signal provided by a “DJ” or another entertainer).

[713] The processor 1505 can be disposed to receive a signal otherwise disposed to shock or surprise the user, such as a firearm (which might generate a loud sound and/or a muzzle flash), a flashbang grenade (which would generate both a loud sound and a bright flash), a demolition charge or other explosive (which might generate a warning, and would generate both a loud sound and possibly a bright flash), other electrical/chemical energy releases, other sudden noises or lighting changes, or as otherwise described herein. In such cases, the processor 1505 can be disposed to control the lenses 1502 so as to protect the user from the audio/video shock that might otherwise occur to the user.

[714] For example, when the user is a law enforcement officer, their eyewear 1500 can receive a signal from the user’s firearm that it is about to be discharged. In such cases, the processor 1505 can be disposed to interpret the signal and respond by adjusting the lenses (and possibly a set of earphones or earplugs) so as to prevent the user from having their eyesight or hearing disturbed by the firearm being discharged. Similarly, when a second law enforcement officer’s firearm is about to be discharged, the second officer’s firearm can be disposed to emit a signal to that effect; the first officer’s eyewear 1500 can receive that signal, whereupon the first officer’s eyewear’s processor 1505 can be disposed to similarly interpret the signal and respond by adjusting the lenses (and possibly a set of earphones or earplugs) so as to prevent the first officer from having their eyesight or hearing disturbed by the firearm being discharged.

[715] In such cases, the signal exchanged between one law enforcement officers’ firearm and another officers’ eyewear 1500 can be encrypted when sent (such as by the first officers’ eyewear’s processor 1505) and decrypted when received (such as by the second officers’ eyewear’s processor 1505), so as to prevent that signal from being received by an unwanted third party. This can allow a set of law enforcement officers to exchange signals indicating warnings of unexpected bright flashes or loud noises, without allowing unwanted third parties from making use of those signals.

[716] For another example, when the user is a law enforcement officer or is military personnel, their eyewear 1500 can receive a signal from a flashbang grenade (or another explosive charge, such as an explosive used to break into a door or break down a wall) that it is about to be discharged. Similar to a firearm, the processor 1505 can be disposed to interpret the signal and respond by adjusting the lenses (and possibly a set of earphones or earplugs) so as to prevent the user from having their eyesight or hearing disturbed by the flashbang grenade being discharged. In such cases, the explosive charge can be disposed to emit a warning signal. The user’s eyewear 1500 can receive that warning signal, whereupon the user’s eyewear’s processor 1505 can be disposed to interpret the warning signal and respond by adjusting the lenses (and possibly a set of earphones or earplugs) so as to prevent the user from having their eyesight or hearing disturbed by the firearm being discharged.

[717] Similarly, when the user is near an area where demolition charges or other explosives are in use (or about to be in use), the demolition charges or other explosives can be disposed to emit a warning signal that they have been triggered and will discharge in a selected amount of time. For example, the demolition charges or other explosives can be disposed to provide 30 seconds warning, allowing users nearby to take cover. Alternatively, the demolition charges or other explosives (or their triggering devices) can be disposed to emit a warning signal so as to inform all users nearby of possible danger, or so as to inform all users nearby not to use electromagnetic signals that might accidentally trigger those explosives.

Enhancing or de-enhancing presentation of objects

[718] In one embodiment, when the user desires to enhance, or to block, particular recognizable objects from view when using the eyewear 1500, the processor 1505 can be disposed to highlight or to inverse-shade (so as to enhance) or to substantially dim or shade (such as to de-en- hance) those particular objects in the user’s field of view. For example, shading/inverse-shading can be performed using polarization, such as when the user’s gaze is directed toward a display whose output includes polarized light, or when the user’s gaze is directed toward a recognized object whose primary annoying or distracting features include polarized light (such as a computing device display or a neon light display).

[719] For one example, when the user is using a display and desires to perform a search for a particular word or phrase appearing on that display, the user can direct the processor 1505 to highlight examples of that word or phrase as presented on the display. This can have the effect that the word or phrase is made more visible to the user on the display, even when the eyewear is otherwise presenting the display to the user using shading/inverse-shading, coloring/tinting, or another audio/video effect. This can also have the effect that the word or phrase can be disposed highlighted using an audio/video effect when presented to the user, even when the control program associated with the display is not otherwise disposed to enhance those particular objects in the user’s field of view.

[720] For another example, when the user is using a display, or is moving (such as walking or operating a vehicle) in a region, and desires to avoid distractions from selected recognizable objects, such as advertising, billboards, business signs, moving images, or as otherwise described herein, the user can direct the processor 1505 to substantially dim or shade as presented by the eyewear 1500, so as to de-enhance or block display of those particular objects in the user’s field of view. This can have the effect that the selected object is made less visible to the user in their field of view, thus less annoying or distracting to the user. This can also have the effect that the recognizable object can be de-enhanced or blocked in the user’s field of view, even when the object is otherwise disposed to attempt to attract the user’s attention to the maximum extent possible.

Control of and by external devices (medical conditions)

[721] In one embodiment, the eyewear 1500 can be disposed to receive a signal from an external device 1540. The external device can include a sensor disposed to receive medical information with respect to the user wearing the eyewear. For example, the eyewear 1500 might be disposed for use in one or more of the following (exemplary) circumstances:

— The eyewear 1500 can be disposed to detect a current medical condition (or predict an oncoming medical condition) of the user, and in response thereto, to exchange information with the user, with another eyewear, with another person, or with an external device.

— The eyewear 1500 can be disposed to receive information from another eyewear, another person, or an external device, and in response thereto, to request/receive information from the user.

[722] When the user is subject to (or is about to be subject to) a medical condition, the eyewear 1500 can be disposed to determine that medical condition, and in response thereto, the eyewear can be disposed to inform the user and guide the user to obtaining assistance. In such cases, the processor 1505 can be responsive to one or more sensors coupled to the user, to the ambient environment, or to an external device having a capability of determining a medical status of the user. In response to the one or more sensors, the processor 1505 can be disposed to determine whether the user is subject to (or is about to be subject to) the medical condition. In response thereto, the processor 1505 can be disposed to control the lenses 1502, frame 1503, lamp 1504, mask 1531, or an external object coupled to the eyewear 1500.

[723] For example, the processor 1505 can be disposed to control the lenses 1502 to inform the user of the medical condition, or to assist in ameliorating the likelihood or severity of the medical condition. When the medical condition is migraine, photophobia, or neuro-ophthalmic disorder, the processor 1505 can be disposed to control the lenses 1502 to admit light in a selected frequency range such as 5OO-56onm (green), to filter infalling light to alter the color balance toward that frequency range, to inject light in that frequency range, or as otherwise described herein. For those and similar medical conditions, the processor 1505 can be disposed to control the lenses 1502 to filter infalling light in another selected frequency range such as blue/ ultraviolet, to filter infalling light or inject light to alter the color balance away from that frequency range, or as otherwise described herein. Similarly, the processor 1505 can also be disposed to control the lenses 1502 to alter the coloring/ tinting of the user’s field of view so as to reduce the likelihood of an oncoming medical condition, or so as to ameliorate the duration or severity of a current medical condition.

[724] For another example, the processor 1505 can be disposed to control the frame 1503 so as to inform the user of the oncoming or current medical condition. (This presumes the user can relatively easily see the frame 1503, or a reflection thereof, in their field of view.) When the medical condition is migraine, photophobia, or neuro-ophthalmic disorder, the processor 1505 can be disposed to control the frame 1503 so as to inform the user of the oncoming or current medical condition, to encourage the user to avoid circumstances that might increase the chance, duration, or severity, of the oncoming medical condition. Forthose and similar medical conditions, the processor 1505 can be disposed to control the frame 1503 so as to encourage the user to avoid circumstances in which the oncoming medical condition might pose a serious danger, such as driving at high speed or in heavy traffic, otherwise operating heavy machinery, or as otherwise described herein.

[725] For another example, when the user is subject to (or is about to be subject to) a medical condition, the processor 1505 can be disposed to control the frame 1503, the lamp 1504, or the mask 1531, or an external object coupled to the eyewear 1500, so as to inform other persons besides the user with respect to the medical condition. [726] For example, when the processor 1505 determines that the user is subject to (or is about to be subject to) an emergency medical condition, such as a heart attack or stroke, the processor 1505 can be disposed to exchange signals with a smartphone or mobile device, so as to call upon emergency responders or other medical personnel (in the US, such as by dialing “911”). When possible, the processor 1505 can be disposed to send/receive medical information, either by a digital protocol, or by generating audio speech for a human telephone responder (as in the case when the processor has called upon emergency responders by dialing “911” or another emergency number).

[727] For another example, when the processor 1505 determines that the user is subject to a particular medical condition, and medical personnel are present, the processor can be disposed to exchange information with those medical personnel. The exchange of information can include the processor 1505 receiving requests and sending responses to medical personnel with respect to the user’s condition. The requests and responses can be with respect to a history of the medical condition (recent or otherwise), a current status of the medical condition, and/ or a predicted status of the medical condition. The requests and responses can also be with respect to any other associated activity by the user with respect to the medical condition (recent or otherwise), such as activities associated with triggering or exacerbating the medical condition, dangerous or potentially dangerous activities associated with the medical condition, self-care associated with the medical condition (recently or otherwise) performed or declined by the user, or as otherwise described herein.

[728] For another example, the requests and responses can be with respect to one or more effects of treatment of the medical condition by medical personnel. For example, when the medical condition is a stroke, the eyewear 1500 can include a sensor disposed to determine pupil diameters of each of the user’s eyes. In such cases, the processor 1505 can receive requests and send responses to medical personnel with respect to the user’s condition, such as in response to ongoing treatment of the medical condition and/or any relationship between the current state of the medical condition and any earlier (recent or otherwise) state of the medical condition.

Control of and by external devices (other conditions)

[729] In one embodiment, the processor 1505 can be disposed to control the lenses 1502, the frame 1503, the lamp 1504, or the mask 1531, or an external object coupled to the eyewear 1500, with respect to a condition that is not strictly a “medical” condition. For example, the user’s condition might include drunkenness (or otherwise being under the influence of an intoxicant), excessive tiredness, uncontrolled rage, or other socially notable conditions.

[730] For example, when the processor 1505 determines that the user is intoxicated or otherwise impaired, it can be disposed to control the lenses 1502 or the frame 1503 to inform the user of that determination. In relatively early stages of intoxication, the user might be able to use this information to cease or limit their drinking so as to be able to recover in a reasonable time, or at least to be able to take steps to find alternate means to achieve goals otherwise not performable when impaired. The user might call, or ask another person to call, a taxi to get home instead of attempting to drive by themselves.

[731] For another example, when the processor 1505 determines that the user is intoxicated or otherwise impaired, it can be disposed to control the frame 1503, the lamp 1504, or the mask 1531, to so inform other persons. The other persons can include bartenders or other servers who might be made aware that the user should not be served additional alcohol, taxi drivers or volunteers who might be made aware that the user can use assistance in getting home, or possibly other persons.

[732] For another example, the sensor couplable to the user, or the external device capable of determining a medical status of the user, can include one or more of the following: an Apple Watch™, a Fitbit™, a blood oximeter, a blood pressure monitor, a heart rate monitor, a moodsensing ring, a thermometer or other temperature sensor, or as otherwise described herein. This can have the effect that the user (and possibly others) can obtain relatively quick feedback/ info r- mation with respect to the user’s physical state, without having to perform any measurement or review any measuring device. This can be useful when the user is engaged in a sport or another activity requiring consistent attention and/ or rapid reactions, such as operating an aircraft, race car, motorcycle, dirt bike (or many other vehicles); playing a video game (particularly a “first- person shooter”), or as otherwise described herein.

[733] For another example, a first eyewear 1500 can be disposed to receive a signal from an external device, wherein the external device is disposed to tell the processor 1505 information from other persons observing the user wearing the first eyewear 1500. For example, the eyewear 1500 might be disposed for use to inform the user whether a substantial number of nearby persons evaluate the user as being intoxicated or otherwise having an impaired condition. In such cases, the user might take advantage of feedback from the “wisdom of crowds” so as to re-evaluate their own condition, and possibly take ameliorative action or avoid any dangerous circumstances they might undertake.

Control by external device (entertainment device)

[734] In one embodiment, the eyewear 1500 can be disposed to receive a signal from an external device, wherein the external device is disposed to tell the processor 1505 to present an AR/VR image (and/or sound) to the user of the eyewear. For example, the eyewear 1500 might be disposed for use with one or more of the following (exemplary) systems:

[735] The user might be observing or participating in a show, such as in a movie theater, as part of an outdoor presentation, as part of an interactive presentation, or in/on a ride (e.g., one having special effects as part of a presentation associated with the ride). The AR/VR image and/ or sound might be available to enhance the show; to comment on or provide information about the show; to provide closed-captioning, subtitles, or translation; to assist those users with sight or hearing impairment; or as otherwise described herein. In such cases, the AR/VR image and/or sound can be provided to enhance the user’s experience or to make up for the user’s disability to fully experience the show.

[736] The user might be operating a vehicle, such as an aircraft (e.g., a fixed-wing aircraft, glider or ultralight aircraft, helicopter, model aircraft or drone, or as otherwise described herein), watercraft (e.g., a motorboat, cigarette boat, jet-ski, or as otherwise described herein), or ground vehicle (e.g., race car, sports car, motorcycle, dirt bike, bicycle, ATV, golf cart, or as otherwise described herein). The AR/VR image and/or sound might be available to provide information about operating the vehicle useful to the user but not easily available to unaided human senses, e.g., information about the ambient environment in which the vehicle is disposed to travel, information about the vehicle’s speed or heading, information about other vehicles not within the user’s field of view, or as otherwise described herein; to provide enhancements or warnings to the user with respect to particular sensor data, e.g., such as when an engine is near failure or when a fuel supply is particularly low; to provide information from an instructor or observer with respect to the user’s operation of the vehicle; to assist those users with sight or hearing impairment; or as otherwise described herein. In such cases, the AR/VR image and/or sound can be provided to assist the user with information useful for operating the vehicle.

[737] The user might be observing or participating in a sporting event, such as in or on a field or sporting arena (e.g., an individual sport, a relay event, or a team sport); a re-creation or reenactment of an historical event or an alternative version thereof (e.g., an American Revolutionary War or American Civil War battle re-enactment); a role-playing game or live-action role-playing game; a geo-caching event, orienteering event, or scavenger hunt; a sporting event having special effects as part of the event; or as otherwise described herein. Similar to observing a show, the AR/VR image and/or sound might be available to enhance the sporting event; to comment on or provide information about the sporting event, such as statistical information about players, historical information about re-enactments; to assist those users with sight or hearing impairment; or as otherwise described herein. In such cases, the AR/VR image and/or sound can be provided to enhance the user’s experience or to make up for the user’s disability to fully experience the sporting event.

[738] The user might be observing or participating in an instructional, testing, or training exercise, such as a military field training exercise, a search/rescue training exercise, a law enforcement training exercise, a firefighting training exercise, or as otherwise described herein.

[739] The user might be observing or participating in a computer game or video game, such as a first-person shooter, a role-playing game, or as otherwise described herein. For example, similar to a sporting event, the computer game or video game can include a game imitating/sim- ulating a sporting event, such as described above, including an event performed on a field or sporting arena, a re-creation or re-enactment of an historical or alternate-historical event, a roleplaying game or live-action role-playing game, a geo-caching event, orienteering event, or scavenger hunt; a sporting event having special effects as part of the event; or as otherwise described herein. For example, also similar to a sporting event, the computer game or video game can include special effects, such as generated by a computing device, a video game device, or a smartphone or other mobile device, and received by the eyewear and presented to the user, including one or more AR/VR images and/or sound as part of the computer game or video game.

[740] For example, the computer game or video game can include a “first-persons shooter”, in which the player attempts to use one or more simulated weapons against targets presented by the game. Such a game can be used to test the user’s skill at identifying and aiming at targets. Similarly, such a game can be used to test the user’s skill at distinguishing between targets at which to aim (such as criminals) and targets at which not to aim (such as innocent civilians). The computer game or video game can be disposed to determine a skill level of the user, or to allow an instructor/trainer or observer/ reviewer to determine a skill level of the user. In such cases, the user’s skill level can be used to determine whether the user passes a skill test or should have additional training.

Fig. 16 — Controlling external devices

[741] Fig. 16 (collectively including Figures 16A-D) shows a conceptual drawing of eyewear capable of controlling external devices or being controlled by external devices.

[742] In one embodiment, eyewear 1600 can include one or more of the following:

— An audio/video signal transmitter/ receiver 1601.

— One or more lenses 1602 disposed between a user (not part of the eyewear) and a field of view (not part of the eyewear).

— A camera 1603 disposed to receive information with respect to one or more gestures provided by the user.

— A processor 1604 coupled to the audio/video signal transmitter/ receiver, the lenses, or the camera.

Controlling external devices

[743] Fig- 16A shows a conceptual drawing of eyewear capable of controlling an external device.

[744] In one embodiment, the eyewear 1600 can be disposed to use the camera 1603 to receive information with respect to one or more gestures provided by the user. The camera 1603 can be disposed to receive information with respect to the user’s eyes (thus, directed inward toward the user’s eyes/face), so as to receive information with respect to one or more eye/face gestures, or can be disposed to receive information with respect to other user gestures (thus, directed outward toward one or more portions of the user’s field of view). The other user gestures can include one or more of: hand/finger gestures, movement of other body parts, or other gestures within the user’s field of view, as otherwise described herein.

[745] The processor 1604 can be disposed to receive information from the camera 1603 within or through the lenses 1602 so as to determine a direction in which the user is looking, thus performing a function of a dynamic eye tracking mechanism. The dynamic eye tracking mechanism can be disposed to provide sufficient information so as to allow the processor 1604 to determine one or more gestures being performed by the user. The processor 1604 can be disposed to use that information to control the audio/video signal transmitter/ receiver 1601 to exchange information with an external device 1605.

[746] For example, the external device 1605 can include a smartphone or other mobile device. In such cases, when the user is looking at a smartphone (or other mobile device), the eyewear can send a signal to the mobile device to control the mobile device in response to one or more user controls. For example, user controls can include capacitive or touch controls, eye or face gestures, finger or hand gestures, head or mouth movements, voice commands, electromagnetic commands from another device, other user commands described herein, or other ways the user can direct the eyewear. In such cases, the processor 1604 can be disposed to direct the mobile device in one or more of the following ways:

— The processor 1604 can be disposed to direct the mobile device to highlight a designated portion of the mobile device’s screen. Thus, when the user is looking at a particular portion of the screen, the eyewear can direct the mobile device to highlight, or to shade/inverse-shade, only that portion of the screen. This can have the effect that the mobile device can show the user just what the user is looking for, such as a particular portion of a display or screen on the mobile device. Other possible advantages include saving power usage or battery time: The mobile device can be disposed to only light/highlight those proportion the display or screen at which the user is looking, or the mobile device can be disposed to provide optimal brightness for the user to view the display or screen without using excessive power usage or battery time.

— The processor 1604 can also be disposed to direct the mobile device to urge the user to review selected portions of the display or screen. For example, the mobile device can be disposed to adjust a highlighted portion of the display or screen so as to urge the user to move their gaze direction while reading, similar to a reading speed prompt (which can improve the user’s reading or comprehension). For another example, the mobile device can be disposed to select a portion of the display or screen to highlight so as to prompt the user to move their gaze direction to an alert or notification. For another example, the mobile device can be disposed so as to prompt the user to move their gaze direction away from the display or screen, with the effect of causing the user to relax their eyes and focus at a greater distance that the mobile device.

— The processor 1604 can also be disposed to direct the mobile device to alter its parameters for control of the display or screen. For example, the processor 1604 can be disposed to direct the mobile device to act as if it had received one or more capacitive control or touch commands at selected locations, such as on the display or screen. For another example, the processor 1604 can be disposed to direct the mobile device to act as if it had received one or more button presses or other controls, such as to turn on/off or to adjust brightness or volume. For another example, the processor 1604 can be disposed to direct the mobile device to act as if it had received a particular facial recognition input, fingerprint recognition input, or other biometric input.

[747] For another example, the processor 1604 can be disposed to control the external device 1605 to adjust an angle or amount of polarization, a color or color balance (or a different set of colors in a varying color pattern), or otherwise as described herein, on a portion of the screen being viewed by the user. The processor 1604 can also be disposed to control the external device 1605 to direct the mobile device to increase a magnification, or to impose other visual effects, on a portion of the screen being viewed by the user. For example, the processor 1604 can also be disposed to control the external device 1605 to cause that portion to blink, or otherwise change a way that portion can be viewed by the user.

[748] For another example, the processor 1604 can be disposed to control multiple such external devices 1605 (thus, having multiple such displays or screens), or multiple displays or screens coupled to a single such external device 1605. In response to a dynamic eye tracking mechanism, the processor 1604 can be disposed to determine whether the user is looking at a first screen 1606a or a second screen 1606b, and to cause the screen being looked at (an “active” screen 1606a) to have a first visual effect and the screen not being looked at (an “inactive” screen 1606b) to have a second visual effect. For example, the processor 1604 can be disposed to control the inactive screen 1606b to be substantially dimmed, so the user is not subject to excessive brightness directed at their peripheral vision, and so the user is not otherwise distracted. For another example, the processor 1604 can be disposed to control the inactive screen 1606b to have its color balance altered so as to be less attractive to the user’s eye; for example, the inactive screen 1660b can be filtered to be more amber, so as to reduce peripheral-vision brightness in the blue portion of the visual spectrum, or the inactive screen can be directed to provide green light, so as to prevent or reduce the likelihood of, or to treat or reduce the severity of, migraines.

Controlling vehicles

[749] Fig- 16B shows a conceptual drawing of eyewear capable of controlling a vehicle.

[750] In one embodiment, the eyewear 1600 can be disposed to allow the user to control an external device 1605 such as a vehicle (for example, an aircraft, speedboat or other watercraft, or a racing car or other ground vehicle). For example, the processor 1604 can be disposed to determine one or more controls in a control panel 1607 at which the user is looking, and to operate the vehicle 1605 by manipulating one or more of those controls. For another example, the processor 1604 can be disposed to operate one or more such controls in response to a gesture for triggering the control. The gesture can include an eye/face gesture, a hand/ finger gesture, a movement of another body particular, a voice control, another gesture as otherwise described herein, or a combination thereof. The user can thus use a gesture in combination with a gaze direction or focusing distance to assist the user with operating the vehicle, or to operate the vehicle entirely by such combinations of gaze and gesture.

[751] For example, the processor 1604 can be disposed to direct one or more of the following controls in response to gestures by the user:

— starting the vehicle, controlling “cruise control” or other automatic driving controls, controlling other controls relating to electric vehicles such as golf carts

— setting a temperature or related controls, turning on/off air conditioning or defrosters or related controls, operating a radio or related equipment, opening/closing doors or windows, opening/closing an engine hood or a trunk, opening/closing a gas or other fluid entry, extrud- ing/retracting cup holders or related equipment,

— turning on/off internal lights or displays, turning on/off or adjusting external lights,

— presenting/highlighting alerts such as from the engine or fuel reserves, or as otherwise described herein.

[752] For another example, the processor 1604 can be disposed to direct one or more controls in a control panel 1607 while the user is operating the vehicle and is occupied with maintaining eye contact with a path of travel, such as watching a road while operating a racing car, or such as watching an airspace for air traffic while operating an aircraft. Similarly, it might occur that some control elements are available on a control panel 1607 (such as on a dashboard) while other control elements are available on another portion of the vehicle (such as on a steering wheel or control yoke), and it might be desirable for the user to maintain their hands on only one of those controls. In such cases, the processor 1604 can be disposed to direct one or more controls in a control panel 1607 while the user is maintaining contact with a different vehicle control. Similar to other controls described herein, the processor 1604 can be disposed to respond to one or more of: a user gaze direction, a user hand/ eye gesture or other body movement, a voice control, or as otherwise described herein, so as to direct operation of a first vehicle control while the user otherwise maintains operation of a second vehicle control.

[753] Some possible examples can include, in an aircraft, that the processor 1604 is disposed to direct the operation of engine and/or flight surface controls in response to controls from the user, such as eye/facial gestures, hand/finger gestures (such as possibly hand/finger gestures as described in the Incorporated Disclosures), other body gestures, voice controls, or as otherwise described herein.

— For example, the processor 1604 can be disposed to use a dynamic eye tracking mechanism to determine when the user performs a first selected gesture (such as glancing in a selected direction 1-3 times) while concurrently performing a second selected gesture (such as blinking or squinting). In response thereto, processor 1604 can be disposed to change a throttle setting or a flight control surface setting.

— For another example, the processor 1604 can be disposed to use a dynamic eye tracking mechanism to determine when the user performs a first selected gesture (such as glancing right- ward/leftward) while concurrently performing a second (not necessarily different) selected gesture (such as a hand wave or a finger touch at a selected location on a control panel 1607) to execute a slip or turn.

— For another example, the processor 1604 can be disposed to use a dynamic eye tracking mechanism to determine when the user glances or looks at a particular control element and concurrently performs a selected gesture to operate that control element. Thus, the processor 1604 can be disposed to determine when the user looks at an artificial horizon and performs a thumbs- up/down gesture to raise/lower an elevator control.

— For another example, the user can perform other combinations of actions, such as described herein, to operate other aircraft controls (including such possibilities as operating cabin lights, a radio, or otherwise as described herein). [754] Similarly, in an automobile or racing car, the processor 1604 can be disposed to determine when the user performs other combinations of actions (thus, of eye/facial gestures, hand/ finger gestures, and/or other body movements) to operate other aircraft controls.

— For example, the processor 1604 can be disposed to determine when the user performs one or more eye/facial gestures, hand/finger gestures, other body movements, voice controls, or as otherwise described herein. In response thereto, the processor 1604 can be disposed to direct one or more controls, such as accelerator/brake, gearing, and/or turning controls.

— For example, the processor 1604 can be disposed to determine when the user performs a first selected gesture, such as glancing upward/ downward 1-3 times (in succession) while concurrently performing a second (not necessarily different) selected gesture (such as blinking or squinting) to apply/ relax an accelerator and/or apply/ release a brake.

— For another example, the processor 1604 can be disposed to determine when the user performs a first selected gesture, such as glancing rightward/leftward, while concurrently performing a second (not necessarily different) selected gesture, such as a hand wave or a finger touch at a selected location on a wheel, to execute a turn.

Alternatively, the user can perform other combinations of actions, such as described herein, to operate other vehicle controls (including such possibilities as operating doors, windows, locks, a trunk, or otherwise as described herein).

Controlling other devices

[755] Fig- 16C shows a conceptual drawing of eyewear capable of controlling another device, such as weapons or other devices.

(Weapons)

[756] In one embodiment, the external device 1605 can include a weapon, such as a pistol, rifle, or other firearm; a taser or similar weapon; a grenade, “flashbang” grenade, or tear gas grenade; or as otherwise described herein. For example, the weapon might be disposed for use by a law enforcement officer or by militaiy personnel; the weapon might include a safety mechanism so as to prevent the user from accidentally setting off the weapon while it is unaimed or while it is aimed at an unintended target. Similarly, it might be desirable to prevent the weapon from being used by anyone other than its issued recipient: for example, it might be desirable to disable a law enforcement officer’s service pistol if it is lost or if it is taken away by a criminal. [757] For example, the external device 1605 can be disposed to include both the weapon and a safety mechanism 1605a (which might be disposed as part of the weapon, of course), the safety mechanism 1605a being disposed to be controlled by the processor 1604 by exchanging one or more messages 1605b therewith. The processor 1604 can be disposed to determine, in response to a dynamic eye tracking mechanism, when the user desires to turn the safety mechanism 1605a on/ off, and disposed to exchange one or more messages with the safety mechanism 1605a to direct it to do so. This can provide the user with a technique for turning the safety mechanism 1605a on/off without having to adjust the direction in which the weapon is aimed and without having to use both hands to do so.

[758] For example, the processor 1604 can be disposed to determine when the user performs a selected gesture, such as an eye/face gesture, voice gesture, or as otherwise described herein. In response thereto, the processor 1604 can be disposed to exchange one or more messages with the safety mechanism 1605a, with the effect directing the safety mechanism 1605a to turn itself on/ off and thus to disable/enable the weapon. For aimed weapons, such as pistols, this can have the effect that the weapon can be disabled/ enabled without having to adjust the aim of the weapon. For weapons that are more easily enabled that disabled, such as grenades (including flashbang grenades), this can have the effect that the weapon can be disabled even after it has been enabled and even after it has been thrown (or launched), particularly if it is discovered that the weapon was misdirected.

[759] In one alternative, the processor 1604 can be disposed to receive a set of explicit nontargets, from the user or in response to a set of identify-friend-or-foe (IFF) transmitters. For example, the explicit non-targets can include non-suspect citizens or other law enforcement officers. This can have the effect that the weapon is not accidentally discharged at any of those explicit non-targets, even when the weapon is mis-aimed or when its aim is altered after targeting and before actual triggering. The processor 1604 can be disposed to identify the explicit non-targets in response to one or more of (A) an explicit IFF signal, which might be encrypted to prevent spoofing, (B) a facial recognition program, which might be pre-loaded with selected facial recognition information relating to persons already known to law enforcement officers to be non-tar- gets. [760] In another alternative, the external device 1605 can include both the weapon and an iris scanner or other biometric device 1605b. For example, one or more of the iris scanner or other biometric device 1605b can include a camera associated with a program operating on the processor 1604 and disposed to identify one or more pre-selected persons such as the law enforcement officer associated with the weapon. For example, the biometric device 1605b can be disposed to enable/disable the weapon in response to whether the processor 1604 determines that the weapon’s user is properly associated with the weapon and is properly aiming the weapon. The weapon can be associated with more than one such user, such as a set of law enforcement officers who work together. This can have the effect that if the law enforcement officer is not looking while aiming weapon, or if the law enforcement officer loses or has their weapon taken away, the safety mechanism will be automatically engaged, and the weapon will not be improperly usable.

[761] Similarly, in another alternative, the external device 1605 can include the weapon, a biometric device 1605b, and laser sight 1605c or other indicator of where the weapon is being aimed. The laser sight 1605c can be optical, so as to allow the user to see where the weapon is aimed, or infrared (IR), so as to prevent others from seeing where the weapon is aimed. The processor 1604 can be disposed to determine, in response to a dynamic eye tracking mechanism, where the user is looking, and to determine, in response to the laser sight 1605c, where the weapon is actually aimed (thus, whether the weapon is actually aimed where the user is looking). When the weapon is mis-aimed, the processor 1604 can be disposed to perform one or more of the following:

— The processor 1604 can be disposed to direct the eyewear 1600, such as using an AR/VR display on the lenses, that the weapon is mis-aimed. For example, the processor 1604 can be disposed to present a flashing light or another alarm to the user to warn against triggering the weapon when mis-aimed.

— The processor 1604 can be disposed to exchange one or more messages with the safety mechanism 1605a to disable the weapon from being triggered when mis-aimed or enable the weapon to be triggered only when properly aimed.

— The processor 1604 can be disposed to direct the eyewear 1600, such as using an AR/VR display on the lenses, when the weapon is properly aimed, or at least at what target at which the weapon is actually aimed. For example, when the laser sight 1605c is not visible to the human eye, the processor 1604 can be disposed to direct the eyewear 1600, such as using an AR/VR display on the lenses, to present an image of where the laser sight 1605c would appear if it were visible, thus without revealing to the target to suspects. (Personalization)

[762] In one embodiment, the processor 1604 can be disposed to detect one or more gestures or other controls by the user and to operate the external device itself in response thereto. For example, when the external device includes a weapon, the processor 1604 can be disposed, such as using a camera or a dynamic eye tracking mechanism, to determine when the user performs one or more eye/face gestures, voice commands, or other gestures, so as to trigger the weapon. Thus, the processor 1604 can be disposed, in response to the one or more gestures, to trigger the weapon without the user having to otherwise use their hands or otherwise. For example, when the user is a law enforcement officer, the processor 1604 can be disposed to detect a selected eye/face gesture and to perform an associated operation such as “squint to shoot” or “wink to shoot” with a pistol, firearm, or other weapon.

[763] In one embodiment, the user (or another person associated with maintenance of the weapon) might personalize a selection of gestures and the actions associated therewith. Alternatively, the eyewear or digital eyewear can be disposed to adjust to user gesture capability when determining its sensitivity to those gestures. For example, a user who can easily manipulate their nose might be offered a selection of gestures associated with nasal movements, such as flaring the nostrils, raising the nose bridge, or wiggling the nose; while a user unable to easily perform such actions might be offered a different selection of gestures.

(Other external devices)

[764] In other embodiments, the user can include other personnel and the external device can include other types of devices. For example, the user and external device can include one or more of the following:

— The user can include an emergency responder or one or more medical personnel and the external device can include medical equipment, such as a sensor or an operating tool. In such cases, the processor 1604 can be disposed to determine when a patient condition should have attention, such as when responding to a medical emergency, performing a surgical operation (or a dental operation), or in other circumstances when the medical personnel might have their attention elsewhere. For example, the processor 1604 can be disposed to receive one or more trigger points for which it is assigned to present an indicator that the sensor is out-of-normal. In such cases, the processor 1604 can be disposed to present that information, such as using AR/VR images or sound, to the medical personnel.

— For example, during or prior to a medical procedure, the medical personnel can select the warning trigger for the medical sensor, such as using an eye/facial gesture recognizable by the processor 1604. When the medical personnel are performing the medical procedure, the processor 1604 can be coupled to the medical sensor. If the medical sensor presents a sensor value that satisfies the warning trigger, the processor 1604 can be disposed to cause the AR/VR images or sound to present the warning to the medical personnel without the latter having to continually direct their attention or their gaze toward the medical sensor during the procedure.

[765] For another example, when the user is participating in a sport, the processor 1604 can be disposed to, in response to a dynamic eye tracking device, identify a direction in which they are looking. The processor 1604 can be disposed to match that direction with a direction in which the user is directing sports equipment. For example, for a golfer attempting a putt, the eyewear or digital eyewear can be disposed to show a direction in which the ball would move given the angle of the putter and the degree of backswing the player is allocating; when this lines up with a direction the player is looking, the processor 1604 can be disposed to present a confirming notification. When so programmed and when such data is available, the processor 1604 can be disposed to compute a likely path in response to a contour map of a putting green. When so programmed and when such data is available, the processor 1604 can be disposed to compute a likely path in response to a wind direction and strength.

[766] For another example, the processor 1604 can be disposed to send an electromagnetic or other signal to the external device 1605, so as to allow a user to control that external device 1605 using eye/face gestures or other gestures recognizable by the processor 1604. In such cases, the processor 1604 can be disposed to detect the user’s eye/face gestures and to send one or more appropriate messages (such as electromagnetic or other signals) to the external device 1605 so as to operate one or more of external device’s functions.

[767] Some additional possible examples can include one or more of the following:

— The processor 1604 can be disposed to control an external device 1605 such as a garage door or other automatic device by the user using one or more eye/face gestures. The eyewear or digital eyewear can detect the one or more eye/face gestures and, in response thereto, send an electromagnetic signal to the garage door to cause it to open/close, as the user instructs. — The processor 1604 can be disposed to control an external device 1605 such as a security door in response to an iris scanner or other biometric scanner i605d. The processor 1604 can be coupled to the iris scanner or other biometric scanner i6c>5d and can send one or more messages to the security door to cause it to open/close, as the user instructs.

— The processor 1604 can be disposed to control an external device 1605, so as to emulate an automobile key, an entertainment device, a game controller, a house lights controller, a laptop (or other computing device) keyboard or pointing device, a sound system controller, a television remote, a universal remote, or any other remote controller. The processor 1604 can respond to eye/face gestures and in response thereto, send one or more messages to external devices 1605 so as to emulate an appropriate controller, as the user instructs.

— The processor 1604 can be disposed to control an external device 1605 in response to an RFID transponder. The processor 1604 can be coupled to the RFID transponder and can allow the transponder to operate with the external device 1605. Where applicable, the user can send one or more signals to control the external device in response to eye/face gestures.

Control using other devices

[768] Fig. 16D shows a conceptual drawing of eyewear capable of being controlled using another device, such as using control signals.

[769] In one embodiment, the processor 1604 can be disposed to cooperate with one or more external devices 1605, so as to identify one or more control signals from the user and so as to adjust operation of the eyewear 1600, the external devices 1605, or both, in response thereto. For example, the external devices 1605 can include a smartphone or mobile device, such as a mobile device 1605c including a camera i605f disposed to capture one or more images of the user and having the processor 1604 disposed to operate on those images to detect one or more eye/face gestures, or other gestures, by the user.

[770] For example, the mobile device 1605c can include its own separate processor (not shown), which can be disposed to recognize one or more eye/face gestures, hand/finger gestures, or other gestures, by the user and to adjust one or more of the following:

— A level or volume with respect to music or other audio/video presentation to the user. For example, the user can raise/lower the volume until satisfied. — An offer or receive a screen-sharing or other AR/VR communication with another set of eyewear 1600. For example, the user can present their own field of view to another user who is willing to receive it.

— An operation of the smartphone or mobile device 1605c, such as to make or take a call, send or read a text message (possibly using an AR/VR display with the eyewear or digital eyewear), send or read a social media communication, or as otherwise described herein. For example, the user can communicate with another user using social media or otherwise.

— A shading/inverse-shading or coloring/tinting control with respect to one or more of the lenses 1602. For example, the user can alter shading/inverse-shading or coloring/tinting until satisfied.

— A zoom or distant focus control. For example, the user can “zoom in” or out, or alter their depth of focus. The user might also use this type of control when playing a video game.

Or as otherwise described herein.

[771] For example, the mobile device 1605c can be disposed to recognize one or more features of an ambient environment, such as a measure of luminance, a measure of coloring/tinting, a measure of audio/video complexity or other interface with possible visual acuity, or as otherwise described herein. When the mobile device detects features of the ambient environment which indicate that an adjustment of shading/inverse-shading or coloring/tinting is called for, the mobile device can signal the eyewear or digital eyewear to make that adjustment.

[772] For another example, the user can direct the mobile device 1605c to cause the eyewear 1600 to make such adjustments, in response to the user’s preference in the moment. Thus, rather than requiring the user pausing operation of the mobile device 1605c so as to operate the eyewear 1600, the user can direct the mobile device 1605c to make any adjustments with respect to shading/inverse-shading, coloring/tinting, or other effects as the user might desire, in response to parameters detected in response to the user’s actual use of the mobile device 1605c.

[773] In another embodiment, the external device 1605 can include a vehicle 1605g having a set of controls disposed to operate the eyewear or digital eyewear. The vehicle 1605g can be real or virtual (such as in an AR/VR environment, or such as in a simulation or video game). For example, the controls can be disposed on a dashboard, on a steering wheel or control yoke, on a detachable control device, or as otherwise described herein. The controls can include one or more of the following:

— A control to adjust shading/inverse-shading coloring/tinting or color balance, refraction, polarization, prismatic deflection, or other audio/video effects.

— A control to set automatic adjustment of one or more audio/video effects, such as setting a threshold at which one or more such audio/video effects are performed.

— One or more sensors disposed to detect objects and/or proximity at a side of the vehicle.

Or as otherwise described herein.

[774] When in use with a vehicle 1605g, the eyewear 1600 (or a processor 1604 included therein) can be disposed to receive one or more messages from the vehicle 1605g indicating its state, and possibly warnings with respect to the vehicle’s status and/or proximity. For example, when backing up, the vehicle 1605 can be disposed to send a message to all nearby sets of eyewear 1600, each of which can alert its own user of a possible hazard. For example, the user can be shown a flashing screen or a flashing icon, a warning color/tint (e.g., red), or a warning message. Similarly, when operating a vehicle 1605b that is backing up, the operator of that vehicle can be warned of any objects in the way or in proximity thereto, in a similar manner.

Fig. 17 - Hand/finger gesture sensor

[775] Fig- 17 shows a conceptual drawing of eyewear capable of including a hand/finger gesture sensor.

[776] As described with respect to the Incorporated Disclosures, the eyewear 1700 can include one or more of the following:

— A hand/finger gesture sensor 1710 disposed to determine one or more of (A) a relative distance or motion with respect to the hand/finger gesture sensor; or (B) another gesture or user input, such as an eye/face gesture, a voice input, or as otherwise described herein.

[777] In one embodiment, the hand/finger gesture sensor 1710 can be coupled to a control circuit 1720, such as a processor or another computing device, and program/ data memory, disposed to perform the functions described herein with respect to use of the sensor. The functions with respect to use of the sensor can include one or more of the following: — Determining a presence of a hand/ finger 1730 (not part of the eyewear 1700) near the sensor 1710.

— Determining a relative distance of the hand/ finger 1730 with respect to the sensor 1710.

— Determining a relative motion of the hand/ finger 1730 with respect to the sensor 1710, such as whether the hand/finger is being moved closer/ further with respect to the sensor, or whether the hand/finger is being moved at substantially right angles with respect to the sensor.

Type of hand/finger gesture sensor

[778] For example, the control circuit 1720 can determine one or more of the presence, relative distance, or relative motion, of the hand/finger 1730 by one or more of the following:

— The control circuit 1720 can include a light sensor 1721 (camera, photocell, photomultiplier, or as otherwise described herein) and can determine whether the hand/finger 1730 casts a shadow on (or otherwise obscures) the light sensor, such as by measuring a luminance level or a luminance level at selected frequencies.

— The control circuit 1720 can include a capacitive or touch sensor 1722, such as disposed to detect a touch or near-touch by the hand/finger 1730.

— The control circuit 1720 can include a camera 1723, such as disposed with a processor or another computing device and program/data memory, to detect an eye/facial gesture, such as a blink/wink, an eye/eyebrow movement, a nose gesture, a smile/smirk or grimace, a squint, a tongue gesture, or as otherwise described herein.

— The control circuit 1720 can include an electromagnetic or other emitter 1724, such as possibly an electromagnetic or ultrasonic emitter, disposed to cause a signal to be reflected from the hand/finger 1730 and received by a corresponding receiver 1725. In such cases, the control circuit 1720 can be disposed to measure a flight time or other indicator of a relative distance between the hand/finger 1730 and the combination of the emitter 1724 / receiver 1725.

[779] While this Application primarily describes the use of the hand/finger gesture sensor 1710 to measure relative luminance, such as using a light sensor 1721, there is no particular requirement for any such limitation. As described herein, the user input can include one or more of: a capacitive or touch control (such as using the capacitive sensor 1722), a sensor disposed to respond to an eye or facial gesture (such as using the camera 1723), a sensor disposed to respond to a mouth gesture (such as using the camera 1723), a sensor disposed to respond to movement of a body part (such as using the camera 1723), a voice control (such as using a microphone or other device responsive to sound), or as otherwise described herein.

Use of hand/finger gesture sensor

[780] The user can use their hand/finger 1730 to provide a hand/finger gesture indicating a user input. For example, the user input can indicate one or more of the following:

— A measure of pain, light sensitivity, audio sensitivity, or another indicator of the occurrence or severity of migraine or photophobia (or phonophobia).

— A measure of the user’s desire for shading/inverse-shading (or a measure of brightness or shadow the user prefers), a measure of the user’s desire for coloring or color balancing, a measure of the user’s desire for refraction (or a measure of fuzziness or unclarity the user perceives), or another visual effect to be applied to light infalling to the user’s eye.

— A measure of other indicator of the user’s subjective perception of a treatment for migraine or photophobia as recently or currently being applied by the digital eyewear 1700.

— A measure of the user’s subjective belief that migraine or photophobia (or phonophobia) is oncoming or likely to be so, or is finishing or likely to be so.

— A measure or other indicator of the user’s ambient environment or recent behavior, such as an amount of sleep, stress, perceived confusion or glare, or as otherwise described herein; or a recency, a degree, or an intensity, of efforts at self-care.

— An adjustment of dynamic visual optimization that the user prefers, or a measure of visual acuity currently or recently perceived by the user (such as described in the Incorporated Disclosures). or as otherwise described herein.

[781] The measure of brightness or shadow the patient prefers can be selected for use in combination with one or more other adjustments to user perception, such as one or more of:

— Combining shading/inverse-shading with dynamic visual optimization, such as described in the Incorporated Disclosures, including Application 16/684,479, filed Nov. 14, 2019, Attorney Docket No. 6501.

— Combining shading/inverse-shading with color filtering or color balancing, such as described herein and in the Incorporated Disclosures, including Application 13/841,550, filed Mar. 15, 2013, Attorney Docket No. 5087 P, or including Application 14/660,565, filed Mar. 17, 2015, Attorney Docket No. 5266C3. [782] The eyewear 1700 can be disposed to periodically, from time to time, or otherwise as described herein, maintain a record of patient inputs. For example, the eyewear 1700 can be disposed to maintain a record of a measure of migraine or photophobia and maintain that measure in a “migraine diary” on behalf of the patient. The migraine diary can be disposed for use by medical personnel, such as a neurologist, ophthalmologist, optometrist, or otherwise as described herein. The eyewear 1700 can be disposed to transmit its determination to an external device, such as using a transceiver (such as a smartphone or other mobile device), a storage device, or as otherwise described herein.

Comparison with ambient environment

[783] In one embodiment, the hand/frnger gesture sensor 1710 can include a front sensor 1721F and one or more external side sensors 1721S disposed to measure infalling light from the ambient environment. As described herein, the front sensor 1721F can be disposed to measure luminance of the ambient environment without modification by the user, while the side sensors 1721S can be disposed to measure luminance as deliberately modified by the user to provide input values. As described herein, variability in the front sensor 1721F and side sensors 1721S can be disposed to provide both a value of user input and an indication of the user’s desire to select that particular input.

[784] To measure luminance, or luminance level of selected frequencies, the front sensor 1721F and side sensors 1721S can use a camera, photocell, photomultiplier, or otherwise as described herein. With respect to the front sensor 1721F, the sensor can be disposed to measure a luminance of the ambient environment, such as to compare with a relatively different luminance of a side sensor 1721S (with respect to the at least one such side sensor), the sensor can be disposed to measure a luminance of the ambient environment as modulated by a shadow presented by the user, such as using the user’s hand, fingers, or another body part.

[785] In one embodiment, a comparison with an ambient environment can be represented using pseudocode such as the following:

First and second hand/ finger gesture [786] For example, to adjust a shadow presented on the side sensor 1721S, the user can conduct a first finger or hand gesture, such as moving a finger or hand closer/farther from a light sensor and thus covering it more/less with a shadow. Upon selecting an input value, the eyewear 1700 can be disposed, in response to a secondary finger or hand gesture, to “lock in” a selected indication. For example, the secondary gesture can include a “swipe”, such as moving a finger or hand in a plane parallel to a temple 1122. Thus, the patient can select an input value with the first gesture and confirm that input value with the second gesture.

[787] The eyewear 1700 can be disposed to recognize the user’s first finger or hand gesture in response to a sequence of measurements by the side sensor 1721S. These measurements can be received as a substantially continuous set of values; they are not required to be selected from a set of discrete values, such as selecting a digit from 0-9. As the eyewear 1700 receives user inputs, it can be disposed to present one or more of those values, such as a sequence of values selected periodically or otherwise from time to time, to the user using an AR/VR or other audio/video presentation. This can have the effect that the to the patient can see or hear the substantially analog value they are providing to the eyewear 1700 using the side sensor 1721S while doing so; this can have the effect that the user can adjust their inputs during the process of input, so as to select substantially the exact input they might desire.

Fig. 18 - Couplable circuit elements and temples

[788] Fig. 18 (collectively including Figures 18A-B) shows a conceptual drawing of eyewear capable of including couplable circuit elements and temples, and capable of being coupled to an external device.

[789] Fig. 18A shows a conceptual drawing of eyewear capable of including couplable circuit elements and temples.

[790] In one embodiment, an example eyewear 1800 can include one or more of the following:

— A front piece 1810,

— A right temple 1820, or

— A left temple 1830. [791] One or more of the temples (such as, without loss of generality, the right temple 1820) can include a power source 1821, such as a battery. One or more of the temples (such as, without loss of generality, the left temple 1830) can include a control circuit 1831, such as a processor and program/ data memory, disposed to operate the eyewear 1800.

Coupling temples using front piece

[792] The right temple 1820 can include a right circuit element 1822, such as a set of conductors, disposed to couple the power source 1821, using a first hinge 1823, to the front piece 1810. Similarly, the left temple 1830 can include a left circuit element 1832, such as a set of conductors, disposed to couple the control circuit 1831, using a second hinge 1833, to the front piece 1810. The front piece 1810 can include a front circuit element 1812, such as a set of conductors, disposed to couple the first hinge 1823 and the second hinge 1833; this can have the effect that the power source 1821 is coupled to the control circuit 1831.

[793] In one or more of the front piece 1810, or the temples 1820 or 1830, the circuit elements 1812, 1822, or 1832 can include any device or technique for coupling the right temple 1820 or its components to the left temple 1830 or its components. For example, the circuit elements 1812, 1822, or 1832, can include one or more conductive wires 1841, such as disposed in parallel to provide a multiple-bit data connection between the power source 1821 and the control circuit 1831. Alternatively, the circuit elements 1812, 1822, or 1832, can include one or more electromagnetic connectors (not shown), such as devices using capacitive coupling, electromagnetic transmitters, transformers, waveguides, or other power/ data transducers, Alternatively, the circuit elements 1812, 1822, or 1832, can include one or more fiber optic connectors (not shown), which might themselves be disposed in parallel to provide a multiple-bit data connection between the power source 1821 and the control circuit 1831.

Coupling/decoupling temples from front piece

[794] In one embodiment, when the first hinge 1823 is disposed to couple the power source 1821 to the front piece 1810 (such as when the eyewear 1800 is in a normal wearing configuration), and the second hinge 1833 is disposed to couple the control 1831 to the front piece 1810 (such as when the eyewear 1800 is in a normal wearing configuration), the circuit element 1812 in the front piece 1810 can be disposed to couple the right temple 1820 to the left temple 1830. This can have the effect that when the temples 1820 and 1830 are coupled to the front piece 1810, the power source 1821 can be coupled to the control circuit 1831, thus allowing the control circuit to draw power from the power source. Thus, when the eyewear 1800 is in a normal wearing configuration, the control circuit can operate using normal power (from the power source).

[795] Alternatively, when either the first hinge 1823 is disposed to decouple the power source 1821 from the front piece 1810 (such as when the eyewear 1800 is no longer in a normal wearing configuration), or the second hinge 1833 is disposed to decouple the control 1831 from the front piece 1810 (such as when the eyewear 1800 is no longer in a normal wearing configuration), the circuit element 1812 in the front piece 1810 can be disposed to decouple the right temple 1820 from the left temple 1830. This can have the effect that when the temples 1820 and 1830 are decoupled from the front piece 1810, the power source 1821 can be decoupled from the control circuit 1831, thus preventing the control circuit from drawing power from the power source.

[796] The front piece 1810 can be disposed to be decoupled from either the right temple 1820 or the left temple 1830, such as by detaching the first hinge 1823 or the second hinge 1833 therefrom. In such cases, the front piece 1810 can be disposed to be decoupled from the right temple 1820 and the left temple 1830 and the front piece 1810 replaced with a new front piece 1810'. For example, the new front piece 1810' can be disposed to support one or more new lenses 1811, one or more new front circuit elements 1812, one or more color-alterable elements (not shown) (as otherwise and further described herein), one or more connections between the right temple 1820 and the left temple 1830 (such as electrical/electronic connections or optical connections), or as otherwise described herein.

[797] Replacing the front piece 1810 with a new front piece 1810', can have the effect of updating or upgrading one or more portions of the eyewear 1800 with improved circuit elements. For example, as described herein, one or more of the following can be updated or upgraded:

— The eyewear’s lenses can be updated or upgraded, such as to match a new prescription for the user.

— One or more of the front circuit elements 1812 can be updated or upgraded, such as to match new connections or to take advantage of new available connections with the right temple 1820 or the left temple 1830, or between the right temple 1820 and the left temple 1830.

— One or more color-alterable elements in the front piece 1810 (not shown), or controls for color-alterable elements in the right temple 1820 or the left temple 1830 (not shown), can be updated or upgraded, such as to match color-alterable elements in other portions of the eyewear 1800, or to take advantage of updates or upgrades in other portions of the eyewear.

— Replacing the front piece 1810 with a new front piece 1810' can have the alternative effect of installing a debugging device in the eyewear 1800, or of replacing a defective part of the eyewear.

[798] Alternatively, the front piece 1810 can be disposed to be decoupled from either one or more of the right temple 1820 or the left temple 1830, such as by detaching the first hinge 1823 and/ or the second hinge 1833 therefrom. In such cases, the front piece 1810 can be disposed to be decoupled from the right temple 1820 and the latter replaced with a new right temple 1820'. Similarly, in such cases, the front piece 1810 can be disposed to be decoupled from the left temple 1830 and the latter replaced with a new left temple 1830'.

[799] As described herein with respect to replacing the front piece 1810 with a new front piece 1810', replacing the right temple 1820 with a new right temple 1820' or replacing the left temple 1830 with a new left temple 1830' can have the effect of updating or upgrading one or more portions of the eyewear 1800 with improved circuit elements. For example, as described herein, one or more of the following can be updated or upgraded:

— The improved circuit elements can be coupled between one of the new temples 1820' or 1830' and the front piece 1810, can be coupled between one of the new temples 1820' or 1830' and the opposite temple 1830 or 1820 respectively, or can be coupled between one of the new temples 1820' or 1830' and an opposite new temple 1830' and 1820' respectively.

— One or more color-alterable elements in one or more of the new temples 1820' or 1830' (not shown), or controls for color-alterable elements in one or more of the new temples 1820' or 1830', can be updated or upgraded, such as to match color-alterable elements in other portions of the eyewear 1800, or to take advantage of updates or upgrades in other portions of the eyewear.

— Replacing the right temple 1820 with a new right temple 1820' or replacing the left temple 1830 with a new left temple 1830' can have the alternative effect of installing a debugging device in the eyewear 1800, or of replacing a defective part of the eyewear.

Coupling temples to alternative device

[800] Fig. 18B shows a conceptual drawing of eyewear including a couplable circuit elements and temples. [801] In one embodiment, eyewear 1800 including the front piece 1810 and the temples 1820 and 1830 can be disposed to couple the front piece 1810 to one or more temples 1820 or 1830 using a hinge 1880. The hinge 1880 can include a first holder 1881a and a second holder 1881b, rotatably disposed so as to allow the hinge 1880 to rotate along a selected axis 1881c. The hinge 1880 can also include one or more wires 1882a (disposed in the front piece 1810) and 1882b (disposed in one of the temples 1820 or 1830), the wires 1882a and 1882b being disposed so as to be coupled when the hinge 1880 is closed and decoupled when the hinge 1880 is open. When the wires 1882a and 1882b are coupled, a circuit can be completed between the front piece 1810 and one or more temples 1820 or 1830. When the wires 1882a and 1882b are decoupled, no such circuit is completed. The wires 1882a can be enclosed in a first conduit 1883a and the wires 1882b can be enclosed in a second conduit 1883b. The first conduit 1883a and the second conduit 1883b can be disposed to be coupled when the hinge 1880 is closed, and to be decoupled when the hinge 1880 is open, but whether the first conduit 1883a and the second conduit 1883b are coupled or decoupled need not necessarily have any particular effect.

[802] For example, the wires 1882a and 1882b can include copper wires that can couple electrically, fiber optic threads that can couple optically, or other elements that can possibly couple using another technique. Although this Application primarily describes the wires 1882a and 1882b as including copper wires or another conducting material that couples electrically, there is no necessaiy requirement for any such limitation. The wires 1882a and 1882b can include any device or technique disposed to be coupled/ decoupled when the hinge 1880 is closed/ open, and vice versa.

[803] Fig. 18C shows a conceptual drawing of eyewear including couplable circuit elements and temples, capable of being coupled to an external device.

[804] In one embodiment, an example eyewear 1800 is capable of being coupled to one or more external devices. An example external device can include one or more of the following:

— A holding device 1840,

— An external power source 1850,

— An external computing device i860. [805] For example, the eyewear 1800 can be disposed to be unfolded and worn by a user, or to be folded and maintained in holding device 1840. When the eyewear 1800 is unfolded and worn by a user, the magnetic hinges 1823 and 1833 can be disposed to couple the temples 1820 and 1830 to the front piece 1810. When the eyewear 1800 is folded and maintained in the holding device 1840, the magnetic hinges 1823 and 1833 can be disposed to uncouple the temples 1820 and 1830 from the front piece 1810 and instead couple the temples 1820 and 1830 (and possibly the front piece 1810) to the holding device 1840. In such cases, one temple, not necessarily the right temple 1820, can be disposed to be coupled to the external power source 1850; possibly the front piece 1810 can also be disposed to be coupled to the external power source 1850. The other temple, not necessarily the left temple 1830, can be disposed to be coupled to the external computing device i860; possibly the front piece 1810 can also be disposed to be coupled to the external computing device i860.

[806] In one embodiment, when the first hinge 1823 is disposed to decouple the power source 1821 from the front piece 1810 (such as when the eyewear 1800 is no longer in a normal wearing configuration, or such as when the eyewear has been decoupled into one or more parts), the power source can be disposed to be coupled to a holding device 1840. The holding device 1840 can be disposed to maintain or include an external power source 1850, such as an external battery or a wall socket or other power source. This can have the effect that, when not being worn, or when having been decoupled into one or more parts, the eyewear 1800 can be disposed to be charged by coupling the eyewear’s power source 1821 to one or more external power sources 1850, such as possibly disposed in the holding device 1840 or another charger.

[807] In one embodiment, when the second hinge 1833 is disposed to decouple the control circuit 1831 from the front piece 1810 (such as when the eyewear 1800 is no longer in a normal wearing configuration, or such as when the eyewear has been decoupled into one or more parts), the control circuit can be disposed to be coupled to the holding device 1840. The holding device 1840 can be disposed to maintain or include an external computing device i860, such as an external processor or program/ data memory. This can have the effect that, when not being worn, or when having been decoupled into one or more parts, the eyewear 1800 can be disposed to trans- mit/receive program/data information to/from the external computing device i860 or its pro- gram/data memoiy, such as possibly when disposed in a holding device 1840 coupled to the external computing device or its external program/data memory, so as to couple to another remote device. [808] For example, the external computing device i860 or its external program/ data memory can be disposed to exchange (such as using a wired or wireless coupling) information between the eyewear’s control circuit 1831 and a remote device 1870, so as to perform one or more of the following:

— To maintain information collected by the eyewear’s control circuit at the remote device, such as to maintain a “migraine diary”, an electronic medical record (EMR), or another record of user activity.

— To update information maintained by the eyewear’s control circuit with new or updated information from the remote device, such as to update parameters for the eyewear’s control circuit.

[809] For one example, the eyewear’s control circuit 1831 can send information to the remote device 1870, such as to inform the remote device of one or more of: (A) information with respect to the user’s ambient environment; (B) information with respect to the user’s personal parameters, such as one or more of: a measure of eyeblink time, a measure of light sensitivity, a measure of user pain; a set of user preferences with respect to selected gestures; a set of user parameters with respect to gesture sensitivity; a user location, such as a current location or a location history; or as otherwise described herein; or (C) information with respect to objects recognized by the eyewear, such as alarms or warnings, friends or other selected persons, selected objects, or as otherwise described herein.

[810] For another example, the eyewear’s control circuit 1831 can receive information from the remote device 1870, such as to inform the eyewear 1800 of one or more of: (A) information from emergency responders or medical personnel; (B) information from law enforcement officers or search/rescue personnel; (C) information from medical devices, such as to update a set of parameters for determining one or more user medical conditions; or as otherwise described herein. In such cases, parameters for determining one or more user medical conditions can include artificial neural network (ANN) weights or other artificial intelligence or machine learning parameters.

[811] For example, when the eyewear 1800 is not being worn, the user can position the eyewear in (or otherwise coupled to) the holding device 1850, so as to couple the power source 1821 to an external power source 1840, and so as to couple the control circuit 1831 to an external computing device i860. This can have the effect that, when not being worn, the eyewear 1800 can be disposed to both charge its power source 1821 and to perform one or more of (A) recording any information it has gleaned from use, and (B) updating its program/ data memory from an external source.

Fig. 19 - Clip-on couplable circuit elements and lenses

[812] Fig. 19 shows a conceptual drawing of eyewear capable of including magnetic clip-on couplable circuit elements and lenses.

[813] In one embodiment, an example eyewear 1900 can include one or more of the following:

— A front piece 1910 disposed to support one or more lenses 1920.

— A clip-on piece 1930 disposed to be coupled to one or more of the lenses, such as at one or more points of contact. or

— A right temple 1940R and/ or a left temple 1940L.

[814] The one or more lenses 1920 can include one or more magnetic attachment points 1950, such as including relatively small magnets or including relatively small metallic spots to which magnets can be relatively easily attached. An additional couplable lens i960 can be disposed to attach to those magnetic attachment points 1950, such as using relatively small magnets or relatively small metallic spots to which magnets can be relatively easily attached. When the magnetic attachment points 1950 include magnets, the additional couplable lens i960 can be disposed to use magnets or metallic spots to attach to those spots; when the magnetic attachment points 1950 include metallic spots, the additional couplable lens i960 can be disposed to use magnets to attach to those spots.

[815] In one embodiment, each of the one or more lenses 1920 can include a plurality of such magnetic attachment points 1950 (such as three attachment points), so as to maintain stability of the additional lens i960 when attached to an associated lens 1920. However, there is no particular requirement for any such limitation. The one or more lenses 1920 can each include only one or two such magnetic attachment points 1950, or they can each include three or more such magnetic attachment points disposed at relatively dispersed points, such as on edges of each of the one or more lenses 1920, such as each having those relatively dispersed points disposed in a triangle covering most of each of the lenses. The additional lenses i960 can be disposed on an inner side (proximal to the user), on an outer side (distal from the user) of the one or more lenses 1920, or both. Alternatively, the additional lenses i960 can be disposed proximal to the user for one such lens 1920 and distal from the user for another such lens 1920.

[816] The digital eyewear 1900 can be disposed so as to provide the user with the ability to attach one or more such additional lenses i960 to each of the one or more lenses 1920. For example, the digital eyewear 1900 can be disposed to allow the user to attach a first additional lens i960 to apply a first effect to light infalling to the user’s eye and a second additional lens i960 to apply a second effect to that light. The first effect and the second effect can collectively include a shading/inverse-shading effect, a polarization effect, a coloring/tinting or color balancing effect, or as otherwise described herein.

[817] Alternatively, the user can attach a first additional lens i960 to apply a first effect to light infalling to the user’s eye and a second additional lens i960 to apply more of the same effect as the first effect to that light. As described herein, the first effect and the second effect can collectively include a shading/inverse-shading effect, a polarization effect, a coloring/tinting or color balancing effect, or as otherwise described herein. For example, the user can attach a first additional lens i960 to apply a shading/inverse-shading effect and attach further additional lenses to apply further shading/inverse-shading effects, until the user is satisfied with an amount of shading/inverse-shading applied by the combination of the multiple such additional lenses.

[818] This can have the effect that the user can adjust the amount of shading/inverse-shading applied by the one or more additional lenses i960 in response to the user’s desires, to a suggestion or warning by the as otherwise described herein. For example, the user can use one or more additional lenses i960 to adjust the amount of shading/inverse-shading in response to one or more of:

— a luminance of the ambient environment;

— the patient’s then-current sensitivity to light;

— the patient’s subjective determination of whether migraine or photophobia is occurring or is about to occur;

— the patient’s subjective determination of a severity of a then-current episode of migraine or photophobia; — a suggestion or warning by the digital eyewear with respect to whether migraine or photophobia is about to occur;

— a suggestion or warning by the digital eyewear with respect to whether migraine or photophobia is occurring and/or a severity thereof;

— a suggestion or warning by the digital eyewear with respect to treatment of migraine or photophobia that is occurring and/ or about to occur;

— a suggestion or warning by the digital eyewear with respect to self-care by the patient with respect to a possible or an actual migraine or photophobia event;

— a suggestion or warning by medical personnel with respect to prevention, treatment or amelioration, or self-care by the patient, with respect to possible, oncoming, or actual migraine or photophobia events; or otherwise as described herein.

[819] For another example, the eyewear 1900 can be disposed to allow the user to attach a first additional lens i960 to apply a first coloring/tinting effect and attach another additional lens to apply a second coloring/tinting effects, until the user is satisfied with a combined coloring/tinting effect applied by the combination of the multiple such additional lenses. In one such case, the first coloring/tinting effect can include a high-pass filter applied to frequencies of light with wavelengths above about 5oonm and the second coloring/tinting effect can include a low-pass filter applied to frequencies with wavelengths below about 56onm, with a combined effect of a bandpass filter applied to allow passage of frequencies of with wavelengths between 5oo-56onm (green).

[820] Similarly, this can have the effect that the user can adjust the amount or type of coloring/tinting applied by the one or more additional lenses i960 in response to the user’s desires, to a suggestion or warning by the digital eyewear 1900, to a suggestion or warning by medical personnel or emergency responders, or as otherwise described herein. For example, the user can use one or more additional lenses i960 to adjust the amount of shading/inverse-shading in response to one or more of the conditions described herein.

[821] The digital eyewear 1900 can be disposed so as to provide the user with the ability to remove one or more of the additional lenses i960. The user can pry the additional lenses i960 away from each of their associated lenses 1920 using sufficient force to overcome the attaching force from the one or more magnets, one by one or all at once, using their hands (if the magnetic attachment is not too strong) or using a removal tool 1970 (if doing so is more convenient). This can have the effect that the user can relative dynamically adjust which effects are applied to in- falling light using one or more of the additional lenses i960. Alternatively, this can have the effect that the user can adjust how much of an effect, such as a shading/inverse-shading effect, or what kind of an effect, such as a coloring/tinting effect, is applied to the infalling light. The user can adjust the amount of shading/inverse-shading or coloring/tinting (or other possible effects) by adding or removing additional lenses i960 associated with each of the one or more lenses 1920, each of which applies a relatively different amount of shading/inverse-shading effect, a different type of coloring/tinting effect, or another possible effect, until satisfied with the combined shading/inverse-shading effect, coloring/tinting effect, or other possible effects.

[822] As described herein, the user can adjust the amount of a coloring/tinting or color balancing effect by adding or removing additional lenses i960 associated with each of the one or more lenses 1920, each of which applies a relatively smaller amount of (or different type of) coloring/tinting or color balancing effect, until the user is satisfied with the combined coloring/tinting or color balancing effect. For example, if the user desires relief from an oncoming, then-cur- rently occurring, or recent migraine or photophobia event, the user can apply one or more green filters (such as with respect to light between about 5oo-56onm) until the user is satisfied that the color balance of the user’s field of view, as viewed by the user, is sufficiently green to prevent, treat, or provide self-care for, migraine or photophobia. Alternatively, if the user desires relief from harsh lighting, such as might occur in response to excessive blue or ultraviolet light (such as having a wavelength shorter than about 48onm), the patient can apply one or more filters disposed to remove relatively shorter wavelengths (such as blue or ultraviolet light) or promote relatively longer wavelengths (such as red or amber light), until the user is satisfied that the color balance of the patient’s field of view, as viewed by the user, is sufficient to avoid migraine or photophobia.

[823] Similarly, the user can adjust the amount of a refraction effect by adding or removing additional lenses i960 associated with each of the one or more lenses 1920, each of which applies a relatively smaller amount of refraction effect, until satisfied with the combined refraction effect. For example, if the user desires to engage in viewing of a relatively close or fine image, such as when reading from a small display (e.g., on a smartphone or other mobile device), the user can apply one or more additional lenses i960 to each of the one or more lenses i960, until the user is satisfied that the combined refractive effect is suitable for reading the intended image. Similarly, if the user desires to engage in viewing of a relatively less-close image, such as when observing at a mid-range or longer distance, the user can apply one or more additional lenses i960 to each of the one or more lenses i960, until the user is satisfied that the combined refractive effect is suitable for viewing the intended image.

[824] While the additional lenses i960 have been described above with respect to an entire one of the one or more such lenses 1920, there is no particular requirement for any such limitation. One or more such additional lenses i960, but not necessarily all of them, can each be disposed to cover a portion which is less than all of its associated one of the one or more lenses 1920. For example, one or more such additional lenses i960 can be disposed to cover a lower region 1920L of its associated one of the one or more lenses 1920, so as to provide a particular visual effect only with respect to that portion of the user’s field of view. For another example, when the user intends to use bifocal, trifocal, multifocal, progressive, or other types of lenses 1920, each one of the lenses can have one or more associated additional lenses i960 disposed to provide more or less refractive effect in one or more regions of the user’s field of view (such as the lower region 1920L, a middle region 1920M, or an upper region 1920U) of the associated one of the one or more lenses 1920. For another example, when the user intends to use bifocal, trifocal, multifocal, progressive, or other types of lenses 1920, each one of the one or more lenses can have one or more associated additional lenses i960 disposed to provide a coloring/tinting or color balance effect (such as enhancing green light, or de-enhancing blue/ultraviolet light) in one or more regions of the user’s field of view (such as the lower region 1920L, the middle region 1920M, or the upper region 1920U) of the associated one of the one or more lenses 1920.

[825] In one embodiment, each of the one or more lenses 1920 can include circuitry 1980 disposed to be coupled to one or more of the right temple 1940R or the left temple 1940L.

[826] For one example, the circuitry 1980 can be disposed to be coupled to one of the temples (without loss of generality, the right temple 1940 R), to allow the circuitry to be coupled the power source 1821 (as described with respect to fig. 18). In such cases, when the circuitry 1980 is coupled to one or more of the lenses 1920, the circuitry 1980 can be disposed to be coupled to the power source 1821 and can be disposed to draw power from the power source. This can have the effect that when the additional lens i960 is coupled to its associated one of the one or more lenses 1920, its circuitry 1980 can be powered and can operate without its own separate power source. [827] For another example, the circuitry 1980 can be disposed to be coupled to one of the temples (without loss of generality, the left temple 1940L), to allow the circuitry to be coupled the control circuit 1831 (as described with respect to fig. 18). In such cases, when the circuitry 1980 is coupled to one or more of the lenses 1920, the circuitry 1980 can be disposed to be coupled to the control circuit 1831 and can be disposed to be controlled by the control circuit. This can have the effect that when the additional lens i960 is coupled to its associated one of the one or more lenses 1920, its circuitry 1980 can be controlled by the control circuit and can operate without its own separate control circuit.

Fig. 20 — Multi-layer lenses

[828] Fig. 20 shows a conceptual drawing of eyewear capable of including multi-layer lenses.

[829] In one embodiment, an example eyewear 2000 can include one or more of the following:

— A front piece 2010 disposed to support one or more lenses 2020.

— A set of lenses 2020 each having multiple layers 2020a, 2020b, or 2020c, or possibly others, each disposed to perform a coloring/ tinting alteration of a user’s field of view.

[830] For example, one or more of the multiple layers 2020a, 2020b, or 2020c, can include a dichromic material disposed on a base layer. The dichromic material can be disposed to change color between a clear state and a color filtering state, in response to an electronic signal, such as an electronic signal received from a processor or another external device. In such cases, one or more of the lenses 2020 can include a base glass component 2021 (which is substantially clear) and a coating 2022 (having a filtering effect on light passing through), with the effect of providing a band-pass filtering effect for a selected color.

[831] The selected color can include one or more of: red, green, blue, cyan, magenta, yellow, or some combination thereof. One such color can include green frequencies between about 500- 56onm, which are believed to prevent, ameliorate, or treat, effects of migraine or neuro-ophthal- mic disorder.

[832] In one alternative, a selected one or more of the lenses 2020 can include a base glass component 2021 that is substantially clear, with a first coating 2022a (such as on a first side) having a first filtering effect, and a second coating 2022b (such as on a second side) having a second filtering effect. This can have the effect that both filtering effects can be applied to light passing through the selected lens 2020.

[833] In another alternative, one or more of the lenses 2020 can include a base glass component 2021 that is substantially clear, with a first coating 2022a (having a first filtering effect) covering a first portion of the user’s field of view, and a second coating 2022b (having a second filtering effect) covering a second portion of the user’s field of view. This can have the effect that the first filtering effect is applied to the first portion of the user’s field of view and the second filtering effect is applied to the second portion of the user’s field of view.

[834] When two or more of multiple layers 2020a, 2020b, or 2020c, are coupled (or one such layer includes more than one filtering coating), the eyewear 2000 can be disposed to provide a combination filter that selects substantially any color. This can have the effect that the user’s field of view, or a portion thereof, can be disposed to present the selected color. For example, a combination of a red filter and a green filter can provide a yellow filter, thus showing only those elements of the user’s field of view that have yellow components.

[835] When one or more of the multiple layers 2020a, 2020b, or 2020c, are disposed to be controlled as a single element covering the user’s entire field of view, an electronic signal directing a color change would be disposed over the user’s entire field of view. For example, a green color can be disposed over the user’s field of view; this can have the effect of preventing ameliorating, or treating, effects of migraine or neuro-ophthalmic disorder.

[836] Alternatively, one or more of the multiple layers 2020a, 2020b, or 2020c, can be disposed to be controlled in individual pixels or regions. In such cases, one or more electronic signals can be disposed to direct a color change over only a portion of the user’s entire field of view. When the pixels or regions are small enough, one or more electronic signals can be disposed to present a color change in the form of a detailed picture (still or moving). This can be disposed to provide an augmented reality or virtual reality image to the user.

[837] In one embodiment, one or more multiple layers can be disposed to combine a first effect on the user’s whole field of view, with a second effect on individual pixels or regions within the user’s field of view. For example, a green color (such as in the 5OO-56onm range of wavelengths) can be imposed on the user’s whole field of view, so as to prevent, ameliorate, or treat, effects of migraine or neuro-ophthalmic disorder. Concurrently, individual pixels or regions can be disposed to show elements of interest to the user that would otherwise be filtered out by the green color (such as red lights, red brake lights, red traffic signs, or other hazard warnings), such as by presenting them in a brighter format, in a flashing format, or otherwise by drawing attention thereto.

[838] In an alternative embodiment, one or more of the multiple layers, or a different set of multiple layers, can be disposed using a different color gamut, other than red/green/blue. For example, any set of color filters can be used, so long as a reasonable portion of the visual spectrum can be affected by one or more of the multiple lenses 202a, 2020b, or 2020c, or possibly others. There is no particular requirement to restrict the set of multiple lenses to exactly or to only three such lenses.

Fig. 21 - Highlighting using polarization

[839] Fig. 21 shows a conceptual drawing of eyewear capable of highlighting using polarization.

[840] In one embodiment, eyewear 2100 can include

— A lens 2110 disposed to perform a function on at least a portion of a field of view.

— A circuit 2120 disposed to control at least one function of the lens.

[841] The function performed by the lens 2110 can include a polarization function. The polarization function having the effect that one or more selected portions of the field of view can be highlighted and/or blocked with respect to the user’s view. For example, when the user is viewing a display screen, such as one coupled to a coupled device (not shown), the polarization function can be disposed to highlight selected portions of the display, such as selected words or pictures. Alternatively, the polarization function can be disposed to de-highlight or otherwise block selected words or pictures (or other features) of the field of view, so as to allow the user to not pay excessive attention to them.

[842] For example, the user might instruct the eyewear 2100 to highlight certain selected words (such as “abecedarian”). In such cases, the eyewear 2100 can direct the circuit 2120 to perform polarization on those portions of the field of view in which that word appears, thus highlighting the word for the user and making it easier for the user to identify that word within other elements in the display.

[843] For another example, the user might instruct the eyewear 2100 to de-highlight or otherwise block selected words or pictures, or other features, of the field of view, so as to allow the user to not pay excessive attention to them. In such cases, the eyewear 2100 can direct the circuit 2120 to perform polarization on those portions of the field of view in which those selected elements appear, thus de-highlighting or otherwise blocking them so as to reduce or delete their effect on the user. For example, when the user is concentrating on a particular part of the field of view, the user might desire to de-highlight or otherwise block advertising or other extraneous distractions.

[844] In alternative embodiments, the eyewear 2100 can direct the circuit 2120 to use shad- ing/inverse-shading or coloring/tinting, either in lieu of polarization or in addition to polarization, to perform or to assist in performing, highlighting or de-highlighting of portions of the field of view.

Combination of functions

[845] In one embodiment, the eyewear can combine two or more such functions, such as in response to an input from the wearer designating that those functions should be combined, or such as in response to the eyewear recognizing a circumstance in which the wearer typically requests that those functions should be combined. For example, the wearer can designate that those functions should be combined using an eye gesture or other input. For another example, the eyewear can recognize a circumstance in which the wearer typically requests that those functions should be combined in response to a machine learning technique, such as a statistical response to sensory parameters, wearer parameters, environmental parameters, or otherwise as described herein. In such cases, the sensory parameters or wearer parameters can include information with respect to the wearer’s medical or other status; the environmental parameters or can include information with respect to the scene in the wearer’s field of view (FOV). The eyewear can also be responsive to other information, or to a combination of factors, such as the eyewear being more/less sensitive to selected parameters (or to particular wearer inputs) when sensory parameters or wearer parameters indicate particular medical or other status, or otherwise as described herein. ALTERNATIVE EMBODIMENTS

[846] While this Application primarily describes a systems and techniques that relate to dynamic adjustment of eyewear, including at least one or more of:

— Dynamically adjusting the eyewear in response to wearer commands, such as when the wearer recognizes that a change in eyewear parameters is desirable.

— Dynamically adjusting the eyewear in response to commands from an overseer or other party, such as when other party recognizes that the wearer is undergoing a medical or other sensory condition.

— Dynamically adjusting the eyewear in response to one or more other eyewear devices, such as when multiple wearers are cooperating to each identify information available to any one of them; or otherwise as described herein.

— Dynamically adjusting the eyewear in response to one or more personalization parameters, or

— Dynamically adjusting the eyewear in response to one or more hybrids of environmental factors or wearer commands.

[847] After reading this Application, those skilled in the art will recognize that the techniques described herein are applicable to a wide variety of different types of eyewear and substitutes for eyewear; to a wide variety of facts about the wearer and their eyewear, and any relationship to their environment; to a wide variety of different ways in which the eyewear could be dynamically adjusted; to a wide variety of other devices that could be used with the eyewear, or ways in which the eyewear could be used; or otherwise as described herein.

[848] This Application describes a preferred embodiment with preferred process steps and, where applicable, preferred data structures. After reading this Application, those skilled in the art would recognize that, where any calculation or computation is appropriate, embodiments of the description can be implemented using general purpose computing devices or switching processors, special purpose computing devices or switching processors, other circuits adapted to particular process steps and data structures described herein, or combinations or conjunctions thereof, and that implementation of the process steps and data structures described herein would not require undue experimentation or further invention. [849] The claims are incorporated into the specification as if fully set forth herein.