Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEM AND METHOD FOR ENGAGING TARGETS UNDER ALL WEATHER CONDITIONS USING HEAD MOUNTED DEVICE
Document Type and Number:
WIPO Patent Application WO/2023/170697
Kind Code:
A1
Abstract:
The present invention relates to a system and method for providing virtual sight as aiming aid to a weapon to engage targets under all weather conditions using mixed reality head mounted device (HMD). The system and method provide a peripheral target observation device like radar/lidar that transmits tactical data to a target data receiver (TDR). Here, TDR process the received tactical data and trajectory of selected target is transformed from TDR's frame of reference to that of HMD. The operator of weapon system, who is also wearer of HMD is provided with situational cues along with a virtual target over HMD display that assists the operator in accurate aiming and locking of intended target.

Inventors:
RAUT PANKAJ UDAY (IN)
PATIL ABHIJIT BHAGVAN (IN)
TOMAR ABHISHEK (IN)
SURI YUKTI (IN)
RATHI PURWA (IN)
BARAI SHANTANU (IN)
KAMPOO ADIL (IN)
TUGAONKAR PRATHAMESH (IN)
Application Number:
PCT/IN2022/050477
Publication Date:
September 14, 2023
Filing Date:
May 20, 2022
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
DIMENSION NXG PVT LTD (IN)
International Classes:
G06T19/00; F41G3/00; G01S3/00; G06F3/01
Foreign References:
US8678282B12014-03-25
US20120280853A12012-11-08
CN109507686A2019-03-22
Other References:
ANONYMOUS: "84) Augmented Reality (AR) based Head Mounted Display System (Army) (17-09-2018)", ‘MAKE-II’ PROJECTS - AIP PROJECTS, DEPARTMENT OF DEFENCE PRODUCTION, INDIA, India, XP009549547, Retrieved from the Internet
Download PDF:
Claims:
We Claim:

1 ) A system 1000 for tracking and locking one or more targets 50, characterized in utilizing a wearable mixed reality based head mounted display (HMD) 500, the system 1000 comprising: a peripheral target observation device 100 configured to obtain tactical data of the one or more targets 50; a target data receiver (TDR) 200 configured to: receive and process the tactical data of the one or more targets 50 received from the peripheral target observation device 100; and select at least one of the one or more targets 50 based on a plurality of predetermined factors; a computing module 250 communicatively coupled with the TDR 200 configured to receive and process trajectory data of the selected target 50 and determine correction data from TDR’s 200 frame of reference to HMD’s 500 frame of reference for transmission to a processing module 560; the processing module 560 configured to perform coordinate transformation of the tactical data such that trajectory of the selected target 50 is transformed from the TDR’s 200 frame of reference to that of HMD’s 500 frame of reference; the wearable mixed reality based HMD 500 configured to render a virtual target based on the transformed trajectory data, and overlay the virtual target 50’ on the selected target 50 in a three dimensional space; and a weapon system 600 having a frame of reference aligned with that of the HMD’s 500 frame of reference, wherein the weapon system 600 is manoeuvred such that a virtual reticle emanating from the weapon system 600 and viewed from the HMD 500, coincides with the virtual target 50’ for accurate aiming at the selected target 50 so overlaid with the virtual target 50’.

2) The system 1000, as claimed in claim 1 , wherein the peripheral object observation device 100 is configured to receive a reflected wave of an irradiated wave from the one or more targets 50 existing at an irradiation destination.

3) The system 1000, as claimed in claim 1 , wherein the tactical data of the one or more targets 50 received by the target data receiver (TDR) 200 is in a spherical coordinate system in the peripheral object observation device’s 100 frame of reference.

4) The system 1000, as claimed in claim 1 , wherein the TDR 200 is configured to select the one or more targets 50 based on the plurality of predetermined factors comprising priority and estimation (PSQR) of target, target type, speed and distance of the target, lethality levels and the like.

5) The system 1000, as claimed in claim 1 , wherein the target data receiver 200 is configured to process, decode and extract the tactical data to obtain a unique identifier information of the one or more targets 50 assigned thereto by the peripheral object observation device 100, radial distance and azimuthal angle of the target 50 from the peripheral object observation device 100, perpendicular height of the target 50 from the ground plane or base, heading angle of the target 50, speed of the target 50, IFF (identification of friend or foe) code of target 50, WCO (weapon control orders) code of the target determined or computed by peripheral object observation device 100, advanced target position, orientation and relative velocity of the target 50.

6) The system 1000, as claimed in claim 1 , wherein the wearable mixed reality HMD 500 further comprises of: at least mixed reality glasses 510; a communication module 520; one or more cameras 530; a display unit 540; an audio unit 550; a plurality of dockable sensors 570 configured to gather data related to the target, comprising target’s surrounding environment, situational awareness, navigation, binocular vision, weather conditions and presence of objects and humans; a processing module 560 configured to receive target data from the one or more cameras 530 and the plurality of dockable sensors 570; and process the received target data to determine target location, IFF (Identification of Friend or Foe), velocity & distance estimation, weapon information.

7) The system 1000, as claimed in claim 1 , wherein the computing module is configured to receive and process the trajectory data of the selected target 50 in real time for interpolating the target trajectory path and prediction of future path, wherein the interpolation and prediction of the target trajectory is based on historical data of track traced by the selected target 50 for a predetermined time period from instance of its detection by the peripheral target observation device 100.

8) The system 1000, as claimed in claim 1 , wherein the computing module 250 is configured to utilize a combination of tracking filters selected from a group comprising Extended Kalman Filter (EKF), Kalman filter (KF), Particle filter or Bayesian filter and neural network model to predict position, target velocity, target trajectory and direction estimates of the selected target 50.

9) The system 1000, as claimed in claim 1 , wherein the computing module is configured to compute 6dof pose correction data in real time consisting of translation and orientation offset between the TDR 200 and the HMD 500.

10) The system 1000, as claimed in claim 10, wherein the computing module 250 is configured to estimate translation position of the HMD 500 with respect to the TDR 200 from readings obtained from global positioning system (GPS), inertial measurement units (IMUs) and real time kinematics (RTK) GPS.

11 ) The system 1000, as claimed in claim 10, wherein the computing module is configured to compute orientation offset between HMD 500 and TDR 200 using true north bearing computed from IMU and GPS readings.

12) The system 1000, as claimed in claim 1 , wherein the processing module 560 is configured to: calculate and estimate 6 dof pose of the HMD 500; receive the trajectory data and correction data in TDR’s 200 frame of reference from the computing module 250; perform coordinate transformation of the tactical data utilizing the 6dof correction data received from the computing module 250, HMD’s estimated 6 dof pose and the trajectory data for parallax correction and transforming the target trajectory from the TDR’s frame of reference to that of HMD’s 500 frame of reference.

13) The system 1000, as claimed in claim 13, further comprising: a GNSS RTK base station hosted at end of the computing module 250 and configured to transmit absolute position thereof, 6dof pose partial correction data associated with a GNSS RTK rover to the processing module 560; and the GNSS RTK rover hosted at end of the processing module 560 and configured to compute position of TDR 200 with respect to the HMD 500 for parallax correction based on the received absolute position of the base station, the 6dof pose partial correction data of the rover and GPS readings of the rover.

14) The system 1000, as claimed in claim 1 , wherein the HMD 500 is configured to detect and track the weapon system 600 to compute 6dof pose of the weapon system 600 and view the emanated virtual reticle from centre of said weapon system 600, wherein the viewing of emanated virtual reticle serves as a virtual aid in manoeuvring and target aiming of the weapon system 600.

15) The system 1000, as claimed in claim 15, further comprising: one or more sensors selected from inertial measurement unit (IMU), proximity sensors placed at HMD 500 and on the weapon system 600; or one or more cameras of the HMD 500 to perform visual 3- dimensional tracking; or a combination thereof to track the 6dof pose of the weapon system 600 in real time and align the frame of reference of the weapon system 600 with that of the frame of reference of HMD 500.

16) The system 1000, as claimed in claim 1 , wherein the HMD 500 is configured to compute a ballistic parametric curve of the emanated virtual reticle based on weapon type, initial thrust, mass of ammunition, maximum range of weapon system, air drag, gravity etc and in accordance manoeuvre the weapon system 600 for target engagement.

17) The system 1000, as claimed in claim 1 , wherein the HMD 500 is configured to display virtual target spawned as a hollow blip, the virtual reticle providing a visual directional cue and situational cues for the target detection and locking. 18) The system 1000, as claimed in claim 1 , wherein the HMD 500 is configured to display virtual target 50’ as overlaid over the real target 50 under all weather conditions.

19) The system 1000, as claimed in claim 1 , wherein the HMD 500 further comprises of an electrochromic optical element 590 configured to modulate opacity of the HMD in response to input received from one or more ambient light sensors to enable the HMD function under all weather conditions.

20) A method 2000 for tracking and locking one or more targets 50, characterized in utilizing a wearable mixed reality based head mounted display (HMD) 500, the method 2000 comprising: obtaining tactical data of the one or more targets 50 from a peripheral target observation device 100 and transmitting the tactical data to a target data receiver (TDR) 200; receiving and processing, at the TDR 200, the tactical data of the one or more targets 50, and selecting at least one of the one or more targets 50 based on a plurality of predetermined factors; receiving and processing, at a computing module 250 communicatively coupled with the TDR 200, trajectory data of the selected target 50 and determining correction data from TDR’s 200 frame of reference to HMD’s 500 frame of reference for transmission to a processing module 560; performing, at a processing module 560, coordinate transformation of the tactical data such that trajectory of the selected target 50 is transformed from the TDR’s 200 frame of reference to that of HMD’s 500 frame of reference; rendering over a wearable mixed reality based HMD 500, a virtual target based on the transformed trajectory data, and overlaying the virtual target 50’ on the selected target 50 in a three dimensional space; and aligning frame of reference of a weapon system 600 with that of the HMD’s 500 frame of reference, and manoeuvring the weapon system 600 such that a virtual reticle emanating from the weapon system 600 and viewed from the HMD 500, coincides with the virtual target 50’ for accurate aiming at the selected target 50 so overlaid with the virtual target 50’.

21 ) The method, as claimed in claim 20, wherein a reflected wave of an irradiated wave from the one or more targets 50 existing at an irradiation destination.

22) The method, as claimed in claim 20, wherein the tactical data of the one or more targets 50 received by the target data receiver (TDR) 200 is in a spherical coordinate system in the peripheral object observation device’s 100 frame of reference.

23) The method, as claimed in claim 20, wherein the one or more targets 50 are selected based on the plurality of predetermined factors comprising priority and estimation (PSQR) of target, target type, speed and distance of the target, lethality levels and the like.

24) The method, as claimed in claim 20, wherein processing of tactical data further comprises decoding and extracting the tactical data to obtain a unique identifier information of the one or more targets 50 assigned thereto by the peripheral object observation device 100, radial distance and azimuthal angle of the target 50 from the peripheral object observation device 100, perpendicular height of the target 50 from the ground plane or base, heading angle of the target 50, speed of the target 50, IFF (identification of friend or foe) code of target 50, WCO (weapon control orders) code of the target determined or computed by peripheral object observation device 100, advanced target position, orientation and relative velocity of the target 50.

25) The method, as claimed in claim 20, wherein receiving and processing of the trajectory data of the selected target 50 is performed in real time for interpolating the target trajectory path and prediction of future path, wherein the interpolation and prediction of the target trajectory is based on historical data of track traced by the selected target 50 for a predetermined time period from instance of its detection by the peripheral target observation device 100.

26) The method, as claimed in claim 20, wherein a combination of tracking filters is selected from a group comprising Extended Kalman Filter (EKF), Kalman filter (KF), Particle filter or Bayesian filter and neural network model to predict position, target velocity, target trajectory and direction estimates of the selected target 50.

27) The method, as claimed in claim 20, wherein determination of 6dof pose correction data in real time comprises of computing translation and orientation offset between the TDR 200 and the HMD 500.

28) The method, as claimed in claim 20, wherein translation position of the HMD 500 with respect to the TDR 200 is estimated from readings obtained from global positioning system (GPS), inertial measurement units (IMUs) and real time kinematics (RTK) GPS. 29) The method, as claimed in claim 20, further comprising: calculating and estimating 6 dof pose of the HMD 500; receiving the trajectory data and correction data in TDR’s 200 frame of reference from the computing module 250; performing coordinate transformation of the tactical data, at the processing module 560, by utilizing the 6dof pose correction data received from the computing module 250, HMD’s estimated 6 dof pose and the trajectory data for parallax correction and transforming the target trajectory from the TDR’s frame of reference to that of HMD’s 500 frame of reference.

30) The method, as claimed in claim 20, further comprising: transmitting absolute position, 6dof pose partial correction data associated with a GNSS RTK rover from a GNSS RTK base station hosted at end of the computing module 250 to the processing module 560; and computing position of TDR 200 with respect to the HMD 500 for parallax correction at the GNSS RTK rover hosted at end of the processing module 560 based on the received absolute position of the base station, the 6dof pose partial correction data of the rover and GPS readings of the rover.

31 ) The method, as claimed in claim 20, wherein aligning the frame of reference of the weapon system 600 with that of the HMD’s 500 frame of reference comprises computing 6dof pose of the weapon system 600 and viewing the emanated virtual reticle from centre of said weapon system 600, wherein the viewing of emanated virtual reticle serves as a virtual aid in manoeuvring and target aiming of the weapon system 600. 32) The method, as claimed in claim 20, wherein a ballistic parametric curve of the emanated virtual reticle is computed based on weapon type, initial thrust, mass of ammunition, maximum range of weapon system, air drag, gravity etc and in accordance manoeuvre the weapon system 600 for target engagement.

33) The method, as claimed in claim 20, wherein virtual target is displayed as a hollow blip, and wherein the virtual reticle provides a visual directional cue and situational cues for the target detection and locking.

34) The method, as claimed in claim 20, wherein the virtual target 50’ as overlaid over the real target 50 is displayed over the HMD 500 under all weather conditions.

35) The method, as claimed in claim 20, further comprising modulating opacity of the HMD 500 by an electrochromic optical element 590 in response to input received from one or more ambient light sensors to enable the HMD function under all weather conditions.

Description:
SYSTEM AND METHOD FOR ENGAGING TARGETS UNDER ALL

WEATHER CONDITIONS USING HEAD MOUNTED DEVICE

FIELD OF THE INVENTION

[0001] Embodiments of the present invention relate to mixed reality based head mounted device and more particularly to a system and a method for using an Augmented Reality/Mixed Reality headset or glasses while aiming a target using handheld firearms or different surface-to-air missile systems or cannons/autocannons under all weather conditions.

BACKGROUND OF THE INVENTION

[0002] Strong armed forces are one of the important aspects for the development and growth of any country. The armed forces participate in peacekeeping operations, military exercises and humanitarian relief missions. They also carry out more traditional tasks such as securing the borders. Armed forces use weapons for the benefit of civilians including law enforcement officers and members of the Army, Navy, Air Force, and Marines. In order to use a weapon effectively, a person must be able to accurately aim at a target. To accurately engage targets, the strike of a bullet must coincide with the aiming point (Point of Aim/Point of Impact) on the target. Over the years, various techniques and devices have been developed to help a person accurately aim a weapon.

[0003] One common approach is to mount a sight on the weapon to view the intended target. The sight must be zeroed before being used on the weapon. A true zeroing a firearm, such as a rifle, is the process of aligning and calibrating the optical sight on the weapon with the rifle so the user can accurately aim at the target from a set distance. It is based on the rationale that any individual differences in sight alignment for each individual will be eliminated by correcting the sight for the individual firing the weapon. Precisely, the user must calibrate the user's optical sights to their weapons to eliminate all eccentricities and ensure the user will hit the intended target. Typically, this is accomplished by adjusting the sights on the weapon such that when a round is fired from the weapon, it hits the aiming point within the margin of error of weapon. This zeroing process is one of the most critical elements of accurate target engagement. In order to aim accurately at a target, it is imperative for the sighting mechanism to be properly installed and adjusted on the gun. Having incorrect sight alignment leads to inaccurate firings that may eventually have negative training impacts besides incurring huge time, cost and ammunition loss.

[0004] One of the existing challenges in sighting in or zeroing a firearm is parallax correction in the sighting system. The goal is reducing the parallax and adjusting the sights so the projectile (e.g., bullet or shell) may be placed at a predictable impact position within the sight picture. The principle is to shift the line of aim so that it intersects the parabolic projectile trajectory at a designated point of reference, known as a zero, so the gun will repeatably hit where it aims at the distance of that "zero" point.

[0005] Conventional methods of zeroing a firearm are often expensive and time consuming where multiple rounds of live ammunition may be fired to achieve desired accuracy besides assuring receiving of detailed instructions from skilled professionals. This indeed is a time and cost intensive training, especially when a large group of armed personnel or defence forces need to be trained. Additionally, there is always a burden of employing a skilled professional adept at handling such sophisticated weapon systems at all times. More importantly, real challenge persists in situations of unfavourable weather conditions where visibility of aerial target is impaired for variable reasons- fog, smog, cloud, rain, snow etc. Locating and engaging threating targets under these disadvantageous weather conditions is critical and plays an instrumental role in factoring mission success. [0006] In the background of foregoing limitations, there exists a need in the art for a system and method that can effectively align the weapon system with line of sight of operator and engage targets with acute precision in real time under all weather conditions. Preferably, some kind of virtual aid in the form of cues for locating and actual engagement of the intended aerial target will be extremely vital for operators in dynamic situations like that of a battlefield. Therefore, an effective and viable system and method for displaying situational and directional pointers for target sighting, tracking, locking and engagement is desired that does not suffer from above-mentioned deficiencies.

OBJECT OF THE INVENTION

[0007] An object of the present invention is to provide a virtual sight as aiming aid while using a weaponry system that can assist in engaging targets in real time under all weather conditions.

[0008] Another object of the present invention is to provide a head mounted display that assists the operator in providing a virtual sight along with situational and directional cues for aiming and locking the intended aerial target.

[0009] An object of the present invention is to provide a system to provide virtual aid in the form of cues for finding the aerial target and aiding in actual engagement with the target by displaying virtual lasers and virtual crosshair pointers for target sighting, tracking, locking and engagement.

[0010] Another object of the present invention is to reduce aiming errors and enhance shooting efficacy as visual and audio aid is provided to operator while using his weaponry system to aim at the target.

[0011] In another object of the present invention, significant reduction in training time and ammunition cost is achieved as operator of the weaponry system is provisioned with visual/audio cues for locking the target. [0012] In yet another embodiment of the present invention, the mixed reality based head mounted display facilitate depiction of entire surveillance picture along with enhanced analytics to the operator for right selection and neutralization of the probable aerial targets.

SUMMARY OF THE INVENTION

[0013] The present invention may satisfy one or more of the above- mentioned desirable features. Other features and/or advantages my become apparent from the description which follows.

[0014] In one aspect of the present disclosure, a system for tracking and locking one or more targets, characterized in utilizing a wearable mixed reality based head mounted display (HMD) is disclosed. Accordingly, the system comprises a peripheral target observation device configured to obtain tactical data of the one or more targets; a target data receiver (TDR) configured to receive and process the tactical data of the one or more targets received from the peripheral target observation device, and select at least one of the one or more targets based on a plurality of predetermined factors.

[0015] The system further comprises of a computing module communicatively coupled with the TDR configured to receive and process trajectory data of the selected target and determine correction data from the TDR’s frame of reference to HMD’s frame of reference for transmission to a processing module. Now, the processing module is configured to perform coordinate transformation of the tactical data such that trajectory of the selected target is transformed from the TDR’s frame of reference to that of HMD’s frame of reference. In one significant aspect, the wearable mixed reality based HMD is configured to render a virtual target based on the transformed trajectory data, and overlay the virtual target on the selected target in a three dimensional space. Finally, the system includes a weapon system having a frame of reference aligned with that of the HMD’s frame of reference, wherein the weapon system is manoeuvred such that a virtual reticle emanating from the weapon system and viewed from the HMD, coincides with the virtual target for accurate aiming at the selected target so overlaid with the virtual target.

[0016] In another aspect of the present invention, the method for tracking and locking one or more targets is disclosed, which is characterized in utilizing a wearable mixed reality based head mounted display (HMD). The method comprises at first obtaining tactical data of the one or more targets from a peripheral target observation device and transmitting the tactical data to a target data receiver (TDR). Next, the tactical data of the one or more targets is received and processed at the TDR and at least one of the one or more targets are selected based on a plurality of predetermined factors. Now, at a computing module communicatively coupled with the TDR, trajectory data of the selected target is received and processed, whereby correction data from TDR’s frame of reference to HMD’s frame of reference is determined for transmission to a processing module. At the processing module, coordinate transformation of the tactical data is performed such that trajectory of the selected target is transformed from the TDR’s frame of reference to that of HMD’s frame of reference.

[0017] Following step involves rendering over a wearable mixed reality based HMD, a virtual target based on the transformed trajectory data and overlaying the virtual target on the selected target in a three dimensional space. Finally, the method steps involve aligning frame of reference of a weapon system with that of the HMD’s frame of reference, and manoeuvring the weapon system such that a virtual reticle emanating from the weapon system 600 and viewed from the HMD, coincides with the virtual target for accurate aiming at the selected target so overlaid with the virtual target. [0018] In one of the most significant aspect of present disclosure, the aforementioned system and method are capable of locking the targets in real time irrespective of prevailing weather conditions.

[0019] Details of one or more implementations of the subject matter described in this specification are set forth in the accompanying drawings and the description below. Other features, aspects, and advantages will become apparent from the description, the drawings, and the claims. Neither this summary nor the following detailed description purports to define or limit the scope of the inventive subject matter.

BRIEF DESCRIPTION OF THE DRAWINGS

[0020] The skilled artisan will understand that the drawings described below are for illustrative purposes only. The drawings are not intended to limit the scope of the present teachings in any way.

[0021] Fig. 1 illustrates a block diagram of the system providing virtual sight as aiming aid to a weapon using head mounted device, in accordance with examples of the present invention.

[0022] Fig. 2 illustrates offset correction between the Target Data Receiver (TDR) with respect to the Head mounted device (HMD), in accordance with examples of the present invention.

[0023] Fig. 3 displays tactical information rendered as a hollow blip over the Head mounted device (HMD), in accordance with one exemplary embodiment of the present invention.

DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS OF THE INVENTION

[0024] In the following discussion that addresses a number of embodiments and applications of the present disclosure, reference is made to the accompanying drawings that form a part hereof, and in which is shown by way of illustration specific embodiments in which the disclosure may be practiced. It is to be understood that other embodiments may be utilized and changes may be made without departing from the scope of the present invention.

[0025] Various inventive features are described below that can each be used independently of one another or in combination with other features. However, any single inventive feature may not address any of the problems discussed above or only address one of the problems discussed above. Further, one or more of the problems discussed above may not be fully addressed by any of the features described below.

[0026] As used herein, the singular forms “a”, “an” and “the” include plural referents unless the context clearly dictates otherwise. “And” as used herein is interchangeably used with “or” unless expressly stated otherwise. Likewise, other terms such as “target or object”, “head display or wearable device or head mounted display or mixed reality based head mounted display” may be interchangeably used. All embodiments of any aspect of the invention can be used in combination, unless the context clearly dictates otherwise.

[0027] Unless the context clearly requires otherwise, throughout the description and the claims, the words ‘comprise’, ‘comprising’, and the like are to be construed in an inclusive sense as opposed to an exclusive or exhaustive sense; that is to say, in the sense of “including, but not limited to”. Words using the singular or plural number also include the plural and singular number, respectively. Additionally, the words “herein,” “wherein”, “whereas”, “above,” and “below” and words of similar import, when used in this application, shall refer to this application as a whole and not to any particular portions of the application. [0028] The description of embodiments of the disclosure is not intended to be exhaustive or to limit the disclosure to the precise form disclosed. While the specific embodiments of, and examples for, the disclosure are described herein for illustrative purposes, various equivalent modifications are possible within the scope of the disclosure, as those skilled in the relevant art will recognize.

[0029] Broadly, Fig. 1 schematically and graphically illustrate a system and method for tracking and locking one or more targets, characterized in utilizing a wearable mixed reality based head mounted display (HMD) that provides virtual sight as an aiming aid to a weaponry system. In one of the exemplary embodiments of the present disclosure, a system and method that provides virtual sight as an aiming aid to a weaponry system is proposed that employs an external, smart wearable head mounted display to enable the operator of the weaponry system or wearer of the HMD achieve the intended purpose of accurate shot placement. Accordingly, the mixed reality based HMD is built with capabilities of providing virtual aid in the form of vital cues for locating the aerial target and aiding in actual engagement of the weapon with the target for accurate aiming and firing.

[0030] Re-referring to Fig. 1 , a block diagram of proposed system 1000 is presented, wherein the system 1000 comprises of a peripheral object/target observation device 100, one or more aerial targets (to be aimed at) 50a, 50b, 50c,...50n (singularly referred to by numeral 50), a target data receiver (TDR) 200, a smart mixed reality based head mounted device (HMD) 500 wearable by the operator, and a weapon system 600 operable by the operator who is aiming the weapon 600 towards the stationary or moving aerial target 50 for final accurate shot.

[0031] In accordance with one exemplary embodiment, the tactical data from external peripheral object observation device 100 such as radar/lidar 100 is received by Target Data Receiver (TDR) 200. At TDR 200, the tactical data is received and processed. In one preferred embodiment, the TDR 200 is communicatively coupled with a computing module 250 where the target trajectory data is received and processed. A processing module 560, provided preferably at operator’s end then performs coordinate transformation of the tactical data such that trajectory of the target is transformed from TDR’s frame of reference to that of HMD’s frame of reference. In one exemplary embodiment, a complete surveillance picture and precise position of one or more aerial targets 50 as virtual targets (as explained later) may be rendered over HMD 500 for absolute aiming and firing. Finally, a weapon system 600 configured at operator’s end is manoeuvred such that a virtual reticle emanates from weapon system 600 and is made viewable via HMD 500 to the operator. This virtual reticle is eventually made to coincide with a virtual target rendered over the HMD 500 for accurate aiming and locking of the target 50.

[0032] In accordance with explanatory embodiment, the external peripheral object observation device 100 is configured to receive a reflected wave of the irradiated radar wave from the aerial target 50 existing at the irradiation destination. In one preferred embodiment, the peripheral object observation device (radar) 100 repeatedly observes a relative position of an aerial target 50 within an enhanced operational picture and transmits the data to Target Data Receiver (TDR) 200 over a radio communication. In one illustration, the peripheral object observation device 100 provides the coordinates of the aerial target 50 in the spherical coordinate system in the peripheral object observation device’s 100 frame of reference.

[0033] The coordinates received by the Target Data Receiver (TDR) 200 from the peripheral object observation device 100 is in addition to other encoded critical data that is later processed. In accordance with one example embodiment, at TDR 200 at least one target 50 of the one or more targets is selected based on a plurality of factors, including, but not limited to priority and estimation (PSQR) of target, target launch direction, target type, speed and distance of the target, target trajectory, lethality levels and the like.

[0034] Further, at TDR 200, the tactical data is processed and decoded to extract a unique identifier information of the selected target 50 assigned thereto by peripheral object observation device 100, radial distance and azimuthal angle of the target 50 from the peripheral object observation device 100, perpendicular height of the target 50 from the ground plane or base, heading angle of the target 50, speed of the target 50, IFF (identification of friend or foe) code of target 50, WCO (weapon control orders) code of the target determined/computed by peripheral object observation device 100, advanced target position, orientation and relative velocity of the target 50 approaching the firing zone. In one example embodiment, the tactical data is decoded from a hexadecimal byte string to extract the critical parametric values (as stated above) with respect to the (selected) target 50.

[0035] The coordinate data of the target 50 along with other critical data received at the TDR 200 is processed, decoded and wirelessly transmitted to a smart mixed reality (MR) based head mounted display (HMD) 500 from where the operator designates the virtual target. In a preferred embodiment, TDR 200 is provisioned with a computing module 250 connected thereto over a wired or wireless connection.

[0036] In an embodiment, the tracking information with respect to the one or more intended or locked targets is continuously received at computing module 250 in real time at an intermittent gap of approx. 1 -3 secs. However, for the fast-moving dynamic air borne threats/targets such as highly manoeuvrable aircraft, supersonic cruise missiles, tactical and strategic ballistic missile re-entry vehicles, other unmanned aerial vehicles (UAVs) etc. a time gap of 1 -3 sec is of invaluable significance. The reason being that the target 50 position will be displaced from its predicted position by a considerable distance (say, to an extent of a mile) within few seconds, which may bring immense uncertainty in target localization, particularly for target engagement.

[0037] In some specific embodiments, for e.g. military or warfare applications such an unpredictable nature of target displacement by a mile or more may prove disastrous if not effectively monitored and tracked. Hence, continuous tracking and tracing of these targets in relatively short time frame (e.g. seconds) becomes quintessential in such war like scenarios. In a preferred embodiment, the present disclosure attempts to provide a solution by way of predicting position of such threatening targets continually and accurately throughout for exact aiming, interception and destruction of the weapon.

[0038] In an example scenario, dynamics of two warfare weapons are explained for clarity and understanding. To strike the target on the first shot, the guided or unguided trajectory of the weapon system must be accurately controlled. This is very much dependent on knowing the exact or probable position of the target threat or its deviation from a predetermined course. For sake of simplicity and explanation, let’s consider two kinds of weapon system:

A) Guided Airborne Ranged Weapons: The popular example that belongs to this warfare system is a missile (also referred as guided rocket) that is capable of self-propelled flight based on computation of changes in position, velocity, altitude, and/or rotation rates of a moving target and/or altitude profile based on information about the target’s state of motion. The missile is guided on to its target based on missile's current position and the position of the target, and computation of course between them to acquire the target.

B) Unguided Weapons: This refers to any free-flight missile type having no inbuilt guidance system e.g. rockets. For such systems, instructions have to be relayed based on commands transmitted from the launch platform for course correction.

[0039] To achieve accurate course and target location, here, at the computing module 250, the trajectory data of the selected target 50 is received and processed in real time for trajectory path synthesis via interpolation that enables the operator to fire at any given point of time without having to wait for knowing the exact location of target from radar 100. The target trajectory path synthesised at the computing module 250 provides the operator with a confident position of the target 50 at any instance thereby making the overall operation of target aiming and firing more precise and accurate. Further, as will be acknowledged by those operating at field level, hit rate in such dynamic war like situations is not very optimal. However, with the virtual aid of the present system 1000, the target probable position can be predicted few milliseconds ahead of time that ensures more directed aiming and hitting of the target 50.

[0040] In one illustrative embodiment, the target trajectory path is interpolated and future path is predicted by the computing module 250. The interpolation and prediction of the target trajectory is based on historical data of track traced by the selected target 50 from instance of its detection by the peripheral target observation device 100. The selected target 50 is observed for its flight path which is influenced by numerous factors. The parameter values representative of target’s inherent aerodynamics (mass, moment of inertia, drag coefficients etc.), geometry, design, immediate environment of the target such as air pressure, air temperature, wind velocity humidity, etc. govern the particular trajectory traversed by the target. The computing module 250 predicts the trajectory path of the selected target 50 as a function of time from instance of its detection by the peripheral target observation device 100.

[0041] In accordance with one working example of present embodiment, the computing module 250 is configured to predict position, target velocity, target trajectory and direction estimates of the selected target 50 using a combination of tracking filters. These tracking filters can be selected from a group comprising Extended Kalman Filter (EKF), Kalman filter (KF), Particle filter or Bayesian filter and Backpropagation trained neural network model for detecting trajectory path and predicting future position of the target 50 based on historical data of such target 50 captured from continuous frames of video of the surrounding environment.

[0042] The computing module 250 is trained in real time with data pertaining to trajectory traced in a predetermined period of time travelled by the target 50 for continual autonomous tracking. In one example embodiment, Extended Kalman Filter (EKF), Kalman filter, Particle filter or Bayesian filter is made operable by taking velocity and position estimates for a target 50 and then predicting where the target 50 will be in the next frame or instance. The actual position of the target 50 in the next video frame is then compared with the predicted position and the velocity, position and orientation estimates are updated.

[0043] In accordance with one working embodiment, Kalman filter (or another variant of the Bayesian filter) may be executed along with other complementary filters- Bayesian/Markov methods that are used to fuse data obtained from one or more sources. In another embodiment, realtime coordinate position of a dynamically moving target is captured based on backpropagation (BP) neural network model or any other preferred neural network model, whereby the track data of moving target is learned and model trained for target future track prediction. In general, the BP neural network model includes the input layer, hidden layer, and output layer, and the network propagates backward and constantly adjusts the weights between the input layer and the hidden layer, and the weights between the hidden layer and the output layer to minimize errors.

[0044] The challenging and uncertain scenario of determining target movement is accomplished using BP neural network that can realize any non-linear mapping from the m-dimensional input to the n-dimensional output to better fit the non-linear curve according to the target historical track data, thus improving the track prediction performance. In one working embodiment of present disclosure, the data obtained regarding target track is pre-processed to eliminate outliers and biases in data that results in improved prediction accuracy of the track. The data is then normalized to reduce influence of maximum and minimum values in the data during prediction of neural network, and improve computation speed of the neural network. In this regard, the system architecture is universal and not tied to any specific learning algorithm, although certain learning algorithms may be beneficial in certain applications.

[0045] Post processing of data, BP neural network model is constructed where the network is first initialized, including initialization of weights and thresholds, the number of neural network and neurons in each layer and the types of transfer functions, model training algorithms, number of iterations, and training objectives are defined for each layer. Now, the predicted value of track is obtained which most accurately and robustly determines the target motion in real time/near real time.

[0046] In accordance with one noteworthy embodiment, it is to be understood that the TDR 200 receives the tactical data in its own frame of reference, as mentioned above. Now, a coordinate transform has to be carried out for transforming the target 50 position from TDR’s 200 frame of reference to HMD’s frame of reference for parallax correction, particularly in scenarios where the operator or wearer of HMD 500 is distantly positioned from that of TDR 200. Thus, next parallax shift between the two frame of references (TDR 200 and HMD 500) is corrected to enable HMD 500 view of the target 50 from TDR’s frame of reference. The computing module 250, thus, determines correction data from TDR’s frame of reference to that of HMD’s by way of computing 6dof pose correction data in real time that comprises of translation and orientation offset between the TDR 200 and the HMD 500.

[0047] The correction data is now transmitted to a processing module 560, which is typically hosted at HMD’s 500 wearer end. The processing module 560 now performs the coordinate transformation of the tactical data such that trajectory of the selected target 50 is transformed, as explained here below in detail.

[0048] For simplicity of computation and reference purposes, let’s consider TDR 200 as source S, aerial target 50 as P, mixed reality-based HMD 500 as H, and a weapon system 600 as G. Referring to Fig. 2, a tactical three- dimensional information pertaining to the aerial target P is received from TDR S in a spherical coordinate system. Symbolically, the transformation of object a w.r.t. b ( 5 ) has been represented hereto as a 4x4 matrix.

= [R33 Transsxi], where R3x3 = Rotation Matrix and Trans3xi= Translation vector

[0049] Therefore, using above notations, a 3-dimensional pose of aerial

T S target 50 (P) w.r.t. TDR 200 (S) is shown as p . Next, the position of MR s based HMD 500 (H) with respect to TDR 200 (S), represented as H is computed using a combination of positioning methods. In particular aspects, real time or near real time dynamic location data of selected target 50 is obtained using global positioning system (GPS) and inertial measurement units (IMUs) readings e.g., its coordinates, trajectory, attitude, heading, etc. These readings are complemented with real-time kinematics (RTK) GPS to increase the accuracy of position data derived from the GPS / IMU readings. The RTK usually relies on a single observation reference point or interpolated virtual point to provide real-time correction, thereby providing mm-level accuracy under various operating conditions. Therefore, the HMD 500 equipped with GPS/RTK and IMU to obtain absolute position of the target 50 along with true north bearing and azimuth values for orientation offset correction between the TDR 200 and HMD 500 frame of references.

[0050] In one example embodiment, GPS/GNSS RTK is enabled via a base station hosted at end of computing module 250 and rover hosted at end of processing module 560. Here, the base station transmits its absolute position, 6dof pose partial correction data associated with RTK rover to the processing module 560. Next, RTK rover hosted at end of the processing module 560 is configured to compute position of TDR 200 with respect to the HMD 500 for parallax correction based on the received absolute position of the base station, the 6dof pose partial correction data of the rover and GPS readings of the rover. Thus, with the combination of GPS and RTK positioning system positional data down to millimetre resolution accuracy may be obtained. However, it should be understood that the concepts disclosed in the present disclosure are capable of being implemented with different types of systems for acquiring accurate global position data and are not limited to the specific types and numbers of such devices described and depicted herein.

[0051] Now, in order to get the 3-dimensional pose of aerial target 50 with

T H respect to MR based HMD 500, represented as p , following equation is applied:

[0052] This implies offset correction or parallax deviation correction to overlay virtual target on real target may be represented as:

5 To

[0053] Where h = 3-dimensional pose of TDR 200 w.r.t. MR based HMD 500;

S

[0054] As explained above, for generating H , translation position of the target 50 is obtained from combination of GPS, RTK and IMU readings, while range of rotation is computed using True North bearing and IMU readings.

[0055] In next exemplary embodiment, referring to Fig. 3 tactical information such as 3dof position of aerial target 50 is rendered over the mixed reality based HMD 500 in the form of a hollow blip at all times. Thus, a virtual target is spawned in HMD 500 along with the visual directional cues to identify and locate the spawned target 50 in the airspace. However, in one exemplary embodiment, for wearer of HMD 500 who may also be the operator of weapon system 600, the virtual target viewable through HMD may have to be aligned for accurate aiming and shooting. In general, aligning the target with that of weapon is referred as zeroing in of weapon 600.

[0056] Weapon Zeroing in is one of the most essential principles underpinning the effective locking in of the intended target. It involves setting and calibrating the sights to enhance firing accuracy. This zeroing process is one of the most critical elements of accurate target engagement. To accurately engage targets, the strike of a bullet must coincide with the aiming point (Point of Aim/Point of Impact) on the target 50. In general, in order to effectively use a weapon 600, it is imperative for the weapon 600 to be properly calibrated and zeroed in with the sight for securing enhanced functional and operational survivability in a dynamic, hostile situation such as a battlefield.

[0057] However, in present context, the real target is viewable to operator as a virtual target displayed over his worn HMD 500. For calibration purposes, frame of reference for the weapon system 600 is required to be aligned with that of HMD’s 500 frame of reference, where such T H transformation between the two frames p is computed using equation 1 . Now, choosing MR based HMD 500 as frame of reference, 3-dimensional pose of weapon system 600 ‘G’ with respect to HMD glasses 500 ‘H’ is T9 required to be computed =

[0058] In one explanatory embodiment, values can be computed using a combination of proximity sensors and IMUs that can be strategically arranged on at least one side of HMD 500 that is adjacent to weapon system 600. Likewise, another set of IMU and proximity sensors can be arranged on weapon system 600 that is held in vicinity of the HMD 500 wearing operator. Further equivalents, alternatives and modifications of above computation are also possible as would be recognized by those skilled in the art. Using similar logic as above, pose of aerial target 50 ‘P’ with respect to weapon system 600 ‘G’ can be found using following equation: [0059] Once the alignment of these frame of references is successfully achieved, the wearer of HMD 500 views a virtual target 50’ spawned in HMD 500 along with the visual directional cues to identify and locate the spawned target 50 in the airspace. As can be seen in Fig. 3, the virtual target 50’ can be located as a virtual blip that is in principle overlaid on the real target 50. In one example embodiment, the wearer is rendered with virtual situational cues along with the virtual blip on display of the target 50 current position for accurate engagement of the real target 50.

[0060] In one implementation, the HMD 500 may use raycasting technique to determine path that has to be travelled by the ammunition fired from the weaponry system 600. In various implementations, the raycasting technique can include casting a thin bundle of rays with substantially no width, or a ray with substantial width (e.g. a cone or cone). The operator, thus, views the virtual ray emanating from the weapon system 600 via HMD 500 such that the virtual target 50’ coincides with the virtual ray/reticle for accurate aiming and firing of the real target 50 which is overlaid with the virtual target 50’.

[0061] In another alternative scenario, the virtual blip may take form of virtual lasers and virtual crosshair pointers for target sighting, tracking, locking and engagement as adjustable overlays over the MR based glasses of the HMD 500; and enable target sighting & locking, weapon deployment and Beyond Line of Sight (BLOS) capability.

[0062] To achieve the above features, the overall schema of HMD functionality is partitioned into hardware and software domains. In one embodiment, the dedicated hardware provides a Mixed Reality (MR)- based HMD 500, the HMD having at least MR glasses 510, a communication module 520, one or more cameras 530, a display unit 540, an audio unit 550, a processing module 560, and a plurality of dock-able sensors 570 (a, b, c... n) mounted on one or more external devices 700 wirelessly connected with the HMD 500. [0063] The one or more external devices 700 selected from external cameras, weapon firing system, aiming device installed on handheld weapon system 600 such as a gun, rifle etc., unmanned Aerial vehicles (UAVs), external cameras, other HMDs, external computing devices. Further, the processing module 560 is configured to receive data from the one or more cameras 530 and/or the one or more dock-able sensors 570 on the one or more external devices 700, the data being accumulated with respect to selected target 50 including the target’s surrounding environment, situational awareness, navigation, binocular vision, weather conditions and presence of objects & humans. In one additional embodiment, this data is processed by the processing module 560 to further determine information related to target detection, IFF (Identification of Friend or Foe), locations of targets & team-mates, velocity & distance estimation, weapon information etc.

[0064] In another aspect of present invention, the method provides virtual aid in the form of situational cues for identifying and locating the aerial target 50 and aiding in actual engagement with the target 50 using virtual sights in the form of a virtual laser pointer or laser blip with a raycasting technique as described below:

[0065] 1 . The object observation device (radar) 100 provides the world coordinates of the real target 50 in the spherical coordinate system. A virtual target 50’ is spawned in mixed reality head mounted device (HMD) 500 along with the visual directional cues to find the spawned target in the airspace.

[0066] 2. A virtual gaze in the form of a crosshair is overlayed in mixed reality head mounted device 500 vision.

[0067] 3. The barrel of the gun/weapon system 600 is tracked to give six degree of freedom (6dof) position in world coordinates using precise object tracking. [0068] 4. The position of the weapon 600 and its aiming direction is shown in the mixed reality head mounted device 500 in the form of a virtual sight/ ray-cast.

[0069] 5. The virtual alignment would encompass- aligning the HMD 500 gaze crosshair with the target overlay and aligning the virtual sight with the HMD gaze crosshair and the target. As all the three overlays are in real- world coordinates, these can be used as an aid for the actual aiming and engaging with the target 50.

[0070] 6. The alignment and the shape of the ray-cast take into consideration the weapon specifications, the current positions of the target, and the related ballistic calculation for perfect zeroing.

[0071] 7. Depth occlusion and mapping using mixed reality HMD 500 may be used for finding the intersection of the ray-cast and the virtual target that overlays the real target. This intersection is used to render a virtual overlay like a bullseye. This is an indication that the required alignment has been achieved. Different shapes and colors may denote the confidence/probability of hitting the target as computed by the system.

[0072] 8. The specific HMD 500 integrates Inertial Measurement Unit (IMU), GPS and optical sensors. Using these sensors, the 6dof position of the head as well as the true North direction is computed accurately.

[0073] 9. The unique graphical user interface (GUI) for the system shows the elevation and azimuth relative to the true North of the sight/head. The GUI also shows cues in the form of directional arrows which help to locate the target. The GUI is customized according to the weapon system that it is to be used for. The system might or might not be used with radar based systems for target tracking. The GUI shows the IFF values (differentiation between friendly and foe targets), target ids, target current velocity, heading and position. The GUI is equipped with warning systems to indicate if certain target is within different range limits. Digital zooming, toggling on-off different GUI features are supported. The GUI can show different situational awareness information like weather condition, ammunition status, information about systems in the same troop, thermal/night vision feed, tank detection, vehicle detection, human detection etc.

[0074] In one important aspect of present disclosure, the system is made to effectively operable for all weather conditions. With the overlaying of virtual target 50’ over the real target 50 and displayed as a hollow blip, the all-time visibility and tracking of real target 50 is achieved even under bad weather conditions (fog, smoke, smog, clouds, etc.). In another implementation, situational cues may appear in front of the HMD display 500 in a form that neither occludes the objects on display, nor distracts the wearer from the content being shown. Thus, these cues do not block the wearer vision, and the system does not inadvertently emphasize these cues that may appear obstructive in user’s clear aiming of the target 50.

[0075] In accordance with other significant aspect of disclosure, the HMD 500 is provided with an advanced optical element to reduce glare from sunlight or other source of bright light autonomously without requiring any manual intervention to make adjustments in the amount of light being allowed to pass through. Generally, under scenarios where transparent HMD glasses are used, the virtual overlays in display unit of HMD 500 are rendered translucent to transparent in light backdrops making their visibility painfully difficult. In order to overcome this limitation, glasses may be coated with an advanced optical element or film that can autonomously switch visibility parameters of the HMD with dynamically changing outside weather conditions.

[0076] In one working embodiment, one or more ambient light sensors are provisioned on HMD 500 that gather data of surrounding outside weather conditions and input it to the optical element of HMD for electric stimulation and eventually glass tint/opacity modulation. Precisely, the optical element herein comprises of an electrochromic element 590 that is configured to monitor, adjust and limit the light by way of changing their opacity and/or colour in response to electric stimulation, such as application of voltage as generated in response to input received from the ambient light sensors. For example, higher the voltage, greater the opacity of HMD glasses 500 is made for clear and distinct viewing of virtual overlays. In one working embodiment, such an electrochromic element may comprise of an entire region of the optical element or present only in some portion thereof. It is however appreciated that such specific configurations are merely illustrative and not intended to be limiting.

[0077] Accordingly, the electrochromic element 590 may be electrically actuated, which results in an increase in opacity of the HMD 500. Here, the degree or level of opacity may be determined based on plurality of parameters such as duration and/or amplitude and/or form and/or frequency of the applied electrical signal. The change in opacity refers to changing a colour, shade, hue, gamma, clarity, transmittance, light scattering, polarization, other optical characteristics, attach time, decay time, shape, outline, pattern, and size of said at least one region. In another event, the HMD 500 may be quickly returned back to its nonopaque condition in seconds.

[0078] In one other aspect of the present invention, the HMD 500 is configured with a display unit 540 and a microphone unit 550 to provide an operator with a visual or audible warning that is activated based on target range, target speed, target type, target velocity and trajectory, IFF displayed, visual GUIs, target direction and range, lethality of target by operator (thresholds as per weapon type/system), weapon specific instructions etc. The audio and video cautions and cues are provided to the operator for taking corrective action and in one other embodiment, an audio tone may be set with visual changes in symbology from non-flashing to flashing bright red alert. [0079] The audio alert enables a uniquely tailored response by the operator to any event or required action by integrating a definable range of alert inputs with the audio alert notification for the ultimate in situational awareness and response. The audio feature is integrated with prerecorded, optimized messages (which may be voice messages, tones or any other audible or visual triggers to signal) to allow the operator to trigger the output upon the breach of any target associated rules.

EXAMPLES

[0080] The present invention is described hereinafter by various embodiments. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiment set forth herein. Rather, the embodiment is provided so that this disclosure will be thorough and complete and will fully convey the scope of the invention to those skilled in the art.

[0081] In accordance with an exemplary embodiment of the present invention, the system comprises of a plurality of computing devices and a plurality of dock-able sensors mounted on a military grade AR headset, operated by users using military grade Mixed Reality (MR) glasses. The system comprises of one or more image capturing modules, one or more RGB cameras, ToF (time of flight) or Depth cameras, and IR stereoscopic cameras. The plurality of computing devices comprises of, but not limited to, a microphone, a speaker, a user interface, and an artificial intelligence module. Further, the computing devices include a plurality of electronic components such as a microprocessor, a memory unit, a power source, and a user interface. The user interface may be activated or utilized by the user by pressing a button or hovering the hand and/or other body parts or providing audio input and/or tactile input through one or more fingers. The plurality of computing devices maybe one or more of, but not limited to, a wearable device such as a Head Mounted Device (HMD) or smart eyewear glasses. Further, the one or more Dock-able sensors include, but not limited to threat detection sensors, infrared sensors, night-vision sensors, a thermal sensor, IFF (identification friend or foe), Lidar (Light Detection and Ranging), Blue Force Tracking, SONAR and GPS.

[0082] In accordance with an exemplary embodiment of the present invention, the plurality of computing devices may include, but not limited to, a wearable device such as a Head Mounted Device (HMD) or smart eyewear glasses. The plurality of computing devices is envisaged to include computing capabilities such as a memory unit configured to store machine readable instructions. The machine-readable instructions may be loaded into the memory unit from a non-transitory machine-readable medium, such as but not limited to, CD-ROMs, DVD-ROMs, and Flash Drives. Alternatively, the machine-readable instructions may be loaded in the form of a computer software program into the memory unit. The memory unit in that manner may be selected from a group comprising EPROM, EEPROM and Flash memory.

[0083] In accordance with an exemplary embodiment of the present invention, the military-grade headset includes, but not limited to one or more glasses, one or more image capturing module, one or more IR stereoscopic cameras, one or more RGB cameras, TOF or depth cameras, one or more microphone, and one or more Speaker. The one or more glasses, image capturing module, RGB cameras, TOF or depth cameras, IR stereoscopic cameras, Inertial Measurement Unit (IMU), microphone, and Speaker are operatively connected. The one or more glasses are configured to provide 60 degrees of Field of vision. The 60 degrees of vision provides a wider field of vision. In another aspect, the system coupled with the glasses provides the user images and videos of targets and locations beyond the line of sight.

[0084] In accordance with an embodiment of the present invention, the MR glasses are military grade and made of a material selected from a group comprising polycarbonate, aluminium alloy and rubber polymer. [0085] In accordance with an embodiment of the present invention, the MR glasses are provided with UV protection and shock proof capability with anti-scratch, anti-fog coating and electrochromic coating with which the transparency of MR glasses is changed from dark shades to no tints, automatically or manually, based on surrounding lights and to adjust the clarity of holograms, in order to withstand different conditions.

[0086] In accordance with an embodiment of the present invention, the one or more Dock-able sensors include, threat detection sensors, infrared sensors, night vision sensors, a thermal sensor, IFF (identification friend or foe), Lidar (Light Detection, and Ranging), Blue Force Tracking, SONAR and GPS.

[0087] In accordance with an embodiment of the present invention, the HMD is operated using one or more of physical buttons, hand-gestures, voice commands and gaze-tracking for interaction.

[0088] In accordance with an embodiment of the present invention enables wireless communication between the HMD and one or more external devices selected from external cameras, weapon firing system, aiming device installed on handheld gun using the communication module; and receive sufficient information for target sighting, locking, and engagement which does not require information from the radar.

[0089] In accordance with an embodiment of the present invention, the HMD MR glass can connect to the existing physical sight of the weapon firing system, so the user can switch from the virtual sight to the actual view of the physical sight in the glass user interface itself. This gives an additional benefit of using the physical sight as a handheld camera to look beyond the corners without endangering the user himself.

[0090] In accordance with an embodiment of the present invention, the information received from the one or more external devices includes one or more of live feed from external cameras and UAVs, live and enhanced satellite images, information from the radar, weapon information, locations and audio-visual data from other HMDs and audio or video information from external computing devices.

[0091] In accordance with an embodiment of the present invention, the processing module is configured to project information received from radar directly to the MR glasses and enable a user to lock and engage the target without any additional human intervention.

[0092] In accordance with an exemplary embodiment of the present invention, the user interface is provided to enable the user to navigate between various mixed reality information overlays and use the sensor data and information in the most efficient manner as per the user’s requirement without it being a hassle to the user. The exemplary user interface includes, but not limited to, one or more buttons, a gesture interface, an audio interface, and a touch-based interface, eye-tracking interface that tracks gaze and focus, EEG-Based Brain-Computer Interface, and the like.

[0093] In accordance with an exemplary embodiment of the present invention, the system provides Information visualization, intuitive interface, non-intrusive and adjustable overlays.

[0094] The exemplary method of working of the system is discussed below. The method starts when the one or more IR stereoscopic cameras of the system described above, along with other dock-able sensors, microphone, capture the audio, visual, and situational data. The dockable sensors are used to sense the situation around the user. The information read by the dock able sensors alerts the user about the threat. The data captured by the camera and the dockable sensors is sent to the computing system for intelligently processing the data and give an assessment of the condition around the user. [0095] It should be understood that the techniques of the present disclosure might be implemented using a variety of technologies. For example, the methods described herein may be implemented by a series of computer-executable instructions residing on a suitable computer- readable medium. Suitable computer-readable media may include volatile (e.g. RAM) and/or non-volatile (e.g. ROM, disk) memory, carrier waves, and transmission media. Exemplary carrier waves may take the form of electrical, electromagnetic or optical signals conveying digital data streams along with a local network or a publicly accessible network such as the Internet.

[0096] It should also be understood that, unless specifically stated otherwise as apparent from the following discussion, it is appreciated that throughout the description, discussions utilizing terms such as "controlling" or "obtaining" or "computing" or "storing" or "receiving" or "determining" or the like, refer to the action and processes of a computer system, or similar electronic computing device, that processes and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.

[0097] Various modifications to these embodiments are apparent to those skilled in the art from the description and the accompanying drawings. The principles associated with the various embodiments described herein may be applied to other embodiments. Therefore, the description is not intended to be limited to the embodiments shown along with the accompanying drawings but is to be providing broadest scope of consistent with the principles and the novel and inventive features disclosed or suggested herein. Accordingly, the invention is anticipated to hold on to all other such alternatives, modifications, and variations that fall within the scope of the present invention.