Sign In to Follow Application
View All Documents & Correspondence

A System For Advanced Surveillance And Security Using Mixed Reality Head Mounted Device

Abstract: ABSTRACT A SYSTEM FOR ADVANCED SURVEILLANCE AND SECURITY USING MIXED REALITY HEAD MOUNTED DEVICE A system (100) for advanced surveillance and security using mixed reality head mounted device, comprises a MR-based HMD (102) adjustably mounted on a helmet (110) using a mounting mechanism (108), the HMD (102) having atleast MR glasses (1022), a communication module (126), one or more cameras, a display unit (128), an audio unit (130) and a processing module (122); and a plurality of dock-able sensors (106) (LiDAR, SONAR etc.) mounted on the helmet (110) and connected with the HMD (102). Further, the system (100) configured to enable wireless communication between the HMD (102) and one or more external devices such as a radar, Unmanned Aerial vehicles (UAVs), external cameras, other HMDs (102), weapon firing systems and external computing devices; and enable target sighting & locking, weapon deployment, Beyond Line of Sight (BLOS) capability, 360 degree vision and remote communication, using the information received from the plurality of dock-able sensors (106) and the external devices. [FIGURE 1A]

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
19 March 2020
Publication Number
39/2021
Publication Type
INA
Invention Field
PHYSICS
Status
Email
vivek@boudhikip.com
Parent Application
Patent Number
Legal Status
Grant Date
2024-07-05
Renewal Date

Applicants

DIMENSION NXG PRIVATE LIMITED
501, Arcadia, Hiranandani Estate, Patlipada, Ghodbunder Road, Thane, Maharashtra - 400607, India

Inventors

1. Pankaj Raut
501, Arcadia, Hiranandani Estate, Patlipada, Ghodbunder Road, Thane, Maharashtra -400607, India
2. Abhijit Patil
501, Arcadia, Hiranandani Estate, Patlipada, Ghodbunder Road, Thane, Maharashtra -400607, India
3. Abhishek Tomar
501, Arcadia, Hiranandani Estate, Patlipada, Ghodbunder Road, Thane, Maharashtra -400607, India
4. Suraj
501, Arcadia, Hiranandani Estate, Patlipada, Ghodbunder Road, Thane, Maharashtra -400607, India
5. Yukti Suri
501, Arcadia, Hiranandani Estate, Patlipada, Ghodbunder Road, Thane, Maharashtra -400607, India
6. Shantanu Barai
501, Arcadia, Hiranandani Estate, Patlipada, Ghodbunder Road, Thane, Maharashtra -400607, India
7. Moaz Momin
501, Arcadia, Hiranandani Estate, Patlipada, Ghodbunder Road, Thane, Maharashtra -400607, India
8. Bikki Rajbhar
501, Arcadia, Hiranandani Estate, Patlipada, Ghodbunder Road, Thane, Maharashtra -400607, India
9. Jaya Sai Kiran Patibandla
501, Arcadia, Hiranandani Estate, Patlipada, Ghodbunder Road, Thane, Maharashtra -400607, India
10. Sunit Kumar Adhikary
501, Arcadia, Hiranandani Estate, Patlipada, Ghodbunder Road, Thane, Maharashtra -400607, India
11. Janmesh Ukey
501, Arcadia, Hiranandani Estate, Patlipada, Ghodbunder Road, Thane, Maharashtra -400607, India
12. Prathamesh Tugaonkar
501, Arcadia, Hiranandani Estate, Patlipada, Ghodbunder Road, Thane, Maharashtra -400607, India

Specification

DESC:FORM 2
THE PATENTS ACT 1970
(39 of 1970)
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
[See section 10 and rule 13]

A SYSTEM FOR ADVANCED SURVEILLANCE AND SECURITY USING MIXED REALITY HEAD MOUNTED DEVICE

DIMENSION NXG PRIVATE LIMITED, an Indian company, having office at 501, Arcadia, Hiranandani Estate, Patlipada, Ghodbunder Road, Thane, Maharashtra -400607, India

The following specification particularly describes the invention and the manner in which it is to be performed.
FIELD OF THE INVENTION
Embodiments of the present invention relate to mixed reality technology and Artificial Intelligence and more particularly to a system for advanced surveillance and security using mixed reality head mounted device, by utilizing Artificial Intelligence and multiple dock-able sensors.
BACKGROUND OF THE INVENTION
Strong armed forces are one of the important aspects for the development and growth of any country. The armed forces participate in peacekeeping operations, military exercises and humanitarian relief missions. They also carry out more traditional tasks such as securing the borders. To prepare strong armed forces various mission trainings are provided to the soldiers. But the conventional mission training provided to soldiers is time consuming, costly and suffers from difficulty in simulating all real combat scenarios. The traditional mission training often causes difficulties for the soldiers in imagining and studying the actual battlefield situation.
Further the soldiers carry number of heavy devices for navigation, battlefield awareness and study of the battlefield. The heavy devices also slow down the working efficiency of the armed forces. The navigation devices and various security systems used by soldiers are affected by environmental conditions such as smoke, haze, fog, low light etc, which cause deterioration of visibility. The traditional navigation devices used for On-field operations are often affected by lack of live navigation cues in unknown terrains or indoor scenarios. For example, when a soldier is inside an armoured vehicle, due to lack of openings or windows, there is very less visibility of the surroundings.
Moreover, repair and maintenance of various devices carried by the soldiers is often required on-field. Such tasks require experts and engineers to guide the soldiers on-field to carry out such repairs themselves. With limited capability of the traditional communication devices such repair and maintenance task are difficult to carry out.
Therefore, there is a need in the art for a system for advanced surveillance and security using mixed reality head mounted device, by utilizing Artificial Intelligence and multiple dock-able sensors. Such system should help in improving human decision-making capability and human-to-machine interaction, and should not suffer from above-mentioned deficiencies to provides an effective and viable solution.
OBJECT OF THE INVENTION
An object of the present invention is to provide a system for advanced surveillance and security using mixed reality head mounted device, by utilizing Artificial Intelligence and multiple dock-able sensors.
Another object of the present invention is to improve human decision-making capability and human-to-machine interaction, in critical situations.
Yet another object of the present invention is to provide advanced lethality to the various security systems carried by the armed forces by upgrading the weapons by means of enhanced imaging and tracking sensors.
Yet another object of the present invention is to increase survivability and mission capability by providing live navigation cues especially in unknown terrains and indoor scenarios.
Yet another object of the present invention is to provide a system that operates flawlessly under low light as well as in adverse weather conditions such as dust, smoke, and haze.
Yet another object of the present invention is to provide a system having capability to work indoor as well as outdoor scenarios. The system also has the operating modes for both day and night lighting scenarios.
Yet another object of the present invention is to provide a system for improved battlefield awareness by reducing the number of devices that must be carried, and helps the user accomplish the mission more efficiently.
Yet another object of the present invention is to provide a system for real time surveying and immersive training for military purposes.
Yet another object of the present invention is to provide see through capabilities to a user even when the user is inside an armoured vehicle or when the target is beyond line of sight.
Yet another object of the present invention is to provide information overlays and target detection by providing information over the normal view of glasses.
Yet another object of the present invention is to provide a system for on-field repair and maintenance of various machinery, artillery or military vehicles by providing remote guidance by a remotely located expert.
SUMMARY OF THE INVENTION
Embodiments of the current invention disclose a system for advanced surveillance and security using a mixed reality head mounted device, by utilizing Artificial Intelligence and multiple dock-able sensors.
The system comprises a Mixed Reality (MR)-based HMD adjustably mounted on a helmet using a mounting mechanism, the HMD having atleast MR glasses, a communication module, one or more cameras, a display unit, an audio unit and a processing module; and a plurality of dock-able sensors mounted on the helmet and connected with the HMD. Further, the processing module is configured to receive data from the one or more cameras and/or the one or more dock-able sensors, the data being related to surroundings, situational awareness, navigation, binocular vision, weather conditions and presence of objects & humans; and process the received data to determine information related to object detection, IFF (Identification Friend or Foe), locations of targets & team-mates, velocity & distance estimation, health status, weapon information, threat recognition and detection; provide the determined information in a form of a visualization, intuitive interface, non-intrusive and adjustable overlays, over the MR glasses of the HMD, using the display unit; enable wireless communication between the HMD and one or more external devices selected from a radar, Unmanned Aerial vehicles (UAVs), external cameras, other HMDs, weapon firing systems and external computing devices, using the communication module; and enable target sighting & locking, weapon deployment, Beyond Line of Sight (BLOS) capability, 360 degree vision and remote communication, using the information received from the one or more dock-able sensors and the external devices.
In accordance with an embodiment of the present invention, the MR glasses are military grade and made of a material selected from a group comprising polycarbonate, aluminium alloy and rubber polymer.
In accordance with an embodiment of the present invention, the MR glasses are provided with UV protection and shock proof capability with anti-scratch, anti-fog coating and photochromatic coating with which the transparency of MR glasses is changed from dark shades to no tints, automatically or manually, based on surrounding lights and to adjust the clarity of holograms, in order to withstand different conditions.
In accordance with an embodiment of the present invention, the one or more Dock-able sensors include, threat detection sensors, infrared sensors, night vision sensors, a thermal sensor, IFF (identification friend or foe), Lidar (Light Detection, and Ranging), Blue Force Tracking, SONAR and GPS.
In accordance with an embodiment of the present invention, the HMD is operated using one or more of physical buttons, hand-gestures, voice commands and gaze-tracking for interaction.
In accordance with an embodiment of the present invention, the information received from the one or more external devices includes one or more of live feed from external cameras and UAVs, live and enhanced satellite images, information from the radar, weapon information, locations and audio-visual data from other HMDs and audio or video information from external computing devices.
In accordance with an embodiment of the present invention, the processing module is configured to project information received from radar directly to the MR glasses and enable a user to lock and engage the target without any additional human intervention.
In accordance with an embodiment of the present invention, the processing module is configured to enable a collaborative mixed reality session with other connected HMDs, to assist users in operation Planning and Live Monitoring.
In accordance with an embodiment of the present invention, the processing module is configured to enable remote communication with external computing devices to assist the user wearing the HMD in maintenance, repair and assembly of on-site machines.
In accordance with an embodiment of the present invention, a plurality of HMDs are connected with a control room for soldier data management, training and simulations.
In accordance with an embodiment of the present invention, the system include a plurality of secondary sensors disposed on one or more external devices, the HMD, and/or the user associated with respective HMD.
In accordance with an embodiment of the present invention, providing the health status, further includes displaying the information of vital parameters such as heart rate, pulse rate, respiration rate, blood oxygen levels, irregular heartbeats, glucose levels, ECG, EEG and EMG, measured using the plurality of secondary sensors disposed on the user of the respective HMD
BRIEF DESCRIPTION OF THE DRAWINGS
So that the manner in which the above recited features of the present invention can be understood in detail, a more particular to the description of the invention, briefly summarized above, may be had by reference to embodiments, some of which are illustrated in the appended drawings. It is to be noted, however, that the appended drawings illustrate only typical embodiments of this invention and are therefore not to be considered limiting of its scope, the invention may admit to other equally effective embodiments.
These and other features, benefits and advantages of the present invention will become apparent by reference to the following text figure, with like reference numbers referring to like structures across the views, wherein:
Fig. 1A-1C illustrates multiple views of a system for advanced surveillance and security using a mixed reality head mounted device, in accordance with an embodiment of the present invention;
Fig. 2A-2B illustrates the Head Mounted Device (HMD) of system of Fig. 1A-1C, in accordance with an embodiment of the present invention; and
Fig. 3 illustrates an exemplary implementation of system of Figure 1A-1C, in accordance with an embodiment of the present invention; and
Fig. 4 – 6B illustrate exemplary snapshots of the multiple implementation of the system of Fig. 1 in multiple scenarios, in accordance with an embodiment of the present invention.
DETAILED DESCRIPTION OF DRAWINGS
While the present invention is described herein by way of example using embodiments and illustrative drawings, those skilled in the art will recognize that the invention is not limited to the embodiments of drawing or drawings described and are not intended to represent the scale of the various components. Further, some components that may form a part of the invention may not be illustrated in certain figures, for ease of illustration, and such omissions do not limit the embodiments outlined in any way. It should be understood that the drawings and detailed description thereto are not intended to limit the invention to the particular form disclosed, but on the contrary, the invention is to cover all modifications, equivalents, and alternatives falling within the scope of the present invention as defined by the appended claims. As used throughout this description, the word "may" is used in a permissive sense (i.e. meaning having the potential to), rather than the mandatory sense, (i.e. meaning must). Further, the words "a" or "an" mean "at least one” and the word “plurality” means “one or more” unless otherwise mentioned. Furthermore, the terminology and phraseology used herein is solely used for descriptive purposes and should not be construed as limiting in scope. Language such as "including," "comprising," "having," "containing," or "involving," and variations thereof, is intended to be broad and encompass the subject matter listed thereafter, equivalents, and additional subject matter not recited, and is not intended to exclude other additives, components, integers or steps. Likewise, the term "comprising" is considered synonymous with the terms "including" or "containing" for applicable legal purposes. Any discussion of documents, acts, materials, devices, articles and the like is included in the specification solely for the purpose of providing a context for the present invention. It is not suggested or represented that any or all of these matters form part of the prior art base or were common general knowledge in the field relevant to the present invention.
In this disclosure, whenever a composition or an element or a group of elements is preceded with the transitional phrase “comprising”, it is understood that we also contemplate the same composition, element or group of elements with transitional phrases “consisting of”, “consisting”, “selected from the group of consisting of, “including”, or “is” preceding the recitation of the composition, element or group of elements and vice versa.
The present invention is described hereinafter by various embodiments with reference to the accompanying drawings, wherein reference numerals used in the accompanying drawing correspond to the like elements throughout the description. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiment set forth herein. Rather, the embodiment is provided so that this disclosure will be thorough and complete and will fully convey the scope of the invention to those skilled in the art. In the following detailed description, numeric values and ranges are provided for various aspects of the implementations described. These values and ranges are to be treated as examples only and are not intended to limit the scope of the claims. In addition, a number of materials are identified as suitable for various facets of the implementations. These materials are to be treated as exemplary and are not intended to limit the scope of the invention.
Figure 1 illustrates a system (100) for advanced surveillance and security using a mixed reality head mounted device. Herein, Figure 1A represents a perspective view of the system (100); Figure 1B illustrates a side view of the system (100); and the Figure 1C illustrates a bottom view of the system (100). As shown in figures 1A-1C, the system (100) comprises, but not limited to, a Mixed Reality (MR)-based Head Mounted Device (HMD) (102) adjustably mounted on a helmet (110) using a mounting mechanism (108), and a plurality of dock-able sensors (106) mounted on the helmet (110) and connected with the HMD (102).
In accordance with an exemplary embodiment of the present invention, the helmet (110) can be any commonly available helmet (110) worn by users, but preferably, the helmet (110) is a military grade helmet (110) made of, but not limited to, Kevlar fibres, advanced aramid fibres, ultra-high-molecular-weight polyethylene or a combination thereof. In one embodiment, as the helmet (110) is to be used for military purposes, the helmet (110) may further include one or more neckbands/straps (116), ear covers (118) and face cover (114). The one or more neckbands/straps (116) help in keeping the helmet (110) (and thereby the system) steady, as the user wearing the HMD (102) moves around. Further, the face cover (114) may include one or more air filters, external microphones, protective covers, face mask additional straps etc. Similarly, ear covers (118) may be meant for, but not limited to, protection of ears from external environment, and/or cancellation of external noise, and/or provide with earphones for the users. However, all the one or more neckbands/straps (116), the ear covers (118) and the face cover (114), may considered as optional components and may be removably provided with the helmet (110).
In accordance with an exemplary embodiment of the present invention, the plurality of dock-able sensors (106) connected with ethe HMD, are selected from, but not limited to, threat detection sensors, infrared sensors, night vision sensors, a thermal sensor, IFF (identification friend or foe), Lidar (Light Detection, and Ranging), Blue Force Tracking, SONAR and GPS. One exemplary arrangement may involving mounting LiDAR on a top of the helmet (110), the SONAR on a left side of the helmet (110) and the thermal sensor on the right (opposite of SONAR) side of the helmet (110). The same can be seen in figure 1A. Herein, the thermal sensors enable the system (100) to work in all weather conditions. i.e., low light, night, fog, smoke, and dust. The plurality of dock-able sensors (106) are configured to improve the day/night operation, visualization, and situational awareness. In an aspect, the headset may one or more capabilities such as threat detection capability, IFF capability (Identification, friend or foe) Blue Force Tracking capability, and the like. The threat detection capability is used to detect a threat and assisted threat recognition. The system (100) is used for identification of own forces and enemies for a quick and decisive response by using the IFF and the Blue force Tracking capabilities. IFF capability is a radar based identification functionality that enables detection of friend and enemy entities. Blue force tracking is a GPS enabled capability that enables monitoring real-time location information of own/friendly armed forces.
In another aspect, LIDAR sensors facilitate in creating high resolution map of the surrounding terrain. The HMD (102) can be used to visualize terrain mapped by docked Lidar of the system (100) and also from more powerful Lidars mounted on UAVs and drones for getting a better idea of the larger battlefield terrain.
Additionally, the MR-based Head Mounted Device (HMD) (102) may be, but not limited to, a mixed reality headset or smart eyewear glasses. As already mentioned above, the HMD (102) is adjustably mounted on the helmet (110) using the mounting mechanism (108) and is to be worn by a user. The same have been illustrated in figure 2A and 2B. The HMD (102) may be operated using one or more of physical buttons, hand-gestures, voice commands and gaze-tracking for interaction.
Kindly note, that, the Mixed Reality based HMDs (102) referred herein may be envisaged to include capabilities of generating an augmented reality environment, a mixed reality environment and a virtual reality environment that lets the user interact with digital content within the environment generated in the HMD (102). Even the specification mostly states that HMD (102) being a mixed reality-based HMD (102), but it will be appreciated by a skilled addressee that any Augmented or Virtual reality-based HMD (102) may be used without departing from the scope of the present invention. It will be understood by a person skilled in the art that below mentioned components of the HMD (102) and their description should be considered as exemplary and not in a strict sense. The HMD (102) is envisaged to include atleast MR glasses (1022), a communication module (126), one or more cameras (1024), a display unit (128), an audio unit (130) and a processing module (122). Some of these components are visible from the outside of the HMD (102), while rest of the component are internal and therefore, provided inside the HMD (102) (not visible). Such components have been shown in a block diagram in figure 2A.
In accordance with an exemplary embodiment of the present invention, the MR glasses (1022) are military grade and may be made of a material selected from a group comprising, but not limited to, polycarbonate, aluminium alloy and rubber polymer. The Military-grade MR glasses (1022) are designed to be shockproof having Shock/impact resistance. A coating material may be used for the military-grade MR glasses (1022) which is selected from, but not limited to anti-fog, anti-dust, scratch resistant, UV protection, and photochromatic coating. The photochromatic coating is used to change the transparency of military-grade MR glasses (1022) from dark shades to no tints automatically or manually based on surrounding lights and to adjust the clarity of holograms. The MR glasses (1022) are configured to provide a high resolution and wider field of view. The MR glasses of the present invention may provide a wider field of view in the range of 55-70 degrees.
Further, the communication module (126) is configured to enable a wireless communication with the one or more devices via a wireless communication network (not shown in figure 2A-2B). Therefore, the wireless communication network may be, but not limited to, wireless intranet network, WIFI internet or GSM/GPRS based 2G, 3G, 4G, LTE or 5G communication network. In that sense, the communication module (126) may include one or more of, but not limited to, a WiFi module, an RF module and a GSM/GPRS module. Preferably, the communication network is internet or Radio Frequency (RF).
In addition, the audio unit (130) (shown in figure 2A and 2B) includes an array of stereoscopic microphones (1034) and one or more speakers (1032). The one or more microphones (1034) are configured to capture binaural audio along the motion of the user, that may include voice commands for performing certain operations and also capture 3D stereo sound with acoustic source localization with the help of IMU. The one or more speakers (1032) are envisaged to provide output audio to the user, in response to the commands or for providing any other information. The audio unit (130) may implement various noise cancellation techniques to further enhance audio quality. Furthermore, the one or more speakers (1032) may have an audio projection mechanism that projects sound directly to the concha of an ear of the user and reaches an ear canal after multiple reflections. In one embodiment, the audio unit (130) may also include voice recognition module to allow only authorised users to operate the system (100) and the HMD (102).
Moreover, the display unit (128) of the HMD (102) may comprise a Liquid Crystal on Silicon display and a visor. In accordance with an embodiment of the present invention, the one or more sensors may selected from, but not limited to, RGB sensor, a depth sensor, an eye tracking sensor, an EM sensor, ambient light sensor, an accelerometer, a gyroscope and a magnetometer. Furthermore, the one or more cameras (1024) are selected from one or more of, but not limited to, omnidirectional cameras, wide angle stereo vision camera, RGB camera, ToF or depth cameras, digital cameras, thermal cameras, Infrared cameras and night vision cameras.
Additionally, the processing module (122) is envisaged to include computing capabilities such as a memory unit configured to store machine readable instructions. The machine-readable instructions may be loaded into the memory unit from a non-transitory machine-readable medium, such as, but not limited to, CD-ROMs, DVD-ROMs and Flash Drives. Alternately, the machine-readable instructions may be loaded in a form of a computer software program into the memory unit. The memory unit in that manner may be selected from a group comprising EPROM, EEPROM and Flash memory. Further, the processing module (122) includes a processor operably connected with the memory unit. In various embodiments, the processor is one of, but not limited to, a general-purpose processor, an application specific integrated circuit and a field-programmable gate array. In one embodiment the processing module (122) may be a part of a dedicated computing device or may be a microprocessor that may be a multipurpose, clock driven, register based, digital integrated circuit that accepts binary data as input, processes it according to instructions stored in its memory and provides results as output. The processing module (122) is the brain of the system (100) and is configured to facilitate operation of each component of the system (100).
The processing module (122) may further implement artificial intelligence and machine learning based technologies for, but not limited to, data analysis, collating data and presentation of data in real-time. In accordance with an embodiment of the present invention, a data repository (not shown) may be also be connected with the system (100). The data repository may be, but not limited to, a local or a cloud-based storage. The data repository may store mixed reality content and data received from the other components (such as dock-able sensors), which may be provide to the processing module (122) when queried using appropriate protocols.
In accordance with an embodiment of the present invention, the system (100) also comprises a plurality of secondary sensors disposed on one or more external devices (204), the HMD (102), and/or the user (206) associated with respective HMD (102). In some cases, the the plurality of sensors may be in a form of wearable sensors and provided on wearable devices such as smartwatches etc which can be connected with the system. In such embodiments, the plurality of secondary sensors are used to measure vital parameters such as heart rate, pulse rate, respiration rate, blood oxygen levels, irregular heartbeats, glucose levels, ECG, EEG and EMG, of the user (wearer). In that sense, the plurality of secondary sensors are selected from, but not limited to, temperature sensor, heartbeat sensor, pulse sensor, blood sugar sensor, and electrical bio-sensors such as Electro-cardiogram (ECG) sensor, Electromyogram (EMG) sensor, and Electroencephalogram (EEG) sensor. This is extremely beneficial if the users are soldiers as this enables them to monitor their vital health parameters at all times.
In other embodiments, the plurality of secondary sensors may be disposed on the HMD or other connected external devices for receiving additional information of the surroundings. In that scenario, the plurality of secondary sensors may include, but not limited to, secondary LiDARs, secondary IR sensors, secondary thermal sensors, secondary radars, ultrasonic sensors, terrain sensors etc.
Furthermore, the one or more MR glasses (1022), a communication module (126), one or more cameras (1024), a display unit (128), an audio unit (130) are operatively connected with the processing module (122).
In accordance with an embodiment of the present invention, the system (100) further implement Brain Computing Interface (BCI). In that sense, the system (100) also includes a BCI module comprising a plurality of BCI sensors (120) provided inside the helmet (110) (shown in Figure 1C). As the user mounts the helmet (110) (and thereby the system (100)) on his head and correctly positions the BCI sensors (120), the BCI module and BCI sensors (120) are configured to sense and amplify brain signals received from a brain of the user. The user may then be showed some events with proper actions and scenes and the corresponding thoughts are sensed by the BCI module, which are used to train the processing module (122) (implementing AI and ML). Once the processing module (122) is properly trained on the data, the processing module (122) acts as an assistant in mixed reality for the user who sees the real world, analyses the thoughts of the user and automatically gives suggestions as per the thought of the user.
Apart from the above mentioned, the system (100) may further include a cooling module to ensure that the components of HMD (102) never exceeds a predetermined temperature; and a power module (124) to provide electrical power to the components of the system (100). In that sense, the power module (124) may be, a rechargeable battery or a non-rechargeable battery replaceable battery.
Figure 3 illustrates an exemplary implementation of system (100) of Figure 1A-1C. As shown in figure 3, the user (206) can be seen wearing the system (100) and the system (100) is connected with one or more external devices (204) via the communication network (202).
Exemplary method of operation includes, but not limited to (and without limiting to any particular order), receiving data from the one or more cameras and/or the plurality of dock-able sensors (106) at the processing module (122). The data captured and received is related to the surroundings, situational awareness, navigation, binocular vision, weather conditions and presence of objects & humans, around the system (100) worn by the user (206). The processing module (122) is then configured to process the received data to determine information related to object detection, IFF (Identification Friend or Foe), locations of targets & team-mates, velocity & distance estimation, health status, weapon information, threat recognition and detection. After processing the data, the processing module (122) provides the determined information in a form of a visualization, intuitive interface, non-intrusive and adjustable overlays, over the MR glasses (1022) of the HMD (102), using the display unit (128). An exemplary screenshot of how the visual overlays of information appear to the user on the HMD’s display, has been shown in the figure 4. As can be seen from the figure 4, the user is provided with target information and all the vital parameters for his health status. The information assist user in decision making and equips him/her with abundant information at his/her disposal to make a final decision in the given situation.
Apart from this, as already mentioned above, the system (100) enables wireless communication between the HMD (102) and one or more external devices (204), which can be wirelessly and remotely connected with the system (100). The one or more devices are selected from a radar, Unmanned Aerial vehicles (UAVs), external cameras, other HMDs (102), weapon firing systems and external computing devices, using the communication module (126). Such remote connection with one or more external devices (204) enable target sighting & locking, weapon deployment, Beyond Line of Sight (BLOS) capability, 360 degree vision and remote communication, using the information received from the plurality of dock-able sensors (106) and the external devices. The information received from the one or more external devices (204) includes, but not limited to, one or more of live feed from external cameras and UAVs, live and enhanced satellite images, information from the radar, weapon information, locations and audio-visual data from other HMDs (102) and audio or video information from external computing devices.
The above mentioned functionalities will now be described way of exemplary applications:
For example, in a security related operation, the armed forces require a wider view of the battlefield (and much more information than what a naked eye can see) to increase survivability and mission capability. Herein the present invention, the video feed from UAVs, information from radar and other sources help enable the military personnel to look into places where naked eye vision is not sufficient and make him/her aware of the surroundings. Say, a bird’s eye view may be overlaid on MR glasses (1022) for better situational awareness, using video feed form UAVs. The processing module (122) is configured to project information received from radar directly to the MR glasses (1022) and enable the user (206) to lock and engage the target without any additional human intervention. Additionally, binocular vision-enabled one or more cameras (1024) may be provided for enabling beyond the line of sight (vision) capability and estimation of velocity and distance so that the far-sighted targets can be engaged with higher accuracy.
In another example, the RADAR and SONAR data may be provided by sensors remotely located from the headset. Such system (100) may be used in the applications, to aid aiming for target shooting in the military operations. In the field of military, the armed forces may use the system (100) to look, detect difficult targets (due to distance, fog, smoke, rain, dust, haze, low light, etc), lock onto the targets using virtual overlays and shoot the target.
Further, the system (100), using the one or more cameras and the plurality of dock-able sensors (106), is configured to see beyond line of sight. For example, the system (100) may be used in the applications, but not limited to, see through an armoured vehicle when the user (206) is sitting inside the vehicle. The 360-degree camera feed is overlayed on the glasses of the user (306) inside the armoured vehicle so that the illusion of glass-like see-through vehicle hood is created. The same has been shown in the exemplary illustration of figure 5. As can be seen from the figure 5, the tank has the plurality secondary sensors mounted on the outside along with the camera setup, that is enabling the soldier inside to have complete awareness and 360 degree view of the surroundings. The system (100) enhances the navigation and situational awareness for crew members of vehicles like tanks.
Further, the processing module (122) with the help of one or more cameras (1024), may be configured to recognize a face, object, and situation. For example, the system (100) may be used in the applications, in combat or high-risk scenarios for surveillance, law enforcement or battlefield, where face and object recognition is required. In that scenario, the system (100) is used for Face, object and situation recognition and associate with data retrieval from the data repository and overlay this added information and intelligence in real time on MR glasses (1022), for quicker decision making in a combat situation.
In accordance with another exemplary embodiment of the present invention, a user interface is also provided to enable the user to navigate between various mixed reality information overlays and use the sensor data and information, in the most efficient manner as per the user’s requirement without it being a hassle to the user (206). The exemplary user interface includes, but not limited to, one or more buttons, a gesture interface, an audio interface, and a touch-based interface, eye-tracking interface that tracks gaze and focus, EEG-Based Brain-Computer Interface, and the like.
In an embodiment, the processing module (122) with the help of AI, provides intelligence to detect, classify and track multiple types of targets like humans, machines, artillery, vehicles, threats, missiles. AI is used to pull out more relevant information from the data repository once a specific target is identified and displayed in an intelligent way on the MR glasses (1022).
In accordance with an exemplary embodiment of the present invention, the processing module (122) also implements machine learning to store the graphics of unseen objects or the objects detected for the first time, for future use. The machine learning module enables machine learning in the system (100). A dedicated artificial intelligence module and a machine learning module may be a part of the processing module or provided on separate remote servers or cloud-based servers. The cloud-based servers may share the data with the system (100) for monitoring the condition around the system (100).
In another embodiment, the system (100) may facilitate soldier data management by enabling inter-connectivity of multiple HMDs (102) and a control Room. For example, the system (100) may send data to a Soldier data management system (100). The Soldier data management system (100) is interconnected with multiple head gears and a control room. The control room is used to monitor the activity going on in the battlefield and respond accordingly. The Health Status of the soldiers is also monitored by the system (100). The health status includes the information of vital parameters such as heart rate, pulse rate, respiration rate, blood oxygen levels, irregular heartbeats, glucose levels, ECG, EEG and EMG, measured using the plurality of secondary sensors disposed on the user (204) of the respective HMD (102).The control room generates warnings on critical data, takes immediate action and provides quick decisions and suggestions based on real-time data on MR glasses (1022) using information overlays.
In accordance with an exemplary embodiment of the present invention, the processing module (122) is further configured to enable a collaborative mixed reality session with other connected HMDs (102), to assist users in operation Planning and Live Monitoring. The present invention has an application such as web or mobile application that enables operation planning. The holographic application provides an accurate representation of the environmental situation at the field of operation in a 3D holographic map. For example, it may provide an accurate representation of the battlefield on 3D holographic map with the desired view of the terrain from multiple perspectives and accurately point out the tactical advantages in the field ahead of the real-life action. Further, the application may fetch live data from plurality of sources such as the headgear worn by the user, satellites, GPS, RADAR, SONAR, and any other sensors, and shows the battlefield real-time in 3D holograms for advanced monitoring. The application also enables remote networking capabilities (holographic teleportation) so that other users outside the base of operations can view plans and contribute as if they were on the battlefield. The same has been illustrated in Figure 6A and 6B. In figure 6A, both user are physically present at a common location and can be seen working in a common mixed reality space. Herein, both users are provided with the accurate representation of the battlefield on 3D holographic map with the desired view of the terrain from multiple perspectives for operation planning and monitoring.
Figure 6B illustrates a scenario where the users are not present on a common location and at least one of the user is present on a different location. As can eb seen from the figure 6B, the user on the left is a teleported holographic projection of the user present on the remote location, whereas the user on the right is an actual soldier. Both can be seen working in a common mixed reality space on a common project.
In another embodiment, the system (100) and method may enable efficient and effective training in an immersive manner of real combat scenarios. The users may be trained in an immersive way by placing them into a more physically and mentally stressful operational environment. For example, the users may be trained for medical emergencies, virtual bootcamps, battlefield simulations, and advanced ground and aerial vehicle simulators.
In yet another embodiment, the system (100) may enable assembly, repair, and maintenance of various systems and devices required by the user on the field. The system (100) may assist the user regarding finding the error and potential cause of error in any device. The system (100) may also enable on-field remote assistance where an expert sitting anywhere in the world can virtually guide the user repairing the machine on-site.
The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments explained herein above. Rather, the embodiment is provided so that this disclosure will be thorough and complete and will fully convey the scope of the invention to those skilled in the art.
Further, one would appreciate that the wireless communication network used in the system can be a short-range communication network and/or a long-range communication network, wire or wireless communication network. The communication interface includes, but not limited to, a serial communication interface, a parallel communication interface or a combination thereof.
The Head Mounted Devices (HMDs) referred herein may also include more components such as, but not limited to, a Graphics Processing Unit (GPU) or any other graphics generation and processing module. The GPU may be a single-chip processor primarily used to manage and boost the performance of video and graphics such as 2-D or 3-D graphics, texture mapping, hardware overlays etc. The GPU may be selected from, but not limited to, NVIDIA, AMD, Intel and ARM for real time 3D imaging.
In general, the word “module,” as used herein, refers to logic embodied in hardware or firmware, or to a collection of software instructions, written in a programming language, such as, for example, Java, C, Python or assembly. One or more software instructions in the modules may be embedded in firmware, such as an EPROM. It will be appreciated that modules may comprised connected logic units, such as gates and flip-flops, and may comprise programmable units, such as programmable gate arrays or processors. The modules described herein may be implemented as either software and/or hardware modules and may be stored in any type of computer-readable medium or other computer storage device.
Further, while one or more operations have been described as being performed by or otherwise related to certain modules, devices or entities, the operations may be performed by or otherwise related to any module, device or entity. As such, any function or operation that has been described as being performed by a module could alternatively be performed by a different server, by the cloud computing platform, or a combination thereof. It should be understood that the techniques of the present disclosure might be implemented using a variety of technologies. For example, the methods described herein may be implemented by a series of computer executable instructions residing on a suitable computer readable medium. Suitable computer readable media may include volatile (e.g. RAM) and/or non-volatile (e.g. ROM, disk) memory, carrier waves and transmission media. Exemplary carrier waves may take the form of electrical, electromagnetic or optical signals conveying digital data steams along a local network or a publicly accessible network such as the Internet.
It should also be understood that, unless specifically stated otherwise as apparent from the following discussion, it is appreciated that throughout the description, discussions utilizing terms such as "controlling" or "obtaining" or "computing" or "storing" or "receiving" or "determining" or the like, refer to the action and processes of a computer system, or similar electronic computing device, that processes and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
Various modifications to these embodiments are apparent to those skilled in the art from the description and the accompanying drawings. The principles associated with the various embodiments described herein may be applied to other embodiments. Therefore, the description is not intended to be limited to the embodiments shown along with the accompanying drawings but is to be providing broadest scope of consistent with the principles and the novel and inventive features disclosed or suggested herein. Accordingly, the invention is anticipated to hold on to all other such alternatives, modifications, and variations that fall within the scope of the present invention and the appended claims.
,CLAIMS:We Claim
1. A system (100) for advanced surveillance and security using mixed reality head mounted device, the system (100) comprising:
a Mixed Reality (MR)-based HMD (102) adjustably mounted on a helmet (110) using a mounting mechanism (108), the HMD (102) having atleast MR glasses (1022), a communication module (126), one or more cameras, a display unit (128), an audio unit (130) and a processing module (122); and
a plurality of dock-able sensors (106) mounted on the helmet (110) and connected with the HMD (102);
wherein the processing module (122) is configured to:
receive data from the one or more cameras (1024) and/or the plurality of dock-able sensors (106), the data being related to surroundings, situational awareness, navigation, binocular vision, weather conditions and presence of objects & humans;
process the received data to determine information related to object detection, IFF (Identification Friend or Foe), locations of targets & team-mates, velocity & distance estimation, health status, weapon information, threat recognition and detection;
provide the determined information in a form of a visualization, intuitive interface, non-intrusive and adjustable overlays, over the MR glasses (1022) of the HMD (102), using the display unit (128);
enable wireless communication between the HMD (102) and one or more external devices (204) selected from a radar, Unmanned Aerial vehicles (UAVs), external cameras, other HMDs (102), weapon firing systems and external computing devices, using the communication module (126); and
enable target sighting & locking, weapon deployment, Beyond Line of Sight (BLOS) capability, 360 degree vision and remote communication, using the information received from the plurality of dock-able sensors (106) and the external devices.
2. The system (100) as claimed in claim 1, wherein the MR glasses (1022) are military grade and made of a material selected from a group comprising polycarbonate, aluminium alloy and rubber polymer.
3. The system (100) as claimed in claim 1, wherein the MR glasses (1022) are provided with UV protection and shock proof capability with anti-scratch, anti-fog coating and photochromatic coating with which the transparency of MR glasses (1022) is changed from dark shades to no tints, automatically or manually, based on surrounding lights and to adjust the clarity of holograms, in order to withstand different conditions.
4. The system (100) as claimed in claim 1, wherein the plurality of dock-able sensors (106) include, threat detection sensors, infrared sensors, night vision sensors, a thermal sensor, IFF (identification friend or foe), Lidar (Light Detection, and Ranging), Blue Force Tracking, SONAR and GPS.
5. The system (100) as claimed in claim 1, wherein the HMD (102) is operated using one or more of physical buttons, hand-gestures, voice commands and gaze-tracking for interaction.
6. The system (100) as claimed in claim 1, wherein the information received from the one or more external devices (204) includes one or more of live feed from external cameras and UAVs, live and enhanced satellite images, information from the radar, weapon information, locations and audio-visual data from other HMDs (102) and audio or video information from external computing devices.
7. The system (100) as claimed in claim 6, wherein the processing module (122) is configured to project information received from radar directly to the MR glasses (1022) and enable a user (206) to lock and engage the target without any additional human intervention.
8. The system (100) as claimed in claim 6, wherein the processing module (122) is configured to enable a collaborative mixed reality session with other connected HMDs (102), to assist users (206) in operation Planning and Live Monitoring.
9. The system (100) as claimed in claim 1, wherein the processing module (122) is configured to enable remote communication with external computing devices to assist the user (206) wearing the HMD (102) in maintenance, repair and assembly of on-site machines.
10. The system (100) as claimed in claim 1, wherein a plurality of HMDs (102) are connected with a control room for soldier data management, training and simulations.
11. The system (100) as claimed in claim 1, comprising a plurality of secondary sensors disposed on one or more external devices (204), the HMD (102), and/or the user (206) associated with respective HMD (102), and connected with the processing module (122).
12. The system (100) as claimed in claim 11, wherein the step of providing the health status, includes displaying the information of vital parameters such as heart rate, pulse rate, respiration rate, blood oxygen levels, irregular heartbeats, glucose levels, ECG, EEG and EMG, measured using the plurality of secondary sensors disposed on the user (204) of the respective HMD (102).

Dated this the 19th Day of March 2021

[VIVEK DAHIYA]
AGENT FOR THE APPLICANT- IN/PA 1491

Documents

Application Documents

# Name Date
1 202021011865-FORM-15 [27-03-2025(online)].pdf 2025-03-27
1 202021011865-PROVISIONAL SPECIFICATION [19-03-2020(online)].pdf 2020-03-19
2 202021011865-IntimationOfGrant05-07-2024.pdf 2024-07-05
2 202021011865-POWER OF AUTHORITY [19-03-2020(online)].pdf 2020-03-19
3 202021011865-PatentCertificate05-07-2024.pdf 2024-07-05
3 202021011865-OTHERS [19-03-2020(online)].pdf 2020-03-19
4 202021011865-FORM FOR STARTUP [19-03-2020(online)].pdf 2020-03-19
4 202021011865-Annexure [11-06-2024(online)].pdf 2024-06-11
5 202021011865-FORM-26 [11-06-2024(online)].pdf 2024-06-11
5 202021011865-FORM FOR SMALL ENTITY(FORM-28) [19-03-2020(online)].pdf 2020-03-19
6 202021011865-PETITION UNDER RULE 137 [11-06-2024(online)].pdf 2024-06-11
6 202021011865-FORM 1 [19-03-2020(online)].pdf 2020-03-19
7 202021011865-RELEVANT DOCUMENTS [11-06-2024(online)].pdf 2024-06-11
7 202021011865-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [19-03-2020(online)].pdf 2020-03-19
8 202021011865-Written submissions and relevant documents [11-06-2024(online)].pdf 2024-06-11
8 202021011865-DECLARATION OF INVENTORSHIP (FORM 5) [19-03-2020(online)].pdf 2020-03-19
9 202021011865-DRAWING [19-03-2021(online)].pdf 2021-03-19
9 202021011865-FORM 13 [17-05-2024(online)].pdf 2024-05-17
10 202021011865-COMPLETE SPECIFICATION [19-03-2021(online)].pdf 2021-03-19
10 202021011865-FORM-26 [17-05-2024(online)].pdf 2024-05-17
11 202021011865-POA [17-05-2024(online)].pdf 2024-05-17
11 Abstract1.jpg 2021-10-19
12 202021011865-FORM 18 [13-01-2022(online)].pdf 2022-01-13
12 202021011865-US(14)-HearingNotice-(HearingDate-05-06-2024).pdf 2024-05-16
13 202021011865-CLAIMS [12-10-2022(online)].pdf 2022-10-12
13 202021011865-FER.pdf 2022-06-30
14 202021011865-FER_SER_REPLY [12-10-2022(online)].pdf 2022-10-12
14 202021011865-OTHERS [12-10-2022(online)].pdf 2022-10-12
15 202021011865-FER_SER_REPLY [12-10-2022(online)].pdf 2022-10-12
15 202021011865-OTHERS [12-10-2022(online)].pdf 2022-10-12
16 202021011865-CLAIMS [12-10-2022(online)].pdf 2022-10-12
16 202021011865-FER.pdf 2022-06-30
17 202021011865-US(14)-HearingNotice-(HearingDate-05-06-2024).pdf 2024-05-16
17 202021011865-FORM 18 [13-01-2022(online)].pdf 2022-01-13
18 202021011865-POA [17-05-2024(online)].pdf 2024-05-17
18 Abstract1.jpg 2021-10-19
19 202021011865-COMPLETE SPECIFICATION [19-03-2021(online)].pdf 2021-03-19
19 202021011865-FORM-26 [17-05-2024(online)].pdf 2024-05-17
20 202021011865-DRAWING [19-03-2021(online)].pdf 2021-03-19
20 202021011865-FORM 13 [17-05-2024(online)].pdf 2024-05-17
21 202021011865-DECLARATION OF INVENTORSHIP (FORM 5) [19-03-2020(online)].pdf 2020-03-19
21 202021011865-Written submissions and relevant documents [11-06-2024(online)].pdf 2024-06-11
22 202021011865-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [19-03-2020(online)].pdf 2020-03-19
22 202021011865-RELEVANT DOCUMENTS [11-06-2024(online)].pdf 2024-06-11
23 202021011865-FORM 1 [19-03-2020(online)].pdf 2020-03-19
23 202021011865-PETITION UNDER RULE 137 [11-06-2024(online)].pdf 2024-06-11
24 202021011865-FORM-26 [11-06-2024(online)].pdf 2024-06-11
24 202021011865-FORM FOR SMALL ENTITY(FORM-28) [19-03-2020(online)].pdf 2020-03-19
25 202021011865-FORM FOR STARTUP [19-03-2020(online)].pdf 2020-03-19
25 202021011865-Annexure [11-06-2024(online)].pdf 2024-06-11
26 202021011865-PatentCertificate05-07-2024.pdf 2024-07-05
26 202021011865-OTHERS [19-03-2020(online)].pdf 2020-03-19
27 202021011865-POWER OF AUTHORITY [19-03-2020(online)].pdf 2020-03-19
27 202021011865-IntimationOfGrant05-07-2024.pdf 2024-07-05
28 202021011865-PROVISIONAL SPECIFICATION [19-03-2020(online)].pdf 2020-03-19
28 202021011865-FORM-15 [27-03-2025(online)].pdf 2025-03-27
29 202021011865-FORM FOR SMALL ENTITY [06-05-2025(online)].pdf 2025-05-06
30 202021011865-EVIDENCE FOR REGISTRATION UNDER SSI [06-05-2025(online)].pdf 2025-05-06
31 544353.pdf 2025-06-26
32 202021011865-RELEVANT DOCUMENTS [26-06-2025(online)].pdf 2025-06-26

Search Strategy

1 ss202021011865E_30-06-2022.pdf

ERegister / Renewals

3rd: 08 Jul 2024

From 19/03/2022 - To 19/03/2023

4th: 08 Jul 2024

From 19/03/2023 - To 19/03/2024

5th: 26 Jun 2025

From 19/03/2024 - To 19/03/2025

6th: 26 Jun 2025

From 19/03/2025 - To 19/03/2026

7th: 26 Jun 2025

From 19/03/2026 - To 19/03/2027

8th: 26 Jun 2025

From 19/03/2027 - To 19/03/2028

9th: 26 Jun 2025

From 19/03/2028 - To 19/03/2029

10th: 26 Jun 2025

From 19/03/2029 - To 19/03/2030