Abstract: ABSTRACT A SYSTEM FOR PUBLIC SURVEILLANCE AND AWARENESS The present disclosure explains a system for public surveillance and awareness comprising a public surveillance and awareness robot 101, a ground station 122 and a wireless communication 120. The public surveillance and awareness robot 101 includes a master controller 102, a battery and power management unit 104, an infrared (IR) camera 106, a microphone 108, a thermal camera 110, a LIDAR 112, a camera 114, a speaker 116, and actuators 118. Fig. 1
DESC:A SYSTEM FOR PUBLIC SURVEILLANCE AND AWARENESS
FIELD
[0001] The embodiments herein generally relate to the field of surveillance system. More particularly, the disclosure relates to a public surveillance and awareness system.
BACKGROUND AND PRIOR ART
[0002] Curfew situations arise in countries all around the world due to various reasons. Pandemics and epidemics are one of the reasons, wherein lockdown and strict restrictions are imposed. The restrictions include mandatory rules and regulations to be followed by people.
[0003] Police and security personnel are deployed in large numbers to ensure the adherence to the rules and regulations imposed by the government. The personnel risk their lives in surveying and bringing awareness amongst the people in such situations. In a pandemic, the personnel need to maintain adequate social distancing, lessen human interaction and avoid direct contact to items used by the public. But the above measures are unavoidable by the personnel as most of the tasks including checking of documents, advising public, and surveying a location involves human to human interaction along with direct contact. Also, visit to hospitals, medical institutes in an epidemic or pandemic containment zone for surveillance increases exposure of the personnel to highly contagious viruses.
[0004] Therefore, there is a need for an automated system to provide public surveillance for preventing direct human to human interaction. Moreover, there is a need for a system for public surveillance and awareness using an automated robot.
OBJECTS
[0005] Some of the objects of the present disclosure are described herein below:
[0006] The main objective of the present disclosure is to provide an automated system for public surveillance.
[0007] Another objective of the present disclosure is to provide an automated system for promoting public awareness during pandemic situations.
[0008] Still another objective of the present disclosure is to provide a system for automatic navigation of a robot towards people.
[0009] Yet another objective of the present disclosure is to provide a system for public surveillance and awareness robot for police personnel to promote safe interaction with people during a pandemic.
[00010] The other objectives and advantages of the present disclosure will be apparent from the following description when read in conjunction with the accompanying drawings, which are incorporated for illustration of preferred embodiments of the present disclosure and are not intended to limit the scope thereof.
SUMMARY
[00011] In view of the foregoing, an embodiment herein provides a system for public surveillance and awareness.
[00012] In accordance with an embodiment, the system comprises of a public surveillance and awareness robot, a ground station and a wireless communication. The public surveillance and awareness robot includes a master controller, a battery and power management unit, an infrared (IR) camera, a microphone, a thermal camera, a LIDAR, a camera, a speaker, and plurality of actuators. The master controller is provided on the robot for controlling the robot, the camera is connected to the master controller on the robot for capturing video at a location and the LIDAR connected to the master controller on the robot for identifying distance of objects and navigating the robot. The camera processes faces of people for identifying a face wearing a mask and identifying a face not wearing a mask based on face recognition in the captured video. The LIDAR identifies a distance of the face of the person not wearing the mask based on a command of the master controller and navigates the robot towards the person not wearing the mask and the master controller controlling a speaker on the robot for playing an audio on reaching the person not wearing the mask. In an embodiment, the audio is played from pre-recorded messages stored in a memory unit of the master controller.
[00013] In accordance with an embodiment, the ground station includes a control station, a joystick control, a steering control, a Graphical user interface (GUI) control and a mobile phone app.
[00014] In accordance with an embodiment, the public surveillance and awareness robot measures temperature of people, identifies people not following rules, navigates towards the people not following the rules and interacts with them for bringing awareness. The public surveillance and awareness robot also verifies documents and permits submitted by the people.
[00015] In accordance with an embodiment, the camera measures distance between people at the location through image processing for identifying people having lesser distance than a standard distance provided in the master controller and the LIDAR navigates the robot towards the identified people based on a command from the master controller.
[00016] In accordance with an embodiment, the infrared camera is provided on the robot for capturing video during low light in the location. In an embodiment, thermal camera is provided on the robot for measuring a temperature of the people identified by the camera.
[00017] In accordance with an embodiment, the ground station communicates with the robot for controlling movement of the robot from a remote location. In an embodiment, the ground station includes a control station for wirelessly communicating with the master controller of the robot and the control station controls actuators connected to wheels of the robot through the master controller based on commands received from a steering control, a joystick control and a graphical user interface control. In an embodiment, the steering control controls direction of wheels of the robot, the joystick control controls forward, backward, and sideward movement of wheels of the robot and the graphical user interface control provided a user interface for directly communicating with the master controller for controlling the robot. In an embodiment, an authorized user from the ground station can make live announcements for playing the announcements through the speaker.
[00018] In accordance with an embodiment, the microphone is provided on the robot for capturing audio of a person around the robot, the master controller processes a reply to the captured audio based on audio processing and the speaker plays the reply processed by the master controller.
[00019] In accordance with an embodiment, a method for surveillance and awareness, comprises the steps of providing a robot for surveillance at a location, capturing a live video of the location using a camera provided on the robot, processing the captured video using face recognition provided in the camera for identifying faces of people wearing a mask and not wearing a mask, transmitting a signal to the master controller by the camera on identifying a person not wearing the mask, navigating the robot towards the person not wearing the mask by a LIDAR based on a command from the master controller and transmitting an audio output to the person not wearing the mask by a speaker provided on the robot.
[00020] These and other aspects of the embodiments herein will be better appreciated and understood when considered in conjunction with the following description and the accompanying drawings. It should be understood, however, that the following descriptions, while indicating preferred embodiments and numerous specific details thereof, are given by way of illustration and not of limitation. Many changes and modifications may be made within the scope of the embodiments herein without departing from the spirit thereof, and the embodiments herein include all such modifications.
BRIEF DESCRIPTION OF DRAWINGS
[00021] The detailed description is set forth with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different figures indicates similar or identical items.
[00022] Fig.1 illustrates a block diagram of a system for public surveillance and awareness, according to an embodiment of the present disclosure herein;
[00023] Fig.2 illustrates a block diagram of ground station for the system for public surveillance and awareness, according to an embodiment of the present disclosure herein; and
[00024] Fig.3 illustrates the public surveillance and awareness robot of the system for public surveillance and awareness, according to an embodiment of the present disclosure herein.
LIST OF NUMERALS
100 - System of public surveillance and awareness
101 - Public surveillance and awareness robot
102 - Master controller
104 - Battery and power management unit
106 - Infrared camera
108 - Microphone
110 - Thermal camera
112 - LIDAR
114 - Camera
116 - Speaker
118 - Actuators
120 - Wireless communication
122 - Ground station
200 - Ground station for system for public surveillance and awareness
201 - Control station
202 - Steering control
204 - Joystick control
206 - GUI control
208 - App
300 - Public surveillance and awareness robot for system
301 - Wheels
302 - Front side of robot
303 - Enclosure of robot
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
[00025] The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
[00026] As mentioned above, there is a need to provide an automated system for public surveillance and awareness in pandemic or curfew situations. In particular, there is a need to provide an automated system for public surveillance and awareness using a robot. The embodiments herein achieve this by providing “A system for public surveillance and awareness”. Referring now to the drawings, and more particularly to Fig.1 through Fig. 3, where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments.
[00027] Fig 1 illustrates the system for public surveillance and awareness. The system 100 includes a public surveillance and awareness robot 101 and a ground station 122. The public surveillance and awareness robot 101 includes a master controller 102, a battery and power management 104, infrared (IR) camera 106, microphone 108, thermal camera 110, LIDAR 112, camera 114, speaker 116, and a plurality of actuators 118.
[00028] The ground station 122 communicates with the public surveillance and awareness robot 101 wirelessly using wireless communication 120.
[00029] In an embodiment, the wireless communication 120 includes Wi-Fi, Bluetooth, any conventional wireless communication technology.
[00030] In an embodiment, the battery and power management unit 104 includes a plurality of batteries for managing and supplying power to components of the public surveillance and awareness robot 101 including the master controller 102, the infrared (IR) camera 106, the microphone 108, the thermal camera 110, the LIDAR 112, the camera 114, the speaker 116, and the actuators 118.
[00031] The master controller 102 controls the infrared (IR) camera 106, the microphone 108, the thermal camera 110, the LIDAR 112, the camera 114, the speaker 116, and the actuators 118. A wireless interface is provided in the master controller 102 for communicating with the ground station 122 using wireless communication 120. The master controller 102 includes a memory unit for storing images captured by the camera 114, thermal camera 110 and the infrared camera 106. The memory unit also stores pre-recorded messages for playing on the speaker 116 and standard values of human body temperature.
[00032] The speaker 116 plays the audio of the pre-recorded messages stored in the memory unit. In an embodiment, the pre-recorded messages are played at regular intervals by the speaker 116, based on a time interval provided in the master controller 102. In an embodiment, the speaker 116 plays live audios transmitted through a wireless/wired microphone connected to it.
[00033] The actuators 118 is provided for rotating wheels of the public surveillance and awareness robot 101 for enabling movement.
[00034] The LIDAR 112 is provided for auto navigation, wherein the LIDAR 112 maps an area where the robot 101 is going to navigate and creates a map of the area for auto navigating the LIDAR 112 in the area. The LIDAR 112 is provided for detecting objects, people and their distances from the surroundings of the robot 101. The LIDAR 112, on detecting an obstacle in the movement path of the robot 101, transmits a signal to the master controller 102. The master controller 102 controls the actuators 118 and changes path of the robot 101 for avoiding obstacles. In an embodiment, auto navigation algorithm is implemented using programming language including but not limited to Python over Robot Operating System (ROS) software platform.
[00035] The camera 114 captures images and videos of people and documents. A face recognition algorithm is configured with the camera 114 for identifying faces of the people. The camera 114 is a 360 degrees camera for identifying people and objects 360 degrees around the robot 101. The face recognition algorithm distinguishes faces of people wearing masks and faces of people not wearing masks. In an embodiment, the face recognition algorithm is implemented using a programming language including but not limited to C, C++ and python. The camera 114 captures an image of a person, when the face recognition algorithm identifies a person around the robot 101.
[00036] In an embodiment, the camera 114, on identifying people not wearing masks, transmits a signal to the master controller 102. The master controller 102 on receiving a signal from the camera 114 of a person not wearing a mask transmits a signal to the LIDAR 112. The LIDAR 112 navigates the robot 101 to the location of the person. On reaching the location of the person, the speaker 116 is activated by the master controller 102 for playing pre-recorded audio messages to the person. The pre-recorded audio message includes advice on wearing the mask.
[00037] In an embodiment, the LIDAR 112 navigates the robot 101 to a group of people, when the face recognition algorithm in the camera 114 identifies the group of people close to each other at a distance lesser than the standard distance specified for social distancing. On reaching to the group of people, the speaker 116 plays the audio of pre-recorded audio messages for alerting and advising the group of people to follow social distancing.
[00038] In an embodiment, image processing algorithm is provided with the camera 114 for verifying document, permits, applications submitted by people. The image processing algorithm verifies the image of documents captured by the camera 114 for authenticity, validity based on QR code scanning, bar code scanning, date, authorized signature, stamp and other standard data verification factors.
[00039] In an embodiment, the infrared camera 106 is provided for capturing images and videos in locations having low/no light and during night time surveillance.
[00040] In an embodiment, the thermal camera 110 is provided for finding a temperature of a person standing in the range of the thermal camera 110. Simultaneously, the camera 114 captures an image of the person 116. The thermal camera 110 transmits the temperature of the person to the master controller 102. The memory unit in the master controller 102 compares the temperature of the person with the standard value of temperature for a human body. The master controller 102 sends an alert to the ground station 122 in the form of a SMS/message to a mobile phone of an authorized user, police personnel or security personnel, when the temperature of the person identified by the thermal camera 110 is greater than the standard temperature. The master controller 102 sends the photo of the person captured by the camera 114 to the ground station 122.
[00041] In an embodiment, the microphone 108 is provided for capturing the voice of person speaking to the robot 101. The microphone 108 captures the audio of the person speaking and transmits to the master controller 102. Based on the received signal of the audio of the person, the master controller 102 including a voice enabled algorithm processes a reply and plays the reply on the speaker 116. The microphone 108 and speaker 116 provide interaction of the robot 101 with the people, by answering common questions and generating awareness.
[00042] Fig. 2 illustrates the ground station 122. The ground station 122 includes the control station 201, steering control 202, joystick control 204, GUI control 206 and a mobile phone app 208.
[00043] In an embodiment, the ground station 122 is provided for remotely controlling the robot 101 through the master controller 102. The ground station 122 can control the robot 101 from a maximum distance of 1km.
[00044] The control station 201 includes a wireless interface for communicating with the wireless interface of the master controller 102 through a communication interface 122. The control station 201 is connected to the steering control 202, the joystick control 204 and the GUI control 206. Output from the steering control 202, the joystick control 204 and the GUI control 206 is transmitted to the control station 201. The control station 201 transmits the output to the master controller 102 of the robot 101. The master controller 102 controls the components of the robot 101 based on the controls received from the control station 201.
[00045] In an embodiment, the joystick control 204 is provided for manually controlling movement of the robot 101 in forward, backward, sideward directions. Based on output of the joystick control 204, the master controller 102 controls the actuators of the robot 101 for controlling rotation of wheels.
[00046] In an embodiment, the steering control 202 is provided for rotating the wheels of the robot 101 in any direction as per a user requirement. The output from the steering control 202 is transmitted to the master controller 102, by the control station 201. The master controller 102 controls the direction of rotation of the wheels of the robot 101, based on the received control from the control station 201.
[00047] In an embodiment, the mobile phone app 208 is provided in the ground station 122 for interacting with the master controller 102 of the robot through wireless communication 120. The mobile phone app 208 can be used for sending audio messages to the master controller 102 for playing through the speaker 116.
[00048] In an embodiment, the Graphical User Interface (GUI) control 206 is provided for changing a mode of the robot 101. The mode can also be changed using the mobile phone app 208. The modes include autonomous mode and teleoperation mode. In the autonomous mode, the master controller 102 controls the robot 101 and the wheels automatically based on the signals from the LIDAR 112, the camera 114. The camera 114 identifies humans and the LIDAR 112 auto navigates the robot 101 towards them. The speaker 116 and the microphone 108 enables the robot 101 for interaction of the humans, using voice enabled algorithms in the master controller 102.
[00049] In the teleoperation mode, an authorized person manually controls the robot 101 from the ground station 122, through the wireless communication 120. The authorized user operates the steering control 202, the joystick control 204, records audio messages from the mobile phone app 208. The operations of the steering control 202, the joystick control 204 and audio messages recorded in the mobile phone app 208 by the authorized user are transmitted to the master controller 102 from the control station 201, for controlling the robot 101. The recorded audio messages in the mobile phone app 208 are played on the speaker 116 of the robot 101. The authorized user can record the audio messages for interaction with the people around the robot 101. Simultaneously, the people around the robot 101 can record their audio in the microphone 108 and the audio can be transmitted to the mobile phone app 208 of the authorized user through the master controller 102. The interaction of the robot 101 with people in both the teleoperation mode and the autonomous mode can be video recorded, and images of the people can be shot by the camera 114. The video recordings and images are stored in the memory unit of the master controller 102 and can be transmitted to the mobile phone app 208 of the authorized user for analyzing the videos or photos. In an embodiment, the authorized user from the ground station 122 using the control station 201 can make live announcements for playing the announcements through the speaker 116.
[00050] Fig.3 illustrates the public surveillance and awareness robot 101.
[00051] The public surveillance and awareness robot 101 includes an enclosure 303, LIDAR 112, infrared camera 106, camera 114, thermal camera 110, microphone 108, speaker 116 and wheels 301.
[00052] The LIDAR 112 is placed at a height above the enclosure 303, for surveying obstacles and people and their distances around the robot 101 easily.
[00053] The infrared camera 106 is placed on the enclosure 303 at front side 302, for night time surveillance of people or obstacles in front of the robot 101.
[00054] The thermal camera 110 is placed on the enclosure 303 at the front side 302, for detecting temperature of a person in front of the robot 101.
[00055] The camera 114 is a 360-degree camera placed on top of the enclosure 303 for capturing people around 360 degrees of the robot 101.
[00056] The microphone 108 and the speaker 116 are placed inside the enclosure 303.
[00057] The master controller 102, actuators, battery and power management unit are also placed inside the enclosure 303 (not visible in the diagram).
[00058] The wheels 301 are placed on two sides of the enclosure 303 for movement of the robot 101. The wheels 301 are connected to the actuators 118.
[00059] A main advantage of the present disclosure is that the system provides a public surveillance and awareness robot.
[00060] Another advantage of the present disclosure is that the system provides an outdoor surveillance robot in curfew, pandemic situations.
[00061] Still another advantage of the present disclosure is that the system provides a public surveillance and awareness robot for police personnel to promote safe interaction with people during a pandemic.
[00062] Yet another advantage of the present disclosure is that the system auto navigates and spreads awareness to people not following certain rules.
[00063] Another advantage of the present disclosure is that the system for public surveillance and awareness robot provides night surveillance.
[00064] Still another advantage of the present disclosure is that the system provides an autonomous working public surveillance and awareness robot.
[00065] Yet another advantage of the present disclosure is that the system provides a robot for promoting general awareness and guidelines to the public.
[00066] Still another advantage of the present disclosure is that the system provides a robot for movement in high-risk zones of hospitals during a pandemic.
[00067] The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the embodiments as described herein.
,CLAIMS:We Claim:
1. A system for surveillance and awareness, comprising:
a robot (101) for surveilling a location;
a master controller (102) provided on the robot (101) for controlling the robot (101);
a camera (114) connected to the master controller (102) on the robot (101) for capturing video at the location;
a LIDAR (112) connected to the master controller (102) on the robot (101) for identifying distance of objects and navigating the robot (101);
characterized in that
the camera (114) processing faces of people for identifying a face wearing a mask and identifying a face not wearing a mask based on face recognition in the captured video;
the LIDAR (112) identifying a distance of the face of the person not wearing the mask from the robot (101) based on a command of the master controller (102) and navigating the robot (101) towards the person not wearing the mask; and
the master controller (102) controlling a speaker (116) on the robot (101) for playing an audio on reaching to the person not wearing the mask thereby spreading awareness.
2. The system as claimed in claim 1, wherein the camera (112) measures distance between people at the location through image processing for identifying people having lesser distance than a standard distance provided in the master controller (102); and
the LIDAR (112) navigates the robot (101) towards the identified people based on a command from the master controller (102)
3. The system as claimed in claim 1, wherein an infrared camera (106) is provided on the robot for capturing video during low light in the location.
4. The system as claimed in claim 1, wherein a thermal camera (110) is provided on the robot (101) for measuring a temperature of the people identified by the camera (114).
5. The system as claimed in claim 1, wherein a ground station (122) communicates with the robot (101) for controlling movement of the robot (101) from a remote location.
6. The system as claimed in claim 4, wherein the ground station (122) includes a control station (201) for wirelessly communicating with the master controller (102) of the robot (101); and
the control station (201) controls actuators (118) connected to wheels of the robot (101) through the master controller (102) based on commands received from a steering control (202), a joystick control (204) and a graphical user interface control (206).
7. The system as claimed in claim 5, wherein the steering control (202) controls direction of wheels of the robot (101), the joystick control (204) controls forward, backward, and sideward movement of wheels of the robot (101) and the graphical user interface control (206) provided a user interface for directly communicating with the master controller (102) for controlling the robot (101).
8. The system as claimed in claim 1, wherein a microphone (108) is provided on the robot (101) for capturing audio of a person around the robot (101);
the master controller (102) processes a reply to the captured audio based on audio processing; and
the speaker (116) plays the reply processed by the master controller (102).
9. The system as claimed in claim 1, wherein the audio is played from pre-recorded messages stored in a memory unit of the master controller (102).
10. A method for surveillance and awareness, comprising the steps of:
providing a robot (101) for surveillance at a location;
capturing a live video of the location using a camera (114) provided on the robot (101);
processing the captured video using face recognition in the camera (114) for identifying faces of people wearing a mask and not wearing a mask;
transmitting a signal to the master controller (102) by the camera (114) on identifying a person not wearing the mask;
navigating the robot (101) towards the person not wearing the mask by a LIDAR (112) based on a command from the master controller (102); and
transmitting an audio output to the person not wearing the mask by a speaker (116) provided on the robot (101).
| # | Name | Date |
|---|---|---|
| 1 | 202041021103-FORM 13 [13-03-2025(online)].pdf | 2025-03-13 |
| 1 | 202041021103-FORM-8 [13-03-2024(online)].pdf | 2024-03-13 |
| 1 | 202041021103-IntimationOfGrant15-04-2025.pdf | 2025-04-15 |
| 1 | 202041021103-STATEMENT OF UNDERTAKING (FORM 3) [19-05-2020(online)].pdf | 2020-05-19 |
| 2 | 202041021103-CLAIMS [14-12-2022(online)].pdf | 2022-12-14 |
| 2 | 202041021103-PatentCertificate15-04-2025.pdf | 2025-04-15 |
| 2 | 202041021103-POA [13-03-2025(online)].pdf | 2025-03-13 |
| 2 | 202041021103-PROVISIONAL SPECIFICATION [19-05-2020(online)].pdf | 2020-05-19 |
| 3 | 202041021103-CORRESPONDENCE [14-12-2022(online)].pdf | 2022-12-14 |
| 3 | 202041021103-FORM 13 [13-03-2025(online)].pdf | 2025-03-13 |
| 3 | 202041021103-POWER OF AUTHORITY [19-05-2020(online)].pdf | 2020-05-19 |
| 3 | 202041021103-RELEVANT DOCUMENTS [13-03-2025(online)].pdf | 2025-03-13 |
| 4 | 202041021103-FER_SER_REPLY [14-12-2022(online)].pdf | 2022-12-14 |
| 4 | 202041021103-FORM 1 [19-05-2020(online)].pdf | 2020-05-19 |
| 4 | 202041021103-FORM-8 [13-03-2024(online)].pdf | 2024-03-13 |
| 4 | 202041021103-POA [13-03-2025(online)].pdf | 2025-03-13 |
| 5 | 202041021103-RELEVANT DOCUMENTS [13-03-2025(online)].pdf | 2025-03-13 |
| 5 | 202041021103-FER.pdf | 2022-06-24 |
| 5 | 202041021103-DRAWINGS [19-05-2020(online)].pdf | 2020-05-19 |
| 5 | 202041021103-CLAIMS [14-12-2022(online)].pdf | 2022-12-14 |
| 6 | 202041021103-FORM-8 [13-03-2024(online)].pdf | 2024-03-13 |
| 6 | 202041021103-EDUCATIONAL INSTITUTION(S) [22-12-2021(online)].pdf | 2021-12-22 |
| 6 | 202041021103-DECLARATION OF INVENTORSHIP (FORM 5) [19-05-2020(online)].pdf | 2020-05-19 |
| 6 | 202041021103-CORRESPONDENCE [14-12-2022(online)].pdf | 2022-12-14 |
| 7 | 202041021103-CLAIMS [14-12-2022(online)].pdf | 2022-12-14 |
| 7 | 202041021103-EVIDENCE FOR REGISTRATION UNDER SSI [22-12-2021(online)].pdf | 2021-12-22 |
| 7 | 202041021103-FER_SER_REPLY [14-12-2022(online)].pdf | 2022-12-14 |
| 7 | 202041021103-Proof of Right [25-07-2020(online)].pdf | 2020-07-25 |
| 8 | 202041021103-CORRESPONDENCE [14-12-2022(online)].pdf | 2022-12-14 |
| 8 | 202041021103-DRAWING [13-05-2021(online)].pdf | 2021-05-13 |
| 8 | 202041021103-FER.pdf | 2022-06-24 |
| 8 | 202041021103-FORM 18 [22-12-2021(online)].pdf | 2021-12-22 |
| 9 | 202041021103-COMPLETE SPECIFICATION [13-05-2021(online)].pdf | 2021-05-13 |
| 9 | 202041021103-EDUCATIONAL INSTITUTION(S) [22-12-2021(online)].pdf | 2021-12-22 |
| 9 | 202041021103-FER_SER_REPLY [14-12-2022(online)].pdf | 2022-12-14 |
| 9 | 202041021103-FORM 13 [09-12-2021(online)].pdf | 2021-12-09 |
| 10 | 202041021103-EVIDENCE FOR REGISTRATION UNDER SSI [22-12-2021(online)].pdf | 2021-12-22 |
| 10 | 202041021103-FER.pdf | 2022-06-24 |
| 10 | 202041021103-POA [09-12-2021(online)].pdf | 2021-12-09 |
| 10 | 202041021103-RELEVANT DOCUMENTS [09-12-2021(online)].pdf | 2021-12-09 |
| 11 | 202041021103-EDUCATIONAL INSTITUTION(S) [22-12-2021(online)].pdf | 2021-12-22 |
| 11 | 202041021103-FORM 18 [22-12-2021(online)].pdf | 2021-12-22 |
| 11 | 202041021103-POA [09-12-2021(online)].pdf | 2021-12-09 |
| 11 | 202041021103-RELEVANT DOCUMENTS [09-12-2021(online)].pdf | 2021-12-09 |
| 12 | 202041021103-COMPLETE SPECIFICATION [13-05-2021(online)].pdf | 2021-05-13 |
| 12 | 202041021103-EVIDENCE FOR REGISTRATION UNDER SSI [22-12-2021(online)].pdf | 2021-12-22 |
| 12 | 202041021103-FORM 13 [09-12-2021(online)].pdf | 2021-12-09 |
| 13 | 202041021103-POA [09-12-2021(online)].pdf | 2021-12-09 |
| 13 | 202041021103-FORM 18 [22-12-2021(online)].pdf | 2021-12-22 |
| 13 | 202041021103-DRAWING [13-05-2021(online)].pdf | 2021-05-13 |
| 14 | 202041021103-EVIDENCE FOR REGISTRATION UNDER SSI [22-12-2021(online)].pdf | 2021-12-22 |
| 14 | 202041021103-FORM 13 [09-12-2021(online)].pdf | 2021-12-09 |
| 14 | 202041021103-Proof of Right [25-07-2020(online)].pdf | 2020-07-25 |
| 14 | 202041021103-RELEVANT DOCUMENTS [09-12-2021(online)].pdf | 2021-12-09 |
| 15 | 202041021103-POA [09-12-2021(online)].pdf | 2021-12-09 |
| 15 | 202041021103-EDUCATIONAL INSTITUTION(S) [22-12-2021(online)].pdf | 2021-12-22 |
| 15 | 202041021103-DECLARATION OF INVENTORSHIP (FORM 5) [19-05-2020(online)].pdf | 2020-05-19 |
| 15 | 202041021103-COMPLETE SPECIFICATION [13-05-2021(online)].pdf | 2021-05-13 |
| 16 | 202041021103-DRAWING [13-05-2021(online)].pdf | 2021-05-13 |
| 16 | 202041021103-DRAWINGS [19-05-2020(online)].pdf | 2020-05-19 |
| 16 | 202041021103-FER.pdf | 2022-06-24 |
| 16 | 202041021103-RELEVANT DOCUMENTS [09-12-2021(online)].pdf | 2021-12-09 |
| 17 | 202041021103-Proof of Right [25-07-2020(online)].pdf | 2020-07-25 |
| 17 | 202041021103-FORM 1 [19-05-2020(online)].pdf | 2020-05-19 |
| 17 | 202041021103-COMPLETE SPECIFICATION [13-05-2021(online)].pdf | 2021-05-13 |
| 17 | 202041021103-FER_SER_REPLY [14-12-2022(online)].pdf | 2022-12-14 |
| 18 | 202041021103-DRAWING [13-05-2021(online)].pdf | 2021-05-13 |
| 18 | 202041021103-POWER OF AUTHORITY [19-05-2020(online)].pdf | 2020-05-19 |
| 18 | 202041021103-DECLARATION OF INVENTORSHIP (FORM 5) [19-05-2020(online)].pdf | 2020-05-19 |
| 18 | 202041021103-CORRESPONDENCE [14-12-2022(online)].pdf | 2022-12-14 |
| 19 | 202041021103-CLAIMS [14-12-2022(online)].pdf | 2022-12-14 |
| 19 | 202041021103-DRAWINGS [19-05-2020(online)].pdf | 2020-05-19 |
| 19 | 202041021103-Proof of Right [25-07-2020(online)].pdf | 2020-07-25 |
| 19 | 202041021103-PROVISIONAL SPECIFICATION [19-05-2020(online)].pdf | 2020-05-19 |
| 20 | 202041021103-DECLARATION OF INVENTORSHIP (FORM 5) [19-05-2020(online)].pdf | 2020-05-19 |
| 20 | 202041021103-FORM 1 [19-05-2020(online)].pdf | 2020-05-19 |
| 20 | 202041021103-FORM-8 [13-03-2024(online)].pdf | 2024-03-13 |
| 20 | 202041021103-STATEMENT OF UNDERTAKING (FORM 3) [19-05-2020(online)].pdf | 2020-05-19 |
| 21 | 202041021103-DRAWINGS [19-05-2020(online)].pdf | 2020-05-19 |
| 21 | 202041021103-POWER OF AUTHORITY [19-05-2020(online)].pdf | 2020-05-19 |
| 21 | 202041021103-RELEVANT DOCUMENTS [13-03-2025(online)].pdf | 2025-03-13 |
| 22 | 202041021103-FORM 1 [19-05-2020(online)].pdf | 2020-05-19 |
| 22 | 202041021103-POA [13-03-2025(online)].pdf | 2025-03-13 |
| 22 | 202041021103-PROVISIONAL SPECIFICATION [19-05-2020(online)].pdf | 2020-05-19 |
| 23 | 202041021103-FORM 13 [13-03-2025(online)].pdf | 2025-03-13 |
| 23 | 202041021103-POWER OF AUTHORITY [19-05-2020(online)].pdf | 2020-05-19 |
| 23 | 202041021103-STATEMENT OF UNDERTAKING (FORM 3) [19-05-2020(online)].pdf | 2020-05-19 |
| 24 | 202041021103-PatentCertificate15-04-2025.pdf | 2025-04-15 |
| 24 | 202041021103-PROVISIONAL SPECIFICATION [19-05-2020(online)].pdf | 2020-05-19 |
| 25 | 202041021103-IntimationOfGrant15-04-2025.pdf | 2025-04-15 |
| 25 | 202041021103-STATEMENT OF UNDERTAKING (FORM 3) [19-05-2020(online)].pdf | 2020-05-19 |
| 26 | 202041021103-OTHERS [14-06-2025(online)].pdf | 2025-06-14 |
| 27 | 202041021103-EDUCATIONAL INSTITUTION(S) [14-06-2025(online)].pdf | 2025-06-14 |
| 1 | SearchHistoryE_24-06-2022.pdf |