Abstract: ABSTRACT UNMANNED AERIAL VEHICLE INTEGRATED SMART FENCE SECURITY SYSTEM Disclosed is a system (101) and a method for monitoring a facility (110), wherein system (101) is configured to communicate with a set of Unmanned Aerial Vehicles (UAV’s) (106) and a smart fence system (104). The smart fence system (104) is configured to detect intrusion of unauthorized person into the facility (110). Further, the smart fence system (104) is configured to transmit a warning signal, along with information of a target zone (105.1) at which the intrusion has taken place, to the system (101). The system (101) is further configured to instruct an unmanned aerial vehicle (106.1) from the set of unmanned aerial vehicles (106), to navigate to the target zone (105.1), and capture a live video of the target zone (105.1). In one embodiment, the live video may be analyzed by the system (101) using an intelligence system to determine the type/ characteristics of intrusion. [To be published with Figure 1]
DESC:FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENT RULES, 2003
COMPLETE SPECIFICATION
(See Section 10 and Rule 13)
Title of invention:
UNMANNED AERIAL VEHICLE INTEGRATED SMART FENCE SECURITY SYSTEM
APPLICANT
MAHINDRA TEQO PVT. LTD.
an Indian Entity having address as:
Mahindra Towers,
B wing, 5th Floor, Dr, G M Bhosle Marg,
Worli, Mumbai, Maharashtra - 400018
The following specification describes the invention and the manner in which it is to be performed.
CROSS-REFERENCE TO RELATED APPLICATIONS AND PRIORITY
The present application claims priority from Indian Provisional Patent Application (App. No. 201921038503) filed on September 24, 2019.
TECHNICAL FIELD
The present subject matter described herein, in general, relates to an automated surveillance system. More particularly, the present subject matter is related to an automated surveillance system and method for monitoring boundary of an area to detect any unauthorized intrusion.
BACKGROUND
The cases of robbery and mishaps at remote locations such as power plants, factories and other industrial areas are increasing across the globe. The existing security systems are largely dependent on security guards and CCTV surveillance systems for detecting intrusion of an unauthorised person along the boundaries of such facilities. However, it is not possible to establish security booths and CCTV surveillance cameras along boundary of the facility, especially when area of the facility is too large, at an economically feasible rate. In some situations, the area of the facility such as solar power plants may extend up to 4800 acres or more. It is not economically possible to install night vision enabled CCTV cameras along the boundary of such facilities nor it is possible to appoint security guards along the boundary since the boundary itself may even exceed 20 kilometres.
To address these problems, now a days, a concept of smart fencing has evolved which attempts to reduce dependency on manpower as compared to the conventional security systems. The smart fence systems generally use a set of sensors such as but not limited to electric fences, radar-sensors, microwave-sensors, video detection systems, optical fibre based sensors and similar sensors installed along the fence to detect intrusion at a particular zone along the boundary of the facility. The other technologies being used are motion detection through sensors or video analytics of CCTV footages. Once the intrusion/ breach is detected, the nearest security guard is intimated of the breach and the security guards act accordingly. However, the smart fencing system is not a full proof system and there are lots of false detection due to intrusion of animals or wind-blown objects, and treefall along the boundary.
In both solutions explained above, there is either a need to monitor the feed obtained from the CCTVs at a central station by humans and/or to deploy Video Analytics to detect the intrusion again involving human intervention for monitoring the output of the feed. Once the intrusion is detected through either of the processes, the interception is still to be done via human security guards. Due to this the security guard is unnecessarily alerted and results into inefficient utilization of human resources. In addition, in most smart fences (i.e. fence detection systems) without the CCTVs, the alarm may ring, however there would be no way of capturing the intruder for future action or detection.
Thus, the previous methods of using human work force to do monitoring or by using cameras with extreme visions integrated with smart fence results into added cost of operations. Also, more and more humans and cameras are needed to cover a huge perimeter which further adds to more data processing and additional costs for installation & operation. Further, the cameras despite of an intruder/issues, continuously operate adding no value to the asset and same is the case of manpower; where dedicated work force is deployed to continuously be at a single place.
Thus, there is a long-standing need for a monitoring system which will use minimal human intervention and electronic resources to curb the deficiencies of existing surveillance and smart fence security systems.
SUMMARY
This summary is provided to introduce the concepts related to a system and a method for monitoring a facility and the concepts are further described in the detail description. This summary is not intended to identify essential features of the claimed subject matter nor it is intended to use in determining or limiting the scope of claimed subject matter.
In one implementation, a system for monitoring a facility is disclosed. The system may comprise a processor and a memory coupled to the processor. The processor may be configured to execute instructions stored in the memory for receiving an intrusion detection signal from a smart fence system. The smart fence system may comprise set of sensors installed over a fence of the facility. The fence may be divided into a set of zones, wherein each zone corresponds to a section of the fence. Each zone may be monitored by one or more sensors from the set of sensors to generate sensor data. The sensor data may be analysed by the smart fence system to generate the intrusion detection signal. The processor may be configured for analysing the intrusion detection signal to detect a target zone corresponding to the intrusion detection signal. The processor may be configured for transmitting, navigation instructions to a candidate unmanned aerial vehicle from a set of unmanned aerial vehicle. The candidate unmanned aerial vehicle may be configured to capture a live video of the target zone, upon reaching the target zone. The processor may be configured for receiving, the live video of the target zone from the candidate unmanned aerial vehicle. The processor may be configured for analysing, the live video of the target zone in order to determine the numbers and characteristics of intruders. The processor may be configured for generating one or more alerts based on the analysis.
In another embodiment, a method for monitoring a facility is disclosed. The method may comprise a step for receiving, via a processor, an intrusion detection signal from a smart fence system. The smart fence system may comprise set of sensors installed over a fence of the facility. The fence may be divided into a set of zones, wherein each zone corresponds to a section of the fence. Each zone may be monitored by one or more sensors from the set of sensors to generate sensor data. The sensor data may be analysed by the smart fence system to generate the intrusion detection signal. The method may further comprise a step for analysing, via the processor, the intrusion detection signal to detect a target zone corresponding to the intrusion detection signal. The method may comprise the step for transmitting, via the processor, navigation instructions to a candidate unmanned aerial vehicle from a set of unmanned aerial vehicle. The candidate unmanned aerial vehicle may be configured to capture a live video of the target zone, upon reaching the target zone. The method may further comprise step for receiving, via the processor, the live video of the target zone from the candidate unmanned aerial vehicle. The method may further comprise the step for analysing, via the processor, the live video of the target zone in order to determine the numbers and characteristics of intruders. The method may comprise the step for generating, via the processor, one or more alerts based on the analysis.
BRIEF DESCRIPTION OF DRAWINGS
The detailed description is described with reference to the accompanying figures. In the Figures, the left-most digit(s) of a reference number identifies the Figure in which the reference number first appears. The same numbers are used throughout the drawings to refer like features and components.
Figure 1 illustrates a network implementation (100) of a system (101) for monitoring a facility (110), in accordance with an embodiment of a present subject matter.
Figure 2 illustrates components of the system (101) for monitoring the facility (110), in accordance with an embodiment of a present subject matter.
Figure 3 illustrates a method (300) for monitoring the facility (110) using a candidate unmanned aerial vehicle (106.1), in accordance with an embodiment of a present subject matter.
DETAILED DESCRIPTION
Reference throughout the specification to “various embodiments,” “some embodiments,” “one embodiment,” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, appearances of the phrases “in various embodiments,” “in some embodiments,” “in one embodiment,” or “in an embodiment” in places throughout the specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures or characteristics may be combined in any suitable manner in one or more embodiments.
The land area of a wind farms, Oil & Gas properties, Refineries, Factories, Construction sites, Corporate Offices, Warehouses, and solar PV power plants are considerably large when compared to area of a standalone building or a society with human settlements, that can be easily monitored using CCTV cameras in order to detect intrusion by any unauthorized person. Typically, a Solar PV power plant is huge as compared with other energy production plants. Further, the Solar PV power plants are mostly located in remote areas of a country. The average size of 100 to 250 MW Solar PV power plants may extend up to 4800 acres or more. The perimeter of such area can run from 20 to 30 kilometers. Due to this, securing the assets in the solar PV power plants from theft and damage has always been a huge challenge for the industry. To overcome this challenge, a novel technique of monitoring Solar PV power plants using unmanned aerial vehicles is proposed. Thus, the entire process not only helps in reducing the manpower fatigue, cost and time but also helps in situations where the intruder holds a weapon and could cause any damage to life and property, as well as enables to record the faces and events as a proof, during such intrusion, even if the intruder runs away. The system for monitoring the facility such as solar PV power plants is further explained with respect to figure 1 to 3.
Referring now to figure 1, a network implementation (100) of a system (101) for monitoring a facility (110) is illustrated, in accordance with an embodiment of a present subject matter. In one embodiment, the system (101) may be implemented on a server and may be configured to communicate with a set of Unmanned Aerial Vehicles (UAVs) (106.1-106.n) and a smart fence system (104). In one embodiment, the unmanned aerial vehicle (106) may be a surveillance drone or any similar UAV. In an embodiment, the surveillance drone may be connected to an electronic/ user device (103) of a security guard or a remote command centre over a network (102). It may be understood that the surveillance drone may be accessed by multiple users/ security guards through one or more electronic devices (103-1), (103-2), (103-3)…(103-n), collectively referred to as electronic device (103) hereinafter, or user device(s) (103), or applications residing on the electronic device (103). The user device (103) may be a data processing device of a person, machine, software, automated computer program, a robot or a combination thereof. In one embodiment, the user device (103) may be in form of a central monitoring station. In one embodiment, the user device (103-1) may be used by a security guard appointed for a particular location/ zone in the facility (110).
In an embodiment, though the present subject matter is explained considering that the system (101) is implemented as a server, it may be understood that the system (101) may also be implemented in a variety of user devices, such as a but are not limited to, server, a portable computer, a personal digital assistant, a handheld device, a mobile, a laptop computer, a desktop computer, a notebook, a workstation, a mainframe computer, and the like. In one embodiment, the system (101) may be implemented in a cloud-computing environment. In an embodiment, the network (102) may be a wireless network such as Bluetooth, Wi-Fi, LTE and such like, a wired network or a combination thereof. The network (102) can be accessed by the user device (103) using wired or wireless network connectivity means including updated communications technology.
In one embodiment, the network (102) can be implemented as one of the different types of networks, cellular communication network, local area network (LAN), wide area network (WAN), wireless LAN, the internet, and the like. The network (102) may either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another. Further, the network (102) may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like.
In one embodiment, the Unmanned Aerial Vehicles (106) may facilitate surveillance from the facility (110). The Unmanned Aerial Vehicles may be equipped with a local positioning system, an imaging device, and a communication unit (not shown). The local positioning system may be a Global Positioning System (GPS), a Global Navigation Satellite System (GNSS). The imaging device may be a camera but not limited to an Infrared (IR) camera, a video camera, a thermal imaging camera, an ultra-violet (UV) camera, an RGB fusion camera, and a digital camera. In one embodiment, an infrared and RGB fusion camera may be used in order to obtain RGB output. In one embodiment, the camera may also provide depth information. In one embodiment, more than one camera may be associated with the UAV (106.1). In one embodiment, the UAV (106.1) may comprise a 360-degree rotating LIDAR (Light Detection and Ranging technology) used to avoid obstacles and ensure safe passage during the flight.
Further, the smart fence system (104) may comprise a set of sensors (not shown). In one embodiment, the fence (107) may be divided into a set of zones (105.1-105.n), wherein each zone (105) corresponds to a section/ part of a fence (107) of the facility (110). Each zone (105) may be monitored by one or more sensors from the set of sensors (not shown). The one or more sensors are configured to detect an intrusion of an unauthorized person into the facility (110). The sensors may be selected from a motion sensor, infrared light sensor, laser sensor, camera with image processing, or any other sensor that can detect a human OR animal crossing/ entering the fence (107). In one embodiment, the fence (107) may be an electric fence, the sensors may be configured to detect if the electric line of the electric fence is broken by the intruder at a particular zone.
Further, upon detection of intrusion, the smart fence system (104) is configured to transmit a warning signal along with information of a target zone (105.1) at which the intrusion has taken place. This warning signal along with information of the target zone (105.1) is received by the system (101).
In one embodiment, each drone/ UAV is stationed over a charging pad at a corresponding zone from the set of zones (105). The charging pad may be enabled with magnetic chargers or any other wireless charging system in order to charge the UAV. The chargeable electricity may be from solar itself or any other source of electric energy. The system (101) may further be configured to instruct a candidate unmanned aerial vehicle (106.1) from the set of unmanned aerial vehicles, to navigate to the target zone (105.1), and capture a live video of the target zone (105.1). This live video may be rendered on associated devices including the display device of the security officer stationed at the watch tower (108.1) in the target zone (105.1).
In one embodiment, the live video may be analysed by the system (101) with the help of humans (101) or by using machine intelligence system such as but not limited to an artificial intelligence (AI) algorithms and machine learning algorithms, to determine the number of intruders, weapons carried by the intruders and other important characteristics of the intruders. Once the live video is analysed, the system (101) may generate one or more alerts based on the analysis and accordingly transmit one or more alerts to the user and one or more concerned authorities. Once all these details are captured then as per necessity, a central manpower team can be deployed to the target zone (105.1). The drones may also be configured to perform regular patrolling at predefined time intervals.
Referring to figure 2, components of the system (101) for monitoring the facility (110) are illustrated, in accordance with an embodiment of a present subject matter. The components of the system (101), comprises at least one processor (201), an input/output (I/O) interface (203), a memory (205), modules (207) and data (219). In one embodiment, the at least one processor (201) is configured to fetch and execute computer-readable instructions stored in the memory (205). In one embodiment, the at least one processor (201) may be a microcontroller which can perform high performance machine learning and AI processing for low-power devices.
In one embodiment, the I/O interface (203) implemented as a mobile application or a web-based application may include a variety of software and hardware interfaces, for example, a web interface, a Graphical User Interface (GUI), and the like. The I/O interface (203) may allow the system (101) to interact with smart fence system (104) and the user devices (103). Further, the I/O interface (203) may enable the user device (103) to communicate with other computing devices, such as web servers and external data servers (not shown). The I/O interface (203) can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I/O interface (203) may include one or more ports for connecting to another server. In an exemplary embodiment, the I/O interface (203) is an interaction platform which may provide a connection between users and system (101).
In an implementation, the memory (205) may include any computer-readable medium known in the art including, for example, volatile memory, such as static random-access memory (SRAM) and dynamic random-access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and memory cards. The memory (205) may include modules (207) and data (219).
In one embodiment, the modules (207) include routines, programs, objects, components, data structures, etc., which perform particular tasks, functions or implement particular abstract data types. In one implementation, the modules (207) may include a data receiving module (209), a navigation module (211), a capturing module (213), a determination module (215), and a transmission module (217). The data (219) may comprise a data store (221), and other data (223). The data, amongst other things, serves as a repository for storing data processed, received, and generated by one or more of the modules.
The aforementioned computing devices may support communication over one or more types of networks in accordance with the described embodiments. For example, some computing devices and networks may support communications over a Wide Area Network (WAN), the Internet, a telephone network (e.g., analog, digital, POTS, PSTN, ISDN, xDSL), a mobile telephone network (e.g., CDMA, GSM, NDAC, TDMA, E-TDMA, NAMPS, WCDMA, CDMA-2000, UMTS, 3G, 4G), a radio network, a television network, a cable network, an optical network (e.g., PON), a satellite network (e.g., VSAT), a packet-switched network, a circuit-switched network, a public network, a private network, and/or other wired or wireless communications network configured to carry data. Computing devices and networks also may support wireless wide area network (WWAN) communications services including Internet access such as EV-DO, EV-DV, CDMA/1×RTT, GSM/GPRS, EDGE, HSDPA, HSUPA, and others.
The aforementioned computing devices and networks may support wireless local area network (WLAN) and/or wireless metropolitan area network (WMAN) data communications functionality in accordance with Institute of Electrical and Electronics Engineers (IEEE) standards, protocols, and variants such as IEEE 802.11 (“WiFi”), IEEE 802.16 (“WiMAX”), IEEE 802.20x (“Mobile-Fi”), and others. Computing devices and networks also may support short range communication such as a wireless personal area network (WPAN) communication, Bluetooth® data communication, infrared (IR) communication, near-field communication, electromagnetic induction (EMI) communication, passive or active RFID communication, micro-impulse radar (MIR), ultra-wide band (UWB) communication, automatic identification and data capture (AIDC) communication, and others.
In one embodiment, the data receiving module (209) may be configured to receive an intrusion detection signal from the smart fence system (104). The smart fence system (104) may be configured to receive signals from a set of sensors installed over the fence (107) of the facility (110). The signals may represent a breach/ intrusion activity detected at a particular location/ zone of the facility (110). This signal may be analysed by the smart fence system (104) to detect an intrusion into the facility (110). Upon such detection, the smart fence system (104) may generate the intrusion detection signal and transmit it to the system (101). The data receiving module (209) may be configured to receive the intrusion detection signal from the smart fence system (104) and accordingly process the intrusion detection signal.
Further, the intrusion detection signal may be analysed by the navigation module (211) to detect a target zone (105.1) corresponding to the intrusion detection signal. The target zone (105.1) may be a zone at which the intrusion has taken place. Further, the navigation module (211) is configured to generate navigation instructions and transmit the navigation instructions to the candidate unmanned aerial vehicle (106.1), from the set of unmanned aerial vehicle (106), which is closest to the target zone (105.1). For this purpose, each unmanned aerial vehicle (106) is geo tagged and real-time locations of each unmanned aerial vehicle (106) is monitored. Furthermore, health of the UAV in terms of battery, the rotor blades may be monitored in real-time. In another embodiment, the candidate unmanned aerial vehicle (106.1) is selected for surveillance based on a set of factors including the distance from the target zone (105.1), battery level, time of the day/ night, weather conditions, type of imaging device mounted on the UAV, and the like. Upon reaching the target zone (105.1), the unmanned aerial vehicle (106.1), is configured to capture a live video of the target zone (105.1).
Further, the Live video is received by the capturing module (213) of the system (101). The capturing module (213) may be configured to determine the available bandwidth for data transmission and accordingly decide the quality of the live video to be received.
Further, the live video is analysed by the determination module (215) using machine intelligence system to determine the number of intruders, weapons carried by the intruders and other important characteristics such as but not limited to the gender, age and the number of the intruders using the machine intelligence system.
Once the analysis is performed, the transmission module (217) may generate one or more alerts based on the analysis and accordingly transmit the one or more alerts to the user. The live video may also be streamed on a display device of a security officer sitting at the watch tower (108.1) in the target zone (105.1). Once all these details are captured then as per necessity, a team of central manpower team can be deployed to the target zone (105.1). The UAVs (106) may also be configured to perform regular patrolling at predefined time intervals. The path for a particular drone present in that region, to cover a particular zone of fence, may be predefined or can be defined manually by the monitoring team.
Figure 3 illustrates a method (300) for monitoring the facility (110) using the candidate unmanned aerial vehicle (106.1), in accordance with the embodiment of the present subject matter. The order in which the method (300) is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method (300) or alternate methods. Furthermore, the method (300) can be implemented in any suitable hardware, software, firmware, or combination thereof. However, for ease of explanation, in the embodiments described below, the method (300) may be considered to be implemented in the above described system (101).
At step (301), the data receiving module (209) may be configured to receive an intrusion detection signal from the smart fence system (104) and accordingly process the intrusion detection signal.
At step (303), the intrusion detection signal may be further analysed by the navigation module (211) to detect a target zone (105.1) corresponding to the intrusion detection signal.
At step (305), the navigation module (211) is configured to generate navigation instructions and transmit the navigation instructions to the candidate unmanned aerial vehicle (106.1), from the set of unmanned aerial vehicle (106). The candidate unmanned aerial vehicle (106.1) may be selected from the set of unmanned aerial vehicle based on one or more factors comprising distance from the target zone, battery level, time of day/night, weather conditions, type of imaging device mounted on each of the unmanned aerial vehicles.
At step (307), a live video is received by the capturing module (213) of the system (101) from the candidate unmanned aerial vehicle (106.1). The capturing module (213) may be configured to determine the available bandwidth for data transmission and accordingly decide the quality of the live video to be received.
At step (309), the live video is analysed by the determination module (215) to determine the number of intruders, weapons carried by the intruders and other important characteristics of the intruders.
At step (311), once the analysis is performed, the transmission module (217) may generate one or more reports/ alerts based on the analysis and accordingly transmit the one or more reports/ alerts to the security in charge.
,CLAIMS:WE CLAIM:
1. A system for monitoring a facility (110), wherein the system comprises:
a processor (201); and
a memory (205) coupled to the processor (201), wherein the processor (201) is configured to execute instructions stored in the memory (205) for:
receiving, an intrusion detection signal from a smart fence system (104), wherein the smart fence system (104) comprises set of sensors installed over a fence (107) of the facility, wherein the fence is divided into a set of zones(105.1-105.n), wherein each zone (105) corresponds to a section of the fence (107), wherein each zone (105) is monitored by one or more sensors from the set of sensors to generate sensor data, wherein the sensor data is analysed by the smart fence system (104) to generate the intrusion detection signal;
analysing, the intrusion detection signal to detect a target zone (105.1) corresponding to the intrusion detection signal;
transmitting, navigation instructions to a candidate unmanned aerial vehicle (106.1) from a set of unmanned aerial vehicle (106), wherein the candidate unmanned aerial vehicle (106.1) is configured to capture a live video of the target zone (105.1), upon reaching the target zone (105.1);
receiving, the live video of the target zone (105.1) from the candidate unmanned aerial vehicle (106.1);
analysing, the live video of the target zone (105.1) in order to determine the numbers and characteristics of intruders;
generating, one or more alerts based on the analysis.
2. The system as claimed in claim 1, wherein the candidate unmanned aerial vehicle (106.1) is equipped with a local positioning system, an imaging device, and a communication unit.
3. The system as claimed in claim 1, wherein the candidate unmanned aerial vehicle (106.1) is selected from the set of unmanned aerial vehicle (106) based on one or more factors comprising distance from the target zone, battery level, time of day/night, weather conditions, type of imaging device mounted on each of the unmanned aerial vehicles.
4. The system as claimed in claim 1, wherein analysis of the live video of the target zone (105.1) is performed using a machine intelligence system.
5. A method for monitoring a facility (110), wherein the method comprises:
receiving, via a processor (201), an intrusion detection signal from a smart fence system (104), wherein the smart fence system (104) comprises set of sensors installed over a fence (107) of the facility (110), wherein the fence (107) is divided into a set of zones(105.1-105.n), wherein each zone (105) corresponds to a section of the fence (107), wherein each zone (105) is monitored by one or more sensors from the set of sensors to generate sensor data, wherein the sensor data is analysed by the smart fence system (104) to generate the intrusion detection signal;
analysing, via the processor (201), the intrusion detection signal to detect a target zone (105.1) corresponding to the intrusion detection signal;
transmitting, via the processor (201), navigation instructions to a candidate unmanned aerial vehicle (106.1) from a set of unmanned aerial vehicle (106), wherein the candidate unmanned aerial vehicle (106.1) is configured to capture a live video of the target zone (105.1), upon reaching the target zone (105.1);
receiving, via the processor (201), the live video of the target zone (105.1) from the candidate unmanned aerial vehicle (106.1);
analysing, via the processor (201), the live video of the target zone (105.1) in order to determine the numbers and characteristics of intruders;
generating, via the processor (201), one or more alerts based on the analysis.
6. The method as claimed in claim 5, wherein the candidate unmanned aerial vehicle (106.1) is selected from the set of unmanned aerial vehicle (106) based on one or more factors comprising distance from the target zone, battery level, time of day/night, weather conditions, type of imaging device mounted on each of the unmanned aerial vehicles.
7. The method as claimed in claim 5, wherein analysing the live video of the target zone (105.1) using machine intelligence system.
Dated this 24th day of September, 2020
| # | Name | Date |
|---|---|---|
| 1 | 201921038503-PROVISIONAL SPECIFICATION [24-09-2019(online)].pdf | 2019-09-24 |
| 1 | Abstract1.jpg | 2021-10-19 |
| 2 | 201921038503-FORM 1 [24-09-2019(online)].pdf | 2019-09-24 |
| 2 | 201921038503-COMPLETE SPECIFICATION [24-09-2020(online)].pdf | 2020-09-24 |
| 3 | 201921038503-FIGURE OF ABSTRACT [24-09-2019(online)].pdf | 2019-09-24 |
| 3 | 201921038503-CORRESPONDENCE-OTHERS [24-09-2020(online)].pdf | 2020-09-24 |
| 4 | 201921038503-DRAWINGS [24-09-2019(online)].pdf | 2019-09-24 |
| 4 | 201921038503-DRAWING [24-09-2020(online)].pdf | 2020-09-24 |
| 5 | 201921038503-ENDORSEMENT BY INVENTORS [24-09-2020(online)].pdf | 2020-09-24 |
| 5 | 201921038503-Proof of Right [19-03-2020(online)].pdf | 2020-03-19 |
| 6 | 201921038503-FORM 3 [17-04-2020(online)].pdf | 2020-04-17 |
| 6 | 201921038503-FORM-26 [19-03-2020(online)].pdf | 2020-03-19 |
| 7 | 201921038503-FORM 3 [17-04-2020(online)].pdf | 2020-04-17 |
| 7 | 201921038503-FORM-26 [19-03-2020(online)].pdf | 2020-03-19 |
| 8 | 201921038503-ENDORSEMENT BY INVENTORS [24-09-2020(online)].pdf | 2020-09-24 |
| 8 | 201921038503-Proof of Right [19-03-2020(online)].pdf | 2020-03-19 |
| 9 | 201921038503-DRAWING [24-09-2020(online)].pdf | 2020-09-24 |
| 9 | 201921038503-DRAWINGS [24-09-2019(online)].pdf | 2019-09-24 |
| 10 | 201921038503-FIGURE OF ABSTRACT [24-09-2019(online)].pdf | 2019-09-24 |
| 10 | 201921038503-CORRESPONDENCE-OTHERS [24-09-2020(online)].pdf | 2020-09-24 |
| 11 | 201921038503-FORM 1 [24-09-2019(online)].pdf | 2019-09-24 |
| 11 | 201921038503-COMPLETE SPECIFICATION [24-09-2020(online)].pdf | 2020-09-24 |
| 12 | Abstract1.jpg | 2021-10-19 |
| 12 | 201921038503-PROVISIONAL SPECIFICATION [24-09-2019(online)].pdf | 2019-09-24 |