Sign In to Follow Application
View All Documents & Correspondence

Method And System To Detect Distance Between Entities

Abstract: The present disclosure provides a system 100 and method 300 for detecting distance between two or more entities out of a group of people. Images are obtained from an image capture device 102 and the images are processed to identify entities in the image. The images are then processed to determine a distance between two or more entities, and if the distance is below a predetermined threshold, an alert is generated, and the identity of the two or more entities is communicated to a central server. The data and logs are stored in a memory device 120.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
11 August 2020
Publication Number
07/2022
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
info@khuranaandkhurana.com
Parent Application
Patent Number
Legal Status
Grant Date
2025-02-27
Renewal Date

Applicants

Bharat Forge Limited
Bharat Forge Limited, Mundhwa, Pune - 411036, Maharashtra, India.
Smartstraw India Private Limited
Smartstraw India Private Limited, Tower A, 4th Floor, Ratha Tek Meadows Road, Elcot Sez, Sholinganallur, Chennai - 600119, Tamil Nadu, India.

Inventors

1. BABASAHEB NEELKANTH KALYANI
Bharat Forge Limited, Mundhwa, Pune - 411036, Maharashtra, India.
2. PURUSHOTTAM BHARDWAJ
Bharat Forge Limited, Mundhwa, Pune - 411036, Maharashtra, India.
3. SUSHMITA PAWAR
Bharat Forge Limited, Mundhwa, Pune - 411036, Maharashtra, India.
4. ATUL BANSAL
Smartstraw India Private Limited, Tower A, 4th Floor, Ratha Tek Meadows Road, Elcot Sez, Sholinganallur, Chennai - 600119, Tamil Nadu, India.
5. MUKUL KUMAR
Smartstraw India Private Limited, Tower A, 4th Floor, Ratha Tek Meadows Road, Elcot Sez, Sholinganallur, Chennai - 600119, Tamil Nadu, India.
6. PRANEET BOMMA
Smartstraw India Private Limited, Tower A, 4th Floor, Ratha Tek Meadows Road, Elcot Sez, Sholinganallur, Chennai - 600119, Tamil Nadu, India.
7. VIPUL VAIBHAW
Smartstraw India Private Limited, Tower A, 4th Floor, Ratha Tek Meadows Road, Elcot Sez, Sholinganallur, Chennai - 600119, Tamil Nadu, India.
8. SARANJIT SAIKIA
Smartstraw India Private Limited, Tower A, 4th Floor, Ratha Tek Meadows Road, Elcot Sez, Sholinganallur, Chennai - 600119, Tamil Nadu, India.
9. TANMAY VAKARE
Smartstraw India Private Limited, Tower A, 4th Floor, Ratha Tek Meadows Road, Elcot Sez, Sholinganallur, Chennai - 600119, Tamil Nadu, India.

Specification

DESC:TECHNICAL FIELD
[0001] The present disclosure relates, in general, to public health. In particular, the present disclosure relates to a means to monitor distance between two or more entities.

BACKGROUND
[0002] Background description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
[0003] COVID 19 has changed the world. Social Distancing and wearing face masks have become absolutely essential to combat the spread of this disease. Enforcement of social distancing and face mask has become a new problem for organizations and society in general. The first problem in the enforcement of these norms is identifying those entities who are not following these norms. Once such entities are identified, enforcement of these norms will become easier.
[0004] There is, therefore, a requirement in the art for a means to detect distance between any two entities in a group of entities.

OBJECTS OF THE PRESENT DISCLOSURE
[0005] Some of the objects of the present disclosure, which at least one embodiment herein satisfies are as listed herein below.
[0006] It is an object of the present disclosure to provide methods and systems to maintain an optimum predefined distance between two entities.
[0007] It is an object of the present disclosure to provide methods and systems to recommend maintaining an optimum predefined distance between entities so as to avoid spread of virus, germs and bacteria.

SUMMARY
[0008] An aspect of the present disclosure provides a method for measuring a distance between two entities, said method comprising: receiving, at a processor of a computing device, an image, the image pertaining to two or more entities present within a field of view (FOV) of an imaging device; based on the received image, identifying, at the processor, presence of the two or more entities within the FOV of the imaging device; determining, at the processor, a distance between each of the identified two or more entities using the received image, where the distance may be determined using a learning engine; and comparing, at the processor, the determined distance with a predefined distance value, wherein upon a mismatch being detected generating an alert related to maintaining the predefined distance between the two or more entities.
[0009] In an embodiment, a plurality of region of interests (ROIs) may be identified from the two or more entities to determine distance between the two or more entities.
[0010] In an embodiment, the plurality of ROIs to be captured may pertain to any or a combination of a face area or a body area.
[0011] In an embodiment, the learning engine may be any of a neural network, a deep learning neural network, and a real time object detection model, a set of instructions for calibration or a combination thereof.
[0012] In an embodiment, the imaging device further may comprise of a frame integration unit; and a noise filter for capturing the image.
[0013] An aspect of the present disclosure provides a system for measuring a distance between two entities, said system comprising: a processing engine of a computing device comprising a processor coupled with a memory, the memory storing instructions executable by the processor to: receive, an image, the image pertaining to two or more entities present within a field of view (FOV) of an imaging device; based on the received image, identify presence of the two or more entities within the FOV of the imaging device; determine, at the processor, a distance between each of the identified two or more entities using the received image, where the distance may be determined using a learning engine; and compare, the determined distance with a predefined distance value, wherein upon a mismatch being detected generating an alert related to maintaining the predefined distance between the two or more entities.
BRIEF DESCRIPTION OF DRAWINGS
[0014] The accompanying drawings are included to provide further understanding of the present disclosure and are incorporated in and constitute a part of this specification. The drawings illustrate exemplary embodiments of the present disclosure and, together with the description, serve to explain the principles of the present disclosure. The diagrams are for illustration only, which thus is not a limitation of the present disclosure.
[0015] FIG. 1 illustrates an exemplary block diagram for a system to detect distance between two or more entities, in accordance with an embodiment of the present disclosure.
[0016] FIG. 2 illustrates exemplary functional components of the proposed system in accordance with an embodiment of the present disclosure.
[0017] FIG. 3 illustrates a flow diagram for a proposed method to detect distance between two or more entities, in accordance with an embodiment of the present disclosure.
[0018] FIG. 4 illustrates an exemplary computer system in which or with which embodiments of the present invention can be utilized in accordance with embodiments of the present disclosure.

DETAILED DESCRIPTION
[0019] The following is a detailed description of embodiments of the disclosure depicted in the accompanying drawings. The embodiments are in such detail as to clearly communicate the disclosure. However, the amount of detail offered is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosure as defined by the appended claims.
[0020] The present disclosure relates, in general, to public health. In particular, the present disclosure relates to a means to monitor distance between two or more entities.
[0021] FIG. 1 illustrates an exemplary block diagram for a system to detect distance between two or more entities, in accordance with an embodiment of the present disclosure.
[0022] In an aspect, the system 100 can be implemented to detect distance between two or more entities using images obtained from a camera such, a close-circuit camera or CCTV and the like.
[0023] In another aspect, the system 100 can be implemented to detect or identify the any two entities who are not maintaining a predetermined distance from one another. The detection is accurate even at low image resolution and at low light conditions.
[0024] In another aspect, the system 100 can employ a 9-cell based approach to determine physical distance between the two or more entities.
[0025] In another aspect, the system 100 can be implemented in all environments, both indoors and outdoors.
[0026] In another aspect, the system 100 can be implemented without internet connectivity.
[0027] In another embodiment, the system 100 can receive input from an image capture device 102 such as a camera, or a CCTV but not limited to the like. The image capture device 102 can have a USB interface with the system 100.
[0028] In another embodiment, the system 100 can include a processing engine 104, which can include processor(s) 106 and a memory 108, the memory 108 storing instructions executable by the processor(s) 106 to detect distance between two or more entities.
[0029] In another embodiment, the processing engine 104 can receive data from the image capture device 102. The data can include images in FOV of the image capture device 102, which can include one or more entities. The received data is pre-processed at a pre-processor 110 and sent to a facial recognition unit 112. The facial recognition unit 112 can include a back-end unit 150, which can include any suitable face detector 152 such as a YOLO v3 HR and a flask server 154 that is operatively coupled with the face detector 152. The facial recognition unit 112 can identify the one or more entities in the received images.
[0030] In another embodiment, distance measurement unit 114 can determine, from the images received, a distance between any two entities. The distance measurement unit 114 further continuously monitors the distance between the two or more entities, thereby tracking position of the two or more entities.
[0031] In an exemplary embodiment, the processing engine 104 can employ a learning engine such as neural networks, machine learning and other artificial intelligence (AI) – based learning to process the received data from the image capture device 102 (interchangeably referred to as the imaging device) to measure distance between two or more entities.
[0032] In another embodiment, when the distance measurement unit 114 determines distance between two or more entities to be within a predefined distance value, a corresponding alert is generated at the alert generator 116.
[0033] In another embodiment, the system includes a display device 118, which can display information pertaining to any or a combination of data received from the image capture device 102, detected distance between the each of the two or more entities, identity of each of the two or more entities and any alert generated.
[0034] In another embodiment, the above information can be communicated to a central server (not shown).
[0035] In another embodiment, the alert generated can be through a visual aid such as flashing LED lamps, displaying of a message etc. or through an audio aid such as an alarm.
[0036] In another embodiment, the system 100 can include a memory device 120, which can store the above-mentioned information.
[0037] In an embodiment, the disclosed system facilitates implementing a physical distance between entities in both indoor and outdoor spaces as this is an essential way to slow down the spread of COVID-19 or any other disease.
[0038] In an exemplary embodiment, the two or more entities may be two or more persons in/at an airport, a shopping mall, a market place, any organisation or any public place.
[0039] FIG. 2 illustrates exemplary functional components of the proposed system in accordance with an embodiment of the present disclosure.
[0040] In an aspect, the system 100 may comprise one or more processor(s) 202. The one or more processor(s) 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, logic circuitries, and/or any devices that manipulate data based on operational instructions. Among other capabilities, the one or more processor(s) 202 are configured to fetch and execute computer-readable instructions stored in a memory 206 of the system 100. The memory 206 may store one or more computer-readable instructions or routines, which may be fetched and executed to create or share the data units over a network service. The memory 206 may comprise any non-transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as EPROM, flash memory, and the like.
[0041] The system 100 may also comprise an interface(s) 204. The interface(s) 204 may comprise a variety of interfaces, for example, interfaces for data input and output devices, referred to as I/O devices, storage devices, and the like. The interface(s) 204 may facilitate communication of system 100. The interface(s) 204 may also provide a communication pathway for one or more components of the system 100. Examples of such components include, but are not limited to, processing engine(s) 208 and data 210.
[0042] The processing engine(s) 208 may be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the processing engine(s) 208. In examples described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the processing engine(s) 208 may be processor executable instructions stored on a non-transitory machine-readable storage medium and the hardware for the processing engine(s) 208 may comprise a processing resource (for example, one or more processors), to execute such instructions. In the present examples, the machine-readable storage medium may store instructions that, when executed by the processing resource, implement the processing engine(s) 208. In such examples, the system 100 may comprise the machine-readable storage medium storing the instructions and the processing resource to execute the instructions, or the machine-readable storage medium may be separate but accessible to system 100 and the processing resource. In other examples, the processing engine(s) 208 may be implemented by electronic circuitry.
[0043] The data 210 may comprise data that is either stored or generated as a result of functionalities implemented by any of the components of the processing engine(s) 208 or the system 100.
[0044] In an exemplary embodiment, the processing engine(s) 208 may include an image receiving engine 212, a presence identification engine 214, a distance determination engine 216, a comparison engine 218 and other engine (s) 220.
[0045] In an embodiment, the image receiving engine 212 can facilitate to receive an image. The image can pertain to two or more entities present within a field of view (FOV) of an imaging device.
[0046] In an embodiment, based on the received image, the image receiving engine 212 can identify presence of the two or more entities within the FOV of the imaging device using the presence identification engine 214.
[0047] In an aspect, a plurality of region of interests (ROIs) may be identified from the two or more entities to determine distance between any two entities. The plurality of ROIs to be captured may pertain to any or a combination of a face area or a body area.
[0048] In an embodiment, a distance determination engine 216 can determine a distance between the identified two or more entities using the received image. The distance can be determined using a learning engine.
[0049] In an aspect, the learning engine may be any of a neural network, a deep learning neural network, a real time object detection model or a set of instructions for calibration or a combination thereof.
[0050] In an embodiment, a comparison engine 218 can facilitate to compare the determined distance with a predefined distance value. Upon a mismatch being detected generating an alert related to maintaining the predefined distance between the two or more entities.
[0051] FIG. 3 illustrates a flow diagram for a proposed method to detect distance between two or more entities, in accordance with an embodiment of the present disclosure.
[0052] With respect to method 300, receiving, at block 302, an image, the image pertaining to two or more entities present within a field of view (FOV) of an imaging device. Based on the received t image, at block 304, identifying presence of the two or more entities within the FOV of the imaging device. Further, at block 306, determining a distance between the identified two or more entities using the received image. The distance may be determined using a learning engine. At block 308, comparing the determined distance with a predefined distance value. Upon detecting the compared distance being less than the predefined distance, an alert may be generated related to maintaining the predefined distance between each of the two or more entities.
[0053] FIG. 4 illustrates an exemplary computer system in which or with which embodiments of the present invention can be utilized in accordance with embodiments of the present disclosure.
[0054] As shown in FIG. 4, computer system 400 includes an external storage device 410, a bus 420, a main memory 430, a read only memory 440, a mass storage device 450, communication port 460, and a processor 470. An entity skilled in the art will appreciate that computer system may include more than one processor and communication ports. Examples of processor 470 include, but are not limited to, an Intel® Itanium® or Itanium 2 processor(s), or AMD® Opteron® or Athlon MP® processor(s), Motorola® lines of processors, FortiSOC™ system on a chip processors or other future processors. Processor 470 may include various modules associated with embodiments of the present invention. Communication port 460 can be any of an RS-232 port for use with a modem-based dialup connection, a 10/100 Ethernet port, a Gigabit or 10 Gigabit port using copper or fibre, a serial port, a parallel port, or other existing or future ports. Communication port 460 may be chosen depending on a network, such a Local Area Network (LAN), Wide Area Network (WAN), or any network to which computer system connects.
[0055] Memory 430 can be Random Access Memory (RAM), or any other dynamic storage device commonly known in the art. Read only memory 440 can be any static storage device(s) e.g., but not limited to, a Programmable Read Only Memory (PROM) chips for storing static information e.g., start-up or BIOS instructions for processor 470. Mass storage 450 may be any current or future mass storage solution, which can be used to store information and/or instructions. Exemplary mass storage solutions include, but are not limited to, Parallel Advanced Technology Attachment (PATA) or Serial Advanced Technology Attachment (SATA) hard disk drives or solid-state drives (internal or external, e.g., having Universal Serial Bus (USB) and/or Firewire interfaces), e.g. those available from Seagate (e.g., the Seagate Barracuda 7200 family) or Hitachi (e.g., the Hitachi Deskstar 7K1000), one or more optical discs, Redundant Array of Independent Disks (RAID) storage, e.g. an array of disks (e.g., SATA arrays), available from various vendors including Dot Hill Systems Corp., LaCie, Nexsan Technologies, Inc. and Enhance Technology, Inc.
[0056] Bus 420 communicatively couples with processor(s) 470 with the other memory, storage, and communication blocks. Bus 420 can be, e.g., a Peripheral Component Interconnect (PCI) / PCI Extended (PCI-X) bus, Small Computer System Interface (SCSI), USB or the like, for connecting expansion cards, drives and other subsystems as well as other buses, such a front side bus (FSB), which connects processor 470 to software system.
[0057] Optionally, operator and administrative interfaces, e.g., a display, keyboard, and a cursor control device, may also be coupled to bus 420 to support direct operator interaction with computer system. Other operator and administrative interfaces can be provided through network connections connected through communication port 460. External storage device 410 can be any kind of external hard-drives, floppy drives, IOMEGA® Zip Drives, Compact Disc - Read Only Memory (CD-ROM), Compact Disc - Re-Writable (CD-RW), Digital Video Disk - Read Only Memory (DVD-ROM). Components described above are meant only to exemplify various possibilities. In no way should the aforementioned exemplary computer system limit the scope of the present disclosure.
[0058] While the foregoing describes various embodiments of the invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof. The scope of the invention is determined by the claims that follow. The invention is not limited to the described embodiments, versions or examples, which are included to enable an entity having ordinary skill in the art to make and use the invention when combined with information and knowledge available to the entity having ordinary skill in the art.

ADVANTAGES OF THE PRESENT DISCLOSURE
[0059] The present disclosure provides systems and methods to maintain an optimum predefined distance between two or more entities.
[0060] The present disclosure provides methods and systems to recommend maintaining an optimum predefined distance between entities so as to avoid spread of virus, germs and bacteria.
,CLAIMS:1. A method for measuring a distance between two entities, said method comprising:
receiving, at a processor of a computing device, an image from an imaging device, the image pertaining to two or more entities present within a field of view (FOV) of the imaging device;
based on the received image, identifying, at the processor, presence of the two or more entities within the FOV of the imaging device;
determining, at the processor, a distance between each of the identified two or more entities using the received image, where the distance is determined using a learning engine; and
comparing, at the processor, the determined distance with a predefined distance value, wherein upon detecting the compared distance being less than the predefined distance, an alert is generated related to maintaining the predefined distance between each of the two or more entities.

2. The method as claimed in claim 1, wherein a plurality of region of interests (ROIs) are identified from each of the two or more entities to determine distance between each of the two or more entities.

3. The method as claimed in claim 1, wherein the plurality of ROIs to be captured pertains to any or a combination of a facial area or a body area.

4. The method as claimed in claim 1, wherein the learning engine is any of a neural network, a deep learning neural network, a real time object detection model, a set of instructions for calibration or a combination thereof.

5. The method as claimed in claim 1, wherein the imaging device further comprises a frame integration unit; and a noise filter for capturing the image.

6. A system for measuring a distance between two entities, said system comprising:
a processing engine of a computing device comprising a processor coupled with a memory, the memory storing instructions executable by the processor to:
receive an image from an imaging device, the image pertaining to two or more entities present within a field of view (FOV) of the imaging device;
based on the received image, identify presence of the two or more entities within the FOV of the imaging device;
determine a distance between each of the identified two or more entities using the received image, where the distance is determined using a learning engine; and
compare the determined distance with a predefined distance value, wherein upon detecting the compared distance being less than the predefined distance, an alert is generated related to maintaining the predefined distance between each of the two or more entities.

Documents

Application Documents

# Name Date
1 202021034381-ABSTRACT [06-09-2022(online)].pdf 2022-09-06
1 202021034381-IntimationOfGrant27-02-2025.pdf 2025-02-27
1 202021034381-STATEMENT OF UNDERTAKING (FORM 3) [11-08-2020(online)].pdf 2020-08-11
1 202021034381-US(14)-HearingNotice-(HearingDate-17-01-2025).pdf 2024-12-09
2 202021034381-ABSTRACT [06-09-2022(online)].pdf 2022-09-06
2 202021034381-CLAIMS [06-09-2022(online)].pdf 2022-09-06
2 202021034381-PatentCertificate27-02-2025.pdf 2025-02-27
2 202021034381-PROVISIONAL SPECIFICATION [11-08-2020(online)].pdf 2020-08-11
3 202021034381-Annexure [31-01-2025(online)].pdf 2025-01-31
3 202021034381-CLAIMS [06-09-2022(online)].pdf 2022-09-06
3 202021034381-COMPLETE SPECIFICATION [06-09-2022(online)].pdf 2022-09-06
3 202021034381-FORM 1 [11-08-2020(online)].pdf 2020-08-11
4 202021034381-COMPLETE SPECIFICATION [06-09-2022(online)].pdf 2022-09-06
4 202021034381-CORRESPONDENCE [06-09-2022(online)].pdf 2022-09-06
4 202021034381-DRAWINGS [11-08-2020(online)].pdf 2020-08-11
4 202021034381-FORM-26 [31-01-2025(online)].pdf 2025-01-31
5 202021034381-Written submissions and relevant documents [31-01-2025(online)].pdf 2025-01-31
5 202021034381-FER_SER_REPLY [06-09-2022(online)].pdf 2022-09-06
5 202021034381-DECLARATION OF INVENTORSHIP (FORM 5) [11-08-2020(online)].pdf 2020-08-11
5 202021034381-CORRESPONDENCE [06-09-2022(online)].pdf 2022-09-06
6 202021034381-FORM-26 [28-10-2020(online)].pdf 2020-10-28
6 202021034381-FORM-26 [06-09-2022(online)].pdf 2022-09-06
6 202021034381-FER_SER_REPLY [06-09-2022(online)].pdf 2022-09-06
6 202021034381-Correspondence to notify the Controller [14-01-2025(online)].pdf 2025-01-14
7 202021034381-FER.pdf 2022-03-09
7 202021034381-FORM-26 [06-09-2022(online)].pdf 2022-09-06
7 202021034381-FORM-26 [14-01-2025(online)].pdf 2025-01-14
7 202021034381-Proof of Right [13-01-2021(online)].pdf 2021-01-13
8 202021034381-ENDORSEMENT BY INVENTORS [10-08-2021(online)].pdf 2021-08-10
8 202021034381-FER.pdf 2022-03-09
8 202021034381-US(14)-HearingNotice-(HearingDate-17-01-2025).pdf 2024-12-09
8 Abstract1.jpg 2022-01-24
9 202021034381-ABSTRACT [06-09-2022(online)].pdf 2022-09-06
9 202021034381-DRAWING [10-08-2021(online)].pdf 2021-08-10
9 202021034381-FORM 18 [19-08-2021(online)].pdf 2021-08-19
9 Abstract1.jpg 2022-01-24
10 202021034381-CLAIMS [06-09-2022(online)].pdf 2022-09-06
10 202021034381-COMPLETE SPECIFICATION [10-08-2021(online)].pdf 2021-08-10
10 202021034381-CORRESPONDENCE-OTHERS [10-08-2021(online)].pdf 2021-08-10
10 202021034381-FORM 18 [19-08-2021(online)].pdf 2021-08-19
11 202021034381-COMPLETE SPECIFICATION [06-09-2022(online)].pdf 2022-09-06
11 202021034381-COMPLETE SPECIFICATION [10-08-2021(online)].pdf 2021-08-10
11 202021034381-CORRESPONDENCE-OTHERS [10-08-2021(online)].pdf 2021-08-10
12 202021034381-CORRESPONDENCE [06-09-2022(online)].pdf 2022-09-06
12 202021034381-CORRESPONDENCE-OTHERS [10-08-2021(online)].pdf 2021-08-10
12 202021034381-DRAWING [10-08-2021(online)].pdf 2021-08-10
12 202021034381-FORM 18 [19-08-2021(online)].pdf 2021-08-19
13 Abstract1.jpg 2022-01-24
13 202021034381-FER_SER_REPLY [06-09-2022(online)].pdf 2022-09-06
13 202021034381-ENDORSEMENT BY INVENTORS [10-08-2021(online)].pdf 2021-08-10
13 202021034381-DRAWING [10-08-2021(online)].pdf 2021-08-10
14 202021034381-ENDORSEMENT BY INVENTORS [10-08-2021(online)].pdf 2021-08-10
14 202021034381-FER.pdf 2022-03-09
14 202021034381-FORM-26 [06-09-2022(online)].pdf 2022-09-06
14 202021034381-Proof of Right [13-01-2021(online)].pdf 2021-01-13
15 202021034381-Proof of Right [13-01-2021(online)].pdf 2021-01-13
15 202021034381-FORM-26 [28-10-2020(online)].pdf 2020-10-28
15 202021034381-FORM-26 [06-09-2022(online)].pdf 2022-09-06
15 202021034381-FER.pdf 2022-03-09
16 202021034381-DECLARATION OF INVENTORSHIP (FORM 5) [11-08-2020(online)].pdf 2020-08-11
16 202021034381-FER_SER_REPLY [06-09-2022(online)].pdf 2022-09-06
16 202021034381-FORM-26 [28-10-2020(online)].pdf 2020-10-28
16 Abstract1.jpg 2022-01-24
17 202021034381-FORM 18 [19-08-2021(online)].pdf 2021-08-19
17 202021034381-DRAWINGS [11-08-2020(online)].pdf 2020-08-11
17 202021034381-DECLARATION OF INVENTORSHIP (FORM 5) [11-08-2020(online)].pdf 2020-08-11
17 202021034381-CORRESPONDENCE [06-09-2022(online)].pdf 2022-09-06
18 202021034381-COMPLETE SPECIFICATION [10-08-2021(online)].pdf 2021-08-10
18 202021034381-DRAWINGS [11-08-2020(online)].pdf 2020-08-11
18 202021034381-FORM 1 [11-08-2020(online)].pdf 2020-08-11
18 202021034381-COMPLETE SPECIFICATION [06-09-2022(online)].pdf 2022-09-06
19 202021034381-CLAIMS [06-09-2022(online)].pdf 2022-09-06
19 202021034381-CORRESPONDENCE-OTHERS [10-08-2021(online)].pdf 2021-08-10
19 202021034381-FORM 1 [11-08-2020(online)].pdf 2020-08-11
19 202021034381-PROVISIONAL SPECIFICATION [11-08-2020(online)].pdf 2020-08-11
20 202021034381-ABSTRACT [06-09-2022(online)].pdf 2022-09-06
20 202021034381-DRAWING [10-08-2021(online)].pdf 2021-08-10
20 202021034381-PROVISIONAL SPECIFICATION [11-08-2020(online)].pdf 2020-08-11
20 202021034381-STATEMENT OF UNDERTAKING (FORM 3) [11-08-2020(online)].pdf 2020-08-11
21 202021034381-ENDORSEMENT BY INVENTORS [10-08-2021(online)].pdf 2021-08-10
21 202021034381-STATEMENT OF UNDERTAKING (FORM 3) [11-08-2020(online)].pdf 2020-08-11
21 202021034381-US(14)-HearingNotice-(HearingDate-17-01-2025).pdf 2024-12-09
22 202021034381-FORM-26 [14-01-2025(online)].pdf 2025-01-14
22 202021034381-Proof of Right [13-01-2021(online)].pdf 2021-01-13
23 202021034381-Correspondence to notify the Controller [14-01-2025(online)].pdf 2025-01-14
23 202021034381-FORM-26 [28-10-2020(online)].pdf 2020-10-28
24 202021034381-DECLARATION OF INVENTORSHIP (FORM 5) [11-08-2020(online)].pdf 2020-08-11
24 202021034381-Written submissions and relevant documents [31-01-2025(online)].pdf 2025-01-31
25 202021034381-DRAWINGS [11-08-2020(online)].pdf 2020-08-11
25 202021034381-FORM-26 [31-01-2025(online)].pdf 2025-01-31
26 202021034381-Annexure [31-01-2025(online)].pdf 2025-01-31
26 202021034381-FORM 1 [11-08-2020(online)].pdf 2020-08-11
27 202021034381-PatentCertificate27-02-2025.pdf 2025-02-27
27 202021034381-PROVISIONAL SPECIFICATION [11-08-2020(online)].pdf 2020-08-11
28 202021034381-IntimationOfGrant27-02-2025.pdf 2025-02-27
28 202021034381-STATEMENT OF UNDERTAKING (FORM 3) [11-08-2020(online)].pdf 2020-08-11

Search Strategy

1 searchstE_03-03-2022.pdf

ERegister / Renewals

3rd: 15 May 2025

From 11/08/2022 - To 11/08/2023

4th: 15 May 2025

From 11/08/2023 - To 11/08/2024

5th: 15 May 2025

From 11/08/2024 - To 11/08/2025

6th: 15 May 2025

From 11/08/2025 - To 11/08/2026