Abstract: The present disclosure provides a system (100) and method (300) for detecting presence of face masks on one or more entities. Images are obtained from an image capture device (102) such as a CCTV, and the images are processed to identify the one or more entities. The images are then processed to determine if the one or more entities are wearing a face mask. If not, an alert is generated, and identification of the one or more entities not wearing the face mask is communicated to a central server. The data and logs are stored in a memory device (122).
DESC:TECHNICAL FIELD
[0001] The present disclosure relates, in general, to public health. In particular, the present disclosure relates to a means to monitor presence of face-masks on one or a group of entities.
BACKGROUND
[0002] Background description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
[0003] COVID 19 has changed the world. Social Distancing and wearing face masks have become absolutely essential to combat the spread of this disease. Enforcement of social distancing and face mask has become a new problem for organizations and society in general. The survey was conducted to find out why some citizens are not wearing masks. Generally, entities/people don’t wear masks as they believe that they don’t need to wear them as long as they maintain social distancing. The first problem in the enforcement of these norms is identifying those entities not following these norms. Once such people are identified, enforcement of these norms will become easier.
[0004] There is, therefore, a requirement in the art for a means to detect presence of face masks on one or more entities at a time.
OBJECTS OF THE PRESENT DISCLOSURE
[0005] Some of the objects of the present disclosure, which at least one embodiment herein satisfies are as listed herein below.
[0006] It is an object of the present disclosure to provide systems and methods to determine whether an entity is wearing a face mask or not.
[0007] It is an object of the present disclosure to provide systems and methods to recommend wearing a face mask when the entity is not wearing one.
[0008] It is an object of the present disclosure to provide systems and methods facilitates to enforce the rules and regulations.
[0009] It is an object of the present disclosure to provide systems and methods to prevent spread of the diseases.
[0010] It is an object of the present disclosure to provide systems and methods for maintaining safety and enhancing health of the public.
SUMMARY
[0011] An aspect of the present disclosure provides a method for determining whether an entity is wearing a face mask or not, said method comprising: capturing, at a processor associated with an imaging device, a first set of image frames of the entity present within a field of view (FOV) of the imaging device; comparing, at the processor using a learning engine, the captured first set of image frames with a predefined image classifier stored in a memory; and in response to determining an unsuccessful match based on the comparison, generating, at the processor, an alert, where the alert corresponds to the entity not wearing the face mask.
[0012] In an embodiment, the imaging device is an image capture device such as CCTV, security cameras, monitor, recorder and the likes.
[0013] In an embodiment, the image capture device is configured to detect and record activities
[0014] In an embodiment, the alert is any of a visual alert or an audible alert.
[0015] In an embodiment, the learning engine is any of a neural network, a deep learning neural network, and a real time object detection model.
[0016] In an embodiment, a face of the entity is detected from the captured first set of image frames, where the face is detected using any of an image-based model or a feature-based model.
[0017] An aspect of the present disclosure provides a system for determining whether an entity is wearing a face mask or not, said method comprising: a processing engine of a computing device comprising a processor coupled with a memory, the memory storing instructions executable by the processor to: capture a first set of image frames of the entity present within a field of view (FOV) of the imaging device; compare, using a learning engine, the captured first set of image frames with a predefined image classifier stored in a memory; and in response to determining an unsuccessful match based on the comparison, generate an alert, where the alert corresponds to the entity not wearing the face mask.
BRIEF DESCRIPTION OF DRAWINGS
[0018] The accompanying drawings are included to provide further understanding of the present disclosure and are incorporated in and constitute a part of this specification. The drawings illustrate exemplary embodiments of the present disclosure and, together with the description, serve to explain the principles of the present disclosure. The diagrams are for illustration only, which thus is not a limitation of the present disclosure.
[0019] FIG. 1 illustrates an exemplary block diagram for a system to detect the presence of face mask on one or more entities, in accordance with an embodiment of the present disclosure.
[0020] FIG. 2 illustrates exemplary functional components of the proposed system in accordance with an embodiment of the present disclosure.
[0021] FIG. 3 illustrates an exemplary flow diagram for a method to detect the presence of face mask on one or more entities, in accordance with an embodiment of the present disclosure.
[0022] FIG. 4 illustrates an exemplary computer system in which or with which embodiments of the present invention can be utilized in accordance with embodiments of the present disclosure.
DETAILED DESCRIPTION
[0023] The following is a detailed description of embodiments of the disclosure depicted in the accompanying drawings. The embodiments are in such detail as to clearly communicate the disclosure. However, the amount of detail offered is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosure as defined by the appended claims.
[0024] The present disclosure relates, in general, to public health. In particular, the present disclosure relates to a means to monitor the presence of face-masks on one or a group of entities.
[0025] FIG. 1 illustrates an exemplary block diagram for a system to detect the presence of a face mask on one or more entities, in accordance with an embodiment of the present disclosure.
[0026] In an aspect, system 100 can be implemented to detect the presence of face-masks on one or more entities using images obtained from a camera such as a closed-circuit camera or CCTV.
[0027] In another aspect, the system 100 can be implemented to detect or identify the entity or entities who are not wearing face masks. The detection is accurate even at low image resolution and at low light conditions.
[0028] In another aspect, the system 100 can be implemented in all environments, both indoors and outdoors.
[0029] In another aspect, the system 100 can be implemented without internet connectivity.
[0030] In another embodiment, the system 100 can receive input from an image capture device 102 such as a CCTV. The optical image capture device 102 can have a USB interface with the system 100.
[0031] In another embodiment, the system 100 can include a processing engine 104, which can include processor(s) 106 and a memory 108, the memory 108 storing instructions executable by the processor(s) 106 to detect the presence of face mask on one or more entities.
[0032] In another embodiment, the processing engine 104 can receive data from the image capture device 102. The data can include images in the field of view (FOV) of the optical image capture device 102, which can include one or more entities. The received data is pre-processed at a pre-processor 110 and sent to a facial recognition unit 112. The facial recognition unit 112 can include a back-end unit 150, which can include any suitable face detector 152 such as a YOLO v3 HR and a flask server 154 that is operatively coupled with the face detector 152. The facial recognition unit 112 can identify one or more entities in the received images.
[0033] In another embodiment, the YOLO v3 HR Optimized YOLOv3 is capable of running 20 cameras with 20 FPS on a single GPU.
[0034] In another embodiment, a face mask detector 114 determines, from the images of one or more entities, if the one or more entities are wearing a face mask.
[0035] In an exemplary embodiment, the processing engine 104 can employ a learning engine 118 such as neural networks, machine learning, and other artificial intelligence (AI) – based learning to process the received data from the image capture device 102 to detect the presence of face masks and to identify the one or more entities not wearing face masks.
[0036] In another embodiment, when the face mask detector 114 determines an entity or entities to not be wearing a face mask a corresponding alert is generated at the alert generator 116.
[0037] In another embodiment, the system includes a display device 120, which can display information pertaining to any or a combination of data received from the image capture device 102, detected entity or entities not wearing a facial mask, identity of the entity or entities not wearing the face mask and an alert generated.
[0038] In another embodiment, the above information can be communicated to a central server.
[0039] In another embodiment, the alert generated can be through a visual aid such as flashing LED lamps, displaying of a message, etc. or through an audio aid such as an alarm.
[0040] In another embodiment, the system 100 can include a memory device 122, which can store the above-mentioned information.
[0041] FIG. 2 illustrates exemplary functional components of the proposed system in accordance with an embodiment of the present disclosure.
[0042] In an aspect, the system 100 may comprise one or more processor(s) 202. The one or more processor(s) 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, logic circuitries, and/or any devices that manipulate data based on operational instructions. Among other capabilities, the one or more processor(s) 202 are configured to fetch and execute computer-readable instructions stored in a memory 206 of the system 100. The memory 206 may store one or more computer-readable instructions or routines, which may be fetched and executed to create or share the data units over a network service. The memory 206 may comprise any non-transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as EPROM, flash memory, and the like.
[0043] The system 100 may also comprise an interface(s) 204. The interface(s) 204 may comprise a variety of interfaces, for example, interfaces for data input and output devices, referred to as I/O devices, storage devices, and the like. The interface(s) 204 may facilitate communication of system 100. The interface(s) 204 may also provide a communication pathway for one or more components of the system 100. Examples of such components include, but are not limited to, processing engine(s) 208 and data 210.
[0044] The processing engine(s) 208 may be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the processing engine(s) 208. In examples described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the processing engine(s) 208 may be processor executable instructions stored on a non-transitory machine-readable storage medium and the hardware for the processing engine(s) 208 may comprise a processing resource (for example, one or more processors), to execute such instructions. In the present examples, the machine-readable storage medium may store instructions that, when executed by the processing resource, implement the processing engine(s) 208. In such examples, the system 100 may comprise the machine-readable storage medium storing the instructions and the processing resource to execute the instructions, or the machine-readable storage medium may be separate but accessible to system 100 and the processing resource. In other examples, the processing engine(s) 208 may be implemented by electronic circuitry.
[0045] The database 210 may comprise data that is either stored or generated as a result of functionalities implemented by any of the components of the processing engine(s) 208 or the system 100.
[0046] In an exemplary embodiment, the processing engine(s) 208 may include an image frames capturing engine 212, a comparison engine 214, an alert generation engine 216, and other engine (s) 218.
[0047] In an embodiment, the image frames capturing engine 212 facilitates to capture of a first set of image frames, using an imaging device, of the entity present within a field of view (FOV) of the imaging device. The imaging device may be a CCTV. In an aspect, a face of the entity may be detected from the captured first set of image frames. The face may be detected using any of an image based model or a feature based model.
[0048] In an embodiment, the comparison engine 214 facilitates to compare, using a learning engine, the captured first set of image frames with a predefined image classifier stored in a memory. In an aspect, the learning engine may be any of a neural network, a deep learning neural network, and a real time object detection model.
[0049] In response to determining an unsuccessful match based on the comparison, the alert generation engine 216 generates an alert. The alert corresponds to the entity not wearing the face mask. In an aspect, the alert may be any of a visual alert or an audible alert.
[0050] In an embodiment, the system 100 may use an artificial network to recognize if the entity is not wearing a mask. The system 100 may be connected to any existing or new IP mask detection cameras to detect an entity without a mask. The system 100 may facilitate an administrator to add faces of the entities and phone numbers to send the entity an alert in case they are not wearing a mask. If the camera captures an unrecognized face, a notification can be sent out to an administrator. In yet another embodiment, the entity not wearing a mask his photo or the video captured by the camera is provided.
[0051] In another embodiment, a face mask detector 114 may identify and inform an entity that he/she was not wearing a mask, AI alerts may be sent with a picture of the entity.
[0052] FIG. 3 illustrates an exemplary flow diagram for a method to detect the presence of face mask on one or more entities, in accordance with an embodiment of the present disclosure.
[0053] At block 302, capture using an imaging device, a first set of image frames of the entity present within a field of view (FOV) of the imaging device. At block 304, compare using a learning engine, the captured first set of image frames with a predefined image classifier stored in a memory. Further, in response to determining an unsuccessful match based on the comparison, at block 306, generating an alert. The alert corresponds to the entity not wearing the face mask.
[0054] FIG. 4 illustrates an exemplary computer system in which or with which embodiments of the present invention can be utilized in accordance with embodiments of the present disclosure.
[0055] As shown in FIG. 4, computer system 400 includes an external storage device 410, a bus 420, a main memory 430, a read only memory 440, a mass storage device 450, communication port 460, and a processor 470. An entity skilled in the art will appreciate that computer system may include more than one processor and communication ports. Examples of processor 470 include, but are not limited to, an Intel® Itanium® or Itanium 2 processor(s), or AMD® Opteron® or Athlon MP® processor(s), Motorola® lines of processors, FortiSOC™ system on a chip processors or other future processors. Processor 470 may include various modules associated with embodiments of the present invention. Communication port 460 can be any of an RS-232 port for use with a modem-based dialup connection, a 10/100 Ethernet port, a Gigabit or 10 Gigabit port using copper or fibre, a serial port, a parallel port, or other existing or future ports. Communication port 460 may be chosen depending on a network, such a Local Area Network (LAN), Wide Area Network (WAN), or any network to which computer system connects.
[0056] Memory 430 can be Random Access Memory (RAM), or any other dynamic storage device commonly known in the art. Read only memory 440 can be any static storage device(s) e.g., but not limited to, a Programmable Read Only Memory (PROM) chips for storing static information e.g., start-up or BIOS instructions for processor 470. Mass storage 450 may be any current or future mass storage solution, which can be used to store information and/or instructions. Exemplary mass storage solutions include, but are not limited to, Parallel Advanced Technology Attachment (PATA) or Serial Advanced Technology Attachment (SATA) hard disk drives or solid-state drives (internal or external, e.g., having Universal Serial Bus (USB) and/or Firewire interfaces), e.g. those available from Seagate (e.g., the Seagate Barracuda 7200 family) or Hitachi (e.g., the Hitachi Deskstar 7K1000), one or more optical discs, Redundant Array of Independent Disks (RAID) storage, e.g. an array of disks (e.g., SATA arrays), available from various vendors including Dot Hill Systems Corp., LaCie, Nexsan Technologies, Inc. and Enhance Technology, Inc.
[0057] Bus 420 communicatively couples processor(s) 470 with the other memory, storage, and communication blocks. Bus 420 can be, e.g. a Peripheral Component Interconnect (PCI) / PCI Extended (PCI-X) bus, Small Computer System Interface (SCSI), USB or the like, for connecting expansion cards, drives and other subsystems as well as other buses, such a front side bus (FSB), which connects processor 470 to software system.
[0058] Optionally, operator and administrative interfaces, e.g. a display, keyboard, and a cursor control device, may also be coupled to bus 420 to support direct operator interaction with computer system. Other operator and administrative interfaces can be provided through network connections connected through communication port 460. External storage device 410 can be any kind of external hard-drives, floppy drives, IOMEGA® Zip Drives, Compact Disc - Read Only Memory (CD-ROM), Compact Disc - Re-Writable (CD-RW), Digital Video Disk - Read Only Memory (DVD-ROM). Components described above are meant only to exemplify various possibilities. In no way should the aforementioned exemplary computer system limit the scope of the present disclosure.
[0059] While the foregoing describes various embodiments of the invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof. The scope of the invention is determined by the claims that follow. The invention is not limited to the described embodiments, versions or examples, which are included to enable an entity having ordinary skill in the art to make and use the invention when combined with information and knowledge available to the entity having ordinary skill in the art.
ADVANTAGES OF THE PRESENT DISCLOSURE
[0060] The present disclosure provides systems and methods to determine whether an entity is wearing a face mask or not.
[0061] The present disclosure provides systems and methods to recommend wearing a face mask when the entity is not wearing one by generating an alert. The present disclosure provides systems and methods that facilitate enforcement of the rules and regulations.
[0062] The present disclosure provides systems and methods to enable the prevention of the spread of the diseases.
[0063] The present disclosure provides systems and methods for maintaining safety and enhancing health of the public.
,CLAIMS:1. A method for determining presence of a face mask worn by an entity, said method comprising:
capturing, at a processor (106) associated with an imaging device, a first set of image frames of the entity within a Field of View (FOV) of the imaging device;
comparing, at the processor (106) using a learning engine (118), the captured first set of image frames with a predefined image classifier stored in a memory (108); and
generating, at the processor (106), an alert in response to determining an unsuccessful match based on the comparison, wherein the alert corresponds to the entity not wearing the face mask.
2. The method as claimed in claim 1, wherein the imaging device is an image capturing device (102).
3. The method as claimed in claim 1, wherein the alert is any of a visual alert or an audible alert.
4. The method as claimed in claim 1, wherein the learning engine is any of a neural network, a deep learning neural network, and a real time object detection model.
5. The method as claimed in claim 1, wherein a face of the entity is detected from the captured first set of image frames, where the face is detected using any of an image based model or a feature based model.
6. A system (100) for determining presence of a face mask worn by an entity, said system (100) comprising:
a processing engine (104) of a computing device comprising a processor (106) coupled with a memory (108), the memory (108) storing instructions executable by the processor (106) to:
capture a first set of image frames of the entity present within a field of view (FOV) of the imaging device (102);
compare, using a learning engine (118), the captured first set of image frames with a predefined image classifier stored in a memory (108); and
in response to determining an unsuccessful match based on the comparison, generate an alert, where the alert corresponds to the entity not wearing the face mask.
7. The system (100) as claimed in claim 7, wherein the alert is any of a visual alert or an audio alert.
8. The system (100) as claimed in claim 7, wherein the learning engine (118) is any of a neural network, a deep learning neural network, and a real time object detection model.
9. The system (100) as claimed in claim 7, wherein a face of the entity is detected from the captured first set of image frames, where the face is detected using any of an image based model or a feature based model.
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 202021034382-IntimationOfGrant05-09-2024.pdf | 2024-09-05 |
| 1 | 202021034382-STATEMENT OF UNDERTAKING (FORM 3) [11-08-2020(online)].pdf | 2020-08-11 |
| 2 | 202021034382-PatentCertificate05-09-2024.pdf | 2024-09-05 |
| 2 | 202021034382-PROVISIONAL SPECIFICATION [11-08-2020(online)].pdf | 2020-08-11 |
| 3 | 202021034382-FORM 1 [11-08-2020(online)].pdf | 2020-08-11 |
| 3 | 202021034382-Annexure [02-09-2024(online)].pdf | 2024-09-02 |
| 4 | 202021034382-Written submissions and relevant documents [02-09-2024(online)].pdf | 2024-09-02 |
| 4 | 202021034382-DRAWINGS [11-08-2020(online)].pdf | 2020-08-11 |
| 5 | 202021034382-DECLARATION OF INVENTORSHIP (FORM 5) [11-08-2020(online)].pdf | 2020-08-11 |
| 5 | 202021034382-Correspondence to notify the Controller [13-08-2024(online)].pdf | 2024-08-13 |
| 6 | 202021034382-US(14)-HearingNotice-(HearingDate-19-08-2024).pdf | 2024-08-01 |
| 6 | 202021034382-FORM-26 [28-10-2020(online)].pdf | 2020-10-28 |
| 7 | 202021034382-Proof of Right [20-01-2021(online)].pdf | 2021-01-20 |
| 7 | 202021034382-CLAIMS [08-09-2022(online)].pdf | 2022-09-08 |
| 8 | 202021034382-ENDORSEMENT BY INVENTORS [10-08-2021(online)].pdf | 2021-08-10 |
| 8 | 202021034382-COMPLETE SPECIFICATION [08-09-2022(online)].pdf | 2022-09-08 |
| 9 | 202021034382-CORRESPONDENCE [08-09-2022(online)].pdf | 2022-09-08 |
| 9 | 202021034382-DRAWING [10-08-2021(online)].pdf | 2021-08-10 |
| 10 | 202021034382-CORRESPONDENCE-OTHERS [10-08-2021(online)].pdf | 2021-08-10 |
| 10 | 202021034382-FER_SER_REPLY [08-09-2022(online)].pdf | 2022-09-08 |
| 11 | 202021034382-COMPLETE SPECIFICATION [10-08-2021(online)].pdf | 2021-08-10 |
| 11 | 202021034382-FORM-26 [08-09-2022(online)].pdf | 2022-09-08 |
| 12 | 202021034382-FER.pdf | 2022-03-17 |
| 12 | 202021034382-FORM 18 [19-08-2021(online)].pdf | 2021-08-19 |
| 13 | Abstract1.jpg | 2022-01-24 |
| 14 | 202021034382-FER.pdf | 2022-03-17 |
| 14 | 202021034382-FORM 18 [19-08-2021(online)].pdf | 2021-08-19 |
| 15 | 202021034382-COMPLETE SPECIFICATION [10-08-2021(online)].pdf | 2021-08-10 |
| 15 | 202021034382-FORM-26 [08-09-2022(online)].pdf | 2022-09-08 |
| 16 | 202021034382-CORRESPONDENCE-OTHERS [10-08-2021(online)].pdf | 2021-08-10 |
| 16 | 202021034382-FER_SER_REPLY [08-09-2022(online)].pdf | 2022-09-08 |
| 17 | 202021034382-DRAWING [10-08-2021(online)].pdf | 2021-08-10 |
| 17 | 202021034382-CORRESPONDENCE [08-09-2022(online)].pdf | 2022-09-08 |
| 18 | 202021034382-COMPLETE SPECIFICATION [08-09-2022(online)].pdf | 2022-09-08 |
| 18 | 202021034382-ENDORSEMENT BY INVENTORS [10-08-2021(online)].pdf | 2021-08-10 |
| 19 | 202021034382-Proof of Right [20-01-2021(online)].pdf | 2021-01-20 |
| 19 | 202021034382-CLAIMS [08-09-2022(online)].pdf | 2022-09-08 |
| 20 | 202021034382-US(14)-HearingNotice-(HearingDate-19-08-2024).pdf | 2024-08-01 |
| 20 | 202021034382-FORM-26 [28-10-2020(online)].pdf | 2020-10-28 |
| 21 | 202021034382-DECLARATION OF INVENTORSHIP (FORM 5) [11-08-2020(online)].pdf | 2020-08-11 |
| 21 | 202021034382-Correspondence to notify the Controller [13-08-2024(online)].pdf | 2024-08-13 |
| 22 | 202021034382-Written submissions and relevant documents [02-09-2024(online)].pdf | 2024-09-02 |
| 22 | 202021034382-DRAWINGS [11-08-2020(online)].pdf | 2020-08-11 |
| 23 | 202021034382-FORM 1 [11-08-2020(online)].pdf | 2020-08-11 |
| 23 | 202021034382-Annexure [02-09-2024(online)].pdf | 2024-09-02 |
| 24 | 202021034382-PROVISIONAL SPECIFICATION [11-08-2020(online)].pdf | 2020-08-11 |
| 24 | 202021034382-PatentCertificate05-09-2024.pdf | 2024-09-05 |
| 25 | 202021034382-IntimationOfGrant05-09-2024.pdf | 2024-09-05 |
| 25 | 202021034382-STATEMENT OF UNDERTAKING (FORM 3) [11-08-2020(online)].pdf | 2020-08-11 |
| 1 | SearchStrategy202021034382E_15-03-2022.pdf |