Abstract: The present disclosure provides a wearable surveillance device 100 to track location and movement of an entity. The surveillance device 100 can include a processing unit 102, a sensor 104, and a positioning unit 106. The sensor 104 can sense movements of the entity and accordingly a first set of alarm signals can be generated to alarm the entity to prevent him/ her from touching his/ her face without sanitizing his/ her hands. The positioning unit 106 can sense location of the entity wearing the surveillance device 100 as well as of other entities around the entity, and accordingly a second set of alarm signals can be generated to help the entity in maintaining social distancing.
[0001] The present disclosure relates generally to alerting systems and devices. More
particularly, the present disclosure provides a surveillance device for alerting an entity.
BACKGROUND
[0002] Background description includes information that may be useful in understanding
the present invention. It is not an admission that any of the information provided herein is
prior art or relevant to the presently claimed invention, or that any publication specifically or
implicitly referenced is prior art.
[0003] Various contagious diseases and infections can be easily transmitted from person
to person through surface to surface transmission, gestures like hand shaking, patting,
hugging, and also through sneezing and coughing. Spread of diseases and infections, such as
COVID-19, Ebola, and SARS has proved to be threatening for survival of human beings.
[0004] As the proven saying goes “prevention is better than cure”, hence one must focus
on preventing such diseases. Social distancing is a much versed measure to prevent such life
threatening diseases and infections. Social distancing refers to the practice of maintaining
distance from other persons, avoiding groups and mass gatherings. Maintaining a safe
distance from others is one of the best tools that can be used for avoiding exposure to such
diseases, and slowing down their spread locally, across the country and world.
[0005] Such diseases and infections could spread from one surface to another through
surface to surface transmission, for instance, a surface touched by an infected person can be
touched by another person, and then he/ she may carry the infection that can be further
transferred to another surface as the infected person touches that surface, and in a similar
manner the infection could be transmitted from one surface to multiple surfaces and one
person to multiple persons due to repeated touching of multiple infected surfaces by the
multiple persons. Hence, one should limit touching such surfaces and objects, and avoid
unnecessary touching of such surfaces.
[0006] However, while completing daily chores and attending important meetings, a
person happens to touch or brush with such objects and surfaces that may be infectious or get
in close contact with other persons. In such situations, in order to avoid exposure to such
diseases, one should be cautious enough to not touch his/ her face, eyes, and nose, and not to
consume food from same hands after touching any surface or getting in close contact with a
3
person. However, while busy with his/ her schedules a person may easily forget to take such
preventions and may happen to touch his/ her face with infectious hands, which may lead to
transfer of infections and diseases inside his/ her body. There is no such system or device that
can alarm a person moments before he/ she touches his/ her face and also remind about social
distancing.
[0007] There is therefore, a need in the art to provide a device to overcome the abovementioned limitations, and aid in maintain social distancing and hygiene.
OBJECTS OF THE PRESENT DISCLOSURE
[0008] Some of the objects of the present disclosure, which at least one embodiment
herein satisfies are as listed herein below.
[0009] It is an object of the present disclosure to provide a surveillance device to track
location of entities.
[0010] It is an object of the present disclosure to provide a surveillance device to track
movements of an entity.
[0011] It is an object of the present disclosure to provide a surveillance device to alert an
entity based on tracked location and movements.
[0012] It is an object of the present disclosure to provide a surveillance device that
facilitates in maintaining hygiene, and keeping an entity healthy and safe.
[0013] It is an object of the present disclosure to provide an innovative, wearable, cost
effective, efficient, safe, and portable surveillance device.
SUMMARY
[0014] The present disclosure relates generally to alerting systems and devices. More
particularly, the present disclosure provides a surveillance device for alerting an entity.
[0015] An aspect of the present disclosure pertains to a surveillance device comprising: a
sensor configured to sense a first set of parameters associated with a first entity, and
correspondingly generate a first set of signals; a positioning unit configured to determine
location associated with the first entity, and correspondingly generate a second set of signals;
and a processing unit operatively coupled to the sensor and the positioning unit, the
processing unit comprising one or more processors coupled with a memory, the memory
storing instructions executable by the one or more processors configured to: extract the first
set of parameters and the determined location from the first set of signals and the second set
of signals respectively;compare the first set of parameters with a dataset comprising pre-
4
defined security limits; generate a first set of alert signals in case at least one of the first set
of parameters is beyond the pre-defined security limits; determine locations of second
entities present within a pre-defined range from the determined location of the first entity,
and correspondingly compute distance of the first entity from each of the second
entities;compare the computed distance with a pre-defined distance limit; and generate a
second set of alert signals in case the computed distance is within the pre-defined distance
limit.
[0016] In an aspect, the sensor comprises any or a combination of proximity sensor,
ultrasonic sensor, infrared sensor, temperature sensor, gyroscope sensor, position sensor, and
motion sensor.
[0017] In an aspect, the first set of parameters comprise any or a combination of motion,
velocity, distance, and displacement of at least one body part of the first entity.
[0018] In an aspect, the positioning unit comprises any or a combination of Geographic
Information Systems (GIS), Global Positioning System (GPS), and Near Field
Communication (NFC).
[0019] In an aspect, the surveillance device comprises a power source (108) configured to
supply electric power to the surveillance device.
[0020] In an aspect, the power source comprises any or a combination of rechargeable
battery, rechargeable cell, solar cell, solar battery, electrochemical cell, storage battery, and
secondary cell.
[0021] In an aspect, the surveillance device comprises a communication unit operatively
coupled with the processing unit, and configured to communicatively couple the positioning
unit with the processing unit.
[0022] In an aspect, the communication unit comprises any or a combination of Wireless
Fidelity (Wi-Fi), Bluetooth, and Li-Fi, optical fiber, Wireless Local Area Network (WLAN),
and ZigBee.
[0023] In an aspect, the surveillance device comprises an image acquisition unit
operatively coupled to the processing unit, and configured to recognize face of the first entity.
BRIEF DESCRIPTION OF THE DRAWINGS
[0024] The accompanying drawings are included to provide a further understanding of
the present disclosure, and are incorporated in and constitute a part of this specification. The
drawings illustrate exemplary embodiments of the present disclosure and, together with the
description, serve to explain the principles of the present disclosure.
5
[0025] The diagrams are for illustration only, which thus is not a limitation of the present
disclosure, and wherein:
[0026] FIG. 1 illustrates a block diagram of the proposed surveillance deviceto illustrate
its overall working in accordance with an embodiment of the present disclosure.
[0027] FIG. 2 illustrates exemplary functional components of processing unit of
disinfecting and alerting device, in accordance with an embodiment of the present disclosure.
DETAILED DESCRIPTION
[0028] In the following description, numerous specific details are set forth in order to
provide a thorough understanding of embodiments of the present invention. It will be
apparent to one skilled in the art that embodiments of the present invention may be practiced
without some of these specific details.
[0029] Embodiments of the present invention include various steps, which will be
described below. The steps may be performed by hardware components or may be embodied
in machine-executable instructions, which may be used to cause a general-purpose or specialpurpose processor programmed with the instructions to perform the steps. Alternatively, steps
may be performed by a combination of hardware, software, firmware and/or by human
operators.
[0030] Embodiments of the present invention may be provided as a computer program
product, which may include a machine-readable storage medium tangibly embodying thereon
instructions, which may be used to program a computer (or other electronic devices) to
perform a process. The machine-readable medium may include, but is not limited to, fixed
(hard) drives, magnetic tape, floppy diskettes, optical disks, compact disc read-only
memories (CD-ROMs), and magneto-optical disks, semiconductor memories, such as ROMs,
PROMs, random access memories (RAMs), programmable read-only memories (PROMs),
erasable PROMs (EPROMs), electrically erasable PROMs (EEPROMs), flash memory,
magnetic or optical cards, or other type of media/machine-readable medium suitable for
storing electronic instructions (e.g., computer programming code, such as software or
firmware).
[0031] Various methods described herein may be practiced by combining one or more
machine-readable storage media containing the code according to the present invention with
appropriate standard computer hardware to execute the code contained therein. An apparatus
for practicing various embodiments of the present invention may involve one or more
computers (or one or more processors within a single computer) and storage systems
6
containing or having network access to computer program(s) coded in accordance with
various methods described herein, and the method steps of the invention could be
accomplished by modules, routines, subroutines, or subparts of a computer program product.
[0032] If the specification states a component or feature “may”, “can”, “could”, or
“might” be included or have a characteristic, that particular component or feature is not
required to be included or have the characteristic.
[0033] As used in the description herein and throughout the claims that follow, the
meaning of “a,” “an,” and “the” includes plural reference unless the context clearly dictates
otherwise. Also, as used in the description herein, the meaning of “in” includes “in” and “on”
unless the context clearly dictates otherwise.
[0034] Exemplary embodiments will now be described more fully hereinafter with
reference to the accompanying drawings, in which exemplary embodiments are shown. This
invention may, however, be embodied in many different forms and should not be construed
as limited to the embodiments set forth herein. These embodiments are provided so that this
invention will be thorough and complete and will fully convey the scope of the invention to
those of ordinary skill in the art. Moreover, all statements herein reciting embodiments of the
invention, as well as specific examples thereof, are intended to encompass both structural and
functional equivalents thereof. Additionally, it is intended that such equivalents include both
currently known equivalents as well as equivalents developed in the future (i.e., any elements
developed that perform the same function, regardless of structure).
[0035] While embodiments of the present invention have been illustrated and described,
it will be clear that the invention is not limited to these embodiments only. Numerous
modifications, changes, variations, substitutions, and equivalents will be apparent to those
skilled in the art, without departing from the spirit and scope of the invention, as described in
the claim.
[0036] The present disclosure relates generally to alerting systems and devices. More
particularly, the present disclosure provides a surveillance device for alerting an entity.
[0037] According to an aspect the present disclosure pertains to a surveillance device
including: a sensor configured to sense a first set of parameters associated with a first entity,
and correspondingly generate a first set of signals; a positioning unit configured to determine
location associated with the first entity, and correspondingly generate a second set of signals;
and a processing unit operatively coupled to the sensor and the positioning unit, the
processing unit comprising one or more processors coupled with a memory, the memory
storing instructions executable by the one or more processors configured to: extract the first
7
set of parameters and the determined location from the first set of signals and the second set
of signals respectively;compare the first set of parameters with a dataset including predefined security limits; generate a first set of alert signals in case at least one of the first set
of parameters is beyond the pre-defined security limits; determine locations of second
entities present within a pre-defined range from the determined location of the first entity,
and correspondingly compute distance of the first entity from each of the second
entities;compare the computed distance with a pre-defined distance limit; and generate a
second set of alert signals in case the computed distance is within the pre-defined distance
limit.
[0038] In an embodiment, the sensor can include any or a combination of proximity
sensor, ultrasonic sensor, infrared sensor, temperature sensor, gyroscope sensor, position
sensor, and motion sensor.
[0039] In an embodiment, the first set of parameters can include any or a combination of
motion, velocity, distance, and displacement of at least one body part of the first entity.
[0040] In an embodiment,the positioning unit includes any or a combination of
Geographic Information Systems (GIS), Global Positioning System (GPS), and Near Field
Communication (NFC).
[0041] In an embodiment,the surveillance device includes a power source (108)
configured to supply electric power to the surveillance device.
[0042] In an embodiment,the power source can include any or a combination of
rechargeable battery, rechargeable cell, solar cell, solar battery, electrochemical cell, storage
battery, and secondary cell.
[0043] In an embodiment,the surveillance device includes a communication unit
operatively coupled with the processing unit, and configured to communicatively couple the
positioning unit with the processing unit.
[0044] In an embodiment,the communication unit can include any or a combination of
Wireless Fidelity (Wi-Fi), Bluetooth, and Li-Fi, optical fiber, Wireless Local Area Network
(WLAN), and ZigBee.
[0045] In an embodiment,the surveillance device can include an image acquisition unit
operatively coupled to the processing unit, and configured to recognize face of the first entity.
[0046] FIG. 1 illustrates a block diagram of the proposed surveillance device to illustrate
its overall working in accordance with an embodiment of the present disclosure.
[0047] As illustrated in FIG. 1, in an embodiment, the proposed surveillance device 100
(interchangeably referred to as device 100, or surveillance device 100) can include a
8
processing unit 102, a sensor 104, and a positioning unit 106. The sensor 104 can include any
or a combination of proximity sensor, ultrasonic sensor, infrared sensor, temperature sensor,
gyroscope sensor, position sensor, and motion sensor, and can be configured to sense a first
set of parameters associated with a first entity, where the first set of parameters can be any or
a combination of motion, velocity, distance, and displacement of at least one body part of the
first entity. In an embodiment, the sensor 104 can generate a first set of signals when the
sensor 102 senses at least one of the first set of parameters associated with the first entity.
[0048] In an embodiment, the positioning unit 106 can be configured to determine
location associated with the first entity, and correspondingly generate a second set of signals.
In an exemplary embodiment, the positioning unit 106 can include any or a combination of
Geographic Information Systems (GIS), Global Positioning System (GPS), Near Field
Communication (NFC), and the like.
[0049] In an embodiment, the processing unit 102 can be operatively coupled to the
sensor 104 and the positioning unit 106, and the processing unit 102 can include one or more
processors coupled with a memory, the memory storing instructions executable by the one or
more processors. In an embodiment, the processing unit 102 can be configured to receive the
first set of signals from the sensor 102 and the second set of signals from the positioning unit
106. In an embodiment, the processing unit 102 can receive the first set of signals from the
sensor 104, and further, can extract the first set of parameters from the first set of signals. The
processing unit 102 can compare the extracted first set of parameters with a dataset, where
the dataset can include pre-defined security limits. The processing unit 102 can generate a
first set of alert signals in case at least one of the first set of parameters is found beyond the
pre-defined security limits, however, in case each of the first set of parameters matches the
dataset, i.e., if each of the first set of parameters is found within the pre-defined security
limits then no alert signal is generated. Hence, the surveillance device 100 can aid the first
entity in preventing himself/ herself from touching his/ her face without sanitizing his/ her
hands
[0050] In another embodiment, the processing unit 102 can receive the second set of
signals from the positioning unit 106, and further, can extract the determined location,
associated with the first entity, from the second set of signals. Further, the processing unit
102 can determine locations of second entities that are present within a pre-defined range
from the determined location of the first entity, and further, can compute distance of the first
entity from the second entities based on the determined location of the first entity and the
determined locations of each of the second entities. The processing unit 102 can compare the
9
computed distance with a pre-defined distance limit. The processing unit 102 can generate a
second set of alert signals in case at least one of the second entities is found in vicinity of the
first entity, i.e., if the computed distance of at least one of the second entities is found within
the pre-defined distance limit, however, in case the computed distance of each of the second
entities is found beyond the pre-defined distance limit then no alert signal is generated.
Hence, the surveillance device 100 can help the first entity in maintaining social distancing.
[0051] In an embodiment, the surveillance device 100 can include a power source 108
that can be configured to supply electric power to all the components of the surveillance
device 100. In an exemplary embodiment, the power source 108 can include any or a
combination of rechargeable battery, rechargeable cell, solar cell, solar battery,
electrochemical cell, storage battery, secondary cell, and the like.
[0052] In an embodiment, the surveillance device 100 can include a communication unit
(not shown) that can be operatively coupled with the processing unit 102, and configured to
communicatively couple the positioning unit 106 with the processing unit 102. In an
exemplary embodiment, the communication unit can include any or a combination of
Wireless Fidelity (Wi-Fi), Bluetooth, and Li-Fi, optical fiber, Wireless Local Area Network
(WLAN), ZigBee, and the like.
[0053] In an embodiment, the surveillance device 100 can further include an image
acquisition unit (not shown) that can be operatively coupled to the processing unit 102, and
can be configured to capture images of the first entity so that face of the first entity can be
recognized through internal processing by face recognition unit of the processing unit 102. In
an embodiment, the image acquisition unit can include fund us camera and other such image
acquisition units.
[0054] In an embodiment, the surveillance device 100 can be adapted to be wearable by
the first entity, where the surveillance device 100 can be mounted on the surface of the body
of the first entity in a fixed position or can be accompanied with other devices and surfaces
associated with the first entity in different positions, such as in clothes pockets, as a wrist
watch, or in various bags.
[0055] In an embodiment, the surveillance device 100 can include a galvanic skin sensor
configured to detect temperature of the first entity. In another embodiment, the surveillance
device 100 can also include LEDs that can illuminate different colours based on the set of
alarm signals generated by the processing unit 102.
[0056] FIG. 2 illustrates exemplary functional components of processing unit of
disinfecting and alerting device, in accordance with an embodiment of the present disclosure.
10
[0057] As illustrated in an embodiment, the processing unit 102 can include one or more
processor(s) 202. The one or more processor(s) 202 can be implemented as one or more
microprocessors, microcomputers, microcontrollers, digital signal processors, central
processing units, logic circuitries, and/or any devices that manipulate data based on
operational instructions. Among other capabilities, the one or more processor(s) 202 are
configured to fetch and execute computer-readable instructions stored in a memory 204 of the
processing unit 102. The memory 204 can store one or more computer-readable instructions
or routines, which may be fetched and executed to create or share the data units over a
network service. The memory 204 can include any non-transitory storage device including,
for example, volatile memory such as RAM, or non-volatile memory such as EPROM, flash
memory, and the like.
[0058] In an embodiment, the processing unit 102 can also include an interface(s) 206.
The interface(s) 206 may include a variety of interfaces, for example, interfaces for data input
and output devices, referred to as I/O devices, storage devices, and the like. The interface(s)
206 may facilitate communication of the processing unit 102 with various devices coupled to
the processing unit 102. The interface(s) 206 may also provide a communication pathway for
one or more components of processing unit 102. Examples of such components include, but
are not limited to, processing engine(s) 208 and database 210.
[0059] In an embodiment, the processing engine(s) 208 can be implemented as a
combination of hardware and programming (for example, programmable instructions) to
implement one or more functionalities of the processing engine(s) 208. In examples described
herein, such combinations of hardware and programming may be implemented in several
different ways. For example, the programming for the processing engine(s) 208 may be
processor executable instructions stored on a non-transitory machine-readable storage
medium and the hardware for the processing engine(s) 208 may include a processing resource
(for example, one or more processors), to execute such instructions. In the present examples,
the machine-readable storage medium may store instructions that, when executed by the
processing resource, implement the processing engine(s) 208. In such examples, the
processing unit 102 can include the machine-readable storage medium storing the instructions
and the processing resource to execute the instructions, or the machine-readable storage
medium may be separate but accessible to processing unit 102 and the processing resource. In
other examples, the processing engine(s) 208 may be implemented by electronic circuitry. A
database 210 can include data that is either stored or generated as a result of functionalities
implemented by any of the components of the processing engine(s) 208.
11
[0060] In an embodiment, the processing engine(s) 208 can include an extraction unit
212, a comparison unit 214, a face recognition unit 216, and other unit (s) 220. The other
unit(s) 220 can implement functionalities that supplement applications or functions
performed by the system 102 or the processing engine(s) 208.
[0061] The database 210 can include data that is either stored or generated as a result of
functionalities implemented by any of the components of the processing engine(s) 208.
[0062] It would be appreciated that units being described are only exemplary units and
any other unit or sub-unit may be included as part of the system 102. These units too may be
merged or divided into super- units or sub-units as may be configured.
[0063] As illustrated in FIG. 2, the processing unit 102 can include the extraction unit
212 that can facilitate extraction of a first set of parameters associated with a first entity. In an
embodiment, the first set of parameters can be extracted from a first set of signals received at
the processing unit 102, where the first set of signals can be generated from sensor 104,
which can include, but not limited to proximity sensor, ultrasonic sensor, infrared sensor,
temperature sensor, gyroscope sensor, position sensor, and motion sensor, based on the
sensed first set of parameters. In an exemplary embodiment, the first set of parameters can be
any or a combination of motion, velocity, distance, and displacement of at least one body part
of the first entity.
[0064] In another embodiment, the extraction unit 212 can facilitate extraction of
determined location of the first entity from a second set of signals that are received at the
processing unit 102, where the second set of signals can be generated by a positioning unit
106. In an embodiment, the extraction unit 212 can facilitate in determination and extraction
of locations of second entities that are present within a pre-defined range from the determined
location of the first entity. In an embodiment, distance of the first entity from the second
entities can be computed based on the determined location of the first entity and the
determined locations of each of the second entities.
[0065] In an embodiment, the comparison unit 214 associated with the processing unit
102 can facilitate comparison of the extracted first set of parameters with a dataset including
pre-defined security limits. In an implementation, a first set of alert signals can be generated
in case at least one of the first set of parameters is found beyond the pre-defined security
limits, but, if each of the first set of parameters matches the dataset, i.e., if each of the first set
of parameters is found within the pre-defined security limits then no alert signal is generated.
[0066] In another embodiment, the comparison unit 214 can facilitate comparison of the
computed distance with a pre-defined distance limit.A second set of alert signals can be
12
generated in case at least one of the second entities is found in vicinity of the first entity, i.e.,
if the computed distance of at least one of the second entities is found within the pre-defined
distance limit, however, in case the computed distance of each of the second entities is found
beyond the pre-defined distance limit then no alert signal is generated. In an embodiment, the
first set of alert signals and the second set of alert signals can be acoustic, and can be emitted
thorough an alarm or a buzzer to alert the first entity in time.
[0067] In an embodiment, face recognition unit 216 associated with the processing unit
102 can aid in face recognition. Images captured by an image acquisition unit can be further
processed at the face recognition unit 216 to detect face of the first entity from the captured
images, once the face is detected, it can further identify alignment of the face. In an
embodiment, the face recognition unit 216 can extract features from the detected face, and
can further extracted features can be matched with the database 210 to recognize the face. In
an implementation, a matrix can be designed on the basis of pixel values calculated at corner
of the detected face. In an exemplary embodiment, the pixel values can be calculated based
on different illuminations conditions for 2D face recognition. The images of the face can be
represented through a high dimensional vector containing pixel values. Feature matching can
be done to match the face that can be detected from image or video from available database
including pre-defined images with unique face identity.
[0068] Thus, it will be appreciated by those of ordinary skill in the art that the diagrams,
schematics, illustrations, and the like represent conceptual views or processes illustrating
systems and methods embodying this invention. The functions of the various elements shown
in the figures may be provided through the use of dedicated hardware as well as hardware
capable of executing associated software. Similarly, any switches shown in the figures are
conceptual only. Their function may be carried out through the operation of program logic,
through dedicated logic, through the interaction of program control and dedicated logic, or
even manually, the particular technique being selectable by the entity implementing this
invention. Those of ordinary skill in the art further understand that the exemplary hardware,
software, processes, methods, and/or operating systems described herein are for illustrative
purposes and, thus, are not intended to be limited to any particular named.
[0069] While embodiments of the present invention have been illustrated and described,
it will be clear that the invention is not limited to these embodiments only. Numerous
modifications, changes, variations, substitutions, and equivalents will be apparent to those
skilled in the art, without departing from the spirit and scope of the invention, as described in
the claim.
13
[0070] In the foregoing description, numerous details are set forth. It will be apparent,
however, to one of ordinary skill in the art having the benefit of this disclosure, that the
present invention may be practiced without these specific details. In some instances, wellknown structures and devices are shown in block diagram form, rather than in detail, to avoid
obscuring the present invention.
[0071] As used herein, and unless the context dictates otherwise, the term "coupled to" is
intended to include both direct coupling (in which two elements that are coupled to each
other contact each other) and indirect coupling (in which at least one additional element is
located between the two elements). Therefore, the terms "coupled to" and "coupled with" are
used synonymously. Within the context of this document terms "coupled to" and "coupled
with" are also used euphemistically to mean “communicatively coupled with” over a
network, where two or more devices are able to exchange data with each other over the
network, possibly via one or more intermediary device.
[0072] It should be apparent to those skilled in the art that many more modifications
besides those already described are possible without departing from the inventive concepts
herein. The inventive subject matter, therefore, is not to be restricted except in the spirit of
the appended claims. Moreover, in interpreting both the specification and the claims, all
terms should be interpreted in the broadest possible manner consistent with the context. In
particular, the terms “comprises” and “comprising” should be interpreted as referring to
elements, components, or steps in a non-exclusive manner, indicating that the referenced
elements, components, or steps may be present, or utilized, or combined with other elements,
` components, or steps that are not expressly referenced.
[0073] While the foregoing describes various embodiments of the invention, other and
further embodiments of the invention may be devised without departing from the basic scope
thereof. The scope of the invention is determined by the claims that follow. The invention is
not limited to the described embodiments, versions or examples, which are included to enable
a person having ordinary skill in the art to make and use the invention when combined with
information and knowledge available to the person having ordinary skill in the art.
ADVANTAGES OF THE PRESENT DISCLOSURE
[0074] The present disclosure provides a surveillance device to track location of entities.
[0075] The present disclosure provides a surveillance device to track movements of an
entity.
14
[0076] The present disclosure provides a surveillance device to alert an entity based on
tracked location and movements.
[0077] The present disclosure provides a surveillance device that facilitates in
maintaining hygiene, and keeping an entity healthy and safe.
[0078] The present disclosure provides an innovative, wearable, cost effective, efficient,
safe, and portable surveillance device.
We Claim:
1. A surveillance device comprising:
a sensor configured to sense a first set of parameters associated with a first
entity, and correspondingly generate a first set of signals;
a positioning unit configured to determine location associated with the first
entity, and correspondingly generate a second set of signals; and
a processing unit operatively coupled to the sensor and the positioning unit,
the processing unit comprising one or more processors coupled with a memory, the
memory storing instructions executable by the one or more processors configured to:
extract the first set of parameters and the determined location from the
first set of signals and the second set of signals respectively;
compare the first set of parameters with a dataset comprising predefined security limits;
generatea first set of alert signals in case at least one of the first set of
parameters is beyond the pre-defined security limits;
determine locations of second entities present within a pre-defined
range from the determined location of the first entity, and correspondingly
compute distance of the first entity from each of the second entities;
compare the computed distance with a pre-defined distance limit; and
generate a second set of alert signals in case the computed distance is
within the pre-defined distance limit.
2. The surveillance device as claimed in claim 1, wherein the sensor comprises any or a
combination of proximity sensor, ultrasonic sensor, infrared sensor, temperature
sensor, gyroscope sensor, position sensor, and motion sensor.
3. The surveillance device as claimed in claim 1, wherein the first set of parameters
comprise any or a combination of motion, velocity, distance, and displacement of at
least one body part of the first entity.
4. The surveillance device as claimed in claim 1, wherein the positioning unit comprises
any or a combination of Geographic Information Systems (GIS), Global Positioning
System (GPS), and Near Field Communication (NFC).
5. The surveillance device as claimed in claim 1, wherein the surveillance device
comprises a power source (108) configured to supply electric power to the
surveillance device.
16
6. The surveillance device as claimed in claim 5, wherein the power source comprises
any or a combination of rechargeable battery, rechargeable cell, solar cell, solar
battery, electrochemical cell, storage battery, and secondary cell.
7. The surveillance device as claimed in claim 1, wherein the surveillance device
comprises a communication unit operatively coupled with the processing unit, and
configured to communicatively couple the positioning unit with the processing unit.
8. The surveillance device as claimed in claim 7, wherein the communication unit
comprises any or a combination of Wireless Fidelity (Wi-Fi), Bluetooth, and Li-Fi,
optical fiber, Wireless Local Area Network (WLAN), and ZigBee.
9. The surveillance device as claimed in claim 1, wherein the surveillance device
comprises an image acquisition unit operatively coupled to the processing unit, and
configured to recognize face of the first entity.
| # | Name | Date |
|---|---|---|
| 1 | 202011027233-FER.pdf | 2025-04-21 |
| 1 | 202011027233-FORM 18 [22-11-2023(online)].pdf | 2023-11-22 |
| 1 | 202011027233-STATEMENT OF UNDERTAKING (FORM 3) [26-06-2020(online)].pdf | 2020-06-26 |
| 2 | 202011027233-FORM-26 [23-07-2020(online)].pdf | 2020-07-23 |
| 2 | 202011027233-FORM FOR STARTUP [26-06-2020(online)].pdf | 2020-06-26 |
| 2 | 202011027233-FORM 18 [22-11-2023(online)].pdf | 2023-11-22 |
| 3 | 202011027233-Proof of Right [23-07-2020(online)].pdf | 2020-07-23 |
| 3 | 202011027233-FORM-26 [23-07-2020(online)].pdf | 2020-07-23 |
| 3 | 202011027233-FORM FOR SMALL ENTITY(FORM-28) [26-06-2020(online)].pdf | 2020-06-26 |
| 4 | 202011027233-Proof of Right [23-07-2020(online)].pdf | 2020-07-23 |
| 4 | 202011027233-FORM 1 [26-06-2020(online)].pdf | 2020-06-26 |
| 4 | 202011027233-COMPLETE SPECIFICATION [26-06-2020(online)].pdf | 2020-06-26 |
| 5 | 202011027233-COMPLETE SPECIFICATION [26-06-2020(online)].pdf | 2020-06-26 |
| 5 | 202011027233-DECLARATION OF INVENTORSHIP (FORM 5) [26-06-2020(online)].pdf | 2020-06-26 |
| 5 | 202011027233-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [26-06-2020(online)].pdf | 2020-06-26 |
| 6 | 202011027233-DECLARATION OF INVENTORSHIP (FORM 5) [26-06-2020(online)].pdf | 2020-06-26 |
| 6 | 202011027233-DRAWINGS [26-06-2020(online)].pdf | 2020-06-26 |
| 6 | 202011027233-EVIDENCE FOR REGISTRATION UNDER SSI [26-06-2020(online)].pdf | 2020-06-26 |
| 7 | 202011027233-DRAWINGS [26-06-2020(online)].pdf | 2020-06-26 |
| 7 | 202011027233-EVIDENCE FOR REGISTRATION UNDER SSI [26-06-2020(online)].pdf | 2020-06-26 |
| 8 | 202011027233-DECLARATION OF INVENTORSHIP (FORM 5) [26-06-2020(online)].pdf | 2020-06-26 |
| 8 | 202011027233-EVIDENCE FOR REGISTRATION UNDER SSI [26-06-2020(online)].pdf | 2020-06-26 |
| 8 | 202011027233-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [26-06-2020(online)].pdf | 2020-06-26 |
| 9 | 202011027233-COMPLETE SPECIFICATION [26-06-2020(online)].pdf | 2020-06-26 |
| 9 | 202011027233-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [26-06-2020(online)].pdf | 2020-06-26 |
| 9 | 202011027233-FORM 1 [26-06-2020(online)].pdf | 2020-06-26 |
| 10 | 202011027233-FORM 1 [26-06-2020(online)].pdf | 2020-06-26 |
| 10 | 202011027233-FORM FOR SMALL ENTITY(FORM-28) [26-06-2020(online)].pdf | 2020-06-26 |
| 10 | 202011027233-Proof of Right [23-07-2020(online)].pdf | 2020-07-23 |
| 11 | 202011027233-FORM FOR SMALL ENTITY(FORM-28) [26-06-2020(online)].pdf | 2020-06-26 |
| 11 | 202011027233-FORM FOR STARTUP [26-06-2020(online)].pdf | 2020-06-26 |
| 11 | 202011027233-FORM-26 [23-07-2020(online)].pdf | 2020-07-23 |
| 12 | 202011027233-STATEMENT OF UNDERTAKING (FORM 3) [26-06-2020(online)].pdf | 2020-06-26 |
| 12 | 202011027233-FORM FOR STARTUP [26-06-2020(online)].pdf | 2020-06-26 |
| 12 | 202011027233-FORM 18 [22-11-2023(online)].pdf | 2023-11-22 |
| 13 | 202011027233-STATEMENT OF UNDERTAKING (FORM 3) [26-06-2020(online)].pdf | 2020-06-26 |
| 13 | 202011027233-FER.pdf | 2025-04-21 |
| 14 | 202011027233-FORM-5 [21-10-2025(online)].pdf | 2025-10-21 |
| 15 | 202011027233-FER_SER_REPLY [21-10-2025(online)].pdf | 2025-10-21 |
| 16 | 202011027233-DRAWING [21-10-2025(online)].pdf | 2025-10-21 |
| 1 | 202011027233_SearchStrategyNew_E_SearchHistory-20226E_18-03-2025.pdf |