Abstract: The proposed disclosure provides a system 100 for quantifying movements of a subject. The system 100 is configured to be a wearable device, and comprises first sensors 102, second sensors 104, and a processing unit 106, where the first sensors 102 and the second sensors 104 sense first parameters and second parameters, respectively, associated with the subject. The processing unit 106, based on the sensed first parameters and second parameters, quantifies a variance associated with movement of the subject, and correspondingly generate a set of alert signals when the quantified variance is beyond a pre-defined range. The system comprises an input device 108 to receive a set of input signals, and determine a pattern associated, and correspondingly generate a set of status signals.
[0001] The present disclosure relates to monitoring of movement of limbs of a person.
More particularly, the present disclosure relates to a system for quantifying movements of a
subject.
BACKGROUND
[0002] Background description includes information that may be useful in understanding
the present invention. It is not an admission that any of the information provided herein is prior
art or relevant to the presently claimed invention, or that any publication specifically or implicitly
referenced is prior art.
[0003] A brain disorder or a physical inability/ disorder, such as Parkinson's disease, may
lead to paralysis, stiffness or shaking of limbs of a person, which results in difficulty in walking,
maintaining balance, and coordination. Such disorders can prove to be fatal if not cured at an
early stage.
[0004] Parkinson's disease occurs when nerve cells, or neurons, in an area of the brain that
controls movement become impaired and/or die. Normally, such neurons produce an important
brain chemical known as dopamine, but, when the neurons die or become impaired, it causes
movement problems for person affected with Parkinson's. In Parkinson's, one may even lose the
nerve endings that produce norepinephrine, which is the main chemical messenger of the
sympathetic nervous system and controls many automatic functions of the body, such as heart rate
and blood pressure. Hence, the loss of norepinephrine might result in various non-movement
features associated with Parkinson's, such as fatigue, irregular blood pressure, decreased
movement of food through digestive tract, and sudden drop in blood pressure, especially, when a
person stands up from a sitting or lying-down position.
[0005] Parkinson's symptoms usually begin gradually and get worse over time. As the
disease progresses, people may have difficulty walking and talking. They may also have mental
and behavioural changes, sleep problems, depression, memory difficulties, and fatigue. Early
symptoms of Parkinson include a tremor or shaking of limb, resulting in slowed movement also
known as Bradykinesia. Moreover, muscle stiffness also occur further limiting range of motion.
3
Body posture also becomes stooped. As Parkinson advances through the brain it further decreases
ability to perform unconscious movements including blinking, smiling, swinging arms while
walking. Speech modulation also varies while speaking. Doing basic activity like writing also
becomes very challenging. Moreover, it may lead to freezing of gait (FOG). During FOG, a person
makes an attempt to complete the step, oscillating back and forth. However, he/ she is unable to
do so as FOG delays feet’s response to sudden changes. As a result they may move by dragging
their feet to the ground which can cause catastrophic events as they could fall and injure
themselves.
[0006] Thus by taking an account of the serious consequences and the progression risks it
possesses, accurate and timely diagnosis of Parkinson plays a significant role in patient care,
especially at the early stages, as it allows the patients to take preventive measures before
irreversible brain damages occurs. Conventional systems and methods that are utilized for
detection of Parkinson, requires a sophisticated lab-setup, and a lots of tests to be performed, which
results in wastage of lots of hard-earned money and precious time, and moreover, may cause
discomfort to person affected with Parkinson.
[0007] There is, therefore, a need in the art to provide an efficient, smart, cost-effective,
and user-friendly system to overcome the above-mentioned problems, and, provide a reliable
means for efficient and timely detection of such diseases based on intelligent computational
approaches, and communicating it to the said person, his/ her relatives, and related authorities.
OBJECTS OF THE PRESENT DISCLOSURE
[0008] Some of the objects of the present disclosure, which at least one embodiment herein
satisfies are as listed herein below.
[0009] It is an object of the present disclosure to provide a system for monitoring
movements of a subject.
[0010] It is another object of the present disclosure to provide a system for measuring
position, orientation, and acceleration of limbs of the subject.
[0011] It is another object of the present disclosure to provide a system for identifying
tremors of the subject.
[0012] It is another object of the present disclosure to provide a system for sending an alert
message, based on the identified tremors, to the subject, his/ her relatives, and related authorities.
4
[0013] It is another object of the present disclosure to provide a portable, wearable,
accurate, fast, efficient, and cost effective system.
SUMMARY
[0014] The present disclosure relates to monitoring of movement of limbs of a person.
More particularly, the present disclosure relates to a system for quantifying movement of a subject.
[0015] An aspect of the present disclosure pertains to a system for quantifying movement
of a subject, the system comprises: one or more first sensors configured to sense first parameters
associated with the subject; one or more second sensors configured to sense second parameters
associated with the subject; and a processing unit operatively coupled to the one or more first
sensors and the one or more second sensors, the processing unit comprising one or more processors
coupled with a memory, the memory storing instructions executable by the one or more processors
and configured to: generate a first set of signals corresponding to the first parameters sensed by at
least one of the one or more first sensors; generate a second set of signals corresponding to the
second parameters sensed by at least one of the one or more second sensors; compare the generated
first set of signals and the generated second set of signals with a first dataset comprising predetermined limits associated with the first parameters and the second parameters; and responsive
to the comparison, quantify a variance associated with movement of the subject, and
correspondingly generate a set of alert signals when the quantified variance is beyond a pre-defined
range.
[0016] In an aspect, the one or more first sensors and the one or more second sensors may
comprise any or a combination of gyroscope and accelerometer.
[0017] In an aspect, the first parameters may comprise any or a combination of orientation,
velocity, and acceleration.
[0018] In an aspect, the second parameters may comprise any or a combination of location,
angular position, and rotation.
[0019] In an aspect, the system may be configured in form of a wearable device
[0020] In an aspect, the system may comprise an input device operatively coupled to the
processing unit, whereby the input device may be configured to receive a set of input signals from
the subject; wherein the set of input signals may pertain to a pattern, and wherein the input device
may comprise any or a combination of LCD screen, digital pen, keyboard, joystick, and mouse.
5
[0021] In an aspect, the processing unit may be configured to determine a pattern
associated with the received set of input signals by comparing the received set of input signals
with a second dataset comprising sets of pre-defined patterns.
[0022] In an aspect, the processing unit may be configured to quantify movement of the
subject based on the comparison, and correspondingly generate a set of status signals.
[0023] In an aspect, the processing unit may be configured to update the first dataset and
the second dataset based on any or a combination of the first set of signals, the second set of signals,
and the set of input signals received.
[0024] In an aspect, the system may comprise one or more computing devices operatively
coupled to the processing unit, and configured to receive any or a combination of the generated set
of alert signals and the generated set of status signals, and correspondingly represent any or a
combination of the first set of parameters, the second set of parameters, the quantified variance,
and the determined pattern.
BRIEF DESCRIPTION OF THE DRAWINGS
[0025] The accompanying drawings are included to provide a further understanding of the
present disclosure, and are incorporated in and constitute a part of this specification. The drawings
illustrate exemplary embodiments of the present disclosure and, together with the description,
serve to explain the principles of the present disclosure.
[0026] The diagrams are for illustration only, which thus is not a limitation of the present
disclosure, and wherein:
[0027] FIG. 1 illustrates exemplary block diagram of the proposed system to illustrate its
overall working in accordance with an embodiment of the present disclosure.
[0028] FIG. 2 illustrates exemplary functional components of a processing unit, in
accordance with an exemplary embodiment of the present disclosure.
[0029] FIG. 3A-3C illustrates quantifying of exemplary patterns, in accordance with an
embodiment of the present disclosure.
DETAILED DESCRIPTION
[0030] The following is a detailed description of embodiments of the disclosure depicted
in the accompanying drawings. The embodiments are in such detail as to clearly communicate the
6
disclosure. However, the amount of detail offered is not intended to limit the anticipated variations
of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and
alternatives falling within the spirit and scope of the present disclosure as defined by the appended
claims.
[0031] Various terms as used herein are shown below. To the extent a term used in a claim
is not defined below, it should be given the broadest definition persons in the pertinent art have
given that term as reflected in printed publications and issued patents at the time of filing.
[0032] In some embodiments, the numerical parameters set forth in the written description
and attached claims are approximations that can vary depending upon the desired properties sought
to be obtained by a particular embodiment. In some embodiments, the numerical parameters
should be construed in light of the number of reported significant digits and by applying ordinary
rounding techniques. Notwithstanding that the numerical ranges and parameters setting forth the
broad scope of some embodiments of the invention are approximations, the numerical values set
forth in the specific examples are reported as precisely as practicable. The numerical values
presented in some embodiments of the invention may contain certain errors necessarily resulting
from the standard deviation found in their respective testing measurements.
[0033] As used in the description herein and throughout the claims that follow, the
meaning of “a,” “an,” and “the” includes plural reference unless the context clearly dictates
otherwise. Also, as used in the description herein, the meaning of “in” includes “in” and “on”
unless the context clearly dictates otherwise.
[0034] The recitation of ranges of values herein is merely intended to serve as a shorthand
method of referring individually to each separate value falling within the range. Unless otherwise
indicated herein, each individual value is incorporated into the specification as if it were
individually recited herein. All methods described herein can be performed in any suitable order
unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and
all examples, or exemplary language (e.g. “such as”) provided with respect to certain embodiments
herein is intended merely to better illuminate the invention and does not pose a limitation on the
scope of the invention otherwise claimed. No language in the specification should be construed
as indicating any non-claimed element essential to the practice of the invention.
[0035] Groupings of alternative elements or embodiments of the invention disclosed herein
are not to be construed as limitations. Each group member can be referred to and claimed
7
individually or in any combination with other members of the group or other elements found
herein. One or more members of a group can be included in, or deleted from, a group for reasons
of convenience and/or patentability. When any such inclusion or deletion occurs, the specification
is herein deemed to contain the group as modified thus fulfilling the written description of all
groups used in the appended claims.
[0036] The present disclosure relates to monitoring of movement of limbs of a person.
More particularly, the present disclosure relates to a system for quantifying movement of a subject.
[0037] According to an aspect the present disclosure pertains to a system for quantifying
movement of a subject, the system can be including: one or more first sensors configured to sense
first parameters associated with the subject; one or more second sensors configured to sense second
parameters associated with the subject; and a processing unit operatively coupled to the one or
more first sensors and the one or more second sensors, the processing unit including one or more
processors coupled with a memory, the memory storing instructions executable by the one or more
processors and configured to: generate a first set of signals corresponding to the first parameters
sensed by at least one of the one or more first sensors; generate a second set of signals
corresponding to the second parameters sensed by at least one of the one or more second sensors;
compare the generated first set of signals and the generated second set of signals with a first dataset
comprising pre-determined limits associated with the first parameters and the second parameters;
and responsive to the comparison, quantify a variance associated with movement of the subject,
and correspondingly generate a set of alert signals when the quantified variance is beyond a predefined range.
[0038] In an embodiment, the one or more first sensors and the one or more second sensors
can be including any or a combination of gyroscope and accelerometer.
[0039] In an embodiment, the first parameters can include any or a combination of
orientation, velocity, and acceleration.
[0040] In an embodiment, the second parameters can include any or a combination of
location, angular position, and rotation.
[0041] In an embodiment, the system can be configured in form of a wearable device.
[0042] In an embodiment, the system can include an input device operatively coupled to
the processing unit, whereby the input device can be configured to receive a set of input signals
from the subject; wherein the set of input signals can pertain to a pattern, and wherein the input
8
device can include any or a combination of LCD screen, digital pen, keyboard, joystick, and
mouse.
[0043] In an embodiment, the processing unit can be configured to determine a pattern
associated with the received set of input signals by comparing the received set of input signals
with a second dataset comprising sets of pre-defined patterns.
[0044] In an embodiment, the processing unit can be configured to quantify movement of
the subject based on the comparison, and correspondingly generate a set of status signals.
[0045] In an embodiment, the processing unit can be configured to update the first dataset
and the second dataset based on any or a combination of the first set of signals, the second set of
signals, and the set of input signals received.
[0046] In an embodiment, the system can include one or more computing devices
operatively coupled to the processing unit, and configured to receive any or a combination of the
generated set of alert signals and the generated set of status signals, and correspondingly represent
any or a combination of the first set of parameters, the second set of parameters, the quantified
variance, and the determined pattern.
[0047] FIG. 1 illustrates exemplary block diagram of the proposed system 100 to illustrate
its overall working in accordance with an embodiment of the present disclosure.
[0048] As illustrated by the FIG. 1, in an embodiment, the proposed system 100, which
can be adapted to be wearable, can include one or more first sensors 102 (also, collectively referred
to as first sensors 102, and individually referred to as first sensor 102), one or more second sensors
104 (also, collectively referred to as second sensors 104, and individually referred to as second
sensor 104), and a processing unit 106, such that the first sensors 102 and the second sensors 104
are operatively coupled to the processing unit 106. In an embodiment, the first sensors 102 and the
second sensors 104 can be any or a combination of gyroscope and accelerometer, and can be
configured to sense first parameters and second parameters associated with the subject,
respectively. The first parameters can include any or a combination of orientation, velocity,
acceleration, and the likes, and the second parameters can include any or a combination of location,
angular position, rotation, and the likes.
[0049] In an embodiment, the processing unit 106 can generate a first set of signals
corresponding to the first parameters sensed by at least one of the first sensors 102, and a second
set of signals corresponding to the second parameters sensed by at least one of the second sensors
9
104. In an embodiment, the processing unit 106 can compare the generated first set of signals and
the generated second set of signals with a first dataset, which can be including pre-determined
limits associated with the first parameters and the second parameters. The processing unit 106 can,
responsive to the comparison performed, quantify a variance associated with movement of the
subject, and correspondingly generate a set of alert signals when the quantified variance is found
to be beyond a pre-defined range.
[0050] In an embodiment, the proposed system 100 can include an input device 108, such
as, but not limited to, any or a combination of LCD screen, keypad, digital pen, computer, smart
phone, keyboard, joystick, mouse, and the likes, which can be operatively coupled to the
processing unit 106. In an embodiment, the input device 108 can be configured to receive a set of
input signals, which can pertain to a pattern, from the subject, such that the received set of input
signals can be transmitted to the processing unit 106. In an embodiment, the processing unit 106
can be configured to determine a pattern associated with the set of input signals by comparing the
received set of input signals with a second dataset, which can be including sets of pre-defined
patterns. In another embodiment, the processing unit 106 can be configured to quantify movement
of the subject based on the said comparison, and can correspondingly generate a set of status
signals that can be indicative of quantification of the movement of the subject.
[0051] In an embodiment, the proposed system 100 can include an input interface 112,
which can be operatively coupled between the input device 108 and the processing unit 106. The
input interface 112 can enable transformation of the set of input signals, which received through
the input device 108, into a set of instructions, such as, but not limited to, binary code, octal code,
hexadecimal code, and gray code, which can be accepted and processed by the processing unit
106.
[0052] In an embodiment, the proposed system 100 can include a database 110, which can
be operatively coupled to the processing unit 106. In an embodiment, the database 110 can be
configured to store any or a combination of the first set of signals, the second set of signals, the set
of input signals, the first dataset and the second dataset. In an illustrative embodiment, the
processing unit 106 can be configured to update the first dataset and the second dataset based on
any or a combination of the first set of signals, the second set of signals, and the set of input signals
received. In embodiment, the database 110 can be configured within the processing unit 106, or
10
can be associated with a cloud 114, or can be configured in a remotely located external source, or,
can even exist as an independent unit.
[0053] In an embodiment, the proposed system 100 can include one or more computing
devices (not shown), such that the one or more computing devices can be operatively coupled to
the processing unit 106, and at least one of the one or more computing devices can receive any or
a combination of the generated set of alert signals and the generated set of status signals, and can
correspondingly represent any or a combination of the first set of parameters, the second set of
parameters, the quantified variance, and the determined pattern.
[0054] FIG. 2 illustrates exemplary functional components of a processing unit, in
accordance with an exemplary embodiment of the present disclosure.
[0055] As illustrated in FIG. 2, in an embodiment, the processing unit 106 can include one
or more processor(s) 202. The one or more processor(s) 202 can be implemented as one or more
microprocessors, microcomputers, microcontrollers, digital signal processors, central processing
units, logic circuitries, and/or any devices that manipulate data based on operational instructions.
Among other capabilities, the one or more processor(s) 202 are configured to fetch and execute
computer-readable instructions stored in a memory 204 of the processing unit 106. The memory
204 can store one or more computer-readable instructions or routines, which may be fetched and
executed to create or share the data units over a network service. The memory 204 can include any
non-transitory storage device including, for example, volatile memory such as RAM, or nonvolatile memory such as EPROM, flash memory, and the like.
[0056] In an embodiment, the processing unit 106 can also include an interface(s) 206. The
interface(s) 206 may include a variety of interfaces, for example, interfaces for data input and
output devices, referred to as I/O devices, storage devices, and the like. The interface(s) 206 may
facilitate communication of the Processing unit 106 with various devices coupled to the processing
unit 106. The interface(s) 206 may also provide a communication pathway for one or more
components of the processing unit 106. Examples of such components include, but are not limited
to, processing engine(s) 208 and data 210.
[0057] In an embodiment, the processing engine(s) 208 can be implemented as a
combination of hardware and programming (for example, programmable instructions) to
implement one or more functionalities of the processing engine(s) 208. In examples described
herein, such combinations of hardware and programming may be implemented in several different
11
ways. For example, the programming for the processing engine(s) 208 may be processor
executable instructions stored on a non-transitory machine-readable storage medium and the
hardware for the processing engine(s) 208 may include a processing resource (for example, one or
more processors), to execute such instructions. In the present examples, the machine-readable
storage medium may store instructions that, when executed by the processing resource, implement
the processing engine(s) 208. In such examples, the processing unit 106 can include the machinereadable storage medium storing the instructions and the processing resource to execute the
instructions, or the machine-readable storage medium may be separate but accessible to the
processing unit 106 and the processing resource. In other examples, the processing engine(s) 208
may be implemented by electronic circuitry. The data 210 can include data that is either stored or
generated as a result of functionalities implemented by any of the components of the processing
engine(s) 208.
[0058] In an embodiment, the processing engine(s) 208 can include a comparison unit 212,
a quantifying unit 214, and other unit(s) 218. The other unit(s) 218 can implement functionalities
that supplement applications or functions performed by the processing unit 106 or the processing
engine(s) 208.
[0059] In an embodiment, the comparison unit 212 associated with the processing unit 106
can facilitate comparison of a first set of signals and a second set of signals with a first dataset,
which can be including pre-determined limits associated with the first parameters and the second
parameters. In an embodiment, a first set of sensors 102 can be configured to sense first parameters
of a subject, and the first set of signals can be generated corresponding to the first parameters
sensed by at least one of the first set of sensors 102. In another embodiment, a second set of sensors
102 can be configured to sense second parameters of the subject, and the second set of signals can
be generated corresponding to the second parameters sensed by at least one of the second set of
sensors 104. In an embodiment, the first sensors 102 and the second sensors 104 can be any or a
combination of gyroscope and accelerometer, the first parameters can include any or a combination
of orientation, velocity, acceleration, and the likes, and the second parameters can include any or
a combination of location, angular position, rotation, and the likes.
[0060] In another embodiment, the comparison unit 212 can be configured to compare a
set of input signals received from an input device 108, which can be pertaining to a pattern, with
a second dataset, which can be including sets of pre-defined patterns. In an embodiment, the input
12
device 108 can include any device, such as, but not limited to, any or a combination of LCD screen,
keypad, digital pen, computer, smart phone, keyboard, joystick, mouse, and the likes, for receiving
input from the subject. In an embodiment, an input interface 112 can be operatively coupled to the
input device 108 to enable transformation of the set of input signals received through the input
device 108, into a set of instructions, such as, but not limited to, binary code, octal code,
hexadecimal code, and gray code, which can be accepted and processed by the processing engines
208. In an illustrative embodiment, any or a combination of the first set of signals, the second set
of signals, the set of input signals, the first dataset and the second dataset can be stored in a database
110, which can be associated with the data 210, or can be associated with a cloud 114, or can be
configured in a remotely located external source, or, can even exist as an independent unit.
[0061] In an embodiment, the quantifying unit 214 associated with the processing unit 106
can facilitate quantification of a variance associated with movement of the subject based on the
comparison of any or a combination of the first set of signals and the second set of signals with
the first dataset. In an embodiment, a set of alert signals can be generated when the quantified
variance is found to be beyond a pre-defined range. In an illustrative embodiment, comparison
between the quantified variance and the pre-defined range can be facilitated by the comparison
unit 212.
[0062] In an embodiment, the quantifying unit 214 can facilitate determination of a pattern
associated with the set of input signals based on the comparison performed between the received
set of input signals and the second dataset. In another embodiment, the quantifying unit 214 can
facilitate quantification of movement of the subject based on the said comparison, and can
correspondingly generate a set of status signals that can be indicative of quantification of the
movement of the subject. In an embodiment, one or more computing devices can be operatively
coupled to the processing unit 106, such that at least one of the one or more computing devices
can receive any or a combination of the generated set of alert signals and the generated set of status
signals, and can correspondingly represent any or a combination of the first set of parameters, the
second set of parameters, the quantified variance, and the determined pattern.
[0063] FIG. 3A-3C illustrates quantifying of exemplary patterns, in accordance with an
embodiment of the present disclosure.
[0064] As illustrated in FIG. 3A, in an embodiment, a flow chart 300 is represented for
quantifying of exemplary patterns. In an embodiment, the flow chart 300 can include an event 302
13
of initializing important credentials of a subject, such as, but not limited to, age, and gender. In an
illustrative embodiment, the subject can insert the credentials through a Graphic User Interface
(GUI).
[0065] In an embodiment, the flow chart can include an event 304, in which spiral patterns
can be drawn by the subject. In an illustrative embodiment, the subject can draw the said patterns
using LCD 108-1 and digital pen 108-2. In an embodiment, the proposed system 100 can include
an input interface 112, which can be operatively coupled between the LCD 108-1 and the digital
pen 108-2. The input interface 112 can enable transformation of the received spiral patterns into a
set of instructions, such as, but not limited to, binary code, octal code, hexadecimal code, and gray
code, which can be accepted and processed by the processing unit 106.
[0066] In an embodiment, the flow chart can include an event 306 of detecting a disease
based on the received spiral patterns. In another embodiment, the processing unit 106 can be
configured to quantify movement of hands of the subject based on the received spiral patterns, and
can correspondingly generate a set of status signals that can be indicative of quantification of the
movement of the subject.
[0067] In an embodiment, the flow chart can include an event 308, in which the wearable
proposed system 100 can be worn by the subject. In an illustrative embodiment, the proposed
system 100 can include an accelerometer 102 and a gyroscope 104, and can be worn by the subject
at his legs, or can be associated in one way or another, with the legs of the subject.
[0068] In an embodiment, the flow chart can include an event 310, in which delay in
movements of the legs of the user can be measured through the accelerometer 102 for a specific
time interval. A first set of signals can be generated based on the movements measured by the
accelerometer 102.
[0069] In an embodiment, the flow chart can include an event 312, in which orientation
of posture of the subject can be measured by the gyroscope 104 for a specific time interval. A
second set of signals can be generated based on the movements measured by the gyroscope 104.
[0070] In an embodiment, the flow chart can include an event 314, at which the processing
unit 106 can predict tremors based on the first set of signals and the second set of signals
corresponding to measurements of the accelerometer 102 and the gyroscope 104. Further, the
prediction can also be based on deep learning model. A set of alert signals can be generated based
on the prediction of tremors.
14
[0071] In an embodiment, the flow chart can include an event 316, in which any or a
combination of the generated set of alert signals and the generated set of status signals can be sent
to a display unit, such as, LCD screen and LED, and computing devices registered with the
proposed system 100, and can correspondingly any or a combination of the first set of parameters,
the second set of parameters, the quantified variance, and the determined pattern can be represented
on the display unit or on the screens of the computing devices, in form of graphics, texts, audio,
video, and the likes.
[0072] In an illustrative embodiment, if the determined pattern is obtained as represented
in FIG. 3B, then, the subject can be considered to be healthy, and a corresponding message can be
displayed at the display unit, whereas, if the determined pattern is obtained as represented in FIG.
3C, then, the subject can be considered to be healthy, and a corresponding message can be
displayed at the display unit.
[0073] Thus, it will be appreciated by those of ordinary skill in the art that the diagrams,
schematics, illustrations, and the like represent conceptual views or processes illustrating systems
and methods embodying this invention. The functions of the various elements shown in the figures
may be provided through the use of dedicated hardware as well as hardware capable of executing
associated software. Similarly, any switches shown in the figures are conceptual only. Their
function may be carried out through the operation of program logic, through dedicated logic,
through the interaction of program control and dedicated logic, or even manually, the particular
technique being selectable by the entity implementing this invention. Those of ordinary skill in the
art further understand that the exemplary hardware, software, processes, methods, and/or operating
systems described herein are for illustrative purposes and, thus, are not intended to be limited to
any particular named.
[0074] While embodiments of the present invention have been illustrated and described, it
will be clear that the invention is not limited to these embodiments only. Numerous modifications,
changes, variations, substitutions, and equivalents will be apparent to those skilled in the art,
without departing from the spirit and scope of the invention, as described in the claim.
[0075] In the foregoing description, numerous details are set forth. It will be apparent,
however, to one of ordinary skill in the art having the benefit of this disclosure, that the present
invention may be practiced without these specific details. In some instances, well-known structures
15
and devices are shown in block diagram form, rather than in detail, to avoid obscuring the present
invention.
[0076] As used herein, and unless the context dictates otherwise, the term "coupled to" is
intended to include both direct coupling (in which two elements that are coupled to each other
contact each other)and indirect coupling (in which at least one additional element is located
between the two elements). Therefore, the terms "coupled to" and "coupled with" are used
synonymously. Within the context of this document terms "coupled to" and "coupled with" are
also used euphemistically to mean “communicatively coupled with” over a network, where two or
more devices are able to exchange data with each other over the network, possibly via one or more
intermediary device.
[0077] It should be apparent to those skilled in the art that many more modifications
besides those already described are possible without departing from the inventive concepts herein.
The inventive subject matter, therefore, is not to be restricted except in the spirit of the appended
claims. Moreover, in interpreting both the specification and the claims, all terms should be
interpreted in the broadest possible manner consistent with the context. In particular, the terms
“comprises” and “comprising” should be interpreted as referring to elements, components, or steps
in a non-exclusive manner, indicating that the referenced elements, components, or steps may be
present, or utilized, or combined with other elements, components, or steps that are not expressly
referenced. Where the specification claims refers to at least one of something selected from the
group consisting of A, B, C …. N, the text should be interpreted as requiring only one element
from the group, not A plus N, or B plus N, etc.
[0078] While the foregoing describes various embodiments of the invention, other and
further embodiments of the invention may be devised without departing from the basic scope
thereof. The scope of the invention is determined by the claims that follow. The invention is not
limited to the described embodiments, versions or examples, which are included to enable a person
having ordinary skill in the art to make and use the invention when combined with information
and knowledge available to the person having ordinary skill in the art.
ADVANTAGES OF THE PRESENT DISCLOSURE
[0079] The present disclosure provides a system for monitoring movements of a subject.
16
[0080] The present disclosure provides a system for measuring position, orientation, and
acceleration of limbs of the subject.
[0081] The present disclosure provides a system for identifying tremors of the subject.
[0082] The present disclosure provides a system for sending an alert message, based on
the identified tremors, to the subject, his/ her relatives, and related authorities.
[0083] The present disclosure provides a portable, wearable, accurate, fast, efficient, and
cost effective system.
We Claim:
1. A system for quantifying movement of a subject, the system comprises:
one or more first sensors configured to sense first parameters associated with the
subject;
one or more second sensors configured to sense second parameters associated with
the subject; and
a processing unit operatively coupled to the one or more first sensors and the one
or more second sensors, the processing unit comprising one or more processors coupled
with a memory, the memory storing instructions executable by the one or more processors
and configured to:
generate a first set of signals corresponding to the first parameters sensed
by at least one of the one or more first sensors;
generate a second set of signals corresponding to the second parameters
sensed by at least one of the one or more second sensors;
compare the generated first set of signals and the generated second set of
signals with a first dataset comprising pre-determined limits associated with the
first parameters and the second parameters; and
responsive to the comparison, quantify a variance associated with
movement of the subject, and correspondingly generate a set of alert signals when
the quantified variance is beyond a pre-defined range.
2. The system as claimed in claim 1, wherein the one or more first sensors and the one or
more second sensors comprise any or a combination of gyroscope and accelerometer.
3. The system as claimed in claim 1, wherein the first parameters comprise any or a
combination of orientation, velocity, and acceleration.
4. The system as claimed in claim 1, wherein the second parameters comprise any or a
combination of location, angular position, and rotation.
5. The system as claimed in claim 1, wherein the system is configured in form of a wearable
device.
6. The system as claimed in claim 1, wherein the system comprises an input device
operatively coupled to the processing unit, whereby the input device is configured to
18
receive a set of input signals from the subject; wherein the set of input signals pertain to a
pattern, and
wherein the input device comprises any or a combination of LCD screen, digital pen,
keyboard, joystick, and mouse.
7. The system as claimed in claim 6, wherein the processing unit is configured to determine
a pattern associated with the received set of input signals by comparing the received set of
input signals with a second dataset comprising sets of pre-defined patterns.
8. The system as claimed in claim 6, wherein the processing unit is configured to quantify
movement of the subject based on the comparison, and correspondingly generate a set of
status signals.
9. The system as claimed in claim 6, wherein the processing unit is configured to update the
first dataset and the second dataset based on any or a combination of the first set of signals,
the second set of signals, and the set of input signals received.
10. The system as claimed in claim 6, wherein the system comprises one or more computing
devices operatively coupled to the processing unit, and configured to receive any or a
combination of the generated set of alert signals and the generated set of status signals, and
correspondingly represent any or a combination of the first set of parameters, the second
set of parameters, the quantified variance, and the determined pattern.
| # | Name | Date |
|---|---|---|
| 1 | 202011017201-IntimationOfGrant26-06-2024.pdf | 2024-06-26 |
| 1 | 202011017201-STATEMENT OF UNDERTAKING (FORM 3) [22-04-2020(online)].pdf | 2020-04-22 |
| 2 | 202011017201-FORM FOR STARTUP [22-04-2020(online)].pdf | 2020-04-22 |
| 2 | 202011017201-PatentCertificate26-06-2024.pdf | 2024-06-26 |
| 3 | 202011017201-FORM FOR SMALL ENTITY(FORM-28) [22-04-2020(online)].pdf | 2020-04-22 |
| 3 | 202011017201-Annexure [25-06-2024(online)].pdf | 2024-06-25 |
| 4 | 202011017201-Written submissions and relevant documents [25-06-2024(online)].pdf | 2024-06-25 |
| 4 | 202011017201-FORM 1 [22-04-2020(online)].pdf | 2020-04-22 |
| 5 | 202011017201-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [22-04-2020(online)].pdf | 2020-04-22 |
| 5 | 202011017201-Correspondence to notify the Controller [07-06-2024(online)].pdf | 2024-06-07 |
| 6 | 202011017201-FORM-26 [07-06-2024(online)].pdf | 2024-06-07 |
| 6 | 202011017201-EVIDENCE FOR REGISTRATION UNDER SSI [22-04-2020(online)].pdf | 2020-04-22 |
| 7 | 202011017201-US(14)-HearingNotice-(HearingDate-10-06-2024).pdf | 2024-05-07 |
| 7 | 202011017201-DRAWINGS [22-04-2020(online)].pdf | 2020-04-22 |
| 8 | 202011017201-DECLARATION OF INVENTORSHIP (FORM 5) [22-04-2020(online)].pdf | 2020-04-22 |
| 8 | 202011017201-CLAIMS [17-10-2022(online)].pdf | 2022-10-17 |
| 9 | 202011017201-COMPLETE SPECIFICATION [22-04-2020(online)].pdf | 2020-04-22 |
| 9 | 202011017201-CORRESPONDENCE [17-10-2022(online)].pdf | 2022-10-17 |
| 10 | 202011017201-DRAWING [17-10-2022(online)].pdf | 2022-10-17 |
| 10 | 202011017201-FORM-26 [08-07-2020(online)].pdf | 2020-07-08 |
| 11 | 202011017201-FER_SER_REPLY [17-10-2022(online)].pdf | 2022-10-17 |
| 11 | 202011017201-Proof of Right [09-09-2020(online)].pdf | 2020-09-09 |
| 12 | 202011017201-FORM 18 [16-12-2021(online)].pdf | 2021-12-16 |
| 12 | 202011017201-FORM-26 [17-10-2022(online)].pdf | 2022-10-17 |
| 13 | 202011017201-FER.pdf | 2022-04-18 |
| 14 | 202011017201-FORM 18 [16-12-2021(online)].pdf | 2021-12-16 |
| 14 | 202011017201-FORM-26 [17-10-2022(online)].pdf | 2022-10-17 |
| 15 | 202011017201-FER_SER_REPLY [17-10-2022(online)].pdf | 2022-10-17 |
| 15 | 202011017201-Proof of Right [09-09-2020(online)].pdf | 2020-09-09 |
| 16 | 202011017201-DRAWING [17-10-2022(online)].pdf | 2022-10-17 |
| 16 | 202011017201-FORM-26 [08-07-2020(online)].pdf | 2020-07-08 |
| 17 | 202011017201-CORRESPONDENCE [17-10-2022(online)].pdf | 2022-10-17 |
| 17 | 202011017201-COMPLETE SPECIFICATION [22-04-2020(online)].pdf | 2020-04-22 |
| 18 | 202011017201-CLAIMS [17-10-2022(online)].pdf | 2022-10-17 |
| 18 | 202011017201-DECLARATION OF INVENTORSHIP (FORM 5) [22-04-2020(online)].pdf | 2020-04-22 |
| 19 | 202011017201-US(14)-HearingNotice-(HearingDate-10-06-2024).pdf | 2024-05-07 |
| 19 | 202011017201-DRAWINGS [22-04-2020(online)].pdf | 2020-04-22 |
| 20 | 202011017201-FORM-26 [07-06-2024(online)].pdf | 2024-06-07 |
| 20 | 202011017201-EVIDENCE FOR REGISTRATION UNDER SSI [22-04-2020(online)].pdf | 2020-04-22 |
| 21 | 202011017201-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [22-04-2020(online)].pdf | 2020-04-22 |
| 21 | 202011017201-Correspondence to notify the Controller [07-06-2024(online)].pdf | 2024-06-07 |
| 22 | 202011017201-Written submissions and relevant documents [25-06-2024(online)].pdf | 2024-06-25 |
| 22 | 202011017201-FORM 1 [22-04-2020(online)].pdf | 2020-04-22 |
| 23 | 202011017201-FORM FOR SMALL ENTITY(FORM-28) [22-04-2020(online)].pdf | 2020-04-22 |
| 23 | 202011017201-Annexure [25-06-2024(online)].pdf | 2024-06-25 |
| 24 | 202011017201-PatentCertificate26-06-2024.pdf | 2024-06-26 |
| 24 | 202011017201-FORM FOR STARTUP [22-04-2020(online)].pdf | 2020-04-22 |
| 25 | 202011017201-IntimationOfGrant26-06-2024.pdf | 2024-06-26 |
| 25 | 202011017201-STATEMENT OF UNDERTAKING (FORM 3) [22-04-2020(online)].pdf | 2020-04-22 |
| 1 | SearchHistorypatseer202011017201E_18-04-2022.pdf |