Abstract: SYSTEM AND METHOD TO DETECT PRESENCE OF SKIN ABNORMALITIES A system (110) for automatically detecting the presence of abnormal heat patterns on skin of a human subject (100) using a non-invasive thermal imaging technique by performing (i) receiving thermal images of the human subject (100) that are captured or recorded by a thermal imaging device, (ii) selecting a region of interest in the thermal image and extracting a plurality of numeric features associated with the region of interest, (iii) generating a probability score that indicates a presence of the abnormal heat patterns on the skin of the human subject (100) by analyzing the plurality of numeric features using a first machine learning model, and (iv) generating an output report that contains the probability score and a label, derived using the probability score which indicates the presence of the abnormal heat patterns FIG. 1
Claims:I/We Claim:
1
1. A system
(110)
for automatically detecting
a
presence of
abnormal heat patterns
on
skin of a
2
human subject (100)
using a non
-
invasive thermal imaging technique, the system comprising:
3
a memory that stores a set of machine
-
readable instructions;
4
a processor
(112)
retrieving machine
-
readable instructions from
a
storage device
(114)
5
which, when executed by the processor, enable the processor to:
6
receive thermal images of the
human subject (100)
that are captured or recorded
7
by a thermal imaging device
(101)
, the the
rmal imaging device
(101)
comprising:
8
an array of sensors that converts infrared energy into electrical signals on
9
a per
-
pixel basis, wherein the array of sensors detected temperature values from
10
the subject’s body; and
11
a specialized processor that process
es the detected temperature values into
12
at least one block of pixels to generate a thermal image;
13
select a region of interest in the thermal image and extract a plurality of numeric
14
features associated with the region of interest
;
15
generat
e
, using a first
machine learning model,
a probability score
that indicates
16
a presence of the
abnormal heat patterns
on
the skin of the
human subject (100)
by
17
analysing
the
extracted
plurality of numeric features, wherein
18
the first machine learning model comprises a first classifier model that is
19
trained to determine a likelihood of the presence of the
abnormal heat patterns
on
the skin
20
of the
human subject (100)
using a first training set, wherein
21
the first training set comprises
a
plurality of numeric features
22
extracted from
a plurality of
thermal images of human subject
s
and
their
23
corresponding labels
of
each of the thermal images
indicating the presence
24
of
the abnormal heat patterns
; and
25
generat
e
an output report that
comprises
the probability score and a label, derived
26
using the probability score which indicates the presence of
the abnormal heat patterns
.
1
2
.
The system
(110)
as claimed in claim 1, wherein
the
plurality of numeric features includes at
24
2
least one
or combination
of
textural features,
first
-
order
temperature features,
time
-
series
3
features,
Fourier
features
, wherein
the
textural features
are extracted from
at least one
gray level
4
co
-
occurrence
matrix or
run
-
length
matrix
, wherein the
Fourier
features
are obtained by taking
a
5
Fourier
transform of time series data
.
6
7
3.
. The system
(110)
as claimed in claim 1, wherein the output report comprises the thermal
8
images with annotated regions correspond
ing to a surface location of the
abnormal heat patterns
9
in the thermal image
, wherein the abnormal heat patterns are produced by live parasitic worms
10
that are present
on
skin of
the
human subject (100)
.
1
4
. The system
(110)
as claim
ed in claim
3
, wherein the output report comprises
an estimate of a
2
reproductive
status of the live parasitic worms
on
the skin of the
human subject (100)
, wherein
3
the reproductive status of the live parasitic worms
on
the skin of the
human subject (100)
i
s
4
estimated using a second machine learning model,
wherein the second machine learning model
5
is at least one
or
a
combination of
logistic regression, support vector machine, neural networks,
6
deep neural networks, linear regression, k
-
nearest neighbour and
random forests
,
wherein the
7
second machine learning model comprises
8
a second classifier model that is trained to determine the reproductive status of the
9
parasitic worms
on
the skin of the
human subject (100)
using a second training set,
10
wherein the second training set comprises
a
plurality of numeric features
extracted from
11
the thermal images of human subject
s
and
their
corresponding labels indicating the
12
reproductive
status of live worms
for each of the ther
mal images
.
1
5
. The system
(110)
as claimed in claim 1, wherein the system stores all probability scores, using
2
a memory unit
, generated for
the
human subject (100)
at
specific interval
s
for assessing efficacy
3
of drugs for a specifi
c disease based on a trend in ordinal value of the probability scores, wherein
4
the efficacy of drugs is assessed by comparing a generated probability score of the
human subject
5
(100)
with the probability scores that are retrieved from the memory
unit
.
1
6
. The system
(110)
as claimed in claim 1, wherein the array of sensors in the thermal imaging
2
device is selected
from
at least one of a thermal imaging camera, a wearable contact
-
based
25
3
thermal device or a lens that focuses the in
frared energy from
a
body
of the
human subject (100)
4
onto the array of sensors.
1
7
. The system
(110)
as claimed in claim 1, wherein the system obtains a first thermal image from
2
the thermal imaging device
(101
)
and a second visual image from an associated RGB camera
3
lens
(116)
, wherein the system extracts the plurality of numeric features from the region of
4
interest using
the
obtained
first thermal image and the second visual image and provide
s
as an
5
input to the first machine learning model to generate a probability score
, wherein the first
6
machine learning model is at least one
or
a
combination of
logistic regression, support vector
7
machine, neural networks, deep neural networks, linear regres
sion, k
-
nearest
neighbour
and
8
random forests
.
1
8
. The system
(110)
as claimed in claim 6, wherein the first thermal image and the second visual
2
image of the
human subject (100)
are used to generate a plurality of multimodal numeric and
3
image features from
the region of interest and provide as an input to a third machine learning
4
model to generate the probability score that indicates a presence of the
abnormal heat patterns
on
5
the skin of the
human subject (100)
,
wherein the third machine learning model is
at least one
or
6
a
combination of
logistic regression, support vector machine, neural networks, deep neural
7
networks, linear regression, k
-
nearest neighbour and random forests
,
wherein the third machine
8
learning model comprises
9
a deep learning classifier m
odel that is trained to determine a likelihood of the
10
presence of the
abnormal heat patterns
on
the skin of the
human subject (100)
using a
11
third training set, wherein the third training set comprises the plurality of multimodal
12
numeric and image features
related to the first thermal images and the second visual
13
images of human subject
s
and corresponding labels indicating the presence of
the
14
abnormal heat patterns
.
1
9
. The system
(110)
as claimed in claim 1, wherein the
processor re
ceives the thermal images of
2
the
human subject (100)
that is recorded by the thermal imaging device, wherein the thermal
3
imaging device records the
human subject (100)
in the form of a thermal video
and selects a
4
plurality of thermal image frames from the
thermal video to generate the thermal images
.
26
1
10
. The system
(110)
as claimed in claim
7
, wherein the plurality of numeric features comprises
2
time
-
series features that are derived from time
-
varying characteristics of thermal pixels or points
3
in the thermal video and visual pixels or points in the visual video.
1
11
. The system
(110)
as claimed in claim
3
, wherein the output report comprises an estimate of
2
at least one
of a
number
or
density
of the live parasitic worms
on
the skin of the
human subject
3
(100)
, wherein
at least one of
the number
or the
density
of the live parasitic worms
on
the skin
4
of the
human subject (100)
is estimated using a fourth machine learning mo
del, wherein the
5
fourth machine learning model comprises
6
A
fourth classifier model that is trained to determine
at least one of
the number
or
7
the
density
of parasitic worms
on
the skin of the
human subject (100)
using a fourth
8
training set, wherein the f
ourth training set comprises a plurality of numeric features
9
extracted from the thermal images of human subjects and their corresponding labels
10
indicating
at least one of
the number
or the
density
of live worms for each of the thermal
11
images.
1
12
.
A method for automatically detecting the presence of
abnormal heat patterns
on
skin of a
2
human subject (100)
using a non
-
invasive thermal imaging technique comprising:
3
receiving thermal images of the
human subject (100)
that are captured or recorded by a
4
thermal imaging device
;
5
selecting a region of interest in the thermal image and
extracting
a plurality of numeric
6
features associated with the region of interest;
7
generating, using a first machine learning model, a probability score that indicates a
8
presence of the
abnormal heat patterns
on
the skin of the
human subject (100)
by analy
z
ing the
9
extracted plurality of numeric features, wherein the first machine learnin
g model comprises a
10
first classifier model that is trained to determine a likelihood of the presence of the
abnormal heat
11
patterns
on
the skin of the
human subject (100)
using a first training set, wherein the first training
12
set comprises a plurality of nu
meric features extracted from a plurality of thermal images of
13
human subjects and their corresponding labels of each of the thermal images indicating the
27
14
presence of
the abnormal heat patterns
; and
15
generating an output report that comprises the probability
score and a label, derived using
16
the probability score which indicates the presence of
the abnormal heat patterns
.
Dated this
21
st
March 2022
Signature:
Name:
Arjun Bala Karthik
IN/PA No.1021
, Description:
12
matrix
. The
textural features
include at least one of second
-
order
features
that
includes
non
-
uniformity, entropy, contrast, dissimilarity,
homogeneity
, energy, correlation,
a
ngular second
moment
, etc
. In some embodiments, the first order temperature includes
mean temperature,
maximum temperature, minimum temperature, standard deviation
, etc
. In some embodiments, the
5
time
-
series features include at least one of
thermal resistance mean variation, rate o
f warming,
slope
, etc
. In some embodiment, the
Fourier
features
are obtained by taking
a
Fourier
transform of
time series data
. The
Fourier
features include
skewness, kurtosis, spectral centroid, spectral
variance, slope
,
Dynamic model Friedrich coefficien
ts, first minima
and maxima relative positions
,
area under curve,
etc
.
10
[0037]
In some embodiments, the system stores all probability scores generated for the
human
subject
100
at
a
specific interval using the memory 114 for assessing efficacy of drugs for
a spec
ific disease based on a trend in an ordinal value of the probability scores. In some
embodiments, the efficacy of drugs is assessed by comparing a generated probability score of the
human
subject
100
with the probability scores that are retrieved from the
memory 114.
15
[0038]
With reference to FIG. 1,
FIG. 2 illustrates an exploded view of a system
110
for
automatically detecting
the
presence of
the abnormal heat patterns
on
the
skin of
the
human subject
100
according to some embodiments herein
. The exploded view 200 of the system 110 includes a
database 202,
a thermal image acquisition module 204,
a region of interest detection module
206
,
a feature extraction module
208
,
a feature analy
z
ing
module 210 and a report generation module
20
212
.
The th
ermal image acquisition module 204 receives thermal images of the
human subject
100
.
In some embodiments, the thermal imaging
device
101 is communicatively connected to the
system
110. The thermal imaging
device
101 captures a thermal image of the body of
the
human
subject 100
. In some embodiments, the thermal imaging
device
101 captures a particular region of
the body of the
human subject 100
. In some embodiments, the thermal image of the body of the
13
human subject 100
is captu
red from a heat map of the bod
y. In so
me embodiments, th
e thermal
imaging
device
101
includes
an array of sensors and a specialized processor. The array of sensors
converts infrared energy into electrical signals on a per
-
pixel basis. The array of sensors detected
temperature values fr
om the human subject’s body. The
specialized processor
processes the
5
captured thermal image of the body of the
human subject 100
to
extract a plurality of numeric
features
to detect
the
presence of
the abnormal heat patterns
on
the
skin of the
human
subject 100
.
In some embodiments, the array of sensors in the thermal imaging device 101 is selected at least
one of a thermal imaging camera, a wearable contact
-
based thermal device or a lens that focuses
the infrared energy from the body of the
human sub
ject 100
onto the array of sensors.
The database
10
202 stores the captured thermal image of the body of the
human subject 100
.
[0039]
The region of interest detection module
206
obtains
an input from a user to select
a
region of interest on the captured thermal im
age of the
human subject 100
. In some embodiments,
the region of interest on the thermal image of the
human subject 100
is obtained using an automated
segmentation technique. The numeric feature extraction module
208
extracts
the
plurality of
15
numeric featu
res associated with the region of interest of the thermal image
. In some embodiments,
the plurality of numeric features
is
extracted
using at least one of image processing technique or a
mathematical analysis.
[0040]
The
feature
analyzing
module
2
10 analyses
a
p
lurality of
extracted
numeric features
to generate a probability score that indicates
the
presence of the
abnormal heat patterns
on
the skin
20
of the
human subject 100
using a first machine learning model
. In some embodiments,
the first
machine learning mode
l includes a first classifier model that is trained to determine
the
presence
of the
abnormal heat patterns
on
the skin of the
human subject 100
using a first training set
. In
some embodiments,
the first training set includes the plurality of numeric features related to the
thermal images of the
human subject 100
and corresponding labels indicating the presence of
the
14
abnormal heat patterns
.
In some embodiments, the
first training set
is obtained
from the storage
device
114
.
[0041]
The report generation module 214 generates a
n output
report with
the probability
score and a label, derived using the probability score which indicates the presence of
abnormal
5
heat patterns
. The system 110 enables the user t
o
identify
the
presence of live parasitic worms
on
the
skin of a
human subject 100
using the
abnormal heat patterns detected on
thermal images of
the skin of
the
human subject 100
. In some embodiments,
the output report
includes
the thermal
images with ann
otated regions corresponding to a surface location of the
abnormal heat patterns
in the thermal image
.
In some embodiments, the abnormal heat patterns are produced by live
10
parasitic worms that are present
on
the
skin of the human subject 100.
In some embod
iments, the
output report includes an estimate of a reproductive status of the live parasitic worms
on
the skin
of the
human subject 100
. The reproductive status of the live parasitic worms may
be
estimated
using a second machine learning model.
In some em
bodiments, classifier models include at least
one of a Support Vector Machine (SVM), a neural network, a Bayesian network, a Logistic
15
Regression, Naïve Bayes, Randomized Forests, Decision Trees, Boosted Decision Trees, K
-
nearest neighbor, a Restricted Bolt
zmann Machine (RBM), or deep learning classifiers.
[0042]
With reference to FIG. 1 and FIG. 2,
FIG. 3 illustrates an example
system view of
automatically detecting the presence of
the abnormal heat patterns
on
the
skin of the
human subject
100
using a thermal image and a second visual image
according to some embodiments herein
.
An
20
RGB camera lens 116
captures
the
second visual image
. In some embodiments,
second visual
images
may be
selected from the video of
the
body of the
human subject 100
. I
n some
embodiments, the thermal imaging device 101
and the RGB camera lens 116
are
communicatively
connected to
the
system 110. In some embodiments, the system 110 includes
the
processor 112
, a
second processor 118
and
the
storage device 114. The processor
112 receives the thermal image
15
and the second visual image
of the body of the
human subject 100
. The storage device 114 stores
data correspond
ing
to the received thermal image
and the
second visual image
of the body of the
human subject 100
. Thermal imagi
ng cameras are readily available in various streams of
commerce.
5
[0043]
In some embodiments, the system 110 obtains the first thermal image from the
thermal imaging device 101 and a second visual image from the RGB camera lens 116. In some
embodiments, the first
thermal image and the second visual image of the
human subject 100
are
used to generate a plurality of multimodal numeric and image features from the region of interest
and provided as an input to a third machine learning model to generate the probability
score.
10
[0001]
In some embodiments,
the processor 112 receives the thermal image of the
human
subject 100
that is recorded by the thermal imaging device 101 in a form of thermal video. The
thermal imaging device 101 records the
human subject 100
in the form of t
he thermal video and
selects a plurality of thermal image frames from the thermal video to generate the thermal images.
T
he system
110
extracts the plurality of numeric features from
a
region of interest using the first
15
thermal image and the second visual image and provided as an input to the first machine learning
model to generate a probability score
.
The system 110 processes the thermal image
and the second
visual image
captured by t
he thermal imaging device 101 for processing a plurality of numeric
features associated with a region of interest of the thermal image, for automatically detecting
the
presence of
the abnormal heat patterns
on
the
skin of
the
human subject 100
and enabling
a user
20
(e.g. a doctor or a physician) to identify
the
presence of live parasitic worms using a
n
output report.
In some embodiments, the output report includes probability scores and labels that
are
derived by
analy
z
ing
the plurality of numeric features in
the thermal image of the
human subject 100
.
In some
embodiments, the plurality of numeric features includes time
-
series features that are derived from
time
-
varying characteristics of thermal pixels or points in the thermal video and visual pixels or
16
point
s in the visual video.
[0002]
With reference to FIG. 1
and
FIG. 2, FIG. 4 illustrates an exemplary process flow
of processing the thermal image, for generating
the
probability score that indicates
the
presence of
the
abnormal heat patterns
on
the skin of
the
hum
an subject 100
by
analy
z
ing
the plurality of
5
numeric features using
the
first machine learning model according to some embodiments herein.
At step
402
, the thermal image is captured using the thermal imaging
device 101
.
In some
embodiments, the
thermal imaging
device 101
may be
mounted on a slidable and axially rotatable
robotic arm capable of moving the thermal imaging camera
in XYZ directions
such that
thermographic images may be captured in a
ppropriate views required for imaging
.
In some
10
embo
diments, thermal image can be captured by a human technician by manually adjusting the
camera.
In some embodiments, the thermal image may be received or retrieved from a remote
device over a network, or from a media such as a CDROM or DVD. In some embodime
nts, the
thermal image may also be received from an application such as those which are available for
handheld cellular devices and processed on the cell phone or other handheld computing devices
15
such as an iPad or Tablet
-
PC. In some embodiments, the therm
al image may be received directly
from a memory or a storage device of an imaging device that is used to capture the thermal image
or a thermal video. In some embodiments, the thermal image is obtained by selecting a single
image frame of the thermal video
. At step
404
, the region of interest in the thermal image of the
subject is
selected
using the region of interest detection module
206
. In some embodiments, the
20
region of interest on the thermal image is obtained from at least one of the user
s
or through
the
automated segmentation technique. For example, the region of interest on the thermal image
includes any location of the thermal image of the body of the
human subject 100
. At step
406
, the
plurality of numeric features associated with the region of int
erest of the thermal image
are
extracted
.
At step
408
,
a probability score is generated
using
the
first machine learning model.
The
17
first machine learning model includes a first classifier model that is trained to
the
presence of the
abnormal heat patterns
on
the skin of the
human subject 100
using a first training set. The first
training set includes the plurality of
extracted
numeric features related to the thermal images of
the
human subject 100
and corresponding labels indicating the presence of
the abn
ormal heat
5
patterns
.
At step
410
, the report is generated with
the probability score and a label that
is
derived
using the probability score which indicates the presence of
the abnormal heat patterns
. At step
412
,
the generated report is provided to system 110 for further analysis.
[0003]
With reference to FIG. 1 to FIG.
4
,
FIG. 5 illustrates an exemplary process flow of
processing the thermal image, for estimating a reproductive status of the live parasitic worms
on
10
th
e skin of the
human subject 100
a second machine learning model according to some
embodiments herein
.
At step
502
, the thermal image is captured using the thermal imaging device
101. At step
504
, the region of interest in the thermal image of the subject i
s selected using the
region of interest detection module 206. At step
506
, the plurality of numeric features associated
with the region of interest of the thermal image
is extracted
. At step
508
,
the
reproductive status
15
of the live parasitic worms
on
the s
kin of the
human subject 100
is estimated using a second
machine learning prediction model. The second machine learning model includes a second
classifier model that is trained to determine the reproductive status of the parasitic worms
on
the
skin of the
human subject 100
using a second training set. The second training set includes the
plurality of numeric features
that
are
extracted from the thermal images of human subjects and
20
their corresponding labels indicating the reproductive status of live worms f
or each of the thermal
images
. At step
510
, the report is generated with the estimated reproductive status of the live
parasitic worms
on
the skin of the
human subject 100
. At step
512
, the generated report is provided
to the system 110 for further analysi
s.
[0004]
With reference to FIG. 1 to FIG. 3, FIG. 6 illustrates an exemplary process flow of
18
processing the thermal image and
the
second visual image, for
the
plurality of multimodal numeric
and image features from the region of interest and the probability sco
re that indicates
the
presence
of the
abnormal heat patterns
on
the skin of
the
human subject 100
using
a
third machine learning
model according to some embodiments herein.
At step 602, the thermal image is captured using
5
the thermal imaging device 101. At step 604, the second visual image is captured using the RGB
camera lens 116. At step 606, the region of interest in the thermal image of the subject is selected
using the
region of interest detection module 206. At step 608, a plurality of multimodal numeric
and image features is generated
from
the first thermal image and the second visual image of the
human subject 100
. At step 610, the probability score is generated using
the third machine learning
10
model. The third machine learning model includes a deep learning classifier model that is trained
to determine
the
likelihood of the presence of the
abnormal heat patterns
on
the skin of the
human
subject 100
using a third train
ing set. The third training set includes the plurality of multimodal
numeric and image features related to the first thermal images and the second visual images of
human subject
s
and corresponding labels indicating the presence of
the abnormal heat pattern
s
. At
15
step 612, the report is generated with the probability score that indicates
the
presence of the
abnormal heat patterns
on
the skin of the
human subject 100
. At step
614
, the generated report is
provided to the system 110 for further analysis.
[0005]
With re
ference to FIG. 1,
FIG.
7
illustrates a flow diagram that illustrates a method
for automatically detecting heat patterns associated with
the
presence of live parasitic worms
on
20
the
skin of a
human subject 100
according to some embodiments herein
. At step
7
02
, the thermal
image of the body of the
human subject 100
is received from the
thermal imaging
device 101
. A
step
7
04
, the region of interest in the thermal image of the subject is
selected and a plurality of
numeric features are extracted
. At step
7
06
,
a
probability score is generated by analy
z
ing the
plurality of numeric features using the first machine learning model
. At step
7
08
, the report is
19
generated with
the probability score and a label that
is
derived using the probability score which
indicates t
he presence of
the abnormal heat patterns
.
In some embodiments, the output report
includes an estimate of
at least one of
the
number
or the density
of the live parasitic worms
on
the
skin of the
human subject 100
.
At least one of the
number
or the density
of the live parasitic worms
5
on
the skin of the
human subject 100
is estimated using a fourth machine learning model that
includes a fourth classifier model that is trained to determine
at least one of
the number
or the
density of
parasitic worms
on
the ski
n of the
human subject 100
using a fourth training set. The
fourth training set includes a plurality of numeric features extracted from the thermal images of
human subjects and their corresponding labels indicating
at least one of
the number
or the density
10
of live worms for each of the thermal images.
[0006]
FIG.
8
illustrates a block diagram of one example system/ image processing system
for processing a thermal image in accordance with the embodiments described with respect to the
flow diagram of FIG.
7
according to the embodiment herein
. The system includes an image
receiver
8
02, a temperature processor
8
03, a feature value extractor
8
04, a storage device
8
05, a
15
machine learning model
8
06, a Central Processing Unit (CPU)
8
08, a memory
8
09, a work station
8
10, a machine
-
readable media
8
11, a display device
8
12, a keyboard
8
13, a mouse
8
14, a database
8
16 and a network
8
17. The image Receiver
8
02 wirelessly receives the video via antenna
8
01
having been transmitted thereto from the video/t
hermal imaging camera 101 of FIG. 1.
The
temperate Processor
8
03 performs a temperature
-
based method to detect pixels in the received
20
thermal image.
The
numeric feature
extractor
8
04 extracts a plurality of
numeric
feature
s
associated with the region of in
terest of the thermal image.
Both Modules
8
03 and
8
04 store their
results to the storage device
8
05.
The machine learning model
8
06 retrieves the results from the
storage device
8
05 and proceeds to process values of the
plurality of
numeric features that a
re
extracted from the thermal image of the subject,
for automatically detecting heat patterns
20
associated with
the
presence of live parasitic worms
on
the
skin of the
human subject 100
. The
machine learning model
8
06
generates the probability score
associat
ed with the abnormal heat
patterns that
indicate
the
presence of the live parasitic worms
on
the skin of the
human subject 100
by analy
z
ing the plurality of numeric features
.
The report generating model
8
18
generates
the
5
report to the user. The Central Processing Unit (CPU)
8
08 retrieves machine
-
readable program
instructions from the memory
8
09 and is provided to facilitate the functionality of any of the
modules of the system
8
00.
The Central Processing Unit (CPU)
8
08
, operating alone or in
conjunction with other processors, may be configured to assist or otherwise perform the
functionality of any of the modules or processing units of the system
8
00 as well as facilitating
10
communication between the system
8
00 and the w
orkstation
8
10.
[0007]
System
8
00 is shown having been placed in communication with the workstation
8
10.
A computer case of the workstation houses various components such as a motherboard with
a processor and memory, a network card, a video card, a hard drive cap
able of reading/writing to
the machine
-
readable media
8
11 such as a floppy disk, optical disk, CD
-
ROM, DVD, magnetic
15
tape, and the like, and other software and hardware needed to perform the functionality of a
computer workstation.
The workstation further
includes a display device
8
12, such as a CRT, LCD,
or touch screen device, for displaying information, images, view angles, and the like.
A user can
view any of that information and make a selection from menu options displayed thereon.
Keyboard
8
13 and mou
se
8
14 effectuate a user input.
It should be appreciated that the workstation has an
20
operating system and other specialized software configured to display alphanumeric values,
menus, scroll bars, dials,
slidable
bars, pull
-
down options, selectable buttons,
and the like, for
entering, selecting, modifying, and accepting information needed for processing in accordance
with the teachings hereof.
The workstation is further enabled to display thermal images,
the
presence of
the abnormal heat patterns
on
the
skin
of the
human subject 100
.
A user or technician
21
may use the user interface of the workstation to obtain the region of interest in the thermal image
of the subject from at least one of (i) the user or (ii) through the automated segmentation technique,
extra
ct the plurality of numeric feature values associated with the region of interest of the thermal
image using at least one of the image processing technique or the mathematical analysis,
generate
5
the probability score that indicates a presence of the
abnorm
al heat patterns
on
the skin of the
human subject 100
by analy
z
ing the plurality of numeric features using the first machine learning
model
and
generate the output report with the probability score and a label, derived using the
probability score
associated with the abnormal heat patterns
which
indicate
the presence of live
parasitic worms
, as needed or as desired, depending on the implementation. Any of these selections
10
or inputs may be stored/retrieved to storage device
8
11. Default settings can
be retrieved from the
storage device. A user of the workstation is also able to view or manipulate any of the data in the
patient records, collectively at
8
15, stored in database
8
16. Any of the received images, results,
determined view angle, and the lik
e, may be stored to a storage device internal to the workstation
8
10.
Although shown as a desktop computer, the workstation can be a laptop, mainframe, or a
15
special purpose computer such as an ASIC, circuit, or the like.
[0008]
Any of the components of the works
tation may be placed in communication with
any of the modules and processing units of the system
8
00.
Any of the modules of the system
8
00
can be placed in communication with the storage devices
8
05,
8
16 and 202 and/or computer
-
readable media
8
11 and may s
tore/retrieve therefrom data, variables, records, parameters,
20
functions, and/or machine
-
readable/executable program instructions, as needed to perform their
intended functions.
Each of the modules of the system
8
00 may be placed in communication with
one o
r more remote devices over network
8
17.
It should be appreciated that some or all of the
functionality performed by any of the modules or processing units of the system
8
00 can be
performed, in whole or in part, by the workstation.
The embodiment shown is
illustrative and
22
should not be viewed as limiting the scope of the appended claims strictly to that configuration.
Various modules may designate one or more components which may, in turn, comprise software
and/or hardware designed to perform the intended f
unction.
[0009]
The foregoing description of the specific embodiments will so fully reveal the
5
general nature of the embodiments herein that others can, by
applying current knowledge, readily
modify and/or adapt for various applications such specific embodiments without departing from
the generic concept, and, therefore, such adaptations and modifications should and are intended to
be comprehended within the
meaning and range of equivalents of the disclosed embodiments.
It is
to be understood that the phraseology or terminology employed herein is for the purpose of
10
description and not of limitation.
Therefore, while the embodiments herein have been described
in
terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein
can be practiced with modification within the spirit and scope.
| # | Name | Date |
|---|---|---|
| 1 | 202241018284-STATEMENT OF UNDERTAKING (FORM 3) [29-03-2022(online)].pdf | 2022-03-29 |
| 2 | 202241018284-POWER OF AUTHORITY [29-03-2022(online)].pdf | 2022-03-29 |
| 3 | 202241018284-FORM FOR STARTUP [29-03-2022(online)].pdf | 2022-03-29 |
| 4 | 202241018284-FORM FOR SMALL ENTITY(FORM-28) [29-03-2022(online)].pdf | 2022-03-29 |
| 5 | 202241018284-FORM 1 [29-03-2022(online)].pdf | 2022-03-29 |
| 6 | 202241018284-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [29-03-2022(online)].pdf | 2022-03-29 |
| 7 | 202241018284-EVIDENCE FOR REGISTRATION UNDER SSI [29-03-2022(online)].pdf | 2022-03-29 |
| 8 | 202241018284-DRAWINGS [29-03-2022(online)].pdf | 2022-03-29 |
| 9 | 202241018284-DECLARATION OF INVENTORSHIP (FORM 5) [29-03-2022(online)].pdf | 2022-03-29 |
| 10 | 202241018284-COMPLETE SPECIFICATION [29-03-2022(online)].pdf | 2022-03-29 |
| 11 | 202241018284-Proof of Right [25-04-2022(online)].pdf | 2022-04-25 |
| 12 | 202241018284-FORM-9 [08-06-2022(online)].pdf | 2022-06-08 |
| 13 | 202241018284-STARTUP [10-06-2022(online)].pdf | 2022-06-10 |
| 14 | 202241018284-FORM28 [10-06-2022(online)].pdf | 2022-06-10 |
| 15 | 202241018284-FORM 18A [10-06-2022(online)].pdf | 2022-06-10 |
| 16 | 202241018284-FER.pdf | 2022-07-04 |
| 17 | 202241018284-OTHERS [22-11-2022(online)].pdf | 2022-11-22 |
| 18 | 202241018284-FER_SER_REPLY [22-11-2022(online)].pdf | 2022-11-22 |
| 19 | 202241018284-DRAWING [22-11-2022(online)].pdf | 2022-11-22 |
| 20 | 202241018284-CORRESPONDENCE [22-11-2022(online)].pdf | 2022-11-22 |
| 21 | 202241018284-COMPLETE SPECIFICATION [22-11-2022(online)].pdf | 2022-11-22 |
| 22 | 202241018284-CLAIMS [22-11-2022(online)].pdf | 2022-11-22 |
| 23 | 202241018284-US(14)-HearingNotice-(HearingDate-11-05-2023).pdf | 2023-04-11 |
| 24 | 202241018284-Request Letter-Correspondence [11-04-2023(online)].pdf | 2023-04-11 |
| 25 | 202241018284-Power of Attorney [11-04-2023(online)].pdf | 2023-04-11 |
| 26 | 202241018284-FORM28 [11-04-2023(online)].pdf | 2023-04-11 |
| 27 | 202241018284-Form 1 (Submitted on date of filing) [11-04-2023(online)].pdf | 2023-04-11 |
| 28 | 202241018284-Covering Letter [11-04-2023(online)].pdf | 2023-04-11 |
| 29 | 202241018284-Correspondence to notify the Controller [27-04-2023(online)].pdf | 2023-04-27 |
| 30 | 202241018284-FORM-26 [30-04-2023(online)].pdf | 2023-04-30 |
| 31 | 202241018284-Correspondence to notify the Controller [11-05-2023(online)].pdf | 2023-05-11 |
| 32 | 202241018284-Annexure [11-05-2023(online)].pdf | 2023-05-11 |
| 33 | 202241018284-Written submissions and relevant documents [22-05-2023(online)].pdf | 2023-05-22 |
| 34 | 202241018284-FORM 3 [24-07-2023(online)].pdf | 2023-07-24 |
| 35 | 202241018284-FORM 3 [25-08-2023(online)].pdf | 2023-08-25 |
| 36 | 202241018284-PatentCertificate01-09-2023.pdf | 2023-09-01 |
| 37 | 202241018284-IntimationOfGrant01-09-2023.pdf | 2023-09-01 |
| 1 | searchE_04-07-2022.pdf |