Abstract: A system (30) and method (400) for measuring one or more entities in a reproductive organ (10) of a patient, using a three dimensional (3D) transvaginal ultrasound are presented. The method includes generating (412) prediction labels for the entities in the 3D image data of the patient and localizing (414) the entities in the 3D image data based on the prediction labels to obtain localized data. Further, the method includes performing (416) structural feature enhancement of the entities in the 3D image data based on plateness feature extraction to derive a plateness map for the entities and generating (418) a geometric active contour for the one or more entities using the localized data and the plateness map. In addition, the method includes measuring (420) and displaying (422) at least one medically significant feature of the entities indicative of health of the reproductive organ based on the geometric active contour.
DESC:BACKGROUND
[0001] Embodiments of the present specification relate to generating an image
of one or more entities in a reproductive organ using three-dimensional (3D)
transvaginal ultrasound, and more particularly to measuring at least one significant
feature of the one or more entities to determine health of the reproductive organ.
[0002] The 3D transvaginal ultrasound is used to image the female
reproductive organ, i.e., the uterus for testing several gynecology related functions
such as implantation, gestation, labor, and child delivery, with a focus on imaging of
fetal anatomy during pregnancy. Anatomically, the uterus consists of three tissue
layers: an inner lining called the endometrium, a middle muscular layer known as
myometrium, and an outer layer known as the perimetrium. The endometrium
maintains patency of the uterine cavity by preventing adhesion between the
myometrium walls. In common medical practice, morphological deformations of the
endometrium are often referred to as uterine abnormalities.
[0003] Besides pregnancy tracking, the transvaginal ultrasound, including 3D
transvaginal ultrasound is performed to detect cysts, fibroid tumors, or other growths
in the region around uterus. Transvaginal ultrasound is also performed when there is
abnormal vaginal bleeding and menstrual problems, as well as for certain types of
infertility, ectopic pregnancy, and pelvic pain in a patient.
[0004] Though 3D transvaginal ultrasound has provided the ability to image a
spatial volume, the shape of the reproductive organ and its different entities continue
to pose a challenge in accurately extracting medically relevant information.
BRIEF DESCRIPTION
[0005] In one aspect, a method for measuring one or more entities in a
reproductive organ of a patient, using a three dimensional (3D) transvaginal ultrasound
is disclosed. The method includes generating prediction labels for the entities in the
3D image data of the patient. The method further includes localizing the entities in
the 3D image data based on the prediction labels to obtain localized data. Further, the
method includes performing structural feature enhancement of the entities in the 3D
image data based on plateness feature extraction to derive a plateness map for the
entities. Moreover, the method includes generating a geometric active contour for the
one or more entities using the localized data and the plateness map. In addition, the
method includes measuring at least one medically significant feature of the entities
indicative of health of the reproductive organ based on the geometric active contour.
[0006] In another aspect, a system for measuring one or more entities in a
reproductive organ of a patient, using a three dimensional (3D) transvaginal ultrasound
is disclosed. The system includes an image acquisition module for capturing 3D image
data for a reproductive organ of a patient. The system further includes an image
analysis module for processing the 3D image data. In particular, the image analysis
module includes a prediction unit for generating a plurality of prediction labels for the
one or more entities in the 3D image data, an image processing unit configured for
localizing the entities in the 3D image data based on the prediction labels to obtain
localized data, performing structural feature enhancement of the entities in the 3D
image data based on plateness feature extraction to derive a plateness map for the
entities, generating a geometric active contour for the one or more entities using the
localized data and the plateness map, and measuring at least one medically significant
feature of the entities indicative of health of the reproductive organ based on the
geometric active contour. Moreover, the system includes a display unit for displaying
at least one of the geometric active contour for the one or more entities, the plateness
map, and the at least one medically significant feature of the one or more entities.
DRAWINGS
[0007] These and other features, aspects, and advantages of the present system
and method will become better understood when the following detailed description is
read with reference to the accompanying drawings in which like characters represent
like parts throughout the drawings, wherein:
[0008] FIG. 1 is a representation of 3D image data for a reproductive organ
obtained using 3D transvaginal ultrasound;
[0009] FIGs. 2(a), 2(b), and 2(c) depict a set of images of different stages of
an endometrium of the reproductive organ during a menstrual cycle obtained by using
a 3D transvaginal ultrasound;
[0010] FIG. 3 is a block diagram representation of a system for measuring one
or more entities of the reproductive organ of a patient, in accordance with aspects of
the present specification;
[0011] FIG. 4 is a flow chart representation of some exemplary functions
carried out by a prediction unit and an image processing unit of the system of FIG. 3,
in accordance with aspects of the present specification;
[0012] FIGs. 5(a), 5(b), and 5(c) are representations of images of interimoutputs
obtained by implementing steps of FIG. 4, in accordance with aspects of the
present specification;
[0013] FIG. 6 is a flow chart representation of further exemplary functions of
the image processing unit of the system of FIG. 3, in accordance with aspects of the
present specification;
[0014] FIGs. 7(a), 7(b), and 7(c) are representations of images obtained after
processing by steps of FIG. 6;
[0015] FIG. 8 is a flow chart representation of further exemplary functions of
the image processing unit of the system of FIG. 3 for obtaining measurements for the
one or more entities of the reproductive organ, in accordance with aspects of the
present specification;
[0016] FIG. 9(a), FIG. 9(b), FIG. 9(c), and FIG. 9(d) are representations of
images obtained after processing steps by steps of FIG. 8; and
[0017] FIG. 10 is a flowchart representation of a method for measuring one or
more entities of a reproductive organ of a patient, in accordance with aspects of the
present specification.
DETAILED DESCRIPTION
[0018] The systems and methods described herein relate to delineation of
entities such as the endometrium from a three-dimensional (3D) image of a female
reproductive organ obtained using 3D transvaginal ultrasound. The systems and
methods advantageously allow for more accurate measurements related to such
entities, which in turn is useful in assessing the health of the female reproductive
organ.
[0019] FIG. 1 is an image 10 that depicts one or more entities of the female
reproductive organ, such as the uterus obtained using a 3D transvaginal ultrasound
imaging system. The entities include an inner lining, i.e., the endometrium 12, a
middle muscular layer 14, i.e., the myometrium, and an outer layer 16 i.e., the
perimetrium. A blow-up view of the endometrium 12 is also shown in FIG. 1. The
endometrium 12 includes three primary tissue layers – the functional layer 18, which
is periodically shed and regrown during the menstrual cycle; the basal layer 20 which
is adjacent to the myometrium 14, and finally the uterine cavity 22, also known as the
endometrium cavity, which is located at the level of the mid-coronal hypersurface of
the reproductive organ. As will be appreciated, the mid-coronal plane transects a
standing human body into two halves-front and back, or anterior and posterior, in an
imaginary line that cuts through both shoulders. Depending on the menstrual phase,
the appearance of the endometrium 12 in the images taken through 3D transvaginal
ultrasound ranges from presence of three bright lines to a hyperechoic structure. The
hyperchoic structure denotes a region in an ultrasound image in which the echoes are
stronger than normal or than that of surrounding structures. FIGs. 2(a), 2(b), and 2(c)
represent a set of images 24, 26, 28 of different stages of the endometrium 12 during
the menstrual cycle obtained using the 3D transvaginal ultrasound.
[0020] FIG. 3 is a block diagram representation of a system 30 for measuring
one or more entities of a female reproductive organ using 3D transvaginal ultrasound.
An image acquisition module 32 is used for data acquisition of a spatial volume,
referred herein generally as image data or 3D image data, of the female reproductive
organ using a 3D or a two-dimensional (2D) ultrasound probe. An image analysis
module 34 is used for processing of the acquired image data to render a segmented
image showing medically significant features of the one or more entities of the female
reproductive organ, and measurement of the medically significant features, as shown
by block 42.
[0021] In one implementation, endometrium is identified as the entity of
interest in the one or more entities of interest. Further, in one example, one of the
medically significant features is a thickness of the endometrium. In another example,
the medically significant feature is a volume of the endometrium. In some
embodiments, both thickness and volume of the endometrium are the medically
significant features. In yet another example, the medically significant feature is a
pattern of lines of the endometrium that is indicative of a phase of the menstrual cycle
of the reproductive organ. All these measurements are in turn used to determine the
health of the reproductive organ of the patient, and used for treatment planning to
restore the health of the reproductive organ, in cases where the segmented image or
the measurements indicate that the reproductive organ is not healthy.
[0022] As shown in FIG. 3, the image analysis module 34 includes a prediction
unit 36, an image processing unit 38, and a display unit 40. The functionalities of the
prediction unit 36 and the image processing unit 38 are described in greater detail with
reference to FIGs. 4-9. A display unit 40 is used for displaying the different outputs
of the image processing unit 38, that include the segmented image having one or more
entities of the reproductive organ and the measurements related to the one or more
entities as shown by block 42. In some embodiments, the display unit 40 is
implemented as a graphical user interface of a computing device, and in some
implementations, the graphical user interface is implemented as a communicating
device, capable of communicating with other devices and systems. A processor 44
and a memory 46 are provided as tangible devices to implement the functionalities of
the prediction unit 36, the image processing unit 38, and display unit 40 of the image
analysis module 34.
[0023] It would be appreciated by those skilled in the art that the image
acquisition module 32 includes a set of instructions executable by the processor 44 to
provide the functionality for receiving the 3D image data. In another embodiment, the
image acquisition module 32 is stored in the memory 46 and is accessible and
executable by the processor 44. In either embodiment, the image acquisition module
32 is adapted for communication and cooperation with the processor 44 and other
components of the image analysis module 34.
[0024] Similarly, it would be appreciated by those skilled in the art that the
image analysis module 34 includes a set of instructions executable by the processor 44
to provide the functionality of the prediction unit 36, the image processing unit 38, and
display unit 40 of the image analysis module 34. In another embodiment, the image
analysis module 34 is stored in the memory 46 and is accessible and executable by the
processor 44.
[0025] Turning now to FIG. 4, a flowchart 48 representing some of the
functionalities of the prediction unit 36 and image processing unit 38 of FIG. 3 are
presented. As indicated by block 52, a training set of images is processed via a deep
learning methodology using a convolution neural network. The training set of images
includes historic 3D image data 50. Also, the training set of images is pre-annotated
for one or more entities using reference labels. The pre-annotation is achieved via the
deep learning methodology using the convolution neural network. In particular, model
parameters for a neural network model are obtained and learnt using historic 3D image
data. These model parameters are used to indicate a higher probability of presence of
one or more entities in the 3D image data of the patient. Thus, the output of the
application of the convolution neural network is a convolution neural network model
54 for the one or more entities of the reproductive organ.
[0026] Furthermore, at step 56, the convolution neural network model 54 is
applied on 3D image data 60 of the reproductive organ of a patient, to identify regions
of high probability that indicate the presence of the one or entities in the 3D image
data 60 of the patient. These regions of high probability are tagged as prediction labels
58.
[0027] The prediction labels 58 are used to identify the one or more entities,
and more specifically to provide an initial estimate of a location of the one or more
entities. It may be worthwhile to note that the one or more entities are identified
through this technique in 2D slices of the 3D volume of the 3D image data 60.
Furthermore, at step 62, the prediction labels 58 are applied on the 3D image data 60
to identify localized data 64 in one or more 2D slices of the 3D image data 60. The
localized data 64 includes the identified regions of the one or more entities.
[0028] FIGs. 5(a), 5(b), and 5(c) are representations of images as progressive
outputs 72, 74, 76 having the localized data obtained by the processing steps of FIG.
4. It may be appreciated by those skilled in the art that the prediction labels 58 as
described hereinabove provide a coarse segmentation for the one or more entities of
interest as shown in the images 72, 74, and 76. In some implementations, it may
suffice to use outputs of FIGs. 5(a), 5(b), and 5(c) to determine the health of the
reproductive organ.
[0029] However, aspects of the present system and method advantageously
provide steps for further processing of the outputs 72, 74, and 76 shown in FIGs. 5(a),
5(b), and 5(c) by the image processing unit 38, as described with reference to FIG. 6.
This processing leads to a more refined segmentation and delineation of contours of
the one or more entities, thereby resulting in greater precision in measurements. The
processing steps described hereinbelow allow structural feature enhancement of the
one or more entities in the 3D image data based on plateness feature extraction by
utilizing the geometric aspects of the functional and basal layers of the endometrium
that represent sheets or plate-like structures in the 3D image data.
[0030] Referring now to FIG. 6, a flowchart 78 describes using 3D image data
80 as an input. The 3D image data 80 is processed for enhancing plate-like structures
in the 3D image data 80, as shown by block 82. Subsequently, as shown by block 84,
noise is suppressed to extract the plate-like structures shown by block 86. Further,
attribute filtering is performed as shown by block 88, to generate a plateness map 90
representing an enhanced image for one or more entities.
[0031] The 3D image data 80 used as input and some example outputs of the
processing steps of FIG. 6 are shown in representative images FIGs. 7(a), 7(b), and
7(c). FIG. 7(a) is representative of the input 3D image data 80 of FIG. 6. Also, FIG.
7(b) is an output 91 obtained shown by block 86. This output 91 has multi-scale plate314737-
1
10
like structure. Moreover, FIG. 7(c) is an output 92 obtained at step 88 which includes
the plateness map 90 for the one or more entities.
[0032] The processing aspects described hereinabove are derived using a
mathematical basis, where endometrium is used as a non-limiting example for one of
the entities of interest. In one non-limiting implementation, a mathematical
representation of the geometry for different layers of the endometrium is obtained by
analyzing the eigenvalues of a Hessian matrix, and is used to compute local curvature
of one or more entities in the 3D image data.
[0033] In one example, the Hessian matrix of a smoothed 3D image
represented as (), blurred via a Gaussian function with standard deviation is
symbolically represented as follows:
H() =
()
(1)
where 1 = , = 3and {} represent the axes of the 3D image.
[0034] Further, in equation (1),
() = () *
/
!2#
(2)
where f() denotes the 3D image data and ‘*’ denotes a convolution operator.
[0035] Let $% = $ = $& be the eigenvalue magnitudes corresponding to
the eigenvectors '%, ', '& of the Hessian matrix of equation (1), where $
=
'
(H'
, 1 = = 3. Using this definition, a unit vector '& represents a direction of
the maximum principal curvature and the curvature magnitude is given by $&.
Similarly, the direction and magnitude of the least curvature are specified by '% and
|$%| respectively. By this definition, the eigenvalues corresponding to a bright, platelike
voxel that represents the one or more entities of interest in the 3D image data,
satisfy the following criteria:
|$%| ˜ 0, |$&| » |$%|, |$| and $& < 0 (3)
[0036] Now, using the criteria of equation (3), the following functions are
defined:
-. =
|$&|
|$%| + |$| + 0 (4)
-1 =2$
&
3%
(5)
[0037] The expression in equation (4) enhances plate-like structures and
attenuates other features such as tubes and other artifacts in the 3D image data. The
expression in equation (5) is used to reduce noise. The parameter is used to prevent
division by zero.
[0038] A plateness response function at scale to derive multi-scale plate-like
structures is then defined as:
R = 51 -
78
.9 51 -
7:
;9H-$& (6)
[0039] In equation (6), a Heaviside function H may be defined as H(z) = 0 if z
= 0, and H(z) = 1 if z > 0. The first term in equation (6) selectively enhances the platelike
features, while the second term suppresses noise. The parameters =and > are
tuned to control the influence of the respective terms in equation (6). These parameters
are tuned experimentally, and are set to 0.5 and 10.0 respectively for analyses in one
exemplary implementation.
[0040] Since the thickness of the endometrium envelope may vary between
individuals, a multi-scale analysis using equation (7) is performed, where, if T =
%, , &, and represents a set of scales, the multi-scale plate-like structures are
computed as follows:
R = max
?D
R (7)
[0041] However, while equation (7) enhances the basal layer and the uterine
cavity of the endometrium, small neighboring tissues are highlighted as well. To
eliminate such clutter, an area based attribute filtering is used that uses a grayscale
area morphology to remove clutter from the final output. The attribute filtered,
plateness map is symbolically obtained and is represented as follows:
E = AF[R; J] (8)
In equation (8), LM[-; J] represents the area based attribute filtering operation of the
image, and J is the area parameter.
[0042] Moreover, further processing as described with reference to FIG. 8 may
be performed using an active contour (commonly known as snake) formulation called
Plateness Guided Snake to obtain a segmented geometric active contour representing
the endometrium. The Plateness Guided Snake may be expressed mathematically as
a curve propagation partial differential equation given below:
N
O
= PQ R
(9)
[0043] In equation (9), C represents a parametric two-dimensional curve, and
n denotes the curve normal. The time for curve propagation is represented by t. PS
represents the image external force which is derived from the Plateness Map defined
in equation (8), and this force defines the curve evolution procedure. The Plateness
Guided Snake uses the Plateness Map in equation (9) and localized data such as the
localized data 64 of FIG, 4 to perform a more refined segmentation of the endometrium
on the sagittal slice.
[0044] Numerically, equation (9) is implemented using a level set theory,
where a propagating curve is represented as a zero level set of the level set function T.
From general level set theory, a parametric curve evolution equation (10) is
reformulated in terms of a level set function as follows:
T
O
= PS VT,TW3X = TX (10)
[0045] In equation (10), TX represents an initialization of the level set function.
On convergence, the segmented endometrium is obtained by extracting the pixels
enclosed by a zero level set of T. The segmented endometrium is referred herein as
the geometric active contour of the endometrium, and is representative of the refined
segmented output of the processing steps.
[0046] Referring now to FIG. 8, a flowchart 100 representing the processing
via use of equations (1)-(10). In the method 100, the plateness map 90 (see FIG. 6)
and the localized data 64 (see FIG. 4) are used as inputs to determine a plateness guided
contour ( see equation (9)), as depicted by block 102. Furthermore, at block 104, the
plateness guided contour is further processed by using adaptive curve evolution (see
equation (10)), to obtain a segmented geometric active contour 106. The segmented
geometric active contour may also be referred to as a segmented endometrium.
[0047] Moreover, measurements are performed on this geometric active
contour 106, as shown by block 108. In one example, the measurement is performed
by analyzing pixel separation between lower and upper boundaries of the segmented
endometrium. This measurement is then used as an indicator of the health of the
reproductive organ, as shown at block 110. The health indicator 110 may be used for
treatment planning for the patient.
[0048] At least in one example, a thickness of the endometrium is measured
across a menstrual cycle of the patient to determine the health of the reproductive
organ. In one comparative experiment, the thickness of endometrium measured by
using the above technique is 6.8836 mm, compared to an observed value of 6.1 mm
using currently available techniques. In another experiment, the measurement using
the technique described herein is 8.6876 mm, and a corresponding observed value
using the currently available methods is 8 mm.
[0049] FIGs. 9(a), 9(b), 9(c), and 9(d) are diagrammatic representations of
images that show the enhanced segmentation for two example cases. In particular,
images 202 and 204 of FIGs. 9(a) and 9(b) show the segmented endometrium during
the proliferative phase of the endometrium. In a similar fashion, images 206 and 208
of FIGs. 9(c) and 9(d) show the segmented endometrium during the menstrual phase.
Thus, the aspects of the system and method described herein also allow for detecting
the phase of the menstrual cycle of the patient.
[0050] FIG. 10 is flowchart 400 representing a method for measuring one or
more entities of a reproductive organ of a patient, in accordance with aspects of the
present specification. The method 400 includes obtaining 3D image data of the
reproductive organ of the patient, as indicated by step 410. At step 412, prediction
labels for the one or more entities in the 3D image data are generated for at least one
of a sagittal slice or for multiple slices of the 3D image data, using a training set of a
historical 3D image data.
[0051] Further, the one or more entities in the 3D image data are localized
based on the prediction labels to obtain localization data, as shown at step 414.
Moreover, at step 416, structural feature enhancement of the one or more entities in
the 3D image data is performed based on plateness feature extraction to derive a
plateness map for the one or more entities. At step 418, a geometric active contour
representing segmented one or more entities is obtained from the plateness map and
the localized data.
[0052] Subsequently, at step 420, at least one medically significant feature of
the one or more entities that is indicative of the health of the reproductive organ is
derived based on the geometric active contour. As previously noted, in one example,
the medically significant feature is a thickness of the endometrium and is measured
across a menstrual cycle of the patient to determine the health of the reproductive
organ. In another example, volume of endometrium is measured as the medically
significant feature.
[0053] The method 400 also includes displaying at least one of the plateness
map, the geometric active contour, and the at least one medically significant feature
for the one or more entities on a display, as indicated by step 422. In some
embodiments, step 422 also includes communicating the at least one of the plateness
map, the geometric active contour, or the at least one medically significant feature for
the one or more entities to a computing or communicating device, for example a health
monitoring system, a personal computer, or to a storage device such as a server device
for further processing or for diagnosis.
[0054] This hybrid approach described herein above, involving obtaining
localized data and performing segmentation refinement via use of the plateness guided
snake and the plateness map and the localized data, results in higher accuracy of
segmentation.
[0055] The method described herein is also implemented in some
embodiments as a computer program product having a non-transitory computer
readable medium encoding instructions that, in response to execution by at least one
processor, cause the at least one processor to perform operations as described herein
above in relation to the method steps.
[0056] The systems and methods described herein provide an automated
processing technique to automatically delineate the endometrium and other entities in
the female reproductive organ, and to measure the thickness and volume of the entities.
The embodiments further simplify the image processing, reduce operator variability,
and improve the outcome for assisted reproductive medicine, cancer screening in
women with postmenopausal bleeding, and endometrium hyperplasia. In addition, the
easy to use, fast, and automated techniques described hereinabove aids in enhancing
the skill and utilization of the ultrasound imaging by less experienced users.
[0057] Further the embodiments described herein are aimed at simplifying
clinical workflow, especially for lower skilled ultrasound operators; improving the
accuracy and reproducibility, by reducing the variability; enhancing productivity by
avoiding multiple examinations and reducing dependency on expert clinicians;
providing objective and reproducible documentation.
[0058] While only certain features of the invention have been illustrated, and
described herein, many modifications and changes will occur to those skilled in the
art. It is, therefore, to be understood that the appended claims are intended to cover
all such modifications and changes as fall within the true spirit of the invention. ,CLAIMS:1. A method (400) for measuring one or more entities in a reproductive organ
(10) of a patient, using three dimensional (3D) transvaginal ultrasound, the
method comprising:
obtaining (410) 3D image data of the reproductive organ of the patient;
generating (412) a plurality of prediction labels for the one or more
entities in the 3D image data based on a training set of historic 3D image
data;
localizing (414) the one or more entities in the 3D image data based on
the plurality of prediction labels to obtain localized data;
performing (416) structural feature enhancement of the one or more
entities in the 3D image data based on plateness feature extraction to derive
a plateness map for the one or more entities;
generating (418) a geometric active contour for the one or more entities
using the localized data and the plateness map; and
measuring (420) at least one medically significant feature of the one or
more entities indicative of health of the reproductive organ based on the
geometric active contour, wherein the at least one medically significant
feature is used for treatment planning for the patient.
2. The method (400) of claim 1 wherein the localized data is obtained for a
sagittal slice from the 3D image data having the one or more entities.
3. The method (400) of claim 1 wherein the localized data is obtained for a
plurality of slices from the 3D image data.
314737-1
18
4. The method (400) of claim 1 wherein the one or more entities comprise an
endometrium (12).
5. The method (400) of claim 4 wherein the at least one medically significant
feature comprises at least one of a thickness of the endometrium (12), a volume
of endometrium (12), or a combination thereof.
6. The method (400) of claim 4 wherein the endometrium (12) comprises a
functional layer (18), a basal layer (20), and an endometrium cavity (22), and
wherein the at least one medically significant feature comprises a measurement
corresponding to at least one of the functional layer (18), the basal layer (20),
and the endometrium cavity (22).
7. The method (400) of claim 1 further comprising determining a phase of
menstrual cycle based on the at least one medically significant feature.
8. The method (400) of claim 1 further comprising displaying at least one of the
plateness map, the geometric active contour, and the at least one medically
significant feature for the one or more entities on a display.
9. The method (400) of claim 1 further comprising communicating the at least
one of the plateness map, the geometric active contour, and the at least one
medically significant feature for the one or more entities for treatment planning
for the patient.
10. A system (30) for measuring one or more entities of a female reproductive
organ (10) of a patient across a menstrual cycle using three dimensional (3D)
transvaginal ultrasound, the system (30) comprising:
an image acquisition module (32) for capturing 3D image data
corresponding to the female reproductive organ of the patient;
314737-1
19
an image analysis module (34) for processing the 3D image data,
wherein the image analysis module (34) comprises:
a prediction unit (36) for generating a plurality of prediction labels
for the one or more entities in the 3D image data based on a training set
of historic 3D image data;
an image processing unit (38) is configured for:
localizing the one or more entities in the 3D image data
based on the plurality of prediction labels to obtain localized
data;
performing structural feature enhancement of the one or
more entities in the 3D image data based on plateness feature
extraction to derive a plateness map for the one or more entities;
generating a geometric active contour for the one or more
entities using the localized data and the plateness map; and
measuring at least one medically significant feature of the
one or more entities indicative of health of the reproductive
organ based on the geometric active contour, wherein the at
least one medically significant feature is used for treatment
panning for the patient; and
a display unit (40) for displaying (42) at least one of the
geometric active contour for the one or more entities, the
plateness map, and the at least one medically significant feature
of the one or more entities.
314737-1
20
11. The system (30) of claim 10 wherein the one or more entities comprise an
endometrium (12).
12. The system (30) of claim 11 wherein the at least one medically significant
feature of the one or more entities comprises a thickness of the endometrium
(12), a volume of the endometrium (12), or a combination thereof.
13. The system (30) of claim 11 wherein the endometrium (12) comprises a
functional layer (18), a basal layer (20), and an endometrium cavity (22), and
wherein the at least one medically significant feature comprises a measurement
corresponding to at least one of the functional layer (18), the basal layer (20),
and the endometrium cavity (22).
14. The system (30) of claim 10 wherein the image processing unit (38) is further
configured for determining a phase of the menstrual cycle based on the at least
one medically significant feature.
15. The system (30) of claim 10 wherein the localized data is obtained for at least
one of a sagittal slice from the 3D image data having the one or more entities
and a plurality of slices from 3D image data.
| # | Name | Date |
|---|---|---|
| 1 | 201641037945-ASSIGNMENT WITH VERIFIED COPY [18-03-2025(online)].pdf | 2025-03-18 |
| 1 | 201641037945-IntimationOfGrant27-03-2024.pdf | 2024-03-27 |
| 1 | Drawing_As Filed_07-11-2016.pdf | 2016-11-07 |
| 2 | 201641037945-FORM-16 [18-03-2025(online)].pdf | 2025-03-18 |
| 2 | 201641037945-PatentCertificate27-03-2024.pdf | 2024-03-27 |
| 2 | Description Provisional_As Filed_07-11-2016.pdf | 2016-11-07 |
| 3 | 201641037945-POWER OF AUTHORITY [18-03-2025(online)].pdf | 2025-03-18 |
| 3 | 201641037945-Written submissions and relevant documents [05-03-2024(online)].pdf | 2024-03-05 |
| 3 | Abstract_As Filed_07-11-2016.pdf | 2016-11-07 |
| 4 | Form5_After PS_07-12-2016.pdf | 2016-12-07 |
| 4 | 201641037945-IntimationOfGrant27-03-2024.pdf | 2024-03-27 |
| 4 | 201641037945-FORM-26 [20-02-2024(online)].pdf | 2024-02-20 |
| 5 | Form18_Normal Request_07-12-2016.pdf | 2016-12-07 |
| 5 | 201641037945-PatentCertificate27-03-2024.pdf | 2024-03-27 |
| 5 | 201641037945-Correspondence to notify the Controller [19-02-2024(online)].pdf | 2024-02-19 |
| 6 | Drawing_After PS_07-12-2016.pdf | 2016-12-07 |
| 6 | 201641037945-Written submissions and relevant documents [05-03-2024(online)].pdf | 2024-03-05 |
| 6 | 201641037945-US(14)-HearingNotice-(HearingDate-23-02-2024).pdf | 2024-01-22 |
| 7 | Description Complete_As Filed_07-12-2016.pdf | 2016-12-07 |
| 7 | 201641037945-FORM-26 [20-02-2024(online)].pdf | 2024-02-20 |
| 7 | 201641037945-FER.pdf | 2021-10-17 |
| 8 | 201641037945-ABSTRACT [23-08-2021(online)].pdf | 2021-08-23 |
| 8 | 201641037945-Correspondence to notify the Controller [19-02-2024(online)].pdf | 2024-02-19 |
| 8 | Claims_After PS_07-12-2016.pdf | 2016-12-07 |
| 9 | 201641037945-CLAIMS [23-08-2021(online)].pdf | 2021-08-23 |
| 9 | 201641037945-US(14)-HearingNotice-(HearingDate-23-02-2024).pdf | 2024-01-22 |
| 9 | Abstract_After PS_07-12-2016.pdf | 2016-12-07 |
| 10 | 201641037945-COMPLETE SPECIFICATION [23-08-2021(online)].pdf | 2021-08-23 |
| 10 | 201641037945-FER.pdf | 2021-10-17 |
| 10 | Form30_Proof of Rightt_20-12-2016.pdf | 2016-12-20 |
| 11 | 201641037945-ABSTRACT [23-08-2021(online)].pdf | 2021-08-23 |
| 11 | 201641037945-Covering Letter [23-08-2021(online)].pdf | 2021-08-23 |
| 11 | Form26_Power of Attorney_20-12-2016.pdf | 2016-12-20 |
| 12 | 201641037945-CLAIMS [23-08-2021(online)].pdf | 2021-08-23 |
| 12 | 201641037945-DRAWING [23-08-2021(online)].pdf | 2021-08-23 |
| 12 | Form30_Proof of Right Under Section7(2)_26-12-2016.pdf | 2016-12-26 |
| 13 | Correspondence by Agent_Form5 Proof of Right_26-12-2016.pdf | 2016-12-26 |
| 13 | 201641037945-FER_SER_REPLY [23-08-2021(online)].pdf | 2021-08-23 |
| 13 | 201641037945-COMPLETE SPECIFICATION [23-08-2021(online)].pdf | 2021-08-23 |
| 14 | 201641037945-Covering Letter [23-08-2021(online)].pdf | 2021-08-23 |
| 14 | 201641037945-OTHERS [23-08-2021(online)].pdf | 2021-08-23 |
| 14 | 201641037945-PETITION UNDER RULE 137 [24-10-2017(online)].pdf | 2017-10-24 |
| 15 | 201641037945-DRAWING [23-08-2021(online)].pdf | 2021-08-23 |
| 15 | 201641037945-PETITION u-r 6(6) [23-08-2021(online)].pdf | 2021-08-23 |
| 15 | Correspondence by Applicant_Petition Under Rule 137_27-10-2017.pdf | 2017-10-27 |
| 16 | 201641037945-FER_SER_REPLY [23-08-2021(online)].pdf | 2021-08-23 |
| 16 | 201641037945-FORM 13 [30-07-2021(online)].pdf | 2021-07-30 |
| 16 | 201641037945-RELEVANT DOCUMENTS [13-02-2020(online)].pdf | 2020-02-13 |
| 17 | 201641037945-FORM-26 [30-07-2021(online)].pdf | 2021-07-30 |
| 17 | 201641037945-OTHERS [23-08-2021(online)].pdf | 2021-08-23 |
| 17 | 201641037945-RELEVANT DOCUMENTS [13-02-2020(online)]-1.pdf | 2020-02-13 |
| 18 | 201641037945-FORM 13 [13-02-2020(online)].pdf | 2020-02-13 |
| 18 | 201641037945-PETITION u-r 6(6) [23-08-2021(online)].pdf | 2021-08-23 |
| 18 | 201641037945-POA [30-07-2021(online)].pdf | 2021-07-30 |
| 19 | 201641037945-FORM 13 [13-02-2020(online)]-1.pdf | 2020-02-13 |
| 19 | 201641037945-FORM 13 [30-07-2021(online)].pdf | 2021-07-30 |
| 20 | 201641037945-FORM 13 [13-02-2020(online)].pdf | 2020-02-13 |
| 20 | 201641037945-FORM-26 [30-07-2021(online)].pdf | 2021-07-30 |
| 20 | 201641037945-POA [30-07-2021(online)].pdf | 2021-07-30 |
| 21 | 201641037945-RELEVANT DOCUMENTS [13-02-2020(online)]-1.pdf | 2020-02-13 |
| 21 | 201641037945-POA [30-07-2021(online)].pdf | 2021-07-30 |
| 21 | 201641037945-FORM-26 [30-07-2021(online)].pdf | 2021-07-30 |
| 22 | 201641037945-FORM 13 [13-02-2020(online)]-1.pdf | 2020-02-13 |
| 22 | 201641037945-FORM 13 [30-07-2021(online)].pdf | 2021-07-30 |
| 22 | 201641037945-RELEVANT DOCUMENTS [13-02-2020(online)].pdf | 2020-02-13 |
| 23 | 201641037945-FORM 13 [13-02-2020(online)].pdf | 2020-02-13 |
| 23 | 201641037945-PETITION u-r 6(6) [23-08-2021(online)].pdf | 2021-08-23 |
| 23 | Correspondence by Applicant_Petition Under Rule 137_27-10-2017.pdf | 2017-10-27 |
| 24 | 201641037945-RELEVANT DOCUMENTS [13-02-2020(online)]-1.pdf | 2020-02-13 |
| 24 | 201641037945-PETITION UNDER RULE 137 [24-10-2017(online)].pdf | 2017-10-24 |
| 24 | 201641037945-OTHERS [23-08-2021(online)].pdf | 2021-08-23 |
| 25 | 201641037945-RELEVANT DOCUMENTS [13-02-2020(online)].pdf | 2020-02-13 |
| 25 | Correspondence by Agent_Form5 Proof of Right_26-12-2016.pdf | 2016-12-26 |
| 25 | 201641037945-FER_SER_REPLY [23-08-2021(online)].pdf | 2021-08-23 |
| 26 | 201641037945-DRAWING [23-08-2021(online)].pdf | 2021-08-23 |
| 26 | Correspondence by Applicant_Petition Under Rule 137_27-10-2017.pdf | 2017-10-27 |
| 26 | Form30_Proof of Right Under Section7(2)_26-12-2016.pdf | 2016-12-26 |
| 27 | 201641037945-Covering Letter [23-08-2021(online)].pdf | 2021-08-23 |
| 27 | 201641037945-PETITION UNDER RULE 137 [24-10-2017(online)].pdf | 2017-10-24 |
| 27 | Form26_Power of Attorney_20-12-2016.pdf | 2016-12-20 |
| 28 | Form30_Proof of Rightt_20-12-2016.pdf | 2016-12-20 |
| 28 | Correspondence by Agent_Form5 Proof of Right_26-12-2016.pdf | 2016-12-26 |
| 28 | 201641037945-COMPLETE SPECIFICATION [23-08-2021(online)].pdf | 2021-08-23 |
| 29 | 201641037945-CLAIMS [23-08-2021(online)].pdf | 2021-08-23 |
| 29 | Abstract_After PS_07-12-2016.pdf | 2016-12-07 |
| 29 | Form30_Proof of Right Under Section7(2)_26-12-2016.pdf | 2016-12-26 |
| 30 | 201641037945-ABSTRACT [23-08-2021(online)].pdf | 2021-08-23 |
| 30 | Claims_After PS_07-12-2016.pdf | 2016-12-07 |
| 30 | Form26_Power of Attorney_20-12-2016.pdf | 2016-12-20 |
| 31 | 201641037945-FER.pdf | 2021-10-17 |
| 31 | Description Complete_As Filed_07-12-2016.pdf | 2016-12-07 |
| 31 | Form30_Proof of Rightt_20-12-2016.pdf | 2016-12-20 |
| 32 | 201641037945-US(14)-HearingNotice-(HearingDate-23-02-2024).pdf | 2024-01-22 |
| 32 | Abstract_After PS_07-12-2016.pdf | 2016-12-07 |
| 32 | Drawing_After PS_07-12-2016.pdf | 2016-12-07 |
| 33 | 201641037945-Correspondence to notify the Controller [19-02-2024(online)].pdf | 2024-02-19 |
| 33 | Claims_After PS_07-12-2016.pdf | 2016-12-07 |
| 34 | 201641037945-FORM-26 [20-02-2024(online)].pdf | 2024-02-20 |
| 34 | Description Complete_As Filed_07-12-2016.pdf | 2016-12-07 |
| 35 | 201641037945-Written submissions and relevant documents [05-03-2024(online)].pdf | 2024-03-05 |
| 35 | Drawing_After PS_07-12-2016.pdf | 2016-12-07 |
| 36 | 201641037945-PatentCertificate27-03-2024.pdf | 2024-03-27 |
| 36 | Form18_Normal Request_07-12-2016.pdf | 2016-12-07 |
| 37 | 201641037945-IntimationOfGrant27-03-2024.pdf | 2024-03-27 |
| 37 | Form5_After PS_07-12-2016.pdf | 2016-12-07 |
| 38 | 201641037945-POWER OF AUTHORITY [18-03-2025(online)].pdf | 2025-03-18 |
| 38 | Abstract_As Filed_07-11-2016.pdf | 2016-11-07 |
| 39 | Description Provisional_As Filed_07-11-2016.pdf | 2016-11-07 |
| 39 | 201641037945-FORM-16 [18-03-2025(online)].pdf | 2025-03-18 |
| 40 | Drawing_As Filed_07-11-2016.pdf | 2016-11-07 |
| 40 | 201641037945-ASSIGNMENT WITH VERIFIED COPY [18-03-2025(online)].pdf | 2025-03-18 |
| 1 | 2021-02-1916-19-47E_19-02-2021.pdf |