Sign In to Follow Application
View All Documents & Correspondence

Systems And Methods For Classification Of An Object

Abstract: A method includes receiving a training image comprising a labeled object and generating a labeled weight matrix corresponding to the labeled object based on one or more features of the labeled object using a neural network. The method further includes receiving a new image comprising an unlabeled object from an imaging device and generating an unlabeled weight matrix corresponding to the unlabeled object based on one or more features of the unlabeled object using the neural network. The method further includes calculating a Hausdorff distance between the unlabeled weight matrix and the labeled weight matrix and classifying the unlabeled object based on the Hausdorff distance.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
13 May 2015
Publication Number
48/2016
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
GEHC_IN_IP-docketroom@ge.com
Parent Application
Patent Number
Legal Status
Grant Date
2023-07-26
Renewal Date

Applicants

General Electric Company
1 River Road, Schenectady, New York 12345, USA

Inventors

1. VAIDYA,VIVEK PRABHAKAR
122, EPIP Phase 2, Hoodi Village, Whitefield Road, Bangalore 560066, INDIA
2. SREEKUMARI, ARATHI
122, EPIP Phase 2, Hoodi Village, Whitefield Road, Bangalore 560066, INDIA
3. SUNDARARAJAN, RAMASUBRAMANIAN GANGAIKONDAN
113, Ranka Court Apartments 18, Cambridge Layout, Ulsoor Bangalore 560008, INDIA
4. ZHAO, FEI
675 Tasman Dr. APT 2212, Sunnyvale, CA 94089 USA

Specification

CLIAMS:1. A method comprising:
receiving a training image comprising a labeled object;
generating a labeled weight matrix corresponding to the labeled object based on one or more features of the labeled object using a neural network;
receiving a new image comprising an unlabeled object from an imaging device;
generating an unlabeled weight matrix corresponding to the unlabeled object based on one or more features of the unlabeled object using the neural network;
calculating a Hausdorff distance between the unlabeled weight matrix and the labeled weight matrix; and
classifying the unlabeled object based on the Hausdorff distance.

2. The method of claim 1, wherein the labeled object comprises one of a duct, an artifact, a cyst, and a lesion.

3. The method of claim 1, wherein the neural network comprises a Restricted Boltzman Machine.

4. The method of claim 1, wherein the labeled weight matrix represents a transformation of the one or more features of the labeled object between one or more visible units and one or more hidden units of the neural network.

5. The method of claim 1, wherein the unlabeled weight matrix represents a transformation of the one or more features of the unlabeled object between one or more visible units and one or more hidden units of the neural network.

6. The method of claim 1, wherein calculating the Hausdorff distance further comprises:
computing one or more cosine distance vectors between one or more rows of the unlabeled weight matrix and one or more rows of the labeled weight matrix;
determining a minumum distance value from each of the one or more cosine distance vectors; and
calculating the Hausdorff distance between the unlabeled weight matrix and the labeled weight matrix based on the one or more minimum distance values.

7. The method of claim 1, further comprising:
receiving a plurality of training images comprising a plurality of labeled objects;
generating a plurality of labeled weight matrices corresponding to the plurality of labeled objects based on one or more features of each of the plurality of labeled objects using the neural network;
calculating a plurality of Hausdorff distances between the unlabeled weight matrix and each of the plurality of labeled weight matrices; and
classifying the unlabeled object as one of the plurality of labeled objects based on the plurality of Hausdorff distances.

8. The method of claim 1, further comprising sending a notification including the classification of the unlabeled object to a user.

9. A system comprising:
at least one processor;
a communication unit configured to receive a training image comprising a labeled object and receiving a new image comprising an unlabeled object from an imaging device;
a neural network unit communicatively coupled to the communication unit and configured to generate a labeled weight matrix corresponding to the labeled object based on one or more features of the labeled object and generate an unlabeled weight matrix corresponding to the unlabeled object based on one or more features of the unlabeled object using a neural network;
a comparison unit coupled to the neural network unit and configured to calculate a Hausdorff distance between the unlabeled weight matrix and the labeled weight matrix; and
a classification unit communicatively coupled to the comparison unit and configured to classify the unlabeled object based on the Hausdorff distance.

10. The system of claim 9, wherein the comparison unit is further configured to:
compute one or more cosine distance vectors between one or more rows of the unlabeled weight matrix and one or more rows of the labeled weight matrix;
determine a minumum distance value from each of the one or more cosine distance vectors; and
calculate the Hausdorff distance between the unlabeled weight matrix and the labeled weight matrix based on the one or more minimum distance values.

11. The system of claim 9, wherein the communication unit is further configured to receive a plurality of training images comprising a plurality of labeled objects.

12. The system of claim 11, wherein the neural network unit is further configured to generate a plurality of labeled weight matrices corresponding to the plurality of labeled objects based on one or more features of each of the plurality labeled objects using the neural network.

13. The system of claim 12, wherein the comparison unit is further configured to calculate a plurality of Hausdorff distances between the unlabeled weight matrix and each of the plurality of labeled weight matrices.

14. The system of claim 13, wherein the classification unit is configured to classify the unlabeled object as one of the plurality of labeled objects based on the plurality of Hausdorff distances.
,TagSPECI:BACKGROUND
[0001] The technology disclosed herein generally relates to classification of an object in an image. More specifically, the subject matter relates to systems and methods for classification of an anatomical object in an image using a neural network.
[0002] The classification of one or more objects such as, a lesion, an organ, a cyst, a duct, an artifact, and the like, in images is a challenge for providing robust medical diagnostics and accelerating workflows of healthcare screening. Current methods for classifying such objects have numerous problems, for example, erroneous classifications due to similarities amongst different types of objects (e.g., a cyst and a lesion) or different sub-types of an object (e.g., malignant lesion and a benign lesion). Furthermore, current methods for classifying an object tend to inaccurately classify rare sub-types of an object, if the rare sub-types of the object possess features that differ from other common sub-types of the object. For example, the current methods may tend to inaccurately classify a rare type of a malignant lesion since the morphological features of the rare type of malignant lesion are significantly different from other common types of malignant lesions. As will be appreciated, such inaccurate classifications of objects may lead to, for example, erroneous diagnosis and treatment methods for a patient.
BRIEF DESCRIPTION
[0003] In accordance with one aspect of the present technique, a method includes receiving a training image comprising a labeled object and generating a labeled weight matrix corresponding to the labeled object based on one or more features of the labeled object using a neural network. The method further includes receiving a new image comprising an unlabeled object from an imaging device and generating an unlabeled weight matrix corresponding to the unlabeled object based on one or more features of the unlabeled object using the neural network. The method further includes calculating a Hausdorff distance between the unlabeled weight matrix and the labeled weight matrix and classifying the unlabeled object based on the Hausdorff distance.
[0004] In accordance with another aspect of the present technique, a system includes a communication unit configured to receive a training image comprising a labeled object and receiving a new image comprising an unlabeled object from an imaging device. The system further includes a neural network unit configured to generate a labeled weight matrix corresponding to the labeled object based on one or more features of the labeled object and generate an unlabeled weight matrix corresponding to the unlabeled object based on one or more features of the unlabeled object using a neural network. The system further includes a comparison unit configured to calculate a Hausdorff distance between the unlabeled weight matrix and the labeled weight matrix. The system further includes a classification unit configured to classify the unlabeled object based on the Hausdorff distance.
DRAWINGS
[0005] These and other features, aspects, and advantages of the present invention will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
[0006] FIG. 1 is a block diagram illustrating an exemplary system for classification of an object in an image in accordance with aspects of the present technique;
[0007] FIG. 2 is a schematic representation of an exemplary method for classification of an object in an image in accordance with aspects of the present technique;
[0008] FIG. 3A is a pictorial representation illustrating an exemplary image including an object in accordance with aspects of the present technique;
[0009] FIG. 3B is a pictorial representation illustrating an exemplary classification of an object in accordance with aspects of the present technique;
[0010] FIG. 3C is a pictorial representation illustrating another exemplary image including an object in accordance with aspects of the present technique;
[0011] FIG. 3D is a pictorial representation illustrating another exemplary classification of an object in accordance with aspects of the present technique; and
[0012] FIG. 4 is a flow diagram illustrating an exemplary method for classification of an object in an image in accordance with aspects of the present technique.
DETAILED DESCRIPTION
[0013] In the following specification and the claims, reference will be made to a number of terms, which shall be defined to have the following meanings.
[0014] The singular forms “a”, “an”, and “the” include plural references unless the context clearly dictates otherwise.
[0015] As used herein, the term “non-transitory computer-readable media” is intended to be representative of any tangible computer-based device implemented in any method or technology for short-term and/or long-term storage of information, such as, computer-readable instructions, data structures, program modules and sub-modules, or other data in any device. Therefore, methods described herein may be encoded as executable instructions embodied in a tangible, non-transitory, computer readable medium, including, without limitation, a storage device and/or a memory device. Such instructions, when executed by a processor, cause the processor to perform at least a portion of the methods described herein. Moreover, as used herein, the term “non-transitory computer-readable media” includes all tangible, computer-readable media, including, without limitation, non-transitory computer storage devices, including, without limitation, volatile and non-volatile media, and removable and non-removable media such as a firmware, physical and virtual storage, a compact disc read only memory, a digital versatile disc, and any other digital source such as a network or the Internet, as well as yet to be developed digital means, with the sole exception being a transitory, propagating signal.
[0016] As used herein, the terms “software” and “firmware” are interchangeable, and include any computer program stored in memory for execution by devices that include, without limitation, mobile devices, clusters, personal computers, workstations, clients, and servers.
[0017] As used herein, the term “computer” and related terms, e.g., “computing device”, are not limited to integrated circuits referred to in the art as a computer, but broadly refers to at least one microcontroller, microcomputer, programmable logic controller (PLC), application specific integrated circuit, and other programmable circuits, and these terms are used interchangeably herein throughout the specification.
[0018] Approximating language, as used herein throughout the specification and claims, may be applied to modify any quantitative representation that could permissibly vary without resulting in a change in the basic function to which it is related. Accordingly, a value modified by a term or terms, such as “about” and “substantially,” are not to be limited to the precise value specified. In at least some instances, the approximating language may correspond to the precision of an instrument for measuring the value. Here and throughout the specification and claims, range limitations may be combined and/or interchanged, such ranges are identified and include all the sub-ranges contained therein unless context or language indicates otherwise.
[0019] A system and method for classification of an object in an image is described herein. The object may include any type of anatomical object, for example, a lesion (e.g., a malignant/cancerous lesion, a benign lesion, and the like), an organ (e.g., the heart, the liver, the brain, and the like), a cyst, a duct, an artifact, and the like. FIG. 1 illustrates a block diagram of an exemplary system 100 configured to classify an object in an image according to one embodiment. The system 100 includes an imaging device 110 and a classification device 130 that are communicatively coupled via a network 120. The imaging device 110 and the classification device 130 are communicatively coupled to the network 120 via signal lines 115 and 125. The signal lines 115 and 125 are provided for illustrative purposes and represent a wired or a wireless link between the network 120 and the imaging device 110 and the classification device 130, respectively. Although the imaging device 110 and the classification device 130 are communicatively coupled via the network 120 according to the embodiment illustrated in FIG. 1, in other embodiments, the classification device 130 may be included within the imaging device 110. Further, although one imaging device 110 and one classification device 130 are coupled to the network 120 in FIG. 1, a plurality of imaging devices 110 and a plurality of classification devices 130 may be coupled to the network 120
[0020] The imaging device 110 may be a device that is configured to generate one or more new images of a subject (e.g., a human patient) including one or more unlabeled objects. The one or more new images may be two dimensional and/or three dimensional images. The term “unlabeled object” is used herein to indicate that an object in a new image has not been classified as one of the objects, e.g., a cyst, a malignant lesion, and the like. Non-limiting examples of the imaging device 110 include an ultrasound imaging device, a computed tomography imaging device, a positron emission tomography imaging device, an X-ray imaging device, and a magnetic resonance imaging device. Further, the imaging device 110 is configured to send the one or more new images to the classification device 130 via the network 120.
[0021] The network 120 may be a wired or wireless type. Further, the network 120 may have one or more configurations such as a star configuration, a token ring configuration, or other known configurations. Furthermore, the network 120 may include a local area network (LAN), a wide area network (WAN) (e.g., the Internet), and/or any other interconnected data path across which multiple devices may communicate. In one embodiment, the network 120 may be a peer-to-peer network. The network 120 may also be coupled to or include portions of a telecommunication network for sending data in a variety of different communication protocols. In one embodiment, the network 120 may include Bluetooth communication networks or a cellular communications network for sending and receiving data such as via a short messaging service (SMS), a multimedia messaging service (MMS), a hypertext transfer protocol (HTTP), a direct data connection, a wireless application protocol (WAP), an email, or the like. While only one network 120 is shown coupled to the imaging device 110 and the classification device 130, multiple networks 120 may be coupled to the entities.
[0022] The classification device 130 may be configured to classify one or more unlabeled objects in a new image. In the illustrated embodiment, the classification device 130 includes a lesion classifier 140, a processor 180, and a memory 190. Further, the lesion classifier 140 includes a communication unit 145, a feature extractor 150, a neural network unit 155, a comparison unit 160, and a classification unit 165. In certain embodiments, the one or more of units of the lesion classifier 140, the processor 180, and the memory 190 may be operatively coupled to a bus (not shown) for communication with each other. The one or more units of the lesion classifier 140 may include codes and routines that may be implemented as software, hardware, or a combination of software and hardware.
[0023] The processor 180 may include at least one arithmetic logic unit, microprocessor, general purpose controller or other processor arrays to perform computations, and/or retrieve data stored in the memory 190. In one embodiment, the processor 180 may be a multiple core processor. The processor 180 processes data signals and may include various computing architectures including a complex instruction set computer (CISC) architecture, a reduced instruction set computer (RISC) architecture, or an architecture implementing a combination of instruction sets. In one embodiment, the processing capability of the processor 180 may support retrieval of data and/or transmission of data. In another embodiment, the processing capability of the processor 180 may also perform relatively complex tasks, including various types of feature extraction, modulating, encoding, multiplexing, and the like. It may be noted that other type of processors, operating systems, and physical configurations are also envisioned.
[0024] The memory 190 may be a non-transitory storage medium. For example, the memory 190 may be a dynamic random access memory (DRAM) device, a static random access memory (SRAM) device, flash memory, or other non-transitory storage mediums. The memory 190 may also include a non-volatile memory or similar permanent storage device, and media such as a hard drive, a floppy drive, a compact disc read only memory (CD-ROM) device, a digital versatile disc read only memory (DVD-ROM) device, a digital versatile disc random access memory (DVD-RAM) device, a digital versatile disc rewritable (DVD-RW) device, a flash memory device, or other non-volatile storage devices.
[0025] The memory 190 stores data that may be required by the classification device 130 to perform associated functions. In one embodiment, the memory 190 includes the units of the lesion classifier 140. For example, the memory 190 includes the communication unit 145, the feature extractor 150, and the like. In another embodiment, the memory 190 stores one or more training images of an anatomical subject including one or more labeled objects. The one or more training images may be two dimensional and/or three dimensional images. The term “labeled object” is used herein to indicate that an object in a training image is pre-classified and labeled as one of the objects, e.g., a benign lesion, a cyst, an artifact, and the like. The one or more labeled objects in the one or more training images may be classified by, for example, a user of the classification device 130 based on previously generated clinical data, and the like. In a further embodiment, the memory 190 stores a distance threshold defined by, for example, a user of the classification device 130 based on a-priori data. The one or more training images and the distance threshold are described below in further detail with reference to the units of the lesion classifier 140.
[0026] The communication unit 145 includes codes and routines configured to facilitate communications between the imaging device 110 and one or more units of the lesion classifier 140. In one embodiment, the communication unit 145 includes a set of instructions executable by the processor 180 to provide the functionality for facilitating communication between the imaging device 110 and one or more units of the lesion classifier 140. In another embodiment, the communication unit 145 is a part of the memory 190. For example, the communication unit 145 may be stored in the memory 190 and is accessible and executable by the processor 180. In either embodiment, the communication unit 145 may be configured to communicate with the processor 180 and other units of the lesion classifier 140.
[0027] In one embodiment, the communication unit 145 receives one or more new images from the imaging device 110. The one or more new images include one or more unlabeled objects. In such an embodiment, the communication unit 145 sends the one or more new images to the feature extractor 150. In another embodiment, the communication unit 145 receives a notification including the classification of one or more unlabeled objects from the classification unit 165. In such an embodiment, the communication unit 145 sends the notification to, for example, a display device (not shown), a user of the classification device 130, and the like.
[0028] The feature extractor 150 includes codes and routines configured to extract one or more features of a labeled object and an unlabeled object from a training image and a new image respectively. In one embodiment, the feature extractor 150 includes a set of instructions executable by the processor 180 to provide the functionality for extracting one or more features of a labeled object and an unlabeled object from a training image and a new image respectively. In another embodiment, the feature extractor 150 is a part of the memory 190. For example, the feature extractor 150 may be stored in the memory 190 and is accessible and executable by the processor 180. In either embodiment, the feature extractor 150 may be configured to communicate with the processor 180 and other units of the lesion classifier 140.
[0029] In one embodiment, the feature extractor 150 receives a training image including a labeled object from the memory 190. The features extractor 150 extracts one or more features of the labeled object from the training image. The feature extractor 150 also receives a new image from the imaging device 110 via the communication unit 145. The new image may include an unlabeled object. The feature extractor 150 detects the unlabeled object based on, for example, image segmentation algorithms and extracts one or more features of the unlabeled object. Non-limiting examples of the one or more features of the labeled object and the unlabeled object include local binary patterns, intensity values, local intensity oriented patterns, and Minkowski functionals. Although the feature extractor 150 is described as receiving a training image stored in the memory 190 according to one embodiment, in other embodiments, the features extractor 150 may receive the training image from the imaging device 110 via the communication unit 145. The feature extractor 150 is further configured to send the one or more features of the labeled object and the unlabeled object to the neural network unit 155. Additionally, although the feature extractor 150 is described as receiving one training image including one labeled object according to one embodiment, in other embodiments the feature extractor 150 may receive a plurality of training images including one or more labeled objects. In such an embodiment, the feature extractor 150 is configured to extract and send one or more features of each labeled object from each of the plurality of training images to the neural network unit 155.
[0030] The neural network unit 155 includes codes and routines configured to generate an unlabeled weight matrix corresponding to the unlabeled object and a labeled weight matrix corresponding to the labeled object. In one embodiment, the neural network unit 155 includes a set of instructions executable by the processor 180 to provide the functionality for generating the unlabeled weight matrix corresponding to the unlabeled object and the labeled weight matrix corresponding to the labeled object. In another embodiment, the neural network unit 155 is a part of the memory 190. For example, the neural network unit 155 may be stored in the memory 190 and is accessible and executable by the processor 180. In either embodiment, the neural network unit 155 may be configured to communicate with the processor 180 and other units of the lesion classifier 140.
[0031] The neural network unit 155 includes one or more neural networks configured to generate an unlabeled weight matrix and a labeled weight matrix corresponding to the one or more features of the unlabeled object and the labeled object respectively. Although the neural network unit 155 is described herein as including a Restricted Boltzman Machine (RBM) according to one embodiment, in other embodiments, the neural network unit 155 may include other types of neural networks, for example, a Deep Boltzman Machine, a convolutional neural network, a deep belief network, an auto-encoder, a de-noising auto-encoder, a multi-layer perceptron, and the like for generating the labeled and the unlabeled weight matrices. Typically, an RBM is a generative stochastic artificial neural network including one or more visible units, one or more hidden units, a weight matrix associated with the connection between the one or more visible units and the one or more hidden units. The energy of the RBM is given by the equation:
E(v,h)=-?_i¦?a_i v_i ?-?_j¦?b_j h_j ?-?_ij¦?W_ij v_i h_j ? . . . (1)
where, v represents the visible units, h represents the hidden units, a represents bias weights for the visible units. b represents bias weights for the hidden units, W is the weight matrix, i = 1, 2, 3, … n, and j = 1, 2, 3, … m.
[0032] In one embodiment, the neural network unit 155 receives one or more features of the labeled object from the feature extractor 150 and generates a labeled weight matrix based on the one or more features of the labeled object. The neural network unit 155 activates an RBM by providing the one or more features of the labeled object as inputs to the one or more visible units of the RBM. The neural network unit 155 transforms the one or more features of the labeled object into a lower dimensional space corresponding to the one or more hidden units of the RBM. The number of hidden units is determined, for example, by a user of the classification device 130 based on a-priori data. Furthermore, the neural network unit 155 generates the labeled weight matrix (i.e., Wij) that represents an energy compact transformation of the one or more features of the labeled object between the one or more visible units and the one or more hidden units of the RBM. In particular, the one or more features in the visible units may be re-generated by, for example, multiplying the one or more features in the hidden units with the labeled weight matrix. In such an example, the error between the one or more features provided as inputs to the visible units and the re-generated features will be minimal since the labeled weight matrix represents an energy compact transformation. In one embodiment, the neural network unit 155 generates the labeled weight matrix by iteratively transforming the one or more features of the labeled object between the visible units and the hidden units until the energy of the RBM described with reference to equation 1, is minimized.
[0033] Similarly, the neural network unit 155 receives one or more features of an unlabeled object from the feature extractor 150 and generates an unlabeled weight matrix based on the one or more features of the unlabeled object. The unlabeled weight matrix represents an energy compact transformation of the one or more features of the unlabeled object between the one or more visible units and the one or more hidden units of the RBM. The neural network unit 155 is further configured to send the labeled weight matrix and the unlabeled weight matrix to the comparison unit 160. Additionally, in the embodiment where the feature extractor 150 receives one or more labeled objects in a plurality of training images, the neural network unit 155 generates and sends a labeled weight matrix corresponding to each labeled object in the plurality of training images to the comparison unit 160.
[0034] The comparison unit 160 includes codes and routines configured to calculate a Hausdorff distance based on a comparison between the labeled and the unlabeled weight matrix and calculate a Hausdorff distance. In one embodiment, the comparison unit 160 includes a set of instructions executable by the processor 180 to provide the functionality for calculating a Hausdorff distance based on a comparison between the labeled and the unlabeled matrix and calculating a Hausdorff distance. In another embodiment, the comparison unit 160 is a part of the memory 190. For example, the comparison unit 160 may be stored in the memory 190 and is accessible and executable by the processor 180. In either embodiment, the comparison unit 160 may be configured to communicate with the processor 180 and other units of the lesion classifier 140.
[0035] In one embodiment, the comparison unit 160 receives the labeled weight matrix and the unlabeled weight matrix from the neural network unit 155. The comparison unit 160 computes one or more cosine distance vectors between one or more rows of the unlabeled object matrix and one or more rows of the labeled weight matrix. The comparison unit 160 determines a minimum distance value from each of the one or more cosine distance vectors. The comparison unit 160 then calculates the Hausdorff distance between the unlabeled weight matrix and the labeled weight matrix as the median of one or more minimum distance values. Although the Hausdorff distance is described as the median value of the one or more minimum distance values according to one embodiment, in other embodiments the comparison unit 160 may calculate the Hausdorff distance as, for example, a minimum value, an average value, a mode value, a maximum value, and the like, of the one or more minimum distance values. The one or more minimum distance values and the Hausdorff distance are described below in further detail with reference to FIG. 2. The comparison unit 160 is further configured to send the Hausdorff distance between the unlabeled weight matrix and the labeled weight matrix to the classification unit 165.
[0036] Additionally, in the embodiment where the feature extractor 150 receives one or more labeled objects in a plurality of training images, the comparison unit 160 receives a labeled weight matrix corresponding to each labeled object in the plurality of training images. In such an embodiment, the comparison unit 160 calculates a Hausdorff distance between the unlabeled weight matrix and each of the labeled weight matrices. The comparison unit 160 then sends the plurality of Hausdorff distances to the classification unit 165.
[0037] The classification unit 165 includes codes and routines configured to classify the unlabeled object based on the one or more Hausdorff distances. In one embodiment, the classification unit 165 includes a set of instructions executable by the processor 180 to provide the functionality for classifying the unlabeled object based on the one or more Hausdorff distances. In another embodiment, the classification unit 165 is a part of the memory 190. For example, the classification unit 165 may be stored in the memory 190 and is accessible and executable by the processor 180. In either embodiment, the classification unit 165 may be configured to communicate with the processor 180 and other units of the lesion classifier 140.
[0038] In one embodiment, the classification unit 165 receives the Hausdorff distance between the unlabeled weight matrix and the labeled weight matrix. The classification unit 165 classifies the unlabeled object as the labeled object (e.g., a cyst, a lesion, an artifact, a duct, and the like) based on the Hausdorff distance. For example, if the Hausdorff distance is below the distance threshold then the classification unit 165 classifies and labels the unlabeled object as the labeled object. In the embodiment where the feature extractor 150 receives one or more labeled objects in a plurality of training images, the classification unit 165 receives a plurality of Hausdorff distances between the unlabeled weight matrix and the plurality of labeled weight matrices. In such an embodiment, the classification unit 165 determines the least amongst the plurality of Hausdorff distances. The classification unit 165 classifies and labels the unlabeled object as the labeled object corresponding to the least Hausdorff distance. In either embodiment, the classification unit 165 may be further configured to generate graphical data for providing a notification to, for example, a user of the classification device 130. The notification includes, for example, an output image including the unlabeled object labeled as one of the labeled object. In one embodiment, the classification unit 165 sends the graphical data to a display device (not shown) operatively coupled to the classification device 130. In such an embodiment, the display device renders the graphical data and displays the notification. In another embodiment, the classification unit 165 sends the notification to a user of the classification device 130 via, for example, an e-mail, a short messaging service, a voice message, and the like. In either embodiment, the user (e.g., a clinician) may use the notification for the diagnosis of a subject, for example, a human patient.
[0039] Referring now to FIG. 2, a schematic representation of an exemplary method 200 for classifying an unlabeled object in a new image 230 using the system illustrated in FIG. 1. The communication unit 145 receives a plurality of training images 210 and 220. The training images 210 and 220 include a plurality of labeled objects such as, but not limited to a cyst 202, a benign lesion 204, and a malignant lesion 222. The feature extractor 150 extracts one or more features of each labeled object in the plurality of training images 210 and 220. The neural network unit 155 then generates a first labeled weight matrix 280 based on the one or more features of the cyst 202 using an RBM 235. The RBM 235 includes one or more visible units 240 and one or more hidden units 245. The first labeled weight matrix 280 represents a transformation of the one or more features of the cyst 202 between the one or more visible units 240 and the one or more hidden units 245 of the RBM 235. The neural network unit 155 generates the first labeled weight matrix 280 by iteratively transforming the one or more features of the cyst 202 between the visible units 240 and the hidden units 245 until a gradient of the first labeled weight matrix 280 reaches a local minimum. Similarly, the neural network unit 155 generates a second and a third labeled weight matrix 285 and 290 based on the one or more features of the benign lesion 204, and the malignant lesion 222 respectively. The second labeled weight matrix 285 represents a transformation of the one or more features of the benign lesion 204 between one or more visible units 241 and one or more hidden units 246 of an RBM 236. The third labeled weight matrix 290 represents a transformation of the one or more features of the malignant lesion 222 between one or more visible units 242 and one or more hidden units 246 of an RBM 237.
[0040] The communication unit 145 receives the new image 230 including an unlabeled object 232 from an imaging device. The feature extractor 150 extracts one or more features of the unlabeled object 232. The neural network unit 155 generates an unlabeled weight matrix 295 based on the one or more features of the unlabeled object 232 using an RBM 238. The unlabeled weight matrix 295 represents a transformation of the one or more features of the unlabeled object 232 between one or more visible units 243 and one or more hidden units 248 of the RBM 238.
[0041] The comparison unit 160 generates one or more cosine distance vectors 296, 297, and 298 between each row of the unlabeled weight matrix 295 and each row of the first labeled weight matrix 280. The comparison unit 160 then computes one or more minimum distance values 299 by determining the least value in each of the cosine distance vectors 296, 297, and 298. Further, the comparison unit 160 calculates the median value amongst the minimum distance values 299 as a first Hausdorff distance 291 between the unlabeled weight matrix 295 and the first labeled weight matrix 280. Similarly, the comparison unit 160 calculates a second Hausdorff distance 292 and a third Hausdorff distance 293 between the unlabeled weight matrix 295 and the second labeled weight matrix 285 and the third labeled weight matrix 290 respectively. The classification unit 165 then classifies the unlabeled object 232 based on the first, second, and the third Hausdorff distances 291, 292, and 293.
[0042] In one example, the comparison unit 160 generates the first cosine distance vector 296 as [0.2, 0.4, 0.6], the second cosine distance vector 297 as [1.7, 1.0, 1.5], and the third cosine distance vector 298 as [1.8, 1.0, 0.4]. The comparison unit 160 then computes the minimum distance values 299 as [0.2, 1.0, 0.4] based on the least value in each of the first, second and the third cosine distance vectors 296, 297, and 298. In such an example, since the median value amongst the minimum distance values 299 is 0.4, the comparison unit 160 calculates the first Hausdorff distance 291 as 0.4. Similarly, the comparison unit 160 calculates the second Hausdorff distance 292 as 1.6 and the third Hausdorff distance 293 as 0.2. The classification unit 165 then classifies and labels the unlabeled object 232 as a malignant lesion since the third Hausdorff distance 293 (i.e., 0.2) is the least amongst the three Hausdorff distances 291, 292, and 293. Since the third Hausforff distance 293 is the least, the classification unit 165 infers that, amongst the three labeled objects 202, 204, and 222, and the unlabeled object 232 in the new image 230 is most similar to the malignant lesion 222. The classification unit 165 may further generate graphical data for providing an output image to, for example, a user of the classification device 130. The output image may include a label indicating that the unlabeled object 232 is a malignant lesion.
[0043] Although the neural network unit 165 is described as including four RBMs 235, 236, 237, and 238 for generating corresponding weight matrices 280, 285, 290, and 295, in other embodiments the neural network unit 165 may include a single RBM to generate the weight matrices 280, 285, 290, and 295. Furthermore, although the dimensions of the weight matrices 280, 285, 290, and 295 are shown as 3X3 matrices, in other embodiments the weight matrices 280, 285, 290, and 295 may be of any dimension. Additionally, the weight matrices 280, 285, 290, and 295 may have different or same dimensions. For example, the weight matrix 280 may be a 4X4 matrix and the weight matrix 290 may be a 5X5 matrix.
[0044] FIG. 3A is a pictorial representation illustrating an exemplary new image 310 received by the lesion classifier 140 of FIG. 1. The new image 310 includes an unlabeled object 320. FIG. 3B is a pictorial representation illustrating an exemplary output image 330 generated by the lesion classifier 140 corresponding to the new image 310 in FIG. 3A. The lesion classifier 140 classifies the unlabeled object 320 and generates the output image 330 including the label for the unlabeled object 320. In the illustrated embodiment, the lesion classifier 140 classifies and labels the unlabeled object 320 as a malignant/cancerous lesion 340 (i.e., a labeled object).
[0045] FIG. 3C is a pictorial representation illustrating an exemplary new image 350 received by the lesion classifier 140 of FIG. 1. The new image 350 includes an unlabeled object 360. FIG. 3D is a pictorial representation illustrating an exemplary output image 370 generated by the lesion classifier 140 corresponding to the new image 350 in FIG. 3C. In the illustrated embodiment, the lesion classifier 140 classifies and labels the unlabeled object 360 as a cyst 380 (i.e., a labeled object).
[0046] FIG. 4 is a flow diagram illustrating an exemplary method 400 for classifying an unlabeled object in a new image using the system in FIG. 1. At step 402, the communication unit 145 receives a training image comprising a labeled object 402. At step 404, the feature extractor 150 extracts one or more features of the labeled object based on the training image. At step 406, the neural network unit 155 generates a labeled weight matrix based on the one or more features of the labeled object using a neural network, where the labeled weight matrix corresponds to an energy compact transformation of the one or more features of the labeled object between one or more visible units and one or more hidden units of the neural network. At step 408, the communication unit 145 receives a new image comprising an unlabeled object from an imaging device. Although the communication unit 145 receives the new image from an imaging device according to one embodiment, in other embodiments the communication unit may receive the new image from the memory 190, an external data repository, and the like. At step 410, the feature extractor 150 extracts one or more features of the unlabeled object based on the new image. At step 412, the neural network unit 155 generates an unlabeled weight matrix based on the one or more features of the unlabeled object using the neural network, where the unlabeled weight matrix corresponds to an energy compact transformation of the one or more features of the unlabeled object between the one or more visible units and the one or more hidden units of the neural network.
[0047] At step 414, the comparison unit 160 computes one or more cosine distance vectors between one or more rows of the unlabeled weight matrix and one or more rows of the labeled weight matrix. At step 416, the comparison units 160 further calculates a Hausdorff distance between the unlabeled weight matrix and the labeled weight matrix based on the one or more cosine distance vectors. For example, the comparison unit 160 determines a minimum distance value from each of the one or more cosine distance vectors. The comparison unit 160 then calculates the median value of the one or more minimum distance values as the Hausdorff distance between the unlabeled weight matrix and the labeled weight matrix. At step 418, the classification unit 165 classifies the labeled object as the unlabeled object based on the Hausdorff distance. For example, if the Hausdorff distance between the unlabeled weight matrix and the labeled matrix is less than a distance threshold, the classification unit 165 classifies and labels the unlabeled object in the new image as the labeled object. In such an example, if the Hausdorff distance exceeds the distance threshold, the classification unit 165 may be configured not to classify the unlabeled object as the labeled object. Although the lesion classifier 140 is described as classifying the labeled object based on one labeled object according to FIG. 4, in other embodiments, the lesion classifier 140 may classify an unlabeled object based on a plurality of labeled objects as described above with reference to FIG. 2.
[0048] Although the features of certain objects (e.g., a duct, a cyst, and the like) are often confounded with features of other objects (e.g., a malignant lesion, a benign lesion, and the like) by prior classification methods, the lesion classifier 140 described above classifies an unlabeled object with a high degree of accuracy, based on the weight matrices. The lesion classifier 140 is advantageous as it classifies an unlabeled object by considering a broad spectrum of information (i.e., features) corresponding to the unlabeled object in the new image and the one or more labeled objects in the training images. Since a weight matrix represents an energy compact description of an object’s features, it enables the lesion classifier 140 to classify the unlabeled object by comparing the salient features of the object and ignoring noise in the broad spectrum of information.
[0049] It is to be understood that not necessarily all such objects or advantages described above may be achieved in accordance with any particular implementation. Thus, for example, those skilled in the art will recognize that the systems and techniques described herein may be embodied or carried out in a manner that achieves or optimizes one advantage or group of advantages as taught herein without necessarily achieving other objects or advantages as may be taught or suggested herein.
[0050] While the technology has been described in detail in connection with only a limited number of implementations, it should be readily understood that the invention is not limited to such disclosed implementations. Rather, the technology can be modified to incorporate any number of variations, alterations, substitutions or equivalent arrangements not heretofore described, but which are commensurate with the spirit and scope of the disclosure. Additionally, while various implementations of the technology have been described, it is to be understood that aspects of the technology may include only some of the described implementations. Accordingly, the inventions are not to be seen as limited by the foregoing description, but are only limited by the scope of the appended claims.

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 2433-CHE-2015-ASSIGNMENT WITH VERIFIED COPY [19-03-2025(online)].pdf 2025-03-19
1 2433-CHE-2015-IntimationOfGrant26-07-2023.pdf 2023-07-26
1 274727-1IN_POA_Form_26.pdf 2015-05-18
2 2433-CHE-2015-FORM-16 [19-03-2025(online)].pdf 2025-03-19
2 2433-CHE-2015-PatentCertificate26-07-2023.pdf 2023-07-26
2 274727-1IN_Form_3.pdf 2015-05-18
3 2433-CHE-2015-POWER OF AUTHORITY [19-03-2025(online)].pdf 2025-03-19
3 2433-CHE-2015-Written submissions and relevant documents [31-05-2023(online)].pdf 2023-05-31
3 274727-1 IN Form 2 Specification.pdf 2015-05-18
4 274727-1 IN Abstract Drawing.jpg 2015-05-18
4 2433-CHE-2015-IntimationOfGrant26-07-2023.pdf 2023-07-26
4 2433-CHE-2015-Correspondence to notify the Controller [12-05-2023(online)].pdf 2023-05-12
5 2433-CHE-2015-PatentCertificate26-07-2023.pdf 2023-07-26
5 2433-CHE-2015-FORM-26 [12-05-2023(online)].pdf 2023-05-12
5 2433-CHE-2015 POWER OF ATTORNEY 17-07-2015.pdf 2015-07-17
6 2433-CHE-2015-Written submissions and relevant documents [31-05-2023(online)].pdf 2023-05-31
6 2433-CHE-2015-AMENDED DOCUMENTS [18-04-2023(online)].pdf 2023-04-18
6 2433-CHE-2015 FORM-1 17-07-2015.pdf 2015-07-17
7 2433-CHE-2015-FORM 13 [18-04-2023(online)].pdf 2023-04-18
7 2433-CHE-2015-Correspondence to notify the Controller [12-05-2023(online)].pdf 2023-05-12
7 2433-CHE-2015 CORRESPONDENCE OTHERS 17-07-2015.pdf 2015-07-17
8 2433-CHE-2015-FORM-26 [12-05-2023(online)].pdf 2023-05-12
8 2433-CHE-2015-POA [18-04-2023(online)].pdf 2023-04-18
8 apstract 2433-CHE-2015.jpg 2015-09-01
9 2433-CHE-2015-AMENDED DOCUMENTS [18-04-2023(online)].pdf 2023-04-18
9 2433-CHE-2015-RELEVANT DOCUMENTS [12-11-2019(online)].pdf 2019-11-12
9 2433-CHE-2015-US(14)-HearingNotice-(HearingDate-17-05-2023).pdf 2023-03-30
10 2433-CHE-2015-ABSTRACT [30-04-2020(online)].pdf 2020-04-30
10 2433-CHE-2015-FORM 13 [12-11-2019(online)].pdf 2019-11-12
10 2433-CHE-2015-FORM 13 [18-04-2023(online)].pdf 2023-04-18
11 2433-CHE-2015-CLAIMS [30-04-2020(online)].pdf 2020-04-30
11 2433-CHE-2015-FER.pdf 2019-11-27
11 2433-CHE-2015-POA [18-04-2023(online)].pdf 2023-04-18
12 2433-CHE-2015-COMPLETE SPECIFICATION [30-04-2020(online)].pdf 2020-04-30
12 2433-CHE-2015-OTHERS [30-04-2020(online)].pdf 2020-04-30
12 2433-CHE-2015-US(14)-HearingNotice-(HearingDate-17-05-2023).pdf 2023-03-30
13 2433-CHE-2015-FER_SER_REPLY [30-04-2020(online)].pdf 2020-04-30
13 2433-CHE-2015-CORRESPONDENCE [30-04-2020(online)].pdf 2020-04-30
13 2433-CHE-2015-ABSTRACT [30-04-2020(online)].pdf 2020-04-30
14 2433-CHE-2015-CLAIMS [30-04-2020(online)].pdf 2020-04-30
14 2433-CHE-2015-DRAWING [30-04-2020(online)].pdf 2020-04-30
15 2433-CHE-2015-COMPLETE SPECIFICATION [30-04-2020(online)].pdf 2020-04-30
15 2433-CHE-2015-CORRESPONDENCE [30-04-2020(online)].pdf 2020-04-30
15 2433-CHE-2015-FER_SER_REPLY [30-04-2020(online)].pdf 2020-04-30
16 2433-CHE-2015-COMPLETE SPECIFICATION [30-04-2020(online)].pdf 2020-04-30
16 2433-CHE-2015-CORRESPONDENCE [30-04-2020(online)].pdf 2020-04-30
16 2433-CHE-2015-OTHERS [30-04-2020(online)].pdf 2020-04-30
17 2433-CHE-2015-DRAWING [30-04-2020(online)].pdf 2020-04-30
17 2433-CHE-2015-FER.pdf 2019-11-27
17 2433-CHE-2015-CLAIMS [30-04-2020(online)].pdf 2020-04-30
18 2433-CHE-2015-FER_SER_REPLY [30-04-2020(online)].pdf 2020-04-30
18 2433-CHE-2015-FORM 13 [12-11-2019(online)].pdf 2019-11-12
18 2433-CHE-2015-ABSTRACT [30-04-2020(online)].pdf 2020-04-30
19 2433-CHE-2015-OTHERS [30-04-2020(online)].pdf 2020-04-30
19 2433-CHE-2015-RELEVANT DOCUMENTS [12-11-2019(online)].pdf 2019-11-12
19 2433-CHE-2015-US(14)-HearingNotice-(HearingDate-17-05-2023).pdf 2023-03-30
20 2433-CHE-2015-FER.pdf 2019-11-27
20 2433-CHE-2015-POA [18-04-2023(online)].pdf 2023-04-18
20 apstract 2433-CHE-2015.jpg 2015-09-01
21 2433-CHE-2015-FORM 13 [18-04-2023(online)].pdf 2023-04-18
21 2433-CHE-2015-FORM 13 [12-11-2019(online)].pdf 2019-11-12
21 2433-CHE-2015 CORRESPONDENCE OTHERS 17-07-2015.pdf 2015-07-17
22 2433-CHE-2015 FORM-1 17-07-2015.pdf 2015-07-17
22 2433-CHE-2015-AMENDED DOCUMENTS [18-04-2023(online)].pdf 2023-04-18
22 2433-CHE-2015-RELEVANT DOCUMENTS [12-11-2019(online)].pdf 2019-11-12
23 2433-CHE-2015 POWER OF ATTORNEY 17-07-2015.pdf 2015-07-17
23 2433-CHE-2015-FORM-26 [12-05-2023(online)].pdf 2023-05-12
23 apstract 2433-CHE-2015.jpg 2015-09-01
24 2433-CHE-2015 CORRESPONDENCE OTHERS 17-07-2015.pdf 2015-07-17
24 2433-CHE-2015-Correspondence to notify the Controller [12-05-2023(online)].pdf 2023-05-12
24 274727-1 IN Abstract Drawing.jpg 2015-05-18
25 2433-CHE-2015 FORM-1 17-07-2015.pdf 2015-07-17
25 2433-CHE-2015-Written submissions and relevant documents [31-05-2023(online)].pdf 2023-05-31
25 274727-1 IN Form 2 Specification.pdf 2015-05-18
26 2433-CHE-2015 POWER OF ATTORNEY 17-07-2015.pdf 2015-07-17
26 2433-CHE-2015-PatentCertificate26-07-2023.pdf 2023-07-26
26 274727-1IN_Form_3.pdf 2015-05-18
27 2433-CHE-2015-IntimationOfGrant26-07-2023.pdf 2023-07-26
27 274727-1 IN Abstract Drawing.jpg 2015-05-18
27 274727-1IN_POA_Form_26.pdf 2015-05-18
28 2433-CHE-2015-POWER OF AUTHORITY [19-03-2025(online)].pdf 2025-03-19
28 274727-1 IN Form 2 Specification.pdf 2015-05-18
29 2433-CHE-2015-FORM-16 [19-03-2025(online)].pdf 2025-03-19
29 274727-1IN_Form_3.pdf 2015-05-18
30 2433-CHE-2015-ASSIGNMENT WITH VERIFIED COPY [19-03-2025(online)].pdf 2025-03-19
30 274727-1IN_POA_Form_26.pdf 2015-05-18

Search Strategy

1 Search_Strategy_2433_CHE_2015_27-11-2019.pdf

ERegister / Renewals

3rd: 19 Aug 2023

From 13/05/2017 - To 13/05/2018

4th: 19 Aug 2023

From 13/05/2018 - To 13/05/2019

5th: 19 Aug 2023

From 13/05/2019 - To 13/05/2020

6th: 19 Aug 2023

From 13/05/2020 - To 13/05/2021

7th: 19 Aug 2023

From 13/05/2021 - To 13/05/2022

8th: 19 Aug 2023

From 13/05/2022 - To 13/05/2023

9th: 19 Aug 2023

From 13/05/2023 - To 13/05/2024

10th: 13 May 2024

From 13/05/2024 - To 13/05/2025

11th: 12 May 2025

From 13/05/2025 - To 13/05/2026